Seminar by Dr. Andreas Aristidou

Title: Virtual Dance Performance for Interactive Characters Date and Time: 9:00 -10:00:am, Friday, December 4, 2015 Venue: Room 146, Research Building

Speaker: Dr. Andreas Aristidou 
Date and Time: 9:00 -10:00 am, Friday, December 4, 2015. 
VenueRoom 146, Research Building

Title: Virtual Dance Performance for Interactive Characters
 
Abstract:
Motion capture (mocap) technology is an efficient method for digitizing art-performances, and is becoming increasingly popular in the preservation and dissemination of dance creations.Although technically the captured data can be of very high quality, dancing allows stylistic variations and improvisations that cannot be easily identified.The majority of motion analyses algorithms are based on ad-hoc quantitative metrics and thus, do not usually provide insights on style qualities of a performance.In this work, we present a framework based on the principles of Laban Movement Analysis (LMA) that aims to identify style qualities in dance motions.The proposed algorithm uses a feature space that aims to capture the four LMA components (Body, Effort, Shape, Space), and can be subsequently used for motion comparison and evaluation. We have designed and implemented a prototype virtual reality simulator for teaching dances in which users can preview dance segments performed by a 3D avatar and repeat them.The user's movements are captured and compared to the dance template motions; then, intuitive feedback is provided to the user based on the LMA components.Moreover, a motion search engine is designed and implemented in which users can perform queries using motion clips in a dance  atabase. Results demonstrate that the proposed method can be used in place, or in combination with text-based queries, to enable more effective and flexible motion database search and retrieval. Furthermore, we investigate the similarities between various emotional states with regards to the arousal and valence of the Russell’s circumplex model. We use a variety of features that encode, in addition to the raw geometry, stylistic characteristics of motion based on LMA. Motion capture data from supervised dance performances were used for training and classification purposes. The experimental results show that the proposed features provide a representative space for indexing and classification of dance movements with regards to the emotion, giving insights on how people express emotional states using their body. Finally, the aforementioned motion analysis framework is implemented in the context of Motion Graphs, and it is used to eliminate potentially problematic transitions and synthesize style-coherent animation, without requiring prior labeling of the data. The effectiveness of our method is demonstrated by synthesizing contemporary dance performances that include a variety of different emotional states, where it manages to compose a dance scenario from existing clips using only plausible movements.
 
Bio:
Dr Andreas Aristidou is post-doc researcher at the Graphics Virtual Reality Lab, Department of Computer Science, University of Cyprus. Andreas has been awarded the ΔΙΔΑΚΤΩΡ fellowship, by the Cyprus Research Promotion Foundation, to establish research in motion analysis and classification. In addition, he has been awarded the Office of Naval Research Global (ONRG) Visiting Scientist Program (VSP) scholarship to visit PhaseSpace Inc. offices in San Diego, CA, USA, in order to be trained in the new PhaseSpace mocap systems and gain experiences in relative subjects; he continues to collaborate with PhaseSpace Inc.,  a leading company that offers motion capture solutions for motion tracking and positioning,and he establishes his own motion capture laboratory at the Graphics and Virtual Reality Lab, University of Cyprus.  Following his novel research in folk dance digitization, he has been recently awarded the DARIAH-EU Theme 2015 in Open Humanities for organizing a workshop on e-documentation of  Intangible Cultural Heritage. Andreas has participated in a number of EU funded projects, including the SIM.POL.VR (Virtual Reality Police Simulator) at the University of Cyprus. In addition, he collaborates with Unity 3d for the design of computer games, incorporating human-like Inverse Kinematic constraints, and with EU creative industries to design innovative algorithm for motion synthesis and character retargeting. He had been a Cambridge European Trust fellow, at the Signal Processing and Communications Laboratory Information Engineering Division, Department of Engineering, University of Cambridge, where he obtained his PhD (2010) under the supervision of  Dr. Joan Lasenby. Andreas has a BSc in Informatics and Telecommunications from the National and Kapodistrian University of Athens (2005) and he is an honor graduate of Kings College London (2006), where he obtained his MSc degree in Mobile and Personal Communications. His main interests are focused on 3D motion analysis and classification, motion synthesis, human animation and involve optical Motion Capture, real time marker prediction and CoR estimation, Inverse Kinematics, and applications of Conformal Geometric Algebra in Engineering.