Example-based Approaches for Expressive Locomotion Generation

Example-based Approaches for Expressive Locomotion Generation
Title Example-based Approaches for Expressive Locomotion Generation PDF eBook
Author Yejin Kim
Publisher
Pages
Release 2013
Genre
ISBN 9781303442964

Download Example-based Approaches for Expressive Locomotion Generation Book in PDF, Epub and Kindle

The recent development of motion capture technology and its popularity makes it possible to produce realistic animation from lively captured motions of an actor with much less effort. With these changes in expressive character animation, an effective motion editing approach is required for modifying the prerecorded motion clips since such data are typically specialized to a particular character and conditions at the time of capture and difficult to edit into a desired output. Thus, recent research in the character animation field has focused on techniques or tools for reusing a set of motion capture data as example motions and producing stylistic variations from the set.For effective editing, each motion property can be classified into two categories, qualitative and quantitative aspects of motions. For example, qualitative properties represent the physical values that can be computationally estimated from the motion such as speed, orientation, and forces while quantitative properties are more associated with the abstract description of motions including physical and inner states of a character. Over years, many researches have focused on each aspect for synthesizing stylistic variations from input motion. Only a few works consider both aspects, and then in a limited way such that they either lack flexibility in controls for stylizing the input motion or the high-quality seen in motion capture data. Motivated by this, our goal is to construct a comprehensive motion framework, especially for human locomotion, that adopts motion capture data as our main parameter source and allows an animator to apply a wide variety of stylistic changes via graphical user interfaces (GUIs). We expect that our framework overcomes many limitations in previous systems by editing the quantitative and qualitative aspects of motions with multiple animation systems, where each of them focuses on editing different motion properties during the synthesis process: interactivity, composition, and timing.In interactive editing of locomotion style, our system is particularly designed for making stylistic changes via the extracted correlations between the end effectors, that we name drives, and the body movement. When an animator interactively controls the positional data for the wrists, ankles, and center of mass, the system automatically updates the current pose at each frame based on the driver positions and driven orientations using the inverse kinematic (IK) and balance maintaining routines. The overall editing process is controlled by a set of simple and intuitive linear operations on the motion drives or extracted correlations. Thus, an animator can quickly transform the input locomotion into a desired style at interactive speed.For expressive locomotion generation on an arbitrary path, we provide a system that adopts multiple example clips on a motion path specified by an animator. Significantly, the system only requires a single example of straight-path locomotion for each style modeled and can produce output locomotion for an arbitrary path with arbitrary transitions. Several techniques are applied to automate the overall synthesis: detection of multiple foot-plants from unlabeled examples, estimation of an adaptive blending length for a natural style change, and a post-processing step for enhancing the physical realism of the output animation. Compared with previous approaches, our system requires significantly less data and manual labor, while supporting a large range of styles.When generating locomotion, it is particularly challenging to adjust the motion's style in a qualitative way. The component-based system is designed for human locomotion composition that drives off a set of example locomotion clips. The distinctive style of each example is analyzed in the form of sub-motion components decomposed from separate body parts via independent component analysis (ICA). During the synthesis process, we use these components as combinatorial ingredients to generate new locomotion sequences that are stylistically different from the example set. Our system is designed for any animator who may not have much knowledge of important locomotion properties, such as the correlations throughout the body. Thus, the proposed system analyzes the examples in a unsupervised manner and synthesizes an output locomotion from a small number of control parameters. Our experimental results show that the system can generate physically plausible locomotion in a desired style at interactive speed.Timing plays an important role in specifying how a character moves from one pose to another. To effectively capture the timing variations in the example set and to utilize them for style transfer, we propose an editing system that provides separate controls over temporal properties of an input motion via the global and upper-body timing transfers. The global timing transfer focuses on matching the input motion to the body speed of the selected example motion and will contain the overall sense of emotional or physical state observed in the example. On the other hand, the timing transfer in the upper body propagates the sense of movement flow through the torso and arms, which is often referred to as succession. We try to transfer this succession by capturing the relative changes of angle rotation in the upper body joints from the example motion and then apply them to the input motion with a scaled amount. Overall, this system provides an animator temporal edits on locomotion style without destroying spatial details and constraints preserved in the original motion.

Transactions on Edutainment VII

Transactions on Edutainment VII
Title Transactions on Edutainment VII PDF eBook
Author Zhigeng Pan
Publisher Springer
Pages 295
Release 2013-11-19
Genre Computers
ISBN 3642290507

Download Transactions on Edutainment VII Book in PDF, Epub and Kindle

This journal subline serves as a forum for stimulating and disseminating innovative research ideas, theories, emerging technologies, empirical investigations, state-of-the-art methods, and tools in all different genres of edutainment, such as game-based learning and serious games, interactive storytelling, virtual learning environments, VR-based education, and related fields. It covers aspects from educational and game theories, human-computer interaction, computer graphics, artificial intelligence, and systems design. The 27 papers of this volume deal with virtual humans; graphics rendering and 3D animation; games and 2D animation; and digital media and its applications.

Handbook of Expressive Arts Therapy

Handbook of Expressive Arts Therapy
Title Handbook of Expressive Arts Therapy PDF eBook
Author Cathy A. Malchiodi
Publisher Guilford Publications
Pages 354
Release 2022-10-26
Genre Medical
ISBN 1462550533

Download Handbook of Expressive Arts Therapy Book in PDF, Epub and Kindle

*Authoritative work on helping adults heal, edited by a renowned expert. *Research is growing for the use of expressive arts to access and regulate powerful emotions and support recovery. *Practical features include case examples and suggestions for tailoring therapies to individual needs. *Covers a broad range of approaches--art, music, movement, writing, play, and more.

Expressive Movement Generation with Machine Learning

Expressive Movement Generation with Machine Learning
Title Expressive Movement Generation with Machine Learning PDF eBook
Author Omid Alemi
Publisher
Pages 240
Release 2021
Genre
ISBN

Download Expressive Movement Generation with Machine Learning Book in PDF, Epub and Kindle

Movement is an essential aspect of our lives. Not only do we move to interact with our physical environment, but we also express ourselves and communicate with others through our movements. In an increasingly computerized world where various technologies and devices surround us, our movements are essential parts of our interaction with and consumption of computational devices and artifacts. In this context, incorporating an understanding of our movements within the design of the technologies surrounding us can significantly improve our daily experiences. This need has given rise to the field of movement computing - developing computational models of movement that can perceive, manipulate, and generate movements. In this thesis, we contribute to the field of movement computing by building machine-learning-based solutions for automatic movement generation. In particular, we focus on using machine learning techniques and motion capture data to create controllable, generative movement models. We also contribute to the field by creating datasets, tools, and libraries that we have developed during our research. We start our research by reviewing the works on building automatic movement generation systems using machine learning techniques and motion capture data. Our review covers background topics such as high-level movement characterization, training data, features representation, machine learning models, and evaluation methods. Building on our literature review, we present WalkNet, an interactive agent walking movement controller based on neural networks. The expressivity of virtual, animated agents plays an essential role in their believability. Therefore, WalkNet integrates controlling the expressive qualities of movement with the goal-oriented behaviour of an animated virtual agent. It allows us to control the generation based on the valence and arousal levels of affect, the movement's walking direction, and the mover's movement signature in real-time. Following WalkNet, we look at controlling movement generation using more complex stimuli such as music represented by audio signals (i.e., non-symbolic music). Music-driven dance generation involves a highly non-linear mapping between temporally dense stimuli (i.e., the audio signal) and movements, which renders a more challenging modelling movement problem. To this end, we present GrooveNet, a real-time machine learning model for music-driven dance generation.

Gesture-Based Communication in Human-Computer Interaction

Gesture-Based Communication in Human-Computer Interaction
Title Gesture-Based Communication in Human-Computer Interaction PDF eBook
Author Antonio Camurri
Publisher Springer
Pages 571
Release 2011-04-02
Genre Computers
ISBN 3540245987

Download Gesture-Based Communication in Human-Computer Interaction Book in PDF, Epub and Kindle

Research on the multifaceted aspects of modeling, analysis, and synthesis of - man gesture is receiving growing interest from both the academic and industrial communities. On one hand, recent scienti?c developments on cognition, on - fect/emotion, on multimodal interfaces, and on multimedia have opened new perspectives on the integration of more sophisticated models of gesture in c- putersystems.Ontheotherhand,theconsolidationofnewtechnologiesenabling “disappearing” computers and (multimodal) interfaces to be integrated into the natural environments of users are making it realistic to consider tackling the complex meaning and subtleties of human gesture in multimedia systems, - abling a deeper, user-centered, enhanced physical participation and experience in the human-machine interaction process. The research programs supported by the European Commission and s- eral national institutions and governments individuated in recent years strategic ?elds strictly concerned with gesture research. For example, the DG Infor- tion Society of the European Commission (www.cordis.lu/ist) supports several initiatives, such as the “Disappearing Computer” and “Presence” EU-IST FET (Future and Emerging Technologies), the IST program “Interfaces & Enhanced Audio-Visual Services” (see for example the project MEGA, Multisensory - pressive Gesture Applications, www.megaproject.org), and the IST strategic - jective “Multimodal Interfaces.” Several EC projects and other funded research are represented in the chapters of this book. Awiderangeofapplicationscanbene?tfromadvancesinresearchongesture, from consolidated areas such as surveillance to new or emerging ?elds such as therapy and rehabilitation, home consumer goods, entertainment, and aud- visual, cultural and artistic applications, just to mention only a few of them.

Movement, embodiment, kinesemiotics: Interdisciplinary approaches to movement-based communication

Movement, embodiment, kinesemiotics: Interdisciplinary approaches to movement-based communication
Title Movement, embodiment, kinesemiotics: Interdisciplinary approaches to movement-based communication PDF eBook
Author Arianna Maiorani
Publisher Frontiers Media SA
Pages 194
Release 2023-06-02
Genre Science
ISBN 2832524796

Download Movement, embodiment, kinesemiotics: Interdisciplinary approaches to movement-based communication Book in PDF, Epub and Kindle

Title PDF eBook
Author
Publisher IOS Press
Pages 7289
Release
Genre
ISBN

Download Book in PDF, Epub and Kindle