Material Detail
Multi-modal Authoring Tool for Populating a Database of Emotional Reactive Animations
This video was recorded at 2nd Joint Workshop on Multimodal Interaction and Related Machine Learning Algorithms, Edinburgh 2005. We aim to create a model of emotional reactive virtual hu- mans. A large set of pre-recorded animations will be used to obtain such model. We have defined a knowledge-based system to store animations of reflex movements taking into account personality and emotional state. Populating such a database is a complex task. In this paper we describe a multimodal authoring tool that provides a solution to this problem. Our multimodal tool makes use of motion capture equipment, a handheld device and a large projection screen.
Quality
- User Rating
- Comments
- Learning Exercises
- Bookmark Collections
- Course ePortfolios
- Accessibility Info