 |
 |
 |
Color-Enhanced Emotion
|
|
The Color-Enhanced Emotion system recognizes facial expressions and controls skin-pigment components using a real-time processor. The installation allows attendees to experience a system that will usher in a new era in communication and movie editing.
|
Art and Science
Scientists and engineers have invented several life-enhancing devices that correct and extend human input-output signals, such as glasses, hearing aids, and loud speakers. The Color Enhanced Emotion system enhances the most important and ambiguous human signal: emotions. The system may enable innovative communication and create a new communication paradigm.
Goals
The purpose of this project is to enhance and control ambiguous human emotional signals. Using image control based on physical properties, and by implementing impressive real-time processing, the system displays video images that have not seen before.
Innovation
The Color-Enhanced Emotion system consists of the following components:
- Emotion recognition based on facial images using computer vision techniques.
- Implementation of a hardware-accelerated real-time processing system that can control the pigment components of the skin to replicate a broad range of conditions: fair, suntanned, pale, drunk, etc.
- Decomposition of the surface reflection using accurate registration cameras.
Ultimately, this project could lead to a new communication culture and a new generation of video editing technology.
Vision
Development of image-based communication methods, such as video phones and video chat, is continuing, and these applications will certainly become more commonplace. The most frequent image displayed on these systems is the human face, and it will be increasingly important to control image quality in a limited-bandwidth environment as emotional expression is enhanced.
Enhancing images with human emotion can help us realize rich and enjoyable communications. Such real-time, highly realistic effects will have an impact on movie editing because they enable application of effects in real time for a reasonable cost, which has never been possible before.
Contact
Toshiya Nakaguchi
Chiba University
nakaguchi (at) faculty.chiba-u.jp
Contributors
Takao Makino
Yoichi Miyake
Saya Okaguchi
Koichi Takase
Ryoko Usuba
Chiba University
Norimichi Tsumura
Chiba University and PRESTO, JST
Nobutoshi Ojima
Kao Corporation
|
 |
|
|