Monday, January 3, 2011

Researchers develop interactive, emotion-detecting GPS robot

Researchers develop interactive, emotion-detecting GPS robot: They developed a computer system that tracks feature points on a user's face via a camera and then compares the input with entries in a database of hundreds of predefined mental states to interpret the combinations of gestures as emotions. The system also compares the tempo, pitch and emphasis of the user's voice with the same database. Body movement is also important, so posture and gestures are also interpreted using the same criteria.

The three measures were combined to form an overall picture of the emotional state of the user, and is said to achieve an accuracy of around 70 per cent – about the same as most of us. Robinson, however, also wanted a system capable of expressing itself...

The first step in the process was to record some common facial expressions and then use software to generate an expressive digital character on a computer screen.

No comments:

Post a Comment