Computers with emotions (w/ Video)
December 23, 2010
(PhysOrg.com) — Cambridge University film provides a glimpse of how robots and humans could interact in the future.
Can computers understand emotions? Can computers express emotions? Can they feel emotions? The latest video from the University of Cambridge shows how emotions can be used to improve interaction between humans and computers.When people talk to each other, they express their feelings through facial expressions, tone of voice and body postures. They even do this when they are interacting with machines. These hidden signals are an important part of human communication, but computers ignore them.Professor Peter Robinson is leading a team in the Computer Laboratory at the University of Cambridge who are exploring the role of emotions in human-computer interaction. His research is examined in the film The Emotional Computer.
http://www.physorg.com/tmpl/default/js/flv/player.swf “We’re building emotionally intelligent computers, ones that can read my mind and know how I feel,” Professor Robinson says. “Computers are really good at understanding what someone is typing or even saying. But they need to understand not just what I’m saying, but how I’m saying it.”
The research team is collaborating closely with Professor Simon Baron-Cohen’s team in the University’s Autism Research Center. Because those researchers study the difficulties that some people have understanding emotions, their insights help to address the same problems in computers.
Facial expressions are an important way of understanding people’s feelings. One system tracks features on a person’s face, calculates the gestures that are being made and infers emotions from them. It gets the right answer over 70% of the time, which is as good as most human observers.
Other systems analyze speech intonation to infer emotions from the way that something is said, and analyze body posture and gestures.