Thursday, April 15, 2010
Mertz active vision robotic head from MIT
I knew realistic robot heads like the Mertz active vision head robot before, and they are usually designed to see what it would be like if robots could display facial expressions.
In that case, you might wonder why someone would design this robot head to look like a freaky baby head instead of a handsome face.
Actually, the ultimate purpose of this robot isn’t to test how a human will respond to it, but how the robot will react to humans. Check out more on this MIT experiment after the jump.
According to the source Mertz is “built to recognize and react to faces and expressions, aiming to research socially situated learning which is similar to an infant’s learning process”.
In other words, it is a robot which is able to “learn” just like my baby son is learning to recognize facial features, and learning to speak by parroting those around him.
So how much emotional expression is it capable of? Well, I once heard that humans have thousands forms of expression in their face, and I believe that Mertz is capable of 13 degrees articulation in its neck alone. Who knows what its eyes and mouth are capable of.
This robot is the awesome creation of Lijin Aryananda and Jeff Weber at the MIT Media Lab. I strongly hope this experiment works, but if it does, it proves that machines can learn. Whoa, not certain that I’m ready for that one.
Subscribe to:
Posts (Atom)