Friday, July 15, 2011

MIT's Media Lab has developed special glasses that read microexpressions.

MIT's Media Lab has developed special glasses. Using a built-in camera linked to software that analyzes facial micro-expressions, the attached earpiece whispers the true interpretation into one's ear. The camera tracks 24 "feature points" on your conversation partner's face; analyzing the micro-expressions, how often they appear, and for how long. Then it compares the collected data with its bank of known expressions.

When MIT researchers Picard and el Kaliouby were calibrating their prototype, they were surprised to find that the average person was only able to interpret, 54 percent of the 24 expressions on real, non-acted faces correctly. They reasoned that most people could use some help reading the real mood of people they were talking with. By contrast, the software correctly identifies 64 percent of the expressions.◦