Prof Schyns said: “Facial expressions and the interpretation of them are a fundamental part of human communication and our study has revealed how the brain uses facial details in order to make crucial social judgements.
“Our study suggests that facial expressions co-evolved with the brain - the former to be deciphered, the latter to decipher. With time-resolved brain data, we reveal both how the brain uses different expressive features and how long it takes to process enough information for the critical social judgements we take for granted.”
There are six basic facial expressions: happy, fear, surprise, disgust, anger and sadness. All of these expressions have distinctive characteristics that the brain can easily distinguish between.
Volunteers in the study were shown each expression on 10 different faces, five male, five female, while brain-imaging equipment monitored how quickly different parts of the brain interpreted them.
The results showed that between 140-200ms of the picture being shown, an information processing mechanism starts independently in both left and right brain hemispheres, looking first at the eyes, then the rest of the face before zooming back in on specific features associated with the basic emotions.
By the end of this process, the brain has enough information to accurately predict the emotional state of the person displaying the facial expression.
To read the entire study, click here.
Abstract: Competent social organisms will read the social signals of their peers. In primates, the face has evolved to transmit the organism's internal emotional state. Adaptive action suggests that the brain of the receiver has co-evolved to efficiently decode expression signals. Here, we review and integrate the evidence for this hypothesis. With a computational approach, we co-examined facial expressions as signals for data transmission and the brain as receiver and decoder of these signals. First, we show in a model observer that facial expressions form a lowly correlated signal set. Second, using time-resolved EEG data, we show how the brain uses spatial frequency information impinging on the retina to decorrelate expression categories. Between 140 to 200 ms following stimulus onset, independently in the left and right hemispheres, an information processing mechanism starts locally with encoding the eye, irrespective of expression, followed by a zooming out to processing the entire face, followed by a zooming back in to diagnostic features (e.g. the opened eyes in “fear”, the mouth in “happy”). A model categorizer demonstrates that at 200 ms, the left and right brain have represented enough information to predict behavioral categorization performance.
Citation: Schyns PG, Petro LS, Smith ML (2009) Transmission of Facial Expressions of Emotion Co-Evolved with Their Efficient Decoding in the Brain: Behavioral and Brain Evidence. PLoS ONE 4(5): e5625. doi:10.1371/journal.pone.0005625◦
Tuesday, July 14, 2009
How We Look at the Face
Subscribe to:
Post Comments (Atom)
0 comments:
Post a Comment