Skip to main content

Every face tells a story: New $1.9M grant keeps facial expression research going

Posted: 

martinez.rgb_.jpg
In 2014, media outlets worldwide took notice after an electrical and computer engineering professor at The Ohio State University opened up a new field of research by exploring how facial expressions influence language and behavior.

Aleix Martinez, professor and director of the Computational Biology and Cognitive Science Laboratory at Ohio State, specializes in computational models of vision, learning and language. His efforts are leading to a deeper understanding of language similarities across cultural barriers. Through his research, doctors may also someday be able to detect depression and other mental illnesses simply by studying the unique story each face tells.

In support of his research, the National Institutes of Health just gave Martinez a $1.9 million grant to ensure it continues.

“This is the newest grant for our ongoing study of facial expressions in sign languages and how these compare to those seen in speech and emotion,” Martinez said.

Leading scientific thinkers have long promoted the concept of six core emotions: happiness, surprise, sadness, anger, fear and disgust. However, in studying American Sign Language (ASL), Martinez and his colleauges ended up discovering how a number of different and unique facial patterns actually assist. Research shows humans across worldwide cultural and continental boundaries often understand the same subtle meanings behind those facial expressions.

nih.gif
Martinez said the present NIH award will help his team determine if these facial expressions will go beyond those typically seen in spoken language forms.

To prove his hypothesis, Martinez will study the facial movement data of many volunteers. Next, he can apply statistical data analysis by designing computer algorithms to automatically detect facial features and facial muscle movements. This data is further integrated into linguistic analysis software.

Martinez said he will then be able to pinpoint how the human face performs a large variety of expressions of emotion, as well as expressions, with "grammatical function," or expressions that are part of the grammar of the language.

The next question is, how do those expressions affect communication? He hypothesizes that when those facial expressions are removed from ASL, compehension becomes increasingly more difficult.

By studying extensive video data, Martinez will study how using distinct facial motions significantly speeds up ASL communication understanding throughout cultures.

“To be able to draw conclusive results, it is necessary to study thousands of videos,” Martinez explains. “The proposed computational approach will provide at least a 50-fold reduction in time compared to current methods done by hand.”

From a sociological perspective, this research also applies to the study of human emotions and language comprehension.

“His results suggest that the underlying ability to express emotions may be similar around the world, but cultural biases may simply define emotions in different ways – much the same way that babies are born with the capacity to speak and make sounds for any language, but are trained to speak their native tongue by what they hear around them,” Time Magazine reports.

By studying such detailed facial expressions, data could even reveal signs of ongoing depression or mental illness and how to prevent it.

To learn more about Martinez and his research, check out his full paper HERE.

Read past articles on Martinez and his research:

"Sign of Success: Martinez wins Google faculty Research Award"

Computer maps 21 distinct emotional expressions - even 'happily disgusted'

More to the face than meets the eye