Newly developed software can interpret emotions in real time

7/6/2016 August Schiess, CSL

The system tracks facial expressions to determine the emotions of individuals through video.

Written by August Schiess, CSL

We laugh, we cry, we grimace, we smile—our emotions are communicated through our expressions, and up until recently, humans were the only ones who could accurately interpret and analyze them. Now, through a newly developed technology by ADSC researchers, these expressions can be tracked, in real time, and measured on a spectrum of emotions.

The expressions of the individual are rated on a continuum based on arousal, valence, and intensity. The cartoon characters in the arousal-valence space are courtesy of Delft University of Technology. studiolab.ide.tudelft.nl/studiolab/pmri/
The expressions of the individual are rated on a continuum based on arousal, valence, and intensity. The cartoon characters in the arousal-valence space are courtesy of Delft University of Technology. studiolab.ide.tudelft.nl/studiolab/pmri/
The expressions of the individual are rated on a continuum based on arousal, valence, and intensity. The cartoon characters in the arousal-valence space are courtesy of Delft University of Technology. studiolab.ide.tudelft.nl/studiolab/pmri/

The team—composed of researchers from the Advanced Digital Sciences Center (ADSC), a University of Illinois research center in Singapore—built software that uses 49 points on the human face to put facial expressions on a spectrum of emotion, measured by arousal, positivity (valence), and intensity.

The team has already licensed this technology to Panasonic, an electronics company in Japan, and are finding other applications in areas as diverse as advertising, education, human resources, politics, and more.

Unlike other emotion-tracking software, this system doesn’t place expressions into predetermined categories, like happy, sad, disgusted, surprised, or neutral. Instead, they’re placed on a spectrum that more accurately encompasses the wide variety of human emotion.

“We found that other models try to classify emotions into predetermined categories, usually seven. But people in real life are more complicated than that,” said Stefan Winkler, a distinguished scientist at ADSC. “People could be happy, but they could be slightly, moderately, or very happy—variations in intensity. Or sometimes people exhibit compound emotions. They could be positively surprised or negatively surprised.”

The researchers used a model from the psychology domain that places emotions on a continuum, rather than predefined categories. The horizontal axis measures valence, which estimates pleasure and displeasure; the vertical axis tracks arousal, which calculates activation or deactivation. The intensity of the expressions—how raised the eyebrows are, how much the smile curves upward—determines the magnitude of arousal and valence.

By tracking 49 points, the software had enough measurements to capture facial expression, but ran into another challenge: how to interpret those points.

“We used a validated database of photos that had been previously rated in regards to perceived emotions and intensity of expression,” said Vassilios Vonikakis, an ADSC research scientist. “We then used this big data set—300,000 images—as training examples for machine learning. The computer taught itself what facial displacements were associated with certain emotions, learning to associate values of arousal and valence with the positions of particular points on the face.”

The software, which can be used with a standard video webcam and installed on any laptop, can also detect engagement.

“We can use the head pose to judge if the person is paying attention to the camera or not. If the person turns away, we can conclude they are less engaged with the screen,” said Siddhanta Chakrabarty, a software engineer at ADSC. “This could have applications in online education—determining if students are engaged with an online lecture.”

The team is looking at a host of other applications. Advertising agencies could measure the reactions of people when they see a billboard or watch a commercial to determine ad effectiveness. Human resources could record the expressions of job candidates during an interview and quantify the expressions of the potential employee. The ability to analyze the expressions of a crowd could prove useful during political speeches, or anywhere with a large crowd.

“Our approach of analyzing emotions on a spectrum—rather than using predefined categories of expressions—makes analyzing a crowd much easier,” said Vonikakis. “Our software gives more fine-grained results individually, and can also estimate the aggregate emotion of a crowd.”  

The group will continue to improve the accuracy of the system, and have already sold several licenses to commercial ventures, including Panasonic, to utilize the technology.

“ADSC's emotion-tracking system yields good correlations in terms of arousal, valence, and intensity as it leverages big datasets for machine learning. That enables us to build more reliable systems,” said Kim Koon Chan, general manager of Singapore Technology Center of Panasonic Industrial Devices Singapore. “We expect the market to increase rapidly and welcome new customers to this field.”

The team was awarded a GAP grant from A*STAR to develop this from a research prototype to a commercial product.

“We’re excited to be using this technology in many different areas—it has potential to provide a much more robust way of describing and quantifying facial expressions than has previously been possible,” said Winkler.  


Share this story

This story was published July 6, 2016.