Wearables

Wearable AI Can Read Your Emotions As You Speak

New research points to a smartwatch that can tell how you feel

Wearables
Jason Dorfman, MIT CSAIL
Feb 01, 2017 at 6:07 PM ET

Smartphones and other sensor-laden devices already collect enough data to build intimate portraits of our lives, allowing corporations and governments to record our associations, consumer preferences, and physical movements. And thanks to new research from MIT, we’re now a lot closer to having consumer devices capable of understanding perhaps the most intimate data of all: emotions.

In a new research project headed by researchers at the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Laboratory, researchers claim they’ve created a wearable AI system capable of detecting the emotional content of human speech with 83 percent accuracy, using a consumer grade device that captures and analyzes spoken conversations in real time.

The system works using an artificial intelligence method called deep learning, where a system learns to perform a certain task by being “trained” on example data provided by humans. In this case, that data comes from subjects performing different emotional responses while reacting to “happy” or “sad” videos, as well as telling happy and sad stories of their own.

Once the system is trained on that data, it can begin transcribing and analyzing speech, assigning a “sentiment score” for each 5-second interval of a conversation. The score is generated by measuring speech patterns like the speaker’s tone of voice and speed of delivery. Typically, positive speakers are more energetic, whereas speech communicating sadness or anxiety tends to be slower and have more pauses.

The wrist-worn device can also use its sensors to provide physiological data on the wearer’s heart rate, blood pressure, and skin temperature. When combined with speech data, the researchers claim, the system can determine the “overall tone” of the conversation with 83 percent accuracy.

“Imagine if, at the end of a conversation, you could rewind it and see the moments when the people around you felt the most anxious,” says Tuka Alhanai, an MIT graduate student who co-authored a paper for the research project. “Our work is a step in this direction, suggesting that we may not be that far away from a world where people can have an AI social coach right in their pocket.”

The researchers suggest such a system could be a boon for people with often debilitating social conditions such as anxiety and Asperger’s syndrome. But it’s easy to see how intimate emotional analysis can also be a privacy nightmare, allowing large corporations, governments, and advertisers to not only record the feelings of people using these devices, but to use that data to secretly manipulate their emotions and decisions.

“Given its importance for communication, the consequences of misreading emotional intent can be severe, particularly in high-stakes social situations such as salary negotiations or job interviews,” the paper’s researchers wrote. “Machine-aided assessments of historic and real-time interactions may help facilitate more effective communication for such individuals by allowing for long-term social coaching and in-the-moment interventions.”