Artificial Intelligence, News, Technology
Emotionally Intelligent AI
The ability and inability to comprehend emotions have been a key distinction between humans and artificial intelligence for long—but a recent discovery by Japanese researchers might change that.
‘Humans are emotional creatures’ is an adage as old as time. As humans, our nonverbal cues form an entire dimension of expression critical to any form of communication. Our facial expressions, gestures, intonation and body language come together to convey information just as much, if not more, than our words.
Like Talking to a Bot
The dimension of the emotional state has been lost on artificial intelligence since its conception—forming a key limitation in its understanding of its human counterparts.
While the cognitive intelligence of AI is helping it power tools that are changing the world as we know it, the lack of emotional intelligence is an impediment to designing systems that are empathetic and provide an immersive experience.
However, things are about to change.
A recent study at the Japan Advanced Institute of Science and Technology reveals a pathway to enable artificial intelligence systems to pick up on human sentiment during a dialogue.
In the endeavour to make AI more ‘emotionally intelligent’, the researchers adopted a group of methods called ‘multimodal sentiment analysis’ that relies on picking up on factors such as speech, facial expressions, voice colour and posture to determine a person’s psychological state autonomously.
However, there is more than what meets the eye in the science of sentiment detection—which is why researchers saw the need for a system that perceives the unobservable physiological signals.
“Humans are very good at concealing their feelings. The internal emotional state of a user is not always accurately reflected by the content of the dialog, but since it is difficult for a person to consciously control their biological signals, such as heart rate, it may be useful to use these for estimating their emotional state”, explains Shogo Okada, associate professor at the Japan Advanced Institute of Science and Technology.
“This could make for an AI with sentiment estimation capabilities that are beyond human.”
The study is described in the IEEE Transactions on Affective Computing.
In just a few years, AI went from identifying images in boxes to performing surgeries, predicting the weather, detecting diseases in humans and restoring words to the paralyzed. Now, it’s understanding our emotions—as it understands our world.
Subscribe to the Blog
Join for the latest tech, design and industry news straight to your inbox.
Our UX team designs customer experiences and digital products that your users will love.
Leave a Reply