AI’s Emotional Revolution is Here


The Gist

  • AI evolution. Emotional intelligence in AI is poised to revolutionize how these models understand and interact with humans.
  • Emotion tech. Hume AI’s technology reads emotions from voices and facial expressions, enhancing AI capabilities.
  • Nonverbal signals. Current AI lacks emotional understanding; integrating nonverbal cues is crucial for better communication.

The AI field has made remarkable progress with incomplete data. Leading generative models like Claude, Gemini, GPT-4, and Llama can understand text but not emotion. These models can’t process your tone of voice, rhythm of speech or emphasis on words. They can’t read your facial expressions. They are effectively unable to process any of the nonverbal information at the heart of communication. And to advance further, they’ll need to learn.

AI’s Next Leap: Emotional Intelligence

Though much of the AI sector is currently focused on making generative models larger via more data, compute, and energy, the field’s next leap may come from teaching emotional intelligence to the models. The problem is already captivating Mark Zuckerberg and attracting millions in startup funding, and there’s good reason to believe progress may be close.

Related Article: AI Gets Empathetic: Advances in Emotionally Intelligent AI

Zuckerberg: AI Needs Emotional Intelligence

“So much of the human brain is just dedicated to understanding people and understanding your expressions and emotions, and that’s its own whole modality, right?” Zuckerberg told podcaster Dwarkesh Patel last month. “You could say, okay, maybe it’s just video or an image. But it’s clearly a very specialized version of those two.”

Related Article: How AI Is Revolutionizing the Customer Journey in 2024

Ex-Meta Employee Leads AI Emotion Innovation

One of Zuckerberg’s former employees might be the furthest along in teaching emotion to AI. Alan Cowen, CEO of Hume AI, is a former Meta and Google researcher who’s built AI technology that can read the tune, timber, and rhythm of your voice, as well as your facial expressions, to discern your emotions. 

Related Article: Emotions in Marketing: The Art of Anticipatory Customer Experience

Hume’s EVI Bot Reads and Reacts to Emotions

As you speak with Hume’s bot, EVI, it processes the emotions you’re showing — like excitement, surprise, joy, anger, and awkwardness — and expresses its responses with “emotions” of its own. Yell at it, for instance, and it will get sheepish and try to defuse the situation. It will display its calculations on screen, indicating what it’s reading in your voice and what it’s giving back. And it’s quite sticky. Across 100,000 unique conversations, the average interaction between humans and EVI is 10 minutes long, a company spokesperson said.

As you speak with Hume’s bot, EVI, it processes the emotions you’re showing — like excitement, surprise, joy, anger, and awkwardness — and expresses its responses with “emotions” of its own.anakin13 on Adobe Stock Photos

Related Article: Emotional Intelligence in Customer Service: The Key Differentiator



Source link

We will be happy to hear your thoughts

Leave a reply

FITNESS BEAUTY
Logo
Shopping cart