Hume AI Raises $50 Million to Innovate Emotionally Intelligent AI
Yesterday, Hume AI announced it has secured $50 million in a Series B funding round led by EQT Ventures, with participation from Union Square Ventures, Nat Friedman, Daniel Gross, Metaplanet, Northwell Holdings, Comcast Ventures, and LG Technology Ventures.
Founded by CEO Alan Cowen, a former researcher at Google DeepMind, Hume AI differentiates itself by focusing on building an AI assistant that understands and responds to human emotions. This includes developing an API that allows enterprises to create emotion-aware chatbots and applications.
What Sets Hume AI Apart?
Unlike conventional text-based chatbots like ChatGPT and Claude 3, Hume AI incorporates voice as a primary interface. It listens to users' intonation, pitch, and pauses to gauge their emotional state.
The company — named after the Scottish philosopher David Hume and based in New York City — has released a public demo of its “Empathic Voice Interface (EVI),” which it claims is the first conversational AI with emotional intelligence. Users can try the demo here: .
The Importance of Understanding Emotions in AI
Creating emotionally aware conversations is complex; Hume AI aspires to go beyond identifying basic feelings like happiness or sadness. Instead, it aims to recognize over 50 nuanced emotions in users, including:
- Admiration
- Anger
- Anxiety
- Boredom
- Joy
- Love
- Sadness
Hume AI is built around the theory that a nuanced understanding of emotions can enhance user experiences — from customer support to collaborative brainstorming.
As Cowen points out, emotional intelligence is crucial for AI interfaces to effectively interpret user intentions. Studies show vocal cues offer richer insights into preferences than language alone, making voice AI more potent in understanding user needs.
How EVI Detects Emotions
Hume AI's EVI identifies emotional cues by analyzing vocal modulations based on extensive training data from hundreds of thousands of participants across multiple cultures. The technology is supported by scientific research published by Cowen and his colleagues, involving experimental data that reveals how emotions are conveyed through vocal and facial expressions.
According to Cowen, the training data also facilitated the development of a speech prosody model, which interprets the tone, rhythm, and timbre of speech—enabling the AI to recognize 48 distinct emotional dimensions. Users can explore an interactive demonstration of the speech prosody model on Hume’s website.
APIs for Developers
Hume AI offers an “Expression Measurement API” that allows businesses to integrate emotional intelligence into their applications, including real-time analysis of facial expressions, vocal bursts, and emotional language. Hume also provides a “Custom Models API” for organizations to train their tailored models for various contexts, from customer interactions to security applications.
Ethics and Social Responsibility
Hume AI is committed to ethical guidelines through its affiliated non-profit, The Hume Initiative. This organization brings together social scientists, ethicists, and AI researchers to outline principles for the responsible use of empathic AI. Among the guidelines is the commitment to avoid manipulative applications of emotion-detection technologies.
Positive Reception and Future Outlook
Following the funding announcement and demo, the reception has been overwhelmingly positive, with industry leaders praising Hume's EVI for its naturalistic interaction capabilities. As AI technology continues to evolve, Hume AI is potentially setting new standards for human-like interactivity and emotional responsiveness in voice assistants — a field where competitors like Amazon Alexa may need to innovate further.
When asked about potential partnerships with larger tech companies, Cowen remained non-committal, saying only, "No comment."
Hume AI stands at the forefront of emotional AI technology, promising to enrich user experiences across various sectors.