Processing…
Success! You're on the list.
NEWSLETTER

Processing…
Success! You're on the list.

What is empathic voice interface: This startup raised $50M just to launch it

What is empathic voice interface: This startup raised $50M just to launch it

We all know that the nuances of voice, facial expressions, and gestures are key to human communication. New York-based Hume AI, a startup focused on human well-being through AI, just secured $50 million series B funding to launch a groundbreaking technology: an Empathic Voice Interface (EVI). This innovation promises to change the way we interact with technology by offering emotionally intelligent AI voices that can be integrated into various applications.

Hume AI’s fundraising round is spearheaded by EQT Ventures. Additional investors included Union Square Ventures, a consortium of Nat Friedman and Daniel Gross, Metaplanet, Northwell Holdings, Comcast Ventures, and LG Technology Ventures.

Last year, San Francisco-based startup InTone developed an AI-powered voice platform for real-time accent enhancement and it raised $1.7 million in a seed funding round by Yellow Rocks!, A.Partners, and business angels from BPO and AI industries.

Building on a legacy of pioneering research

Hume AI was founded by Dr. Alan Cowen, a renowned scientist whose work on semantic space theory fundamentally changed our understanding of emotional experience and expression. This theory delves into the subtleties of voice, face, and gesture, providing a crucial foundation for Hume’s AI development. The company sits at the intersection of AI, human behaviour, and well-being, having already created advanced tools for measuring human emotions used in robotics, healthcare, and user research.

Hume AI’s founder, Dr. Cowen, emphasises the limitations of current AI systems that rely on superficial human input. His vision is for AI to learn directly from proxies of human happiness, essentially reconstructing human preferences and continuously updating them through new interactions and applications.

What is EVI (empathic voice interface)

The centrepiece of Hume’s recent funding is the EVI, currently in beta. This emotionally intelligent conversational AI is the first of its kind, trained on vast amounts of human interaction data. EVI can not only understand when users have finished speaking but can also predict preferences and generate vocal responses tailored for user satisfaction over time. These capabilities require just a few lines of code for developers to integrate EVI into any application.

Current AI voice interfaces often fall short due to their monotonous and robotic nature. EVI aims to bridge this gap by creating immersive conversational experiences that mimic the natural flow of human speech. It achieves this feat through a novel multimodal generative AI system that integrates large language models (LLMs) with expression measures. Hume refers to this as an empathic large language model (eLLM).

Ted Persson, Partner at EQT Ventures, the lead investor in this round, believes Hume’s empathic models are the missing piece in the AI landscape. He sees EVI as the foundation for creating AI that truly understands human needs and desires, with the potential to become a universal interface.

eLLM: The secret sauce behind EVI’s fluency

The eLLM empowers EVI to adapt its words and vocal tone based on context and the user’s emotional expressions. Additionally, EVI can accurately detect the end of a user’s turn, seamlessly stopping its own speech when interrupted and delivering rapid responses with minimal latency (under 700 milliseconds) for a near-human conversation experience.

What are Hume AI’s potential functions 

Union Square Ventures, another investor, highlights the scientific rigour and exceptional data quality behind Hume’s technology. Andy Weissman, managing partner, emphasises the wide range of potential applications, from customer service to improving medical diagnoses and patient care, as demonstrated by Hume’s collaborations with Softbank, Lawyer.com, and researchers at Harvard and Mt. Sinai.

What we think about Hume

Hume AI’s growing team of 35 researchers, engineers, and scientists are dedicated to advancing Dr. Cowen’s work on semantic space theory. His research, published in prestigious journals like Nature, has analysed the widest range and most diverse set of emotions ever studied, informing Hume’s data-driven approach to creating more empathic AI tools. Hume’s technology goes beyond words, leveraging the nuances of speech, including rhythm, timbre, and nonverbal cues, to enhance human-computer interaction.

Hume AI’s mission extends beyond creating innovative products. They conducted groundbreaking research, published in leading scientific journals, and supported The Hume Initiative, a non-profit organisation that has released the first concrete ethical guidelines for empathic AI. With EVI and a commitment to ethical development, Hume AI can usher in a new era of human-computer interaction marked by empathy and understanding.

Related Posts
Total
0
Share

Get daily funding news briefings in the tech world delivered right to your inbox.

Enter Your Email
join our newsletter. thank you