Hume is a research lab and technology company. Our mission is to ensure that artificial intelligence is built to serve human goals and emotional well-being.
Hume offers the world's first Empathic Voice Interface (EVI), which I used to build Yuri. By leveraging their API, I incorporated emotional intelligence into Yuri, enhancing the experience with empathetic, voice-based interactions that adapt to tone, rhythm, and expression
I’ve been testing out Hume AI, and honestly, it’s one of the more interesting AI tools I’ve used lately. Most emotion AI platforms feel sterile or overly technical, but Hume actually feels like it was built to understand people — not just analyze data.
What stood out to me the most was the way it picks up on subtle emotional cues in voice and text. I ran a few tests where I changed my tone just slightly, and Hume caught it. Not in a robotic “detected: sadness” way — more like, “we noticed a drop in energy and emotional tone,” which felt more nuanced and respectful. It doesn’t try to label you so much as listen, if that makes sense.
It’s not perfect—there were moments when the interpretations felt a little generic—but considering how complex human emotion is, I’m impressed with how far it’s come. Plus, the company seems really values-driven, which matters to me when we're talking about tech that interprets human feelings.