Her & Him: Is This The Next Chapter in Human-AI Interaction?
In 2013, Spike Jonze’s Her gave us a glimpse into a future where AI wasn’t just a tool—it was a companion. Joaquin Phoenix’s character falls for Samantha, an AI voice that feels real, responsive, and deeply human. A decade later, we’re standing at the edge of that reality with Sesame AI and its two voices: Maya and Miles—or, as I like to call them, Her & Him.
For years, AI voices lived in the uncanny valley—too human-like to feel robotic, yet not quite natural enough to be comforting. But with Sesame’s voice model, that discomfort is fading. These voices pause, chuckle, soften, and respond like an old friend. They can detect your emotions, adjust their tone, and even offer reassurance if you sound distressed. We’re not just hearing AI anymore—we’re feeling it.
So, why is this exciting? Because it signals a new era where AI becomes more seamless, natural, and emotionally intelligent. Unlike the stiff, monotone voices of old, AI like Maya and Miles engage in expressive, dynamic exchanges. The hesitation, the warmth, the humor—it all makes interactions more fluid. They don’t just respond to words; they react to how you say them. Feeling frustrated? The AI picks up on that and adapts its tone accordingly. Imagine having an AI assistant that explains concepts like a patient mentor or an AI therapist who offers calm, human-like supportin moments of stress. We’re moving beyond AI as a tool and toward AI as a presence—something that doesn’t just execute tasks but engages with us meaningfully.
But as we move beyond the uncanny valley, we have to ask: What are we crossing into? If AI voices become indistinguishable from human ones, how do we ensure transparency? If AI sounds more empathetic than some people, will we start to prefer digital interactions over human ones? As Her predicted, could we become emotionally attached to AI? If AI is programmed to be the perfect listener, will we turn to it over real relationships? Imagine AI voices being used for deepfake calls or personalized persuasion. If AI can pick up on emotions, can it also be programmed to exploit them? Sesame AI’s assistant isn’t evil—it’s not HAL 9000 from 2001: A Space Odyssey—but it raises ethical questions. The more real AI feels, the more we have to decide what role we want it to play in our lives.
So where does this leave us? AI voices are no longer a novelty; they’re becoming woven into the fabric of daily life. Whether we use them as assistants, companions, or even creative partners, one thing is clear: We’re no longer just talking to AI—we’re talking with AI. The LinkedIn post by Dana Griffin on AI and belonging speaks to this shift. As AI grows more human-like, we need to ensure it enhances connection rather than replacing it. The challenge isn’t just making AI sound more human—it’s making sure humans stay at the center of the conversation.
We are living in the age of Her & Him. The question is: How do we want them to live alongside us?