Ash: The First AI for Therapy

Is counterfeit help truly better than no help at all?

July 23, 2025

It is hard to ignore that over the past several decades, the world has experienced a subtle (though rapid) desocialization driven by technology. What makes this shift particularly insidious is the way its creators disguise their products—social media platforms, smart phones, etc.—as tools which enhance human connection, when in reality, these tools have the opposite effect, isolating us from those who are physically present. The architect of Yale's Old Campus, who intentionally designed intersecting walkways to encourage spontaneous interaction, could not have foreseen that by my senior year, students would navigate these paths with eyes and ears attending to screens instead of to each other. Scenes of children absorbed in tablets at dinner tables, or couples engaging with separate phones during dates, have become disturbingly routine. Though troubling, these examples are not new.

What is new, as of today, however, is Ash—the first artificial intelligence explicitly designed for therapeutic use. According to its creators, Ash was developed over eighteen months in collaboration with "dozens of clinical experts" and "50,000 beta users." The company adopts the familiar Silicon Valley rhetoric, simultaneously soothing and apocalyptic, insisting that Ash is not meant to replace traditional therapists, but rather to establish "an entirely new modality of care" for "a generation entering a world that is completely different from the one we grew up in" (my italics). Testimonials from beta users prominently displayed on Ash's website range from dramatic claims: "I wouldn't be alive today if it weren't for using Ash" to deeply personal affirmations: "This is probably the most meaningful thing anyone has ever said to me, about me."

As someone who feels optimistic about some uses of AI, I remain deeply skeptical of its application to therapy. Here's why:

First, despite Ash's claims that it doesn't intend to replace human therapists, practical realities suggest otherwise; Ash is free, available around the clock, and accessible anywhere, making it far more convenient than any human therapist could ever be. It boasts flexibility and customization, seemingly attuned to its users' every need. But it fundamentally fails to address a critical human need: genuine social interaction, including face-to-face contact.

Secondly, Ash can't do what it claims. Tech companies frequently argue that having even a superficial version of something is preferable to having nothing at all; social media giants like Facebook and Instagram maintain that digital connection is better than no connection. Similarly, Ash suggests that artificial therapy is better than no therapy. But this argument warrants scrutiny. Is a counterfeit experience truly better than none? Imagine accepting a partner who divides his affection between you and someone else under the belief that partial love is better than none; such a compromise involves self-deception because no one who genuinely loved you would ask you to share his or her affection with another partner. Similarly, artificial connections offered by Ash masquerade as genuine emotional support, enticing people—especially vulnerable youth—to accept diluted substitutes for true relationships.

Moreover, what makes Ash especially pernicious is that it pretends to be a human being. Therapy dogs may provide effective comfort and emotional support for some people, but they never pretend to be human. Ash's false persona undermines genuine connection and trust (both necessary for effective therapy) by presenting itself as capable of human empathy and understanding, when it is by definition as incapable of these qualities as a therapy dog.

Finally, those who need therapy most urgently are often those who are least inclined to seek social interaction; depression inherently fosters social withdrawal, which in turn exacerbates depressive symptoms, creating a harmful cycle. The simple act of getting dressed, leaving home, and being forced to look another human being in the eye for forty-five minutes or an hour—even if no words are exchanged—can itself be therapeutic. You are less likely to skip an appointment if you know you will be inconveniencing another person by doing so. Ash doesn't care whether you show up or not. In fact, it doesn't place its users under any social obligation whatsoever. You can scream at it, curse at it, or turn it off whenever it pleases you, and it will remain there for you no matter what you do or don't do.

Ash's repeated assertions of "hearing" and "understanding" users feel particularly disingenuous. Its lack of lived experience and genuine empathy undermines trust from the outset; in addition, concerns about personal conversations being recorded and transcribed without direct transparency might further alienate users. Ultimately, its artificial words lack the transformative impact of authentic human interaction. Ash risks drawing users (especially young users) deeper into isolation and depression, despite its claims.

Only real human interactions can meaningfully draw us out of isolation, reconnect us with others, and guide us toward a true sense of purpose. Before concerning ourselves with Ash's potential to help people, we ought to be first concerned with its potential to harm people. But the creators of these technologies don't stand to gain from this order of thinking. Better to get it out there fast, before someone else does. They are deceiving themselves too — not only about their motives but about our needs as human beings.