What Is AI Therapy? How It Works and Its Limits

AI therapy uses artificial intelligence, typically in the form of a chatbot or app, to deliver mental health support through text or voice conversations. These tools range from simple, scripted programs that walk you through structured exercises to sophisticated systems powered by large language models that can hold open-ended, personalized conversations. Most are available as smartphone apps with subscriptions starting around $15 to $30 per month for basic plans, making them significantly cheaper and more immediately accessible than traditional therapy.

How AI Therapy Actually Works

AI therapy tools fall into two broad categories based on the technology behind them. The older generation relies on rule-based systems: predefined scripts, keyword matching, and structured workflows typically designed by psychologists. These bots guide you through established techniques like cognitive behavioral therapy (CBT) using a fixed set of responses. They’re predictable and tightly controlled, but they struggle when your input is vague, complex, or doesn’t fit the script.

Newer tools are built on large language models, the same technology behind ChatGPT and similar systems. These are trained on massive text datasets and then fine-tuned using therapeutic conversation examples and human feedback. The result is a chatbot that can hold flexible, multi-turn conversations, remember context from earlier in your session, and generate responses tailored to what you’ve actually said rather than matching keywords to a pre-written answer. They can pick up on emotional cues in your language using natural language processing, a set of techniques that let the system analyze the sentiment and meaning behind your words rather than just scanning for specific phrases.

In practice, a typical session looks like texting with a therapist. You describe what’s bothering you, the AI responds with reflective statements, asks follow-up questions, and may suggest coping strategies drawn from evidence-based approaches like CBT, where the focus is on identifying how your thoughts influence your emotions. Some apps also track your mood over time, send check-in reminders, or offer guided exercises between conversations.

What the Clinical Evidence Shows

The research on AI therapy is still early, but results so far are promising for mild to moderate depression and anxiety. In one randomized clinical trial, participants using an AI behavioral intervention platform saw their depression symptoms drop by 34% over two months, compared to 20% for those receiving standard care. Anxiety symptoms fell by 29% in the AI group versus just 8% in the control group. The effect sizes were large for the AI group, meaning the improvements weren’t trivial or borderline.

Perhaps more surprising is how people feel about the relationship itself. Users of one AI chatbot called Therabot reported a therapeutic alliance, the sense of trust and collaboration you feel with a therapist, that was comparable to what people typically report with human therapists. On average, participants in that study voluntarily spent about 6 hours engaging with the chatbot over four weeks, suggesting that people don’t just try these tools once and abandon them.

That said, these studies tend to be small, short-term, and focused on people with milder symptoms. AI therapy has not been tested extensively for severe mental illness, trauma, or conditions that require nuanced clinical judgment. It works best as a supplement to human care or as a first step for people who might not otherwise access any support at all.

What AI Therapy Can and Can’t Do

The strengths are straightforward: AI therapy is available 24/7, costs a fraction of traditional sessions, requires no scheduling, and eliminates the anxiety some people feel about talking to another person. For someone lying awake at 2 a.m. with racing thoughts, having a tool that can walk them through a breathing exercise or help them challenge catastrophic thinking has real value.

The limitations are equally clear. AI cannot read body language, detect tone of voice in text-based formats, or draw on years of clinical intuition. Rule-based systems in particular fall apart when conversations go off-script, delivering irrelevant or generic responses. Even the best large language models can “hallucinate,” generating confident-sounding advice that is inaccurate or inappropriate. They also lack the ability to truly understand your experience. They process patterns in language, not meaning in the human sense.

For complex situations like grief intertwined with relationship conflict, a personality disorder, or processing childhood trauma, AI tools simply don’t have the depth. They can reflect back what you say and suggest techniques, but they can’t sit with uncomfortable silence, notice when you’re avoiding something, or adapt a treatment plan across months of sessions the way a skilled human therapist would.

Safety and Crisis Situations

One of the most serious concerns is how AI therapy handles someone in crisis. Most reputable apps include some form of suicide risk detection, using natural language processing to flag phrases associated with self-harm or suicidal ideation. When triggered, these systems typically redirect users to crisis hotlines or emergency resources rather than attempting to manage the situation themselves.

The technology for detecting risk in text is improving. Systems can categorize language into risk levels based on specific phrases and contextual cues. But these tools are not yet reliable enough to replace human judgment in high-stakes moments. A person expressing suicidal thoughts in indirect or unusual language might not trigger the system at all. If you’re in acute crisis, an AI chatbot is not a safe substitute for calling a crisis line or going to an emergency room.

Privacy and Regulation

When you tell an AI chatbot about your anxiety, relationship problems, or trauma history, that data goes somewhere. Not all AI therapy apps are subject to HIPAA, the U.S. law governing health data privacy. HIPAA only applies to covered entities like healthcare providers and their business associates. A standalone consumer app downloaded from an app store may not qualify, which means your conversations could theoretically be used for advertising, sold to data brokers, or stored with minimal security.

Apps that do follow HIPAA standards typically encrypt data both in storage and during transmission, using encryption protocols recommended by the National Institute of Standards and Technology. The current minimum standard is AES 128-bit encryption, though 256-bit is recommended. Before using any AI therapy tool, it’s worth checking whether the company signs business associate agreements, publishes a clear privacy policy, and specifies how your conversation data is stored and whether it’s used to train future AI models.

On the regulatory side, the FDA has authorized over 1,200 AI-enabled medical devices across healthcare, but none of them have been approved for mental health uses. Fewer than twenty digital mental health devices of any kind (including non-AI tools) have received FDA authorization. This means the AI therapy apps currently on the market are largely unregulated as medical devices. The FDA has acknowledged the rapid growth of “AI therapists” and is actively evaluating how to handle them, but for now, no chatbot-based therapy tool carries an FDA stamp of approval.

Cost and Access

Most AI therapy apps use tiered subscription pricing. Basic plans typically run $15 to $30 per month and include a limited number of daily interactions. Premium tiers cost $40 to $80 per month and offer unlimited conversations along with features like mood tracking, progress reports, and personalized exercises. Some apps offer free versions with restricted functionality, and a few enterprise plans exist for healthcare organizations deploying the technology across patient populations.

For comparison, the average cost of a single session with a human therapist in the U.S. ranges from $100 to $250 without insurance. Even with insurance, copays often run $20 to $50 per session. AI therapy doesn’t replace the depth of those sessions, but for someone on a tight budget, uninsured, or living in an area with few mental health providers, it offers a level of support that simply wasn’t available before. The average wait time for a new therapy appointment in many parts of the country stretches weeks or months. An AI app is available the moment you download it.