Ai Therapy
Balanced information for patients, providers, and the public from a real life human psychologist.
PDF – updated 08/25
Overview & Guidance
Principles for making decision with AI + recommendations for safely and effectively engaging with AI for personal support.
Future of AI Mental Health
My perspective on where this is all going + implications for chatbot users, mental health professionals, and policy makers.
For Schools & Universities
Suggestions for student messaging about AI mental health + using a harm reduction model to emerging concerns about student AI use.
FAQ
What is AI therapy?
AI therapy is the use of artificial intelligence tools, apps, and websites to receive mental health support and advice. The tools range from using free options like ChatGPT and Claude, to paid apps and services that have an additional interface that makes it seem more like a traditional therapist.
AI “therapy” is not an equivalent to human-based psychotherapy, it cannot diagnose or treat any mental health condition. It is more a type of “pseudotherapy” that appears similar to psychotherapy by providing supportive statements, practical health advice, and helpful insights in the right situations, but there are clear differences.
My argument is that AI chatbots are the next logical extension of the self-help and peer support traditions, but not professional mental health services. For as long as humans have been alive, we have gone to sources of knowledge and wisdom about our health and wellbeing, and have sought out one another for personal support. The current AI mental health tools are a very engaging blended packaging of self-help and peer support traditions in one place.
Is AI therapy as effective as traditional therapy?
Research has shown that AI therapy can be helpful for some non-clinical uses, particularly for people with mild concerns, who do not have access to traditional therapy, and who have difficulty being honest with a human therapist.
AI “therapy” is difficult to research and make fair comparisons to human therapy because they are distinct services with different goals, methods, and structures. This means there is no true apples-to-apples comparison possible except in highly controlled laboratory environments that have limited real world applications.
Additionally, no AI therapy tools have been approved to provide professional level services by any governments, regulating agencies, or healthcare systems in the world to date.
What are the pros and cons of AI therapy?
The pros of AI therapy are 1) it is cost effective when using mainstream tools like ChatGPT and Claude, 2) it is available anywhere with an internet connection and there’s no wait for using it, 3) it can be used at any time of the day with no time limits.
The cons of AI therapy are 1) it isn’t actually therapy and there are clear limits to what it can help with, 2) data security and privacy policies are continuously changing and are much weaker than current record keeping standards in real healthcare systems, and 3) there is no professional credibility, oversight, or accountability.
To add another point, using the term “AI Therapist” is actually misleading and could take people away from services that may actually help them. I use it here only to help people find this information.
Is my private private health information safe with with an AI therapist?
Concerns about the privacy of your chat logs with an AI mental health chatbot is one of the top issues in this area. Each company offering AI tools are continuously updating their global data security and privacy polices, and they are not at all focused on health care uses.
In contrast, the United States (HIPAA), European Union (GDPR), UK (Data Privacy Act 2018), Australia (Privacy Act 1988), South Korea (PIPA), Japan (APPI), and many others have extensive healthcare security laws that real psychologists, psychiatrists, and other mental health professionals must follow to maintain their license.
AI is exempt from these laws, so there is greater risk of your data being used, shared, or sold with AI mental health tools than in formal healthcare settings.
What is the best AI therapy app? What is the best free AI therapist?
If you’re going to experiment with the technology for personal support, I recommend staying with one mainstream LLMs, particularly ChatGPT from OpenAI & Claude from Anthropic, which both have free versions. My current pick is to use Claude because of Anthropic’s privacy policy, focus on user safety, and the more human tone that Claude has.
I suggest avoiding any of the other AI bots or apps marketed as being for personal support or that pretend to be a “therapist”, or any that have a visual avatar, voice, or designed personality. Offerings from character.ai and Replika are examples of products that are more questionable. Much less is known about them and the risks for attention manipulation, fostering user dependence, and faking authority are common concerns.
Who should use AI therapy and who shouldn’t?
There are no AI bots that are approved by regulating agencies for use as part of mental health treatment. However, many people are using them experimentally despite that.
People who may benefit from them 1) have mild mental health concerns or stresses, 2) do not have access to a traditional therapist or services, and who 3) feel comfortable with the current level of privacy.
Cases where the technology could be more harmful are people who 1) have serious, complex, or safety issues with their mental health, who 2) have a history of attachment or dependency challenges, and who 3) are likely to take advice from various sources without deep personal consideration. If you have a tendency toward any of these things I’d recommend either avoid this tech altogether, or only use it in adjunct with professional services.
Will AI replace human therapists?
My view is that AI will replace most therapists but not because AI companions and chatbots will be better than human therapists. Instead, I think AI has the potential to become a prediction engine for people, providing real time insights and advice that can imagine problems in advance, and help each person avoid the consequences of bad decisions.
If this can occur with any accuracy, then therapists will be needed much less because AI will help people prevent mental health problems from developing at all.
However, I do think human therapists will still exist in 2100 at least, but we will mostly serve people in more serious and complex situations, like it was originally intended.
Search
CONNECT
willmeekphd@gmail.com