Could AI automation replace human therapists?

Is this the future of psychotherapy? Freepix photo.

Psychotherapy has always been a deeply human endeavor: a patient talking, a therapist listening and responding, and healing happening through words. But with the rapid rise of conversational artificial intelligence, particularly large language models (LLMs), that paradigm is shifting fast.

A team of University of Utah researchers is tackling this change, but not by asking, “Will robots replace therapists?” Rather, they explore more practical questions: What are we automating and how much?

“The history of new technology like this is almost always about collaboration, and it’s about how it supports the human expert in doing the work they can do,” said Zac Imel, a professor of educational psychology and lead author of a new study titled “A Framework for Automation in Psychotherapy.” “It might be useful to think about frameworks for understanding the different types of work that could be done through automation, and that’s what this paper is.”

The study is the result of a cross-campus collaboration among researchers from Utah’s College of Engineering, School of Medicine and College of Education.

Simply put, automation is when machines perform tasks that humans previously performed. In therapy, that could range from a chatbot delivering prewritten coping tips to AI systems that take notes, organize them, analyze therapy sessions, provide feedback to clinicians or even talk directly to patients.

Varying degrees of automation

Co-author Vivek Srikumar uses self-driving cars as an analogy for the varying levels of automation.

“The automobile industry has been introducing driver assistance systems in our cars for many years now, and the end is self-driving cars,” said Srikumar, an associate professor at the Kahlert School of Computing. “This paper can be seen from that perspective. The extreme form of AI in psychotherapy is an AI therapist, but different levels of automation may be associated with varying levels of risk. You might have different capabilities or assistance that is provided to therapists, to clients, to organizations by AI.”

Imel and Srikumar are long-time collaborators who teamed up with Brent Kious, an associate professor of psychiatry, to craft the automation framework, which was posted in advance of publication by Current Directions in Psychological Science.

The team outlined four categories, representing different levels of automation along a continuum.

  • Category A: Scripted systems. Humans prewrite content, which is then provided to patients by chatbots that follow decision trees.
  • Category B: AI evaluates therapists. The AI reviews therapy sessions and gives feedback or ratings.
  • Category C: AI assists therapists. The AI suggests interventions, prompts, or phrasing, but a human therapist delivers care.
  • Category D: AI provides therapy directly. An autonomous agent generates responses and interacts with patients, possibly with supervision.

The team evaluated each category for its potential utility and risk levels, which vary widely. A scripted chatbot, an AI coaching tool for therapists, and a fully autonomous AI therapist are fundamentally different technologies with different risks. However, it’s often unclear to users, or even to health systems, which technology they are using.

Weighing risks and benefits

“By cataloging the various levels of automation, the same question takes on different flavors at various levels, questions about risk, questions about consent, who gets to consent and how much consent and the impact of potential mistakes and the questions about who and how much responsibility is borne by various parties,” Srikumar said. “All of these things, the questions remain the same, but the impact of these questions changes.”

The team is particularly interested in how clinicians are evaluated and mentored to enhance the level of care provided to patients.

“We are currently partnering with SafeUT, Utah’s statewide text-based crisis line, to develop tools that help evaluate crisis counselors’ sessions so that they can get feedback to maintain key skills and even develop new ones as we learn more about crisis counseling,” Kious said.

Evaluation and training are where large language models can support therapists without replacing them, Imel said. Current methods are no match for the scale of need in mental health care.

Automating without replacing human therapists

“To evaluate a psychotherapy session is tremendously labor-intensive. It’s slow, it’s unreliable, it rarely gets used,” Imel said. “You’re not recording your sessions and then mailing them off to an expert who can listen to them and evaluate them and give you feedback and then send it back to you so you can learn from it.” Here, appropriately trained LLMs can quickly capture core components of treatment and provide that information back to therapists quickly–often in real time.

The researchers note that anyone can now turn to ChatGPT for counseling that might resemble psychotherapy. LLMs are designed to be engaging and sound empathetic, and are trained on vast datasets, but they don’t necessarily use evidence-based psychotherapy techniques. Accordingly, they carry significant risks, as they are known to fabricate information, encode biases and respond unpredictably.

“Why would one want to deploy the riskiest version of a tool when there are so many lighter versions of it that we can already deploy that are going to make life easier?” Srikumar said. “A note-taking application, for example, something that maintains notes across a session. These are already going to improve the quality of life for clinicians, the quality of service.”

The team also envisions a future role for AI in crisis hotlines.

“It’s a really challenging environment where you don’t know anything about the people you’re talking to. They’re calling in; you may only have five or six talk turns to connect with them. You have a very confined space to try and help this person and get them safe and reduce risk,” Srikumar said. “What I do foresee is that AI will heavily augment those future crisis counseling systems because the scale is too big to be satisfied without automation.”

The University of Utah for Newswise. The Gayly online. 4/8/26 @ 1:45 p.m. CST.