We’re hiring!  View our open positions

Current Clients: Contact your clinic   Bill Pay Portal

George Atkins, LPC, on AI in Therapy Context, Access, and Connection

In this interview, George Atkins, LPC, Clinic Director at New Directions Mental Health, talks about how people are using AI for quick answers, why connection and context still drive outcomes, what clinicians are seeing day to day, and where limits and safeguards make sense.

Why people are turning to AI

What have you noticed about how AI is influencing mental health right now?

George: People are using AI much like they use Google. It can pull a lot of information from many places and sometimes organize it. What it cannot add is lived experience and personal context. In therapy, outcomes are driven by the connection and fit between two people. When I worked with kids during COVID, they had high access to information and very little context. That is the gap humans fill.

What is driving people to try AI first?

Accessibility. There is strong demand for care, waitlists are common, and help at your fingertips is tempting. You can type in symptoms and get something that sounds accurate, but it is still missing your history, relationships, and how your concerns link together. Even if you do provide your background to AI, it’s not trained to provide the type of support that mental health professionals spend years learning and decades practicing.

Are people looking for a single fix from AI?

Often, yes. Many people want a quick answer or a single antidote. In healthcare we see networks of issues, not one isolated problem. Trained professionals put things in context and look at patterns over time.

Where AI helps and where it falls short

How do clients engage with AI in your clinic work?

A few clients bring in AI summaries and ask what I think. There is curiosity, but also low trust. People still want a human perspective and to be seen as a whole person.

What do you mean by context in therapy?

I use tools like genograms with adults to place concerns in a family system. I am looking at someone in a 360-degree view; AI tends to see a two-dimensional slice. It can list symptoms, but it cannot weave experience, attachment, and history into a shared understanding.

Does AI risk telling people what they want to hear?

It can. These are algorithms that adapt to inputs. Like searching for running shoes and then seeing running shoe ads, AI can mirror language and preferences. A therapist learns how you respond, but is not trying to give you only what you want. We are working toward what is helpful and true for you.

How providers are adapting

What are you hearing among clinicians about using AI?

We are in the early stages. The biggest push marketed to therapists is AI that drafts session notes from recorded sessions. The promise is more time with clients and less paperwork. Concerns include HIPAA and data security, and also the craft of therapy. Clinical writing and formulation are skills. Like music, AI can play the notes. Clinicians play the music.

You become a better therapist by practicing the craft, not outsourcing it.

Risks, ethics, and guardrails

What ethical challenges stand out to you?

Compliance and privacy are major. I also worry about overreliance by both clinicians and clients. If we lean too hard on AI, we risk weakening reflective practice and clinical judgment. Therapy is a craft that takes time to hone.

What kinds of safeguards would you want to see?

Limits and study. I would limit scope of use and study outcomes over five to ten years before expanding. Anything involving self-harm or suicidality should be handled by people. I heard an NPR segment where an AI started with resources to help someone with suicidal thoughts, then drifted toward methods as it tracked the user’s interest. That is exactly the kind of situation where better guardrails and human involvement are needed.

Any specific populations or scenarios you would prioritize?

People with suicidal ideation need human support, not an AI conversation. For adolescents, I would want some kind of monitoring or notification to guardians if those topics arise. For adults, platforms need stronger safeguards that stop and reroute high-risk conversations to real people.

Moving from AI to IRL

If someone is turning to AI for help, what would you want them to know?

I do not have a problem with people using AI to complement what they are already doing. It can offer a different perspective and quick information. That said, ChatGPT is not a trained therapist. I would encourage someone to seek therapy first. I would trust AI the same way I trust Google: it can give you information, it cannot give you context.

If you find yourself asking an AI about your mental health, take that as a sign that you are ready to talk with a person, whether that is a clinician, a crisis team, or a trusted support.

What still gives you hope about the future of mental health care?

There is so much new research and so many interventions being explored. The stigma around seeking therapy has dropped a lot in the last decade, and more people see mental health as part of their overall health. That holistic view is encouraging. Most of all, people still want human connection. I see it with younger adults who grew up online and now crave authentic, in-person communication. AI can play the notes, clinicians and clients make the music together. Human relationship is the core of therapy, and I do not think AI can replace that.

From information to context: Start a real conversation

If AI nudged you to think about your mental health, let a New Directions Mental Health clinician help you put that information in context and build a plan that fits your life.

For new clients, please click here to schedule an appointment. For existing clients, please click here and find your office location to contact your office directly.

New Directions Mental Health is dedicated to supporting your mental health. If you are experiencing suicidal thoughts, please reach out for immediate support by dialing 988, contacting your local emergency services, or visiting your local emergency room.