AI Didn’t Write That… Or Did It?
Jul 09, 2025
The fine line between automation and accountability in psychotherapy documentation
Artificial Intelligence is officially in the room—and no, I’m not talking about your client’s dissociative symptoms. I’m talking about the AI tools now generating clinical notes faster than some therapists can say “reflective listening.”
Let me be clear: I’m not against AI in behavioral health documentation.
In fact, I partner with AI developers who are building thoughtful, ethical, and incredibly helpful tools for providers. I’ve seen firsthand how AI can reduce note fatigue, improve consistency, and even help newer clinicians structure their documentation more effectively.
But from where I sit—as someone who reviews behavioral health documentation for a living—there’s a real risk we need to talk about.
🚩 The problem isn’t the tool. It’s what we do with it.
More and more, I’m auditing records where the notes feel... off.
They’re:
- Nearly identical from session to session
- Written in a tone that doesn’t match the provider’s style
- Filled with passive phrases or clinical generalizations
- Missing any meaningful reflection of what actually happened in the session
When I ask about the source, the answer is usually:
👉 “We’re using AI to help generate notes.”
Look, I get it. Notes are tedious. Providers are stretched thin. And documentation tools that promise to cut that burden in half are appealing—especially when they’re marketed as “compliant.”
But here’s the truth: no AI tool is compliant out of the box.
Compliance is about accuracy, integrity, and individualized care—and that can’t be automated.
🧾 AI in documentation needs:
- Oversight: Notes must be reviewed and edited by the licensed clinician who provided the service.
- Alignment: Templates should match the provider’s style, client population, and level of care.
- Updates: Tools need to be retrained as clinical programs evolve.
- Boundaries: Just because a system can prefill interventions doesn’t mean they were actually provided.
So where does that leave us?
💡 AI is a tool. It’s not your clinical judgment. It’s not your voice. And it’s certainly not your license.
If you’re using automation to support your practice—great.
If you’re relying on it to think for you—it’s time to rethink.
Let’s use this technology with integrity, not impulse. Your license (and your audit trail) will thank you.
Want help reviewing your documentation for AI red flags? I’ve got you.