AI Documentation in Clinical Practice Is a Competence Question
Some people might see AI and psychology, or AI and clinical work, in the same sentence and assume they have no business being that close together. There are legitimate concerns involving professional obligation, clinical thinking, authorship, the client relationship, and client consent. These are real ethical questions. Concerns about authorship and cognitive burden or offloading are addressed in another post here. Technological change is not new, and the APA Ethics Code already includes standards relevant to the adoption of new technologies, which also apply to AI. This post focuses specifically on the competence framework because it relates directly to the APA Ethics Code and is especially actionable for clinicians. Ethics codes usually do not draw black-and-white lines about when technology should or should not be adopted, but they also do not assume that new uses are automatically acceptable. They ask what competent use would require.
Documentation matters clinically and legally because it is not just a record of what happened. It supports continuity of care, treatment planning, progress tracking, risk management, communication, and professional accountability. In that sense, documentation helps protect both the client and the clinician. APA’s record-keeping guidance frames professional records as a structured part of psychological practice, and APA Services notes that appropriate records can help ensure continuity of care.
As important as what goes into a note is what should be left out of it. Because notes may later be requested, scrutinized, or used in legal or institutional contexts, clinicians already have to think carefully about inclusion, omission, wording, and downstream consequences. This is especially important where records could expose clients to stigma, legal risk, or other harms if handled poorly. The concern is not whether AI writes part of the note, but whether the clinician remains able to make those decisions thoughtfully and understands that responsibility for the content and authorship of the note remains theirs, regardless of what technology was used to assist with it. HHS guidance also distinguishes psychotherapy notes from the rest of the medical record and gives them special protections, which underscores that different kinds of mental health documentation carry different downstream implications.
Competence and Consent
New technologies require appropriate training, consultation, and expertise so that clinicians understand their risks and benefits when applying them in practice, as reflected in APA Ethics Code Standard 2.01(c). Clinicians do not need to become AI experts, but if they choose to use AI in documentation, it needs to be in a way they can understand, evaluate, and take responsibility for. Technologies such as telehealth platforms, assessment software, and electronic health records involve the same kind of professional judgment and should be used within the clinician’s competence and scope. APA has also published guidance, emphasizing transparency, informed consent, privacy and security, accuracy, human oversight, and the need for psychologists to remain responsible for final decisions.
Client consent is an essential component of ethical technology use and is closely related to competent use. APA guidance distinguishes ambient listening from shorthand or dictation, placing them at different points on the consent spectrum. Using AI to transform clinician shorthand is generally less invasive, and how consent is communicated may depend on the provider’s workflow and setting. Ambient session recording, however, calls for explicit consent and should be addressed as part of written and ongoing informed consent, especially because workflows and tools may change over time. Competent use requires the clinician to understand which form of documentation assistance is being used and what that mode requires. APA’s AI guidance likewise emphasizes transparent disclosure of relevant AI use and informed consent regarding purpose, application, and risks.
What Remains the Clinician's Responsibility
Documentation is much more than the ability to write. It reflects clinical reasoning based on session observations, interventions, and plans for future care. It requires identifying what is clinically salient, summarizing accurately, and considering safety, risk, cultural factors, and the ongoing treatment plan. Because clinicians remain responsible for their documentation, AI can assist with the drafting process, but the reasoning reflected in the record still has to remain accountable to the clinician. That is the more important ethical question. APA’s current AI guidance is consistent with that view. AI should augment, not replace, human decision-making, and psychologists remain responsible for final decisions.
This is especially relevant for trainees and students. They are not just learning to produce polished notes. They are learning how to reason about and summarize a session, observe during session, translate that into writing, and understand the ethical and legal responsibilities the record holds. That makes the question not just whether AI helps with wording, but whether it supports or bypasses developing clinical judgment. APA’s supervision guidelines also frame training in health service psychology through a competency-based approach aimed at developing supervisee competence while protecting clients and the public.
AI assistance in documentation is neither inherently ethical nor inherently unethical. Its ethical significance depends on how it is used, what parts of the task it supports, and whether the clinician remains able to understand, evaluate, and stand behind the final record. The core issue is not whether AI appears anywhere in the workflow. It is whether clinical judgment, informed use, and responsibility remain with the clinician. That is the standard that matters, because the clinician remains the author of the note and the one accountable for what it says.
Subscribe for future posts
If you want new writing at the intersection of AI and psychology, ethics, and implementation of AI in clinical practice, subscribe on Substack.
The views expressed here are my own and do not necessarily reflect the views of any current or future employer, training site, academic institution, or affiliated organization.