Reflections on Integrating AI into Therapeutic Practice: A Tool for Structure, Not a Substitute for Connection
by Dr Chris Tennyson, Clinical Director
The emergence of AI has understandably raised questions in many professions, and psychotherapy is no exception. Can technology ever belong in something as deeply human as therapy? My view is that, when used thoughtfully and ethically, AI has the potential to become a valuable tool to enhance rather than replace the therapeutic process.
I have begun to experiment with this by inputting some simulated and anonymised themes and formulations to gauge the kind of structured materials, such as worksheets, reflective exercises, or road maps, that AI might create. I’ve been genuinely impressed by the fluency and cohesion of the language it generates, and how effectively it can turn abstract psychological ideas into accessible, usable resources.
My hope is that, in time, and with client consent, this approach might be trialled collaboratively, giving clients the opportunity to test whether these AI-assisted resources help them reflect or apply therapeutic ideas between sessions.
Even so, this process could never replace the relational and empathic work that sits at the heart of therapy. Rather, it might offer a way of turning insights born in the therapy room into concrete tools for growth, giving clients something cohesive, visually clear, and practically usable to reinforce their development. Used thoughtfully, AI could help save time on formatting and structure, allowing more energy to be devoted to what truly matters: thinking, feeling, and connecting.
The thinking itself, however, must always come from the therapist. AI cannot understand theory, emotion, or the subtle interplay of attachment, trauma, and neurodivergence that shapes a person’s inner world. Any prompts must be grounded in a genuine therapeutic relationship - one built on trust, safety, and curiosity. It is that relationship which generates the understanding; AI simply offers one possible medium through which to express it.
Ethics and confidentiality are of course non-negotiable. No identifying or confidential client material should ever be inputted into AI, and its use should remain limited to assisting with structure, language, and presentation - never to generate clinical opinions or store information.
Used in this way, AI could become part of a thoughtful, human-centred practice - one where technology supports, rather than mimics, the art and integrity of therapy.
