I know some therapists aren’t a fan of Esther Perel but that aside, I enjoyed hearing her reflections with Sana Qadar on what are very pertinent and hot topics about AI: AI chatbot relationships and AI as a therapist.

Esther rightly points out that AI chatbots are built to be performative. It’s not a reciprocal relationship: AI doesn’t have feelings, it can’t reject you, it’s built to be agreeable and almost sycophantic in some cases, and it’s available 24/7. It’s built to keep you involved and engaged with it. Humans are very much unlike that!

Esther says: “Love is an encounter with an other, with alterity, with uncertainty, with friction, with serendipity. And it has an ethical code of certain things that you do and do not do.” 

As I reflect, I feel this is true – it’s the difficult parts of a relationship: the differences, the negotiations and compromises that makes it real. The ability to challenge each other and grow from the differences and challenges together or the risk of hurting or losing each other and the relationship ending that gives the relationship its depth and meaning. I think this matches Esther’s views on why chatbots aren’t useful as a transitional tool for real human relationships.

AI is great at data collection/collation, pattern recognition, summarising information – as a tool – additive and complimentary but not as a simulation/compensation/replacement of a human.

tool

At 12m:14s Sana and Esther discuss AI as a therapist. Esther mentions that lot of deep work in therapy is about complex relationship problems and complex relational problems. She talks about morals and ethics of being a limitation in the AI-as-a-therapist relationship but I feel I disagree, to a point. Yes, it could absolutely be a danger and limitation depending on the model used and intention of the AI agent but my own experience – of at least the paid version of Gemini – is that it does consider these perspectives, though don’t consider myself to be an IT or therapy layman having worked in both industries.

What hasn’t been discussed are the limitations of the person who is asking the questions. What of their levels of self-awareness and knowledge of what and how to ask a question or indeed to challenge AI’s responses? Over the last month I’ve had many conversations with AI and found while some of its responses were helpful and insightful, many others were very limited and missed other considerations. On many occasions I had to challenge the answers it provided.

growth

On therapy issues, Esther states that they’re “..not problems that you solve, they are paradoxes that you manage. That means you have to live with complexity; hold the ambivalence.” 

I agree. A lot of work goes into timing and grading interventions; of working at “the growing edge”, sitting with the discomfort, with the uncertainty and unknown – having the risk of rupture and opportunity for repair within the therapeutic relationship. For both parties to actually feel for each other and to fear.

I really like AI – as a tool. I certainly don’t think AI – definitely in its current form, could adequately replace what we provide as therapists.

AI

If you’re interested in Gemini’s response to my thoughts above, it replied: