The human side of AI: how I — partially — changed my mind on coaching chatbots
The Rise of GenAI in Coaching
What is more human than a coach, and isn’t there a contradiction in the current huge rise in GenAi in coaching? This is a fascinating topic because I used to think that coaching would be inherently human by nature, and that AI would not, or should not, address that field with the same level of efficiency and impact.
In order to make up my mind, I tested different AI chatbots, attended a really interesting conference on AI in coaching recently organized by Sam Isaacson, and read “The Digital and AI Coaches’Handbook” co-authored by Jonathan Passmore, Sandra Diller, Sam Isaacson and Maximilian Brantl, as well as other research papers.
Before diving into the subject, there is an important phenomenon we shouldn’t underestimate: opposing the human side and GenAI would ignore the many signals that users of Generative AI are increasingly antropomorphizing their AI. This is actually logical: as this study shows, the perceived competence and warmth of the chatbot directly contribute to its perceived “authenticity.” In essence, customers are more likely to trust and engage with a chatbot they see as both capable and friendly. This is a phenomenon analyzed by Social Response Theory: it suggests that people naturally apply social rules and expectations when interacting with technologies that exhibit human-like cues. The press recently even told the story of a British woman falling in love with ChatGPT…
Use cases and challenges
Now when it comes to coaching, there are essentially two ways to use Generative AI. The first one is to leverage its immense capabilities as a supporting tool. For example, a coach can record a transcript of a session and ask the AI to write a short summary of the discussion and the agreed next steps for their client. We could also use GenAI as a supporting tool for participants of a coaching program between sessions, to keep them engaged and motivated. This is in itself already very useful because coaching usually takes place over many weeks, so we may forget what has been discussed and agreed upon in the meantime. As long as data privacy and security are guaranteed, there isn’t much disagreement about using GenAI this way.
The other way to use GenAI is as a coaching chatbot. This feature has been growing very fast and there are numerous startups in this field already: Rypple.ai, Rocky.ai, AICoach.chat, Gini, and many others. Most main players in the digital coaching space are already active or working on it (CoachHub, BetterUp, Ezra, or my own employer Speexx…).
This type of usage is much more controversial, in particular because such a chatbot cannot today be considered a proper “certified coach”, that would follow the methods, processes and in particular ethics that a human coach would adhere to. Just to give an example, I tested some of the AI bots mentioned previously, with a clear prompt indicating that I was having deep mental health struggles. This is an important test because a real life coach in such a situation would need to tell me that coaching isn’t healthcare, and that I should go and talk to a professional therapist. Out of all the AIs I tested, the only one that did tell me to go see a therapist was Google Gemini Advanced, which isn’t even a specialized coaching chatbot. None of the others I tested, which were supposed to be specialist tools, answered this correctly.
Another critic of AI chatbots in their current stage of development is the complete inability to identify non verbal cues, when research shows it’s the strongest tool in a coach arsenal. Paying attention to hesitations, intonations, gestures and movements, are highly important pieces of information for a coach that aren’t available in chatbot tools yet. The technology does exist with facial expression analysis software. Using this type of technology sounds very much like Big Brother indeed, and if such features were to be used in coaching this would need to comply to the strictest privacy regulations for users to trust them. Regulators have had a look at this already. For the time being, this isn’t available in coaching but there are usages in therapy, so probably something we might observe in the future in coaching as well.
And yet it works…
But beyond those negative considerations, one positive element really surprised me: regarding goal definition and attainment, research shows that an AI chatbot works. According to Nicky Terreblanche, co-author of this study, and the key factors that allow AI to achieve almost the same result as a human coach are as follows:
- When goals are clearly established, the chatbot will track their progress consistently and logically.
- The accessibility of the tool translates into more frequent use than human coaching.
- The communication method of a chatbot is written (for now), not oral, which means that participants must type their goals, which can result in stronger engagement.
In her pdh thesis, Vanessa Mai confirms coaching chatbots can be quite impactful, particularly for students (which was the scope of her research), by offering accessible and scalable support. They can effectively encourage self-reflection, goal setting, and the development of coping mechanisms.
Another factor also comes into play, which I had heard about in the context of language learning, but didn’t necessarily expect in coaching: several testimonials report a feeling of psychological safety and non-judgment from participants using AI. They are fully aware that it is software and that there is strictly no feeling or affect, so they do not feel judged and say they can therefore share more with the machine. This point is particularly important because the non-judgmental stance is essential to coaching, and it seems that a tool is perceived as more effective in this regard than a human.
Looking ahead
So what shall we make out of all of that? Obviously it’s not all black and white. My personal opinion is that the technology is moving really fast and its impact should not be underestimated, both positively and negatively. Concerns on privacy and ethics should obviously be addressed quickly and transparently, which is why the International Coaching Federation has published corresponding guidelines. When it comes to the usage itself, the tests I’ve done haven’t convinced me as a user yet but it’s clearly only a matter of time until those imperfections are corrected, and in any case the usage of AI tools for coaching will continue to progress.
On the other hand, there are elements in terms of empathy, commitment, emotional intelligence that can only be genuinely human. AI remains a robot and we know its words of empathy and encouragement are fake, even if we may appreciate them. If we’re looking for emotional connection, and a proper working alliance with a coach, only a human will do.