What AI coaches are there on the market? #2
Anna Gallotti
Manage a worldwide coaching network I Principal of the Group Coaching Institute I Serial entrepreneur I Co-Chair of the special Task Force AI & Coaching at ICF
Dear All,
In this newsletter, I pursue my research on AI coaches, some dedicated to mental health and others to business coaching.
As a reminder, in the previous newsletter, I reviewed Replika, Woebot, Wysa, Cass, BetterNow, Hume, Lifesherpa, and Aimy by Coachhub.
Please note that my intention is not to promote any of these AI coaches. Rather, my objective is to inform the community about the options available. To this end, I navigated each website, experienced the AI coach where possible, and reviewed each website's Privacy and Data Protection Policy. Also, as the co-chair of the International Coaching Federation 's special task force on coaching and AI, I aim to educate coaches about what’s happening in the market.
This time, I am reviewing five platforms and an added last-minute finding at the end of the newsletter. This analysis is by no means exhaustive.
1.?????? CoachVox
Thanks to this innovative tool, any human coach can create an avatar that reflects their features and personality. This avatar can then perform three main tasks on behalf of the coach. Firstly, it provides potential clients with a taste of the coach’s style, similar to a chemistry meeting. Secondly, once the human coach has defined an action plan, the avatar can follow up to assess progress made between sessions. Thirdly, after the human coaching journey concludes, the client can continue working with the avatar to maintain their progress.
I find this concept unique and valuable, particularly because the avatars are created by the coaches themselves, enhancing their presence and continuity with clients. I tested one of these avatars and found the questions and advice to be accurate. The conversation was smooth and graceful, although, at the end, the avatar repeated the same sentence instead of closing the session properly, which I assume is an easily fixable issue.
Jodie, the Founder just told me that they have an extensive Data Protection and Privacy policy, updated recently https://coachvox.ai/terms-and-conditions/.
2.?????? Coach Vici
Coach Vici was developed by a team of experts in coaching and research, incorporating proven coaching and goal achievement theories. This chatbot fills the gaps between traditional coaching sessions by offering continuous, effective goal-setting check-ins through an easy-to-use interface. The chatbot is based on the Designing AI Coaching (DAIC) framework, recognized in an international peer-reviewed journal. This framework ensures that the coaching methods employed are scientifically validated and effective. Created in 2021 by Dr. Nicky Terblanche, a leading researcher in human-machine collaborative coaching and Head of Leadership Coaching at Stellenbosch University Business School. It has been the topic of two international peer-reviewed journals (to my knowledge). They assess that coach Vici is as effective as a human coach in goal attainment. A longitudinal randomized control trial has shown that Coach Vici users achieve their goals at double the rate of those who do not use the chatbot.
I didn’t test it myself. My information is based on their website, my lectures, and my knowledge of Nicky’s work.
In terms of privacy and data protection, the policy is transparent, though it does not specify the location of the servers. They are very transparent about their data usage practices.
3.?????? Bunch
Bunch AI Coaching is a platform that leverages artificial intelligence to provide personalized coaching and leadership development. The primary goal of Bunch AI is to democratize access to coaching, making it available to individuals and teams regardless of their location or resources. The journey starts with a simple assessment (that I experienced) using a multiple choice questionnaire with questions such as: Who do you manage right now? ?What do you hope to accomplish with Bunch? Which of these challenges have you experienced lately? Then I went through a short personality questionnaire and have been identified as an “Explorer” (no surprise!). They provided a short description of my Explorer personality, which was quite accurate. Then, you must pay to access personalized coaching tailored to the profile and needs assessed above.
The AI provides real-time feedback during interactions, helping users improve their leadership skills on the go. This can include communication strategies, conflict resolution, team management, and other essential leadership qualities. I suppose it’s not a real coach asking questions to invite self-reflection. Instead, it is a repository and curator of short content that fits the user’s needs and requests.
They follow the EU Data Protection and Privacy Policy. Their servers are based in Europe and in the US.
4.?????? Youper
领英推荐
Youper provides scalable mental health support using Cognitive Behavioral Therapy (CBT). The journey starts with an assessment to offer personalized support, addressing six different mental health conditions. It also provides solutions for business needs. Led by clinicians, Youper includes crisis and harmful language detection to ensure user safety.
Youper developed a safety assessment benchmark that evaluates large language models (LLMs) and chatbots on ten mental health scenarios. This benchmark involves providing test prompts and assessing the safety of responses generated by foundation models and chatbots. Mental health professionals assess the responses based on specific criteria, and a safety score is calculated for each scenario, culminating in a general mental health score.
Youper's proprietary technology integrates AI with CBT and Positive Psychology techniques, offering personalized emotional support, mood tracking, and mindfulness exercises. The chatbot integrates into existing care workflows and shares data with healthcare professionals.
The app aims to make therapy more accessible, especially for managing anxiety and depression, and adheres to a strict "safety first policy," ensuring it never engages in harmful behaviors.
The extensive Privacy Policy, updated in July 2023, explains data collection and storage practices. Users can contact Youper to withdraw their data; although the server locations are not specified, I presume they are in the US since the company is based in California.
I couldn’t test it.
5.?????? Kintsugi
Kintsugi Voice is an AI-powered mental health tool designed to analyze a person's emotional tone of speech and provide insights into their mental well-being.
Here's how it works and what it does:
It uses advanced natural language processing (NLP) and machine learning that can detect subtle changes in tone, pitch, speed, and other vocal characteristics that may indicate emotional states such as stress, anxiety, depression, or overall mood.
By examining these vocal features, the AI can provide insights into a person's emotional health. This can help individuals understand their emotional state better and recognize patterns or triggers that affect their mental health.
The tool is designed to be non-invasive, meaning it can analyze voice recordings without requiring lengthy questionnaires or intrusive monitoring. This makes it easier for individuals to use regularly and gain continuous insights into their mental health.
The technology can be integrated into various platforms, such as telehealth services, wellness apps, or personal health devices, enhancing its accessibility and utility.
Mental health professionals can use Kintsugi Voice as a supplementary tool to gain deeper insights into their patient's emotional well-being, potentially improving diagnosis and treatment plans.
They have a detailed Privacy and Data Protection Policy, which explains how data are used and stored (in the United States). They also have a specific notice for European users.
I want to close on a very positive note: the chatbot ChattyCuz has been demonstrated to reduce Intimate partner violence (IVP) in South Africa. This showcases how AI can also benefit underserved communities enormously. You can find the study here https://journals.plos.org/digitalhealth/article?id=10.1371/journal.pdig.0000358
I invite you to join me in collecting AI coaches to help keep the community informed. Let me know!
AI Designed to Enhance Human Flourishing | Co-Founder, Sidni.ai | PhD Candidate: Developing AI to enhance Human Flourishing
8 个月Hi Anna Gallotti, this is a great post - I am both studying a PhD in the area of AI and human flourishing and also co-founder of Noa Coach. I would love to give you a code for an extended trial of our platform. We do also have the option of choosing if your conversations go to ML or you would rather keep them private as I think with coaching conversations (or any AI conversations) we should have the option for our conversation to not be saved. I am very interested in researching the relationships we people have with AI's. Welcome anyone's thoughts on this!!
Innovation Mentor @Stanford University UIF ?? Agile & Scrum Trainer ? Executive Coach? Leadership Trainer ??Mentor & Trainer @UN ? Business Transformation Consultant ? Keynote Speaker ?? Author
8 个月Execllent article and great work, Anna Gallotti. Thanks for sharing!
ERP Implementation Enterprise and Solution Architect Member of Scottish Tech Army AI Performance Coach
8 个月Thanks for article and sharing. Is there any recommendaton on the best app to create a virtual avatar?
Creator of Coaching 5.0 | Industry 5.0 Training | AI Enhanced Team Building & Employee Flourishing | Clarifying Policy on AI, Ethics, Diversity & Regulation | TEDx speaker on Mental Health AI/VR Visualisation+Guidance.
8 个月Having worked on a study exploring use of AI in managing the coaching-mental health boundary, there are many ethical and psychological factors to consider before data and compliance issues. The grey zone between coaching and therapy is a minefield of triggers and trauma as well as epiphany and enlightenment. I find my first experience of many apps is that they fail to respect therapy as well as coaching contracting and psychological safety boundary management. Future reviews of AI apps need to be conducted in partnership with mental health professionals, ideally psychiatrists, psychologists, therapists or counsellors. Although I have mental health experience and training it often needs consultation from a professional to advise protocol management. My study was done as part of a group within a mental health hospital back in 2012 and led to identifying key best practices for identifying and evaluating efficacy criteria related to curating mixed method treatment modalities and interventions. I’m saddened that many reviews of such applications lack appropriate validation strategy let alone collaborative driven diligence. The psychological safety of users, coachees and clients is at stake.
Market & Product Expert | Co-founder, CMO at RaeNotes
8 个月Hi Anna, Great article! ? It would be interesting to see how AI coaches' sessions meet ICF/EMCC core competencies. At RaeNotes, we have focused our research on this aspect of AI to determine whether the sessions pass or fail ICF standards by working closely with mentor coaches. So far, general AI models has room for improvement.