Skip to content
Decorative: Coaching leaders work to create unbiased coaching tools. Image description: a blurry image of a man behind orange and pink smoke

Coaches play an essential role in creating safe and unbiased digital coaching tools

Digital transformation and personalization in the coaching sector risk exposing client data and harming trust. Coaches are essential in addressing these challenges by creating unbiased digital coaching tools that support diversity, transparency, and accountability.

Challenge:

AI bias and safety issues can undermine trust in coaching relationships

Opportunity:

Transparency and diversity in AI coaching tools can guard against bias

Impact:

Prioritizing ethical AI use enhances inclusivity and accessibility for coaching clients


Wisdom Weavers


Identifying types of bias and misinformation when using AI helps coaches set up ethical safeguards in their practices

The surge of AI has taken the coaching software field by storm, introducing time-saving hybrid models that create greater efficiency for coaches and a more user-friendly interface for clients. However, these technologies rely on continual access to client data that helps AI adapt to coach and client needs and support greater personalization. Coaching leaders, including platforms, accrediting bodies, and coaching researchers are working to evaluate how bias, safety issues, and misinformation can be addressed in the integration of coaching and AI. Wisdom Weaver Jeff Hancock, founding director of the Stanford Social Media Lab and BetterUp Science Board member, speculates that the problematic aspect of introducing AI into the coaching ecosystem is deception or a lack of authenticity. He explains, “When clients do not know whether their interaction is with a coach, with some machine, or some hybrid of machine with coach, this can cause a drop in trust.” Once this trust has been violated, it is difficult to regain, especially in coaching spaces where people are being vulnerable in revealing personal or professional challenges and trying to grow.

 A breach of trust in a coaching space can have lasting consequences. If AI is introduced in a way that provides problematic guidance and instruction, this violation of trust can lead to harm. Jeff continues, “People cannot always tell whether they are receiving sound advice or misinformation from machines and might worry: Should I believe my coach when they say to do this, or is that misinformation because this other website says I should do the opposite? That’ll lead to a lot of uncertainty and confusion.” Jeff believes coaches must remain trusted partners with clients to develop ethical principles that guide the process of integrating machines. This ensures clients feel confident that they will be informed when and how AI is involved and that they have choices over this process.


Coaches using coaching software must be aware of client safety issues, and work to preserve client data privacy

Beyond trust, data privacy and accuracy emerge as issues for consideration. Wisdom Weaver Gloria Origgi, a philosopher who studies trust, relays that there are many risks regarding what data is used in a machine learning task, and who controls that data. Generally speaking, many data sets are accessible via the internet, but others are private and are not available to the public. Protecting client data would be a matter of data privacy. However, private datasets and algorithms present unique risks. Gloria explains that biases can shape how data is collected and later interpreted using algorithms: “The fact that data sets are proprietary means they are not open access to all public, so people or organizations can gather data in a biased way. When we have racist and sexist biases in the data, the algorithm will reproduce these biases in its output.”

Coaching has been predicated on confidentiality and safety practices. When it comes to coaching technology platforms, this software collects details about people to tailor and personalize recommendations for growth, so developers must implement preventative measures to avoid breaches to trust and data security. Given that clients provide coaches with personal information to adjust their level of coaching interventions, clients must feel confident that this data is private. Wisdom Weaver Dr. Jacinta Jiménez, psychologist and BetterUp VP of Coaching Innovation, explains, “Coaches must consider how clients feel about disclosing personal information across technology platforms and safeguard the information they do provide. By educating themselves on the range of biases, privacy issues, and the possibility of misinformation, coaches can help safeguard their practices to ensure client safety.” Coaches should also be involved in the process of developing those coaching AI tools to offset the potential for bias and safeguard client trust.


Increasing diversity and digital literacy in the development of AI software solutions can support unbiased and personalized development

As AI tools are increasingly used in the coaching field to respond to clients with diverse backgrounds and needs, coaches will need to guard against AI bias by designing and training AI using diverse teams, data sets, and coaching methodologies. Wisdom Weaver Jeff Hancock points out that because of historic and systemic realities, diverse communities are less represented in leadership and executive positions. He observes, “Diversity in leadership positions has only become the recent focus of coaching. If we expand these roles out to diverse communities and assume they function the same as those demographics traditionally in charge, this will be problematic. A lot of these technologies are geared toward non-diverse populations.” To provide personalized support for clients of all backgrounds, Jacinta agrees that coaching must maintain a dynamic scope of practice across cultures. She explains, “And not just how to coach across cultures, but also how different cultures relate to and feel about technology.” When coaches account for the language and varied lived experiences of diverse populations, the workings for trust within those communities can be developed.

Digital literacy is key to guarding against bias in emerging technologies. Jacinta elaborates, “It’s about enabling our coaches, giving them agency to approach clients in a digital world with heightened awareness, safe practices, and cross-cultural understanding. Digital literacy goes beyond knowing how to operate technology. It’s how to digest information, and sort information as much as possible, into fact and fiction.” Coaches who are engaging clients through digital platforms and supporting development with AI tools are adapting their skills to the digital landscape and how it represents them in digital forms. Jacinta explains that this entire process requires discernment and active engagement on the part of coaches and clients.


Coaches can support client autonomy through algorithmic transparency and accountability

Coaches can allay client fears over the use of AI coaching models by employing transparency and accountability in their practices. Wisdom Weaver Mara Gerstein, technology and marketing expert, believes “the real danger is clients not knowing when they are interacting with AI versus a human coach. When AI has created content — and people don’t realize that — the potential for disinformation and propaganda is dangerous.” Mara confirms that the United States currently has no regulation: “Europe is at least two years ahead of the US when it comes to regulation, so hopefully, there’s greater collaboration worldwide in creating standards that ensure algorithmic transparency.”

Wisdom Weaver Nicky Terblanche, senior lecturer and coaching expert, warns that ethical challenges could become dark and somber: “We all know about human biases. Similarly, technology use can have certain discriminatory elements such as bias in AI and racial or gender profiling in a subtle way.” Nicky explains that the creators of these AI coaches could have a hidden agenda and infuse subtle messages to conform people in certain ways that ignore employee autonomy. For example, an organization could deny employee development based on employee interest that does not reflect a company’s objectives. For Gloria, another worst-case scenario is about control: “We could see one or two private companies take control over the whole AI market. For example, you can imagine a political campaign that is influenced by the production of bots or deep fakes oriented by the political ideas of the owner of the company.” Ultimately, it all comes down to algorithmic transparency over exactly what these machines do, which is becoming increasingly problematic because even the creators of AI software like Chat GPT cannot always anticipate how the software will behave.

The good news is that coaches have opportunities now to shape the coaching software they decide to use and empower themselves with the knowledge they need to ensure their clients are protected.


A Call to Action for Coaches

More work can be done to understand the ethical implications of AI. Luckily, leaders in coaching research, coach tech, and accrediting organizations are taking the ethical implications of AI seriously and leading the charge to explore its benefits and possible drawbacks. The ICF Coaching Platform Coalition has brought together stakeholders throughout the coaching industry to create ethical standards and best practices for safe and quality coaching in digital spaces. As part of this effort, the coalition has committed to supporting digital literacy in the coaching industry to create unbiased, science-backed, and inclusive online coaching experiences.

By experimenting and familiarizing themselves with AI technologies, coaches can lead other industries in the use of effective, safe, and human-centered AI. Increased literacy with digital coaching tools can help coaches understand the limitations and potential benefits of tech-supported coaching with client engagement. When used thoughtfully, these tools have the potential to expand coaching services, highlight impact, and support greater personalization in coaching development. With so much potential for bias in AI, involved coaches have the opportunity to take control of how coaching programs can counteract biases and foster greater inclusion. Coaches can get involved in professional conversations about digital ethics to ensure client protections and necessary guidelines are in place. They can also be involved in the creation of coaching apps to ensure ethical protocols are followed and enacted. Following the example of the EMCC, which recently published an ethics guideline outlining ethical guidance for coaches using AI, coaches can create their own protocols for ethical AI use. By understanding AI and taking informed action, coaches have the potential to utilize these powerful technologies to the benefit of their clients and practices. We invite you to consider how coaches can ensure the safe, effective, and ethical adoption of coaching technologies as the industry continues to evolve. 

Learn more about how the coaching industry can foster unbiased digital coaching engagement:

Back To Top