Why coaches should stop caring about data privacy
Image created using Leonardo

Why coaches should stop caring about data privacy

The Association for Coaching's Technology and Coaching conference started today. The opening panel started with a poll that included the question:

If you had to argue against the adoption of more technology in coaching, what would you say?

There were several different sorts of responses, each of which I'd love to dive into at some point, ranging from the psychological side-effects of interacting with technology to inherent bias in datasets and a whole lot more. One of the most common answers was along the lines of:

Using technology increases the need for data privacy

At a surface level, that's undeniably true. There's been plenty of talk about ChatGPT over the last 12 months, and each time we interact with it we're sending that conversation data over the internet to OpenAI for processing. That data - particularly if it includes person identifiable information - should be protected. And it doesn't stop there, because OpenAI will want to hold onto that data for as long as possible in order to continuously train its products.

I received an email the other day from a tool I'd signed up to the marketing for, boldly declaring how proud they were for achieving a great milestone. As a sign of how much they value their users, since their last email they had obtained compliance with GDPR.

GDPR is the EU's data privacy legislation. In order for a company to collect and process personal data within the EU (or several other countries, including the UK) it must be compliant with GDPR. That's not a badge of honour showing how much they value their users, it's just the law.

The email I received is like a taxi driver taking a phone call halfway through a journey and proudly telling their passenger that they've just found out they've passed their driving test. It's good that they have, but I expect more from someone whose profession is driving.

Data privacy compliance is important

There are many data privacy laws around the world and they take different forms, but at their root they're the same: nobody wants their private information to be known more widely than is necessary. I understand that a book seller needs my address to ship my package to me but I'd be upset if that appeared alongside the 5-star review I then go on to post.

The broad principles data protection regulation follow are good ones. It's comforting to know that a piece of technology is going to be following them. But they're also quite accepting that organisational processes can be complex, so a large number of people may end up with access to personal data.

That book seller's processes may be relatively straightforward - log the order, find the book in the warehouse, package it up, hand it to a courier and post it through my letterbox - but how many people will see my address as part of that process?

As coaches we should hold ourselves to a higher standard

When I first start working with a new client it always feels important to talk about how I'm not going to be talking to anyone else about what we discuss. Their line manager will never find out, my wife will never find out, the conversation is confidential.

Except, not really.

In the golden olden days where meetings happened in person, that word confidential was quite transparent. Coach and client would enter a room and nobody else would be able to hear what we talked about. The only people with any record at all of any of the details of that conversation would be the two of us. If neither of us took notes, the only evidence we'd met at all might be in a calendar invite somewhere.

Technology removes that possibility entirely. At one level that's quite basic; every participant in a video call needs to send their data to each other one, so at any given moment that data is somewhere in the world and could be tapped into. It would probably be encrypted so it's unlikely a teenager experimenting with their new Raspberry Pi will stumble across my coaching conversation, but that's not where the biggest risk lies.

It's quite possible that a record exists of every video call I've ever taken part in. I've not seen it, but some employee of my video call provider probably has. Probably not every employee, and probably not specifically looking at me, but the data's there. And as technology has moved forward, with calls being recorded and now transcribed using AI, those records now exist out there somewhere.

Where do they exist? Am I right to feel a bit concerned that some coaches, even those using an AI transcription service, might not be able to answer that question? If that's being used in a coaching conversation, that suddenly doesn't feel so confidential. The idea (hypothetically) that every employee of an organisation might have access to listen to coaching conversations feels quite some distance from the truly private room of the pre-Covid era.

Next steps

I'd like to make two suggestions for how we should be approaching technology with privacy in mind.

Firstly, let's hold our technology to a higher standard than GDPR. Compliance with the law is important. But confidentiality and data privacy are two different things. We should meet bold claims that products are compliant with GPDR with a high degree of professional skepticism, only choosing to use products that offer the highest levels of security and protection over all data.

Secondly, let's treat all of our technology equally. Of course it's important that AI coach providers protect personal data so we're right to be concerned. But we should be equally concerned with all technology we use. We should be encouraging clients who communicate via SMS to move to RCS. We should be ensuring we send encrypted emails. We should make sure we switch off all AI transcriptions and voice assistants, and probably switch to video providers that don't include them.

Privacy is a valid concern and I'm truly glad the coaching profession is aware of it. Let's make sure we use that concern to select the right technologies for our clients that protect their outcomes and improve the coaching profession as technology continues to develop at pace.

Kate Franklin

C-Suite Coach & Advisor | Speaker | Expert in People, Relationships, Systems & Culture | Founder at Nkuzi Change | 25 years’ experience in Leadership Development

1 年

thank you Sam for helping us all make sense of this complex and daunting arena

Dawn Springett

Catalysing Fearless Change | Transforming Chaos to Clarity | Integrating Global Perspectives | Grounded in Real-World Experience | Making Change Manageable Again

1 年

You've sparked an important conversation, Sam. There is a similar discussion in the therapy and healthcare space, where the privacy and security of patient data is paramount. I find the Health Insurance Portability and Accountability Act (HIPAA) standard a good benchmark to keep in mind when selecting the digital tools that support our coaching practice (AI, scheduling, file-keeping, note-taking etc.).?Curious to hear your take on HIPAA.

Bj?rn Nissen

Certified Team coach - Leadership development - Org design - Connecting the dots ... System thinking & awareness // Diagnose & Shift __>

1 年

Thanks Sam for keeping this on the agenda. By saying, "only choosing to use products that offer the highest levels of security and protection over all data" Which (most common) products would you say are ok as of today?

As long as coaching is a non-regulated industry, Life Coaches will continue to scam the public.

回复

要查看或添加评论,请登录

Sam Isaacson的更多文章

社区洞察

其他会员也浏览了