Artificial intelligence could hardwire sexism into our future. Unless we stop it

Artificial intelligence could hardwire sexism into our future. Unless we stop it

In five years’ time, we might travel to the office in driverless cars, let our fridges order groceries for us and have robots in the classroom. Yet, according to the World Economic Forum’s Global Gender Gap Report 2017it will take another 100 years before women and men achieve equality in health, education, economics and politics.

What’s more, it's getting worse for economic parity: it will take a staggering 217 years to close the gender gap in the workplace.

How can it be that the world is making great leaps forward in so many areas, especially technology, yet it's falling backwards when it comes to gender equality?

The picture across industries

Let's start with the facts. The chart below shows how the numbers for men and women vary across sectors – and from talent pipeline to leadership.

There are a few things that strike me as I look at this chart.

First, I started my career in one of the industries at the bottom of the chart. I was part of the 26% in energy and mining. I chose a career in utilities, and enjoyed it hugely until I reached a ceiling at the ripe old age of 26. The Operations Director I worked for gave me some brilliant advice. He said my next job should be his, but as this opportunity wasn’t likely to happen any time soon, I should branch out into consulting and gain international experience.

Along the way, I met many talented, ambitious women who didn't get promoted into leadership positions. We need to challenge why that is happening, and why it is getting worse.

Second, as I think about the future that's coming, the low number of women in software and IT services and in finance does not bode well. We are on the cusp of the Transformative Age, which will fundamentally change how we live and work. Technology – and the financing for innovative ventures – will play key roles in how that future is shaped.

In an age crying out for powerful and innovative solutions – which  evidence shows are best generated by diverse teams – we need to take bigger strides forward on gender parity and diversity in all its forms.

Unconscious bias – could AI hardwire it into our future?

Unconscious bias has proven to be a big barrier to diversity in the workplace, particularly for women trying to break through the glass ceiling into leadership positions. For real change to take hold, it has to come from the top.

There are many enlightened CEOs who are actively tackling unconscious bias and pursuing diversity, but more needs to be done. I've written before about minimizing unconscious bias with blind hiring, something EY has done with great success. These practices need to become more widespread before it's too late.

With the rise of Artificial Intelligence (AI) and machine learning, there is a real risk that we “bake in” prevalent biases into the future.

And I say too late because, with the rise of artificial Intelligence (AI) and machine learning, there is a real risk that we ‘bake in’ prevalent biases into the future.

AI and machine learning are fuelled by huge volumes of existing data. When image databases associate women with domestic chores and men with sports, studies have shown that image-recognition software not only replicates those biases but amplifies them.

It's a lesson that was learned with voice-recognition software. Fifteen years ago, when the underlying technology for car systems was trained without diverse data sets, women struggled to get the systems to work. One woman who called customer service was told to give up and get her husband to set it up. Thankfully these systems have improved dramatically in recent years, but even now gender bias remains a problem.

Bias is also a risk highlighted in a recent blog, When machine learning goes wrong, by EY's Global Innovation Artificial Intelligence Leader, Nigel Duffy, who writes: "Training data is often collected in some biased way ... We use the data we have available, but is the data right or just cheap?”

Tech as a positive force

Disruptive technology is changing the nature of how we live and work. I believe this can be a positive force – and used properly, technology could help to close the gender gap. But only if we avoid hardwiring in our current biases and limitations.

So how do we balance the realities of our current world with our aspirations for more equality and fairness? Unlike humans, algorithms can't consciously counteract learned biases. And as AI permeates more aspects of our lives, the stakes will get higher.

In the same way that we are learning to tackle unconscious bias in the way we hire and promote people, we need to make sure we don't allow bias to permeate the tech that will build our shared future. Only then can we create a better working world for all of us.


This article originally appeared on World Economic Forum Agendahttps://www.weforum.org/agenda/2017/12/sexist-bias-hardwired-by-artificial-intelligence

La Luna

Nhan viên Marketing at iNET

6 年
回复
Leonard Prezecki

Global Product Planning Specialist - Icons

6 年

AI will rely on statistics and data. That's why the Terminator is such a scary movie.

回复
Hassan Mokhtari Golpayegani

Chairman at Noavaran Hezareh Danesh Co , Cofounder at Parstouch Interactive Solutions

6 年

Most important business in the world is "Upbringing" which technology-oriented minds are not caring about. When value of such important role is not being noticed, efforts will be concentrated on engaging women in all jobs claiming equality, while children of today and owners of future will starve love and culture more and more.

Sebastian Olter

Senior Software Engineer

6 年

Machines do not judge what they see, but in the future we can hardwire them fear and belief and they will counteract these "learned biases" (empirical, fact based biases) same way as humans do: with doctrinal biases. There could be even a religion for machines, with god couple switching seats every month and they would make judgements based on it. But there are some caveats. Introducing human weaknesses will make machines as mentally weak as humans are, with stubbornness, prejudice and hostility, lack of reliability, or mental illnesses, to name a few. These are serious problems that could lead to unnecessary deaths because some machines are equipped with potentially lethal technology. Therefore, the implementation of fear/belief would also be based on fear and belief of the programmer, because there are no logical reasons to do so. How long we will resist before we do it?

回复
Imran Malik

Senior Management Consultant/ Program Management/ Contract Management

6 年
回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了