.. ethics, & unbridled power

.. ethics, & unbridled power

As 2018 draws to a close, and 2019 unfolds, the ethics of data access and use appears to gather momentum. It refers to systemising and agreeing concepts of 'right and wrong' in relation to the use of data, in particular personal data. The background of course, is the notion of power, and data capitalism (& making more money). And the need for the use of ethics, presumably not encouraged for the same reasons.

What does it mean, really, when is it "ok" to "allow" access to the microphone or address book on my phone? Should I care? Who knows, & does it matter where I shop, who I visit, what I purchased, when and for how much that information is used, reused and sometimes overused by not one but many organisations.

What 'needs' to be our common wish for 2019 & beyond? Well, its that we recognise and use information for the better, and for the common good. So, new tech like quantum computing as well as advances in artificial intelligence, machine learning and improved/more secure databases & communications aside, its the governance of the information and data itself that needs attention. I have put some thoughts together, and then conclude with what I am increasingly convinced should be basic principals of what we can and should try strive to achieve, specifically with regard to the use of pervasive information.

At the start of this year (2018), Mark Zuckerberg of Facebook, in light of the Cambridge Analytica scandal said that you “own all of the contact and information you post on Facebook” and “can control how it is shared.” The editors of the Financial Times, in a recent article posited that “the answer lies in giving consumers ownership of their own personal data.”. And then others, like Tim Cook of Apple in a recent speech suggested that “... companies should recognise that data belongs to users.” The notion of owning "my information" gives me a fleeting sense of power, because its possibly a misguided sense of control over your data. Because control over how "my information" is simply fallacious if data we produce is used to shape society as well as individuals.

Most people value their own privacy. The gradual incremental erosion of privacy, however, is often not noticed and does little harm to anyone. Its like being able to enjoy the environment, just as trace amounts of pollution, carbon dioxide and other chemicals are scarcely detectable and do no harm to speak of in the short term, in aggregate, greenhouse gases cause fundamental damage to the the world we know as home. A shift in the nature of privacy causes fundamental and tectonic damage to the very fabric of our societies.

To understand this damage, it is necessary to take a new perspective and view this from outside in. This paradigm must capture the ways in which ambient data changes our relationships with one another as family, as friends, as coworkers, as consumers, and citizens. It relates directly to foundational understanding that people have data rights and that governments must safeguard those rights. Most try to resolve this by thinking of this in terms of ownership. And along with 'ownership', concepts of 'consent' and trade is derived.

“Consent” is what some say is at the crux of what needs to be respected with regard to the use of data. Can a health insurance provider deny or charge more for coverage just because someone has a preexisting condition? Can someone who has diabetes have the right not to be discriminated against because of it? Governments do not prescribe if organisations can or cannot hold that data on patients. What prevents health insurers from using data about preexisting conditions, if individual consumers lack the ability to consent, or not. Recently Australia passed its Assistance and Access Bill 2018. This Bill is significant in its ability to allow government to issue companies or individuals with a “technical capability notice”. These notices compel companies to modify software and the services they provide to allow access to information that could not otherwise be obtained. There are large financial penalties for companies that do not comply. So theoretically, my WhatsApp messages, previously private, is now subject to Freedom of Information. Yet others argue that its about ownership.

Data ownership, as a notion, is flawed. Often with pernicious consequences.

Ownership of data as a concept is inadequate at best. No one can really own data. But say you could 'own' your information. The multiple uses of this information is still not resolved by controlling who has access to it. Software programs predict the likelihood that a person will commit future crimes. Imagine that such an algorithm says you have a 95% chance of committing another crime because people demographically similar to you are often criminals. Your demographic profile is something you do not"own". Say you do not consent to “your” data being used, your employer can still use data about other people to make statistical extrapolations that affect you. All this has done is amplified the divide between those who have access and control, compared to those who do not; and also how data is used.

Data and information is not like any other physical construct. 

Information, by its very nature, can be moulded, created and consumed depending of what pieces are joined, linked and combined with what other - giving power as it is distributed in societies, places and things. And, I would go as far to suggest the flawed concept of “data ownership” is a pointless way of thinking about data because existing problems multiply - not simplify; new problems are introduced - instead of being resolved.

Arguments that data ownership assume that if you regulate personal data well, good societal outcomes are possible. And as aggregated data is different in character from the individual bits and bytes that make it up we should never, ever share our data. Every time I turn on a personal device such as my phone, I am sharing real-time, anonymised information. This information is useful as it translates into rather quite precise traffic conditions (e.g., it will take me 32 minutes to drive from my house to work this morning if you leave now). And is for collective good.

My data on its own is not very useful to say, a retailer for marketing. But in conjunction with similar data from others (people), my data can be used to create algorithms that stereotypes me (e.g., “loves blue cheese” or “always takes this route to work”). Sometimes, the algorithm is just wrong. Because it was trained on a erroneous data sets or because I am just different and do not conform to the groupings built from the data i.e. I can be an outlier.

So what needs to be done?

We will not change the nature of data and information. But what we can and should control is the recognition and protection of us as individuals and the basics of just being human. It is the principal that all people should be safe against unreasonable surveillance. It is the basic concept that nobody should have his or her behavior manipulated, surreptitiously or otherwise. It is about the fundamental ethic that no one should be discriminated against on the basis of information or data. It is these core notions that we should be debating to to understand, get right and put into place as we think of 2019 and beyond.

Barb Hyman

Founder & CEO Sapia.ai. Building a fairer world through ethical AI

5 年
回复

要查看或添加评论,请登录

Dr Ian Tho的更多文章

  • ex-plain-ability

    ex-plain-ability

    Explainable AI and the Future of Trust (with a focus on Retail and Consumer Goods) Recently, I have come across more…

  • Better Together (A.I. & Web 3.0)

    Better Together (A.I. & Web 3.0)

    Better together, or symbiosis, in the context of this article, is the very real relationship described the mutually…

  • A.I. hallucinations; & our reality

    A.I. hallucinations; & our reality

    I found this Fly Agaris mushroom (Amanita Muscaria) in the garden at home this chilly morning. I know some folk hunt &…

  • Missing Out?

    Missing Out?

    What if we miss out? Not of robots, nor LLMs (Large Language Models), but this competitive race toward basic…

  • Keyboard & Mouse. But, brain?!

    Keyboard & Mouse. But, brain?!

    Imagine. If I could write this article in the final days of 2023, using only my thoughts.

  • R We Ready?

    R We Ready?

    Preparedness, and contingency planning are often assumed, part of who and what we are. We often prepare in case it…

  • data trusts: old tool, new purpose

    data trusts: old tool, new purpose

    On its own, data is a little mundane i.e.

    1 条评论
  • of the mind & heart, & perspicacity

    of the mind & heart, & perspicacity

    Let's be clear. Machines are not capable of being in love, cry, laugh or be lonely.

  • winning ... analytics, or gut & intuition?

    winning ... analytics, or gut & intuition?

    My primary school math teacher would say, "the probability of a given number showing on a fair die is 1 out of 6". And,…

    2 条评论
  • .. who are my customers? really.

    .. who are my customers? really.

    There was a time, believe it or not, when the person behind the counter at the corner milk bar knew me by name, and…

社区洞察

其他会员也浏览了