Are we Ethically-Challenged when it comes to Data?

With more recent revelations on how institutions have been using Facebook users' data coming to light, the sharp focus has been on the improper use of customer and social data. It's a challenging topic, as the use of this data is increasingly providing insights, product, and enablers which provide benefits hoped to significantly outweigh the value of the individual's privacy. But this freedom and open access is provided at what cost, and how do we determine the boundaries or proper and improper use?

I was reminded of an article we wrote a couple of years ago, which I'll post below. It touched on the risks companies face around the ethics and privacy of employee data, as they increasingly seek to provide insights and drive efficiencies. It challenged companies to question themselves on whether they 'should', not whether they 'could', use big data to better understand and predict employee behaviors. Enjoy!

----- original article below published August 2016 -------

The word “privacy” seems to be everywhere these days. Most companies or institutions have a Privacy Policy, which few of us ever read and most of us agree to at the click of a mouse. Yet how many of us remember that privacy is one of our most fundamental human rights? How many of us realize that we are willingly divesting ourselves of that right, often several times a day?

The cost of Big Data

Big Data analytics are being used to map genetics, improve social security services, and make better human resourcing decisions; but the phenomenal value of personal data makes it tempting to disregard the right to privacy of data subjects, not least in the work place. It is only a matter of time until a greater public understanding of the reach of Big Data analytics causes a sizeable social backlash with potentially serious financial consequences.

What are the implications for employees?

The work place is a particular area of concern. According to the “employee records exemption” in Australia (Section 7B of the Privacy Act), businesses are not bound by the usual privacy protections when it comes to employee data gathered in the context of an employment relationship.

In other words, your employer is able to draw datasets together that, when subject to analytical analysis, provide insights on your potential emotional state or your intention to start a family. The danger here is that the digital behavior on which these conclusions could be founded is from the monitoring and surveillance that many employees are not aware of.

Employers could of course counter that an employee has agreed to this through their employment contract, including the policies in place within the organization. But here is the crux of the issue: are you fully aware of what it is you have actually agreed to? And, even if you did understand this in detail, do you feel that you had any choice in accepting the terms set out in the contract? Assuming that the answer to either of these questions is “no”, then how can this be construed as informed consent? 

The possibilities presented by recent data analytics advances are not lost on the majority of users, yet there remains a blind trust in the benevolence of technology for those with nothing to hide. Take, for instance, the recent introduction of some of the strictest data retention laws in the world by the Australian government. Or reports that health insurers will provide discounts for those policy holders who agree to share their health tracker data with their insurer. The general consensus seems to be that if you have nothing to hide then sharing your data is not a problem.

Growing awareness and understanding of the power of data analytics may cause a conscious modification of people’s digital behavior, or self-censorship, which ultimately equates to a restriction on the right to free expression.

The immense business value of data means that the urge to monetize or use data collected will by far outweigh the risks of infringing on privacy – at least initially. To date, the existing laws are unevenly implemented. With only one instance of a company being fined in 2013, there has been no significant litigation in Australia so far.

What can companies do?

Given all this, it would seem the time is right for companies to lead on defining what better practices on data use are. Rather than joining the gold rush to reap potentially massive rewards, but risk a social backlash that would compromise the beneficial use of big data, companies would do well to take the lead in fostering trust among data providers.

There is clear evidence to show that individuals are willing to give up a measure of privacy in return for transparency in how it is used and trust that it is safe. By undertaking to self-regulate and lay down a definition for the ethical use of data, coupled with providing clear and intelligible information about such policies, companies can safeguard the rewards of data analytics without risking the loss of their social license to operate. Surely this is something that every employer can see the sense in.

Olivia Ryan

Founder, She thought She Could...So She Did ?? Creator of People Programs ?? Facilitator ?? Coach ??Artist

6 年

Good article and very relevant to the work we are doing in the HS reporting space. I wasn’t aware of section 7B clause in the Privacy Act, thanks for pointing this out.

Paula McLuskie

Partner at EY, Financial Services

6 年

Well said - social risk and who to manage is becoming more and more important fir all businesses #betterquestions

张磊

贵州青酒营销有限公司 - 总经理

6 年

耶稣是个什么东西?

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了