How Distorting Privacy Helps Big Tech Consolidate Power
The MIT Press
Committed to the daily re-imagining of what a university press can be since 1962. Shares ≠ endorsements
By Elizabeth M. R.
Technology companies are using so-called privacy-preserving or privacy-enhancing technologies (PETs) and other technologies to superficially follow the letter of data-focused laws while simultaneously violating their spirit. For example, as many data protection laws hinge on the identifiability of individuals, industry is embracing PETs for anonymization, deidentification, pseudonymization, and obfuscation. But academic researchers have repeatedly demonstrated the degree to which deidentification, anonymization, and aggregation do not reliably prevent reidentification, and even where they do, privacy and security risks remain, particularly with respect to certain groups and communities. For example, the fitness tracking app Strava made headlines for revealing the location and activities of US military personnel around clandestine bases in Syria when it published anonymized heat maps of popular running routes. These types of risks are not solved with the use of differential privacy or other anonymization techniques, which only achieve a narrow, mathematical view of privacy.
In fact, PETs are allowing dominant technology companies to reimagine privacy in their own self-interested image. For example, at Facebook’s 2019 developer conference, Mark Zuckerberg said, “Over time . . . a private social platform will be even more important to our lives than our digital town squares.” As with his “the future is private” declaration, Zuckerberg was painting a vision of privacy that is focused on private communications infrastructure owned and controlled by Facebook. Similarly, Apple’s “What happens on your iPhone, stays on your iPhone” campaign, criticized by privacy advocates for misrepresenting the company’s record on privacy, reveals Apple’s increasing tendency to conflate privacy and the confidentiality of data stored on its devices. Even if data remains locally stored and processed on an individual’s device, it can still be used to manipulate their behavior and engineer certain outcomes in ways that intrude on the personal boundaries and autonomy that privacy traditionally seeks to protect. And Google’s shift from third-party tools to greater dependence on its first-party ecosystem suggests a similar view of privacy as confidentiality or secrecy vis-à-vis third parties rather than from Google itself.
Even if data remains locally stored and processed on an individual’s device, it can still be used to manipulate their behavior and engineer certain outcomes in ways that intrude on the personal boundaries and autonomy that privacy traditionally seeks to protect.
Perhaps even more significantly, these moves signal a broader shift toward the enclosure of digital spaces through ever-expanding, privately owned walled gardens or digital fiefdoms governed by opaque automated tools. For instance, rather than ending tracking, Google’s new “privacy” tools would merely enable it through more opaque methods within Google’s own ecosystem. In fact, all the proposals in Google’s Privacy Sandbox would actually channel more and more activity into Google’s own first-party ecosystem, routing everything through Google’s APIs and automated tools, and increasing third-party reliance on Google. Viewed through this lens, it becomes apparent how Google’s planned phaseout of third-party cookies is no skin off its back when it can be the first and only party shaping user behavior through the Chrome browser. Indeed, the title of Google’s original announcement captures it best: “Building a More Private Web. And Google is not alone.
Apple’s new “privacy” tools and features similarly deepen its dominance through the company’s control over infrastructure, such as mobile hardware-? and software-based operating systems. In line with its general push toward localized machine learning, Apple introduced the new iCloud Privacy Relay tool, which is designed to stop websites from building a profile of you by preventing them from matching your website requests with your actual IP address, and App Tracking Transparency tool, which requires apps to request permission before tracking users. Both tools actually deepen Apple’s vertical integration into digital advertising markets by routing app developers and advertisers through Apple’s ecosystem, without limiting Apple’s access to advertising data. Similarly, Apple’s introduction of a digital identity feature, which allows users to store driver’s licenses and other identity credentials through the Apple Wallet, in much the same way as Apple Pay, will only make Apple users more dependent on and tethered to their iPhones as the device becomes a one-stop shop for payments, identification, communications, and more. In fact, these privacy moves have trig-gered anticompetitive accusations against Apple from competitors and scrutiny from regulators around the world.
While there are good reasons to embrace PETs that are proven to work, there are also good reasons to be suspicious when these tools are embraced by some of the most powerful companies on the market—companies that are implementing and embedding infrastructures to surveil as well as control the behavior of individuals, groups, and entire communities. As these examples illustrate, big tech’s rhetorical shift and strategic adoption of PETs and other measures taken in the name of “privacy” are actually helping it preserve and consolidate power in the face of a governance crackdown narrowly focused on data. Such measures allow these companies to preach the gospels of privacy and security without changing the fundamentals of their business models, altering their core activities, or ceding any power or control, all the while putting individuals and communities at risk.
领英推荐
Big tech’s rhetorical shift and strategic adoption of PETs and other measures taken in the name of “privacy” are actually helping it preserve and consolidate power in the face of a governance crackdown narrowly focused on data.
Big tech’s embrace of PETs demonstrates how easily impoverished notions of privacy centered on data can be weaponized by industry to serve an agenda of domination, control, and extraction—in other words, to perpetuate the status quo. As the popularity of PETs shows, data protection or data privacy are not the same as protection or privacy for people, despite frequent conflation of the two. Distracted by these tools, we risk losing sight of the original aims of privacy—to maintain zones or spheres around the inner or private life of the individual; protect the individual’s physical person, home, and family life; create boundaries that are foundational to the exercise and enjoyment of other fundamental rights and freedoms; protect individuals from discrimination and harassment; and ultimately, defend the individual liberty and autonomy necessary for a fully functioning democracy.
---
Elizabeth M. Renieris is a Senior Research Associate at the Institute for Ethics in AI at Oxford University. A lawyer by training, her academic research focuses on cross-border data governance and the ethical implications of emerging technologies.
Her latest book, Beyond Data: Reclaiming Human Rights at the Dawn of the Metaverse, explores why laws focused on data cannot effectively protect people—and how an approach centered on human rights offers the best hope for preserving human dignity and autonomy in a cyberphysical world.
Professor at London School of Economics and Political Science
1 年Georgia Meyer