Privacy Rights, Big Data Analytics and Tort Law: The Problems with Trinidad and Tobago's Data Protection Regime
https://newsroom.unsw.edu.au/news/business-law/predictive-policing-will-you-do-time-crime Ben Knight

Privacy Rights, Big Data Analytics and Tort Law: The Problems with Trinidad and Tobago's Data Protection Regime

Introduction

Big Data is a fundamental touchstone of the new and increasingly data-driven global economy.[1] Big Data refers to the exponential growth in the availability and automated use of information; It is concerned with the gigantic digital datasets that corporations and governments amass to identify general trends and correlations to predict human behavioural patterns. But there is an inherent risk with so many players in the digital economy engaging in the trade in and harvesting of large scales of personal information – it threatens an individual’s fundamental rights to privacy and to protection of their personal data. Threats to a person’s privacy threatens the integrity of the person.[2] Privacy is the right to control the disclosure of information about oneself to ultimately reflect how an individual would like to be presented to society; the right to privacy is therefore inextricably bound up with issues of personality and protecting individual integrity and reputation[3] through autonomy,[4] but the extent to which an individual is able to exercise this right in the digital age hinges upon the level of protection afforded to their personal data.

As the animating force behind artificial intelligence,[5] data is used to develop sophisticated manipulation technologies to generate comprehensive behavioural profiles from diverse datasets to predictively police and deceive the individual into thinking that the choices they make are their own.[6] In reality, it is often guided by algorithms. Big nudging and other forms of persuasive computing create a data-like pan region within which technology giants like Facebook and Google rule as ‘emergent transnational sovereigns’ and subtly govern the masses through psychographic, predictive profiling, to the aversion of democracy, outside the remit of public scrutiny and control.[7]

Thus, as data develops, constitutional rights arguably diminish; and with the growing datafication of society, it is personal data that corporations like Facebook, Snapchat and Google are vying for. They offer their products mainly as vehicles to harvest massive proportions of personal data, but the user is actually the product being sold.[8] There is ultimately a desperate need to develop data privacy legislation and while Trinidad and Tobago (TT) tries to establish its data protection regime through its Data Protection Act, it is the thesis of this research paper that the Act is patently ineffective and fraught with functional deficiencies stemming from the root of its dysfunctionality – its definition of data being too narrow.

By reason of that fact, the Act fails to touch and concern the critical security risks, especially to the individual’s fundamental right to privacy that are inherent in Big Data analytics. This exposes the country to large-scale data breaches that have the potential to cause harm to its data subjects. It is therefore necessary to seek an alternative channel of remedial relief where a person’s right to data protection and privacy have been contravened; This research paper is predicated upon the proposition that there is value in seeking alternative remedial relief through the common law tort of negligence since the crux of a data-protection regime ultimately boils down to the relationship between a data-controller and data-processor with their data-subjects, to whom a duty of care is inexorably and unavoidably owed.

Background: The Facebook-Cambridge Analytica Data Scandal

In 2018, it was revealed that Cambridge Analytica, a political consulting firm combined misappropriation of digital assets, data mining, data brokerage and data analysis with strategic communication to influence elections in multiple countries (including Australia, India, Malta, Mexico and especially Trinidad and Tobago).[9] The company acquired and used personal data from Facebook users from an external researcher who told Facebook that he was collecting it for academic purposes.[10] 270,000 Facebook users used a Facebook app called ‘This is Your Digital Life’[11] and by giving this third-party app permission to acquire their data, it gave the app access to information on the user’s friends network, resulting in the collection of data of over 80 million Facebook users, who, for the most part, did not expressly consent to Cambridge Analytica accessing and collecting their data. This increasingly raises issues about the use of personal data collected without knowledge or consent to establish models of a user’s personality for microtargeting (sharing ads, messages and pages based on issues data-trends think they care about).[12]

In Trinidad and Tobago, the quantitative data was used to create psychometric profiles that taught the company how best to manipulate behaviour and incite psycho-culturally charged political warfare.[13] It used Trinidad and Tobago as a data-mining test site in 2013 under the then Persad-Bissessar Administration to discourage political participation[14] in the local elections through a reactive ‘Do So! Movement’ that manipulated first time Afro-Caribbean voters into a resistance movement against politics and voting, all the while knowing that Indo-Caribbean youths were less likely to disobey what their parents told them to do. It repurposed the electronic data science amassed from Trinbagonian citizens in 2009-2010 to create psychographic profiles through data fusion and analytics to micro-target and influence what should otherwise be deliberate and private decisions,[15] and this led to a 40% shift in the 18-35 year-old turnout, which ultimately swung the elections by 6%. The idea that an external company could be used to target people through massive social media campaigns to benefit their highest bidders is a critical breach of an individual’s right to privacy and data protection and there must be some regulation for it to occur.

Yet, the investigation probe into the Cambridge Analytica scandal in Trinbago was concluded in 2020 due to it being ‘a waste of police time’ because it was ‘comments made on a political platform, without evidence.’ But even if it had not been concluded, could TT’s data privacy laws actually guard against this type of threat? It is the thesis of this research paper that it could not, because there are inherent deficiencies with the Data Protection Act that must now be further explored.

Methodology

This research assignment will be centred around bibliographic research on the tension between personal data protection and Big Data analytics. It will comparatively focus first on Japanese, Chinese and European Union data privacy law to discern what an adequate data-protection regime would require to then demonstrate how the TT Data Protection Act falls short in the Literature Review. Next, the Analysis section will lay out the fundamental problem with the TT Data Protection Act and what that means for the scope and efficacy of the Act when taken as a whole. Thereafter, the law of torts as an alternative avenue for remedial relief will be considered.

Literature Review

The increasing advent of data-driven economies inaugurate new information and communication technologies (ICTs) that better allow for dataveillance – companies and governments using data-capture devices through online platforms and the Internet of Things to undertake surveillance, interception and collection of a person’s personal data. This increasingly Big Data phenomenon absorbs the explosion of data proliferated, and revolutionises the methods of analysing it to draw predictive conclusions and patterns of behaviour that companies and governments can now exploit/capitalise upon for their own gain.[16] The idea that more and more personal data is being collected, processed and shared, but often sold without an individual’s free, explicit and informed consent violates the human right to digital privacy and it is necessary to discern what the law does to bar against such severe-rights abuses.[17]

The United Nations General Assembly (UNGA) Resolution 68/167 of 2013[18] re-affirmed an individual’s right to privacy in the digital age and in so doing, extended article 17 of the International Covenant on Civil and Political Rights (ICCPR),[19] when it called for states to ‘respect and protect the individual’s right to privacy within the context of digital communications by developing and implementing adequate legislation that protects against the unlawful and arbitrary collection, processing, retention or use of personal data by governments, business enterprises and private organisations.’[20] But the Trinbagonian Data Protection Act (TT DPA) has arguably failed to develop and implement such adequate legislation.

It is necessary to first detail the more comprehensive data privacy laws within the EU, Japan and China in order to best illustrate what the TT DPA lacks. Article 5 of the General Data Protection Regulation (GDPR) of the EU establishes the fundamental principles which arguably lay at the heart of any data-protection regime: lawfulness, fairness and transparency, purpose limitation, data minimisation, accuracy, storage limitation, integrity and confidentiality, and accountability.[21] Article 4 broadly defines personal data to mean any information relating to an identified or identifiable natural person. This allows any ‘physical, physiological, genetic, mental, economic, cultural or social factors specific to the identity of a natural person as well as any geo-location data or online identifiers’[22] to constitute information personal to that individual that falls within the scope of protection. Furthermore, it defines processing to mean any set of operations performed on sets of personal data, whether it be by, inter alia, ‘automated collection, recording, organisation, structuring, storage, adaptation or alteration.’[23] Such a broad definition allows the GDPR to touch and concern the large scale data-mining and processing activities of Big Data Analytics and this allows the Regulations to adequately protect the individual’s right to data privacy. At the same time, Chapter 3 of the GDPR allows the data-subject the right to access, to rectify errors within, to restrict the processing of, and to the erasure of their personal data, and this arguably allows the individual a much greater degree of autonomy over how their private data would be used than the TT DPA.

Similarly, China’s 2020 Draft Personal Data Protection Law extends the lawful basis upon which personal data processing is permitted beyond the data-subject’s consent to include any processing considered necessary for protecting the public interest (performance of a contract to which the data-subject is a party, fulfilling legal obligations, protecting the life, health or property of an individual under emergency circumstances, etc) – but the critical takeaway is that any data processed must have been so done on a lawful basis. Moreover, it makes provision for tighter restrictions to cross-border transfers of personal data so that, data transfers within China to a foreign country are not allowed unless it passes security assessments organised by cybersecurity administrative authorities, or unless a data-protection contract to ensure that the processing activities of the recipients of the personal data comply with Chinese regulations.[24] The draft of Chinese protection law also takes extraterritorial effect: her laws are applicable to foreign companies processing the personal data of individuals physically located within China if the company supplies goods or services to individuals located within China, or has processed data to analyse or assess the behaviour of individuals in China.[25]

Furthermore, the amended Japanese Protection of Personal Information Act of 2017 is consistent with the previous jurisdictions’ data-protection standards and went a step further than the EU jurisdiction and clarified that individual identification codes, (codes, characters, letters, numbers or symbols), which the body feature of a specific individual has been converted into data to be provided for use by computers including not only biometrics but DNA sequence data, facial recognition data, iris pattern data, voiceprint data, gait pattern data all fall within the scope of personal data protection.[26] It even goes a step further to ensure that the need to safeguard data privacy rights is galvanised against the need to greater facilitate Big Data by making allowance for the use of enormous personal data as Anonymously Processed Information (API).[27] The critical point of API is not that it is now possible to use large chunks of personal data if it is anonymised; it is that provision is made for privacy by design – this means that from the onset, Japan’s PPIA makes provision to enhance data protection by integrating anonymisation into data processing procedures so that the person becomes increasingly unidentifiable and this reduces the risk that their data will be harvested and mined by Big Data analytics to their detriment.

Trinidad and Tobago’s Data Protection Act is materially underdeveloped in comparison. Where the EU, Japanese and Chinese jurisdictions adopt as broad a definition of personal data as materially possible to try to capture all forms, fashions and forums whereupon information can be accrued in this digital era, the Trinbagonian DPA is significantly narrower. The Act defines data only in the physical sense (as, inter alia, any document, correspondence, memorandum, book, plan, map, drawing, pictoral or graphic work),[28] so it fails to take into account the increasing digitalisation of data that is amassed and mined from internet service provider as well as social media ‘apps’. It also defines personal information too restrictively, in that it only focuses on a limited set of sensitive personal information when compared to the other extra-hemispheric jurisdictions, that must be recorded rather than simply being related to an identified or identifiable person. There is too much of an emphasis only on public record type information as opposed to an actual all-encompassing definition that is capable of touching all the different ways in which data could be considered personal.

Moreover, though it could be said that the EU, Japanese and Chinese jurisdictions fail to extend their definitions of data to touch and concern those information posted on online platform social media accounts, it is argued that the definitions of processing that they adopt at least capture the phenomenon of data-mining and this provides a second layer of personal data protection – but the TT DPA does not define processing at all. By a failure to define processing in any definitive way, it effectively precludes from protection most, if not all, Big Data analytics that have the potential to abuse privacy rights.

Beyond that, the TT DPA is also materially underdeveloped because the general privacy principles it sets out are flawed. Unlike the EU’s GDPR, there is no section focusing entirely on the individual’s right under a data-protection regime. Rather, section 6 of the TT DPA tries to protect personal information by ‘appropriate safeguards’ but fails to explain what those safeguards are. S. 6 (b) tries to implement some variance of a purpose limitation principle, but this is undermined by s. 43 which mostly allows for personal information to be disclosed for research purposes. The problem with this is that it essentially lacks any requirement of a lawful basis to extract information. S. 43 all but allows the government to mine large proportions of data and disclose it wherever it deems fit under the guise of research – this means that it is increasingly possible for that statistical data to be drawn and analysed to attempt to influence elections, exactly like what was revealed to have occurred in the 2018 Facebook-Cambridge Analytica data scandal.

Furthermore, while the specific scope and extent of an individual’s right under the GDPR allows for the right to access, to rectify, restrict and to be forgotten, under the TT’s DPA, it is riddled with ambiguity and this means that the Act is too unconcerned with adequately empowering its data-subjects with the autonomy necessary to truly have a say in how information appurtenant to their person, as an extension of their personality, is treated with and this cannot be correct.

Comparatively therefore, the TT DPA is significantly ill-equipped and too underdeveloped to adequately establish effective privacy laws and by extension, a well-working data-protection regime. It is materially flawed and ultimately deficient by reason of its excessive narrowness and it becomes necessary to examine the implications of such an infant Act.

Analysis: The fundamental problem with the Data Protection Act

The fundamental problem of the TT DPA is that it fails to properly define an all-encompassing definition of the very thing that it is supposed to protect – data. The meaning of data under s. 2 of the TT DPA is wholly unhelpful so one must turn to the meaning of personal information under the Act and even here, it is too limited to adequately touch and concern the multiplicity of data-mining and extractive techniques employed through Big Data Analytics.

The Internet of Things is an ecosystem of electronic sensors constituting any human-made or natural object with an internet address that transfers data over a network.[29] These data capture devices also include online surveillance systems like Facebook’s WhatsApp and Messenger,[30] and ‘cookies’ that track a user’s movement across the internet and delivers this information to servers who are now able to access the amount of time a user spent on a webpage, what they did on the page, and even how many pages they scrolled.[31] The aggregation and coordination of the data accumulated by the Internet of Things, surveillance and tracking systems by third party cloud storage services can reveal everything from buying habits to socio-political preferences.[32]

Surveillance capitalism drives the internet/data economy.[33] Because it is eco-system fuelled by data extraction, online services like social media could only earn revenue by monetising data – this means that they ‘comb through user profiles, preferences and behaviour to find insights and predictions that can now be sold and shared to other third parties for numerous purposes, not the least of which could be elections tampering.’[34]

But the Trinbagonian Data Protection Act cannot even begin to fathom the true extent to which artificial intelligence and Big Data analytics and algorithms use data-mining techniques to turn raw data collected on the Internet of Things into meaningful intelligence that has a variance of uses to third party data companies. Where, by employing broad definitions of ‘data’ and ‘processing’, the EU, Japanese and Chinese jurisdictions are better able to recognise (though not enough as is materially necessary) the potential room for abuse of Big Data analytics, the TT DPA fails to define processing at all and its definition of ‘personal information’ is too narrowly limited to only personal information recorded, rather than any information related to any identified or identifiable individual.

It is simply not possible by the TT DPA to protect personal information amassed on online social media accounts or platforms, or stored by internet service providers, and this precludes TT privacy law from considering the extent to which the abuse, misuse or misappropriation of data mined exposes a data-subject to harm. The act essentially separates personal information from a person’s private data and that hinders the scope, efficacy, and the capacity of the act to actually regulate data breaches.

The spill-overs of this fundamental problem: Inadequate regulations for Data-processors.

The TT DPA’s fundamental gravamen is symptomatic of a larger material flaw of the Act when examined as a whole – its entire premise is too myopic.

Decisional privacy gives the individual autonomy. Autonomy refers to ‘a set of diverse notions including self-governance, liberty rights, privacy, individual choice, liberty to follow one’s will, causing one’s own behaviour and being one’s own person,’[35] but Big Data, artificial intelligence and the Internet of Things undermine this. Online behavioural advertising hinges on accessing massive chunks of personal data.[36] Companies and governments disclose personal information for the private benefit of third-party advertising technology companies who use the data acquired to manipulate consumer choice.[37] In other words, as more and more information is accrued, behavioural advertising creates psychological ‘wants’ that masquerade as cognitive choices but are, in reality, little else than subtle manipulation.[38]

But rather than employing some mechanism of purpose limitation or data minimisation to bar against these personal security threats, the TT DPA is too concerned with trying to justify and describe the instances wherein and explain the reasons why personal information may be disclosed. While s 41-43 could essentially be construed as an acknowledgement that governments engage in data mining, there is no purpose limitation. In other words, the Act takes little care to ensure how exactly data should be disclosed, who it may be disclosed to and/or to ensure that the purpose of disclosure is legitimate. This means that a government could harvest and then distribute or sell the data mined to third parties under the guise of research – much the same way that the Facebook researcher who sold the data he accrued to Cambridge Analytica did. 

At the same time, artificial intelligence and data analytics are cyberthreats to the electoral process.[39] Data can be misused as a weaponized micro-targeting agenda where voter attitudes are manipulated.[40] Psychometric profiling encroaches upon a data subject’s free will by ‘using data fusion and analytics to reveal deeply personal and granular details about themselves which are then used to emotionally influence what should be a deliberative, private and thoughtful choice.’[41] This was the scenario with Cambridge Analytica, which used microtargeting and psychographic profiling to swing election results in the 2015 election by 6% and used in America to ‘put Trump in the White House.’[42] Big Data analytics make distortion campaigns more successful predominantly through social media recommendation algorithms: in other words, ‘the selective presentation of information by an intermediary to meet its agenda rather than to serve its users represents an abuse of a powerful platform and is simply one point on an emerging map of the ability to quietly tilt an election.’[43]

The bottom line is that while psychographic profiling excels at instigating polarising propaganda,[44] electoral hacking hinges on accessing personal data. Yet the TT DPA fails to account for that. By reason of the fact that it failed to properly apply a broad definition of data, or any definition of processing at all, it means that the Act cannot truly extend to the cross-border data flows of critical personal information discerned from data capture devices from online social media platforms and the Internet of Things. S. 72 of the TT DPA restricts the disclosure of personal information unless consistent with the purpose for which the information is being collected. S. 72 (1) and (4) make provision for individual consent to make a pertinent consideration as to whether the information may be disclosed or not. But because the Act fails to properly establish a broad definition of data, it materially cripples the remaining scope of the entire Act in its entirety. In other words, it is immaterial that s. 72 sets out seemingly adequate safeguards because they will not apply to data that the Act is incapable of defining.

Furthermore, it restricts disclosure of personal information unless consent is given but it fails to consider all the ways in which information could amassed through cookies or data capture devices within the Internet of Things that an individual is unaware about. Finally, its attempted purpose limitation is undermined when one considers the fact that this does not include public, governmental entities who, by ss. 41-43 can amass information and disclose it as they see fit and to whomever they see fit, so long as it is under the guise of research purposes. There is therefore little material protection against the acquisition of a local Trinbagonian citizen’s data from foreign conglomerates or organisations.

Conclusively then, because the basic premise of the Act (to protect private data) is inherently flawed, the scope and efficacy of the Act by extension is materially crippled, and it means that there is a general inability of TT’s supposed data privacy laws to adequately build a workable regime that could safeguard the Trinbagonian’s rights. This therefore means that there is a need to seek protection and remedial relief in instances of the breach of one’s fundamental right to data protection elsewhere. It is the thesis of this paper that that alternative channel could be found under tort law.

Tort Law

Because the TT DPA is patently ineffective and structurally flawed, it would find itself having to stretch its statutory mandates too thinly to even begin to address reach the type of issues concerned with data breaches and these limited mandates will do little else than prevent broader regulation on data security. The Data Protection Act is inadequate in addressing data privacy law, but data breaches can arguably be better addressed by an ex post tortuous liability.

Tort law is arguably the best alternative avenue because data-processors and data-controllers are transfixed with duties and obligations over their data-subjects – this means that it is possible to substantiate a legal cause of action where those duties and obligations are breached. A duty of care exists when the actions of one could reasonably be expected to affect and so harm another. It is an obligation to take reasonable care to avoid harming an individual who would be foreseeably harmed by an act or omission. Data subjects rely on data-processors and data-controllers to be careful in protecting their personal data. Within the context of the Trinbagonian Data Protection Act, data-controllers (the Commissioner) attracts a duty of non-disclosure,[45] and a duty to take reasonable care and skill to ensure that data was processed on a lawful basis.[46] At the same time, public body data-processors have a duty to take reasonable care and skill to protect data processed,[47] and an obligation to safeguard against data-breaches by taking such steps as is necessary to ensure that the personal information in its custody or under its control is stored only in TT and accessed only in TT.[48] Similarly, private entity data-processors have a duty to take reasonable care to protect the personal data collected, retained, managed, used, processed or stored obligation,[49] and an obligation to prevent personal information under its custody or control from being disclosed to any third-party without the data-subject’s consent.[50]

Irrespective of the functional deficiencies and material discrepancies that cripple the scope and efficacy of the Act, the point is that this duty of care still remains and by reason of that fact, it is possible for a data-subject to satisfy a cause of action under the law of torts.

Ordinarily, since data rights, like image rights are privacy rights, it is to privacy torts that we should turn, but it is only the US jurisdiction and to a limited extent, the Canadian jurisdiction that recognise a tort of invasion of privacy. These jurisdictions may be of persuasive precedent, but it is first necessary to determine the extent to which they could effectively address digital privacy breaches.

Public Disclosure of Private Facts

Under American law, where a detail of a person’s life is not generally known to the public or is not publicly available but is communicated to enough people that it is reasonably likely that the fact will become public knowledge and will be offensive to a reasonable person of ordinary sensibilities, the public disclosure tort is satisfied, unless it could be shown that the private fact possessed some element of newsworthiness. [51] Similarly, under the Canadian jurisdiction, a person is subject to liability under the tort of public disclosure where they invade another’s privacy by giving publicity to a matter concerning that person’s private life if the matter publicised would be highly offensive to a reasonable person and is not of legitimate concern to the public.[52]

This tort is pre-eminently concerned only with protecting secret affairs, concerns or facts that are secluded from public scrutiny,[53] or instances where steps have been taken to maintain one’s privacy,[54] but the fundamental gravamen here is whether private facts include behavioural data mined in digital environments. The answer is that it is rare for the application of this tort to extend to the protection of informational privacy[55] because it is unlikely that an individual can establish any private claim on the Internet by reason of the fact that it is primarily a public arena.[56] In other words, because it is mostly non-sensitive behavioural data accrued within virtual environments, it will not be considered personally identifiable information capable of constituting any personal detail that was revealed so as to fall within the scope of private facts,[57] so it is highly unlikely that this tort will be able to adequately protect against digital privacy breaches.

Intrusion upon Seclusion

Here, American law stipulates that a cause of action in the invasion of one’s privacy can be substantiated if an individual intentionally intrudes or invades the private affairs (the solitude or seclusion) of another which results in that person’s mental anguish or suffering.[58] For the Canadian jurisdiction, intrusion imposes liability on a person that ‘intentionally or recklessly intrudes, physically or otherwise, upon the seclusion of another or their private affairs or concerns without lawful justification, once the invasion would be highly offensive to a reasonable person of ordinary sensitivities.’[59]

But even here, intrusion is unlikely to adequately address digital privacy breaches since its focal point is the ‘deliberate and significant invasions of personal privacy’[60] by improper access to the victim’s private information, not a failure to protect against access of such private, personal data.[61] Thus, this privacy tort will also likely be incapable of adequately addressing digital privacy breaches.

Appropriation of Name or Likeness

Under US law, appropriation under invasion of privacy occurs when one uses another’s name, likeness or image for commercial purposes without consent. Similarly in Canada, misappropriation protects the proprietary rights of an individual to the exclusive marketing of their personality – name, image, and likeness – for commercial gain – but both the tort under both jurisdictions is contingent on some degree of commercially saleable product advertising capability that makes the personality an asset requiring protection. So even though it might be possible to use this privacy tort in instances of data breaches demonstrating informational trafficking,[62] the tort is reliant on the victim’s ability to show some loss of an economic opportunity that the appropriator gained instead, and this makes it too limited to adequately touch and concern issues regarding privacy data rights.

The Problems with Privacy Torts

The problem with privacy torts is that they treat data privacy rights primarily as personality rights and this makes them ill-equipped in this particular circumstance to touch and concern issues of big data analytics, data mining and dataveillance through the Internet of Things. But it may be possible to rectify this by treating digital privacy breaches as issues fundamentally concerned with an individual’s property rights; here, it is to the tort of conversion that we must now turn.

Conversion

Within the US jurisdiction, conversion imposes liability upon the intentional assertion of control over another’s property so as to interfere with the property owner’s right to control it.[63] Though traditionally concerned with only proprietary rights in physical property, it has come to extend over the rights of an owner of digital property as well. Thus, preventing a proprietor from exercising control over his computer records and data could impose liability upon the converter for converting his property because there is little difference whether the information was on physical paper or was intangible as computer data.[64]

The problem is that UK law does not recognise this extension. Within the English jurisdiction, extending the tort of conversion to choses in action would ‘drastically reshape the tort and make it inconsistent with the Torts (Interference with Goods) Act of 1977.’[65] And in the Canadian jurisdiction, while it was held that the unauthorised use of personal information could amount to conversion,[66] this was only in relation to intangible, contractual rights. In other words, the UK and Canadian jurisdictions – the jurisdictions most persuasive to the courts of Commonwealth Caribbean territories like Trinidad and Tobago – cannot adequately account for a tort of conversion capable of covering digital property so as to make it a plausible cause of action for victims of data breaches.

Negligence

Negligence is perhaps the best alternative capable of protecting a data-subject’s right to personal data protection and privacy by reason of the fact that it hinges on a duty of care. Once a duty of care could be proven, it is immaterial whether personal data attracts some commercial element; it is immaterial whether it is a proprietary or personality right. This is so because a duty of care transfixes one party with an obligation to take reasonable care and skill to avoid harm. Within the context of privacy laws and data rights, data-controllers and data-processors have a duty to take reasonable steps when collecting and processing the personal information of a data subject, to carefully protect that private data from unlawful disclosure and/or processing through data breaches. In other words, the data-subject relies on the data-controller to ensure that data-processors process their data for a lawful cause, that they store it for only so long as it is necessary; and the data-subject relies on the data-processor to not misuse or misappropriate their data such that it would cause them harm. Alternative legal action must be sought through the common law tort of negligence because negligence is not concerned with trying to fit a new phenomenon within the existing confines of some already defined tort – it is concerned with the relationship that one person has with another, and this naturally comes to include the relationship between the data controller and processor and their data subjects.

Generally, a duty of care will be owed wherever in the circumstances it is foreseeable that if the defendant does not exercise due care, the plaintiff will be harmed.[67] In establishing a duty of care, ‘one must first ask whether, as between the alleged wrongdoer and the person who has suffered damage, there is a sufficient relationship of proximity or neighbourhood such that, in the reasonable contemplation of the former, carelessness on their party may likely to cause damage to the latter. Then it is necessary to consider whether there are any considerations which ought to negative, reduce or limit the scope of the duty or the class of persons to whom it is owed.’[68]

If the data-controller fails to ensure that the personal data of the subject was processed lawfully, he exposes the data-subject to a breach of their privacy that could result in the misuse of that data for psychographic profiling that could cause harm through discrimination and subtle manipulation. If the data-controller fails to safeguard the data subject’s personal information from being accessed, stored or transferred by some extra-hemispheric entity in another jurisdiction; if data-processors fail to lawfully mine and process the data-subject’s data they expose the data-subject to potential misuses and misappropriations of their data, and this is ultimately a material breach in the duty of care owed.

Each and every single one of these instances creates some measure of harm that is reasonably foreseeable, and as such, they are all the kind of relationships where a duty is owed. Therefore, in the final analysis, because there is a duty of care between data-controllers and processors and their data-subjects, it is immaterial that the TT DPA is patently flawed and ineffective; it is immaterial whether the individual’s right to personal data protection and privacy is a right of proprietary ownership (property) or personality – so long as there has been a breach of the duty owed to take reasonable care when collecting, mining, processing and disclosing the data-subject’s personal data, there is a cause of action that can be substantiated under the common tortuous law of negligence. 

Conclusion

As the world becomes increasingly digitized, Big Data and the Internet of Things continue to expand their spheres of influence, but this is not without security concerns. As data develops, the individual’s right to privacy, in particular, is increasingly at risk. This Big Data phenomenon requires an adequate data protection regime, but in the absence of functional data privacy laws within the Trinbagonian jurisdiction, some other alternative must be explored – the common law tort of negligence is arguably the best alternative since data protection is at its core, boiled down to the relationship that data controllers and processors have with their data subjects, which ultimately attracts a duty of care.

[1] Habib Kazzi, ‘Digital Trade and Data Protection: The Need for a Global Approach Balancing Policy Objectives’ (2020) 4, 2, European Journal of Economics, Law and Social Sciences, 42; Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, ‘Towards a thriving data-driven economy’ 2 July 2014, COM (2014) 442.

[2] Charles Fried, ‘Privacy’ (1968) 77 Yale Law Journal 475.

[3] Julie Cohen, ‘A Right to Read Anonymously: A Closer Look at “Copyright Management” in Cyberspace’ (1991) 28 University of Connecticut Law Review 981.

[4] Beate Rossler, ‘The Value of Privacy’ Polity Press 116.

[5] Karl Manheim, ‘Artificial Intelligence: Risks to Privacy and Democracy’ (2019) 21, 1 Yale Journal of Law and Technology 5.

[6] Ibid.

[7] Dirk Helbing et al., ‘Will Democracy Survive Big Data and Artificial Intelligence?’ (Sci Am, 25 February 2017) <https://www.scientificamerican.com/article/will-democracy-survive-big-data-and-artificial-intelligence> accessed 28 November 2020; Julie E. Cohen, ‘Law for the Platform Economy’ (2017) 51 UC Davis L Rev 133, 195.

[8] Damien Collins, ‘Summary of Key Issues from the Six4Three Files’ <https://www.parliament.uk/documents/commons-committees/culture-media-andsport/ Note-by-Chair-and-selected-documents-ordered-from-Six4Three.pdf> accessed 28 November 2020.

[9] Reuters Editorial, ‘Factbox: Who is Cambridge Analytica and what did it do?’ (20 March 2018) accessed 28 November 2020.

[10] Matthew Rosenberg, Nicholas Confessore, Carole Cadwalladr ‘How Trump Consultants Exploited the Facebook Data of Millions’ The New York Times (17 March 2018) ISSN 0362-4331 accessed 28 November 2020.

[11] Alex Hern, ‘How to check whether Facebook shared your data with Cambridge Analytica’ The Guardian (10 April 2018) accessed 28 November 2020.

[12] Trent J. Thornley, ‘The Caring Influence: Beyond Autonomy as the Foundation of Undue Influence’ (1996) 71 IND LJ 513, 524.

[13] Carole Cadwalladr, ‘IMade Steve Bannon’s Psychological Warfare Tool: Meet the Data War Whistleblower’ Guardian (18 March 2018) <https://www.theguardian.com/news/2018/mar/17/data-war-whistleblower-christopher-wylie-facebook-nix-bannon-trump> accessed 28 November 2020.

[14] H. Akin Unver, ‘Artificial Intelligence, Authoritarianism and the Future of Political Systems in Cyber Governance and Digital Democracy’ (2018/9) <https://edam.org.tr/wp-content/uploads/2018/07/AKIN-Artificial-Intelligence_Bosch-3.pdf> accessed 28 November 2020; Elaine Kamarck, ‘Malevolent Soft Power, AI, and the Threat to Democracy’ Brookings (28 November 2018) <https://www.brookings.edu/research/malevolentsoft-power-ai-and-the-threat-to-democracy> accessed 28 November, 2020.

[15] ‘Meet Cambridge Analytica: The Big Data Communications Company for Trump & Brexit’ None Above UK (2 February 2017) <https://notauk.org/2017/02/02/meet-cambridge-analytica-the-big-data-communicationscompany-responsible-for-trump-brexit> accessed 28 November 2020.

[16] Celia Zolynski, ‘Big Data and Personal Data between the Principles of Protection and Innovation’ (2020) 12, 1 Law, State and Telecommunications Journal, 225 <https://doi.org/10.26512/lstr.v12i1.30007> accessed 28 November 2020.

[17] Thilla Rajaretnam, ‘Data Mining and Data Matching: Regulatory and Ethical Considerations Relating to Privacy and Confidentiality in Medical Data’ (2014) 9, 4 JICLT 294, 298.

[18] UN Doc A/C.3/71/L.39/Rev.1.

[19] Resolution 2200 A (XXI), [1966] 999 UNTS 302.

[20] UNGA Resolution 68/167 on the right to privacy in the digital age UN Doc A/C.3/71/L.39/Rev.1.

[21] EU General Data Protection Regulation (GDPR): Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data.

[22] Ibid.

[23] Ibid.

[24] Nick Beckett, Amanda Ge and Roxie Meng, ‘China passes draft law on personal data protection’ Lexology (2020) <https://www.lexology.com/library/detail.aspx?g=5a0c5e48-7ead-48dc-ab82-b9bebf710236#:~:text=On%202120October%202020%2C%20China’s,data%20protection%20law%20in%20China> accessed 28 November 2020.

[25] Ibid.

[26] Noriko Higashizawa, ‘Data Privacy Protection of Personal Information Versus Usage of Big Data: Introduction of the Recent Amendment to the Act on the Protection of Personal Information (Japan)’ (2017) 84, 4 Defense Counsel Journal.

[27] Ibid.

[28] Data Protection Act, Trinidad and Tobago (TT) s. 2.

[29] Manheim, ‘Artificial Intelligence’ (n 5) 6; Margaret Rouse, ‘Internet of Things’ Tech Target (2016) <https://internetofthingsagenda.techtarget.com/definition/Internet-of-Things-IoT> accessed 28 November 2020.

[30] Samuel Gibbs ‘How Much Are You Worth to Facebook’ Guardian (28 January 2016) <https://www.theguardian.com/technology/2016/jan/28/how-muchare-you-worth-to-facebook> accessed 28 November 2020.

[31] Joanna Geary, ‘Tracking the trackers: What are Cookies? An Introduction to Web Tracking’ Guardian (23 April 2012) <https://www.theguardian.com/technology/2012/apr/23/cookies-and-web-tracking-intro> accessed 28 November 2020.

[32] Manheim, ‘Artificial Intelligence’ (n 5).

[33] Shoshana Zuboff, ‘Google as Fortune Teller, The Secrets of Surveillance Capitalism’ Pub Purpose (5 March 2016) <https://publicpurpose.com.au/wp-content/uploads/2016/04/Surveillance-capitalism-Shuboff-March-2016.pdf> accessed 28 November 2020. 

[34] Manheim, Artificial Intelligence (n 5) 9.

[35] Tom L. Beauchamp and James F. Childress, Principles of Biomedical Ethics (1989) 67-68.

[36] Manheim, Artificial Intelligence (n)

[37] Ibid.

[38] Ibid.

[39] Daniel Oberhaus, ‘Algorithms Supercharged Gerrymandering. We Should Use Them to Fix it’ Vice (3 October 2017) <https://motherboard.vice.com/en_us/article/7xkmag/gerrymandering-algorithms> accessed 28 November 2020.

[40] Manheim, Artificial Intelligence (n 5).

[41] Meet Cambridge Analytica (n 15).

[42] Nick Miller, ‘Cambridge Analytica CEO Suspended After Boasts of ‘Putting Trump in the White House’ Sydney Morning Herald (21 March 2018) <https://www.smh.com.au/world/europe/cambridge-analytica-ceo-suspended-after-boasts-of-putting-trump-in-white-house-20180321-p4z5dg.html> accessed 28 November 2020

[43] Jonathan Zittrain, ‘Engineering an Election’ (2014) 127 Harvard Law Review 335, 338.

[44] Manheim, Artificial Intelligence (n 5).

[45] Data Protection Act, TT, s. 25.

[46] Ibid s. 9.

[47] Ibid s. 35.

[48] Ibid s. 36.

[49] Ibid s. 69.

[50] Ibid s. 72.

[51] Sidis v F-R Publishing Corp. Haynes 113 F.2d 806 (2d Cir.1940); Alfred A. Knopf, Inc. 8 F.3d 1222 (7th Cir. 1993).

[52] Doe 464533 v ND [2016} ONSC 541.

[53] Vera Bergelson, ‘It’s Personal But is it Mine? Toward Property Rights in Personal Information’ (2003) 37 UC Davis L Rev 379, 407-408.

[54] United States v Gines-Perez 214 F Supp (2d) 205 at 225 (DPR 2002).

[55] Boring v Google Inc, 598 F Supp (2d) 695 (WD Penn 2009).

[56] Bryce Clayton Newell, ‘Rethinking Reasonable Expectations of Privacy in Online Social Networks’ (2010) 17 Rich JL & Tech 1, 21.

[57] Gordon v Canada (Health) 2008 FC 258; Eloise Gratton, Internet and Wireless Privacy: A Legal Guide to Global Business Practices (Toronto: CCH Canadian, 2003).

[58] Nader v General Motors Corp 255 N.E.2d 765 (NY 1970).

[59] Jones v Tsige [2012] ONCA 32.

[60] Ibid.

[61] Robert L. Rabin, ‘Perspectives on Privacy, Data Security and Tort Law’ (2017) 66 DePaul L. Rev 313.  

[62] Fraley v Facebook Inc. 830 F. Supp. 2d 785 (ND California 2011).

[63] Restatement (Second) of Torts 222A (Am. Law. Inst. 1965).

[64] Thyroff v Nationwide Mutual Insurance Co. 864 NE 2d 1272 (NY 2007).

[65] OBG Ltd v Allan [2008] 1 AC 1.

[66] Haug v Saskatchewan [2006] 2 WWR 516.

[67] Donoghue v Stevenson [1932] AC 562, p 579.

[68] Anns v Merton LBC [1977] 2 All ER, pp 498, 499.




Warren A.

Empowering Businesses with Strategic Growth and Digital Solutions | Driving Success in E-Commerce & Marketing

3 年

This is huge (no pun intended). I'll be following.

Laurence Brathwaite

Engineer - Design and Manufacturing

3 年

Great read! Very interesting to know the details concerning why our data protection regime is currently inadequate. It makes me wonder who will be leading the charge if a situation like the 09-10' elections happens again. Maybe an even more concerning idea is how safe are we from international influences who are looking for another round of test subjects.

要查看或添加评论,请登录

Malcolm Superville的更多文章

社区洞察

其他会员也浏览了