May Privacy Sum Up
Photo from Pixabay

May Privacy Sum Up

News and events:

  1. New draft guidelines on the calculation of administrative fines - The European Data Protection Board (EDPB) published the new Guidelines 04/2022 on the calculation of administrative fines under the GDPR on 16 May. The Guidelines come as an effort to harmonise the fine calculation process across all EU states and to further assist the supervisory authorities in assessing the criteria which calculations are based on. The Guidelines state that the starting point should be calculated on the basis of three elements: the nature of the infringement, the severity and the turnover of the business. Further, the EDPB suggests a 5 step methodology for the assessment of the total fine: (1) Identifying the processing operations in the case and evaluating the application of Article 83(3) GDPR. (2) Finding the starting point for further calculation based on an evaluation of the 3 elements. (3) Evaluating aggravating and mitigating circumstances related to past or present behaviour of the controller/processor and increasing or decreasing the fine accordingly. (4) Identifying the relevant legal maximums for the different processing operations. (5) Analysing whether the final amount of the calculated fine meets the requirements of effectiveness, dissuasiveness and proportionality, as required by Article 83(1) GDPR, and increasing or decreasing the fine accordingly. Please find the guidelines here.
  2. Google sued for using NHS data - A representative action against Google has been brought to the High Court in the UK, as the company allegedly received confidential health data of over 1.6 million UK individuals without their knowledge or consent. In 2015 Google's AI subsidiary, DeepMind, received this data from the Royal Free NHS Trust in London in order to test an app designed to detect acute kidney injuries. In exchange, the Royal Free could use the app at a discounted rate. While the ICO found this deal to be illegal, the decision was not to fine the trust due a lack of guidelines for the sector. Read more about it here.
  3. Court of Appeal ruling on CCTV in Ireland - This case concerned footage from CCTV cameras placed in the tearoom of a hospice in Ireland following an incident in 2015 where the words "Kill all whites, ISIS is my life" were found written on the table. The purpose of the monitoring was for health and safety reasons as well as crime prevention. The case appeared at the first instance due a complaint to the Irish supervisory authority (DPC) by an employee of the hospice, against whom disciplinary action had been taken based on recordings from the tearoom camera. He claimed he was not informed the CCTV footage would be used for this purpose. The DPC decided the footage was not processed beyond the purposes of security, a decision upheld by the Circuit Court. However, following a second appeal, the High Court noted that the data subject was not made aware that the footage would be used for disciplinary purposes nor he ought reasonably to have expected such use. The Court of Appeal upheld the decision following an appeal by the DPC. "The Court’s reasoning focused on principles of data transparency (notification to data subject) and the data subject’s reasonable expectation as to the secondary purpose of processing." Please find the full decision here.

Decisions:

  1. Google, Spain - It is not often that Google is faced with GDPR penalties, and it certainly was not expected for one to come from the Spanish authority, AEPD. Nonetheless, mid May, the agency announced that Google had committed 'two very serious infringements' for which it would pay a penalty of €10 million. The breaches concerned the transfer of personal data of EU citizens contained in requests for their data to be erased to a Harvard-based academic project called the Lumen Project. The latter would gather data from online erasure requests for study. The AEPD found that the practice of sharing data regarding the requests for data erasure with a third party essentially frustrate the right to be forgotten. Both Google and the Lumen Project were asked by the Spanish agency to delete the information covered by the erasure requests and Google was asked to amend its practices. While it may have been complicated for the AEPD to enforce any changes in the US-based Lumen Project, the group has already agreed to delete data received from Google without consent from EU citizens. Find out more information on this decision here.
  2. Clearview AI, UK - The ICO fined Clearview AI Inc €9 million for several breaches under the GDPR and the UK GDPR. The facial recognition company operates by collecting images and data publicly available online to create a vast database, which customers can then use to identify or effectively track individuals. The company does not inform the individuals whose data is collected that it is being used for this purpose, which the ICO found unacceptable. Clearview AI collects data and images of individuals world wide, hence the ICO fairly assumed that a substantial amount of those monitored would be Britons, giving them jurisdiction to step in. This possibly could mean that more EU states will follow. In fact, the company had already been fined €20 million by the Italian authorities in February 2022 for the same breaches and €10,000 in Germany back in 2020 for insufficient cooperation during an investigation. The ICO found a breach of purpose limitation, insufficient legal basis for processing, failing to have a process for storage limitation and data deletion, failure to implement additional safeguards when processing special category data (in this case biometric data), as well as disincentivising individuals who wish to object to their data being collected. Please find the full publication here.
  3. Twitter, US - The US authorities have fined the giant social media platform €139.6 million (or $150 million) for misrepresenting how users' data would be used. Between 2013 and 2019 the company had collected users' email addresses and phone numbers for security reasons, including two-factor authentication services. This data was then passed on to advertisers to create targeted ads for those users, to which they never gave their consent. Moreover, this breach contradicts the company's previous representations that its practices were in line with EU and Swiss data protection frameworks. According to Federal Trade Commission chair Lina Khan, this misuse of data affected over 140 million Twitter users, while boosting the company's revenue substantially, considering ads generate 90% of their turnover. Find the full article here.

要查看或添加评论,请登录

Privacy Optimization - Data Breach Management Tool的更多文章

社区洞察

其他会员也浏览了