Are we seeing a more targeted approach from both regulators and companies to ensure privacy and online user safety?
While regulatory bodies are cracking down on privacy breaches and passing new legislation, there is also an increased effort by companies to self-regulate their platforms.?
The Information Commissioner’s Office (ICO) has recently issued a warning about?“emotional analysis” ?technology, warning companies to avoid it due to inaccuracies in the technology, as well as concerns about bias and data collection. This is the first time the ICO has issued a blanket warning on the ineffectiveness of newer technology and signals a more active approach to regulating these systems.?
Additionally, the ICO recently issued a warning to the?Department for Education ?after the privacy of up to 28 million children was breached and their student records were used by gambling sites as a means of age verification.?
Meanwhile, companies are also making their own active efforts to police their platforms and protect users. Bumble has recently released an open-source version of their?AI feature ?which was developed as a means to protect users from unsolicited NSFW pictures. In releasing the code, the company hopes it can be adopted by the wider tech community and contribute to a larger conversation about “cyber flashing”.
Instagram will also now require underage users to?verify their age ?with video selfies or uploading IDs, in an attempt to address online child safety and the negative effects of social media on children. While this policy raises questions and concerns about privacy and how the data will be stored and collected,?the platform assures that this policy will comply with data protection law.
Thanks,
Pimloc
(Primary links of news articles are attached to the images)
--------------------------------
News
DofE receives a warning from ICO after children’s data was used by gambling companies
The ICO has reprimanded the Department for Education for giving an employment screening company access to a database of children 14 and over between 2018 and 2020. This data was subsequently used to conduct age-verification checks for gambling companies, marking a “serious breach” of data protection law.?
The ICO warns companies against using biometric technology to carry out emotional analysis
The ICO has warned companies against using biometric technology to conduct emotional analysis or face fines, labelling the process as “pseudo-scientific”.?They argue that AI-powered emotional analysis relies on large amounts of sensitive biometric data and carries many risks of inaccuracy and system bias, particularly if used to contribute to meaningful decisions.
Instagram will require?young users to verify their age with video selfies
Instagram will be partnering with the tech firm Yoti to develop an age estimation system whereby users who try to edit their date of birth from under 18 to over 18 will now have to verify their age through ID or a video selfie. This change comes as part of an effort to guarantee an age-appropriate experience for users and protect children on social media.
领英推荐
The NYPD joins Ring Neighbours to?access neighbourhood surveillance network
The New York Police Department (NYPD) has joined Ring Neighbors, a neighbourhood surveillance network built around Amazon’s Ring doorbell cameras. This new collaboration means that the police force will have access to community posts in the network, as well as request assistance and video footage from the public for active police matters.?
One in five tech workers is under workplace surveillance, according to a recent survey
A recent survey by Prospect Union has found that one in five tech workers is subject to software that monitors their activity, including keystrokes, mouse clicks,?physical location tracking, as well as website activity. The survey specifically highlights the extent of digital workplace surveillance in the UK since the covid-19 pandemic and the rise of monitoring software to watch employees working from home.?
--------------------------------
AI News Snippet of the Week
Bumble releases open-source version of AI feature which helps combat cyber-flashing
The Data Science team at Bumble has released a whitepaper and open-source version of its “Private Detector” technology, an AI tool that automatically blurs nude images shared on the app. The company began working on this technology in 2019 as an effort to protect its users from unwanted NSFW pictures and improve user safety.
--------------------------------
Policy Updates
EU lawmakers approve and publish The Digital Services Act in the Official EU Journal
The EU has published the Digital Services Act in the EU Official Journal, prompting an update of e-commerce rules in the region. The Digital Services Act, alongside the Digital Markets Act, hopes to establish transparency and accountability frameworks for “Internet intermediaries”. The law will take effect from February 2024, though rules specific to large online platforms are expected to become effective sooner.