CPPA Drafts, Privacy Concerns of Connected Cars, & Airbnb's GDPR Investigation

CPPA Drafts, Privacy Concerns of Connected Cars, & Airbnb's GDPR Investigation

By Robert Bateman and Privado.ai

This week’s Privacy Corner Newsletter explores:

  • A summary of the California Privacy Protection Agency (CPPA)’s new set of draft regulations.
  • The alleged “privacy nightmare” of connected cars.
  • The results of Ireland’s GDPR investigation into Airbnb
  • Three recommended privacy-related reads published this week.

California Regulator Publishes Draft Risk Assessment Regulations

The California Privacy Protection Authority (CPPA) has published draft regulations on risk assessments under the California Consumer Privacy Act (CCPA).

  • The CCPA requires businesses to conduct risk assessments before engaging in certain types of high-risk data processing and submit certain information about risk assessments to the CPPA.
  • The CPPA’s draft regulations provide rules on the circumstances under which businesses must conduct a risk assessment and the form that risk assessments should take.
  • The draft regulations also provide some proposed modifications to the existing regulations (not yet in force), including definitions of “artificial intelligence” and “automated-decision making”.

More regulations? Didn’t the CPPA publish regulations already? And don’t you mean the CPRA?

OK, let’s get this straight:

  • The California Privacy Rights Act (CPRA) amended the CCPA.?
  • That process is complete, so we should generally use “the CCPA” to refer to “the CCPA as amended by the CPRA”.
  • We already have two sets of finalized CCPA regulations.?
  • The first set was issued by the California Attorney General under the original CCPA in 2020. We can call these rules the “CCPA Regulations”.
  • The second set of regulations was issued under the CPRA (OK, sometimes it’s helpful to distinguish the laws) by the CPPA. We can call these rules “the first set of CPRA Regulations”.
  • The first set of CPRA Regulations was finalized very late. While those rules were supposed to be enforceable since July, the Sacramento Superior Court pushed the enforcement date until next March.?
  • The CPPA is appealing this court decision, but don’t bet on the appeal succeeding.
  • The first set of CPRA Regulations was also incomplete, missing rules on service provider contracts, automated decision-making, and risk assessments.

So where are we now?

The CPPA has just published its second set of CPRA Regulations, which you’re currently reading about in this edition of The Privacy Corner newsletter.?

The second set of CPRA Regulations is still in draft form but incorporates public comments from earlier this year.

What does the second set of CPRA Regulations cover?

The second set of CPRA Regulations fills in the gaps left in the first set of CPRA Regulations. The focus is on risk assessments.

When must a business conduct a risk assessment?

Businesses must conduct a risk assessment before engaging in certain risky activities, including:

  • Selling personal information
  • Processing sensitive personal information
  • Using “automated decision-making technology
  • Processing personal information about children under 16
  • Monitoring employees, contractors, job applicants, or students
  • Monitoring people’s behavior in publicly accessible places
  • Using personal information to train AI models

What must a risk assessment include?

Some requirements for conducting risk assessments include:

  • A short summary of the processing.
  • The categories of personal information involved.
  • The context of the processing.
  • An analysis of consumers’ reasonable expectations.
  • The operational elements of the processing (including the technology used, the number of consumers affected, and the names of service providers receiving personal information).
  • The purpose of the processing.
  • The benefits of the processing.
  • The potential harms of the processing.
  • An assessment of the safeguards that can be implemented to mitigate the potential harms.
  • An assessment of whether the safeguards will actually work.

Do the draft regulations mention anything else?

There are a couple of other sections of the draft regulations that propose additions and modifications to the first set of CPRA Regulations, including

  • Specific rules for businesses assessing their use of automated decision-making technology.
  • A requirement for businesses conducting risky processing to conduct annual cybersecurity audits.
  • Definitions of “artificial intelligence” and “automated-decision making”.
  • A new rule requiring service providers to cooperate with businesses undertaking risk assessments, cybersecurity audits, and providing information about automated decision-making.

When will these regulations be finalized?

The CPPA has not announced a timeline for finalizing its draft regulations. The draft will be discussed at a board meeting on September 8.?

Given its ongoing litigation regarding the delayed enforcement of the first set of CPRA Regulations, the CPPA has not earned a reputation for impeccable timekeeping.

Modern Cars a ‘Privacy Nightmare’, Says Mozilla Research

In research into connected vehicles published on Wednesday, Firefox developer Mozilla found that cars were “the official worst category of products for privacy” that the group had ever reviewed.

  • The research analyzed how 25 car brands collected personal data about drivers across criteria such as data use, data control, “track record” (the brand’s history of known data leaks and other issues), security, and AI.
  • All 25 cars tested received Mozilla’s *privacy not included label, with only two models enabling users to delete personal data.
  • Tesla received negative results across all five tested categories, with all car brands failing in at least two.

Is it surprising that “computers on wheels” collect a lot of personal data?

The services provided by connected cars require a lot of data processing.

What’s significant about this research is that it confirms an apparently lax attitude among connected car manufacturers towards privacy by design and security.

What sorts of privacy and security issues were revealed?

Here are a few headline stats from Mozilla’s research:

  • 76% of the car brands allegedly sell personal data about their drivers.
  • 84% of brands state that they share personal data about their drivers.
  • 56% of brands state that they will share data with government authorities following a “request” (which Mozilla distinguishes from a court order).
  • 100% of brands failed Mozilla’s transparency tests by using complex or confusing language in their privacy notices.

Some of these stats might need further explanation…

Yes, for example, “sharing” personal data is not necessarily a malicious act.?

And sharing personal data with law enforcement is a legal obligation under certain circumstances. But the privacy-conscious among us expect businesses only to do so where such a request is valid.

However, the research revealed serious issues with enabling users to opt out of the unnecessary sharing of their personal data and—predictably—found some apparent problems with many brands’ “consent” processes.

Which brands came off worse?

Only Tesla earned “dings” (dings are bad) across all five criteria, meaning that Mozilla took issue with how the company handled data use, data control, security, and AI, and also with the company’s track record on privacy and security incidents.

Volkswagen allegedly uses data about how people use their seatbelts and brakes, combined with their ages and genders, for targeted advertising purposes.

Renault and Dacia’s models were the least problematic, with only two “dings” each in the areas of “data use” and “security”.?

Predictably, these two brands provide the only tested car models available in Europe, where the GDPR appears to have spared drivers from the worst data protection violations.

Ireland’s Airbnb GDPR Investigation Ends in a Reprimand

The Irish Data Protection Commission (DPC) has concluded an investigation into Airbnb, finding multiple GDPR violations and issuing a reprimand and corrective measures.

  • The DPC’s investigation began last November but stems from a complaint submitted in Germany in January 2019.
  • The Irish regulator found that Airbnb had violated GDPR provisions relating to the principles of data processing, lawfulness, transparency, and the right of access.
  • The DPC reprimanded Airbnb and ordered the company to “revise its internal policies and procedures” but issued no fine.

No fine?

The Irish DPC decided that it would not be “necessary, proportionate, or dissuasive” to issue a fine against Airbnb, the global property rental platform whose 2022 revenues were $8.3 billion, despite finding violations across five GDPR provisions.

What actually happened?

Here’s the background:

  • In 2015, an individual submitted access and erasure requests to Airbnb.
  • In June 2018, the individual logged into his Airbnb account, which he expected to be inactive following the 2015 request. He complained to Airbnb.
  • Airbnb asked for a copy of the individual’s ID to authenticate the request, and he refused to provide it.?
  • Eventually, the two parties agreed to authenticate the request via a phone call.
  • When Airbnb provided the results of the request, the individual alleged that certain data was missing and that the erasure process was incomplete.?
  • The individual, a German speaker, was also unhappy that Airbnb’s cover letter was written in English.
  • On January 1, 2019, the individual complained to the Berlin data protection authority (DPA).

That was almost four years ago.

Yes, the complaint has taken nearly four years to resolve.

It appears that the Berlin DPA sat on the complaint for over a year before forwarding it to Ireland, Airbnb’s EU establishment. This long delay is not explained in the DPC’s decision.

After a further 22 months of back and forth between the Irish DPC, the Berlin DPA, the individual, and Airbnb, the Irish DPC commenced its formal investigation in December 2022,

Nearly ten months later, we have the result: A reprimand.

What did the DPC find?

Here are the GDPR violations found by the Irish DPC:

  • Article 5 (1) (c) (data minimization): Airbnb should not have requested ID to authenticate the request when less “data-driven” solutions were available.
  • Article 6 (1) (f) (legitimate interests): “Legitimate interests” was not a valid legal basis for processing the individual’s ID.
  • Article 15 (1) (right of access): Airbnb initially provided the individual with only some of the personal data he requested.
  • Article 12 (1) (transparency): Airbnb did not provide the requested personal data in a “concise, transparent, intelligent, and easily accessible form”.
  • Article 12 (3) (transparency): Airbnb failed to keep the individual updated on the progress of his request.

Along with the reprimand, the DPC ordered Airbnb to use the individual’s first language (rather than just English) in the cover letters it attaches to its data subject rights request responses.

What We’re Reading

Take a look at these three privacy-related reads published this week:

Marian Martín

Sensual Interior Design

1 年

?? Wow, this is really interesting! I'd love to learn more about this topic. Could I get a connection request? ??

回复

要查看或添加评论,请登录

Privado.ai的更多文章

社区洞察

其他会员也浏览了