TechTicker Issue 62: January 2025
Ikigai Law
An award-winning law firm helping innovation-led companies find efficient solutions.
New Year, New Rules! No prizes for guessing what we’re talking about. If you've been following us, you already know we’ve been sprinkling memes here and there in earlier editions, about how eagerly we have been waiting for drumrolls the draft Digital Personal Data Protection Rules, 2025 (Rules). And just like that, with the New Year (and, a slightly late Xmas gift) from the IT Ministry, we’re now setting sail towards the world of data protection compliance.?
It’s been a rollercoaster, and with our tech policy friends and fam, we’re all on this ride together. So, after quickly exchanging a flurry of ‘sad-happy weekend to us!’ messages, followed by a collective sigh (yep, we’ve all been there), we put on our analysis hats and got to work. You can read our preliminary analysis here. Our partners Nehaa and Sreenidhi also spent some time unpacking and talking about the rules – specifically how they impact B2C businesses – and that recording is here.
Amidst the flood of information that has been doing the rounds, we’ve got your back by adding another one to the pile. And so, this edition is laser-focused on the Rules.
We start by going back in time and refreshing the story surrounding the origins of the Rules — from the passing of the Act and the numerous events that delayed their launch in the past year. Then, we get into the weeds: spotlighting hot-cake issues, capturing some of the reaction and ‘wait, what’ moments. We round up with a share of some in-house expert takes on some of questions that we and the broader ecosystem have been grappling with. In case you have any specific queries for our team, shoot away! You can reach out to us at [email protected].?
Even though this newsletter is all. about. data. — we will indulge ourselves a minor deviance to discuss a couple of important updates on AI and content moderation.
Let’s rewind it a bit
Back in 2012, visionary Justice A.P. Shah presented a report calling for better protection of individual’s privacy in India. But it took over a decade — after a Supreme Court ruling recognizing the right to privacy; the Srikrishna committee report exploring potential regulatory approaches; and not one, but several draft bills — for the Digital Personal Data Protection Act, 2023 (DPDP Act) to come to life in August 2023.
And we were told, the rules to operationalize this Act would be rolled out immediately after. But then, something hit the brakes. The hold up? Reportedly, there were industry concerns regarding how to technically operationalize verifiable parental consent. Oh, and the Ministry of Home Affairs had to give their stamp of approval as well. The result? Businesses found themselves stuck and restless — in a waiting game worse than Bengaluru’s traffic.
Finally, after what seemed like an eternity (at least to us), the IT Ministry doing its best rendition of ‘thank god its Friday’, released the Draft Rules for public consultation on January 3, 2025. Deadline to submit comments to this draft is February 18, 2025; so, if you want to get in your two cents, now's the time! (Psst: some have sought extension on this deadline, which reportedly might just be granted.)
Hot-cake issues ?
?
?Review, reaction, and commentary
Immediately after the DPDP Rules were released, the media was flooded with Twitter rants and government clarifications.
Government clarifies: Union Minister Ashwini Vaishnaw gave a series of interviews? (here, here and here) explaining and clarifying different provisions of the DPDP Rules. These included providing a timeline for the implementation of the rules, noting that sector-specific restrictions may be implemented for cross-border data flows and highlighting the use of tokens for verifiable parental consent. The IT Ministry also emphasized that data localization will be limited in its application and clarified that sectoral regulators will retain their authority over cross-border data flows. The IT Ministry is likely to extend the consultation period for the Rules, with specific focus groups to be formed on specific aspects of the draft.
Industry reacts:? Different sectors have had varied reactions to the DPDP Rules. While the Rules have caused concern in sectors like fintech — which is grappling with the potential increase in operational costs to comply with different requirements; some are waiting to get-set-go on innovating and building new products. Meanwhile, marketers brace for change in engagement strategies — basically, the increased emphasis on data minimization and user consent may change the existing model of hyper-specific targeted advertisements and move the industry to develop marketing practices beyond sharp targeting.
Civil society views: Some civil society organizations have criticized the rules for being ‘too vague’, citing how terms like “reasonable safeguards”, “appropriate measures”, or “necessary purposes” are used without any clear explanations. A commentator highlighted that there is a lack of operative guidance in the Rules, and they are “incomplete and rushed”. However, other experts have praised the Rules for establishing a forward-looking framework to protect the rights of individuals by granting them greater control over information.
What we are reading
?Experts’ corner – Ikigai voices on the DPDP
“Who needs to verify and age-gate. How do you go about doing that? This is a concern that has been raised in earlier iterations as well. Are you moving towards a scenario where you age-gate the entire internet? I don’t think that is the intent of the government. It might be an inadvertent effect. In other parts of the world, we have seen language which says that organisations should have age verification or age-gate mechanisms based on the kind of content that is being accessed and the level of risks involved in the online activity. So, if you are a website for adult content, having a checkbox that says ‘I confirm I am above 18’ might not quite cut it. But if you are a news website, should you really be worried about verifying the age and identity of every single individual who is coming on your platform to consume your content. At least in the letter of the law, we currently don’t see this kind of nuance come through. You do have to do a little bit more so that children below the age of 18 cannot easily access, or the converse that only those above the age of 18 may have access.
-???Nehaa Chaudhari, Partner (in her podcast with the The Economic Times)
?SDFs will also be subject to data localisation requirements based on the recommendations of an executive committee,” said Pallavi Sondhi, senior associate at Ikigai Law.
Experts believe this rule is likely to encounter strong pushback from industry giants such as Meta, Amazon, Google, and other organisations which handle sensitive health and financial data.
Rule 14, on the other hand, restricts the flow of Indians’ data with foreign countries. This rule, however, unlike rule 12, applies to all data fiduciaries. It says that if a company processes personal data in India or processes it outside of India for goods or services offered in India, it must follow certain requirements when it comes to sharing that data with foreign governments.
"In case of cross border data transfer, data fiduciaries will have to comply with certain conditions that the Central Government may prescribe through a separate executive order," said Sondhi. "The conditions will relate to providing data access to foreign governments or persons/entities under its control."
-?? Pallavi Sondhi, Senior Associate (quoted in the Secretariat)
While DPDP Rules definitely stole the spotlight, there have been other exciting and relevant tech policy updates. Moving on to other news…
Connecting the Dots
?In India, fact-checking organizations are uncertain about their future. Eleven organizations in the country, currently partner with Meta through its fact-checking network, with some relying on Meta for half their revenue. Interestingly, Union Minister Ashwini Vaishnaw commented on the decision stating that Meta’s shift in policy validated the government’s approach of setting up fact-check units (FCUs) to deal with misinformation pertaining to the government. These FCUs were conceptualized under the amended Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. They were empowered to identify any content concerning any ‘business of the Central Government’ and flag it as fake or false or misleading. However, the Bombay High Court declared FCUs to be unconstitutional. You can read more about FCUs and the case in our previous edition here.
?That’s all for now!
We’d love to hear your feedback, concerns or issues you’d like us to cover. Or, you could just drop in to say hi. We are available at [email protected] .
Signing off, the Ticker team for this edition: Isha Vidushi Nirmal Vijayant
?
DCPP Certified Privacy Professional | Senior Legal Associate | Data Protection, Privacy | Tech Contracts |
1 个月Hi, just to clarify, the Act doesn't require you to post the Notice in English AND ALL 22 Scheduled languages. The section says "English OR any language specified in the Eighth Schedule".