Are you ready for the global regulatory crackdown in 2025?

Are you ready for the global regulatory crackdown in 2025?

“Tomorrow belongs to those who can hear it coming” - David Bowie

The New Year invariably ushers in a wave of tech predictions and speculative forecasts. What will be the next big buzzword or gadget? How will our digital interactions evolve? Which groundbreaking innovation or market trend will take centre stage? These are all intriguing questions. But let's begin 2025 by focusing on a tech certainty: regulation.

Whether we welcome it or not, 2025 will see a significant rise in online regulations. Moreover, it's clear that a diverse regulatory landscape – particularly because of a likely change in US tech regulatory strategy - will persist globally. International organisations will continue to grapple with the ongoing challenge of operating within differing approaches. This latest ByteWise Insights article explores the specific technology regulations that will materialise this year. Fuelled by growing concerns around Artificial Intelligence (AI), data protection and privacy, online safety, and consumer protection, we can expect a substantial increase in regulatory activity. Organisations need to be aware of the evolving regulatory landscape and adapt their practices accordingly. Are you prepared for this?

Building a safer and more trusted digital world

Online safety is top of the regulatory list. Policy makers and regulators around the world will intensify efforts to combat online harms, including misinformation and disinformation (particularly following Meta’s decision to get rid of fact checkers), hate speech, and cyberbullying.

The UK is gearing up for a significant year with the implementation of its Online Safety Act (OSA). This landmark legislation will hold a wide range of online service providers accountable for user-generated content, with a particular focus on protecting children. This law covers search engines, pornography websites, social media platforms, video-sharing services, mobile gaming services and many more ‘user-to-user’ services. 2025 will be the year of ‘age assurance’ and Ofcom has already made relevant announcements on what it considers to be ‘highly effective’ approaches (e.g., facial age estimation / digital identify services). Despite criticism from the likes of ex-10 Downing Street policy adviser, Rohan Silva (N.B., Times subscription required), the European Union (EU) will be pressing forward with the implementation of its Digital Services Act (DSA). Like the UK OSA, this legislation focuses on content moderation, tackling misinformation, and protecting people’s rights, particularly for large online platforms and intermediaries like Facebook, Instagram, AliExpress, TikTok, and Booking.com.

The EU and UK's efforts are inspiring a global trend, with many countries developing their own online safety frameworks. Australia, for instance, is reviewing its Online Safety Act, seeking to align it with the EU and UK models. One key point of discussion is raising the minimum age for social media use to 16, a development also being explored in Singapore and Indonesia. Again, the need for effective age assurance will be key.

A fairer digital world?

The European Commission (EC) is currently developing a Digital Fairness Act, a step towards modernising consumer protection laws in the digital age. We are unlikely to witness the final legislation in 2025, but its development signifies a crucial shift in how we regulate online businesses. The proposals aim to address the evolving landscape of technology and business models, focusing on so-called “problematic practices” that potentially exploit or unfairly disadvantage consumers (e.g., deceptive and addictive design, discriminatory personalised pricing & advertising, digital subscriptions and more - see a more detailed overview here). The move towards a Digital Fairness Act in 2025 represents a proactive approach by the EC to ensuring a fair and equitable digital marketplace. By addressing these “problematic practices" the EC seeks to empower consumers, foster fairer competition, and promote a more trustworthy online environment. But for some this might be another example of the EU over-regulating.

Moving towards a privacy-first world

The European Union's (EU) General Data Protection Regulation (GDPR) has fundamentally shifted the landscape of data privacy, establishing a consent-based regime that is driving a privacy-first approach to the digital world. This paradigm shift is particularly evident in the advertising sector, where the reliance on third-party cookies is diminishing. In response, innovative solutions such as the use of ‘first party’ data, Universal IDs and other consent-based or privacy-enhancing technologies are emerging, fostering the development of more transparent and user-centric business models.

However, significant challenges remain in achieving consistent and effective compliance with the GDPR. Despite the efforts of the European Data Protection Board (EDPB), established under the GDPR to ensure a consistent application and enforcement of data protection law across the EU, ensuring harmonisation is a complex and often impractical undertaking. Take, for example, last year’s EDPB's decision to strike down the ‘Consent or Pay’ (CoP) approach. This model, used by many news and content publishers, aims to offer people a choice: either consent to targeted advertising based on people’s browsing activity or behaviours (‘behavioural advertising’), delivering revenue to help fund their content, or pay a fee to access the content without this type of advertising. The EDPB ruled that this model may not always meet the requirements for valid consent under the GDPR. Specifically, the EDPB expressed concerns that such an arrangement could create undue pressure on users, potentially undermining the principle of freely given consent. The decision highlights the delicate balance between protecting user privacy and enabling sustainable business models for online content providers. It underscores the need for ongoing dialogue and collaboration between regulators, businesses, and civil society groups to find innovative solutions that respect user privacy while supporting a vibrant digital economy. The UK's Information Commissioner's Office (ICO) is adopting a more practical approach, as demonstrated by its ‘pay or consent’ impact assessment. This approach allows organisations to proceed, provided they consider four key factors when determining the validity of consent: power imbalances, appropriate fees, equivalence, and privacy by design. The ICO will also prioritise online tracking, focusing on data protection law compliance and assisting publishers in implementing more privacy-friendly advertising strategies.

The UK's Data Use and Access Bill (DUA) remains under parliamentary debate. Anticipated to become law in 2025, this post-Brexit legislation aims to reform the UK's data protection framework while ensuring the country meets 'adequacy' requirements - up for renewal on 27 2025 - for data transfers to the EU, thereby facilitating trade. Meanwhile, Australia has updated its Privacy Act to align with global standards and address the challenges of the digital era. In the US, a comprehensive federal privacy law is unlikely to be enacted in 2025. Instead, we’ll see more state-level regulations and an assertive stance from the Federal Trade Commission (FTC) in enforcing consumer privacy. The FTC has already sought to tighten the consent rules on behavioural advertising if targeting children under the age of 13 years of age.

Framing the rules for the AI game

My recent article on the increasing scrutiny of AI systems in late 2024 highlighted the lack of global leadership, resulting in a growing divergence in market approaches. This trend continued in early January with South Korea adopting new AI regulations. The EU's AI Act remains a landmark piece of legislation, focusing on high-risk AI systems, including bans on certain uses and stringent oversight for others. This focus will intensify in 2025, particularly in critical sectors like healthcare and finance.

Since publishing my article in early December 2024, two significant regulatory developments have emerged which will influence the rules of the AI game:

  • The EDPB issued an opinion on the GDPR's application to AI systems. This guidance clarifies when and how AI models can be considered anonymous across the EU (N.B., it should not be able to identify a person directly or indirectly), a crucial aspect for compliance. It also provides a framework for using 'legitimate interest' as a legal basis for AI development, emphasising a three-step process within GDPR impact assessments.
  • The UK Government launched a consultation on AI and copyright in the creative industries. This initiative aims to balance the rights of creators with the growth and development of AI technologies, a key economic growth driver for the UK Government, by exploring mechanisms for content remuneration and control.

And some good news: a more competitive digital landscape?

Tech giants are facing increased scrutiny over their market dominance and potential anti-competitive practices. Enforced regulations will promote fairer competition and prevent monopolies. The EU's Digital Markets Act (DMA) and the UK's new digital competition regime are leading examples. The UK Competition and Markets Authority (CMA) has already initiated an investigation into Google's market dominance in search and search advertising, examining whether it holds ‘Significant Market Status’ (SMS). Similar competition regime reviews are underway in other jurisdictions, such as Australia. These actions signal a global shift towards stricter oversight of tech companies and a commitment to seeking to create a more level playing field in the digital marketplace.

Global regulatory differences will increase, and the battle lines are being drawn up

The digital landscape is undergoing a period of rapid regulatory evolution in 2025. We are witnessing a surge in regulations, yet these vary significantly across different global regions. This emerging patchwork of rules risks distracting regulators from their core goals. Some advocate for stricter, more swiftly implemented rules, while others champion a more laissez-faire approach prioritising ‘free speech’. The new Trump administration in the US is expected to mark a major shift in tech regulation, creating a contrast with the approaches of many other jurisdictions. This will create a challenging and uncertain environment for organisations, especially as - for example - society increasingly demands access to trustworthy and independent news.

It's crucial to recognise that safeguarding children from harmful or illegal content does not inherently equate to censorship. Similarly, prioritising brand safety and suitability for audiences does not necessarily infringe on freedom of expression. While cultural differences necessitate tailored approaches, the fundamental goal remains clear: people expect a safer and more secure digital environment in their daily lives.

But tomorrow really does belong to those that can hear it coming…

Some industry commentators may perceive these increasing regulations as toothless due to a perceived lack of enforcement. This criticism holds some merit. However, I contend that 2025 will witness a marked increase in regulatory enforcement. Big Tech companies will likely be the first to face scrutiny. Those - including investors - who believe that operating in ambiguity provides an advantage as it did those 15 years ago may come to regret their apathy. The future belongs to those who anticipate and adapt to the evolving regulatory landscape.


A version of this article was first published in New Digital Age.

Nick Stringer, a prominent global technology, public policy, and regulatory affairs adviser, has contributed significantly to the international application of brand safety standards. He is CEO of Flux Digital Policy, a strategic technology advisory, public policy, and public affairs consultancy, Managing Director of Mobile Games Intelligence - the global voice of the mobile gaming industry with policy-makers and regulators - and interim CEO of the International Social Games Association. His extensive experience includes serving as the former Director of Regulatory Affairs at the UK Internet Advertising Bureau (IAB UK).

Follow him for all his ‘ByteWise Insights’ on LinkedIn, X, Medium, Threads, Substack or BlueSky.

要查看或添加评论,请登录

Nick Stringer的更多文章

社区洞察

其他会员也浏览了