Why we should all be working on our DSA readiness
Maximilian Bleyleben
Founder whisperer | investor | scale-up operator | board director | privacy professional | AI policy wonk
Don't think you need content moderation tech? Think again.
[Reposting as an article here for greater reach. For timely access to my writings please consider subscribing on Substack]
The week before Christmas was an interesting one in EU digital regulation, if only because of the made-for-clickbait announcement from the European Commission that three companies have been added to the list of designated Very Large Platforms (VLOPs): Xvideos, Pornhub and Stripchat.
That brings to 19 the number of companies that will face the most stringent requirements to police content under the far-reaching Digital Services Act (DSA). Not only will they have to report their user numbers, but also to assess systemic risks they create (especially re children), allow access to independent auditors and external researchers, and deploy recommender systems that are not based on user profiling.[1]
The EU has been both commended and criticised for drawing a distinction between the Goliaths (most of which are US-based), and everyone else. This approach was a direct response to findings that GDPR had disproportionately impacted smaller companies and—in effect—made it harder for them to compete.?
But with all the focus on VLOPs, we seem to have forgotten that most digital services of any size will face critical new obligations from 17 February this year. Especially around moderation of user-generated content (UGC).
(The following is most definitely not legal advice.)
Under the DSA’s rules, if there is any UGC on your platform—whether it’s comments or reviews, chat messages, content in files people are exchanging or posting, live voice communications, 3D creations—you have to implement means of detecting, flagging and removing illegal content.[2] If you are a marketplace for products, you must have a process to identify and remove illegal goods, including counterfeits.
This could prove a real headache for many growth-stage and midmarket companies that have users or sell product in the EU. While a lot of the technical components for content moderation and user reporting workflows exist, they still need to be cobbled together in a way that covers all the DSA requirements. You’ll need at least these features (appropriately localised to the 27 EU member nations…):?
领英推荐
That’s the technology & process bit. Of course you will first have to come up with a content policy that both satisfies the DSA definition[3] and matches the context of your UGC.
And that’s not all. You will also have to demonstrate that your privacy and security mechanisms were designed to protect minors specifically (including not serving them profile-based ads[4]), and that your interfaces are not deceptive or using ‘dark patterns’.
Today the market for tools and services that can help with content moderation is fragmented. There are plenty of vendors jumping on the double bandwagon of this new regulation and the new technology-du-jour (AI). It can be hard to distinguish between those that provide services (based on humans and/or AI), or tools and components for you to build your own solution (which will likely also require some human moderators). In addition to technology, you’ll need to appoint someone who owns the policy and can evolve it, and to continuously manage set of principles to help adjudicate disputes.?
Finally, content moderation is uniquely complex in that it can be very specific to your service (eg, what is considered a threatening comment in a social community may not be a reportable offence in an adversarial video game chat). At the same time, every company needs to make use of generalised content moderation approaches (eg, how to identify and report on Child Sexual Abuse Material or CSAM). Getting the benefit of the best standards in the industry while optimising for your own service can be hard.
Look out for lots of innovation and company pivots among content moderation solution providers as they try to address the DSA compliance challenge for the midmarket.
[This article first appeared on my Substack. If you like it please share, and consider subscribing.]
[1] In fact, recommender systems (ie, the personalisation algorithm that powers your feed) are under attack from all sides in Europe, which will be interesting to watch given that they are. by far the most effective driver of user growth, see TikTok astonishing growth.
[2] The DSA does not create a new definition for what content is illegal – it simply points at existing EU and member state laws. Broadly, illegal content includes anything that incites terrorism, depicts the sexual exploitation of children, incites racism or xenophobia, infringes intellectual property rights or is considered disinformation. But there are also country-specific restrictions to be aware of, such as the prohibition on depicting Nazi symbols in Germany, or more stringent restrictions on racist content in France.
[3] Note that “where a content is illegal only in a given Member State, as a general rule it should only be removed in the territory where it is illegal.” (Questions and Answers: Digital Services Act).
[4] The DSA draws a very hard line here, directly barring digital ads based on profiling by using personal data of users “when [operators] are aware with reasonable certainty that the recipient of the service is a minor” (Article 28). This puts into much clearer language what had been implied until now by GDPR’s Recital 71 restriction on automated decision-making via profiling. Note that while the targeted advertising ban applies to any company in scope, VLOPs and VLOSEs face additional obligations to mitigate risks including “targeted measures to protect the rights of the child, including age verification and parental control tools.”
CEO Compliant?? | Data Compliance for Digital Media | Data Ethics | Digital Policy & Regulation | WFA Strategic Partner | Ex-Meta Policy Council | International Speaker
1 年When I was on Meta’s Public Policy Council, I saw first-hand the scale of the problem, the horror of the problem and the importance of moderation. The DSA (and its enforcement) is super-important. Great article Maximilian.