Shared interests in a crisis of fakes
What interests do platforms and regulators share?

Shared interests in a crisis of fakes

2025 sees global attitudes to regulating online platforms drifting apart[N1]. Between the laissez faire of the US and the “legislative faire” of the EU and elsewhere a chasm is forming. Yet despite these diverging views, there are common interests: both want to tell real people from fake people (bots, AI agents, imposters etc.).?

If you want to protect people from content (and platforms), or protect people (and platforms) from regulations, you arrive at the same destination: you want to know that people you are protecting are the people you want to protect.

Platforms may vary in their reasons to promote free speech and deregulation, but all share a commercial need to know their customers. Consider the motives and follow the money. People and/or organisations need to pay for platforms to operate. Four typical streams of revenue are:

  • Subscription fees
  • Advertising fees
  • Data auctioning
  • Commission on Sales [N3]

On subscription, it could be said that platforms don’t care if subscribers are real people so long as their money is real. How effective X's efforts to charge for authenticity as an anti-bot measure may be arguable [01], but it is clearly intended to be pro-revenue. Paying a subscription fee for an account doesn’t mean that the account is linked to a real and unique person, but it does change the economics of feeding an army of synthetic accounts.

The advertising industry cares about people, at least in terms of their propensity to influence, be influenced, and to purchase. Targeting ads demands that you understand your audience and how likely they are to buy your product. Until synthetic accounts start buying products (when your AI agent sorts out your travel wardrobe for you), or can be convinced to echo and amplify messages, no organisation will want to waste their advertising dollars on fake people, AI powered or not. The more fakes on a platform, the less attractive it is to advertisers.

Selling online advertising opportunities demands that platforms know their users and their habits well enough to auction the next advertising opportunity to bidders in real time. The more accurate the data, the more “pure” (about real people), the more valuable. Auctions and licensing offer commercial deals for this data to other organisations for their own purposes [N5].?

Even civic minded “public good” platforms want to be sure that their participants are real people and that interactions between these real people keep within the platform guidelines. If you want to have a safe place, you have to make sure that it is safe.

Science has platforms to worry about too. The measure of progress in science is the publication, review, and citation of papers. Here there are platforms (journals, publications etc.), and here too there are problems. As a recent Conversation article offered: “Fake papers are contaminating the world’s scientific literature, fueling a corrupt industry and slowing legitimate lifesaving medical research” [03].

Platforms of all types and ethical profiles need to know that people are real people, and know enough about them, to grant them rights to participate. Knowing that people are people is essential, but not sufficient to mitigate all risks and protect all interests. Even when platforms have privacy preserving, trustworthy ways to prove things about people, they will still have difficult problems to solve. Platforms still need to decide what content is harmful and how much (if anything) needs to be known about people in each jurisdiction that they operate in as well as their response to things like notice and takedown for copyrighted content. And they need to find ways to police their decisions. In real time.?

The motives of legislators and regulators should also be considered. Here we offer an observation that the subject of their interest is, ultimately, legal entities, people or organisations who pay taxes, and who can be protected, rewarded or penalised through legislature and regulation. Abstract notions of organisations and products are interesting in so much as who owns, sells or controls them, and hence who can be the subject of regulation.

It’s not just the whole person and nothing but the whole person. In some cases legislation and other commercial interests mean that platforms care about certain aspects of your person, that you are over or under a certain age for example. We can see this in recent Australian legislation that enforces a ban on access to social media for people under 16 years old to be enacted in 2025 [04], and most jurisdictions have age constraints on access to adult content and goods.

We live in an era of mega-platforms that operate globally, have more customers than most jurisdictions have citizens, and generate more revenue than most countries have GDP. Legislation and regulation of these platforms is… “challenging”. Decisions will need to be made on where and what lines to draw, and what rules to apply within their jurisdiction for different platforms and participants. Clarity will be needed on who they need to protect and from what, when and where, and who they need to enable for what, when and where. Other challenges include the issues of liability and penalties, and, of course, fees and taxes.

And of course we shouldn’t forget that the subject of their shared interest, people, is the one that we care about most - at least those of us who are people. Here we need balance protection and enablement, to consider how much, and how little, needs to be known and shared. There are some groups for whom anonymity online is of vital interest - journalists, political advocates.? For them (and for society) it is important that the platforms don't know who they are.? It is important for platforms to know that an account holder/user is a real person/human, but not to know more about them than is necessary and agreed with them. Transparency and trustworthy behavior are essential qualities for any sustainable relationship.

Despite their differences, both platforms and regulators share an interest in people. Approached the right way, that shared interest should be an opportunity for a win-win-win, constructive collaboration between platforms, regulators and people, to the benefit of all.

Notes:

[N1] This paper will use “platform” broadly to cover social media, gaming, retail, auction, adult content, etc: digital online services that support communities of users. Cheekily we might say that “platform” refers to an “interactive computer service”, as in Section 230.

[N2] To avoid confusion in the context for this article, both viewpoints have merit: communication can contain harmful content, target the most vulnerable, and have terrible consequences. Censorship can deny rights, suppress truths and chill cultural development. Absolute views on either position fail when confronted with reality.

[N3] The EU DSA requires that Marketplaces on regulated platforms provide for identify verification (IDV) to safeguard the interests of buyers

[N4] One of the benefits of digital advertising is that it makes it possible to link the specific advert instance to the actor and their action. You get to know which half of your advertising budget was wasted.

[N5] There are many legitimate concerns about such actions, however some purposes may be less nefarious. For example, platform data offers very large scale rich data sets for sociologists and other scientific researchers - at least those who can navigate the ethical minefield [02].

Sources:

[01] Haman, M., & ?kolník, M. (2023). The unverified era: politicians’ Twitter verification post-Musk acquisition. Journal of Information Technology & Politics, 1–5. https://doi.org/10.1080/19331681.2023.2293868?

[02] Golder S, Ahmed S, Norman G, Booth A. Attitudes Toward the Ethics of Research Using Social Media: A Systematic Review. J Med Internet Res 2017;19(6):e195. URL: https://www.jmir.org/2017/6/e195. DOI: 10.2196/jmir.7082

[03] https://theconversation.com/fake-papers-are-contaminating-the-worlds-scientific-literature-fueling-a-corrupt-industry-and-slowing-legitimate-lifesaving-medical-research-246224, accessed 17 Feb 2025.?

[04] https://parlinfo.aph.gov.au/parlInfo/download/legislation/bills/r7284_aspassed/toc_pdf/24150b01.pdf;fileType=application%2Fpdf, accessed 17 Feb 2025.

Jo Spencer

Sezoo Co-Founder | Digital Trust | SSI | Payments | Banking | Consultant | Technical Architect | Musician

1 周

Love this article, John Phillips! We need to keep People as the primary focus, and help Platforms and Regulators to do the same. In working this through we had magic discussions about so many aligned topics. It's not always simple, but should be. More for the near future!

Excellent article. Thank you!

Richard Oliphant

Independent Legal Consultant for Docusign, Adobe, HM Land Registry, Digidentity, OneID, ShareRing, Ascertia, CSC, IoM Govt Digital Agency, Scrive #eidas #esignature #digitalidentity #blockchain #aml #ageverification

1 周

John - that was a good and thought-provoking read. And it’s a good illustration of our multi-polar world.

John Phillips

Digital Trust | Emerging Technology | Innovation | Education

1 周

A downloadable PDF of this article is available on the Sezoo website here: https://www.sezoo.digital/resources/shared-interests-in-a-crisis-of-fakes/ CC BY SA 4.0

Campbell Cowie

Public policy | Regulation | International standards | Digital identity | Coach

1 周

A well put together article and it’s helpful to have the alignment of interests set out like this. I am wary of calls for identity verification to be mandatory on social media (for the reason that anonymity is critical for some groups, as you capture), but for those who place a value in knowing that the source of content is a real person and the real person they believe it to be the technology is there. On reading your article I was reminded of the push against bots from Elon Musk. Pro revenue and, as you explain, pro people. https://www.forbes.com/sites/antoniopequenoiv/2024/04/04/musks-x-says-its-purging-bots-heres-how-the-platform-has-struggled-to-squash-its-bot-problem/

要查看或添加评论,请登录

John Phillips的更多文章

  • What explainable AI can teach us about good policies

    What explainable AI can teach us about good policies

    Imagine that you are in court having to make a case. Your license to continue operation as a business is on the line.

    14 条评论
  • Accordingly large and urgent? 2/2

    Accordingly large and urgent? 2/2

    This is the second of two articles on the Australian Universities Accord final report. The report describes the impact…

    3 条评论
  • Accordingly large and urgent? 1/2

    Accordingly large and urgent? 1/2

    This is the first of two articles I'll be posting looking at the final report of the Australian Universities Accord…

  • Towards Better Ends

    Towards Better Ends

    People, and the organisations they rely upon for services, are bad at anticipating, preparing for, and easing, the…

    8 条评论
  • Education providers should look to provide Verifiable Credentials for their students, now

    Education providers should look to provide Verifiable Credentials for their students, now

    It's been over five years since I and my good friend Andrew Tobin took the idea of "verifiable credentials" to the…

    13 条评论
  • Who says you can do that work here?

    Who says you can do that work here?

    Proving you have the right credentials to do a job, take up a role, or take up further studies can be difficult enough…

    8 条评论
  • Smishing with fake org ID – a risk to customers, organisations, and their directors

    Smishing with fake org ID – a risk to customers, organisations, and their directors

    Banks and other financial institutions need to use a number of channels to communicate with their customers, including…

    12 条评论
  • Do you mind if I…? Towards better online consent models

    Do you mind if I…? Towards better online consent models

    Preface Consent is an increasingly frequent and intrusive part of our online experience and is handled in many…

    10 条评论
  • Digital Wallet Design for Guardianship

    Digital Wallet Design for Guardianship

    This article is part of a series that we’re writing on how we might best implement a digital model for Guardianship…

    5 条评论
  • Guardianship and Education

    Guardianship and Education

    Authors: John Phillips, Jo Spencer This article is part of a series that we’re writing on how we might best implement a…

    4 条评论