Are social media companies responsible for content?
Darryl Grauman
Technologist, Strategist, Board Member, Speaker, Author, Restauranteur, Advisor, Investor, Coach, Biohacker, Fitness fiend and Thought Leader. Taking Kiwi ingenuity global.
Over the last week I've been reading a lot of media articles about the "Christchurch Call" government and social media conference held in France; and resultant multi-country, multi-social media platform non-binding agreement. There's been significant debate, much media attention and many opinions. It's become emotional instead of factual and I believe it should come back to basic principles. The basic principle is whether the social media companies are regarded as platforms or publishers, or both?
I'll use Facebook as an example but the same is true for Twitter, YouTube, WeChat or any other social media organisation.
Consider this; your telephone company provides you with a telephone line, that's "the platform". We've never held any telephone company responsible for the words spoken across the telephone line. This is why we have a concept called “Lawful Intercept” - this is when there is sufficient legal evidence in front of law enforcement authorities and courts that will allow for an order to "wiretap" a telephone line. The Telco, under these conditions, is obliged to allow and facilitate this legally sanctioned access to private information. They are never liable for the content.
Equate this to Facebook. If Facebook is simply a platform, like a phone line, how is it that we now are making them responsible for the content delivered via their platform? Is this fair and reasonable?
What happens if we now regard Facebook as a publisher? Is every person utilising the platform now an author? This construct would be very different to other publishers who’s authors are usually employees under their direct control or otherwise under specific contract to the publisher. In the case of Facebook each of their millions of users is a contributor and is subsequently an author whom Facebook has no direct control over other than a click-through terms of use for the platform. And, as Facebook is a private company, what organisation or governing body has the authority to interfere in the standards Facebook use to curate and publish content, in what location? Because of the ubiquitous access to this content across hundreds of sovereign countries, to who's legal standards of free speech and censorship should Facebook be held? And then what do the mechanics of this look like?
Imagine you are Facebook sitting in the middle, you thought you were a platform, now you're being considered a publisher and you have to reconcile your policies to fit across the policies of the USA, China, Iran, Russia, Indonesia and hundreds of countries with different interpretations and degrees of free (or not so free) speech. How do you react? How do you create a single consolidated one-size-fits-all policy? How do you technically and operationally manage this?
Let's bring it closer to home. The despicable manifesto of the Christchurch shooter has been banned in New Zealand and possession has been criminalised. This is based on an NZ government call not to give this horrendous person a voice. But other countries haven't criminalised the manifesto based on an equally valid view that having the right to read the content is an insight into a dark soul and can be learned from in order to prevent more heinous crimes like this from occurring in the future. I've read a great argument that if writings from The Holocaust were banned and names of those who committed atrocities were never spoken, then how could we remember, educate and prevent something like that from ever occurring again? (As an aside Hitler's Mein Kampf is still freely available in New Zealand.)
In this case the NZ government is literally asking Facebook for their platform to control to whom they publish this document and who can possess it. Technically, operationally, logistically, financially, imagine the resources required to conform to the demands of every country, organisation, professional body making similar demands. Could any platform company conform to all of these demands without going insane or insolvent?
Let’s also look at the “Christchurch Call” pledge that’s been made to actively manage online live content so that another livestream akin to the Christchurch shooter can’t be published. A noble cause absolutely. But imagine tens of thousands of people uploading live content simultaneously, maybe they’re uploading themselves playing a first person shooter video game, maybe it’s a combat shooting competition, perhaps it’s movie content - once more technically, operationally and financially could a platform provider absolutely comply to this noble cause in real time (and still turn a profit)?
So here’s a hypothesis. What would happen if Facebook was just considered a platform and the onus was on them to provide a form of lawful intercept capability within geographical sovereignty confines, and provide tools for governments, censors and lawmakers to curate content accessibility for their sovereign regions. This would effectively shift the burden to each country to act as “content police” hopefully in line with their national censorship rules and regulations. This would move the governance responsibility of content (and cost of governance) into the public arena which, in most democratic countries, would hopefully be transparent to the public. Yes, there are other non-democratic countries that may be more restrictive in the content that they allow, but perhaps this rule-set could also become transparent globally?
This rambling is meant to try to simplify the issues, contextualise them in fact (not emotion) and provoke thought, not to criticise or promote any single position. The debate will probably continue for a long time.
Thank you for sharing !
Head of Business Strategy and Architecture at Auckland Council
5 年I agree with Sanjay, link the costs of safety to the revenue stream . Particularly when we are really interested in controlling what breaches the basic standards of human decency. All countries share those. Why should a nation’s taxpayers have to subsidise facebook?
Digital Transformation | Emerging Technologies | Business Development | Client Management | Delivery Governance | Business Unit Leadership
5 年Good one Darryl...FB is smart enterprise following the money ,developing sophisticated content oriented products to get more eyeballs for longer, for more ad dollars. There is a case for FB having some obligation to incur reasonable costs of minimising resulting adverse unintended consequences of public harm caused. FB opening up their platform for country specific initiatives by governments to suit their regulations is a good idea though...
Senior Business Analyst - Product Management & Engineering PMO
5 年A good logical read, Darryl. Thank you for sharing. I wonder if any politicians will come across this article and have a good read also. I will actually send this out to my local MP.
Microsoft Technical Specialist Security | Enabling Productivity through Security
5 年Would be interesting to hear your thoughts on "if a website encouraged certain content to drive revenue" rather than provide a "free" forum for expression and derived indirect revenue from targeting advertising. That may be a test to define the relationship of contributors to the platform/publisher.? Facebook=platform. KDC=publisher?