Facebook's separation of church and state
Satyarth Priyedarshi
Chief eCommerce Officer, Redington Limited || ex- Google, Flipkart, Jio, Tata || 5 TEDx Talks || LinkedIn Top Voice ‘19 || LinkedIn Power Profile ‘18 || Study Board Member at Universities || Views Personal
Ever since Cambridge analytica scandal came out, facebook hasn't been able to rest or be out of news for the wrong reasons. Since then it has been fined in EU and is currently talking to FTC to avoid another fine around user privacy.
As the election meddling related news was coming in last year (2018), Mark Zuckerberg floated a very interesting idea. A sort of "Supreme Court" for moderating content. You see facebook faces a unique challenge due to its global stature. What might be wrong in one country, might not be wrong in another.
Content is representation of societal zeitgeist and therefore the right or wrong moves with social norms, while facebook is a global system. Cultures and norms are protected by borders, but content can float anywhere in the borderless world of internet.
This combined with recent faux pas has made Zuckerberg accept that there is no way that the content on facebook can be moderated by one company. The content is being generated by people and if he has to keep his profit, then it will have to be moderated by a system as big as the people.
What is new
The idea floated last year was moved forward a little last week. Zuckerberg has suggested hiring around 40 people on a global board, who will be the final authority on whether a content should be banned or not.
Facebook is creating a separate committee to look into all content removal related aspects. it will have 40 people who will have a term of 3-6 years and they will be able to nominate other people to take their place when they retire. So in a way the system will be autonomous and independent.
Right now, Facebook employs some 15,000 people who follow an extensive rulebook to decide if the content is Obscene, Dangerous, Free speech or Malicious. Even then there are many a slips that happen. Now there will be a committee of people partly funded by facebook independent of its business looking into content quality.
Yep, its a censor board of sorts.
Facebook's approach seems to be that content quality is something that should be in People's hand rather than corporate. Content removal should be done for the right reason.
Why go on and create such a structure? I believe that here is what is really happening
Protection from lawsuits
Once a committee is authorized to censor stuff based on policy interpretations, take it that there will be many borderline content that will fall into the net. In today's world there is no marketing without shock. And if the censors take away the power of the shock, then facebook becomes as interesting as the television. There are bound to be lawsuits to challenge the version.
This is a major reason why no company has ever attempted to censor the platform or take responsibility for the content on their platform. It's called "Safe Harbour" defence.
If facebook as a corporate and had taken the decision to censor stuff, then the responsibility for any stuff and the underlying damage would have been with facebook. In present scenario facebook doesn't take any censorship based on subjectivity. It takes decision only when a complaint is raised. It takes action when the complaint is raised by an individual about violation of TOC in some shape or form. Fb has a responsibility to address user concerns and under this protection, it can remove content.
If facebook was to proactively start taking actions, then it becomes responsible for negligence if any dubious content slips. So this solution of creating a separate body tasked with the mandate and autonomy protects it from lawsuits.
Protection of share price from volatility
Everytime a scandal around content or privacy breaks, the share prices for FB go a little down. This constant upheaval is not good for the company in the long run. The share prices move because it is perceived that FB will face some sort of ramification or fine or user's trust erosion in a breach.
If the content agency itself is a separate body from facebook, and facebook is able to establish that in people's perception, it will stop being the guilty party here and be able to protect its image and share prices better.
Having an agency in place indemnifies facebook to some level and that is a very crucial protection. It can keep focusing on tech and leave all ruckus being created due to this content business out of its share earnings call or news papers.
-------------------------------------------------------------
Disclaimer: The views expressed here are my own and don't represent the views of any of my employers, present or past.
Satyarth