Should social media platforms be regulated? - Perspectives on Zuckerberg's testimony to US Congress
For the first time ever, I have been glued to my laptop watching the testimony of Mark Zuckerburg in front of the US congress. The questioning was relentless, some congress members threw Zuck softballs which could be easily batted away, others pinned him down on specific yes / no questions which were very difficult to wriggle out of.
For those of you who didn't watch these marathon 4-5 hour long sessions, let me summarise some of the key points for you and share why I think the decision to regulate companies like Facebook is a very easy one.
Issues with content on Facebook
1) Spreading of terrorist propaganda
2) Selling of opioids which lead to thousands of American deaths
3) Selling of animal products such as ivory from elephants that contribute to their extinction
4) Spreading of fake news or other rogue political messages (Russian troll farm)
5) Incorrectly leaving up of personally damaging posts or taking down of content related to certain faiths, people of different ethnic backgrounds and more
Issues with privacy
6) Users not knowing how their information is being shared or what companies have access to it
7) Too many privacy settings, not all in one place
8) Terms and conditions that people do not understand or bother reading
9) People who don't use facebook have their data captured by Facebook Pixel
Issues for children
10) Children over the age of 13 are treated the same as adults and are impressionable
11) Children are becoming addicted to social media
Issues with management
12) Zuckerberg may have promoted a perhaps reckless pursuit of user and data acquisition
13) Facebook seemingly did not have privacy and security issues as a top item for management attention
14) Zuckerberg or his team seem not to have had enough management information to measure how these issues were becoming serious problems
If I look at all of these issues the answer to me is very clear, yes, Facebook and other companies like them should be subject to regulation. The scope of this regulation is of course up for debate and it's application should be balanced.
Since starting DynaRisk, it has continued to be reinforced on me that the simplest explanation or solution for something is usually the best. So here are some basic truths:
People do not like to do things they don't see the value in
Very few people read terms & conditions pages.
Very few people seek out security and privacy settings unless they have a reason to or are forced to.
The Bystander Effect may be kicking in for people on social media which results in insufficient numbers of people reporting inappropriate content unless it is really seriously shocking.
Companies are apprehensive about regulation unless it directly benefits them
Regulation can constrain the flexibility and thus growth of companies which reduces profitability and returns to shareholders.
Companies, especially large ones, have vast resources to throw at influencing government policy to benefit their businesses or to at least reduce the impact to them.
We like things that are bad for us
75% of Americans will be overweight by 2020 (OECD)
19 per cent of adults in the UK smoked in 2014, down from a peak of 46 per cent in 1974 (NHS)
When you combine these areas you get a recipe for disaster, this is why government needs to step in. When left on our own, we are at the mercy of companies in the pursuit of profit and consumers are simply not strong enough to resist them.
In Zuck's testimony to Congress, he said over and over again how everyone has a choice about how they share data on Facebook. The problem is the average user:
- might not realise why they should restrict things, so they don't
- forgets about what they posted in the past and how it could impact them
- simply accepts the current privacy situation and doesn't bother
- just clicks on the "next" button to bypass a warning popup because they want to get to the thing they want
I see evidence of this lack of caring everyday in the cyber security business, only 10% of Gmail users have turned on 2 Factor Authentication for example. It takes 3-4 clicks to do but people don't do it!
If I could sum up the feeling most people have towards privacy, 2FA, junk food, alcohol, cigarettes (40 years ago) and other vices, it would be this.
So with all of these problems, what is the solution?
The issue here is: companies control the narrative of how they position products and services to us. We need to have these critical issues re-framed by someone we trust into a narrative that makes them important to us and drives us to take action.
For example:
Versus
The regulation to get people to stop smoking created a whole new market for smoking cessation products! In fact, this market is supposed to reach $4.4 billion by 2023. As I mentioned earlier, smoking has dropped precipitously over several decades.
Here's an example in the cyber world:
70 new emoji? That doesn't matter to me, so I delay applying the update because I don't care.
Re-framed: Fixes over 12 security vulnerabilities that could lead to a criminal taking complete control of your phone.
Now I'm going to update!
When I look at some of DynaRisk's own data, I see that we have a 38% completion rate for users taking our action for reviewing their privacy controls on Facebook. We gave these users an action to take based on a security score which framed their level of risk in their mind that then caused them to act.
While of course we want to aim for this to be as near to 100% as we can, if 38% of people involved in the Cambridge Analytica fiasco had reviewed their settings or been more aware of their privacy, 33 million fewer people could have been impacted!
Summary
Excluding the systemic changes Facebook needs to make to it's tools, processes and management structure, I think the way we can overcome these types of challenges is to:
1) Use regulation to force a basic level of protection for consumers (ie: GDPR!)
2) Re-frame privacy/security/risk issues from messages that benefit companies and their advertising revenues into messages that benefit the person. The messages cannot come from the company providing the product because they are inherently biased by their business model.
3) Drive people to take action to protect themselves based on their own personal circumstances and help them when they have problems.
If we rely on companies like Google and Facebook who provide us free services based on ad revenue to police themselves we will always be at a disadvantage.