The Letter Facebook's Board Should Write Today
Mark,
You have asked that your board of directors provide direct feedback and guidance on issues of serious consequence to Facebook. We believe that we are at such a time – and your board felt the need to provide you and your management team with our views regarding recent events - as well as, frankly, a dose of displeasure. We sense that Facebook's content safeguards are not being designed or managed properly. Here are our thoughts:
We understand that Facebook has 2.3 billion content sources. And we know that both human and technical controls are required. And yes, we see that your team has certainly tried hard. But, by any objective measure, Facebook's controls have failed miserably, as evidenced by the sickening tragedy in New Zealand that was broadcast live over our social network. It’s time for Facebook to grow up. Now.
First, many of us on the board were surprised to see Monika Bickert’s team profiled so openly in Vanity Fair. Of course we are glad to see our Community Standards made public, but our belief is that a team focused on policing hate speech and highly inappropriate content should maintain a more modest profile. And the image of her (frankly) tiny-looking group of camera-ready team members made this grave function look flippant.
As you will recall, we were happy with the recent note you posted on the topic. It was well-structured, clearly espoused, and 100% reasonable. But as we discussed then, it was possible that such focus on collaboration, artificial intelligence, and respect for individual expression – while admirable – might be too weak. Perhaps the time has come for the Facebook team to become stronger – and yes, perhaps even unreasonable. Here’s what we recommend:
Tighten the Acceptability Threshold – We know that Facebook content is more engaging as it shifts to the edge of acceptableness. And yes, it’s good for business to keep this loose. But by tightening this threshold of acceptability, you can improve things considerably. And yes, a side-effect is that Michelangelo’s David will get flagged, but we need much tighter content requirements. Monika should focus on this immediately.
World-Class, Real-Time Ops – Many of us on the board have built professional operations centers, which are run with maniacal attention to details, every second of every hour. And these centers are organized with a real-time command structure, usually by ex-Military. Please instill in your team the belief that one bad post would literally end-the-world. With this in mind, we advise creation of a live, full-alert, nerve center for policing content.
Oversight With Real Consequences – As a board, we will not advocate government regulation. But we all learned through experience that necessity is the mother-of-invention. It’s time now to accept that your mission is not best-effort filtering of the vast majority of bad posts. Rather, your mission must be to stop all unacceptable posts. Period. With no exceptions. And when violated, the consequences to you personally will be severe.
None of this is fun. And it will cost a lot of money. Facebook will suffer. But we are confident that your team will identify creative solutions. Here’s an example (and maybe Monika does this now): Of the 2.3 billion members, we'd bet that half would never stream a live murder. This includes IBM, and USDA, and Catholic Charities, and so on. These groups live at the steep, front-end of a Pareto distribution, and thus require much less focus.
Your board estimates that you will need to hire three thousand full-time experts to work in this area, developing new solutions (such as time-delays for live streaming), as well as to staff the nerve center. Assuming salaries of about $150K, this will cost about $450 million. But this is reasonable for a company with a market cap of half a trillion. None of us would blink at requesting that Verizon do this sort of thing. And their market cap is half ours.
Mark, we’ve stuck with you during thick and thin, and we’ve admired your wonderful ability to evolve personally and professionally, with the changing landscape. This goes for your management team as well. But the recent live streaming of a mass murder is too much for this board – and we will not sit by and watch ineffective and insufficient security methods be used to police content. Get started on this now. We will be watching.
Regards,
The Board of Directors of Facebook.
Retired until further notice
6 年Excellent post. My wife and I were discussing this just last night. The technology is available to make it happen and I love how you lay it out like a fully staffed SOC performing threat analysis and incident response. She was adamant that that the economic hit would be negligible considering that the platform might become more appealing again. However, we both agreed that - as usual - policy lags behind technology. As the Vanity Fair article exposes, the determination of what is considered free speech vs hateful speech is an incredibly complex challenge. Who becomes the morality police? Well, since it’s FB’s platform...
Thanks, Ed! Excellent letter! And, #DeleteFacebook is too far gone. Redemption is not possible. Time's up!
Independent Information Technology and Services Professional
6 å¹´Ed, Here are some useful ways to adopt the items on the list in sync with your board recommendations: Stakeholders- include the using public, Facebook, government oversight as involved, in agreement, satisfied with deployment Opportunity- freedom and privacy in a way that restores corporate reputation through value established with viable benefits accrued Requirements- tighten content requirements through realtime monitoring and adherence to agreed to standards that are bounded, coherent, acceptable, and fulfilled Software System- field a live, full-alert, nerve center for policing content using demonstrable, usable, and operational AI Way of Working- stop all unacceptable posts and deliver necessary consequences for failure putting in use established principles and foundations that are made to work well Team- utilize Facebook and government in a collaborating and high performance manner Work- to be initiated is prepared, started, placed under control, and sustained Don
Independent Information Technology and Services Professional
6 å¹´Ed, It would be helpful to others, if not Mark Zuckerburg and the Facebook Board, to frame the board recommendations more precisely in terms of : 1. Stakeholders 2. Opportunity 3. Requirements 4. Software System 5. Way of working 6. Team 7. Work Don
Privacy, Cybersecurity, & AI Attorney
6 å¹´One interesting question is why hasn't community policing been as effective on FB as it has been on community forums like Reddit? Perhaps the format of Facebook's platform needs to be reframed to set new ground rules for users. A top-down policing of content will seem patriarchal and suppress creativity. That's not to say that criminal and abhorrent content shouldn't be quickly flagged and culled, but maybe it's better to let users set their own boundaries. A free exchange of ideas and information between interested users has been the core value proposition of social media platforms, like FB. Over sanitizing them may kill too much good bacteria along with the bad, when maybe the results can be achieved without intense policing by the platform.