Dear Sheryl Sandberg
Dear Sheryl,
You’re right: Facebook has “a long way to go” to repair its civil rights track record. As the results of your internal civil rights audit show, and global pressure mounts for Facebook to do something about hate, political coercion and false news items, there are simple and effective ways to improve your policies and the people who enforce them; adopt a binding set of universal rules that protect the platform, the users, and the advertisers from being targeted as victims or being targeted.
Career online community moderators -- from old school forums to cutting edge social media -- know strong guardrails governing online behavior increase the stakes among participants. The guardrails must be backed up by the corporate terms of service, supported by the company’s legal department, and have clear connections to global law enforcement.
Here is an opportunity for Facebook to show its market power and leadership in the face of a bad report card. The timeline for creating an internal body that both creates the clear guidelines (I provided a sample below) is immediate: Immediately find a C-suite civil rights executive and an advisory board that not only invites bold-face names, but also leaders from the growing online community profession-- I have several in my contacts and readily volunteer to help.
Here is a sample outline of basic rules that will protect Facebook and its members from civil rights infractions:
- Constructive dialogue is welcome here. Please think before you post. Do not either post attacks or provoke attacks.
- Bullying, trolling, hate speech, outright threats, veiled threats, and any other method to impugn someone else’s safety is not tolerated.
- Please be vigilant and mindful in your participation here. Proceed at your own risk.
- This platform reserves the right to take action--suppression, suspension, expulsion, or, in extreme cases, notifying law enforcement, of any participant who violates Rule #2, at our discretion and in accordance with our Terms of Service. We also reserve the right to change our Terms at any time to meet the changing needs of this platform.
- If you see content which violates our rules, please go here <link> to report it. Our moderation team will review your report within XX business days.
These rules come with a price tag: Facebook must employ thousands of human beings to detect and adjudicate content shared on Facebook. May I suggest a portion of the 3 million college graduates in the Class of 2020 who are looking for work while in lockdown? The members of this untapped workforce are internet natives, capable and hungry.
Facebook must build a scoring system for members (like a driver’s license infraction points system) that terminates membership after so many warnings, it must relentlessly innovate in AI to catch content before it goes live, it must offer continual training, coaching, and mental health support to the workforce. Facebook can and should charge its returning advertisers a “safety tax” for every ad placed on the platform-- not unlike the TSA fee every American pays when traveling by air.
It's not too late to restore Facebook to its primary purpose: connecting people to each other and to topics about which they are passionate. Please act now.
Warmly,
Shira Levine
Online Customer Engagement Expert
Facebook member and active user since 2007
Shira Levine is a strategic marketing consultant living in Melbourne, Australia. For more information, contact her on LinkedIn or at Fanchismo.com.
Customer Engagement Expert | Ex-Sephora, eBay, Zynga, Dulux
4 年For more on Facebook civil rights audit, listen to Rashad Robinson on Pivot Google is the latest FAANG to invest in India, Facebook might ban political ads, and civil... https://one.npr.org/i/894713695:894713697
Ph.D. candidate / teacher at mental health treatment center
4 年I wish I could reply to a reply at times....but thanks and yes...and those were a couple of the people on my team....cool
Associate Professor in Online and Convergent Media at University of Sydney
4 年I'd also like to see Facebook pay its moderators better, to lift the professional bar of moderation and to employ more community managers as part of a push to strategically design for more inclusive engagement. BTW we need these type of principles to apply across the social media eco-system, so that they become normative.
Ph.D. candidate / teacher at mental health treatment center
4 年I love it, Shira, incliding the "old school" part (with community forums).? Erica Fox Friedman?will probably relate.? Without taking away from the skills of newer social media people, there is a certain?je ne sais quoi that "original" online community people (pre-social media) have.? I don't think enough of them are leveraged by companies these days. I think it could be useful to put some of the human effort into a sort of "Tier 2" effort.? From what I've seen, most platforms seem to struggle to understand when "regular" reporting and support systems aren't working, aren't enough.? It's wise to know when a more direct conversation and listening is needed.? As an example, I am one of those who warned about data issues before Facebook "got slapped" about it.? I was blown off.? Listening sooner might have saved a lot of headache. Today I have a shell of an account -- saw too much; personal choice.? Maybe it can change in the future.? Now maybe you can tackle Instagram and Pinterest.? I know many think "oh, just a bunch of pretty pictures".? But there's an "underbelly" there.
Passionate about Social & Content | Online Community builder | Aristos in the kitchen
4 年Facebook demerit points- love it!! Also love "May I suggest a portion of the 3 million college graduates in the Class of 2020 who are looking for work while in lockdown? The members of this untapped workforce are internet natives, capable and hungry." Thanks for articulateling and sharing ??