Dear Facebook employees,
Honestly, I want the best for Facebook. I poured my heart and soul into the company from 2008–2012. I also want the best for the world, though. I know you do, too. I don’t think those things have to be in conflict. I wrote down some thoughts on how I think we get those things more aligned, in case it’s helpful to you. None of this is revolutionary, quite the contrary. Much of it has been said by others but I don’t think it hurts to be restated. Here goes…
I’ll start with my own admission. In 2009, I said, “We believe in Facebook’s mission that giving people tools to make the world more open is a better way to combat ignorance or deception than censorship.” It turns out that I was wrong. First of all, it’s a false choice. There are more options than just being “open” and “censorship”. Most importantly, though, it’s become obvious in the 11 intervening years that the opposite is actually true. The more successful Facebook is in accomplishing its mission, the more ignorance, deception and the like there appears to be in the world. There is definitely correlation here. Unfortunately, I also believe there is more than a little causation.
How did we get here?
Well, in part, Facebook’s biggest strengths are also its biggest weaknesses. Early on, Facebook focused on the connection. If you get a person connected to another real person they actually know, a lot of other problems go away. And it worked. Friends are much less likely to scam you, be inappropriate or annoy you than strangers. Also, there is a good chance you’ll be interested in the content they share. Unfortunately, it doesn’t solve everything because, you know what? Your Uncle Daryl isn’t a doctor, doesn’t know shit about vaccines and is easily misled by others on the topic. This is doubly bad because of the connection. You’re more likely to believe misinformation from Uncle Daryl than from strangers.
Adding gasoline to the fire is Facebook’s sophisticated content system. Using signals from billions of people and untold pieces of content, it knows what content people will find engaging. You know what’s engaging as heck? Wild conspiracy theories and incendiary rhetoric. Put together a piece of content that comes to you from a trusted source (i.e. your friend) and Facebook making sure you see the really tantalizing stuff and you get viral misinformation. That’s why Facebook’s system is so susceptible to it and it spreads so quickly. When the integrity of your entire system is based on the quality of the connection and not the quality of the information, the forces of misinformation see a vulnerability and they are exploiting it aggressively. The Plandemic movie was recent and tragic example.
It has been said that a lie gets halfway around the world before the truth has a chance to get its pants on. Now, Facebook’s speed and reach make it more like a lie circles the globe a thousand times before the truth is even awake. This is no accident. Ironically, the one true conspiracy theory appears to be that malevolent nation-states, short-sighted politicians, and misguided interest groups are using conspiracy theories to deliberately misinform the public as a means of accomplishing their long-term strategic goals.
Why isn’t Facebook doing more to address this?
Unfortunately, I do not think it is a coincidence that the choices Facebook makes are the ones that allow the most content–the fuel for the Facebook engine–to remain in the system. I do not think it is a coincidence that Facebook’s choices align with the least resources required, outsourcing important aspects to third parties. I do not think it is a coincidence that Facebook’s choices appease those in power who have made misinformation, blatant racism and inciting violence part of their platform. Facebook says, and may even believe, that it is on the side of free speech. In fact, it has put itself on the side of profit and cowardice.
You don’t have to be, though. Facebook has seemingly limitless resources at its disposal. You’ve got some of the smartest people in the world who work at Facebook. I know, I’ve worked with them. You’ve developed some of the most advanced technology in history and have mountains of capital. As one example, the company has said it may spend as much as ~$34 billion for stock buybacks since just 2017. The main ingredient that you lack is the will.
How to find the will?
First of all, it’s helpful to realize the world has changed and so has Facebook. In the four years I worked at Facebook, a lot of precedents were set that are still playing out today. Some of them made sense for the 2008 world but don’t make sense now. In 2008, the professional arbiters of truth–the press–were much stronger both in terms of resources and distribution. In 2008, Facebook’s reach was growing but it only touched a small percentage of the population. In 2008, people used Facebook more to keep up with friends than as a news or information source. Today, all of that has changed dramatically.
Newsrooms have been decimated and the press’ overall distribution has been similarly reduced. Meanwhile, Facebook has become a primary source of news and information for billions of people. In short, when we decided that Facebook would take a hand’s off approach to content, the world didn’t need Facebook to fact check or contextualize information. The world needs it now desperately.
I still believe that Facebook does more good than harm. There has been no better example than the emotional support for the current health crisis. The value of connection with family and friends during this time is incalculable. However, just doing more good than harm is not enough.
If you think of Facebook as the place where people get their information, it’s like the one grocery store in a town. Everyone shops there and its shelves are mostly filled with food that is nutritious, fun, entertaining, engaging, etc… However, sprinkled through the shelves are foods that look like regular stuff but are actually poison. I’m not talking about junk food with frivolous or empty calories. I’m talking about food that literally poisons one’s mind, turning him or her against science, facts, and other people. If that’s your mindset, what resources would you leave on the table to find the poison? Are there any risks you would not take? At the very least, you would not hesitate to put warning labels on the poison.
That’s not the way Facebook has thought in the past, though. Instead, I believe there is an inherent intent bias within Facebook. That is, you know your intentions are good and therefore you focus on the good outcomes and dismiss the bad. I was definitely guilty of it. It’s easy to do, especially, when detractors have the opposite bias. That is, they see some bad outcomes and assume bad intentions.
It would be helpful for Facebook to cut through all of that and be honest with itself. If you believe that productive information on Facebook can create a sisterhood of truckers, sell Cliff bars, start revolutions in the Middle East, defeat a terrorist organization, then you must also believe that misinformation you host and distribute can destroy lives, incite violence, torture those who have already endured unspeakable tragedy and convince people to make devastating health choices.
Promoting free speech shouldn’t be used as a get out of tough choices card. Yes, people have the right to express ignorant or misinformed views but that doesn’t mean you are prevented from providing context on those views or that you are required to give them distribution.
For centuries, the main way people received the free speech of others was through publishers or the press. Maybe a few people heard a speech. Maybe it was even a few thousand people who were present. However, the vast majority of people read about it in the paper, where it was put in context. Even with the advent to radio and TV, the actual video or audio of the speeches were followed by commentary of reporters. These employees of for-profit private companies provided context and attempted to arbitrate truth. Was it perfect? No, but it mostly worked and it kept the forces of misinformation and divisiveness largely at bay.
That system has been disrupted, in large part, by you. You have a responsibility to take an active role in fixing it and/or finding a new system that works better. The Facebook Journalism Project and the support of fact checkers are a great start but they are bandaids. Alas, we are hemorrhaging civility and truth. The scale and sophistication for a real solution is orders of magnitude more.
What should Facebook do?
I don’t have a silver bullet but I know you need to build trust. You need to show the world that you are not putting profit over values. Therefore, I would suspend the stock buyback program. As I mention, you’ve committed ~$34 billion to stock buybacks. It looks like you’ve spent about $20 billion. That’s $14 billion left (please check my math). I’d devote the equivalent resources to a goal of better informed users. You’d be showing that you’re literally choosing users over profit.
What’s the metric? I don’t know but I have confidence that you can figure it out. The spirit is that you have swung the pendulum all the way toward enabling expression. Let’s move it toward the quality of information or an outcome of an accurately informed public. Success on this would be infinitely more valuable to your investors than artificially propping up the stock with buybacks.
I’d put the company in lockdown. We did it in 2011 when Google was launching Google+. They had orders of magnitude more resources, more engineers, the largest distribution platform in the world and had committed everything to squashing Facebook. We worked day and night and kicked their ass. We humiliated them. This challenge is even more daunting but also infinitely more important. I know you can do it.
It will be hard, though. You’ll need courage, money and brainpower. You’ll also need to cast aside long-held beliefs. Just because taking a specific action could be a “slippery slope”, doesn’t mean it’s wrong. Just because a solution isn’t currently “scalable” doesn’t mean it’s unworkable or that you couldn’t eventually scale it. Just because something is an “edge case” doesn’t mean it’s irrelevant.
In case it is not clear, the stakes are high. We are in the midst of a global pandemic. Nearly 400,000 people are dead. Many more are likely to die and that risk is being made worse by content you host. Every. Single. Day. The only way the stakes could be higher is if we were on the brink of a world war. Thankfully, we are not. However, I encourage you to ask yourself where a concerted and systematic undermining of science and truth and rampant divisiveness ends if it is left unchecked? A lasting peace? I doubt it.
Whatever you do, I can promise you this: You will continue to be criticized. People will always say that you are both doing too much and not enough. That is the price of leadership. I used to tell Facebook colleagues who complained about criticism to go work at MySpace. No one bothers to criticize them. You don’t work at MySpace, though, because we trounced them, as well. You work at Facebook and you can beat misinformation and divisiveness, too. I’m rooting for you. We all are.
Your friend,
Barry
Principal, Founder at Solutions 4 Communities
4 年Thank you..this was really powerful.
Freelance Writer, Artist, Musician @WriteFight99
4 年Facebook, at least, should start with Twitter's approach to President Trump's tweets of disinformation about voting by mail from a couple weeks ago, by tagging such nonsense posts with factual clarifications.
Independent Management Consulting Professional
4 年Barry, Result. By result, I believe you are asking for some sort of future prediction? If so, short term, MZ will make every effort he can to give credence to all sides and in a somewhat qualified manner promise to make changes and monitor intensely all formed committees and say there are many good sides to most of their discoveries and we will be reviewing how to best address these concerns through significant policy changes that will best serve our mission. This will possibly drag out for some months "due to the complexity of the issues at stake and the seriousness in which these issues force us to make the best decisions possible for the sake of our customer base and our company's mission." Final result: "Having reviewed and met with an amazing amount of extremely qualified groups and individuals in regards to all the varied issues of concern, it has benn decided that certain aspects of our platform will need more scrutiny and analysis given the complex and daunting task of assessing the shear volume of our users data. As you can imagine there's a lot of stuff here to deal with. I mean a lot of stuff. we will be changes some ways we do stuff and some is working really well so we aren't going to change that stuff. I still believe that we can keep a free and open platform for an exchange of information and ideas and I feel we are doing more good than harm by allowing individuals to express their ideas, thogh some may find them offensive. Incitement is in the ear of the listener and that's not something we can control or do I think we ought to." Final prediction: "Stuff" will keep happening. Facebook will continue to garner market share and increase in value. Workers who can stomach being a part of FB will stay at their desks. Those workers who just want a paycheck will continue along as if there was nothing they would or could do for a variety of good or bad reasons. Because those individuals who have an opportunity through control of abundant resources, networks, influence, leverage and other factors allowing for leadership to bring about change for the better haven't the "will" do do so.
Independent Management Consulting Professional
4 年Barry, I applaud your efforts, but I fear you are tilting... MZ fell into this gig by pure accident (boy genius? I know he thinks so). Actually he has little or no moral compass and decidedly thumbs his nose at all who criticize his company's success, or so, what he sees success. His measure of superior performance uses metrics of $'s, number of users, and international political and financial effect. My group believes FB net effect to have been neutral. I on the other hand look at reality and judge the effect to be net negative. Obvious proof. Overly simplistic but, is the world in a better place today, than pre-facebook given... all the factors... nationally and/or internationally? When understanding success and successful leadership; ethics, fairness and doing "no harm" are those things that most believe will allow humanity to evolve, improve and move ahead. The blatant disregard for these as displayed by MZ over these many years...proves he has no understanding of any of these lofty endeavors. Some people rarely change, grow, or learn more than they feel they need to...MZ doesn't feel he needs to do any of these things.