Facebook's Big Lie
In light of the ongoing boycott of Facebook by some in the advertising industry, Facebook has gone on a PR offensive claiming that they do not profit from hate.
On July 1st, Facebook’s VP of global affairs and communications, NickClegg, stated, “I want to be unambiguous: Facebook does not profit from hate.”
I would also like to be unambiguous: Bullshit.
Let’s start at the beginning.
Facebook makes 99% of its money from advertising. The way you make money from advertising is by attracting people to your site. This is the same for other media like TV, radio, and newspapers. The more people you attract, and the longer they stay, the more money you can extract from advertisers.
Next, we have to understand algorithms. In layman's terms, algorithm is a fancy name for a mathematical formula. Algorithms are formulas used by machines to make decisions. For example, when you go to Facebook, you are shown a completely different page than I see when I go to Facebook. Your page has your friends, ads from companies that Facebook thinks you will be interested in, and articles that Facebook thinks you will like. Mine has completely different friends, different ads, and different articles. The decisions about what to show each of us are made in milliseconds by algorithms.
In simple terms, as Facebook gathers data on us, the data is fed into algorithms. The algorithms constantly evolve in a way that is aimed at maintaining our interest. This is how we are encouraged to keep coming back to Facebook and staying as long as possible.
(You may be surprised to know that the data that Facebook gathers about us is not simply from our behaviors on Facebook, but on all kinds of other data they are able to collect even if we are not Facebook users, and even if we have opted out of tracking. But that’s a subject for another day.)
The online media industry has learned that one of the most powerful tools to keep people engaged is controversy. Many studies have shown that hate speech is powerful stuff. According to one study “content generated by the hateful users tend to spread faster, farther and reach a much wider audience as compared to the content generated by normal users.” This is not a pleasant fact, but it is nonetheless true.
Algorithms learned this very early on and have been designed to serve us emotionally powerful material that will keep us engaged. At times, this material can be divisive and hateful and feed into the most negative aspects of our personalities. For the most part, algorithms don’t make value judgements. They simply feed us material that will keep us in the corral as long as possible.
Facebook goes beyond that. Facebook actively seeks to keep us further engaged by recommending “groups” to us that its algorithms believe will be attractive to us. These may be perfectly innocent groups about fashion, baseball, or books. But they may also be groups of pedophiles, criminals, and advocates of hate and violence.
If you have any doubts about this, let me tell you about an internal study done by a team of Facebook employees. Contrary to Facebook’s stated mission of “connecting the world,” the team found that Facebook was far more effective at dividing the world.
In fact, the Facebook team found that algorithms they use "to gain user attention & increase time on the platform” are not bringing people together, they are driving people apart. According to the report by the Facebook employees, “64% of all extremist group joins are due to our recommendation tools...Our recommendation systems grow the problem.”
In other words, Facebook’s algorithms help extremists of all kinds find each other, engage with each other, stay longer on the platform, and therefore create the opportunity for Facebook to earn more ad dollars.
Prof Hany Farid, an expert at University of California Berkeley has said, “They didn’t set out to fuel misinformation and hate and divisiveness, but that’s what the algorithms learned."
In order to believe that Facebook does not profit from hate, you have to believe that Mark Zuckerberg doesn’t understand how his algorithms work. And if you believe that, I’m afraid you will believe anything.
Bob Hoffman is author of "Advertising For Skeptics."
Director of comms & media strategy Fama Volat / The Next Aurora
4 年It's a good observation and I agree that Facebook does profit from hate. Although I'm open to have my mind changed. I do think that their is some nuance that needs to be adressed. I think Facebook is not fully to blame. There are more factors besides the algorithm. Traditional media are declining and need Facebook to support them financially. Controversial and negative content achieves more engagement. So they're inclined to behave in a clickbaity manner. This with journalism becoming less about the truth and more about narratives are big factors aswel. Also it doesn't take human confirmation bias into account. It isn't clear to me that people wouldn't have found those extreme groups by themselves. What is the attribution of these recommendations? I'm agreeing it is a problem when it occurs. Other platforms dont seem to have a better solution. On Twitter my ads are in between idealogical fighting, my reddit ads are shown to people in their echochamber. So it isnt clear why only facebook is being boycot. This last part is exagguerated but it shows all platforms struggle. Facebooks claim is false and important show. I'm genuinly interested in your solution Bob Hoffman. Thomas ben ook wel benieuwd naar jouw kijk hierop.
Marketing and Business Consultant (Owner)
4 年I was sent this link to a trailer that seems to be quite relevant - "Please watch this documentary on Netflix." https://youtu.be/c-UqfqPH9jU
Chief Commercial Officer at Adgile
4 年Sidney Hunwick
Great post, Bob. Thanks.
Fractional CRO, Growth Consultant, Business Advisor: specialising in strategic growth.
4 年Great article, as always. Possibly ironic the URL I read this article on included ...ffman/?trackingId=zFmlt... in it