Dear Mark Zuckerberg


Dear Mark Zuckerberg

I’ve said it before and I’ll say it again: Facebook needs an editor?—?to stop Facebook from editing. It needs someone to save Facebook from itself by bringing principles to the discussion of rules.

There is actually nothing new in this latest episode: Facebook sends another takedown notice over a picture with nudity. What is new is that Facebook wants to take down an iconic photo of great journalistic meaning and historic importance and that Facebook did this to a leading editor, Espen Egil Hansen, editor-in-chief of Aftenposten, who answered forcefully:

The media have a responsibility to consider publication in every single case. This may be a heavy responsibility. Each editor must weigh the pros and cons. This right and duty, which all editors in the world have, should not be undermined by algorithms encoded in your office in California…. Editors cannot live with you, Mark, as a master editor.

Facebook has found itself?—?or put itself?—?in other tight spots lately, most recently the trending topics mess, in which it hired and then fired human editors to fix a screwy product.

In each case, my friends in media point their fingers, saying that Facebook is media and thus needs to operate under media’s rules, which my media friends help set. Mark Zuckerberg says Facebook is not media.

On this point, I will agree with Zuckerberg (though this isn’t going to get him off the hook). As I’ve said before, we in media tend to look at the world, Godlike, in our own image. We see something that has text and images (we insist on calling that content ) with advertising (we call that our revenue) and we say it is media, under the egocentric belief that everyone wants to be like us.

No, Facebook is something else, something new: a platform to connect people, anyone to anyone, so they may do whatever they want. The text and images we see on Facebook’s pages (though, of course, it’s really just one endless page, a different page for every single user) is not content. It is conversation. It is sharing. Content as we media people think of it is allowed in but only as a tool, a token people use in their conversations. We are guests there.

Every time we in media insist on squeezing Facebook into our institutional pigeonhole, we miss the trees for the forest: We miss understanding that Facebook is a place for people, people we need to develop relationships with and learn to serve in new ways. It’s not a place for content.

For its part, Facebook still refuses to acknowledge the role it has in helping to inform society and the responsibility?—?like it or not?—?that now rests on its shoulders. I’ve written about that here and so I’ll spare you the big picture again. Instead, in these two cases, I’ll try to illustrate how an editor?—?an executive with an editorial worldview?—?could help advise the company: its principles, its processes, its relationships, and its technology.

The problem at work here is algorithmic thinking. Facebook’s technologists, top down, want to formulate a rule and then enable an algorithm to enforce that rule. That’s not only efficient (who needs editors and customer-service people?) but they also believe it’s fair, equally enforced for all. It scales.Except life doesn’t scale and that’s a problem Facebook of all companies should recognize as it is the post-mass-media company, the company that does not treat us all alike; like Google, it is a personal-services company that gives every user a unique service and experience. The problem with algorithmic thinking, paradoxically, is that it continues a mass mindset.

In the case of Aftenposten and the Vietnam napalm photo, Hansen is quite right that editors cannot live with Mark et al as master editor. Facebook would be wise to recognize this. It should treat editors of respected, quality news organizations differently and give them the license to make decisions.Here I argued that Facebook might want to consider giving editors an allocation of attention they can use to better inform their users. In this current case, the editor can decide to post something that might violate a rule for a reason; that’s what editors do. I’m not arguing for a class system, treating editors better. I’m arguing that recognizing signals of trust, authority, credibility will improve Facebook’s recommendation and service. (As a search company, Google understands those signals better and this is the basis of the Trust Project Google is helping support.)

When there is disagreement?, and there will be,?Facebook needs a process in place?—?a person: an editor?—?who can negotiate on the company’s behalf. The outside editor needn’t always win; this is still Facebook’s service, brand, and company. But the outside editor should be heard: in short, respected.

These decisions are being made now on two levels: The rule in the algorithm spots a picture of a naked person (check) who is a child (check!) and kills it (because naked child equals child porn). The rule can’t know better. The algorithm should be aiding a human court of appeal who understand when the rule is wrong. On the second level, the rule is informed by the company’s brand protection: “We can’t ever allow a naked child to appear here.” We all get that. But there is a third level Facebook must have in house, another voice at the table when technology, PR, and product come together: a voice of principle.

What are the principles under which Facebook operates? Facebook should decide but an editor?—?and an advisory board of editors?—?could help inform those principles. Does Facebook want to play its role in helping to better inform the public or just let the chips fall where they may (something journalists also need to grapple with)? Does it want to enable smart people?—?not just editors?—?to make brave statements about justice? Does it want to have a culture in which intelligence?—?human intelligence?—?rules? I think it does. So build procedures and hire people who can help make that possible.

Now to the other case, trending topics . You and Facebook might remind me that here Facebook did hire people and that didn’t help; it got them in hot water when those human beings were accused of having human biases and the world was shocked!

Here the problem is not the algorithm, it is the fundamental conception of the Trending product. It sucks. It spits out crap. An algorithmist might argue that’s the public’s fault: we read crap so it gives us crap?—?garbage people in, garbage links out. First, just because we read it doesn’t mean we agree with it; we could be discussing what crap it is. Second, the world is filled with a constant share of idiots, bozos, and trolls and a bad algorithm listens to them and these dogs of hell know how to game the algorithm to have more influence on it. But third?—?the important part?—?if Facebook is going to recommend links, which Trending does, it should take care to recommend good links. If its algorithm can’t figure out how to do that then kill it. This is a simple matter of quality control. Editors can sometimes help with that, too.

UPDATE: Facebook relented and will publish the photo. Here’s exec Justin Osofsky explaining.


Daniel W McIntosh

MSE, Booking & Contracts, Specialty Entertainment, Team Building, Brand Promotion, Actor, Stunt Performer, Aspiring DJ, Instructor in Specialty Arts.

8 年

FB is on a mass free speech censorship campaign, on those with different opinions then their organizations politics.

Brian Podczerwinski

Operations Manager at Astrotech Incorporated / ATR Manufacturing Ltd.

8 年

Facebook = Time Suck.

William Redmond, Dr-Ing/PhD

Data Management Professional

8 年

After more than 51 years in IT, I absolutely refuse to join Facebook. It is a mind numbing and highly dangerous invention. I have friends that spend an inordinate amount of time perusing various posts when they could be out in the world establishing direct human contact with others and trying in whatever small ways to make the world a place where people understand each other and peacefully coexist. This article is correct in many of its observations, but still is looking through the tainted dark lens of modern mass media, where the frequently warped and ignorant viewpoints of a few are presented to the rest of the world as sacrosanct truth. Facebook is actually doing the same thing in its own way, pushing the viewpoints of Zuckerberg and his obviously out-of-touch techies. The algorithms being used are based on someone's opinion of what is proper without taking into consideration that community standards vary enormously from place to place. It would, in my opinion, be a much better use of these techies' talents if they were charged with the task of devising a better way for individual Facebook users to filter for themselves what they are exposed to.

要查看或添加评论,请登录

Jeff Jarvis的更多文章

  • An unprecedented grand coalition for Democracy

    An unprecedented grand coalition for Democracy

    As Nicolle Wallace exclaimed on her show Friday, Liz Cheney and Dick Cheney, Alexandria Ocasio-Cortez and Bernie…

    1 条评论
  • How Murdoch's empire makes its memes

    How Murdoch's empire makes its memes

    I've long said that Rupert Murdoch is the single most malign influence in English-speaking democracy. He executes his…

  • California's Deal for News

    California's Deal for News

    A deal has just been struck in California by Assembly member Buffy Wicks that averts what could have been, in my…

    4 条评论
  • What 'Press'?

    What 'Press'?

    Margaret Sullivan?—?whom I greatly respect and with whom I almost always agree?—?wrote a Guardian column asserting that…

    5 条评论
  • Time to clean house at The Post

    Time to clean house at The Post

    This headline is how The Post's editors chose to cover Trump's horrendous appearance at the NABJ. So wrong.

  • White voters heard here

    White voters heard here

    On MSNBC this morning, I watched Elise Jordan’s focus groups from Green Bay, Wisconsin — the first after the nomination…

    1 条评论
  • In Mass Media's Death Throes

    In Mass Media's Death Throes

    The New York Times et al wish Joe Biden would go gentle into that good night. I wish mass media would instead.

    5 条评论
  • As complicated as Black and white

    As complicated as Black and white

    Here are two attempts to redraw the binary political taxonomies of today: In the FT, Gideon Rachman argues that after…

  • How to save news? A report from California.

    How to save news? A report from California.

    There is still hope that California’s perilous, protectionist legislation for news could be reformed, but not without…

    5 条评论
  • The Post's Scandals

    The Post's Scandals

    I am a loyal listener of Prospect's media podcast, Media Confidential, with Alan Rusbridger and Lionel Barber (even if…

    1 条评论

社区洞察