Dear Mark Zuckerberg

Jeff Jarvis
5 min readSep 9, 2016

I’ve said it before and I’ll say it again: Facebook needs an editor — to stop Facebook from editing. It needs someone to save Facebook from itself by bringing principles to the discussion of rules.

There is actually nothing new in this latest episode: Facebook sends another takedown notice over a picture with nudity. What is new is that Facebook wants to take down an iconic photo of great journalistic meaning and historic importance and that Facebook did this to a leading editor, Espen Egil Hansen, editor-in-chief of Aftenposten, who answered forcefully:

The media have a responsibility to consider publication in every single case. This may be a heavy responsibility. Each editor must weigh the pros and cons. This right and duty, which all editors in the world have, should not be undermined by algorithms encoded in your office in California…. Editors cannot live with you, Mark, as a master editor.

Facebook has found itself — or put itself — in other tight spots lately, most recently the trending topics mess, in which it hired and then fired human editors to fix a screwy product.

In each case, my friends in media point their fingers, saying that Facebook is media and thus needs to operate under media’s rules, which my media friends help set. Mark Zuckerberg says Facebook is not media.

On this point, I will agree with Zuckerberg (though this isn’t going to get him off the hook). As I’ve said before, we in media tend to look at the world, Godlike, in our own image. We see something that has text and images (we insist on calling that content ) with advertising (we call that our revenue) and we say it is media, under the egocentric belief that everyone wants to be like us.

No, Facebook is something else, something new: a platform to connect people, anyone to anyone, so they may do whatever they want. The text and images we see on Facebook’s pages (though, of course, it’s really just one endless page, a different page for every single user) is not content. It is conversation. It is sharing. Content as we media people think of it is allowed in but only as a tool, a token people use in their conversations. We are guests there.

Every time we in media insist on squeezing Facebook into our institutional pigeonhole, we miss the trees for the forest: We miss understanding that Facebook is a place for people, people we need to develop relationships with and learn to serve in new ways. It’s not a place for content.

For its part, Facebook still refuses to acknowledge the role it has in helping to inform society and the responsibility — like it or not — that now rests on its shoulders. I’ve written about that here and so I’ll spare you the big picture again. Instead, in these two cases, I’ll try to illustrate how an editor — an executive with an editorial worldview — could help advise the company: its principles, its processes, its relationships, and its technology.

The problem at work here is algorithmic thinking. Facebook’s technologists, top down, want to formulate a rule and then enable an algorithm to enforce that rule. That’s not only efficient (who needs editors and customer-service people?) but they also believe it’s fair, equally enforced for all. It scales. Except life doesn’t scale and that’s a problem Facebook of all companies should recognize as it is the post-mass-media company, the company that does not treat us all alike; like Google, it is a personal-services company that gives every user a unique service and experience. The problem with algorithmic thinking, paradoxically, is that it continues a mass mindset.

In the case of Aftenposten and the Vietnam napalm photo, Hansen is quite right that editors cannot live with Mark et al as master editor. Facebook would be wise to recognize this. It should treat editors of respected, quality news organizations differently and give them the license to make decisions. Here I argued that Facebook might want to consider giving editors an allocation of attention they can use to better inform their users. In this current case, the editor can decide to post something that might violate a rule for a reason; that’s what editors do. I’m not arguing for a class system, treating editors better. I’m arguing that recognizing signals of trust, authority, credibility will improve Facebook’s recommendation and service. (As a search company, Google understands those signals better and this is the basis of the Trust Project Google is helping support.)

When there is disagreement , and there will be, Facebook needs a process in place — a person: an editor — who can negotiate on the company’s behalf. The outside editor needn’t always win; this is still Facebook’s service, brand, and company. But the outside editor should be heard: in short, respected.

These decisions are being made now on two levels: The rule in the algorithm spots a picture of a naked person (check) who is a child (check!) and kills it (because naked child equals child porn). The rule can’t know better. The algorithm should be aiding a human court of appeal who understand when the rule is wrong. On the second level, the rule is informed by the company’s brand protection: “We can’t ever allow a naked child to appear here.” We all get that. But there is a third level Facebook must have in house, another voice at the table when technology, PR, and product come together: a voice of principle.

What are the principles under which Facebook operates? Facebook should decide but an editor — and an advisory board of editors — could help inform those principles. Does Facebook want to play its role in helping to better inform the public or just let the chips fall where they may (something journalists also need to grapple with)? Does it want to enable smart people — not just editors — to make brave statements about justice? Does it want to have a culture in which intelligence — human intelligence — rules? I think it does. So build procedures and hire people who can help make that possible.

Now to the other case, trending topics . You and Facebook might remind me that here Facebook did hire people and that didn’t help; it got them in hot water when those human beings were accused of having human biases and the world was shocked!

Here the problem is not the algorithm, it is the fundamental conception of the Trending product. It sucks. It spits out crap. An algorithmist might argue that’s the public’s fault: we read crap so it gives us crap — garbage people in, garbage links out. First, just because we read it doesn’t mean we agree with it; we could be discussing what crap it is. Second, the world is filled with a constant share of idiots, bozos, and trolls and a bad algorithm listens to them and these dogs of hell know how to game the algorithm to have more influence on it. But third — the important part — if Facebook is going to recommend links, which Trending does, it should take care to recommend good links. If its algorithm can’t figure out how to do that then kill it. This is a simple matter of quality control. Editors can sometimes help with that, too.

UPDATE: Facebook relented and will publish the photo. Here’s exec Justin Osofsky explaining.

--

--

Jeff Jarvis

Blogger & prof at CUNY’s Newmark J-school; author of Geeks Bearing Gifts, Public Parts, What Would Google Do?, Gutenberg the Geek