Hey, Mark Zuckerberg: It’s Time To Take Responsibility for Facebook’s Role in How Users Get Their News
On Saturday night, Facebook founder Mark Zuckerberg posted a lengthy status update addressing the question of whether the social network — and the circulation of fake news by its users and “trending” algorithms — could have contributed to the outcome of the November presidential election.
“Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.”
Reading this, I wondered whether Zuckerberg even looks at Facebook, because it doesn’t resemble the social media site most of us use daily. His comment suggests that either he is being deliberately narrow in his definition of “fake news,” or that he doesn’t understand the problem. Either is worrisome. And either way, he’s ducking out of a conversation that absolutely needs to be had — one it’s clear his employees are having with or without him, to judge by this New York Times article:
Some employees are worried about the spread of racist and so-called alt-right memes across the network, according to interviews with 10 current and former Facebook employees. Others are asking whether they contributed to a “filter bubble” among users who largely interact with people who share the same beliefs.
Even more are reassessing Facebook’s role as a media company and wondering how to stop the distribution of false information. Some employees have been galvanized to send suggestions to product managers on how to improve Facebook’s powerful news feed: the streams of status updates, articles, photos and videos that users typically spend the most time interacting with.
At any rate, I would love to hear what Zuckerberg considers “fake news.” Maybe he’s restricting that definition to stories that are entirely made up, like the one mentioned by the New York Times that claimed Pope Francis endorsed Donald Trump for the American presidency (he didn’t). And when Zuck says only 1 percent of what appears on Facebook is fake news or hoaxes, I wonder if he’s referring only to the links that appear in Facebook’s news/trending sidebar. Because every day, I see people — smart, educated people — sharing exaggerated, biased, clickbait articles that bear no resemblance to well-sourced and reported sets of facts.
I recently shared a link on Facebook to a blog post called “Please Stop Sharing Links to These Sites,” which included a list of a dozen or so websites known for clickbait headlines and highly distorted articles — including made-up quotes — that entice people into sharing their links on social media. At the top of the list were two sites I see my liberal friends sharing articles from regularly: Occupy Democrats and Bipartisan Report.
It was easily the most-shared thing I’ve ever posted: 49 people shared it, suggesting that lots of folks are similarly sick of seeing fake news shared on social media and wanted to help their friends discern baloney from real reporting. Maybe that’s because I’m a journalist and many of the people I’m friends with on Facebook are current or former journalists, too. But I think the frustration is more widespread than that.
Unfortunately, within a few days, some of them were back to resharing links from Occupy Democrats, Bipartisan Report and others on the list. I’ve considered spending my days commenting on such links and steering folks toward better sources of information, but I worry that becoming that sort of nag would quickly lead to my perspective being ignored.
Over my 20-odd years in journalism, I have often been faced with readers who are muddy on the difference between a reported news article and an opinion piece placed squarely on the editorial page. The proliferation of online media has, to some extent, probably made things more confusing; for starters, there’s often no clear editorial page anymore. There are satire sites fronting as straight-shooting news sources, and outlets once devoted to clickbait and memes (Buzzfeed, Vice) have gotten more and more serious about producing real journalism. There are more news sources than ever, all operating on a different point on the spectrum between pure fact-based news and punditry, and it’s tough to remember which is which.
That may be one reason that, according to a September Gallup poll, trust in the media is at an all-time low. Distrust opens the window for readers to rely on alternative sources of news — and it’s especially easy to trust a source that confirms your suspicions or beliefs about the world around you. On top of that, people don’t really want to have to spend their time fact-checking articles before they share them. (I often wonder how many people would know how to fact-check if they wanted to; snopes is good, but it can’t investigate everything).
Folks cruising around on social media are in a casual frame of mind, one given more to confirmation bias than critical thinking, given more to entertainment and emotion than to the often dull, nuanced presentation of facts in most reported news. In general, real journalism doesn’t tell readers how they should feel about something; it gives them the facts they need to work that out for themselves. But in a social media context, people want quick takes, breezy bites of information (or infotainment) — not longform articles they’ll need a while to digest.
If people continue to get their news directly from legitimate news organizations or even good aggregators like Google News, this isn’t much of a problem. However, for most Facebook users, the site is now their primary news source. In particular, they learn about the day’s events based on what their friends are sharing. And what their friends are sharing isn’t likely to be solid reporting from the Washington Post or the Associated Press.
Lack of circulation and sharing is a big problem for news outlets, which are hemorrhaging revenue and reporters at an alarming rate as readership falls — and ad revenues along with it. I don’t think anyone would argue that we need more solid and investigative news reporting, not less, to keep a close eye on corporate and government corruption. But those ideals don’t appear to be a factor when people are choosing what links to share on Facebook.
The even bigger problem, though, is that when a populace is less well informed, it holds less power. The phrase “knowledge is power” has been repeated so many times (it dates back to at least the 7th Century CE) that people tend to tune it out, but it’s as important as it ever was. The more knowledge, the more information, the more facts you have, the more power you have.
It’s true when you are the CEO of a company and you understand its inner workings, its secret deals, its legal and fiscal liabilities, and you choose what to tell your underlings. It’s true when you’re an elected leader and you decide whether to tell the populace about a potential economic or defense problem — or when you deny the press access to your daily activities, as Trump is already beginning to do.
It’s also true when you are an average citizen and you’re making decisions about who to elect, what laws and ideas to support, what movements to get behind, whose products to buy — the everyday decisions of life.
Facts are the key component of a functioning democracy. And we’re running dangerously low on facts these days.
Vox recently reported on Facebook’s news-curation difficulties. For a while, the site had its own team of news curators — who were inexperienced in the field of journalism, and who were quickly fired after accusations that their news selections were biased. Those curators were replaced by an algorithm, about which Facebook has said little.
As Vox explains:
This algorithm takes into account a variety of factors, like how close you are to the poster, how many times a post has been shared or liked by other Facebook users, the type of post (wedding and baby announcements seem to bubble up to the top), and so forth. And then it chooses the posts it thinks you’re most likely to enjoy — whether they were posted three minutes or three days ago — and puts them at the top of your news feed.
Most of us will only ever see a fraction of the things our friends post on Facebook, so the ability to decide which posts to show first amounts to the power to control which posts users read at all.
So no, Mark Zuckerberg, I don’t think you can get away with saying that 99 percent of what circulates on Facebook is “authentic,” whatever that means. It’s clear that people rely on Facebook for their news, that much of what gets shared on Facebook is biased or misleading in some way, and that Facebook’s own algorithm is further influencing what people see, potentially leading them further away from the facts they need in order to be a powerful participant in our democracy. Take responsibility for what is happening on your site, and find ways to help your users locate the news, reporting and knowledge they need to function in society.