[email protected]

Facebook and the 2016 Election

A brief history.


Facebook is, to its chagrin, part of the conversation about the outcome of the 2016 election. Here is a list of some special moments that we have all shared with Facebook:

  • Facebook has ~1.8 billion users and 66% get news from the platform.
  • In May, Facebook was accused by conservatives of forcing its “Trending Topics” to have a left-leaning bias.
  • In late August, Facebook fired its “trending news team,” prohibiting them from talking or writing about what happened.
  • In September, Facebook/Oculus founder Palmer Luckey funded white-nationalist “meme magic” for Trump and Facebook censored a famous photo from the Vietnam War for nudity.
  • In October, Facebook began “choking off reach in the news feed.” It was harder for traditional publishers to get distribution without paying for it. They could presumably pay Facebook but they also began to pay celebrities directly. According to Digiday, “When Mic shared a story about convicted rapist Brock Turner on its own Facebook page, it gathered about 7,700 reactions and was shared 4,400 times. When [George] Takei shared that same story the next day, it got nearly three times as many reactions — over 22,000 — and drove over 5,000 shares.”
  • In early November, Facebook was revealed to be hosting viral political content that was made up out of whole cloth by Macedonian teens looking to make money. This content was apparently very popular and widely shared. For example, according to the sociologist Zeynep Tufekci, as quoted in the New York Times, “A fake story claiming Pope Francis — actually a refugee advocate — endorsed Mr. Trump was shared almost a million times, likely visible to tens of millions.”
  • Soon after the election, Facebook founder Mark Zuckerberg said it was “crazy” that the fake news had influenced the election. People on Twitter quickly responded by pointing out that Facebook eagerly sells its influence to advertisers, and took partial credit for mass pro-democracy activist uprisings during the Arab Spring. They asked: So which is it?
  • On November 8, Facebook generated tremendous engagement during the election. According to Forbes, “There were 115.3 million people on Facebook worldwide that generated 716.3 million likes, posts, comments and shares related to the election. There were 643 million views of election-related videos. And, over 10 million people in the U.S. shared on Facebook that they’d voted.”
  • One person with a close connection to Facebook also weighed in. On November 9, a former Facebook product designer, Bobby Goodlatte, wrote the following on Facebook: “This isn’t anybody’s fault. Nobody predicted this. But this must be a wake up call. My aim here is not to point fingers. Again, this isn’t about blaming or accusing. But this election is a clear mandate to act.”
  • On November 12, facing public criticism, Mark Zuckerberg issued a very carefully-worded statement that accepted no responsibility for the election results. He downplayed the impact of fake news—“only a very small amount is fake news and hoaxes,” he explained, although he didn’t make clear how often fake news was shared. He then restated the larger issue of news reliability as, essentially, a product problem: “Identifying the ‘truth’ is complicated. While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted. An even greater volume of stories express an opinion that many will disagree with and flag as incorrect even when factual. I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves.”
  • Zuckerberg also linked to the “News Feed FYI,” for those wanting to learn more, but that was last updated in August.
  • Also on November 12th, BuzzFeed reported that the Trump campaign attributed a great deal of its digital fundraising success to Facebook’s advertising products.
  • Today, November 14, it was reported that Facebook apparently had a “fix” for the fake news, but after the mess with trending topics, didn’t want to release the fix because it would upset the same conservative forces that had led to the “Trending Topics” debacle.
  • Also today, New York Magazine proposed how Facebook could deal with its fake news problem: “…It will need to bring back and expand the human editorial teams, and partner product managers and engineers with people with editorial expertise. Another important step, he says, would be to create a public editor position, similar to the role at the New York Times — someone who acts as a liaison between readers (or users) and the organization itself.”
  • Finally, also today, Tressie McMillan Cottom, a sociologist and activist who writes about digital media, race, and many other issues, had her Facebook account deleted because trolls reported her for violating Facebook’s “real names” policy: “As of 1 PM on November 14, 2016, Facebook had indeed locked me out of my account. I was able to extract ten years of photos first, which is important. But, it does mean both my professional and personal accounts are gone. So, whomever reported me won this round.”

So that’s Facebook.

From a tweet by Matt Haughey

“Is Facebook a media company?”

This is a big question that often gets asked about technology giants. It’s important to understand that “media” here is not just “thing that delivers news and entertainment” but rather “corporation with primary mission of providing a revenue-driving platform that can deliver information and advertising to an audience.”

To technology companies, being a “media company” is basically a death sentence. Look at Google: It’s an advertising company dependent on people publishing web pages on the Internet, but actually look over here at Alphabet, at these self-driving cars and immense opportunities. Media companies have unions and ombudswomen and declining growth. Technology companies fund trips to Mars. So, as Nick Carr wrote in September:

Facebook is an automated data processing company that manages — brilliantly, by any technical standard — an extraordinarily complex network graph, one with well over a billion nodes. To an outsider, the nodes may look like persons or readers or consumers, and the data may look like news stories or photographs or advertisements. But to Facebook they’re just numbers, just the mathematical abstractions of graph theory. Facebook uses software algorithms to optimize data flows among the nodes on its graph in a way that produces a pattern of network activity that maximizes the flow of a certain kind of data (dollars) to one particular node (the one labeled “Facebook”). That’s its business.

“Is it a media company?” is a meaningless question, so avoid it. Facebook is obviously an influence company. It has no significant physical assets and makes no significant consumer electronics products (its mobile phone play was a mess), and its market cap is 150 times greater than that of the New York Times. (But only twice as much as Comcast.) So it’s selling something.

I don’t know how Facebook thinks—who could? But it might be something like this: The real engagement drivers are fun videos, viral things about human cultural identities, the world’s largest banana splits, racist things, boobs, and things about aliens. Everyone is upset today that Macedonian teenagers searching for a quick buck wrote that Hillary Clinton had an FBI employee murdered. But what about tomorrow? Let’s say you blocked all the publishers from Macedonia. Are you also going to take away the articles that say that healing crystals can lower blood pressure? What about if CBS Sunday Morning does a piece about angels?

Facebook, of necessity, sees cultural problems as product problems. I mean what can you do? Facebook grew so huge, so fast, that it faces entirely new classes of cultural problems every day and there is literally no precedent. Plus it creates cultural problems that it must also clean up. None of this is Facebook’s fault. It’s just more that our civic and personal life is, in a real and demonstrable way, a side-effect of revenue-driving decisions made by Facebook. We let that happen! No one forced us.

Facebook must find dealing with “The News” exhausting. Everything it does is going to get professional news people angry, which is frustrating because they control the media. Plus having all that media inside of it, and reliant on it for success, forces Facebook to have a different kind of ethics—antiquated media ethics instead of cool Silicon Valley ethics. This is because once you let media in, its ethos infects everything. And everyone yells at you. As tech giants learn over and over, media is a viral industry.

Facebook likes automated solutions that are cheap and require as little human intervention as possible. This is a problem if it is to be the center of global civic life. For example one result of this is that it was comfortable firing all the editors who worked on trending topics. Another is that strangers could “report” Tressie Cottom and she could not keep her Facebook account from being destroyed.

For Mark Zuckerberg to say that less than 1% of news on Facebook is a hoax is a little like saying that less than 1% of your brain is malignant cancer. It’s not the 1% but the malignancy. It sounds like things are 99% okay, but it’s actually a very bad diagnosis.

Paul Ford is a co-founder of Postlight. 

Story published on Nov 14, 2016.