Facebook's Reckoning Draws Nearer

Sooner or later, the company will be forced to take on the responsibilities that come with being the world's dominant news distributor.

Mark Zuckerberg speaks.
Mark Zuckerberg speaks in 2015. (Reuters/Jose Miguel Gomez)

Nine months after Donald Trump won the presidency by unexpectedly swinging key states in the upper Midwest by slim margins, Facebook’s role in the 2016 election is still not clear.

Just in the last week, Facebook’s advertising has come under new scrutiny. Friday evening, The Wall Street Journal reported and CNN confirmed that special prosecutor Robert Mueller served the company with a search warrant to gather information on Russia-linked accounts that Facebook said purchased $150,000 worth of ads on the platform.

Earlier in the week, ProPublica found that anti-Semitic terms could be used to target advertising, prompting Facebook to scramble to patch up its advertising-keyword system.

Both incidents highlighted a basic question: What should Facebook know about the automated systems that it uses to make money?

Facebook’s advertising platform allows anyone, really, to purchase advertising targeted at particular populations. That could be people who recently got engaged, graduates of San Jose State, Red Sox fans, or as ProPublica discovered, bigots.

Facebook’s response to the ProPublica investigation was that few people actually targeted ads to people who listed themselves as “Jew haters.” It was a capability that was merely latent in the system.

And with the Russian ad buy, $150,000 of ad purchases is a drop in the bucket for Facebook, which had $8.8 billion of revenue in the fourth quarter of 2016 alone.

But American elections are not “Facebook-scale.” They can be swung by thousands of people. And within a massive automated system like Facebook’s, it was possible to stash a targeted disinformation campaign around the 2016 election.

The great irony of Facebook is that a system built to connect people makes high-margin advertising dollars by doing away not just with the middleman, but with any man or woman. There doesn’t have to be any human between an advertiser and the people who are targeted.

In the past, salespeople and production teams would have provided a check on the advertising that ran. Now, the people are gone, but the algorithmic systems that Facebook has created are not yet up to the task.

And not just with the advertising ecosystem. Earlier this year, a New York Times Magazine story asked, more or less, whether Facebook’s News Feed was, on balance, bad for American democracy. The aperture of the critique keeps widening, too: a new Times story highlights that governments all over the world are passing laws to regulate Facebook and other internet platforms.

The new zeitgeist has forced Facebook’s leadership to accept some responsibility for the way it shapes the political information people receive.

“Giving everyone a voice has historically been a very positive force for public discourse because it increases the diversity of ideas shared,” Mark Zuckerberg himself admitted. “But the past year has also shown it may fragment our shared sense of reality.”

That followed the burst of “fake news ” in the original sense of the coinage around the election: straight-up hoaxes perpetrated by a variety of actors like the viral story that the Pope endorsed Donald Trump. (He did not.)

The real problem of “fake news” wasn’t merely that it could work on Facebook’s system, but that the incentives of News Feed seemed to make fake news perform substantially better than real news. URLs filled with nonsense aren’t the problem, but the actual distribution system that Facebook’s News Feed represents. If lies, hoaxes, or “truthiness” are so effective, that does not portend good things for factual accounts. The system, whatever else it might be, does not seem to be as neutral as Facebook would like it to be.

It’s possible that the Russian ad buy will end up connected to this larger complex of problems. The way that many media organizations use Facebook ads is to test their content, not to distribute it. It could be that the ads were merely a good way to mark certain kinds of users or find the most viral, divisive content.

Both kinds of knowledge could have multiplied the impact of any kind of disinformation campaign.

One thing seems clear: Facebook will end up before Congress in one way or another. Before congresspeople, demurrals won’t cut it. Facebook has to know itself better, even if that means hiring a lot more people. Facebook wants these things to be artificial-intelligence problems, but counterintelligence may be the more relevant field right now.

Alexis Madrigal is a contributing writer at The Atlantic and the host of KQED’s Forum.