A Bold New Scheme to Regulate Facebook

Without screwing it up

Mark Zuckerberg addresses journalists at Facebook's annual F8 developer conference last month. (Stephen Lam / Reuters)

Facebook, as a company, works hard to appear neutral.

It talks up the value of openness, expression, and connection—ideas with no particular partisan valence. It donates to both the Democratic and Republican national conventions this year—even as its CEO, Mark Zuckerberg, implicitly criticizes Donald Trump. And it always writes in the same calm, blandly upbeat voice. Facebook wants to be a bright, limpid window between you and your friends, interfering only to show the occasional ad.

But this week, Facebook’s reputation for neutrality took a major hit. Gizmodo’s Michael Nunez reported that workers tampered with the stories that showed up in its “Trending” module, a list of popularly discussed news events that displays in Facebook’s mobile app and on the top-right corner of its homepage. While the stories in the list often represented legitimately popular topics, the human contractors who controlled it—and who wrote one-sentence summaries of the event in question—sometimes skewed what was actually being said.

According to the report, this manipulation took several forms. First, the curators writing the “trending” headlines might “inject” a topic into the trending list—such as an atrocity in Syria, or a prominent Black Lives Matter protest—even if no one was talking about it, especially if they felt that it better represented the day’s news budget or if outside critics complained it should have been present.

But second, and more seriously, one former worker told Nunez that the workers sometimes missed or ignored popular discussion topics among conservative users because “either the curator didn’t recognize the news topic or it was like they had a bias against Ted Cruz.” He goes on to say that many of the Ivy-educated contractors working for Facebook simply did not recognize the significance of the event in question—or, worse, did recognize a story and hoped to downplay its significance.

These allegations are immediately tricky, because not all of the examples cited by the employee indicate journalistic malpractice. Curators were told not “trend” a story if only Newsmax or Breitbart—and not The New York Times or CNN or the BBChad reported it. This is just good editorial guidance: Breitbart and Newsmax have a reputation for playing fast and loose with xenophobia and conspiracy theories, and the major networks are much better sourced. As Will Oremus writes at Slate, much of the “trending” story demonstrates only that editorial guidance can never be automated or completely neutral.

Even when a “trending” decision is algorithmic, there is precedent for sorting some topics out. In 2010, Twitter tweaked their trending topics list so that perennially popular names, like Justin Bieber, did not permanently sit atop it. (This is somewhat like taking the Bible off the monthly bestseller list.)

Yet bias on the world’s largest social network would still be bias. So Senate Republicans opened an inquiry into the company on Tuesday. Senator John Thune of South Dakota* formally asked Facebook to provide documentation of the “Trending” feature’s guidelines and to account for whether its curators had ever buried conservative-leading news.

But even if Thune and other Republicans wanted to regulate Facebook, it’s not clear how they would do it.

“It’s all tricky, because it’s all speech,” says Jonathan Zittrain, a law and computer-science professor at Harvard University and a co-founder of the school’s Berkman Center for Internet and Society. At the end of the day, Facebook makes an editorial product, and like any editorial product it is protected under the First Amendment. But federal regulators or entrepreneurial legislators would still have several options.

First, the Federal Trade Commission could require that Facebook choose a slightly less objective-seeming word than “Trending” for its feature.

“They could call it ‘Special Topics’ or ‘Highlighted Topics’ instead of ‘Trending Topics,’” Zittrain told me. “But if you change the label, we’re back to where we started. And the sloppiness in labeling may represent a sensibility that says, ‘it’s really not our responsibility what goes on in the minds of our users insofar as it garners us more clicks,’” he said.

Congress could also insist that certain standards had to be upheld during curation. In the early 1990s, Congress began requiring cable companies to offer a broadcast station (like the local ABC or NBC affiliate) if the signal from that station’s antenna reached a cable subscriber’s home. The courts eventually upheld this “must carry” provision because it was “content neutral”—it regulated speech without abridging the meaning or political view.

But Zittrain said there may be an even more promising way to keep Facebook from acting against its users’ interest. In an unpublished paper that he is writing with Jack Balkin, a Constitutional law professor at Yale Law School, Zittrain recommends that certain massive repositories of user data—like Apple, Facebook, and Google—be offered a chance to declare themselves “information fiduciaries.” An information fiduciary would owe certain protections to its users, closing the “disconnect between the level of trust we place in [online services] and the level of trust in fact owed to us,” according to the paper.

The key to this idea? Facebook might opt into this regulation itself.

Right now, many technology companies are restricted by overlapping state data-privacy rules. These laws impose a fractured and sometimes conflicting set of rules on the companies themselves. As Zittrain and Balkin write:

California, for example, requires companies who accidentally expose their customers’ personal data to notify those customers of such potential breaches. While the California law by its terms only requires such notifications to customers in California, companies end up notifying everyone in order to avoid leaving any Californians out.

Trying to comply with the whole body of interstate privacy laws has led some companies to say they would support a single federal privacy standard.

Technology companies might not even have to de facto comply with this law—instead, they could opt into compliance. There’s precedent for this: The Digital Millennium Copyright Act of 1998 formally required companies to do very little to combat copyright abuse, but if corporations complied with its scheme, they were given broad legal immunity from the copyright abuse of their users.

“In the years since the DMCA’s passage, nearly all major intermediaries conform to its processes in order to avail themselves of the immunity,” says the paper. “While the copyright industries no doubt would have liked those processes to be required outright—and perhaps made stronger—they achieved a meaningful and lasting policy success in a difficult political environment.”

Zittrain and Balkin want to extend this kind of privacy immunity to Facebook, Microsoft, and the other large tech companies—if they agree to act as data fiduciaries. They recognize that this means the information-fiduciary standard would have to be very good, but “such a trade-off offers a clear path to implementation against skeptical and otherwise-near-implacable opposition by the firms to be affected,” they write.

Beyond that, there’s one more way that Facebook could avoid allegations of bias. Zittrain thinks Facebook, as a kind of data wholesaler, ought to make a straight feed of its friend activity available to all users. Then users could run their own News Feed-style algorithm on the data, and Facebook’s algorithm would merely be one premiere option.

“It’s extremely limiting, it’s as if an iPhone could only run software from Apple,” he said of the current regime.

It is also, he admitted, unlikely. “Even though I’m in favor of that,” Zittrain told me, “I don’t see a legal hook.”


* This article originally stated that John Thune is a senator from North Dakota. We regret the error.

Robinson Meyer is a former staff writer at The Atlantic and the former author of the newsletter The Weekly Planet.