Skip to main content

Moderators say Facebook didn’t prepare them to catch Russian propaganda

Moderators say Facebook didn’t prepare them to catch Russian propaganda

/

The company relies on contractors who might screen thousands of ad components daily

Share this story

Photo by Amelia Holowaty Krales / The Verge

When Facebook disclosed earlier this month that it was the target of a Russian propaganda campaign, it stayed light on details. The company said Russia-linked accounts had been used to purchase at least $100,000 in advertisements aimed at “amplifying divisive social and political messages.” But Facebook declined to provide any specific ads, to say who may have been a target of those ads, or to say how many people the ads may have reached. The incident raises questions about the thoroughness of Facebook’s ad review process.

According to four former Facebook workers who described the ad review system to The Verge, the company relies on workers reviewing different components of an ad, and asks them to review hundreds, if not thousands, of components a day. The workers, three of whom overlapped with the disclosed timeframe of the Russia campaign, described a rote, numbers-driven process that valued speed and automation over nuance, and that wasn’t built to deflect something like political propaganda. “I know exactly what these guys did,” one former worker said about the Russian ad buyers. The person said they were reading the news and thinking, “it’s not hard to do.”

The workers said the ad review process was largely powered by a group of contractors in the United States, as well as another team in India. Multiple sources described the US group as a different “class” at the company, hired through a third-party business and sequestered from Facebook employees at large, even when they worked on-site alongside them. Most worked on contract for somewhere around $18 per hour.

Former contractors said that each worker would sit at a computer and, using an internal screening tool, review a different aspect of the ad. Ads were broken down into components: the image, different parts of the text, and the page where the ad led to were screened in different queues.

“There was no quality metric that I guess we had.”

Using a series of keyboard codes, similar to how a grocery check-out might ring up vegetables, the workers would punch in tags based on what appeared on their screens. The assembly line-like division of labor helped scale the system, but it did not promote critical evaluation. “There was no quality metric that I guess we had,” said one former worker. But there were other metrics: the system timed how long workers spent flagging ads, as well as how many ads were flagged.

Many of the reasons for flagging were intuitive, and are disclosed openly by Facebook. Ads depicting violence or sexually explicit images were easily flagged, according to the workers, and rules disqualified ads with too much text, or that could have been scams. But when I asked one person whether something as complex as an attempt to undermine democratic institutions could have been flagged, the person gave a deadpan response: sure, just pop in the keypad code for undermining democracy, and it’s good to go. “They weren’t screening for, like, propaganda or anything,” the person said.

There were no particularly onerous demands made on political ads, at least that the front line ad reviewers dealt with, although Facebook says it doesn’t allow campaigns that request information like political affiliation. The company also forbids “content that exploits controversial political or social issues for commercial purposes,” and on its face, this would seem to encompass the Russia ads, which dealt with issues like immigration and gun rights. The workers, however, said there was wiggle room. A confederate flag could be banned when it was used to sell something. But one person said this rule was generally reserved for the most egregious acts, such as using political unrest as a fear-based marketing tool.

Reviewers were supposed to question what an ad buyer’s business model was, and this could have been one hurdle to the Russia campaign. But Facebook’s ad operation was so massive that bizarre ads came through regularly. Facebook has said the politically-motivated ads in question were geographically targeted and that reviewers were told to watch for inappropriately targeted ads, but that doesn’t seem to have been an obstacle. The company is currently facing a scandal after ProPublica discovered that ads could be targeted to interests like “jew hater,” and controversy around targeting stretches back even further.  

Facebook did not respond to a request for comment on the precise failure point

Facebook didn’t respond to questions about its ad-screening techniques, or about how the Russia ads may have been screened. The workers said Facebook also uses an algorithm to screen ads, and that the contractors were constantly training the system as they worked. This raises the question of whether the Russian ads were approved by human hands at all, or passed entirely by machines. Facebook also has a sales team that sometimes handles large ad buys. But the company told CNN that the sales team did not make contact with the ad buyers, suggesting either the contractors or the algorithm was responsible for screening.

Reporting from The New York Times has already suggested that, internally, Facebook workers are demanding more transparency about the ad process. That concern might be most intimately felt by the workers who were on the front lines. “I was so disappointed but also not surprised,” one of the former workers said, “and it freaks me out that I may have had a hand, in some small way, of disseminating those things.”