In 2010, Accenture signed an accounting contract with Facebook. By 2012, that had broadened to include a content moderation agreement, particularly outside of the United States.
That year, Facebook sent employees to Manila and Warsaw to train Accenture employees to sort posts, said two former Facebook employees involved in the trip. Accenture employees learned how to use a Facebook software system and platform guidelines to leave content in place, remove it, or escalate it for review.
What started out as a few dozen Accenture moderators has grown rapidly.
In 2015, Accenture’s San Francisco Bay Area office set up a team, named Honey Badger, just for the needs of Facebook, former employees said. Accenture has grown from around 300 workers in 2015 to around 3,000 in 2016. This is a mix of full-time employees and contractors, depending on location and task.
The company quickly turned its work with Facebook into moderation deals with YouTube, Twitter, Pinterest and others, executives said. (The digital content moderation industry is expected to reach $ 8.8 billion next year, according to Everest Group, roughly double the 2020 total.) Facebook has also awarded contracts to Accenture in areas such as checking fake or duplicate user accounts; and monitoring celebrity and brand accounts to make sure they haven’t been inundated with abuse.
After federal authorities discovered in 2016 that Russian agents had used Facebook to deliver controversial messages to American voters for the presidential election, the company increased the number of moderators. He said he would hire more than 3,000 people – in addition to the 4,500 he already had – to monitor the platform.
“If we are to build a safe community, we have to react quickly,” Zuckerberg said in a 2017 article.
The following year, Facebook hired Arun Chandra, a former Hewlett Packard Enterprise executive, as vice president of large-scale operations to help oversee relationships with Accenture and others. Her division is overseen by Ms. Sandberg.