- Joined
- Mar 3, 2018
- Messages
- 1,713
Here at HardOCP, I think the sheer volume of garbage that makes its way into Facebook posts before moderators take it down is common knowledge. Facebook itself has the daunting task of trying moderate all that content, and according to a recent writeup from The Verge, they subcontract some of those moderation duties out to a company called Cognizant. While Cognizant employees allegedly have to sign a strict NDA, The Verge managed to interview a few of them, and what they found isn't pretty. The moderation work itself takes a serious mental toll on Cognizant's employees, who don't enjoy the same generous benefits Facebook employees tend to get, but their descriptions also open a window into Facebook's internal moderation policies. For example, some posts that would seemingly violate Facebook's internal guidelines aren't arbitrarily categorized as a "protected characteristic" by Facebook, and therefore have to stay up, while other similar posts get taken down. Facebook reportedly updates their guidelines every day, and conflicting sources of information make it difficult for moderators to do their job consistently while trying to hit Facebook's target "accuracy" score. While a tour of the moderation facility didn't paint it in a particularly bad light, assuming any of these interviews are true, the work Facebook's moderators do is even harder than it appears to be. Thanks to cageymaru for the tip.
The fourth source is perhaps the most problematic: Facebook’s own internal tools for distributing information. While official policy changes typically arrive every other Wednesday, incremental guidance about developing issues is distributed on a near-daily basis. Often, this guidance is posted to Workplace, the enterprise version of Facebook that the company introduced in 2016. Like Facebook itself, Workplace has an algorithmic News Feed that displays posts based on engagement. During a breaking news event, such as a mass shooting, managers will often post conflicting information about how to moderate individual pieces of content, which then appear out of chronological order on Workplace. Six current and former employees told me that they had made moderation mistakes based on seeing an outdated post at the top of their feed. At times, it feels as if Facebook’s own product is working against them. The irony is not lost on the moderators. "It happened all the time," says Diana, a former moderator. "It was horrible - one of the worst things I had to personally deal with, to do my job properly.” During times of national tragedy, such as the 2017 Las Vegas shooting, managers would tell moderators to remove a video - and then, in a separate post a few hours later, to leave it up. The moderators would make a decision based on whichever post Workplace served up. "It was such a big mess," Diana says. "We're supposed to be up to par with our decision making, and it was messing up our numbers."
The fourth source is perhaps the most problematic: Facebook’s own internal tools for distributing information. While official policy changes typically arrive every other Wednesday, incremental guidance about developing issues is distributed on a near-daily basis. Often, this guidance is posted to Workplace, the enterprise version of Facebook that the company introduced in 2016. Like Facebook itself, Workplace has an algorithmic News Feed that displays posts based on engagement. During a breaking news event, such as a mass shooting, managers will often post conflicting information about how to moderate individual pieces of content, which then appear out of chronological order on Workplace. Six current and former employees told me that they had made moderation mistakes based on seeing an outdated post at the top of their feed. At times, it feels as if Facebook’s own product is working against them. The irony is not lost on the moderators. "It happened all the time," says Diana, a former moderator. "It was horrible - one of the worst things I had to personally deal with, to do my job properly.” During times of national tragedy, such as the 2017 Las Vegas shooting, managers would tell moderators to remove a video - and then, in a separate post a few hours later, to leave it up. The moderators would make a decision based on whichever post Workplace served up. "It was such a big mess," Diana says. "We're supposed to be up to par with our decision making, and it was messing up our numbers."