Facebook Will Make More People Watch Violent Videos so You Don't Have To

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Facebook currently has a moderation team of 4,500 people reviewing videos and other flagged reports, but Zuckerberg is adding another 3,000 to curb the steadily growing prevalence of disturbing content. He also hints at new tools that will make such material easier to report, though there is little elaboration on how those will compare to the current system. Do you have the stomach for this kind of job?

It is unclear whether or not these new team members will be officially hired as staff or just contractors, as the company's current moderators are. According to Zuckerberg, this bolstered community operations team will be able to more effectively root out offensive content on Facebook, before too many people see it. Since launching Facebook Live last spring, the company has been mired in controversy over the feature. Everything from sexual assault to suicide has appeared on the platform, and many critics say the company rushed the product to market without adequately equipping itself for these challenges.
 
I couldn't do it. This is one of those jobs AI would be good for, unless it learned too much about our dark side.
 
I couldn't do it. This is one of those jobs AI would be good for, unless it learned too much about our dark side.

I'm with you, I couldn't do it either. If an AI does it this job could be what causes the uprising though.
 
Didn't some one sue Facebook last year for phycological damages from doing this job at Facebook?
 
Didn't some one sue Facebook last year for phycological damages from doing this job at Facebook?

That was Microsoft. I actually got to visit that physical site, it's pretty insane how much data they collect...and how much access they have to our systems.
 
I dont think watching the violence would bug me, I doubt I'll see anything worse than Iraq or Afghanistan, but the hours and hours of other plain retarded shit people put up would probably be what breaks me lol, and it probably pays like crap too
 
I can imagine that there might be some toolsets to autoflag this stuff.

Recognizing objects is relatively easy these days. So an analysis that detects a weapon (firearm/knife/hand grenade) or certain words ("suicide", "I'm done with it", *gunshots* etc) might get flagged for viewing by a moderator.

At least, that's how I'd suspect a company like Facebook to handle some of this. Some of it has to be automated, or it'll continue to have a 24hr delay like the prior weeks.
 
I heard it pays pretty well due the amount of stress and insane stuff they have to put up with. But I imagine most of them get burn out pretty quickly.

On the other hand, it is a form of censorship. There's nothing that is preventing them from blocking anti Facebook stuff.
 
I heard it pays pretty well due the amount of stress and insane stuff they have to put up with. But I imagine most of them get burn out pretty quickly.

On the other hand, it is a form of censorship. There's nothing that is preventing them from blocking anti Facebook stuff.

It's their platform, they can censor anything they want.
 
i feel sorry for the 3000 Filipinos that will have to watch these stupid videos all day
 
Facebooks AI already sucks at image assessment, videos, oh brother, and articles, heh. Nah this is a human only thing.

So anyone have a link where to sign up? How much $$ etc?
 
Back
Top