DooKey
[H]F Junkie
- Joined
- Apr 25, 2001
- Messages
- 13,500
According to current and former contract workers who were responsible for training the YouTube AI, guidelines used to train the AI are often contradictory and confusing and this is causing the issues with child exploitation videos remaining on YouTube. Further, YouTube has promised to hire 10,000 moderators to get rid of inappropriate videos, but they might want to take a look at their guidelines used to train the AI.
These documents and interviews reveal a confusing and sometimes contradictory set of guidelines, according to raters, that asks them to promote “high quality” videos based largely on production values, even when the content is disturbing. This not only allows thousands of potentially exploitative kids videos to remain online, but could also be algorithmically amplifying their reach.
These documents and interviews reveal a confusing and sometimes contradictory set of guidelines, according to raters, that asks them to promote “high quality” videos based largely on production values, even when the content is disturbing. This not only allows thousands of potentially exploitative kids videos to remain online, but could also be algorithmically amplifying their reach.
Last edited: