The algorithm is working as intended, but it's being exploited. The algorithm exists for convenience and to consolidate ad revenue. Take away the algorithm and people will start getting advertisements for things that have nothing to do with the person's demographic and/or have to search for specific videos instead of it showing up in their feed, and then people will bitch even more.
The issue is that it's not Youtube's place to police behavior online. That is the parent's responsibility. They can only curb things that violate their terms of service. An example would be Kyle starting to host videos of video card unboxing then some people finding them sexual and start commenting on them to promote that sexuality. Add into that, that Kyle now hosts millions of videos on his site. How is he going to police it, other than by having people flag the video, of which the video itself has nothing overtly wrong with it? It's only a small subset of people sexualizing them, for things that most would not find sexy. People say that children shouldn't be posting online. <sarcasm>Should we then keep them under lock and key in a closet so that no one will view them? And since we're talking about children, let's admit that it's the girls that are the problem. And since they obviously didn't learn to hide themselves away as girls, they should hide away as women too, because as we know, it's how a woman looks that dictates how they are sexualized.</sarcasm> That last part has some truth in it, of course, but at the same time, people will sexualize things. Should we take My Little Pony off the air because some sexualize it? What about Transformers? Robot Chicken? Trump?
How about starting at home and school, educating children on things to minimize threats, exposure, stupidity, etc. The other thing, why should we have to censor children doing childish things? Should we then have them cover up at the playground because anyone can see them? What about at the public swimming pool? Hotels? Amusement parks?
To me, it seemed the guy was raising awareness of this to bitch about monetization more than any altruism.
The issue is that it's not Youtube's place to police behavior online. That is the parent's responsibility. They can only curb things that violate their terms of service. An example would be Kyle starting to host videos of video card unboxing then some people finding them sexual and start commenting on them to promote that sexuality. Add into that, that Kyle now hosts millions of videos on his site. How is he going to police it, other than by having people flag the video, of which the video itself has nothing overtly wrong with it? It's only a small subset of people sexualizing them, for things that most would not find sexy. People say that children shouldn't be posting online. <sarcasm>Should we then keep them under lock and key in a closet so that no one will view them? And since we're talking about children, let's admit that it's the girls that are the problem. And since they obviously didn't learn to hide themselves away as girls, they should hide away as women too, because as we know, it's how a woman looks that dictates how they are sexualized.</sarcasm> That last part has some truth in it, of course, but at the same time, people will sexualize things. Should we take My Little Pony off the air because some sexualize it? What about Transformers? Robot Chicken? Trump?
How about starting at home and school, educating children on things to minimize threats, exposure, stupidity, etc. The other thing, why should we have to censor children doing childish things? Should we then have them cover up at the playground because anyone can see them? What about at the public swimming pool? Hotels? Amusement parks?
To me, it seemed the guy was raising awareness of this to bitch about monetization more than any altruism.