YouTube To Deploy More Human Moderators To Combat Abuse

1617

YouTube has been having a bit of a content problem in recent times, where it seems that with some clever maneuvering, some content creators have managed to upload some rather disturbing videos. YouTube has stated several times that they will be cracking down on this, and in their latest blog post, they share some of the details on how.

One of those ways would be to hire more human moderators. While filters can help to catch certain videos, like we said, manipulation on the video and its title can sometimes make it past the filters, which is where human moderators come in and can catch things that sometimes the computer can’t.

According to  YouTube, “We will continue the significant growth of our teams into next year, with the goal of bringing the total number of people across Google working to address content that might violate our policies to over 10,000 in 2018. At the same time, we are expanding the network of academics, industry groups and subject matter experts who we can learn from and support to help us better understand emerging issues.”

Granted humans might not be as efficient as we need rest, we take breaks, and sometimes we get careless, but hopefully the combined efforts of filters, AI, and human moderators can help make YouTube a safer place for everyone.

You May Also Like