However it seems that YouTube has come up with a solution to that problem, which is to isolate these videos in a “limited state”. According to YouTube, “We’ll soon be applying tougher treatment to videos that aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism. If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state”.
They add, “The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetized, and won’t have key features including comments, suggested videos, and likes. We’ll begin to roll this new treatment out to videos on desktop versions of YouTube in the coming weeks, and will bring it to mobile experiences soon thereafter.”
Basically just like unlisted videos, these videos will not be discoverable by the general YouTube community, which means that at least on the surface, YouTube will be protecting its users from such video content. The latest efforts by YouTube is just another part of their overall plan to keep its platform as “safe” as possible. Last month YouTube announced that they will now redirect searches for extremist videos to anti-terrorism playlists instead.