Content creators on YouTube who follow all of the site’s rules may still face censorship by the platform, under new plans announced by Google.
According to a post on YouTube’s official blog, videos will now be subject to the rule of the mob. If enough users flag a video as “hate speech” or “violent extremism,” YouTube may impose restrictions on the content even if it breaks none of the platform’s rules.
We’ll soon be applying tougher treatment to videos that aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism. If we find that these videos don’t violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state. The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetized, and won’t have key features including comments, suggested videos, and likes.
YouTube has also rolled out a “trusted flagger” program, in which 15 “expert NGOs and institutions” to help them identify hate speech and extremism on their platform.
Among these organizations are the No Hate Speech Movement, a left-wing project pushed by the Council of Europe, as well as the Anti-Defamation League, an organization whose president has been accused of “manufacturing outrage” by the World Jewish Congress.
YouTube is also planning to artificially alter its search results so that searches for “sensitive” topics on YouTube no longer return the most popular videos, but a “playlist of curated YouTube videos that directly confront and debunk violent extremist messages.”
The platform also plans to artificially promote videos created via its “Creators for Change” program, which, in YouTube’s words, features creators who are “using their voices and creativity to speak out against hate speech, xenophobia and extremism.”
We’ve started rolling out features from Jigsaw’s Redirect Method to YouTube. When people search for sensitive keywords on YouTube, they will be redirected towards a playlist of curated YouTube videos that directly confront and debunk violent extremist messages. We also continue to amplify YouTube voices speaking out against hate and radicalization through our YouTube Creators for Change program. Just last week, the U.K. chapter of Creators for Change, Internet Citizens, hosted a two-day workshop for 13-18 year-olds to help them find a positive sense of belonging online and learn skills on how to participate safely and responsibly on the internet.