YouTube says it’s now banning videos that include hate speech or promote supremacist views—a move that comes 14 years after the site’s inception and hundreds of millions of videos have been posted.
The ban covers videos that suggest one group of people is superior over another based on factors including race, gender, age, or sexual orientation, YouTube said on Wednesday. Some examples include videos that glorify Nazi ideology or deny documented events like the Holocaust.
Additionally, the service said it will remove ads from video channels that repeatedly violate its new policy. Many video creators get a cut of ad revenue and removing that source of money makes it more difficult for them to continue producing videos.
“The openness of YouTube’s platform has helped creativity and access to information thrive,” YouTube said in blog post on Wednesday. “It’s our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination and violence.”
YouTube did not respond to a request for comment about why it took so long to ban supremacy videos. Previously, it has said it had to remain balanced in its approach to protecting free speech, although it has cracked down on videos that contain nudity and profanity.
The latest policy shift follows some smaller steps YouTube took to mitigate the problem in 2017, when it started to cut down on recommending certain controversial videos to viewers. The move reduced the distribution of these videos on average by 80%. YouTube also removed features like comments from the videos and didn’t allow them to be shared.
YouTube’s decision comes as it ramps up policing of its site amid complaints that it’s become a cesspool of discrimination, harassment, and disinformation. But amid the crackdown, the site has faced mounting pressure from conservatives, who say that their posts are being targeted by liberal-leaning tech companies.
YouTube has had plenty of time to address the problem of supremacist videos since it’s founding in 2005. And, like other social networks like Facebook, hate speech is a major problem along with political extremism.
Last year, viewers flagged nearly 160,000 videos for hate speech or political extremism—YouTube reports these two items together—according to YouTube’s latest report about user violations on its service. The service removed or blocked nearly 45,000 of those videos, less than 30% of those reported in the category.
Hate speech and political extremism is the biggest category of complaints at YouTube. Defamation or insults is second, with 97,000 reports.
Critics have complained that YouTube is inconsistent in enforcing its rules on hate speech and harmful content. Over past couple of years, those complaints have been getting louder.
For example, Carlos Maza, a writer at Vox, has publicly complained that YouTube personality Steven Crowder has used his YouTube channel to repeatedly attack Maza’s sexual orientation. YouTube initially sided with Crowder, saying in a tweet on Tuesday that the videos were “clearly hurtful”, but that they didn’t violate YouTube’s policies.
On Wednesday, YouTube revised its stance, saying it would no longer allow ads on Crowder’s channel due to “a pattern of egregious actions.”
YouTube’s new policy for supremacy videos is already creating complaints about how it’s being enforced. [f500link]Ford Fischer, editor in chief of online media outlet News2Share, said on Twitter that “within minutes” of YouTube’s announcement his website was stripped of ads. YouTube also removed two of Fischer’s videos, one of which showed activists confronting a Holocaust denier who casted doubt on the historical event. The other video, which was used in a PBS documentary, showed raw footage of a speech by anti-Semitic conspiracy theorist Michael Peinovich, known as Mike Enoch.
Wednesday, as part of its new policy against supremacist videos, YouTube said it would not remove videos that expose hate or provide analysis of current events.