Despite instituting a more stringent policy around hateful content in June, YouTube has been criticized for doing too little and not providing enough transparency.
In a blog post on Tuesday, YouTube said it had removed more than 100,000 videos and over 17,000 channels for violating its hate speech rules in April through June, which is five times more than it removed in the previous three months. It also took down over 500 million comments over hate speech.
YouTube attributed the increase to its recent efforts to counter the proliferation of hate content. In June, the company updated its hate speech policy to include a ban on supremacist content and the removal of videos that deny well-documented atrocities, such as the Holocaust, and the 2012 shooting rampage at Sandy Hook elementary school.
YouTube, which is owned by Google (GOOGL), also said it’s been able to remove more objectionable content before it was widely viewed — efforts that have resulted in an 80% decrease in views on content that is later taken down for violating YouTube’s rules.
Despite the new policy, YouTube has not taken action on channels belonging to prominent purveyors of hate, such as white supremacist Richard Spencer and former KKK leader David Duke.
Earlier this month, the Anti-Defamation League released a report that found at least 29 YouTube channels espousing anti-Semitic and white supremacist content. While some of the channels named in the report have since been taken down, many still remain on the platform.
In the last week, YouTube has also flip-flopped on content decisions. For example, it deleted and then reinstated several channels with white nationalist views, including Vdare and the Iconoclast.
“We realize that many may find the viewpoints expressed in these channels deeply offensive. However, after a re-review of the content, we reinstated the channels,” a YouTube spokesperson told CNN Business.
While YouTube didn’t provide a clear explanation as to its decision-making process, the spokesperson said it generally removes channels that are entirely dedicated to violating its policies or repeat offenders of its guidelines.
On Tuesday, YouTube also said its machine learning systems are improving. More than 87% of the 9 million videos it removed during the second quarter were first flagged by its automated systems. Videos can be removed for a variety of reasons beyond hate speech, including copyright infringement, violence, nudity and spam.
Last week, YouTube CEO Susan Wojcicki again said the company is committed to being an open platform.
“A commitment to openness is not easy. It sometimes means leaving up content that is outside the mainstream, controversial or even offensive,” she wrote in a letter to people who create content for YouTube. “But I believe that hearing a broad range of perspectives ultimately makes us a stronger and more informed society, even if we disagree with some of those views.”