Advertisement

Amid harassment complaints, YouTube says it will remove more white supremacist content

YouTube is revising its hate speech policies to prohibit videos with white supremacist and neo-Nazi content amid controversy over the company's response to allegations of harassment.
(Patrick Semansky / Associated Press)
Share

On Tuesday, the insults were “clearly hurtful,” but they did not violate YouTube rules.

On Wednesday, they were part of “a pattern of egregious actions,” and as punishment, the company would suspend the offending channel’s YouTube revenue stream.

YouTube’s quick reversal on a high-profile harassment case is the latest illustration of a company struggling to define its rules and enforce them with any consistency. Amid the backlash, YouTube rolled out new rules barring certain white supremacist content — videos it has faced criticism for years for hosting.

This controversy over homophobic, racist harassment on Google’s dominant video platform was two years in the making. Popular right-wing commentator Steven Crowder had at times used his YouTube channel to mock Vox Media video journalist Carlos Maza. In one video, Crowder called Maza, who is gay and Latino, “a lispy queer” and pantomimed oral sex. Every time a new video was posted, Maza woke to incessant attacks from Crowder’s fans on his Twitter and Instagram. Last year he was bombarded with hundreds of text messages after he was doxxed, meaning personal information including his phone number was posted online.

Advertisement

Fed up with what he describes as harassment from Crowder and his followers, Maza last week stitched together clips in which the right-wing commentator made overt jabs at his sexuality and ethnicity, and in a series of tweets, denounced YouTube for continuing to allow Crowder to harass him.

YouTube investigated the harassment, and Tuesday, concluded that Crowder did not violate its policies and that the company would take no action.

The reasoning: Crowder’s videos weren’t primarily malicious attacks but rather were opinions. YouTube said Crowder had not instructed his viewers to harass Maza, and none of Maza’s personal information was revealed in content uploaded by Crowder.

On Wednesday, the company walked back on Crowder, suspending his channel’s YouTube revenue stream and citing “a pattern of egregious actions.”

(The tech giant spurred further confusion when it tweeted that Crowder could get back in good standing simply by removing a link to his e-commerce store that sold T-shirts emblazoned with a homophobic slur. The company later clarified that Crowder would need to address the complete pattern of bad behavior to again earn revenue directly from YouTube.)

Crowder has defended himself in videos, saying his comments were “friendly ribbing,” and he accused Vox of trying to “silence voices that they don’t like.” He did not respond to a request for comment.

Advertisement

But Crowder’s videos targeting Maza were a clear violation of YouTube’s long-standing harassment policy, said UC Irvine visiting researcher Kat Lo, who studies content moderation.

“The use of baited language and focus on a marginalized person has clearly incited harassment,” Lo said. “This case is especially interesting because the violation is so clear in terms of use of racist, homophobic and dehumanizing language, but then [YouTube’s] policy interpretation is very, very loose, which suggests there are other factors at play.”

It’s typical for companies to walk back decisions or be more aggressive in how they moderate content based on public backlash, Lo added said.

YouTube’s latest attempt to curb hate speech is also a change in policy for a company that has been cautious to intervene in disputes about content.

The new rules, which YouTube outlined in a blog post, prohibit videos “alleging that a group is superior in order to justify discrimination, segregation or exclusion” based on race, gender, caste, religion, sexual orientation and other protected categories. That includes videos endorsing white supremacy and those denying well-documented violent events, such as the Holocaust and the 2012 mass shooting at Sandy Hook Elementary School.

YouTube said it would begin scrubbing offending videos from its servers, and accounts that repeatedly cross the line will be subject to sanctions, including exclusion from revenue sharing and total suspension.

Advertisement

Though the prohibition on neo-Nazi speech is clear-cut in the policy, YouTube said it will not ban “borderline content and harmful misinformation such as videos promoting a phony miracle cure for a serious illness, or claiming the earth is flat,” but it would attempt to tamp down circulation by limiting promotion of those videos. Channels that produce occasional hateful content can accumulate “strikes” rather than receiving a first-time ban.

YouTube is not alone in facing criticism for weak or inconsistent enforcement of hate speech and harassment policies. The video hub, along with Facebook and Twitter, have received heightened scrutiny in Washington for allowing false information and discriminatory ideologies to spread on their platforms.

Conspiracy theorist Alex Jones built his audience on YouTube despite years of complaints about misinformation before he was ousted from the platform last year. Facebook followed suit in March, banning the InfoWars founder and several other controversial figures for violating policies on hate speech and promoting violence.

In late May, a distorted video of House Speaker Nancy Pelosi racked up millions of views on Facebook and spread on YouTube and Twitter. YouTube removed the video after the Washington Post published a report about it.

Lo, the UC Irvine researcher, said platforms are often hesitant to take down content from public figures, especially when they might be subject to criticism alleging they are biased against conservatives.

She said YouTube’s move to leave Crowder’s videos up will give YouTube creators a clear strategy to avoid having their content removed for hateful speech. So long as most of their videos don’t contain hateful rhetoric, they should manage to sneak around any moderation, she said.

Advertisement

“I think they will be more empowered to do so rather than feeling pressure to not use bigoted language — which is just so absurd to me,” she said.

YouTube’s effort to demonetize hateful content will do little to stop bad actors, Maza said.

“Abusers use it as proof they’re being ‘discriminated’ against,” Maza tweeted. “Then they make millions off of selling merch, doing speaking gigs, and getting their followers to support them on Patreon. The ad revenue isn’t the problem. It’s the platform.”

Advertisement