Advertisement

YouTube is letting users decide on terrorism-related videos

Share

Nudity. Sexual activity. Animal abuse. All are reasons YouTube users can flag a video for removal from the website. Add a new category: promotes terrorism.

YouTube and its parent company, Google, have been criticized by lawmakers for refusing to prescreen militant speeches and propaganda videos that have been cited in more than a dozen terrorism investigations over the last five years.

But rather than submit to policies that many argue would amount to an erosion of 1st Amendment rights, particularly in an open-access environment such as the Internet, YouTube is taking a decidedly more democratic path — let the customers decide.

Advertisement

The approach puts YouTube in the middle of a debate over whether it is possible to protect free speech and deny militants a powerful recruitment tool — slick videos glorifying jihad that reach into the laptops and minds of disaffected young Americans.

After years of calling on YouTube to take down content produced by Islamic extremists, Sen. Joe Lieberman (I-Conn.) called the new flagging protocols a “good first step toward scrubbing mainstream Internet sites of terrorist propaganda.”

“But it shouldn’t take a letter from Congress — or in the worst possible case, a successful terrorist attack — for YouTube to do the right thing,” said Lieberman, whose staff has met with YouTube officials on the issue.

Yet the new category also is “potentially troubling,” said George Washington University law professor Jeffrey Rosen, because the phrase “promotes terrorism” is more subject to interpretation than the longstanding language in the YouTube guidelines that specifically forbids material that incites others to commit violence.

In November, YouTube removed hundreds of videos that featured the American cleric Anwar Awlaki, whom U.S. officials have designated a “global terrorist,” after Rep. Anthony Weiner (D-N.Y.) wrote then-YouTube Chief Executive Chad Hurley a letter detailing Awlaki’s appearance in more than 700 videos with 3.5 million page views on the site.

Despite YouTube’s action, dozens of Awlaki’s speeches are easily found on the site, and users who play the speeches are directed to dozens of other Islamic militant videos under a “suggestions” column.

Advertisement

YouTube has been a favorite tool of Awlaki, who is believed to be hiding in Yemen with other members of the organization Al Qaeda in the Arabian Peninsula. U.S. law enforcement officials think Awlaki’s preaching influenced Umar Farouk Abdulmutallab, who is accused of trying to blow up a plane over Detroit on Christmas Day; Faisal Shahzad, the would-be Times Square bomber; and Army Maj. Nidal Malik Hasan, who is accused of killing 13 people at Ft. Hood, Texas, in November 2009.

A 21-year-old Baltimore construction worker accused of plotting to blow up a military recruiting station last week called Awlaki a “real inspiration,” according to court documents.

U.S. investigators working on domestic terrorism cases during the last five years have repeatedly found Awlaki’s English-language speech “Constants on the Path to Jihad” shared among circles of would-be plotters. The speech, which is still on YouTube, is a lengthy interpretation of the religious justifications for fighting against perceived enemies of Islam.

If a father forbids his son to fight, Awlaki says at one point, the son should disobey. “When the command of Allah clashes with the command of the parents,” Awlaki says, “he will obey the command of Allah.”

After a 21-year-old woman told a British judge that she was inspired to stab a parliamentarian in March after she watched Awlaki’s speeches on YouTube, Britain security minister Pauline Neville-Jones called on the U.S. “to take down this hateful material.”

“Those websites would categorically not be allowed in [Britain] — they incite cold-blooded murder and, as such, are surely contrary to the public good,” Neville-Jones said in an October speech in Washington.

Advertisement

YouTube executives say they are committed to ensuring that the website is not used to “spread terrorist propaganda or incite violence.” But given the massive amount of video uploaded to YouTube — more than 24 hours of video every minute — it is “simply not possible” to prescreen the content, YouTube executive Victoria Grand wrote in a Nov. 10 letter to Weiner.

YouTube relies on users to flag inappropriate videos to be reviewed by its employees. YouTube would not disclose how many reviewers it employs or what languages they understand. If the reviewers determine that the videos contain nudity, animal abuse, hate speech or incite violence, they are taken down for violating the site’s terms of use.

But when it comes to deciding whether a video is religious free speech or promotes terrorism, YouTube aims “to draw a careful line between enabling free expression and religious speech, while prohibiting content that incites violence.”

It is admirable that YouTube devotes resources to consider religious speech on a case-by-case basis, said Rosen, the law professor. “It is precisely the speech of those we hate that needs the most protection if free expression is going to flourish.”

brian.bennett@latimes.com

Advertisement