Facebook

Social Media Platforms Aren't Doing Enough to Police Domestic Extremists, Report Finds

NBC Universal, Inc.

A new report from Digital Citizens Alliance and The Coalition for a Safer Web says social platforms need to do a better job cracking down on domestic extremists who continue to use digital platforms to recruit new members, spread misinformation and glorify their violent acts.

The 56-page report, released Monday, calls on Congress to enact new penalties for domestic terrorism. It also recommendations that social media companies hire more people to police their sites and increase consistency in enforcement by sharing information about bad actors with other online platforms. 

"They brag about their ability to track videos and be able to put ads that really relate to the experience. They need to use that prowess and stop treating this like a PR problem and start treating it like a national security issue that it is," said Tom  Galvin, Executive Director of Digital Citizens Alliance. 

Researchers from the DC-based non-profits teamed up and spent more than two months scouring the social media sites  Youtube, Facebook, Instagram, Twitter,  TikTok and Telegram. The report outlines hundreds of troubling posts, including pictures and videos believed to be from domestic terrorists, militias and other extremists.

"They used it to actually plan their activities [at the Capitol] for January 6th. It was a very convenient and effective way for them to do it," said  Galvin. "We found that they recruit via digital platforms just like jihadists do."

Galvin said social media companies not only have a legal obligation, but a moral one, to ensure hate groups  aren't using the platforms to spread misinformation or recruit new members. 

But he says the typical business model is to seek as much content and as many users as possible- to monetize them - which can be contrary to the mission of removing posts or people who violate platform rules. 

"We're not here to say the platforms  aren't doing anything, what we're saying is they're not doing enough," said  Galvin.

Galvin said once his researchers started searching for the posts, they encountered an endless loop of suggested content feeding them extremist ideas. He worries with  TikTok catering to younger users and talk of an Instagram platform for kids, failure to crackdown could help recruit a future generation of hate. 

"Hopefully with this investigation in this report, we can add to the public discourse about the role of digital platforms, frankly, the rise of domestic extremism to really significant heights not seen since the  1990s. And that is something we need to pay very close attention to,"  Galvin said.

The social media platforms told News 4 they try to be proactive and take down extremist content before it's even reported. Several said they have already removed many of the posts and profiles researchers cited in the report.

"Violent extremism has no place on  TikTok," said a spokesperson.  "We work aggressively to stop the spread of disinformation and ban accounts that attempt to co-opt our platform for violent or hateful behavior."

According to  TikTok's most recent transparency report, the platform removed 92.4% of  violative videos before they were reported, and 93.5% were removed within 24 hours of being posted.

Digital Citizens Alliance acknowledged that removed content is often quickly replaced with a similar version or the same content under a different name, even citing a specific example from the report that is already back on Instagram. 

In a statement, a spokesperson for Instagram and Facebook wrote, "We expect this movement to change its tactics to try and evade our enforcement. Our teams are monitoring any changes in behavior and the types of content being shared so we can adapt our policies and enforcement as necessary. We continue to remove thousands of accounts, Pages, groups, events and profiles that violate our policies against  QAnon.” 

A YouTube spokesperson took issue with some of the report's findings, writing: " We welcome more research on this front, but disagree with the assertion that we haven’t done enough to combat extremist content on YouTube. In fact, in some instances, our policies and efforts to limit the spread of harmful misinformation through our systems are more comprehensive than those of other platforms."

YouTube pointed out that channels belonging to many extremist leaders, including the Proud Boys and Chris Hill, as well as prominent QAnon figures, were terminated months prior to January 6th. 

In the fourth quarter of 2020 alone, YouTube said it removed more than 13,000 channels and 72,000 videos for violating its violent extremism policy.

Twitter and Telegram did not respond to requests for comment. 

Contact Us