© 2024 KASU
Your Connection to Music, News, Arts and Views for 65 Years
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Group-Chat App Discord Says It Banned More Than 2,000 Extremist Communities

Group-chat app Discord announced on Monday that in the second half of 2020, it took down more than 2,000 communities dedicated to extremist causes, of which more than 300 focused on the baseless conspiracy theory QAnon.
picture alliance
/
picture alliance via Getty Images
Group-chat app Discord announced on Monday that in the second half of 2020, it took down more than 2,000 communities dedicated to extremist causes, of which more than 300 focused on the baseless conspiracy theory QAnon.

Discord, the group-chat app that has grown rapidly during the coronavirus pandemic, removed more than 2,000 communities dedicated to extremism and other violent content in the second half of last year, the company reported on Monday.

Officials at Discord said that of the 2,212 extremist and violent communities taken down from its platform, about 1,500 were first detected by the company. That is nearly double the number that was banned for extremist content in the first half of 2020.

"We continue to believe there is no place on Discord for groups organizing around hate, violence, or extremist ideologies," Discord said in its latest transparency report.

The enforcement actions come at a critical time for Discord, with tech giant Microsoft reportedlyin talks to acquire the social network for $10 billion.

Discord is a social media site full of mostly invitation-only group-chat rooms where users typically communicate anonymously. Founded in 2015 as a hub for gamers, Discord has more recently branched out into a gathering place for things like book clubs, karaoke and Wall Street trading tips.

Among the forums disabled in the latest round were some devoted to the anti-government Boogaloo movement. Discord said there had been a spike in activity in its communities from QAnon, the pro-Trump conspiracy theory. From July to December, Discord deleted 334 communities related to QAnon.

Overall, Discord shut down nearly 30,000 communities across the site for various types of abuse. The most-often cited violations were for cybercrimes and exploitative content, which includes revenge porn and sexually explicit content involving minors.

Once seen as a haven for white nationalists and other hateful groups, Discord has been working to expel violence-promoting users and dangerous communities from the platform ever since the deadly Unite the Right rally in Charlottesville, Va., in 2017. Discord was used heavily by many who planned that event, sparking the platform to step up its moderation policies.

While rioters who stormed the Capitol in January communicated on a variety of social networks like Facebook and Twitter and smaller sites more open to far-right commentary, like Parler and Gab, the left-leaning group Unicorn Riot has documented 18 communities, which Discord calls servers, frequented by some who took part in the Capitol siege.

William Partin, a research analyst at the nonprofit Data & Society who studies disinformation online, said Monday's report shows that Discord continues to be concerned about another possible "infestation of far-right groups" that the social network saw in the lead-up to the deadly Charlottesville rally.

"While reports like this one are part of a public relations campaign that's meant to say, 'Hey, we're taking this seriously,' I think it's also evidence of the considerable progress they've made," Partin said.

Yet the report offers just a limited snapshot, he said. Most moderation on Discord is handled by its own users, who serve as administrators who enforce rules and norms.

"This is of course to some extent just outsourcing the highly skilled labor of moderation and of community management," Partin said, adding that though there are advantages to having a community's peers keep community members in line, their actions are not publicly documented.

Discord does not provide data about moderators, including what kind of toxic content that moderators have tolerated. Rather, Discord offers statistics about what company officials do across the site, often following a user report. Most people experience Discord in small, private communities, unlike other social media platforms, like Twitter, where nearly all conversations are public facing.

"So if I see someone harassing someone on Twitter, I can go and report it, but I kind of have to be in the right place at the right time on Discord," Partin said.

According to Discord, 15% of its full-time staff members are dedicated to trust and safety, a percentage around the same as those at larger social media companies like Facebook and Twitter.

In the second half of 2020, that team deleted some 266,000 accounts from Discord, overwhelmingly for violations of the site's prohibition on exploitative content, which includes nonconsensual pornography and sexual content involving minors.

During that period, more than 27,000 communities were banned, mostly for violations of the platform's rules against cybercrimes.

Discord officials said that while harassment was the most frequent issue reported by users, cybercrime experienced the largest jump late last year, increasing nearly 250% from the first half of 2020.

Spam continues to sorely plague Discord. Trust and safety officials removed 3.2 million accounts for "spammy behavior," according to the report, which separated spam removals from takedowns in other categories.

Civil lawyers and prosecutors sent a steady stream of requests to Discord, where chats are often private and accessible only to those invited, but the communication is not encrypted.

In the last six months of 2020, the site complied with more than 1,100 subpoenas and 900 search warrants, according to the company.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Tags
Bobby Allyn is a business reporter at NPR based in San Francisco. He covers technology and how Silicon Valley's largest companies are transforming how we live and reshaping society.