Discord says it has banned more than 2,000 extremist communities: NPR


The group chat app Discord announced on Monday that in the second half of 2020, it had destroyed more than 2,000 communities dedicated to extremist causes, more than 300 of which focused on the baseless QAnon conspiracy theory.

picture alliance / picture alliance via Getty Images

hide caption

toggle legend

picture alliance / picture alliance via Getty Images

The group chat app Discord announced on Monday that in the second half of 2020, it had destroyed more than 2,000 communities dedicated to extremist causes, more than 300 of which focused on the baseless QAnon conspiracy theory.

picture alliance / picture alliance via Getty Images

Discord, the group chat app that grew rapidly during the coronavirus pandemic, removed more than 2,000 communities dedicated to extremism and other violent content in the second half of last year, the company reported Monday.

Discord officials said of the 2,212 extremist and violent communities removed from its platform, around 1,500 were first detected by the company. This is almost double the number that was banned for extremist content in the first half of 2020.

“We continue to believe that there is no place on Discord for groups that organize around hatred, violence or extremist ideologies,” Discord said in its latest transparency report.

Enforcement measures come at critical time for Discord, with tech giant Microsoft would have in talks to acquire the social network for $ 10 billion.

Discord is a social media site full of mostly invitation-only group chat rooms, where users generally communicate anonymously. Founded in 2015 as a hub for gamers, Discord has recently branched out into a gathering place for things like book clubs, karaoke, and Wall Street business boards.

Among the forums deactivated in the last round, some were devoted to the anti-government movement Boogaloo. Discord said there has been a spike in activity in its QAnon communities, the pro-Trump conspiracy theory. From July to December, Discord deleted 334 communities linked to QAnon.

Overall, Discord has shut down nearly 30,000 communities across the site for various types of abuse. The most frequently cited violations involved cybercrime and exploitative content, which includes revenge pornography and sexually explicit content involving minors.

Once seen as a haven for white nationalists and other hate groups, Discord has worked to kick violent inciting users and dangerous communities from the platform since the murderous Unite the Right rally in Charlottesville, Va., in 2017. Discord was used heavily by many who planned this event, prompting the platform to step up its moderation policies.

While rioters who stormed the Capitol in January communicated on a variety of social networks like Facebook and Twitter and smaller sites more open to far-right commentary like Parler and Gab, the left-wing group Unicorn Riot documented 18 communities, which Discord calls servers, frequented by some who took part in the siege of the Capitol.

William Partin, a research analyst at the nonprofit Data & Society that studies disinformation online, said Monday’s report shows that Discord continues to be concerned about another possible “group infestation. far right “that the social network saw in the run-up to the deadly Charlottesville rally.

“While reports like this are part of a public relations campaign to say, ‘Hey, we’re taking this seriously,’ I think it’s also proof of the tremendous progress they’ve made. “said Partin.

Still, the report offers only a limited overview, he said. Most moderations on Discord are handled by its own users, who serve as administrators who enforce rules and standards.

“This is of course, to some extent, to just outsource the highly skilled workforce of moderation and community management,” Partin said, adding that while there are benefits to What peers in a community keep community members online, their actions are not publicly documented.

Discord does not provide data on moderators, including what type of toxic content the moderators tolerated. Instead, Discord offers statistics on what company managers are doing on the site, often as a result of a user report. Most people experience Discord in small, private communities, unlike other social media platforms, like Twitter, where almost all conversations are public.

“So if I see someone harassing someone on Twitter, I can go report it, but I kind of gotta be in the right place at the right time on Discord,” Partin said.

According to Discord, 15% of its full-time staff are dedicated to trust and security, a percentage about the same as large social media companies like Facebook and Twitter.

In the second half of 2020, this team deleted some 266,000 accounts from Discord, primarily for violations of the site’s ban on exploitative content, which includes non-consensual pornography and sexual content involving minors.

During this period, more than 27,000 communities were banned, mainly for violating the rules of the platform against cybercrime.

Discord officials said that while harassment was the most frequently reported problem by users, cybercrime saw the biggest jump at the end of last year, increasing almost 250% from the first half of the year. 2020.

Spam continues to plague Discord. Trust and security officials deleted 3.2 million accounts for “spam behavior,” according to the report, which separated spam deletions from withdrawals in other categories.

Civil attorneys and prosecutors have sent a constant stream of requests to Discord, where chats are often private and accessible only to invited people, but communication is not encrypted.

In the last six months of 2020, the site complied with more than 1,100 subpoenas and 900 search warrants, according to the company.


About Author

Leave A Reply