Thanks to the infinitely depressing degree to which covidi has kept everyone trapped inside, Discord is more relevant than ever. But as the company revealed its latest transparency report, which has brought new challenges and improved efforts to meet other challenges, should probably have worked harder sooner.
Discord, which is supposed to in talks with Microsoft to sell for around 1.3 Bethesdas, published the transparency report today. Amid standard operational information about Discord’s second half of 2020, some details were highlighted. On the one hand, the overall number of user reports increased quite steadily throughout 2020 (from 26,886 in January to 65,103 in December), and the number initially increased in March. Does that make sense; people got trapped in his house and as a result Discord grew rapidly. Spam caused the largest number of account deletions (more than 3 million), with exploitative content that included non-consensual pornography in a distant second (129,403) and harassment in a third (33,615).
Discord also noted that of the reports made, measures were most often taken against issues related to material harmful to children, cybercrime, doxxing, exploitative content and extremist or violent content. “This can be explained in part by the team’s prioritization of issues in 2020 that would likely cause real-world damage,” the company said in the transparency report.
In fact, according to the report, Discord removed more than 1,500 servers for violent extremism in the second half of 2020, which it said was “almost 93% more than the first half of the year.” He cited groups like Boogaloo Boys and QAnon as examples.
“This increase can be attributed to the expansion of our anti-extremism efforts, as well as the growing trends in the space of online extremism,” the company wrote. “One of the online trends observed in this period was the growth of QAnon. We adjusted our efforts to deal with the move, ultimately eliminating 334 QAnon-related servers. “
G / O Media may receive a commission
Deletions of cybercrime servers skyrocketed similarly throughout 2020, up 140% from the first half of the year. In all, Discord eliminated nearly 6,000 servers for cybercrime during the second half of 2020, which it said followed a significant increase in reports. “Discord wrote that ‘more cybercrime spaces that were never marked as trust and security and eventually more were removed.”
Discord also stressed his focus on methods that allow him to “proactively detect and eliminate the most damaging groups on our platform,” noting his efforts against extremism as an example, but also pointing out where he committed a error.
“We were disappointed to realize that in this period one of our tools to proactively detect [sexualized content related to minors] the servers contained an error, “Discord wrote.” There were fewer general flags for our computer as a result. This error has been resolved since then and we have resumed the servers from the tool surfaces “.
The other issue here is that Discord made a concerted effort to remove QAnon content at the same time, other platforms did“After the lion’s share of the damage was done.” While the removal may have been proactive according to the internal definition of Discord, the platforms took a long time to even behave reactively when it came to QAnon in general, which led to actual and lasting damage in the United States and all over the world. In 2017, Discord also operated as a great stage for Unite The Right concentration in Charlottesville, Virginia, which finally caused violence and three deaths. Although the platform has tried to clean up her act ever since, she hosted an abundance of abuse i high-right activity in 2017.
Some transparency is much better than none, but it should be noted that transparency reports from technology companies are often prlittle information about how decisions and the highest priorities are made of the platforms that essentially govern our online lives. Earlier this year, for example, Discord banned the r / WallStreetBets server at the height of GameStop stonksapalooza. Spectators suspected brutal play — external interference of some sort. Talking to Kotakudespite this two sources made it clear that the labyrinthine internal policies of moderation caused Discord to make that decision. Bad weather and poor transparency before and after taking care of the rest.
This is just a minor example of how this dynamic can be developed. There are many more. Platforms can be said to be transparent, but ultimately only provide people with a bunch of barely contextualized numbers. It’s hard to tell the aspect of real transparency in the age of all-encompassing technology platforms, but that’s not it.
.