Can Discord consider building themselves a dedicated report system?
I'm a professional game moderator who's worked elsewhere, (8 years).
I've had to recently report a user from my server to Discord's team and it was a strange user-experience for me.
Firstly there wasn't an actual "report" button for anybody to escalate reports to Discord, and it took some digging through procedures to figure out how to send in the information.
There really ought to be a "report user" system and it's a bit of a gamble require users to have to go through the process of finding IDs and the like to report safety violations.
Yes, it means having far more reports to process, and far more fake reports sent in, but it is the actual job of Discord and the work that comes with having an internet community. I think it's unrealistic to expect everyone who will make a server know how to moderate at a professional level, and thus Discord really needs their own system.
Discord users, even those with moderator roles on servers do not always have the means to investigate cases, and obviously not to report things to NCMEC and the police. It should be easier to report to the company itself, and the company should take a present role in ensuring the safety of users.
Despite the amount of very dedicated and mature server moderators and admins on Discord, there is a lot of gaps for things to fall through the cracks and putting liability on the server owners for failing to meet professional industry standards is-maybe-even unfair of Discord.
Secondly, it is abnormal that the Discord moderation team appears to be working out of Zendesk or some kind of ticket system rather than an actual report system to handle TOS violations.
Perhaps this is currently cheaper for the company, but it means a lot of case information is potentially lost, and a lot more is put on the user to accurately report cases that most reporters are not going to be experienced in documenting for the company to investigate.
GDPR does restrict data collection on levels, but considering that I worked at a European company that still had automatic "context" collected when a user made a report through a report system so we could investigate regardless of if the user sent in the proper links to exact messages, I think this is not an issue with privacy and it more about investing the time and money to build such a system.
Having the user build the case for you with documentation, may also lead to a lot of "he said she said" battles without unbiased context to fall back on.
Discord seems to be (intentionally or not?) "outsourcing" the moderation to actual server owners and their moderation team, and your report system is currently the traditional "Safety" division which the worst situations are escalated by (what would normally be company-paid and trained) moderators.
I'm not sure this is the best user experience to give server owners and their communities. Sure it saves money not to need as many dedicated company moderators and to support a report system of tools you must maintain, but the bigger the group on discord, the closer to a "professional" moderator you need to be to reign it all in.
-- I just saw a giant server I was in (maybe 7,000-10,000 users?) have their Admin rage quit due to the stress of handling so many toxic situations (such as users making threats of terrorism, suicidal/self harming users, etc) they were unprepared to face.
I don't believe Discord offers any "training" material to server admins and their moderators, and I have seen my fellow co-workers (especially volunteer moderators, I was once a volunteer for a game as well), get combat like PTSD from their work handling what the UN classifies as "child rape images" (IE child pornography.)
While I doubt the average server on Discord will get to that level, Discord offers very little support or even experience to their communities in handling these problems (which are unfortunately quite real).
I will argue that server moderators/admins are very ill-equipped to deal with it in the first place (there's a lot that server moderation tools can't do), and Discord really should make reporting next to "brainless" so your users have less to worry about.
Lastly, it is kind of a risk as well to expect users to know what kind of thing and how and when to escalate to the police across the world. That's the sort of thing Discord ought to be doing and investigating independantly.
I assume that Discord does do this for extreme safety cases, but I just got an email back from my report suggesting that I do it in severe cases of user endangerment (IE suicidal or self harming users).
Obviously I do not have the resources to track down the user's location and report to authorities (even if I had to) because I cannot IP search and I do not have the user's account records at my disposal.
Thankfully I know what to do, but that could be undue pressure to put on an average Joe user (and your customers).
If Discord will not make its own report infrastructure, then I request Discord at least release real training material to help server owners and moderators. The only material I've found thus far basically describes how your tools or server features work, and that hardly covers the complicated situations that inevitably arise.
I am not suggesting that Discord take away any rights of server owners or their moderators or diminish them at all, just add a report button to supplement what it already has, and empower them further. They're also customers, so it's good to make their lives easier.
Thanks for listening.
Du måste logga in om du vill lämna en kommentar.