Efficiently Report AutoMod-Detected Messages
Let's make it super easy to report flagged messages from our log channel by adding a "Report to Trust and Safety" button in the actions section of automod.
-
Absolutely necessary IMO. Most of the things that are severe enough to report to Discord T&S are also the things we want as few people to be exposed to as possible. The only current options are;
1. Have it deleted ASAP by Auto-moderation (whether that be community-run moderation bots or Automod), but then have no way to report it, or
2. Have no form of Auto-moderation in order to report these messages before deleting them, but requiring the server moderators to be on high alert for all channels at all times, and increasing the number of users exposed to harmful content.
Allow us to report messages flagged by Automoderation via the log it leaves in its designated channel.0 -
I believe this defeats the purpose of automod a little: The message is never seen by anyone except the sender and moderators, so provided they don't try to defeat the automod filter nobody is exposed to the bad content. If they defeat the filter, you can report that message
0 -
I don't agree that it defeats the purpose; Automod is meant to prevent users from being exposed to harmful content. Reporting the message is meant to flag problematic users for Discord to review and remove from their platform at their discretion. These two purposes are not at odds with each other, they directly compliment each other. I do not understand why you think reporting a user to Discord would in any way diminish the purpose Automod serves of preventing people from being exposed to harmful content. At best, you can just not report the message. You will always have that option. But right now, there is no option to report when it's necessary.
There are certain behaviours that should not be tolerated on the platform no matter how many people are exposed to it, or even if they're stopped in the attempt. A user who posts (or even merely attempts to post) something like gore, or NSFW content depicting minors, or threatening physical harm to another user, should absolutely face consequences and be removed from the platform. A ban from a single server is not even close to enough. But Discord will do absolutely nothing if you do not directly report the message. Which you cannot do if Automod blocks it.
I help to manage one of the larger public communities on Discord, and our userbase is on the younger side. Note, that isn't minors who shouldn't be on Discord, that's young people in the 13-16 age range. Every single day myself and my moderators are exposed to some of the most vile things people can find or say on the internet. I could easily share screenshots showing that every day without fail we're faced with racially-motivated hate speech, inappropriate adult content (reminder: in a server populated with young teens), and other harmful behaviours. And you know why so many people do it? Because there's no consequence. We cannot report them. We have to prioritize the well-being of our users over getting a valid message to report to Discord (which would require exposing literal hundreds of thousands of people to it for an extended period of time). It is extremely counter-intuitive.
I am not satisfied with just removing these individuals from my servers knowing they will just join another server to do the same thing to other people in another server.
I see absolutely no reason to be against the ability to report users for messages flagged via Automod.
1 -
I fully agree with Jake.
0 -
As how it is now, reporting the automod message seems to report the originally sent message of the user.
0 -
Reporting the automod message isn't possible on Mobile, so if that's the solution, please enable on mobile platforms.
Better yer, just add a “report message” button to the actions so that the UX is actually clear
0
サインインしてコメントを残してください。
コメント
6件のコメント