[Suggestion] Fix NSFW/explicit image detection false positives, add image moderator approval
A lot of completely safe for work artwork, photos, and images are getting wrongly flagged as NSFW. I've complained about this issue before, and I appreciate that now Discord actually tells you why things fail to upload (being because of the NSFW bot filter, compared to before where it just failed to upload with no explanation whatsoever), but the false positives are still an extremely annoying and unfixable issue. I run a community server so there's nothing I can do to disable this even if I wanted to, since it's a SFW server, but this is getting in the way of server activities.
I have a feature request that I think would be a better workaround than just sending us off to Discord Support to report it try and "retrain" the AI. What if anything that got flagged by the bots as NSFW would get passed on to server moderators to approve or disapprove, or even clearly mark as NSFW vs SFW? As it is currently, server admins essentially have no choice but to bow to the bot's decisions, regardless of whether it's right or not. Approval would at least give us a choice in the matter. And giving the option of letting human moderators decide whether something counts as not NSFW could possibly help retrain the bot, though I'd understand why you wouldn't want to let random people do it like captchas. Just some food for thought.
-
@... stop it
1 -
I was directed to post here by support so I'm reusing this thread. This was labeled NSFW but its just me holding some toads, lol. This one is closest to my issue, as mods could have manually approved this image instead of me having to think about photoshopping my hands blue or something.
0 -
Oh no! The children must not be allowed to see this incredibly explicit photo of shoes.
0
サインインしてコメントを残してください。
コメント
3件のコメント