39 upvotes, 3 direct replies (showing 3)
View submission: Changelog: Post insights, relevance experiments, and mod notes
Frankly, I am totally against the communities being able to see automod rules. If they know how to work around filters to say whatever they want it can become a problem.
Comment by ddoeth at 22/03/2022 at 08:10 UTC
13 upvotes, 0 direct replies
Exactly this. We put a lot of work into banning special writings and wordings of hate speech, if this would be made public the users could work around those rules and it would get useless.
Comment by MinimumArmadillo2394 at 22/03/2022 at 01:59 UTC
4 upvotes, 0 direct replies
Tbh, they likely already know what is or isn't a good thing and no amount of tweaking can be proactive enough. From my perspective, if bad people want to do bad things, it's entirely possible to make things harder to do, but it's still a cat and mouse game but automod is notoriously difficult for doing any sort of "Work around the filters" kind of work. You're better off building your own bot for that. Reddit's Automod can't even tell you who made the post in a modmail because it's hot garbage.
My main point was, it's a good idea to show users *exactly* what's required to post somewhere and in the past 2 or so years, it's become more and more prevalent for subreddits to have karma/account age filters.
Some BIG subreddits auto-remove anyone that pings another user or even says the words "moderator." This is the kind of thing that needs to be known to users. Subreddits that don't want you talking about who runs them are bound to be power-trips.
If a user is willing to go through the effort and make 10+ accounts to figure out the automod system for a particular subreddit in order to spam, that should 100% be something reddit can discover and crack down on via IPs if the accounts are reported properly. In essence, if the moderation "staff" is doing their job, these sorts of accounts shouldn't be an issue. If they aren't doing their job, it becomes an automod mess where people become confused and don't know why their stuff isn't being seen and is instantly removed. It's the lesser of 2 evils.
Spammers are gonna spam anyway, regardless of what the filters are AND spammers are more easily tracked down via IPs and such. New users tend to give up on a social media platform if they don't feel engaged with, in which having everything you say auto-removed is a bad thing in that respect.
At the LEAST, Reddit should force automod to notify users of actions against their account and automod comments/posts should be un-deletable by moderation.
Comment by iSlideInto1st at 22/03/2022 at 01:36 UTC
-12 upvotes, 1 direct replies
Totally. Why should people be able to see why their post was (effectively) shadowbanned (if they even notice, since none is given)? Being told what they aren't allowed to say would be absolute anarchy!
Definite problem.