87 upvotes, 4 direct replies (showing 4)
View submission: Additional Insight into Secondary Infektion on Reddit
There are groups, like the tshirt spammers, with seemingly endless accounts that get around spam detection but a state actor only uses a small handful of accounts that get immediately caught?
Why are they be so seemingly inept at manipulating Reddit when they're so successful on other platforms?
Comment by worstnerd at 08/04/2020 at 22:56 UTC
89 upvotes, 2 direct replies
Thanks for the question. I think there's a bit of a misconception here, regarding the t-shirt spammers we actually do catch many of them and do so immediately. Those operations are pretty used to changing up their tactics in order to get around the blocks we put in place, the good news is we're also pretty good at detecting these changes and tend to catch on fairly quickly. So some may squeak through, but rarely for long.
With respect to their "ineptitude" on Reddit vs other platforms, there are a few components to that. First, our moderators and users have a deep understanding of their communities, and it is hard to get something past you all (thank you!). Second, this campaign didn't really show any signs of attempting to amplify the messages (namely using additional accounts to upvote or engage with the content in any way to make it seem organic...admittedly they were removed from the subreddits almost immediately, so there wasn’t much of a chance). Finally, Reddit is not a platform built to amplify all content, we are built for discussion. You all decide what content should be seen with your up and down votes. If something doesn’t fit for your community, mods can remove it and/or users can downvote it. This is in contrast to the model on other platforms, which are constantly searching for eyes for every piece of content.
Comment by [deleted] at 08/04/2020 at 22:50 UTC
8 upvotes, 3 direct replies
[removed]
Comment by get_it_together1 at 10/04/2020 at 14:15 UTC
3 upvotes, 0 direct replies
As others said, we don’t know the scale of the state actor operation and it seems likely they much of what they do don’t get caught. They likely a combination of automated scripts and human people, so we’re probably only catching a part of the operation.
Comment by birds_are_singing at 09/04/2020 at 00:32 UTC
2 upvotes, 0 direct replies
Only the inept ones get *caught* is the likely answer. Mods have very little additional information compared to users, so there's absolutely no way to know how many influence campaigns go under the radar. Without a radical rethinking of how accounts are provisioned, that won't change.