Comment by Axel_Sig on 05/04/2019 at 07:29 UTC

64 upvotes, 1 direct replies (showing 1)

View submission: SEQUENCE - FINAL STITCH (THEATRICAL)

View parent comment

I agree with all you said and this question is only tangentially related, but do you feel this small example give a great showing of just how easy it is to manipulate the reddit algorithm wether intentionality or unintentionally? How easy is for a small group to control or mass indrouce and control the narrative not just here but everywhere on reddit

Replies

Comment by youngluck at 05/04/2019 at 07:46 UTC

61 upvotes, 2 direct replies

In a small pocket of time, yes. The community began to correct itself, but only after we��d nearly burned through the resources we had to keep it running. I believe that if left to run longer that manipulation would have been over powered by the voices and voting power of the collective, hell even the narrators started to correct themselves. Posts popped up with calls to create allegiances against the botnet, but we were all out of steam to see if they’d play out. To answer your question though, I do think that in a narrow window of time and under the condition that only one idea can be presented at any given time, the group that organizes the best (and that includes through the use of unfair vote brigading and the exploitation of other loopholes) will be able to control the narrative, not just here but anywhere on Reddit and society in general. And even in places that do allow for correction over time, it’s still potentially dangerous. In elections, for instance, where correction of an idea means nothing after the Election Day. The smartest people here are working on this problem and have been paying attention to the potential learnings of sequence. I’m going to get a good night’s sleep and then myself and the team will also compile our own analysis of what we’ve learned. Not just on April fools, but in the ARG that proceeded it. It’s a lot to process.