https://www.reddit.com/r/self/comments/1gouvit/youre_being_targeted_by_disinformation_networks/
created by walkandtalkk on 11/11/2024 at 15:31 UTC*
31903 upvotes, 179 top-level comments (showing 25)
(I wrote this post [1]in March and posted it on r/GenZ. However, a few people messaged me to say that the r/GenZ moderators took it down last week, though I'm not sure why. Given the flood of divisive, gender-war posts we've seen in the past five days, and several countries' demonstrated [2]use of gender-war propaganda to fuel political division in multiple countries, I felt it was important to repost this. This post was written for a U.S. audience, but the implications are increasingly global.)
1: https://www.reddit.com/r/GenZ/comments/1bfto4a/youre_being_targeted_by_disinformation_networks/
And you probably don't realize how well it's working on you.
This is a long post, but I wrote it because this problem is real, and it's much scarier than you think.
In September 2018, a video went viral[3] after being posted by In the Now, a social media news channel. It featured a feminist activist pouring bleach on a male subway passenger for manspreading. It got instant attention, with millions of views and wide social media outrage. Reddit users wrote that it had turned them against feminism.
There was one problem: The video was staged.[4] And In the Now, which publicized it, is a subsidiary of RT, formerly Russia Today, the Kremlin TV channel aimed at foreign, English-speaking audiences.
4: https://euvsdisinfo.eu/viral-manspreading-video-is-staged-kremlin-propaganda/
As an MIT study found in 2019, **Russia's online influence networks** reached 140 million Americans[5] **every month** -- the majority of U.S. social media users.
5: https://www.technologyreview.com/2021/09/16/1035851/facebook-troll-farms-report-us-2020-election/
In 2013, Yevgeny Prigozhin, a confidante of Vladimir Putin, founded the Internet Research Agency (the IRA) in St. Petersburg. It was the Russian government's first coordinated facility to disrupt U.S. society and politics through social media.
Here's what Prigozhin had to say[6] about the IRA's efforts to disrupt the 2022 election:
Gentlemen, we interfered, we interfere and we will interfere. Carefully, precisely, surgically and in our own way, as we know how. During our pinpoint operations, we will remove both kidneys and the liver at once.
In 2014, the IRA and other Russian networks began establishing fake U.S. activist groups on social media[7]. By 2015, hundreds[8] of English-speaking young Russians worked at the IRA. Their assignment was to use those false social-media accounts, especially on Facebook and Twitter -- but also on Reddit, Tumblr, 9gag[9], and other platforms -- **to aggressively spread conspiracy theories and mocking, ad hominem arguments** **that incite American users.**
8: https://www.rferl.org/a/russia-whistle-blowing-troll-gets-her-day-in-court/27047858.html
9: https://www.bbc.com/news/technology-43255285
In 2017, U.S. intelligence found that **Blacktivist**, a Facebook and Twitter group with more followers than the official Black Lives Matter movement, was operated by Russia[10]. Blacktivist regularly attacked America as racist and urged black users to rejected major candidates. On November 2, 2016, just before the 2016 election, Blacktivist's Twitter urged[11] Black Americans: "Choose peace and vote for Jill Stein. Trust me, it's not a wasted vote."
10: https://money.cnn.com/2017/09/28/media/blacktivist-russia-facebook-twitter/index.html
11: https://money.cnn.com/2017/09/28/media/blacktivist-russia-facebook-twitter/index.html
The brilliance of the Russian influence campaign is that it convinces Americans to attack each other, worsening both misandry and misogyny, mutual racial hatred, and extreme antisemitism and Islamophobia. In short, it's not just an effort to boost the right wing; it's an effort to radicalize everybody.
Russia uses its trolling networks to **aggressively attack men.** According to MIT[12], in 2019, the most popular Black-oriented Facebook page was the charmingly named "**My Baby Daddy Aint Shit."** It regularly posts memes attacking Black men and government welfare workers. It serves two purposes: Make poor black women hate men, and goad black men into flame wars.
MIT found that My Baby Daddy is run by a large troll network[13] in Eastern Europe likely financed by Russia.
But Russian influence networks are also also aggressively misogynistic and aggressively anti-LGBT.
On January 23, 2017, just after the first Women's March, the New York Times found that the Internet Research Agency began a coordinated attack[14] on the movement. Per the Times:
14: https://www.nytimes.com/2022/09/18/us/womens-march-russia-trump.html
More than 4,000 miles away, organizations linked to the Russian government had assigned teams to the Women’s March. At desks in bland offices in St. Petersburg, using models derived from advertising and public relations, copywriters were testing out social media messages critical of the Women’s March movement, adopting the personas of fictional Americans.
**They posted as Black women critical of white feminism, conservative women who felt excluded, and men who mocked participants as hairy-legged whiners.**
But the Russian PR teams realized that one attack worked better than the rest: They accused its co-founder, Arab American Linda Sarsour, of being an antisemite. Over the next 18 months, at least 152 Russian accounts regularly attacked Sarsour. That may not seem like many accounts, but it worked: They drove the Women's March movement into disarray and eventually crippled the organization.
A former federal prosecutor who investigated the Russian disinformation effort summarized it like this[15]:
15: https://www.nytimes.com/2022/09/18/us/womens-march-russia-trump.html
**It wasn’t exclusively about Trump and Clinton anymore. It was deeper and more sinister and more diffuse in its focus on exploiting divisions within society on any number of different levels.**
As the New York Times reported[16] in 2022,
16: https://www.nytimes.com/2022/09/18/us/womens-march-russia-trump.html
There was a routine: Arriving for a shift, **[Russian disinformation] workers would scan news outlets on the ideological fringes, far left and far right, mining for extreme content that they could publish and amplify on the platforms, feeding extreme views into mainstream conversations.**
Last month, the New York Times reported[17] on a new disinformation campaign. **"Spamouflage"** is an effort by China to divide Americans by combining AI with real images of the United States to exacerbate political and social tensions in the U.S. The goal appears to be to cause Americans to lose hope, by promoting exaggerated stories with fabricated photos about homeless violence and the risk of civil war.
As Ladislav Bittman, a former Czechoslovakian secret police operative, explained[18] about Soviet disinformation, **the strategy is not to invent something totally fake. Rather, it is to act like an evil doctor who expertly diagnoses the patient’s vulnerabilities and exploits them, “prolongs his illness and speeds him to an early grave instead of curing him.”**
18: https://www.nytimes.com/2022/09/18/us/womens-march-russia-trump.html
Russia now runs its most sophisticated online influence efforts through a network called **Fabrika**. Fabrika's operators have bragged[19] that social media platforms catch **only 1%** of their fake accounts across YouTube, Twitter, TikTok, and Telegram, and other platforms.
But how effective are these efforts? By 2020, Facebook's most popular pages[20] for Christian and Black American content were run by Eastern European troll farms tied to the Kremlin. And Russia doesn't just target angry Boomers on Facebook. Russian trolls are enormously active on Twitter. And, even, on Reddit[21].
21: https://www.thedailybeast.com/russians-used-reddit-and-tumblr-to-troll-the-2016-election
The term "disinformation" undersells the problem. Because much of Russia's social media activity is not trying to spread fake news. Instead, **the goal is to divide and conquer** by making Western audiences depressed and extreme.
Sometimes, through brigading and trolling. Other times, by posting hyper-negative or extremist posts or opinions about the U.S. the West over and over, until readers assume that's how most people feel. And sometimes, by using trolls to disrupt threads that advance Western unity.
As the RAND think tank explained[22], **the Russian strategy is volume and repetition, from numerous accounts,** to overwhelm real social media users and create the appearance that everyone disagrees with, or even hates, them. And it's not just low-quality bots. Per RAND,
22: https://www.rand.org/pubs/perspectives/PE198.html
Russian propaganda is produced in incredibly large volumes and is broadcast or otherwise distributed via a large number of channels. ... **According to a former paid Russian Internet troll, the trolls are on duty 24 hours a day, in 12-hour shifts, and each has a daily quota of 135 posted comments of at least 200 characters.**
You are being targeted by a sophisticated PR campaign meant to make you more resentful, bitter, and depressed. It's not just disinformation; it's also real-life human writers and advanced bot networks working hard to shift the conversation to the most negative and divisive topics and opinions.
It's why some topics seem to go from non-issues to constant controversy and discussion, with no clear reason, across social media platforms. And a lot of those trolls are actual, "professional" writers whose job is to sound real.
Here are some thoughts:
Comment by Old_Smrgol at 11/11/2024 at 16:22 UTC
836 upvotes, 15 direct replies
And of course all this lines up with the way the social media algorithms work anyway; they give you more of whatever you click/comment on. Which tends to be bait: rage bait, outrage bait, strawman dunk bate.
That and cute animals. And thirst traps.
The obvious solution is for everyone to use all of these platforms a lot less. Including this one.
Comment by contradictoryyy at 11/11/2024 at 16:22 UTC
514 upvotes, 6 direct replies
I did an entire academic research paper on this and got so into the topic I accidentally overwrote it by 10 pages, single spaced (after editing as much as I could down, my professor was extremely cool with the extra content after I sheepishly admitted I enjoyed my topic a little too much). The strategy of volume and repetition is called the Propaganda of Noise and was coined by Joseph Goebbels, the head of propaganda for the Nazis. It essentially said that it doesn’t matter what the truth is, it only matters what is repeated over and over again until it becomes the truth. It’s how countries like North Korea have entire swathes of population that believe that the supreme leader is this God like being. With bots and AI the effect is so much stronger today as well.
It’s fucking wild.
Comment by [deleted] at 11/11/2024 at 15:48 UTC
152 upvotes, 6 direct replies
There's a docu-series on Prime called Brainwashed. I highly recommend watching it.
Comment by SkyMarshal at 11/11/2024 at 17:07 UTC
191 upvotes, 9 direct replies
I saw this when you posted it in GenZ, it's excellent, one of the best explainers of how disinfo works that I've seen. It's insane it was removed from that sub. Maybe also consider re-/cross-posting it to more receptive communities like /r/disinfo and /r/activemeasures.
Comment by [deleted] at 11/11/2024 at 16:17 UTC
430 upvotes, 19 direct replies
Yes, I've convinced 1 person of this in my life, but it's very hard to convince people and most don't listen to me! Iran is also participating in these activities. North Korea recruits and trains citizens to do this from a young age.
Comment by Unable_Sleep_2078 at 11/11/2024 at 16:25 UTC
107 upvotes, 5 direct replies
I recently read a book called "The engineers of chaos" (*Les ingénieurs du chaos*, in French) that also has similar arguments. I don't know if it exists in other languages but it was an eye-opener on Big Data and how it's being used in politics. The fact that governments can freely access other countries' "media space" is a huge problem, in my opinion. But I don't know how this can be solved without killing the internet.
Comment by CurrentImpressive784 at 11/11/2024 at 17:05 UTC*
79 upvotes, 3 direct replies
This is excellent! I will take the opportunity to copy and paste here in the comment section a post that I wrote to inform other teachers. If any of you are interested in learning this stuff, please check out these resources or direct teachers to this comment.
For many of us, this feels like second nature. Something sound fishy? Google it. Someone making a bold claim? Is it true, and what might their intention be for lying? However, if you've paid any attention to... reality, Facebook AI slop, deepfake technology becoming more sophisticated, news media making opposing claims, and recommendation algorithms pushing the most inflammatory, highly interacted posts to the top, then you might **recognize the importance of navigating online spaces where people are actively trying to deceive you**. At it's simplest, it's not being tricked by an AI image of the Pope in a Puffer Jacket[1] or not falling for a fake romance scam on a dating app[2]: developing a little bit of tech savviness and street-smarts. As much as this might feel like common sense, the tools being used to convince people of lies or of acting against their own values and well-being are becoming increasingly sophisticated, and are succeeding at their goal at an immense scale. The skill of not always accepting things at face value is less intuitive than I would have thought (or hoped), so I personally am finding ways to have these discussions in my classroom (HS Comp Sci), and I think it is important enough not only on a personal level, but on a societal level, to where I want you to consider reading up on media literacy and potentially teaching some of this, if you are in a position where you can introduce it in your own classroom.
1: https://www.cbsnews.com/news/pope-francis-puffer-jacket-fake-photos-deepfake-power-peril-of-ai/
2: https://www.fbi.gov/how-we-can-help-you/scams-and-safety/common-frauds-and-scams/romance-scams
I know that this is preachy, possibly unnerving, and unfortunately inseparable from politics, but regardless of where you sit on the political spectrum, you can probably think of cases where you or someone you know was deceived by something that they found online or that was shared with them, and we need to help our students and ourselves adapt to a world where this will become increasingly common.
If you are interested in learning more, here are a handful of resources:
1. **NEWS LITERACY PROJECT**[3]: Organization created by Alan C. Miller, a Pulitzer Prize winning Journalist, with robust resources for educators including lesson plans, projects, and handy little things like bellringer prompts.
2. **CRASH COURSE MEDIA LITERACY**[4]: If you haven't come across Crash Course while working in education, I'd be impressed. Created by the brothers Hank and John Green, Crash Course has made freely available educational content for years, with well produced many video series on a variety of subjects
3. **CIVIC ONLINE REASONING**[5]: Non-profit created by the Stanford History Education Group, providing a curriculum, lessons, assessments, and videos that can be used to incorporate media literacy into classes.
4. **THE ORGANIZATION OF AMERICAN STATES' GUIDE TO MEDIA LITERACY AND CYBERSECURITY**[6]: Made by the OAS, an international organization made up of countries throughout the western hemisphere (so not *those* American states), in collaboration with Twitter (different time). This is a well produced 50+ page guide on media literacy, with more of a lean toward personal cybersecurity and technological literacy. A bit dated, especially with the advent of generative AI and what happened to Twitter not long after this was produced, but this guide has great production value and could serve as a handy resource for figuring out how to present and discuss these topics.
If you are looking for a particular place to start, I recommend teaching **Lateral Reading**, the skill of reading up on the organizations and authors behind content users may find online. A brief video on the subject can be found here[7], AND I have a week-long project that I developed to teach this in my class, that I am willing to share if anyone is interested.
4: https://www.youtube.com/watch?v=sPwJ0obJya0&list=PL8dPuuaLjXtM6jSpzb5gMNsx9kdmqBfmY
5: https://cor.inquirygroup.org/
6: https://www.oas.org/en/sms/cicte/docs/Media-Literacy-and-Digital-Security.pdf
7: https://youtu.be/SHNprb2hgzU?si=0XXkOgeYXu8fMRsi
Comment by Rex_felis at 11/11/2024 at 17:07 UTC
82 upvotes, 2 direct replies
Yeah there's no way people are this divided by default. Yes we've certainly got our own issues but this internet shit isn't as indicative of reality as it seems. The media being pumped is turbo charging these chambers of hate and vitrol. Unfortunately it's appealing to some for various reasons and it gets tons of engagement so it's almost irresistible in social media spaces.
The loneliness epidemic is only intensified by this kind of language and troll campaigns. I've been feeling for the last few years that we (Americans, and English speaking net users) are being intentionally driven apart. The only people I meet in real life that parrot these trends are chronically online.
Im very wary of the content that's being pushed ESPECIALLY in this subreddit. Something fucky is going on. I'm guessing some players are hoping the liberal wing will do something like Jan 6th again too, or simply coalesce into the trump presidency without any push back due to the depression and apathy being pumped out.
Things really aren't this divisive in reality for most people. We're being led to think that ideological differences are grounds for conflict escalation. These next few months to years will be interesting. I'm debating getting off social media entirely, including Reddit. I did it from 2018-2021 with minimal usage at various times.
Comment by snackonmywhack at 11/11/2024 at 16:58 UTC
25 upvotes, 2 direct replies
including reddit
Comment by MaximDecimus at 11/11/2024 at 18:52 UTC
25 upvotes, 0 direct replies
The GenZ subreddit is absolutely being blasted with disinformation. Everything is just about amplifying anger and arguing.
Comment by [deleted] at 11/11/2024 at 17:05 UTC
46 upvotes, 2 direct replies
[removed]
Comment by mc_mcfadden at 11/11/2024 at 19:25 UTC
75 upvotes, 8 direct replies
Robert Muller in ~2017 told congress that Russia is actively attacking us with misinformation and nobody did a thing about it or remembers
Comment by supacrusha at 11/11/2024 at 19:51 UTC
20 upvotes, 0 direct replies
I'm convinced this is what happened to the libertarian meme subreddit, it went from being a niche, pro liberty subreddit, to being primarily Trump-spam, civil war spam, homophobic and racist subreddit over the course of the election cycle. My views haven't changed, theirs have, and watching it happen I felt like I was going insane, because all of a sudden I was getting shit on for the same takes I've always had. I am as much of a voluntarist as ever, definitely not the same with them.
Comment by p3n1sf4ce at 11/11/2024 at 22:53 UTC
20 upvotes, 0 direct replies
I feel like this should be pinned to the front page of Reddit.
Comment by Jesus_Faction at 11/11/2024 at 17:14 UTC
40 upvotes, 1 direct replies
<you are not immune to propaganda>
Comment by TemperateStone at 11/11/2024 at 22:00 UTC
13 upvotes, 1 direct replies
I've been tryign to tell people this for years but people either mock me for it or don't want to hear it because they don't want to believe they're so manipulated.
Comment by Ariston_Sparta at 11/11/2024 at 16:11 UTC
37 upvotes, 1 direct replies
This is absolutely spot on!
This is all 5th generation warfare... Narrative manipulation.
Your mind is now the battlefield of nations.
Comment by King_LaQueefah at 11/11/2024 at 16:44 UTC
41 upvotes, 0 direct replies
This is the best sub for this at the moment. I feel that this sub is being flooded with posts that do exactly this.
Comment by NotEqualInSQL at 11/11/2024 at 21:00 UTC
12 upvotes, 1 direct replies
I have been saying this forever. Some people from the left think it is only the Right that is being targeted and that they can see it plain as day, but people need to really start questioning that they themselves might be exposed or victims of it too. Just the inverse of the rights.
Comment by suninabox at 11/11/2024 at 18:49 UTC*
25 upvotes, 2 direct replies
stupendous merciful spark steep profit wise knee selective encourage trees
Comment by Niztoay at 11/11/2024 at 19:25 UTC
24 upvotes, 1 direct replies
Somebody who wants to do something important, find a way to communicate this information visually. Infographics, charts, graphs, gifs, screenshots. Whatever you got to do to make this accessible for people who don't have the time or ability to read a couple thousand words on the russian sabotage of our social media spaces.
I get this stuff is important, and I'm not saying the OP needs to do this, but if there's somebody out there who thinks they got the skill to break this down into easily digestible chunks you'll be doing deeply important work for humanity in this fight against totalitarian power.
💜 💜
Comment by [deleted] at 11/11/2024 at 18:51 UTC
12 upvotes, 0 direct replies
Thank you for posting this
Comment by 33ITM420 at 11/11/2024 at 16:38 UTC
25 upvotes, 1 direct replies
reddit is among the worst, tbh
Comment by Daekar3 at 11/11/2024 at 17:35 UTC
8 upvotes, 0 direct replies
SO MANY PEOPLE need to see this! The quote from WarGames is 100% spot on, and will only become more necessary wisdom as AI really takes off.
Comment by [deleted] at 12/11/2024 at 09:57 UTC
9 upvotes, 0 direct replies
I ran and moderated several gaming servers between 2005 and about 2018 when life got busy. The most effective thing I ever did was place a flat IP block on anything registered in Russia, or anything which showed a traceroute as having come through China.
When I implemented that, the toxicity and trolling and awful commentary both in and out of game dropped from near-continuous to virtually zero *literally* at the click of a button. There is something deeply, *deeply* sick in Russian culture.
The West should have severed the Internet hardlines into Russia a decade ago when they first invaded Crimea and made their plans known, and there should be an IP block places at the backbone hubs in London, Berlin, and the West Coast which block anything which routed through eastern China. It won't be 100%, but even a 95% block on Russian internet traffic will have a dramatic and immediate positive effect on western society.