How Disinformation About the Minnesota Shooting Spread Like Wildfire on X

How Disinformation About the Minnesota Shooting Spread Like Wildfire on X

How Disinformation About the Minnesota Shooting Spread Like Wildfire on X

Sep 3, 2025 11:28 AM

How Disinformation About the Minnesota Shooting Spread Like Wildfire on X

Under Elon Musk’s leadership, X has become the perfect platform to supercharge the spread of dangerous disinformation during breaking news events.

People leave flowers at a memorial after the Annunciation Catholic School shooting in Minneapolis on August 27, 2025.Photo by Christopher Mark Juhn/Anadolu via Getty Images

Minutes after the perpetrator of the shooting at Annunciation Catholic Church in Minneapolis last week was identified, YouTube appeared to delete several videos they had shared that morning.

But not before the videos were downloaded and reshared in full on X.

Within hours, the platform was flooded with wild claims about the shooter and her motivation, with everyone from Elon Musk, the site’s owner, to the head of the FBI and left-wing activists posting half-baked allegations blaming anti-Christian hate, transgender genocide, and white supremacy. Many of the posts racked up millions of views per X’s public metrics.

While other social media platforms were also used to share unfounded claims about the shooter’s motivations, X, under Musk, has become the perfect platform to supercharge the spread of dangerous disinformation during breaking news events. The entire team tasked with tackling disinformation on the platform was first culled years ago, and now X’s biggest users claim they are incentivized by the platform to share out-of-context clickbait content over verified facts.

“X’s feed algorithm is fully designed to maximize engagement, even negative engagement,” says Laura Edelson, an assistant professor in the computer sciences college at Northeastern University who specializes in tracking disinformation online. “In these conditions, conspiratorial, extreme content tends to perform very well. And when you couple that with the fact that with X’s significantly weakened content rules, this is exactly what we would expect to result.”

X did not respond to WIRED‘s request for comment.

An 11-minute video from the shooter, which was shared by dozens of X accounts in the minutes after their identity was revealed, includes a wide array of guns and ammunition. The weapons were adorned with over 120 symbols, words, and phrases that reference dozens of hateful ideologies, mass shooters, memes, and coded language used by the nihilistic online communities the shooter was a member of.

As extremism researchers warned people against jumping to quick conclusions given the huge swathe of digital, written, and video content that needed to be analyzed, X users took very little notice.

The same day, screenshots from the video were used by everyone from elected lawmakers and senior government officials to law enforcement personnel, activists, podcasters, and conspiracy theorists on X to push particular narratives about what was to blame for the latest mass shooting.

In one of the primary narratives erroneously pushed immediately after the shooting, conservative influencers and politicians claimed that the perpetrator’s gender identity was at fault. Information about the shooter, who identified as transgender and changed her name to Robin Westman when she was 17 years old, spread like wildfire on X, pushed by a huge list of right-wing figures, including Georgia representative Marjorie Taylor Greene, right-wing podcaster Benny Johnson, and Musk himself. X’s own AI-powered chatbot Grok refuted the idea that transgender people disproportionately carry out mass shootings.

Many X users, like right-wing commentator Nick Sortor, claimed the attack was motivated by hatred of God, citing “all the anti-Christian and and anti-God writings” on the shooter’s guns. FBI director Kash Patel seemed to boost these claims by posting that the shooting was being investigated as a “hate crime targeting Catholics.” Conspiracy theorist Laura Loomer alleged that the shooter was “radicalized by leftism and Islam.” Others cited anti-Israel phrases written on the weapons as proof the shooting was antisemitic.

On the left, most focused on the fact the shooter praised other mass shooters and used racist language. Left-wing podcaster Benjamin Dixon, who describes himself as “Pastor of Antifa,” described the shooter on X as “a right wing incel aggrieved white boy.”

The platform itself even helped boost some of these claims in the summaries it presented to users about the shooting, which highlighted the “anti-Trump messages” written on some of the shooter’s weapons without mentioning all the other words and phrases.

The reality, according to extremism experts who are still trawling through the shooter’s writings and digital footprint, is that there simply appears to have been no overarching ideology motivating the shooter. Instead, some have determined that the shooter was likely part of a growing group of nihilistic violent extremists whose sole motivation is the violence itself.

“They clearly state several times they are not doing this for any ideology or cause, they are simply doing this for the sake of violence, for their desire for notoriety, to know what it feels like to be one of their idols, to cause chaos and see the fear in the eyes of their victims,” Marc-André Argentino, an extremism researcher, wrote on Bluesky in reference to the diaries the shooter posted to YouTube hours before the attack.

Argentino also warned that nihilistic violent extremists can be performative, their messages and writings designed in part to trick people into boosting specific phrases out of context.

“These kind of attackers put on a performance, there is a script they follow and part of that script is trolling journalists or leaving ironic items in the hope someone in the media will bring on the Streisand effect to further spread their attack and be immortalized,” says Argentino.

But on X, where posting first and often is rewarded more than posting accurate and verified information, such warnings repeatedly fall on deaf ears.

“Context collapse is a rhetorical device that we see a lot in breaking news events—twisting real quotes or events to mislead or presenting it without the context in which it was said,” Nina Jankowicz, the former Biden administration disinformation czar who is now CEO of the American Sunlight Project, tells WIRED. “This is particularly prevalent on microblogging platforms like X, where users are disincentivized to read more than 280 characters or past the headline of a news item.”

Since Musk’s takeover of Twitter in late 2022, almost every move he’s taken with regards to content moderation has made it more difficult to find accurate and timely information on X.

Shortly after he bought the company, Musk removed the team responsible for tackling disinformation and replaced it with the crowdsourced Community Notes system, which made the misinformation situation worse, not better. More recently, users have relied on Grok to fact-check posts, which also hasn’t helped. This lack of oversight, combined with a new verification system that rewards posts with the highest engagement over verified sources, has resulted in a toxic stew of disinformation flooding the platform during major global breaking news events. This was clearly seen at the outbreak of the Israel-Hamas war and more recently during the protests in LA.

“There are no guardrails on X anymore, and it’s become overrun by disinformation accounts and grifters,” says Mike Rothschild, an author who writes about conspiracy theories and extremists. “There are certain narratives about mass shootings that will instantly find homes on X, and nothing holds them back from spreading.”


Credit: Original Article