WIRED Roundup: The Right Embraces Cancel Culture

WIRED Roundup: The Right Embraces Cancel Culture

WIRED Roundup: The Right Embraces Cancel Culture

Sep 22, 2025 1:14 PM

WIRED Roundup: The Right Embraces Cancel Culture

On this episode of Uncanny Valley, we discuss OpenAI’s new teen safety features, the right’s retaliation against critics of the late Charlie Kirk, and more of the week’s biggest stories.
Charlie Kirk (R) shaking hands with US President Donald Trump as he speaks on stage at America Fest 2024 in Phoenix, Arizona.Photo-Illustration: WIRED Staff; Josh Edelson; Getty Images

All products featured on WIRED are independently selected by our editors. However, we may receive compensation from retailers and/or from purchases of products through these links. Learn more.

In today’s episode, our host Zöe Schiffer is joined by WIRED’s senior culture editor Manisha Krishnan to run through five of the best stories we published this week—from OpenAI implementing teen safety features to how human design is the new astrology. Zöe and Manisha also discuss the reverberating reactions to Charlie Kirk’s death and why the work of many creators, from comic book artists to late night show hosts, is getting cancelled.

Mentioned in this episode:
Cancel Culture Comes for Artists Who Posted About Charlie Kirk’s Death by Manisha Krishnan
OpenAI’s Teen Safety Features Will Walk a Thin Line by Kylie Robison
US Tech Giants Race to Spend Billions in UK AI Push by Natasha Bernal
How China’s Propaganda and Surveillance Systems Really Operate by Zeyi Yang and Louise Matsakis
Human Design Is Blowing Up. Following It Might Make You Leave Your Spouse by Mattha Busby

You can follow Zoë Schiffer on Bluesky at @zoeschiffer and Manisha Krishnan on Bluesky at
@manishakrishnan. Write to us at uncannyvalley@wired.com.

How to Listen

You can always listen to this week’s podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here’s how:

If you’re on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts and search for “uncanny valley.” We’re on Spotify too.

Transcript

Note: This is an automated transcript, which may contain errors.

Zoë Schiffer: Welcome to WIRED’s Uncanny Valley. I’m WIRED’s director of business and industry, Zoë Schiffer. Today on the show, we’re bringing you five stories that you need to know about this week, including reactions to the death of right-wing media figure, Charlie Kirk, and the content cancellations that have happened as a result of that. I’m joined today by our senior culture editor, Manisha Krishnan. Manisha, welcome to Uncanny Valley.

Manisha Krishnan: Hi, Zoë.

Zoë Schiffer: So Manisha, our first story this week is about OpenAI making this announcement about new teen safety features for ChatGPT. This is part of an ongoing effort to respond to concerns about how minors engage with chatbots. We reported this week that the company is building out an age-prediction system that it says can identify if someone is under the age of 18 and reroute them to an age-appropriate system. A potential scenario that they outlined in the blog post about this was if the system detects that a user is considering suicide or self-harm, it’ll contact the user’s parents, and if the parents are unreachable, it might contact the authorities. This comes at a moment when we’ve seen tons and tons of headlines about people dying by suicide or committing acts of violence after engaging in pretty lengthy conversations with AI chatbots. So I’m curious what you make of it.

Manisha Krishnan: So I think it’s kind of important to contextualize this because this is happening at a time where we’re seeing age verification being applied to a range of industries from porn to video games, and companies are going about it in different ways. And while I think there obviously is a lot to be concerned about with young people having unfettered access to ChatGPT, I think these efforts always raise a lot of questions like how is age going to be verified? Where’s that data going to be stored? Even the idea of something inappropriate flagging someone’s parents or the authorities, what is appropriate?

Zoë Schiffer: Right.

Manisha Krishnan: I can definitely see why suicide would be something that you’d want to flag, but maybe there’s other things that the authorities or someone’s parents may not be helpful in every situation. And I guess I’m thinking more about when we go into young people’s exploration of their sexual orientation, just as one example as the culture war topic that’s big right now. So maybe I’m getting too ahead of myself, but those are some of the questions that come to mind.

Zoë Schiffer: That was immediately what came to mind for me, especially because you specifically report on the adult content industry, and I feel like this issue always comes up there where it’s what’s the trade-off between privacy and keeping people safe? But when it comes to young people, it really feels like the privacy conversation goes out the window and regulators are much more inclined to be like, safety comes first, and we might not necessarily care if we’re degrading privacy in some kind of fundamental way.

Manisha Krishnan: Yeah, with the whole porn conversation too, Pornhub obviously has a litany of controversy, but at this point, because they’ve been in trouble so much, they’ve buttoned up a lot of their regulations. And now, in response to the age verification stuff, they’ve removed themselves. So you’re also like, “Am I opening up a vacuum to other maybe more nefarious or irresponsible sites?” There’s always something that will crop up in replacement of something else.

Zoë Schiffer: Well, we’ll wait to see how OpenAI continues to handle this. Staying on the topic of AI for one moment, our next story is about how US tech giants are investing billions of dollars in AI infrastructure in the UK. Our colleague, Natasha Bernal, reported that Microsoft and NVIDIA announced that they’ll be investing up to $45 billion in the form of data centers and AI research. This comes on the heels of another joint venture from NVIDIA, Nscale and OpenAI that’s also aimed at boosting AI infrastructure in the country. Earlier this week, OpenAI CEO, Sam Altman, and NVIDIA CEO, Jensen Huang, traveled with President Trump to the UK during his state visit and then we got a whole bunch of announcements about all of these billion-dollar investments.

Manisha Krishnan: Honestly, one of my first reactions was is this just another form of American tech imperialism spreading and how do the Brits feel about this and what is the underlying motivation for these tech companies to make all these announcements? Do they actually want to invest that much in the UK? That would make sense, but also is it to appease Trump?

Zoë Schiffer: There is a lot of controversy, which you pointed to. London is Europe’s largest data center market and has already been really impacted by constraints and power availability and the lack of suitable land. Data centers just require huge amounts of energy and there’s a lot of opposition from a lot of different groups like environmental, local residents, you name it. It’s one thing to say, this is so great, we’re going to get all of this infrastructure potentially jobs, but it’s another when you live close to a data center and you’re literally impacted because your power bill is spiking or you don’t have enough clean water or they’re just really, really freaking loud.

Manisha Krishnan: Yeah, and we’ve already seen how some of these data centers are disproportionately impacting more marginalized communities. It does seem like kind of a depressing time if you care about climate.

Zoë Schiffer: Moving on. Next story is about China. Our colleagues, Zeyi and Louise recently reported on a leak of internal documents from a Chinese company that show how digital censorship tools are being marketed and exported globally. This company that they were focusing on in the story sells what amounts to a commercialized great firewall to at least four other countries. The groundbreaking leak shows in great detail the capabilities that this company has to monitor and hack internet traffic. Researchers who examined the files described it as digital authoritarianism as a service. These companies collaborate with academic institutions on research and development. They personalize their tools to fit a client’s needs, and they chase lucrative government contracts, which sounds honestly very familiar.

Manisha Krishnan: This article was great because it really was sort of demystifying and debunking this notion that so many people have of China’s great firewall being this single entity. Zeyi and Louise do point out that these companies have far less transparency, of course, but otherwise they do function very similarly to western tech companies. And I think that’s something that we have to grapple with more just how we view China and how they do things as being so different from how things are done here. But I think with this administration in particular, we are noticing that that’s no longer always the case.

Zoë Schiffer: One more story before we break. It’s about human design, which is a new astrology-like system that uses birthdates to divide people up into personality types. And it’s been completely blowing up online perhaps, because unsurprisingly, some people are taking it extremely seriously and designing their lives around it. Tell me about this story because you were very involved in it being published on WIRED.

Manisha Krishnan: I did commission this story because I think human design is going to blow up. Human design, it’s like astrology, but it also combines Kabbalah and a bunch of other sort of spiritual systems, and essentially people are divided into five categories, manifestors, generators, manifesting generators, reflectors and projectors. It’s sort of used lightly, like how people use astrology by some of the followers, but then there is really an intense contingent that follows it rigidly. A reporter talked to a woman who broke up with her husband after a single reading. Even on Love is Blind, this guy kept using the catchphrase, follow your spleen-

Zoë Schiffer: Right.

Manisha Krishnan: … which some human design followers believe that your spleen is a better guide than your gut. And so he ended up breaking it off with one of the women that he was dating in Love is Blind because he said, “His spleen was silent.”

Zoë Schiffer: I was locked in for the first part of this. And then we got to the spleen thing. What does that mean? Is it literally a gut sense? What are they tapping into?

Manisha Krishnan: Honestly, it is really confusing because they have all of these rules around deconditioning yourself from essentially forces within you that don’t jive with who you really are, but the way that you decondition yourself seems to be in some cases very rigid. I saw one person on Reddit posting about how they only eat polenta because that’s the only ingredient that will allow them to become their truest self according to human design.

Zoë Schiffer: I do want to know, do you know what I am?

Manisha Krishnan: Yes.

Zoë Schiffer: Because you asked me my birthday yesterday, so I’m on the edge of my seat.

Manisha Krishnan: I did. I plugged it in. And you are a generator, which is an energy type defined with a sacral center characterized by a consistent self-sustaining life force-

Zoë Schiffer: Wow.

Manisha Krishnan: … that provides stamina and the capacity to do fulfilling work.

Zoë Schiffer: Did WIRED write this?

Manisha Krishnan: I know, I was just thinking that.

Zoë Schiffer: Well, great. I love that for myself. Coming up after the break, we’ll dive into the backlash that some people from graphic designers to high-profile entertainers have received after commenting on Charlie Kirk’s death. Welcome back to Uncanny Valley. I’m Zoë Schiffer. I’m joined today by senior culture editor Manisha Krishnan. Manisha, the story that keeps on reverberating this week is that of Charlie Kirk’s death. Our colleague, Jake Lahut, has been covering how the Trump administration in the general right-wing base has maintained their position that Kirk’s death was a result of leftist ideology and maybe even a coordinated attack. Both of these claims have been debunked, but it’s done little to change people’s minds. And this week, you reported that different artists have been facing professional retaliation for voicing their opinions on Kirk. What did you find in your reporting?

Manisha Krishnan: There’s been a bunch of people from different industries that have lost their jobs over posting unsympathetically about Charlie Kirk’s death from journalists to video game developers. But one that stuck out in my mind was I interviewed this trans writer who was doing a comic series for DC Comics, and she referred to Charlie Kirk as a Nazi bitch after he died and she was suspended on Bluesky for a week and DC fired her and they’ve canceled the series. And that really stuck out to me because she has said that Charlie Kirk, he was staunchly anti-trans. I mean, he was anti a lot of things that weren’t a straight Christian white male, and he was pretty loud and proud about those views. And so I think it really does stick out to me because it’s almost like our people kind of expected to perform grief maybe for someone who espoused hateful views towards the community that they’re part of, but it almost feels like this really, really hard line that a lot of corporations have taken. Making someone apologize is one thing, but literally disappearing art, canceling an entire series or South Park deciding not to re-air an episode about Charlie Kirk that he himself loved. He said he really liked it. I just think it goes a little bit beyond just reprimanding people.

Zoë Schiffer: I’ve been looking at a lot of the commentary from the ray, how they’re sub-conceptualizing what they’re doing and interpreting this moment, and I think they would say we’re not asking people to grieve his death. We’re asking people not to celebrate his death, but I think this has become its own cultural divide where many people on the left are seeing this as the rise of pretty extreme cancel culture, literally making it so that people lose their jobs or canceling art that was related to Charlie Kirk, and then I saw one post from a prominent media figure this morning on X saying, “Cancel culture is when you go back in someone’s history and you find their tweets to make them lose their job.” If you say something in real time and people react to it, that’s just experiencing consequences for your actions.

Manisha Krishnan: Yeah, that was such a reach. I mean, I found that ridiculous because it’s kind of splitting hairs, but I think going back to even the term cancel culture, which I know that I used in the headline for my story kind of tongue in cheek, but I think what we’re really seeing also is corporate compliance. And it’s preemptive almost to a certain degree. And just going back to the point about they don’t want people to perform grief. There was a list that we had reported on, I think it was called Charlie’s Killers or something, but it was a list of people who had tweeted, and one of my acquaintances, she’s like a Canadian political influencer, was on there and she said something completely benign. It was not about him at all. It was more about the environment that we’re in right now. And she’s number one on this docs list.

Zoë Schiffer: Wow.

Manisha Krishnan: So it’s pretty dangerous.

Zoë Schiffer: This kind of seemed to culminate in the news on Wednesday that ABC was suspending Jimmy Kimmel’s late night show indefinitely as a result of comments he made about Charlie Kirk. So it cemented this widespread worry that there’s a censorship spree around commenting on the death on every single medium, whether that’s journalistic commentary or your personal view or comedy. So what do you make of that move?

Manisha Krishnan: At first, I was shocked, maybe I shouldn’t be shocked at this point anymore, but I went and looked at what he said. He really didn’t say anything about Charlie Kirk. What he sort of made a joke about was MAGA trying to deny that the shooter was one of them. And so then I found out that Nexstar, which is a media company and like an ABC affiliate, is the one that were apparently up in arms about this, and they are trying to land a $6.2 billion deal right now, and so they need FCC approval for that. And so once I read that, I was like, “Okay, this does make a lot more sense.” It also makes it a lot more cynical because we’re looking at direct FCC, direct government involvement in censorship and taking away First Amendment rights.

Zoë Schiffer: Obviously, we’re in a deeply polarized political moment, and that only gets exacerbated on social media, but what do you think it was about this specific incident that seems like, and it’s hard to name a turning point as it’s happening, but it does feel like something has shifted in a big way.

Manisha Krishnan: For one thing, I think that indefinitely canceling the show to me strikes as such an overreaction, especially when it’s compared to what he actually said. I think maybe if you wanted to force him to trot out there and give an apology, that’s more in line with some of the reactions we’ve seen in the past. I also think when this happens to someone who’s, for better or worse, no matter how you feel about Kimmel, he’s a major pop culture figure. And so I think when this censorship happens to journalists who people don’t really like us anyway, or other random artists, it’s not as big of a deal as someone like Jimmy Kimmel who, Charlie Kirk, yeah, he had a lot of followers, but Jimmy Kimmel, he is much, much better known. There’s probably no doubt about that.

Zoë Schiffer: Yeah. Yeah. Well, Manisha, thank you so much for joining me today.

Manisha Krishnan: Thanks for having me.

Zoë Schiffer: That’s our show for today. We’ll link to all the stories we spoke about in the show notes. Make sure to check out Thursday’s episode of Uncanny Valley, which is about how some tech companies are betting big on humanoid robots as the future of AGI. Adriana Tapia and Mark Lyda produced this episode, Amar Lal at Macro Sound mixed this episode, Pran Bandi was our New York studio engineer. Kate Osborn is our executive producer. Condé Nast head of global audio is Chris Bannon and Katie Drummond is WIRED’s global editorial director.


Credit: Original Article