So, Meta is currently in a bit of a weird legal spot. They just asked a US district court to throw out a lawsuit that basically accuses them of being a bunch of high-tech pirates. The company behind the suit is Strike 3 Holdings. If you don’t know them, they own a massive library of adult films, and they are claiming that Meta “illegally torrented” their movies. Why? To train AI. Specifically, Strike 3 thinks Meta is building some secret, unannounced adult version of their “Movie Gen” AI. They were looking for damages that could hit $350 million. That is a lot of money for some downloads.
But Meta says the whole thing is just a fantasy. On Monday, they filed a motion to dismiss, and they didn’t hold back. They called Strike 3’s claims a mix of “guesswork and innuendo.” They even poked at Strike 3’s reputation, mentioning how some people call them a “copyright troll” because they file so many lawsuits.
The “Personal Use” Defense
The core of Meta’s argument is actually pretty funny when you think about it. They aren’t necessarily denying that some movies were downloaded on their network. Instead, they are saying that if it happened, it was just bored employees or random guests looking for a distraction. They say the evidence is “plainly indicative” of private personal use.
Think about the scale. Strike 3 says Meta used a “stealth network” of 2,500 hidden IP addresses to hide their tracks. Meta says that’s nonsense. According to the filing, the actual number of downloads linked to Meta IPs was only about 22 per year. That’s roughly two movies a month.
Meta’s lawyers argued that if a giant tech company were trying to steal a “massive dataset” to train a sophisticated video AI, they wouldn’t just grab 22 files a year. You need millions of data points for AI training. Twenty-two movies isn’t a dataset; it’s a slow weekend. From what I can tell, they are basically saying their employees might be watching porn on the clock, but the company isn’t using it for work.
Breaking Down the Timeline
One of the biggest holes Meta pointed out is the timing. This is where the “perplexity” of the legal strategy gets interesting.
-
The alleged downloads started in 2018.
-
Meta didn’t even start researching “Multimodal Models and Generative Video” until about 2022.
-
That’s a four-year gap where they would have been hoarding porn for a project that didn’t exist yet.
Meta also pointed out that their own terms of service actually prohibit the AI from generating adult content. They argue it would be “nonsensical” to train a model on stuff they don’t want the model to ever produce. It would be like a vegetarian restaurant stealing recipes for steak. Why bother?
Who actually clicked the link?
Another issue is the “who” of it all. Meta has tens of thousands of employees. They have contractors. They have visitors. They have repair people. All these people use the Meta Wi-Fi every single day. Strike 3 hasn’t identified a single person who did the downloading. They didn’t link the activity to anyone actually working in the AI department.
Meta’s filing says it’s just as likely that a “freeloader” or a “guest” was responsible. They even brought up a specific contractor Strike 3 mentioned. This guy was supposedly told to download adult content at his dad’s
house. Meta pointed out he was an “automation engineer.” He didn’t have anything to do with sourcing data for AI. When his contract ended, the downloads stopped. Strike 3 says that proves Meta was involved, but Meta says it just proves that that specific guy stopped watching those videos when he stopped working there. Anyway, it’s a bit of a stretch to blame the CEO for what a contractor does at his dad’s place.
The Mystery of the “Stealth Network”
Then there is the “stealth network” claim. Strike 3 says Meta used 2,500 hidden IPs to mask their “theft.” Meta’s response was basically: “If we were trying to hide, why would we leave hundreds of other downloads totally visible on our main corporate IP addresses?”
It’s a fair point. Why hide half the crime and leave the other half out in the open? Meta calls the whole theory “nonsensical.” They also shot down the idea that they should have been “policing” their network better. They argued that monitoring every single file downloaded by every person on their global network would be an “extraordinarily complex and invasive undertaking.” Essentially, they are saying they shouldn’t have to spy on their employees’ every click just to protect Strike 3’s copyrights.
What This Means for AI Training
This case is part of a much bigger fight. Right now, everyone is suing AI companies. Authors are suing because their books were used in training data. Artists are suing over their paintings. But those cases usually involve massive, confirmed scrapes of data. This one is different because it’s based on a handful of torrents.
If Meta wins this, it sets a bit of a precedent. It suggests that a company isn’t automatically responsible for every bit of data that passes through its pipes. It also forces plaintiffs to prove that the data was actually used for training, not just that it existed on a hard drive somewhere in the building.
Meta’s spokesperson was pretty blunt about it: “These claims are bogus.” They want the world to know they don’t want this kind of content in their models. They take “deliberate steps” to avoid it. Whether the court believes that or not is the $350 million question.
The Next Steps
Strike 3 now has about two weeks to file a response. They have to find a way to link those 2,400 movies directly to the AI team. If they can’t, this suit is likely going to the graveyard. Meta is playing the “boring human” card-arguing that people are just people, and sometimes people do weird things on company time that have nothing to do with the company’s “mission.”
It’s a weirdly grounded defense for a company usually obsessed with the “metaverse.” Instead of talking about the future of humanity, they are talking about one guy at his dad’s house and some intermittent torrents. Sometimes the simplest explanation-that people just like watching movies-is the one that wins in court.