Court: Musk’s X Must Prove It Wasn’t Negligent Over Child Abuse Material – SLVITO

Court: Musk’s X Must Prove It Wasn’t Negligent Over Child Abuse Material


A federal appeals court has revived a case that alleges that X was negligent in its handling of child sexual abuse material (CSAM), and “slow-walked its response” to reports.

The Ninth US Circuit Court of Appeals, in San Francisco, said X must face a claim that it failed to promptly report a video containing explicit images of two underage boys to the National Center for Missing and Exploited Children (NCMEC). The lawsuit was originally filed in 2021, before Elon Musk’s 2022 takeover of the platform.

In one example, the plaintiffs claim that a 13-year-old boy was tricked into sharing explicit images of himself via Snapchat, which were then shared on X. These images were later reported by the boy and his mother, who supplied ID and complied with X’s reporting processes. However, X allegedly took nine days to take the offending content down, at which point it had already racked up over 167,000 views and 2,000 retweets and had circulated around his high school.

The suit alleged that X “passed on opportunities to develop better tools” to stop the spread of this type of content, “despite the inadequacy of its existing infrastructure.” It went on to claim that due to X’s business model, “it receives significant advertising revenue from hosting sought-after or popular posts, including those that depict pornographic content featuring minors.”

The plaintiffs also pointed to numerous limitations in X’s child abuse material reporting processes. These included not allowing users to report child pornography sent via private messaging, requiring reporters to supply an email address, and requiring the reporter to have and be logged into a Twitter account.

Judge Danielle Forrest found section 230 of the federal Communications Decency Act, which protects online platforms from liability over user content, didn’t grant X immunity from the negligence claim after it learned about the offending material. However, it was found immune from allegations it benefited from sex trafficking.

Recommended by Our Editors

Adult content represented a large chunk of X at the time of the case. A Reuters report from 2022 found that 13% of all content on the platform was adult material, citing internal documents. Meanwhile, problems with CSAM have persisted on X. According to X’s 2024 January and June Transparency Report, 2.78 million accounts were deactivated for child sex material violations, though this fell to 132,155 in the October 2024 to March 2025 period.

X has yet to comment on the court’s decision.



Newsletter Icon

Get Our Best Stories!

Your Daily Dose of Our Top Tech News


What's New Now Newsletter Image

Sign up for our What’s New Now newsletter to receive the latest news, best new products, and expert advice from the editors of PCMag.

By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.

Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

About Will McCurdy

Contributor

Will McCurdy

I’m a reporter covering weekend news. Before joining PCMag in 2024, I picked up bylines in BBC News, The Guardian, The Times of London, The Daily Beast, Vice, Slate, Fast Company, The Evening Standard, The i, TechRadar, and Decrypt Media.

I’ve been a PC gamer since you had to install games from multiple CD-ROMs by hand. As a reporter, I’m passionate about the intersection of tech and human lives. I’ve covered everything from crypto scandals to the art world, as well as conspiracy theories, UK politics, and Russia and foreign affairs.


Read Will’s full bio

Read the latest from Will McCurdy





Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top