‘This conduct is despicable’: State sues Apple over providing iCloud services for ‘child porn’
Apple already has denied the allegations in a lawsuit by private individuals, but now it is a state, West Virginia, that has gone to court to demand action regarding the tech corporation’s involvement, through its iCloud services, in child sex abuse.
State Attorney General JB McCuskey has filed documents accusing Apple of prioritizing user privacy over child safety.
A statement from his office called its case against Apple the first of its kind by a government agency over the distribution of child sexual abuse material on the tech company’s data storage platform.
A report by Reuters said the state’s charges cited a text message in 2020 from the corporation’s anti-fraud chief that confirmed because of Apple’s priorities, iCloud was “the greatest platform for distributing child porn.”
“The state said it is seeking statutory and punitive damages and that the lawsuit filed in Mason County Circuit Court asks a judge to force Apple to implement more effective measures to detect abusive material and implement safer product designs,” Reuters said.
The report noted the company previously has considered scanning images, but dropped the plan over worries about user privacy.
Among the concerns is that a government could look for material to support censorship or arrests.
But McCuskey said there’s a reason a crackdown is needed.
“These images are a permanent record of a child’s trauma, and that child is revictimized every time the material is shared or viewed,” his statement said. “This conduct is despicable, and Apple’s inaction is inexcusable.”
Other online platforms scan uploaded photos and such against a database of identifiers for known child sex abuse material, and Apple’s platform was unencrypted, meaning law enforcement could search with a warrant.
It later considered encryption, but dropped the idea.
Reuters said, “In August 2021, Apple announced NeuralHash which it designed to balance the detection of child abuse material with privacy by scanning images on users’ devices before upload.” But that, too, soon was abandoned.
Now it has a featured called Communication Safety that blurs images of nudity.
The report pointed out the state is charging, “Federal law requires U.S.-based technology companies to report abuse material to the National Center for Missing and Exploited Children. Apple in 2023 made 267 reports, compared to 1.47 million by Google and 30.6 million by Meta Platforms.”
Apple allowed child sexual abuse materials on iCloud for years, new lawsuit claims https://t.co/hh6aLIo7iS pic.twitter.com/FKz7wIAZh2
— Eyewitness News (@ABC7NY) February 19, 2026
West Virginia is suing Apple,
claiming iCloud became a playground for trouble.If this escalates, expect some bearish vibes on AAPL. #Apple $AAPL
— AI Stock News (@stock_news_AI) February 19, 2026