Close Menu
  • Home
  • AI & Technology
  • Politics
  • Business
  • Cryptocurrency
  • Sports
  • Finance
  • Fitness
  • Gadgets
  • World
  • Marketing

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Bitcoin’s Next Big Surge? On-Chain Metrics Suggest a Price Shift Is Near

July 31, 2025

Trump’s White House makeover: A new 90,000-square-foot ballroom

July 31, 2025

Reddit should be a ‘go-to search engine,’ Steve Huffman says

July 31, 2025
Facebook X (Twitter) Instagram
  • Home
  • About US
  • Advertise
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram
MNK NewsMNK News
  • Home
  • AI & Technology
  • Politics
  • Business
  • Cryptocurrency
  • Sports
  • Finance
  • Fitness
  • Gadgets
  • World
  • Marketing
MNK NewsMNK News
Home » Lawsuit: Apple Fails to Prevent Spread of Child Pornography on iCloud
AI & Technology

Lawsuit: Apple Fails to Prevent Spread of Child Pornography on iCloud

MNK NewsBy MNK NewsDecember 15, 2024No Comments4 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


A 27-year-old woman who was a victim of child sexual abuse is suing Apple for over $1.2 billion, claiming the company failed to implement its own system to detect and remove child pornography from its iCloud service.

The New York Times reports that Apple is facing a major lawsuit that could potentially cost the tech giant over $1.2 billion in damages. The suit, filed over the weekend in the U.S. District Court in Northern California, alleges that Apple failed to protect victims of child sexual abuse by not implementing its own system to identify, remove, and report child pornography, also known by the technical term child sexual abuse material (CSAM) stored on its iCloud service, the company’s popular cloud storage offering.

The plaintiff, a 27-year-old woman from the Northeast who is using a pseudonym due to the sensitive nature of the case, was a victim of child sexual abuse from infancy. The abuse was perpetrated by a relative who took photographs of the acts and shared them online with others. The woman continues to receive law enforcement notices, sometimes dozens per day, informing her that the illegal images have been found in the possession of individuals charged with crimes related to child pornography.

In 2021, Apple unveiled a tool called NeuralHash that would allow the company to scan for known child sexual abuse images by comparing their distinct digital signatures, or hashes, against photos stored in a user’s iCloud account. However, the company quickly abandoned the system after facing criticism from cybersecurity experts who warned that it could create a backdoor to iPhones and enable government surveillance.

The lawsuit argues that by introducing and then abandoning the NeuralHash system, Apple broke its promise to protect victims of child sexual abuse and allowed the illegal material to proliferate on its platform. The suit seeks to change Apple’s practices and compensate a potential group of 2,680 victims who are eligible to be part of the case. Under the law, victims of child sexual abuse are entitled to a minimum of $150,000 in damages, which could result in a total award exceeding $1.2 billion if a jury finds Apple liable.

This case is the second of its kind against Apple, following a lawsuit filed in August by a 9-year-old girl in North Carolina who was sent child sexual abuse videos through iCloud links and encouraged to film and upload her own nude videos. Apple has filed a motion to dismiss the North Carolina case, citing Section 230 of the Communications Decency Act, which provides tech companies with legal protection for content posted on their platforms by third parties.

The outcome of these lawsuits could have significant implications for the tech industry, as recent rulings by the U.S. Court of Appeals for the Ninth Circuit have determined that Section 230 shields may only apply to content moderation and do not provide blanket liability protection. This has raised hopes among plaintiffs’ attorneys that tech companies could be challenged in court over their handling of illegal content on their platforms.

Apple has defended its practices, stating that it is committed to fighting the ways predators put children at risk while maintaining the security and privacy of its users. The company has introduced safety tools to curtail the spread of newly created illegal images, such as features in its Messages app that warn children of adult content and allow people to report harmful material to Apple.

However, critics argue that Apple has prioritized privacy and profit over the safety of victims of child sexual abuse. For years, the company has reported significantly less abusive material than its peers, capturing and reporting only a small fraction of what is caught by Google and Facebook.

The plaintiff in the current lawsuit decided to sue Apple because she believes the company gave victims of child sexual abuse false hope by introducing and then abandoning the NeuralHash system. As an iPhone user herself, she feels that Apple chose privacy and profit over the well-being of people like her who have suffered immensely from the spread of illegal images.

Read more at the New York Times here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
MNK News
  • Website

Related Posts

Exclusive — Gemini’s Tyler Winklevoss: Fees on Fintech, Crypto Would Only ‘Enrich the Bankers’

July 31, 2025

Jim Jordan’s ‘Censorship Files:’ UK Government Tried to Censor Criticism of Mass Migration, ‘Two-Tier’ Policing

July 31, 2025

Mark Zuckerberg: Meta Will Bring ‘Personal Superintelligence to Everyone’

July 31, 2025
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Top seeds Zverev, Gauff advance at Canadian Open – Sport

July 31, 2025

Marchand returns to global stage with 200 medley world record – Sport

July 31, 2025

Pakistan veterans team into WCL final after Indian players forfeit semi-final – Sport

July 30, 2025

German mountaineer Laura Dahlmeier confirmed dead after attempting Laila Peak summit – Pakistan

July 30, 2025
Our Picks

Bitcoin’s Next Big Surge? On-Chain Metrics Suggest a Price Shift Is Near

July 31, 2025

Bitcoin Sees Rising New Investor Dominance, Old Holders Yet To Capitulate

July 31, 2025

Weak Bitcoin Treasury Companies Won’t Survive The Bear Market

July 31, 2025

Recent Posts

  • Bitcoin’s Next Big Surge? On-Chain Metrics Suggest a Price Shift Is Near
  • Trump’s White House makeover: A new 90,000-square-foot ballroom
  • Reddit should be a ‘go-to search engine,’ Steve Huffman says
  • Exclusive — Gemini’s Tyler Winklevoss: Fees on Fintech, Crypto Would Only ‘Enrich the Bankers’
  • Apple is ‘open to’ acquisitions to boost its AI roadmap

Recent Comments

No comments to show.
MNK News
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Home
  • About US
  • Advertise
  • Contact US
  • DMCA
  • Privacy Policy
  • Terms & Conditions
© 2025 mnknews. Designed by mnknews.

Type above and press Enter to search. Press Esc to cancel.