This lawsuit is a class action complaint filed in the United States District Court for the Northern District of California. The plaintiffs, “Amy” and “Jessica,” are represented by attorneys Micha Liberty of Liberty Law, James R. Marsh, Margaret E. Mabie, and Helene M. Weiss of Marsh Law Firm PLLC, and Hillary Nappi of Hach Rose Schirripa & Cheverie LLP.
Nature of the Action
The lawsuit seeks damages for violations of federal criminal child pornography statutes, specifically 18 U.S.C. §§ 2252 (a)(4)(B) and (b)(2). The plaintiffs allege that Apple Inc. failed to design and manufacture products that are safe for use, particularly for children. The complaint highlights that Apple knew about the safety issues related to its products but failed to disclose these hazards to consumers.
Key Allegations
- Defective Products: The plaintiffs claim that Apple’s products, including iPhone, MacBook, iPad, and iCloud, were defectively designed and facilitated the spread of child sexual abuse material (CSAM).
- Failure to Act: Despite being aware of the dangers, Apple allegedly failed to implement measures to detect and limit CSAM on its platforms.
- NeuralHash Tool: Apple developed a CSAM detection tool called NeuralHash but failed to implement it effectively, leading to continued harm to the plaintiffs and other victims.
- Legal Violations: The lawsuit asserts that Apple violated federal laws by not reporting known CSAM and failing to protect victims.
Plaintiffs’ Claims
The plaintiffs, “Amy” and “Jessica,” were victims of child sexual abuse, and their abuse was depicted in CSAM circulated on the internet, including on Apple devices and iCloud. They seek relief on behalf of themselves and similarly situated individuals who were harmed by Apple’s defective products.
Relief Sought
The plaintiffs are seeking:
- Compensatory and punitive damages.
- Injunctive relief requiring Apple to identify, remove, and report CSAM on its platforms.
- Implementation of policies and procedures to prevent the continued dissemination of CSAM.
This lawsuit underscores the critical need for tech companies to prioritize user safety, especially for vulnerable populations like children, and to comply with legal obligations to report and prevent the spread of harmful content.
Comment
According to John Shehan, Senior Vice President, Exploited Children Division & International Engagement at the National Center for Missing and Exploited Children, there are currently 2680 actively traded CSAM series depicting identified victims in existence. This is out of over 30,000 victims identified by NCMEC whose CSAM is not yet in circulation but at anytime could enter the online CSAM ecosystem.
This lawsuit has identified 45 Marsh Law clients in 89 criminal cases across the country where Apple and iCloud have been identified as a factor in the distribution, collection, and storage of these victims’ child sex abuse images and videos. In 2023, Apple itself identified 267 incidents of CSAM discovered on its platforms and since it does not proactively scan for CSAM that number is certainly much greater. In addition, since Apple does not proactively scan for CSAM, there may be victims from the 30,000 non-circulated victims identified by NCMEC stored on Apple’s platforms.
Most importantly, this lawsuit is more than about raw numbers, it is about lives shattered by companies like Apple who are doing absolutely nothing to limit the spread of illegal CSAM on and using their platforms. Behind every image, every video, is a child whose life was destroyed and whose pain and suffering are magnified by the never-ending mass distribution of their childhood sexual abuse. No responsible company, no ethical or moral human being, should ever willingly contribute to this inhumane trauma and oppression. Unfortunately, Apple has done just that, and they will now face responsibility and accountability before the justice system and the public.
Currently 2680 child victims are eligible for justice as putative members of the Amy v. Apple class. What we know, however, is that many more victims, now and in the future, will be helped when Apple begins fulfilling its moral and ethical responsibility to proactively scan for illegal child sex abuse images and videos on its platforms. Only by stopping the spread of this pernicious material can we begin to address our obligations to victims by restoring some measure of dignity and compassion to survivors of these horrible crimes. Our message to Apple is to stop harboring and protecting criminals and start demonstrating responsibility and accountability to children. This lawsuit aims to do just that.
Leave a Reply