CSAM victims sue Apple for dropping planned scanning tool
December 13, 2024

CSAM victims sue Apple for dropping planned scanning tool

Thousands CSAM Victims are suing apple Plans to scan devices for child sex abuse material have been abandoned.

In addition to facing over $1.2B in fines, the company may be forced to resume its abandoned plans after many of us pointed out the risk of abuse by the repressive regime…

story so far

Most cloud computing services use digital fingerprinting to regularly scan user accounts for child sexual abuse material (CSAM).

These fingerprints are a way to match known CSAM images without anyone having to look at them, and are designed to be blurry enough to continue to match cropped or otherwise edited images while producing very few false positives. When a positive match is found, the photo is manually reviewed by a human. If the photo is confirmed to be CSAM, a report will be filed and forwarded to law enforcement.

iCloud is one of the very few cloud services that doesn’t do this kind of scanning, with Apple citing privacy reasons.

To introduce CSAM scanning in a privacy-respecting way, Apple proposes to run a fingerprint recognition tool to on device The rationale for scanning is that it’s less invasive than scanning iCloud photos. only if Various Matches were found after manual inspection of the photos as a way to further reduce the risk of false positives.

question, As many of us have observedis the possibility of authoritarian government abusing its power.

Can create digital fingerprints any Material type, not just CSAM. There is nothing to prevent authoritarian governments from adding images of political campaign posters or similar images to their repositories.

A tool designed to target serious criminals could easily be used to detect those who oppose the government or one or more of its policies. Apple—which would receive a database of fingerprints from the government—could find itself inadvertently helping to suppress political activists, or worse.

Apple initially said it would never agree to this, but many of us again pointed out that it had no choice. As the company says every time it has to do this do something sketchy arrive obey the law“Apple complies with the laws of every country in which it does business.”

The iPhone maker initially rejected the argument, but eventually CSAM scan plan abandoned before it’s too late Acknowledge the reality of the problem. Apple followed used this exact argument Oppose the proposed legislation.

CSAM victims file lawsuit

Alsternica CSAM victims are reportedly suing Apple for failing to scan.

Thousands of victims are suing Apple, accusing it of failing to detect and report illegal child pornography, also known as child sexual abuse material (CSAM) […]

Child sexual abuse survivors have filed a lawsuit accusing Apple of using cybersecurity defenses to ignore the tech giant’s mandatory CSAM reporting obligations. If they win the jury’s favor, Apple could face fines of more than $1.2 billion. Perhaps most notably for privacy advocates, Apple may also be forced to “identify, remove, and report CSAM on iCloud and implement policies, practices, and procedures to prevent the continued spread of CSAM on Apple devices and services.” or child sex trafficking. This could mean a court order to implement the controversial tool or an alternative that meets industry standards for detecting CSAM at scale.

Apple has been accused of profiting from the policy.

As survivors see it, Apple profits by allowing CSAM on iCloud because child predators see its product as a safe haven to store the CSAM that is heavily reported by most other big tech companies. The lawsuit states that Apple reported only 267 known instances of CSAM in 2023, while four other “leading technology companies submitted more than 32 million reports.” If Apple’s lax approach to CSAM continues unchecked, survivors worry that artificial intelligence could exponentially increase the number of unreported CSAM.

The company responded that it had indeed taken proactive steps to resolve the issue.

Child sexual abuse is abhorrent and we are committed to combating the ways predators put children at risk. We are urgently and aggressively innovating to combat these crimes without compromising the security and privacy of all our users. For example, features like Communication Safety will warn children when they receive or attempt to send content containing nudity to help break the chain of coercion that leads to child sexual abuse. We remain focused on establishing protective measures to help prevent the spread of CSAM.

9to5Mac’s Opinion

This issue is a win-win situation for all involved. There is an inevitable conflict between the detection of truly heinous crimes and the risk of their exploitation by authoritarian governments.

If Apple had made scanning iCloud photos a standard practice from the beginning, this might never have become a controversial issue. Ironically, it was the company’s attempt to achieve the same goal in a more privacy-respecting way that sparked the controversy.

At this point, it may be in Apple’s own interest for a court to rule on this. if it yes Forced to implement scans that future governments will take advantage of, the company can at least point out that it has no choice. Conversely, if Apple wins the case, it could set a legal precedent and remove ongoing pressure.

photo: and gold/Not splashed

2024-12-10 12:16:47

Leave a Reply

Your email address will not be published. Required fields are marked *