πŸ“£ Post by arstechnica

2024-12-09

Thousands of child sex abuse victims sue Apple for lax CSAM reporting
Apple knowingly ignoring child porn is a "never-ending nightmare," lawsuit says.
arstechnica.com/tech-policy/20…

arstechnica

https://mastodon.social/@arstechnica/113624699875479200

https://arstechnica.com/tech-policy/2024/12/thousands-of-child-sex-abuse-victims-sue-apple-for-lax-csam-reporting/?utm_brand=arstechnica&utm_social-type=owned&utm_source=mastodon&utm_medium=social

https://files.mastodon.social/media_attachments/files/113/624/699/830/940/905/original/b0878b8921e013ba.jpg

πŸ’¬ Replies

2024-12-10 doctor_zoidberg

@arstechnica We would all be safer if cops had the right to search our homes at will. Just saying...
Because I don't get how people can compartmentalize privacy like this. Privacy is a basic […]

2024-12-10 geomaster337

@arstechnica while I think they should maybe do SOMETHING, the thing they proposed with iOS 15.2 or whatever was horrible. If they scan unencrypted files in iCloud, fine. If they do anything […]

2024-12-09 MisterMoo

@arstechnica It's always been so weird to me that forums and comment threads are 100% against responsible, on-device CSAM scanning.

2024-12-09 Fleurvervenne

@arstechnica apple in movies "only the goodguys can use apple."
Apple in resl live "we're used a lot by criminals, as we're more a status symbol then anything else...and no matter what we […]

2024-12-09 T2R

@arstechnica as abhorrent as this material is. I don’t want a big brother IN my phone. I have an expectation of privacy on my phone and its contents (99% dog pics). This suit is nothing more […]

2024-12-09 mmalc ┃ 1πŸ”—

@arstechnica
"In 2021, Apple announced an on-device scanning system to identify and report users who stored known child sexual abuse material on their iCloud Photos accounts. …
[…]

────

πŸ“‘ Local feed

πŸ•οΈ Communities

πŸ”₯ Hashtags

πŸ”Ž Search posts

πŸ”‘ Sign in

πŸ“Š Status

πŸ›Ÿ Help