Thousands of child sex abuse victims sue Apple for lax CSAM reporting
Apple knowingly ignoring child porn is a "never-ending nightmare," lawsuit says.
arstechnica.com/tech-policy/20β¦
https://mastodon.social/@arstechnica/113624699875479200
@arstechnica We would all be safer if cops had the right to search our homes at will. Just saying...
Because I don't get how people can compartmentalize privacy like this. Privacy is a basic [β¦]
@arstechnica while I think they should maybe do SOMETHING, the thing they proposed with iOS 15.2 or whatever was horrible. If they scan unencrypted files in iCloud, fine. If they do anything [β¦]
@arstechnica It's always been so weird to me that forums and comment threads are 100% against responsible, on-device CSAM scanning.
@arstechnica apple in movies "only the goodguys can use apple."
Apple in resl live "we're used a lot by criminals, as we're more a status symbol then anything else...and no matter what we [β¦]
@arstechnica as abhorrent as this material is. I donβt want a big brother IN my phone. I have an expectation of privacy on my phone and its contents (99% dog pics). This suit is nothing more [β¦]
@arstechnica
"In 2021, Apple announced an on-device scanning system to identify and report users who stored known child sexual abuse material on their iCloud Photos accounts. β¦
[β¦]
ββββ