Per one tech forum this week: “Google has quietly installed an app on all Android devices called ‘Android System SafetyCore’. It claims to be a ‘security’ application, but whilst running in the background, it collects call logs, contacts, location, your microphone, and much more making this application ‘spyware’ and a HUGE privacy concern. It is strongly advised to uninstall this program if you can. To do this, navigate to 'Settings’ > 'Apps’, then delete the application.”
This is EXACTLY what Apple tried to do with their on-device CSAM detection, it had a ridiculous amount of safeties to protect people’s privacy and still it got shouted down
I’m interested in seeing what happens when Holy Google, for which most nerds have a blind spot, does the exact same thing
I have 5 kids. I’m almost certain my photo library of 15 years has a few completely innocent pictures where a naked infant/toddler might be present. I do not have the time to search 10,000+ pics for material that could be taken completely out of context and reported to authorities without my knowledge. Plus, I have quite a few “intimate” photos of my wife in there as well.
I refuse to consent to a corporation searching through my device on the basis of “well just in case”, as the ramifications of false positives can absolutely destroy someone’s life. The unfortunate truth is that “for your security” is a farce, and people who are actually stupid enough to intentionally create that kind of material are gonna find ways to do it regardless of what the law says.
Scanning everyone’s devices is a gross overreach and, given the way I’ve seen Google and other large corporations handle reports of actually-offensive material (i.e. they do fuck-all), I have serious doubts over the effectiveness of this program.
Google did end up doing exactly that, and what happened was, predictably, people were falsely accused of child abuse and CSAM.
Apple had it report suspected matches, rather than warning locally
It got canceled because the fuzzy hashing algorithms turned out to be so insecure it’s unfixable (easy to plant false positives)
The hell it did, that shit was gonna snitch on its users to law enforcement.
Overall, I think this needs to be done by a neutral 3rd party. I just have no idea how such a 3rd party could stay neutral. Some with social media content moderation.