Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How can you be sure to know what is it what you don't want if you want to have a conversation?


I don't want any functionality capable of scanning photos on my device for "illegal" content; I simply don't want that code living on my device no matter how restricted its (initially planned) use is.


Fair enough. Framing it like that it seems like a small step from a dystopia.

But let's face it, we're feet deep in it already: unless you control what code runs on your device you're never safe from code that scans data on your device.

I not sure I really buy the slippery slope argument. If in the future Apple wanted to be more intrusive, they would just be more intrusive and scan your photos on your device for real, not with a convoluted and probabilistic fingerprint method.

What is the weak point? To get people accustomed to being scanned? Aren't people already? Your content is already scanned the moment it reaches to the cloud.

What does this extra client-side scanning add to the dystopia that we're not already experiencing?


It's the opening of Pandora's box. Scanning your device is a huge paradigm shift; we all know the cloud is "someone else's computer" and untrustworthy but thus far one's device was sacred and no one dared touch it.

The floodgates are open now; politicians and other "interested parties" have watched this unfold very carefully and gleefully noted the majority didn't care as much as everyone expected, so they'll definitely be pushing for it now.

Imagine Windows Defender (an antimalware / antivirus distributed with all versions of Windows and enabled by default) starts scanning one's hard drive (and attached external drives) for image files (it already scans documents and executables and even uploads samples of malicious binaries to Microsoft for analysis): how would you / the world react?


> Scanning your device is a huge paradigm shift

But that's not what they want to do. They want to perform a client side fingerprint of a subset of images before they get uploaded to the cloud.

You can argue that they could in the future turn this into a scan of everything on the device. But you can also argue that if that's what they want to do in the future they'll do it in the future. It's all about trust. If you don't trust Apple to not push nefarious code on your devices, stay away from Apple, and that's true even before all this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: