Apple will remotely install software to scan all US phones for child sex abuse images

LukeTbk

[H]ard|Gawd
Joined
Sep 10, 2020
Messages
1,509
I liked my iPhone 8 Plus for what it was, but I guess it is time to ditch it. I am an anime fan. If you know anything about anime, you'll know that some character designs might be misconstrued as underage, and possibly in revealing clothing. They're just drawings and fiction, but any agency looking at this might be too retarded, or might not even care. Even if I don't get taken off to jail, I'll probably be marked/flagged by this. I don't use iCloud at all, and I don't just text anime around often, but just seeing this on the horizon is enough of a warning flag. Either this garbage will be stillborn, or it will spiral out of control into something far more insidious than this. I need to bail. Where to, though? I don't know. There's a reason I left the Android landscape. I guess it's time to start looking for a new phone.

From what I understand it is not an intelligence trying to determine if an image is illegal by looking at is content, but by trying to see if it is one present (with cropping, turning, mirror, etc... type of modification to it) in a giant database of known illegal image, which will not contain drawing and you would still have to be sent in prison by a judge that would look at the images, flagged and followed is much more in realm of the possible.

Has for being stillborn, that something Microsoft-Facebook-Google was doing for a very long time, Apple was special in not participating in it more than the change being special, it is really hard to imagine the mainstream buyer caring about a hash database used on picture you ask Apple Cloud to host for you.
 

StoleMyOwnCar

2[H]4U
Joined
Sep 30, 2013
Messages
2,238
From what I understand it is not an intelligence trying to determine if an image is illegal by looking at is content, but by trying to see if it is one present (with cropping, turning, mirror, etc... type of modification to it) in a giant database of known illegal image, which will not contain drawing and you would still have to be sent in prison by a judge that would look at the images, flagged and followed is much more in realm of the possible.

Has for being stillborn, that something Microsoft-Facebook-Google was doing for a very long time, Apple was special in not participating in it more than the change being special, it is really hard to imagine the mainstream buyer caring about a hash database used on picture you ask Apple Cloud to host for you.

I don't doubt for a second that this is going to turn into a heuristic and/or pattern-based machine learning/neural network evaluation, given time. Discord already does/did something similar, it has a bot that goes around trying to automatically check if you're posting a lewd image into an SFW channel (by the way, it sucks at its job). At first I was okay with this when I read the initial blueprint, because I assumed it was only going to be for photos uploaded to the cloud and/or texted around. However, it seems like it will actually be functioning actively on your phone, as well. And I fully expect later iterations of the iPhone to have some setting buried very far down deep that will be automatically enabled, checked to upload this information automatically, to a temporary and/or trial subscription that you didn't know you had.

I'm not eager to see what rabbit hole this leads down into. Stopping child abuse and child pornography is a noble cause, but as for me I obviously do neither... but I'm not eager to sit there and be marked by the faulty algorithms that are going to come out after this initial wave, living my life in fear of my own phone, because I play, say, Azur Lane on it. I guess I could just delete everything on my iPhone and only use it as a plain phone... and maybe that's the path I will have to go. But at that point, why even have a smartphone? I am sure you're right, most people will not care about this. I am speaking from my perspective, and there seems to be a lot to care about. If not now, then later.
 

revenant

[H]F Junkie
Joined
Apr 10, 2005
Messages
14,846
This is terrifying... not that I have anything to hide jesus ... but it's just going to spiral from here to the next giant privacy violation.. we'll probably be using China's facial recognition social punishment system in 10 years probably too.. or some form thereof..
 

Mchart

Supreme [H]ardness
Joined
Aug 7, 2004
Messages
4,679
This is terrifying... not that I have anything to hide jesus ... but it's just going to spiral from here to the next giant privacy violation.. we'll probably be using China's facial recognition social punishment system in 10 years probably too.. or some form thereof..
Ya'll realize that every other cloud company has been doing this for years now, correct? I'd prefer Apple not do it either, but it's hilarious how people only care about this stuff and have a hissy fit when Apple announces it. Keep in mind, none of the others, like Adobe and Google, never announced it. It was just shadow introduced.
 

revenant

[H]F Junkie
Joined
Apr 10, 2005
Messages
14,846
Ya'll realize that every other cloud company has been doing this for years now, correct? I'd prefer Apple not do it either, but it's hilarious how people only care about this stuff and have a hissy fit when Apple announces it. Keep in mind, none of the others, like Adobe and Google, never announced it. It was just shadow introduced.
I guess that makes sense.. I guess it just hit home right then reading about it. and I am not sure I had a full on hissy.. lol.
 

emphy

Limp Gawd
Joined
Aug 31, 2016
Messages
296
The Apple claim is one in a trillion error rate:
The trained network’s output is an N-dimensional, floating-point descriptor

That sound like for something of relatively small size (1 billion * 1,000 pictures) could have a good chance to all have single unique hash, the simple microsoft GUID :
the number of random version-4 UUIDs which need to be generated in order to have a 50% probability of at least one collision is 2.71 quintillion
number is equivalent to generating 1 billion UUIDs per second for about 85 years

the probability to find a duplicate within 103 trillion version-4 UUIDs is one in a billion.


PhotoDNA has tend to be between 128 and 4096 bits.

Google have been using this since 2008, facebook since 2011, Microsoft has being running something similar on onedrive for a while, how often false positive occur must be somewhat known, OneDrive being using a photodna database hash for one drive since 2015.

I would imagine the issues being more in the is it close enough hash than actual collision and it is good to have human verification process at the end of the chain.

Discloser: I know almost nothing about the subject.

You don't need to be an expert on the subject to know that a one in a trillion error rate is a cow crap number pulled from someones behind.
 

Jonnycat99

Limp Gawd
Joined
Nov 9, 2006
Messages
292
This is just a stopgap measure until they set up the brain scanning devices outside of public facilities and stores.
 

blackmomba

Gawd
Joined
Dec 5, 2018
Messages
579
Ya'll realize that every other cloud company has been doing this for years now, correct? I'd prefer Apple not do it either, but it's hilarious how people only care about this stuff and have a hissy fit when Apple announces it. Keep in mind, none of the others, like Adobe and Google, never announced it. It was just shadow introduced.
Everyone else puts the pedo hash db on your device and scans your local pictures before they've been uploaded to the cloud?

Where's the citations please?
 

sfsuphysics

[H]F Junkie
Joined
Jan 14, 2007
Messages
14,956
"... are an excuse to weaken encryption and other security measures that protect millions of users and our national security,” Apple said ...

(hint not recently)
 

LukeTbk

[H]ard|Gawd
Joined
Sep 10, 2020
Messages
1,509
Everyone else puts the pedo hash db on your device and scans your local pictures before they've been uploaded to the cloud?
I imagine most use Google or Azure PhotoDNA service that microsoft offer for free (thus not the local device), but almost all of them do scan them now.

Apple way probably save apple money (if they do not want to use Microsoft for mental) ? Or legally it is better if the picture is tagged before being hosted by them, not sure why they do it like that unlike the others, they were scanning them once they were on the iCloud since 2020, does doing the AI work on the phone instead then on the cloud a big difference ?
 

deruberhanyok

2[H]4U
Joined
Aug 22, 2004
Messages
2,105
does doing the AI work on the phone instead then on the cloud a big difference ?

absolutely it does.

before, the data on your phone was, in theory, secure, as long as you weren’t just sending it up to cloud services.

but now the phone will scan your data and send off a report to somewhere without your control of it. They say it will only happen if the photos are uploaded to iCloud, that’s the trigger for this whole process.

But with this the capability exists for it to scan whatever, whenever, and report on it, without you knowing about it, and all we have is Apple’s statements that “the system won’t be abused” and “won’t be expanded” to go on. There’s no way lock the door, and we just have to trust Apple with the keys.

that is a massive change from “if you want to keep your data private just don’t put it in the cloud.”
 

LukeTbk

[H]ard|Gawd
Joined
Sep 10, 2020
Messages
1,509
absolutely it does.
I get the slippery slope and one day it would be different but not for now, it is pretty much the exact same (just apple saving money) it seem to me.
Ya'll realize that every other cloud company has been doing this for years now, correct? I'd prefer Apple not do it either, but it's hilarious how people only care about this stuff and have a hissy fit when Apple announces it. Keep in mind, none of the others, like Adobe and Google, never announced it. It was just shadow introduced.
That sound false:
https://www.theverge.com/2018/9/3/1...ial-moderation-tool-internet-watch-foundation
https://www.blog.google/around-the-...nd-report-child-sexual-abuse-material-online/

MIcrosoft made that very public over time it seem:
https://news.microsoft.com/on-the-i...eing-used-to-fight-online-child-exploitation/
https://www.engadget.com/2015-07-16-microsoft-photodna-cloud.html
https://www.theguardian.com/technol... is to introduce a,2bn tweets every five days.
https://www.verizon.com/about/our-company/company-policies/digital-safety
https://blogs.microsoft.com/on-the-...odna-technology-to-combat-child-exploitation/


People didn't seem to mind when Apple started to do the same in 2020, it is the new on the phone itself system that they seem to mind.
 

cybereality

[H]F Junkie
Joined
Mar 22, 2008
Messages
8,317
It's a matter of ownership. When you upload files to a cloud system, a private company owns the servers. This is much different from a scanning process happening locally on a piece of hardware you previously thought you owned.
 

LukeTbk

[H]ard|Gawd
Joined
Sep 10, 2020
Messages
1,509
It's a matter of ownership. When you upload files to a cloud system, a private company owns the servers. This is much different from a scanning process happening locally on a piece of hardware you previously thought you owned.
But (if I understand) it is after you asked your phone to upload the file on the cloud system, which will be there in a mere second and would have occurred there (and I imagine still will has the DB grow) anyway, making that quite semantic (about who pay for the electricity and etc...), making it virtually the same in every effect, beside who pay for the processing and current.
 

SmokeRngs

[H]ard|DCer of the Month - April 2008
Joined
Aug 9, 2001
Messages
17,420
It's a matter of ownership. When you upload files to a cloud system, a private company owns the servers. This is much different from a scanning process happening locally on a piece of hardware you previously thought you owned.
It is and it isn't different. If you're dumb enough to store any data on "the cloud" you should know that anything there can and will be scanned even though the company should never do it. It's there and they're going to do whatever they can with your data.

Your own device should be under your control unlike anything with the cloud. Nothing should be run on it without your permission and authorization. It's your data and no one should have access to it unless you give them access every single time they ask.

And I don't want to hear anything about people saying not to turn on the Apple cloud access. It doesn't matter. Some iOS update is going to turn it on "accidentally" and with the way Apple is it probably won't be long until it's forced on and you'll be required to use it in order to use the hardware. Ironically, this will only be possible because of Apple's walled garden approach. This will be a direct result of Apple's walled garden and top to bottom ecosystem approach. Things may work easily but this is simply one of the big downsides to jumping into and using an ecosystem like that. Apple has most of the control top to bottom and it means Apple can basically do what they want. When you're the gatekeeper it means a level of control with is not possible with something like Android.

The fact of the matter is this is a massive intrusion of privacy. It doesn't matter if others have done it. They're all bad and all of them need to be fought and refused.
 

LukeTbk

[H]ard|Gawd
Joined
Sep 10, 2020
Messages
1,509
The fact of the matter is this is a massive intrusion of privacy. It doesn't matter if others have done it. They're all bad and all of them need to be fought and refused.
I feel that part sound almost like a misunderstanding of hashing of image technology (it could be my understanding of it obviously).

If Microsoft and others false positive rate have been has low has they say, is an algorithm that send a complete gibberish of bits that is a mathematic representation of the rate of changes in pixels intensity of an image to a server a massive intrusion of privacy ? It is telling to apple how many picture you are putting there, but considering you where sending the picture on the iCloud you were already obviously saying that in some way to a server.

Some iOS update is going to turn it on "accidentally"
Which would have put the image on iCloud, which would have scanned them in the very same way, the only change here is saving cloud processor by running it on the phone itself no ?

Arguably having your phone look at the image, create a mathematic expression of it and send that to the server is more private than the server looking at the image to create the image, like the current situation (but I imagine they will continue to scan iCloud for time to time, like old picture on there after the DB got refreshed).
 

SmokeRngs

[H]ard|DCer of the Month - April 2008
Joined
Aug 9, 2001
Messages
17,420
I feel that part sound almost like a misunderstanding of hashing of image technology (it could be my understanding of it obviously).

If Microsoft and others false positive rate have been has low has they say, is an algorithm that send a complete gibberish of bits that is a mathematic representation of the rate of changes in pixels intensity of an image to a server a massive intrusion of privacy ? It is telling to apple how many picture you are putting there, but considering you where sending the picture on the iCloud you were already obviously saying that in some way to a server.


Which would have put the image on iCloud, which would have scanned them in the very same way, the only change here is saving cloud processor by running it on the phone itself no ?

Arguably having your phone look at the image, create a mathematic expression of it and send that to the server is more private than the server looking at the image to create the image, like the current situation (but I imagine they will continue to scan iCloud for time to time, like old picture on there after the DB got refreshed).
It's not a matter of more or less privacy. It's a matter of having privacy or having none at all. There should be no scanning of any photos or other data on your phone. Catching pedos is just fine, as long as it doesn't intrude on my privacy and my rights. If they want to catch pedos, law enforcement needs to gather evidence and obtain a warrant to do so. That is enshrined in the Constitution of the United States and this forced addition by Apple violates that.

No one should be scanning my files on my devices other than me without my expressed permission.

Would you allow a virus or malware scanner to scan all your files and send the results of anything possibly illegal to law enforcement or a private company or some non-profit organization? What Apple is doing is the exact same thing but in smaller scope. I said it earlier and I'll say it again: this is nothing but an attempted end-run around the Constitution by doing an illegal search and seizure without a warrant. If it isn't fought and stopped now it will continue to grow larger with increased infringements of rights by expanding what it searched and what is looked for. It probably won't be long before all data is searched and all results are uploaded no matter what.

Any infringement on rights must be fought and defeated because once the first infringement is made it's only easier for them to create additional infringements.
 

DukenukemX

Supreme [H]ardness
Joined
Jan 30, 2005
Messages
5,810
If you can't trust the company, then you can't trust that the shipped binaries are actually built from the open-source code.
You shouldn't trust any company.
Unless you personally verify the source (which most people aren't qualified to do), and then compile it yourself to load on your phone, you can't be sure there is not a hidden backdoor.
The idea is that a lot of smart people will be looking at the code and will find things if things are to be found. You know, like installing software to scan all you photos for pedo stuff.
And, even then, there could be compromised firmware or hardware backdoors on the device that would be difficult to find, outside of advanced security researchers. In any case, Apple is never going to open-source their OS.
This is why Paul Daniels exists. If there's a hardware backdoor then Apple broke it and we'll need to fix it. Also this doesn't excuse Apple not having open source code for their software.
 

d3athf1sh

Gawd
Joined
Dec 16, 2015
Messages
920

Attachments

  • apple shenanigans.mp4
    3.6 MB · Views: 0
Last edited:

Aurelius

2[H]4U
Joined
Mar 22, 2003
Messages
3,464
so you guys may be interested to find that iphones apparently have been taking random photos already, as seen in infrared, wonder if when they detect something if they'll try to get a snap of the user?
It's not taking photos. That's the depth sensor periodically going off to provide attention-based features. To quote Apple's technical document on Face ID:

Even if you don’t enroll in Face ID, the TrueDepth camera intelligently activates to support attention aware features, like dimming the display if you aren't looking at your device or lowering the volume of alerts if you're looking at your device. For example, when using Safari, your device checks to determine if you're looking at your device and turns the screen off if you aren’t. If you don’t want to use these features, you can open Settings > Face ID & Passcode and disable Attention Aware Features.

This is what's frustrating when discussing Apple privacy issues, whether it's CSAM or Face ID... there's a lot of people who don't really understand the technology and are really using this to justify their own irrational preconceptions, whether it's extreme paranoia or dreams of a total Android monopoly.
 

Mchart

Supreme [H]ardness
Joined
Aug 7, 2004
Messages
4,679
It's not taking photos. That's the depth sensor periodically going off to provide attention-based features. To quote Apple's technical document on Face ID:

Even if you don’t enroll in Face ID, the TrueDepth camera intelligently activates to support attention aware features, like dimming the display if you aren't looking at your device or lowering the volume of alerts if you're looking at your device. For example, when using Safari, your device checks to determine if you're looking at your device and turns the screen off if you aren’t. If you don’t want to use these features, you can open Settings > Face ID & Passcode and disable Attention Aware Features.

This is what's frustrating when discussing Apple privacy issues, whether it's CSAM or Face ID... there's a lot of people who don't really understand the technology and are really using this to justify their own irrational preconceptions, whether it's extreme paranoia or dreams of a total Android monopoly.
Reminds me of the idiots complaining about Apple stealing your fingerprints when TouchID came out. I wonder if these are the same type of people that thought cameras were stealing their souls 130 years ago.
 

deruberhanyok

2[H]4U
Joined
Aug 22, 2004
Messages
2,105
I get the slippery slope and one day it would be different but not for now, it is pretty much the exact same (just apple saving money) it seem to me.

“Apple saving money” is not a good reason to bypass probable cause. Which they need because our phones are private property.

if we own our phones, they need permission from us or a legal order to use them to scan the data on them.

if otoh Apple thinks we don’t actually own our phones, someone at Apple ought to make that really clear. Then the legal experts can sort it out.

d3athf1sh yeah, I’ve seen those but as Mchart and Aurelius have pointed out, it’s part of the auto Face ID capability. You can just turn off that part if you don’t like it, and Apple is pretty clear about how it works.

I get that everyone wants to jump on everything Apple now, but this will just further distract from the major issue at hand, which (i think) is whether or not we own our phones and, if so, what Apple can and can’t do to them without our permission.

i actually really like a lot of the other features coming in iOS 15 for both the capabilities they provide and that they’re all opt-out by default. And that they don’t have a nebulously defined “sends reports to Apple who reviews it and maybe passes it on to the authorities” capability.
 
Last edited:

Aurelius

2[H]4U
Joined
Mar 22, 2003
Messages
3,464
An important point on the subject... it'd be highly ironic if the "I'm switching to Android" (or, let's be honest, more likely the existing Anything But Apple camp) ended up compromising their privacy in the end by using a less secure cloud photo platform. That and it sounds like this might not be as much of a change as some people think it is.

Screen Shot 2021-08-10 at 9.51.03 AM.png
 

Mchart

Supreme [H]ardness
Joined
Aug 7, 2004
Messages
4,679
An important point on the subject... it'd be highly ironic if the "I'm switching to Android" (or, let's be honest, more likely the existing Anything But Apple camp) ended up compromising their privacy in the end by using a less secure cloud photo platform. That and it sounds like this might not be as much of a change as some people think it is.

View attachment 383404
Yeah, that’s what I’ve thought why they are doing this. By moving it to the host device they can implement full end to end security since they can’t scan it on the cloud anymore if they did this.
 

DukenukemX

Supreme [H]ardness
Joined
Jan 30, 2005
Messages
5,810
Yeah, that’s what I’ve thought why they are doing this. By moving it to the host device they can implement full end to end security since they can’t scan it on the cloud anymore if they did this.
There are probably two reasons Apple is doing this on the host device instead of cloud. Firstly, probably a lot of people aren't using the cloud or at least not as many as Apple would like. Secondly, it saves resources. They're using your hardware, your electricity, and your data to do their work. They know you can't or won't stop them because you're too ignorant to how these devices work to begin with.
“Apple saving money” is not a good reason to bypass probable cause. Which they need because our phones are private property.
That's debatable because for some time now we've been told that a lot of the stuff in our stuff is not owned by us. You can buy a car now and have physical features you can't use because you gotta pay extra to unlock it. None of this has been tested in court because most of the time any lawsuit is settled outside of court.
if we own our phones, they need permission from us or a legal order to use them to scan the data on them.
Good luck with that without access to the source code. It'll eventually become like the elevator buttons where you press it thinking it does something but there's no wires going to the button. It's there to make you feel better about your actions.
if otoh Apple thinks we don’t actually own our phones, someone at Apple ought to make that really clear. Then the legal experts can sort it out.
That'll never happen and Apple as well as everyone else will try to stop you. Most software is in a legal grey area where you license it but own the product that the software is running on. But the product won't work without the software so what's the point? Again, nothing legally tested and all you have to go by is an end user agreement.
I get that everyone wants to jump on everything Apple now, but this will just further distract from the major issue at hand, which (i think) is whether or not we own our phones and, if so, what Apple can and can’t do to them without our permission.
What we should focus on is having Apple and everyone else release their source code for their OS. That way we can look through the code and compile it and then install it onto our phones if we choose to do so. You aren't going to force Apple to change how they operate if you can't see how they operate. There's a reason why so many servers run Linux which releases all their source code.
 

Mchart

Supreme [H]ardness
Joined
Aug 7, 2004
Messages
4,679
There are probably two reasons Apple is doing this on the host device instead of cloud. Firstly, probably a lot of people aren't using the cloud or at least not as many as Apple would like. Secondly, it saves resources. They're using your hardware, your electricity, and your data to do their work. They know you can't or won't stop them because you're too ignorant to how these devices work to begin with.

That's debatable because for some time now we've been told that a lot of the stuff in our stuff is not owned by us. You can buy a car now and have physical features you can't use because you gotta pay extra to unlock it. None of this has been tested in court because most of the time any lawsuit is settled outside of court.

Good luck with that without access to the source code. It'll eventually become like the elevator buttons where you press it thinking it does something but there's no wires going to the button. It's there to make you feel better about your actions.

That'll never happen and Apple as well as everyone else will try to stop you. Most software is in a legal grey area where you license it but own the product that the software is running on. But the product won't work without the software so what's the point? Again, nothing legally tested and all you have to go by is an end user agreement.

What we should focus on is having Apple and everyone else release their source code for their OS. That way we can look through the code and compile it and then install it onto our phones if we choose to do so. You aren't going to force Apple to change how they operate if you can't see how they operate. There's a reason why so many servers run Linux which releases all their source code.
Servers run on Linux because Linux is way easier to just do shit on when it comes to management, scripting, and reliability. Open source is just a perk, but no one is reviewing that code. You think people review all the code when they run yum update? They put a list of trusted repos and call it a day. However, trusted repos get compromised constantly, which is generally how Red Hat had a business model…
 

LukeTbk

[H]ard|Gawd
Joined
Sep 10, 2020
Messages
1,509
Yeah, that’s what I’ve thought why they are doing this. By moving it to the host device they can implement full end to end security since they can’t scan it on the cloud anymore if they did this.
That would make it impossible to scan older photo has the database of illegal photo grow too.


No one should be scanning my files on my devices other than me without my expressed permission.

By asking your phone to send an image on iCloud are you not giving that permission ?
 

Mchart

Supreme [H]ardness
Joined
Aug 7, 2004
Messages
4,679
That would make it impossible to scan older photo has the database of illegal photo grow too.
Well that’s what they’ve said. The ticket is generated at the time the photo is first taken/put into the photo library on the phone.
 

LukeTbk

[H]ard|Gawd
Joined
Sep 10, 2020
Messages
1,509
Well that’s what they’ve said. The ticket is generated at the time the photo is first taken/put into the photo library on the phone.
Yes but did they say that they will never scan it again over the year on the iCloud has well ?

Did they say when the photo is taken, that would be extremely misleading because it is always phrased when sending it to the cloud, that your local photo would never be.
 

Mchart

Supreme [H]ardness
Joined
Aug 7, 2004
Messages
4,679
Yes but did they say that they will never scan it again over the year on the iCloud has well ?

Did they say when the photo is taken, that would be extremely misleading because it is always phrased when sending it to the cloud, that your local photo would never be.
Read their tech sheet. They generate a ticket, the ticket gets uploaded along with the photo. The ticket is what they look at.
 

LukeTbk

[H]ard|Gawd
Joined
Sep 10, 2020
Messages
1,509
Read their tech sheet. They generate a ticket, the ticket gets uploaded along with the photo. The ticket is what they look at.
Ok yes they will relook the ticket, but how does the human validation step be possible if one day it tick has positive has DB updates ? I guess they could simply silently delete them.
 

SmokeRngs

[H]ard|DCer of the Month - April 2008
Joined
Aug 9, 2001
Messages
17,420
By asking your phone to send an image on iCloud are you not giving that permission ?
No. It doesn't ask you each and every time if you want to have the image scanned and a report sent to law enforcement or some organization in order to determine if you've done something illegal. It also doesn't inform you that you are waiving your rights against unlawful search and seizure and that by doing so you may be incriminating yourself. It would also need to do that for every single photo/file that is sent.
 

Mchart

Supreme [H]ardness
Joined
Aug 7, 2004
Messages
4,679
No. It doesn't ask you each and every time if you want to have the image scanned anAd a report sent to law enforcement or some organization in order to determine if you've done something illegal. It also doesn't inform you that you are waiving your rights against unlawful search and seizure and that by doing so you may be incriminating yourself. It would also need to do that for every single photo/file that is sent.
You agree to the EULA when you activate iCloud on the phone, not to mention the EULA when you use the phone itself for the first time. Further, them sending this data to a private org is not unlawful search & seizure. You can make that claim all you want, but you are factually wrong.
 

SmokeRngs

[H]ard|DCer of the Month - April 2008
Joined
Aug 9, 2001
Messages
17,420
You agree to the EULA when you activate iCloud on the phone, not to mention the EULA when you use the phone itself for the first time. Further, them sending this data to a private org is not unlawful search & seizure. You can make that claim all you want, but you are factually wrong.
And you just proved why this is an end run around the Constitution. And I wouldn't be too sure about it not being illegal. The only use of this (at this time) is to proactively find something possibly illegal in order to pass it off to law enforcement. There is literally no other use for it. This means someone is being used as a state actor and therefore the Constitution and those pesky Amendments are required to be followed. If not, any and all evidence is useless and anything obtained by using that illegal evidence will also be useless.
 

LukeTbk

[H]ard|Gawd
Joined
Sep 10, 2020
Messages
1,509
No. It doesn't ask you each and every time if you want to have the image scanned and a report sent to law enforcement or some organization in order to determine if you've done something illegal. It also doesn't inform you that you are waiving your rights against unlawful search and seizure and that by doing so you may be incriminating yourself. It would also need to do that for every single photo/file that is sent.
With a little option do not ask again. But for you, that does know about it (or the few reading the agreement), that nuance become really small it seem to me, specially that is purely on where some part of an hashing process occur that seem to botter you, why everything you say should not already be true ? Purely because your own device is being used in some way in the process, that could be an important nuance for the future, but for the current implementation that seem to change nothing in practice. The picture were already scanned on arrival, the thing that change is moving that a bit.

But no report is send to anyone at that point, at least that not what seem to be said:
The tool designed to detected known images of child sexual abuse, called “NeuralHash,” will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.

Apple do an human review before filing a report to NCMEC:
https://www.apple.com/child-safety/...s_for_Children_Frequently_Asked_Questions.pdf
 

LukeTbk

[H]ard|Gawd
Joined
Sep 10, 2020
Messages
1,509
And you just proved why this is an end run around the Constitution. And I wouldn't be too sure about it not being illegal. The only use of this (at this time) is to proactively find something possibly illegal in order to pass it off to law enforcement. There is literally no other use for it. This means someone is being used as a state actor and therefore the Constitution and those pesky Amendments are required to be followed. If not, any and all evidence is useless and anything obtained by using that illegal evidence will also be useless.
It has been court tested:

https://casetext.com/case/united-states-v-bohannon-15
Bohannan moves the Court to suppress the fruits of any searches conducted by Microsoft and NCMEC, which includes all evidence of child pornography images obtained in later searches. See Mot. to Suppress at 1, 10-12, 14. Bohannon also moves to suppress the fruits of Servat's visual inspection of the image, and the SFPD's execution of the search warrant.

The Court denies Bohannon's motion. First, the Fourth Amendment does not apply to Microsoft's search, as Microsoft was not acting as a government entity or agent. Even if the Fourth Amendment applied, Bohannon consented to Microsoft's search by agreeing to Microsoft's terms of service.
Second, NCMEC did not conduct any search and thus could not have violated Bohannon's Fourth Amendment rights. Third, when Sergeant Servat confirmed that the image was child pornography, this did not result in any further intrusion on Bohannon's privacy. Finally, the SFPD's warrant was supported by probable cause, and were it not, the good faith exception to the exclusionary rule would apply because law enforcement reasonably relied on the warrant.

Bohannon argues that Microsoft's initial identification of child pornography in the OneDrive account resulted from an unlawful warrantless search. But the Fourth Amendment does not apply to Microsoft's private actions and, in any event, Bohannon consented to Microsoft searching his account.


The judge could have been wrong obviously, but that seem a common legal opinion:

https://ideaexchange.uakron.edu/cgi/viewcontent.cgi?article=2493&context=akronlawreview
Courts must exercise caution when admitting evidence into the record obtained through hash value algorithms. It is not, however, per se unconstitutional for private internet service providers to use hash-based examination

Paying close attention to the circumstances surrounding the search, courts must determine the answer to the following questions: (1) Was the individual that conducted the initial search in fact acting as a private individual and not a state actor? (2) Was the defendant’s reasonable expectation of privacy thwarted by the third-party doctrine? (3) Was there probable cause for the warrantless search? (4) Was the state actor acting within the scope of the private party search? If the answer to any of the preceding questions is no, the defendant should be entitled to suppress the evidence gained during the unconstitutional search. When courts choose to implement this standard analysis, United States citizens receive proper protection both from unreasonable search and seizure by the government, as well as protection from the crime of child exploitation.
 
Last edited:
Top