Apple will remotely install software to scan all US phones for child sex abuse images

This is terrifying... not that I have anything to hide jesus ... but it's just going to spiral from here to the next giant privacy violation.. we'll probably be using China's facial recognition social punishment system in 10 years probably too.. or some form thereof..
 
This is terrifying... not that I have anything to hide jesus ... but it's just going to spiral from here to the next giant privacy violation.. we'll probably be using China's facial recognition social punishment system in 10 years probably too.. or some form thereof..
Ya'll realize that every other cloud company has been doing this for years now, correct? I'd prefer Apple not do it either, but it's hilarious how people only care about this stuff and have a hissy fit when Apple announces it. Keep in mind, none of the others, like Adobe and Google, never announced it. It was just shadow introduced.
 
Ya'll realize that every other cloud company has been doing this for years now, correct? I'd prefer Apple not do it either, but it's hilarious how people only care about this stuff and have a hissy fit when Apple announces it. Keep in mind, none of the others, like Adobe and Google, never announced it. It was just shadow introduced.
I guess that makes sense.. I guess it just hit home right then reading about it. and I am not sure I had a full on hissy.. lol.
 
The Apple claim is one in a trillion error rate:
The trained network’s output is an N-dimensional, floating-point descriptor

That sound like for something of relatively small size (1 billion * 1,000 pictures) could have a good chance to all have single unique hash, the simple microsoft GUID :
the number of random version-4 UUIDs which need to be generated in order to have a 50% probability of at least one collision is 2.71 quintillion
number is equivalent to generating 1 billion UUIDs per second for about 85 years

the probability to find a duplicate within 103 trillion version-4 UUIDs is one in a billion.


PhotoDNA has tend to be between 128 and 4096 bits.

Google have been using this since 2008, facebook since 2011, Microsoft has being running something similar on onedrive for a while, how often false positive occur must be somewhat known, OneDrive being using a photodna database hash for one drive since 2015.

I would imagine the issues being more in the is it close enough hash than actual collision and it is good to have human verification process at the end of the chain.

Discloser: I know almost nothing about the subject.

You don't need to be an expert on the subject to know that a one in a trillion error rate is a cow crap number pulled from someones behind.
 
This is just a stopgap measure until they set up the brain scanning devices outside of public facilities and stores.
 
Ya'll realize that every other cloud company has been doing this for years now, correct? I'd prefer Apple not do it either, but it's hilarious how people only care about this stuff and have a hissy fit when Apple announces it. Keep in mind, none of the others, like Adobe and Google, never announced it. It was just shadow introduced.
Everyone else puts the pedo hash db on your device and scans your local pictures before they've been uploaded to the cloud?

Where's the citations please?
 
"... are an excuse to weaken encryption and other security measures that protect millions of users and our national security,” Apple said ...

(hint not recently)
 
Everyone else puts the pedo hash db on your device and scans your local pictures before they've been uploaded to the cloud?
I imagine most use Google or Azure PhotoDNA service that microsoft offer for free (thus not the local device), but almost all of them do scan them now.

Apple way probably save apple money (if they do not want to use Microsoft for mental) ? Or legally it is better if the picture is tagged before being hosted by them, not sure why they do it like that unlike the others, they were scanning them once they were on the iCloud since 2020, does doing the AI work on the phone instead then on the cloud a big difference ?
 
does doing the AI work on the phone instead then on the cloud a big difference ?

absolutely it does.

before, the data on your phone was, in theory, secure, as long as you weren’t just sending it up to cloud services.

but now the phone will scan your data and send off a report to somewhere without your control of it. They say it will only happen if the photos are uploaded to iCloud, that’s the trigger for this whole process.

But with this the capability exists for it to scan whatever, whenever, and report on it, without you knowing about it, and all we have is Apple’s statements that “the system won’t be abused” and “won’t be expanded” to go on. There’s no way lock the door, and we just have to trust Apple with the keys.

that is a massive change from “if you want to keep your data private just don’t put it in the cloud.”
 
absolutely it does.
I get the slippery slope and one day it would be different but not for now, it is pretty much the exact same (just apple saving money) it seem to me.
Ya'll realize that every other cloud company has been doing this for years now, correct? I'd prefer Apple not do it either, but it's hilarious how people only care about this stuff and have a hissy fit when Apple announces it. Keep in mind, none of the others, like Adobe and Google, never announced it. It was just shadow introduced.
That sound false:
https://www.theverge.com/2018/9/3/1...ial-moderation-tool-internet-watch-foundation
https://www.blog.google/around-the-...nd-report-child-sexual-abuse-material-online/

MIcrosoft made that very public over time it seem:
https://news.microsoft.com/on-the-i...eing-used-to-fight-online-child-exploitation/
https://www.engadget.com/2015-07-16-microsoft-photodna-cloud.html
https://www.theguardian.com/technol... is to introduce a,2bn tweets every five days.
https://www.verizon.com/about/our-company/company-policies/digital-safety
https://blogs.microsoft.com/on-the-...odna-technology-to-combat-child-exploitation/


People didn't seem to mind when Apple started to do the same in 2020, it is the new on the phone itself system that they seem to mind.
 
It's a matter of ownership. When you upload files to a cloud system, a private company owns the servers. This is much different from a scanning process happening locally on a piece of hardware you previously thought you owned.
 
It's a matter of ownership. When you upload files to a cloud system, a private company owns the servers. This is much different from a scanning process happening locally on a piece of hardware you previously thought you owned.
But (if I understand) it is after you asked your phone to upload the file on the cloud system, which will be there in a mere second and would have occurred there (and I imagine still will has the DB grow) anyway, making that quite semantic (about who pay for the electricity and etc...), making it virtually the same in every effect, beside who pay for the processing and current.
 
It's a matter of ownership. When you upload files to a cloud system, a private company owns the servers. This is much different from a scanning process happening locally on a piece of hardware you previously thought you owned.
It is and it isn't different. If you're dumb enough to store any data on "the cloud" you should know that anything there can and will be scanned even though the company should never do it. It's there and they're going to do whatever they can with your data.

Your own device should be under your control unlike anything with the cloud. Nothing should be run on it without your permission and authorization. It's your data and no one should have access to it unless you give them access every single time they ask.

And I don't want to hear anything about people saying not to turn on the Apple cloud access. It doesn't matter. Some iOS update is going to turn it on "accidentally" and with the way Apple is it probably won't be long until it's forced on and you'll be required to use it in order to use the hardware. Ironically, this will only be possible because of Apple's walled garden approach. This will be a direct result of Apple's walled garden and top to bottom ecosystem approach. Things may work easily but this is simply one of the big downsides to jumping into and using an ecosystem like that. Apple has most of the control top to bottom and it means Apple can basically do what they want. When you're the gatekeeper it means a level of control with is not possible with something like Android.

The fact of the matter is this is a massive intrusion of privacy. It doesn't matter if others have done it. They're all bad and all of them need to be fought and refused.
 
The fact of the matter is this is a massive intrusion of privacy. It doesn't matter if others have done it. They're all bad and all of them need to be fought and refused.
I feel that part sound almost like a misunderstanding of hashing of image technology (it could be my understanding of it obviously).

If Microsoft and others false positive rate have been has low has they say, is an algorithm that send a complete gibberish of bits that is a mathematic representation of the rate of changes in pixels intensity of an image to a server a massive intrusion of privacy ? It is telling to apple how many picture you are putting there, but considering you where sending the picture on the iCloud you were already obviously saying that in some way to a server.

Some iOS update is going to turn it on "accidentally"
Which would have put the image on iCloud, which would have scanned them in the very same way, the only change here is saving cloud processor by running it on the phone itself no ?

Arguably having your phone look at the image, create a mathematic expression of it and send that to the server is more private than the server looking at the image to create the image, like the current situation (but I imagine they will continue to scan iCloud for time to time, like old picture on there after the DB got refreshed).
 
I feel that part sound almost like a misunderstanding of hashing of image technology (it could be my understanding of it obviously).

If Microsoft and others false positive rate have been has low has they say, is an algorithm that send a complete gibberish of bits that is a mathematic representation of the rate of changes in pixels intensity of an image to a server a massive intrusion of privacy ? It is telling to apple how many picture you are putting there, but considering you where sending the picture on the iCloud you were already obviously saying that in some way to a server.


Which would have put the image on iCloud, which would have scanned them in the very same way, the only change here is saving cloud processor by running it on the phone itself no ?

Arguably having your phone look at the image, create a mathematic expression of it and send that to the server is more private than the server looking at the image to create the image, like the current situation (but I imagine they will continue to scan iCloud for time to time, like old picture on there after the DB got refreshed).
It's not a matter of more or less privacy. It's a matter of having privacy or having none at all. There should be no scanning of any photos or other data on your phone. Catching pedos is just fine, as long as it doesn't intrude on my privacy and my rights. If they want to catch pedos, law enforcement needs to gather evidence and obtain a warrant to do so. That is enshrined in the Constitution of the United States and this forced addition by Apple violates that.

No one should be scanning my files on my devices other than me without my expressed permission.

Would you allow a virus or malware scanner to scan all your files and send the results of anything possibly illegal to law enforcement or a private company or some non-profit organization? What Apple is doing is the exact same thing but in smaller scope. I said it earlier and I'll say it again: this is nothing but an attempted end-run around the Constitution by doing an illegal search and seizure without a warrant. If it isn't fought and stopped now it will continue to grow larger with increased infringements of rights by expanding what it searched and what is looked for. It probably won't be long before all data is searched and all results are uploaded no matter what.

Any infringement on rights must be fought and defeated because once the first infringement is made it's only easier for them to create additional infringements.
 
If you can't trust the company, then you can't trust that the shipped binaries are actually built from the open-source code.
You shouldn't trust any company.
Unless you personally verify the source (which most people aren't qualified to do), and then compile it yourself to load on your phone, you can't be sure there is not a hidden backdoor.
The idea is that a lot of smart people will be looking at the code and will find things if things are to be found. You know, like installing software to scan all you photos for pedo stuff.
And, even then, there could be compromised firmware or hardware backdoors on the device that would be difficult to find, outside of advanced security researchers. In any case, Apple is never going to open-source their OS.
This is why Paul Daniels exists. If there's a hardware backdoor then Apple broke it and we'll need to fix it. Also this doesn't excuse Apple not having open source code for their software.
 

Attachments

  • apple shenanigans.mp4
    3.6 MB
Last edited:
so you guys may be interested to find that iphones apparently have been taking random photos already, as seen in infrared, wonder if when they detect something if they'll try to get a snap of the user?

It's not taking photos. That's the depth sensor periodically going off to provide attention-based features. To quote Apple's technical document on Face ID:

Even if you don’t enroll in Face ID, the TrueDepth camera intelligently activates to support attention aware features, like dimming the display if you aren't looking at your device or lowering the volume of alerts if you're looking at your device. For example, when using Safari, your device checks to determine if you're looking at your device and turns the screen off if you aren’t. If you don’t want to use these features, you can open Settings > Face ID & Passcode and disable Attention Aware Features.

This is what's frustrating when discussing Apple privacy issues, whether it's CSAM or Face ID... there's a lot of people who don't really understand the technology and are really using this to justify their own irrational preconceptions, whether it's extreme paranoia or dreams of a total Android monopoly.
 
It's not taking photos. That's the depth sensor periodically going off to provide attention-based features. To quote Apple's technical document on Face ID:

Even if you don’t enroll in Face ID, the TrueDepth camera intelligently activates to support attention aware features, like dimming the display if you aren't looking at your device or lowering the volume of alerts if you're looking at your device. For example, when using Safari, your device checks to determine if you're looking at your device and turns the screen off if you aren’t. If you don’t want to use these features, you can open Settings > Face ID & Passcode and disable Attention Aware Features.

This is what's frustrating when discussing Apple privacy issues, whether it's CSAM or Face ID... there's a lot of people who don't really understand the technology and are really using this to justify their own irrational preconceptions, whether it's extreme paranoia or dreams of a total Android monopoly.
Reminds me of the idiots complaining about Apple stealing your fingerprints when TouchID came out. I wonder if these are the same type of people that thought cameras were stealing their souls 130 years ago.
 
I get the slippery slope and one day it would be different but not for now, it is pretty much the exact same (just apple saving money) it seem to me.

“Apple saving money” is not a good reason to bypass probable cause. Which they need because our phones are private property.

if we own our phones, they need permission from us or a legal order to use them to scan the data on them.

if otoh Apple thinks we don’t actually own our phones, someone at Apple ought to make that really clear. Then the legal experts can sort it out.

d3athf1sh yeah, I’ve seen those but as Mchart and Aurelius have pointed out, it’s part of the auto Face ID capability. You can just turn off that part if you don’t like it, and Apple is pretty clear about how it works.

I get that everyone wants to jump on everything Apple now, but this will just further distract from the major issue at hand, which (i think) is whether or not we own our phones and, if so, what Apple can and can’t do to them without our permission.

i actually really like a lot of the other features coming in iOS 15 for both the capabilities they provide and that they’re all opt-out by default. And that they don’t have a nebulously defined “sends reports to Apple who reviews it and maybe passes it on to the authorities” capability.
 
Last edited by a moderator:
An important point on the subject... it'd be highly ironic if the "I'm switching to Android" (or, let's be honest, more likely the existing Anything But Apple camp) ended up compromising their privacy in the end by using a less secure cloud photo platform. That and it sounds like this might not be as much of a change as some people think it is.

Screen Shot 2021-08-10 at 9.51.03 AM.png
 
An important point on the subject... it'd be highly ironic if the "I'm switching to Android" (or, let's be honest, more likely the existing Anything But Apple camp) ended up compromising their privacy in the end by using a less secure cloud photo platform. That and it sounds like this might not be as much of a change as some people think it is.

View attachment 383404
Yeah, that’s what I’ve thought why they are doing this. By moving it to the host device they can implement full end to end security since they can’t scan it on the cloud anymore if they did this.
 
Yeah, that’s what I’ve thought why they are doing this. By moving it to the host device they can implement full end to end security since they can’t scan it on the cloud anymore if they did this.
There are probably two reasons Apple is doing this on the host device instead of cloud. Firstly, probably a lot of people aren't using the cloud or at least not as many as Apple would like. Secondly, it saves resources. They're using your hardware, your electricity, and your data to do their work. They know you can't or won't stop them because you're too ignorant to how these devices work to begin with.
“Apple saving money” is not a good reason to bypass probable cause. Which they need because our phones are private property.
That's debatable because for some time now we've been told that a lot of the stuff in our stuff is not owned by us. You can buy a car now and have physical features you can't use because you gotta pay extra to unlock it. None of this has been tested in court because most of the time any lawsuit is settled outside of court.
if we own our phones, they need permission from us or a legal order to use them to scan the data on them.
Good luck with that without access to the source code. It'll eventually become like the elevator buttons where you press it thinking it does something but there's no wires going to the button. It's there to make you feel better about your actions.
if otoh Apple thinks we don’t actually own our phones, someone at Apple ought to make that really clear. Then the legal experts can sort it out.
That'll never happen and Apple as well as everyone else will try to stop you. Most software is in a legal grey area where you license it but own the product that the software is running on. But the product won't work without the software so what's the point? Again, nothing legally tested and all you have to go by is an end user agreement.
I get that everyone wants to jump on everything Apple now, but this will just further distract from the major issue at hand, which (i think) is whether or not we own our phones and, if so, what Apple can and can’t do to them without our permission.
What we should focus on is having Apple and everyone else release their source code for their OS. That way we can look through the code and compile it and then install it onto our phones if we choose to do so. You aren't going to force Apple to change how they operate if you can't see how they operate. There's a reason why so many servers run Linux which releases all their source code.
 
There are probably two reasons Apple is doing this on the host device instead of cloud. Firstly, probably a lot of people aren't using the cloud or at least not as many as Apple would like. Secondly, it saves resources. They're using your hardware, your electricity, and your data to do their work. They know you can't or won't stop them because you're too ignorant to how these devices work to begin with.

That's debatable because for some time now we've been told that a lot of the stuff in our stuff is not owned by us. You can buy a car now and have physical features you can't use because you gotta pay extra to unlock it. None of this has been tested in court because most of the time any lawsuit is settled outside of court.

Good luck with that without access to the source code. It'll eventually become like the elevator buttons where you press it thinking it does something but there's no wires going to the button. It's there to make you feel better about your actions.

That'll never happen and Apple as well as everyone else will try to stop you. Most software is in a legal grey area where you license it but own the product that the software is running on. But the product won't work without the software so what's the point? Again, nothing legally tested and all you have to go by is an end user agreement.

What we should focus on is having Apple and everyone else release their source code for their OS. That way we can look through the code and compile it and then install it onto our phones if we choose to do so. You aren't going to force Apple to change how they operate if you can't see how they operate. There's a reason why so many servers run Linux which releases all their source code.
Servers run on Linux because Linux is way easier to just do shit on when it comes to management, scripting, and reliability. Open source is just a perk, but no one is reviewing that code. You think people review all the code when they run yum update? They put a list of trusted repos and call it a day. However, trusted repos get compromised constantly, which is generally how Red Hat had a business model…
 
Yeah, that’s what I’ve thought why they are doing this. By moving it to the host device they can implement full end to end security since they can’t scan it on the cloud anymore if they did this.
That would make it impossible to scan older photo has the database of illegal photo grow too.


No one should be scanning my files on my devices other than me without my expressed permission.

By asking your phone to send an image on iCloud are you not giving that permission ?
 
That would make it impossible to scan older photo has the database of illegal photo grow too.
Well that’s what they’ve said. The ticket is generated at the time the photo is first taken/put into the photo library on the phone.
 
Well that’s what they’ve said. The ticket is generated at the time the photo is first taken/put into the photo library on the phone.
Yes but did they say that they will never scan it again over the year on the iCloud has well ?

Did they say when the photo is taken, that would be extremely misleading because it is always phrased when sending it to the cloud, that your local photo would never be.
 
Yes but did they say that they will never scan it again over the year on the iCloud has well ?

Did they say when the photo is taken, that would be extremely misleading because it is always phrased when sending it to the cloud, that your local photo would never be.
Read their tech sheet. They generate a ticket, the ticket gets uploaded along with the photo. The ticket is what they look at.
 
Read their tech sheet. They generate a ticket, the ticket gets uploaded along with the photo. The ticket is what they look at.
Ok yes they will relook the ticket, but how does the human validation step be possible if one day it tick has positive has DB updates ? I guess they could simply silently delete them.
 
By asking your phone to send an image on iCloud are you not giving that permission ?
No. It doesn't ask you each and every time if you want to have the image scanned and a report sent to law enforcement or some organization in order to determine if you've done something illegal. It also doesn't inform you that you are waiving your rights against unlawful search and seizure and that by doing so you may be incriminating yourself. It would also need to do that for every single photo/file that is sent.
 
No. It doesn't ask you each and every time if you want to have the image scanned anAd a report sent to law enforcement or some organization in order to determine if you've done something illegal. It also doesn't inform you that you are waiving your rights against unlawful search and seizure and that by doing so you may be incriminating yourself. It would also need to do that for every single photo/file that is sent.
You agree to the EULA when you activate iCloud on the phone, not to mention the EULA when you use the phone itself for the first time. Further, them sending this data to a private org is not unlawful search & seizure. You can make that claim all you want, but you are factually wrong.
 
You agree to the EULA when you activate iCloud on the phone, not to mention the EULA when you use the phone itself for the first time. Further, them sending this data to a private org is not unlawful search & seizure. You can make that claim all you want, but you are factually wrong.
And you just proved why this is an end run around the Constitution. And I wouldn't be too sure about it not being illegal. The only use of this (at this time) is to proactively find something possibly illegal in order to pass it off to law enforcement. There is literally no other use for it. This means someone is being used as a state actor and therefore the Constitution and those pesky Amendments are required to be followed. If not, any and all evidence is useless and anything obtained by using that illegal evidence will also be useless.
 
No. It doesn't ask you each and every time if you want to have the image scanned and a report sent to law enforcement or some organization in order to determine if you've done something illegal. It also doesn't inform you that you are waiving your rights against unlawful search and seizure and that by doing so you may be incriminating yourself. It would also need to do that for every single photo/file that is sent.
With a little option do not ask again. But for you, that does know about it (or the few reading the agreement), that nuance become really small it seem to me, specially that is purely on where some part of an hashing process occur that seem to botter you, why everything you say should not already be true ? Purely because your own device is being used in some way in the process, that could be an important nuance for the future, but for the current implementation that seem to change nothing in practice. The picture were already scanned on arrival, the thing that change is moving that a bit.

But no report is send to anyone at that point, at least that not what seem to be said:
The tool designed to detected known images of child sexual abuse, called “NeuralHash,” will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified.

Apple do an human review before filing a report to NCMEC:
https://www.apple.com/child-safety/...s_for_Children_Frequently_Asked_Questions.pdf
 
And you just proved why this is an end run around the Constitution. And I wouldn't be too sure about it not being illegal. The only use of this (at this time) is to proactively find something possibly illegal in order to pass it off to law enforcement. There is literally no other use for it. This means someone is being used as a state actor and therefore the Constitution and those pesky Amendments are required to be followed. If not, any and all evidence is useless and anything obtained by using that illegal evidence will also be useless.
It has been court tested:

https://casetext.com/case/united-states-v-bohannon-15
Bohannan moves the Court to suppress the fruits of any searches conducted by Microsoft and NCMEC, which includes all evidence of child pornography images obtained in later searches. See Mot. to Suppress at 1, 10-12, 14. Bohannon also moves to suppress the fruits of Servat's visual inspection of the image, and the SFPD's execution of the search warrant.

The Court denies Bohannon's motion. First, the Fourth Amendment does not apply to Microsoft's search, as Microsoft was not acting as a government entity or agent. Even if the Fourth Amendment applied, Bohannon consented to Microsoft's search by agreeing to Microsoft's terms of service.
Second, NCMEC did not conduct any search and thus could not have violated Bohannon's Fourth Amendment rights. Third, when Sergeant Servat confirmed that the image was child pornography, this did not result in any further intrusion on Bohannon's privacy. Finally, the SFPD's warrant was supported by probable cause, and were it not, the good faith exception to the exclusionary rule would apply because law enforcement reasonably relied on the warrant.

Bohannon argues that Microsoft's initial identification of child pornography in the OneDrive account resulted from an unlawful warrantless search. But the Fourth Amendment does not apply to Microsoft's private actions and, in any event, Bohannon consented to Microsoft searching his account.


The judge could have been wrong obviously, but that seem a common legal opinion:

https://ideaexchange.uakron.edu/cgi/viewcontent.cgi?article=2493&context=akronlawreview
Courts must exercise caution when admitting evidence into the record obtained through hash value algorithms. It is not, however, per se unconstitutional for private internet service providers to use hash-based examination

Paying close attention to the circumstances surrounding the search, courts must determine the answer to the following questions: (1) Was the individual that conducted the initial search in fact acting as a private individual and not a state actor? (2) Was the defendant’s reasonable expectation of privacy thwarted by the third-party doctrine? (3) Was there probable cause for the warrantless search? (4) Was the state actor acting within the scope of the private party search? If the answer to any of the preceding questions is no, the defendant should be entitled to suppress the evidence gained during the unconstitutional search. When courts choose to implement this standard analysis, United States citizens receive proper protection both from unreasonable search and seizure by the government, as well as protection from the crime of child exploitation.
 
Last edited:
It has been court tested:

https://casetext.com/case/united-states-v-bohannon-15
Bohannan moves the Court to suppress the fruits of any searches conducted by Microsoft and NCMEC, which includes all evidence of child pornography images obtained in later searches. See Mot. to Suppress at 1, 10-12, 14. Bohannon also moves to suppress the fruits of Servat's visual inspection of the image, and the SFPD's execution of the search warrant.

The Court denies Bohannon's motion. First, the Fourth Amendment does not apply to Microsoft's search, as Microsoft was not acting as a government entity or agent. Even if the Fourth Amendment applied, Bohannon consented to Microsoft's search by agreeing to Microsoft's terms of service.
Second, NCMEC did not conduct any search and thus could not have violated Bohannon's Fourth Amendment rights. Third, when Sergeant Servat confirmed that the image was child pornography, this did not result in any further intrusion on Bohannon's privacy. Finally, the SFPD's warrant was supported by probable cause, and were it not, the good faith exception to the exclusionary rule would apply because law enforcement reasonably relied on the warrant.

Bohannon argues that Microsoft's initial identification of child pornography in the OneDrive account resulted from an unlawful warrantless search. But the Fourth Amendment does not apply to Microsoft's private actions and, in any event, Bohannon consented to Microsoft searching his account.


The judge could have been wrong obviously, but that seem a common legal opinion:

https://ideaexchange.uakron.edu/cgi/viewcontent.cgi?article=2493&context=akronlawreview
Courts must exercise caution when admitting evidence into the record obtained through hash value algorithms. It is not, however, per se unconstitutional for private internet service providers to use hash-based examination

Paying close attention to the circumstances surrounding the search, courts must determine the answer to the following questions: (1) Was the individual that conducted the initial search in fact acting as a private individual and not a state actor? (2) Was the defendant’s reasonable expectation of privacy thwarted by the third-party doctrine? (3) Was there probable cause for the warrantless search? (4) Was the state actor acting within the scope of the private party search? If the answer to any of the preceding questions is no, the defendant should be entitled to suppress the evidence gained during the unconstitutional search. When courts choose to implement this standard analysis, United States citizens receive proper protection both from unreasonable search and seizure by the government, as well as protection from the crime of child exploitation.
Except you backed up my argument perfectly. Did you not read the last part about a state actor? This also has nothing to do with something on your personal device. Again, this case is about information stored on someone else's servers.

Apple is going to be scanning your own device using your own hardware and electricity to intentionally search for something not residing on their hardware and reporting that in order to prosecute someone. Your case does not apply.

Also, if the cloud service becomes mandatory, at that point that case is not likely to apply since it will be impossible to separate from your device and someone else's servers for something that is on your owned device.
 
Yep. Until there is some substantial legal changes to how cloud services are treated, that will be par for the course. I would argue the cloud services have gotten to the point where they should be treated like an ISP and/or utility so companies can't do this stuff. However, that is a massive piece of legislation that needs to be pushed to change the law, and given that the legislative branch of the USA has turned into bread & games - I doubt it will ever get better.

All of that being said, cloud services aren't mandatory. You do not have to use iCloud on an iPhone, it will work just fine without it. You can still do manual local backups with iTunes just fine as well. I actually keep most of the iCloud stuff disabled anyways, because I prefer the speed of the local backups, and most people keep iCloud photos off as well given how fast you'll blow through the free iCloud storage. The only feature I see myself liking with iCloud, besides activation lock, is the new cloud relay function.
 
Last edited:
Back
Top