Apple will remotely install software to scan all US phones for child sex abuse images

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
6,533
. . .

noko I don’t disagree with you are saying about it being a potential avenue for abuse. but if you just want to take one or two sentences and then come up with your own interpretation of what it does and how it does it, and put your own spin on it, what does that accomplish?

Yes, the "visual derivative" has to be recognizable as an image for them to be able to perform a manual review. And again, I don't disagree with what you're saying is the potential for abuse. But you're making a lot of suppositions. And the potential for modification and expansion is certainly an issue, right now the fact that the capability exists at all is, IMO, the bigger concern, as I've explained.

Don't focus on "what if" - we should be focused on "what is". There's enough issue there already.

You might want to read the technical overview here:

https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf

As it goes into much more detail about how the Neuralhash system works. It doesn't specify what the visual derivative is but I would assume it's grayscaled or intentionally obscured in some way - like having a big fuzzy filter put over it, or an extremely high level of compression. Purely guesswork on my part, and I agree that Apple not being clear about the derivative is a problem. But the defense of it would be: if the reports are only provided when an image matches something, wouldn't you want them to be able to tell, on manual review, with an extremely high level of confidence, if an image was a false positive? For that to work they'd have to be able to see some version of the image.

(Again, not saying it's a good thing. But within the confines of the system they've designed, it's a good solution to the problem).

. . .
Thanks for the research and link. Now I do focus on what ifs at time, what the current implementation does is defined by code. What it can do can be much different and of course Apple could be saints and deliver 100% with zero bugs, employee following it 100% all the policies etc. In our World that is not even being close to the norm. For me Red Flags are waving everywhere with this and Apple would not be where I would put important trust in. Of course my wife does so she has an iPhone.

Apple has had a long problem anyways with security with Pegasus Spyware, which can invade your phone without even clicking on anything. Controls your camera, mike, see's paswords, docs and so on.
https://www.theguardian.com/technol...walled-garden-is-no-match-for-pegasus-spyware
And yet Pegasus has worked, in one way or another, on iOS for at least five years. The latest version of the software is even capable of exploiting a brand-new iPhone 12 running iOS 14.6, the newest version of the operating system available to normal users. More than that: the version of Pegasus that infects those phones is a “zero-click” exploit. There is no dodgy link to click, or malicious attachment to open. Simply receiving the message is enough to become a victim of the malware.


It’s worth pausing to note what is, and isn’t, worth criticising Apple for here. No software on a modern computing platform can ever be bug-free, and as a result no software can ever be fully hacker-proof. Governments will pay big money for working iPhone exploits, and that motivates a lot of unscrupulous security researchers to spend a lot of time trying to work out how to break Apple’s security.

But security experts I’ve spoken to say that there is a deeper malaise at work here. “Apple’s self-assured hubris is just unparalleled,” Patrick Wardle, a former NSA employee and founder of the Mac security developer Objective-See, told me last week. “They basically believe that their way is the best way.”

Apples arrogance, ignoring failures of their OS I do not see changing, adding in IOS a number of additional avenues right at the system level just makes it worst in my book.

Brief rundown on Pegasus:
https://en.wikipedia.org/wiki/Pegasus_(spyware)
 

emphy

Limp Gawd
Joined
Aug 31, 2016
Messages
296
...

But my experience as an analyst was that there are plenty of dumb criminals out there. It will probably catch a lot of them.
My problem with this reasoning is that those dumb criminals would also be caught by an increased police effort. That makes the real choice of implementing this technology one between funding/effort and one of the most important human rights we have in a free society (the one to be assumed innocent until proven guilty), as opposed to the choice this is framed as: between waiving that right and catching those criminals.
 

deruberhanyok

2[H]4U
Joined
Aug 22, 2004
Messages
2,098
My problem with this reasoning is that those dumb criminals would also be caught by an increased police effort. That makes the real choice of implementing this technology one between funding/effort and one of the most important human rights we have in a free society (the one to be assumed innocent until proven guilty), as opposed to the choice this is framed as: between waiving that right and catching those criminals.

Totally agreed. Setting aside the "what if" scenarios, they've come up with a really good system design that does what they say it does - provides far more privacy for your photos than the usual "just scan everything in the cloud" model that other providers employ. But IMO, unfortunately, it looks like it comes at the cost of fourth and fifth amendment rights. The frogurt is cursed, if you will.

Maybe that's a tradeoff that's worth it for some. I am... undecided. But I have a feeling if this is implemented without any lawsuits from EFF or whoever that we will start to see similar capabilities in all the various alternative photo applications from Google, Samsung, Dropbox, Facebook, etc. And on Android, too. It could become just part and parcel of having a mobile device, and there may not actually be any realistic alternatives that don't come with some kind of major feature or usability compromises.
 

Aurelius

2[H]4U
Joined
Mar 22, 2003
Messages
3,444
Totally agreed. Setting aside the "what if" scenarios, they've come up with a really good system design that does what they say it does - provides far more privacy for your photos than the usual "just scan everything in the cloud" model that other providers employ. But IMO, unfortunately, it looks like it comes at the cost of fourth and fifth amendment rights. The frogurt is cursed, if you will.

Maybe that's a tradeoff that's worth it for some. I am... undecided. But I have a feeling if this is implemented without any lawsuits from EFF or whoever that we will start to see similar capabilities in all the various alternative photo applications from Google, Samsung, Dropbox, Facebook, etc. And on Android, too. It could become just part and parcel of having a mobile device, and there may not actually be any realistic alternatives that don't come with some kind of major feature or usability compromises.

I see it as likely amounting to a non-event. Apple helps nab more not-so-bright criminals; the slippery slope doesn't materialize; people eventually forget about it. As it stands, companies like Facebook and Google already scan for those photos, just directly in the cloud. The real gotcha is if Google follows Apple (as it often does on safety/security matters) and does on-device screening for Android users uploading to Photos/Drive. Seems like it'd be folly to switch platforms over Apple's move knowing both what Google already does and what it could very easily do next.
 

kirbyrj

Fully [H]
Joined
Feb 1, 2005
Messages
28,824
But IMO, unfortunately, it looks like it comes at the cost of fourth and fifth amendment rights.

Apple isn't a law enforcement/government agency so those do not apply. The bill of rights (in theory anyway) spells out the citizens rights in relation to the government. Hence, Facebook/Google/Apple/Twitter can essentially silence anyone they deem inappropriate as they are private companies. You agree to their terms of service upon using their products.
 

Jonnycat99

Limp Gawd
Joined
Nov 9, 2006
Messages
286
I see it as likely amounting to a non-event. Apple helps nab more not-so-bright criminals; the slippery slope doesn't materialize; people eventually forget about it. As it stands, companies like Facebook and Google already scan for those photos, just directly in the cloud. The real gotcha is if Google follows Apple (as it often does on safety/security matters) and does on-device screening for Android users uploading to Photos/Drive. Seems like it'd be folly to switch platforms over Apple's move knowing both what Google already does and what it could very easily do next.

Outside of rooting, smartphones have never been a "user-controlled" piece of hardware. Considering that Google and Dropbox have already implemented CSAM detection on their servers, I think the only reason this is getting as much traction as it is getting is because Apple released a press release on the issue, and because part of the detection process is occurring on the phone itself due to the files themselves being encrypted on the user's device.

The only way to truly have data privacy on a mobile device will be a FOSS ROM or smartphone implementation a la Pinephone or Librem.
 

Aurelius

2[H]4U
Joined
Mar 22, 2003
Messages
3,444
Outside of rooting, smartphones have never been a "user-controlled" piece of hardware. Considering that Google and Dropbox have already implemented CSAM detection on their servers, I think the only reason this is getting as much traction as it is getting is because Apple released a press release on the issue, and because part of the detection process is occurring on the phone itself due to the files themselves being encrypted on the user's device.

The only way to truly have data privacy on a mobile device will be a FOSS ROM or smartphone implementation a la Pinephone or Librem.
Pretty much, although you can always turn iCloud photo syncing off in this case. And hardcore open source phones... well, they're not great phones, at least not these days.
 

SmokeRngs

[H]ard|DCer of the Month - April 2008
Joined
Aug 9, 2001
Messages
17,403
Apple isn't a law enforcement/government agency so those do not apply. The bill of rights (in theory anyway) spells out the citizens rights in relation to the government. Hence, Facebook/Google/Apple/Twitter can essentially silence anyone they deem inappropriate as they are private companies. You agree to their terms of service upon using their products.
Except that Apple is acting as an arm of law enforcement with this change. Apple will be scanning your phone with your photos to look for something illegal. This is nowhere near the same as before because then Apple was only scanning what was on their property. That was the loophole they used for scanning; they scanned data on their servers in an effort to avoid possible issues with illegal materials on their own servers so they couldn't get in trouble for having illegal materials.

Scanning the phone is completely different because the phone is not Apple's property. At the same time Apple is acting as an extension of law enforcement AND preemptively scanning for possible illegal materials. This isn't about having illegal materials on Apple property because if it was Apple simply wouldn't allow materials which match to be uploaded to their servers. Instead, it's not only allowed to be uploaded but sent off to law enforcement as evidence for prosecution. This is Apple acting as a state actor and therefore Constitutional protections are in effect. That means the scan itself is an illegal search and seizure.

The computer tech repairing a busted laptop who finds child porn on the laptop reports this to local authorities and actions are taken. That's legal because finding the child porn is incidental and accidental. The tech and company he works for is not acting as a state actor in that instance. However, if law enforcement has come by and told that company/tech to scan every computer they get for child porn the company and tech have become state actors. It's also illegal for law enforcement to ask or demand anyone do this while at the same time makes useless any evidence obtained in this way. The discovery must be incidental and accidental and what Apple is doing is neither incidental or accidental. It is using government provided tools to intentionally search for certain things on property not owned by Apple in order to turn it over to law enforcement.
 

equinox654

Limp Gawd
Joined
Apr 7, 2005
Messages
508
What's gonna be scary is when someone's teenage son sees jail bait on reddit or 4chan and it is cached by the browser or app and Apple sicks the feds on them.
 

LukeTbk

[H]ard|Gawd
Joined
Sep 10, 2020
Messages
1,447
Unfortunately people don't have a good grasp that a 32 bit hash vs. a 1080*760*24bit = (49766400 bits) image is going to have collisions. Even if the hash is not heavily random.
The old UUID hash (from 1976) we commonly use for non important stuff are 128 bits (3.40 × 10 exposed in 38 different value or 340,282,366,920,938,463,463,374,607,431,768,211,455 different value, that 4.25*10 exposed by 28 by humans on earth) this can go in the 4000 bits if it need to.

Do you have a source, I really doubt it will be a 32 bit hash ?

Would this not use some intelligence to try to catch similar picture but cropped, stretched, inversed, etc.... I doubt collision hash would be an issue in 2021 for something that has a lot of data to create the hash from like you pointed out by the number of bits a single decompressed image has.

Tbh, the pedos will just keep files encrypted and use a memory only decryption/viewer. Or run linux in a vm. I'm sorry this is not robust and this is using children to get a foot in the door.
I imagine many do, but surprisngly Apple-Microsoft-Dropbox do catch people that stores non encrypted hundreds-thousand on the free version of their system, with account not too separated from them (i.e email used to be create them and what not) and that use explicite naming of their folder to indicate that where they put their CP.

For example:
http://www.yourgv.com/news/court/ju...cle_d0e87292-0bfd-11eb-8ff5-2379f117d72b.html

They event use regular email to send link to their icloud account with those pictures:
https://thedesk.net/2020/02/apple-search-warrant-child-porn-scan-emails-icloud-forbes/

There was 45 millions hit in 2019 I think, from people using tech giants technology:
https://www.nytimes.com/interactive/2019/09/28/us/child-sex-abuse.html

Maybe unlike in 2019-2020 when Apple scanned icloud and message for it and did made less the press than now, that it would change.
 

kirbyrj

Fully [H]
Joined
Feb 1, 2005
Messages
28,824
Except that Apple is acting as an arm of law enforcement with this change. Apple will be scanning your phone with your photos to look for something illegal. This is nowhere near the same as before because then Apple was only scanning what was on their property. That was the loophole they used for scanning; they scanned data on their servers in an effort to avoid possible issues with illegal materials on their own servers so they couldn't get in trouble for having illegal materials.

Scanning the phone is completely different because the phone is not Apple's property. At the same time Apple is acting as an extension of law enforcement AND preemptively scanning for possible illegal materials. This isn't about having illegal materials on Apple property because if it was Apple simply wouldn't allow materials which match to be uploaded to their servers. Instead, it's not only allowed to be uploaded but sent off to law enforcement as evidence for prosecution. This is Apple acting as a state actor and therefore Constitutional protections are in effect. That means the scan itself is an illegal search and seizure.

The computer tech repairing a busted laptop who finds child porn on the laptop reports this to local authorities and actions are taken. That's legal because finding the child porn is incidental and accidental. The tech and company he works for is not acting as a state actor in that instance. However, if law enforcement has come by and told that company/tech to scan every computer they get for child porn the company and tech have become state actors. It's also illegal for law enforcement to ask or demand anyone do this while at the same time makes useless any evidence obtained in this way. The discovery must be incidental and accidental and what Apple is doing is neither incidental or accidental. It is using government provided tools to intentionally search for certain things on property not owned by Apple in order to turn it over to law enforcement.

Except they aren't a government entity. You can argue all you want, but they are not doing it at the behest of a government entity. They are simply doing it on their own accord and passing the information off to law enforcement. I'm not saying its right or not. I would agree that it probably isn't, but I'm just saying it isn't fundamentally a constitutional issue because the parties involved aren't government entities. I would assume that in order to seize the property once Apple reports it, a warrant would be required, etc. I get it that it's splitting hairs because the assumption is that a government entity is going to get involved when a violation occurs, but it's not a 4th amendment issue.

So it comes down to accepting the terms of use of their product and services of a private company.
 

SmokeRngs

[H]ard|DCer of the Month - April 2008
Joined
Aug 9, 2001
Messages
17,403
Except they aren't a government entity. You can argue all you want, but they are not doing it at the behest of a government entity. They are simply doing it on their own accord and passing the information off to law enforcement. I'm not saying its right or not. I would agree that it probably isn't, but I'm just saying it isn't fundamentally a constitutional issue because the parties involved aren't government entities. I would assume that in order to seize the property once Apple reports it, a warrant would be required, etc. I get it that it's splitting hairs because the assumption is that a government entity is going to get involved when a violation occurs, but it's not a 4th amendment issue.

So it comes down to accepting the terms of use of their product and services of a private company.
Show me the law that says a company doing the job of the state is not a state actor. Show me the law that says a private company can knowingly and intentionally inspect and search anything you have on a privately owned device and use whatever they find against you in a court of law.

You're not going to find anything but neither one is legal. The only reason it's legal for Apple to search anything you transfer to their cloud service is because that data is stored on their servers and only searched after it's on their servers.

Terms of use means little or nothing because it's rare they are held to be legal in the first place but it's even more rare for them to be challenged legally. It's not some sort of catch all for a company to do whatever they want which is illegal which is a mistake you seem to be making.
 

LukeTbk

[H]ard|Gawd
Joined
Sep 10, 2020
Messages
1,447
Show me the law that says a private company can knowingly and intentionally inspect and search anything you have on a privately owned device and use whatever they find against you in a court of law.
That seem a bit of a strawmen, has here they do not do this at all from what I understand.

It is a bit more:
User: Apple can I put a picture on your cloud infrastructure
Apple: Has per our user agreement, your device will generate a hash and compare it to a DB and attach a key to your picture that you send us, if there is a match we will be able to look at said picture and will probably do so (in reality I imagine only if your account get flagged a lot and does not end up very low on the list).
User: Ok do so and send the picture on your cloud infrastructure after please.

And not inspect or search anything on your device.

Terms of use means little or nothing
In the very close to this cases, where the scan is purely on the cloud, judgement writing by court about it do say that the 4th does not apply because of the private company angle, but that if it was a government agent (like a government server) it would not apply either because of the user agreement the person signed on, that become a form of yes you can search my car-person if you want.
 

SmokeRngs

[H]ard|DCer of the Month - April 2008
Joined
Aug 9, 2001
Messages
17,403
That seem a bit of a strawmen, has here they do not do this at all from what I understand.

It is a bit more:
User: Apple can I put a picture on your cloud infrastructure
Apple: Has per our user agreement, your device will generate a hash and compare it to a DB and attach a key to your picture that you send us, if there is a match we will be able to look at said picture and will probably do so (in reality I imagine only if your account get flagged a lot and does not end up very low on the list).
User: Ok do so and send the picture on your cloud infrastructure after please.

And not inspect or search anything on your device.


In the very close to this cases, where the scan is purely on the cloud, judgement writing by court about it do say that the 4th does not apply because of the private company angle, but that if it was a government agent (like a government server) it would not apply either because of the user agreement the person signed on, that become a form of yes you can search my car-person if you want.
Have you even bothered to look at what Apple is going to do. Apple is not going to be scanning anything on their servers. Apple will be scanning everything on your phone. The loophole is gone. Period. End of story.

User agreements cannot remove your rights in perpetuity which is exactly what this program by Apple will do. Cops do not get a single warrant and then have unlimited permission to search your home, devices, vehicles, etc, etc, etc. Warrants are for a specific time, place and items and any little error will invalidate the warrant and anything seen or seized by the cops while using that invalid warrant.

Apple is not scanning data on their hardware. They are scanning your data by using your hardware in order to have you prosecuted. They are not asking for permission to do this. They are not getting a warrant. They are going to use anything they might find in their hash file to have you prosecuted. THAT IS BEING A STATE ACTOR. STATE ACTORS ARE REQUIRED TO ABIDE BY THE CONSTITUTION.

The best case legal scenario for Apple would be a scan of any data being uploaded to their servers and disallow anything on the hash file. And that would be the end of it. No turning anything over to the authorities and so not being a state actor.

For the people who don't seem to understand. It is not required to be an official government employee to be a state actor. It only requires intended collusion with the government to be a state actor OR a directive by the government to do something.
 

LukeTbk

[H]ard|Gawd
Joined
Sep 10, 2020
Messages
1,447
Have you even bothered to look at what Apple is going to do. Apple is not going to be scanning anything on their servers. Apple will be scanning everything on your phone. The loophole is gone. Period. End of story.
Well yes has I said (sorry for my bad English it is a second language), I explicitly said: your device will generate a hash and compare it to a DB and attach a key to your picture that you send us,



Apple will be scanning everything on your phone.
Here I feel it is impressive for a reason, your phone will be creating hash from picture you explicitly ask them to hold on their own server, before the upload instead of after because they will not be able to open a file anymore on their cloud without a matching key.

They are not asking for permission to do this.
It is week but I imagine they will in the user agreement they will ask you to say yes when you active your phone.

For the people who don't seem to understand. It is not required to be an official government employee to be a state actor. It only requires intended collusion with the government to be a state actor OR a directive by the government to do something.
I think most understand that (the actor would not be needed otherwise) and previous judgement on the matter do explore it.

We will see quite soon if it does hold up in court.

User agreements cannot remove your rights in perpetuity which is exactly what this program by Apple will do.
This has been tested thoroughly on the many cloud service before, explicitly consenting to a search like those service user do, do seem to have some legal force according to some courts.
https://casetext.com/case/united-states-v-bohannon-15
The Court denies Bohannon's motion. First, the Fourth Amendment does not apply to Microsoft's search, as Microsoft was not acting as a government entity or agent. Even if the Fourth Amendment applied, Bohannon consented to Microsoft's search by agreeing to Microsoft's terms of service. Second, NCMEC did not conduct any search and thus could not have violated Bohannon's Fourth Amendment rights.

Bohannon argues that Microsoft's initial identification of child pornography in the OneDrive account resulted from an unlawful warrantless search. But the Fourth Amendment does not apply to Microsoft's private actions and, in any event, Bohannon consented to Microsoft searching his account.


Everyone that say some I agree to Apple term of service will also have consented to Apple creating a hash of picture they are sending to the cloud on their device.

Has for it being considered an actor:
defendant "has the burden of establishing government involvement in a private search." United States v. Cleaveland, 38 F.3d 1092, 1093 (9th Cir. 1994). The "relevant inquiry is: (1) whether the government knew of and acquiesced in the intrusive conduct; and (2) whether the party performing the search intended to assist law enforcement efforts or further his own ends." Id. (quoting United States v. Reed, 15 F.3d 928, 931 (9th Cir. 1994)). At the second step, a private entity may have "dual motives" comprising both a "legitimate, independent motivation to further its own ends" and a desire to assist law enforcement. Id. at 1094 (citation omitted). If that is the case, the Ninth Circuit requires the Court to determine whether "the government's participation" in the search was "so extensive as to trigger Fourth Amendment scrutiny." Id. It is not enough that the government "used the fruits of [the] search to obtain a warrant." Id. Instead, the government must have "actively" participated in the search. Id.; see also United States v. Walther, 652 F.2d 788, 792 (9th Cir. 1981) ("The presence of law enforcement officers who do not take an active role in encouraging or assisting an otherwise private search has been held insufficient to implicate Fourth Amendment interests, especially where the private party has had a legitimate independent motivation for conducting the search."); Corngold v. United States, 367 F.2d 1, 5-6 (9th Cir. 1966) (en banc) (explaining that even when a private party has dual motives, a search that law enforcement "joined actively" is equivalent to a search by the government).

First, Bohannon does not allege that the government knew about this PhotoDNA search, and Microsoft's general practice of using PhotoDNA to identify child pornography—a practice that it has no statutory obligation to perform—is not enough. See United States v. Green, 857 F. Supp. 2d 1015, 1018 (S.D. Cal. 2012). If it were, anything that private persons routinely do and that helps prevent crime could constitute government action. Second, Bohannon has not shown that Microsoft's "legitimate, independent motivation to further its own ends" was "negated" by Microsoft's potential "dual motive to detect or prevent crime." Even if Microsoft is also happy to help law enforcement prevent child sexual abuse and has a "track record" of doing so, Mot. to Suppress at 11, there was no "government involvement" in this PhotoDNA search, let alone the active or "extensive" participation necessary to override Microsoft's "legitimate, independent motive."


If a company has in a significant part of is motive branding has why they prevent crime to occur, that a good argument that they are not acting has a government agent for example, a company that would just not want to actively host on their computer with their electricity child pornography would also be an independent of the government mandate reason to do it.

All that to say, that I am confident (with no knowledge of anything) it has a good chance to pass in court, Apple action here seem clearly to be for is privacy branding (which is ironic when they and the government had the ability to easily look at non encrypted picture on their cloud their branding was better than in their effort to make that nearly impossible).

And like previously said, even if government agent was demonstrated by the defense:
, Privacy Statement at 422-23, including its prohibition on unlawful acts and "any activity that exploits, harms, or threatens to harm children," Services Agreement at 135. That is precisely what happened here. Thus, even were Microsoft acting as a government agent, its PhotoDNA search was reasonable under the Fourth Amendment.
 

SmokeRngs

[H]ard|DCer of the Month - April 2008
Joined
Aug 9, 2001
Messages
17,403
Well yes has I said (sorry for my bad English it is a second language), I explicitly said: your device will generate a hash and compare it to a DB and attach a key to your picture that you send us,




Here I feel it is impressive for a reason, your phone will be creating hash from picture you explicitly ask them to hold on their own server, before the upload instead of after because they will not be able to open a file anymore on their cloud without a matching key.


It is week but I imagine they will in the user agreement they will ask you to say yes when you active your phone.


I think most understand that (the actor would not be needed otherwise) and previous judgement on the matter do explore it.

We will see quite soon if it does hold up in court.


This has been tested thoroughly on the many cloud service before, explicitly consenting to a search like those service user do, do seem to have some legal force according to some courts.
https://casetext.com/case/united-states-v-bohannon-15
The Court denies Bohannon's motion. First, the Fourth Amendment does not apply to Microsoft's search, as Microsoft was not acting as a government entity or agent. Even if the Fourth Amendment applied, Bohannon consented to Microsoft's search by agreeing to Microsoft's terms of service. Second, NCMEC did not conduct any search and thus could not have violated Bohannon's Fourth Amendment rights.

Bohannon argues that Microsoft's initial identification of child pornography in the OneDrive account resulted from an unlawful warrantless search. But the Fourth Amendment does not apply to Microsoft's private actions and, in any event, Bohannon consented to Microsoft searching his account.


Everyone that say some I agree to Apple term of service will also have consented to Apple creating a hash of picture they are sending to the cloud on their device.

Has for it being considered an actor:
defendant "has the burden of establishing government involvement in a private search." United States v. Cleaveland, 38 F.3d 1092, 1093 (9th Cir. 1994). The "relevant inquiry is: (1) whether the government knew of and acquiesced in the intrusive conduct; and (2) whether the party performing the search intended to assist law enforcement efforts or further his own ends." Id. (quoting United States v. Reed, 15 F.3d 928, 931 (9th Cir. 1994)). At the second step, a private entity may have "dual motives" comprising both a "legitimate, independent motivation to further its own ends" and a desire to assist law enforcement. Id. at 1094 (citation omitted). If that is the case, the Ninth Circuit requires the Court to determine whether "the government's participation" in the search was "so extensive as to trigger Fourth Amendment scrutiny." Id. It is not enough that the government "used the fruits of [the] search to obtain a warrant." Id. Instead, the government must have "actively" participated in the search. Id.; see also United States v. Walther, 652 F.2d 788, 792 (9th Cir. 1981) ("The presence of law enforcement officers who do not take an active role in encouraging or assisting an otherwise private search has been held insufficient to implicate Fourth Amendment interests, especially where the private party has had a legitimate independent motivation for conducting the search."); Corngold v. United States, 367 F.2d 1, 5-6 (9th Cir. 1966) (en banc) (explaining that even when a private party has dual motives, a search that law enforcement "joined actively" is equivalent to a search by the government).

First, Bohannon does not allege that the government knew about this PhotoDNA search, and Microsoft's general practice of using PhotoDNA to identify child pornography—a practice that it has no statutory obligation to perform—is not enough. See United States v. Green, 857 F. Supp. 2d 1015, 1018 (S.D. Cal. 2012). If it were, anything that private persons routinely do and that helps prevent crime could constitute government action. Second, Bohannon has not shown that Microsoft's "legitimate, independent motivation to further its own ends" was "negated" by Microsoft's potential "dual motive to detect or prevent crime." Even if Microsoft is also happy to help law enforcement prevent child sexual abuse and has a "track record" of doing so, Mot. to Suppress at 11, there was no "government involvement" in this PhotoDNA search, let alone the active or "extensive" participation necessary to override Microsoft's "legitimate, independent motive."

If a company has in a significant part of is motive branding has why they prevent crime to occur, that a good argument that they are not acting has a government agent for example, a company that would just not want to actively host on their computer with their electricity child pornography would also be an independent of the government mandate reason to do it.

All that to say, that I am confident (with no knowledge of anything) it has a good chance to pass in court, Apple action here seem clearly to be for is privacy branding (which is ironic when they and the government had the ability to easily look at non encrypted picture on their cloud their branding was better than in their effort to make that nearly impossible).

And like previously said, even if government agent was demonstrated by the defense:
, Privacy Statement at 422-23, including its prohibition on unlawful acts and "any activity that exploits, harms, or threatens to harm children," Services Agreement at 135. That is precisely what happened here. Thus, even were Microsoft acting as a government agent, its PhotoDNA search was reasonable under the Fourth Amendment.
What part of it's scanning your phone do you not understand? Every example you posted is about the company scanning data on their own servers. Apple will NOT be doing that. It is not the fault of the owner of the phone that Apple will implement encryption which means Apple can no longer scan the data on its own servers. It also does not give permission for Apple to scan your phone and send anything it finds to law enforcement.

I will repeat. The only reason companies were legally able to send information to law enforcement is because they did scans on their own hardware; not on a privately owned device. The only legal way for Apple to continue with what they have announced is to have ownership of every Apple device which is scanned.

If Apple implements encryption so Apple can no longer read the contents of their own servers, that's too bad for Apple because it was their own decision to do so. It does not give Apple legal recourse to scan any and all devices with the intent to turn information over to law enforcement. The moment Apple does that the company becomes a state actor. The moment Apple becomes a state actor every single search they do on a device without a warrant is unconstitutional and illegal.
 

Aurelius

2[H]4U
Joined
Mar 22, 2003
Messages
3,444
What's gonna be scary is when someone's teenage son sees jail bait on reddit or 4chan and it is cached by the browser or app and Apple sicks the feds on them.
In order for Apple to sic the feds, that teenage son would need to have saved at least 30 pictures that are already in a collective CSAM database. At that point it's far, far from a "whoopsie" as Reddit and that teen are in trouble.
 

pendragon1

Fully [H]
Joined
Oct 7, 2000
Messages
32,538
(sorry for my bad English it is a second language)
better than half the english speakers around here ;)
but i agree with smoke, they are acting on behalf the gov now and scanning devices is out of line. a computer shop cant scan every system that comes through its doors, like he said, it has to be incidental/accidental discovery. otherwise its invasion of privacy, and there are laws against that, at least up here(CAN) there are.
 

Jonnycat99

Limp Gawd
Joined
Nov 9, 2006
Messages
286
better than half the english speakers around here ;)
but i agree with smoke, they are acting on behalf the gov now and scanning devices is out of line. a computer shop cant scan every system that comes through its doors, like he said, it has to be incidental/accidental discovery. otherwise its invasion of privacy, and there are laws against that, at least up here(CAN) there are.

The difference is that you can opt out of having them scan your local files if you decline saving them to their cloud. No cloud, no scan (at least for now).
 

DukenukemX

Supreme [H]ardness
Joined
Jan 30, 2005
Messages
5,810
What's gonna be scary is when someone's teenage son sees jail bait on reddit or 4chan and it is cached by the browser or app and Apple sicks the feds on them.
I'm reminded by my niece who used to live in Paterson NJ in how many under age girls are in relationships with men 18+. Seems to be the culture in that area and they probably all use iPhones. Not that whatever I think Apple is doing is going to catch anyone sending nudes of themselves underage, or the 18+ receiving them. This is just an excuse from Apple to sell your data. It would funny to watch how much teenagers using iPhones get in trouble with the police. I'd like to see their faces as they wonder how their personal data was given to law enforcement without their permission. Won't happen but an Apple hater can dream.
 

LukeTbk

[H]ard|Gawd
Joined
Sep 10, 2020
Messages
1,447
What part of it's scanning your phone do you not understand? Every example you posted is about the company scanning data on their own servers. Apple will NOT be doing that. It is not the fault of the owner of the phone that Apple will implement encryption which means Apple can no longer scan the data on its own servers. It also does not give permission for Apple to scan your phone and send anything it finds to law enforcement.

I will repeat. The only reason companies were legally able to send information to law enforcement is because they did scans on their own hardware; not on a privately owned device. The only legal way for Apple to continue with what they have announced is to have ownership of every Apple device which is scanned.

If Apple implements encryption so Apple can no longer read the contents of their own servers, that's too bad for Apple because it was their own decision to do so. It does not give Apple legal recourse to scan any and all devices with the intent to turn information over to law enforcement. The moment Apple does that the company becomes a state actor. The moment Apple becomes a state actor every single search they do on a device without a warrant is unconstitutional and illegal.
Is that a serious law opinion (is it from you or from court precedent and legal expert ?) From my understanding a company aware of CP is legally forced to send information to law enforcement.

That make the intent to turn information to law enforcement quite fuzzy, Apple intent could be to not actively host CP on their server, specially if every other company have much stronger measure than them, which become hard to distinguish because it is impossible for apple to try to not host CP on their server and not contact law enforcement when they find some. And again why to you say any and all device, it will be only device that use their cloud service and only for the content people ask apple to host.
 

kirbyrj

Fully [H]
Joined
Feb 1, 2005
Messages
28,824
Show me the law that says a company doing the job of the state is not a state actor. Show me the law that says a private company can knowingly and intentionally inspect and search anything you have on a privately owned device and use whatever they find against you in a court of law.

You're not going to find anything but neither one is legal. The only reason it's legal for Apple to search anything you transfer to their cloud service is because that data is stored on their servers and only searched after it's on their servers.

Terms of use means little or nothing because it's rare they are held to be legal in the first place but it's even more rare for them to be challenged legally. It's not some sort of catch all for a company to do whatever they want which is illegal which is a mistake you seem to be making.

There is no law like you say. There's no law telling them they can't do it either. Your view of what a government entity entails is suspect at best. As an example, say you look into your neighbors window from your own house and see your neighbor kill his wife, and you report it to the police. You are not an arm of the government or a state actor even though your neighbor had an expectation of privacy in his residence. Police action and the associated warrants that would allow government entities to search a house for the related murder, etc. are the byproduct of a non-government actor's testimony. My take on this is essentially, this is what Apple is doing just in a much more technical fashion.

Like I said, I agree with you in principle other than the constitutional issue. It's not a illegal "search and seizure" because Apple is not bound by those rules. I'm not sure this aspect of the relationship between the company, its product, and the use by the owner of the product has been litigated yet to say for sure what the "law" is, honestly. But its just a bad precedent to set in terms of the company/end user relationship.
 
Last edited:

SmokeRngs

[H]ard|DCer of the Month - April 2008
Joined
Aug 9, 2001
Messages
17,403
Is that a serious law opinion (is it from you or from court precedent and legal expert ?) From my understanding a company aware of CP is legally forced to send information to law enforcement.

That make the intent to turn information to law enforcement quite fuzzy, Apple intent could be to not actively host CP on their server, specially if every other company have much stronger measure than them, which become hard to distinguish because it is impossible for apple to try to not host CP on their server and not contact law enforcement when they find some. And again why to you say any and all device, it will be only device that use their cloud service and only for the content people ask apple to host.
As I said, look at all the examples and the only thing you're going to find is companies scanning data on hardware they own. They cannot legally scan anything on hardware they do not own and send the results to law enforcement. I don't think it's even legal for law enforcement with a warrant to force a company to scan someone's phone. Law enforcement is required to confiscate the phone with a warrant and get into the phone themselves to see anything which is on it.

It doesn't matter if iCloud is required or not. Apple is still intentionally and purposely scanning hardware which is not owned by Apple with the intention of sending certain results to law enforcement. It's very simple, if Apple wants to make 100% sure there is no child porn being uploaded to iCloud from users, Apple simply doesn't have to have allow encryption. Problem solved.

As it is, Apple scanning anything in iCloud is problematic in the first place. Apple cannot get in trouble for having child porn or any illegal data on their servers as long as they don't know about it and didn't put it there. Legally Apple and all other cloud providers are covered. They only scan it because they make use of it and in doing the scanning they can be aware of the contents of other people's data and therefore turn over the illegal data to law enforcement.

Scanning the data on your phone has nothing to do with Apple trying to keep child porn off their servers. If everything is encrypted and Apple can't read it, Apple cannot be in any trouble for anything stored on their servers. You need to look elsewhere for the real reason Apple wants to scan your phone.
 

SmokeRngs

[H]ard|DCer of the Month - April 2008
Joined
Aug 9, 2001
Messages
17,403
There is no law like you say. There's no law telling them they can't do it either. Your view of what a government entity entails is suspect at best. As an example, say you look into your neighbors window from your own house and see your neighbor kill his wife, and you report it to the police. You are not an arm of the government or a state actor even though your neighbor had an expectation of privacy in his residence. Police action and the associated warrants that would allow government entities to search a house for the related murder, etc. are the byproduct of a non-government actor's testimony. My take on this is essentially, this is what Apple is doing just in a much more technical fashion.

Like I said, I agree with you in principle other than the constitutional issue. It's not a illegal "search and seizure" because Apple is not bound by those rules. I'm not sure this aspect of the relationship between the company, its product, and the use by the owner of the product has been litigated yet to say for sure what the "law" is, honestly. But its just a bad precedent to set in terms of the company/end user relationship.
Bad attempt at the strawman. Your example has nothing to do with what Apple is doing. It's much closer to the incidental and accidental find of child porn by a computer tech working on someone's computer. To be anything like what Apple is going to do is for you to break into your neighbor's house, install cameras and microphones everywhere in order to record and view everything which goes on specifically so you can find anything illegal and report it to law enforcement and law enforcement knows exactly what you're doing. Except to be a little closer to Apple, you've done the exact same thing to half the residences in the country.

Guess what, all of that is illegal and absolutely none of it is admissible in court because it was obtained illegally.
 

kirbyrj

Fully [H]
Joined
Feb 1, 2005
Messages
28,824
Guess what, all of that is illegal and absolutely none of it is admissible in court because it was obtained illegally.

But you haven't shown that at all. Like LukeTbk said, is that your opinion or something with legal precedent? He pointed out the PhotoDNA example and the litigation surrounding that as acceptable practice. It is not too far out of the realm of possibility for a federal judge to review that precedent when considering any action brought against Apple. A corporate entity is subject to different laws than government entities in regards to search and seizure.
 
Last edited:

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
6,533
There is no law like you say. There's no law telling them they can't do it either. Your view of what a government entity entails is suspect at best. As an example, say you look into your neighbors window from your own house and see your neighbor kill his wife, and you report it to the police. You are not an arm of the government or a state actor even though your neighbor had an expectation of privacy in his residence. Police action and the associated warrants that would allow government entities to search a house for the related murder, etc. are the byproduct of a non-government actor's testimony. My take on this is essentially, this is what Apple is doing just in a much more technical fashion.

Like I said, I agree with you in principle other than the constitutional issue. It's not a illegal "search and seizure" because Apple is not bound by those rules. I'm not sure this aspect of the relationship between the company, its product, and the use by the owner of the product has been litigated yet to say for sure what the "law" is, honestly. But its just a bad precedent to set in terms of the company/end user relationship.
Looks like precedent has been Private Citizens finding evidence, even if they did it illegally can be accepted in Court of Law but if state actor no: So where does Apple fall into this? I have to say it has to be very convenient for Apple to have a shield of Encryption on their servers making it impossible to search and inspect except for CP. It could contain detail plans via images how to make bombs, how to strike, when, people assassination assignments all in secret while Apple just flags CP and while Apple proclaim their servers are all super clean.

https://www.nolo.com/legal-encyclopedia/searches-private-citizens.html
 
Last edited:

kirbyrj

Fully [H]
Joined
Feb 1, 2005
Messages
28,824
Looks like president has been Private Citizens finding evidence, even if they did it illegally can be accepted in Court of Law but if state actor no: So where does Apple fall into this? I have to say it has to be very convenient for Apple to have a shield of Encryption on their servers making it impossible to search and inspect except for CP. It could contain detail plans via images how to make bombs, how to strike, when, people assassination assignments all in secret while Apple just flags CP and while Apple proclaim their servers are all super clean.

https://www.nolo.com/legal-encyclopedia/searches-private-citizens.html

Exactly. I was working in a hospital, and the policy was to examine purses and bags that were brought into the emergency room by security personnel of the hospital. They did this and found some kind of drug in the purse of a woman. The evidence was turned over to law enforcement and the woman was arrested. Law enforcement did not search her bag, but the contents were in fact searched and handed over to law enforcement by a 3rd party hospital employee following the hospital's corporate policy.
 

cybereality

[H]F Junkie
Joined
Mar 22, 2008
Messages
8,287
And just 5 years ago, Apple wouldn't even unlock the iPhone of a known terrorist. What changed?

1629314193195.png


https://techcrunch.com/2016/03/21/a...-we-will-not-shrink-from-this-responsibility/
 

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
6,533
And just 5 years ago, Apple wouldn't even unlock the iPhone of a known terrorist. What changed?

View attachment 386203

https://techcrunch.com/2016/03/21/a...-we-will-not-shrink-from-this-responsibility/
That is actually very interestedly worded, if indeed their servers are encrypted, no backdoors, the best place for your data, if you can get it there before detection is on their servers. As for your iPhone: Apple": "That is your responsibility and chance to take". They will no longer be at odds with the government - except different governments may have other ideas in mind. I wonder if they actually ran this through all the different Countries or is this US only?
 

SmokeRngs

[H]ard|DCer of the Month - April 2008
Joined
Aug 9, 2001
Messages
17,403
But you haven't show that at all. Like LukeTbk said, is that your opinion or something with legal precedent? He pointed out the PhotoDNA example and the litigation surrounding that as acceptable practice. It is not too far out of the realm of possibility for a federal judge to review that precedent when considering any action brought against Apple. A corporate entity is subject to different laws than government entities in regards to search and seizure.
Show me the legal precedent that says it's legal. It's not for me to prove it's legal, that's your job. You will not find instances of systematic and intentional searches of personal property by private companies with the intention of turning over the results to law enforcement being held up as legal.
 

kirbyrj

Fully [H]
Joined
Feb 1, 2005
Messages
28,824
Show me the legal precedent that says it's legal. It's not for me to prove it's legal, that's your job. You will not find instances of systematic and intentional searches of personal property by private companies with the intention of turning over the results to law enforcement being held up as legal.

That's not how that works at all. Everything is legal until a law is passed forbidding it or restricting it's use.

I mean I am in agreement with you that it's a shitty thing to do and is anti consumer. I just don't think it is technically illegal.
 
  • Like
Reactions: noko
like this

sfsuphysics

[H]F Junkie
Joined
Jan 14, 2007
Messages
14,938
And just 5 years ago, Apple wouldn't even unlock the iPhone of a known terrorist. What changed?

View attachment 386203

https://techcrunch.com/2016/03/21/a...-we-will-not-shrink-from-this-responsibility/
What changed? Well they had a potential ad bonanza with that, thousands of "we protect your privacy" type billboard across the nation.... then they decide to change their stance and those billboards are gone and replaced with cute pictures of kittens and puppies wearing clothes... all shot on an iPhone of course.

Or someone higher up had their child, grandchildren, etc pictures put on the internet or something and they have the power to make change.
 

SmokeRngs

[H]ard|DCer of the Month - April 2008
Joined
Aug 9, 2001
Messages
17,403
That's not how that works at all. Everything is legal until a law is passed forbidding it or restricting it's use.

I mean I am in agreement with you that it's a shitty thing to do and is anti consumer. I just don't think it is technically illegal.
So how often are private companies allowed to break into your home and search through it? How often are individuals allowed to do this? How legal is it for anyone to do that?

The illegality is already established and it's the reason why before this no one was allowed to search phones and computers but only what was uploaded to their servers.

And it does work this way. Show me a case where companies were allowed to search private property in order to look for evidence of unlawful activities and turn it over to law enforcement. You're not going to find one where it was ruled legal although I wouldn't be surprised if you found some where it was ruled illegal.
 

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
6,533
So how often are private companies allowed to break into your home and search through it? How often are individuals allowed to do this? How legal is it for anyone to do that?

The illegality is already established and it's the reason why before this no one was allowed to search phones and computers but only what was uploaded to their servers.

And it does work this way. Show me a case where companies were allowed to search private property in order to look for evidence of unlawful activities and turn it over to law enforcement. You're not going to find one where it was ruled legal although I wouldn't be surprised if you found some where it was ruled illegal.
I would say this will most likely go to court, user agreement accepting conditions of use for the software, while you own the hardware, they own the software, you own the content but gave permission, like inviting someone into your home and you tell them to go through your desk and they found something. Basically you said it was ok for Apple to do this. The key difference I see is the unending time period in effect so it is not like an one time event. Plus if you loan your phone out, that person did not give permission. Until Privacy laws are made to protect individuals from this, user beware maybe the best defense. The problem with inviting or accepting people to come in is that they can also plant stuff as well.
 
Last edited:

travm

[H]ard|Gawd
Joined
Feb 26, 2016
Messages
1,364
Apple isn't a law enforcement/government agency so those do not apply. The bill of rights (in theory anyway) spells out the citizens rights in relation to the government. Hence, Facebook/Google/Apple/Twitter can essentially silence anyone they deem inappropriate as they are private companies. You agree to their terms of service upon using their products.
But when they pass the info on the law enforcement, who then uses that information to make an arrest... this definitely blurs the lines. Its almost like cops, hiring mobsters to conduct interrogations, because they aren't allowed to waterboard suspects.

Mind you, if someone can be proven to have participated in knowingly storing or sharing images of child abuse of this nature, I would support water boarding with no interrogation. Its the nature of evidence, science, and how we do things as a people that I wonder if this is the right way to do it.
 

kirbyrj

Fully [H]
Joined
Feb 1, 2005
Messages
28,824
So how often are private companies allowed to break into your home and search through it? How often are individuals allowed to do this? How legal is it for anyone to do that?

The illegality is already established and it's the reason why before this no one was allowed to search phones and computers but only what was uploaded to their servers.

And it does work this way. Show me a case where companies were allowed to search private property in order to look for evidence of unlawful activities and turn it over to law enforcement. You're not going to find one where it was ruled legal although I wouldn't be surprised if you found some where it was ruled illegal.

And you accused me of creating a straw man earlier? Nobody is breaking into your home and searching through it. But you give Apple broad permissions when you click "Accept" to the TOS.

https://www.apple.com/legal/privacy/pdfs/apple-privacy-policy-en-ww.pdf

Have you ever actually read that? They collect usage information from your phone all the time. Here are a few reminders...

Personal Data Apple Collects from You
• Usage Data. Data about your activity on and use of our offerings, such as app launches within our services, including browsing history; search history; product interaction; crash data, performance and other diagnostic data; and other usage data

Apple’s Use of Personal Data

• Comply with Law. To comply with applicable law — for example, to satisfy tax or reporting obligations, or to comply with a lawful governmental request.
(Note how that is not an all inclusive list, just a few examples).

Apple’s Sharing of Personal Data
Others. Apple may share personal data with others at your direction or with your consent, such as when we share information with your carrier to activate your account. We may also disclose information about you if we determine that for purposes of national security, law enforcement, or other issues of public importance, disclosure is necessary or appropriate. We may also disclose information about you where there is a lawful basis for doing so, if we determine that disclosure is reasonably necessary to enforce our terms and conditions or to protect our operations or users, or in the event of a reorganization, merger, or sale.
 
Top