Apple will remotely install software to scan all US phones for child sex abuse images

So you own the device, and you own a license to the software, wouldn't that make it illegal to search through the contents without a warrant? Also, wouldn't that render all evidence stemming from Apple's finding not usable in court?

Constitutional amendments only apply to the government, not private entities, unless it can be seen that the private entities are working as an agent of the government.

See: https://www.eff.org/deeplinks/2017/...urth-Amendment-Safeguards-by-using-Geek-Squad
 
Ive had the same flip phone since 2008 (maybe 09), so its pretty safe to say im not too concerned with the nonsense they put on those bricks you guys call "smart" phones, but this would seem to me like a private company is making a decision on the products/services they offer, and anyone who objects to their practices is free to move along. Apple is very much into letting you buy somthing (at an inflated price) and then dictating how you want to use it, so it shouldnt be all that surprising that theyre forcing somthing else on their customers. Though, how effective is this really going to be? Wouldnt any interested party be able to simply put the images in an archive and send that instead of the raw images?
 
Constitutional amendments only apply to the government, not private entities, unless it can be seen that the private entities are working as an agent of the government.

See: https://www.eff.org/deeplinks/2017/...urth-Amendment-Safeguards-by-using-Geek-Squad
I would think instead of the Constitutional the Electronic Communications Privacy Act of 1986 would apply which pretty much states
The ECPA, as amended, protects wire, oral, and electronic communications while those communications are being made, are in transit, and when they are stored on computers. The Act applies to email, telephone conversations, and data stored electronically.
This applies to not only government entities but also third parties.

Although I wouldn't be surprised if they EULA'd you giving consent to this, which in itself should be criminal since you can't use their stuff without allowing them to illegally scan your data.
 
I would think instead of the Constitutional the Electronic Communications Privacy Act of 1986 would apply which pretty much states
This applies to not only government entities but also third parties.

Although I wouldn't be surprised if they EULA'd you giving consent to this, which in itself should be criminal since you can't use their stuff without allowing them to illegally scan your data.

You are storing your images on their server. They (just like Google and Dropbox already do) have the right to scan those images before or after they are stored on their property.

Really the only difference between Apple and [Google && Dropbox] is that Apple is performing this scan on your physical device, out of necessity, because your files are encrypted before they leave your device.

If you do not want your images scanned for CSAM don't use their cloud service.
 
I would think instead of the Constitutional the Electronic Communications Privacy Act of 1986 would apply which pretty much states

This applies to not only government entities but also third parties.

Although I wouldn't be surprised if they EULA'd you giving consent to this, which in itself should be criminal since you can't use their stuff without allowing them to illegally scan your data.

As people keep explaining... unless Apple is acting on behalf of the government, and there's no indication it is, this is legal. Apple and the child safety groups are the ones deciding when and how they report content, not officials.

Think about the implications of what you said if it was actually true — every social network would be forced to let you post any content you want without screening it, since it's "your data." Now, if the government ordered Apple or those social networks to scan your storage or intercept the data before it reached the public, that'd be another matter... but it's not.
 
As people keep explaining... unless Apple is acting on behalf of the government, and there's no indication it is, this is legal. Apple and the child safety groups are the ones deciding when and how they report content, not officials.

Think about the implications of what you said if it was actually true — every social network would be forced to let you post any content you want without screening it, since it's "your data." Now, if the government ordered Apple or those social networks to scan your storage or intercept the data before it reached the public, that'd be another matter... but it's not.
As has been explained to you multiple times, the only reason the scanning was legal in the first place is because the scanning was taking place on Apple servers. Apple will no longer be scanning their own property but instead someone else's property and thus it's not legal.
 
As has been explained to you multiple times, the only reason the scanning was legal in the first place is because the scanning was taking place on Apple servers. Apple will no longer be scanning their own property but instead someone else's property and thus it's not legal.

Except it's not illegal as has already been explained multiple times. You are equating electronic access you agree to per the terms of service by a private company with breaking and entering a home without a warrant by the government.

You don't have to look far as I quoted earlier about Apple's true "privacy" policy.

And you accused me of creating a straw man earlier? Nobody is breaking into your home and searching through it. But you give Apple broad permissions when you click "Accept" to the TOS.

https://www.apple.com/legal/privacy/pdfs/apple-privacy-policy-en-ww.pdf

Have you ever actually read that? They collect usage information from your phone all the time. Here are a few reminders...

Personal Data Apple Collects from You
• Usage Data. Data about your activity on and use of our offerings, such as app launches within our services, including browsing history; search history; product interaction; crash data, performance and other diagnostic data; and other usage data

Apple’s Use of Personal Data

• Comply with Law. To comply with applicable law — for example, to satisfy tax or reporting obligations, or to comply with a lawful governmental request.
(Note how that is not an all inclusive list, just a few examples).

Apple’s Sharing of Personal Data
Others. Apple may share personal data with others at your direction or with your consent, such as when we share information with your carrier to activate your account. We may also disclose information about you if we determine that for purposes of national security, law enforcement, or other issues of public importance, disclosure is necessary or appropriate. We may also disclose information about you where there is a lawful basis for doing so, if we determine that disclosure is reasonably necessary to enforce our terms and conditions or to protect our operations or users, or in the event of a reorganization, merger, or sale.

dead-horse-spank.gif
 
Modern Apple is a real shit company. A corporate behemoth that knows how to maximize profits. A consumer friendly company, far from that they are.
 
Modern Apple is a real shit company. A corporate behemoth that knows how to maximize profits. A consumer friendly company, far from that they are.

No matter how many rainbow flags and black fists a company puts up on it's Twitter page, its only purpose is to generate revenue for its shareholders. Don't believe the hype.
 
As people keep explaining... unless Apple is acting on behalf of the government, and there's no indication it is, this is legal. Apple and the child safety groups are the ones deciding when and how they report content, not officials.

Think about the implications of what you said if it was actually true — every social network would be forced to let you post any content you want without screening it, since it's "your data." Now, if the government ordered Apple or those social networks to scan your storage or intercept the data before it reached the public, that'd be another matter... but it's not.
These groups have taken control of corporate America, telling them what is acceptable. The same happened when Congress allowed corporations to have a say in jail sentences and mass incarceration followed. A "Group" main focus is their objective and how it affects others isn't their problem. I have a lot of firearm photos on my phone (non-Apple), but what if they start scanning for "ghost guns", high capacity magazines, not shooting or stored in a "safe area" to red flag me? They'll eventually want to scan for anything that offends the Groups they pander to.

Paypal has already denied any payment that is related to firearms. Affirm doesn't allow it, payment processors like Square, Stripe, etc. same thing. The list is vast. The tech community know the hold the keys to the digital economy.
Preventing digital crime always seems like the right thing to do at first, but eventually it will be used for evil. Everyone is fighting for data, scanning people phones and logging locations are the most valuable.
 
Except it's not illegal as has already been explained multiple times. You are equating electronic access you agree to per the terms of service by a private company with breaking and entering a home without a warrant by the government.

You don't have to look far as I quoted earlier about Apple's true "privacy" policy.



View attachment 388492
Except it is illegal. Go ahead and look at any and every case related to companies scanning data you put on their servers. The only reason it's considered legal is because the data is already on the company's servers. Is there a reason you are incapable of processing that? They can only scan what is on their own servers. I will repeat it for you: they can only scan what is on their own servers.

I will also repeat something else you don't seem to understand. Apple does not own the phone once it is purchased by someone. It is not their hardware and therefore they cannot legally scan it. If Apple can no longer scan the data on their own servers, that's too bad for Apple. I guess you don't realize that Apple already has data on their servers that they can't scan. All cloud providers have data they can't scan because it has already been encrypted or locked in some way. Why would anything be different now when this has already been taking place.

There have already been court cases where TOSes and EULAs and whatnot have been found to be not legally enforceable. It's especially so when they infringe on rights. But according to you, it's okay for Apple employees to do anything illegal as long as they put it in the EULA first since according to you it's no longer illegal. You really should stop beating the dead horse of the all powerful EULA.
 
Apple does not own the phone once it is purchased by someone. It is not their hardware and therefore they cannot legally scan it. If Apple can no longer scan the data on their own servers, that's too bad for Apple.

The images on your phone are only being scanned if you store them on Apple's cloud servers. If you do not want apple scanning your images on your device, don't store them on Apple's cloud servers. This is entirely your choice.

Storing your images on Apple's cloud server comes with some rules you have to follow, if you do not like those rules then don't store your images on Apple's cloud server.
 
Modern Apple is a real shit company. A corporate behemoth that knows how to maximize profits. A consumer friendly company, far from that they are.
So bad that its employees created a website to share their own grievance about the company.
 
The images on your phone are only being scanned if you store them on Apple's cloud servers. If you do not want apple scanning your images on your device, don't store them on Apple's cloud servers. This is entirely your choice.

Storing your images on Apple's cloud server comes with some rules you have to follow, if you do not like those rules then don't store your images on Apple's cloud server.
If Apple wants to scan everything on their servers they can decide not to implement encryption. It's that simple. There's no reason for them to illegally search someone else's property and it's not Apple's choice.
 
Except it is illegal. Go ahead and look at any and every case related to companies scanning data you put on their servers. The only reason it's considered legal is because the data is already on the company's servers. Is there a reason you are incapable of processing that? They can only scan what is on their own servers. I will repeat it for you: they can only scan what is on their own servers.

I will also repeat something else you don't seem to understand. Apple does not own the phone once it is purchased by someone. It is not their hardware and therefore they cannot legally scan it. If Apple can no longer scan the data on their own servers, that's too bad for Apple. I guess you don't realize that Apple already has data on their servers that they can't scan. All cloud providers have data they can't scan because it has already been encrypted or locked in some way. Why would anything be different now when this has already been taking place.

There have already been court cases where TOSes and EULAs and whatnot have been found to be not legally enforceable. It's especially so when they infringe on rights. But according to you, it's okay for Apple employees to do anything illegal as long as they put it in the EULA first since according to you it's no longer illegal. You really should stop beating the dead horse of the all powerful EULA.

You are talking about about two different things with your TOS statements. In most cases that I can think of where you have an unenforceable TOS, you have the parent company removing goods or services from a customer due to a customer violation.

This case is totally different. This is a disagreement of whether or not the TOS grants access or the data collected is within the right of the parent company to collect due to an acceptance of a TOS. To my knowledge, this hasn't been litigated.

I mean some aspects of the TOS have to be enforceable or else why bother even writing one?
 
You are talking about about two different things with your TOS statements. In most cases that I can think of where you have an unenforceable TOS, you have the parent company removing goods or services from a customer due to a customer violation.

This case is totally different. This is a disagreement of whether or not the TOS grants access or the data collected is within the right of the parent company to collect due to an acceptance of a TOS. To my knowledge, this hasn't been litigated.

I mean some aspects of the TOS have to be enforceable or else why bother even writing one?
Because people are generally too stupid or too poor to litigate it so they don't fight back. It's that simple.
 
Because people are generally too stupid or too poor to litigate it so they don't fight back. It's that simple.

Fair enough, but my point is that if nobody fights back, the terms of the TOS are going to dictate that Apple does what they want, and it isn't as simple as citing the Bill of Rights because they are a private company.

What ground would they have to litigate anyway? Nobody (except for paedophiles) is technically damaged by this policy that was accepted by the TOS. In the prior TOS examples you had customers losing accounts, paid content, etc where they suffered damages. I could see a potential lawsuit dismissed out of hand citing the plaintiffs having no standing.
 
You are storing your images on their server. They (just like Google and Dropbox already do) have the right to scan those images before or after they are stored on their property.

Really the only difference between Apple and [Google && Dropbox] is that Apple is performing this scan on your physical device, out of necessity, because your files are encrypted before they leave your device.

If you do not want your images scanned for CSAM don't use their cloud service.
Sounds like they're still scanning while it's in your possession, regardless of the reason why, that should not be legal. That said not sure the ECPA requirement is that you own the hardware that the storage is on. It's literally is tantamount to wiretapping, you could argue that with traditional wiretapping they're not taping your phone they're tapping the central phone office where the call goes to, it still isn't legal.

As people keep explaining... unless Apple is acting on behalf of the government, and there's no indication it is, this is legal. Apple and the child safety groups are the ones deciding when and how they report content, not officials.

Think about the implications of what you said if it was actually true — every social network would be forced to let you post any content you want without screening it, since it's "your data." Now, if the government ordered Apple or those social networks to scan your storage or intercept the data before it reached the public, that'd be another matter... but it's not.
Again ECPA does not strictly dictate to government, it's illegal for ANY third party to "wiretap". In fact if anything the government can do it more freely due to how the Patriot act amongst others gives the government more power (aka immunity to lawsuits)
 
Sounds like they're still scanning while it's in your possession, regardless of the reason why, that should not be legal. That said not sure the ECPA requirement is that you own the hardware that the storage is on. It's literally is tantamount to wiretapping, you could argue that with traditional wiretapping they're not taping your phone they're tapping the central phone office where the call goes to, it still isn't legal.


Again ECPA does not strictly dictate to government, it's illegal for ANY third party to "wiretap". In fact if anything the government can do it more freely due to how the Patriot act amongst others gives the government more power (aka immunity to lawsuits)

But it's not wiretapping so wiretapping laws don't apply. I don't mean to sound pedantic, but there is a fundamental difference between listening in or recording a conversation between two parties and checking hashes on phones against a database of known child porn images.

Using Siri (or any connected device with a microphone) is closer to wiretapping than this is.
 
Fair enough, but my point is that if nobody fights back, the terms of the TOS are going to dictate that Apple does what they want, and it isn't as simple as citing the Bill of Rights because they are a private company.

What ground would they have to litigate anyway? Nobody (except for paedophiles) is technically damaged by this policy that was accepted by the TOS. In the prior TOS examples you had customers losing accounts, paid content, etc where they suffered damages. I could see a potential lawsuit dismissed out of hand citing the plaintiffs having no standing.
Doesn't hurt anyone? Let me guess, you're one of those people who say if you've done nothing wrong you should have nothing to hide. Everyone is harmed by this. It's called illegal spying and it has nothing to do with child porn. If something is encrypted and stored on Apple's hardware, Apple cannot get in trouble for it. And getting in trouble for something stored on their hardware is the stated reason for scanning it in the first place

This is nothing more than an increased invasion of privacy and illegal to boot. Everyone is harmed by this, including you even if you don't realize it.

And I'll repeat it yet again since you don't seem to understand the concept. Putting something in some sort of terms of service does not mean you can do something illegal. It's as simple as that. Unless you think it's okay for Apple employees to rape the mothers of anyone who uses their phones after they put that in the TOS. I assure you, it's still illegal for any Apple employee to rape anyone even if you agreed to allow it in the TOS.
 
But it's not wiretapping so wiretapping laws don't apply. I don't mean to sound pedantic, but there is a fundamental difference between listening in or recording a conversation between two parties and checking hashes on phones against a database of known child porn images.

Using Siri (or any connected device with a microphone) is closer to wiretapping than this is.
And you still don't understand the law. It is wiretapping. It's an electronic communication and it's being recorded. If it's not recorded it cannot be checked or turned over the authorities.
 
Doesn't hurt anyone? Let me guess, you're one of those people who say if you've done nothing wrong you should have nothing to hide. Everyone is harmed by this. It's called illegal spying and it has nothing to do with child porn. If something is encrypted and stored on Apple's hardware, Apple cannot get in trouble for it. And getting in trouble for something stored on their hardware is the stated reason for scanning it in the first place

This is nothing more than an increased invasion of privacy and illegal to boot. Everyone is harmed by this, including you even if you don't realize it.

And I'll repeat it yet again since you don't seem to understand the concept. Putting something in some sort of terms of service does not mean you can do something illegal. It's as simple as that. Unless you think it's okay for Apple employees to rape the mothers of anyone who uses their phones after they put that in the TOS. I assure you, it's still illegal for any Apple employee to rape anyone even if you agreed to allow it in the TOS.

It's funny you say that because I don't think YOU understand the concept. I don't know what to tell you. We agree in principle, but not on the specifics. I don't think anything you said would stand up in an actual court of law, especially when you throw around stupid examples like an "Apple employee to rape anyone even if you agreed to allow it in the TOS."

You don't have a standing to claim damages when you can't point to any damages. "Invasion of privacy" isn't going to cut it when you agreed to the TOS. Any use of a smartphone is an invasion of privacy as I already pointed out. How is a court going to decide when you are ok with sharing personal information with Apple and when you're not? Apple is checking your phone all the time to provide you with all kinds of services.
 
And you still don't understand the law. It is wiretapping. It's an electronic communication and it's being recorded. If it's not recorded it cannot be checked or turned over the authorities.

Don't quit your day job and become a police officer.
 
Sounds like they're still scanning while it's in your possession, regardless of the reason why, that should not be legal. That said not sure the ECPA requirement is that you own the hardware that the storage is on. It's literally is tantamount to wiretapping, you could argue that with traditional wiretapping they're not taping your phone they're tapping the central phone office where the call goes to, it still isn't legal.

Yes, they are scanning the images while they are still on your phone, because you agree to this as the terms of storing your images on their cloud server. It is entirely voluntary on your part.

Is it legal for me to come into your house and install cameras so that I can have a computer AI watch you day and night to make sure you aren't up to no good? What if you agree to let me to that in exchange for giving you access to my 100 foot yacht full of pre-med college chicks who are enamored with guys with physics degrees?
 
These groups have taken control of corporate America, telling them what is acceptable. The same happened when Congress allowed corporations to have a say in jail sentences and mass incarceration followed. A "Group" main focus is their objective and how it affects others isn't their problem. I have a lot of firearm photos on my phone (non-Apple), but what if they start scanning for "ghost guns", high capacity magazines, not shooting or stored in a "safe area" to red flag me? They'll eventually want to scan for anything that offends the Groups they pander to.

Paypal has already denied any payment that is related to firearms. Affirm doesn't allow it, payment processors like Square, Stripe, etc. same thing. The list is vast. The tech community know the hold the keys to the digital economy.
Preventing digital crime always seems like the right thing to do at first, but eventually it will be used for evil. Everyone is fighting for data, scanning people phones and logging locations are the most valuable.
"What ifs" aren't really a valid argument here, though. Apple is looking for hashes of known CSAM, with a high threshold to prevent false positives, a limitation to cloud uploads, and a human review process. It's not a generic image recognition system; it's not scanning strictly offline photos; it's not based on what offends the groups, it's material that's explicitly, unambiguously illegal. There's still a large chasm between what you fear Apple might do one day and what it's doing now.

I'll agree companies can have an outsized influence on things, but I'd say Apple's App Store policies have more of an effect than this will.
 
This has been an interesting discussion both here and in the tech press. For me, this functions more as a barometer of who has their eye on the future and who gets mired down in the details.

Looking to that future, it is clear that appl and other tech giants view their cloud services and thier operating systems as one and the same. If anything what consumers actually buy is the physical parts, sure you can say I own a rectangle piece of plastic, aluminum and silicon but that does not an i phone make. I hope that eventually our legislative bodies get their collective shit together and put down some sort of consumer protections but I'm realistic about that outcome within my lifetime, at least.

Story time coming up for you TLDR'ers: I was having a conversation about this with a good long term friend who is a corpo lawyer. Now, they are not as tech savvy as say the members of a tech forum but they are pretty savvy when it comes to how power and perception shapes the bureaucracy of life. A few gems stood out which I feel most folks lose the plot on: If you are one of those common sense golden rule types who shout "but, but it's illegal", are of average means and reach a courtroom then you have probably already lost. The law doesn't care about right/wrong, fact/fiction, good/evil etc. it only cares about power and perception. Who has that power and to do what with changes over time but in 99% of cases which don't make the nightly news that sentiment remains the same. Then just as casually, they smiled and said, "we laugh at those people that think the law works like on TV, in fact we love the law and order types. It means that we can put in the least amount of effort possible and still win," then leaning in as if to drop some knowledge bomb that was akin to a secret handshake, "the truth is the law is what we say it is." Let that sink in.

Have all the debates you want like in Kafka's the Trial but the reality is for most of the general tech using but not tech savvy folks, they think they are the judge/priest (or Loki to use modern parlance) when they really are K (some rando variant office drone) being shuffled along to the mortal coil for an unknown crime by a faceless bureaucracy.
 
https://arstechnica.com/tech-policy...e-photo-scanning-plans-to-address-criticisms/

Shockingly, instead of digging in their heels and doubling down on the rhetoric, Apple has said they’ll delay while they think about ways to improve the privacy of their super awesome privacy protecting make-your-own-phone-snitch-on-you-we-promise-governments-won’t-abuse-it system.

unless they just decide to cancel the plan altogether, or just do what everyone else does and store photos unencrypted and perform the scans on their own servers, I’m not sure any changes they make could be better.
 
https://arstechnica.com/tech-policy...e-photo-scanning-plans-to-address-criticisms/

Shockingly, instead of digging in their heels and doubling down on the rhetoric, Apple has said they’ll delay while they think about ways to improve the privacy of their super awesome privacy protecting make-your-own-phone-snitch-on-you-we-promise-governments-won’t-abuse-it system.

unless they just decide to cancel the plan altogether, or just do what everyone else does and store photos unencrypted and perform the scans on their own servers, I’m not sure any changes they make could be better.
Apple already scanned photos on its own servers, as I recall, like Google, Facebook and others. The difference was scanning incoming photos on-device.

At any rate, we should be glad Apple is at least thinking carefully about this. While I wasn't as outraged by Apple's initial approach as some, the company does need to recognize that it can't treat CSAM scanning like it does a big software update... that is, it's not something it can design internally and keep largely hush-hush. This requires more collaboration and consultation.
 
Back
Top