Apple will remotely install software to scan all US phones for child sex abuse images

pendragon1

Extremely [H]
Joined
Oct 7, 2000
Messages
32,838
huh. i thought they were all aboot user privacy. not that i support cp/pedos but if they wont unlock a phone for a murder investigation...
"Security experts, however, have warned that such move could "open the floodgates to extensive surveillance." guaranteed
 

techie81

[H]ard for [H]ardware
Joined
Jan 12, 2005
Messages
5,514
To the crowd that say, if you don't have anything to hide - just wait until you are falsely accused of something due to a technical glitch or something is used against you from your phone in a completely unrelated accusation.

We are living in dangerous times.
 

sc5mu93

Gawd
Joined
Jul 11, 2018
Messages
660
in my head, i see apple senior execs in a meeting and this is the convo:
"we need to build in backdoor framework to enable more penetration into the Chinese market."
"but that will not go over well in the west. any suggestions on how to sell this?"


"oh yeah - who will argue with stopping pedos?!!"

Tim Apple: "done!"
 

Nobu

Supreme [H]ardness
Joined
Jun 7, 2007
Messages
6,700
That's the tricky part, even if apple does not go farther with it whats to say a 3rd party from finding a way to highjack it and use it?
My biggest issue with it is Apple is NOT a policing organization. That isn't to say they cannot do this, but I don't want THEM doing it, because there are no laws to protect us from any abuse of their privilege.
 

DeadlyAura

Supreme [H]ardness
Joined
Jun 6, 2005
Messages
4,325
I stole this from the Reddit thread:
"The year is 2030, a reporter reports on evil behavior of Apple suppressing LGBT people in foreign countries. Soon after the FBI arrests the reporter, with claims from apple that "Our automated systems indicated they had child abuse images". Reporter gets dogged on by the rest of the media paid by Apple for being a pedo.

The U.S. government secret court orders (Patriot Act) Apple to add images to their database and report them to the U.S. government. Images like pictures of guns, (Real or not, can't tell the difference), political campaign images, screenshots of text conversations, etc.

The CCP orders Apple to add images like political posters, memes, tianemen square images, so they can black bag and "dissapear" citizens.

Saudi Arabia (Big apple customer and investor) orders Apple to do the same as above. Remember the Saudis tried to get at Jeff Bezos, he had a big article he wrote about it.

If you're a person of any power this is a show stopper. This will be used against you.

Apple censors LGBT content, censors search, and complies with governments across the world in their inhumanitarian acts. You think they won't do it here? Apple is a company where only dollar signs matter. They have *no* morals. If we did still have some decent morality we'd be regulating this company into the ground. Round up Tim Cook and see what kind of images he has on his devices, you'd be surprised."

The argument is that, Apple "does not have access to the raw images, only the image hashes" and that they are using the hashes to compare against a database of known abuse images.

MY argument is that, this is bullshit. It literally means that Apple DOES have access to the data on your phone, they're only choosing to use the hashes currently and NOT the raw images, and that the raw images will only be accessed if they get a match. This, however, means that they DO have access to the raw images, and can access them at their leisure. Sure, maybe right now it doesn't matter. But what will it take for an employee to decide to do something bad? What will it take for Apple to start putting the images on your phone? Maybe you're an outspoken political activist or politician that Apple doesn't like? If they already have access to your device.... what's stopping them from putting the images there?

Or, like the quote above... Right now, they are "looking for child abuse images." What's stopping them from looking for political images? What's stopping them from looking for whatever they want in order to demonize you, or imprison you, or silence you?

This is a horrible, horrible idea that can ONLY end badly.

Let the police, DHS, FBI, etc. do their damn jobs. The tech companies need to stop overstepping their bounds.
 

pendragon1

Extremely [H]
Joined
Oct 7, 2000
Messages
32,838
I stole this from the Reddit thread:


The argument is that, Apple "does not have access to the raw images, only the image hashes" and that they are using the hashes to compare against a database of known abuse images.

MY argument is that, this is bullshit. It literally means that Apple DOES have access to the data on your phone, they're only choosing to use the hashes currently and NOT the raw images, and that the raw images will only be accessed if they get a match. This, however, means that they DO have access to the raw images, and can access them at their leisure. Sure, maybe right now it doesn't matter. But what will it take for an employee to decide to do something bad? What will it take for Apple to start putting the images on your phone? Maybe you're an outspoken political activist or politician that Apple doesn't like? If they already have access to your device.... what's stopping them from putting the images there?

Or, like the quote above... Right now, they are "looking for child abuse images." What's stopping them from looking for political images? What's stopping them from looking for whatever they want in order to demonize you, or imprison you, or silence you?

This is a horrible, horrible idea that can ONLY end badly.

Let the police, DHS, FBI, etc. do their damn jobs. The tech companies need to stop overstepping their bounds.
yup
""The software, reportedly called neuralMatch, is designed to look through images that have been stored on iPhones and uploaded to iCloud storage," Schlitz wrote. "According to the Financial Times, if the software detects child sexual abuse in a photo, it will then pass the material on to human reviewers who will alert law enforcement if they think the images are illegal.""
that doesnt sound that looking at hashes to me...
 

HAL_404

[H]ard|Gawd
Joined
Dec 16, 2018
Messages
1,221
boy I'm sure glad they upgraded the cameras mounted on the highway overpass bridge last month in my town of 6000 or so people ... don't laugh, we have one 4-way intersection, a McDonalds and a Super Walmart yet, there's cameras watching. Almost everything is on camera today. It starts out looking like a good thing, will keep us SAFE and SECURE from evil people but everyone knows what will come next. You do know, right? But I digress ...

I don't use anything Apple any more after my glorious experiences with iTunes ... movies @ $20 a pop but provides no surround sound for them if you play them on a windows PC
 

HAL_404

[H]ard|Gawd
Joined
Dec 16, 2018
Messages
1,221
Its things like this that make me laugh. We have security cameras all big stores like wallamrt and in big city's, most give out their life story on facebook and so on. But the second their tech does something like this you have folks getting angry over it.

yea, that guy George Orwell ... look at all the trouble he's causing us today. Things wouldn't be like they are had he not written 1984 :rolleyes:
 

jmilcher

Supreme [H]ardness
Joined
Feb 3, 2008
Messages
5,284
“ The database of processed CSAM images will be distributed in the operating system (OS), the processed images transformed so that users cannot see what the image is, and matching done on those transformed images using private set intersection where the device will not know whether a match has been found. This means that when the features are rolled out, a version of the NCMEC CSAM database will be uploaded onto every single iPhone. “

That’s a quote from the EFF article.


Holy shit. They are going to be keeping hashes of those images locally on every iPhone.
 
Last edited:

jmilcher

Supreme [H]ardness
Joined
Feb 3, 2008
Messages
5,284
No just the hash's of the images. if the images were there apple would be in very big trouble.
True. Still doesn’t seem like a very efficient system, and def not a good system for catching new images and offenders. I’m all for protecting children, but like many people I’m also all for having basic privacy.

I suspect Apple may lose some customers if they do not do an about face here.
 

1_rick

[H]ard|Gawd
Joined
Feb 7, 2017
Messages
1,989
“ The database of processed CSAM images will be distributed in the operating system (OS), the processed images transformed so that users cannot see what the image is, and matching done on those transformed images using private set intersection where the device will not know whether a match has been found. This means that when the features are rolled out, a version of the NCMEC CSAM database will be uploaded onto every single iPhone. “

That’s a quote from the EFF article.


Holy shit. They are going to be keeping those images locally on every iPhone.
No, it's a database of hashes. Putting the actual images on your phone would be, at a minimum, entrapment. Not to mention it would literally be them actually trafficking in CP.
 

RPGWiZaRD

[H]ard|Gawd
Joined
Jan 24, 2009
Messages
1,199
I don't see how this will go past EU privacy law legislation. US probably not a prob with how strongly a company can dictate the rules (take it or leave it™ customer approach)
 

pendragon1

Extremely [H]
Joined
Oct 7, 2000
Messages
32,838
I don't see how this will go past EU privacy law legislation. US probably not a prob with how strongly a company can dictate the rules (take it or leave it™ customer approach)
"all US phones" theres nothing aboot EU here.
 

Lakados

Supreme [H]ardness
Joined
Feb 3, 2014
Messages
4,558
I am super conflicted about this, it does open up some scary possibilities if Apple later decides to abuse this tech.
But I mean their pedo's so, F-em?
But also I'm in Canada so it doesn't apply, and I'm not a pedo so.... It doesn't apply to me, but who's saying later Apple doesn't deploy something that does apply to me, but then again I have a hard time imagining what I am doing that would be considered bad.
So really it comes down to scope and how that scope gets defined, but at least they are announcing that they are doing this and not just sneaking it in and only telling people after the fact, so yay for disclosure?
 

sc5mu93

Gawd
Joined
Jul 11, 2018
Messages
660
I am super conflicted about this, it does open up some scary possibilities if Apple later decides to abuse this tech.
But I mean their pedo's so, F-em?
But also I'm in Canada so it doesn't apply, and I'm not a pedo so.... It doesn't apply to me, but who's saying later Apple doesn't deploy something that does apply to me, but then again I have a hard time imagining what I am doing that would be considered bad.
So really it comes down to scope and how that scope gets defined, but at least they are announcing that they are doing this and not just sneaking it in and only telling people after the fact, so yay for disclosure?
police have lots of tools to catch pedos.

I think Tim Apple is adding it to meet compliance with regulations in other markets. Being tested here in the US, and trying to put "lip stick on a pig" by promoting the Pedo Use Case.
 

scojer

Supreme [H]ardness
Joined
Jun 13, 2009
Messages
7,710
What's this? Something that Apple is doing before Android? I'm surprised!

It's a good idea on paper, but not in practice. As others have said, it's a slippery slope that will be used for nefarious reasons before we realize what's going on.
 

Zarathustra[H]

Extremely [H]
Joined
Oct 29, 2000
Messages
32,890
I heard this on the radio yesterday,

As much as I like to see pedophiles go to jail, this is a step too far, and should be illegal. Way too big brother.

The content of our devices needs to be considered private. If law enforcement wants to scan it, they should present a warrant.

Otherwise, where does it stop?
 

The Mad Atheist

[H]ard|Gawd
Joined
Mar 9, 2018
Messages
1,291
To the crowd that say, if you don't have anything to hide - just wait until you are falsely accused of something due to a technical glitch or something is used against you from your phone in a completely unrelated accusation.

We are living in dangerous times.
Shh, but you'll get cleared in a court of law years later and the, what do they call the mouth breather masses these days, gradually forgive you after the media post your arrest everywhere.
 

Loose Nut

Limp Gawd
Joined
Oct 21, 2009
Messages
418
I say do it to all the phones, GoPro's, ANY camera that has internet access
scan any and all servers in the US and in every country!
home based servers too

and make dam sure you scan ALL phones held by any and all Government official's.
Biden and his son first. just to make sure the program works.

Worried about your Privacy Rights ?
hehe, your Internet Privacy was gone BEFORE the first cat image was ever uploaded to the internet.
 

HAL_404

[H]ard|Gawd
Joined
Dec 16, 2018
Messages
1,221
"The Holy See is the name given to the government of the Roman Catholic Church, which is led by the pope as the bishop of Rome. ... Since 1929 it has resided in Vatican City, which was established as an independent state to enable the pope to exercise his universal authority."

That's a haven for pedophilia and those desiring to marry 13 year old children. And you thought evil was just in video games?
 

Mchart

Supreme [H]ardness
Joined
Aug 7, 2004
Messages
4,670
Why not snuff vids? Why not images showing torture, war crimes, blueprints for IEDs?

Because if you oppose this idea, you're a pedo.
Because snuff films are not a massive issue. Pedophilia and pederasts take and share photos like this constantly. It's a massive issue. Considering the way this is being done via a hash database on iCloud, I honestly don't see the problem.
 

M76

[H]F Junkie
Joined
Jun 12, 2012
Messages
12,282
Indiscriminate surveillance with no probable cause. No thanks. I don't care what excuse they give. And then it gives a false positive and some actual creep goes through your pictures of your kids. Heck no.
 

Zarathustra[H]

Extremely [H]
Joined
Oct 29, 2000
Messages
32,890
Shh, but you'll get cleared in a court of law years later and the, what do they call the mouth breather masses these days, gradually forgive you after the media post your arrest everywhere.

Not to mention when you are bankrupted by having to spend money on attorneys.

People have this cute idea that if you are arrested you can just say "I can't afford an attorney" and your defense will be handled for free.

In reality:

1.) In most cases this means you get an overworked attorney who can't spend enough time to make your case, so you will probably either be convicted or forced into a plea deal even if you haven't done anything, and;

2.) Many places actually make you reimburse them for your public defender.

3.) Decent lawyers can cost a few hundred bucks an hour, and defending a case takes MANY hours.

Guilty or not, you are probably losing your freedom, your house, your reputation, and many other things.

The system is completely broken.
 
Last edited:
Top