AI Will Save Cops From Trauma of Viewing Child Porn Images

DooKey

[H]F Junkie
Joined
Apr 25, 2001
Messages
13,500
Met police in London will be using AI to scan for images of child abuse/porn on suspects phones/devices in the next 2 to 3 years. They plan to move to this technology so their officers won't have to look at the approximately 53K devices per year for evidence. Once this technology is in action it will save human officers from the psychological trauma of having to wade through very distrubing images as well as speed up the task of doing so. Excellent use of AI as long as it doesn't miss anything.

The Met's digital forensics team uses bespoke software that can identify drugs, guns and money while scanning someone’s computer or phone. But it has proven problematic when searching for nudity. "Sometimes it comes up with a desert and it thinks its an indecent image or pornography,” Mr Stokes said.
 
Considering how little actual work cops do, Im ok with them making the sacrifice. They choose their profession.
 
I always thought I would hate to be the person in the position of having to look at all that stuff. This is a good use for AI, but I suppose there will ultimately still need to be someone to physically review the evidence if a case went to criminal court. "We have a printout that states you have this stuff on your computer." "Can anyone verify that it actually was on my computer? If not, then how do we know the AI didn't make a mistake?"
 
Considering how little actual work cops do, Im ok with them making the sacrifice. They choose their profession.

My neighbor is a cop and she deals with a lot of kid stuff. I don’t know how she does it. It’s very emotionally draining and it shows in her personal life. She’s tough as nails, and she’s fine - but make no mistake, she’s taking one for team humanity so the rest of us can live in ignorant bliss. This is a good use of AI if it actually works...
 
There is EVIL in this world, and I for one would not what to look at that everyday.
You cannot unsee what you have seen.
 
While I understand the work police do here in this regard is very taxing, it is for the greater good. Having AI do it will lead to false positives, and for something like this, that shouldn't ever be a possibility.

What if the computer decides a drawing is a positive match? But what if that drawing is not of a real human? In such an instance there is no abuse of a child. But the false positive might result in a false conviction. Considering the ramifications of such a conviction, it's very similar to a death sentence.

I'm sorry but I genuinely believe AI should NOT be doing this job, ever.
 
I am guessing the AI will just filter through the majority of it but there's no way it can be confirmed without human intervention.
 
All for it, no cop decent person should ever be forced to look at that.

I have, in the past, wondered how I would handle this exact situation. You want to do good and help people, but you know it's gonna fuck you up at the personal level. Same question I have about gynecologist.

But of course....how to make sure the AI is correct. Who watches the watchmen?
 
The Met's digital forensics team uses bespoke software that can identify drugs, guns and money while scanning someone’s computer or phone. But it has proven problematic when searching for nudity. "Sometimes it comes up with a desert and it thinks its an indecent image or pornography,” Mr Stokes said.

Like ice cream with walnut sauce on it???
 
While I understand the work police do here in this regard is very taxing, it is for the greater good. Having AI do it will lead to false positives, and for something like this, that shouldn't ever be a possibility.

What if the computer decides a drawing is a positive match? But what if that drawing is not of a real human? In such an instance there is no abuse of a child. But the false positive might result in a false conviction. Considering the ramifications of such a conviction, it's very similar to a death sentence.

I'm sorry but I genuinely believe AI should NOT be doing this job, ever.

False positives occur all time in the justice system with humans. I mean...if they got arrested they have to be guilty of something...am i right? We love putting people in jail..especially innocent ones. Death row, statistically is between 5-8% innocent people based upon conviction turnover rate.
 
Given the issues Apple is having with facial recognition, see a lot of false positives and false negatives until this gets a lot of training. Humans will still have to do the final checks for a long time.
 
Considering how little actual work cops do, Im ok with them making the sacrifice. They choose their profession.

I'm in a job that pays well and sometimes means not doing a lot of work. Kinda like the cops. I'm an operator at a large nuclear plant. You could say similar statements about me.

However,do you want to do 12 shift work that changes from day to night constantly and is horrible for your health? If there ever is a major emergency such as a meltdown, I get to stay and deal with it, probably take massive radiation dose and basically lose my human rights. Do you want to do that?

The answer is probably no.

Same as the police. Do you want to constantly fear for being injured or losing your life on the job? Risk mental illness such as post traumatic stress? Again, shift work that reduces your lifespan? How about dealing with the scum of society constantly?

Have a little respect.
 
While I understand the work police do here in this regard is very taxing, it is for the greater good. Having AI do it will lead to false positives, and for something like this, that shouldn't ever be a possibility.

What if the computer decides a drawing is a positive match? But what if that drawing is not of a real human? In such an instance there is no abuse of a child. But the false positive might result in a false conviction. Considering the ramifications of such a conviction, it's very similar to a death sentence.

I'm sorry but I genuinely believe AI should NOT be doing this job, ever.

I have a relative that is an officer that has to deal with this kind of crap. It is incredibly hard on you over time. We will all benefit if technology permits the people who investigate this stuff to do more of it in less hours of exposure.

Unless rolled out incredibly stupidly, your concerns are without merit, and I'll explain why.

There are two kinds of investigating kiddie porn pics.

1) This is suspect to conviction direction. Evaluating evidence in possession of person X and making a case. I don't see how you can really automate this away. However tune the AI for a low but consistent rate of false positives, and you can avoid having the human troll through the other gigs of horrific shit such people collect rather than sift through ALL the data. AI should be able to do a good job at marking repeats as well, sorting by victims, etc.

2) This is the evidence to suspect direction. The content, since it has lots of identifiable stuff in it such as victims, perps, locations, etc. Not only to the criminals trade the porn, they also trade the victims. AI can sort, collate, and generally sift through that shit at a speed and volume people cannot. false positives don't matter because it is essentially a lead generator. Some leads pan out, some leads don't. Hopefully the AI makes for more, and better leads in less time. If it can make even halfway decent assessment of age progression for an individual, it may be able to connect investigators with victims over time. Inherently this type of use would be processing data for human use, and thus mistakes would not be found useful and discarded. Worst case is it decreases efficiency rather than increases it. If that's the case it gets shitcanned or ignored.
 
Out of curiosity is this scanning done without the users knowing? Or is the user giving up the phone? I was reading how this same technology us being used for people going to different countries and friends now are wiping their Phone clean as anyone can download the info on your phone.

More prevalent now if you have cryptocurrency logins, to company info, to just about anything.

The age of us losing all privacy is upon us. With that being said, no one should have child pornogrophy. It's just sick.
 
Out of curiosity is this scanning done without the users knowing? Or is the user giving up the phone? I was reading how this same technology us being used for people going to different countries and friends now are wiping their Phone clean as anyone can download the info on your phone.

More prevalent now if you have cryptocurrency logins, to company info, to just about anything.

The age of us losing all privacy is upon us. With that being said, no one should have child pornogrophy. It's just sick.

If your being suspected on possession of child porn then usually it's because you were either caught in a sting operation or it was found during investigation of a crime and all your data has already been carted away. This software they're developing is most likely for their investigators to use after the seizure. The FBI already has an enormous CP database that they use for cross referencing known material that's been in the wild. What they're really after is the unique stuff that hasn't popped up on their feed yet.
 
How do you confirm the AI's validity, if no living person observes it?
And therein lies the rub. Given the potential destructiveness for the innocent of a false positive, this should never be handed over wholesale to AI. It sucks for the people who have to view the vile stuff, but it is what it is.
 
The only thing I can imagine that is more traumatizing than cops having to go through the child pornography is programming AI to recognize it. From my understanding in order to set a baseline for the AI to work off of requires scanning and working on hundreds, if not thousands, of these images by hand. AI doesn't just magically start to recognize things off the bat.
 
This doesn't make sense to me.

It will certainly save police time, but it's not going to save them any child porn trauma. No legal system worth its salt is going to trust data directly from AI without having a human first review it. So, now instead of seeing ALL pictures, police are going to see ONLY material flagged as suspected child porn.
 
While I understand the work police do here in this regard is very taxing, it is for the greater good. Having AI do it will lead to false positives, and for something like this, that shouldn't ever be a possibility.

What if the computer decides a drawing is a positive match? But what if that drawing is not of a real human? In such an instance there is no abuse of a child. But the false positive might result in a false conviction. Considering the ramifications of such a conviction, it's very similar to a death sentence.

I'm sorry but I genuinely believe AI should NOT be doing this job, ever.

To piggyback off your comment. Naturalist and "intimate" (for lack of better word) yet innocent family photos of children in non-prudish families that are more open with each other and don't sexualize the human body, are completely legal to have and share (ruled decades ago by the Supreme Court to be protected by the 1st amendment in America) yet will undoubtedly pop up as false positives. Hopefully, humans do review the images afterward and make the decision of rather its an innocent photo or one that's sexualized, and if someone decides it violates the law, that lawyers, jury and judges will have the ability to view and make a final determination before sentencing innocent people. We don't need AI making such life altering decisions by itself!
 
Last edited:
Now they'll look at nothing BUT disturbing images, instead of combing through huge numbers of ordinary snaps. A win for efficiency if it works, but surely this just condenses the horror?
 
The only thing I can imagine that is more traumatizing than cops having to go through the child pornography is programming AI to recognize it. From my understanding in order to set a baseline for the AI to work off of requires scanning and working on hundreds, if not thousands, of these images by hand. AI doesn't just magically start to recognize things off the bat.

There already exists manually created databases of kiddie porn victims.

Also you could just train one to detect porn, and another to detect kids. Got a pic that passes the porn check and the kid check, flag it for review.

Also just training it to sort faces would be useful. Maybe sort tattoos or such.
 
If the AI looked on my computer and saw all the Japanese Anime I watch it would think I was into underage pron (any highschool anime), underage beatings and torture (Naruto and Gundam), any number of other anime types.
 
I see the problem with this being, AI mistakes being recorded as even potentially accurate, or tagging someone as a 'potential' criminal. As we know, being categorized as such in any way, and you will wind up being presumed guilty before any actual crime has even been committed. Much the same way a lot of society automatically assumes all homosexuals are out to molest children, or how many white people become uneasy when at an ATM late at night when a large black man walks in behind them, in comparison to a 90 year old grandma, for example. Getting put into the potential criminal category and the next thing you know, you're getting rounded up in the group of usual suspects, even when you've done absolutely nothing to deserve it.
The solution is simple; rotate those who have to do the most horrible work. Just like no one wants to be the guy cleaning out the latrine, so they don't make the same people do it every day.
 
53000 devices last year? Holy shit London has that big of a problem? Or do they get everything from the whole country sent to them?
 
I don't know, if cops can't handle it, why should be they be enforcing those laws? They're going to have to see it sometime. Someone is going to have to see it sometime. You thing a prosecutor is going to tell a Jury, "the AI said there is something bad behind that blurred blob in the image. Trust me, out AI is correct 89.5% of the the time. Please convict?"
 
Unfortunately, there will likely never be a solution to save police from having to view this, or other disturbing images. The AI might be able to flag pictures and make the job easier, but it is still going to require human verification. They might not have to troll through hundreds of images, but they would still need to look at least at a sampling of the images flagged. The flagged images could be compared to the FBI database, but mistakes can still happen. Having human verification will make it more likely to convict a criminal and it can also weed out false positives and try to prevent innocent people from having their lives ruined over a computer error.
 
I think people are approaching the justice system all wrong. Given how much it costs to keep someone behind bars and currently tech levels, you would think at this point someone would have realized its easier and cheaper to just give people what they want. A thief steals a car worth $30k, and the cost of convicting and keeping the thief behind bars for a year is going to run upwards of $60k. At that rate we may as we be handing out $30K cars to people who are thinking about stealing one.

Obviously given the current topic, handing out child porn isn't going to work, because the crux of this whole situation is the creation of child pornography, and its affect on the victims. It comes from somewhere, and its creation is what needs to stop. So why not address the real issue and cripple the creation market by making an AI that creates "fake porn". Perverted people are always going to exists, they always have, and trying to hammer out undesirables always leads to misery for innocent people. Its better to build a beacon and let them congregate to it rather than leave them to their own devices in the open world. I can live with a system that fools 90% of pedophiles into thinking what they are seeing is real, rather than a system that only catches 90% of the images of the victims.

Want to stop the drug cartels across the globe? Legalize the stuff and hand it out for free, most of the chemicals are dirt cheap and easy to make anyways.

An Alcoholic is the most dependable person in the world, just keep the task simple and have some booze waiting for them when they are done.
 
I'd assume some person would still need to check the images that the software flags as such. Or if the program says it child pron you're guilty automatically? The number of false positives is relatively high with these algorithms. By the time you tune it so it doesn't report false findings it will report almost nothing.

And will "those are only for ai training purposes for the police" be a valid defence?
 
I think people are approaching the justice system all wrong. Given how much it costs to keep someone behind bars and currently tech levels, you would think at this point someone would have realized its easier and cheaper to just give people what they want. A thief steals a car worth $30k, and the cost of convicting and keeping the thief behind bars for a year is going to run upwards of $60k. At that rate we may as we be handing out $30K cars to people who are thinking about stealing one.
I'm all for that, if they'd hand out the money spent on catching and keeping thieves in prison we'd all be living as kings. That is what universal basic income is about. Yet almost everyone I meet is opposed to it. Of course most people don't think about the practical reasons, just the same old "they shouldn't get anything for nothing"
 
While I understand the work police do here in this regard is very taxing, it is for the greater good. Having AI do it will lead to false positives, and for something like this, that shouldn't ever be a possibility.

Well, I think since it is a AI it will learn over time and those errors will be minimal as more time goes on. I think you and others are misunderstanding what's going on here and how it will work. It's not like they flip a switch on the AI and then cops never have to look at the pictures again, but rather they turn on the AI and then any results that the computer comes back at with say less than 80% positive of illegal content the cops will double check. So at first you turn the AI on, and if the computer isn't 100% sure and with human intervention along the way human officers will have to sift through what the machine is seeing as illegal or not. This wil OVER TIME cut down on the amount of images cops will have to look at and not immediately rely on it 100% at the start. Make sense?
 
Well, if they are planning on triggering the AI to take over the defence networks and nuke us into extention; forcing the AI to look at child porn/abuse will be a good start.
 
If the AI is scanning for photos containing illegal activity in a sea of otherwise innocuous images, will it really save a human having to check its work?

All the AI is doing is separating the wheat from the chaff, so to speak. Someone still has to mill the wheat.
 
Uhhh no, people don't get just magically arrested because they're for sure guilty of something. There are plenty of unsolved crimes out there, and there are plenty of innocent people arrested. Hell, even convicted and sentenced to death. I'm not saying it doesn't happen, but I don't think AI is the solution to this particular problem.

False positives occur all time in the justice system with humans. I mean...if they got arrested they have to be guilty of something...am i right? We love putting people in jail..especially innocent ones. Death row, statistically is between 5-8% innocent people based upon conviction turnover rate.
 
Back
Top