Customs Officers Of The Future Will Be Lie-Detecting Robots

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
The Canadian Border Services Agency and the US Department of Homeland Security is now using lie-detecting robots at the border. What could possibly go wrong?

The Automated Virtual Agent for Truth Assessments in Real Time (AVATAR) is currently being tested by the Canadian Border Services Agency and the US Department of Homeland Security. The robot — programmed to look for physiological changes that indicate lying through eye-detection software and other sensors — could help border agents catch terrorists or drug traffickers, according to San Diego State University researchers.
 
No thank you. I've seen AI online that can detect your gender, emotions, age, etc. and many times are way off.

Hopefully, it uses better tech than some other things. Like that camera when taking pictures of Asians - "Did you blink?". Or black people "No face detected".

I think we're a ways off. If it can't get simple things right with a high rate of accuracy with a wide range of subjects, I don't want to have it doing anything "official".
 
No thank you. I've seen AI online that can detect your gender, emotions, age, etc. and many times are way off.

Hopefully, it uses better tech than some other things. Like that camera when taking pictures of Asians - "Did you blink?". Or black people "No face detected".

I think we're a ways off. If it can't get simple things right with a high rate of accuracy with a wide range of subjects, I don't want to have it doing anything "official".
There are levels of computer vision. And depending on how broad the subject the harder it is to get it working properly.

Looking for very specific signs in an eye or detecting microexpressions. Is much easier, actually magnitudes easier than identifying an eye in a picture or a face for that matter. An AI might not be able to detect a face with near 100% accuracy. But it can detect facial expressions and features if it was told that it was looking at face.

So you need an operator that points the detector exactly at the face / eyes of a person, and verify that the guy is not wearing a mask or something like that.
 
Wow, now we're handing this task off to a robot?
Misleading title mislead you. Not replacing, but helping. And not a robot, I mean you don't need a humanoid robot or any kind of robot for this, just a camera and computer vision software.
 
There are levels of computer vision. And depending on how broad the subject the harder it is to get it working properly.

Looking for very specific signs in an eye or detecting microexpressions. Is much easier, actually magnitudes easier than identifying an eye in a picture or a face for that matter. An AI might not be able to detect a face with near 100% accuracy. But it can detect facial expressions and features if it was told that it was looking at face.

So you need an operator that points the detector exactly at the face / eyes of a person, and verify that the guy is not wearing a mask or something like that.

I'd say I'd accept some margin of error, as it'd be similar with a human. Misinterpreting nervousness, good acting, etc..
 
I'd say I'd accept some margin of error, as it'd be similar with a human. Misinterpreting nervousness, good acting, etc..
I think with lie detection the most important is the quality of the questions. You can't just ask "are you a terrorist"? And be done with it. And still need to establish a baseline. So this seems more like a replacement for the old cumbersome lie detectors. And there is no way it will be used on everyone at border check.
 
They will just train their people to believe harder. Are you a terrorist? No, I am a freedom fighter, to call me a terrorist is an insult. If you believe enough, it's not a lie being told by you, thus lie detectors fail.
 
I just watched blade runner and this reminded me of the opening scene. Just ask them about tortoises.
 
There is no such thing as actual Lie Detection.

There is at most "indicators of stress" but that is just it, there is a reason why polygraphs and the like aren't accepted in court. So, yeah, this is based on BS.
 
Won't work...

giphy.gif
 
Wow, now we're handing this task off to a robot?
Not for the reason you think.

People often wonder what the primary purpose of canines are in vehicle searches. They think that its because their advanced olfactory systems can detect the drugs, but that's easy to mask. What they are most useful for is for giving police officers with a good hunch an excuse to pursue further investigation without a warrant, and eager to please dogs can read their owner and signal when they sense that that's what they are supposed to do. And if the dog is wrong, well, the dog isn't racist or sexist or whatever to open them up for a lawsuit, its just a dog. What are you going to do, fire the dog and ruin its career?

I have a feeling this is an electronic version of that, which like a dog will be immune from accusations of racial discrimination and the like, a very sensitive issue these days, and provide an excuse to do random deeper interrogations.

After all, it doesn't really have to work to, well, work.

As long as it goes off say one out of ten times, you now have introduced a plausible excuse to detain people at random which places a greater risk on smugglers and the like, which has deterrence value in and of itself.
 
If ever asked to take a lie detector test (polygraph) I will always refuse because I know they are only 50% accurate, at best. But then in court they will say I refused to take the lie detector test so must have something to hide.
 
Reminds me of that robot parole officer in Elysium


or Futurama's Dr. Perceptron with his "you need to relax more" approach, followed by a nice zap to help you relax
 
Back
Top