Scarlett Johansson Can't Stop the Internet from Pasting Her Face on Porn

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
While the world’s highest-paid actress has managed to defend her privacy in the past (e.g., prosecuting the iCloud “hackers” who leaked her nude photos), Avengers star Scarlett Johansson admits there is little she can do about “deepfakes,” a trend that utilizes AI to swap an actor’s face with another in a video. Johansson’s face has been “grafted into dozens of graphic sex scenes” by fans, and while she thinks it’s disturbing, fighting them would be a “lost cause” due to the depth of the global internet and how legalese varies among countries.

Clearly this doesn’t affect me as much because people assume it’s not actually me in a porno, however demeaning it is. I think it’s a useless pursuit, legally, mostly because the internet is a vast wormhole of darkness that eats itself. There are far more disturbing things on the dark web than this, sadly. I think it’s up to an individual to fight for their own right to their image, claim damages, etc. I mean, this is coming from someone who has a guy from Hong Kong get famous from making an AI with my exact face on it that wasn’t “technically” me.
 
Deep fakes could become a real problem in the future in more than just porn. As they get better and better, they can be a problem for the justice system in dealing with evidence, in politics with fake fabricated clips of politicians saying things they haven't, and in all sorts of areas I can't even think of right now.

I wonder if it will be possible to design analysis algorithms that can scan a video or audio recording and determine if it was real or computer generated.
 
Among the many problems that the world faces today, this is definitely a first world problem. ;)
 
For a friend, right?

images?q=tbn:ANd9GcQaByDEfK_FF49TQCSSjNCW03j2B6T9ZYrO3o0wG-tD9PLj7hFY.jpg
 
Deep fakes could become a real problem in the future in more than just porn. As they get better and better, they can be a problem for the justice system in dealing with evidence, in politics with fake fabricated clips of politicians saying things they haven't, and in all sorts of areas I can't even think of right now.

I wonder if it will be possible to design analysis algorithms that can scan a video or audio recording and determine if it was real or computer generated.
Indeed it will be. Perhaps a form of digital signing of content is the way forward?
 
Deep fakes could become a real problem in the future in more than just porn. As they get better and better, they can be a problem for the justice system in dealing with evidence, in politics with fake fabricated clips of politicians saying things they haven't, and in all sorts of areas I can't even think of right now.

I wonder if it will be possible to design analysis algorithms that can scan a video or audio recording and determine if it was real or computer generated.
If you can make an AI to make these, then you can make an AI to determine whether they are real or not. It's kinda like encryption keys and hashes – the data is out there, you just have to find it.
 
If you can make an AI to make these, then you can make an AI to determine whether they are real or not. It's kinda like encryption keys and hashes – the data is out there, you just have to find it.

You don't even need to go that far, as compression etc has its own signature in noise. You can look at a photoshoped image that looks without question real to the human eye, but look at the noise profile etc etc and it stands out like a sore thumb, there are some very advanced tools for this as well and it is very hard to fake. People forget there is a lot more to a video or photo than the visual color data.
 
  • Like
Reactions: PaulP
like this
I guess I am not surprised and not even terribly horrified since she's a celebrity known for her looks.

This gets really creepy when people start doing this with co-workers / friends / teachers / whatever, and I don't really like that endgame.
 
If you can make an AI to make these, then you can make an AI to determine whether they are real or not. It's kinda like encryption keys and hashes – the data is out there, you just have to find it.
You don't even need to go that far, as compression etc has its own signature in noise. You can look at a photoshoped image that looks without question real to the human eye, but look at the noise profile etc etc and it stands out like a sore thumb, there are some very advanced tools for this as well and it is very hard to fake. People forget there is a lot more to a video or photo than the visual color data.

These are both true but remember that most manipulated images are not made with "defending against" this sort of thing. As time goes on and the tech to create deepfakes becomes better, while those made just for fun and whatnot will be relatively easy to spot with technology, those designed by major interests with money (and therefor power) will use the best technicians to make it lot as legit as possible. For instance, if you want to influence the electorate shortly before voting day, unscrupulous lobbyists will pay a fortune to create a deepfake suggesting Candidate X said horrible thing Y, which - based upon huge amounts of gathered data and metadata for microtargeting - will be likely to be seen by Z voters and push them out to vote for Candidate Q. Now, it may come out in time that it was a deepfake and X never said such a thing, but the damage is already done!

Worse, the whole thing means that discerning the truth is even more difficult and requires more knowledge/tools. If even "common' deepfakes get good enough to be unable to distinguish for 90%+ of the population on viewing alone, then likely tools will be created to assess and render judgement on authenticity. Thus, people start running anything marginally questionable through tools of choice and generally believe the "fake" or "authentic" outcome etc. This opens a whole new can of worms because there are many circumstances where the tools themselves could be compromised one way or another, and/or simply can't handle the top 10% or 1% of deepfakes that are designed with fooling these tools in mind. As we've seen with (in this country anyway) other circumstances like DNA testing, it is a BAD thing when the common person believes that a certain technology is pretty much foolproof at determining "truth", so vulnerabilities are twice as hard to deal with as people are trained to believe that these things are almost always right!

Consider the aforementioned political example , where the deepfake of Candidate X is created with that top 10% -ish tech, so everyone who runs it through "Google's totally free and trustworthy deepfake checker" sees it come out as "authentic", creates the shitstorm as expected and runs off to the polls while others count their dollars and shake hands. Talking heads will argue over possible vulnerabilities and some highly technical types on the Internet will confirm that it was in fact fake, but the mainstream will dismiss it as conspiracy theory, while the outrage/conspiracy types out there will further take it another way. So by the time those few highly technical types take their evidence to others, find ethical support from elsewhere in the industry, confirm, and have it reported that yes this whole thing was in fact fake, Candidate Q is about to be sworn in as Senator Q and the PAC that created this whole thing spends a few more bucks on "It was just one video, we've got to look forward, it couldn't have REALLY affected the election even if it WAS false...which we're not saying that we actually think it is blah blah blah". and we're onto the next issue of discussion. Clearly, some things should start to feel a little familiar

In any event, it is a worrisome potential issue. As for what that can be done about it, I do think that deepfakes should be required takedowns upon request and we should probably update our libel/slander/defamation laws to account for their presence.
 
Deep fakes could become a real problem in the future in more than just porn. As they get better and better, they can be a problem for the justice system in dealing with evidence, in politics with fake fabricated clips of politicians saying things they haven't, and in all sorts of areas I can't even think of right now.

I wonder if it will be possible to design analysis algorithms that can scan a video or audio recording and determine if it was real or computer generated.

I agree that it's disturbing. And yes, with today's technology, you can easily tell fakes. But that's today's technology. And you're always racing against others in the war. Tomorrow, AI might be good enough to get around that. The second that someone figures out X occurring indicates it's been altered, someone else will figure out how to remove X from the equation.

Remember, 40 years ago, people said a computer would never be able to defeat a human in chess. Now they can defeat Go Masters.

I guess it really comes down to the dedication of wanting to ruin a person's life. A politician, sure, in 20 years, I can easily see all audio and video being inadmissible. For Joe Schmo though, unless he's targeted in an international conspiracy, I doubt regular people will have the tools to ruin his life. That might be 40 years away until that happens.
 
To be fair, she's an actor that's famous for her looks/posing in action roles rather than her acting (it always reminds me of Sci-Fi/Sy-Fy B-movie level acting). She'll need to up her game when age catches up with her or chase more lawsuits.
 
Surely it has to be endearing to know an entire generation of teenage boys are polishing the old one eyed gopher to her likeness.
 
I agree that it's disturbing. And yes, with today's technology, you can easily tell fakes. But that's today's technology. And you're always racing against others in the war. Tomorrow, AI might be good enough to get around that. The second that someone figures out X occurring indicates it's been altered, someone else will figure out how to remove X from the equation.

Remember, 40 years ago, people said a computer would never be able to defeat a human in chess. Now they can defeat Go Masters.

I guess it really comes down to the dedication of wanting to ruin a person's life. A politician, sure, in 20 years, I can easily see all audio and video being inadmissible. For Joe Schmo though, unless he's targeted in an international conspiracy, I doubt regular people will have the tools to ruin his life. That might be 40 years away until that happens.

As far as ruining someone's life, while regular users will not often be targeted by someone with the resources to use some of the mostpowerful methods, given the way society works today its pitifully easy to ruin someone's life even now and this will make things even easier. All you have to do is expose private, or theoretically private/unseemly behavior to the public, real or fake, to someone's boss, relatives/friends, or neighborhood..continuously. Deepfakes will make this even worse as the people who can't tell that isn't really a Nigerian prince ready to send them a fortune or who join pyramid schemes (Its not a pyramid its a triangle of power!), will be confronted with "evidence" of your misdeeds.

The real problem is society as a whole and there are many things we need to do to fix it, but the first thing is to reverse the conventions of the social media age and return to the days before it, ideologically. Prior to the social media (and big data analysis/advertising ) era, there were two important tenets of the Internet I bet many here are old enough to remember "Be skeptical about everyone and everything" and most importantly "Value your privacy, keep your meatspace identity and information off the Internet, especially posted in public'. A little over a decade and we totally lost those two fundamental principles for a variety of reasons, motivated primarily by those smelling money and control. If we can reverse that trend, we'll be on the road to a better place.
 
While I do find her attractive at times, honestly I'd rather have her voice coming out of my computer and other devices aka a personal assistant which is way way sexier to me - Cortana, not even. :)

We need to get that Google Tacotron 2 technology to mimic Scarlett's voice so my 'puters can all speak to me in those sexy undertones the way her voice did in the movie "Her" as the voice of the guy's computer. Shivers, man, shivers. :D
 
Deep fakes could become a real problem in the future in more than just porn. As they get better and better, they can be a problem for the justice system in dealing with evidence, in politics with fake fabricated clips of politicians saying things they haven't, and in all sorts of areas I can't even think of right now.

I wonder if it will be possible to design analysis algorithms that can scan a video or audio recording and determine if it was real or computer generated.

Its an old idea that we are finally having to deal with. Remember watching Running man as a teen and thinking... hmmm that is where its going. Need a scap goat for just about anything... what do you mean you weren't there. This video clearly says otherwise.

To continue with the bad or perhaps so bad they where good movie thread... just imagine Sandra Bullocks The Net ramped to 11. When someone is out to get you you don't just have to worry about your drivers licence, you'll have to worry about the video of you running over puppies. lol
 
I took one for the [H] team and watched a couple of "her" videos. The deepfakers did a good job with her face in the videos I watched. I don't know about sound bc I had to have it turned off where I was. I will have to show the old lady after while and see what she thinks, lol.
 
Wow, she is being realistic and rational, pants off...uh, hats off, I mean...
 
it's not acting, it's bit playing. Wow, $40M a year for bit playing; I wonder what she had to do to get the part. There are no more real actors or actresses in Hollywood today. Don't believe that? OK, name ONE actor of today that can be compared to Clark Gable? Like I told you, there are no real actors or actresses in Hollywood today. As Consumers we settle for crapola and call it wonderful ... small wonder quality has all but vanished, food quality is pathetic, etc
 
Last edited:
it's not acting, it's bit playing. Wow, $40M a year for bit playing; I wonder what she had to do to get the part. There are no more real actors or actresses in Hollywood today. Don't believe that? OK, name ONE actor of today that can be compared to Clark Gable? Like I told you, there are no real actors or actresses in Hollywood today. As Consumers we settle for crapola and call it wonderful ... small wonder quality has all but vanished, food quality is pathetic, etc
Anything else going wrong?
 
OK, name ONE actor of today that can be compared to Clark Gable?

Literally who? had to google him.
were TVs even a thing when he was alive? Could you even see anything or were they just black and white smudges. How can you rate a smudge on it's acting.
 
it's not acting, it's bit playing. Wow, $40M a year for bit playing; I wonder what she had to do to get the part. There are no more real actors or actresses in Hollywood today. Don't believe that? OK, name ONE actor of today that can be compared to Clark Gable? Like I told you, there are no real actors or actresses in Hollywood today. As Consumers we settle for crapola and call it wonderful ... small wonder quality has all but vanished, food quality is pathetic, etc


I always found the old timey acting styles of the likes of Gable to be stiff and not believable as real characters. IMHO most on screen performances today are BETTER than back then. The characters coma across as real, not just some actor elocuting with some stiff half american half british accent.

Yes, there are also lots of terrible actors, but that has always been the case.

These days a lot of the best acting - however - is in television. Watch Bryan Cranston's portrayal of Walter White in Breaking Bad, and how the character develops throughout the seasons, and then maybe throw in his very different character in his portrayal of LBJ in "All the Way" and tell me he isn't a brilliant actor. He is not the only one. He is just the first to come to mind.


Acting isn't dead and it isn't diminished. Yes, there are many poor actors, but the classic cinema period you are fawning over really wasn't that great either. It had a very distinctive style, but it resulted in stiff characters that didn't come across like real people.

Personally, I find it very difficult to enjoy most films from before the mid to late 70's. There are some exceptions, but in general they just aren't very good. :p
 
Literally who? had to google him.
were TVs even a thing when he was alive? Could you even see anything or were they just black and white smudges. How can you rate a smudge on it's acting.

A film or tv show being in black and white does not prevent you from being able to observe or judge performances.
 
Solution: stop googling your own name on the internet. But of course that would mean not being a narcissist.

BTW your face can't be copyright. You didn't create it, you were born to look like that.

I thought the value she added was in the acting, but if she insists that it's only the look, then who am I to say otherwise.
 
quick, make deepfakes good enough so that we can custom order vids of virtual girls

( and quick before the sjw pc religious crowd says this promotes real life abuse, create unrealistic expectations....)
 
Back
Top