Nvidia GPUs Breaking Chrome’s Incognito Mode?

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
This bug was apparently acknowledged by Nvidia—the author says that contents may leak across applications due to GPU memory not being erased beforehand. Have you ever encountered content “sticking” like this?

When I launched Diablo III, I didn’t expect the pornography I had been looking at hours previously to be splashed on the screen. But that’s exactly what replaced the black loading screen. Like a scene from hollywood, the game temporarily froze as it launched, preventing any attempt to clear the screen.
 
"splashed" on the screen you say? Kinky.

I had a similar experience when I was a teenager on Windows 95. There were two layers of the desktop wallpaper. I accidently hit "set as wallpaper" instead of "Save As" in the browser. This set the risky image to the wallpaper, but the themes wallpaper was displayed over it. Well when you shutdown or started up, for just a second or 2, the risky image would show up. Since this was on my parents computer I got caught and it was awkward.
 
Anyone note that he said this was reported 2 years ago to Google and Nvidia with no fix? Lets say your were typing a password that wasn't masked (as there are some websites that allow you to unmask the password.) Wouldn't this be a huge security bug? How hard is it to clear out GPU memory? Seems a bit negligent to me...
 
lol I think my background pictures when no one are around would likely cause more issue, I swap between two files, one with ex's on it to remind me what not to do, and one with game screenshots. locking the prerendered frames to one might at least stop the frame buffer from hanging on to old data but I'm actually more curious as to what the old data is being store for, is it to save operations to avoid wrote of zeroes or is it to pull the amd trick of referencing data that has already been rendered? which is funny instead of both companies finding a better way they just copy each others shortcuts. lol.
good thing they game still look good. lol.
 
lol I think my background pictures when no one are around would likely cause more issue, I swap between two files, one with ex's on it to remind me what not to do, and one with game screenshots. locking the prerendered frames to one might at least stop the frame buffer from hanging on to old data but I'm actually more curious as to what the old data is being store for, is it to save operations to avoid wrote of zeroes or is it to pull the amd trick of referencing data that has already been rendered? which is funny instead of both companies finding a better way they just copy each others shortcuts. lol.
good thing they game still look good. lol.

What.
 
Looking at those screenshots looks like this happens on between Mac and WIndows as one screenshot was on a mac. So that does show that issue is more of a universal issue.

Has it been shown that this issue does not happen with AMD or Intel video cards?
 
Almost as good as going to a business meeting and seeing history suggestions of 10 xxx sites flash by when the customer types an url.
 
Looking at those screenshots looks like this happens on between Mac and WIndows as one screenshot was on a mac. So that does show that issue is more of a universal issue.

Has it been shown that this issue does not happen with AMD or Intel video cards?

It only happens on Mac OS X, and it happens with both Nvidia and AMD graphics cards and drivers, so Nvidia is now pointing the finger at Apple. (source, note the update at the end of the article)
 
It only happens on Mac OS X, and it happens with both Nvidia and AMD graphics cards and drivers, so Nvidia is now pointing the finger at Apple. (source, note the update at the end of the article)

Interesting, almost looks like the entire story changed since day one. Original story talked about how Windows wiped system memory before giving it to a program and now the same wasn't happening for video memory and made no mention of Apple anywhere that I noticed. I actually thought Windows was addressed in a few places. Now I only see mention of Apple and the only mention of Windows being that it doesn't happen there.

So it does sound and look like it is an Apple memory management bug only and not a driver bug for the video card.
 
Back
Top