All Intel CPUs in last 10 yrs have critical bug to show protected kernel memory areas

GPU bound gaming is obviously not gonna be affected by CPU performance losses, unless they were SO severe they suddenly made you CPU bound, but even meltdown+spectre microcode isn't that bad for most 4K scenarios.

The performance loss is significant if you play at 1080p or 1440p and are trying to hit a minimum 144fps or even higher in less graphically complex games. There's a lot of competitive games that qualify, where it's possible to get 150-250+ fps averages like most of the mobas, csgo, sc2, etc, and where people care about those framerates for high refresh monitors.

Of course, Ryzen is much worse in those scenarios than a patched Intel cpu so there isn't much you can do but grin and bear it.
 
Those synthetic benchmarks would show if my system took a serious hit. They don't and are canned and are easy to compare the results, hence their inclusion.

Game wise I couldn't tell the difference in the Witcher 3, AC:O, Mass Effect Andromeda, and BF1. Those are the only games that I've recently been playing.

Then again, I've got a 6950X at 4.4GHz and play at 4K in just about every game. So, I'm certain that I'm GPU bound all the time, unlike tests and games run at 720P. Edit, or even 1080P these days.
Well, since you tested 3 items on your computer, the world is safe and cozy. /s

There have been reports of games taking a hit, so maybe, just maybe, there might be others? It is a crazy world after all.
 
Well, since you tested 3 items on your computer, the world is safe and cozy. /s

There have been reports of games taking a hit, so maybe, just maybe, there might be others? It is a crazy world after all.
For your re-reading pleasure:

"Then again, I've got a 6950X at 4.4GHz and play at 4K in just about every game. So, I'm certain that I'm GPU bound all the time, unlike tests and games run at 720P. Edit, or even 1080P these days."
 
For your re-reading pleasure:

"Then again, I've got a 6950X at 4.4GHz and play at 4K in just about every game. So, I'm certain that I'm GPU bound all the time, unlike tests and games run at 720P. Edit, or even 1080P these days."
Awesome read.

I am sure there will be more tests coming out. it will be interesting to see!
 
GPU bound gaming is obviously not gonna be affected by CPU performance losses, unless they were SO severe they suddenly made you CPU bound, but even meltdown+spectre microcode isn't that bad for most 4K scenarios.

The performance loss is significant if you play at 1080p or 1440p and are trying to hit a minimum 144fps or even higher in less graphically complex games. There's a lot of competitive games that qualify, where it's possible to get 150-250+ fps averages like most of the mobas, csgo, sc2, etc, and where people care about those framerates for high refresh monitors.

Of course, Ryzen is much worse in those scenarios than a patched Intel cpu so there isn't much you can do but grin and bear it.

This exactly. I play at 1080p with a 1080 ti and I'm definitely CPU bound on the games I play the most. I am pretty sure I will be seeing a substantial hit in some games. It sucks but AMD is far worse for my use-case anyway so what can you do.
 
How long until physical silicon is changed in newer chips to prevent this? I imagine the next generation is probably too far along for changes like that?
 
I am quite certain the next release of chips from both vendors will not significantly address spectre. I am not even convinced the one after that will.

Spectre is a major PITA. Any solution will take considerable time to research and design.
 
No, they wouldn't as they don't test what is most likely impacted.
Yes, they test what I and most other gamers care about - 3D performance. And it puts the results in a format that can easily be compared.

As to the IOPS slowdown - again, that doesn't affect game load times at all since we gained virtually nothing from quality SATA III SSDs to the fastest NVME drive out there.

That said, if you run a large database - I'm sorry. Sucks to be you.
 
Yes, they test what I and most other gamers care about - 3D performance.

As to the IOPS slowdown - again, that doesn't affect game load times at all since we gained virtually nothing from quality SATA III SSDs to the fastest NVME drive out there.

That said, if you run a large database - I'm sorry. Sucks to be you.
There have been tests that show a good amount of slowdown on games, so no, running benchmark programs won't tell you. Do you like synthetic benchmarks for everything or do you like to see them test actual product?

I think most people would like to see a bunch of games tested at 1080 to see what happens.
 
Yes, they test what I and most other gamers care about - 3D performance. And it puts the results in a format that can easily be compared.

As to the IOPS slowdown - again, that doesn't affect game load times at all since we gained virtually nothing from quality SATA III SSDs to the fastest NVME drive out there.

That said, if you run a large database - I'm sorry. Sucks to be you.
What about MMOs? Any network and/or disk heavy game?

Synthetic 3D benchmarks aren't good for real life 3D performance information, let alone real gameplay.
 
There have been tests that show a good amount of slowdown on games, so no, running benchmark programs won't tell you. Do you like synthetic benchmarks for everything or do you like to see them test actual product?

I think most people would like to see a bunch of games tested at 1080 to see what happens.
What about MMOs? Any network and/or disk heavy game?

Synthetic 3D benchmarks aren't good for real life 3D performance information, let alone real gameplay.

My results don't bear this out because I'm not CPU bound with my overclocked 6950X - if you are at 1080P with your CPU then I'm sorry. Maybe think about upgrading to 4K or 1440P.
 
My results don't bear this out because I'm not CPU bound with my overclocked 6950X - if you are at 1080P with your CPU then I'm sorry. Maybe think about upgrading to 4K or 1440P.
So you are 4K+, which means everybody is or has to be? In your own little world.
If you had bothered to read my sig, I am already at 4K with an OC 5960x. ;)
I was relaying common sense which more people should try.
 
So you are 4K+, which means everybody is or has to be? In your own little world.
If you had bothered to read my sig, I am already at 4K with an OC 5960x. ;)
I was relaying common sense which more people should try.
For your re-reading pleasure:

My results don't bear this out because I'm not CPU bound with my overclocked 6950X - if you are at 1080P with your CPU then I'm sorry. Maybe think about upgrading to 4K or 1440P.

And you must not use Hardforum on a phone - You should try it out and see what I see. If you don't want to, well I'm sorry for you again.
 
How long until physical silicon is changed in newer chips to prevent this? I imagine the next generation is probably too far along for changes like that?

Just read about the browser patch and it seems that it may be a simple thing or more complex - aint that usually the truth :cool:

Browser patch works by disconnecting from the actual time that it used to fetch info from the CPU. Randomizes / offsets the actual time. Since the exploit essentially 'times' the difference between a missed speculation and the actual data, it makes the hacker look in the wrong place. The browser patch by randomizing its clock reports, makes the exploit less effective.

Maybe silicone could be based on a similar concept? It would'nt be a major retrofit. Adding some type of RNG or encrypt the clock signal. Since CPU's are basically built like legos, - yeah, EXTREMELY simplified - cache block here, MMX block here, instruction fetch block here, buffer alc block here, possibly it could be an easy matter to replace the former clock block with a improved one.

Just 'speculation' <boy, that word is getting a lot of use lately>, but hopeful that we get to keep our performance since one way to fix it is to undo out of order processing and go back to first in first out.

Slower but safer.

Reminds me of how analog computers are used heavily when designing nuclear bombs. Slower, but emf proof comparatively, and easy hacks are unlikely. Lets hope a fix doesnt put us back in the 60's when analog computers ruled.;)
 
For your re-reading pleasure:



And you must not use Hardforum on a phone - You should try it out and see what I see. If you don't want to, well I'm sorry for you again.
You assume quite a bit. I do use my phone for [H]. Anything else you are wise and all knowing on, even though wrong every time?
 
Just read about the browser patch and it seems that it may be a simple thing or more complex - aint that usually the truth :cool:

Browser patch works by disconnecting from the actual time that it used to fetch info from the CPU. Randomizes / offsets the actual time. Since the exploit essentially 'times' the difference between a missed speculation and the actual data, it makes the hacker look in the wrong place. The browser patch by randomizing its clock reports, makes the exploit less effective.

Maybe silicone could be based on a similar concept? It would'nt be a major retrofit. Adding some type of RNG or encrypt the clock signal. Since CPU's are basically built like legos, - yeah, EXTREMELY simplified - cache block here, MMX block here, instruction fetch block here, buffer alc block here, possibly it could be an easy matter to replace the former clock block with a improved one.

Just 'speculation' <boy, that word is getting a lot of use lately>, but hopeful that we get to keep our performance since one way to fix it is to undo out of order processing and go back to first in first out.

Slower but safer.

Reminds me of how analog computers are used heavily when designing nuclear bombs. Slower, but emf proof comparatively, and easy hacks are unlikely. Lets hope a fix doesnt put us back in the 60's when analog computers ruled.;)

That's actually not at all easy to do in the CPU. Clocking seems simple, but at these rates, it's actually quite complex inside the chip. That aside, you then have other issues: what is "random", and are you actually just getting back to the performance of non-speculative designs.

I actually expect speculative execution to stay, and there will be hardening against training and data inferral from external processes. You can throw gobs of silicon to deal with this problem in a couple ways, and that's far preferable to a radically new design. But - I'm not in the loop now, so I do not know what the teams are cooking up.
 
Patch & mitigations affects minimum frame render times aka the creamy smoothness that [H]ard gamers should care about, this is already been shown.
 
Last edited:
That's actually not at all easy to do in the CPU. Clocking seems simple, but at these rates, it's actually quite complex inside the chip. That aside, you then have other issues: what is "random", and are you actually just getting back to the performance of non-speculative designs.

Yeah(y) I kinda doubt that replacing a "clock block" would be a switch out the old and switch in the new, its that I bet that the other solution would concern redesigning speculative execution. On the surface that(!) seems like redesigning calculus.

But I dont design CPU's just have an interest in what goes on under the 'lid'. Seems to be a confirmed fix that 'first in first out' is the solution at a performance loss. Maybe someone will figure a way to correctly guess with a 99% probability where to go next on the current clock cycle. Then 99% of the time there is nothing to hack. Good guess's do that.

I wonder <speculate>where the next generation of speculation will come from. Too much of a performance hit to lose it. As you stated "here to stay".

I wonder if the fix will be publicized?

Or <conspiracy theory here>, kept in house until someone figures it out fifteen years later;)
 
Back
Top