AMD Ryzen 9 5950X CPU Review & Benchmarks (Workstation, Gaming, Overclocking)

Wow such utter destruction in IPC, Single Core Perf, obvioiusly multi, AND at a much lower power!!! Insane!!!
 

Genuine Intel(R) CPU 0000 Today at 12:46 PM​

Toppc from MSI claims 98% of chips can do 2000
So, 98 percent of Ryzen 5000 series can to Infinity Fabric 2000? That's what you're saying?

That's refreshing. I've had two 3900x and one 3950x and none could do 1900. Of course, that was on the December 2019 BIOS. Maybe I should test on the latest.

I have a 2x16 kit of Trident Z B-Die rated for 4000MHz 19-19-19-39 at 1.35V. Best I could do with it was 3733 at 1.45v. I couldn't boot at the XMP settings even with IF set asynchronous. It would be cool if that RAM was usable now.
 
Last edited:
From the reviews it looks like all of the 5000 series chips are roughly the same for gaming, ultimately.
Reviews for the 5600x/5800x make it look like the i5-10600k is usually faster when gaming. Of course at 1440p+ it's only a few frames but the 5950x looks to be much faster than the lower SKUs and more on par with Intel. I was excited for an upgrade from my 3700x but for gaming it looks like it's practically the same unless I want to drop $800...
 
So, 98 percent of Ryzen 5000 series can to Infinity Fabric 2000? That's what you're saying?

That's refreshing. I've had two 3900x and one 3950x and none could do 1900. Of course, that was on the December 2019 BIOS. Maybe I should test on the latest.
only issue is, about double the price of the 3600 MHz kits
 
From the reviews it looks like all of the 5000 series chips are roughly the same for gaming, ultimately.
I will finally say it, “RIP Intel.”

In the back of my mind I honestly thought Intel would have an “answer” or work out their issues with 14nm+^9. Looks like I was wrong.
 
Reviews for the 5600x/5800x make it look like the i5-10600k is usually faster when gaming. Of course at 1440p+ it's only a few frames but the 5950x looks to be much faster than the lower SKUs and more on par with Intel. I was excited for an upgrade from my 3700x but for gaming it looks like it's practically the same unless I want to drop $800...
I found it really odd that the whole 5000 series lineup poops all over Intel in low res benches but not at Ultra 1440p on some games. Sometimes Intel is a frame or two ahead.
Interesting.....
 
It's been exactly one day and I'm already bored with the usual talking-heads on YouTube getting hard-ons for 1080p gaming...............Jesus Christ guys we get it, these things DOMINATE at low rez 2010 gaming! =|
1440p/4k numbers would be less thumbnail-worthy but more relevant. VR numbers and Multi-Monitor setups as well. Is there any real reason for us to upgrade from 8000 or 9000 or 10000 series Intel parts (or even Ryzen 2, 3) if Minecraft at fucking 300fps is not what we are after....

Please. Thank You.
 
It's been exactly one day and I'm already bored with the usual talking-heads on YouTube getting hard-ons for 1080p gaming...............Jesus Christ guys we get it, these things DOMINATE at low rez 2010 gaming! =|
1440p/4k numbers would be less thumbnail-worthy but more relevant. VR numbers and Multi-Monitor setups as well. Is there any real reason for us to upgrade from 8000 or 9000 or 10000 series Intel parts (or even Ryzen 2, 3) if Minecraft at fucking 300fps is not what we are after....

Please. Thank You.
Because when gaming at 1440p/4k you are more GPU bound. Which means the CPU used doesnt mean much at all.

The lower the resolution, the more the CPU performance comes into play.
 
It's been exactly one day and I'm already bored with the usual talking-heads on YouTube getting hard-ons for 1080p gaming...............Jesus Christ guys we get it, these things DOMINATE at low rez 2010 gaming! =|
1440p/4k numbers would be less thumbnail-worthy but more relevant. VR numbers and Multi-Monitor setups as well. Is there any real reason for us to upgrade from 8000 or 9000 or 10000 series Intel parts (or even Ryzen 2, 3) if Minecraft at fucking 300fps is not what we are after....

Please. Thank You.

As others have said, 1920x1080 is a CPU bound resolution. As you increase resolution, you become more GPU bound. When reviewing CPU's, you want to remove other variables as much as you can and showcase what the CPU can do. Lower resolution gaming is how you do that in a CPU review. Also, multi-monitor gaming is about as dead as multiGPU gaming. VR is extremely GPU dependent, so again, you won't see that in a CPU review. As for your direct question, no. Probably not. If you are running a more recent Intel CPU, you won't gain much switching to AMD. It certainly wouldn't amount to good gains for the money spent.
 
we need better than 3600 to get the gains out the 5900 and 5950X you're saying?


4000 1:1 IF is required?
I have no idea, I just know that my first thought was definitely not "these are cool and all, but how will I afford the RAM?"
 
We have a covid retail restriction here of 25% capacity.... which means about 5-6 people in our local part shop. I drove by this morning and the line for their door was almost 2 blocks long. lol

I hope stock is good... AMD is killing it.
 
Probably because at that point you are GPU bound - that's why they CPU test games at 1080p.
Yes but you would think it would be the AMD chips that would be 1 or 2 frames ahead though...
Its within the margin of error but it seems like the margin of error always favors Intel. I smell conspiracy! lol jk
 
So getting into 1440p and 4k gaming, seems pretty close for AMD / Intel, any review sites showing MIN FPS, as that would show more detail as to what CPU is more powerful when you get more into GPU limited resolutions...
 
just saw a flight simulator test at the LTT review...amazing
 
Having been on hand for previous AMD vs Intel chip battles and bouncing back and forth a few times. It seems that when threatened Intel always seems to pull out a secret weapon like Darth Vader with a new light saber to burn a hole through the AMD rebellion. Once the threat is gone they turn their back on the enthusiast market. For now AMD will gain some ground but i can hear Vader breathing. Still waiting for a good reason to move past my OC X99 stuff now
 
So getting into 1440p and 4k gaming, seems pretty close for AMD / Intel, any review sites showing MIN FPS, as that would show more detail as to what CPU is more powerful when you get more into GPU limited resolutions...
gamersnexus shows average, 1% and 0.1% lows

Linus also shows 95th percentile and 99percentile, as well as average. Although, his numbers for Intel in general, seem way lower than other sites. Compare his Intel numbers for Shadow of The Tomb Raider, vs. TPU. I wonder if Linus did a gameplay run Vs. the in game benchmark or something.
 
Last edited:
As others have said, 1920x1080 is a CPU bound resolution. As you increase resolution, you become more GPU bound. When reviewing CPU's, you want to remove other variables as much as you can and showcase what the CPU can do.
At the same time I think you want definitely both.

You want to isolate CPU as much as possible to see what in yet to know gaming scenario that you would not be GPU bound what is likely to happen, but look at how misleading those 400p resolution test showing +50% game boost would be if we had only those, a CPU ability to "feed" a video card is an essential part of it's gaming performance in almost all scenarios someone buying a new high price CPU person would care for, thus showing how good they are that is essential.

Putting some Flight simulator 2020 type of game that are know to be CPU heavy even under 70 FPS for example being a good one, testing how much DirectX 12-Vulkan multicore scenario does.

Many potential buyer have has a question, I have a 2070 super or just got a new 3080, what do I gain has a gamer by upgrading my cpu, that what they want to see and if they are playing a 1440p or 4K it is absolutely those resolution that will be telling for them and they should obviously wait until a scenario exist for which it will matter at those resolution (by then an even better alternative or better price point are likely to exist).

I suspect there is a bit of it make CPU choice virtually irrelevant and boring to gamer that are a large part of the audience and remove reviews views to reviewer that could be going on sometime.

All those benchmark will need to be redone with the 6800-6800xt i imagine as well and it would be ridiculous for when they compare a 5950xt vs 10900K with a 6800xt to only show low detail 1080p benchmark to isolate the CPU.
 
At the same time I think you want definitely both.

You want to isolate CPU as much as possible to see what in yet to know gaming scenario that you would not be GPU bound what is likely to happen, but look at how misleading those 400p resolution test showing +50% game boost would be if we had only those, a CPU ability to "feed" a video card is an essential part of it's gaming performance in almost all scenarios someone buying a new high price CPU person would care for, thus showing how good they are that is essential.

Putting some Flight simulator 2020 type of game that are know to be CPU heavy even under 70 FPS for example being a good one, testing how much DirectX 12-Vulkan multicore scenario does.

Many potential buyer have has a question, I have a 2070 super or just got a new 3080, what do I gain has a gamer by upgrading my cpu, that what they want to see and if they are playing a 1440p or 4K it is absolutely those resolution that will be telling for them and they get obviously wait until a scenario exist for which it will matter at those resolution (by then an even better alternative or better price point are likely to exist).

I suspect there is a bit of it make CPU choice virtually irrelevant and boring to gamer that are a large part of the audience and remove reviews views to reviewer that could be going on sometime.

All those benchmark will need to be redone with the 6800-6800xt i imagine as well and it would be ridiculous for when they compare a 5950xt vs 10900K with a 6800xt to only show low detail 1080p benchmark to isolate the CPU.

You don't have to tell me. I include 4K testing in my CPU reviews. Unfortunately, AMD decided to sample us late and didn't provide us with all the available models. My reviews will be delayed quite some time as we'll have to purchase the 5900X and 5950X ourselves.
 
You don't have to tell me. I include 4K testing in my CPU reviews. Unfortunately, AMD decided to sample us late and didn't provide us with all the available models. My reviews will be delayed quite some time as we'll have to purchase the 5900X and 5950X ourselves.
That sucks.
 
I have no idea, I just know that my first thought was definitely not "these are cool and all, but how will I afford the RAM?"
Oh, my bad.
"The term sweetspot is used incorrectly. It referes to the max speed that you can slap the fastest sticks in and get them to run which is 3600mhz for Zen 2 and now 3800mhz for Zen 3. The difference between the sweetspots is that you can still clock higher but you have to manually do it. Thus on Zen 2 going from 3600mhz to 3800mhz you have to manually set the IF to 1900mhz to support 3800mhz memory and then proceed to tweak the timings. Conversely, you'd have to do the same with Zen 3 when going to 4000mhz ram and 2000mhz IF. Essentially what it means is that the maximum speed the Infinity Fabrick (IF/fclk) will run went from 1900mhz on Zen 2 to 2000mhz on Zen 3. The sweet spot of each is 100mhz below the maximum."
 
We have a covid retail restriction here of 25% capacity.... which means about 5-6 people in our local part shop. I drove by this morning and the line for their door was almost 2 blocks long. lol

I hope stock is good... AMD is killing it.
Just checked the Chicago Micro Center, all SKUS, in store only, 10+ in stock, 3950X, $649.99
 
  • Like
Reactions: ChadD
like this
Just checked the Chicago Micro Center, all SKUS, in store only, 10+ in stock, 3950X, $649.99
Of the 5000 series? Because I don't show any availability at either the Chicago or Westmont store. It said 10+ for the 5900X for Westmont before launch this morning and now it doesn't say anything.

I would call ahead. I have high doubts that they're in stock.
 
"The term sweetspot is used incorrectly. It referes to the max speed that you can slap the fastest sticks in and get them to run which is 3600mhz for Zen 2 and now 3800mhz for Zen 3. The difference between the sweetspots is that you can still clock higher but you have to manually do it. Thus on Zen 2 going from 3600mhz to 3800mhz you have to manually set the IF to 1900mhz to support 3800mhz memory and then proceed to tweak the timings. Conversely, you'd have to do the same with Zen 3 when going to 4000mhz ram and 2000mhz IF. Essentially what it means is that the maximum speed the Infinity Fabrick (IF/fclk) will run went from 1900mhz on Zen 2 to 2000mhz on Zen 3. The sweet spot of each is 100mhz below the maximum."
[3:04 PM]
Forge :
"Sweet spot" refers to a place where the difficulty/cost starts to scale upward sharply and the benefits no longer do.
And in this case, Ryzen IMC sweet spots had a general limit, with some few exceptions, just like any other silicon.
And the Ryzen 3 sweet spot has shifted upward slightly versus the Ryzen 2 one. No surprise.
But it's not like 3600 is suddenly garbage.

[3:05 PM]
Forge :
It just means that if you can get 3800+ ram for a small or no price increase over 3600, it now makes sense to do so.
I'm running my Ryzen 2 on DDR4-3000 and I'm not suffering at all.
 
  • Like
Reactions: Aix.
like this
just saw a flight simulator test at the LTT review...amazing
That 1920x1080 ultra setting result is interesting:
https://tweakers.net/reviews/8270/1...ight-simulator-2020-en-crysis-remastered.html

If the 5950x didn't exist considering that the 5900x-3900xt-5600x-10900K has the identical average FPS we would assume that the game was GPU bound at ultra graphic setting with them (if it is not a typo), when it could maybe do 70FPS with a much stronger CPU, who knows.

I wonder if some other scenario will start to occur like that (say ultra fast M2 drive having same numbers than SSD, PCI-express 4.0 vs 3.0 for the video card, not much or no difference with ram pass a certain speed) that will be opened by overclock 5000x cpus.
 
Thanks fellas, understand I'm aware of why testing at 1080p is done.........my issue was the fact that all of the "known news sources" for this release kinda wasted our collective time by including "WOW OMG!" thumbnails and then telling us shit we expected......"Ryzen is finally beating Intel at gaming.......".

But gaming at 1080p is the only thing they focused on (I KNOW)...to stress the impact of the CPU. Thing is you've been able to game....at 1080p...for a decade or so on chips you can buy for $50 bucks. So while the graphs are great, and those chasing 200+FPS at 1080p for..some reason..competition, whatever....is relevant its not really answering the *other half* of the question I think we all want answered, what does this mean to people running 1080 Ti's or 2070's or Maxwell 980's....does this CPU magically give you "moar framez!" somehow due to its multicore and faster design......at more modern resolutions. Does your brand new 3080 or 3090 suddenly pick up 20+ frames moar than it did on an Intel part "because AMD's sub-processor scaling array with added moxie is now operating at 3.3gigaflops versus last gens 2.1 whatevers...", etc.

It's just one of those irritating omissions....I know there will be a smaller impact, but I'm sure all of us would like to know "ok, how small". But you know..that takes away from the *drama*....and drama gets clicks.
 
Thanks fellas, understand I'm aware of why testing at 1080p is done.........my issue was the fact that all of the "known news sources" for this release kinda wasted our collective time by including "WOW OMG!" thumbnails and then telling us shit we expected......"Ryzen is finally beating Intel at gaming.......".

But gaming at 1080p is the only thing they focused on (I KNOW)...to stress the impact of the CPU. Thing is you've been able to game....at 1080p...for a decade or so on chips you can buy for $50 bucks. So while the graphs are great, and those chasing 200+FPS at 1080p for..some reason..competition, whatever....is relevant its not really answering the *other half* of the question I think we all want answered, what does this mean to people running 1080 Ti's or 2070's or Maxwell 980's....does this CPU magically give you "moar framez!" somehow due to its multicore and faster design......at more modern resolutions. Does your brand new 3080 or 3090 suddenly pick up 20+ frames moar than it did on an Intel part "because AMD's sub-processor scaling array with added moxie is now operating at 3.3gigaflops versus last gens 2.1 whatevers...", etc.

It's just one of those irritating omissions....I know there will be a smaller impact, but I'm sure all of us would like to know "ok, how small". But you know..that takes away from the *drama*....and drama gets clicks.
I think after today, almost all of the GPU and gaming benchmarks going forward will be on the X570/5950x test benches since it is now the top gaming setup. So you will eventually see 1440/4k results.
 
119196.png


Boring GPU Limited Results, as requested
 
Of the 5000 series? Because I don't show any availability at either the Chicago or Westmont store. It said 10+ for the 5900X for Westmont before launch this morning and now it doesn't say anything.

I would call ahead. I have high doubts that they're in stock.
I don't know why I checked the 3000 series, maybe I had that search saved. Checking the 5000 series, you are correct, 0 in stock, status "LIMITED AVAILABILITY". Sorry for the false hope.
 
That sucks.

Unfortunate side effect of being a newer site. We don't have the readership to warrant allot of consideration from AMD. Who we are and the fact we worked with Kyle for so long has helped allot on the videocard, PSU and motherboard side. It has not helped as much on the CPU side. Well, not AMD anyway. Having said that, I think we get a fair amount more consideration and product than we normally would given the PR people know us from HardOCP.

That is to say, we probably wouldn't get anything if we had all popped up out of nowhere and had no experience doing reviews. But, in the meantime it sucks that we can't always have day one reviews for our readers.
 
Thanks fellas, understand I'm aware of why testing at 1080p is done.........my issue was the fact that all of the "known news sources" for this release kinda wasted our collective time by including "WOW OMG!" thumbnails and then telling us shit we expected......"Ryzen is finally beating Intel at gaming.......".

But gaming at 1080p is the only thing they focused on (I KNOW)...to stress the impact of the CPU. Thing is you've been able to game....at 1080p...for a decade or so on chips you can buy for $50 bucks. So while the graphs are great, and those chasing 200+FPS at 1080p for..some reason..competition, whatever....is relevant its not really answering the *other half* of the question I think we all want answered, what does this mean to people running 1080 Ti's or 2070's or Maxwell 980's....does this CPU magically give you "moar framez!" somehow due to its multicore and faster design......at more modern resolutions. Does your brand new 3080 or 3090 suddenly pick up 20+ frames moar than it did on an Intel part "because AMD's sub-processor scaling array with added moxie is now operating at 3.3gigaflops versus last gens 2.1 whatevers...", etc.

It's just one of those irritating omissions....I know there will be a smaller impact, but I'm sure all of us would like to know "ok, how small". But you know..that takes away from the *drama*....and drama gets clicks.
there are plenty of reviews with 1440p and 4k tests
 
Last edited:
I applaud the AMD for taking the "at least" instead of "up to" route. Impressive showing today. Well done!
 
Back
Top