Ryzen 5600X3D potential?

After MC sees these aren't the hit they expected the price will be $199 very quickly. Maybe even over the weekend. I'll get one then.

I'd still wait, the 5800X3D is the only AM4 cpu worth buying at this point. 6 core CPUs should only be in budget builds and the 5600X bundle at MC that was just at $189 last week is the way to go. You can slap that combo in almost any older case and if you have something built in the last 8 years, you can reuse the DDR4 for it. It also comes with a decent heatsink so you don't have to purchase one.
 
Last edited:
I would probably buy that processor, but it's only available in the microcenter in the US and you have to pick it up there, they don't send it anywhere at all, and I'm in Europe, it's a shame it's not available globally and in large quantities
And normally, then the price would be lower if it was available in large quantities,5800x too strong for my motherboard...
 
Last edited:
Of course I have no way if knowing how many they have but Microcenter here still shows 25+.

I’ll get one when it goes on sale or shows under 24 left maybe.

IMG_5444.jpeg
 
Yes all Washington, DC area stores show 25+ in stock.
And what should 25+ mean now? + 250 or 2500 or 25000 Nonsense,How many shops does that microcenter have?
It turns out that they didn't really have too much leftover from the ryzen 5800x3D
 
And what should 25+ mean now? + 250 or 2500 or 25000 Nonsense,How many shops does that microcenter have?
It turns out that they didn't really have too much leftover from the ryzen 5800x3D

Add it to your cart and then adjust the quantity. Denver currently has 35 in stock. Rockville has 68 in stock, so almost twice as many as Denver.
 
Also this is the first time in recent memory I've seen a Microcenter processor not have the '1 Per Houshold' limit imposed. Apparently you can buy as many 5600X3D's as you want, albeit in store.
 
I think they have more of these then people think. I was at my local Microcenter the day it came out picking one up and some other stuff and there were at least 20 other people getting one that morning and yet the store still says 25+ today.

Put mine in one of my machines that had a 2600 in it. Processor rips in games feels just as good as my 5800X3D does.
 
I think they have more of these then people think. I was at my local Microcenter the day it came out picking one up and some other stuff and there were at least 20 other people getting one that morning and yet the store still says 25+ today.

Put mine in one of my machines that had a 2600 in it. Processor rips in games feels just as good as my 5800X3D does.
Ya I believe the majority of games don't perform better going from 6-8 cores all else equal.
I'm thinking having 2 less cores then the 5800X3D but the same stacked cache (total) should give the CPU some advantage? Maybe that is why they lowered the clocks to compensate. Anyone have thoughts on this?
 
Last edited:
Ya I believe the majority of games don't perform better going from 6-8 cores all else equal.
I'm thinking having 2 less cores then the 5800X3D but the same stacked cache should give the CPU some advantage? Maybe that is why they lowered the clocks to compensate. Anyone have thoughts on this?
Could be possible. I runs pretty cool at it's current clocks with the Wraith Prism cooler and thermal grizzly paste I use. Unless I have a frame counter on it hard to tell the difference in many games. Of course I don't have apple to apple systems as the systems are 5800X3D/6800XT/32gb/X570MB and 5600X3D/6700/32gb/B450MB BUT I can switch between them and get a close enough expeirence.
 
Could be possible. I runs pretty cool at it's current clocks with the Wraith Prism cooler and thermal grizzly paste I use. Unless I have a frame counter on it hard to tell the difference in many games. Of course I don't have apple to apple systems as the systems are 5800X3D/6800XT/32gb/X570MB and 5600X3D/6700/32gb/B450MB BUT I can switch between them and get a close enough expeirence.
Ya I believe the majority of games don't perform better going from 6-8 cores all else equal.
I'm thinking having 2 less cores then the 5800X3D but the same stacked cache (total) should give the CPU some advantage? Maybe that is why they lowered the clocks to compensate. Anyone have thoughts on this?
Yes, this for the clock was very strange to me too.
So is it possible to overclock manually or does the processor have a limit according to the factory specifications? Forget it, now I read that there is no overclock.

For my motherboard it says that 5800x3d is supported, now do I need a bios update for 5600x3d or not, I don't know.

If someone might be selling a 5600x3d at a good price, they can contact me, but it would have to be shipped to Europe at the lowest possible price and to avoid taxes and the like... would that be possible?
 
Yes, this for the clock was very strange to me too.
So is it possible to overclock manually or does the processor have a limit according to the factory specifications? Forget it, now I read that there is no overclock.

For my motherboard it says that 5800x3d is supported, now do I need a bios update for 5600x3d or not, I don't know.

If someone might be selling a 5600x3d at a good price, they can contact me, but it would have to be shipped to Europe at the lowest possible price and to avoid taxes and the like... would that be possible?
Should be fine my B450 recent bios was for extra 5800X3d support and it runs the 5600X3D perfect

And yeah X3D parts have Zero overclocking just normal Ryzen boosting so cool it well and it will boost like it should.
 
Yes, this for the clock was very strange to me too.
So is it possible to overclock manually or does the processor have a limit according to the factory specifications? Forget it, now I read that there is no overclock.

For my motherboard it says that 5800x3d is supported, now do I need a bios update for 5600x3d or not, I don't know.

If someone might be selling a 5600x3d at a good price, they can contact me, but it would have to be shipped to Europe at the lowest possible price and to avoid taxes and the like... would that be possible?
Not sure if my point was understood. I am saying that there is LITERALLY more 3D cache for the SIX core vs. the EIGHT core.
Should this not have a incremental increase? Surely 6-cores with more 3D stacked RAM per core have an advantage in scenarios where 3D-chache is utilized?
Hence my point that they deliberately dropped the core clocks so it wouldn't best the "under utilized" 8-cores.
Do you get what i am getting at?
 
Not sure if my point was understood. I am saying that there is LITERALLY more 3D cache for the SIX core vs. the EIGHT core.
Should this not have a incremental increase? Surely 6-cores with more 3D stacked RAM per core have an advantage in scenarios where 3D-chache is utilized?
Hence my point that they deliberately dropped the core clocks so it wouldn't best the "under utilized" 8-cores.
Do you get what i am getting at?
Should be fine my B450 recent bios was for extra 5800X3d support and it runs the 5600X3D perfect

And yeah X3D parts have Zero overclocking just normal Ryzen boosting so cool it well and it will boost like it should.
Do you want to sell 5600x3d? I can only buy used from a natural person.I am in EU.

I understand you, they lowered the clock on purpose so as not to kill so many 5800x or 5700x and similar.
 
would the 5600X3D be a good server chip ? Something that could be thrown in the a ASROCK Rack Motherboard?
 
Not sure if my point was understood. I am saying that there is LITERALLY more 3D cache for the SIX core vs. the EIGHT core.
Should this not have a incremental increase? Surely 6-cores with more 3D stacked RAM per core have an advantage in scenarios where 3D-chache is utilized?
Hence my point that they deliberately dropped the core clocks so it wouldn't best the "under utilized" 8-cores.
Do you get what i am getting at?
The 3D cache is shared amongst the cores. So, it doesn't matter. If a job only needs 6 cores, the 5800x3D would have the same amount of 3D cache, per core.
 
would the 5600X3D be a good server chip ? Something that could be thrown in the a ASROCK Rack Motherboard?
No.

The 5800X3D is a better chip if there was a comparison to consider and it is most certainly not regarded as a compute-excelling / non-gaming piece.
 
The 5800X3D is around 6% faster than the 5600X3D @ 1080p - 5% @ 1440p - 1% @ 2160p.


View: https://www.youtube.com/watch?v=kpG4UoWihEg


CPUs don't get faster or slower depending on your display resolution, they are just more likely to be handicapped by a GPU bottleneck at a higher resolution. Introduce a big enough GPU bottleneck and a 7800X3D would be the same speed as a 2500k. Sort of like how all cars are the same speed while stopped at a red light.

But you have a lot of different ways to reduce GPU bottlenecks on a per-game basis by adjusting graphics settings as necessary.
 
CPUs don't get faster or slower depending on your display resolution, they are just more likely to be handicapped by a GPU bottleneck at a higher resolution.
And that "handicap" is demonstrated by the 5600X3D being "SLOWER" than the 5800X3D at performing the same task, under the same conditions.

However you want to spin it, the 5800X3D is still "FASTER" in all resolutions that "MATTER", be it 1080p-1440p-2160p.
HWUB just made a graph to show this by an approximation of how much "SLOWER" the 5600X3D is than the 5800X3D.
 
This seem to show that it would have been a nice product that would have worked would they have had not enough 8 core working chips to "waste" cache on bad bin (if the 3d had been available at launch that could have been a thing).

But seem to be pure "demo" product.
 
And that "handicap" is demonstrated by the 5600X3D being "SLOWER" than the 5800X3D at performing the same task, under the same conditions.
The only "handicap" I mentioned was a GPU bottleneck. GPU bottlenecks do not "demonstrate" the differences between CPUs, they hide the differences between CPUs - because you are handicapping the faster CPU(s). This is why the performance difference is smaller at 4K, and it's not due to the pros and cons of either CPU.

However you want to spin it, the 5800X3D is still "FASTER" in all resolutions that "MATTER", be it 1080p-1440p-2160p.
HWUB just made a graph to show this by an approximation of how much "SLOWER" the 5600X3D is than the 5800X3D.

Not sure if you fully understood my post or what I was actually addressing, but I never said that the 5600X3D was faster or equal to the 5800X3D. The 5600X3D is a 5800X3D with 2 cores disabled and base+boost clocks that are 100Mhz lower, so it would be a neat trick if the 5600X3D was faster.
 
The only "handicap" I mentioned was a GPU bottleneck. GPU bottlenecks do not "demonstrate" the differences between CPUs, they hide the differences between CPUs - because you are handicapping the faster CPU(s). This is why the performance difference is smaller at 4K, and it's not due to the pros and cons of either CPU.



Not sure if you fully understood my post or what I was actually addressing, but I never said that the 5600X3D was faster or equal to the 5800X3D. The 5600X3D is a 5800X3D with 2 cores disabled and base+boost clocks that are 100Mhz lower, so it would be a neat trick if the 5600X3D was faster.

star-trek-ball.gif
 
Ok so now instead of saying the 5600x was 6% slower than a 5800x at 1440p we will say game ran 6% slower at 1440p on the 5600x.

Same for GPUs, the game ran 23% slower on X instead of Y we do not know how much the gpu itself was slower with only that information....

Seem like everyone know what one mean even if they skip that part
 
Ok so now instead of saying the 5600x was 6% slower than a 5800x at 1440p we will say game ran 6% slower at 1440p on the 5600x.
Better yet would be to simply drop the obsession with categorizing CPU performance based on GPU-centric criteria such as display resolution.

"The 5800X3D is 6% faster than the 5600X3D when you are not GPU-limited"

See, it's not that hard.

When you don't really know what you're talking about, post a meme instead. It's easier than learning.

As a Star Trek fan, it pains me that you couldn't even find the full version of that GIF.
 
"The 5800X3D is 6% faster than the 5600X3D when you are not GPU-limited"

See, it's not that hard.
When would that be and how would we know that a not a single ms was changed, the render queue being always empty ?

I imagine not using game with a viewport (or game at all) would be better at this, but people will often be interested a how well a cpu feed render call, which will require to make enough of them to be relevant to them enough to risk for the render to impact the game framerate.

Quite possible than by when there is a new radeon 9900xtx, ddr5 6000 cl-20 with the latest drivers that the same games ran on 5800x3d 8% faster instead of 6% at 1080p.... and that newer game that use 8 core more ran 9.5% faster

A cpu will nevre be X% faster than another, it will have been x% faster at a specific set of task
 
When you don't really know what you're talking about, post a meme instead. It's easier than learning.

As a Star Trek fan, it pains me that you couldn't even find the full version of that GIF.
I am sorry that I only had half the amount of time to devote to a keyboard warrior of such prestigious standing as yourself.
Being a Trekkie as well, Wesley never even made my B list, I was very much a happy camper when the Traveler liberated him from the Enterprise.

Feel free to tell me what I was doing wrong. Would it make you feel better if I just copy and paste some graphs that say the same thing?

1692747504797.png


1692747563032.png


1692747621864.png
 
Would it make you feel better if I just copy and paste some graphs that say the same thing?
Actually the graphs are considerably more informative, since depending on the game you play, the actual benefit (in non-GPU-limited situations, just according to those graphs) can be anywhere from 0-15+%. It's basically what I said a month and a half ago:
It really depends on what games you play.


A cpu will nevre be X% faster than another, it will have been x% faster at a specific set of task
I was only generalizing as much as you were with your comment:
Ok so now instead of saying the 5600x was 6% slower than a 5800x at 1440p we will say game ran 6% slower at 1440p on the 5600x.
Which "Specific set of task" were you referring to with that comment? Obviously, I do agree that it's better to look at results on a game-by-game basis, based on what games you actually play. I was just giving an example of how it's not necessary to tie CPU benchmarks to specific display resolutions and imply that a CPU is somehow less powerful just because you are running 4K (and thus, introducing a GPU bottleneck in most cases).

It's actually very easy to determine how GPU limited you are. You just monitor your GPU utilization. It's either hitting 100% GPU utilization or it's not. It's not rocket science.
 
Back
Top