7800X3D will be an utter failure of a CPU

1337Goat

Gawd
Joined
Apr 15, 2020
Messages
743
With last gen's main offering of the 5800X vs the 5900X, where we saw a simple clock difference of .1 GHz, with one difference beating the counterpart's base clock, and the other beating the other's boost clock by the same amount, I pondered that either choice would be decent. Fair trade-offs, in my opinion.

However, with this gen, I'm seeing a terrible clock difference here, of course due to the single CCX design, but nonetheless, for us consumers, behold how we suffer.

The 7800X3D, can only boost to 5.0 GHz, while the 7900X3D can boost to 5.6 GHz. This is simply pathetic. This turns the 7800X3D into a low-end cheap CPU. In my last build, I went with the 5800X because I can fully appreciate 8 cores with higher boost clocks than the more expensive CPUs. That's good enough for gaming and other common work, and on top of that, the high clock speeds were nice for the price.

But now I'm faced with an absolute garbage 7800X3D with lame boost clocks and "4.X GHz" base clock, which on AMD's website is revealed to be 4.2 GHz. That's the same as their 7950X3D's base clock, but with literally half the cores.
Can this be justified? No. Not at all.

This forces us to buy the 7900X3D if we want true gaming performance, and the 7950X3D if we want cores and performance.

And if you want a pitiful processor that lacks both cores and fast clocks, that leaves the 7800X3D. That's certainly not what I would purchase, and if I wouldn't do it, who would?
It better be cheap is all I can say.
2020-10-08-image-14.jpg



amd-ryzen-7000-x3d.jpeg
 
The 7900X3D has two CCDs, and only one of them has 3D cache. Only the NON-3D cache CCD can boost that high. I think that's the one missing factor here.

WHAT!!!!

That's just brutally awesome to be quite honest, and only fortifies my main point. Poor 7800X3D. Just not gonna win any awards compared to more versatile and powerful processors.
 
WHAT!!!!

That's just brutally awesome to be quite honest, and only fortifies my main point. Poor 7800X3D. Just not gonna win any awards compared to more versatile and powerful processors.

Well there are a lot of factors at play. A game or program that is on the 3D cache CCD of the 7900X3D/7950X3D should perform similar to the 7800X3D. They should have similar boost clocks. Or it could be assigned to the other CCD on the 7900X3D/7950X3D and have potentially higher boost clocks but no 3D cache, which some other programs and games seem to prefer.

So if we assume that every program or game is always being correctly assigned to the CCD that it will benefit most from, then that would allow the versatility of the 7900X3D and 7950X3D to shine. But it also highlights the fact that you are really at the mercy of the scheduler to assign programs to the correct CCD. You are also subject to additional latency issues associated with multiple CCD chips when programs or games have to communicate across the infinity fabric in order to communicate between cores.

The comparative simplicity of the 7800X3D might be nice because you remove any issues related to which CCD your program or game will run on. You're not at the mercy of the scheduler. If you know that the main game(s) that you play benefit from 3D cache, then this would obviously be your best bet to make sure your game is always running on a 3D cache CCD. And also, you're never communicating across the infinity fabric in order to communicate between cores.

So I think that there could still be a niche for the 7800X3D. I've always maintained that your best bet when selecting a CPU is to look up benchmarks for the games that you actually play. Looking at games that are known to benefit heavily from 3D cache, and comparing 7800X3D vs 7900X3D should tell the story here.
 
Well there are a lot of factors at play. A game or program that is on the 3D cache CCD of the 7900X3D/7950X3D should perform similar to the 7800X3D. They should have similar boost clocks. Or it could be assigned to the other CCD on the 7900X3D/7950X3D and have potentially higher boost clocks but no 3D cache, which some other programs and games seem to prefer.

So if we assume that every program or game is always being correctly assigned to the CCD that it will benefit most from, then that would allow the versatility of the 7900X3D and 7950X3D to shine. But it also highlights the fact that you are really at the mercy of the scheduler to assign programs to the correct CCD. You are also subject to additional latency issues associated with multiple CCD chips when programs or games have to communicate across the infinity fabric in order to communicate between cores.

The comparative simplicity of the 7800X3D might be nice because you remove any issues related to which CCD your program or game will run on. You're not at the mercy of the scheduler. If you know that the main game(s) that you play benefit from 3D cache, then this would obviously be your best bet to make sure your game is always running on a 3D cache CCD. And also, you're never communicating across the infinity fabric in order to communicate between cores.

So I think that there could still be a niche for the 7800X3D. I've always maintained that your best bet when selecting a CPU is to look up benchmarks for the games that you actually play. Looking at games that are known to benefit heavily from 3D cache, and comparing 7800X3D vs 7900X3D should tell the story here.


Oh yes, you're absolutely right. In the end, only real research matters.
My opinion here is most certainly a mere gamble, so if I get proven wrong, all the better.
When it really comes down to it, only actual real-world performance matters, and if I get proven wrong, I couldn't be happier.
 
has anyone used process lasso to force programs to a specific ccd?

I was looking to movs the gaming pc to the 7800x3d and the 5950x to replace my aging dual Xeon setup.
 
The 7800x3D will most likely age better than the 7900x3D due to having 8 cores with v-cache rather than 6. The 7900x3D is essentially a 6 core 12 thread CPU for gaming as you lose performance once the second CCD starts getting involved. Techpowerup has already previewed what we can expect from the 7800x3D and it should be about on par with the 7950x3D while not having any potential scheduling issues due to single CCD design. The 7900x3D is the odd CPU IMO as I would either go the 7800x3D or 7950x3D for a gaming system due to how the CPUs work.
 
I thought the 5800x3d proved more cache is more broadly beneficial over more games than the extra clock you're foregoing (e.g. compared to the 5800x)?
Techspot comparison

I think as long as the clock speed penalty isn't too great (and it doesn't seem to be) the bigger issue is probably the price premium over a non-vcache model. Hopefully with Zen5 or onward AMD will just go with vcache from the get-go...
 
The 7800X3D will likely be one of the best bang for the buck gaming CPU ever made. The 7950X3D has some use case, the useless one is the 7900X3D. 6-core CCDs so less cores have access to the bigger cache.
 
The 7800X3D will likely be one of the best bang for the buck gaming CPU ever made. The 7950X3D has some use case, the useless one is the 7900X3D. 6-core CCDs so less cores have access to the bigger cache.
Yeah, until the next one they release next year. Maybe they can get their shit together enough so that the processor doesn't die when you attempt to overclock it. But I suspect most of us aren't gonna extreme OC any of these chips anyway.

The 7900X3D is gonna be fine, because they dropped the clocks on the 7800X3D significantly. The 7600 is essentially the same at gaming as the 7700, so the 7900X3D will be just fine.

Looks like all the rage is surrounding the speculated 8000 series chips with X3D killing Intel's gaming dominance once and for all. Something is always the best value until the next best value... Hope it happens. If it does, don't expect AMD to be your friend and price it affordably. Nothing preventing them from becoming the Intel of old. They're a corporation, they don't give a shit about you.
 
Its not going to be fine because nobody is buying it. Its a stupid compromise.
Maybe. No one is buying it because there's almost no reason to move off the 5800X3D if you're a 1080P gamer. Honestly, I think the X3D processors are bullshit anyway because they do nothing for higher resolutions. It's a stop gap solution, gluing extra cache on top of a chiplet to increase certain workloads. It causes a scheduling nightmare in dual CCD units and the new implementation is hardly better than the old one.

Now, on the server side... They are gluing the shit out of their Epyc processors and there is a literal fuck ton or X3D cache on them. Those are the ones to watch.

If AMD really wanted to make a gaming processor they would designate one unit in their product stack and bitch it out. Instead they're just slapping cache on 1 CCD across their current lineup. It's gonna do jack and shit for most people. Why are you even running in 1080P on a 7900X+ anyway? You don't need more than 8 cores for gaming in general. That may change, but it will be a slow change. Heavily dependent on how many cores are adopted for Consoles. Since Consoles drive the gaming market.
 
Honestly, I think the X3D processors are bullshit anyway because they do nothing for higher resolutions.
Again. It really does more in certain games than "moar FPS" that the 1st person gamer base is so worried about. CPU intensive sim games and such tend so see more benefits. For some games it is indeed bullshit. In others it's a godsend.

Up to the buyer to do the legwork on which is applicable for them.
 
"This forces us to buy the 7900X3D if we want true gaming performance, and the 7950X3D if we want cores and performance."

This is incorrect. Nobody is forcing you to buy anything. If you have an 5800X3D then you are upgrading for what the sake of throwing money at a corporation. So certainly nobody is forcing you to buy new mobo, ram, cpu. Not to mention AM5 is fresh out of the gate these are processors that came out within the first year of the platform launching. How long was AM4 around before it even offered a X3D option to begin with?

Why not just say AM5 is a total failure because the unrealistic expectations I demand are not being met?

Buy Intel then it's a toaster oven on old architecture but you can brag about how fast it is.
 
The 7900X3D is gonna be fine, because they dropped the clocks on the 7800X3D significantly.

They did not "drop the clocks on the 7800X3D significantly". The CCD on the 7900X3D and 7950X3D that has 3D cache ALSO has much lower clocks. It's ONLY the non-3D cache CCD on those chips that has the higher clocks. Of course, it should have been fully expected that AMD's marketing department would gloss over that fact when talking about the clock speeds of those CPUs; only mentioning the faster clocks of the non-3D cache CCD. They can't do the same for the 7800X3D since it only has one CCD and it's an 3D cache CCD.

Why are you even running in 1080P on a 7900X+ anyway?

What makes you so certain that many people are? The reason why 1080P benchmarking (and even 720P benchmarking in some cases) is done is to eliminate GPU bottlenecks from the equation during CPU benchmarking. Most sites also offer 1440P and 4K benchmarks also, so they aren't hiding anything. It's just simply not helpful or accurate to purposefully introduce GPU bottlenecks into the equation and then claim that that slows down the CPU. Even if that makes a faster CPU irrelevant under certain circumstances, that isn't the point of the benchmark. Someone might upgrade their GPU later on while still owning that same CPU. Most games also offer many in-game video settings that can reduce GPU usage, whereas it's much harder to reduce CPU usage via in-game settings.
 
I agree that the 7900X3D seems like the loser of the bunch. The core penalty seems like it would matter over time more than the slight boost in clock speed it gets vs the 7800X3D when using the 'speed' CCD. The 7800X3D doesn't get the higher clock CCD, but it is cheaper and you don't have to worry about where your games are running. I'm probably just going to get the 7800X3D since I don't do a lot of production stuff these days and I don't want to mess with Xbox Game Bar.
 
If AMD wants to flood their market with multiple SKU's and what not per socket generation I'm all for it. The illusion of an upgrade path is makes lets them convert wary cheapskates like me into performance fiends looking for the cheapest fix.
 
The illusion of an upgrade path is makes lets them convert wary cheapskates like me into performance fiends looking for the cheapest fix.
Or just offer incredible deals. I never though I'd have a 12 core cpu until microcenter said "Come right in..."
 
your first pic should be the x3d parts because its a 500mhz improvement when you compare the correct chips, 5800x3D vs 7800X3d...
Totally going to be a failure, though. offers the already proven high amount of extra v cache on top of a very significant clock boost over the already exalted 5800x3d.

*Edit* Forgot to mention and NO weird ccx shenanigans like the other 2 x3d chips. Going to be a nice streamlined gaming cpu. Sign me up.
 
Last edited:
There's a reason that the 7800x3d isn't available yet and it's not that it's going to suck, it's because they didn't want to undercut sales on the other two. As others have mentioned the clock speed difference on the ccd with 3d cache will be minor and the other cores won't generally be used for games.

The 7950x3d makes sense as a hybrid gaming/productivity pc or for those that don't to make any compromises though the scheduler complexity creates a drawback for gaming as well. The 7900x3d is the odd man out of the bunch and it shows with how poorly it's selling even before there's a cheaper alternative that will likely outperform it in games more often than not.
 
They did not "drop the clocks on the 7800X3D significantly". The CCD on the 7900X3D and 7950X3D that has 3D cache ALSO has much lower clocks. It's ONLY the non-3D cache CCD on those chips that has the higher clocks. Of course, it should have been fully expected that AMD's marketing department would gloss over that fact when talking about the clock speeds of those CPUs; only mentioning the faster clocks of the non-3D cache CCD. They can't do the same for the 7800X3D since it only has one CCD and it's an 3D cache CCD.



What makes you so certain that many people are? The reason why 1080P benchmarking (and even 720P benchmarking in some cases) is done is to eliminate GPU bottlenecks from the equation during CPU benchmarking. Most sites also offer 1440P and 4K benchmarks also, so they aren't hiding anything. It's just simply not helpful or accurate to purposefully introduce GPU bottlenecks into the equation and then claim that that slows down the CPU. Even if that makes a faster CPU irrelevant under certain circumstances, that isn't the point of the benchmark. Someone might upgrade their GPU later on while still owning that same CPU. Most games also offer many in-game video settings that can reduce GPU usage, whereas it's much harder to reduce CPU usage via in-game settings.
I'm not suggesting anything is being hidden. I'm just saying the cache really gives a boost to 1080P gaming, diminishing returns past that.

I understand why they hadn't released it yet. They are milking the consumer base.

I guess, aside from the Server side of things where AMD is really lining up the Cache enabled parts to kick ass... I just wonder why these parts even matter. It's cool that you can glue on some cache memory to get better performance but anyone can do that. This doesn't feel like a permanent solution. It feels like an experiment. Like the processor wasn't good enough so we threw cache at the problem to see if it worked. And it did, in some workloads.

IDK - I'm just rambling at this point and you have some good points.
 
I feel it will be a non brainer best cpu for non AM4/good 690-790 owner gamer and maybe by too much for the lineup and hard for AMD to price.

I do not see any sign how-why it, it should perform almost the same has the test we saw of the half 7950x3d, max boost clock being only 100 mhz lower or so according to amd marketing for the high cache CCD.

Low frequency will make it really cheap to cool and DDR5-6000 will be cheaper than what you will need on the Intel alternative to beat it if it is possible.
 
Like the processor wasn't good enough so we threw cache at the problem to see if it worked. And it did, in some workloads.
....what? In what way do x3d chips feel like inferior silicon?

It's a cpu aimed at a specific workload. No need to overthink it lol. In the same way that throwing more cores on a cpu benefits certain tasks, or Intel's E cores, etc.
 
....what? In what way do x3d chips feel like inferior silicon?

It's a cpu aimed at a specific workload. No need to overthink it lol. In the same way that throwing more cores on a cpu benefits certain tasks, or Intel's E cores, etc.
I look at it like glued on bullshit. Because it limits clock speeds and or downright lowers them. It limits power you can hammer through the silicon and they chips die if you try to overclock them as DerBauer discovered. It's a gimped processor with memory, literally, glued on top of the chiplet. It has had heat generation issues from the 5800X3D. 3D Stacking tech just isn't there yet. Maybe if they ever figure out how to do the nano heat pipes or whatever exotic silicon based solution they can come up with it will be more feasible.

You can have your own opinion, I can have mine. We can both be right.
 
I'm not suggesting anything is being hidden. I'm just saying the cache really gives a boost to 1080P gaming, diminishing returns past that.
Would that statement be true for anything about cpu higher performance ? The more time you give the cpu in between frame the less obvious the gain will be ?

And this is more a low vs high fps than resolution statement right ? Because at high FPS at higher resolution the gain are still massive

7950x3d vs 7950x, on techpowerup series of titles (with a strong enough GPU like the 4090, 1080 vs 1440 quite similar gain)

. 720p: +16%
1080p: +14%
1440p: +12%
2160p: +4.2%

For the same game and test 7700x vs 5800x
. 720p: +21.4%
1080p: +20%
1440p: +17.8%
2160p: +9.2%

CPU heavy title seem to get popular, hogwart legacy for example show impressive gain from a 5950x vs a 3950x even below 90fps and with upscaler and old title being so large of a collection, we could start to see high frame at 4k more and more
 
I look at it like glued on bullshit. Because it limits clock speeds and or downright lowers them. It limits power you can hammer through the silicon and they chips die if you try to overclock them as DerBauer discovered. It's a gimped processor with memory, literally, glued on top of the chiplet. It has had heat generation issues from the 5800X3D. 3D Stacking tech just isn't there yet. Maybe if they ever figure out how to do the nano heat pipes or whatever exotic silicon based solution they can come up with it will be more feasible.

You can have your own opinion, I can have mine. We can both be right.
Idk. It's just odd. Maybe I'm just looking at the wording odd. Fair enough.
 
Would that statement be true for anything about cpu higher performance ? The more time you give the cpu in between frame the less obvious the gain will be ?

And this is more a low vs high fps than resolution statement right ? Because at high FPS at higher resolution the gain are still massive

7950x3d vs 7950x, on techpowerup series of titles (with a strong enough GPU like the 4090, 1080 vs 1440 quite similar gain)

. 720p: +16%
1080p: +14%
1440p: +12%
2160p: +4.2%

For the same game and test 7700x vs 5800x
. 720p: +21.4%
1080p: +20%
1440p: +17.8%
2160p: +9.2%

CPU heavy title seem to get popular, hogwart legacy for example show impressive gain from a 5950x vs a 3950x even below 90fps and with upscaler and old title being so large of a collection, we could start to see high frame at 4k more and more
So, you are pairing a 700+ dollar CPU with a 1500-2000 dollar graphics card and at least a 500-700 dollar motherboard probably on 7400Mhz RAM and a 4TB PCI-E 4.0 Drive too. jesus... I'm guessing the results aren't universal if you're not running the most expensive graphics card on the planet. Just saying. I would be more interested if they have a set of benchmarks with the 7900XTX (Which I own) of this vs a 13900K overclocked to the gills and see what the breakdown actually is.

You can throw FPS at anyone and it doesn't mean a damn thing to them. I could give a shit about either processor

You're drinking the Kool-Aid, good for you.
 
500-700 dollar motherboard probably on 7400Mhz RAM
Maybe it is some joke, but the 7800x3d will probably not be that expensive, will not require an expensive motherboard and "regular" 6000mhz ddr-5 will be enough.

if you're not running the most expensive graphics card on the planet. Just saying.
I mean, not sure why we are talking about good cpu for running hard to run game in native 4k with high enough FPS for cpu to matter to start with (I was responding to someone that talked about native 4k gaming), but if we do it will be with powerful GPUs.
 
No one is playing competitive games at 4k seriously. 1440p is currently the most popular resolution for refresh rate, fps, cost, and sharpness combination. I don't have a 4090, but i have serious doubts it would maintain 144fps+ at 4k

the 7800x3d having an estimated 12-15% bump in performance over a 7700X seems like an absolute winner to me. -- but ill probably buy a 7700x from microcenter because cost has a higher weight factor for me. I am sitting on a 3900x and i cannot believe how fast its fell behind, literally i went from a 6900xt to a 6600xt and my FPS at 1440p didnt change, thats how bad its holding me back.
 
Maybe it is some joke, but the 7800x3d will probably not be that expensive, will not require an expensive motherboard and "regular" 6000mhz ddr-5 will be enough.


I mean, not sure why we are talking about good cpu for running hard to run game in native 4k with high enough FPS for cpu to matter to start with (I was responding to someone that talked about native 4k gaming), but if we do it will be with powerful GPUs.
The AM5 Platform is pretty expensive to get into right now. By the time AM5 hit's it's last generation CPU the boards should have dropped to normal levels of pricing. PCI-E 5.0 will be mainstream by then. the 6900XT can push 40 FPS max settings at 4K, the 7900XTX can do 70 depending on the title. My 13900K has been delivering incredible gains in 4K over every other processor I have owned. It's been a wonderful experience in everything I have used with it.

These X3D Parts are a waste of money if all you're doing is playing games at 1080P. Almost Everything on the planet can deliver maximum FPS to max out most refresh rates on monitors at 1080P. If it pushes 100 more FPS when you're already getting 500 FPS ... pointless

No one is playing competitive games at 4k seriously. 1440p is currently the most popular resolution for refresh rate, fps, cost, and sharpness combination. I don't have a 4090, but i have serious doubts it would maintain 144fps+ at 4k

the 7800x3d having an estimated 12-15% bump in performance over a 7700X seems like an absolute winner to me. -- but ill probably buy a 7700x from microcenter because cost has a higher weight factor for me. I am sitting on a 3900x and i cannot believe how fast its fell behind, literally i went from a 6900xt to a 6600xt and my FPS at 1440p didnt change, thats how bad its holding me back.
Speak for yourself. I play everything at 4K. There are plenty of people that run high refresh on 4K. I've been on 4K for over a decade.
 
No one is playing competitive games at 4k seriously. 1440p is currently the most popular resolution for refresh rate, fps, cost, and sharpness combination. I don't have a 4090, but i have serious doubts it would maintain 144fps+ at 4k

the 7800x3d having an estimated 12-15% bump in performance over a 7700X seems like an absolute winner to me. -- but ill probably buy a 7700x from microcenter because cost has a higher weight factor for me. I am sitting on a 3900x and i cannot believe how fast its fell behind, literally i went from a 6900xt to a 6600xt and my FPS at 1440p didnt change, thats how bad its holding me back.
However, you do you. If cost is an issue find the best deal you can. Performance on 3000 series AMD was 25-40% lower than 5000 series. Just drop a 5800X3D(or a cheapo 5700X) in your system and screw upgrading to the 7700X if cost is actually holding you back.

In your use case, the 5800X3D makes the most sense. because it will be faster than the 7700X in the games you play (most likely).

It's funny I don't care for the tech, but I gotta call it like it is sometimes.
 
No one is playing competitive games at 4k seriously. 1440p is currently the most popular resolution for refresh rate, fps, cost, and sharpness combination. I don't have a 4090, but i have serious doubts it would maintain 144fps+ at 4k

the 7800x3d having an estimated 12-15% bump in performance over a 7700X seems like an absolute winner to me. -- but ill probably buy a 7700x from microcenter because cost has a higher weight factor for me. I am sitting on a 3900x and i cannot believe how fast its fell behind, literally i went from a 6900xt to a 6600xt and my FPS at 1440p didnt change, thats how bad its holding me back.
Depends on the game... I easily maintain 144 FPS on my 4090 with a 5950X CPU in BF2042 & COD:MW2.... both set to max graphics. Other games perform just as well, and other games hover in the 100's (GPU limited still), it's very game dependent, but even last gen CPU can easily do it, not even the X3D version either.
 
Last edited:
Depends on the game... I easily maintain 144 FPS on my 4090 with a 5950X CPU in BF2042 & COD:MW2.... both set to max graphics. Other games perform just as well, and other games hover in the 100's, it's very game dependent, but even last gen CPU can easily do it, not even the X3D version either.
The 5000 Series AMD Processors are robust and capable at 4K. I have two in systems at home and they both wonderful processors that offer excellent bang for the buck.
 
Plenty of people play games at 4K. I think you'll find that things change bit when you throw the word "competitive" in there, though. If you're talking competing for money/fame/etc. they're usually incentivized to play at lower resolutions and details. If you just mean playing a game where you're competing vs. other people, you'll see people using anything from a toaster to an OC'd 4090 running at 8K.
 
Some weird arguments going on here.

Clock speed doesn't matter, nor does the amount of cache. Bottom line, system throughput / speed is all that matters.. Sometimes the way to the best speed overall is to max the clocks. Sometimes it's to max the cache. It's going to depend on what is running, how memory intensive it is, and how cache-friendly the memory access patterns are.

Obviously one would like to have the best of both. The current state of the art doesn't permit both, at least not at an acceptable price point. I want to have a family car that goes 0-60 in 2.5 seconds and gets 50 MPG, too bad for me.

At least AMD gives us a choice based on use case needs. Always need max cores and clocks? 7950X. Mixed workload that sometimes needs clocks and sometimes cache? 7950X3D. Cache heavy workload? 7800X3D. Clock heavy workload that doesn't need all the cores? 7800X. The only one we're missing is the 7950X3D++ with extra cache on both CCD's, and I don't think that would be a common use case anyway.
 
the 7800x3D may match or outperform the 7900x3D in games that benefit from the cache. We will have to wait and see.
 
Back
Top