7800X3D will be an utter failure of a CPU

Some weird arguments going on here.

Clock speed doesn't matter, nor does the amount of cache. Bottom line, system throughput / speed is all that matters.. Sometimes the way to the best speed overall is to max the clocks. Sometimes it's to max the cache. It's going to depend on what is running, how memory intensive it is, and how cache-friendly the memory access patterns are.

Obviously one would like to have the best of both. The current state of the art doesn't permit both, at least not at an acceptable price point. I want to have a family car that goes 0-60 in 2.5 seconds and gets 50 MPG, too bad for me.

At least AMD gives us a choice based on use case needs. Always need max cores and clocks? 7950X. Mixed workload that sometimes needs clocks and sometimes cache? 7950X3D. Cache heavy workload? 7800X3D. Clock heavy workload that doesn't need all the cores? 7800X. The only one we're missing is the 7950X3D++ with extra cache on both CCD's, and I don't think that would be a common use case anyway.
Yeah, I thought when we saw a 2.2 GHz Core 2 CPU was beating a 3.9 GHz Pentium 4 in single threaded performance 15 years ago that we would have learned clock speed doesn't matter as much.
 
Yeah, I thought when we saw a 2.2 GHz Core 2 CPU was beating a 3.9 GHz Pentium 4 in single threaded performance 15 years ago that we would have learned clock speed doesn't matter as much.
Exactly.

It reminds me a little of DSLR cameras and megapixels, for a long time people just wanted the most megapixels, but so much goes into a camera's image quality than its MP count.
 
This thread has a troll title, but it got me to post about so I guess it worked..

I think it's fairly obvious for below 4k gaming this will sell like all get out. I DO game at 4k and I have a motherboard and RAM ready to go when this thing launches.
 
When this shit out? I need to decide whether to wait till August or get in on the X3D bandwagon now.
 
I'm sure within a few weeks of release they will be widely available.
Maybe, maybe not. Either way, freedom from the grip of my 8+year old pc is contingent on my acquiring the 7800X3D, so I'll be doing whatever it takes to not have to wait any longer than I have to.

Besides it could end up being as scarce as the 7950X3D, the demand certainly seems to be there.
 
Ugh, I am so not a morning person. 4 hours early means 6am. On a good day I wake up at 8, on a better day I wake up at 930

My plan is to try and get there at least a couple hours before opening while simultaneously trying to order online from the usual suspects as a fail-safe.
 
Maybe, maybe not. Either way, freedom from the grip of my 8+year old pc is contingent on my acquiring the 7800X3D, so I'll be doing whatever it takes to not have to wait any longer than I have to.

Besides it could end up being as scarce as the 7950X3D, the demand certainly seems to be there.
My PC is just as old but I’m still waiting on the reviews, I’m in no rush. Who knows what deals might spring up for other CPUs to compete with the 7800x3D’s release?
 
My PC is just as old but I’m still waiting on the reviews, I’m in no rush. Who knows what deals might spring up for other CPUs to compete with the 7800x3D’s release?
That would be a smart way to go, but I have a hard time leaving an itch and I've already accumulated all the parts of a new build and the processor is the only missing piece. I have faith that it will be every bit as great as anticipated (by everyone but this thread's OP anyway).
 
That would be a smart way to go, but I have a hard time leaving an itch and I've already accumulated all the parts of a new build and the processor is the only missing piece. I have faith that it will be every bit as great as anticipated (by everyone but this thread's OP anyway).
I’m still playing relatively older games anyways. I have a 2080Ti in my 5820k setup and with DLSS it was enough to play Diablo IV beta at 4k so it should last me for a while. I can’t afford to upgrade both my whole setup at once anyways so I’ll be holding onto my 2080Ti for a while.
 
I’m still playing relatively older games anyways. I have a 2080Ti in my 5820k setup and with DLSS it was enough to play Diablo IV beta at 4k so it should last me for a while. I can’t afford to upgrade both my whole setup at once anyways so I’ll be holding onto my 2080Ti for a while.
Sounds like you’ve got a good bit of use left. I’m on a GTX 970, released a whole 4 years before your 2080ti!
 
I’m still playing relatively older games anyways. I have a 2080Ti in my 5820k setup and with DLSS it was enough to play Diablo IV beta at 4k so it should last me for a while. I can’t afford to upgrade both my whole setup at once anyways so I’ll be holding onto my 2080Ti for a while.
Almost anything will run Diablo IV. It's Recommended specs are a 4000 Series i7 Intel Processor and an Nvidia GTX970. You probably don't need DLSS...

I wiped an old Dell laptop for my Cousin's kid and gave it to him. Only has an i5 7000 series in it with a GTX1050 (2GB) and he's running default settings (in Diablo IV) at 1080P nearly flawless. That laptop is 4 years old. You will be fine.
 
Last edited:
Meh. Wait for reviews lmao. Higher end ones have 2 CCDs when the game using the ccd with cache the boost isn't as high. Common now lmao.
 
Maybe. No one is buying it because there's almost no reason to move off the 5800X3D if you're a 1080P gamer. Honestly, I think the X3D processors are bullshit anyway because they do nothing for higher resolutions. It's a stop gap solution, gluing extra cache on top of a chiplet to increase certain workloads. It causes a scheduling nightmare in dual CCD units and the new implementation is hardly better than the old one.

Now, on the server side... They are gluing the shit out of their Epyc processors and there is a literal fuck ton or X3D cache on them. Those are the ones to watch.

If AMD really wanted to make a gaming processor they would designate one unit in their product stack and bitch it out. Instead they're just slapping cache on 1 CCD across their current lineup. It's gonna do jack and shit for most people. Why are you even running in 1080P on a 7900X+ anyway? You don't need more than 8 cores for gaming in general. That may change, but it will be a slow change. Heavily dependent on how many cores are adopted for Consoles. Since Consoles drive the gaming market.

we only have our selves to blame for the 7900/7950X3D.. literally everyone cried about the 5900/5950X not having an 3D variant even though AMD had said multiple times the issues/complications with it outweighed the performance benefit.. welp this time around they did it and thus proved what they said was true.
 
Maybe. No one is buying it because there's almost no reason to move off the 5800X3D if you're a 1080P gamer. Honestly, I think the X3D processors are bullshit anyway because they do nothing for higher resolutions. It's a stop gap solution, gluing extra cache on top of a chiplet to increase certain workloads. It causes a scheduling nightmare in dual CCD units and the new implementation is hardly better than the old one.

Now, on the server side... They are gluing the shit out of their Epyc processors and there is a literal fuck ton or X3D cache on them. Those are the ones to watch.

If AMD really wanted to make a gaming processor they would designate one unit in their product stack and bitch it out. Instead they're just slapping cache on 1 CCD across their current lineup. It's gonna do jack and shit for most people. Why are you even running in 1080P on a 7900X+ anyway? You don't need more than 8 cores for gaming in general. That may change, but it will be a slow change. Heavily dependent on how many cores are adopted for Consoles. Since Consoles drive the gaming market.

You realize that you are supporting their method with your rant :p?

"No one needs more than 8 cores for gaming in general" is what they are working with. The 7900x3D and 7950x3D are for people who want both the higher core counts ,and better gaming performance. 1 chiplet is used for gaming, and both are used when not gaming. They have new drivers to support that too, so most gaming workloads get confined to only the X3D chiplet. So only 6 or 8 cores are used(7900x3d/7950x3d) for gaming.

So, let's say I want to have enough cores for work, and I also use the same machine for gaming. I have an x3D chip for gaming, and a 16 core chip that clocks 8 cores up to 5.7ghz and 8 cores to 5.1ghz for work.
 
Last edited:
These X3D Parts are a waste of money if all you're doing is playing games at 1080P. Almost Everything on the planet can deliver maximum FPS to max out most refresh rates on monitors at 1080P. If it pushes 100 more FPS when you're already getting 500 FPS ... pointless

(y)

But these new "pc experts" and YouTube shills swear up and down you need X3D and the "But I'm a competitive gamer" nonsense
 
(y)

But these new "pc experts" and YouTube shills swear up and down you need X3D and the "But I'm a competitive gamer" nonsense

I'm at a loss actually.

They push that so hard and my experience leads me to believe it's BS.

Back in the day, I was completely dominant at 20-25 FPS on a Voodoo 2. And running Unreal tournament at those FPS levels was considered the best you could do.

So recently I went onto Quake Live and artificially limited my FPS to 20 FPS. There was no change in my general scoring.

BUT: I'm far better as a player with an old Kensington ball mouse. Isn't that strange? The only difference that I can lift the mouse off the pad while the ball is still spinning and return it to the pad to stop it. It's a technique I learned to be able to not run off the side of small mouse pads. It's also good for twitch moves.

That being said I don't really buy into the advantage of gaming above 60FPS.
 
Most games these days rely more on GPU and not CPU power, especially at 4K and higher resolutions. The 7800X3D will do just fine.
 
I'm at a loss actually.

They push that so hard and my experience leads me to believe it's BS.

Back in the day, I was completely dominant at 20-25 FPS on a Voodoo 2. And running Unreal tournament at those FPS levels was considered the best you could do.

So recently I went onto Quake Live and artificially limited my FPS to 20 FPS. There was no change in my general scoring.

BUT: I'm far better as a player with an old Kensington ball mouse. Isn't that strange? The only difference that I can lift the mouse off the pad while the ball is still spinning and return it to the pad to stop it. It's a technique I learned to be able to not run off the side of small mouse pads. It's also good for twitch moves.

That being said I don't really buy into the advantage of gaming above 60FPS.
The big deal about the x3D CPUs is in the 0.1% lows which is what you will really notice due to better frame time consistency which turns into smoother gameplay. Most if not all high skill players will easily tell the difference between having occasional major dips in the framerate compared to only having minor dips.

Saying you score the same in QL at 20fps and with normal settings doesn't mean much unless you are hitting 40+ LG and 50+ RG regularly while being among the top players on the servers. The really impressive aimers will hit closer to 50 than 40 LG in QL, unless it is duel. Bottom fragging on bad gear and good gear is easy to achieve.
 
^ yes.

Keep your Q6600 / RTX 3080 build and enjoy it. The X3D chips aren’t for you, but some of us enjoy high FPS, high Hz, low response time monitors when playing online shooters.
 
The big deal about the x3D CPUs is in the 0.1% lows which is what you will really notice due to better frame time consistency which turns into smoother gameplay. Most if not all high skill players will easily tell the difference between having occasional major dips in the framerate compared to only having minor dips.

Saying you score the same in QL at 20fps and with normal settings doesn't mean much unless you are hitting 40+ LG and 50+ RG regularly while being among the top players on the servers. The really impressive aimers will hit closer to 50 than 40 LG in QL, unless it is duel. Bottom fragging on bad gear and good gear is easy to achieve.

And yet.... I'm still dominant at 20FPS.

What I"m saying to you is I'm an exceptional player and see no difference.
 
And yet.... I'm still dominant at 20FPS.

What I"m saying to you is I'm an exceptional player and see no difference.
Why are you in this thread? Have you even used an X3D system? You’re obviously not the target market because you’re satisfied with the performance you get in a game released in 2010 that is a remake/upgrade of a game from 1999 (Q3A). If 20 FPS is gold for you then you’d have equal disdain for modern GPUs. Huzzah.
 
Why are you in this thread? Have you even used an X3D system? You’re obviously not the target market because you’re satisfied with the performance you get in a game released in 2010 that is a remake/upgrade of a game from 1999 (Q3A). If 20 FPS is gold for you then you’d have equal disdain for modern GPUs. Huzzah.

It's simple.. I'm in this thread because I usually enjoy conversation. I'm bringing an alternate perspective to a conversation about an X3D part. I'm voicing an unpopular opinion that today's frame rate levels are not relevant to game play success.

Also, I'm currently responding to a person too emotionally invested in the subject to understand the basis of the criticism.
 
It's simple.. I'm in this thread because I usually enjoy conversation. I'm bringing an alternate perspective to a conversation about an X3D part. I'm voicing an unpopular opinion that today's frame rate levels are not relevant to game play success.

Also, I'm currently responding to a person too emotionally invested in the subject to understand the basis of the criticism.
It’s just experience. The same you have in your area (although I played Q3A back before launch in the beta while at a dot com). I have in modern computers and games.
 
Excuse me, but this would be solved a little easier if you two went head to head in said video game.
Go ahead. Don't pretend like that's impossible to do!!

You two play a two-out-of-three round 10-kill death match in QL or Q3A or another classic game of raw speed, skill, and domination, and let the winner be known.
I'll certainly know what it means if one of you declines this match. Go for it. Seriously. Do it. I D-double DARE you both to compete properly using a 30 FPS max system for Spirit_Retro and his roly-poly mouse ball, and Sk3tch, using his ultra-fast, low-latency system, of course only using newer parts, and none of the old classic stuff.

I'll state it clearly: If you do not compete, you are assumed to be wrong. And a coward.
And if you both compete, then you both earn respect, but the winner shall be considered right.
 
Excuse me, but this would be solved a little easier if you two went head to head in said video game.
Go ahead. Don't pretend like that's impossible to do!!

You two play a two-out-of-three round 10-kill death match in QL or Q3A or another classic game of raw speed, skill, and domination, and let the winner be known.
I'll certainly know what it means if one of you declines this match. Go for it. Seriously. Do it. I D-double DARE you both to compete properly using a 30 FPS max system for Spirit_Retro and his roly-poly mouse ball, and Sk3tch, using his ultra-fast, low-latency system, of course only using newer parts, and none of the old classic stuff.

I'll state it clearly: If you do not compete, you are assumed to be wrong. And a coward.
And if you both compete, then you both earn respect, but the winner shall be considered right.
Happy to, but that doesn’t get to the core issue. X3D did not exist before 2021. Modern games have different demands than those that are a decade+ old.
 
Last edited:
Happy to, but that doesn’t get to the core issue. X3D did not exist before 2021. Modern games have different demands than those that are a decade+ old.
Well, you might be right, but it would still be cool to see it happen. =p
 
we only have our selves to blame for the 7900/7950X3D.. literally everyone cried about the 5900/5950X not having an 3D variant even though AMD had said multiple times the issues/complications with it outweighed the performance benefit.. welp this time around they did it and thus proved what they said was true.
How does this explain the 7773x? That has 64 cores with 3D vcache and was last gen tech that worked. They’re doing 3D vcache again on their Genoa CPUs as well. I just think they wanted to segment the consumer CPUs better. A 7950x3d with vcache on all CCDs put into a server board would be a pretty good alternative to server CPUs for applications needing high clocks and the vcache. It wouldn’t be 64 or 96 cores, but it’s also a lot cheaper.

I can understand the weirdness they pulled on something like the 7900x, but doing it on the 7950x just seems like unnecessary cost-cutting/market segmentation.
 
How does this explain the 7773x? That has 64 cores with 3D vcache and was last gen tech that worked. They’re doing 3D vcache again on their Genoa CPUs as well. I just think they wanted to segment the consumer CPUs better. A 7950x3d with vcache on all CCDs put into a server board would be a pretty good alternative to server CPUs for applications needing high clocks and the vcache. It wouldn’t be 64 or 96 cores, but it’s also a lot cheaper.

I can understand the weirdness they pulled on something like the 7900x, but doing it on the 7950x just seems like unnecessary cost-cutting/market segmentation.
it works on genoa because the clocks are low enough that the negative impact of the v-cache is less noticeable. the reason they did the split v-cache on the 79xx chips is because having to lower the clocks on both dies would put the 7950X3D below the 7900X in non gaming workloads where the v-cache isn't used. instead you get the benefit of v-cache while only losing 2-3% performance in non v-cache workloads. it was never about cost savings and more about having to play the bs benchmark number game that the media and intel would of jumped all over.
 
Wow, I finally read their gaming review. Means there is no point in considering new gen for now. Also this is the first fucking website where their numbers match my system. Every single 1440P and 4K bench they ran matches exactly my numbers for 5800X3D. Amazing!

I don't know wtf other idiots are smoking.
 
The arguments of 20 vs 60+fps are silly. Put the same player in with better gear and his potential (not skill, but potential to do well) rises, like a pro chef with high end cookware and knives vs a butter knife.
 
Back
Top