# 7800X3D will be an utter failure of a CPU

0
Don't your words and picture say two different things? If both are correct then Power Delivered = Power Dissipated. I can divide both sides by power and arrive at Delivered = Dissipated. Assuming we're not taking large portions of the power and converting it into some other energy state, like radio waves or other types of emitted radiation. I guess my comment should have pointed to CPUs, as probably north of 95% of the power draw will be waste heat torching your rooms.

Power Delivered =

p = i*V

Power Dissipated:

(In the resistors)
P= R*I^2

What I meant is that I am more concerned about heat than power consumption. As a human having a PC under my desk, my balls get sweaty if there is too much heat e.g., with the 3090 Ti that used to put out too much heat. Whereas, the 4090 even mid summer is not pushing that much heat out.

3090 Ti used to run at 75 C. This card runs at 65 C. 10 C less makes my balls happy.

Hope this is clear. Also to my understanding, an AC would be more effective in cooling a card operating at 65 C v 75 C.

Same applies to CPUs.

What I meant is that I am more concerned about heat than power consumption. As a human having a PC under my desk, my balls get sweaty if there is too much heat e.g., with the 3090 Ti that used to put out too much heat. Whereas, the 4090 even mid summer is not pushing that much heat out.
I think that a very common misconception, a 400w cpu kept at 0 by very efficient cooling will heat your room the exact same amount than a 400w cpu at 105C, in both case 400w of heat is being distributed.

For current computer chips almost all power get converted to heat very little is "ate" by the processing.

heat and temperature are a bit different even if somewhat linked, to take an example putting your hand in a 100celcius oven will hurt way less than in a 100 celcius boiling water, even if both are exposed to the same temperature one got much more heat transferred because water is a better transmitter of heat.

Your 4090 does not just run milder because it has better cooling, it is probably not using as much power than your 3090ti as well, often nearly 100 less watt, the card is often twice as efficient and can do a lot at 250-350w:

Think about it a cpu that would use so little power that it could be passively cooled if you let it rise at 85c, it would add almost nothing to the temperature of the room but would be at 85c in comparison someone keeping a 350watt 13900k at room temp +1.5c via an incredible water cooling with 3 480mm radiator, super pump pressure, super everything, would be adding 350w + the pump heat to the room.

Last edited:
And you completely missed the point of my post.

And you completely missed the point of my post.
Well...idk. It's just a bit odd to reference the temp at which the card is operating at. The chip running 10c cooler itself doesn't matter. Now if it's also using 100 less watts, oh for sure that matters.

With pc parts it's not complicated, every watt pulled from the wall is dumped as heat into your room. A PC is an expensive electric heater that can do fancy things while heating the room.

The 4090 is so much more efficient that I'm sure it's drawing way less watts than your 3090 ti on average. The temperature of the card itself does not matter for you or the room at all.

When we do calculation for heat generation in a server room, we look at the wattage being used, never the "temps" the parts run at. The chip running 90C and drawing 15W of power vs a chip running 50C and drawing 100W, well the 15W chip will always make less heat. Temperature measured at an object says how "hot" something is in that spot, but it doesn't mean it's contributing a ton of heat energy into the environment.

In the case of my 7800x3d and 4090 combo, my current total power draw at the wall averages 475W when gaming. My old 3090 would run 400W on it's own as indicated in GPUZ. So upgrading has definitely cut the amount of heat generated. I don't know how the guys running a 13900KS and 4090 deal with that kind of heat, that's easily 650W in gaming.

. I don't know how the guys running a 13900KS and 4090 deal with that kind of heat, that's easily 650W in gaming.

With pc parts it's not complicated, every watt pulled from the wall is dumped as heat into your room. A PC is an expensive electric heater that can do fancy things while heating the room.

The 4090 is so much more efficient that I'm sure it's drawing way less watts than your 3090 ti on average. The temperature of the card itself does not matter for you or the room at all.

That's exactly it. I noticed that there's less heat coming from my PC with the 4090 vs the 3080 Ti that I had before it. Both GPUs ran at around 65C while under load so the actual temperature of the GPU had nothing to do with it. It's the fact that the 4090 tends to pull around 350-360 watts on a typical gaming load while my 3080 Ti was pulling around 450 watts.

Do any of you still rip CDs occasionally? I'll get a new CD every so often, or a lot if I get on a kick, and rip them to FLAC. I've been doing this for years with EAC, dBpowerAMP and AccurateRip and never had a problem...

...until I built this new system last month...

Now any program that uses AccurateRip to detect and set the correct offset for the CD/DVD reader and software absolutely will not detect correctly. This is using the same 2 drives I've had for years that are in the AccurateRip database and normally setup on the first "key" disc. Without the AccurateRip feature, my FLACs are coming out with horrible pops and digital noise.

Well, anyway, I spent days and days trying to figure out what was wrong. I finally gave up and bought an external enclosure that converts the SATA to USB 3.1. Figured I'd bypass the SATA as much as possible. Plugged it in and put my key disc in and.... AccurateRip detected it and made the offset adjustments immediately. WTF? Works perfectly via USB. What is causing the direct SATA connection to fail, but the USB connection to work fine? I'm guessing it's got to be a BIOS thing. I'm running the SATA ports with AHCI mode. There doesn't seem to be a whole lot of options for them. It's old tech by this point in time. I wouldn't think that AMD or MSI would have suddenly screwed them up, but there's got to be an explanation.

Anyone?

Interesting note on street fighter 6. 7800x3d doesn't do precompiled shaders. Option wasn't given on startup. On my 5800x3d system, I have that step and took 10mins to do. As nice as the 7800x3d is, I have noticed the shader load issue happen in game. In some instances during a drive impact triggering, the game went into horrible slow mo mode before the hit connected.

Last edited:
Interesting note on street fighter 6. 7800x3d doesn't do precompiled shaders. Option wasn't given on startup. On my 5800x3d system, I have that step and took 10mins to do. As nice as the 7800x3d is, I have noticed the shader load issue happen in game. In some instances during a drive impact triggering, the game went into horrible slow mo mode before the hit connected.
theres a button to enable it in the options, check that.

Do any of you still rip CDs occasionally? I'll get a new CD every so often, or a lot if I get on a kick, and rip them to FLAC. I've been doing this for years with EAC, dBpowerAMP and AccurateRip and never had a problem...

...until I built this new system last month...

Now any program that uses AccurateRip to detect and set the correct offset for the CD/DVD reader and software absolutely will not detect correctly. This is using the same 2 drives I've had for years that are in the AccurateRip database and normally setup on the first "key" disc. Without the AccurateRip feature, my FLACs are coming out with horrible pops and digital noise.

Well, anyway, I spent days and days trying to figure out what was wrong. I finally gave up and bought an external enclosure that converts the SATA to USB 3.1. Figured I'd bypass the SATA as much as possible. Plugged it in and put my key disc in and.... AccurateRip detected it and made the offset adjustments immediately. WTF? Works perfectly via USB. What is causing the direct SATA connection to fail, but the USB connection to work fine? I'm guessing it's got to be a BIOS thing. I'm running the SATA ports with AHCI mode. There doesn't seem to be a whole lot of options for them. It's old tech by this point in time. I wouldn't think that AMD or MSI would have suddenly screwed them up, but there's got to be an explanation.

Anyone?
Interesting. Unfortunately my drive is USB and my FLAC rips are coming out perfect with EAC+AccurateRip on my rig. Have you tried other SATA drivers? The only thing I can think of in the bios that might help would be enabling Spread Spectrum.

Do any of you still rip CDs occasionally? I'll get a new CD every so often, or a lot if I get on a kick, and rip them to FLAC. I've been doing this for years with EAC, dBpowerAMP and AccurateRip and never had a problem...

...until I built this new system last month...

Now any program that uses AccurateRip to detect and set the correct offset for the CD/DVD reader and software absolutely will not detect correctly. This is using the same 2 drives I've had for years that are in the AccurateRip database and normally setup on the first "key" disc. Without the AccurateRip feature, my FLACs are coming out with horrible pops and digital noise.

Well, anyway, I spent days and days trying to figure out what was wrong. I finally gave up and bought an external enclosure that converts the SATA to USB 3.1. Figured I'd bypass the SATA as much as possible. Plugged it in and put my key disc in and.... AccurateRip detected it and made the offset adjustments immediately. WTF? Works perfectly via USB. What is causing the direct SATA connection to fail, but the USB connection to work fine? I'm guessing it's got to be a BIOS thing. I'm running the SATA ports with AHCI mode. There doesn't seem to be a whole lot of options for them. It's old tech by this point in time. I wouldn't think that AMD or MSI would have suddenly screwed them up, but there's got to be an explanation.

Anyone?
are your chipset drivers up to date?

Interesting. Unfortunately my drive is USB and my FLAC rips are coming out perfect with EAC+AccurateRip on my rig. Have you tried other SATA drivers? The only thing I can think of in the bios that might help would be enabling Spread Spectrum.
I'm also getting perfect rips once I switched over to a USB external case and put my DVD drive in it. I don't know that there's any other SATA drivers I can use besides the ones Microsoft installs. I wonder if switching to RAID mode and using the AMD driver for that would make any difference?

are your chipset drivers up to date?
Yup, newest available on AMDs website.

If you play music directly from the CD over SATA is there any audio issues? If it's fine playing the CD, you can rule SATA out of being the problem. Also what is your VSOC and VDDIO set at (if you tweaked these yourself)?

I'm an EE and heat dissipated is related (depends on) to the resistance of the circuitry and the current delivered. It's not the same thing.

Power delivered is simple P = IV. Power DISSIPATED (as heat) is } = I^2 * R.

View attachment 577702
For DC circuits:
Like your image shows, P=I^2*R is just P=VI with V=IR plugged in. You can calculate it either way , P=IV, or P=I^2*R, you'll get the same result. Also, P=I^2*R Ionly applies to ohmic devices. For example you cant use it to calculate power through diodes, BJTs, FETS. In those cases you go back to P=IV. P=IV will give you power, whether it's thermal or electrical power will depend on the context of what you are solving for.

For integrated circuits Electrical Power in = Thermal power out is usually a pretty safe assumption. If it's an LED you should also take into account power from the light being emitted.

For integrated circuits Electrical Power in = Thermal power out is usually a pretty safe assumption.
I figured as much, but I build jets, not processors.

And you completely missed the point of my post.
I think your ballz did

WHAT!!!!

That's just brutally awesome to be quite honest, and only fortifies my main point. Poor 7800X3D. Just not gonna win any awards compared to more versatile and powerful processors.
This aged like milk. Especially when the 9 series dropped so far.

Last edited:
Looks like not many 7800x3d users are gonna be changing their cpu anytime soon. I may reconsider if the 9800x3d doesn't suck but we'll see if they are worth it soon enough. For now the 7800 is still the P/W champ.

I mean, it's a CPU. For the folks gaming at 4K (which is probably a lot considering this is [H]) it's mostly just to keep your low end FPS from being too low. There isn't much motivation to get a new one even if you have a 5800X3D.

This aged like milk. Especially when the 9 series dropped so far…
Was there a need to revive this thread though?

[insert something about Linux]

Just for closure.

Some toasters run Linux.

Was there a need to revive this thread though?
I thought was relevant today more than ever. Still a current best choice for gaming.

Last edited:
Oh man, this is no where near the necro level I’ve seen here…. This was a weak necro. And one I thought was relevant today more than ever.
True that. I'm glad that my choice to go with a 7800X3D 2 months ago was not the wrong decision.

I had actually grabbed a 7900X3D for \$320 but when I had the opportunity to get a 7800X3D for the same price I returned the 7900X3D. I only use my system for gaming and content consumption and I do enjoy my indy sim games which tend to need single core performance.

Oh man, this is no where near the necro level I’ve seen here…. This was a weak necro. And one I thought was relevant today more than ever.
It's not the age of the corpse, but the fact that the subject matter didn't warrant it being dug up.

But, you know, maybe it was "funny" (?)

Hell yea brother

I bought a 7800X3D 2 weeks ago...so yes the thread is still relevant
There are already other current threads discussing the 9700X and 7800X3D though. Pretty much the only point of resurrecting this thread is to point out how wildly off mark the OP was.

The 7800x3d is still the best gaming cpu, even with new cpu's launched by both Intel and AMD, so I think we should keep reviving this awful take until that's no longer the case .

There are already other current threads discussing the 9700X and 7800X3D though. Pretty much the only point of resurrecting this thread is to point out how wildly off mark the OP was.

Off base like a MLID article (waits for the week ban from the mods here for criticizing MLID).

The 7800x3d is still the best gaming cpu, even with new cpu's launched by both Intel and AMD, so I think we should keep reviving this awful take until that's no longer the case .
If Intel doesn't surpass the 7800X3D, the 9800X3D is guaranteed to. We already know that at minimum Zen 5 is not a step back from Zen 4. The argument at that point will be which provides better value, which will undisputedly be the 7800X3D until it is no longer available.

Also, arguably the 7950X3D is the best current gaming CPU. Those extra cores can help out if there are a lot of simultaneous simulations, i.e. Cities: Skylines 2. It is almost never worse than a 7800X3D in performance, just requires some (potentially annoying) tweaks in certain games.

If Intel doesn't surpass the 7800X3D, the 9800X3D is guaranteed to. We already know that at minimum Zen 5 is not a step back from Zen 4. The argument at that point will be which provides better value, which will undisputedly be the 7800X3D until it is no longer available.

Also, arguably the 7950X3D is the best current gaming CPU. Those extra cores can help out if there are a lot of simultaneous simulations, i.e. Cities: Skylines 2. It is almost never worse than a 7800X3D in performance, just requires some (potentially annoying) tweaks in certain games.

Not true actually. There are a few games where the 7700X is in fact FASTER than the 9700X, although maybe that just comes down to Windows being broken or needing a BIOS update something. I would hope by the time the 9800X3D launches, there will not be a single game where the 7800X3D is faster otherwise that would be incredibly sad.

Not true actually. There are a few games where the 7700X is in fact FASTER than the 9700X, although maybe that just comes down to Windows being broken or needing a BIOS update something. I would hope by the time the 9800X3D launches, there will not be a single game where the 7800X3D is faster otherwise that would be incredibly sad.

View attachment 677926
The FPS Review has the same game as performing marginally better on the 9700X. So who knows, that little decrease might just have been margin of error.

The FPS Review has the same game as performing marginally better on the 9700X. So who knows, that little decrease might just have been margin of error.

Could be but there are other instances where it may not be. A 10fps difference in the 1% lows here seems outside the margin of error if just barely. But based on HUB latest video it just seems like there's some things iffy about Windows and Zen 5, I would imagine under the correct scenario Zen 5 would be equal to Zen 4 and not worst.

Let's get back on topic (about the utter failure of the 7800X3D).

Let's get back on topic (about the utter failure of the 7800X3D).

OP should just delete this thread and pretend it never happened