Most 9800X3D reviews lacked 1440P and 4K gaming benchmarks, but I found some

So here are my scores. All information is provided. A pretty meh upgrade for me for the stuff I do on my PC. Generally, also running much hotter than my 7800X3D but that might be able to be fixed once I dial in the voltages. Right now, running at 5.415 Gs. Thanks to sk3tch for grabbing this for me. Helluva guy!

System specs
DDR5 6000 CL30
Windows 11 24H2
4K 240 Hz DLSS Quality when available, HDR on
Nvidia drivers 566.147800X3D - PBO -15 and bulldzoid ram timings9800X3D - PBO - 15/+200 and bulldzoid ram timingsDifference
Cinebench R206958899729%
Cinebench R23179372316529%
Cinebench R241099137025%
CPU-Z Bench SC69283621%
CPU-Z Bench MC7425902121%
Ass Creed Mirage1581580%
Shadow of Tomb Raider1801800%
Far Cry 6 (no FSR)122121-1%
Ghost Recon Breakpoint (no upscaling) DX111151161%
Cyberpunk 2077 (DLSS FG ON)50.550.1-1%
Forza Horizon 5 (FG on)1791801%
Horizon Zero Dawn Remastered1681722%
Hogwarts Legacy (Hogsmead Area walkthrough)1081156%
Valley Benchmark (MSAA 8X)112.3112.80%
Superposition (4K Optimized)33003331590%
Red Dead Redemption1151150%
Heaven Benchmark (MSAA 8X)113.4115.52%
Metro Exodus Enhanced Edition (Extreme preset)111.09111.540%
 
So here are my scores. All information is provided. A pretty meh upgrade for me for the stuff I do on my PC. Generally, also running much hotter than my 7800X3D but that might be able to be fixed once I dial in the voltages. Right now, running at 5.415 Gs. Thanks to sk3tch for grabbing this for me. Helluva guy!

System specs
DDR5 6000 CL30
Windows 11 24H2
4K 240 Hz DLSS Quality when available, HDR on
Nvidia drivers 566.147800X3D - PBO -15 and bulldzoid ram timings9800X3D - PBO - 15/+200 and bulldzoid ram timingsDifference
Cinebench R206958899729%
Cinebench R23179372316529%
Cinebench R241099137025%
CPU-Z Bench SC69283621%
CPU-Z Bench MC7425902121%
Ass Creed Mirage1581580%
Shadow of Tomb Raider1801800%
Far Cry 6 (no FSR)122121-1%
Ghost Recon Breakpoint (no upscaling) DX111151161%
Cyberpunk 2077 (DLSS FG ON)50.550.1-1%
Forza Horizon 5 (FG on)1791801%
Horizon Zero Dawn Remastered1681722%
Hogwarts Legacy (Hogsmead Area walkthrough)1081156%
Valley Benchmark (MSAA 8X)112.3112.80%
Superposition (4K Optimized)33003331590%
Red Dead Redemption1151150%
Heaven Benchmark (MSAA 8X)113.4115.52%
Metro Exodus Enhanced Edition (Extreme preset)111.09111.540%
So as expected, aside from an outlier at 6% difference with Hogwarts Legacy, at 4K it's basically the same performance between the two chips for real world gaming.

Obviously the synthetics are going to look different.

Thanks for compiling!
 
Of course but people wanted 4K benches and to know more about the CPU. I remember upgrading from 5800X3D to 7800X3D and feeling the same with the hope that 5090 would bring some gains later. But that never happened. So here is to hoping one more time.

Since I posted I dialed down the voltages and removed the +200 overclock as it was causing my fans to go ape shit. I prefer a silent PC and am ok with 5.215 Gs with a -20 offset. I might apply Tech Yes City’s settings to keep CPU under 75 C and hit 5.3 Gs. Probably will be best outcome. At 5.415 unchecked it was running up to 96 C in synthetics like Cinebench lol.

My board wants to overvolt the crap out of the CPU. Good thing is I tried manual volts that kept processor at 4.9 GHz (same as 7800X3D) and unlike my old proc that ran around 70 C at that speed this one runs at 62 C so there is some efficiency gain somewhere. For gaming at 4K none of this matters. Use PBO and enjoy the fastest gaming processor and GPU for now.

For people with 7800X3D and 4K setups, just skip this gen and wait for next one.
 
Of course but people wanted 4K benches and to know more about the CPU. I remember upgrading from 5800X3D to 7800X3D and feeling the same with the hope that 5090 would bring some gains later. But that never happened. So here is to hoping one more time.

Since I posted I dialed down the voltages and removed the +200 overclock as it was causing my fans to go ape shit. I prefer a silent PC and am ok with 5.215 Gs with a -20 offset. I might apply Tech Yes City’s settings to keep CPU under 75 C and hit 5.3 Gs. Probably will be best outcome. At 5.415 unchecked it was running up to 96 C in synthetics like Cinebench lol.

My board wants to overvolt the crap out of the CPU. Good thing is I tried manual volts that kept processor at 4.9 GHz (same as 7800X3D) and unlike my old proc that ran around 70 C at that speed this one runs at 62 C so there is some efficiency gain somewhere. For gaming at 4K none of this matters. Use PBO and enjoy the fastest gaming processor and GPU for now.

For people with 7800X3D and 4K setups, just skip this gen and wait for next one.
Good information of real world testing and findings about the 9800X3D, thanks for sharing.
 
So here are my scores. All information is provided. A pretty meh upgrade for me for the stuff I do on my PC. Generally, also running much hotter than my 7800X3D but that might be able to be fixed once I dial in the voltages. Right now, running at 5.415 Gs. Thanks to sk3tch for grabbing this for me. Helluva guy!

System specs
DDR5 6000 CL30
Windows 11 24H2
4K 240 Hz DLSS Quality when available, HDR on
Nvidia drivers 566.147800X3D - PBO -15 and bulldzoid ram timings9800X3D - PBO - 15/+200 and bulldzoid ram timingsDifference
Cinebench R206958899729%
Cinebench R23179372316529%
Cinebench R241099137025%
CPU-Z Bench SC69283621%
CPU-Z Bench MC7425902121%
Ass Creed Mirage1581580%
Shadow of Tomb Raider1801800%
Far Cry 6 (no FSR)122121-1%
Ghost Recon Breakpoint (no upscaling) DX111151161%
Cyberpunk 2077 (DLSS FG ON)50.550.1-1%
Forza Horizon 5 (FG on)1791801%
Horizon Zero Dawn Remastered1681722%
Hogwarts Legacy (Hogsmead Area walkthrough)1081156%
Valley Benchmark (MSAA 8X)112.3112.80%
Superposition (4K Optimized)33003331590%
Red Dead Redemption1151150%
Heaven Benchmark (MSAA 8X)113.4115.52%
Metro Exodus Enhanced Edition (Extreme preset)111.09111.540%

Just about every game on your list supports DLSS. This is just me but I want to get as much fps as possible (I have a 240Hz monitor) so I always use DLSS Performance mode since it looks barely any worst than native 4k these days. In that scenario you would probably find yourself far more likely to be CPU limited and getting more value out of the 9800X3D. If I'm still CPU limited even on the 9800X3D then I simply increase the DLSS setting until I no longer am but now I am pushing as much fps as both
my GPU and CPU will allow.
 
I use DLSS Quality. There is normally a world of difference in IQ between performance and quality. If I can use DLAA I prefer that over DLSS Quality but for this bench as most people run DLSS Quality is what I used for real world tests.
On another note, I tried -30 PBO and it crashed so now I am on -25 PBO and +75 MHz with temps capped to 90 C. I saw some spikes to 86 C while loading dragon age but my PC is silent with a quiet fan profile and I am running 5.3 Gs which is where I wanted to end up. Most gaming is about 70-75 C very similar to my 7800X3D.

That's it. Time to play some games.
 
So here are my scores. All information is provided. A pretty meh upgrade for me for the stuff I do on my PC. Generally, also running much hotter than my 7800X3D but that might be able to be fixed once I dial in the voltages. Right now, running at 5.415 Gs. Thanks to sk3tch for grabbing this for me. Helluva guy!

System specs
DDR5 6000 CL30
Windows 11 24H2
4K 240 Hz DLSS Quality when available, HDR on
Nvidia drivers 566.147800X3D - PBO -15 and bulldzoid ram timings9800X3D - PBO - 15/+200 and bulldzoid ram timingsDifference
Cinebench R206958899729%
Cinebench R23179372316529%
Cinebench R241099137025%
CPU-Z Bench SC69283621%
CPU-Z Bench MC7425902121%
Ass Creed Mirage1581580%
Shadow of Tomb Raider1801800%
Far Cry 6 (no FSR)122121-1%
Ghost Recon Breakpoint (no upscaling) DX111151161%
Cyberpunk 2077 (DLSS FG ON)50.550.1-1%
Forza Horizon 5 (FG on)1791801%
Horizon Zero Dawn Remastered1681722%
Hogwarts Legacy (Hogsmead Area walkthrough)1081156%
Valley Benchmark (MSAA 8X)112.3112.80%
Superposition (4K Optimized)33003331590%
Red Dead Redemption1151150%
Heaven Benchmark (MSAA 8X)113.4115.52%
Metro Exodus Enhanced Edition (Extreme preset)111.09111.540%

Appreciate the 4K reality check after seeing all the 1080p reviews. It eased my feelings of FOMO and my thoughts of "what if I upgraded my 7800X3D" considering how universally praised the 9800X3D is as a gaming CPU. I guess the money will be better spent elsewhere or even just saved altogether.
 
I use DLSS Quality. There is normally a world of difference in IQ between performance and quality. If I can use DLAA I prefer that over DLSS Quality but for this bench as most people run DLSS Quality is what I used for real world tests.
On another note, I tried -30 PBO and it crashed so now I am on -25 PBO and +75 MHz with temps capped to 90 C. I saw some spikes to 86 C while loading dragon age but my PC is silent with a quiet fan profile and I am running 5.3 Gs which is where I wanted to end up. Most gaming is about 70-75 C very similar to my 7800X3D.

That's it. Time to play some games.
Great benchmarks bud, thanks for posting them. I ended up going with the 9900x for my build - it was cheaper, and I wanted lower shader compilations as my 4k fps difference should be minimal with an x3d. But with fomo with the 9800x3d was big.
Also regarding DLSS IQ - I also use 4k DLAA exclusively, or DLSS Quality at the very least as I just can't handle the blurriness. It was a major reason I upgraded to 4k res (which helped, but didn't solve the issue). From what a gather, there's a select few (and from reddit seems like a minority) that are 'lucky' enough to not notice TAA's/DLSS balanced/performance blurriness and ghosting. Anyone that says that they can't spot a difference from 4k native to DLSS performance check out the dozens of comparisons done here: https://www.reddit.com/r/FuckTAA/comments/oi0v86/taa_on_vs_taa_off_comparisons_sharpness_texture/
 
There IS a difference between DLAA and DLSS Performance, nobody said there isn't. But that gap has massively closed with the release of v3.7. The difference in IQ is hardly worth the huge fps hit in return. Here's Baldur's Gate 3 running DLAA vs DLSS Performance. The hair strands is more detailed with DLAA yeah sure, but in modern titles this tiny difference in IQ is also the difference between 50fps and 100+ fps and that to me is not worth it. And again you need to be running at least v3.7 or newer, even games using v3.5 will have noticeably worse DLSS Performance IQ.

EDIT: Also included Stalker 2 DLAA vs DLSS Performance. There is an obvious difference if you zoom in enough, but nobody is gonna eskimo kiss their screen while playing. And the huge power reduction with DLSS Performance is also nice.
 

Attachments

  • IMG_1526.jpeg
    IMG_1526.jpeg
    820.4 KB · Views: 0
  • IMG_1523.jpeg
    IMG_1523.jpeg
    833.9 KB · Views: 0
  • IMG_1532.jpeg
    IMG_1532.jpeg
    687.4 KB · Views: 0
  • IMG_1529.jpeg
    IMG_1529.jpeg
    467 KB · Views: 0
Last edited:
There IS a difference between DLAA and DLSS Performance, nobody said there isn't. But that gap has massively closed with the release of v3.7. The difference in IQ is hardly worth the huge fps hit in return. Here's Baldur's Gate 3 running DLAA vs DLSS Performance. The hair strands is more detailed with DLAA yeah sure, but in modern titles this tiny difference in IQ is also the difference between 50fps and 100+ fps and that to me is not worth it. And again you need to be running at least v3.7 or newer, even games using v3.5 will have noticeably worse DLSS Performance IQ.

EDIT: Also included Stalker 2 DLAA vs DLSS Performance. There is an obvious difference if you zoom in enough, but nobody is gonna eskimo kiss their screen while playing. And the huge power reduction with DLSS Performance is also nice.
1. Truthfully I don’t need screenshots because my own personal experience says otherwise. It might be a personal thing and I do mean the 'harsh' word I'm going to use - I despise blurriness in games. I'd take 'crisp' FXAA over TAA any.single.day.
2. Different games, different monitor panels (and resolutions), and just different people would all provide different results. It’s not a black white kind of situation that you’re trying to make it sound like.
3. If you don’t notice a difference then honestly good for you, you should just use whatever you like
4. Regardless, it’s quite hard to notice details with a phone taken picture. Using an actual screenshot would be miles better.
5. And lastly the issue with DLSS is almost never noticeable with standing still, TAA's/DLSS blurriness 'shines' during motion. The combination of the standing still + a phone picture of your monitor instead of an in-game screenshot shows me that you might have a lack of understanding of this specific subject as that combo alone would make differentiation between the settings impossible to notice, regardless of if it's true or not.

Edit: here's a thread just from last month with >100 comments agreeing that DLSS hurts image quality or it's as bad as TAA
https://www.reddit.com/r/FuckTAA/comments/1gidf3d/i_cannot_stand_dlss/
If you don't want to - don't agree with it, but don't state 'facts' that DLSS Perf is almost equal to DLAA.
 
Last edited:
1. Truthfully I don’t need screenshots because my own personal experience says otherwise. It might be a personal thing and I do mean the 'harsh' word I'm going to use - I despise blurriness in games. I'd take 'crisp' FXAA over TAA any.single.day.
2. Different games, different monitor panels (and resolutions), and just different people would all provide different results. It’s not a black white kind of situation that you’re trying to make it sound like.
3. If you don’t notice a difference then honestly good for you, you should just use whatever you like
4. Regardless, it’s quite hard to notice details with a phone taken picture. Using an actual screenshot would be miles better.
5. And lastly the issue with DLSS is almost never noticeable with standing still, TAA's/DLSS blurriness 'shines' during motion. The combination of the standing still + a phone picture of your monitor instead of an in-game screenshot shows me that you might have a lack of understanding of this specific subject as that combo alone would make differentiation between the settings impossible to notice, regardless of if it's true or not.

Edit: here's a thread just from last month with >100 comments agreeing that DLSS hurts image quality or it's as bad as TAA
https://www.reddit.com/r/FuckTAA/comments/1gidf3d/i_cannot_stand_dlss/
If you don't want to - don't agree with it, but don't state 'facts' that DLSS Perf is almost equal to DLAA.

I am not stating any facts. Just my opinion, we can agree to disagree. You do you, I'll use DLSS all day everyday.
 
1. Truthfully I don’t need screenshots because my own personal experience says otherwise. It might be a personal thing and I do mean the 'harsh' word I'm going to use - I despise blurriness in games. I'd take 'crisp' FXAA over TAA any.single.day.
2. Different games, different monitor panels (and resolutions), and just different people would all provide different results. It’s not a black white kind of situation that you’re trying to make it sound like.
3. If you don’t notice a difference then honestly good for you, you should just use whatever you like
4. Regardless, it’s quite hard to notice details with a phone taken picture. Using an actual screenshot would be miles better.
5. And lastly the issue with DLSS is almost never noticeable with standing still, TAA's/DLSS blurriness 'shines' during motion. The combination of the standing still + a phone picture of your monitor instead of an in-game screenshot shows me that you might have a lack of understanding of this specific subject as that combo alone would make differentiation between the settings impossible to notice, regardless of if it's true or not.

Edit: here's a thread just from last month with >100 comments agreeing that DLSS hurts image quality or it's as bad as TAA
https://www.reddit.com/r/FuckTAA/comments/1gidf3d/i_cannot_stand_dlss/
If you don't want to - don't agree with it, but don't state 'facts' that DLSS Perf is almost equal to DLAA.
So what anti aliasing do you use, then? Native in modern games with complex visuals looks absolutely awful, it's a shimmering fest. MSAA is gone, and did not fix all the shimmering to begin with.

Between DLSS/DLAA and upscaling to 8k when playing at 4k to bring things in line, yeah my choice is quickly made.
 
So what anti aliasing do you use, then? Native in modern games with complex visuals looks absolutely awful, it's a shimmering fest. MSAA is gone, and did not fix all the shimmering to begin with.

Between DLSS/DLAA and upscaling to 8k when playing at 4k to bring things in line, yeah my choice is quickly made.
Almost exclusively DLAA. It completely resolved the TAA issue for me. Thankfully I can ‘afford’ the performance hit with a 4090, as any modern title with FG+DLAA sits at around 90-120fps which I’m comfortable with.
DLAA doesn’t upscale to 8k btw, it’s still 4K just with added AI algorithms for the AA smoothness. Nvidia figures put DLAA as 5-10% performance difference from native 4K.
 
I never said DLAA upscales to 8k, but the alternative to fix aliasing decently does - supersampling, hence the second part of my sentence.

I'd put a lot of money that at 4k, you would never pass a blind test between DLSS Quality and DLAA. Performance? Doable in some games at least. But most people would fail, especially while playing an active video game.

Personally the bigger thing is - performance. Because I care about input lag and clarity in motion. DLAA never wins there except on light games where performance headroom is huge, but I'd still use DLSS Quality for the power savings. Performance is when I need it to keep things running at my refresh rate.
 
Last edited:
I didn't say DLAA upscales to 8k, but the alternative to fix aliasing does - supersampling.

I'd put a lot of money that at 4k, you would never pass a blind test between DLSS Quality and DLAA. Performance? Doable in some games at least. But most people would fail, especially while playing an active video game.

Personally the bigger thing is - performance. Because I care about input lag and clarity in motion. DLAA never wins there except on light games where performance headroom is huge, but I'd still use DLSS Quality for the power savings.
You’re going to lose a lot of money on that bet then. Not sure why you’re so confident about other people’s lives- seems like a bad mindset.
For me DLAA is a night and day difference- so much so that there are games that I stopped playing before I could enable DLAA. I’m very sensitive to bluriness and ghosting (and there are a more than a few others in that subreddit I linked above that feel the same).
You do you though. Just stop trying to convince me you know better about how I like to play and look at games. People here are just so arrogant is wild.
 
You don't think admitting to using FG with DLAA and then complaining about DLSS is a bit weird? The compromises of FG are much more noticeable and measurable, in both lag and picture quality. I use FG all the time but it's really a last resort solution, to hit your refresh rate.

That being said, yep you do you. But you're sitting on a bit of a high horse with debatable opinions so expect people to pick on you :)
 
Last edited:
Buildzoid is overclocking live right now on his YouTube channel at the time of this post if anybody wants to watch it
 
It has minimal impact on AMD so not sure why someone would pay for those extra timings. The OG timings are good enough.
 
5.3-5.4 GHz all-core is what Jufes got with his timings pre-delided.
 
I can run 5.4 Gs if I don’t look at temps and wear headphones. But I am not going to do that as perf doesn’t change in games for me.
 
Only the beginning he I think tested the 9800X3d or has info of it tested direct die and claims it's pointless. The rest of the video is bullshit ignore it he's trying to sell his product. Waiting for the direct die testing. Just in case anyone was thinking about it maybe just some Info or speculation.


View: https://youtu.be/rW9dqHva6JY?si=NAx9A0nPwwLXIkML
 
Last edited:
  • Like
Reactions: Niner
like this
Only the beginning he I think tested the 9800X3d or has info of it tested direct die and claims it's pointless. The rest of the video is bullshit ignore it he's trying to sell his product. Waiting for the direct die testing. Just in case anyone was thinking about it maybe just some Info or speculation.


View: https://youtu.be/rW9dqHva6JY?si=NAx9A0nPwwLXIkML

Decent watch. I only watched his videos due to him making fun of other "tech" youtubers such as Jay and the guy from Level1tech and exposing them for not knowing certain things they should know especially with the 13th/14th Gen Intel situation. Other than that I can't stomach his approach to his videos.
 
Decent watch. I only watched his videos due to him making fun of other "tech" youtubers such as Jay and the guy from Level1tech and exposing them for not knowing certain things they should know especially with the 13th/14th Gen Intel situation. Other than that I can't stomach his approach to his videos.
Ya he's just a con artist trying his best. Selling information that will be widespread all over the Internet just that he curates and condenses it but his asking price is outrageous. I would never give him a penny.
Although I do like his waterblock and direct die content only cause he does it from a gamers perspective.
So I'm looking forward to the 9800X3D and 285K direct die videos those are good content.
He's like the Walmart version of Derbaur except he's not an engineer just a gamer lmao just slapping shit together willy nilly trying to rip off people but I guess that's most businesses.
Derbaur is the real deal and his products are priced properly and are of the highest quality.
To Jufes credit he's saying there is no FPS gain to direct die but I feel like waterblocking it is always a good idea no matter which chip.
 
  • Like
Reactions: Niner
like this
Btw I figured out how to reduce my temps while retaining the performance. I was setting some thermal setting to motherboard which was unnecessarily overvolting the CPU when not required causing temps to be in mid 80s. Now I went back to more daily driver settings and most things that create impact in gaming are:

Tight timings
PBO offset (using -20)

Everything else gets you higher Cinebench and CPUZ bench scores but there is no difference in gaming either in minimums or average fps for me. Maybe 1 fps here or there which honestly, I get running the exact same bench more than once. Reason is that max clockspeed remains the exact same at 5.215 Gs. Max volts dropped from 1.190 to 1.165
With the above settings I dropped about 150 pts in CB20 but reduced temps from 86 C to about 78 C max temps and no more fans boosting during bench. Pretty dope.
 
  • Like
Reactions: Niner
like this
For the reviews that did not test more likely, real world game play for a combination of components, A.K.A just 1080p with a 4090 -> I find limiting on answering the question: "What would I gain at where I game at, 4K?" Hardware Unbox latest, 45 games 9800X3D vs 14900K, 4090 all at 1080p for me was a totally pointless endeavour. Good at pointing out which CPU can push game code faster but irrelevant for determining if I should bother upgrading. Luckily others did the work and there are examples of 4K gaming where I would be playing at. A 5090 using a 5800X3D on a 120hz OLED TV at 4K, mostly max settings, is probably an overkill CPU in many cases. The monitor is most likely my limiting component in my arsenal.

Yes, lower resolution can expose what limiting but I prefer a whole review reflecting how the present-day product would be used and not some future crystal ball thrown in there.
 
For the reviews that did not test more likely, real world game play for a combination of components, A.K.A just 1080p with a 4090 -> I find limiting on answering the question: "What would I gain at where I game at, 4K?" Hardware Unbox latest, 45 games 9800X3D vs 14900K, 4090 all at 1080p for me was a totally pointless endeavour. Good at pointing out which CPU can push game code faster but irrelevant for determining if I should bother upgrading. Luckily others did the work and there are examples of 4K gaming where I would be playing at. A 5090 using a 5800X3D on a 120hz OLED TV at 4K, mostly max settings, is probably an overkill CPU in many cases. The monitor is most likely my limiting component in my arsenal.

Yes, lower resolution can expose what limiting but I prefer a whole review reflecting how the present-day product would be used and not some future crystal ball thrown in there.

I don't think we should be viewing HUB's videos as straight up upgrade/purchase advice, it's always been more on the side of "just for science" IMO. I actually have a bigger issue with his list of games, 45 games is a lot and most of it are simply games that hardly anyone plays. How many people are playing Shadow of the Tomb Raider today which is a single player game from 2018.
 
I don't think we should be viewing HUB's videos as straight up upgrade/purchase advice, it's always been more on the side of "just for science" IMO. I actually have a bigger issue with his list of games, 45 games is a lot and most of it are simply games that hardly anyone plays. How many people are playing Shadow of the Tomb Raider today which is a single player game from 2018.
I wouldn't call it good science in the end, there are other metrics than just avg FPS and 1% lows. GPU usage, CPU usage, as in how the CPU can push the GPU would be a great metric to include but is absent. To me it seems like a marketing video more than a useful review. Yes if I was going to build a gaming rig now, sure, 9800X3D but I already have several gaming rigs and planning on getting next generation Nvidia card, that is if available. Their reviews have almost no useful information for me to use.
 
I wouldn't call it good science in the end, there are other metrics than just avg FPS and 1% lows. GPU usage, CPU usage, as in how the CPU can push the GPU would be a great metric to include but is absent. To me it seems like a marketing video more than a useful review. Yes if I was going to build a gaming rig now, sure, 9800X3D but I already have several gaming rigs and planning on getting next generation Nvidia card, that is if available. Their reviews have almost no useful information for me to use.

Correct me if I'm wrong but another issue with testing CPU performance at 1080p is that ray tracing at higher resolutions will tax the CPU harder due to more rays being actually traced or something like that. Maybe I'm wrong or misheard that information I think I got it from DigitalFoundry but if it's true then that's another reason why we should be testing CPU performance in RT titles at higher res than 1080p.
 
Correct me if I'm wrong but another issue with testing CPU performance at 1080p is that ray tracing at higher resolutions will tax the CPU harder due to more rays being actually traced or something like that. Maybe I'm wrong or misheard that information I think I got it from DigitalFoundry but if it's true then that's another reason why we should be testing CPU performance in RT titles at higher res than 1080p.
Yes, more pixels to shade per triangle of the scene when resolution goes up, meaning for each pixel, lighting has to be calculated, compute shaded etc.
 
I don't think we should be viewing HUB's videos as straight up upgrade/purchase advice, it's always been more on the side of "just for science" IMO. I actually have a bigger issue with his list of games, 45 games is a lot and most of it are simply games that hardly anyone plays. How many people are playing Shadow of the Tomb Raider today which is a single player game from 2018.
For real with all the money they are making from the channel the asshole can at least spend some money on the latest games but he's too cheap. Or if he's penny pinching ask the devs for a key to promote /showcase the games. Makes no sense with so many people watching to not have the latest games yet have audience put up with his teenage style cringe comments and gross looks on his fugly face lol.
 
I don't think we should be viewing HUB's videos as straight up upgrade/purchase advice, it's always been more on the side of "just for science" IMO. I actually have a bigger issue with his list of games, 45 games is a lot and most of it are simply games that hardly anyone plays. How many people are playing Shadow of the Tomb Raider today which is a single player game from 2018.
I agree the games chosen are very meh, even though I have been an advocate of expanded games testing. Like where are the RTS games? I only see one represented and no survival-builder games. Where is Palworld? Surely they could source a large base build save file from one of their viewers to really stress the CPUs.
Yes, more pixels to shade per triangle of the scene when resolution goes up, meaning for each pixel, lighting has to be calculated, compute shaded etc.
That does really complicate things. Am I correct in assuming ray tracing doesn't scale with more cores? But also, wouldn't the performance deltas be similar? Like 20% faster at 1080p with ray tracing would also be 20% faster at 1440p with ray tracing, assuming no GPU bottleneck.

In general though, you only need to track GPU usage in your system to see if you can benefit from a CPU upgrade. You can also try to see if there are GPU reviews with your game and resolution to determine how much your CPU is holding it back, then use that to determine how much more powerful of a CPU you need and whether that is worth the cost.
 
If you want to do comparisons for some games I am happy to run benches for you if that helps with a purchase decision. I actually enjoy doing that. Now I am going to report my results of OC vs. no OC for a 7800X3D at 4K to give you guys a glimpse of if the OC is actually worth anything because I am still on the fence about it. It also means I need to download about 500 GB of games again since I deleted a lot of them after the benches I posted lol.
 
  • Like
Reactions: Tsumi
like this
If you want to do comparisons for some games I am happy to run benches for you if that helps with a purchase decision. I actually enjoy doing that. Now I am going to report my results of OC vs. no OC for a 7800X3D at 4K to give you guys a glimpse of if the OC is actually worth anything because I am still on the fence about it. It also means I need to download about 500 GB of games again since I deleted a lot of them after the benches I posted lol.
Great offer, I personally don't need it so I'll let others who have specific requests make them. While I do enjoy nerding out over this stuff, I'm not upgrading for at least 3 or 4 years, probably longer given my track record. I'll most likely get whatever is the greatest on AM5 about 3 years after its release.

I will say the gaming performance improvement of the 7800X3D over the 7700X is what convinced me to jump straight into the 7800X3D rather than starting at the 7700X. I've also started getting interested in hosting game servers, and there is a woeful lack of benchmarks on that. Especially whether or not the Vcache helps. Mostly for repurposing old hardware so reviews of the latest and greatest don't really help there, lol.
 
Makes sense. As I said I am doing it more for me. I need to dial my CPU to the point where I am comfortable with the push vs. noise vs. temps etc. I will do the balls to the wall as well just to see what happens but all these overclocking videos are giving me some FOMO about if I have the performance dialed in correctly lol.
 
Great offer, I personally don't need it so I'll let others who have specific requests make them. While I do enjoy nerding out over this stuff, I'm not upgrading for at least 3 or 4 years, probably longer given my track record. I'll most likely get whatever is the greatest on AM5 about 3 years after its release.

I will say the gaming performance improvement of the 7800X3D over the 7700X is what convinced me to jump straight into the 7800X3D rather than starting at the 7700X. I've also started getting interested in hosting game servers, and there is a woeful lack of benchmarks on that. Especially whether or not the Vcache helps. Mostly for repurposing old hardware so reviews of the latest and greatest don't really help there, lol.
I've also been unsuccessful at finding data for server hosting. I know that single threaded performance helps, so desktop chips reign absolutely supreme, but the impact of cache and RAM speed for that use case? Could not find anything.

Given that the code is generally very similar to the client, vcache might just help. But then, there are no graphics or audio effects to process on the server so maybe not...

I would run tests if I had a 7950 X3D at home, but I have a 7800 X3D and a 11900k, the latter is much slower at everything, vcache or not, so it doesn't work as a comparison. (it's also the one I use for game hosting, with XMP enabled (4000mhz) and optimized cooling to maintain a 5.3ghz single core clock, old repurposed hardware like you said, but since I was looking at the 9800 X3D, I was wondering how the 7800 X3D would fare in its stead)
 
Last edited:
Here are the results for my daily driver vs. balls to the wall and stock + tweaked 7800X3D. Proves the point that balls to the wall is a hit or miss.

System specs
DDR5 6000 CL30
Windows 11 24H2
4K 240 Hz DLSS Quality when available, HDR on
Nvidia drivers 566.367800X3D - PBO -15 and bulldzoid ram timings9800X3D stock with DDR5 6000 CL30 stockStock difference wrt 7800X3D tweaked9800X3D - PBO - 20/+75 and bulldzoid ram timings (daily driver)Difference wrt stock9800X3D - PBO - 20/+200 and bulldzoid ram timingsDifference wrt stockDifference wrt daily driver
Max Core Frequency (MHz)4875521552905415
Max Temps (C)858690 (limited)96
Max Volts (V)1.31.1751.231.28
Cinebench R206958882727%88490%89972%2%
Cinebench R23179372301428%230190%231651%1%
Cinebench R241099134522%13551%13702%1%
CPU-Z Bench SC69281518%8231%8363%2%
CPU-Z Bench MC7425862816%88993%90215%1%
Ass Creed Mirage158152-4%1584%1584%0%
Shadow of Tomb Raider180174-3%1814%1803%-1%
Far Cry 6 (no FSR)122118-3%1256%1213%-3%
Ghost Recon Breakpoint (no upscaling) DX11115111-3%1175%1165%-1%
Cyberpunk 2077 (DLSS FG ON)50.545.2-10%49.510%50.111%1%
Forza Horizon 5 (FG on)179174-3%1814%1803%-1%
Horizon Zero Dawn Remastered168167-1%1734%1723%-1%
Hogwarts Legacy (Hogsmead Area walkthrough)1081102%1155%1155%0%
Valley Benchmark (MSAA 8X)112.3106.4-5%111.45%112.86%1%
Superposition (4K Optimized)3300331764-4%336616%331594%-1%
Red Dead Redemption115111-3%1154%1154%0%
Heaven Benchmark (MSAA 8X)113.4109.6-3%114.65%115.55%1%
Metro Exodus Enhanced Edition (Extreme preset)111.09109-2%112.33%111.542%-1%
 
Back
Top