Ryzen 7 5800X3D Beats Core i9-12900KS By 16% In Shadow of the Tomb Raider

1_rick

Supreme [H]ardness
Joined
Feb 7, 2017
Messages
5,359
https://www.tomshardware.com/news/r...i9-12900ks-by-16-in-shadow-of-the-tomb-raider

The Ryzen 7 5800X3D delivered an average frame rate of 231 FPS, while the Core i9-12900K and Core i9-12900KS finished with 190 FPS and 200 FPS, respectively. Therefore, AMD’s chip beat the Core i9-12900K by 22% and the Core i9-12900KS by 16%.

Shadow of the Tomb Raider is a title that relies heavily on memory speed and is sensitive to memory latency, which favors the Ryzen 7 5800X3D. It’s necessary to put the AMD chip through more titles to see whether it can be the “world’s fastest gaming CPU,” as AMD has been calling it.

The Ryzen system used a 3080 Ti while the 12900K and KS used a 3090 Ti. Tests were done at 720p to eliminate GPU bottlenecks.
 
More interested in 2k/4k results. Anything less is fairly useless these days. In particular, does this perform any better than a 5900k at modern resolutions. I don't see the comparisons to intel being all that relevant, as most people buying this thing are just upgrading their current AM4 build.
 
CL40 4800 DDR5 is super weak. I would like to see a much better kit paired here. 6400 CL32 can have uplifts of 20-25 percent in some titles compared to the JDEC spec 4800.
 
More interested in 2k/4k results. Anything less is fairly useless these days. In particular, does this perform any better than a 5900k at modern resolutions. I don't see the comparisons to intel being all that relevant, as most people buying this thing are just upgrading their current AM4 build.
At 4k/5k it becomes much more about the GPU. I understand why they did what they did. Most places benchmark CPUs nowadays at 1080p. They just took it a step further.
 
I would like to see a more even comparison, it “looks” like they are trying to give Intel an edge with that 3090 when really they are holding it back with that ram selection. A little skewed there, but reflects what we already know in situations where titles want or need that cache you will see an improvement,
 
At 4k/5k it becomes much more about the GPU. I understand why they did what they did. Most places benchmark CPUs nowadays at 1080p. They just took it a step further.
Yeah makes sense. Still would be nice to see what sort of difference you'd really see in real world resolutions that you'd be running 3080 Ti or 3090 Ti on to see if it makes any difference vs the 5900X or not.
 
Tests were done at 720p to eliminate GPU bottlenecks.
Congrats to AMD for winning the title of "Best at Something Nobody Cares About." Is it even possible to buy a new 720p screen these days? Or did AMD provide a special adapter that allows you to connect an Apple Watch to use as the display?

1998 would like its benchmark back.
 
hmm....

ohno.jpg
 
Congrats to AMD for winning the title of "Best at Something Nobody Cares About." Is it even possible to buy a new 720p screen these days? Or did AMD provide a special adapter that allows you to connect an Apple Watch to use as the display?

1998 would like its benchmark back.
Yes, in fact I did last year.
 
Congrats to AMD for winning the title of "Best at Something Nobody Cares About." Is it even possible to buy a new 720p screen these days? Or did AMD provide a special adapter that allows you to connect an Apple Watch to use as the display?

1998 would like its benchmark back.
The point was to test the performance of the CPU, so the low resolution was used to minimize the amount of work the video card was doing as much as possible. I still have an issue with using different video cards in each system.
 
The point was to test the performance of the CPU, so the low resolution was used to minimize the amount of work the video card was doing as much as possible. I still have an issue with using different video cards in each system.

I don't know that it would have made a difference at such low of a resolution. The Ryzen system used the "less powerful" GPU anyway.
 
Here's an idea. Put up some benchmarks not related to gaming if you are just trying to compare CPU performance. Benchmarks like these are useless click bait.
I've always agreed about this, but it looks like the ship has sailed long ago on that. For years doing this kind of test to show how Intels were "best for gaming" was an accepted method.
 
The point was to test the performance of the CPU, so the low resolution was used to minimize the amount of work the video card was doing as much as possible. I still have an issue with using different video cards in each system.
In that case, why even pretend to run a game at all? A $4000 gaming rig being used to play a game at 231fps at smart watch resolution is no more realistic of a benchmark than simply comparing raw numbers from a spec sheet. It would have been useful in 1998.

The reason they had to time travel back to 1998 in order to run the benchmark is because the performance difference in 2022 amounts to approximately nobody-cares-percent.
 
I've always agreed about this, but it looks like the ship has sailed long ago on that. For years doing this kind of test to show how Intels were "best for gaming" was an accepted method.
I agree and yes the trend has been this for years. Its not just a AMD marketing trick. Intel has been guilty of the same thing. Cherry picking and misleading for what I would worry about when it came to a CPU. Plus on a game that is years old that clearly favors AMD traditionally. Which leads me to believe real actual performance numbers are not going to look very good when compared apples to apples.
 
I agree and yes the trend has been this for years. Its not just a AMD marketing trick. Intel has been guilty of the same thing. Cherry picking and misleading for what I would worry about when it came to a CPU. Plus on a game that is years old that clearly favors AMD traditionally. Which leads me to believe real actual performance numbers are not going to look very good when compared apples to apples.

I think it will be more competitive than you think. AMD basically is taking what makes their CPUs competitive with lower clockspeeds and tripling it (albeit lowering the clockspeed more). If Intel decided to triple the cache on ADL, it would be a monster.
 
I would like to see a more even comparison, it “looks” like they are trying to give Intel an edge with that 3090 when really they are holding it back with that ram selection.
4800 vs 3200 is unfair to Intel? The quick search I did found 3200 to have higher bandwidth but 4800 to have lower latency (on the same 12900K).
 
In that case, why even pretend to run a game at all? A $4000 gaming rig being used to play a game at 231fps at smart watch resolution is no more realistic of a benchmark than simply comparing raw numbers from a spec sheet. It would have been useful in 1998.
Didn't this very site use low-resolution comparisons, back when they still did reviews here, for the exact same reason given in the article? (And yeah, people would crawl out of the woodwork every time with the same complaint.)
 
More interested in 2k/4k results. Anything less is fairly useless these days. In particular, does this perform any better than a 5900k at modern resolutions. I don't see the comparisons to intel being all that relevant, as most people buying this thing are just upgrading their current AM4 build.
Doesn't make any sense to test a cpu at 2k/4k, might as well use a 4790k :D
 
Congrats to AMD for winning the title of "Best at Something Nobody Cares About." Is it even possible to buy a new 720p screen these days? Or did AMD provide a special adapter that allows you to connect an Apple Watch to use as the display?

1998 would like its benchmark back.
I thought the whole point of the 12900ks is to have the highest performance gaming CPU. Now that the 5800X3D beats it (at least in one game), now its useless?
The whole point of the benchmark is to see which CPU "allows" higher performance in a particular game engine. YES, its not as applicable for the real world, but its still a valid data point nonetheless just like Cinebench results.
I wonder what CSGO is like...
 
  • Like
Reactions: kac77
like this
Doesn't make any sense to test a cpu at 2k/4k, might as well use a 4790k :D
Exactly this. You do not test CPU gaming performance at 4k. All the results would be GPU limited which doesn't mean a damn thing. Now in 2k I can see some games showing the CPU performance. 1080p should be the sweet spot to use since no one is playing at 720p.

Now to really test CPU performance you do want to use a lower resolution. But, lets be honest here. This is 1 game benchmark.....I would like to see way more games before you make a final decision.

Remember the 5800x3D is being pushed out as a gaming CPU. When it comes to productivity benchmarks, no one expects it to beat the 5800x or even Intel. What AMD wants to go is take back the gaming CPU performance crown....
 
I wonder what the actual boost clocks are since the 3D variant is supposed to be 300MHz less...
Dat IPC in games...
 
More interested in 2k/4k results. Anything less is fairly useless these days. In particular, does this perform any better than a 5900k at modern resolutions. I don't see the comparisons to intel being all that relevant, as most people buying this thing are just upgrading their current AM4 build.


No, I wouldn't be. 720p max is the best balance of pushing the CPU like most normal people do (while minimizing stress on GPU)

if you test at 4k, then the results will all be GPU-limited
 
Exactly this. You do not test CPU gaming performance at 4k. All the results would be GPU limited which doesn't mean a damn thing. Now in 2k I can see some games showing the CPU performance. 1080p should be the sweet spot to use since no one is playing at 720p.
For me it is almost obvious that you do a mix (but if you test at 4K, do it at 3090TI/6950xt)

People that buy the latest/greatest CPU would be often playing at high resolution and would be the type of people that upgrade a lot (making hypothetical future games or future GPU performance not that important to them if at all), making it really relevant for them how it does now in things they would actually do.

You want the 720p/600p run to test the cpu part to a maximum, to give an idea when RDNA3/Lovelace come up, how much difference could you see at 1440p or even 4k on some title among them, if there is none yet with the current GPU and because you will not test all games.

You want regular type of run, so people that want to upgrade for today performance (or wait to upgrade) can have the information.

When it comes to productivity benchmarks, no one expects it to beat the 5800x
Considering how much giant of a cache they pack on the Epyc (with 3d v-cache has well coming up) I imagine some non-game workload, (database, very complex math simulation ?) could be helped by this has well ?
 
Last edited:
For me it is almost obvious that you do a mix (but if you test at 4K, do it at 3090TI/6950xt)

People that buy the latest/greatest CPU would be often playing at high resolution and would be the type of people that upgrade a lot, making it really relevant for them how it does now in things they would actually do.

You want the 720p/600p run to test the cpu part to a maximum, to give an idea when RDNA3/Lovelace come up, how much difference could you see at 1440p or even 4k on some title among them, if there is none yet with the current GPU and because you will not test all games.

You want regular type of run, so people that want to upgrade for today performance (or wait to upgrade) can have the information.


Considering how much giant of a cache they pack on the Epyc (with 3d v-cache has well coming up) I imagine some non-game workload, (database, very complex math simulation ?) could be helped by this has well ?
Great points, 720P-4K it is, I didn't look at it that way only the original way of bumping down the resolution to see the raw power of the CPUs.
 
In short, do you have a CPU from the last 2~3 generations of CPUs from either side? Cool, your gaming experience will be just fine at real resolutions... lol.

I would imagine the only people who "might" care for gaming would be those who competitively game for money and go for lowest settings possible for the highest FPS possible.
 
I'm pretty sure everyone understands why tests are done at 720p. The main question is how relevant is it really for most people? Test the cpus at normal resolutions too.
 
In short, do you have a CPU from the last 2~3 generations of CPUs from either side? Cool, your gaming experience will be just fine at real resolutions... lol.

I would imagine the only people who "might" care for gaming would be those who competitively game for money and go for lowest settings possible for the highest FPS possible.
No, I want to see relevant frame times & minimum FPS numbers at modern resolutions. CPU's make a big difference with this. My upgrade from a 2700x->3700x was noticeable in some games. The same goes when I went from 3700x->5900x.

The most relevant question when it comes to the 5800x3D is how it performs compared to a 5900x in the numbers that matter.
 
No, I want to see relevant frame times & minimum FPS numbers at modern resolutions. CPU's make a big difference with this. My upgrade from a 2700x->3700x was noticeable in some games. The same goes when I went from 3700x->5900x.

The most relevant question when it comes to the 5800x3D is how it performs compared to a 5900x in the numbers that matter.
I can't disagree with that point, I noticed quite a jump on my last upgrade in frame times and overall smoothness, even though at 4K, my average did not go up by a large amount.
 
No, I want to see relevant frame times & minimum FPS numbers at modern resolutions. CPU's make a big difference with this. My upgrade from a 2700x->3700x was noticeable in some games. The same goes when I went from 3700x->5900x.

The most relevant question when it comes to the 5800x3D is how it performs compared to a 5900x in the numbers that matter.
Theoretically, even if avg fps is about the same, minimum should be much better. Agree, it would be nice to know.
 
In short, do you have a CPU from the last 2~3 generations of CPUs from either side? Cool, your gaming experience will be just fine at real resolutions... lol.
Someone with an high end GPU (or aiming above 100 fps in average) could be set back from a ryzen 2600 (I imagine 0.1% being even a bigger scaling for some games), for someone with a mid-end or going for 60 fps for other reason like a tv/monitor that max out, I imagine that you would be right for most game.

Avg-Ultra-1440p.png
 
Last edited:
720p tests are purely academic. Its not actually useful information, because nobody games at 720p. The marketing of this CPU is that its a solid boost for gaming. Actual gaming.

Any GPU review which still shows 720p data-----I skip right over it.
 
720p tests are purely academic. Its not actually useful information, because nobody games at 720p. The marketing of this CPU is that its a solid boost for gaming. Actual gaming.

Any GPU review which still shows 720p data-----I skip right over it.
It's useful data for a long term builder. 720p results today show you the difference when you upgrade to RTX 5000 series later on.

Excluding maybe 4K and bigger resolutions, those will be GPU constrained for a long time still.
 
It's useful data for a long term builder. 720p results today show you the difference when you upgrade to RTX 5000 series later on.

Excluding maybe 4K and bigger resolutions, those will be GPU constrained for a long time still.
The point here is look at the chart above.

As you said, any newer CPU is irrelevant if you don't have a higher-end GPU. This is why I specifically want to see tests between the 5800x3D, a 5800x, and a 5900x, along with a 3080ti/3090. With these GPU's, the CPU makes a big difference when it comes to frame-drops, etc.

The most relevant test to if a 5800x3D is worth upgrading to is showing said results and comparing to a 5800x, and 5900x. People can then make a decision based off of differences (Or not) that are real world and human noticeable. If the 5800X3D doesn't uplift the mins at all compared to a 5900x, or even 5800x, it's not worth the money.
 
Oh no they benched marked in a way to show the performance difference of the thing they are talking about and not the GPU the horror. lol

The fact that that is what you have to do to make the CPU matter AT ALL. Should tell everyone that buys CPUs for gaming alone what they need to know. You don't need anything more then a mid range CPU to game... GPU sure go for the moon if you can afford/find what you want. CPU if your just gaming its all the same. Get the best priced CPU that slots into the platform you want to run.
 
I don't know anyhting about Shadow of the Tomb Raider. Is it a representative benchmark? Or is it biased like all those sham AotS benchmarks that have been pushed in the past?

Either way, I am glad to see the repeated leapfrogging keep up, and I hope it continues. This benefits the consumer.
 
Congrats to AMD for winning the title of "Best at Something Nobody Cares About." Is it even possible to buy a new 720p screen these days? Or did AMD provide a special adapter that allows you to connect an Apple Watch to use as the display?

1998 would like its benchmark back.

It makes perfect sense to lower resolution and graphics settings as much as possible in order to isoate the CPU performance.

If you want to predict how a system will perform in a given title look at two benchmarks:

1.) CPU benchmark with your CPU, and all graphics settings and resolutions minimized.
2.) GPU benchmark at the resolution and graphics settings you'd think you'd play at.

Now pick the lower of the two scores. That will be your system performance.

An exact combination of your GPU and CPU at the exact settings will of course always be more accurate, but you can't always find someone testing the exact configuration you might have.
 
It makes perfect sense to lower resolution and graphics settings as much as possible in order to isoate the CPU performance.

That makes perfect sense, but I also want to see real world performance gains. Yes it might be 15% faster when CPU bound, but how does that translate to actual game performance? It is a useless thing to measure because no one actually games at those resolutions with settings turned down. I don't mind including it, but they should always include 1080/1440/1440 ultra wide and 4K at max settings. Of course CPUs tend not to make much of a difference there, but it would be nice to see how much or how little the gain would be from an older CPU to inform a purchasing decision.
 
Back
Top