Comprehensive Core i9-10900K Review Leaked: Suggests Intel Option Formidable

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,894
"Nothing much is news-worthy in this news." - XL-R8R

"A comprehensive review of the Intel Core i9-10900K 10-core/20-thread processor by Chinese tech publication TecLab leaked to the web on video sharing site bilibili. Its testing data reveals that Intel has a fighting chance against the Ryzen 9 3900X both in gaming- and non-gaming tasks despite a deficit of 2 cores; whereas the much pricier Ryzen 9 3950X only enjoys leads in multi-threaded synthetic- or productivity benchmarks.

Much of Intel's performance leads are attributed to a fairly high core-count, significantly higher clock speeds than the AMD chips, and improved boosting algorithms, such as Thermal Velocity Boost helping the chip out in gaming tests. Where Intel loses hard to AMD is power-draw and energy-efficiency. TecLab tested the three chips with comparable memory- and identical graphics setups."


https://www.techpowerup.com/267287/...eview-leaked-suggests-intel-option-formidable
 
Actually doesn't look so bad. Not a home run hit, but Intel is still in the game.
 
Zen 2 cannot compare to Intel for extreme gaming. I'm waiting for Zen 3 to see if AMD trounces Intel on the gaming front.

Unless you consider Threadripper. IIRC it was beating Intel in most games... if you're willing to drop $2.5-3k on a proc/mobo.
 
  • Like
Reactions: N4CR
like this
Looks like it. +0.1Ghz and +30W...
Great, another 6700K -> 7700K, with additional TDP on top of it.
I do have to say, though, that it has been impressive how long Intel has stretched out 14nm+++++ over the years.

If their offerings keep getting better, hopefully that will continue to up the competition for AMD, and that is a win-win for us!
 
Was this performance review also set to the same 235w TDP limit that Intel used in its own benchmarks?
 
Was this performance review also set to the same 235w TDP limit that Intel used in its own benchmarks?

Well, the screenshots show it pulling 337W, so probably not. (To be fair, the 3950X was shown using 306W. Of course, it's got 16 cores, not just 10.)
 
Zen 2 cannot compare to Intel for extreme gaming. I'm waiting for Zen 3 to see if AMD trounces Intel on the gaming front.
What are you waiting for? Their 2nd cheapest Zen2 CPU already beats the 9999kkk
3950 already has better 1% and averages in memenite but no one ever talks about that for some strange reason.
 

Attachments

  • fortnight 3300x.jpg
    fortnight 3300x.jpg
    174.7 KB · Views: 0
Zen 2 cannot compare to Intel for extreme gaming. I'm waiting for Zen 3 to see if AMD trounces Intel on the gaming front.

That’s a bit of an overstatement. It is very situational and depends entirely on what you mean by “extreme gaming” as that is not an official market definition and something defined on a person-by-person basis. Even in the situations where Intel wins the average is barely outside of a general margin of error, I would hardly call that not comparing.
 
That’s a bit of an overstatement. It is very situational and depends entirely on what you mean by “extreme gaming” as that is not an official market definition and something defined on a person-by-person basis. Even in the situations where Intel wins the average is barely outside of a general margin of error, I would hardly call that not comparing.

I use AMD in most of my / family’s rigs but my main rig I use Intel. If I spend $3-4k on a 2080ti powered rig I am not going to gimp it over $50-100 so I go Intel on that one.

Now if you’re budget constrained or streaming or need the cores AMD makes allll the sense in the world.

Of course no budget at all you’d pick thread ripper...

Intel needs to do better, fast, or AMD will blaze past them in all segments. Most AMD already has.
 
Zen 2 cannot compare to Intel for extreme gaming. I'm waiting for Zen 3 to see if AMD trounces Intel on the gaming front.
I mean, if you really care that Intel gets 176fps over AMD getting 172fps - I would hardly call that a 'win', especially for the decreased core count and the massive TDP increase.
But, if there is someone in that "extreme gaming" category that must get every fps to achieve their goal, then Intel is the way to go.
 
Oof, even Kaby Lake improved the iGPU over Skylake...


Only for battery efficiency and the hardware decoders. On the desktop, Kaby Lake's GPU performance was identical..

Much like Haswell refresh, you tend to only see one improvement in efficiency on the same process . Coffee Lake was just using the improvements that made the 7700k possible to introduce 6 slower-clocked cores at 65w TDP,

The 8700k at stock clocks exceeds 100w package power when you run Cinebench, and they just opened-up the gas tanks for that point onward.

multi-core-enhance-cinebench-pwr.png

9900k uses 175w package power at stock clocks.

The 10900k uses somewhere around 250 w fully-loaded!

Skylake++++: fuck the power, full-speed ahead!
 
Last edited:
Sure, if you don't mind burning 300W to get 3% better frametimes, it's a great buy.

It's not 300w difference. The power usage difference in most games will be less than the usage of a 40watt light-bulb. Big whoop.

What are you waiting for? Their 2nd cheapest Zen2 CPU already beats the 9999kkk
3950 already has better 1% and averages in memenite but no one ever talks about that for some strange reason.

Cherry pick one terrible game from some no-name website; a highly overclocked 3300X versus stock 9900K. Talk about a stretch...

So "extreme" gaming is 1080p low settings in 2020?

I mean, I run Intel on my main rig, not a hater, but once you get into 4K, ray-tracing, etc. Intel is not far ahead or at all really.
That’s a bit of an overstatement. It is very situational and depends entirely on what you mean by “extreme gaming” as that is not an official market definition and something defined on a person-by-person basis. Even in the situations where Intel wins the average is barely outside of a general margin of error, I would hardly call that not comparing.
I mean, if you really care that Intel gets 176fps over AMD getting 172fps - I would hardly call that a 'win', especially for the decreased core count and the massive TDP increase.
But, if there is someone in that "extreme gaming" category that must get every fps to achieve their goal, then Intel is the way to go.

All complete myths. Only noobs think you need 1080P resolution to run into CPU bottlenecks, or are playing ancient terrible games/software. In the three games I play (DCS Flight Simulator, Escape from Tarkov, Star Citizen), ALL three are CPU bound with my RTX Titan at 3840x1600/160FPS/Hz and 9900K at 5.4 GHz. I buy the fastest items no matter the brand. Not going to change the fact that my 5.4 GHz 9900K is going to be on average 10-15% faster than an overclocked 3950X in 99.9% of games. Intel just currently has a frequency/overclock advantage that clearly puts it ahead of AMD right now for gaming.

I hope Zen 3 changes that.
 
Last edited:
Waiting for Zen 3, Big Navi/3080, and Samsug 980 Pro myself. Can't believe my 9900K is almost two years old.
 
I buy the fastest items no matter the brand.

Hey, the world needs whales, and if that's your niche, then lean in. Most people prefer something with a higher value proposition.
 
All complete myths. Only noobs think you need 1080P resolution to run into CPU bottlenecks, or are playing ancient terrible games/software. In the three games I play (DCS Flight Simulator, Escape from Tarkov, Star Citizen), ALL three are CPU bound with my RTX Titan at 3840x1600/160FPS/Hz and 9900K at 5.4 GHz. I buy the fastest items no matter the brand. Not going to change the fact that my 5.4 GHz 9900K is going to be on average 10-15% faster than an overclocked 3950X in 99.9% of games. Intel just currently has a frequency/overclock advantage that clearly puts it ahead of AMD right now for gaming.

I hope Zen 3 changes that.

Your specific use case does not define the entirety of a product’s performance. This is why we use averages when discussing where an item falls in relation to another and don’t rely on cherry picked examples.
 
It's not 300w difference. The power usage difference in most games will be less than the usage of a 40watt light-bulb. Big whoop.



Cherry pick one terrible game from some no-name website; a highly overclocked 3300X versus stock 9900K. Talk about a stretch...





All complete myths. Only noobs think you need 1080P resolution to run into CPU bottlenecks, or are playing ancient terrible games/software. In the three games I play (DCS Flight Simulator, Escape from Tarkov, Star Citizen), ALL three are CPU bound with my RTX Titan at 3840x1600/160FPS/Hz and 9900K at 5.4 GHz. I buy the fastest items no matter the brand. Not going to change the fact that my 5.4 GHz 9900K is going to be on average 10-15% faster than an overclocked 3950X in 99.9% of games. Intel just currently has a frequency/overclock advantage that clearly puts it ahead of AMD right now for gaming.

I hope Zen 3 changes that.

5.4ghz changes the landscape there. 24/7 stable?
 
5.4ghz changes the landscape there. 24/7 stable?

Even if it is for gaming, the power draw when doing any sort of non-gaming workload, especially one using AVX is going to mean a serious investment in cooling that is a 0.1% type of build and not for even the average guy who builds his own computer. I'm not against that sort of build at all, but you're tripling the cost of a build to get that last 5% of performance (see also the RTX Titan).
 
I'm not against that sort of build at all, but you're tripling the cost of a build to get that last 5% of performance

This is what will probably keep me off this particular upgrade. It's too bad neither side will offer a chip with fewer cores but the best rated speed.
 
The 8700k at stock clocks exceeds 100w package power when you run Cinebench, and they just opened-up the gas tanks for that point onward.

View attachment 246531

9900k uses 175w package power at stock clocks.

The 10900k uses somewhere around 250 w fully-loaded!

Yep.. WHEN you're running fully loaded

9900k doing minor web browsing and emails etc consumes about 3W when power saving measures are in place. And most games don't use all cores either, 30-35W in Assassin's creed Odyssey for instance (at 5ghz, 6 cores/12 threads).
 
Last edited:
Oh I forgot this is no longer the [H]ardforum.. now the [A]verage or [V]alue forum for some people.

Who would have ever thought, in a global recession, people are being a bit stingier on their spending on technology from 2017.

And I'll have you know - I've gone Zen the last couple of years not because it was Fast, but because it's interesting. Intel has been boring for years. That's what [H] was about... Not spend the most money to cover niche cases and single digit performance returns.
 
Yep.. WHEN you're running fully loaded

9900k doing minor web browsing and emails etc consumes about 3W when power saving measures are in place. And most games don't use all cores either, 30-35W in Assassin's creed Odyssey for instance (at 5ghz, 6 cores/12 threads).

I don't know about that. Unless I just don't know what "power saving measures" you're talking about. I routinely see 25-35W in normal computer tasks like browsing on mine.
 
Last edited:
Oh I forgot this is no longer the [H]ardforum.. now the [A]verage or [V]alue forum for some people.

Of course, if you were really [H]ard you'd have a second 3950X setup just for the tasks that AMD does better than Intel... Evidently you too have fallen prey to the [V]alue mindset.
 
Last edited:
Of course, if you were really [H]ard you'd have a second 3950X setup just for the tasks that AMD does better than Intel... Evidently you too have fallen prey to the [V]alue mindset.

Ya, I'm going to build a second PC to do tasks I don't do or care nothing about. Boy some people get desperate to try and make a point...

Ironically an OC'd 10980XE would still be better at 99% of those same tasks than an OC'd 3950X.

I'm truly sorry (not) that Intel is still clearly better for games; you're just going to have to get over it.
 
Oh I forgot this is no longer the [H]ardforum.. now the [A]verage or [V]alue forum for some people.

While I mostly agree with the sentiment that Intel is still top dog for gaming, I'd argue that simply buying the top binned SKU isn't [H]ard. Being [H]ard has always been a mindset of getting the most out of what we got, IMO.

We both joined [H]ardforum in 2004, and back then dropping an Athlon XP-M 2500+ (Barton), a $75 mobile CPU, into a desktop motherboard then overclocking the shit out of it was irrefutably [H]ard. Shortly thereafter, we were talking about unlocking vertex shaders and pipes in GeForce 6800, X1800 GTO, and more than I can remember. None of these were flagship parts. So yes, to a degree being [H]ard is also about value sentiment and trying to snipe the flagship product without spending flagship money.
 
Last edited:
This whole topic came about because of talk that AMD was faster at games and/or Intel is only faster at "1080P", both which are false. I have nothing more to add.
 
Back
Top