No excitement about intel's latest offering??

The problem with the mid range offerings is the price. You're still paying $260+ for an unlocked part which is $100 more than the 3600. The plain old locked i5 is ~$190-200, but you're not going to get that 5Ghz boost speed. You're only looking at 4.3 which is going to perform essentially the same as the 3600(x) for the same price. It's not unreasonable for someone to put an extra $100 toward a video card instead of the unlocked CPU.
Yeah, I know the problem, that's why I'm not holding my breath for great news when real benchmarks come out. I'm looking at some upgrades and swaps with my home server, my desktop and, my son's desktop and my daughter(2) desktops. I use my server for recording live TV with Plex. I use it for some development (remote visual studio code install), a couple of local Minecraft servers, file server, and a few other odds and ends. It would be nice to have something with hardware transcoding, so waiting to see the new AMD Apu compared to a chip with quicksync, compared to just reusing my ryzen 1600 with a cheap GPU thrown in. I am not in a rush. I was going to go all in for AMD, but if my b450 can't be upgraded to zen3 then I have to upgrade it either way so I may as well check it what Intel has and how much. I try not to make decisions until I have the data to support it. Like you, I have a pretty good idea that AMD is still going to be the $/perf leader and they have their new stuff coming out that will possibly put them further ahead (depending on how they price it).
 
No one is interested in the latest 10 core, up to 5.3 ghz chip?? :) My 8700K is starting to struggle a bit in maintaining 4.8ghz, I had to boost the voltage recently to stabilize it. so I was considering an 9900K and calling it a day, then read about the latest 10900k. Seems pretty interesting. But I haven't seen much hype about it. Is it the same core as the existing Coffee Lake setup, just more of them at a higher clock?

While some 8700k struggle to hit 4.8ghz, the 8086k are all 5.0+ ghz and same goes for the 9900k. The jump from a 9900k to a 10900k will be smaller than from an 8086k to a 9900k if there was any even at all.

The only real excitement for the 10th gen parts should be the much cheaper 6/12 parts which look to give Intel a great value in the midrange while leaving a nice upgrade path to RKL.
 
Intel's biggest problem with 10nm going forward, is that their 14nm was too damn good. Now 10nm can't really compete on the desktop where clock speed is king.

Even TSMC 7nm would be a downgrade for Intel vs its 14nm on Desktop. It's possible we might never see another 5GHz+ desktop CPU from anyone, below 14nm.

More like 10nm was just to damn crappy. First with CNL 10nm it late 2018 which was a joke when it got destroyed by 14nm+ in efficiency and clock speed. Now improved ICL 10nm+ is still not good enough for desktop use as clock losses still offset IPC gains. Later, even TGL 10nm++? will get passed in the desktop scene for RKL as the icp gains still can't offset the freq losses.
 
Oh almost certainly auto, this is a recent Windows install. Only just installed HWINFO to see what's what.
Right so when you’re running AVX loads like BF4 it’s running 4.7ghz turbo, as the default AVX setting is -3.
 
Right so when you’re running AVX loads like BF4 it’s running 4.7ghz turbo, as the default AVX setting is -3.
It may not be an AVX load, since it's pegged to 5.0; however, that may be what the board is doing instead. I'll look at it closer.
 
Why the lack of excitement? Simple. Except for SMT across all except the Celeron-branded models and the addition of two more cores in the i9 line, it's just more of the same old thing. It goes to show you just how complacent Intel has been in the past decade.
 
I’m planning a 10th gen system.

There are some nice improvements in the midrange imo.

Better ITX boards and all CPUs have hyperthreading.

Have a preorder in for the Gigabyte Z490 ITX and waiting on the 10600K to go on sale.

My situation is somewhat less common, I’m taking the gaming GPU out of my main computer and building a new separate system just for gaming. If I was on Ryzen 3000 or 8/9th gen already I probably wouldn’t upgrade.
 
I have a 6600k right now and have been weighing upgrading to either an R5 3600 on X570, with the intent to upgrade to a 4600 later this year, or to a 10600k on Z490. They'd cost about the same considering I'd need to buy faster RAM for Ryzen, but X570 boards have either been out of stock or have had their prices hyper inflated for several weeks now.

Between my own impatience and not knowing when the X570 situation will change, I am probably going to pull the trigger on Intel on Wednesday.
 
Last edited:
the 10700K looks interesting, 8 core / 16 thread $399.
Yeah, or the 3700x which is $280. And this is why more people are not excited, we already have most of these parts covered for a cheaper price. I'm more interested/excited to see if zen3 IPC increase closes the gap in gaming performance. 200mhz + slightly increased IPC gives me hope. Then if they keep current pricing (or even bump price UP to meet Intel) it would be a much better buy.

Edit:. This doesn't mean Intel is DOA, it just doesn't excite me. I may still end up building an Intel system to replace my server just because QSV is much better than VCE (or w/e AMD is calling it now).
 
Yeah, or the 3700x which is $280. And this is why more people are not excited, we already have most of these parts covered for a cheaper price. I'm more interested/excited to see if zen3 IPC increase closes the gap in gaming performance. 200mhz + slightly increased IPC gives me hope. Then if they keep current pricing (or even bump price UP to meet Intel) it would be a much better buy.

Edit:. This doesn't mean Intel is DOA, it just doesn't excite me. I may still end up building an Intel system to replace my server just because QSV is much better than VCE (or w/e AMD is calling it now).

The more I think of it, the more likely it is that I'll just wait for zen 3
 
The more I think of it, the more likely it is that I'll just wait for zen 3
Yeah, that's what I'm thinking.. and now that AMD officially said they WILL supply zen3 microcode for b450/x470, I won't even have to upgrade my MB to do so, w00t. (I may end up updating anyways, but it's nice not being forced to).
 
Kind of difficult to justify the price of the 9900k / 10900k if you use your PC for more than just gaming... I was considering a 9900k vs 3900x a few days ago and basically AMD is cheaper while 95% as fast as intel in single-threaded workload and 125% as fast in multi-threaded tasks. From what it seems, the 10900k will have its price comparable to the 3950x... which is basically the same thing again.

Pure gaming build: Intel FTW. Mixed use build: AMD FTW.
 
Kind of difficult to justify the price of the 9900k / 10900k if you use your PC for more than just gaming... I was considering a 9900k vs 3900x a few days ago and basically AMD is cheaper while 95% as fast as intel in single-threaded workload and 125% as fast in multi-threaded tasks. From what it seems, the 10900k will have its price comparable to the 3950x... which is basically the same thing again.

Pure gaming build: Intel FTW. Mixed use build: AMD FTW.

BS.

Intel doesn't win gaming because it's single threaded, because gaming is far from single threaded. Intel wins gaming because it has lower latency.

Intel is great for mixed usage. Because most usages are not embarrassingly parallel.

AMD really only wins the embarrassingly parallel workloads. Encoding/Rendering. Of which, only video encoding is popular.

Gaming and Mixed usage: Intel.

Embarrassingly Parallel: AMD.
 
BS.

Intel doesn't win gaming because it's single threaded, because gaming is far from single threaded. Intel wins gaming because it has lower latency.

Intel is great for mixed usage. Because most usages are not embarrassingly parallel.

AMD really only wins the embarrassingly parallel workloads. Encoding/Rendering. Of which, only video encoding is popular.

Gaming and Mixed usage: Intel.

Embarrassingly Parallel: AMD.
I take that you’re a hardcore intel fanboy then? Intel will bounce back, don’t get too butt hurt about it, lol.
 
People, it is not a 250w processor unless you load it up and run it at that most of the time which, I would argue, most people do not.

I have a 9900k. When it is running full pelt it is not a cool processor by any stretch. Enable speed shift (HWP), core duty cycling and sleep states, and it becomes positively civilised.

While I was playing assassin’s creed origins it was only consuming 30-40W last night and the cores were running at 5ghz! 2 cores were sleeping at the time also. This is far from the 220w lofty peaks that you will hear people cry foul about. Starcraft 2 was down around 15-20W.

Intel processors actually idle at lower power than AMDs processors for now, so net power usage will be lower.
 
Last edited:
Kind of difficult to justify the price of the 9900k / 10900k if you use your PC for more than just gaming... I was considering a 9900k vs 3900x a few days ago and basically AMD is cheaper while 95% as fast as intel in single-threaded workload and 125% as fast in multi-threaded tasks. From what it seems, the 10900k will have its price comparable to the 3950x... which is basically the same thing again.

Pure gaming build: Intel FTW. Mixed use build: AMD FTW.

Urm.. no

I use things like STM32Cube, fusion 360, Simplicity studio, autodesk inventor, Atmel Studio.

I would rather not have to deal with having to tweak memory, with edge cases etc etc. It needs to just work
 
I wouldn't consider going AMD until they catch up on the ghz race but make no mistake about it, this is just a garbage being shoveled by intel just to keep their name relevant.
 
I wouldn't consider going AMD until they catch up on the ghz race but make no mistake about it, this is just a garbage being shoveled by intel just to keep their name relevant.
It's not garbage. This fanboy hyperbole is getting silly.

It's just a placeholder generation, after a series of placeholder generations; like AMD did with every Bulldozer release. Just likely for less time, and remaining competitive for nearly all of it, unlike Bulldozer which was actually DOA from a performance perspective.

It's also quite nice that AMD is providing new products that are competitive, but at the same time, it should be understood that in general, both are competitive and have their own edge cases -- performance and otherwise -- that make them desirable over the other.
 
It's a pretty solid statement, you can hand a decade-old 980X to someone now and they're unlikely to notice its slow for day-to-day tasks, and a 2600K is pretty viable even for gaming.
The 10900K isn't exciting because it doesn't really improve on what Intel is good at (low latency and high clock speeds). Instead, it adds two more cores that no one really asked for (if you're married to the mainstream platforms and need cores, AMD is the undisputed best choice, and if your workload scales to 10 cores it could probably benefit from 12 or 16).

Completely agree here - I'm still on a 2600k (at 4.6Ghz, mind you) and it hasn't felt slow for the past 7 years I've had it.
 
Completely agree here - I'm still on a 2600k (at 4.6Ghz, mind you) and it hasn't felt slow for the past 7 years I've had it.

Even my 12 year old C2Q, doesn't feel slow in day to day usage. I bought and returned a gaming laptop with a 4C/8T Raven Ridge CPU, which didn't feel any faster at all. It benchmarked faster, but didn't feel faster.

The typical user doing home productivity, internet, and media consumption will find no difference in pretty much any modern CPU, even the few dual cores still available.

You really need to be gaming, or doing some kind of heavy lifting (encoding/rendering) for new CPUs to matter at all. And even then, for gaming you need a powerful GPU to reveal CPU weaknesses, and if you only occasional encode a video, does ti even matter if it takes 10 more minutes. It's not like you watch it encode.
 
People, it is not a 250w processor unless you load it up and run it at that most of the time which, I would argue, most people do not.

I have a 9900k. When it is running full pelt it is not a cool processor by any stretch. Enable speed shift (HWP), core duty cycling and sleep states, and it becomes positively civilised.

While I was playing assassin’s creed origins it was only consuming 30-40W last night and the cores were running at 5ghz! 2 cores were sleeping at the time also. This is far from the 220w lofty peaks that you will hear people cry foul about. Starcraft 2 was down around 15-20W.

Intel processors actually idle at lower power than AMDs processors for now, so net power usage will be lower.

Not only that, but you have give it a full core load, AND overclock it. Here is a screen grab for the 10900K power usage 5 minutes into a blender render.

Note that at stock speed, 10900K using slightly less power than a Ryzen 3900x:

BlenderPower.png

It only goes nuts on Power when you give it a big All core overclock with a all core workload.
 
People, it is not a 250w processor unless you load it up and run it at that most of the time which, I would argue, most people do not.

I have a 9900k. When it is running full pelt it is not a cool processor by any stretch. Enable speed shift (HWP), core duty cycling and sleep states, and it becomes positively civilised.

While I was playing assassin’s creed origins it was only consuming 30-40W last night and the cores were running at 5ghz! 2 cores were sleeping at the time also. This is far from the 220w lofty peaks that you will hear people cry foul about. Starcraft 2 was down around 15-20W.

Intel processors actually idle at lower power than AMDs processors for now, so net power usage will be lower.

When you try to do a handbrake video encode it ramps right up to 180W+ Meanwhile, I throw the same encode on my R5 3600 and it runs with less than half the power. Even sitting here typing this, my 9900k is around 15-25W, not 2-3W like you're claiming in another thread. Gaming is a mixed bag. I don't have AC:Odyssey installed to check your numbers, but in COD:WW2 it would fluctuate between 50-110W depending on the situation, but definitely not 30-40W.

My experience is nowhere near as rosy a picture as you're claiming. You're complaining about having to tweak the AMD system, but then you're saying that you have to tweak the Intel system to get it to work at a lower power (I still haven't found a bios setting for core duty cycling in my ASRock Taichi board). I would never make a claim about "net power usage" like that. That is way too dependent upon what someone is actually doing with their computer. For example, right now I have my R5 3600 loaded up doing handbrake encodes. I'm using significantly less power compared to doing that on my 9900k.
 
Going through the some of the publishers of results the common trend is that the 900 and 700K's are power hungry versions of the 3950 and 3900 which offer far better efficiency and still remain good gaming options despite non of the tests being done overclocked. I would still say the 3950X is well worth its price point and is still the best all round option for high end.

Intel does score a win with the 10600K which is definitely faster than the 3600 however the 3600 has has been sold to every man and his dog in the year and not many people will change platform now, given Zen3 is close.

The big caveat is that all these CPU's need to run much higher clock speeds and run hot as well as use a boatload of power compared to AMD. In most benches the gap is 10-15FPS vs a max OC 10th gen, to while pretty much all these CPU's are in the 160-180FPS range which is more than enough. I will take efficiency over gimmick any time.
 
Going through the some of the publishers of results the common trend is that the 900 and 700K's are power hungry versions of the 3950 and 3900 which offer far better efficiency and still remain good gaming options despite non of the tests being done overclocked. I would still say the 3950X is well worth its price point and is still the best all round option for high end.

Intel does score a win with the 10600K which is definitely faster than the 3600 however the 3600 has has been sold to every man and his dog in the year and not many people will change platform now, given Zen3 is close.

The big caveat is that all these CPU's need to run much higher clock speeds and run hot as well as use a boatload of power compared to AMD. In most benches the gap is 10-15FPS vs a max OC 10th gen, to while pretty much all these CPU's are in the 160-180FPS range which is more than enough. I will take efficiency over gimmick any time.

The 10600k is ~$80 more expensive also.
 
BS.

Intel doesn't win gaming because it's single threaded, because gaming is far from single threaded. Intel wins gaming because it has lower latency.

Intel is great for mixed usage. Because most usages are not embarrassingly parallel.

AMD really only wins the embarrassingly parallel workloads. Encoding/Rendering. Of which, only video encoding is popular.

Gaming and Mixed usage: Intel.

Embarrassingly Parallel: AMD.
Agree plus a lot of biased AMD fanboy tests were always using Premiere and curiously NOT using the iGPU on the 9900k for example, which makes for a BIG difference in the results.
 
I wouldn't consider going AMD until they catch up on the ghz race but make no mistake about it, this is just a garbage being shoveled by intel just to keep their name relevant.
Well that "garbage" still performs darn great on 14nm vs the questionable nm ratings of AMD STILL unable to get close to 5Ghz...so yeah there is that. But apparently every AMD fanboy does 100% blender these days lol
 
Agree plus a lot of biased AMD fanboy tests were always using Premiere and curiously NOT using the iGPU on the 9900k for example, which makes for a BIG difference in the results.

Or NOT running simultaneous renders which then favors Intel fanboy tests... No offense, but you can play these games all day which is why you pick the best CPU for your use case....
 
Not only that, but you have give it a full core load, AND overclock it. Here is a screen grab for the 10900K power usage 5 minutes into a blender render.

Note that at stock speed, 10900K using slightly less power than a Ryzen 3900x:

View attachment 246805

It only goes nuts on Power when you give it a big All core overclock with a all core workload.
So, it used about the same power as the 3950x and is about 1/2 the speed? Wow color me impressed. And it's run over 5 minutes to smooth out the spike in power from Intel chips
"For this, we run the test for 5 minutes to ensure that Tau has expired on any CPUs with boosting durations. This brings the power consumption lower as compared to a shorter workload "
so it still doesn't prove anything about Intel not drawing really high wattages.

Hand picked huh?
Same site (gamersnexus) from your "proof"...
10900k is using 198watts vs 138watts for 3950x. You literally found one from all the benchmarks and used it like it was the norm.
https://www.gamersnexus.net/images/media/2020/10900k-review/12_power-cinebench-nt.png

- 199watts vs 134watts for 3950x in y-cruncher (Tomshardware)
https://cdn.mos.cms.futurecdn.net/Gp2zGqUYDcFcCUfywW5S7n-650-80.png

- 222watts in Handbrake when all cores are actually used vs 160watts 3950x (while getting handily beat). (Tomshardware)
https://cdn.mos.cms.futurecdn.net/hhakAayRyXRuoAVDo9PpAn-650-80.png

And from Anandtech:
"We clocked 220W on our Intel chip for this test however, well beyond the 120W of the AMD processor." in POV-Ray.

You literally found the one power chart that made Intel look not completely disgusting on power consumption (if taken by itself without performance data), congrats. If that chart included efficiency, the 3950x is way ahead still (as it accomplishes much more in the same time frame).

Of course you have to give it full load, nobody was saying you're going to pull > 200watts at idle... strawman argument much? It pulls 222watts in handbrake with NO overclock. Intel has already stated it can pull up to 250w without O/C. Not sure what you're argument was here.

So, it beats the 9900k by a tiny bit and generates more heat, and still not excited.
 
Back
Top