Anandtech and Gamers Nexus 11700k Reviews

In to defend my 5800X purchase

Ryzen 5 2600-->3600X-->5800X. Same everything except CPU.

Now try doing that starting from a Core i5 9600K thru to a 11700K.

$50. Whoopteey dooo.
I wonder how many months it would take most people before that $50 "savings" was used up from electricity costs and then started moving the other way.
 
I wonder how many months it would take most people before that $50 "savings" was used up from electricity costs and then started moving the other way.

More than you'd think unless you are using your computer at 100% load 24/7/365 with unlimited tau, etc.. Basic surfing, etc. there shouldn't be a noticeable difference between the two. Honestly, I don't see making up the difference with average workloads in the normal lifecycle of the CPU.
 
More than you'd think unless you are using your computer at 100% load 24/7/365 with unlimited tau, etc.. Basic surfing, etc. there shouldn't be a noticeable difference between the two. Honestly, I don't see making up the difference with average workloads in the normal lifecycle of the CPU.
Double the wattage when the CPU isn't loaded down isn't going to increase electrical costs? It's somehow not going to increase electrical costs by $50 over several years? I highly doubt that especially with increasing energy prices. It's also interesting that you only mention "basic surfing" and "average workloads". Why would you even bother with a high end CPU or upgrade to one of these if you'll never actually make use of it?

Trying to create a situation in which it looks "favorable" for the CPU isn't doing the argument any good.

Then again, you could be right. Without spending quite a bit more on cooling, the CPU may never be able to boost high enough (or at all) to really use much more electricity. So, you may have a point here.
 
If I just go by the AVX2 power usage numbers given in the review (using the 0x34), it would take 178 days running at those levels (24/7) to makeup the $50 difference at my current power rate between it and the 5800X ($0.16/kwH after fees and everything).

Unless I messed that math up...correct me if I'm wrong.
 
Double the wattage when the CPU isn't loaded down isn't going to increase electrical costs? It's somehow not going to increase electrical costs by $50 over several years? I highly doubt that especially with increasing energy prices. It's also interesting that you only mention "basic surfing" and "average workloads". Why would you even bother with a high end CPU or upgrade to one of these if you'll never actually make use of it?

Trying to create a situation in which it looks "favorable" for the CPU isn't doing the argument any good.

Then again, you could be right. Without spending quite a bit more on cooling, the CPU may never be able to boost high enough (or at all) to really use much more electricity. So, you may have a point here.
It's the same as fuel mileage or time to destination. If you spend most of your time at 50mpg or 30mph, and maybe a minute at 20mpg or 80mph, your time/mileage will reflect that.
 
^ I guess I was looking at more the ease of upgrade path. Along the same lines of upgrading from my 2005 GMC Envoy (16mpg) to my 2022 Honda Civic ST (35mpg).
I only have to stop for a fill every 10 days instead of every 4. I notice that more than the $2k/year or whatever it may be I'm saving in fuel.

I didn't have to yank a mainboard out of my case. That's worth $50 to me right there. Plus it's faster than Intel, less heat, less power, less noise, and it was available.
 
The coping and rationalization from some people around the web about this review is fascinating; from magical BIOS fixes, to memory speeds, to cooling, to the fact that the motherboard used is under NDA when the CPU isn't, there's nothing that isn't suspect.

Sounds like the same things that were said during the Bulldozer era. :) Now, I liked my pile driver cpu's and when the OS was optimized for it, they did work a bit better but, there was no miracle cure for them because their problem was in the hardware cache itself, as in it was way to slow.
 
Sounds like the same things that were said during the Bulldozer era. :) Now, I liked my pile driver cpu's and when the OS was optimized for it, they did work a bit better but, there was no miracle cure for them because their problem was in the hardware cache itself, as in it was way to slow.
At least bulldozer was insanely cheap. Lots of budget builds were viable because of $80 FX6300s.
 
The coping and rationalization from some people around the web about this review is fascinating; from magical BIOS fixes, to memory speeds, to cooling, to the fact that the motherboard used is under NDA when the CPU isn't, there's nothing that isn't suspect.
I don't think there is anything suspect about it. It seems clear that Intel and the mobo brands are refining Bios updates for Rocket Lake, until release day. The mobo used is under NDA---because its getting bios updates and if a site reports that something isn't performing or working correctly, even though its a pre-release product; people could associate that product with problems or sub-par performance, before its even ready to be evaluated.

The 11700k specifically, didn't apply to the NDA because it was not an officially announced product. In terms of the fine print, it didn't exist.
 
Double the wattage when the CPU isn't loaded down isn't going to increase electrical costs? It's somehow not going to increase electrical costs by $50 over several years?
It doesn't really work that way. Idle power draw is almost the same with most cpus. They are all pretty efficient at idle.
Guru3D has a graph showing a Ryzen 5 3600 total system draw at 63 watts, i7-10700k is at 74 watts. Even a Threadripper 3990X is only at 112 Watts.
https://www.guru3d.com/articles-pages/intel-core-i7-10700k-processor-review,5.html
I don't see any idle numbers for the 11th gen yet but I would think that the idle power usage would still be pretty low. I'm not saying Intel has a good cpu on their hands though.....
 
I have an 11900K pre-order. I had a 10900K that I bought at Micro Center for $400 (wanted iGPU, wanted top binning - otherwise would have opted for 10850K) - figured for $150 more I may as well try the new hotness. So there you go. One sale. :)

Have a kit with Z490 so I'm "stuck" in that regard (bought to get the 3080 that came with it).
 
In to defend my 5800X purchase

Ryzen 5 2600-->3600X-->5800X. Same everything except CPU.

Now try doing that starting from a Core i5 9600K thru to a 11700K.

$50. Whoopteey dooo.

$50? Do you mean the price increase? That doesn't mean much when the 3600 wasn't that much slower than the X or XT models and most reviewers recommended against buying it simply on price. The 3800X was even more pointless unless you absolutely didn't want to overclock your 3700X which was $71 cheaper at $329. At $450 for a 5800X that is a $121 price increase for an 8 core chip.
 
$50? Do you mean the price increase? That doesn't mean much when the 3600 wasn't that much slower than the X or XT models and most reviewers recommended against buying it simply on price. The 3800X was even more pointless unless you absolutely didn't want to overclock your 3700X which was $71 cheaper at $329. At $450 for a 5800X that is a $121 price increase for an 8 core chip.

It is a $50 increase, objectively and factually. Now, if you want to be subjective........
 
I bought a couple 3800X's because they were $10 cheaper than the 3700X was at the time. I am just kicking myself for going Intel because it was available at a good price at the time when I could not get anything AMD for even close MSRP.
 
It doesn't really work that way. Idle power draw is almost the same with most cpus. They are all pretty efficient at idle.
Guru3D has a graph showing a Ryzen 5 3600 total system draw at 63 watts, i7-10700k is at 74 watts. Even a Threadripper 3990X is only at 112 Watts.
https://www.guru3d.com/articles-pages/intel-core-i7-10700k-processor-review,5.html
I don't see any idle numbers for the 11th gen yet but I would think that the idle power usage would still be pretty low. I'm not saying Intel has a good cpu on their hands though.....
And you would buy one of these CPUs just to let it sit idle and never do anything with it? Idle power means nothing when the system is actually in use and once you start using these Intel CPUs, the power use starts going sky high. The point of buying a high end CPU is to use it and get performance out of it.
 
I bought a couple 3800X's because they were $10 cheaper than the 3700X was at the time. I am just kicking myself for going Intel because it was available at a good price at the time when I could not get anything AMD for even close MSRP.

Yeah, I bought both my 3700x's for $279 plus tax each and my 3600 for $175 plus tax.
 
  • Like
Reactions: noko
like this
Right now, 3960x all cores 100%, 3900x all cores 100%, 5800x all cores 100%, 6700K most cores doing nothing. Power does matter. Does Intel have anything like AMDs Curve Optimizer?

Ok, a 50w more power required CPU will take more than just 50w at the wall due to power supply efficiencies and VRM efficiencies and any resistive heat loss. Roughly 50w/.9 for the VRM = 56w more than a lower powered CPU. Then 56w more out of the power supply, 56w/.9 = 62w from the plug.

Now add in more heat that has to be disappointed in the room if A/C is required (If A/C is not being used then this is 0), A/C takes about 1/3 the power to remove the heat removed -> 62w more heat -> 62w x 1/3 = 21w of A/C energy needed to cool the added heat from the 50w more CPU.

So that 50w more CPU is more like 62w + 21w (if A/C is needed) or 83w additional cost overall. Now if you are running more than one CPU, as in several, 10 or 100 => the costs do add up very significantly.
 
All the talk about the idle draw gets me a little excited, Intel has been delivering pretty well on idle and sleep states, especially when paired with 12V systems from the OEM's the power states give some fun options there and I am somewhat interested to see what the big 3 can deliver with them in the short term. Long term these chips are an obvious stop-gap for their consumer 10nm parts later this year once they get their initial push of the Xeons out the door for the big June-Sept enterprise deliveries. These chips on the 14nm are a little disappointing but it leaves me very hopeful for their 10nm HEDT/Server parts that should start cropping up in the next few months. I will be paying attention because I really do need a new workstation and AMD hasn't announced their new Threadrippers yet so that fight is the one I am really looking forward to.
 
And you would buy one of these CPUs just to let it sit idle and never do anything with it? Idle power means nothing when the system is actually in use and once you start using these Intel CPUs, the power use starts going sky high. The point of buying a high end CPU is to use it and get performance out of it.
I get your points and perspective now. I first sort of skimmed over the GamersNexus video while at work and I had sound off. Now at home I looked at it again and realize I really missed a lot. I didn't initially catch the part about how the Asus board, which they used for testing, did actually stick to the Intel power limit specs. Then he mentioned how another Gigabyte board they have did not follow spec and was using an extra 70 watts. Yikes. That is an insane difference. So the extra power difference in their charts is with the Asus board, and those numbers are really best case. The Gigabyte board numbers would be much worse. Again, yikes. That is awful, and much worse than I initially thought. So yeah, I think it would be best to avoid this Intel generation. I wonder how many people will get boards like the Gigabyte and not know it, and then they run into power supply issues. Of course you should have some headroom in your PSU but if the cpu ends up using an extra 70 watts and you don't realize it you can get into some trouble.
 
I get your points and perspective now. I first sort of skimmed over the GamersNexus video while at work and I had sound off. Now at home I looked at it again and realize I really missed a lot. I didn't initially catch the part about how the Asus board, which they used for testing, did actually stick to the Intel power limit specs.
Yeah the new Asus boards (my 12th gen ones at least) ask when you clear cmos/first boot if you want to use intel power limits or unlock power.
 
Back
Top