Intel Core i9-12900K, i7-12700K, i5-12600K Specs, Pricing & New TDP!

Picking a CPU without iGPU in 2021 seems dangerous, unless you have a hot spare dGPU.

I have plenty video cards. To me an iGPU is an added cost I can do without.
It's like the '90s when onboard sound saved your butt the night before the big LAN party all over again.
Yeah, I remember those days. I was the one with a giant tote full of spare parts from cpu's, gpu's, mobos, psu's, ram, video & sound cards. Most of the time we had a Compusa/MC near by for those parts we didn't have onhand.
Yep, I was THAT guy everybody turned to when they needed some part :ROFLMAO:
 
Picking a CPU without iGPU in 2021 seems dangerous, unless you have a hot spare dGPU.

Unless you are using your GPU to mine, it's not very common for them to experience an early death. Good case airflow and a good PSU helps also...

With that said, I've always been able to make use of integrated GPUs. I run several monitors, but only use my main monitor for gaming. Given the choice, I prefer to only have my main monitor connected to the discrete GPU and have the secondary monitors running from the integrated GPU.
 
Unless you are using your GPU to mine, it's not very common for them to experience an early death. Good case airflow and a good PSU helps also...

Another case of anti-crypto propaganda masquerading as "truth."
 
That's a pretty pro-crypto statement if you ask me.

Sure, I'll admit, I'm pro-crypto, but the fact remains that a card that is undervolted running a constant temperature and power is likely far better off than one that is used for the extreme temperature, voltage, and load changes associated with gaming. Yet, nobody thinks twice about using a video card for gaming.
 
Sure, I'll admit, I'm pro-crypto, but the fact remains that a card that is undervolted running a constant temperature and power is likely far better off than one that is used for the extreme temperature, voltage, and load changes associated with gaming. Yet, nobody thinks twice about using a video card for gaming.
Well they are primarily sold for gaming, you would kinda expect that's what it would be used for. And just like cpus (even K's) most people don't OC.
 
Well they are primarily sold for gaming, you would kinda expect that's what it would be used for. And just like cpus (even K's) most people don't OC.

At least it is not obvious, a well run has described crypto operation is better than the Amazon game that burned 3090 by giant spike in the game menu for example, but I imagine pushed crypto for someone that steal is electricity vs well made game/driver would be worst and so on.
 
At least it is not obvious, a well run has described crypto operation is better than the Amazon game that burned 3090 by giant spike in the game menu for example, but I imagine pushed crypto for someone that steal is electricity vs well made game/driver would be worst and so on.
Read your post again, it does not make sense.
 
Sure, I'll admit, I'm pro-crypto, but the fact remains that a card that is undervolted running a constant temperature and power is likely far better off than one that is used for the extreme temperature, voltage, and load changes associated with gaming.

My bad, I missed a word, I thought he said gaming was harder on cards, not crypto. Disregard.
 
Another case of anti-crypto propaganda masquerading as "truth."

Did my comment really have anything to do with Crypto? Or does it have more to do with the fact that most videocards weren't intended to be used at 100% load 24/7?

crypto doesn't kill cards any worse than gaming.

Yeah... no. Even if you somehow played games 24/7, that still wouldn't mean that your GPU was at 100% load the whole time you are playing the game. Depending on the game and what you are doing in the game, there are many cases where you are CPU limited and the GPU is just loafing along at far less than 100% load.

Sure, I'll admit, I'm pro-crypto, but the fact remains that a card that is undervolted running a constant temperature and power is likely far better off than one that is used for the extreme temperature, voltage, and load changes associated with gaming. Yet, nobody thinks twice about using a video card for gaming.

The fact that you have to move your own goal-posts by mentioning things like under-volting sort of proves my point. If mining wasn't dangerous to your card, then you wouldn't have to take those kinds of measures to prevent damage. The average miner is not going to bother with that, they are just going to fire up the program and let it roast. If you really think gaming is more dangerous to your card, you could under-volt while gaming too :rolleyes:
 
The fact that you have to move your own goal-posts by mentioning things like under-volting sort of proves my point. If mining wasn't dangerous to your card, then you wouldn't have to take those kinds of measures to prevent damage. The average miner is not going to bother with that, they are just going to fire up the program and let it roast. If you really think gaming is more dangerous to your card, you could under-volt while gaming too :rolleyes:
Sorry but I have to disagree. Any miner that does anything outside of mining once a month when they remember to start their miner knows to limit voltage for both heat reduction and (this is the most important part) reduced electricity costs. It's not dangerous, it's simply smart money because you are throwing money out the window by letting it drink more juice unnecessarily. On my 3090 I do actually under-volt most of the time, because the performance in many games is still there while keeping temps & electrical use down. Only on games like Cyberpunk do I open it up fully.
 
Did my comment really have anything to do with Crypto? Or does it have more to do with the fact that most videocards weren't intended to be used at 100% load 24/7?



Yeah... no. Even if you somehow played games 24/7, that still wouldn't mean that your GPU was at 100% load the whole time you are playing the game. Depending on the game and what you are doing in the game, there are many cases where you are CPU limited and the GPU is just loafing along at far less than 100% load.



The fact that you have to move your own goal-posts by mentioning things like under-volting sort of proves my point. If mining wasn't dangerous to your card, then you wouldn't have to take those kinds of measures to prevent damage. The average miner is not going to bother with that, they are just going to fire up the program and let it roast. If you really think gaming is more dangerous to your card, you could under-volt while gaming too :rolleyes:

Basically, you have just demonstrated that you don't have a working knowledge of GPU mining at all (whatever, it's not for everyone). And who is to say that a video card can't be used 24/7? The 3090 is essentially a prosumer card that can be used to render/encode for hours on end. It's been well established in every review I read that a 3090 is a poor gaming value compared to other options.

The fact that your GPU is at high load and then "loafing along at far less than 100% load" is exactly the problem. You have far more hot/cold cycles with varied temps, voltage, and loads. Just because something is running 24/7 doesn't mean that it is in danger of failing, especially when run far below the maximum levels the card could handle. If anything, the fans would be more susceptible to wear, but they are often the easiest parts to replace.

I'm not moving the goal posts at all. You inferred that mining is bad for a GPU, and I tried to explain that that I don't agree with that blanket assessment. I don't know who exactly you're referring to as the "average miner." Every miner I have talked to has spent time tweaking their card for the highest performance at the least amount of power draw. A casual perusal of YT videos shows various configurations for maximum efficiency if you bothered to look (but like I said, GPU mining isn't for everyone). Running a card without tweaking settings is inefficient, and inefficiency is wasted potential profit. For example, a regular 3070 gets the highest hash rates while drawing less than 120W which is far lower (half maybe?) than the design of the card allows for.
 
Basically, you have just demonstrated that you don't have a working knowledge of GPU mining at all (whatever, it's not for everyone). And who is to say that a video card can't be used 24/7? The 3090 is essentially a prosumer card that can be used to render/encode for hours on end. It's been well established in every review I read that a 3090 is a poor gaming value compared to other options.

The fact that your GPU is at high load and then "loafing along at far less than 100% load" is exactly the problem. You have far more hot/cold cycles with varied temps, voltage, and loads. Just because something is running 24/7 doesn't mean that it is in danger of failing, especially when run far below the maximum levels the card could handle. If anything, the fans would be more susceptible to wear, but they are often the easiest parts to replace.

I'm not moving the goal posts at all. You inferred that mining is bad for a GPU, and I tried to explain that that I don't agree with that blanket assessment. I don't know who exactly you're referring to as the "average miner." Every miner I have talked to has spent time tweaking their card for the highest performance at the least amount of power draw. A casual perusal of YT videos shows various configurations for maximum efficiency if you bothered to look (but like I said, GPU mining isn't for everyone). Running a card without tweaking settings is inefficient, and inefficiency is wasted potential profit. For example, a regular 3070 gets the highest hash rates while drawing less than 120W which is far lower (half maybe?) than the design of the card allows for.
I think probably a lot of the stigma about longevity with mining cards, comes from a few years ago. Before undervolting was fairly mainstream. And before something like AMD's Wattman feature. I honestly hear about undervolting now, from pretty average gamer dudes.
 
5sfzbf.jpg
 
All I'm looking for is a processor that'll do constant 5.0 to 5.1Ghz boost during gaming sessions since I play a lot of games which are script intensive. May jump on the DDR5, Asus Z690 bandwagon but $600 for the board alone makes me think twice.
 
All I'm looking for is a processor that'll do constant 5.0 to 5.1Ghz boost during gaming sessions since I play a lot of games which are script intensive. May jump on the DDR5, Asus Z690 bandwagon but $600 for the board alone makes me think twice.
My 8700k did 5.1ghz all core all day every day. If that's all you need pick one up and call it a day. Of course once I upgraded my server to a 5600x it stomped my 8700k in single and threaded loads, even though it only boosts to 4.65ghz in all core (single thread can do 4.8.) My work is extremely single thread limited, but I still ended up with a 5950x for by main rig. Let's wait a few more days for reviews, you may find amds current offerings are exactly what you need, if you forget the magic 5ghz and focus on the performance. With adequate cooling you can still hit 5ghz on the higher cpus, I do on my 5950x.

Edit: Meant to type adequate, not always. Didn't notice the spellcheck.
 
Last edited:
All I'm looking for is a processor that'll do constant 5.0 to 5.1Ghz boost during gaming sessions since I play a lot of games which are script intensive. May jump on the DDR5, Asus Z690 bandwagon but $600 for the board alone makes me think twice.
Asus seems to be gouging a bit. MSI/Gigabyte/Asrock seem to be charging typical release prices of previous few gens.
 
Most people push the memory hard on mining use and the temps tend to already be pretty hot for the Nvidia cards memory. I often feel that is the biggest reason to avoid used mining cards as the memory has been pushed hard and is far more likely to fail, I really doubt the gpu itself is affected much.
 
My 8700k did 5.1ghz all core all day every day. If that's all you need pick one up and call it a day. Of course once I upgraded my server to a 5600x it stomped my 8700k in single and threaded loads, even though it only boosts to 4.65ghz in all core (single thread can do 4.8.) My work is extremely single thread limited, but I still ended up with a 5950x for by main rig. Let's wait a few more days for reviews, you may find amds current offerings are exactly what you need, if you forget the magic 5ghz and focus on the performance. With adequate cooling you can still hit 5ghz on the higher cpus, I do on my 5950x.

Edit: Meant to type adequate, not always. Didn't notice the spellcheck.
My 9900K has no issue with 5.0 GHz all core, either. Ran a little warm with a 280mm radiator, but it's been good ever since upgrading to a 360mm rad.
 
Whole industry is heavily leaving this way now with on chip/die specialty processors/clusters.

We got used to it with video decoders/encoders, now you'll see it for everything else

This is just (the start of) basically a PC background task co-processor/cluster
All things are cyclical. We're back to hardware based decoder/encoder/etc - until a new set of algorithms comes out, run in software only at first, and then we wait for them to get implemented in hardware again.
My 9900K has no issue with 5.0 GHz all core, either. Ran a little warm with a 280mm radiator, but it's been good ever since upgrading to a 360mm rad.
4.9 all core on my 10700k, but that's just straight auto-boosting. Never bothered pushing it farther; haven't needed to.
 
All I'm looking for is a processor that'll do constant 5.0 to 5.1Ghz boost during gaming sessions since I play a lot of games which are script intensive. May jump on the DDR5, Asus Z690 bandwagon but $600 for the board alone makes me think twice.
I would wait until the Z690 boards and DDR5 prices go down in the next few months. Just my 0.02c. No idea how good Alder lake will be....but even if you get the lower 12600k to save cost that motherboard and memory prices is YEEEESH bad if you ask me.

But, thats what you get when paying for top of the line technology.
 
Marginally faster than 5950x with twice the power draw, makes this a luke warm recommendation at best.

Funny to hear Steve state so many reservations along with his analysis. Sounds a bit like a drug commercial at the end.
 
Last edited:
hmmm looks to be about 2-5% faster in gaming at 1080p, at least from the Hardware Unboxed review. I actually expected more.

Now in Multithreading...wowsa ya it woops AMD, but that is to be expected with 8 more threads/cores. AMD must lower prices now if you ask me.
 
hmmm looks to be about 2-5% faster in gaming at 1080p, at least from the Hardware Unboxed review. I actually expected more.

Now in Multithreading...wowsa ya it woops AMD, but that is to be expected with 8 more threads/cores. AMD must lower prices now if you ask me.
The 12900K has 8 fewer threads compared to the 5950X, 4 more cores and same number of threads compared to the 5900X. That's not getting into the P vs. E core discussion.
 
Back
Top