For those who bought Haswell. Was it what you expected?

just upgraded from a q6600 to a 4770k.

Setup:
Asus z87-pro
I7 4770k
(2) 2gb pc3-1600
hd3870 *i know the video card is old
Samsung 840 Pro 128gb

compared to my q6600 its night and day and only need to upgrade my ram and cpu fan.
 
Just upgraded from I7 [email protected] (mostly), couldn't take the upgrade itch any more. I have been pretty vocal in this thread so thought I would offer some feed back

New Rig

Gigabyte GA-Z87X-OC Mobo
Intel Core i7 4770K [email protected]
Corsair Hydro Series H110 Closed processor water cooler
CoolerMaster RC-942 case
16Gb 1600MHZ ram (already had)
SSD (already had)
GTX 680 (already had).

Games tested
Gaming resolution 1080P and all games set pretty high quality where GPU is usually the bottleneck even at a modest 1080P.

Bioshock Infinity (no FPS gain)
Metro Last light (no FPS gain).
Starcraft 2 (no FPS gain)
Supreme commander forged alliance (around 20-30% sim speed increase but STILL maxes 1 core, no surprise there).
Crysis 3 (no FPS gain)
**didn't feel the need to test more.

The results above I was expecting as the only games I play where a core was maxed was Supcom Forged Alliance large maps with ALX etc.

Advice for gamers with Bloomfield/Nehalem @3.8 or better.

How do I put this........um..............DON"T DO IT !!!!!!, seriously NO, use the money on hookers!

Its the worse money I have spent in a long time, I am thinking why the hell didn't I just jam a Corsair hydro on the 920 and OC a little more for fun, may have satisfied the itch. My 920 is looking at me in the corner wondering what the hell it did wrong!, I cant even look at it little own try and explain why he's been replaced.


Bloomfield/Nehalem owners please feel free to flame me for being a idiot.
 
Just upgraded from I7 [email protected] (mostly), couldn't take the upgrade itch any more. I have been pretty vocal in this thread so thought I would offer some feed back

New Rig

Gigabyte GA-Z87X-OC Mobo
Intel Core i7 4770K [email protected]
Corsair Hydro Series H110 Closed processor water cooler
CoolerMaster RC-942 case
16Gb 1600MHZ ram (already had)
SSD (already had)
GTX 680 (already had).

Games tested
Gaming resolution 1080P and all games set pretty high quality where GPU is usually the bottleneck even at a modest 1080P.

Bioshock Infinity (no FPS gain)
Metro Last light (no FPS gain).
Starcraft 2 (no FPS gain)
Supreme commander forged alliance (around 20-30% sim speed increase but STILL maxes 1 core, no surprise there).
Crysis 3 (no FPS gain)
**didn't feel the need to test more.

The results above I was expecting as the only games I play where a core was maxed was Supcom Forged Alliance large maps with ALX etc.

Advice for gamers with Bloomfield/Nehalem @3.8 or better.

How do I put this........um..............DON"T DO IT !!!!!!, seriously NO, use the money on hookers!

Its the worse money I have spent in a long time, I am thinking why the hell didn't I just jam a Corsair hydro on the 920 and OC a little more for fun, may have satisfied the itch. My 920 is looking at me in the corner wondering what the hell it did wrong!, I cant even look at it little own try and explain why he's been replaced.


Bloomfield/Nehalem owners please feel free to flame me for being a idiot.

Thanks much for the feedback. The upgrade itch has been pretty strong with me lately too. I ordered the new Corsair 540 case to hopefully scratch it some. After reading your experience and similar others...think I'll throw any other upgrade money at replacing my 580 and ride my 920@4ghz a little longer.
 
PC gaming is riding into the wall due to tech is advancing to slow now, 7970 a year and a half and still no new generation out yet and same with nvidia and it be 2 and a half year to 20nm node from previous gen and then?
3 maybe 4 years to next?

I find the upgrade worth it, faster boot, better motherboard features, a bit better performance in the game I play than the same oc 2500k I had.

Once you have enough speed, and seriously if you do 2xaa vs some mssaa or such you win a ton more frames with current cards.

seldom cpu speed limited in games, gpu on the other hand...
 
Thanks much for the feedback. The upgrade itch has been pretty strong with me lately too. I ordered the new Corsair 540 case to hopefully scratch it some. After reading your experience and similar others...think I'll throw any other upgrade money at replacing my 580 and ride my 920@4ghz a little longer.

NP, I was tying to be a little humorous in my post, incase I came off as a little crazy.

I have had a little fun OC Haswell, but give it a week and the PC will be humdrum again.

Yea bumping that 580 you will see a much better return for your money, I came form a 580 to a 680 and it was very measurable and noticeable.
 
Just upgraded from I7 [email protected] (mostly), couldn't take the upgrade itch any more. I have been pretty vocal in this thread so thought I would offer some feed back

New Rig

Gigabyte GA-Z87X-OC Mobo
Intel Core i7 4770K [email protected]
Corsair Hydro Series H110 Closed processor water cooler
CoolerMaster RC-942 case
16Gb 1600MHZ ram (already had)
SSD (already had)
GTX 680 (already had).

Games tested
Gaming resolution 1080P and all games set pretty high quality where GPU is usually the bottleneck even at a modest 1080P.

Bioshock Infinity (no FPS gain)
Metro Last light (no FPS gain).
Starcraft 2 (no FPS gain)
Supreme commander forged alliance (around 20-30% sim speed increase but STILL maxes 1 core, no surprise there).
Crysis 3 (no FPS gain)
**didn't feel the need to test more.

The results above I was expecting as the only games I play where a core was maxed was Supcom Forged Alliance large maps with ALX etc.

Advice for gamers with Bloomfield/Nehalem @3.8 or better.

How do I put this........um..............DON"T DO IT !!!!!!, seriously NO, use the money on hookers!

Its the worse money I have spent in a long time, I am thinking why the hell didn't I just jam a Corsair hydro on the 920 and OC a little more for fun, may have satisfied the itch. My 920 is looking at me in the corner wondering what the hell it did wrong!, I cant even look at it little own try and explain why he's been replaced.


Bloomfield/Nehalem owners please feel free to flame me for being a idiot.

Thanks for this post. Exactly what I am interested in.
 
no

I7-4770K was not worth the upgrade coming from an I7-920 at 4.0ghz.

Here is a comparison of my 4770k on the new 3dmark vs. my old i7-920 at 3.8ghz (my lowest voltage highest overclock setting.)

The I7-920 had 12GB 1333 MHz ram and was o/c'ed to 3.8ghz
The I7-4770k has 16GB 1600 MHz ram and is o/c'ed to 4.2ghz.

The difference in gaming is un-noticeable and that's my primary PC use. Both benchmarks are using my GTX 670 video card at stock settings. Exact same hardware for these tests - save m/b, cpu, ram.

xue8p.jpg
 
Just upgraded from I7 [email protected] (mostly), couldn't take the upgrade itch any more. I have been pretty vocal in this thread so thought I would offer some feed back

New Rig

Gigabyte GA-Z87X-OC Mobo
Intel Core i7 4770K [email protected]
Corsair Hydro Series H110 Closed processor water cooler
CoolerMaster RC-942 case
16Gb 1600MHZ ram (already had)
SSD (already had)
GTX 680 (already had).

Games tested
Gaming resolution 1080P and all games set pretty high quality where GPU is usually the bottleneck even at a modest 1080P.

Bioshock Infinity (no FPS gain)
Metro Last light (no FPS gain).
Starcraft 2 (no FPS gain)
Supreme commander forged alliance (around 20-30% sim speed increase but STILL maxes 1 core, no surprise there).
Crysis 3 (no FPS gain)
**didn't feel the need to test more.

The results above I was expecting as the only games I play where a core was maxed was Supcom Forged Alliance large maps with ALX etc.

Advice for gamers with Bloomfield/Nehalem @3.8 or better.

How do I put this........um..............DON"T DO IT !!!!!!, seriously NO, use the money on hookers!

Its the worse money I have spent in a long time, I am thinking why the hell didn't I just jam a Corsair hydro on the 920 and OC a little more for fun, may have satisfied the itch. My 920 is looking at me in the corner wondering what the hell it did wrong!, I cant even look at it little own try and explain why he's been replaced.


Bloomfield/Nehalem owners please feel free to flame me for being a idiot.

Yes - I feel the same way. I wasted my money.

My only gain was that the MSI gaming board I purchased has a killer nic embedded and it does actually work well. In games I play I've seen a reliable 10ms drop in ping times. Not that 10MS ping time is worth my $350 dollar upgrade expense..., but that's the only difference I've seen felt besides faster boot times into windows because my EVGA x3 SLI board took a long time to boot the whole time I had it and the MSI posts immediately and has a MSI fast boot option where it skips all the health checks in the BIOS.
 
Thamks for that too Archaea.
I've got the upgrade itch but my 920,co no less, is not holding me back too much.

I'd probably be better off going from my MSI 460 Talon to a 760.
The 460 to me is the equivalent gpu to the 920 cpu. It's really been great for a long time but a 760 will just about double my gaming performance on it's own-hopefully.
That's what I'll upgrade now.
Thanks everyone for their honost opinions
 
I've been too lazy to put together my 920@4GHz replacement. Figured I have a four day weekend (now) ahead. Last few posts making me sad. :(

Though I guess the reason I haven't put it together is I know it won't be a crazy replacement. I even rant to my friends now, basically nobody cares how fast engineers can make chips without restrictions like batteries, BUT I CARE!

I will enjoy it. Screw it. I suppose I will enjoy the modest, but very existent increase. Mostly because I'm not holding my breath to have my socks blown off anytime soon.
 
Bloomfield/Nehalem owners please feel free to flame me for being a idiot.

I won't flame you nor do I think you're even remotely an idiot, but I do tend to disagree, albeit with an argument different than the one you make. Is it worth the upgrade from strictly a gaming aspect? Not really, although I do seem to have higher minimum FPS in WoW (main game I play), it's not a huge difference and I could have easily done without it. Most of the games I see you listed are very GPU bound games, especially at higher resolutions. I'd imagine something like battlefield 3 would have a difference, especially for those who play on those crazy 64+ person servers.

From non-gaming aspects is where the questions start to arise. Did I like that everytime I played a game my room would become 3-4 degrees hotter than the rest of the house minimum? No, I didn't like that, and even when idling the 920 put out a ton of heat and really sucked up the power. The fully loaded power difference between a 3.4ghz I7 920 and a 4.2ghz 4770k measured at the wall is over 60 watts of power, with idle temps showing similar results. This thing literally just sips power like an englishman sipping tea at idle.

For me it also brought SATA3, USB3, UEFI bios for quicker boots, a low power graphics solution if I ever needed it, etc. The sata3 difference really can't be understated if you have an SSD, and if you don't have one, buy that before you upgrade your CPU. USB3, on the devices that support it, is a pretty massive improvement as well.

I just felt it was just time to move on from the computer I bought in 2008. Even though my overclocks are pathetic compared to others I don't regret it, I just wish I had gotten better luck on the CPU draw.

Edit: I just read what some others were saying, and yes, if you have an older graphics card you'll get more out of that everytime. I've been using a 7970 since january of last year so I didn't feel the need to upgrade.
 
Last edited:
For me it also brought SATA3, USB3, UEFI bios for quicker boots, a low power graphics solution if I ever needed it, etc. The sata3 difference really can't be understated if you have an SSD, and if you don't have one, buy that before you upgrade your CPU. USB3, on the devices that support it, is a pretty massive improvement as well.
Does it include USB bug?

BTW I'm CPU limited, not HD speed limited. Getting a 4 core CPU is a priority for everyone who is on Wolfsdale and similar 45 nm systems. (Server drives with 128 MB RAM are not exactly limited in anything. Yes SSD, high end SSD especially bussiness/intel editions, can be faster, but in most applications it's not that brutal speed difference.)

SSDs are nice for book reading however. Place Linux on them proper viewer and you are all set. Velociraptors, or Constellation ES can stay silent, and you can enjoy your night book/manga reading on big monitor.
 
Last edited:
My 920 is looking at me in the corner wondering what the hell it did wrong!, I cant even look at it little own try and explain why he's been replaced.

This made me LOL pretty hard. My old faithful Core 2 Quad has been staring at me like this for a week. The plan right now is to get a new SSD/HDDs and turn it into some kind of media PC.
 
I don't have any benchmark data, but just from personal experience, going from an FX-8120 build to an i5-4670k has actually been a pretty smooth transition. Between changing to the i5 and now having PCI 3.0 slots, I feel there's a relatively noticeable difference while gaming. I really didn't know what to expect, but this jump seems to have smoothed out most of my graphical complaints, so I'd say (and it almost hurts to say it since I've been pulling for the Red Team for so long) if you're on the AMD side of the fence, Haswell hasn't been a bust for me.
 
I won't flame you nor do I think you're even remotely an idiot, but I do tend to disagree, albeit with an argument different than the one you make. Is it worth the upgrade from strictly a gaming aspect? Not really, although I do seem to have higher minimum FPS in WoW (main game I play), it's not a huge difference and I could have easily done without it. Most of the games I see you listed are very GPU bound games, especially at higher resolutions. I'd imagine something like battlefield 3 would have a difference, especially for those who play on those crazy 64+ person servers.

From non-gaming aspects is where the questions start to arise. Did I like that everytime I played a game my room would become 3-4 degrees hotter than the rest of the house minimum? No, I didn't like that, and even when idling the 920 put out a ton of heat and really sucked up the power. The fully loaded power difference between a 3.4ghz I7 920 and a 4.2ghz 4770k measured at the wall is over 60 watts of power, with idle temps showing similar results. This thing literally just sips power like an englishman sipping tea at idle.

For me it also brought SATA3, USB3, UEFI bios for quicker boots, a low power graphics solution if I ever needed it, etc. The sata3 difference really can't be understated if you have an SSD, and if you don't have one, buy that before you upgrade your CPU. USB3, on the devices that support it, is a pretty massive improvement as well.

I just felt it was just time to move on from the computer I bought in 2008. Even though my overclocks are pathetic compared to others I don't regret it, I just wish I had gotten better luck on the CPU draw.

Edit: I just read what some others were saying, and yes, if you have an older graphics card you'll get more out of that everytime. I've been using a 7970 since january of last year so I didn't feel the need to upgrade.

I've just recently read Maximum PC's article on Haswell and while they have a lot of gripes about it, in the end it is faster than Sandy Bridge (they say roughly 20%) and Ivy-Bridge. It is important to note that this wasn't gaming though, as you mentioned as such.

My big question for you is, Haswell really runs that much cooler for you? My 930 puts out a ton of heat while on idle just like yours. All I've read about Haswell has been that it's not a great overclocker due to heat issues... Unless you de-lid it, and I'm not sure I'm ready for that. At least not on a brand new $300 CPU.
 
I've just recently read Maximum PC's article on Haswell and while they have a lot of gripes about it, in the end it is faster than Sandy Bridge (they say roughly 20%) and Ivy-Bridge. It is important to note that this wasn't gaming though, as you mentioned as such.

My big question for you is, Haswell really runs that much cooler for you? My 930 puts out a ton of heat while on idle just like yours. All I've read about Haswell has been that it's not a great overclocker due to heat issues... Unless you de-lid it, and I'm not sure I'm ready for that. At least not on a brand new $300 CPU.

Yep, it's cooler in general in terms of total heat output. From what I've been reading (I didn't really keep up with ivy bridge since I didn't have a desktop part, just a laptop with no OCing), the whole TIM thing causes the heat from the core to not efficiently get to the heat spreader, so the heat on the core stays there instead of transferring. The bloomfield and lynnfield had been soldered on, thus providing a much better transfer of heat. That's why people risk deliding them, they can lower temps 15 degrees just by popping the top off and putting some good TIM in. I haven't researched it since my processor is so horrible it doesn't warrant deliding, but I remember reading about the usage of a hammer and vice or a razor, and I'm not confident enough in my handiwork to use either of those things around a processor.

So ya, it's much cooler overall in terms of the heat it puts out. My new system idles at about 80 watts of power usage (I have a lot of stuff in my case); my old 920 idled around 160 watts, with nothing changing except the processor, RAM, mobo, and now I'm using an H100i instead of air cooling. Note that I left on all the speedstep and even the turbo, running my 920 undervolted (1.075v) at 3.4ghz for day to day usage since I didn't feel I needed more.

I haven't recorded prime95 temps but I know that during general gaming (using GPU power too so it's not exactly apples to apples), I now have about 220 watts usage compared 300 or so on the undervolted 920.

Haswell doesn't really work like the older architecture from what I've seen, I'm almost always in some sort of low power state unless more is needed. Even when playing WoW most of the cores are running slower than 4.2ghz and only one is usually at the defined voltage. Even at 100% load it's still not terrible, the cores will go to full turbo but the power draw isn't too crazy, it's only when the AVX instructions are running that this thing really heats up and starts sucking down the power. Haswell just seems much, much more efficient with its speedstep than nehalem.

TLDR: Undervolted 920->4770k dropped power usage from 160 watts at idle to 80 according to my unit at the wall. Temps in my room dropped a lot too, haswell temp problem, from what I've read, is caused by a failure to adequately transfer heat from internal cores -> external facing heat spreader.
 
Last edited:
I won't flame you nor do I think you're even remotely an idiot, but I do tend to disagree, albeit with an argument different than the one you make. Is it worth the upgrade from strictly a gaming aspect? Not really, although I do seem to have higher minimum FPS in WoW (main game I play), it's not a huge difference and I could have easily done without it. Most of the games I see you listed are very GPU bound games, especially at higher resolutions. I'd imagine something like battlefield 3 would have a difference, especially for those who play on those crazy 64+ person servers.

From non-gaming aspects is where the questions start to arise. Did I like that everytime I played a game my room would become 3-4 degrees hotter than the rest of the house minimum? No, I didn't like that, and even when idling the 920 put out a ton of heat and really sucked up the power. The fully loaded power difference between a 3.4ghz I7 920 and a 4.2ghz 4770k measured at the wall is over 60 watts of power, with idle temps showing similar results. This thing literally just sips power like an englishman sipping tea at idle.

I appreciate your point of view. My 4770k is not a great OCer either, I settled on [email protected] which is ok due to decent cooling but higher vcore than many. To get to 4.5 is a real mission and not worth the added stress.

WOW is not exactly CPU intensive, as its so low GPU intensive the CPU may max, but you are probably at such high frame rates going any higher is moot, that or another change effecting the result ie GPU driver.

People that manually set there Bloomfiled Mobo voltages could/can easily reduce power and heat, when left on Auto many boards go nuts as seen in many of the early reviews, maybe you left yours on auto?. If we save 30-60 watts on "average" with Haswell that's not exactly a lot, I suspect mileage will vary. I understand running the PC in small room you may notice a heat difference, but I have always run these in fairly large rooms and notice zero diff.



My big question for you is, Haswell really runs that much cooler for you? My 930 puts out a ton of heat while on idle just like yours. All I've read about Haswell has been that it's not a great overclocker due to heat issues... Unless you de-lid it, and I'm not sure I'm ready for that. At least not on a brand new $300 CPU.

Yep I am aware of the Synthetic and APP advantages of Haswell, really just commenting on gaming.

My 4770K idles at around 35C @4.4Ghz that's with the Corsair 110 closed water cooler and around 80C load with Prime. It has a better cooling solution than the 920 had which sat around 45C and 80C load on air. I suspect if I put the Corsair 110 on the 920 the it would drop idle temps a fair amount.The Haswell mobo puts out less heat, but never found the old platform bad due to manual voltage control good case cooling..** oh I turn off all CPU throttling on both.

Does it really generally "feel" a lot cooler to me?, no not around 4.4Ghz.
 
Last edited:
People that manually set there Bloomfiled Mobo voltages could/can easily reduce power and heat, when left on Auto many boards go nuts as seen in many of the early reviews. If we save 30-60 watts on "average" with Haswell that's not exactly a lot, I suspect mileage will vary. I understand running the PC in small room you many notice a heat difference, but I have always run these fairly large rooms.

If you look, I actually undervolted my 920 for day to day operation. I was at 1.075v at 3.4ghz, sometimes when I wanted to see if there was an actual cpu limitation I'd run it at 1.325v for 4ghz. I just didn't feel that massive increase in voltage was worth it. Those power consumption numbers I gave aren't the most accurate in the world due to the equipment I used to get them, but they're good enough to give a relational comparison, and I cut my idle power consumption in half.


oh I turn off all CPU throttling on both.

Does it really "feel" a lot cooler to me no, not around 4.4Ghz.

Just curious why on Earth you'd turn off the cpu throttling. When you're away from your computer do you really want it running full bore, max voltage? The enhanced speed step was one of the major reasons I bought the 4770k (aside from the increase in performance obviously), or atleast the power reduction was, which the speedstep is a part of. My room isn't small by any means, it's probably the warmest part of the house but it's not like an office or anything.
 
I won't flame you nor do I think you're even remotely an idiot, but I do tend to disagree, albeit with an argument different than the one you make. Is it worth the upgrade from strictly a gaming aspect? Not really, although I do seem to have higher minimum FPS in WoW (main game I play), it's not a huge difference and I could have easily done without it. Most of the games I see you listed are very GPU bound games, especially at higher resolutions. I'd imagine something like battlefield 3 would have a difference, especially for those who play on those crazy 64+ person servers.

From non-gaming aspects is where the questions start to arise. Did I like that everytime I played a game my room would become 3-4 degrees hotter than the rest of the house minimum? No, I didn't like that, and even when idling the 920 put out a ton of heat and really sucked up the power. The fully loaded power difference between a 3.4ghz I7 920 and a 4.2ghz 4770k measured at the wall is over 60 watts of power, with idle temps showing similar results. This thing literally just sips power like an englishman sipping tea at idle.

For me it also brought SATA3, USB3, UEFI bios for quicker boots, a low power graphics solution if I ever needed it, etc. The sata3 difference really can't be understated if you have an SSD, and if you don't have one, buy that before you upgrade your CPU. USB3, on the devices that support it, is a pretty massive improvement as well.

I just felt it was just time to move on from the computer I bought in 2008. Even though my overclocks are pathetic compared to others I don't regret it, I just wish I had gotten better luck on the CPU draw.

Edit: I just read what some others were saying, and yes, if you have an older graphics card you'll get more out of that everytime. I've been using a 7970 since january of last year so I didn't feel the need to upgrade.

Exactly the way I feel. I didn't get much more game performance out of it. But on the whole the PC is much better. The heat output was noticeable. The cores run about 7C cooler idle than the i950 did. At a gaming load they run about the same as the 950, and at max prime95 they run about 5C hotter. However, this is all relative as the total heat output on Haswell is still lower than on the 950.

Beyond that, moving to a UEFI bios was extremely nice. The post time is definitely faster and the BIOS is more user friendly. SATA3 was massive. The speed at which windows and apps/games load is completely noticeable. Benchmarking my SSDs I doubled my transfer rates by going to SATA3. From the moment post is finished to Windows logon screen is about 1.7 seconds now.

Side comment, someone mentioned WoW isn't CPU intensive. This is wrong. Wow is generally CPU limited not GPU limited. This is due to the way the netcode in the game is handled. As player count in an area increased so does workload on the CPU. Wow is one of the rare games which does get a bump, especially in min FPS from the CPU.

Overall, the upgrade is very nice. A lot of it has to do with moving to a new platform from the reliable but aging X58.
 
Upgraded to 4770k from a delidded 3770k that I could pass Intel Burn Test at 5GHZ standard run.


Without delidding i can only over clock 4770k to 4.6GHZ without it getting to hot.

I'm not sure if I should keep it or not. Thinking about getting a 4670k which may over clock better and wait till Intel fixes 4770k.
 
Upgraded to 4770k from a delidded 3770k that I could pass Intel Burn Test at 5GHZ standard run.


Without delidding i can only over clock 4770k to 4.6GHZ without it getting to hot.

I'm not sure if I should keep it or not. Thinking about getting a 4670k which may over clock better and wait till Intel fixes 4770k.

I think anyone who upgraded from Sandy bridge/Ivy Bridge did not make a smart move upgrading to haswell.

Don't get me wrong I love the upgrade itch :) But it just wasnt worth it to upgrade.

Now if you were able to sell off your old mobo/cpu then yea it would be worth it, otherwise your money would be well spent in other area's. Like a new SSD or Video card.
 
I think anyone who upgraded from Sandy bridge/Ivy Bridge did not make a smart move upgrading to haswell.

Don't get me wrong I love the upgrade itch :) But it just wasnt worth it to upgrade.

Now if you were able to sell off your old mobo/cpu then yea it would be worth it, otherwise your money would be well spent in other area's. Like a new SSD or Video card.



It's worth it to me because I enjoy the experience. Even though its not going well with the chip that I got I know that good ones can be found.
 
Yep, it's cooler in general in terms of total heat output. From what I've been reading (I didn't really keep up with ivy bridge since I didn't have a desktop part, just a laptop with no OCing), the whole TIM thing causes the heat from the core to not efficiently get to the heat spreader, so the heat on the core stays there instead of transferring. The bloomfield and lynnfield had been soldered on, thus providing a much better transfer of heat. That's why people risk deliding them, they can lower temps 15 degrees just by popping the top off and putting some good TIM in. I haven't researched it since my processor is so horrible it doesn't warrant deliding, but I remember reading about the usage of a hammer and vice or a razor, and I'm not confident enough in my handiwork to use either of those things around a processor.

So ya, it's much cooler overall in terms of the heat it puts out. My new system idles at about 80 watts of power usage (I have a lot of stuff in my case); my old 920 idled around 160 watts, with nothing changing except the processor, RAM, mobo, and now I'm using an H100i instead of air cooling. Note that I left on all the speedstep and even the turbo, running my 920 undervolted (1.075v) at 3.4ghz for day to day usage since I didn't feel I needed more.

I haven't recorded prime95 temps but I know that during general gaming (using GPU power too so it's not exactly apples to apples), I now have about 220 watts usage compared 300 or so on the undervolted 920.

Haswell doesn't really work like the older architecture from what I've seen, I'm almost always in some sort of low power state unless more is needed. Even when playing WoW most of the cores are running slower than 4.2ghz and only one is usually at the defined voltage. Even at 100% load it's still not terrible, the cores will go to full turbo but the power draw isn't too crazy, it's only when the AVX instructions are running that this thing really heats up and starts sucking down the power. Haswell just seems much, much more efficient with its speedstep than nehalem.

TLDR: Undervolted 920->4770k dropped power usage from 160 watts at idle to 80 according to my unit at the wall. Temps in my room dropped a lot too, haswell temp problem, from what I've read, is caused by a failure to adequately transfer heat from internal cores -> external facing heat spreader.

I've done quite a bit of research on the TIM thing as well because heat has become somewhat of a top priority for me in the past couple years. I'm with you on the handiwork to actually de-lid them. I have patience but I tend to rush things like that. To excited to see what kind of change it'll make, I guess.

The CPU I've been eye balling is the 4670K. Quite a bit cheaper, leaves me room to upgrade later and doesn't seem to sacrifice to much for my use. I think I could do without Hyper-Threading, and I believe it lacks a small amount of L3 Cache. 6MB rather than 8MB. The big thing Maximum PC said a few times is the processor was designed with the mobile (laptop) platform in mind. Which would explain it's better efficiency with Speed Step and what not.

I have my machine hooked up through a CyberPower UPS that tells me through the PC the wattage it's using. Granted I believe the UPS is monitoring everything plugged into it which includes speakers and the monitor (plus the PC of course) and my idle wattage consumption is 228 watts. Lowest it ever gets. I've also got my CPU voltage relegated by the motherboard. A few weeks ago I tried lowering the voltage keeping my overclock and nothing lower than what it runs (1.376 volts idle/1.34 volts load) was stable under Prime95. So I'm just going to leave it the way it is and hope a new build lowers temps and power consumption.

For me... It still comes down to whether or not I want to drop a ~$700 to upgrade. Is it worth it? From a performance standpoint, probably not. So I'm continuing to contemplate whether I should do it or not. I'd also make the jump from Windows 7 to Windows 8 as well. Frankly, $700 isn't worth the lowered power consumption/heat output for me. It'd be a nice sized side benefit for me to go along with killing the upgrade itch and better overall performance.
 
I've done quite a bit of research on the TIM thing as well because heat has become somewhat of a top priority for me in the past couple years. I'm with you on the handiwork to actually de-lid them. I have patience but I tend to rush things like that. To excited to see what kind of change it'll make, I guess.

The CPU I've been eye balling is the 4670K. Quite a bit cheaper, leaves me room to upgrade later and doesn't seem to sacrifice to much for my use. I think I could do without Hyper-Threading, and I believe it lacks a small amount of L3 Cache. 6MB rather than 8MB. The big thing Maximum PC said a few times is the processor was designed with the mobile (laptop) platform in mind. Which would explain it's better efficiency with Speed Step and what not.

I have my machine hooked up through a CyberPower UPS that tells me through the PC the wattage it's using. Granted I believe the UPS is monitoring everything plugged into it which includes speakers and the monitor (plus the PC of course) and my idle wattage consumption is 228 watts. Lowest it ever gets. I've also got my CPU voltage relegated by the motherboard. A few weeks ago I tried lowering the voltage keeping my overclock and nothing lower than what it runs (1.376 volts idle/1.34 volts load) was stable under Prime95. So I'm just going to leave it the way it is and hope a new build lowers temps and power consumption.

For me... It still comes down to whether or not I want to drop a ~$700 to upgrade. Is it worth it? From a performance standpoint, probably not. So I'm continuing to contemplate whether I should do it or not. I'd also make the jump from Windows 7 to Windows 8 as well. Frankly, $700 isn't worth the lowered power consumption/heat output for me. It'd be a nice sized side benefit for me to go along with killing the upgrade itch and better overall performance.

For heat and power it would be a huge upgrade indeed. Also sata 6.0 and usb 3.0 as well.

It all depends on your gaming situation TBH. If your running at 1080p with your 930@4ghz, then you probably wouldn't see much of an upgrade if you run it at 60hz. You would see a greater increase by getting another 680 GTX. Of course this my opinion and some may disagree.
 
For heat and power it would be a huge upgrade indeed. Also sata 6.0 and usb 3.0 as well.

It all depends on your gaming situation TBH. If your running at 1080p with your 930@4ghz, then you probably wouldn't see much of an upgrade if you run it at 60hz. You would see a greater increase by getting another 680 GTX. Of course this my opinion and some may disagree.

S'all good. I value most everyone's opinion on here. I've asked many questions and never been steered wrong, or been given bad advice.

I do indeed play at 1920x1080 on a 60hz monitor. A year ago when I was running a Sapphire 5850 I ran a second monitor to "monitor", if you will, the usage of the CPU/GPU via Crysis 2. Obviously the GPU was at 100% for the most part while the CPU was around... 40% - 60% tops...? It was something like that anyway where I had a lot of room on the CPU yet. I jumped to a 7970 and that made a huge difference in the smoothness of Crysis 2. Only reason I went to a 680 was because of AMD driver issues.

Bit of a side question here, but for someone like me (I don't see myself going to a higher resolution any time in the near future) would it make more sense to go SLi, or CrossFire or continue just trying to get a better single GPU?
 
I must say I am quite impressed with my 4770k so far, have her at 4.5ghz - 1.250v. My primary game right now is Natural Selection 2 and my minimum FPS dips during heavy battles have drastically improved.... With my 920 @ 4.13ghz I would get as low as 60fps, running Fraps last night the frames never dropped below 90 and the game was noticeably smoother. Hitching and stuttering when quickly moving into new areas of the maps was also much less apparent.

I really think that for a multi GPU, 1080p 120hz setup, the move from Nehalem to Haswell is worth it. I plan on playing Tomb Raider and my heavily modded Skyrim in the next couple days to further test the new setup.
 
Bit of a side question here, but for someone like me (I don't see myself going to a higher resolution any time in the near future) would it make more sense to go SLi, or CrossFire or continue just trying to get a better single GPU?

It's been a long time and the technology has probably improved a lot, but after having crossfired 4970s I swore off ever using 2 graphics cards again. Between microstutter problems, needing new profiles for new games or adding them yourself, and the fact that it didnt work for anything in windowed mode I just gave up on the technology and got a 7970.

Amd drivers have come a surprisingly long way since I got my card last January, so much so that the 3dmark comparison tests I ran increased by over 10%. That being said a gtx780 would be my next card, but at 1920x1200 I dont really seem to need it yet
 
It's been a long time and the technology has probably improved a lot, but after having crossfired 4970s I swore off ever using 2 graphics cards again. Between microstutter problems, needing new profiles for new games or adding them yourself, and the fact that it didnt work for anything in windowed mode I just gave up on the technology and got a 7970.

Amd drivers have come a surprisingly long way since I got my card last January, so much so that the 3dmark comparison tests I ran increased by over 10%. That being said a gtx780 would be my next card, but at 1920x1200 I dont really seem to need it yet

Had I've come from such a thing I think I would have sworn off using two cards again as well. Again, from a power usage/heat standpoint I think one single "good" GPU is probably the way to go for someone like me. I'd go SLi of xFire just to see, but that's about the only reason why. Unless of course it made more sense that I do such a thing.
 
I just finished up my Haswell build and it has more or less met my expectations.

With that being said, my expectations were tempered significantly after the official reviews were released, so I am content with my 4.4Ghz overclock @1.2v.

Initially, I was hoping for 5+ ghz overclocks and significantly reduced temps, but this was back in April when rumors were swirling around about a 7Ghz OC on engineering samples.

Is it noticeably faster than my i7 920 (which was also overclocked to 4.4Ghz)? No.

However, I now have native USB3, Sata 6 and lower power consumption. To me, that was worth the switch.
 
S'all good. I value most everyone's opinion on here. I've asked many questions and never been steered wrong, or been given bad advice.

I do indeed play at 1920x1080 on a 60hz monitor. A year ago when I was running a Sapphire 5850 I ran a second monitor to "monitor", if you will, the usage of the CPU/GPU via Crysis 2. Obviously the GPU was at 100% for the most part while the CPU was around... 40% - 60% tops...? It was something like that anyway where I had a lot of room on the CPU yet. I jumped to a 7970 and that made a huge difference in the smoothness of Crysis 2. Only reason I went to a 680 was because of AMD driver issues.

Bit of a side question here, but for someone like me (I don't see myself going to a higher resolution any time in the near future) would it make more sense to go SLi, or CrossFire or continue just trying to get a better single GPU?

SLI has alot less issues then crossfire, ALOT less issues. Now there are some SLI issue's it isnt perfect, but it isnt as bad as the 6-7k series from AMD, or even the 400 series from Nvidia.
 
Moving up from a core 2 quad haswell has been "better". Ivy probably would have been in the same ball park and probably given me everything I wanted. Still it was time to upgrade my work machine and it's easy to buy the hot new stuff when you aren't spending your own money :D
 
Are you guys using speedstep? I've got my haswell clocked at 4.5ghz now at 1.2 volts and left speedstep enabled. It's much more advanced than my old I7-920 speedstep. the MHz range on haswell is all over the board. It clocks down as low as 800mhz and seems like it can do any MHz it wants - where as the old I7-920 simply was one of two speeds with speedstep engaged.
 
If you look, I actually undervolted my 920 for day to day operation. I was at 1.075v at 3.4ghz, sometimes when I wanted to see if there was an actual cpu limitation I'd run it at 1.325v for 4ghz. I just didn't feel that massive increase in voltage was worth it. Those power consumption numbers I gave aren't the most accurate in the world due to the equipment I used to get them, but they're good enough to give a relational comparison, and I cut my idle power consumption in half.


Just curious why on Earth you'd turn off the cpu throttling. When you're away from your computer do you really want it running full bore, max voltage? The enhanced speed step was one of the major reasons I bought the 4770k (aside from the increase in performance obviously), or atleast the power reduction was, which the speedstep is a part of. My room isn't small by any means, it's probably the warmest part of the house but it's not like an office or anything.

I was talking about your mobo voltages/chipset not your CPU voltages, 1366 often heavily overcompensated left on Auto.

A lot of overclockers turn that speed step rubbish off. I would advise turning it off for more consistent performance. If I turn my Aircon on for 2 minutes I will use more power than I would save in a year by having speed step off lol

Side comment, someone mentioned WoW isn't CPU intensive. This is wrong. Wow is generally CPU limited not GPU limited. This is due to the way the netcode in the game is handled. As player count in an area increased so does workload on the CPU. Wow is one of the rare games which does get a bump, especially in min FPS from the CPU.

Overall, the upgrade is very nice. A lot of it has to do with moving to a new platform from the reliable but aging X58.


Look at this WOW performance review, even at a low 1680x1050 (to ensure GPU is not the bottleneck) a modest I7 875 at only 3.73Ghz is hitting a really good 94FPS. hmm so maybe WOW is not really that CPU intensive :p

http://www.tomshardware.com/reviews/world-of-warcraft-cataclysm-directx-11-performance,2793-9.html
 
Last edited:
Are you guys using speedstep? I've got my haswell clocked at 4.5ghz now at 1.2 volts and left speedstep enabled. It's much more advanced than my old I7-920 speedstep. the MHz range on haswell is all over the board. It clocks down as low as 800mhz and seems like it can do any MHz it wants - where as the old I7-920 simply was one of two speeds with speedstep engaged.

This is one of the first things I noticed. It behaves pretty much identically to my laptop's Ivy Bridge cpu. I'm also not sure as to whether my 920 could do it or not, but the 4770k speedsteps each core individually, so some may run at 800mhz while another is at max speed. Watching the ASUS AI suite is interesting, as it shows that even though a core may be at the max turbo speed, half the time it isn't even drawing the set voltage.

My 920 would speedstep down to somewhere in the low 2ghz range rarely if nothing was going on, the second anything happened it was well over 3ghz. It was very conservative with its speedstepping.

And no, I'd never turn it off, someone would have to show me some proof that the modern implementation of it has some sort of dire effect, which it wouldn't. Turning speedstep off just makes it run hotter all the time and use more energy, putting out more heat which I then would have to cool with AC.
 
Are you guys using speedstep? I've got my haswell clocked at 4.5ghz now at 1.2 volts and left speedstep enabled. It's much more advanced than my old I7-920 speedstep. the MHz range on haswell is all over the board. It clocks down as low as 800mhz and seems like it can do any MHz it wants - where as the old I7-920 simply was one of two speeds with speedstep engaged.

you can change the minimum and maximum CPU State by going to Control Panel- Power Options the under your power plan click on Change Power Settings-Change Advanced Settings to get to the Power Options window. There, scroll down to Processor power management. Then choose the % of CPU you want for Minimum Processor State. Of course, you want Maximum processor state at 100%.
This is in Win 7, and I believe Vista is similar, but don't remember.
I have mine set for 60% of 4.5GHZ overclock, so the cpu never goes under 2.7GHZ, but can be set for whatever you want. I see no temp difference if idling at 2.7GHZ over 800 MHZ.
I haven't done any benchmarks to see if there are any performance hits or boost, but it does even out that up/down speed in CPU-Z.
 
Just upgraded from I7 [email protected] (mostly), couldn't take the upgrade itch any more. I have been pretty vocal in this thread so thought I would offer some feed back

New Rig

Gigabyte GA-Z87X-OC Mobo
Intel Core i7 4770K [email protected]
Corsair Hydro Series H110 Closed processor water cooler
CoolerMaster RC-942 case
16Gb 1600MHZ ram (already had)
SSD (already had)
GTX 680 (already had).

Games tested
Gaming resolution 1080P and all games set pretty high quality where GPU is usually the bottleneck even at a modest 1080P.

Bioshock Infinity (no FPS gain)
Metro Last light (no FPS gain).
Starcraft 2 (no FPS gain)
Supreme commander forged alliance (around 20-30% sim speed increase but STILL maxes 1 core, no surprise there).
Crysis 3 (no FPS gain)
**didn't feel the need to test more.

The results above I was expecting as the only games I play where a core was maxed was Supcom Forged Alliance large maps with ALX etc.

Advice for gamers with Bloomfield/Nehalem @3.8 or better.

How do I put this........um..............DON"T DO IT !!!!!!, seriously NO, use the money on hookers!

Its the worse money I have spent in a long time, I am thinking why the hell didn't I just jam a Corsair hydro on the 920 and OC a little more for fun, may have satisfied the itch. My 920 is looking at me in the corner wondering what the hell it did wrong!, I cant even look at it little own try and explain why he's been replaced.


Bloomfield/Nehalem owners please feel free to flame me for being a idiot.

Thanks for this post. After reading this, I spent money on a new SSD, GPU, and 24gigs of ram. I think that was money well spent after looking over this thread. My 4.2Ghz Oc is still doing fine.
 
Thanks for this post. After reading this, I spent money on a new SSD, GPU, and 24gigs of ram. I think that was money well spent after looking over this thread. My 4.2Ghz Oc is still doing fine.

I haven't really noticed much of a benefit moving from a 4.5GHz 2600K to a 4.3GHz 4770K. I know it's faster, obviously, but I probably should have just put the money towards a second GTX 780. Live and learn!
 
I haven't really noticed much of a benefit moving from a 4.5GHz 2600K to a 4.3GHz 4770K. I know it's faster, obviously, but I probably should have just put the money towards a second GTX 780. Live and learn!

I did both :)
 
Look at this WOW performance review, even at a low 1680x1050 (to ensure GPU is not the bottleneck) a modest I7 875 at only 3.73Ghz is hitting a really good 94FPS. hmm so maybe WOW is not really that CPU intensive :p

http://www.tomshardware.com/reviews/world-of-warcraft-cataclysm-directx-11-performance,2793-9.html

No, Toms just has no clue what they are doing. They followed a predefined path in the middle of no where rendering near nothing. If you go into a major city or do a large 25 man raid the performance difference is instantly noticeable. It just difficult to benchmark in that environment so the failures at Tom's ran from point A to B and concluded that accurately displays Wow performance.

P.S. Not trying to sound like an elitist asshole but been playing since release. Really familiar with the engine now. Min FPS goes up significantly with cpu power. It mainly has to do with all the netcode being processed. Won't notice it in lowly populated spots, but areas with a lot of action/combat/people it can drag down CPUs. On my 4770K and 780GTX Wow can still dip into the 40s. If you look at the graphics in Wow, you'd instantly know it's not because the GPU can't push the polygons. Heck I think individual characters in games like Crysis 3 have more polygons than a Wow raid instance :)
 
Back
Top