Intel Haswell i7-4770K IPC and Overclocking Review @ [H]

Like Ivy before it, Haswell represents Intels primary focus on the iGPU improvements.

If you want a great CPU and overclocker you are still best to go with Sandy Bridge or Sandy Bridge E.
 
Bottom line is Haswell is a great CPU for mobile applications but for the desktop you Sandy and Ivy guys have no need to upgrade unless you got money to burn. I might buy one but I'm coming from a trusty Q6600.

Agreed - like you, I am also coming from Q6600. Even for us, the only reason to upgrade to Haswell (as opposed to Ivy) is if there is price parity - and when was the last time Intel did that with a new socket?
 
Thanks Kyle, great read for the weekend.

Been running my 920 since April 2010... I'd love to have a reason to upgrade... but other than doing a bunch of video encoding, I just cant see the reason to right now.

Then again, how many other times have I upgraded more on "want" than on "need"...

Going to have to realty think this one over.
 
So I guess someone like me with a z77 and 2600k even with a titan/780 at PCI-E 2.0 is fine ehh?

Really no need to upgrade :(

Really great review though
 
Like Ivy before it, Haswell represents Intels primary focus on the iGPU improvements.

If you want a great CPU and overclocker you are still best to go with Sandy Bridge or Sandy Bridge E.

Isn't it sort of a wash? Better IPC of Haswell makes up for the lower clock speed. Getting 5GHz on a SB chip is hardly guaranteed, 4.8 GHz is considered lucky from what I've seen.

SB-E tends to clock better but is also more expensive.

I see little reason not to get Haswell IF you need a new machine right now - upgrading is a different story, though.
 
Nice review. I am sitting here with 920 @ 3,8GHz on Asus P6X58 Deluxe mb which have 2x sata6 and 2x usb3. So for those I dont need the upgrade. Performance increases from Sandy -> Ivy -> Haswell are really disappointing but I am still leaning towards upgrading to Haswell now. Z87 mb's look nice and those would have more and native sata6's & usb3's. Would be nice to see some i7 9xx series @ 4GHz added into the review as I think the increased IPC's will make a difference in games like WoW.
 
I have no reason to upgrade my desktop pc with i7 2600 and NVidia 570 with 8 gb ram. However, if I was going to purchase a laptop now would be the time to get one with haswell they are going to be very nice.
 
Guess I will continue to enjoy my 2500K and Z68.

I have a 7950 coming... it's my first purchase of a computer part since Apr 2011.
 
First off, Great Review Kyle -- However, I'm kinda wondering if it would be a worthwhile investment for me to make the upgrade.... I'm coming from an x58 system, and I'd like confirmation from a "professional hobbymaster" such as yourself on this.

That said, the way I see this is....
I've got an i7-930 running at about 3.8 to 4.05ghz now (varying on what my mood is) on a Gigabyte x58A-ud5 with 12gb 1600mhz ram (its oc'd to 1620ish) ... my big thing about not upgrading to Ivy Bridge was the fact it didn't support more than 2 Sata3 ports for raid. I'm spoiled right now with a 3 ssd drive raid on SATA2, and I'm feeling like I'd still want a 3 Drive Sata3 raid.
I'm kinda assuming the upgrade for the newer functionalities, raid controller, and all this will be MORE than worth it for me, coming of an i7-930 1366... Correct me if I am wrong....

It'd be a hefty Investment, as I'd plan on getting 3 new ssd's, another 16gb or more of 1866 ram, the Mobo and Processor, and go a little wild with a Swiftech H20-220 Elite (or similar type of setup).

My only other option is waiting for Ivy Bridge-E? I'm not sure I really NEED 6 cores....
 
Not really anything special over Ivy but I will be replacing my 920 with one of these for the following reasons.

- PCI-E 3.0
- Dumping NF200
- SATA III
- USB 3.0
- Updated instruction set support (about to become very important once the new consoles are out in the wild)
- Power consumption
- Better motherboard layout than my UD5 for SLI
- I'm bored
 
I'm getting a I7 4770K, upgrading from a Q6600, building new pc from scratch.
I've been buying parts for a while. I purchased a Zalman CNPX12C cooler a while back for the system, originally for an Ivybridge. Does anyone know if this will fit the new socket 1150 ?? Hope so, it hasn't come out the box yet and I've had it over 6 months :( Checked manufacturers website and it doesn't list it yet.
Any recommendations on a mobo to go with it, dont wanna spend too much, pref less than 150 Euro.
Thanks.
 
According to HighTechLegion on YouTube the layout of the socket connectors etc on 1150 boards is the same as 1155. So Sany/Ivybridge coolers can be used on Haswell. TFFT, hope he's right
 
Kyle,

Thanks for the review. Now, let's see the pricing. My q6600 has been doing well: i3770k is looking like the path for me...

Ken
 
Anyone looking for a reason to move beyond 1366 should take a look at this for an example

http://software.intel.com/en-us/art...it-intel-advanced-vector-extensions-intel-avx

The tech on show in that demo (you will need access to a Sandy Bridge or higher machine) cannot be implemented on 1366/1156 and offers effects directly comparable to GPU accelerated physics. I'd be willing to put money down on that kind of physics simulation being used in games going forward as Intel starts to push harder into PhysX's territory with updated versions of Havok now that the consoles support it (shiny cloth effects for everyone!).
 
Same thing I said for Ivy bears repeating for this generation of chips. Haswell and Ivy are targeted at notebooks and AiOs rather than regular desktops. Intel has not been focused on making a better CPU for a while.

Haswell, like Ivy before it are mainly about trying to lower the power consumption a little bit while desperately trying to pull off a massive upgrade for the GPU. The end game is to break the low end GPU business largely owned by Nvidia and AMD.

Yep. Now that i've read a lot more reviews, it certainly seems like Intel pulled a big fat AMD here. Lackluster performance gains with an emphasis on graphics.

If there's a time for AMD to catch up, this would be it.
 
Anyone looking for a reason to move beyond 1366 should take a look at this for an example

http://software.intel.com/en-us/art...it-intel-advanced-vector-extensions-intel-avx

The tech on show in that demo (you will need access to a Sandy Bridge or higher machine) cannot be implemented on 1366/1156 and offers effects directly comparable to GPU accelerated physics. I'd be willing to put money down on that kind of physics simulation being used in games going forward as Intel starts to push harder into PhysX's territory with updated versions of Havok now that the consoles support it (shiny cloth effects for everyone!).

I see that like I see DX 11.1. By the time games tip toe into that edge of the pool there will be CPUs that do it and do it better if it ever even takes off. It's like buying DX11 videocards as soon as they come out because all others will be obsolete. By the time anything uses it those videocards be 4-5 revisions/generations old and you'll have to upgrade again anyway.
 
I see that like I see DX 11.1. By the time games tip toe into that edge of the pool there will be CPUs that do it and do it better if it ever even takes off. It's like buying DX11 videocards as soon as they come out because all others will be obsolete. By the time anything uses it those videocards be 4-5 revisions/generations old and you'll have to upgrade again anyway.

True but in this case that tech has been around since Sandy launch and has been supported by every AMD CPU from Bulldozer forward. We are at the tipping point for actual usage of the tech with Jaguar going into the new console platforms.
 
CORRECTION:
In the Bottom Line on the last page it says:
4.6GHz is sounding kind of dicey without less than $100 invested in a cooling system.

It should read either:
4.6GHz is sounding kind of dicey with less than $100 invested in a cooling system.

Or

4.6GHz is sounding kind of dicey without more than $100 invested in a cooling system.

:)
But great review. I was considering whether I should upgrade my whole system from a 2500k, or just invest in a GPU upgrade. Guess the question was answered.
 
I've been saying it for so long now, Intel needs to start selling six core cpu's for low price.

this refresh of the same quad cpu for all these years is becoming boring as hell.
 
You know, I'm still on an i7 920 that is OC'd to 4ghz and I see so very little reason to upgrade.

That's a good thing I think? lol
 
Updated instruction set support (about to become very important once the new consoles are out in the wild)

The tech on show in that demo (you will need access to a Sandy Bridge or higher machine) cannot be implemented on 1366/1156 and offers effects directly comparable to GPU accelerated physics

See, to me these are reasons not to upgrade right now. It usually takes developers 1-2 years to go into full swing with features (we see this with DX all the time).

I figure by the time any of the above matters we will be discussing the "tick" of Broadwell. A shrink to 14nm and some refinement due out next year sounds like good timing for the fall of 2014 gaming season. You won't really see much new tech in fall 2013 as those games have been in dev for a while.
 
Any word on heatsink compatibility? Will the Hyper 212 Evo work with Haswell without issue?
 
Has me wondering if we're truly reaching the end of practical processor design and materials (silicon) gains. There's too many similarities between Sandy Bridge/Ivy Bridge vs Haswell as can be said for K10 vs Bulldozer.

Not disappointing, but why put out a "new gen processor" when it's not that much faster than the last three gens.

I still think the lack of large gains on recent Intel CPUs are "marketing-related." Better for Intel to just dribble out the upgrades at this point. Until AMD puts out a "killer" CPU Intel can save some cash on R&D and just make incremental upgrades. The "un-knowledgeable" 95% of the computer buying public will see the new shinny and buy it. And the enthusiast will get the upgrade itch and upgrade.

If you have the itch and are on a 1366/920 platform Haswell might be a good option, but if you past 1366 upgrade your GPU. And with 4k monitors on the horizon we are going to NEED GPU upgrades.....bad.
 
Gotta love those 920's. LOL because I never did find how far it would go.

Great review always my GOTO one stop site, Thanks.

And my upgrade path still vaguely exists because it includes DDR4 memory whenever that happens?

A nice fast GPU is all I want right now haven't had the time to slow down and enjoy my passions for some time now. Except my daily visits to HardOCP :D that one-stop solutions that enable you to choose.
 
The $199 I paid for my 2600k at Microcenter is looking more and more like an uber-steal as time passes. Thanks for the review, Kyle!
 
Great review as always :)

You know, I'm still on an i7 920 that is OC'd to 4ghz and I see so very little reason to upgrade.

That's a good thing I think? lol

Same here, 930 @ 4.0GHz. The performance leap is just no longer the same as all my previous upgrades where I could see a huge difference without needing to compare numerical benchmark values.

I hope the next "tick" cycle bringing Haswell to 14nm will offer a further drop in heat dissipation, maybe that would be reason enough to upgrade next year.
 
See, to me these are reasons not to upgrade right now. It usually takes developers 1-2 years to go into full swing with features (we see this with DX all the time).

I figure by the time any of the above matters we will be discussing the "tick" of Broadwell. A shrink to 14nm and some refinement due out next year sounds like good timing for the fall of 2014 gaming season. You won't really see much new tech in fall 2013 as those games have been in dev for a while.

Allow me to provide a very real very much right now example. GRID 2 came out this week and includes an AVX supporting executable. The game had a lot of Intel co marketing and tech work (hence some of the Iris only GPU effects using PixelSync). However the enhancements aren't simply limited to visual side of things from what I've seen.

Bellow are the xml's from the benchmark included with the game. The first is from my 920 @ 4ghz with SLI GTX680's and the second is from my 3570K @4.4ghz with a single GTX680.

i7 920 - https://dl.dropboxusercontent.com/u/44321990/GRID2_Benchmark_14-05-08_on_02-06-2013.xml

i5 3570K - https://dl.dropboxusercontent.com/u/44321990/GRID2_Benchmark_11-11-25_on_02-06-2013.xml

Long story short the average fps so the 920/GTX680 SLI rig was just over 72fps using the non AVX enabled build of the game, the 3570K single GTX680 rig scored an average of just shy of 78fps.

That's double the GPU horse power (and yes both cards were fully loaded before you ask :D ) but a lower framerate. In a world where enthusiast PC gamer's are constant bitching about un-optimized games using ancient tech we should be embracing and pushing for greater adoption of stuff like this, it's ready now and is clearly easily (relatively speaking) implemented.

EDIT: I've since decided that the results on this are probably a bit funky and something went crazy. Re-ran the SLI and it coughed up another 50 frames... weird. Still a fairly big gain between the two platforms from a CPU standpoint, I believe code masters is offloading tasks they would normally assign to the GPU to the CPU instead.
 
Last edited:
So I'd be lucky if I got a chip that could hit 4.5 Ghz on air.

That's only around 30% faster than my 4 year old i860 CPU.
Disappointing how little the speeds have increased over the past 4 years.

Still don't seen a reason to upgrade.
 
Awesome review, thank you for taking the time to do it.

Makes me a bit irked that HWL isn't sporting solder under the IHS and that getting a good OC'er is now harder than capturing a golden goose. It is nice to see the seriously low power states that it is capable of...despite the fact that one must heavily invest in just the cooling system if even menial OC'ing is desired...seems the logic with smaller manufacturing processes and lower power use would condone smaller cooling devices, but the opposite is ringing true. I think much of that is attributed to the dumb-ass choice to plaster TIM under the IHS instead of a proper material.

I agree that the biggest allure of HWL isn't HWL itself, but the Z87 motherboards. Been a long time coming that Intel finally stopped it's long-lasting solo player game of fuck-around and provided more than a skimpy 2 native SATA3 ports.

OTOH, it will be very interesting to see what kind of added battery life the mobile HWL versions exhibit because of the insanely low power states, barring the laptop manufacturers don't use smaller capacity batteries as a cost cutting measure knowing that they could get away with doing so.
 
Intel GPU's put out massive heat for each upgrade lulz! Screws the Overclocking enthusiast population, and ruins any cheap 6 core no igpu options.

As an AMD fan I give it....
"heath ledger claps"
 
Sorry for the user that had there x58 board die.. i have a lifetime board with Evga..

Remember that x58 enjoys things the other don't like full x16 pipe lines in Sli and CX and triple channel memory..
 
Sorry for the user that had there x58 board die.. i have a lifetime board with Evga..

Remember that x58 enjoys things the other don't like full x16 pipe lines in Sli and CX and triple channel memory..

Yes, x16 on pci-e 2.0, which is the equivalent of x8 on pci-e 3.0. everything since has had tri channel, except 2011 which has quad channel.
 
Back
Top