Haswell News

I hope they show some more Haswell motherboards at IDF. Anyone got conformation on that? I hope I don't have to camp out at Microcenter at release either lol, rumors say limited quantities upon release and everyone hates sitting there with a brand new motherboard no processor...
 
Limited release due to the USB 3.0 bug on Haswell's motherboard chipset.
 
Can anyone here clarify the method of overclocking Haswell, I read somewhere that it's going to be BLCK based just like my current i7 920? Whereas the Ivy Bridges and the Sandy Bridges were different? Sorry for sounding like a noob, I'm just gearing up for a Haswell build and I'm a little unsure about which memory I should go with.

More detail: My current i7 920 overclocking is indirectly tied with my memory's speed (unless I've been overclocking it completely wrong for the past 3 years) so what I've done was increase the multiplier to x21 and BLCK to 200 which would make my memory speeds at 2005mhz hence the reason I bought Dominator GTs that are rated at 2000mhz so that everything would be "streamlined". I know I can just change the RAM ratio if I wanted to go higher on my CPU speeds but I hate doing that because I feel like the computer runs "bottlenecked" that way.

Sooo I've recently helped my cousin put together his pc which is an Ivy Bridge i7 3770K on an Asus RIVE and noticed that when overclocking you just set the memory speeds at whatever the XMP is and work on the CPU speed independantly. I was thinking wow, that's great, you can basically get some ultra low latency ram (i.e. 1600mhz with like 7 timings) and just worry about the CPU speed overclocking later.

NOW, I was thinking man, this is going to be great for Haswell when it comes out. I'll just do the same thing and buy some ultra low latency ram at 1600mhz and be done with it BUT I read rumors that it's going back to BLCK based so I'm now thinking I should get as high as possible speed RAM so that it won't limit my CPU overclocking limits.

I know it's a lot but I hope someone understands what I'm talking about here.
 
I'm a little confused :confused:

I was under the impression that LGA2011 is a dead end, but after reading this thread it looks like IB-E and Haswell-E will be 2011?

So, is LGA1050 a dead end?

I guess what I am asking is which socket will have the longest life span? I am getting tired of my SB, and I want to change! hahaha
 
I'm a little confused :confused:

I was under the impression that LGA2011 is a dead end, but after reading this thread it looks like IB-E and Haswell-E will be 2011?

So, is LGA1050 a dead end?

I guess what I am asking is which socket will have the longest life span? I am getting tired of my SB, and I want to change! hahaha

From what I gathered,

LGA2011 was supposedly going to be ready for Ivy Bridge E which makes the socket still viable. A newer rumor suggests that Intel might skip Ivy Bridge E all together and go straight to Haswell E. IF and only IF that happens, Intel will introduce a completely newer socket because DDR 4 is involved and making LGA2011 a dead end. IMO I think Intel will still release Ivy Bridge E because it doesn't look like DDR 4 is ready for prime time, but who knows, maybe they'll be able to push it out by the end of this year because samples of it has been floating around.

LGA1150 which is the socket for Haswell is scheduled to be a dead end though. Only meant for Haswell as far as I've read.
 
I believe lga1150 also depends on DDR4. If broadwell is DDR4 than it will need a new socket. If it does not than it may use lga1150 (provided the soldered to the board rumors are not true).
 
I want a very power efficient machine for sound and light video editing. I would prefer one that didn't need a graphics card (part of the reason for going for HD4000), but I'm not sure if it pays to wait a couple months for slightly better integrated graphics.
 
I would like to see this feature make it to the desktop, and have eyefinity/nvsurround capabilities at the same time.

Think about it, completely powering down your graphics cards saves a bunch of power, especially for SLI and crossfire systems.

Yeah, the feasibility and the interoperability required to pull that off in a manner that makes it worthwhile probably could never happen and it would probably end up working only for a "1-2" monitor situation which is fine for a lot of people but utterly useless for me.
 
I want a very power efficient machine for sound and light video editing. I would prefer one that didn't need a graphics card (part of the reason for going for HD4000), but I'm not sure if it pays to wait a couple months for slightly better integrated graphics.

Wouldn't you better be served possibly by considering one of AMD's APU line-up. From what everybody seems to be saying they are beating Intel's lineup in terms of computing power / power consumption.
 
Wouldn't you better be served possibly by considering one of AMD's APU line-up. From what everybody seems to be saying they are beating Intel's lineup in terms of computing power / power consumption.

This exactly.

Every PC I've built since I purchased my 3930k has been Intel. However, the last build I did for a friend we used a 5800k with an SSD.

I was blown away that an integrated GPU was able to play BF3 smooth enough that he can play without issues. So from now on any cheaper system I build will use AMD APUs. I suggest looking in to them as well.
 
This exactly.

Every PC I've built since I purchased my 3930k has been Intel. However, the last build I did for a friend we used a 5800k with an SSD.

I was blown away that an integrated GPU was able to play BF3 smooth enough that he can play without issues. So from now on any cheaper system I build will use AMD APUs. I suggest looking in to them as well.

What I don't understand is why intel even bothers putting an IGP in their mid-high end cpus when no one seems to use them. Who buys a i7 (whether its a 2600k, 3770k, 4770k, etc.) and actually uses the integrated graphics? I can see it in mobile graphics and i3 processors but I don't know of anyone with an 2500k or similar i5 or i7 cpus that even consider integrated as an option. I have tried the IGP on the 2500k and its quite low end, its so cheap to just throw a 260 gtx or something similar for sub $50 prices and its more than 2x as fast.
 
Hoping we get some solid Haswell info soon. My SATA controller is bugging out and I'm anxious to upgrade. Looks like I'll be holding off awhile longer.
 
Yeah, the feasibility and the interoperability required to pull that off in a manner that makes it worthwhile probably could never happen and it would probably end up working only for a "1-2" monitor situation which is fine for a lot of people but utterly useless for me.

One can dream, right?
 
So Haswell is out in the wild, but limited release and overprice, nice. I just told a friend that it wouldn't be out until Summer/Fall. He just build a new PC with a 3770k (and I'm about to pick one up too), classic. Well at least there is no LGA1150 Shuttle yet.
 
What I don't understand is why intel even bothers putting an IGP in their mid-high end cpus when no one seems to use them. Who buys a i7 (whether its a 2600k, 3770k, 4770k, etc.) and actually uses the integrated graphics? I can see it in mobile graphics and i3 processors but I don't know of anyone with an 2500k or similar i5 or i7 cpus that even consider integrated as an option. I have tried the IGP on the 2500k and its quite low end, its so cheap to just throw a 260 gtx or something similar for sub $50 prices and its more than 2x as fast.

Powerful workstations?

They simplify machines that need powerful CPUs but not GPUs...
 
Powerful workstations?

They simplify machines that need powerful CPUs but not GPUs...

Yes, and no.

And at the price that you, (whether it be as a consumer or as a business owner), will quite possbily pay in 2-5 years as a result of this - You might definitely reconsider your "Facebook Like+" stance. But obviously not until you are greeted by Captain Hindsight.
 
From what I gathered,

LGA2011 was supposedly going to be ready for Ivy Bridge E which makes the socket still viable. A newer rumor suggests that Intel might skip Ivy Bridge E all together and go straight to Haswell E. IF and only IF that happens, Intel will introduce a completely newer socket because DDR 4 is involved and making LGA2011 a dead end. IMO I think Intel will still release Ivy Bridge E because it doesn't look like DDR 4 is ready for prime time, but who knows, maybe they'll be able to push it out by the end of this year because samples of it has been floating around.

LGA1150 which is the socket for Haswell is scheduled to be a dead end though. Only meant for Haswell as far as I've read.

I'm still not sure if Haswell-E will use DDR4. The server variants-- Haswell-EP/EN-- will use DDR4.

Broadwell will be the first consumer, non-server processor with DDR4 though. So that will most likely use a new socket and will not be a drop-in replacement with Haswell Socket 1150 boards. That should give consumers at least one year or two years at most with that platform. Haswell-E, haven't found an article that has confirmed it or that it will carry over the DDR4 controller from its server counterpart.

It will use a new version of Socket 2011 though, but will not be a drop-in replacement with current SB-E socket 2011 boards. That's been confirmed a while ago already.

At least with AMD, AM3+ were on boards with DDR2 and DDR3 memory. That helped a lot with upgradeability and costs. Of course, performance wasn't there, but the fact that it made it affordable for upgrades between processors, helped.
 
I thought Intel already confirmed that Ive Bridge-E was going to be released for the X79 platform. I know a lot of people built X79 systems because of this upgrade path. Would be really shoddy if Intel changes this plan.
 
Implying ddr4 will improve much on the desktop side
Why do you think it won't help? The limited improvements in memory speed, compared to steadily increasing CPU core count and IPC, make the switch to DDR4 very welcome. Plus the DDR4 standard gets rid of an addition timing bottleneck (tH) used by DDR1-3, by using a point-to-point protocol.

If you mean where it would help, of course it's not going to matter to many users who just browse the web and use other non-demanding applications. However, it will help in certain applications bottlenecked by memory access.
 
Why do you think it won't help? The limited improvements in memory speed, compared to steadily increasing CPU core count and IPC, make the switch to DDR4 very welcome. Plus the DDR4 standard gets rid of an addition timing bottleneck (tH) used by DDR1-3, by using a point-to-point protocol.

If you mean where it would help, of course it's not going to matter to many users who just browse the web and use other non-demanding applications. However, it will help in certain applications bottlenecked by memory access.


I'm kinda clueless in ram department frankly, but there is barely performance increase with 1333 mhz ram and 2000 mhz ram right so will we really see improvement over ddr3 that worth the upgrade? I mean ddr4 will cost an arm and a leg.
 
I thought I read somewhere that DDR4 would have frequencies up in the 4000MHz range, it might be making that up but surely the differences would start to become apparent at those sorts of speeds?
 
I thought I read somewhere that DDR4 would have frequencies up in the 4000MHz range, it might be making that up but surely the differences would start to become apparent at those sorts of speeds?


From what I've heard they're taking the numbers of whats possible based off previous OC'ing trends with DDR1, 2, and 3. I think the default DDR4 frequency is nothing less than 2200-2400Mhz, which is absolutely ridiculous. I wouldn't be surprised one bit if in the next 4-5 years you do see 4-5Ghz modules.
 
Ridiculously low or high?

I'd consider it a bit on the low side given that we have DDR3 that reaches within striking distance of those speeds available now.
 
Ridiculously low or high?

I'd consider it a bit on the low side given that we have DDR3 that reaches within striking distance of those speeds available now.

But how many of those modules are actually rated for those speeds and not simply overclocked?

DDR4 is starting at 2133 MHz (remember DDR3 started at 1066 MHz) but reports say it can reach 4266 MHz. Give it time.
 
Why do you think it won't help? The limited improvements in memory speed, compared to steadily increasing CPU core count and IPC, make the switch to DDR4 very welcome. Plus the DDR4 standard gets rid of an addition timing bottleneck (tH) used by DDR1-3, by using a point-to-point protocol.

If you mean where it would help, of course it's not going to matter to many users who just browse the web and use other non-demanding applications. However, it will help in certain applications bottlenecked by memory access.

In general computing, it's the on-die GPU that makes the most of fast memory, not the processor. In fact, with no on-die GPU, you're almost always better off with more slower RAM than you are with less high speed RAM. If Intel decouples it's L3 cache (and bypasses adding L4) for their GPUs, then we could potentially see HDXXXX performance improve substantially with the move to DDR4. But considering they've been going in the opposite direction -- Haswell's ULV part with GT3 L4-packaged GPU -- then he's right to question the benefits.

For servers it makes more sense; higher capacity and faster speeds are never going to be turned down. For desktops and laptops? Well, that depends.
 
I believe this is a good read to get some idea about DDR4:
Bit-Tech article
Original article from PC Watch

The biggest benefits coming to DDR4, in my opinion, are the following:

  1. One DIMM slot per channel. (i.e.- Four slots is quad-channel; 8 slots is octa-channel)
  2. Point-to-point protocol
  3. Switched memory banks
  4. Lower voltage than DDR3-- 1.05v to 1.2v
  5. Higher memory bandwidth and transfer rates-- up to 4266 MT/s.
As mentioned above, for integrated graphics, DDR4 will help a lot in that area. If AMD, for example, switched to DDR4 in the next APU, Kaveri, it'd work out a lot better than using DDR3 memory. Intel as pelo mentioned already has gone the opposite approach with integrating eDRAM on certain models of ULV Haswell processors. But, it will be costly regardless. When Broadwell is released to the masses and DDR4 has made its way from server to consumer in two years (2015-ish), the eDRAM may possibly be removed in favor of DDR4. That depends entirely on Intel though.

Now, the other features such point-to-point protocol and switched memory banks are interesting. From what I understand when I first read about DDR4 on the PC Watch website is that the memory controller has ONE channel connected to a single DIMM slot. In other words, from Point A to Point B, keeping connections simple while maintaining higher transfer rates and memory bandwidth.

As for switched memory banks, we probably won't see this on consumer boards and Broadwell (and possibly Haswell-E if it does implement DDR4). We'll mostly see these in servers and possibly first in the Haswell-EP/EN server processors. When I read about it, my thinking is that it works like this:

  • A server has 32 memory DIMMs with eight banks of 4 DIMMs each.
  • Memory controller sees eight channels of RAM connected in eight DIMM slots.
  • Memory controller can switch between each DIMM in each bank while maintaining a point-to-point connection with each of them.
In other words, a server could squeeze more RAM into less slots while still keeping up a high memory transfer rate and bandwidth. That's how I understood it unless pelo or pxc has a better explanation.

So, is DDR4 going to be beneficial?

Well, as always, depends on what you do on your computer and what application you are using. Anything that needs the higher memory transfer rate will benefit greatly from DDR4. The lower voltage of DDR4 combined with a lower powered processor should make for a cheaper electric bill.

If you are using your computer primarily for normal productivity suites like Office, surfing the internet, or listening to music, I don't think DDR4 will help you there at all. If you are playing a very demanding game, maybe it'll help more more than DDR3.

However, if there is an integrated GPU on your processor, using DDR4 should alleviate the bandwidth and memory transfer rate issues from DDR3 especially when you look at AMD's APU lineup.

But, it will be the servers that will always benefit from the faster memory, and its extra features will help a lot there.
 
That article sounds like it was written by a WebProNews editor. Quote some facts and then sum it up with biased opinion stated as fact.
 
Launch date is 27th may with systems on sale from 2nd June for the following:-

4th Gen Intel® Core™ i7-4770K, i7-4770, i7-4770S, i7-4770T, i7-4765T, i5-4670K, i5-4670, i5-4670S, i5-4670T, i5-4570, i5-4570S, i5-4570T, i5-4430, i5-4430S Processors (Haswell),
Intel® Z87, H87, Q87, Q85, B85 chipsets (Lynx Point)
 
Why do you think it won't help? The limited improvements in memory speed, compared to steadily increasing CPU core count and IPC, make the switch to DDR4 very welcome. Plus the DDR4 standard gets rid of an addition timing bottleneck (tH) used by DDR1-3, by using a point-to-point protocol.

Well I programmed on DDR5. It worked without problems until 30 cores, 60 cores were with some hiccups. Scaling it to normal overclocked CPU means: 6 cores are fine, 12 cores is the max for non quad channel.

Shouldn't DDR5 be faster than DDR4?
 
Launch date is 27th may with systems on sale from 2nd June for the following:-

4th Gen Intel® Core™ i7-4770K, i7-4770, i7-4770S, i7-4770T, i7-4765T, i5-4670K, i5-4670, i5-4670S, i5-4670T, i5-4570, i5-4570S, i5-4570T, i5-4430, i5-4430S Processors (Haswell),
Intel® Z87, H87, Q87, Q85, B85 chipsets (Lynx Point)

Perfect. That gives me enough time for adequate reviews before my new build in august (unless the NCASE M1 comes earlier)
 
Launch date is 27th may with systems on sale from 2nd June for the following:-

4th Gen Intel® Core™ i7-4770K, i7-4770, i7-4770S, i7-4770T, i7-4765T, i5-4670K, i5-4670, i5-4670S, i5-4670T, i5-4570, i5-4570S, i5-4570T, i5-4430, i5-4430S Processors (Haswell),
Intel® Z87, H87, Q87, Q85, B85 chipsets (Lynx Point)


For that news update I will kill you with high-fives :D
 
Launch date is 27th may with systems on sale from 2nd June for the following:-

4th Gen Intel® Core™ i7-4770K, i7-4770, i7-4770S, i7-4770T, i7-4765T, i5-4670K, i5-4670, i5-4670S, i5-4670T, i5-4570, i5-4570S, i5-4570T, i5-4430, i5-4430S Processors (Haswell),
Intel® Z87, H87, Q87, Q85, B85 chipsets (Lynx Point)

"Systems on sale", so does this mean like Dell, Falcon and all the other OEMs will have access to it first? ORRR does this mean I can run down to my local Microcenter and pick up one on June 2nd? Last but surely not least, source?
 
"Systems on sale", so does this mean like Dell, Falcon and all the other OEMs will have access to it first? ORRR does this mean I can run down to my local Microcenter and pick up one on June 2nd? Last but surely not least, source?

The official inter roadmap is the source, it was leaked out and cpu-world managed to get a linked to google's cached version before the pdf was taken off intels' site. If you want PM me and I will email you the html version that I cut n pasted from google.

As for systems on sale I have no idea - I just put up what was on the document
 
Back
Top