Now that Haswell is close to release, can we start talking about Broadwell?

aphexcoil

Limp Gawd
Joined
Jan 4, 2011
Messages
322
Does anyone have any details concerning Broadwell? We know that it will be a die shrink to 14nm, but are there any other planned upgrades for Broadwell (i.e. the GPU, power-savings, etc.)

We're usually pretty good at keeping the rumor-mill spinning, so if anyone has any info about what we can expect from Broadwell, then let's hear it.
 
As with everything on Intel's agenda: BETTER INTIGRATED GRAPHICS!!! YAAAAY!!!! BETTER BATTERY LIFE!!! YEAAH!!!!! ADVANCED SLEEP STATES!!!! WEEEE!!!!

Performance? Oh, yeah a little bit.
 
No. We don't even have reliable numbers for Haswell.
 
As with everything on Intel's agenda: BETTER INTIGRATED GRAPHICS!!! YAAAAY!!!! BETTER BATTERY LIFE!!! YEAAH!!!!! ADVANCED SLEEP STATES!!!! WEEEE!!!!

Performance? Oh, yeah a little bit.

Sadly, they don't need to get better performance. It's like a race where the one behind have his leg broken. He won't catch up and the the winner can just win like a walk in the park. Witch make me sad.
 
As with everything on Intel's agenda: BETTER INTIGRATED GRAPHICS!!! YAAAAY!!!! BETTER BATTERY LIFE!!! YEAAH!!!!! ADVANCED SLEEP STATES!!!! WEEEE!!!!

Performance? Oh, yeah a little bit.
Great post, i totally agree with you.
 
So sad that Intel's focus is almost all mobile moving forward, so much that they're not even going to make their own motherboards anymore...
 
So sad that Intel's focus is almost all mobile moving forward, so much that they're not even going to make their own motherboards anymore...

Who actually bought Intel motherboards the last few releases though?...apart from maybe OEMs?
 
They have the best drivers for their boards and there's not much of a difference compared to the Asus/Gigabyte/MSI boards. They even have special "drivers" that enable stuff like Trim support for RAIDs. There's a ton of reasons why people would pick an Intel board and sure it doesn't have all the bling bling, but I bet their RMA process is much better than everyone elses :)
 
Performance? Oh, yeah a little bit.
WTH did that meme come from? Haswell, like every new uarch from Conroe and later, has had IPC improving by around 10-15% on average. There are unlocked enthusiast chips available for a small premium that let you run a little toaster oven inside your case if you wish.

I don't blame people for wanting as much performance as you can get in a 125W socket, but the trend for mainstream systems (by far vast majority of where desktop processors are going) is smaller form factors and better power efficiency. The market generally doesn't want the same processor characteristics that overclockers want.

Intel is focusing much more on energy efficiency nowadays and not really pushing up clock speeds (see AMD if you have a clock speed fetish... that's really working out just like Netburst lol). The K processors are a decent compromise, IMO, or usually are when not hobbled by poor TIM.
 
Who actually bought Intel motherboards the last few releases though?...apart from maybe OEMs?

Agreed. Used asus boards (with intel CPUs) for probably 6 years now. Intel should scrap making boards.

But like many of us, I am worried about the demise of the build-your-own high end desktop.
 
Build your own high-end isn't going anywhere, and Intel is no longer making boards starting with Haswell.

It's the build your own low-end systems that are in danger.
 
I don't think any of us here uses the integrated GPU, amirite!


Haha u rite. I think more of us are pissed that precious die space is being wasted on an integrated graphics processor that we don't want and don't need, hence feeling like we're paying extra for something that is a waste.

I wonder if it will use DDR4.


Well, it might on the enthusiast side. All we know for sure is that Haswell-E will have DDR4, for the server chips, but based on IV-E we wont see HS-E until at least Q3/14. Wiki has an entry that Skylake will bring widespread DDR4 adoption so it might be right to think DDR4 will be on at least the Extreme platform and maybe the i5/i7 equivalents with Broadwell.
 
I've been waiting for a reason to upgrade my 920 so I can explain it to my wife why I need to upgrade, but every Intel revision thus far hasn't provided me with the material for argument. I'm looking at you Haswell, make me upgrade dammit, I've had an itch for 3 years now!
 
Speed wouldn't be a problem if developers starting making programs that can use more than 2 cores. Especially games which need the most power.
 
Speed wouldn't be a problem if developers starting making programs that can use more than 2 cores. Especially games which need the most power.

I think next-gen consoles will make pc games use more cpu cores.
 
Speed wouldn't be a problem if developers starting making programs that can use more than 2 cores. Especially games which need the most power.

Multithreaded programming is much much harder than single threaded. Expect even more bugs on software if you think that is the solution. Many programmers don't even know how to detect a deadlock. Hardware vendors could find no other solution to the silicon limitations and are now throwing away the problem at developers.
 
Multithreaded programming is much much harder than single threaded. Expect even more bugs on software if you think that is the solution. Many programmers don't even know how to detect a deadlock. Hardware vendors could find no other solution to the silicon limitations and are now throwing away the problem at developers.


TSX could alleviate that a little bit. Programming sucks, I'll be the first to admit that, but developers need to step there game up and break out of there shell. I hate the, "if it ain't broke don't fix it" attitude where so long as something runs call it a day.
 
I think part of the mentality is. "Why write new code when I can re-use old code, even if that code is out of date and not optimized for new cpus."

Another part of the problem is companies cutting back on development time.

We live in a world where BETA is good enough, fix it in the live environment. Not that I agree with that.
 
TSX could alleviate that a little bit. Programming sucks, I'll be the first to admit that, but developers need to step there game up and break out of there shell. I hate the, "if it ain't broke don't fix it" attitude where so long as something runs call it a day.

But nobody wants to pay for software. Or if they do, it should be no more than $0.99 and it better make dinner and do the laundry too or it's getting rated 1 star.
 
I'll be the first to admit that, but developers need to step there game up and break out of there shell. I hate the, "if it ain't broke don't fix it" attitude where so long as something runs call it a day.

You have to realize that some of this is due to management and for good reasons.
 
I don't think any of us here uses the integrated GPU, amirite!

I went with my new-ish IvyBridge build because of the integrated graphics. Made more sense for me to use onboard + 1 nice video card than a Piledriver with 2 video cards. I realize I'm the minority in this regard, but some of us do seek this out.
 
I use the integrated video on my 3570k to run extra monitors...
 
I don't think any of us here uses the integrated GPU, amirite!

I use the i3-2100 integrated gpu for linux. On the other hand, I just switched my fileserver from the previous i3-2100 to an AMD athlon ii setup, and I had to add in a pci matrox millenium for my zfs server. Ugh.
 
So sad that Intel's focus is almost all mobile moving forward, so much that they're not even going to make their own motherboards anymore...

They are working toward computer ubiquity. They even talk about it in their roadmaps. We are at the crest of a huge wave of cheap hardware that is going to change the idea of what we wire up to the internet. Power is expensive, and the diminishing returns on cranking up the GHZ right now are outweighed by the need for cheap, low power hardware.
 
According to what Intel stated back in December regarding that rumor, no.
 
Multithreaded programming is much much harder than single threaded. Expect even more bugs on software if you think that is the solution. Many programmers don't even know how to detect a deadlock. Hardware vendors could find no other solution to the silicon limitations and are now throwing away the problem at developers.

In a typical games engine, the AI, physics, sound, game script, etc. are all run at different timesteps. Thus the various variables they control are not in clock-per-clock sync with each other to begin with. Splitting these features up into multiple threads has no effect on the amount of bugs one will encounter (no more than if the secs designed a single thread to run all these processes, as all programs will have bugs). If the physics thread is reporting at 70 tics per second whereas the game script thread is reporting at 125 per second, so long as the main thread runs at the highest frequency or greater, it can serve and receive the variables that the child threads operate.

Other programs are not as simple to plan out, but it is nonetheless possible.
 
Intel DOES want to move to all-BGA production, as to lock-down the market: however with AMD still offering even SOMEWHAT competitive chips, they need the illusion of freedom. If they went ALL-BGA now, a lot of intel fans may move to AMD and even cause a rise in AMD ownership. As soon as AMD is out of the desktop picture entirely, intel will go all-BGA: I bet my nuts on it.
 
Back
Top