Intel Unveils 5th Generation Intel Core Processor Family

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
Today, Intel unveiled the 5th Generation Intel® Core™ processor family. Built on Intel’s cutting edge 14nm manufacturing process, the 5th Gen Intel Core processors deliver premium performance, stunning visuals and enables improved battery life to take computing to the next level. The performance also provides the foundation for more natural and interactive experiences with Intel® RealSense™ technology, Intel® Wireless Display (Intel® WiDi) and voice assistants. With the “Broadwell” microarchitecture expected to be the fastest mobile transition in company history, offering consumers a broad selection and availability of devices, the 5th Gen Intel Core processors are purpose-built for the next generation of compute devices offering a thin, light and more efficient experience across traditional notebooks, 2 in 1s, Ultrabook™ devices, Chromebooks, All-In-One desktop PCs and mini PCs.
 
Targeted toward mobile, so does that mean desktop users are left behind once again? Will this ever turn into something that represents more than maybe a 5% increase in processor speed vs the previous generation?
 
still rocking my 2500K @ a healthy overclock... I'm a few generations behind but it runs everything (and I mean everything) super smooth and sweet

BF4 @ 1440p 90Hz with all options, smooth as silk.
 
Based on my Sales training at Intel Retail Edge I should try and sell you something based on the latest and slimiest Heat producing chip with cheap TIM.
 
I thought there had already been some broadwell devices out there. Not that I paid much here to the market these are after.
 
We deploy maybe 50 to 100 laptops a year depending on what the hiring situation and how hard staff are on their machines. This is good news for us - better battery life with enough performance to get by.
 
I thought there had already been some broadwell devices out there. Not that I paid much here to the market these are after.

These are the higher power Broadwell devices, 15W and 28W TDP. The ones released last year were the low power 4.5W chips for fanless design devices.
 
At this stage I feel intel is going for lower scale processes just for the sake of it.
 
Targeted toward mobile, so does that mean desktop users are left behind once again? Will this ever turn into something that represents more than maybe a 5% increase in processor speed vs the previous generation?

As long as we don't have software (other than rendering and encoding) that truly benefits from increases in CPU power, it really isn't a big deal.

Are you really CPU limited in anything?

I'm still on a CPU I bought in 2011, as I see no reason to upgrade. Nothing taxes my CPU.

This becomes even more notable if we broaden it to computer users overall. For what most people do, any existing AMD or Intel 64bit dual core chip made in the last 10 years would be more than fast enough.

It's a far cry from where we were when I was in college (1999 to 2003), when in order to keep up I needed a new CPU and motherboard every year and a new GPU every 6 months.
 
I'm withmost people here. my 3930k has little issue doing pretty much anything i ask of it. I was hoping broadwell would be the key to me lowering my power draw.. while it may it doesn't seem like it would be cost effective.

I guess we'll wait for skylake and cheaper ddr-4
 
At this stage I feel intel is going for lower scale processes just for the sake of it.

And why wouldn't they?

High end CPU's simply are only relevant to a tiny microscopic fraction of the market.

Mobility and low power use is MUCH more important to anyone in the CPU business than raw power.
 
These are just Broadwell SoCs for notebooks and "2-in-1" desktops, basically Mac-like desktops using mobile processors. Broadwell K (socketed desktop LGA1150) CPUs start dribbling out in the Spring.
 
I agree on the performance, though I would love to see a 14nm high performance CPU that produces less heat. On warmer days, long hours of gaming could make my room hotter than I would like, with all the heat my system dumps out. Same goes with GPU, it's one of the reason why I don't do multi-GPU setup anymore.
 
Makes more business sense to dedicate limited 14nm to mobile PC laptop and tablet first where there's still premium.
 
Glad they are using the mobile segment to beta-test their new 14nm process.

That being said, I'll be holding off on a system overhaul until three things happen:
1. Skylake 14nm socketed desktop parts for Z-series chipset have an adequate TIM under the IHS (solder).
2. DDR4 prices are less than half of what they are today.
3. Single GPU with at least 6GB VRAM that meets or exceeds my 780 SLI performance offering tremendously less power draw and heat output, be it NVidia or AMD.


If Intel still chooses to cheap out and rip off us mainstream desktop segment customers by utilizing paste under the IHS for their mainstream i3, i5, and i7 parts (hell, I'd be happy with just K-SKUs having solder), then I may end up springing for the entry or mid level enthusiast i7 part (X109 platform?)...which should be an 8 core, with 12+ cores for the $1000+ extreme edition. I will probably stick with a uATX build, and the added cost of the X-series platform may be worth it since I tend to keep each build for 3-6 years each, assuming a higher MoBo price and the $500-600 mid-level i7 part.
 
Targeted toward mobile, so does that mean desktop users are left behind once again? Will this ever turn into something that represents more than maybe a 5% increase in processor speed vs the previous generation?

Who knows. Maybe they eventually combine the tech and put 20 of these small mobile based processors on a single board for desktop use.
 
Targeted toward mobile, so does that mean desktop users are left behind once again? Will this ever turn into something that represents more than maybe a 5% increase in processor speed vs the previous generation?

Probably not. Wait for Skylake or even Cannonlake.
 
Probably not. Wait for Skylake or even Cannonlake.

I am hoping that 14nm will allow for 5GHz+ overclocks but I am not holding my breath. I do not think I will replace my i7 970 until Skylake-E however my core2quad that powers my linux based HTPC I probably will replace with a quad core 14nm chip most likely after I see if AMD has a massive increase in efficiency with Zen.
 
*pats trusty OC'd i5 2500K*

For my rig, same. I'll might just 3770K when the time comes and those chips are much cheaper. Who knows how long until I get a whole new build. Maybe we'll be working with layered RAM by that point.
 
For my rig, same. I'll might just 3770K when the time comes and those chips are much cheaper. Who knows how long until I get a whole new build. Maybe we'll be working with layered RAM by that point.

I don't see the point in making that upgrade at any point in time.
 
Yeah, I just see no need for an upgrade in the near future.

Maybe once the LGA 2011 Xeon's fall in price, I'll pop one of those in a nd see how well it overclocks, but due to the rather limited generation over generation improvements in CPU's, and the relative lack of anything that takes advantage of those gains, there really is no reason to buy the latest gens.

My 3930k still feels fresh and new, and is not holding me back in the slightest.
 
At this stage I feel intel is going for lower scale processes just for the sake of it.

This is what my reactionary side say:

Yeah, have to agree. They haven't added any major GPU architectural efficiency improvements since Ivy Bridge (which had only 25% more EUs, but over 50% higher performance). The GT3 is a holdover from Ivy Bridge as well, making the Crystalwell the only new GPU-related product Intel developed in the last 3 years.

And this is what my cool reasoning side of the brain says in retort:

Intel is stuck waiting on DDR4.

Yeah, they could release a more efficient GPU core rev, but until DDR4 makes the rounds, they're going to be bandwidth-constrained in the GT3 part range (even with Crystalwell). Might as well wait for the bandwidth to drive it before you release the New Hotness.

And while you could improve GT2 performance, unless you also have the know-how to improve performance/watt, GT2 will hit up against the 15w TDP brick wall (HD 4600 is not much of an improvement over HD 4000 in 15w parts). Also, for cost and portability reasons, a number of 15w GT2 parts omit the second memory channel, making more performance moot.

Beyond the GPU, core improvements are much harder to do on the already highly-optimized CPU.
 
Beyond the GPU, core improvements are much harder to do on the already highly-optimized CPU.

What if they make the cores bigger? That should help. Fuck more cores, we need bigger, faster ones.
 
What if they make the cores bigger? That should help. Fuck more cores, we need bigger, faster ones.

Making the cores with wider issue means more overhead to handle that extra throughput, and eventually it gets so high that the actual performance gains are zero. This could change if we have a major breakthrough in core layout, but it won't be anytime soon.
 
Given current gen games and software your 3770K will see 2018.

My overclocked i5 3570K eats anything I throw at it paired with a GTX 980 - even @2560X1600. While the nerd in me wants to build a new rig (because who doesn't?) - I'm actually ok with things the way they are. I didn't hold back or compromise on anything with that build, and of all the PCs I've put together over the years, I'm most happy with this one. This just gives me more $$$ to invest in other things like stereo equipment and vehicles. :)

I for one welcome chips with the same horsepower for less heat and power consumption so we can get some badarse portable gaming solutions. (I'd love to see a surface pro 3 type thingy that can actually game).
 
I for one welcome chips with the same horsepower for less heat and power consumption so we can get some badarse portable gaming solutions. (I'd love to see a surface pro 3 type thingy that can actually game).

There are 2 scenarios that would make me decide for an upgrade:
- Cram everything in a mini-ITX enclosure with no compromises compared to my current rig: powerful AND quiet with standard GPU/PSU/Cooler. (I know there's NCASE, still researching on that one, here's an interesting article: http://www.silentpcreview.com/Quiet_Mini-ITX_Gaming_Build_Guide_2 )
- Affordable DDR4, because I don't wanna invest in a near EOL DDR3 platform.
 
Making the cores with wider issue means more overhead to handle that extra throughput, and eventually it gets so high that the actual performance gains are zero. This could change if we have a major breakthrough in core layout, but it won't be anytime soon.

I don't know, seems to me they stick with cores this size because they're easier to park as that helps saving power. Though that's only my gut feeling, I not an expert.
 
There are 2 scenarios that would make me decide for an upgrade:
- Cram everything in a mini-ITX enclosure with no compromises compared to my current rig: powerful AND quiet with standard GPU/PSU/Cooler. (I know there's NCASE, still researching on that one, here's an interesting article: http://www.silentpcreview.com/Quiet_Mini-ITX_Gaming_Build_Guide_2 )
- Affordable DDR4, because I don't wanna invest in a near EOL DDR3 platform.

Take a look at the CaseLabs Mercury
 
My overclocked i5 3570K eats anything I throw at it paired with a GTX 980 - even @2560X1600. While the nerd in me wants to build a new rig (because who doesn't?) - I'm actually ok with things the way they are. I didn't hold back or compromise on anything with that build, and of all the PCs I've put together over the years, I'm most happy with this one. This just gives me more $$$ to invest in other things like stereo equipment and vehicles. :)

You must run mostly older titles.


Your 980 is a bit faster than my Titan, but not that much.

I find that the ideal gaming experience (never dropping below 60fps with all eye candy on) is still far from achievable at 2560x1600...
 
Zarathustra[H];1041336648 said:
As long as we don't have software (other than rendering and encoding) that truly benefits from increases in CPU power, it really isn't a big deal.

Are you really CPU limited in anything?

I'm still on a CPU I bought in 2011, as I see no reason to upgrade. Nothing taxes my CPU.

This becomes even more notable if we broaden it to computer users overall. For what most people do, any existing AMD or Intel 64bit dual core chip made in the last 10 years would be more than fast enough.

It's a far cry from where we were when I was in college (1999 to 2003), when in order to keep up I needed a new CPU and motherboard every year and a new GPU every 6 months.

Dragon Age: Inquisition chewed up my [email protected].
 
Zarathustra[H];1041337952 said:
You must run mostly older titles.


Your 980 is a bit faster than my Titan, but not that much.

I find that the ideal gaming experience (never dropping below 60fps with all eye candy on) is still far from achievable at 2560x1600...

That's fair - I am on the "buy last year's games during steam sales" instead of paying full price on release. There's no dragon age inquisition or super FPSs going on.

But it's still more than respectable. The Tomb Raider benchmark pulled an average of 58.8 FPS with a low dip to 52 - everything all the way up at that rez. Also driving a second monitor simultaneously @ 1900X1200.

For what it's worth...
 
Zarathustra[H];1041336648 said:
Are you really CPU limited in anything?

Absolutely. I play World of Warcraft a lot. That game is retardedly CPU limited. Even with my 2500k @ 5Ghz, i'll sit there with my CPU pegged as both my GPUs coast along at ~20% utilization each. Granted the game is not as multi-threaded as it could be, but there is nothing wrong with wanting more single-thread performance.
 
Absolutely. I play World of Warcraft a lot. That game is retardedly CPU limited. Even with my 2500k @ 5Ghz, i'll sit there with my CPU pegged as both my GPUs coast along at ~20% utilization each. Granted the game is not as multi-threaded as it could be, but there is nothing wrong with wanting more single-thread performance.

Well, how are your framerates? Doesn't matter how utilized the CPU cores are if it's not actually affecting your performance in-game.
 
Well, how are your framerates? Doesn't matter how utilized the CPU cores are if it's not actually affecting your performance in-game.

I haven't played wow in several years, but when you're in the middle of a busy city, framerates can and do suffer. I'd imagine the same can be said of when you're in the middle of a raid.
 
Well, how are your framerates? Doesn't matter how utilized the CPU cores are if it's not actually affecting your performance in-game.

It does affect my performance in-game. Granted, I run a 120hz monitor, so situations where others might be content pegged at 60hz still leave me wanting more. Still, it bugs me that even at a paltry resolution of 1080P, I have to turn certain settings down to maintain my framerate, all due to the CPU being the bottleneck.
 
Back
Top