Leaked Intel CPU Roadmap For 2016?

in 4 an 8-core 6900k


As crazy as it sounds I'm not all that psyched on more cores. I'm still on a 4 core/8HT CPU and not once has an AAA game, video encoding, or VM appliance, given me fits of rage "OMG NEED MOAR CORES!".

It's my Video card that is the culprit (660GTX 2GB) and once I upgrade that (970 BF here i come!) I can go back to my 2440x1440 resolution for nearly everything at max and not worry about any problems.

Don't get me wrong, I'll take a 8 core system...eventually.
 
More cores wont get exciting till games start making use of them. Or if reverse hyperthreading ever gets made.
 
are we ever going to get out of 3-4 ghz range? seems we cant pass barrier. i would think 4-5 ghz range by now
 
are we ever going to get out of 3-4 ghz range? seems we cant pass barrier. i would think 4-5 ghz range by now

http://ark.intel.com/products/88195/Intel-Core-i7-6700K-Processor-8M-Cache-up-to-4_20-GHz

I think 4ghz base w/ 4.2 turbo counts as 4-5.

I dont care what the base is. We need a cpu that can run 5ghz+ overclocked without having to cherry pick chips. I got a pretty good 2600k I felt to get it to run 4.9 ghz for nearly 5 years was pretty good. But again, this cpu is almost 5 years old and there has not been another CPU since that clocks as well.
 
I'd say were nearing the limit of frequency attainable for such complex chip designs using silicon. Not sure what's next to be used, maybe something using Graphine composite down the road.

These new SKUs bode well for us though. If the 10core is $1k or so as usual, the others will move down proportionately.

Myself I hope to find a used 5960x with a X109(?) chipset if they are out next year. Unless the 6800K has some substantial IPC gains & OC gains. Can't wait to retire from X58.
 
So is it safe to say that my X99 motherboard will be good for this new generation of processors? Not that I have reason to ditch my 5820K at all but I just find it odd that Intel would stick to X99 instead of trying to make money on making another chipset.
 
As crazy as it sounds I'm not all that psyched on more cores. I'm still on a 4 core/8HT CPU and not once has an AAA game, video encoding, or VM appliance, given me fits of rage "OMG NEED MOAR CORES!".

It's my Video card that is the culprit (660GTX 2GB) and once I upgrade that (970 BF here i come!) I can go back to my 2440x1440 resolution for nearly everything at max and not worry about any problems.

Don't get me wrong, I'll take a 8 core system...eventually.

My position is the opposite. Holding out for 16 cores.
I play the odd game, but most of my time spent is spent in 3D and rendering, which has me constantly wanting more cores Granted, there's distributed rendering which I also make use of, but limits the other workstations and is too time consuming for screen grabs. And so local rendering is always better in such cases.

And so it's MOAR CORES for me.

That said, I guess I should just get a dual socket XEON and be done with it
 
Darn... look like I need to put my plan for i7 5820k on hold for now ...
 
As crazy as it sounds I'm not all that psyched on more cores. I'm still on a 4 core/8HT CPU and not once has an AAA game, video encoding, or VM appliance, given me fits of rage "OMG NEED MOAR CORES!".

It's my Video card that is the culprit (660GTX 2GB) and once I upgrade that (970 BF here i come!) I can go back to my 2440x1440 resolution for nearly everything at max and not worry about any problems.

Don't get me wrong, I'll take a 8 core system...eventually.

I dunno, I'll take all the cores I can get. Since Microsoft implemented multi-core support in their .NET compiler I actually have a use for more cores.
 
Fakest-looking "leak" ever. Also, the officially announced Kaby Lake was delayed to 2017 and in doing so they had to seriously suck it up and admit they couldn't keep up the tick/tock. If there was any chance in hell of them doing it they wouldn't have announced that.

Also the numbers on the fake-looking slides are stupid and not believable. It says Skylake-E will both be out soon and also support socket 2011-v3 among other stupid things. If Skylake-E supports socket 2011-v3 I'll eat my hat.
 
Fakest-looking "leak" ever. Also, the officially announced Kaby Lake was delayed to 2017 and in doing so they had to seriously suck it up and admit they couldn't keep up the tick/tock. If there was any chance in hell of them doing it they wouldn't have announced that.

Also the numbers on the fake-looking slides are stupid and not believable. It says Skylake-E will both be out soon and also support socket 2011-v3 among other stupid things. If Skylake-E supports socket 2011-v3 I'll eat my hat.

Tic/tock wasn't really practical towards the literal end of silicon use.

As of my last talking to some intel people i know. Graphene,Diamond, Gallium arsenide are some options. Intel may have to stall with trying to reduce thermal heat generation/dissipation to get a better chip instead of just pushing for smaller gates. They may be forced to go that route. AMD might be able to catch up ( if they get incredibly lucky)

I have heard that 12nm is doable. Past that? in an actual manufacturing setting? it's anyones guess.10nm MIGHT be doable. But they had to add hafnium to the gates to even get that low.

I still keep wondering if light based cpus are dead in the water. There was talk about making cpus out of actual beams of light but it sounds like that was never really feasible.
 
As crazy as it sounds I'm not all that psyched on more cores.
DX12 + improvements in game physics means you're going to want, not NEED but want, more than 4 cores for gaming over the next few years. 4 cores will still be serviceable of course but if you want peak performance out of your video card you'll need more than that.

For internet + youtubes + office higher clocked dual core is still the way to go and will be for a long time if not forever.
 
I have heard that 12nm is doable. Past that? in an actual manufacturing setting? it's anyones guess.....There was talk about making cpus out of actual beams of light but it sounds like that was never really feasible.
10nm is looking doable but probably delayed while they perfect quantum effect transistors.

http://www.realworldtech.com/intel-10nm-qwfet/

Making CPU's out of light was never going to happen outside a lab. Laser-on-silicon based logic or buses could maybe still happen in a product you could actually buy but is still very much in the R&D stage and if it ever happens probably won't be available for many years. Intel demonstrated some of this years ago for CPU's but appears to have turned their photonics focus to inter-CPU/system networking rather than logic.
 
IBM have been able to produce a 7nm test chip and made Intel look like a joke. Hopefully we can get more competition in the cpu market again for prices to get lower. I have my eye on that 10 core 20 thread cpu. That should last a decade or two. Now paying for it is another thing entirely.
 
Test chips aren't mass production and IBM has pretty much gotten out of the foundry business. They certainly won't be competing with Intel in x86-land and whatever they make will not be even close to affordable for any but the rich. Sub-10nm will almost certainly be a reality but don't count on it soon.
 
IBM have been able to produce a 7nm test chip and made Intel look like a joke. Hopefully we can get more competition in the cpu market again for prices to get lower. I have my eye on that 10 core 20 thread cpu. That should last a decade or two. Now paying for it is another thing entirely.

That test chip is sort of like an experiment that shows that EUV lithography works. The idea itself isn't new. What they really need is not just to show that it works in theory, but to find a way to make EUV commercially viable. We don't know who will be the first to commercially use EUV

And by the way, I'd be cautious to say that IBM has made intel look like a joke. IBM's 7nm has a fin pitch of 30nm, while Intel's 14nm's fin pitch is already 42nm. By that scaling, Intel's 10nm will probably get them to 30nm fin pitch. It definitely wouldn't be comparable to Intel's 7nm if those figures remain.
 
Hmm it's a bit disappointing that none of the turbo boost breaks 4ghz. I was expecting the 6860k to be around 4.2ghz as it seems like that was going to be the gamers i7.
 
They're still focusing on power reduction over pure performance since that is where the money is these days so its not too surprising. Generally speaking high clocks= high power usage.

Back in the 90's everyone thought we'd be over 10Ghz by now FWIW.
 
They're still focusing on power reduction over pure performance since that is where the money is these days so its not too surprising. Generally speaking high clocks= high power usage.

Back in the 90's everyone thought we'd be over 10Ghz by now FWIW.

This.

If low power wasn't the name of the game they could make them faster, but at a higher power draw.
 
Do you HAVE to buy now or are you just feeling an upgrade itch?

If you have to buy now and want a socket 2011-v3 CPU I'd rather get the 5820K. Not much difference performance wise core to core for Broadwell-E vs Haswell-E. Much better value on the entry level socket 2011-v3 chips. Unless you've got money to waste or a specific task that really needs all those cores the very top end Intel chips with heaps of cores aren't worth it. Especially if you're overclocking which I assume you are given the site you're posting on.

If you already have a decent system and just have a upgrade itch I'd wait for Kaby Lake if you don't mind a less expensive desktop system. If you want a top end socket 2011 system worth upgrading too I'd wait for Purley platform to come out. Kaby Lake will be 2016, maybe mid to late 2016, not sure exactly when. Purley is a long ways away though, probably mid to late 2017.
 
I'm doing a new build. I'll be doing gaming/streaming as well as 3D rendering and photo work on it. I won't be pulling the trigger on the CPU purchase for a couple of months.
 
If your current system is the one in your sig. I'm not really sure if you'd need to upgrade honestly.

If you still want to though:

Is it for work and they're paying for everything? Weeeellll....different story in that case, sure go hog wild and get top of the line everything.

If you're paying for everything and are strapped on cash then strategic buying is a necessity and I'd say my previous post's suggestions are still good.
 
The current rig in my sig will be "mothballed". I'm doing a new rig on my own dime (I wish I could get the company I work for to pay for it, but I doubt I could talk them in to that especially since I'm also looking at putting a pair of Titan X (or their successors) in the new rig as well).
 
Back
Top