Intel to use integrated AMD graphics in thier chips

A day late and a dollar short... But not jumping to conclusions, only time will tell how it pans out.

I feel with laptops, you either have an entry level, or a high end in terms of gaming (integrated vs dedicated). If you're running a super powerful onboard graphics chip, are you really going to run any of the newer games on anything other than low settings? Especially at higher resolutions.

Then you have the new GeForce 1000 based series which you can have no problem running any games on High Res.

Maybe... just maybe you will be able to run a few games with "some" settings above low. Like I said, only time will tell.
 

LOL!

You are using stocks to justify something?

I guess Apple Watch is a huge success and Tim Cook's is Apple's best CEO ever since Apple's stocks keep going up and up.
 
LOL!

You are using stocks to justify something?

I guess Apple Watch is a huge success and Tim Cook's is Apple's best CEO ever since Apple's stocks keep going up and up.
You read that backwards. He's using that to justify the stock price bump. I think it's just the normal up and down from AMD, personally. should help their bottom line, though.

I expect business computers with intel/amd to do pretty well (maybe not as well as intel/intel), and expect Dual Graphics mobile systems eventually. Would be even better with AMD/Nvidia graphics and intel cpu, but I doubt that will happen.
 
EMIB, HBM2, AMD GPU, KBL/CFL-H? CPU. 2000$-3000$ devices?

Must be some premium Apple macs.

Maybe it's the replacement Mac Pro?

Apple announces the new Mac Pro with integrated everything: now even less power, and even less upgradable! And three times the price!

Cause you know there would have to be two of them, or it's not a gimmicky Apple workstation :D

Also, until this is officially announced by Intel, it's just rumors. WSJ is a step-up from the usual Wccftech crap, but it's still a for-pay site trying to get page views.

And even if it's true, we're still talking EXACTLY THE SAME COST OF IMPLEMENTATION as an external GPU. You still have to supply the HBM and the tightly-packed integration of these.

Even Kyle has yet to post an article: he believes so solidly in this rumor that it's only appeared here in the forums.
 
Last edited:
Fair enough.

We'll see if these come anywhere near Max Q in terms of performance/watt and cost. It's tightly integrated, but you still have to pay for HBM2 and cool the beast.

But I can see the appeal for Intel which doesn't like Nvidia being in such control of their gaming laptop designs. But the downside is they have to rely on AMD for their "alternatives."

But I still think this is just a custom chip for Apple. The press release means nothing until there are multiple design wins.
 
Last edited:
Fair enough.

We'll see if these come anywhere near Max Q in terms of performance/watt and cost. It's tightly integrated, but you still have to pay for HBM2 and cool the beast.

But I can see the appeal for Intel which doesn't like Nvidia being in such control of their gaming laptop designs. But the downside is they have to rely on AMD for their "alternatives."

But I still think this is just a custom chip for Apple. The press release means nothing until there are multiple design wins.

I dont see Intel being too concerned about Nvidia or any type of GPU next to their CPU. I do see Intel being concerned about AMD getting more money to survive. But I feel that if theres one company in the world that could force Intel to work with AMD instead of Intel having Nvidia chips, it would be Apple. All it would take is a little hint that Ryzen looks mighty good. Add that to a clear reluctance for Apple to have Nvidia back inside their stuff and I can see how this turned out.
 
Instead of being taken by surprise, you could just have believed Kyle when he wrote about future Intel CPUs using AMD graphics.


I guess Kyle can say "told you so" to lots of people now. :)

That Kyle Bennett fake news rumormonger!
 
I dont see Intel being too concerned about Nvidia or any type of GPU next to their CPU. I do see Intel being concerned about AMD getting more money to survive. But I feel that if theres one company in the world that could force Intel to work with AMD instead of Intel having Nvidia chips, it would be Apple. All it would take is a little hint that Ryzen looks mighty good. Add that to a clear reluctance for Apple to have Nvidia back inside their stuff and I can see how this turned out.

The GPU size isn't what's constraining the size of gaming laptops. It's the cooling required to keep those parts running continuously.

This won't open any doors Nvidia has not already opened for themselves. It just gives builders of portable high-end gaming devices another alternative.
 
https://browser.geekbench.com/v4/compute/811174
PN8YVSn.png


Looks to be Semi Custom Polaris w/HBM IP. 24CUs would be 1536SPs, correct?


LQXVAE7.png

People are also speculating it is Hades Canyon - 100w/66w 4/8 Parts w/dGPU(Polaris MCM) and Optane.

Oh and here is supposed performance:

V766bhp.png

Source

Not bad perf at all if its real!
 
https://browser.geekbench.com/v4/compute/811174
PN8YVSn.png


Looks to be Semi Custom Polaris w/HBM IP. 24CUs would be 1536SPs, correct?


LQXVAE7.png

People are also speculating it is Hades Canyon - 100w/66w 4/8 Parts w/dGPU(Polaris MCM) and Optane.

Oh and here is supposed performance:

V766bhp.png

Source

Not bad perf at all if its real!
Probably real but on a ultra thin laptop, it probably won't reach that score due to insufficient cooling, but on a regular laptop with adequate cooling, this performance would be great.
 
Never use the so called proto boards with unlimited power and cooling for anything. OEM implementations are usually completely different for obvious reasons. Also potential huge variance between OEMs as well.

When you see the results on notebookreview etc, then its something worth talking about.

The 100W spec for the NUC isn't by accident. The GPU may be able to pull 65-75W.
 
Last edited:
Maybe it's the replacement Mac Pro?

The MacBook Pro series only got 2 models with AMD GPU, and I guess these may just replace those and increase the price as well. I can easily see the line go way past 3000$. The lowest model currently starts at 2400$. I would expect a new starting point around 2800$.

It also raises the question on "why" for this product again. The bare bone NUC will also be something like 1000$ before you add memory and SSD. Couple of 100K in potential volume for Macs and NUCs combined? The cost structure is going to be record high.
 
Well yeah, I was just joking about it replacing thew currently-dead Mac Pro.

This is obviously aimed at beefing-up the Macbook Pro and making it a fraction of a mm thinner, at some ungodly cost premium. Because the fucker is not thin enough?

Now it's clear to me why Apple raised the price of the Pro by hundreds of dollars last time. It's to prepare the buyers for even bigger reaming this time around, all to bump performance up to 1050 levels.

And by the time this can actually be purchased, consumer Vega will be out, making this mostly moot.
 
Last edited:
EMIB to attach HBM2 to Vega 11 to avoid interposers. Then a regular PCIe x8 interface to the CPU and put it all on a single package. The CPU is a KBL-H, not even KBL-R or CFL-H.
 
Damn, this will be impressive!

I wonder how this will compare to an XBOX One X in performance. Laptop only or will there be a desktop chip for VERY SFF?
 
Damn, this will be impressive!

I wonder how this will compare to an XBOX One X in performance. Laptop only or will there be a desktop chip for VERY SFF?
Xbox One X is ~6tflops. This one is 3.3 on a day where you got all the power and cooling you want ;)

Real world in a MacBook Pro may be something like 2.5tflop vs 6tflop in Xbox One X. And Xbox One X having much more bandwidth. Think a RX560 desktop card with ~180GB/sec bandwidth or so instead of 112GB/sec. Xbox One X got 326GB/sec to compare.

Its Apple/NUC only. The NUC is a 100W top part. Other OEMs wont touch it for multiple reasons and expect to pay extraordinary out of your nose for any Apple product with it or the NUC.

Seems to be Q2ish 2018 product. Using KBL-H that is replaced by KBL-R and CFL-H at the time. Not to mention Volta.

Its an outdated product today, even more so when you can actually buy it. Seems to be another product that's 12-18 months behind any relevance curve.

Impressive is anything its not. Its 3 chips on a package with only 1 connection being "new". The chipset didn't even get to be on the package.

HBM2<->(EMIB)<->GPU<->(PCIe x8)<->CPU
 
Last edited:
Guess HBM memory is about to get a helluva lot cheaper.

This will increase production of memory modules used HBM, but I don't think the limitation of the tech is the memory as much as it is the massive interposers used on the top-end HBM products, i.e. Vega and GP100. This implementation should be quite cheap.
 
This will increase production of memory modules used HBM, but I don't think the limitation of the tech is the memory as much as it is the massive interposers used on the top-end HBM products, i.e. Vega and GP100. This implementation should be quite cheap.

4GB HBM2 is still close to 100$. EMIB saves some 5-10$ contra interposer. And in this case it saves the more important height too.
 
4GB HBM2 is still close to 100$. EMIB saves some 5-10$ contra interposer. And in this case it saves the more important height too.

Sure, but I'm mostly speaking to the cost (and yields...) of the massive interposers used by GP100 and Vega. This part looks very easy to produce, at a much smaller size, and here the power and size benefits of HBM are realized.
 
Sure, but I'm mostly speaking to the cost (and yields...) of the massive interposers used by GP100 and Vega. This part looks very easy to produce, at a much smaller size, and here the power and size benefits of HBM are realized.

There is only size. HBM2 haven't been able to deliver any power benefits against newer GDDR5X/GDDR6. The power density is more of an issue due to being a laptop part. Not to mention the overall power usage that will be quite large. Its not a question if it will throttle. Its a question about how much.
 
Seems to be Q2ish 2018 product. Using KBL-H that is replaced by KBL-R and CFL-H at the time. Not to mention Volta.

Its an outdated product today, even more so when you can actually buy it. Seems to be another product that's 12-18 months behind any relevance curve.

Impressive is anything its not. Its 3 chips on a package with only 1 connection being "new". The chipset didn't even get to be on the package.

HBM2<->(EMIB)<->GPU<->(PCIe x8)<->CPU

For laptops, I would still think this would be the better way to go for laptops as long as the price is right in comparison to Raven Ridge. Should be similar CPU performance with much better GPU performance.
 
For laptops, I would still think this would be the better way to go for laptops as long as the price is right in comparison to Raven Ridge. Should be similar CPU performance with much better GPU performance.

Well, it is, except that this is a 45w solution, whereas Raven Ridge is a 15w solution. This part won't be going into ultrabooks like the Macbook Air and XPS13, for example, whereas Raven Ridge will (or should).
 
The CPU being used alone is a 35-45W one. If the entire package is 45W its going to have some serious performance drawbacks. In terms of the NUCs. 10-15W chips became 65-100W (No, not desktop chips, mobile!).
 
Current gen 15 inch macbook pros have a 45 watt intel chip and a polaris chip. This seems like a size reduction on the pair at least and maybe some power savings to boot. I'm curious if there will be a lower wattage version made as well.
 
There goes the advantage that Raven Ridge once would have had.

Obviously, this isn't a huge surprise with turncoat Raja in charge until recently.
 
Perhaps the first and last time we see it.

Another demonstration there is no integration. Next step Intel CPU+Intel dGPU?

DOLwFP4X4AUuLSd.jpg
 
Back
Top