From ATI to AMD back to ATI? A Journey in Futility @ [H]

Either way AMD's stock took another hit. Got nothing to prop up the stock until end of June. Yeah they will reveal Vega at Computex but release won't be til end of June next to Epyc.
 
Licensing rumor just denied by Lisa Su: http://www.marketwatch.com/story/am...ense-deal-2017-05-22?siteid=yhoof2&yptr=yahoo
So now we have Intel and AMD denying this deal, but "you have it on good authority it's a done deal"?

All that statement says is that AMD understands what advantages they have, and aren't stupid enough to give them away.

In my opinion, this does not preclude a third-party asking for a specific product combining both Intel and AMD tech.

Edit: for that 3rd party's exclusive use.
 
All that statement says is that AMD understands what advantages they have, and aren't stupid enough to give them away.

In my opinion, this does not preclude a third-party asking for a specific product combining both Intel and AMD tech.

Edit: for that 3rd party's exclusive use.


Was about to say the same thing.
 
Told you this was complete horse shit. The Intel->Nvidia license seems to be ongoing, even if the payments have stopped. And an on-package AMD GPU attached to an Intel CPU is yet-another blue-sky pipe-dream.

You'd probably have an easier time making a GPU on an m.2 card, if you really wanted small. And THAT would be replaceable, AND requires no cross-licensing.

This pipe-dream of a GPU-on-package runs-into the same thermal limits as any other high-end APU does. So no serious system could use both simultaneously in a small case. And larger systems would just be better-served with a PCIe graphics card.

FINALLY, WHY THE FUCK WOULD AMD HELP INTEL?
A full-on purchase of the GPU business I could see, but anything less was just pure idiocy produced by dreamers who haven't the slightest clue on how to run a business.

To Quote Lisa:

"We're not looking at enabling a competitor to compete against our products"

That should have been obvious to all these big dreamers. It's pretty clear to me why she's running things, and the rest of you are not.
 
Last edited:
All that statement says is that AMD understands what advantages they have, and aren't stupid enough to give them away.

In my opinion, this does not preclude a third-party asking for a specific product combining both Intel and AMD tech.

Edit: for that 3rd party's exclusive use.

IMO people are starting to using Conspiracy theory type logic to defend this. There is nothing left to stand on.

It's been denied by Intel and AMD.

The NVidia licence that Intel bought was perpetual for all patents issued up until early 2017, so Intel does NOT need a new GPU license.

The notion of putting AMD GPUs inside an Intel CPU package, is pretty much toxic to both Intel and AMD:

  • For Intel it would be admission they can't design a decent IGP after coming so far.
  • For AMD it would give up their only potential Raven Ridge advantage. Raven Ridge is arguably their most important product. To Undercut it by letting Intel AMD IGP would be criminally inept.

There is just nothing left to defend this rumor...
 
Told you this was complete horse shit. The Intel->Nvidia license seems to be ongoing, even if the payments have stopped. And an on-package AMD GPU attached to an Intel CPU is yet-another blue-sky pipe-dream.

You'd probably have an easier time making a GPU on an m.2 card, if you really wanted small. And THAT would be replaceable, AND requires no cross-licensing.

This pipe-dream of a GPU-on-package runs-into the same thermal limits as any other high-end APU does. So no serious system could use both simultaneously in a small case. And larger systems would just be better-served with a PCIe graphics card.

FINALLY, WHY THE FUCK WOULD AMD HELP INTEL?
A full-on purchase of the GPU business I could see, but anything less was just pure idiocy produced by dreamers who haven't the slightest clue on how to run a business.

To Quote Lisa:

"We're not looking at enabling a competitor to compete against our products"

That should have been obvious to all these big dreamers. It's pretty clear to me why she's running things, and the rest of you are not.

IMO people are starting to using Conspiracy theory type logic to defend this. There is nothing left to stand on.

It's been denied by Intel and AMD.

The NVidia licence that Intel bought was perpetual for all patents issued up until early 2017, so Intel does NOT need a new GPU license.

The notion of putting AMD GPUs inside an Intel CPU package, is pretty much toxic to both Intel and AMD:

  • For Intel it would be admission they can't design a decent IGP after coming so far.
  • For AMD it would give up their only potential Raven Ridge advantage. Raven Ridge is arguably their most important product. To Undercut it by letting Intel AMD IGP would be criminally inept.

There is just nothing left to defend this rumor...

I'm not defending the rumor, I'm just looking at the perspective of it being true.
 
I'm not defending the rumor, I'm just looking at the perspective of it being true.
That is technically a defense. Semantics aside they allowed this rumor to linger a real long time before trying to squash it, so it is possible they where negotiating and it fell through. It's also possible the original source was trusted enough to run with the info none of this means the reporting was bad. We don't really know what was going on behind closed doors and speculative reporting always has a chance to fall through no reason to get up in arms over it.
 
  • Like
Reactions: N4CR
like this
high-end APUs run into thermal limits? What?

You can only extract so much heat from a single IHS, or did you fail basic thermodynamics?

Think of all the fun people are having keeping their 7700k cool. Now make it more complicated because you have a 120w beefcake RX 570 on the same die.

It's a whole lot easier to cool two separate devices. That's how the Surface Book does it's magic.

An APU of any appreciable performance level (say, 4x more powerful than AMD IGP) will hit thermal limits. Anything less, and were back to the "why the hell would you do that if it's barely faster than existing IGP" question
 
Isn't RX 570 way beyond high-end APU possibilities?

It's way beyond what AMD is going to deliver for Raven Ridge, which is expected to have about 700 Stream Processors, and obviously a lot less memory bandwidth, than even budget GPUs.
 
Think of all the fun people are having keeping their 7700k cool.
Because it's thermal transfer inside the IHS is impeded by a glue?
Now make it more complicated because you have a 120w beefcake RX 570 on the same die.
It will actually become much easier, because now you can just solder that 200 or more mm^2 die to IHS. Yes, TDP will have to be set pretty high (like 95W minimum for a package with decent GPU) , but that's the GPU price, what can you do. Hell, if GPU was sufficiently energy efficient, you could probably fit something like 7700hq + 1050 into that package with plenty room to spare.
 
Well yeah, but if they were going to deliver on-package graphics for Intel, you betet your ass it should be at least rttwice as fast as raven ridge. Otherwise why bother?

Oh, thats what you meant? I've never really considered the scenario where AMD gives Intel better APUs to be believable.
 
In my opinion, this does not preclude a third-party asking for a specific product combining both Intel and AMD tech.
Nor does it preclude integrating two separate dice on the same package. Which wouldn't involve IP licensing, but acquiring another companies product. No licensing deals exist to put discrete cards in a system as is. That was always the likely scenario.

This pipe-dream of a GPU-on-package runs-into the same thermal limits as any other high-end APU does. So no serious system could use both simultaneously in a small case. And larger systems would just be better-served with a PCIe graphics card.
Yet we already have CPUs and GPUs capable of far more heat dissipation with even smaller coolers? It's a miracle! Just imagine all the smoke given off by a graphics card over 120W as it struggles to run under 50C. Breaks all those laws of physics and combines the CPU and GPU with double the transistor density of even the most compact fabrication process!

Or a large CPU sized cooler could dissipate well over 300W, exceeding even discrete card specifications, if the <85C operating temperature of most GPUs was a target. Even Vega10 and a 1800X would barely need to pass that limit. Maybe a dual Vega10 plus Ryzen APU for even better performance at less wattage?

I'm sure the guys with PhD's from writing papers don't know nearly as much about this as people on a hardware forum.
 
APUs have never made sense from a value perspective... at least for us here at hard forum. A shitty processor and low end GPU is cheaper and multiples better. A few months ago I did a 860k/1050ti build for traveling rather than an APU.

Now, if AMD could get their GPUs into Intel's chip they'd certainly do it. You get basically profit off of the majority of the market with no risk. Businesses make deals like that all the time.
 
I think the most generous interpretation at this point is that there was some deal going on, but that's now fallen by the side.

So of course it's being denied, and now it's simply the clout of Kyle's word remaining. However, that still weights highly for me at least, given I've never seen such an article quite like this from him. Felt odd when he first came out with it, like there would be follow up joke post or something...
 
APUs have never made sense from a value perspective... at least for us here at hard forum. A shitty processor and low end GPU is cheaper and multiples better. A few months ago I did a 860k/1050ti build for traveling rather than an APU.

Now, if AMD could get their GPUs into Intel's chip they'd certainly do it. You get basically profit off of the majority of the market with no risk. Businesses make deals like that all the time.
APU's in a corporate setting is where the value is. They are meant for PC's in the workplace that have 4000+ PC's that don't use a lot of electricity. That can be a lot of money saved on energy costs.
 
  • Like
Reactions: N4CR
like this
APU's in a corporate setting is where the value is. They are meant for PC's in the workplace that have 4000+ PC's that don't use a lot of electricity. That can be a lot of money saved on energy costs.

For a corporate setting doesn't a more energy efficient intel CPU with IGPU make sense? Not like they need a GPU for games... and professionals generally use Quadros.
 
For a corporate setting doesn't a more energy efficient intel CPU with IGPU make sense? Not like they need a GPU for games... and professionals generally use Quadros.
sometimes. however some applications don't work well with the intel embedded gpu and quite frankly they fail often.
 
  • Like
Reactions: N4CR
like this
For a corporate setting doesn't a more energy efficient intel CPU with IGPU make sense? Not like they need a GPU for games... and professionals generally use Quadros.

Absolutely for me.

My company cheaped-out and went AMD Trinity dual-core instead of Intel Ivy Bridge dual-core. I regret it every day, since it's slower than my old Core 2 Duo.

More powerful APUs are meaningless for work, except for those using CAD. And for most of those they'll pay for a dedicated card, since the Quadro/FirePro cards unlock many times improved performance.

Sure, I have way more GPU power than the old GMA950, but you wouldn't know it.
 
Last edited:
Absolutely for me.

My company cheaped-out and went AMD Trinity dual-core instead of Intel Ivy Bridge dual-core. I regret it every day, since it's slower than my old Core 2 Duo.

More powerful APUs are meaningless for work, except for those using CAD. And for most of those they'll pay for a dedicated card, since the Quadro/FirePro cards unlock many times improved performance.

Sure, I have way more GPU power than the old GMA950, but you wouldn't know it.
That's funny because the company I work for went intel only and our hardware team has to be a 24 hour shop to keep things going. I am one of 7 that fixes windows and lan issues for this company's location and we are business hours only. we have a 4 year refresh cycle that should really be a 3 year cycle. But hey, corporations are cheap.
 
What's the use case for APUs? I'm struggling to think why one needs an APU in a desktop setting when you can shove a GTX 1080 + i7 7700K into a case the size of a PS4. Even for mobile I don't see a niche for it.

An APU, IMO, is an oxymoron. Pair a powerful CPU and a powerful GPU onto the same die for space savings, yet the irony here is that to be able to cool it you absolutely need a lot of space for heatpipes, fans and heatsinks. Smaller is not better in this case, since even with an IHS it's difficult to fit many heatpipes on top of the die.

The only laptop with the beefiest APU (Carrizo/Bristol Ridge) is as heavy as the lightest laptops featuring 6700HQ/7700HQ + 1070. Both hover around 2.2-2.8kg. I chose the top of the line APU model, maxing out at 35-45W.

APU laptops: 15.6 inch, 2.2-2.7kg. i7 x700HQ + 1070 laptops: 2.2-2.7kg. TPD wise x700HQ + 1070 combo is around 4-5 times higher. So where's the APU advantage?
 
What's the use case for APUs? I'm struggling to think why one needs an APU in a desktop setting when you can shove a GTX 1080 + i7 7700K into a case the size of a PS4. Even for mobile I don't see a niche for it.

An APU, IMO, is an oxymoron. Pair a powerful CPU and a powerful GPU onto the same die for space savings, yet the irony here is that to be able to cool it you absolutely need a lot of space for heatpipes, fans and heatsinks. Smaller is not better in this case, since even with an IHS it's difficult to fit many heatpipes on top of the die.

The only laptop with the beefiest APU (Carrizo/Bristol Ridge) is as heavy as the lightest laptops featuring 6700HQ/7700HQ + 1070. Both hover around 2.2-2.8kg. I chose the top of the line APU model, maxing out at 35-45W.

APU laptops: 15.6 inch, 2.2-2.7kg. i7 x700HQ + 1070 laptops: 2.2-2.7kg. TPD wise x700HQ + 1070 combo is around 4-5 times higher. So where's the APU advantage?

Sweet! Send me $1500 so I can build my mother a new email/facebook/web game machine!
 
Sweet! Send me $1500 so I can build my mother a new email/facebook/web game machine!

What's wrong with Core M for that use case? Why do you need an APU? What's wrong with Core M/Celeron for email/facebook/web game? All those use cases you mentioned favor the stronger CPU cores of Core M/Atom over AMD APUs.

Your mother can get by with a $250 Celeron-powered notebook. Why does she need an APU when she won't use the GPU?
 
What's wrong with Core M for that use case? Why do you need an APU? What's wrong with Core M/Celeron for email/facebook/web game? All those use cases you mentioned favor the stronger CPU cores of Core M/Atom over AMD APUs.

Your mother can get by with a $250 Celeron-powered notebook. Why does she need an APU when she won't use the GPU?

Core M/Celeron are essentially APUs as well.

The new Raven Ridge will just be a better one.

Most PCs sold today are laptops, and AMD needs a good APU primarily for the laptop market.

I am always reading the GOG forums, and the people having the most trouble running games are people using Intel laptops and their integrated graphics.

This market would likely do a lot better with a nice AMD APU.
 
Yeah current Intel IGP's are all held back by their memory bandwidth, don't see how AMD will over come that unless they use things like HBM or something other then GDDR4.
 
If that APU is as memory limited as Intel iGPUs are, then they will struggle just as much.

The reason Intel GPUs are failing in old games on GOG is not performance, but drivers.

Also AFAIK, even AMDs current APUs are faster than Intels. Intel has to use expensive eDRAM cache to catch AMD.

It will be interesting to see how it turns out but it seems unlikely AMD would boost Shader units by 50% only to deliver the same performance they have today.
 
The older the game, the lower the advantage, last time i checked. In GTA V AMD is faster than even eDRAM models, but in old stuff they


Huh. How old games are we talking about, just in case?
no idea but I can tell you from work that intel's graphics drivers still have a tough time with hardware acceleration on video and embedded videos in powerpoint. I wouldn't even try games on those chips
 
The older the game, the lower the advantage, last time i checked. In GTA V AMD is faster than even eDRAM models, but in old stuff they


Huh. How old games are we talking about, just in case?

10+ years old. I think there are more problems because back then games were never even tested running on Intel IGP so there was zero work done assuring that the worked, and Intel probably doesn't use regression tests with games that old either...
 
The reason Intel GPUs are failing in old games on GOG is not performance, but drivers.

Also AFAIK, even AMDs current APUs are faster than Intels. Intel has to use expensive eDRAM cache to catch AMD.

It will be interesting to see how it turns out but it seems unlikely AMD would boost Shader units by 50% only to deliver the same performance they have today.

This one is pretty simple. Vega APU should be much more efficient with bandwidth from 3 changes:

1. Better primitive discard/culling steps in their new geometry engine.
2. Binning Cache for the Rasterizer, less traffic to and from the APU to system RAM, more on-chip cache traffic.
3. ROPs a client of L2, for deferred rendering games, less traffic to off-chip RAM.

They don't need HBM for +50% iGPU performance uplift. HBM would be required if they ever go big APU, like PS4Pro or Scorpio class.

One has to wonder whether there's such a market for such a strong APU w/ 2-4GB HBM2 though.
 
This one is pretty simple. Vega APU should be much more efficient with bandwidth from 3 changes:

1. Better primitive discard/culling steps in their new geometry engine.
2. Binning Cache for the Rasterizer, less traffic to and from the APU to system RAM, more on-chip cache traffic.
3. ROPs a client of L2, for deferred rendering games, less traffic to off-chip RAM.

They don't need HBM for +50% iGPU performance uplift. HBM would be required if they ever go big APU, like PS4Pro or Scorpio class.

One has to wonder whether there's such a market for such a strong APU w/ 2-4GB HBM2 though.


can't assume that at all, it will be better, but we have no idea of how much better, nV gave their estimates on Pascal and most of its bandwidth savings came from compression....

Also Intel has primitive discard for just about the same time as nV.

So kinda left with the last 2, if AMD's is as efficient as nV's and can be used at all times (binned rasterizer) that will get around 20%, and nV is on a second generation so... Lets say 10% for AMD sounds fair.

ROP L2 cache is coherent in Intel IGP if I remember correctly and that is what it looks like in the diagram, so yeah.

Sorry was looking at the diagram wrong, Intel IGP has edram for that, so they have a pretty BIG advantage there.
 
Last edited:
Sorry was looking at the diagram wrong, Intel IGP has edram for that, so they have a pretty BIG advantage there.

That would be nice, if Intel has eDram for their regular APUs. But Intel figured it was too expensive for them to keep doing such a design. Their eDram Crystal Well powered APU was ridiculous, an entire 2x die is devoted to eDram.

HBM2 2GB should serve well with Vega's HBCC for any APU. Keep costs low while it's effectiveness as a cache keeps performance high with low system ram bandwidth.
 
Back
Top