From ATI to AMD back to ATI? A Journey in Futility @ [H]

This is probably bad news for us consumers on the CPU side of things and good news on the GPU side. Less compettion for Intel, more competition to NVIDIA. AMD survives a few more quarters, gives up on the chance to build a good CPU for the chance to build a good GPU. The perfect Storm would be a GPU with Intel's manufactoring processes.

Can you imagine if that was the deal they struck?? lol
AMD: We'll let you use our IP, if you let us use your manufacturing nodes! *wink*

However, with that contract AMD has with GloFo, it wouldn't really be in AMD's best interests given the monetary penalty they'd incur :\
(Details at that can be found here, starting where it says "Meanwhile, although AMD is")

I wonder if Intel would even go for that?
 
RIP Intel graphics engineers.

Though I remember Koduri saying that graphics talent is in high demand with all the mobile players along with amd and nvidia, maybe a large chunk of them will land on their feet.
 
RIP Intel graphics engineers.

Though I remember Koduri saying that graphics talent is in high demand with all the mobile players along with amd and nvidia, maybe a large chunk of them will land on their feet.
I don't see any reason to believe they'd be gone.
 
RIP Intel graphics engineers.

Though I remember Koduri saying that graphics talent is in high demand with all the mobile players along with amd and nvidia, maybe a large chunk of them will land on their feet.

i think they are just moving from the intel stuff they were working on to the amd stuff.
 
multi small dies on the same process isn't going to help either lol, cause cost increases because now you need a very expensive interconnect, on top of the interposer on top of extra memory for the two or more GPU's to communicate.

It all comes down to when those other components are cheap enough to manufacture to sustain current bracket prices.

The only thing it will help with is yields of bigger chips vs.... and right now that is why nV is making the performance chips first before they go to the enthusiast chips or they do the professional cards first since volume is less there and they can cover the increased cost of having less yields because of the increased costs of those cards.
AMD indicated better yields of a smaller die GPU outweighs the added cost of hooking them up. An interposer is just another chip and costs could get cheaper then a board design with 32gb or more of DDR6 while maintaining good signal quality for the faster ram. There is a clear limit in how big you can go on a process - GP100 is rather large and going larger will be even more difficult. So what do you do for the next 3-6 years? Stall? If costs are high then AMD could just take away the HPC market from Nvidia with more powerful multi-GPU/HBM configurations that are 2x+ more powerful then Nvidia's greatest. GP100 needed HBM2 because DDR 5(x) and even DDR 6 would be less effective and would probably not work in the long run stacking hundreds of these gpu's together. HBM is a good solution which allows some rather versatile configuration with the interposer with multiple processors (same or different types) in a very compact space.

What will actually happen is anyone's guess but looks like to me AMD is on a reasonable path. Not sure for Nvidia.
 
I cannot believe some of you guys actually think there's a chance Intel will have a Radeon designed die in their CPU's... I hate to break it to you, but not on this planet!

It's a cross-licensing deal. It's purpose is to continue Intel's iGPU production without Nvidia or AMD suing the holy mary out of them. It's gonna be business as usual, and that exactly is the point. As for what AMD gets out of it apart from cash, well that's an actual interesting discussion. My guess would be some kind of HT-related patents so AMD can launch Zen with similar technology.
 
I cannot believe some of you guys actually think there's a chance Intel will have a Radeon designed die in their CPU's... I hate to break it to you, but not on this planet!

It's a cross-licensing deal. It's purpose is to continue Intel's iGPU production without Nvidia or AMD suing the holy mary out of them. It's gonna be business as usual, and that exactly is the point. As for what AMD gets out of it apart from cash, well that's an actual interesting discussion. My guess would be some kind of HT-related patents so AMD can launch Zen with similar technology.
Or to allow China to produce X86/A64 CPU's for servers or something like that. Once AMD sold their fab business that has been rather shaky on who can produce x86/A64 CPU's. You would think Intel would get AMD to produce all of their CPU's on their process vice seeing it propagate to multiple fabs.
 
AMD indicated better yields of a smaller die GPU outweighs the added cost of hooking them up. An interposer is just another chip and costs could get cheaper then a board design with 32gb or more of DDR6 while maintaining good signal quality for the faster ram. There is a clear limit in how big you can go on a process - GP100 is rather large and going larger will be even more difficult. So what do you do for the next 3-6 years? Stall? If costs are high then AMD could just take away the HPC market from Nvidia with more powerful multi-GPU/HBM configurations that are 2x+ more powerful then Nvidia's greatest. GP100 needed HBM2 because DDR 5(x) and even DDR 6 would be less effective and would probably not work in the long run stacking hundreds of these gpu's together. HBM is a good solution which allows some rather versatile configuration with the interposer with multiple processors (same or different types) in a very compact space.

What will actually happen is anyone's guess but looks like to me AMD is on a reasonable path. Not sure for Nvidia.


Yet nV has shown they can design and manufacture big die chips at acceptable yields much better than AMD can gen after gen since the g80. Why is that? It isn't because of the fab, cause nV and AMD both used TSMC in the past and we have seen AMD have supply issues.

Its a matter of perspective, just because AMD can't do it because they are lacking something, doesn't mean that is the way of the future for the industry ;) at least not yet.

It doesn't matter if its one chip or two chips, both will consume similar amounts of power if both are designed well (per mm2).

And no HPC market needs software, something AMD (and Intel btw) are lagging behind, this is a hurdle they must overcome before taking on nV. And this is also why nV has made big strides into the HPC market, all to the dismay of Intel and Intel's big push with Phi's next gen and trying to get it out before nV can get new generations. Unfortunate for Intel is 10nm is delayed, and nV has now pushed Volta up to 16/14mn. Intel needs 10nm to push Phi (Knight's landing) more as right now on 14nm, Phi is 647nm2 if I remember correct and that is the biggest they can go.

This is why AMD Boltzmann Initiative hasn't taken off.
 
Last edited:
I cannot believe some of you guys actually think there's a chance Intel will have a Radeon designed die in their CPU's... I hate to break it to you, but not on this planet!

It's a cross-licensing deal. It's purpose is to continue Intel's iGPU production without Nvidia or AMD suing the holy mary out of them. It's gonna be business as usual, and that exactly is the point. As for what AMD gets out of it apart from cash, well that's an actual interesting discussion. My guess would be some kind of HT-related patents so AMD can launch Zen with similar technology.
PC World said:
“To my understanding, Intel has a team of about ~1,000 engineers working on their forward-looking iGPU technology,” Bennett told PCWorld. “Basically, that work will be scrapped and that team and their work will be replaced with AMD teams and technology going forward. There are also Apple implications here as well, and this deal is good for Apple assuredly.”
That doesn't really jive with Intel's graphics engineers deciding to pursue jobs at other companies. Who knows, maybe Intel designs IGPs without engineers.
 
Yet nV has shown they can design and manufacture big die chips at acceptable yields much better than AMD can gen after gen since the g80. Why is that? It isn't because of the fab, cause nV and AMD both used TSMC in the past and we have seen AMD have supply issues.

Its a matter of perspective, just because AMD can't do it because they are lacking something, doesn't mean that is the way of the future for the industry ;) at least not yet.

It doesn't matter if its one chip or two chips, both will consume similar amounts of power if both are designed well (per mm2).

And no HPC market needs software, something AMD (and Intel btw) are lagging behind, this is a hurdle they must overcome before taking on nV. And this is also why nV has made big strides into the HPC market, all to the dismay of Intel and Intel's big push with Phi's next gen and trying to get it out before nV can get new generations. Unfortunate for Intel is 10nm is delayed, and nV has now pushed Volta up to 16/14mn. Intel needs 10nm to push Phi (Knight's landing) more as right now on 14nm, Phi is 647nm2 if I remember correct and that is the biggest they can go.

This is why AMD Boltzmann Initiative hasn't taken off.
Well that holds true today as well - question is will it tomorrow when there is no new process tech out. AMD gamble but looks logical as well.

Vega, depending how competitive will be AMD's answer near term but after that? Navi and looks like multiple GPU's on an interposer - what will Nvidia have?
 
Just as an example, Intel with P4 took 5 years to get out Nehelam, how long did it take AMD to go from Phenom to Zen? 10 years.
What is your logic here? I fail to see any. Are you talking arc to arc? If so your leaving out a few to make your point. If you have one.
 
In those 5 years the P4 went through quite a few overhauls. The phenom didn't until they got the II out but still they had to rest on it because bulldozer was an overwhelming failure.
 
In those 5 years the P4 went through quite a few overhauls. The phenom didn't until they got the II out but still they had to rest on it because bulldozer was an overwhelming failure.
AMD actually had a number of new Bulldozer cores mostly for the APU's (they had to if they wanted some sort of presence in the mobile market or to keep one toe in the door, each with better power characteristics).
 
AMD actually had a number of new Bulldozer cores mostly for the APU's (they had to if they wanted some sort of presence in the mobile market or to keep one toe in the door, each with better power characteristics).
Well yea the bulldozer has gone through a ton of iterations but the topic I was talking about was the phenom and P4. AMD did good with making the bulldozer family not such a waste.

But that doesn't mean it wasn't still a poor executed product. The phenom line had a future, a better one than bulldozer eventually did. It's what happens when you try to rewrite your entire product stack.
 
Given that damn near all the AMD GPU tech still comes out of "ATi" in Markham Ontario, this thought has always come to mind. Why hasn't NVIDIA or Intel put a GPU engineering office across the street and simply taken all their employees. (Apple, Samsung, or Qualcomm could be added to that list as well if you stretch a bit.) Most of those engineers and programmers would walk across the street for a nickel in a second. They have no loyalty to AMD. NVIDIA and Intel could destroy the entire AMD GPU infrastructure quickly. Then again, maybe they do not want AMD wiped out. Maybe they want AMD to remain at 3rd rung, a very low rung. That all said, I don't think Intel is even slightly concerned with AMD in terms of CPU. If Intel is threatened by any tech, it is ARM.
 
Yet Intel has offered its fabs to ARM licensees. If Intel was that concerned with a chip with a nearly zilch software stack on the desktop I doubt they would have.
 
As someone who actually spent a bit of time on an Iris Pro ultrabook, I was pretty impressed. For an integrated gpu playing modern games at okay quality settings, quite the contrast to the old 'extreme graphics' bullshit of the old days that was barely better than useless.


I think with generation two and an optimised architecture, we'll see HBM shine. We also see this potential in some benchmarks where the Fiji punches well above where it should with 4gb vram.

It would be interesting if AMD and Intel shared fab space though... never happen.
 
Yet Intel has offered its fabs to ARM licensees. If Intel was that concerned with a chip with a nearly zilch software stack on the desktop I doubt they would have.

Samsung makes chips for Apple. Your argument is invalid. If intel has excess capacity, then it would make sense for them to contract it. The node advantage alone is not going to make or break ARM processors.

As someone who actually spent a bit of time on an Iris Pro ultrabook, I was pretty impressed. For an integrated gpu playing modern games at okay quality settings, quite the contrast to the old 'extreme graphics' bullshit of the old days that was barely better than useless.


I think with generation two and an optimised architecture, we'll see HBM shine. We also see this potential in some benchmarks where the Fiji punches well above where it should with 4gb vram.

It would be interesting if AMD and Intel shared fab space though... never happen.

Which ultrabook did you try? I'm looking for one with at least 60Whr battery. For now I'll just use my Lenovo Flex 2 14 that has the 30Whr built in battery, plus the 42Whr battery I taped to the back of the screen.
 
I cannot believe some of you guys actually think there's a chance Intel will have a Radeon designed die in their CPU's... I hate to break it to you, but not on this planet!

It's a cross-licensing deal. It's purpose is to continue Intel's iGPU production without Nvidia or AMD suing the holy mary out of them. It's gonna be business as usual, and that exactly is the point. As for what AMD gets out of it apart from cash, well that's an actual interesting discussion. My guess would be some kind of HT-related patents so AMD can launch Zen with similar technology.


I don't know if this deal is purely about not being sued or if intel will actually put amd gpu tech into its integrated gpus, but as far as I am concerned, putting amd gpu tech into intel integrated graphics would boost up amd.

It would naturally vastly increase the install base of gcn or whatever newer equivalent comes out from amd and is put into intel parts. The lowest common denominator of ever increasing competence of integrated gpus will favor amd gpus, and that will likely scale to their higher end gpus as well.
 
Which ultrabook did you try? I'm looking for one with at least 60Whr battery. For now I'll just use my Lenovo Flex 2 14 that has the 30Whr built in battery, plus the 42Whr battery I taped to the back of the screen.

When they launched the platform about two years back, it was the top spec UX301LA? with the dual SSDs. That thing is a beast for anything outside of heavy graphics use, I'm not a heavy gamer when on the go but what I saw was pretty impressed with it. Especially considering the thermal envelope it works with. With light use could get around 8-9 hours out of it, weighed two tenths of fuck all and Check out youtube gaming reviews of the Iris Pro, you'll be pretty surprised at what it can do.

Lmfao at taping battery to the back.. it's one way about it. No replacement for battery displacement lol!

Also had some time on the 6700k iGPU, even on 1440 ultrawide was fine for normal desktop use, didn't get a chance to try gaming though.
 
Hmm Intel and AMD inside. Makes sense for both, Intel can ditch the money and effort on a gpu and AMD gets the massive install base and more cash. Would make it easier for AMD to push their gpu tech and marginalize Nvidia and their hold on developers. If I was a Nvidia stock holder I would be worried about this.
 
In those 5 years the P4 went through quite a few overhauls. The phenom didn't until they got the II out but still they had to rest on it because bulldozer was an overwhelming failure.
Ridiculous statement. What point are you trying to make. LOL.
Ya the P4 went through many overhauls because it sucked ass compared to AMD's tech at the time. Intel has the time and money to do so.
The PII came out relatively quickly after PI. BD wasn't even in existence then. Are you re-writing history on the fly now?
"Had to rest on it because bulldozer was an overwhelming failure", WTF dude! They didn't rest on anything. The new arc, for better or worse was the product they produced and sold.
 
AMD has mastered cheap, decent integrated graphics. Intel has, apparently, not really found a way to make this a reality on their end without taking up a ton of silicon real estate and dramatically increasing costs. For MOST of us on here this is irrelevant, we buy discrete cards and don't really care. But...take my wife for example....I bought her a cheap AMD APU laptop and it'll play Dragon Age: Inquisistion, Sims 4, etc at playable framerates and I paid a litle over $300 for it. That's pretty incredible in my mind. My laptop with a I7-6700HQ, which can't do ANY form of gaming on it's own (even though the CPU itself costs as much as I paid for my wife's entire laptop), cost $1500 with a decent dedicated graphics card.

So make no mistake....AMD has something to offer Intel. If they could combine intellectual property and come out with a middle of the pack Kaby Lake with AMD Bristol Ridge level iGPU you'd see a revolution in the $500 laptop market. It would practically negate the low-end dedicated GPU market in laptops, decrease manufacturing costs (the extra copper and cooling for adding a discrete GPU is NOT cheap), and really bridge the gap between the consumer gamer and enthusiast products.

if intel want to develop better gpu using other company tech they don't need to license it from AMD. as it is intel already have cross licensing agreement with nvidia. why they want to license AMD tech when they can easily do it with their current licensing with nvidia? and yet intel still not 'adopting' nvidia tech in their gpu but instead keep developing their gpu on their own. that's why some people see this is no different that what intel and nvidia doing right now. intel probably decided to go with AMD once the current agreement with nvidia expire next year because AMD probably charging less compared to what intel been paying to nvidia right now.
 
if intel want to develop better gpu using other company tech they don't need to license it from AMD. as it is intel already have cross licensing agreement with nvidia. why they want to license AMD tech when they can easily do it with their current licensing with nvidia? and yet intel still not 'adopting' nvidia tech in their gpu but instead keep developing their gpu on their own. that's why some people see this is no different that what intel and nvidia doing right now. intel probably decided to go with AMD once the current agreement with nvidia expire next year because AMD probably charging less compared to what intel been paying to nvidia right now.
The cross licensing agreement expires this spring, plus intel was never too keen on it to begin with.
 
The cross licensing agreement expires this spring, plus intel was never too keen on it to begin with.

my point is if intel want to develop better gpu using other company tech they don't really need to license AMD IP when they already have nvidia with their current licensing. the licensing agreement will be expire soon but intel can always renew them if they want to. not to mention intel are chasing power efficiency. and we know which company have better architecture for that. so if intel decided to license the IP from AMD it probably because of the same exact reason why they license nvidia IP in the first place and not about integrating AMD tech into their igpu like some people think. but that's only part of the story. back in may/june Kyle also talk about how RTG want to break away from AMD and join intel instead (Rebellion by RTG lead by Raja towards AMD; now does RX480 rebellion campaign make more sense?). if that's true then i can accept the notion about RTG building gpu for intel though i imagine AMD probably will not going to let that from happening. imagine if AMD actually have to pay intel for the graphic tech they use in their APU because RTG was owned by intel?
 
if intel want to develop better gpu using other company tech they don't need to license it from AMD. as it is intel already have cross licensing agreement with nvidia. why they want to license AMD tech when they can easily do it with their current licensing with nvidia? and yet intel still not 'adopting' nvidia tech in their gpu but instead keep developing their gpu on their own. that's why some people see this is no different that what intel and nvidia doing right now. intel probably decided to go with AMD once the current agreement with nvidia expire next year because AMD probably charging less compared to what intel been paying to nvidia right now.

Again and again we have to say this. Please read up on what happened in 2011. It is a court issue more like a settlement. Intel didnt want to have anything to do with nvidia. This was never something they both mutually agreed on. So they paid them off, if they used nvidia technology they would be stuck with them even longer. Intel didn't go to nvidia and said we need your technology please help, it only happened because of a legal issue.
 
The cross licensing agreement expires this spring, plus intel was never too keen on it to begin with.

And it will get renewed. Else it will be hard to sell Kaby Lake, later Coffee Lake and for example and Goldmont ;)
 
if intel want to develop better gpu using other company tech they don't need to license it from AMD. as it is intel already have cross licensing agreement with nvidia. why they want to license AMD tech when they can easily do it with their current licensing with nvidia? and yet intel still not 'adopting' nvidia tech in their gpu but instead keep developing their gpu on their own. that's why some people see this is no different that what intel and nvidia doing right now. intel probably decided to go with AMD once the current agreement with nvidia expire next year because AMD probably charging less compared to what intel been paying to nvidia right now.

You also realize that right now, with NVIDIA pushing into HPC and the server market, they're ultimately a bigger threat to Intel's bottom line than AMD, right? I'm assuming Intel doesn't want to keep funding the people working their hardest to put them out of business. AMD and Intel both compete on x86 chips and as long as software designers keep making software for x86 there is enough business for both of them. If ARM or GPU technologies take off, THAT hurts Intel's bottom line far more than AMD taking a 5% chunk of the x86 server market. AMD plays in the professional space, but their bread and butter is lower cost consumer grade hardware. NVIDIA is going directly after a market that Intel has essentially owned 100% for decades, and right now has way more R&D budget ot throw at it than AMD. I sure know who I'd think is the bigger threat to my company in that scenario.
 
Hey remember when there was the grilling Nvidia for creating a performance chart that was rather long for what was seen as small gains.
Well this has made me smile.

Radeon-Software-Crimson-ReLive-NDA-Only-Confidential-v4-page-040-copy-1140x641.jpg


Nvidia marketing would be proud :)
Cheers
 
What is your logic here? I fail to see any. Are you talking arc to arc? If so your leaving out a few to make your point. If you have one.

Intel is capable of moving things faster then AMD because they have multiple teams and have multiple processor types in their pipelines. AMD only has one processor per generation being designed because of the resources they have. How can you not see that? That is why AMD was stuck with Phenom and Bulldozer, once they made a choice that was it, 5 years per chip.

Intel on the other hand, has cut down to 3 years per node and 2 gens within that node, that is 1.5 years per gen (now going back to 4.5 gens per node but still maintains the 1.5 years per gen)

How many people thought AMD was going to maintain their lead with Athlon 64? I mean did it even look remotely possible for a 5 billion dollar company to go up against a 150 billion dollar company and think, Intel (150 billion), wouldn't be able to compete? All of AMD's designs came from Intel by reverse engineering and making them better! Even the Athlon's, which were made by a company buyout of Nexgen who reverse engineered Intel's latest chips too, which was the PIII.

Intel screwed up and that is why AMD was able to take the lead. It had nothing to do with AMD. On top of that when Intel went back to their PIII design and updated it they came out with core designs which were already on the market in laptops 1 year after P4's were out. And more updates came with Nehalem once it was ready for desktops.
 
Last edited:
Hey remember when there was the grilling Nvidia for creating a performance chart that was rather long for what was seen as small gains.
Well this has made me smile.

Nvidia marketing would be proud :)
Cheers

Oi....I've always been an AMD homer, and I have a RX480 in my desktop....but their marketing for this generation has been almost too much for me. Really all they need to market is: 98% of the performance for 10% less! And people will buy the damn things.
 
This is an admission by Intel that they'll never be able to build a good enough GPU on their own. The players in the GPU space is down to two.

Did you buy or sell the proper options to profit from this Kyle? Whenever I read something on Seeking Alpha or a similar site, I know they'd love to have the inside knowledge you just had and wouldn't hesitate one second to leverage it as much as possible (not that I agree with it).
 
Intel is capable of moving things faster then AMD because they have multiple teams and have multiple processor types in their pipelines. AMD only has one processor per generation being designed because of the resources they have. How can you not see that? That is why AMD was stuck with Phenom and Bulldozer, once they made a choice that was it, 5 years per chip.

Intel on the other hand, has cut down to 3 years per node and 2 gens within that node, that is 1.5 years per gen (now going back to 4.5 gens per node but still maintains the 1.5 years per gen)

How many people thought AMD was going to maintain their lead with Athlon 64? I mean did it even look remotely possible for a 5 billion dollar company to go up against a 150 billion dollar company and think, Intel (150 billion), wouldn't be able to compete? All of AMD's designs came from Intel by reverse engineering and making them better! Even the Athlon's, which were made by a company buyout of Nexgen who reverse engineered Intel's latest chips too, which was the PIII.

Intel screwed up and that is why AMD was able to take the lead. It had nothing to do with AMD. On top of that when Intel went back to their PIII design and updated it they came out with core designs which were already on the market in laptops 1 year after P4's were out. And more updates came with Nehalem once it was ready for desktops.

Relevent: http://seekingalpha.com/article/4029144-amd-takeover-barrier-golden-prize-x86-license
 
I'll mention one word: Apple

Intel CPUs, AMD GPUs (in the models that have a discrete GPU). Integrate them and even the lower modules get effectively the same GPU as the higher end models, making the software and drivers simpler, and on some models having a built in AMD GPU may be enough to remove the discrete GPU, saving BoM costs and board space.
 
Back
Top