From ATI to AMD back to ATI? A Journey in Futility @ [H]

Intel dumped Imagination Technologies twice, last time foe good.
http://www.standard.co.uk/business/...fter-intel-dumps-cut-price-stake-9567763.html
https://www.theguardian.com/busines...nologies-slips-as-intel-sells-remaining-stake

Apple seems to be working on their own GPU design. They hired a lot of GPU specific people like Intel.
To clarify - Intel will most likely have many options they are considering as well as discussions with a number of other companies to buy them out. Will that happen or not only time will tell. At one point last year a rumor of Intel buying out Imagination Technologies was starting to be circulated as well as Apple buying them out. It did not pan out in the end or at least yet. In other words what Kyle wrote could be 100% accurate at the time but until money exchanges hands with written agreement it can evaporate into thin air as if it never existed. Yes Intel at one point had I believe around 10% investment in IMGTEC but sold those shares for what ever reason.
 
OK. Got information back on this. Everything I have mentioned here is definitively correct.

Intel is licensing AMD GPU technology. No money has changed hands yet, so there is not financial impact till late in the year, hence nothing in the current earnings report.

The first product AMD is working on for Intel is a Kaby Lake processor variant that will positioned in the entry-to-mid level performance segment. It is in fact an MCM (multi-chip-module), and will not be on-die with the KB CPU. It is scheduled to come to market before the end of the year. I would expect more collaboration between AMD and Intel in the future on the graphics side.

And you can take all that to the bank.
 
Wait this might have been addressed already but if this is true, then what about AMD's APU business?
 
Wait this might have been addressed already but if this is true, then what about AMD's APU business?

Exactly. There is 117 reasons why this wont happen. Cant think of a single one on why it would.

If Intel wanted a gaming GPU they go Nvidia with a truckload better perf/watt, smaller die and less memory bandwidth need.
 
I also think we might be looking at this at totally the wrong way. Intel might be looking at AMD's graphic for their science processing power and not for their gaming graphics
 
It seems clear to me (I may be way off) that this GPU deal may be real, but would largely be intended for the Machine Learning, AI, Autonomous Car markets. Sure it may trickle into PC one day but that is a big undertaking considering that AMD GPU's are built to integrate with AMD CPU's (construct/cat/zen) You can't just plop into a Intel CPU without incredible amounts of work.

No way this can take place on current socket, a refresh of architecture coming at 10nm for Intel would need to take place before this could ever work on the PC side.

That said Intel could get behind ROCm and or begin to integrate Radeon compute solutions into the mentioned markets to combat Nvidia's push into data centers and the machine learning environment with their GPU and ARM cores.
 
Last edited:
Exactly. There is 117 reasons why this wont happen. Cant think of a single one on why it would.

If Intel wanted a gaming GPU they go Nvidia with a truckload better perf/watt, smaller die and less memory bandwidth need.
Or maybe Intel sees how nVidia has a tendency to screw over companies who partner with them for IP and they want a more trustworthy technology partner
 
Or maybe Intel sees how nVidia has a tendency to screw over companies who partner with them for IP and they want a more trustworthy technology partner
Or Intel see Nvidia as one of their primary competitors in many sectors, they may also be able to take a lot of orders away from AMD after Ryzen and associated APUs after the 1st 3-4 years if they pump a lot of cash into this.
I think AMD are insane to do a deal along these lines with Intel but hey, AMD do seem to have a thing about squeezing Nvidia or get hung up on them often if going by their events.
Cheers
 
Or Intel see Nvidia as one of their primary competitors in many sectors, they may also be able to take a lot of orders away from AMD after Ryzen and associated APUs after the 1st 3-4 years if they pump a lot of cash into this.
I think AMD are insane to do a deal along these lines with Intel but hey, AMD do seem to have a thing about squeezing Nvidia or get hung up on them often if going by their events.
Cheers
The MCM design given to Intel still likely isn't on par with what AMD would have. At least I doubt Kaby Lake is working with the Infinity fabric all that well. So AMD should still retain a performance advantage while obviously moving a lot more product. They would likely be left with a rather commanding chunk of the market considering Intel's current share.
 
I may have missed your recent posts on this Kyle, but in light of this Intel-AMD deal (or is it Intel-RTG?), the scenario you suggested 9 months, is this the beginning steps of an impending breakup between RTG and AMD or has this faded since then?
 
I may have missed your recent posts on this Kyle, but in light of this Intel-AMD deal (or is it Intel-RTG?), the scenario you suggested 9 months, is this the beginning steps of an impending breakup between RTG and AMD or has this faded since then?
I think our reporting on that caused some pretty big ripples inside AMD and quieted things down a lot on that front. When we published that editorial, licensing was just in the beginning phases of being discussed. I think much of that energy moved into the current licensing deal with Intel and AMD/RTG. It is for a very limited SKU right now, but I think it has a very big chance of getting much wider than it is currently. Some of the folks at Intel right now, fairly high up on the ladder came from AMD as well. I think this will likely help direct a wider AMD licensing agreement in the future.
 
I'm going go guess this specialized Kabylake processor is being designed for Apple, not for an ordinary desktop PC, based on what AMD said at this investor conference:

"Advanced Micro Devices' (AMD) Management Presents At Barclays Global Technology, Media And Telecommunications Conference (Transcript)

Unidentified Company Representative

Okay. On the licensing front, I won’t put you on the spot about specific recent rumors but say at least there has been some talk that you might have a partnership with one of your largest competitors, rather than talking about that specifically, could you just talk about your appetite more broadly for other licensing partnerships and what type of willingness you’d have to do those?

Devinder Kumar

Yes, I think, we've talked about IP monetization in particular partnerships. So, AMD with the treasure full of IP that we have, we have 10,000 plus patterns, half of them are US-based. And we saw over the last two or three years that we can go ahead and partner with folks, where we don't want to directly enter market. We will be very careful in terms of who we partner with, where it doesn't come back and directly compete with us in areas that we want to go put products in, and very happy to do on a fair basis a deal with the partner to go ahead and monetize that IP, get cash, benefit the P&L and balance sheet in particular."



For what its worth.
 
Ye, even the CFO of AMD says no such thing if it competes with themselves. Aka against APUs in this case.

And Intel is supposed to have taped out a product without a deal and money exchanged. That smells big time of fairy tales and unicorns on its own.
 
The MCM design given to Intel still likely isn't on par with what AMD would have. At least I doubt Kaby Lake is working with the Infinity fabric all that well. So AMD should still retain a performance advantage while obviously moving a lot more product. They would likely be left with a rather commanding chunk of the market considering Intel's current share.
Yeah which is why I said 3-4 years from now and a large investment of cash and resources by Intel.
While that may seem quite far away in consumer business and AMD competing, that is just one partial generation update away for AMD and a serious danger.
Cheers
 
Last edited:
I'm going go guess this specialized Kabylake processor is being designed for Apple, not for an ordinary desktop PC, based on what AMD said at this investor conference:
While I am not commenting on this directly, but I would not disagree with your thoughts about that.
 
Why would AMD sell dGPUs to co-package with Intel CPUs if those co-packaged parts would compete with their own APUs? What customer would be interested in such a part?

This still doesn't add up.
 
As Dannotech suggested, Apple might be interested in these parts, and in my opinion, I suppose if Apple wants better graphics performance in their computers, their high-end models currently use Polaris (if i remember rightly) with Intel's HD Graphics making up the low and mid end, if Apple intends on pursuing more graphics performance, perhaps an AMD GPU coupled with Intel's CPUs will be a better bet for Apple's purposes with their computers. I don't know much about the complex systems involved, but I suppose this multi-chip-module could be smaller and fit better than the current method of high-end Polaris products inside the Mac products, and offer better GPU performance than the current iGPU Intel products?

If these products are intended only for Mac products, that could be something considered not a threat to AMD's APU business, and I suppose Intel would be happy that Apple wouldn't be considering a AMD APU in the future. Why not a perhaps Intel-Nvidia part for Apple? I dunno, price maybe? Cooperation aspects? Maybe a AMD APU could still knock that combo out?
 
As Dannotech suggested, Apple might be interested in these parts, and in my opinion, I suppose if Apple wants better graphics performance in their computers, their high-end models currently use Polaris (if i remember rightly) with Intel's HD Graphics making up the low and mid end, if Apple intends on pursuing more graphics performance, perhaps an AMD GPU coupled with Intel's CPUs will be a better bet for Apple's purposes with their computers. I don't know much about the complex systems involved, but I suppose this multi-chip-module could be smaller and fit better than the current method of high-end Polaris products inside the Mac products, and offer better GPU performance than the current iGPU Intel products?

If these products are intended only for Mac products, that could be something considered not a threat to AMD's APU business, and I suppose Intel would be happy that Apple wouldn't be considering a AMD APU in the future. Why not a perhaps Intel-Nvidia part for Apple? I dunno, price maybe? Cooperation aspects? Maybe a AMD APU could still knock that combo out?
Apple is trying to squeeze Nvidia and does not want to increase user base for CUDA, what some seem to think.
For Apple they see OpenCL as the way forward rather than CUDA and so a good reason not to use Nvidia, which then ironically reduces Apple's competitive edge with their upper models from this perspective and also their consumer models from performance and power demand.
Keeping to this attitude-strategy no idea how they will respond to competing products that end up with Volta.
Cheers
 
Last edited:
I see Apple as being less and less relevant in the PC space. They reigned supreme with the iMac but now there's the Surface Studio. Their iPad line is dying, and Surface Pro plus various other premium laptops are squeezing the Macbook line. They're really only relevant in the smartphone space, and even there they're losing due to all the stupidities that come with owning an iPhone.

I'm pretty sure most professionals have given up on using an Apple product for productivity unless forced. And even then there are Hackintosh solutions that while not as stable are many times more powerful.
 
I see Apple as being less and less relevant in the PC space. They reigned supreme with the iMac but now there's the Surface Studio. Their iPad line is dying, and Surface Pro plus various other premium laptops are squeezing the Macbook line. They're really only relevant in the smartphone space, and even there they're losing due to all the stupidities that come with owning an iPhone.

I'm pretty sure most professionals have given up on using an Apple product for productivity unless forced. And even then there are Hackintosh solutions that while not as stable are many times more powerful.

Don't worry, universities still buy iMacs for computer science classes.

My university keeps buying 27inch iMacs just for people to compile thier C code in entry level computer science classes.

Meanwhile computer labs for engineers who have to do solidworks comes with HP and Dells that have a Haswell quad core Xeon E3 with integrated graphics.
 
Don't worry, universities still buy iMacs for computer science classes.

My university keeps buying 27inch iMacs just for people to compile thier C code in entry level computer science classes.

Meanwhile computer labs for engineers who have to do solidworks comes with HP and Dells that have a Haswell quad core Xeon E3 with integrated graphics.

Quad core Xeons with an iGPU sounds horrific for engineering apps... You poor buggers.

I always wondered if university IT gets kickbacks or what.

I always thought it was funny Apple puts AMD into their systems considering their target market.
 
Is it that profitable to make special product lines for Apple's diminishing X86 numbers?
 
Is it that profitable to make special product lines for Apple's diminishing X86 numbers?
Considering Intel does not make anything special for PC market or enthusiasts, it would be rather annoying to say the least if they do :)

Didn't Intel refuse to give Apple access to Kaby Lake until sometime this year.
But they may still do something that Apple can use, but I doubt it would solely be for them as Intel wants growth opportunities.
Cheers
 
Last edited:
My wild guess is the "not for an ordinary computer" is the correct part, if it's not Apple. Connecting the dots, my vote is for the first consumer version of Project Alloy "face computer"/mixed reality googles, said to be coming in 2H 2017 from an article in the verge in I think Aug '16. AMD had Radeon and CPUs in the Sulon Q, a similar concept to Project Alloy, but separate start up company in Canada - not sure if partially or fully owned by AMD. Radio silence on Sulon Q's blog and limited media articles as far as I can tell since last year...so one possibility that comes to mind since earlier news articles quoted an Intel VP saying that the consumer project alloy would have options for discrete GPU (which if you wanted to have a light as possible headset makes no sense and seems like an inadvertant admission that nvidia or amd would be involved). I think it would be engineered as a APU or MCM....Since VR headsets would be a new market for both Intel and AMD, they may have decided to split the difference and use it as a test project to combine Radeon iGPU with Intel CPU in a small form factor...That way it would in no way cannibalize existing markets, but allow both to collaborate in new ones currently dominated by ARM mobile phones combined with VR google headsets, such as Google's Daydream and Samsung GearVR. Voila...a common foe, Radeon and Intel unite, and AMD gets a piece of VR without yet having recovered enough to have the R&D alone to make an APU Sulon Q a success. Intel can try out combining AMD iGPU with their CPUs and DSPs with less risk to disrupting existing business, and incentivize Radeon to bring their A game IP by having royalties per device. "No Raven Ridges were harmed in the making of this Face Computer..."

It could be that apple said something like this: So we have an intel cpu and amd gpu in our macbooks(or other products). How can we integrate both in a more cost- and space effective way. AMD's SoC's like Raven Ridge (or Custom but ZEN for Custom Designs are not ready before 2018 -AMD) could be an option but probably no fast enough on the gpu side, so they ended up asking for a MCM.
A collab between AMD and Intel for Apple would make perfect sense to me.
 
A collab between AMD and Intel for Apple would make perfect sense to me.

Shotgun collaboration so both AMD and INTC could save their Apple business would make sense. With news that Apple is/may be putting an ARM coprocessor into their MacBooks, both INTC and AMD's days may be numbered in those products.
 
It could be that apple said something like this: So we have an intel cpu and amd gpu in our macbooks(or other products). How can we integrate both in a more cost- and space effective way. AMD's SoC's like Raven Ridge (or Custom but ZEN for Custom Designs are not ready before 2018 -AMD) could be an option but probably no fast enough on the gpu side, so they ended up asking for a MCM.
A collab between AMD and Intel for Apple would make perfect sense to me.

And would be a classic case of AMD killing a critical segment for a product line before it launches, which also has longer term implications as I mentioned 3-4 years down the line for AMD depending how broad the deal is and what level of investment (cash and resources) Intel puts in.
This means Intel will have a product to compete with Raven Ridge in Apple products and with a foothold even before we see Raven Ridge.
Yes it may not be as effective as Raven Ridge, but in this instance it adds competition that was not there for it in the 1st place.
I just cannot see how this deal does not impact the AMD CPU division in some way now and also in the future, if this was done by the Radeon group with Raja, one may expect the CPU division and their VP to fire back and screw them (Radeon) over in a way as well.
Part of the discussion it is important to know exactly where this deal originated from within AMD, if it exists and signed off.
Cheers
 
Shotgun collaboration so both AMD and INTC could save their Apple business would make sense. With news that Apple is/may be putting an ARM coprocessor into their MacBooks, both INTC and AMD's days may be numbered in those products.

Apple already got ARM coprocessors. Apple isn't replacing x86 CPUs and there is no valid alternative for x86. Mac sales would also plummet just for the lack of ability to run Windows that quite a lot installs on their macs.

Apples mac sales are hit due to the innovation death of the company and the fear of killing of its already dead iPad division. 2in1 is the hot thing and Apple got nothing.

How much of the MacBook sales do you recon is with the discrete GPU instead of IGP?
 
Why would AMD sell dGPUs to co-package with Intel CPUs if those co-packaged parts would compete with their own APUs? What customer would be interested in such a part?

This still doesn't add up.
Well let see, AMD can have up to 100% of all X86/X64 APU's having some of their tech making money on everyone or have 15%-20% having their tech at cut rate pricing making very little profit. I can see the lure of allowing Intel to use their IP for a price.

This would also swing over to their graphics division when 80% of all PC's use AMD graphics - that would be a huge win. Also Intel has best chance of competing with Arm in the future on the small scale. Some rather hefty volume there.
 
Well let see, AMD can have up to 100% of all X86/X64 APU's having some of their tech making money on everyone or have 15%-20% having their tech at cut rate pricing making very little profit. I can see the lure of allowing Intel to use their IP for a price.

This would also swing over to their graphics division when 80% of all PC's use AMD graphics - that would be a huge win. Also Intel has best chance of competing with Arm in the future on the small scale. Some rather hefty volume there.

Why would anyone buy an AMD APU then? And will 15W parts now become 35W parts? How many actually buys IGP based on graphics performance? And in a way where you dont instantly collapse from laughing while trying to point out the much better and much faster discrete solution.

GCN is anything but power efficient. How had you imagined it should be ARM competitor worth against graphics designed for low power? Hell, AMD cant even compete on something as simple as quicksync and decode formats.
 
Why would anyone buy an AMD APU then? And will 15W parts now become 35W parts? How many actually buys IGP based on graphics performance? And in a way where you dont instantly collapse from laughing while trying to point out the much better and much faster discrete solution.

GCN is anything but power efficient. How had you imagined it should be ARM competitor worth against graphics designed for low power? Hell, AMD cant even compete on something as simple as quicksync and decode formats.

Why would anyone buy an AMD APU *now*? They are great products but they haven't been driving sales.

Getting their tech into every budget PC and up is a huge win

As far as power efficiency we don't know what architecture they will be (or where they will be fabbed for that matter)
 
Why would anyone buy an AMD APU *now*? They are great products but they haven't been driving sales.

Getting their tech into every budget PC and up is a huge win

That's a great idea, if AMD didn't develop Zen but had just given up on CPUs.

As far as power efficiency we don't know what architecture they will be (or where they will be fabbed for that matter)

If you wanted perf/watt, give me a good reason why you pick GCN. And even more if you want a premium product.
 
That's a great idea, if AMD didn't develop Zen but had just given up on CPUs.



If you wanted perf/watt, give me a good reason why you pick GCN. And even more if you want a premium product.

Even if Zen utterly stomps Intel in every segment theyd still be lucky to increase market share into the upper 20% range. Half a loaf is better than no loaf

Intel iGPUs have never been premium parts and lack on the "perf" portion of per/watt. At the very least these will be more power efficient than Intel CPU + discrete GPU (in example the Macs with the RX460)
 
Back
Top