From ATI to AMD back to ATI? A Journey in Futility @ [H]

I have no love for AMD since I haven't been able to get a fucking business card or any graphics cards for review from them.
But I shudder when I think about what will happen if Nvidia's only competition fades away. Prices will go up and innovation will slow.

If AMD dies, is there anyone that could take their place? Could Foxconn hire the best and brightest from AMD and start designing their own products?
 
I have no love for AMD since I haven't been able to get a fucking business card or any graphics cards for review from them.
But I shudder when I think about what will happen if Nvidia's only competition fades away. Prices will go up and innovation will slow.

If AMD dies, is there anyone that could take their place? Could Foxconn hire the best and brightest from AMD and start designing their own products?


After all of AMD's fuck ups, would they want to?
 
The AMD ship started taking on water back in 2014. I personally broke free from AMD last year. With the 980ti and the 1080 I picked up this morning, I am never looking back.I also heard a rumor that Microsoft is going with a Pascal 1060 variant for their Xbox One refresh. Microsoft wants to firmly take the performance crown back.

The reason for losing the performance crown is where they lost track on software solutions for their exotic hardware problem (from DDR3 to Edram). It would not matter if they have had Nvidia with that hardware solution they still would not get ahead.
Nice side grade congrats ;)
 
In my world, given the landscape, I would try to do this if I were AMD proper:

1) Sell off the RTG group to Intel. Seriously, if they are pushing so hard to go to the point of making things toxic, better you make a buck and control the deal. Greater control over terms and patents as well.

2) Form a new group, hiring the best engineers away from both NVidia and ATIntel, and whomever else sells gpus.

3) Partner with Microsoft - yes seriously - in developing a brand new architecture that will fully take advantage of the DX 12 promises from the ground up, keeping in mind the developers as well and how they hate learning new things.

4) Once the card is ready, take your shot, and perhaps you surprise the shit out of everyone with something that actually makes it a 3 horse race again, which we haven't seen since I believe the 90s.
 
It would probably have to be someone like Qualcomm and a desktop version of the Adreno GPU (*hint* ADRENO ~ RADEON)

What a twist that would be...

In reference to another thread on the forums: Whatever happened to PowerVR and S3? They were reasonably big players long, long ago. Are either of them still around in any capacity?
 
Proof that traders don't give a crap about rumours and articles with no sources cited. AMD is up almost 5% currently.
 
Proof that traders don't give a crap about rumours and articles with no sources cited. AMD is up almost 5% currently.
Well when stock goes up or down then you certainly know the story was planted ;)
 
I just want real competition (even duopolies are better than monopolies).

IF and it's a big IF, ATI became part of Intel AND assuming they were being starved of investment at AMD while Intel gives them more resources, they might be able to do better. But that's a long chain of if's and assumptions.
 
I have no love for AMD since I haven't been able to get a fucking business card or any graphics cards for review from them.
But I shudder when I think about what will happen if Nvidia's only competition fades away. Prices will go up and innovation will slow.

If AMD dies, is there anyone that could take their place? Could Foxconn hire the best and brightest from AMD and start designing their own products?

Shouldn't be hard, most of AMD's engineers are in Taiwan now anyway. The ones that were let go a few years back mostly migrated to nVidia.
 
Can't say I'll blame Koduri. He probably believes RTG will be significantly more competitive if he had the resources they need, resources which AMD can no longer afford to provide, and they probably will never be able to.

This is an industry that takes a lot of money to be competitive in, there's no any other way around this.
AMD has been in a Catch 22 situation for some time now. They needed a great product to get market share back from Nvidia and make enough money for a great R&D department, but they needed a great R&D department to make a great product.

That is a death spiral.
 
Proof that traders don't give a crap about rumours and articles with no sources cited. AMD is up almost 5% currently.

Well when stock goes up or down then you certainly know the story was planted ;)
I checked before I published the story, it was not up a full percent at that time. Guess we know the truth now eh? The Street want to see RTG get away from AMD's floundering CPU division?
 
the next-gen AMD card promises to be amazing...that's what they've been saying for the past few product refreshes...I don't want ATI/AMD to go away or be bought by Intel...everyone should be pulling for AMD otherwise Nvidia will start charging $1000 for their high end single cards
 
Could this be good for investors and enthusiasts? I have been wracking my brain for weeks trying to figure out why Intel seems to be so quiet about the JV AMD entered into in the China market. Sure AMD structured it in a way that they had controlling interest of the one entity as to not break the x86 Intel crosslicensing agreement but there are other conditions in the same crosslicensing deal that prevent AMD from licensing x86 technologies. Then again, AMD has certain IP in x86 (64-bit extensions etc.) that Intel also needs as we all know BUT could Intel just be 'remaining silent' on the whole thing because they already knew they would be purchasing RTG in the near future? If so how does this play out for shareholders? I want to know if shares would be given in Intel or some shady crap takes place and, in return for a cash payment (which, I assume, would pretty much alleviate AMD's current debt) AMD shareholders now only have shares of a CPU business. Because of the custom SOC's AMD currently has that are protected in contracts with Sony, Microsoft etc. and presumably Nintendo in the near future… There would still have to be a crosslicensing of IP back to AMD from Intel. If so this could work out for gamers/enthusiasts and investors with Intel having no love loss for Nvidia and a boatload of money to invest in the new graphics division. Not really sure why Intel wouldn't just purchase AMD lock stock and barrel at this point and be done with it… Certainly going to be interesting over the next while but I always get nervous when I see AMD on a run (investment wise)
 
If AMD falls someone else will rise, the market is too atractive for just one player.
I dunno, man. Not just finding the money, which would be billions of dollars, but also hiring enough skilled engineers, the production agreements, the sheer time to do all this while competing with Nvidia's NEXT generation... I don't see a new player coming in quite so easily.

I guess maybe a player like Qualcomm or Samsung could step up. I just don't know if they'd be interested.
 
I dunno, man. Not just finding the money, which would be billions of dollars, but also hiring enough skilled engineers, the production agreements, the sheer time to do all this while competing with Nvidia's NEXT generation... I don't see a new player coming in quite so easily.

I guess maybe a player like Qualcomm or Samsung could step up. I just don't know if they'd be interested.

There's also matter of intellectual property - after buying 3dfx Nvidia probably owns so much patents for basic 3d acceleration stuff it would be impossible to make gpu without them.
 
Kyle

May I get some clarity on what prompted this article's greenlight for posting?

I mean this editorial should have some real sourcing to back it up. Right now what I read is, and I am paraphrasing, I overheard water cooler talk and am reporting it to the masses as gospel.

I mean, come on bro, I love your site for various reasons, but the journalist in me is screaming rumor article. If I walked up to my editor and presented this to him/her it would get tossed in the trash. No sources, no citation. At least in the Nano article you had the evidence of being denied a timely sample model in relation to the rest of the industry to prove your point.

Sure-sure we could go with the 'time will vindicate me' assumption, but I see people in here singing praises for good journalism. This is more like the early stages of someone sticking a finger in the whistleblower pie to say they tasted it first. I do not believe you are the type of person to want that kind of reputation. The national inquirer is the worst in that ballpark, and CNN/Fox afternoon opinion reporters to a lesser extent. We know how much we respect them.

Sure workplace politics are interesting, and really put a finger on the atmosphere of the work environment. However, I sit here and wonder if you feel any sense of obligation to your readership when articles such as this are posted without any concrete evidence to present.

Point is:
Body language, third and fourth person accounts, and historical bias are not what make an article reportable. Harsh truths, links, names, places, events, and putting all of those pieces of the puzzle together are what make truly defining investigative articles.

Just my 2c from a longtime reader.
 
I dunno, man. Not just finding the money, which would be billions of dollars, but also hiring enough skilled engineers, the production agreements, the sheer time to do all this while competing with Nvidia's NEXT generation... I don't see a new player coming in quite so easily.

I guess maybe a player like Qualcomm or Samsung could step up. I just don't know if they'd be interested.


Samsung buying ATI IP and going after Nvidia? That would be some storm.
 
Kyle, I was told a few weeks ago that while GF 14nm process is very power efficient (compared to Fiji) it doesn't scale well, if at all (kind of like Fiji). I've also heard that AMD is working double time to increse clock speed (again like fiji) so maybe that's why its running hot now.
 
Kyle

May I get some clarity on what prompted this article's greenlight for posting?

I mean this editorial should have some real sourcing to back it up. Right now what I read is, and I am paraphrasing, I overheard water cooler talk and am reporting it to the masses as gospel.

I mean, come on bro, I love your site for various reasons, but the journalist in me is screaming rumor article. If I walked up to my editor and presented this to him/her it would get tossed in the trash. No sources, no citation. At least in the Nano article you had the evidence of being denied a timely sample model in relation to the rest of the industry to prove your point.

Sure-sure we could go with the 'time will vindicate me' assumption, but I see people in here singing praises for good journalism. This is more like the early stages of someone sticking a finger in the whistleblower pie to say they tasted it first. I do not believe you are the type of person to want that kind of reputation. The national inquirer is the worst in that ballpark, and CNN/Fox afternoon opinion reporters to a lesser extent. We know how much we respect them.

Sure workplace politics are interesting, and really put a finger on the atmosphere of the work environment. However, I sit here and wonder if you feel any sense of obligation to your readership when articles such as this are posted without any concrete evidence to present.

Point is:
Body language, third and fourth person accounts, and historical bias are not what make an article reportable. Harsh truths, links, names, places, events, and putting all of those pieces of the puzzle together are what make truly defining investigative articles.

Just my 2c from a longtime reader.

Yeah, this reminds me of something Charlie would write about nVidia on Semiaccurate.
 
Kyle shows how to whine when you don't get invited to a AMD event. If Kyle is a professional he should stick to reviewing PC hardware (Intel/AMD/Nvidia) and not writing such rubbish. There are only 2 ways this can go - Kyle is right and AMD has a problem or he is horribly wrong and will have egg on his face on June 29th. :)
 
Kyle, I was told a few weeks ago that while GF 14nm process is very power efficient (compared to Fiji) it doesn't scale well, if at all (kind of like Fiji). I've also heard that AMD is working double time to increse clock speed (again like fiji) so maybe that's why its running hot now.

Technically it is Samsung :) . Because GF was not getting anywhere with the process before. I wrote this before AMD have had problems with heat and performance and FinFet is not going to fix this for them.
So scaling things can only work when you are solving those problems rather then moving to a different process .....
 
Kyle shows how to whine when you don't get invited to a AMD event. If Kyle is a professional he should stick to reviewing PC hardware (Intel/AMD/Nvidia) and not writing such rubbish. There are only 2 ways this can go - Kyle is right and AMD has a problem or he is horribly wrong and will have egg on his face on June 29th. :)

Such a long time to keep this secret your avg NDA for these events is prolly much shorter, have to ask the question why ?
 
Well, Kyle, thanks for that.
I don't have the time to be engrossed in all things tech anymore, so stories like this are gold for me.

That said, when[H] does a piece of reporting, rare as they are, I pay attention. Almost always spot on, well couched, and valuable.

Oh, and way to go maintaining that whole integrity thing. It's not always easy.
 
All the fanboy remarks on both sides calling bias is ridiculous. I've been reading [H] for many years, even before making my forum handle, and they're my sole source for GPU and CPU reviews for gaming. I've also used those reviews to buy both sides of the aisle several times over, at least since 3DFX died.

Sure the reviews language can be harsh at times, but it is honest and they do their best to back it in fact as much as possible. And while I can't speak for Kyle, I can't imagine that either as a gamer or a businessman he would ever want AMD to go under and have this market be a monopoly for Nvidia. As a gamer, it drives up prices and lowers innovation, and for [H]...that's less reviews, less traffic, less interest. AMD is going full-retard by locking them out, looking like petulant children and doing further damage to their brand instead of less.
 
All the fanboy remarks on both sides calling bias is ridiculous. I've been reading [H] for many years, even before making my forum handle, and they're my sole source for GPU and CPU reviews for gaming. I've also used those reviews to buy both sides of the aisle several times over, at least since 3DFX died.

Sure the reviews language can be harsh at times, but it is honest and they do their best to back it in fact as much as possible. And while I can't speak for Kyle, I can't imagine that either as a gamer or a businessman he would ever want AMD to go under and have this market be a monopoly for Nvidia. As a gamer, it drives up prices and lowers innovation, and for [H]...that's less reviews, less traffic, less interest. AMD is going full-retard by locking them out, looking like petulant children and doing further damage to their brand instead of less.
The only draw back of buying system parts based on [H] reviews is that they tend to last to long, which means fewer upgrades.
 
The only draw back of buying system parts based on [H] reviews is that they tend to last to long, which means fewer upgrades.

Only a problem if you're Nvidia, AMD or a shareholder. And since my dumb ass sold 95% of my Nvidia stock sometime last year, and have since watched it pop ~115%...I'll be bitter and sad over here in my corner, happy with my 970 for a while.
 
Rumor, speculation, disgruntled ex employees and body language as source with knowledge of Intel interests in buying. We will see how this pans out in a short period of time. The most concerning is when Polaris will hit the streets for real reviews and analysis to take place beyond speculation. Performance/price is one aspect, while unique features #2 - it can start low end like Nvidia 750Ti and work up. Mid year to me is June and not a paper launch.

I am in manufacturing and I see over and over again how ugly a new product line can be on startup yet in the end becomes extremely successful once the problems are overcome. Anyways with the lack of any real substantial leaks, images of anything makes one wonder as well.
 
Kyle - something you might like to clarify;

Are you referring to issues with Polaris in relation to products which we would expect them to compete with, e.g. supposed 1060, 1070 etc?

Polaris is clearly not going to compete with the 1080 if all information available, has lead myself and others here down the right track. If that is what you are meaning (no 1080 competition/high end typical [H] focus), then it might pay to clarify that, as a reasonable amount of people in the hardware scene are aware of this. If that turns out to be true, (only competitive with 1070 down), then some will bitch at you even harder for fanboyism etc if they have a good loss leader for the mid-low range.

The average joe though, the often silent majority, has no idea Polaris likely won't be competing with the 1080, it would be nice to at least touch on that as more reliable information comes to hand, either here or in a future article. Although AMD has sent mixed messages in this regard. We know only from shipping manifests, that there are 2 cut varations from the polaris 10 die with another one assuredly full fat..

I would presume considering NDA lifts 29th, we'll know more at Ctex and some PR hype from Macau..
The lack of 'performance' on the day activity sheet is telling. Architecture and 'tech demo'... we've already had the tech demos so no performance talk leads me to think Ctex or nothing (along with some information I have been privy to for quite some time..).. and maybe even then they'll drip it out.. ffs. Not sounding good.
 
Last edited:
In my world, given the landscape, I would try to do this if I were AMD proper:

1) Sell off the RTG group to Intel. Seriously, if they are pushing so hard to go to the point of making things toxic, better you make a buck and control the deal. Greater control over terms and patents as well.

2) Form a new group, hiring the best engineers away from both NVidia and ATIntel, and whomever else sells gpus.

3) Partner with Microsoft - yes seriously - in developing a brand new architecture that will fully take advantage of the DX 12 promises from the ground up, keeping in mind the developers as well and how they hate learning new things.

4) Once the card is ready, take your shot, and perhaps you surprise the shit out of everyone with something that actually makes it a 3 horse race again, which we haven't seen since I believe the 90s.

1) Is that even possible? Wouldn't the government block such a sale using Anti-trust concerns?

2) Unless they are totally bereft of talent it is easier to develop their own folks or give them better focus than to restart from the ground up.

3) This would be a good idea. Surprised that NVidia hasn't done this actually.
 
I spoke to one of my professors, who is an IEEE senior fellow and who is acquainted with a rather famous AMD cpu architect/engineer...

This is exactly what he said, word for word, translated from Italian

"all is not well at global foundries "

In response to this article

I take it he means in relation to amd, he would not say anything else
 
1) Is that even possible? Wouldn't the government block such a sale using Anti-trust concerns?

2) Unless they are totally bereft of talent it is easier to develop their own folks or give them better focus than to restart from the ground up.

3) This would be a good idea. Surprised that NVidia hasn't done this actually.


For 1, I'd think this is possible because it is simply a graphics company going from one Chip maker to another. I think where you've have to worry about monopoly, is if AMD tried to sell its chip division to Intel, or its GPU division to NVidia. I see no reason why they'd block RTG being sold off to Intel - really its no different than AMD having them.
 
I wouldn't worry about lack of competition. There is demand for 2 competitors in the GPU market, and if one of those companies exits the market, someone else will step in. When 3dfx went belly up and left NVidia as the sole high-performance GPU player, ATI stepped up with the Radeon, a GPU with much more performance than anything they had previously released. Within a couple of years they came out with the 9700 Pro, a card much better than anything 3dfx would likely have released if they had stuck around.* Intel would be a much stronger competitor for NVidia than AMD has been lately, with access to more advanced fab tech and much deeper pockets. Against Intel, NVidia might not have the luxury to do things like release the 980 at $600 and then the 980 Ti at $650 the following year. With VR demanding much higher performance (by my back of the envelope calculations, you would need 32K resolution before your eyes would stop noticing improvements) and 3D stacking of transistors just getting started, we could be looking at a new era of amazing advances with 2 well-managed and well-funded companies competiting against each other.

*I know everyone was going on about "Rampage" at the time, but if it were anything close to what it was cracked up to be, NVidia would have integrated the tech within a couple of years of buying 3dfx. They didn't, and the GeForce 4 and FX they did release got creamed by the 9700 and 9800.
 
I spoke to one of my professors, who is an IEEE senior fellow and who is acquainted with a rather famous AMD cpu architect/engineer...

This is exactly what he said, word for word, translated from Italian

"all is not well at global foundries "

In response to this article

I take it he means in relation to amd, he would not say anything else

AMD is tied to GF, they have to get a certain amount of wafers from them otherwise it breaks their contract and the fines are quite hefty. So if GF is having issues with their process, AMD is stuck and have to weather the storm.
 
Back
Top