From ATI to AMD back to ATI? A Journey in Futility @ [H]

Hasn't AMD been saying for quite a long time that they were going to bring VR performance to the mainstream with Polaris? That would hint that this was meant to be a middle stack part, how is that spin? I think the people being critical of your piece think that the timing is rather poor, considering you posted right as the Macau event that you guys weren't invited to is going on. I feel like the piece would have more credibility if it were posted even a week or two ago.

Would you expect AMD to truthfully say "We wanted this to be the king of all cards, but when we realized it couldn't compete - we needed to market it as a 'mid-range' card; as that's the only place it could possibly compete"? Or, would you expect them to then only say "It was always intended as a 'mid-range' card"?
 
Well they might have targeted for performance bracket, if they thought nV was going to do the same thing they did with the gtx9xx launch where they pretty much didn't go beyond the enthusiast level performance cards........

Which would have been the typical launch. Going up one tier of performance for the same price. Taking into consideration with the node issues of 20nm, nV went more so it would have thrown AMD off.
 
I wish for more things to be stirred up, lol. Kyle, let the butthurt flow through you! Write more articles!!! ROFL.

Industry needs to be rocked for a bit.
Rock on brother!

So, not only can you see into the future, but you can travel into the past and attempt to re-write history, TOO?
Yes. I hang out with Dr. Who as well. Bet you are really jealous now.

If AMD is truly cherry picking samples for reviewers,
I do not know that for sure and since I do not not, I pulled that out of the article. But I do have sources that said exactly that.

And when was the last time [H] reported on rumors like so many other rumor sites? If [H] reports on rumor, they say its rumor lol very clearly.
Thanks for noticing. And how many "rumors" have you seen us post that were not true? I cannot recall one off the top of my head.

Would you expect AMD to truthfully say "We wanted this to be the king of all cards, but when we realized it couldn't compete - we needed to market it as a 'mid-range' card; as that's the only place it could possibly compete"? Or, would you expect them to then only say "It was always intended as a 'mid-range' card"?
You know AMD no longer produces top end CPUs either! You know why? They were targeting the middle of the product stack. That's the ticket.
 
Not really what I was getting at. AMD has been having problems for the better part of a decade now. But it's only recently we're getting these editorials about how AMD cut you off but you ain't mad bro......AMD is shit and their mom is so fat that when the bitch jump up in the air she gets stuck.....but I ain't mad.

But then if/when their mom is so fat that when the bitch jump up in the air she gets stuck....isn't that just reporting the facts and not bias?

If anything - the only bias resulting from the blacklist was when this article went up/the timing of it - Kyle said it (the article) was in the pipeline for a while and the non-invite just meant it was easier to publish this now rather than later. But apparently it was always gonna get published - invite or not. If he got an invite and published it later you people would just be calling him ungrateful instead.
 
Last edited:
Despite AMD calling these (Polaris) midrange parts aimed at the $300 and under/VR segment this entire time?

So, not only can you see into the future, but you can travel into the past and attempt to re-write history, TOO?

And what if a pair of Polaris 10 cards or god forbid even a pair of Polaris 11 cards equal or surpass 1080 performance for LESS MONEY, would that be AMD failing to you? Would that be AMD "not having any answer for 1080" in your eyes?

Your bias is showing.

Originally, they never said this was to be a middle stack part. It was to be to counter Pascal.
 
Mr. Bennett, the nature of this article. True or not is in some ways more than entertaining. I do think you have the balls of brass and this industry needs people like you. (Think the restaurant scene in Scarface) :)
 
AMD gave up on high end and server processors back in 2012, mostly due to their high end 'dozer part having mediocre performance and not having a plan to fix it.
 
Wow...shitty. Really it just sucks for all of us because it allows nVidia to run wild for the foreseeable future. I kind of went "in" on nVidia with a G-Sync monitor purchase, but competition is never a bad thing in this realm.
 
AMD gave up on high end and server processors back in 2012, mostly due to their high end 'dozer part having mediocre performance and not having a plan to fix it.


It takes years to create these chips, once you make such a mistake, they just can't keep going at it and they can't fix the issues on a dime, its better to pull out and save that money instead of producing and pushing marketing and what not to that degree because you end up burning more money.
 
Getting rid of Global Failures was actually a good idea, that they did it in a way that cost them billions and then continued to use them was the bad part. They really needed to work with TSMC more and partner with some other fab that can acutally get yields over 80% after the early bring-up phase for a process.
 
Polaris 10 is rumored to be ~230mm^2 and accordingly GP104 is ~315mm^2, so I don't see how AMD could've imagined Polaris 10 coming close to GP104 (yes, the sizes are not exactly equivalent as 14nm vs 16nm, and performance isn't only about mm^2, but either way the difference is still quite large).




From January, they already mentioned that Polaris should be a cheaper card.



Good find but they never mention midrange, they wanted to increase the TAM for VR which could have ment low end P10.
 
It takes years to create these chips, once you make such a mistake, they just can't keep going at it and they can't fix the issues on a dime, its better to pull out and save that money instead of producing and pushing marketing and what not to that degree because you end up burning more money.

What it takes is the balls and cash to start over with something that might be able to compete and start on that immediately after you realize the current product is FUBAR. Not bail on one segment and waffle around for over a year before starting on something that might be able to compete.
 
I don't care about performance, but if it is hot card then it is a dud. I was really hoping to get a video card this year that was AMD, but it looks like NVIDIA might be getting my money :( Look forward to the [H]'s review.
 
My guess is, and this is pure speculation, is that if Raja Koduri manages to spin off ATI along with all the GPU related intellectual property, Intel will buy the new spinoff for peanuts and Mr. Koduri will end up with a fat paycheck for making it possible. Intel doesn't need what's left of AMD after that, as their CPU Intelectual property is several generations behind Intel's. AMD has already lost their fabs by spinning off Global Foundries. It's not too farfetched that they will loose their GPU business to Intel, a transaction that will be facilitated by Mr. Koduri. Acquiring GPU technology on the cheap would be good for Intel. Given their resources, Intel could actually build on that intellectual property and make some competitive products. The downside is that AMD will be lost after that. If by a miracle NVIDIA will by out AMD after they loose their GPU business as well, I am pretty sure that NVIDIA will fire most of AMD and only keep those that have actual engineering talent and something meaningful to contribute. This, of course is just speculation on my part. I'm just a casual observer.
I don't see nvidia buying AMDs CPU IP. Nvidia has been busy promoting the hell out of their GPU products as computing devices other than for just graphics for a while now, and is constantly trying to push that market. Spending the capital to acquire IP to produce x86 CPUs. If anything when you consider their marketing emphasis on efficiency, I suspect they have another mobile processor architecture of their own in the works. Buying AMDs CPU IP to try and compete with Intel would be a costly move not only for the initial expense but the manufacturing and even R&D with nothing to show for it for years. It wouldn't be a repeat of 3dfx and essentially removing any chance for a company to swoop that up and be a competitor in the graphics market, it would be similar to the scenario AMD buying ATI which was a massive waste(among other AMD decisions).
 
  • Like
Reactions: STEM
like this
What it takes is the balls and cash to start over with something that might be able to compete and start on that immediately after you realize the current product is FUBAR. Not bail on one segment and waffle around for over a year before starting on something that might be able to compete.


That is true, and it looks like they did that with Zen....
 
It takes years to create these chips, once you make such a mistake, they just can't keep going at it and they can't fix the issues on a dime, its better to pull out and save that money instead of producing and pushing marketing and what not to that degree because you end up burning more money.

Exactly. Its why NVidia couldn't go back and add Async Compute to Pascal in the hardware - by that point, the parts had already been baked and it was in production.

That being said, they may or may not have had time to factor in Async into Volta, depending on what stage of the design process its in. However, if AMD flops hard here on Polaris, and subsequently on Vega, frankly NVidia might not even bother with the implementation.

The thing right now is, AMD has a mindshare right now that it hasn't seen since the 6950/70 days. Whether it actually means something to performance, async compute is locked in the brains of many people as an AMD advantage. AMD desperately needs something that can continue to grab that mindshare.

If they fail, they will lost all advantage and people will stop caring, and what was once a mindshare will become an in-joke.

As such, the only real way they can recover is to push Polaris as a mid range card, focus on performance per watt, and sell for as cheap as they possibly can.

Long term, if they are having the problems Kyle indicated with Polaris, then Vega better be an entirely new engineering design, and not based on Polaris.

Otherwise, there is a lot of pain ahead.


Really, the only way out of the shitstorm depends on how quickly AMD can slash prices, empty their graphics card inventory, and then come up with a far better design than the one they have to compete with NVidia. If they are still banking on selling Polaris for $300, let me tell you, in unequivocal terms:


They be FUCKED.
 
Doubt we'll see any confirmation of that for many years to come. But whether it was meant to be a mainstream card or not, if it's priced right, it could do extremely well. If AMD can afford to sell it at that price, that is.

If they price it at >$300, for a GPU with 390x-class performance, they're sunk.

AMD doesn't want to be seen as a price/performance brand. The Fury X was their Titan.
They had to reduce the price because NV shocked them with a similar performing Ti.

AMD will continue to lower their prices because of their reactionary business practices.
I wouldn't be surprised NV priced the 1070 at $380 just to mess with AMD in the middle stack.
 
That is true, and it looks like they did that with Zen....


Agreed. Unless Kyle has Zen rumors as well that say their chip sucks, lol.

Latest rumors on Zen was that it was beating internal expectations and might actually compete with Skylake instead of Haswell as first reported.
 
Kyle

May I get some clarity on what prompted this article's greenlight for posting?

I mean this editorial should have some real sourcing to back it up. Right now what I read is, and I am paraphrasing, I overheard water cooler talk and am reporting it to the masses as gospel.

I mean, come on bro, I love your site for various reasons, but the journalist in me is screaming rumor article. If I walked up to my editor and presented this to him/her it would get tossed in the trash. No sources, no citation. At least in the Nano article you had the evidence of being denied a timely sample model in relation to the rest of the industry to prove your point.

Sure-sure we could go with the 'time will vindicate me' assumption, but I see people in here singing praises for good journalism. This is more like the early stages of someone sticking a finger in the whistleblower pie to say they tasted it first. I do not believe you are the type of person to want that kind of reputation. The national inquirer is the worst in that ballpark, and CNN/Fox afternoon opinion reporters to a lesser extent. We know how much we respect them.

Sure workplace politics are interesting, and really put a finger on the atmosphere of the work environment. However, I sit here and wonder if you feel any sense of obligation to your readership when articles such as this are posted without any concrete evidence to present.

Point is:
Body language, third and fourth person accounts, and historical bias are not what make an article reportable. Harsh truths, links, names, places, events, and putting all of those pieces of the puzzle together are what make truly defining investigative articles.

Just my 2c from a longtime reader.

Yep, comes off more like high school blogging instead of reputable journalism.
 
Good find but they never mention midrange, they wanted to increase the TAM for VR which could have ment low end P10.

I agree that they didn't say "look we'll only offer this and this level of performance", but they explicitly marked the 290x/970 cards while mentioning Polaris as a whole, and that Polaris has been created to be a good price/perf (good performance with a lower price) since at least January.

I'm not saying Polaris isn't disappointing (I don't know what AMD initially wanted), just that AMD has been (IMO) quite clear with the expected performance of Polaris since January.
 
I don't see nvidia buying AMDs CPU IP. Nvidia has been busy promoting the hell out of their GPU products as computing devices other than for just graphics for a while now, and is constantly trying to push that market. Spending the capital to acquire IP to produce x86 CPUs. If anything when you consider their marketing emphasis on efficiency, I suspect they have another mobile processor architecture of their own in the works. Buying AMDs CPU IP to try and compete with Intel would be a costly move not only for the initial expense but the manufacturing and even R&D with nothing to show for it for years. It wouldn't be a repeat of 3dfx and essentially removing any chance for a company to swoop that up and be a competitor in the graphics market, it would be similar to the scenario AMD buying ATI which was a massive waste(among other AMD decisions).

I think that they would buy AMDs CPU IP if it's cheap enough. Not to compete with Intel, but just to have an x86 license in their arsenal. I'm sure that there are a few scenarios where that would come in handy. For example embedding a few low clock speed low-performance x86 cores inside a GPU just to have something that they can boot the OS with on a server. That would make for a more compact compute server, not to mention that it would hurt Intel. Hurt Intel where it counts: their high-end server business. Intel can still sell all they want to hosting providers and data centers that rely on hosting providers for business. Again, just speculating.

Imagine how these fun discussions and speculations would read if AMD no longer existed. Yuck!

I'm not a fanboy of any company, so I would hate for AMD to go away. AMD is the victim of bad and incompetent management. NVIDIA and Intel screwed up in the past, but they always bounced back and learned from their mistakes. Not so much AMD.

Someone capable should take over AMD, line up management and fire & replace everyone who is not competent to fulfil their current position. I hope that there are enough folks left that are willing to turn this company around if given the chance.

SaaB (born from jets - right) fired most of their accountants before they went bankrupt because they wouldn't let the engineers sink more money into the 9-3. Problem was that by then they didn't have that many competent engineers in their employ anymore.
 
Last edited:
How is it on a tech forum some of you don't know the difference between an architecture and a specific SKU? "Polaris can't beat Pascal" a tiny bit more granularity is needed for that to have any meaning.
 
How is it on a tech forum some of you don't know the difference between an architecture and a specific SKU? "Polaris can't beat Pascal" a tiny bit more granularity is needed for that to have any meaning.

1362697378362.jpg
 
I have to admit, I'm a little shocked Cageymaru hasn't posted by now in this thread.

Even if AMD's video card division crumbled tomorrow, as long as they provide driver support until say 2019; It wouldn't suck too bad for those heavily invested in their hardware....
 
Yeah I was just looking that up, it wasn't till March/April till they stated that.
Been some back and forth on that, but those saying that I am re-writing history are reaching.

That is a pathetic attempt at bribing you.
All this stuff about being "buthurt" over a trip to China is pretty damn funny. If I want to go to Macau, I will go buy a ticket and go. Hook trying to bribe me after AMD's Roy Taylor calling the reviewers left out of the Nano unfair was pretty sleezy, but Hook trying to buy me off was a low blow. I have known Hook for a long time and the fact that he thinks I can be bought is insulting. That right there shows you how off the rails things are going on inside AMD.

Polaris 10 is rumored to be ~230mm^2 and accordingly GP104 is ~315mm^2, so I don't see how AMD could've imagined Polaris 10 coming close to GP104 (yes, the sizes are not exactly equivalent as 14nm vs 16nm, and performance isn't only about mm^2, but either way the difference is still quite large).
230mm^2 x 1.3 give you 300mm^2 approximately. The jump between 14nm to 16nm is considered to be ~30%, so you tell me what AMD could have imagined.

Yep, comes off more like high school blogging instead of reputable journalism.
Sorry you think that, but I am glad you took the time to read it fully and comment here.
 
Frankly, knowing this about the video card division, I would love to know what's going on with regards to the processor division.

On the surface, everything seemed to be going right for AMD, as the new Crimson Edition Drivers are a FAR improvement over the previous Catalyst ones, and the direction seemed to be solid the way they were going. But when you learn that much of that was spin and marketing, you wonder what to trust.

For instance, Zen originally was to bring AMD to within Haswell level of performance, and AMD brought in an engineer who designed their last success, the original Athlon. The latest rumors is that the chips are performing far better than expected and may actually be able to go toe to toe with Skylake.

But how much of that is just more marketing, spin, and carefully planted stories?

That's why I'd love to know if the PC division is faring better than the graphics division right now ahead of a Zen release.
 
Question - Hook. Is he marketing for AMD overall, or just the RTG?
Things have changed a lot there in terms of hierarchy lately and I am not sure. That and the fact that he will not return my texts or emails makes it difficult to ask as well.
 
That is a pathetic attempt at bribing you. What are you supposed to write about if they have no half-way decent products to offer, or at least something worth getting excited about in the pipeline? [H]ardOCP thrives on its ability to offer quality and unbiased content. Personally, I trust the reviews here more than anywhere else on the web.

This Chris Hook guy asking you to sell your livelihood and integrity for a short vacation in San Francisco (where you had to actually work nonetheless) is not only a low attempt at bribery but also blatantly insulting. It's like someone saying to me: "Hey, here is three grand, now quit your job and... oh, God help you!"

AMD was doomed the day they decided to purchase ATI. They borrowed money, sold stocks and did all kinds of stuff to acquire ATI. That was about the same time that Intel returned from rehab and woke up from their drunken Pentium 4 binge. The only reason why AMD was successful from 2000 until the end of mid-2006 was because Intel screwed up and kept manufacturing Pentium 4 chips. And even then Intel was making money. Instead of heavily investing in CPU design, AMD decided that they want to make "Fusion" and merge CPU with GPU technology. Uhm, how did that turn out for them?

To make matters worse, they "cancelled" the ATI brand (as in - they killed it). ATI was a brand that carried weight and had a strong following. Why kill it?

To make a long story short, imagine how things would have turned out if AMD wouldn't have bought ATI? If they really had a snowballs chance in hell to make a decent Fusion APU, then why not partner up with NVIDIA? Oh well, I guess we'll never know.

From far away it looks like these two guys have a relationship at the personal level: Raja Koduri and Murthy Renduchintala. Or they both overpromise each other things that neither can deliver. Either way, this won't end well for AMD. Intel has the money to push forward, AMD doesn't.

My guess is, and this is pure speculation, is that if Raja Koduri manages to spin off ATI along with all the GPU related intellectual property, Intel will buy the new spinoff for peanuts and Mr. Koduri will end up with a fat paycheck for making it possible. Intel doesn't need what's left of AMD after that, as their CPU Intelectual property is several generations behind Intel's. AMD has already lost their fabs by spinning off Global Foundries. It's not too farfetched that they will loose their GPU business to Intel, a transaction that will be facilitated by Mr. Koduri. Acquiring GPU technology on the cheap would be good for Intel. Given their resources, Intel could actually build on that intellectual property and make some competitive products. The downside is that AMD will be lost after that. If by a miracle NVIDIA will by out AMD after they loose their GPU business as well, I am pretty sure that NVIDIA will fire most of AMD and only keep those that have actual engineering talent and something meaningful to contribute. This, of course is just speculation on my part. I'm just a casual observer.

The YouTube video in the article shows some very unexcited folks talking that are very aware of the fact that they don't really have a decent product to sell. Looks more like an infomercial. Just listen to what they say and observe their facial expressions.

It was a fun read :)

AMD would probably be in a better position if they haven't divested their profitable flash business.
 
  • Like
Reactions: STEM
like this
OCN -prznar1 said:
I dont care, i want to know polaris 11 specs.
This is my favorite. His posts read like a #1 AMD fanboy. This one is a gem though. Reminds me of the NBA's Philadelphia Sixers. This team's management makes sure the team continues tanking year after year in order for a chance at a #1 pick. They also trade their best players for future top picks. In fact, there are rumors currently they are trading the #3 pick from last year.

This guy and this NBA team's motto is "Next year/release will be the one!".
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
This site is SO BIASED towards whatever is the best enthusiast gear to buy at any given moment in time! If you happen to be an ignorant brand loyalist, that means at several points in time your team could lose for a while and that is so difficult to accept! LIES! I can't sit down, it hurts SO BAD!

Seriously, this is one of the few tech sites left that doesn't just recommend one brand forever. It just so happens that Intel and nVidia have been on top for a while. When that changes, I'm sure the recommendations here will change too, as they have done from the beginning of [H].

Anyway, about the article:

AMD burns bridge. [H] airs dirty laundry. Awesome. I haven't read tech drama this good since Tom Pabst reported (and Kyle confirmed) issues with the 1.13GHz P!!!, causing Intel to pull production. Good stuff.
 
in the fantasy world you live in. Who is going to measure and state how much competition with themselves is adequate? (not that "competing with yourself" makes any sense to begin with.

The shareholders will measure how the company is doing. Look at Intel, no real competition and they have a product stack second to none. IF NV were to be the only dGPU seller there is no doubt they would have a full product stack as well. Don't take this as an endorsement of dGPU monopoly, it's not. However, it wouldn't be the hyperbolic end of the world that you and all the parrots are squaking about.
 
Lot of pages and threads. I'll sum it up for the new people joining us.

Don't buy AMD .... Do buy nVidia. If you're still with AMD, I promise you will enjoy the nVidia experience. Drivers are constant, performance is king. Excellent end-user experience.
 
Back
Top