Statement from AMD - Raja Koduri Leaves

Anyone else own shares? Been long AMD for a.... long time. Over 15,000 shares now. I'm hoping this a good move. The direction under Dr. Su has been fantastic.
 
What if Intel starts making GPU's allied with AMD to battle Nvidia? :wacky:
 
Lisa will have to fire his merry band of misfits as well.
Have to get everyone back on the same page. Efficient and fast.

Step 1: Only use HBM in professional products.
Step 2: Stop the lame marketing events.
Step 3: Concentrate on making a top card that battles the Titan line from day 1, trickle down from there.
Step 4: Make enough stock
Step 5: Only implement software/hardware that is needed in the very near future. Speed wins the race.

Pretty much.

AMD needs to move, not adapt.

They need to identify their target market, focus their vision and plainly execute.

They need to prove, not promise.
 
Pretty much.

AMD needs to move, not adapt.

They need to identify their target market, focus their vision and plainly execute.

They need to prove, not promise.

The problem right now though is their target market is Cryptocurrency.

Right now a liquid cooled Vega 64 will outhash a Titan XP, at slightly less wattage and at half the price.

Gamers are pretty much Meh when it comes to Vega but the Crypto Miners are hot for them, wiping out all of the stock cause the card works so well for the wattage and price.
 
The problem right now though is their target market is Cryptocurrency.

Right now a liquid cooled Vega 64 will outhash a Titan XP, at slightly less wattage and at half the price.

Gamers are pretty much Meh when it comes to Vega but the Crypto Miners are hot for them, wiping out all of the stock cause the card works so well for the wattage and price.

Eh? Vega 64 was perfect for me performs better than 1080, freesync and I can mine at $6/day on top of that. Along with 2 free games I gave as bday present and saved myself $160 in presents. Destiny 2 with 1080 is hardware locked
Gaming and mining/compute complete package
I'd say their target audience are people that research a lot or game and mine but they seem to be anti mining on vega front
 
Waiting for them to go the same route with their GPU line as they did with their CPU line in regards to CCX's. Make a GPU core out of multiple GCX's and then scale the number of them based on the tier or the product. Just the higher yield rates alone would be worth it, let alone the far greater binning potential.
 
Haven´t been fun for him at AMD either, when Lisa Su and co took all the money for CPU development. It was clear that even making an IC design for another SKU was something there was no money for.
 
I find it interesting that even though Rajas tenure at amd was far from perfect, there seems to be such demand for gpu skillsets that you are hard pressed to not find work in that field if you have some skills. What is the unemployment rate in gpu development? Is this kind of near instant pick up only on the high end or the lower tier engineer level too?
 
I find it interesting that even though Rajas tenure at amd was far from perfect, there seems to be such demand for gpu skillsets that you are hard pressed to not find work in that field if you have some skills. What is the unemployment rate in gpu development? Is this kind of near instant pick up only on the high end or the lower tier engineer level too?

Its clear now Raja left AMD, he didn't get fired. He has been actively seeking something else and got a position at Intel. Going to be interesting to see how long it takes for AMD to find a replacement and who it will be.
 
I don't get some of the sentiment some people write about Raja Koduri. You get to work at AMD you have to be fighting windmills with a shoestring budget and stuff does not get any easier when both gpu you are launching are not clear winners.
The second guessing of the memory is also somewhat of a mystery. While it has been reported by Buildzoid that if there was no HBM used by Vega the power usage footprint would be a lot larger.

Then you have a PR department that is non functional that part has to be the worst where there is nothing they have ever done to dent Nvidia marketshare even when they had superior products.

Where does AMD find people that are built for an uphill battle they don't come along very often. You can replace people but there is no guarantee that they are an improvement...
 
Sorry to say, but I saw this coming a mile off and got flack for it in this forum...

I personally think that Raja came across as unprofessional, lacking in leadership skills and lacking in focus, at least on GPU's.

AMD's current GPUs are no better than AMD's CPUs from 10 years ago. They are in the same rut of simply performing minor iterations that are sub-par. The current architecture should have been abandoned 4 years ago.

I still maintain that the current "new" GPU is a lie, and not what AMD say it is, and most of the evidence has come out since its launch. I would still like to see a die shot that proves what its actual process node is.

However, AMD now need to focus on getting it's GPU act together, and ditch this outdated GPU architecture for something less Bulldozer.
 
  • Like
Reactions: Elios
like this
You didn't hear it from me, but ATI is coming back and AMD will only be making CPU's later next year,
 
Who else gets the feeling that Navi is going to be a steaming pile of shit as well? Chances are we won't see anything remotely competitive from AMD/RTG until 2020'ish.
 
Who else gets the feeling that Navi is going to be a steaming pile of shit as well? Chances are we won't see anything remotely competitive from AMD/RTG until 2020'ish.

Reality is going to possibly be more harsh. Navi is a brand new technology that I don't think they will be able to move off of quickly unless they have the capital now to do so. They couldn't do it with arctic Islands. They were to hamstrung and needed to make lemonade out of lemons.

Therefore, given the research and development already sunk into Navi, it's going to be hard for them to move off of it for at least another five years from launch I would assume.
 
Writing was on the wall way back at the event where they unsuccessfully hyped Vega and said. "EVERYONE IS GOING HOME WITH.............. a t-shirt"

He was planning his exit at that time or it was being planned for him.
 
You didn't hear it from me, but ATI is coming back and AMD will only be making CPU's later next year,
Of course we didn't hear it from you because it isn't going to happen.

AMD owns too many Technologies and Patents for ati's architecture.

What will a t i do without the gcn or freesync or many of Andy's open source initiative?

Sorry but I think you are pulling this out of your ass and wish casting
 
One more thing about the new guy. Or should I say potential new guy?

The number one goal for AMD and the RTG should be to close the performance Gap with Nvidia. Develop a product that is a Halo product and Superior to Nvidia. I do not believe they have the engineers capable of doing so in-house, at least in terms of design.

Therefore, whoever leads the RTG next needs to make hiring a chief Architect who can design the next great architecture 4 video cards, just like AMD hired Keller to redo processors.

Even if they are simply renting someone like they did Keller, they need someone who knows what they are doing to design something that can compete with Nvidia.

There is no other way they are going to be able to pull something out of their ass without outside help in my opinion. They can't simply throw things against the wall and hope something sticks.
 
So many damn questions.

I really want to know what the future holds for RTG.

Their architecture was shit because they had to carry water for the Arctic Islands, but their business savvy was excellent.

They were able to make lemonade out of the lemons that were Polaris, and despite issues with a Vega, it at least is holding a place and in some cases trading blows with Nvidia despite the Arctic Islands architecture limiting it.

They made everything open source to increase adoption of Technologies in those cards that otherwise would get ignored in the past, which directly lead to the games we see now in DirectX 12 and vulkan.

If AMD did not establish the Partnerships with the Kronos group and Microsoft, it would have been an Nvidia Slaughter up until this point.

So while I am happy to see a video card veteran who is passionate about gaming be the one in charge of the group, he needs to hire the right people around him.

AMD simply cannot keep the status quo with the current engineers.

They need better engineers and better decision makers. They need to land a superstar Chief Architect.

Even if they rent one like they did for Keller to get them right for the next five to seven years they need to do it.

Kyle, can you provide any more insight? At risk of you saying too much, is there anything, anything at all you can share that wouldn't jeopardize your sources and would shed light on where AMD goes from here? And anything at all on Navi? Are they expecting a Polaris level bomb or could this be something very good?
 
Navi is a brand new technology that I don't think they will be able to move off of quickly unless they have the capital now to do so.

Navi is about as brand new as all the GCN versions. The harsh truth is AMD have nothing post GCN because the funding was removed. Navi is just another tiny change. with some slides thrown in. Vega was also brand new, you know NCU PR slides and all.

There is a reason why AMD needs a 500mm2 345W water cooled more dense node die with HBM2 vs a 300mm2 180W part using GDDR5X. 73.6% more transistors and 91.6% more power needed to archive the somewhat same.
 
Last edited:
AMD always forward looking designs (at least for a very long time now) sometimes this has paid off, other times it has not.

why Nv is doing soooo great with 1000 series (I call bs on much of it cause I know how terrible they have been in past along with their continued proprietary customer/shareholder shafting ways) they trimmed the fat with current chips, run that at a much higher speed, end result, faster less power, but as per usual with Nv, they basically have gimped ability to do "advanced things" the most advanced thing in them current (IMO) is the way they handle clock speeds/boost/voltage so maintain TDP (which is closely monitored in terms of wattage not just cooling), oh yeh and GDDR5X.

AMD/Radeon, are more or less being gimped on raw clock speeds (though for them to run at a deficit of ~500Mhz or more are still competitive says much about their tech which is way fatter) but they are also able to leverage in hardware what Nv is unable or unwilling to do with any but their top tier pro cards.

Nv has a very bad habit (again IMO) to ONLY use things when others have optimized/showed value of using them, then coming to the party to one up the competition in any number of ways, not always for the benefit of consumers/customers. their 1000 series seems a similar thing, do the minimal amount possible at wicked clock speeds instead of having the raw grunt. (200 all the way to 700 series essentially were behemoths that could do it all, but paid for it in regards to power/temps/voltages, so the 1000 series they tore apart the basics to really focus on raw clock speeds instead of all the fancy extras)
-------------------------------------------------------------------------------------------------------------------
------------------------------------

Tessellation was in AMD cards way sooner then Nv cards, yet MSFT in infinite wisdom shat on AMD and forced them to rebuild the engine for Nv benefit
most of the GDDR/HBM standards have been put forward/built by AMD Not Nv...the sole exception in the longest time seems to be GDDR5X
proper soldering method for the card/core AMD is more or less to thank for that when comes to graphics cards, Nv from 6000 all the way up to 9000 series had massive issues with lead solder cracking/failing, AMD used eutectic solder way sooner as a result of how well it worked out, Nv to my knowledge started using it for the 200 series to current 1000 series.

I dont know to each own, AMD builds solid no compromise products, they might not be as "optimized" as say Nv or as clock speed/ipc focused as Intel, but, for them big pickup truck designs show they are well engineered, they just need to put more of the $$ into truly optimizing their designs I suppose.
GCN is a solid Uarch, shame they are hamstrung by being FORCED to use GF or pay out tens to hundreds of millions $ wafer fine using someone else, though GF seems to have been fine process for the lower clock speed parts, maybe they just need to have cards that use way more shaders at lower clock speed to take advantage of the "known" if they cannot compete on a clock speed war, maybe they can on a raw grunt level?

------------------------------------------------------------------------------------------------------------------
------------------------------------
Anyways, Raja likely left cause he was at odds with Lisa Su, or, the board was at odds with him in that 400-500-Vega was not as "amazing" as they thought it should be, or, could be his focus all along was Navi and tuning the driver teams and such to support what was already on the books (like Jim Keller was lead for Zen and beyond got it all sorted out and he "left" maybe Raja was deciding in a fancier way if he was going to stay long term and it was a contractual obligation for him to even be a part of it vs Jim was contractual with no stay or go at a later point clause.

i.e. be lead of a reformulated Radeon group and if feel comfortable with it, then will stay, but, was not comfortable for whatever reason so "left, cause it seems awful fishy to say the least, either his leaving was warranted by his contract, or was ousted, only AMD knows, sure hurts their stock price though, the Intel deal I was expecting to have a solid increase in valuation ($15 minimum especially after the many many analyst views that AMD can't compete with Intel no matter how amazing Ryzen turned out and the back to profit gain that bore fruit) went up from $10 some odd, to $12 some odd, back to $11 some odd this morning)

Anyways, long story short, is an interesting turn of events, but I seen this happening, cause apprently he (Raja) was adamant that he would only be on board if he was lead of the RTG (Radeon) a boss does not just up and quit after a "sabbatical" unless he was owed vacation time or something along those lines, which again IMO seems more a case of his contract stipulated he has right to resign for permanent tenureship IF HE decided to proceed OR was be given X paid time at a future date before he left on good terms type thing.

RX 300-Vega were not raw gaming grunt designs like Nv 1000 series are, but, they are still amazing in their own right, just like always with AMD designs, the stuff that can leverage what they have under the hood (VLIW 5/4/GCN) seem to have a boat load of untapped hard to achieve ooomph, unfortunately game devs seem to pursue Nv designs that are "easier" to get the performance out of in a usable state instead of chasing the lions share of performance that can be had from Radeon designs.

------------------------------------------------------------------------------------------------
----------------------
Just saying, like mining/code cracking, the Radeons for YEARS have been absolute beasts when you can tap the potential in comparison to Nv. word an odd way, easier to get the Nv doing highway speeds (100mph) whereas harder to tap the Radeon to do race level speeds with a semi loaded up (180mph)
totally different designs, totally different concepts of what they want/need to do, suppose with Nv having more available $$ to optimize their product offerings to different sections of the GPU market does not hurt as much as AMD having less available $$ to support multiple fronts not only GPU designs, so they have been unable to shed the fat and focus without screwing themselves in a different way.
 
Navi is about as brand new as all the GCN versions. The harsh truth is AMD have nothing post GCN because the funding was removed. Navi is just another tiny change. with some slides thrown in. Vega was also brand new, you know NCU and all.

There is a reason why AMD needs a 500mm2 345W water cooled more dense node die with HBM2 vs a 300mm2 180W part using GDDR5X.

Educate me.

My understanding was that, for the longest time, AMD had hardware that was accessed via GCN that got ignored by game developers.

Vulkan and DX12 make accessing that hardware easier, hence the ridiculous gains in Doom by older AMD cards.

If I am right, why move away from GCN? Why not overhaul and modernize it if they cant afford to replace? Hasn't Nvidia been on some form of CUDA core architecture for forever?
 
Educate me.

My understanding was that, for the longest time, AMD had hardware that was accessed via GCN that got ignored by game developers.

Vulkan and DX12 make accessing that hardware easier, hence the ridiculous gains in Doom by older AMD cards.

If I am right, why move away from GCN? Why not overhaul and modernize it if they cant afford to replace? Hasn't Nvidia been on some form of CUDA core architecture for forever?

GCN is the uarch. And not sure if you want to use a highly sponsored game as the showcase. The average performance is far from it.

GCN is simply outdated and unable to provide in a modern world. Hence the complete failure and the massive additional need of transistors and power just to compete.

AMD needs something new, plain simple. Just as Nvidia did with Kepler->Maxwell for example or Pascal->Volta.

You can only do so much by small tweaks and node changes to the same thing over and over. Just look at the perf/watt vs Hawaii for example, that was able to do FP64 compared to Vega. Hawaii was the top of GCN.
 
Last edited:
GCN is the uarch. And not sure if you want to use a highly sponsored game as the showcase. The average performance is far from it.

GCN is simply outdated and unable to provide in a modern world. Hence the complete failure and the massive additional need of transistors and power just to compete.

AMD needs something new, plain simple. Just as Nvidia did with Kepler->Maxwell for example or Pascal->Volta.

You can only do so much by small tweaks and node changes to the same thing over and over. Just look at the perf/watt vs Hawaii for example, that was able to do FP64 compared to Vega. Hawaii was the top of GCN.

And your opinion that gcn is outdated is based on? Far as I know Vega is the only chipset to implement 100% of the dx12 spec required plus optional extensions. It's also capable of working with exceptionally large datasets. This is something Nvidia chokes on.

Gcn is quite literally more powerful and more versatile. But it isn't faster because it's not streamlined.

It makes it a pickup truck in a race against a sports car.

Sometimes the things that come out of your mouth just leave me dumbfounded.
 
And your opinion that gcn is outdated is based on? Far as I know Vega is the only chipset to implement 100% of the dx12 spec required plus optional extensions. It's also capable of working with exceptionally large datasets. This is something Nvidia chokes on.

Gcn is quite literally more powerful and more versatile. But it isn't faster because it's not streamlined.

It makes it a pickup truck in a race against a sports car.

Sometimes the things that come out of your mouth just leave me dumbfounded.

If GCN wasnt outdated, we would see the Vega 64 actually compete with the 1080.
 
And your opinion that gcn is outdated is based on? Far as I know Vega is the only chipset to implement 100% of the dx12 spec required plus optional extensions. It's also capable of working with exceptionally large datasets. This is something Nvidia chokes on.

Gcn is quite literally more powerful and more versatile. But it isn't faster because it's not streamlined.

It makes it a pickup truck in a race against a sports car.

Sometimes the things that come out of your mouth just leave me dumbfounded.

Vega 64 got about the same perf/watt as Fury X. That again was only marginally better than Hawaii. GCN at that stage was done for it.

If you think having a GPU that got ~75% more transistors and uses ~90% more power including exotic memory to compete is great. Then I dont know what to say. There is a reason why AMD GPUs doesn't sell for either gamers or professionals and HPC. And then we dont even have to mention GP100 and GP102 or GV100 that's completely out of reach.
 
Vega 64 got about the same perf/watt as Fury X. That again was only marginally better than Hawaii. GCN at that stage was done for it.

If you think having a GPU that got ~75% more transistors and uses ~90% more power including exotic memory to compete is great. Then I dont know what to say. There is a reason why AMD GPUs doesn't sell for either gamers or professionals and HPC. And then we dont even have to mention GP100 and GP102 or GV100 that's completely out of reach.

Yet they are selling every Vega they can make so far and I consider Kyle far more informed on their sales then you.
 
Vega 64 got about the same perf/watt as Fury X. That again was only marginally better than Hawaii. GCN at that stage was done for it.

If you think having a GPU that got ~75% more transistors and uses ~90% more power including exotic memory to compete is great. Then I dont know what to say. There is a reason why AMD GPUs doesn't sell for either gamers or professionals and HPC. And then we dont even have to mention GP100 and GP102 or GV100 that's completely out of reach.

Again you harp on power versus performance. But gcn was designed to do so much more than Pascal. Hence the transistor overload.

Again not outdated, just sub optimal with regards to raw 3d game speed
 
Again you harp on power versus performance. But gcn was designed to do so much more than Pascal. Hence the transistor overload.

Again not outdated, just sub optimal with regards to raw 3d game speed

So you say its a product that completely missed what the market wanted? Its not a HPC product, its not a professional product, its not a gaming product. What is it then?

Raja left and joined Intel because he couldn´t do anything at AMD due to no funding. Including moving them beyond GCN that they desperately need.

Yet they are selling every Vega they can make so far and I consider Kyle far more informed on their sales then you.

And how many is that? Lets be honest, not that many and even then you see stock everywhere. Not to mention the lack of interest in custom cards from AIBs.
 
Last edited:
So you say its a product that completely missed what the market wanted? Its not a HPC product, its not a professional product, its not a gaming product. What is it then?



And how many is that? Lets be honest, not that many and even then you see stock everywhere.

It missed the target for gaming. But it isn't a miserable product there. When properly tuned it keeps up with the best Nvidia has to offer. But no one game company wants to invest in that tuning.

It's great for the prosumer and professional market in things like video editing. And mining is spectacular.

We rehashed these stupid points again and again and again. I can admit faults in the design, but it's certainly not outdated.
 
It missed the target for gaming. But it isn't a miserable product there. When properly tuned it keeps up with the best Nvidia has to offer. But no one game company wants to invest in that tuning.

It's great for the prosumer and professional market in things like video editing. And mining is spectacular.

We rehashed these stupid points again and again and again. I can admit faults in the design, but it's certainly not outdated.

Properly tuned meaning some sponsored game and hopefully caught Nvidia without proper drivers or implementation when benched?

But where are those prosumer and professional sales? And mining? I dont think so. You bought the wrong card then.

You keep saying its not outdated, but here we are. Head of RTG left, sales is abysmal at best, market share is in a collapsing state. And the product needs liquid cooling, ~75% more transistors and ~90% more power and ~50% more bandwidth to be able to compete. It got no worth while perf/watt advantage over Fury X even with a node shrink. AMD even had to remove FP16 from Fury X to avoid slowing it too obvious in the PR cases. I can see the success all over the product. Its about 2-3 generations behind is what it is.
 
Last edited:
It missed the target for gaming. But it isn't a miserable product there. When properly tuned it keeps up with the best Nvidia has to offer. But no one game company wants to invest in that tuning.

It's great for the prosumer and professional market in things like video editing. And mining is spectacular.

We rehashed these stupid points again and again and again. I can admit faults in the design, but it's certainly not outdated.

This is why I wanted him to explain. Nothing about the GCN screams "outdated" to me.

There are some architectural flaws and scaling issues.

Did nvidia abandon CUDA?

No. They refined it, updated it, etc.

If Shintai were to say that PART of GCN were out of date, and that AMD needs to rework it to make the next iteration more efficient, then I could agree.

At this point, I have to wonder if he isn't conflating GCN with the Arctic Islands architecture.

GCN is what allows AMD to compete in DX12 and Vulkan games. But the Arctic Islands architecture was a limiting factor.

To discard GCN is for AMD to give up the one advantage it has. Just seems silly to me.

I do agree though it needs to be advanced and evolve, but not yet be replaced. There are envelopes that could yet be pushed on future revisions.
 
Back
Top