Vega Rumors

Can't dump Koduri,

He is a very capable engineer, he got locked into a situation that was done by people before him.

But any case yeah the rest of what ya stated was right.

AMD needs to focus on today, and not tomorrow, features don't matter when there is no adoption rate because the product doesn't sell well.

Funny thing about it is, when ya see marketing talking about "new features" before they start talking about performance, well shit hit the fan already. When ya start seeing, this features gives this much amount of performance over an unknown generation lol, Shit hit the fan already. That's why I said that word cloud back a year ago wasn't anything special cause I thought they were talking about pre Polaris architecture, and they were.
 
Last edited:
Can't dump ,

He is a very capable engineer, he got locked into a situation that was done by people before him.

But any case yeah the rest of what ya stated was right.

AMD needs to focus on today, and not tomorrow, features don't matter when there is no adoption rate because the product doesn't sell well.

Funny thing about it is, when ya see marketing talking about "new features" before they start talking about performance, well shit hit the fan already. When ya start seeing, this features gives this much amount of performance over an unknown generation lol, Shit hit the fan already. That's why I said that word cloud back a year ago wasn't anything special cause I thought they were talking about pre Polaris architecture, and they were.

Then what does he needs to be successful? That's the unknown question because I feel even if they had 5 billion extra for R&D right now, they will continue to engineer features that doesn't have anything to do with gaming performance.

Maybe the need to take a page out of Nvidia's book. Remove all computing features such as double precision, machine learning etc. from AMD gaming cards. Strip/Cut it down.
I'm not a chip engineer or understand thermodynamics, but I would think less heat, less power usage coming from electric loads that don't matter for gaming, will result in more headroom for performance.
 
Then what does he needs to be successful? That's the unknown question because I feel even if they had 5 billion extra for R&D right now, they will continue to engineer features that doesn't have anything to do with gaming performance.

Maybe the need to take a page out of Nvidia's book. Remove all computing features such as double precision from AMD gaming cards. Strip it down.
I'm not a chip engineer or understand thermodynamics, but I would think less heat, less power usage coming from electric loads that don't matter for gaming, will result in more headroom for performance.


That is definitely the first step, I think that is what Navi's scalability is all about, its not about infinity fabric its about the ability to have the same tech across multiple markets but focused on the specific market target.
 
They will with Navi which is rumor to be a MCM design, but we know that Nvidia is planning their own MCM design post Volta though the question will be who is leading the way between AMD and Nvidia when it comes to MCM.

Where does anything say Navi won't use GCN architecture. MCM is just breaking the current design into more chips, which won't improve performance. MCM GCN won't beat NVidia any more than Crossfire GCN on a card has.

AMD needs a fundamental leap in it's base technology and it seems like they are out of ideas.
 
While it seems a little foolish to hop on the "wait and see" bandwagon, to be honest anything at 1080 performance and similar price is just fine for me. Heat and power are a whatever... As a hotrodder its cool to me that my Ls1 can drink the fuel, the performance makes up for it... are there more efficient platforms out there? sure! some of them can even make more power. My fx9370 and 290 combo did very well for me, it ate power and made heat....It also provided more than enough performance in the vast majority of gaming and medium workload tasks i asked of it. Bad comparo? Sure! Im a pro mechanic, not a tech guy. Too bad you cant toss a camshaft and port job into a silicon thinking bucket with billions of switchymabobs.

Would like to go 4k on the next build, and if vega gaming can run with 1080 i may stick on the amd side of things. If not, a 1080 is a monster of a card and will do just fine. I like amd, but am not a blind fanboy. I switched to ati/amd after 3dfx died during my highschool days--at the price i could afford, the ati gave a bit more horsepower vs the comparable nvidia on the shelves at the time. If they cant provide in the higher end 4k bracket...well, it will be cool to finally try a good nvidia.
 
While it seems a little foolish to hop on the "wait and see" bandwagon, to be honest anything at 1080 performance and similar price is just fine for me. Heat and power are a whatever... As a hotrodder its cool to me that my Ls1 can drink the fuel, the performance makes up for it... are there more efficient platforms out there? sure! some of them can even make more power. My fx9370 and 290 combo did very well for me, it ate power and made heat....It also provided more than enough performance in the vast majority of gaming and medium workload tasks i asked of it. Bad comparo? Sure! Im a pro mechanic, not a tech guy. Too bad you cant toss a camshaft and port job into a silicon thinking bucket with billions of switchymabobs.

Would like to go 4k on the next build, and if vega gaming can run with 1080 i may stick on the amd side of things. If not, a 1080 is a monster of a card and will do just fine. I like amd, but am not a blind fanboy. I switched to ati/amd after 3dfx died during my highschool days--at the price i could afford, the ati gave a bit more horsepower vs the comparable nvidia on the shelves at the time. If they cant provide in the higher end 4k bracket...well, it will be cool to finally try a good nvidia.


Well that is kinda what AMD has been doing for the past few gens, more is better, more horsepower it should beat out the competition, its not how that horsepower is being utilized efficiently.

From the looks of it, Vega will not scale better at 4k, there was a specific reason for Fury X to do this, it had more shader horsepower than the 980ti, so the bottleneck was shifting to that as res went up. Vega doesn't have that advantage against the 1080ti, nor is the gtx 1080 shader bound at 4k either at least with current games.
 
Instead of AMD making a chip that is fast for today's games, that want to include features that may matter five years from now.
All that extra crap slows down development and complicates driver support. Dump HBM, GCN, infinity fabric, whatever other complicated nonsense Koduri is thinking of and just make a video card for right now.
Introduce features when they matter. The DX12 optimization on AMD didn't matter because DX12 sucks as a whole. They thought making cheap-ass cards with DX12 optimization was going to save them.

Or just dump Koduri, sell mobile/console/mid-range chips until you can find the right leader. He can't stand in Jensen Huang's shadow.

That's incredibly stupid from a project management POV. Predicting what features are needed 5 years ahead is incredible risk.

Nvidia learnt a lot from Fermi. They stagger their product development into timeframes closely fitting a product refresh, gradually introducing new features and allowing it to be adopted, while researching new ones.

Though to be honest I don't think much of the software Nvidia had developed the past 2 product cycles have been well received. At best it's been lukewarm. But it seems to force AMD to play catch up just to tick enough boxes, while reducing manpower for actual chip development.

The only ones worth mentioning are SMP, Tile-based rendering and color compression. The rest are pretty meh, to be honest. With VR dying down, I don't see a need for AMD to even attempt SMP. Just get playable latencies for the 1% that uses VR. Tile-based rendering is a pretty massive advantage that AMD needs to get even with, as well. They're also quite behind on color compression, kinda forcing them to use bigger busses/higher cost memory techs just to be even. And it looks like Vega is quite bandwidth starved.

AMD is lagging quite far behind, but panicking and rushing everything is the worst way to go.
 
That's incredibly stupid from a project management POV. Predicting what features are needed 5 years ahead is incredible risk.

Nvidia learnt a lot from Fermi. They stagger their product development into timeframes closely fitting a product refresh, gradually introducing new features and allowing it to be adopted, while researching new ones.

Though to be honest I don't think much of the software Nvidia had developed the past 2 product cycles have been well received. At best it's been lukewarm. But it seems to force AMD to play catch up just to tick enough boxes, while reducing manpower for actual chip development.

The only ones worth mentioning are SMP, Tile-based rendering and color compression. The rest are pretty meh, to be honest. With VR dying down, I don't see a need for AMD to even attempt SMP. Just get playable latencies for the 1% that uses VR. Tile-based rendering is a pretty massive advantage that AMD needs to get even with, as well. They're also quite behind on color compression, kinda forcing them to use bigger busses/higher cost memory techs just to be even. And it looks like Vega is quite bandwidth starved.

AMD is lagging quite far behind, but panicking and rushing everything is the worst way to go.


They are lagging very much behind, 1 year just by launch times of gens, but another generation based on perf/watt soon to be 2 generations because of the imminent Volta launch. Total almost a 3 generation lead for nV.

That's why when people say wait of the next best thing from AMD, they will compete, its pretty much a given they are still playing the catch up game, unless nV screws up its not in AMD's hands to dictate anything.
 
Kinda curious, but could we be seeing the biggest fail in video card history? There were some real turds in the past, like the FX 5800, but this seems worse. Over a year late, sucking 2x the power, but still slower. The only shoe left to drop is the price.
 
Kinda curious, but could we be seeing the biggest fail in video card history? There were some real turds in the past, like the FX 5800, but this seems worse. Over a year late, sucking 2x the power, but still slower. The only shoe left to drop is the price.

Naw Nvidia's FX series and AMD's 2900 series to me still are the biggest failures.
 
Naw Nvidia's FX series and AMD's 2900 series to me still are the biggest failures.

Yeah 2900xt was bad, it was hyped up to be the most amazing thing ever yet when it launched ati at the time had a big "value" campaign plastered all over it. It was 8800gts performance while sucking more power than the 8800gtx and was several months late irrc. Its a similar thing they done with polaris, it was meant to be much better performer but was hampered by clock speeds so amd started all this "disruptive product" bollocks like that's what it was intended to be from the start.
 
Where does anything say Navi won't use GCN architecture. MCM is just breaking the current design into more chips, which won't improve performance. MCM GCN won't beat NVidia any more than Crossfire GCN on a card has.

AMD needs a fundamental leap in it's base technology and it seems like they are out of ideas.
That is certainly true that Navi may still use GCN, it was just a guess on my part that Navi based on the rumor mill that Navi may use MCM.
 
I think the 2900XT was more of a flop than the FX5800. The FX5800 was salvaged somewhat with the 5950 and the performance while subpar, still gave you some decent increases over the GF4's. The really great thing about the 5800 was the fan, where the push pull part of the shroud was placed right next to each other, so it could recycle its own air.

The 2900XT was so bad that AMD took it off the shelves and tried to dump them into OEM's, which I think most told them to shove off. Remember how long it was? lol it had to have that extra fan that hung off the end of it, some of the OEM's had the one with the grip bar so you could install/uninstall it.
 
I think the 2900XT was more of a flop than the FX5800. The FX5800 was salvaged somewhat with the 5950 and the performance while subpar, still gave you some decent increases over the GF4's. The really great thing about the 5800 was the fan, where the push pull part of the shroud was placed right next to each other, so it could recycle its own air.

The 2900XT was so bad that AMD took it off the shelves and tried to dump them into OEM's, which I think most told them to shove off. Remember how long it was? lol it had to have that extra fan that hung off the end of it, some of the OEM's had the one with the grip bar so you could install/uninstall it.


Presumably these, xtx iterations apparently.

5h5xqwV.jpg


The standard heatsink on the retail 2900 xt was ok for what it was, not sure why they went mental with that longer one for the "xtx". I mean the overall size minus the bracket doesn't look too dissimilar to the 2900xt.
 
I had a 2900xt. But I won it at a LAN party. I used it for a year and gave it to a friend.
 
I had a 2900xt. But I won it at a LAN party. I used it for a year and gave it to a friend.

I had 2 in crossfire...that was an interesting experience to say the least. Still have one sitting around as a paperweight, which it excels at given how much copper the heatsink was made up of, which it badly needed.
 
Yeah 2900xt was bad, it was hyped up to be the most amazing thing ever yet when it launched ati at the time had a big "value" campaign plastered all over it. It was 8800gts performance while sucking more power than the 8800gtx and was several months late irrc. Its a similar thing they done with polaris, it was meant to be much better performer but was hampered by clock speeds so amd started all this "disruptive product" bollocks like that's what it was intended to be from the start.

Sounds similar to Vega really.

On a side note I had 2900 pro 256 bit - after they cut obscene memory bus it wasn't that bad of card - bargain priced at 2/3 of 8800gt at that time and still much more powerfull than utter trash cards like 2600 ( I think that one was actually biggest failure ever - all the crap from 2900xt but without even an ounce of power 2900 had)
 
I had an HD 2900 Pro 512-bit. Man, did that card run HOT :D
 
Presumably these, xtx iterations apparently.

5h5xqwV.jpg


The standard heatsink on the retail 2900 xt was ok for what it was, not sure why they went mental with that longer one for the "xtx". I mean the overall size minus the bracket doesn't look too dissimilar to the 2900xt.

Man good old days. Them naked pipes! old cards just had something to them. I am sure I will say the same thing about todays cards in 10 years.
 
Cause there was no higher cost FE for the 1060, or the 1080 TI, it really got some bad press.
1060 didn't have FE but nvidia model was $50 more expensive then regular retail price. Very similar to FE. I don't believe they called it FE if I remember correctly. 1060 launch price was 249.99 for 6gb but nvidia edition has been 299.99 till this day. So sort of like founders edition.
 
yep there was a FE for the 1060 with a higher cost forgot about that, so the 1080ti they dropped the cost.
 
That's because they have history of showing numbers that were insulting to IQ of any hardware enthusiast and their PR is targeted at people who like to visit r/AMD.

Both Companies are guilty of doing this. There have been many cases from both sides of numbers been quoted been best case scenarios in very specific situations. The Nvidia wonder driver for Kepler as an example or are you forgetting the 970 fiasco?

Point is, that anyone with any IQ shouldn't believe marketing numbers from either side.
 
Both Companies are guilty of doing this. There have been many cases from both sides of numbers been quoted been best case scenarios in very specific situations. The Nvidia wonder driver for Kepler as an example or are you forgetting the 970 fiasco?

Point is, that anyone with any IQ shouldn't believe marketing numbers from either side.


nV doesn't do things like poor Volta ;). They also haven't compared to AMD products for quite some time well they did with the gtx 1060, but still, it did come out where they said it would.

The ONLY time nV cherry picked was when they were down, the same thing AMD does ALL THE TIME. Its easy to read AMD though, if they aren't chest pumping, there are problem areas with their products.
 
nV doesn't do things like poor Volta ;). They also haven't compared to AMD products for quite some time well they did with the gtx 1060, but still, it did come out where they said it would.

The ONLY time nV cherry picked was when they were down, the same thing AMD does ALL THE TIME. Its easy to read AMD though, if they aren't chest pumping, there are problem areas with their products.


Well, I did guess that you would reply to defend Nvidia, rather than accept my point that you shouldn't believe marketing number from any company. Good job.
 
Well, I did guess that you would reply to defend Nvidia, rather than accept my point that you shouldn't believe marketing number from any company. Good job.


There is a reason why one way its better than another. When nV stated the gtx 1080 was faster then the 980ti, and it came out true. The 1080ti was more than 20% over then gtx 1080 that is what it came out true.

Do you see AMD doing that?

Hell no

They do things like 2x polygon throughput or 2x the perf per watt without giving references. Shit they did it with Polaris too.

That is bad marketing, because it increases expectations to unrealistic points.

Good job for lumping AMD's crappy marketing to nV's excellent marketing, while fan boys say nV's marketing is so much better and that is the only reason nV sells more, you try to equalize and pander to the lowest common denominator.

AMD did what they thought was best, Fanboys took it to a whole new level, and this is what we get, Vega @ 300 watts, which the only people that will be interested in this card is people that have been waiting for a year and half and have free sync monitors.

The people that don't have a free sync monitor and have been waiting, will just go else where.

Do you know why they never told us Polaris figures were based on r270 and r280 till a few weeks before launch? Cause they knew how crappy Polaris really was!, so instead of giving any hard figures so people would figure out oh its a gtx 970 equivalent a year later.... they throw curve balls at us. While people assumed it was over the 390 series cards they hyped the shit out of Polaris.

Same shit with Vega man,

You never see nV doing this "fart" cloud. When they say x some performance, a specific metric and they tell you exactly what that metric is there is no room for wishy washy thinking.

If you start seeing nV act like AMD in marketing, start expecting a really bad product.

Did its surprise you Ryzen losing in games by the margin it did? For me no it didn't that is why they never compared Ryzen to Intel in gaming before launch at least not with actual frame rates, cause they knew they will lose.

Yes things like that nV will do, when they have an inferior product, which they don't right now and haven't had in many generations.

But the flip side of it, AMD showing benchmarks with Vega FE with pro capable drives, vs Geforces without any pro optimizations, do you see nV doing that? Why do you suppose AMD did that? Cause they knew they were going to get creamed in a straight up benchmark. No doubt about it. RX forget it, gaming drivers forget it. Anyone still thinking that, really should rethink how their brain works and this came straight from AMD marketing those reasons too lol. When was the last time we heard nV use the same excuses? FX, when they had a bad product. We heard AMD say the somethings in the past too with the r600.

It all stems from the crap product. Believe the marketing or not believe the marketing doesn't really matter. Marketing only works when the product is good this type of market. And its easy to see when marketing is lying to us, its never straight forward. When there are layers of BS after BS without anything concrete, its all crap man. nV hasn't do that in many years around 15 years.

AMD has a bad habit of doing it even when their products are decent, because their image they feel is inferior. Which is true their branding is weaker. They don't know how to brand their products. Epyc really, what did they get the same marketing guy from CryTek? Ryzen? Sorry but all these Y's don't change the name to be cool.

How many times has nV changed the name of their graphics cards? Or ordering of the name, 3 times in the past 15 years? How many times has AMD done this? Oh many more times than that more like almost every 2 generations. How are they going to retain branding and product recognition when they do things like that?

After the A64 success WTF did they change the name to phenom? They just killed their brand when they did that, cause now when people see Phenom, they will think its new, lets look up the benchmarks.

IF they stuck with A64, phenom would have been more of a success. Again branding, they screwed themselves over.

I can give so many examples where AMD marketing shot themselves in their own foot multiple times in one launch. No other company in tech is so incompetent at marketing. When marketing is that bad get rid of everyone ya got on both the CPU and GPU side, from the top down, cause if you don't get rid of the top, the lower level people start thinking with the same f'ed up ideals as the top.
 
Last edited:
Both Companies are guilty of doing this. There have been many cases from both sides of numbers been quoted been best case scenarios in very specific situations. The Nvidia wonder driver for Kepler as an example or are you forgetting the 970 fiasco?

Point is, that anyone with any IQ shouldn't believe marketing numbers from either side.
970 was more product flaw than marketing. 970 still gives the best performance value at that price for most people and the 3.5gb not changing performance or comparison, until the news came out. Would putting a 3.5gb up front affect the sales? Unlikely.
 
Both Companies are guilty of doing this. There have been many cases from both sides of numbers been quoted been best case scenarios in very specific situations. The Nvidia wonder driver for Kepler as an example or are you forgetting the 970 fiasco?

Point is, that anyone with any IQ shouldn't believe marketing numbers from either side.

I don't believe any marketing by principle but in last few years when Nvidia presentation promised 20% increase then reviews after launch showed around 20% gains across the board while AMD showed stuff like 2x480 in Ashes...

970 I didn't really care since I buy for what I see in professional tests not specification itself plus I bought it after 3,5GB was known.
 
As much as I want RTG to succeed and compete. There is a clear difference in marketing and product delivery. Now it might fire up true fanboys for AMD but Nvidia just brings the card, drops it infront of you and tells you how fast it is and it is that fast. No bullshit what so ever. They don't have to bring babes to a show and go on a tour and shit. They just show you the card, and tell you how much it costs and how fast its gonna be and profit!
 
The only thing that's going to sell AMD cards at this point is monitors, cheap, high quality panels with freesync. The cards may not sell themselves, but honestly, for the money some of the freesync panels out there make having a lesser card ok.
 
I don't believe any marketing by principle but in last few years when Nvidia presentation promised 20% increase then reviews after launch showed around 20% gains across the board while AMD showed stuff like 2x480 in Ashes...

970 I didn't really care since I buy for what I see in professional tests not specification itself plus I bought it after 3,5GB was known.

Nvidia once had some kind of driver release that showed something like a 70% bump in performance in the slides for it, from what i remember it was basically enabling sli for a game that had no support for it and thats where they got the number from. So yes, they both talk complete and utter shite about performance, they cherry-pick scenarios that give them the best numbers and then ramble on about it like its the case for all instances in the title in question.
 
Nvidia once had some kind of driver release that showed something like a 70% bump in performance in the slides for it, from what i remember it was basically enabling sli for a game that had no support for it and thats where they got the number from. So yes, they both talk complete and utter shite about performance, they cherry-pick scenarios that give them the best numbers and then ramble on about it like its the case for all instances in the title in question.

If a game didn't support SLI and nVidia rolled out a driver that enabled SLI, then what's wrong with showing the gains?

AMD advertise the best case gains for their drivers all the time.
 
I think both companies suck and lie about marketing. I mean come on people they lied about the 970 GTX, and got caught...they even had a lawsuit and a settlement where I got $60 back for false advertisement.

AMD to me no better at marketing and PR.

The last good marketing that AMD did, was the 5870 and when they released eyefinity. That shit had Nvidia dumbfounded when it got released, it was because of that marketing I bought 2 5870 and 3x 24inch dell monitor. O and Kyle and his reviews didn't help either LOL
 
what a long post saying absolutely nothing, only your total love for Nvidia, and yet does not change anything or make anything that I have said wrong.

You should not buy a card from AMD or Nvidia based on their marketing. Wait until proper reviews come out. Anybody who buys a card based on Marketing slides from either company is an idiot.
 
Back
Top