AMD Chief Confirms New High-End GPU: "We Will Be Competitive in High-End Graphics"

AMD has not released a GPU with meaningful overclocking headroom in a very long time now. You expect that to suddenly change with 7nm Navi? Why exactly?

2080 Ti performance from 7nm Navi is pie in the sky dreaming, IMO.
Not saying AMD will be at 2080ti parity with their release of 7nm Navi, but Vega is a better OCer than you give it credit for.

Vega at 1700mhz and an upped unlocked power limit is no slouch.
(GamersNexus testing)
12%-20% OC gains vs stock are pretty worthwhile.

It puts Vega 56 at 2070 levels for the most part.

The major con here being power consumption. That's the price of being competitive though. (And not a big deal to me at least... This isnt 2990WX levels of power consumption)

I for one wish Nvidia would give users the option to push their cards more inefficiently. Even at the expense of silicon lifespan. At least shunt mods exist. I don't appreciate their hardware hand-holding. If I enter 2.0 vcore on a CPU, I should be able to kill the darn thing. Let me kill a GPU dammit :mad:
 
There's that goal-post move I was waiting for!



Fuck I wish. If AMD had a reasonably-priced product that embarrassed my 1080Ti with hybrid cooler? I'd own it.

I did not move any goal posts, you did the space heaters comment, sorry to say, you cannot move out of that.

Shit, I remember when Vega launched and was such a failure even AMD fans couldn’t ignore it; all I heard was, “Raja didn’t have enough time to have much affect on Vega. Wait for Navi” So this thread is hilarious to me... now it’s all Raja’s fault since he left.

From what I saw, it was his fault before he ever left.
 
I did not move any goal posts, you did the space heaters comment, sorry to say, you cannot move out of that.

It's hotter than the competition at stock, significantly hotter at overclocked. Nothing to move away from.
 
AMD has not released a GPU with meaningful overclocking headroom in a very long time now. You expect that to suddenly change with 7nm Navi? Why exactly?

2080 Ti performance from 7nm Navi is pie in the sky dreaming, IMO.

I didn't say to expect 2080Ti performance. But 1080Ti/2080 performance is more than reasonable.
1080*1.35x = .. 1080Ti (about 30% actually).
AMD has stated multiple times 1.35x performance from 7nm node.

Throw in some OC and that 1080Ti performance area moves more towards 2080Ti. Not too far, but towards it.
That's IF they release it for consumers.


There's that goal-post move I was waiting for!
You called a 290X a space heater, while it has the same TDP as your 1080Ti. Probably less as yours is AIO WC.

But yes, I hope they do release something that can beat or equal a 1080Ti I'm down.
Sweet freesync on a 4k HDR VRR TV, yes please.
 
It's hotter than the competition at stock, significantly hotter at overclocked. Nothing to move away from.

Gpu temperature has nothing to do with warming a room, please stop blasting ignorant things as fact. Wattage is what matters and you crank those 2000 series up they are right there with a stock Vega, it's also the dumbest argument ever. You and Dayaks run around defending Nvidia like it's a job lately, while you should be happy if AMD can compete again in the high end, since that should lower the over priced 2000 series or 3000 by then. Also the 290x was a great card ran it since launch up till about a year ago and it performed fine a great card to last that long.
 
I didn't say to expect 2080Ti performance. But 1080Ti/2080 performance is more than reasonable.
1080*1.35x = .. 1080Ti (about 30% actually).
AMD has stated multiple times 1.35x performance from 7nm node.

Throw in some OC and that 1080Ti performance area moves more towards 2080Ti. Not too far, but towards it.
That's IF they release it for consumers.



You called a 290X a space heater, while it has the same TDP as your 1080Ti. Probably less as yours is AIO WC.

But yes, I hope they do release something that can beat or equal a 1080Ti I'm down.
Sweet freesync on a 4k HDR VRR TV, yes please.
Apple did not get 35% more performance from the 7nm node. But ok. Point taken. I will believe it when I see it, personally. It will be great news if true.
 
Gpu temperature has nothing to do with warming a room, please stop blasting ignorant things as fact. Wattage is what matters and you crank those 2000 series up they are right there with a stock Vega, it's also the dumbest argument ever. You and Dayaks run around defending Nvidia like it's a job lately, while you should be happy if AMD can compete again in the high end, since that should lower the over priced 2000 series or 3000 by then. Also the 290x was a great card ran it since launch up till about a year ago and it performed fine a great card to last that long.

I don’t defend any company. I just state facts. Or at least a fact based opinion I can go back and debate with data.

Apple did not get 35% more performance from the 7nm node. But ok. Point taken. I will believe it when I see it, personally. It will be great news if true.

I have similar skeptism with those types of claims. Same bucket as nVidia’s int/fp hybrid pipeline for CUDA cores in the rtx cards. They had some lofty claims that might be true if the stars aligned just right...
 
They're hotter because they consume more wattage :ROFLMAO:

More ignorance, they run hotter due to weaker cooling on the stock 290x, thus why most replace it with water cooling or waited for the aftermarket cooled cards. At least the cards cooling was not nearly as bad as the leaf blower Fermi cards. One day you might learn that gpu temperature has nothing to do with how hot or cold the room it's in will be.
 
what we have right now is thanks to that "competition".

Yes it is. What point are you trying tomorrow make? That you want AMD out of the market entirely?

Two months from games that still aren't slated for release that were shown at a convention?

Do you bitch that you cannot yet buy a 2021 model year car in 2018 too?

I have always thought that your name fits. Tomb raider was released September 14th, it's now October 28th, we are fast approaching two months with no rtx in sight. Do you want to make another false equivalency?

Until ray tracing is in the consumers hand, it's just another promised technological improvement, in a long line of promised improvements that have yet to be seen beyond tech demos and glossy magazines.
 
290X launched competing and easily beating Thermi (it was already beat by the prior generation - that was my point) and the Titan. Then the 780s came out. For a period of time the 780Ti held a tiny lead, especially over the shitty 290x ref cooler models. Then the aftermarket cards came out, drivers were better and guess what, since then the 290x is faster even today by quite a decent margin especially in vram limited scenarios. Keep in mind I run the very best model of these cards and am quite aware of its history.
The 290X came out in October 2013. The GTX 780 came out in May of 2013. The GTX 680 had been available for a little more than a year when the 780 came out. In other words: Kepler was out for nearly 2 years by the time the 290X was released. The 290X wasn't competing with Fermi.
Yes it is. What point are you trying tomorrow make? That you want AMD out of the market entirely?



I have always thought that your name fits. Tomb raider was released September 14th, it's now October 28th, we are fast approaching two months with no rtx in sight. Do you want to make another false equivalency?

Until ray tracing is in the consumers hand, it's just another promised technological improvement, in a long line of promised improvements that have yet to be seen beyond tech demos and glossy magazines.
One month. The RTX cards came out September 27, and the required Windows 10 update that adds DXR to DX12 wasn't released until October 2. Regardless of what the timeline is like for Shadow of the Tomb Raider at this point, DICE has already announced that ray tracing will be available in Battlefield V at launch next month.
 
Bingo. What i don't get is Nvidia has literally 11 products right now in the consumer market that isn't old stock for market saturation. You can buy exactly what you need for your budget. If anyone complains that something is to much I pretty much just see them as an entitled person who demands luxury for a quarter of the price that can't believe a company is trying to make a profit.

There's two ways to look at it, and I think if you look at it right there is a very valid argument from the gamers' perspective that the industry is messed up right now.

Nvidia does have a swath of products out there for every price point. You are right on that. People are complaining things are getting too expensive. They are kind of right on that. The price is going up. However, if there were no market for the price, the top price point wouldn't constantly be getting pushed higher, so someone is buying.

HOWEVER, and that's a big however, there is a huge argument to be made that things are stagnating without competition. Yes, there's a nice $399 part out there, but how much has the performance $399 would buy changed since AMD started moving product development to the package editing team rather than people who design new hardware? Even if you shuffle that $399 price point around a bit to account for inflation, the news isn't great. Which is why I think a lot of people reflexively reference the CPU landscape, where even if it hasn't been huge leaps, the industry has been making an effort to dole out more performance to the same price point rather tan just invent new price points to house more perfromant skus.

For nvidia or AMD, being in the bitcoin mining or AI processor business may be fine and profitable, but we won't like it as gamers. That path leads to.. meh everyone should just buy a console. And at some point, that may be true if consoles are driving dual 8k displays fast with real time raytracing and all the hard and soft body physics calauclations, etc., there won't be much practical reason to do otherwise.
 
I can't see AMD releasing FUD statements at this point in their game. They've been pretty honest lately, and for good reason.
 
There's two ways to look at it, and I think if you look at it right there is a very valid argument from the gamers' perspective that the industry is messed up right now.

Nvidia does have a swath of products out there for every price point. You are right on that. People are complaining things are getting too expensive. They are kind of right on that. The price is going up. However, if there were no market for the price, the top price point wouldn't constantly be getting pushed higher, so someone is buying.

HOWEVER, and that's a big however, there is a huge argument to be made that things are stagnating without competition. Yes, there's a nice $399 part out there, but how much has the performance $399 would buy changed since AMD started moving product development to the package editing team rather than people who design new hardware? Even if you shuffle that $399 price point around a bit to account for inflation, the news isn't great. Which is why I think a lot of people reflexively reference the CPU landscape, where even if it hasn't been huge leaps, the industry has been making an effort to dole out more performance to the same price point rather tan just invent new price points to house more perfromant skus.

For nvidia or AMD, being in the bitcoin mining or AI processor business may be fine and profitable, but we won't like it as gamers. That path leads to.. meh everyone should just buy a console. And at some point, that may be true if consoles are driving dual 8k displays fast with real time raytracing and all the hard and soft body physics calauclations, etc., there won't be much practical reason to do otherwise.


I agree with you there. I appreciate you logically putting up a perspective there that makes sense. Though in this same situation, I have no idea what Nvidia should do to look better in public for a obviously glaring issue that AMD is having. Did you see the recent post about the 590? It's going to be pretty much almost a 1070 at $300 new. Sigh. I just bought a 1070 from the forums for $200 just a little over a month ago. So AMD is releasing another meh product that they rightfully can charge what they want for it, but it sounds like it will just be another card that's 2 years to late that's more expensive then their competitors current card that has a very healthy used offering.

I would absolutely love if AMD did in gpus for what they did in their cpus. Release products that almost perform the same for 1/2 the price. That would be absolutely wonderful imho. Not smart...but wonderful.

(though you can get a 580 4gb right now for $150...so that's really damn nice).
 
The 290X came out in October 2013. The GTX 780 came out in May of 2013. The GTX 680 had been available for a little more than a year when the 780 came out. In other words: Kepler was out for nearly 2 years by the time the 290X was released. The 290X wasn't competing with Fermi.

780Ti which the 290X competed with came out Nov 13th... after the 290X.
 
780Ti which the 290X competed with came out Nov 13th... after the 290X.
tenor.gif
 
And at some point, that may be true if consoles are driving dual 8k displays fast with real time raytracing and all the hard and soft body physics calauclations, etc., there won't be much practical reason to do otherwise.
It will happen. Eventually everything will be a little SOC chip like a flash drive. I guess we'll be driving 360° dome screens by then and you go beyond that and tech starts getting a lot more weird ;)

You chose 780 to make your point, I chose 780Ti. There is an entire pitch between those two goal posts.
 
780Ti which the 290X competed with came out Nov 13th... after the 290X.

The 290x was still cheaper then the 780ti and performed very similar. Like I said the 290x was a great card for AMD that got even better with additional cooling. 780ti was a good card it just cost more.
 
The thing I love about my Vega 56 as although it does have a higher power draw, that only occurs when everything about the card is being fully utilized. Of course, that also occurs with NVidia cards as well.
 
Yeah they were behind for a bit but point is not for long, but people only ever remember launch for some weird reason.

If you didn't live the launch and survive the aftermath, the only thing you see when you look back in history is, for the most part, launch. It's documented by multiple sites and most sites don't revisit launch reviews when companies manage to make their products better (Exhibit A: No Man's Sky). nVidia, for the most part, manages to get their products out at close to what each product is going to perform at for most of their useful life whereas AMD/RTG usually releases product below potential (hyped or realized) and improvements over time come into play.

Someone who is oblivious to this scenario (i.e. the majority of consumers) will search for reviews, catch one or two of them, see nVidia at XYZ and AMD/RTG at ZYX and conclude that they should choose the better one, unaware that the other card superseded the prior winner over time.
 
I can't see AMD releasing FUD statements at this point in their game. They've been pretty honest lately, and for good reason.
Well FUD is the wrong term, but she hasn't really committed to much of anything beyond "we'll be competitive in high end graphics again". Easy to be honest when it's overly vague - she's not exactly sticking her neck out.
 
Last edited:
Yes it is. What point are you trying tomorrow make? That you want AMD out of the market entirely?



I have always thought that your name fits. Tomb raider was released September 14th, it's now October 28th, we are fast approaching two months with no rtx in sight. Do you want to make another false equivalency?

Until ray tracing is in the consumers hand, it's just another promised technological improvement, in a long line of promised improvements that have yet to be seen beyond tech demos and glossy magazines.

My point is we want competition to give as cheaper option but this is what makes all those company kill and devouring each other.
 
... One day you might learn that gpu temperature has nothing to do with how hot or cold the room it's in will be.

This shows a complete misunderstanding of how cooling works.

You can cool any processor or heat source with a heat sink, but you will never go below ambient temperature, without some form of refrigeration.

Even a water cooler with a big radiator is still coupling to ambient.

If the room is 20°F, the heat source will be XX degrees warmer than that, based on how good the cooler is; add 30° to the room, the heat source gets 30° warmer.

That's very basic thermodynamics.
 
Competition is critical to the functioning of a capitalist free market, without it all you have is corporatism. You can jump any hoops you want to justify it, but that's reality. AMD needs to compete, so bring it team red.
I think you missed it by just a tad.

If AMD does not compete with NVIDIA and Nvidias prices continue to grow, then margins become lucrative to other potential competitors in this space. We don’t need AMD for principles of capitalism to flourish. Actually it might be better if they dropped the ball again and Nvidia prices cards even higher next generation. This might entice new, energetic competition, and ideas. If you are tired of AMD dropping the ball, maybe it’s time for a new competitor, or three. Competition will show up when comfortable profit margins invite it. Razor thin margins mean continued limping along.
 
This shows a complete misunderstanding of how cooling works.

You can cool any processor or heat source with a heat sink, but you will never go below ambient temperature, without some form of refrigeration.

Even a water cooler with a big radiator is still coupling to ambient.

If the room is 20°F, the heat source will be XX degrees warmer than that, based on how good the cooler is; add 30° to the room, the heat source gets 30° warmer.

That's very basic thermodynamics.
Indeed, which is why it's the delta and not the absolute temperature that matters.
 
This shows a complete misunderstanding of how cooling works.

You can cool any processor or heat source with a heat sink, but you will never go below ambient temperature, without some form of refrigeration.

Even a water cooler with a big radiator is still coupling to ambient.

If the room is 20°F, the heat source will be XX degrees warmer than that, based on how good the cooler is; add 30° to the room, the heat source gets 30° warmer.

That's very basic thermodynamics.

Where the hell did I mention sub ambient cooling? You managed to miss the point of my comment which is the temp a gpu runs at has nothing in relation to how it would heat a room its in. It would be the amount of wattage that was transmitted into heat by the heatsink into the room. So if one is running 90 degrees Celsius with 300 watts of heat and another is running 80 degrees Celsius but 350 watts of heat then the cooler gpu is actually heating the room more. The actual temp of the gpu has more to do with the efficiency of the heatsink and overall room temperature.
 
Dude, adding just a person to a room is a ~1000W heater; ever notice how hot the room gets at a lan party, even with the AC cranked up?

I have a 3 ton HVAC unit, and if I've got 6 or 7 of us with our rigs, it's going to be 80 degrees in here, with it maxxed out.

That's just the way it works. :zdunno:

That's one reason we tend to do more lan parties in the winter; 6 1000W sustems, added to 6 pissed off people (not everyone wins, lol), means we have toe doors open, and fans running.

If a gpu dumps heat AT ALL, it warms the ambient.

The HVAC system removes the heat, but that doesn't mean it doesn't exist.
 
Dug this up and posted it in a more obscure thread a while back but you might find it interesting after your comment about how the architectures are similar. It goes much further than you realize and is well documented. These are memory structure block diagrams, R700 is pre-GCN HD 4870 etc and very little has changed in basic functionality since then, let alone GCN itself.

R700/HD4800 series View attachment 115413 R9 280/HD7900 View attachment 115414 R9 200/290X etc View attachment 115416 Vega View attachment 115415


Check out the family block structure as well, quite similar.

R700/HD4800 View attachment 115420 R9 280/HD7900 View attachment 115418 R9 200/290X etc View attachment 115419 Vega View attachment 115417

What do you expect to have changed here? These are just block diagrams they don't show changes to the blocks themselves. The common theme is that they show a front end feeding into execution units and from there into a back end. That kind of topology will never really change much unless you're trying to do something pretty radical. Even just going by the change from dx10 to dx11 to dx12 adhering to the spec alone *demands* some night and day changes to how data is accessed, stored and shared. We know that that is the case therefore meaning that the nitty gritty implementations are not something a block diagram will convey. Nvidia gpus have much better gpgpu chops explaining extra cache levels/sram scratchpads but (pre tensor core era) they're for flexibility and developer friendliness reasons in the compute field. They have only very marginal uses in rasterization, which wholesale ignores memory latency and depends entirely on memory bandwidth.
 
AMD is competitive until you get to the high end. They simply don't have a card to compete with the 1080ti, let alone the 2080ti...but unless you are looking for the highest end cards, AMD will serve you just fine. I have a Vega 64, and it's working VERY well with every game I throw at it at 3440x1440. Is my card the fastest? No. Is it enough to play most games in ultra quality settings with a 75hz (maximum for my monitor) refresh. Sure is!

When you get to the $200-$300 cards, the value proposition is even better on the AMD side, and the VAST majority of graphics cards sold fall into this price range.

The high end is the only part of the market I, and most other enthusiasts, care about. I'm glad your hardware works for you, that's ultimately what matters. Which monitor are you running? I just ordered an ROG Swift PG348 to go with my 2080 Ti. I would have gone 4k but the only ones worth getting are the new, ridiculously-priced HDR monitors that I just can't justify buying. I figure 3440x1440 is a resolution that a 2080 Ti ought to be able to drive at 100 FPS for years to come.
 
Back
Top