AMD's best Big Navi GPU may only be a match for Nvidia Ampere's second tier

I own a RX 480 and I love the card but when the Bit Coin market boomed the price of RX 580's skyrocketed. This left Nvidia cards like the GTX 1060 as the better bargain during the Bit Coin boom. RX 580's and 570's are starting to creap up in the Steam Hardware Survey, suggesting that people are now starting to pick up these bargains. But as an owner of a Vega 56 and Fury card I can say that AMD doesn't deserve the same money as Nvidia. OpenGL performance is still shit on Windows, but perfectly fine on Linux. For a while people had serious stability problems, particularly on Radeon VII. Until AMD fixes these things, they don't deserve the same money Nvidia is asking for.




The high end market is not the market. Looking at Steams Hardware Survey the market is the GTX 1060, 1050, 1050 Ti, 1070, 1650, and 1660 Ti. In that order from high to low. Meaning anything $250 or less. RDNA2 will disrupt shit. The RX 480 was a market disrupter. The GTX 970 was a market disruptor. Not the GTX 1080, or RTX 2080, or the Radeon VII. Unless AMD plans to release a RDNA2 based card at $250 with Ray-Tracing support, then nothing will change. The graphics card that Nvidia and AMD need to beat is the RX 580, as of right now it stands as the single best performance per dollar card on the market. The second card to beat is the GTX 1060.

I have more faith that Intel will be a bigger market disruptor than AMD. Intel has to try really hard if they want any market share.


Not sure why you think Intel would have more chance, lol. I mean... They completely missed their target already on a low end discrete GPU (they they still don't have their 10nm capacity figured out, 7nm just got delayed. I doubt they will release a discrete consumer GPU before 2022 if they don't pull the plug by then for bleeding off resources. And now Raja is leaving... Maybe Intel just figured out he he wasn't worth what they thought he was or he figured out Intel is Intel and he can't get his job done. Either way, it doesn't speak well of an incomplete/delayed project. This is all rumors, but I have much more faith of AMD disrupting than Intel at this point (and disruption is a relative term, you can create ripples/disruption in a calm lake with a small rock, doesnt mean the lake is completely gone).
 
I wholeheartedly agree we need another competitor in this space. nV and AMD have effectively fixed prices and stagnated progress. Whether complicit or incidental, the consumer has been losing out.
Well Intel is not exactly known for having low-cost products, so I'm not sure how that would change the dynamic.

I think it is more likely that AMD will come out swinging with a 2080 Ti competitor at say $800, than Intel shaking things up on price.
 
Rdna2 is not just big navi. Its an architecture that will apply to all ranges up and down the tiers. Hence it may actually be a market disruption.
We've seen for years now that AMD has tried multiple times to compete in the high end market with no technology trickling down to the mid or low. Have you seen a cheap Vega graphics card? Have you seen a cheap version of the Fury cards? The only thing AMD produced for the past 4 years for the mid to low end market was the RX 5500 XT, and that thing is a total joke compared to the RX 580. At least Nvidia is trying to fail in that market range with 1650's and 1660's. Seriously, those are hardly an upgrade from a GTX 1060.

Whatever AMD releases will not be cheap and "MAY" sadly be competing at the performance level of a RTX 2080 Ti. Which would have been impressive 2 years ago but Nvidia is about to release a new range of products. I expect AMD to rebrand the RX 5700's into RX 6600's and those are based on RDNA1.0 not 2.0. I'm going by AMD's history because rebranding products is something AMD seems to do often for the past 8 years. If you're right then I'm not going to complain but I really REALLY doubt we'll see a cheap RDNA2 GPU for at least another year from now. By cheap I mean $400, which isn't mainstream pricing.

Intel hasnt shown anything but a fancy fan shroud. Thats about it for Chintel so far.
No doubt Intel's first products will barely be able to compete against something like a GTX 1660. But they'll probably have Ray-Tracing and priced competitively, because even Intel knows that Nvidia is the king of the hill, despite how much they like to inhale their own farts. Intel will help drive down prices and help disrupt the market.
 
Is it coming out swinging to release a 2080Ti competitor 2 years later, around the same time as the 3080 that will be around the same price?
Good point. Originally I thought AMD might start a price war but I don't think this is likely, especially if the supply chain is constrained with everything going on.

Fact is, most people aren't spending $1,200 bones on a GPU. Yeah, I think the 2080 Ti was worth it for me, but the price is too steep in general.

So yeah, it's 2 years later, but only less that 1% of gamers are on a 2080 Ti. Might be 2 years old for us on the forum, but it would be a huge jump for a lot of people if at $800 or even $700 if AMD wants to play hard.

I doubt AMD will beat Nvidia's new best, but even if they get 2nd place that is not so bad. It would still be 1440p144 or 4K60 capable and that is a good sweet spot for the market.
 
Fact is, most people aren't spending $1,200 bones on a GPU. Yeah, I think the 2080 Ti was worth it for me, but the price is too steep in general.

If $1200 was actually too steep, the price wouldn't still be $1200 after 2 years on the market. The group that all the "but it's too expensive" complainers should be mad at is all the other people happily paying $1200. This is basic supply/demand.

So yeah, it's 2 years later, but only less that 1% of gamers are on a 2080 Ti. Might be 2 years old for us on the forum, but it would be a huge jump for a lot of people if at $800 or even $700 if AMD wants to play hard.

There's more to sales and marketing of a product like a GPU than selling lots of low and midtier units. Halo products are worth a million times more than the revenues from their smaller margins, or low percentage of installed base they make up. A company becoming known for having the fastest GPU's on the market has a trickle-down effect which influences purchases all the way down the product stack. Thus is it's a safe bet than Nvidia will do everything in their power to try to always have the fastest GPU available no matter what, even if they're losing money on it and low quantities are manufactured. This is why the "but 2080Ti is only <1% on Steam stats" argument misses the bigger picture.
 
Last edited:
Fact is, most people aren't spending $1,200 bones on a GPU. Yeah, I think the 2080 Ti was worth it for me, but the price is too steep in general.

So yeah, it's 2 years later, but only less that 1% of gamers are on a 2080 Ti. Might be 2 years old for us on the forum, but it would be a huge jump for a lot of people if at $800 or even $700 if AMD wants to play hard.
People who buy $800 or $700 graphic cards are like 7% of the market. Steams Hardware Survey shows that only 1% who use Steam own a RTX 2080, 0.71% own a 2080 Super, while 2% own a 2070, and another 2% own a 2070 Super. Nobody who buys a Radeon VII even shows up on the Steam Hardware Survey and the 2080 Ti makes up 0.91%. The GTX 1060 on it's own makes up 12%.
I doubt AMD will beat Nvidia's new best, but even if they get 2nd place that is not so bad. It would still be 1440p144 or 4K60 capable and that is a good sweet spot for the market.
Nvidia isn't competing against what GPU AMD is making but the console market itself. Nvidia didn't win any contracts with Sony or Microsoft and they certainly don't want gamers buying these consoles instead of their GPU's. If Nvidia were to simply move the ladder down like a RTX 3080 performing like a RTX 2080 Ti then Nvidia has lost market share. If Nvidia makes the RTX 3070 perform like a RTX 2080 Ti then Nvidia can push the console market into the corner where 30 fps will be the norm once again. This won't be won through sheer performance but through Ray-Tracing. My belief is that Nvidia is planning to make a big performance increase when it comes to Ray-Tracing. Fact is the RTX line of cards so far are a joke and that Nvidia realizes this due to the lack of sales. The GTX 1060 is still #1 on Steam because Nvidia has failed to make a compelling product to sufficiently surpass it in performance per dollar. Considering the failure that is the GTX 1650's and 1660's, it doesn't look like Nvidia was even trying. That's why they burped out "SUPER" versions of cards to try and entice consumers with slightly better performance.

When the GTX 970 was released it performed like a GTX 780 Ti, which was a $330 GPU performing like a $700 GPU. That was no happy accident of engineering. The GTX 970 was released right after the PS4 and XB1 was released. Nvidia could have asked for $400 or more, which was the asking price of the R9 290, but they didn't. Nvidia will likely do the same this time around and AMD maybe caught off guard like when they priced the RX 5700 and 5700 XT and then Nvidia came out lowering prices before AMD even has a chance to release these products. AMD's RDNA2.0 card may not even be 2nd place.
 
People who buy $800 or $700 graphic cards are like 7% of the market. Steams Hardware Survey shows that only 1% who use Steam own a RTX 2080, 0.71% own a 2080 Super, while 2% own a 2070, and another 2% own a 2070 Super. Nobody who buys a Radeon VII even shows up on the Steam Hardware Survey and the 2080 Ti makes up 0.91%. The GTX 1060 on it's own makes up 12%.

Nvidia isn't competing against what GPU AMD is making but the console market itself. Nvidia didn't win any contracts with Sony or Microsoft and they certainly don't want gamers buying these consoles instead of their GPU's. If Nvidia were to simply move the ladder down like a RTX 3080 performing like a RTX 2080 Ti then Nvidia has lost market share. If Nvidia makes the RTX 3070 perform like a RTX 2080 Ti then Nvidia can push the console market into the corner where 30 fps will be the norm once again. This won't be won through sheer performance but through Ray-Tracing. My belief is that Nvidia is planning to make a big performance increase when it comes to Ray-Tracing. Fact is the RTX line of cards so far are a joke and that Nvidia realizes this due to the lack of sales. The GTX 1060 is still #1 on Steam because Nvidia has failed to make a compelling product to sufficiently surpass it in performance per dollar. Considering the failure that is the GTX 1650's and 1660's, it doesn't look like Nvidia was even trying. That's why they burped out "SUPER" versions of cards to try and entice consumers with slightly better performance.

When the GTX 970 was released it performed like a GTX 780 Ti, which was a $330 GPU performing like a $700 GPU. That was no happy accident of engineering. The GTX 970 was released right after the PS4 and XB1 was released. Nvidia could have asked for $400 or more, which was the asking price of the R9 290, but they didn't. Nvidia will likely do the same this time around and AMD maybe caught off guard like when they priced the RX 5700 and 5700 XT and then Nvidia came out lowering prices before AMD even has a chance to release these products. AMD's RDNA2.0 card may not even be 2nd place.

People who buy $700+ cards make up only 7% of the market, but the profit margins to nV for that segment are much higher than the mid-range, and these margins increase as you move up the product line. Same goes for AMD, more so since they are likely already cranking out in volume on the node with TSMC. This is why both companies put so much focus on high-end chips/cards. And like you say, nV is thoroughly motivated to push the envelope in the high-end both in traditional rendering and ray tracing so they can once again own the enthusiast segment. On the other side, AMD may be aiming to put out a better-priced second-place card that will sell well.

As for nVidia worrying about enthusiasts (i.e. most of us here) moving to consoles, there is a very low risk of this group moving away from PC -- we're a captive market. It's the mid-range where the battle will be fought. If nV can put out a mid-range card that has both superior performance and unique features e.g. RT (even with those features meaning next to nothing), they'll keep market share. If they put out a middling mid-range line up that doesn't beat console performance, they'll rapidly lose market share to consoles.

One thing is clear, nV will once again be slicing and dicing their chips to create different models that they can strategically position against AMD over time. There will almost certainly be Tis and Super models up the wazoo in the coming year.

Edit: I'm loathe to post rumors, but a few more pieces of info are looking to solidify: https://wccftech.com/amd-radeon-rx-big-navi-rdna-2-gpu-november-launch-no-hbm2-tsmc-7nm-rumor/

16GB with 512-bit bus. Big Navi may be more expensive than we thought...
 
Last edited:
I wholeheartedly agree we need another competitor in this space. nV and AMD have effectively fixed prices and stagnated progress. Whether complicit or incidental, the consumer has been losing out.

AMD did not fix prices nor stagnate progress, as you put it. They simply had no where to go with the position they were in, financially, and the state that Raja left RTG in. Now, on the other hand, I would say that Nvidia most definitely fixed prices and stagnated progress in the last couple of years but, it does not matter, as long as people buy them, they will sell them.
 
AMD did not fix prices nor stagnate progress, as you put it. They simply had no where to go with the position they were in, financially, and the state that Raja left RTG in. Now, on the other hand, I would say that Nvidia most definitely fixed prices and stagnated progress in the last couple of years but, it does not matter, as long as people buy them, they will sell them.

Not really. You can moan about Nvidia’s prices but lack of progress? How much bigger do you want their chips to be?
 
AMD did not fix prices nor stagnate progress, as you put it. They simply had no where to go with the position they were in, financially, and the state that Raja left RTG in. Now, on the other hand, I would say that Nvidia most definitely fixed prices and stagnated progress in the last couple of years but, it does not matter, as long as people buy them, they will sell them.

That's what I meant by qualifying they have "effectively" fixed prices, complicity or incidentally. By "progress" I'm referring to price/performance progress which has been largely flat over the past 4 years. What we have now is a pricing dynamic where nVidia launches at exorbitant prices at the high end, this trickles down to the mid-range, and AMD follows suit without any real competitive pricing. Two rotating stars in a fixed orbit. Throw another star in there and the fixed predictability inherent to a binary system is no longer possibly.
 
That's what I meant by qualifying they have "effectively" fixed prices, complicity or incidentally. By "progress" I'm referring to price/performance progress which has been largely flat over the past 4 years. What we have now is a pricing dynamic where nVidia launches at exorbitant prices at the high end, this trickles down to the mid-range, and AMD follows suit without any real competitive pricing. Two rotating stars in a fixed orbit. Throw another star in there and the fixed predictability inherent to a binary system is no longer possibly.

I do not agree, since, at least in my opinion, the 5700 and 5700XT were priced exactly where they should have been. In fact, you can now get these cards, often on sale, for less than MSRP and although I have not looked them up in a while, the 5600XT prices are probably going down, as well. Unlike Nvidia card pricing, where they remain high, no matter what. (That is just the way it is and that is what it is.) The fact that folks want AMD to sell their cards significantly cheaper, and therefore, not get R&D money, as a consequence, just so they can hope Nvidia lowers their prices, is disingenuous at best. (You may not be doing so but, many do.)

Eventually, the RX580 and lower cards will be gone but for now, they are here and that is what is cheaper, at this time. (Although they are starting to show their age, as well.)
 
Not really. You can moan about Nvidia’s prices but lack of progress? How much bigger do you want their chips to be?

Their performance, general performance, was insignifcantly faster than the previous generation, well costing significantly more. The fact that new tech was added is really the only good reason for the price increase but, that stuff was not at all useful when it was first released. The fact that they have big chips is not my problem nor my concern, I do not exist to make them money.

Edit: But we already now that.
 
One day, Big Navi will be fastest card ever produced by man, ever. Next day, more like mid-tier. Tomorrow it will be low powered, then the cycle will return back to fastest ever.

It might be nice to see actual product and numbers. Tired of the rumor mill spin cycle.
 
One day, Big Navi will be fastest card ever produced by man, ever. Next day, more like mid-tier. Tomorrow it will be low powered, then the cycle will return back to fastest ever.

It might be nice to see actual product and numbers. Tired of the rumor mill spin cycle.
Its the new schrodinger's cat, fastest and the slowest at the same time until we open the box.
 
One day, Big Navi will be fastest card ever produced by man, ever. Next day, more like mid-tier. Tomorrow it will be low powered, then the cycle will return back to fastest ever.

It might be nice to see actual product and numbers. Tired of the rumor mill spin cycle.

It will likely only claim fastest ever title if it beats Ampere to Market. Last year that looked likely. This it's starting to look like Ampere will be here first.
 
Their performance, general performance, was insignifcantly faster than the previous generation

This is false.

well costing significantly more.

This is true.

The fact that new tech was added is really the only good reason for the price increase but, that stuff was not at all useful when it was first released.

You may not value new tech like DLSS or RT and don't want to pay the early adopter tax but clearly other people do. Objectively, Nvidia made considerable progress in both features and performance with Turing. Certainly far more than Navi did.
 
Is it coming out swinging to release a 2080Ti competitor 2 years later, around the same time as the 3080 that will be around the same price?

You really believe the 3080 is going to be the same price as the current 2080 line up ?

I doubt it... Nvidia has been more then willing to increase the price every cycle. I doubt this time will be different. Unless AMD really does have a card that compete with the 3080.
 
This is false.



This is true.



You may not value new tech like DLSS or RT and don't want to pay the early adopter tax but clearly other people do. Objectively, Nvidia made considerable progress in both features and performance with Turing. Certainly far more than Navi did.

Essentially, the parts for Nvidia were at least in part, renamed and priced significantly more, over all. For example 2080Ti became the new Titan X but with increased cost. Cost wise, objectively, the performance was essentially the same. Once again, that is simply the way it is and if someone wants to spend that money on something that initially, is unsupported, that is on them.
 
https://mobile.twitter.com/Underfox3/status/1290380141693198345

There's a HUGE patent dump of AMD gpu technology in that thread, it has RDNA+, HPC and console related patents.

Raytracing, enhanced efficiency matrix multiplication, fine grained hint based frequency & voltage boosts, variable rate shading and 3d stacking related.

Not every patented technology will appear in a product but still this shows us in what has AMD been working on.
 
Essentially, the parts for Nvidia were at least in part, renamed and priced significantly more, over all. For example 2080Ti became the new Titan X but with increased cost. Cost wise, objectively, the performance was essentially the same. Once again, that is simply the way it is and if someone wants to spend that money on something that initially, is unsupported, that is on them.

Exactly. You can either claim prices went up or performance didn't. But you can't claim both in the same argument :)
 
  • Like
Reactions: Auer
like this
I'm just not sure this would necessarily be a huge problem, even if true. Okay, fine so Nvidia will hold the ultimate performance crown with their $1300+ 3080 Ti. However, if it means that the next gen AMD Big Navi RDNA2 card is on the level of the plain 3080 or even 3070 / 3070 Ti depending on the variant, then all AMD needs to do is to trade blows with those cards at a better value. Especially given how the prices have been jacked up considerably over this past generation (Remember when a 1080 Ti was a bloody $700 card, best of the best by far etc), if AMD can manage to make the top end RDNA2 cards come in with comparable raw performance to the 3080 / 3070 Ti / 3070 lineup, as well as provide equal or better support for raytracing plus things like DLSS ideally with open tech, and manage to do it with lower pricing, they could have a winner on their hands. Some areas where they go a bit further in terms of specs (ie perhaps they offer more RAM than the comparable NV card at that tier which makes it even more useful in certain cases) would also help.

Sure there will be a lot made of whomever is the raw top of the line, but considering that the regular NV "Numbered Ti" top of the line has been priced into the stratosphere where before only TITAN cards would go, that's not an accessible or desirable option for many. The next generation of GPUs will also be alongside a new generation of consoles, so it could be a significant step forward as people compare. So long as the next high end RDNA2 card is comparable and comes out on top in terms of features and/or price with the "standard high end" tier, its likely to be set up well. AMD has done well in this arena in the past, lets hope they can carry it forward this generation.
 
Last edited:
You really believe the 3080 is going to be the same price as the current 2080 line up ?

I doubt it... Nvidia has been more then willing to increase the price every cycle. I doubt this time will be different. Unless AMD really does have a card that compete with the 3080.

I have no doubt AMD will have a 3080 competitor. Given how they priced the VII and the ongoing 7nm shortages you can be sure it will be priced similarly to the 3080 too.
 
I heard AMD's big navi was gonna DISRUPT THE FUCK out of 4k gaming!! SWeet!!! Will have like a 4 layer chip and like 8x the transistor count of the first navi. 4 gpu's stacked running in sli. So bam! And they are hiding extra ram in the stack too, so if a cheapo oem doesn't put enough ram on it still has its shit going nice. And the 8x transistors means its 8x as fast as nvidias transistors which are only 1x, and the RTX will look really real. Disney is going to make real virtual reality out of it, so you walk into a room and it's like a whole other place. You can climb the virtual reality trees, jump in the water, pretty nuts! That means making lots of money and makes it more disruption! That's because the economy will get a small earthquake from the disruption. Who knows what that really means but some 3rd world countries will go bankrupt because of it. That's the shit right there!

Second tier is all lies, they scared of the disruption when its 8x faster, it's a whole other level.
 
1596686632394.png
 
Edit: I'm loathe to post rumors, but a few more pieces of info are looking to solidify: https://wccftech.com/amd-radeon-rx-big-navi-rdna-2-gpu-november-launch-no-hbm2-tsmc-7nm-rumor/

16GB with 512-bit bus. Big Navi may be more expensive than we thought...

I don't think there will be a workstation or prosumer version of the card as the article suggests. Those will come in the form of the MI100 which is based off CDNA instead of RDNA, and those cards will have 32GB HBM2e. If there is an HBM2e version of Navi 2 as the Linux drivers indicate, that would mean it would very much be a consumer version of that card.
 
I'm just not sure this would necessarily be a huge problem, even if true. Okay, fine so Nvidia will hold the ultimate performance crown with their $1300+ 3080 Ti. However, if it means that the next gen AMD Big Navi RDNA2 card is on the level of the plain 3080 or even 3070 / 3070 Ti depending on the variant, then all AMD needs to do is to trade blows with those cards at a better value. Especially given how the prices have been jacked up considerably over this past generation (Remember when a 1080 Ti was a bloody $700 card, best of the best by far etc), if AMD can manage to make the top end RDNA2 cards come in with comparable raw performance to the 3080 / 3070 Ti / 3070 lineup, as well as provide equal or better support for raytracing plus things like DLSS ideally with open tech, and manage to do it with lower pricing, they could have a winner on their hands. Some areas where they go a bit further in terms of specs (ie perhaps they offer more RAM than the comparable NV card at that tier which makes it even more useful in certain cases) would also help.

Sure there will be a lot made of whomever is the raw top of the line, but considering that the regular NV "Numbered Ti" top of the line has been priced into the stratosphere where before only TITAN cards would go, that's not an accessible or desirable option for many. The next generation of GPUs will also be alongside a new generation of consoles, so it could be a significant step forward as people compare. So long as the next high end RDNA2 card is comparable and comes out on top in terms of features and/or price with the "standard high end" tier, its likely to be set up well. AMD has done well in this arena in the past, lets hope they can carry it forward this generation.

Pretty much. I couldn't care less if nV takes the performance crown again and the accolades they get for it. I'm not buying their absurdly overpriced top-tier anyways, and I'm not influenced by halo products. All I care about is who has demonstrable better performance in the $700-800 range, and that this card is notably faster from where this range has been stuck for 4+ years (it needs to perform at least to 2080 Ti, ideally better). Whoever has best price/performance in this range gets my money.
 
I have no doubt AMD will have a 3080 competitor. Given how they priced the VII and the ongoing 7nm shortages you can be sure it will be priced similarly to the 3080 too.

I agree no one leaves money on the table.

I was just more commenting on the sillyness of thinking NV will keep 3080 prices where 2080 prices where. Price is going up even if AMD has some competition for them. They believe they can get away with it and they probably can. Whats the saying... fool me once. 3080 tis are going to be unaffordable for the vast majority of gamers. It was hard enough justifying $700 GPUs that where only $700 GPUs for a year.... and then it was even harder to justify $1000 GPUs even if they stayed on top for a couple years. I have a feeling this round NV and AMD are both going to be asking us to justify $1100-1200 for the top of the line gaming card. (they will both have even more expensive titan/pro prosumer creative type SKUs as well) I'm sure the tech will be awsome, and performance for both companies is going to be fantastic. But at some point I know I have to tap out and say... so what are you guys offering in a mid ranger. (that will no doubt cost what we used to pay for the top end cards) I don't know unless games really take a massive leap up in the IQ dept... I feel like I'll be sticking with my 5700xt for a few years yet.
 
I agree no one leaves money on the table.

I was just more commenting on the sillyness of thinking NV will keep 3080 prices where 2080 prices where. Price is going up even if AMD has some competition for them. They believe they can get away with it and they probably can. Whats the saying... fool me once. 3080 tis are going to be unaffordable for the vast majority of gamers. It was hard enough justifying $700 GPUs that where only $700 GPUs for a year.... and then it was even harder to justify $1000 GPUs even if they stayed on top for a couple years. I have a feeling this round NV and AMD are both going to be asking us to justify $1100-1200 for the top of the line gaming card. (they will both have even more expensive titan/pro prosumer creative type SKUs as well) I'm sure the tech will be awsome, and performance for both companies is going to be fantastic. But at some point I know I have to tap out and say... so what are you guys offering in a mid ranger. (that will no doubt cost what we used to pay for the top end cards) I don't know unless games really take a massive leap up in the IQ dept... I feel like I'll be sticking with my 5700xt for a few years yet.

End of the day it doesn’t really matter what happens at the top of the market as long as you stick to your budget. I’m pretty sure the upcoming $300-$400 cards will bring considerable perf/$ gains.
 
They believe they can get away with it and they probably can. Whats the saying... fool me once. 3080 tis are going to be unaffordable for the vast majority of gamers.
3080Tis aren't intended for the vast majority of gamers, nor are they needed by the vast majority of gamers.

As for prices seeming like they're going up, I really recommend reading this: https://en.m.wikipedia.org/wiki/Supply_and_demand

So many people have this childlike and emotional understanding of corporations as though they're competing in a highschool popularity contest, rather than being legally obligated to make the most profit in service to their shareholders.
 
End of the day it doesn’t really matter what happens at the top of the market as long as you stick to your budget. I’m pretty sure the upcoming $300-$400 cards will bring considerable perf/$ gains.

I bet the perf/$ gains at the lower end of that range (~$300) are tiny. Today that gets you a RX 5600XT, or RTX 2060 KO. I expect you only get a tiny increment on this.

Maybe $400 cards will get a bigger bump.

The biggest bump will be in $600+ cards.
 
Whoever has best price/performance in this range gets my money.
Biggest issue for AMD here is that there are a lot of moving parts that they're not necessarily tracking on.

One of those is figuring out how to do something similar to DLSS, effectively. When you're talking about 4k and then adding in RT and then adding in HDMI 2.1 / DP 2.0 capable displays that can run >60Hz at 4k, you're now in territory that no extrapolation of currently understood AMD technology would arrive at. But a 3080 (Ti) / 3090 is expected to.

And then there's RT itself, for which the highlight implementation on the upcoming consoles (UE5) ignores. Great, bring that to the PC as well! But they need to figure out RT. AMD halfassing RT on their debut RT product would be typical, but also might be fatal. And it goes hand in hand with whatever intelligent upscaling technology that they've yet to release as well.

But at some point I know I have to tap out and say... so what are you guys offering in a mid ranger. (that will no doubt cost what we used to pay for the top end cards) I don't know unless games really take a massive leap up in the IQ dept... I feel like I'll be sticking with my 5700xt for a few years yet.
I've had my 1080Ti the longest that I've had a GPU in the last decade or so now -- AMD wants ~US$400 for approximately equivalent performance and has nothing better to offer. And honestly, while I might decide at the time of purchase to grab something more expensive, I'm more likely to stick in that range, and there's nothing there today. For the same price at best I could get a 2080 Super.

So yeah, they're going to have to work to get me to move on, and I do want to move on because I want to get an LG 48CX and run it at full speed :)
 
When you're talking about 4k and then adding in RT and then adding in HDMI 2.1 / DP 2.0 capable displays that can run >60Hz at 4k, you're now in territory that no extrapolation of currently understood AMD technology would arrive at.

I'm not sure that's accurate; AMD had Eyefinity up and running 10 years ago. Those were 8K multi-monitor setups running 10-bit color at 60hz and lower resolutions at 120. They were gaming on them, too; I remember seeing an F1 2010 full-size cockpit/wheel/pedals multi-screen racing setup.
 
I'm not sure that's accurate; AMD had Eyefinity up and running 10 years ago. Those were 8K multi-monitor setups running 10-bit color at 60hz and lower resolutions at 120. They were gaming on them, too; I remember seeing an F1 2010 full-size cockpit/wheel/pedals multi-screen racing setup.
It's entirely accurate.
What you're talking about is using eight separate display connections instead of one, and using an operating system and driver solution that was / is significantly more convoluted than using a single physical display.

Further, you're talking about using multiple GPUs to get those outputs and also to provide the rendering power for them, a technology while not lost, is not currently feasible for modern games.

What I'm talking about (rather clearly) is whether AMD will be able to provide a single GPU that can provide the performance to power a single 4k120 HDR display with ray tracing.

As it stands, that ain't in the cards.
 
What I'm talking about (rather clearly) is whether AMD will be able to provide a single GPU that can provide the performance to power a single 4k120 HDR display with ray tracing.

As it stands, that ain't in the cards.
Well, to be fair, Nvidia can't do that either.
 
What you're talking about is using eight separate display connections instead of one, and using an operating system and driver solution that was / is significantly more convoluted than using a single physical display.

So it was even harder back then? Also AMD Eyefinity included single-card solutions driving up to six displays.

I think it's unrealistic to believe that AMD is unaware of high resolution, high frequency displays.
 
Well, to be fair, Nvidia can't do that either.
Ampere looks like it probably will. Big Navi / Navi 2, mostly doesn't.
So it was even harder back then? Also AMD Eyefinity included single-card solutions driving up to six displays.
Easier, as pre-DX12 technologies handled more of that in the driver stack, game developers had to do less work to use them, and the technologies were heavily promoted and supported by vendors. DX12 / Vulcan have mostly put those initiatives on hiatus.
I think it's unrealistic to believe that AMD is unaware of high resolution, high frequency displays.
Unaware? No. Incapable of addressing?

For reasons that I cannot fathom, yes.
 
Back
Top