6800 / XT Review Round Up

MaZa

2[H]4U
Joined
Sep 21, 2008
Messages
3,099
The answer is "no"


There is no "AMD's version".

DirectML Super Resolution is a feature of DirectX.

It is still technically AMD's version because it is designed in collaboration with Microsoft, AFAIK. Please don't pick my words apart. But thank you anyway.
 

GoodBoy

[H]ard|Gawd
Joined
Nov 29, 2004
Messages
1,886
both AMD and Nvidia are failing (to meet market demand) because TSMC + Samsung cannot supply the market with GPUs.

We are all losing in this.

I think this is the reality.

With only 2 fabs able to supply chips for (nearly) everything, Desktop CPU's, Server CPU's, cell phones, GPU's, Consoles.. it's just not enough capacity to go around. The fab's, if they are smart, will just raise prices. Supply and demand. It means everything in the future is going to cost more because increased output from smaller dies is not enough to offset increased demand.

So I am really glad AMD is finally competitive at least in rasterization. That is enough for AMD fan's and fortnite players. They will sell as many cards as they can produce. Should hold Nvidia prices to reality.

But if demand continues to outpace fab capacities, higher prices for everything is likely in our future.
 

DukenukemX

Supreme [H]ardness
Joined
Jan 30, 2005
Messages
5,072
The situation seems to reverse for 6800 vanilla vs the 3070.

3070 seems to be running into memory size limitations at 4K especially with Ray Tracing on (shadow of the Tomb Raider game for example)

At lower resolutions the high overclocking ability of 6800 gives it a leg-up

Right now DLSS is the only ace in Nvidia's hand. As soon as AMD comes up with its own equivalent the 6800 should match or better the 3070 in all scenarios, including future proofing with extra memory
Battlefield V and Strange Brigade are the two games I've seen where the 6800 Vanilla destorys the 3070. For every other games it's either the 6800 is slightly faster or the 3070 is slightly faster. In the games 6800 does win the higher resolutions widen the cap even further. But does a $570 6800 sound more enticing than a $500 3070? Considering that Ray-Tracing is faster and not broken on the 3070, no it doesn't. Don't know why AMD thought an extra $70 higher than the 3070 would be better than say $450 price. If the 6800 was $450 then I'd say that it maybe worth looking into, but as it stands the 3070 is better. AMD seems to be in the mood of raising prices on their new products.
 

ChadD

Supreme [H]ardness
Joined
Feb 8, 2016
Messages
4,918
Battlefield V and Strange Brigade are the two games I've seen where the 6800 Vanilla destorys the 3070. For every other games it's either the 6800 is slightly faster or the 3070 is slightly faster. In the games 6800 does win the higher resolutions widen the cap even further. But does a $570 6800 sound more enticing than a $500 3070? Considering that Ray-Tracing is faster and not broken on the 3070, no it doesn't. Don't know why AMD thought an extra $70 higher than the 3070 would be better than say $450 price. If the 6800 was $450 then I'd say that it maybe worth looking into, but as it stands the 3070 is better. AMD seems to be in the mood of raising prices on their new products.

Well there is an eventual 6700 coming that should be the intended 3070 competitor.

I can only assume the 6800 non XT exists to salvage XT chips. Basically there is only one card right now..... the best chips that are 100% functional have been put aside to fill 6900 XT orders in a month. The 95% functional chips are going into the XT.... and what can be salvaged is going into the non XT. Same PCBs same RAM config.

It does seem like the 6800 is an odd duck.... I mean why not just spend a little bit more for the proper XT at that point.

Looking forward to the 6700 XT though, no doubt it will still compete well with the 3070, and probably be much cheaper. Possibly also sell in a 8GB version. As nice as the ram is.... if you where just looking for a screaming 1440 card for the next few years. I bet a 6700 XT 8gb will be a great buy.
 
Joined
Sep 15, 2017
Messages
184
Techspot

We’re looking at a 16% performance advantage going the way of the Radeon RX 6800 over the RTX 3070 in Doom Eternal, though we are talking about 242 fps vs. 281 fps at 1440p which is not very meaningful.

At 4K, the RTX 3070 seems to be running out of VRAM as the game requires 9GB of memory at this resolution and settings. As a result the RX 6800 extends its lead to 31%, going from 120 fps right up to 157 fps.

6800 wins in every game in this review vs. 3070, 14% faster 1440p and 15% at 4k over 18 games.
 

johnnysd

Limp Gawd
Joined
Oct 6, 2007
Messages
201
I am not sure I would say RT performance is falling behind per say. All anyone has to test are games optimized for the then only option Nvidia. I am not suggesting AMD is going to overtake NV this round in terms of RT performance. However I do believe there is more to be had with some AMD specific optimizations. I hope AMD is working with developers of the stuff already on the market and not just new titles. Perhaps that is asking much though a lot of the early RT title developers are DEEP in Nvidias pockets.

Looking at a game like AC Valhalla... its obvious AMD optimizations make a huge difference. With that game being a clear AMD win where their older AC Odyssey goes the other way. I think over the next few months as some newer titles launch for the next gen consoles and for PC with DX12U support. Revisiting AMD v NV raytracing will be interesting. Frankly I don't much care about the previous NV crop of RT games... if I cared much about any of those titles and their RT I would probably already own a RTX card. The only one I have actually played was control... and frankly its not a game I found would hold any real replay value for me. I loved it but I'm not going back this weekend to play it again either. I think AMD RT getting added to WOW for instance will move far more cards then Control. If I'm going to shell out a premium specifically for a new thing like RT I am going to want game support on games with max replay (non frame sensitive competitive play) like MMOs.
It's interesting. In the few reviews that used them the Fidelity FX optimized cards seemed to really be ahead of 3080. Stock to stock, 3080 is overall ahead and has much faster RT and DLSS. but I could see that moving towards AMD as time goes on. They say there are already 60 Fidelity FX cards coming and consoles will automatically be optimized for RDNA2.

In terms of ray tracing even though it is very far behind the 3080 I could see developers using ray tracing so that it works well for 3070, 6800, 6800XT and 2080TI as that will likely be a much bigger market segment than just the 3080 alone. So I am not sure there will be many titles at all where you "miss out" with the 6800XT. There will be some, as NVIDIA will partner on games to highlight the 3080 and upcoming 3080TI.

I could make a good argument for either side.

One thing though is that there are almost no AIB cards for the 3080 really around it's list price. Even AIB partners that have those models are using their stock for the Strix or FTW3 versions et all. So it's really $750 to get a 3080. Will be very interesting to see 6800 and 6800XT AIB pricing.
 
  • Like
Reactions: ChadD
like this

Lumpus

Limp Gawd
Joined
Sep 2, 2005
Messages
346
AMD seems to be in the mood of raising prices on their new products.
My guess is they're leaving available margin. nVidia will probably do a big across the board price drop (after Christmas) and then AMD can follow suit... assuming both parties actually have sufficient stock in retail channels by then
 

MavericK

Zero Cool
Joined
Sep 2, 2004
Messages
30,242
Techspot

We’re looking at a 16% performance advantage going the way of the Radeon RX 6800 over the RTX 3070 in Doom Eternal, though we are talking about 242 fps vs. 281 fps at 1440p which is not very meaningful.

At 4K, the RTX 3070 seems to be running out of VRAM as the game requires 9GB of memory at this resolution and settings. As a result the RX 6800 extends its lead to 31%, going from 120 fps right up to 157 fps.

6800 wins in every game in this review vs. 3070, 14% faster 1440p and 15% at 4k over 18 games.
It's also 15% more expensive so, makes sense?
 

exlink

Supreme [H]ardness
Joined
Dec 16, 2006
Messages
5,437
My guess is they're leaving available margin. nVidia will probably do a big across the board price drop (after Christmas) and then AMD can follow suit... assuming both parties actually have sufficient stock in retail channels by then
Why would they cut prices when the demand is through the roof (and will be for quite sometime)?
 

Roy2001

n00b
Joined
Nov 18, 2020
Messages
3
Techspot

We’re looking at a 16% performance advantage going the way of the Radeon RX 6800 over the RTX 3070 in Doom Eternal, though we are talking about 242 fps vs. 281 fps at 1440p which is not very meaningful.

At 4K, the RTX 3070 seems to be running out of VRAM as the game requires 9GB of memory at this resolution and settings. As a result the RX 6800 extends its lead to 31%, going from 120 fps right up to 157 fps.

6800 wins in every game in this review vs. 3070, 14% faster 1440p and 15% at 4k over 18 games.
But, if you are paying $580, why not add $70 to get the XT?
 

GoodBoy

[H]ard|Gawd
Joined
Nov 29, 2004
Messages
1,886
5700xt is far closer to 6800xt in architecture than 2080 is to 3080.
He basically means for Nvidia, 3xxx is RTX 2.0. 2xxx chips were RTX 1.0
For AMD, 6xxx is RTX 1.0. (These are not RTX 'versions', just the number of architectures each company has had RTX experience with)

It really shows in the performance GamersNexus 6800XT review. Not had a chance to read others yet, but their review was very thorough and even covered the smart access memory performance.
 

LukeTbk

Limp Gawd
Joined
Sep 10, 2020
Messages
402
It's not "AMD's version".
Will they not brand a FidelityFX (open but AMD branded) that use the DirectML (that was made by microsoft using Nvidia card), that will add a little something to DirectML making it so that a Directx12 card will not automatically support FidelityFX even if they fully support DirectML ?
 

DukenukemX

Supreme [H]ardness
Joined
Jan 30, 2005
Messages
5,072
I can only assume the 6800 non XT exists to salvage XT chips. Basically there is only one card right now..... the best chips that are 100% functional have been put aside to fill 6900 XT orders in a month. The 95% functional chips are going into the XT.... and what can be salvaged is going into the non XT. Same PCBs same RAM config.
My prediction is that the prices of the 6800XT and the 6800 won't be any different after some time. This is what happened to the 5700 and 5700 XT as the price of the two products were essentially the same. The lack of demand and the fact that you can turn a 5700 into a 5700 XT easily is why this happened. At some point I believe the 6800XT will be the same price as the 6800, which won't be $570. It'll be a lower price.
It does seem like the 6800 is an odd duck.... I mean why not just spend a little bit more for the proper XT at that point.
Same problem for the 5700's except people just bought the cheaper model for a lower price. I wonder if a 6800 could be flashed into a 6800XT?
Looking forward to the 6700 XT though, no doubt it will still compete well with the 3070, and probably be much cheaper. Possibly also sell in a 8GB version. As nice as the ram is.... if you where just looking for a screaming 1440 card for the next few years. I bet a 6700 XT 8gb will be a great buy.
Assuming AMD has a sane price. As much as we all love these new cards, they won't realistically sell. Neither AMD or Nvidia are going to sell this cards very well, and don't pay attention to the lack of supply as that's not due to demand. Neither the RTX 2000 or the 5700's and 5600XT cards sold well, mostly due to price. People are looking for a $250 or cheaper graphics card and I doubt the 6700's will be in that price range. Probably $300-$450 price range. So I guess the mainstream cards will be the RTX 3050 or the RX 6600 cards. Really sad.

Why would they cut prices when the demand is through the roof (and will be for quite sometime)?
Because after Christmas the demand will flop and nobody is going to buy them. Same goes for new consoles. Do you think people really need these graphic cards to play new games? Because they don't. After Christmas is when I expect us to enter a depression so it really becomes problematic then.
 

Zarathustra[H]

Fully [H]
Joined
Oct 29, 2000
Messages
31,234
Same problem for the 5700's except people just bought the cheaper model for a lower price. I wonder if a 6800 could be flashed into a 6800XT?
Might be possible.

Assuming - of course - they aren't binned because those 12 CU's havent been shut off due to being non-functional on the specific chip you get.
 

LukeTbk

Limp Gawd
Joined
Sep 10, 2020
Messages
402
Because after Christmas the demand will flop and nobody is going to buy them. Same goes for new consoles. Do you think people really need these graphic cards to play new games? Because they don't. After Christmas is when I expect us to enter a depression so it really becomes problematic then.
People will at the $500-$700 NVIDIA price tags once those become available (maybe that what you mean by price cut) same for the radeon announced price once start to be available.

Looking at the 2080 RTX:
https://camelcamelcamel.com/product/B07XSLGKG7
https://camelcamelcamel.com/product/B07VDMGYGZ

That was not nearly has hyped product, they maintained their price really well.
 

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
51,087
I wonder if a 6800 could be flashed into a 6800XT?
I was informed quite some time ago through the grapevine that flashing non-XT to XT cards will no longer be an option. And before you get up in arms about it, the main reason this is done is to prevent gray market and black market reflashing fraud. Enthusiasts are not the problem here. That number is so small that "correcting" the issue is not worth the engineering investment to fix it. At least that was the understanding that I took from the conversation.
 

sc5mu93

Limp Gawd
Joined
Jul 11, 2018
Messages
331
I was informed quite some time ago through the grapevine that flashing non-XT to XT cards will no longer be an option. And before you get up in arms about it, the main reason this is done is to prevent gray market and black market reflashing fraud. Enthusiasts are not the problem here. That number is so small that "correcting" the issue is not worth the engineering investment to fix it. At least that was the understanding that I took from the conversation.
Thanks for the info. This was something I had wondered from the announcement of XT and non-XT. That said, I am sure someone will try it to verify. :D
 

TheRookie

Limp Gawd
Joined
May 21, 2019
Messages
208
Battlefield V and Strange Brigade are the two games I've seen where the 6800 Vanilla destorys the 3070. For every other games it's either the 6800 is slightly faster or the 3070 is slightly faster. In the games 6800 does win the higher resolutions widen the cap even further.
Wrong. Radeon RX 6800 is faster than GeForce RTX 3070 in just about every game (I can't even think of one where this is not the case), from barely faster to a lot faster.
Battlefield V and Strange Brigade are the two games I've seen where the 6800 Vanilla destorys the 3070. For every other games it's either the 6800 is slightly faster or the 3070 is slightly faster. In the games 6800 does win the higher resolutions widen the cap even further. But does a $570 6800 sound more enticing than a $500 3070? Considering that Ray-Tracing is faster and not broken on the 3070, no it doesn't. Don't know why AMD thought an extra $70 higher than the 3070 would be better than say $450 price. If the 6800 was $450 then I'd say that it maybe worth looking into, but as it stands the 3070 is better.
Why $450? Why not free?

The whining about prices is really annoying.
AMD seems to be in the mood of raising prices on their new products.
What are you talking about?
 
Last edited:

LukeTbk

Limp Gawd
Joined
Sep 10, 2020
Messages
402
If the 6800 was $450 then I'd say that it maybe worth looking into, but as it stands the 3070 is better
If you have nice usage for the proprietary (like the sound if you often make work conference from home with background noise right now, many could really use it right now) feature or value RT (arguably has of now on that one), otherwise I am a bit curious why, the 6800 seem always faster sometime a little bit, sometime a lot.
 

TheRookie

Limp Gawd
Joined
May 21, 2019
Messages
208
Assuming AMD has a sane price. As much as we all love these new cards, they won't realistically sell. Neither AMD or Nvidia are going to sell this cards very well, and don't pay attention to the lack of supply as that's not due to demand. Neither the RTX 2000 or the 5700's and 5600XT cards sold well, mostly due to price. People are looking for a $250 or cheaper graphics card and I doubt the 6700's will be in that price range. Probably $300-$450 price range. So I guess the mainstream cards will be the RTX 3050 or the RX 6600 cards. Really sad.


Because after Christmas the demand will flop and nobody is going to buy them. Same goes for new consoles. Do you think people really need these graphic cards to play new games? Because they don't. After Christmas is when I expect us to enter a depression so it really becomes problematic then.
Your prediction will go down like a lead balloon.
 

Zarathustra[H]

Fully [H]
Joined
Oct 29, 2000
Messages
31,234
My prediction is that the prices of the 6800XT and the 6800 won't be any different after some time. This is what happened to the 5700 and 5700 XT as the price of the two products were essentially the same. The lack of demand and the fact that you can turn a 5700 into a 5700 XT easily is why this happened. At some point I believe the 6800XT will be the same price as the 6800, which won't be $570. It'll be a lower price.

Same problem for the 5700's except people just bought the cheaper model for a lower price. I wonder if a 6800 could be flashed into a 6800XT?

Assuming AMD has a sane price. As much as we all love these new cards, they won't realistically sell. Neither AMD or Nvidia are going to sell this cards very well, and don't pay attention to the lack of supply as that's not due to demand. Neither the RTX 2000 or the 5700's and 5600XT cards sold well, mostly due to price. People are looking for a $250 or cheaper graphics card and I doubt the 6700's will be in that price range. Probably $300-$450 price range. So I guess the mainstream cards will be the RTX 3050 or the RX 6600 cards. Really sad.


Because after Christmas the demand will flop and nobody is going to buy them. Same goes for new consoles. Do you think people really need these graphic cards to play new games? Because they don't. After Christmas is when I expect us to enter a depression so it really becomes problematic then.

Meh. There is certainly a large demand for mid to low end GPU's, but there is also a greater demand than ever for halo cards.

But I agree with you, pricing has gone up too much too fast.

$250 is unrealistic, but certainly $400 ought to be doable.

The thing is, both AMD and Nvidia are in it to make money, and if no one bought the halo cards they wouldn't make them.

Personally I am a little extreme, but for me my target in any GPU purchase is all the quality sliders set to max, at 4k with a bottom 1% framerate that never drops below 60fps, and I'm not alone. Demand sustains this pricing.

If anything sales of the 20xx generation were slowed by the ample avaiability of leftover 1080ti's from the cryptomining bust. The more time passes those 1080ti's will be less and less relevant, and demand for new generations will pick up.
 

TheRookie

Limp Gawd
Joined
May 21, 2019
Messages
208
Meh. There is certainly a large demand for mid to low end GPU's, but there is also a greater demand than ever for halo cards.

But I agree with you, pricing has gone up too much too fast.

$250 is unrealistic, but certainly $400 ought to be doable.

The thing is, both AMD and Nvidia are in it to make money, and if no one bought the halo cards they wouldn't make them.

Personally I am a little extreme, but for me my target in any GPU purchase is all the quality sliders set to max, at 4k with a bottom 1% framerate that never drops below 60fps, and I'm not alone. Demand sustains this pricing.

If anything sales of the 20xx generation were slowed by the ample avaiability of leftover 1080ti's from the cryptomining bust. The more time passes those 1080ti's will be less and less relevant, and demand for new generations will pick up.
AMD and NVIDIA have different products for people with different budgets.

For people with $250 budget, AMD and NVIDIA have products for that.

For people with $400 budget, AMD and NVIDIA have products for that.

For people with $700 budget, AMD and NVIDIA have products for that.

et cetera

et cetera

__________________________________________________________

It's not as if unless you have $1500, you can't buy a new video card.
 
Last edited:

Zarathustra[H]

Fully [H]
Joined
Oct 29, 2000
Messages
31,234
AMD and NVIDIA have different products for people with different budgets.

For people with $250 budget, AMD and NVIDIA have products for that.

For people with $400 budget, AMD and NVIDIA have products for that.

For people with $700 budget, AMD and NVIDIA have products for that.

et cetera

et cetera

__________________________________________________________

It's not as if unless you have $1500, you can't buy a new video card.

No one claimed it was. I think you missed the point of my post, which was that there still is demand for top end products, even with the high prices today, and that's why the high prices survive.

$250 buys you a GPU that is pretty difficult to play most modern titles on enjoyably though, without significant sacrifices in graphics quality and resolution compared to modern norms.

But remember, less than 20 years ago, I bought the fastest enthusiast GPU money could buy (GeForce 3 TI 500) for $350 brand new retail.

Now, this was a time when AMD, sorry ATi and Nvidia were neck at neck with eachother, which helped pricing.

Things have also become more complex since then, requiring more advanced cooling, multi-layer boards, and a combination of yield issues and a consolidation in the silicon fab industry means chips are more expensive today than they were then.

$350 for a top end card is no longer realistic, but IMHO $650 certainly is if we didnt have this excessive price creep.
 
Last edited:

sphinx99

Gawd
Joined
Dec 23, 2006
Messages
860
No one claimed it was.

$250 buys you a GPU that is pretty difficult to play most modern titles on enjoyably though, without significant sacrifices in graphics quality and resolution compared to modern norms.
Devil's advocate but is this really true? Right now I've been playing AC Valhalla at 4k, medium settings, ~40 fps on my nigh geriatric GTX 1080--because I can't get my hands on any of the new cards. To be honest, it's still a perfectly serviceable experience and one that I think would be well worth paying $250 for. Something like a GTX 1660 Super in that price range seems to get near-ish a 1080 experience and I think that's decent value for the dollar. (I am not so familiar with AMD's offerings in that price range but assume they too have some near-1080 $250 card.)

It's important not to conflate "significant sacrifices in quality" with not having 4K and max settings and frame rates.
 
Last edited:

TheRookie

Limp Gawd
Joined
May 21, 2019
Messages
208
No one claimed it was.

$250 buys you a GPU that is pretty difficult to play most modern titles on enjoyably though, without significant sacrifices in graphics quality and resolution compared to modern norms.
I want to play all the latest titles at 4K with all the settings on high.

How come I can't do that with a $250 video card? /s

But remember, less than 20 years ago, I bought the fastest enthusiast GPU money could buy (GeForce 3 TI 500) for $350 brand new retail.
...and back in the day, a burger was 15 cents

Now, this was a time when AMD, sorry ATi and Nvidia were neck at neck with eachother, which helped pricing.

Things have also become more complex since then, requiring more advanced cooling, multi-layer boards, and a combination of yield issues and a consolidation in the silicon fab industry means chips are more expensive today than they were then.

$350 for a top end card is no longer realistic, but IMHO $650 certainly is if we didnt have this excessive price creep.
I want to buy the most expensive video card.

Why is the most expensive video card so expensive? /s

___________________________________________________

No one is forcing you to buy the most expensive video card.

The option is available for those who can and are willing to buy it.
 

Zarathustra[H]

Fully [H]
Joined
Oct 29, 2000
Messages
31,234
...and back in the day, a burger was 15 cents
Irrelevant comparison.

Burgers were 15 cents in the late 50's / early 60's. There has been a good deal of inflation since then.

15 cents in 1960 dollars is $1.33 today. You can still get a basic burger off the dollar menu. By this measure burgers have actually become cheaper since then. (though to be fair, I don't know how big a 15c burger was in 1960, so I don't know how comparable they are to modern dollar menu burgers)

$350 in 2001 dollars is $520 today. That represents a top end GPU price increase to ~300% of the 2001 price.
 
Last edited:

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
51,087
Meh. There is certainly a large demand for mid to low end GPU's, but there is also a greater demand than ever for halo cards.

But I agree with you, pricing has gone up too much too fast.
"Low end" GPUs are on their way out the door due to how strong APUs are getting.

You need to update your awesome launch price graph you did such a great job of!!! :)
 

TheRookie

Limp Gawd
Joined
May 21, 2019
Messages
208
Irrelevant comparison.

Burgers were 15 cents in the late 50's / early 60's. There has been a good deal of inflation since then.

15 cents in 1960 dollars is $1.33 today. You can still get a basic burger off the dollar menu. By this measure burgers have actually become cheaper since then.

$350 in 2001 dollars is $520 today. That represents a top end GPU price increase to ~300% of the 2001 price.
You didn't response to the rest of my comment.

___________________________________________________

No one is forcing you to buy the most expensive video card.

The option is available for those who can and are willing to buy it.
 

Zarathustra[H]

Fully [H]
Joined
Oct 29, 2000
Messages
31,234
You didn't response to the rest of my comment.

___________________________________________________

No one is forcing you to buy the most expensive video card.

The option is available for those who can and are willing to buy it.

Agreed.

Still. A 3x price increase for no other reason than "because" should be enough to piss anyone off.

These two statements are not mutually exclusive.

Now, there is a little bit more to it than just "because". Producing a good GPU in 2020 costs more than it did in 2000. More advanced coolers, more advanced ciruits on multi-layer boards, silicon fab capacity, development cost and yield decreases have conspired to raise silicon chip prices, all these things add up.

And this is where I admit, I have no insight what so ever into the current balance sheets and costing of the GPU makers, but I'm going to go out on a limb and suggest that the price increase is not fully explained by th ecost increase. Not close to it.

I'll be the first to tell you that "cost plus" pricing is not the way the world works anymore, but when there is a huge mismatch like this, it is usually due to monopolistic pressures of a lack of competition in a market.
 
Last edited:

Lakados

2[H]4U
Joined
Feb 3, 2014
Messages
3,029
Why would they cut prices when the demand is through the roof (and will be for quite sometime)?
If anything both parties may see a price creep as components continue to remain scarce. PWM Voltage doublers are still in short supply, as are Japans Solid State Capacitors, China still hasn't opened up more than half their rare earth mining facilities/refining and since they maintain like 80% of the global supply there are shortages across the board as stockpiles are being depleted, and Microsoft and Sony are apparently both crawling up AMD's behind over late XBox and PS5 shipments which are putting some hurt on their XMas projections. Global supply chains are a complete disaster right now I say all the new fancy things are going to be in seriously short supply well into 2021.
 

TheSlySyl

Gawd
Joined
May 30, 2018
Messages
754
It's important not to conflate "significant sacrifices in quality" with not having 4K and max settings and frame rates.
Seriously.
You don't *need* every setting to be at Ultra.
I was gaming decently well at 4k on my damn 1070, and I'm gaming very well at 4k with my 1080TI in a vast majority of titles. Yeah, you sacrifice like 8x AA and super-uber-hyper-sampled clouds, but who gives a fuck about the clouds? The biggest setting I have to turn down to get games running smoothly at 4k is some form of ultra Ambient Occlusion that I tend to find too aggressive anyway. I don't want shit like vignette or the crazy intensive and seriously awful looking fog in MH:World, for example.

Ray Tracing and shader effects becoming more common and being a bigger part of modern graphics is a larger draw for new cards to me than pure rasterization. Which, for me, is why I think i'm looking more towards a 3080 than a 6800XT at the moment.

I'm mostly curious what the hell AMD is gonna pull off next year. See if they do a Ryzen 2 or 3 jump of improvement with their GPUs.
 

TheRookie

Limp Gawd
Joined
May 21, 2019
Messages
208
Agreed.

Still. A 3x price increase for no other reason than "because" should be enough to piss anyone off.

These two statements are not mutually exclusive.
What do you have against giving choices?

The option is available for those who can and are willing to buy it.

If you don't want most expensive option, there are cheaper options.
 

Zarathustra[H]

Fully [H]
Joined
Oct 29, 2000
Messages
31,234
What do you have against giving choices?

The option is available for those who can and are willing to buy it.

If you don't want most expensive option, there are cheaper options.

I'm all for choices. Multiple price points for different performing GPU's makes perfect sense.

I'm against unreasonable price inflation.

No one wants to pay 3x more for the modern version of the same thing, after correcting for inflation.
 

LukeTbk

Limp Gawd
Joined
Sep 10, 2020
Messages
402
Agreed.

Still. A 3x price increase for no other reason than "because" should be enough to piss anyone off.

These two statements are not mutually exclusive.
When considering that:
Sony sets PlayStation 3 price at $499 and $599 - May. 9, 2006
PlayStation 4 cost $399 at launch
PlayStation 5 launches in November, starting at $399 for PS5 Digital Edition and $499 for PS5 with ultra HD blu-ray

Obviously having bluray/ability to run netflix value went to a lot to almost nil over time and they sell more gaming type pass now than before, but still, console price didn't creep up at all.

Would be interesting to see how much a video card able to game like a console of is days cost evolution is has a metric of PC gpu inflation, we will need to wait for the launch of the geforce 3050/6700 or the AMD card closer to XBox series performance and see what it cost, versus the cost of video card that were performing like the PS3/PS4 in their days.
 
Top