6800 / XT Review Round Up

No one wants to pay 3x more for the modern version of the same thing, after correcting for inflation.
That not the feeling I have when I try to buy them on Amazon, there is literally line of people in front of store of people that want to pay 3x more for product that do ridiculous things, when $400 consoles that game really well and a 5700xt exist.

There is giant group of people that want to pay that price and that why they make those products, we would be gaming with console graphic with console performance at console price otherwise.

But that said that part:
The option is available for those who can and are willing to buy it.

If you don't want most expensive option, there are cheaper options



Do not exist enough, at least right now, there is not that many interesting cheap option on newegg, if there was a $350 RTX 3060 or cheap 2060/2070/5070xt it would be true, but old card are incredibly expensive right now.
 
I'm all for choices. Multiple price points for different performing GPU's makes perfect sense.

I'm against unreasonable price inflation.

No one wants to pay 3x more for the modern version of the same thing, after correcting for inflation.
Haven't you taken an economics class?

The market sets the price.

If not enough people are buying the products, the prices will come down.
 
Wrong. Radeon RX 6800 is faster than GeForce RTX 3070 in just about every game (I can't even think of one where this is not the case), from barely faster to a lot faster.
I'm sure I can dig up some examples.
https://tpucdn.com/review/amd-radeon-rx-6800/images/anno-1800-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/borderlands-3-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/borderlands-3-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/death-stranding-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/detroit-become-human-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/divinity-original-sin-2-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/far-cry-5-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/project-cars-3-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/project-cars-3-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/jedi-fallen-order-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/jedi-fallen-order-3840-2160.png
Why $450? Why not free?

The whining about prices is really annoying.
Because despite what you may say, the 6800 is about the same in performance as the 3070, maybe slightly ahead. But not when it comes to Ray-Tracing performance. Also don't forget that consumers favor Nvidia over AMD. At $570, AMD won't be selling many 6800's. The price should be $450 at best, otherwise it'll end up like the 5600XT, if not worse. Remember when the 5600XT was released for $280 but 5700 cards were going for $300? That was a stupid move from AMD, and charging $70 more than Nvidia is even more stupid.
average-fps_1920_1080.png
average-fps_2560_1440.png
average-fps_3840-2160.png
What are you talking about?
AMD's CPU's have also risen in price. Their 8 core 5800X is $50 more than the 3700X on release. Seems AMD has gotten greedy.
 
I'm sure I can dig up some examples.
https://tpucdn.com/review/amd-radeon-rx-6800/images/anno-1800-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/borderlands-3-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/borderlands-3-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/death-stranding-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/detroit-become-human-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/divinity-original-sin-2-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/far-cry-5-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/project-cars-3-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/project-cars-3-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/jedi-fallen-order-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/jedi-fallen-order-3840-2160.png

Because despite what you may say, the 6800 is about the same in performance as the 3070, maybe slightly ahead. But not when it comes to Ray-Tracing performance. Also don't forget that consumers favor Nvidia over AMD. At $570, AMD won't be selling many 6800's. The price should be $450 at best, otherwise it'll end up like the 5600XT, if not worse. Remember when the 5600XT was released for $280 but 5700 cards were going for $300? That was a stupid move from AMD, and charging $70 more than Nvidia is even more stupid.
View attachment 300954
View attachment 300955
View attachment 300956

AMD's CPU's have also risen in price. Their 8 core 5800X is $50 more than the 3700X on release. Seems AMD has gotten greedy.
I don't know about greedy, their parts are price competitive, with both NVidia and Intel, they also have to account for TSMC's rate increases as well. AMD purchased that extra fab time at a premium to beat out their competitors that price has to be reflected somewhere and as they are contractually obligated to provide the Xbox and PS5 chips at a fixed point it's not coming there. Sure you can argue that some specific parts aren't ideal given their price but parts like the 6800 aren't meant to be something they move in volume. It's basically a slightly defective XT model so that means they have to provide the same board and components as the XT while keeping it at a price point where it isn't in high demand because ideally, they don't have a lot of parts to bin down so it is a model that is scarce by design. They also have to ensure that the pricing/performance doesn't get too close to their fully-functional lower tier parts because then they accidentally end up competing with themselves and then have to alter prices later on which just upsets the AIB partners and makes them look bad.
 
If anything both parties may see a price creep as components continue to remain scarce. PWM Voltage doublers are still in short supply, as are Japans Solid State Capacitors, China still hasn't opened up more than half their rare earth mining facilities/refining and since they maintain like 80% of the global supply there are shortages across the board as stockpiles are being depleted, and Microsoft and Sony are apparently both crawling up AMD's behind over late XBox and PS5 shipments which are putting some hurt on their XMas projections. Global supply chains are a complete disaster right now I say all the new fancy things are going to be in seriously short supply well into 2021.
That's true, but that doesn't mean that a lot of people will own them. If the prices do rise for these graphic cards then consumers will find another way to game. The worst part is that games won't be catering to these AMD 6000 or Nvidia 3000 cards until a sizable amount of people have bought them. Cyber Punk 2077 recommends a GTX 1060 6 GB, not requires it. The GTX 1060 is the most popular card on Steam. Surely that can't be a coincidence? Something for someone to think about if they're about to plop down $570 to $650 for any of these graphic cards. If the adoption rate is low then how many games will utilize these features and performance? Hopefully AMD has made enough to actually sell.
 
That's true, but that doesn't mean that a lot of people will own them. If the prices do rise for these graphic cards then consumers will find another way to game. The worst part is that games won't be catering to these AMD 6000 or Nvidia 3000 cards until a sizable amount of people have bought them. Cyber Punk 2077 recommends a GTX 1060 6 GB, not requires it. The GTX 1060 is the most popular card on Steam. Surely that can't be a coincidence? Something for someone to think about if they're about to plop down $570 to $650 for any of these graphic cards. If the adoption rate is low then how many games will utilize these features and performance? Hopefully AMD has made enough to actually sell.
AAA titles can't afford to develop a game for the top 10%, sure they can add features that only they can really use but these things are so expensive to make they must hit as much of the market as they can. And really more than 50% of all PC gamers are still on 1080p, There are a lot of cards out there that do that for pretty cheap. Find the lowest common denominator then focus on that guarantee a minimum acceptable performance profile there than scale-up, don't try to develop for the top 1% and scale down.
 
I'm sure I can dig up some examples.
https://tpucdn.com/review/amd-radeon-rx-6800/images/anno-1800-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/borderlands-3-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/borderlands-3-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/death-stranding-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/detroit-become-human-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/divinity-original-sin-2-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/far-cry-5-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/project-cars-3-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/project-cars-3-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/jedi-fallen-order-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/jedi-fallen-order-3840-2160.png

Because despite what you may say, the 6800 is about the same in performance as the 3070, maybe slightly ahead. But not when it comes to Ray-Tracing performance. Also don't forget that consumers favor Nvidia over AMD. At $570, AMD won't be selling many 6800's. The price should be $450 at best, otherwise it'll end up like the 5600XT, if not worse. Remember when the 5600XT was released for $280 but 5700 cards were going for $300? That was a stupid move from AMD, and charging $70 more than Nvidia is even more stupid.
View attachment 300954
View attachment 300955
View attachment 300956

AMD's CPU's have also risen in price. Their 8 core 5800X is $50 more than the 3700X on release. Seems AMD has gotten greedy.

I've always found average FPS to be a pretty pointless measure. Minimum or 1% FPS is much more useful to make these types of decisions, because when deciding what is the minimum performance you want this is usually expressed as "well, the lowest I want my framerate to fall is X" No one cares if they get 200fps 2/3's of the time if they spend the other 1/3rd at 20fps. The average is 140fps.

I find framerate stability is often overlooked. Best would be to have a range of framerates from a test, but if I only have one, I want the minimum/1%.
 
I've always found average FPS to be a pretty pointless measure. Minimum or 1% FPS is much more useful to make these types of decisions, because when deciding what is the minimum performance you want this is usually expressed as "well, the lowest I want my framerate to fall is X" No one cares if they get 200fps 2/3's of the time if they spend the other 1/3rd at 20fps. The average is 140fps.

I find framerate stability is often overlooked. Best would be to have a range of framerates from a test, but if I only have one, I want the minimum/1%.
Those 1% lows are certainly the thing you notice the most, that's for sure. They are always jarring and always disapointing.
 
I'm sure I can dig up some examples.
https://tpucdn.com/review/amd-radeon-rx-6800/images/anno-1800-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/borderlands-3-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/borderlands-3-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/death-stranding-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/detroit-become-human-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/divinity-original-sin-2-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/far-cry-5-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/project-cars-3-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/project-cars-3-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/jedi-fallen-order-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/jedi-fallen-order-3840-2160.png

Because despite what you may say, the 6800 is about the same in performance as the 3070, maybe slightly ahead. But not when it comes to Ray-Tracing performance. Also don't forget that consumers favor Nvidia over AMD. At $570, AMD won't be selling many 6800's. The price should be $450 at best, otherwise it'll end up like the 5600XT, if not worse. Remember when the 5600XT was released for $280 but 5700 cards were going for $300? That was a stupid move from AMD, and charging $70 more than Nvidia is even more stupid.
View attachment 300954
View attachment 300955
View attachment 300956

AMD's CPU's have also risen in price. Their 8 core 5800X is $50 more than the 3700X on release. Seems AMD has gotten greedy.
Something is wrong with that setup.

For example

Hardware Unboxed tested Gears 5.

Radeon RX 6800 beat the GeForce RTX 3070 and the results weren't even close.
 
Last edited:
Something is wrong with that setup.

For example, Hardware Unboxed tested Gears 5.

Radeon RX 6800 beat the GeForce RTX 3070 and the results weren't even close.
yeah different reviews have different results than that. You need to look at more than one cherry picked review.
 
Will they not brand a FidelityFX (open but AMD branded) that use the DirectML (that was made by microsoft using Nvidia card), that will add a little something to DirectML making it so that a Directx12 card will not automatically support FidelityFX even if they fully support DirectML ?
Well just to be clear.... this entire MS developed DX with NV stuff is silly.

Do people really think MS just got their hands on the tech in their console this month ? Or even this year ?

MS has been working with AMD on RT for YEARS.

Nvidia got wind of it and adapted their AI bits to do the job first. Their was an entire generation of Nvidia GPU that skipped the gaming market completely... cause gaming markets had zero use for Tensor cores. Why release a card that would have only been 5-10% faster then the 1080 but offer useless Tensors. (especially seeing as they had no real competition to worry about at that time.... just keep making the 1000 cards and profit for awhile can't blame them) When they got wind of what AMD and MS/Sony where cooking for the next gen consoles they got in front of it. No doubt MS worked with Nvidia... but to think they where not already working with AMD on their next gen console tech is silly. They didn't throw together their new Xbox on a whim in 2020, its been in development probably since 2017 and no doubt RT was part of the picture probably the entire time.
 
yeah different reviews have different results than that. You need to look at more than one cherry picked review.

There is something really wrong with TechPowerUp's setup though.

As another example...

LegitReviews, Kitguru, and HEXUS shows the Radeon RX 6800 is faster than the GeForce RTX 3070 in Borderlands 3, and again, it wasn't even close.
 
There is something really wrong with TechPowerUp's setup though.

As another example...

LegitReviews, Kitguru, and HEXUS shows the Radeon RX 6800 is faster than the GeForce RTX 3070 in Borderlands 3, and again, it wasn't even close.
Yep guru3d used the same cpu as tpu with different results.
 
Assuming AMD has a sane price. As much as we all love these new cards, they won't realistically sell. Neither AMD or Nvidia are going to sell this cards very well, and don't pay attention to the lack of supply as that's not due to demand. Neither the RTX 2000 or the 5700's and 5600XT cards sold well, mostly due to price. People are looking for a $250 or cheaper graphics card and I doubt the 6700's will be in that price range. Probably $300-$450 price range. So I guess the mainstream cards will be the RTX 3050 or the RX 6600 cards. Really sad.
No doubt hopefully the 3050/6500 class cards are not dogs. Who knows perhaps with all the improvements they will be a big bump for that market range. I agree with you though chances are the bump up will be minor. Although to be fair if a 6600 was to perform around the same as the 5700 at the $250 price range that could be a really popular card.

I splurged a bit more then usual last year and got a 5700 XT which I don't regret at all its been a great card, and I got in before the price of everything tech jumped. I won't be looking for a 6700 XT cause ya I expect they will be around the same $500 Canadian I paid for the 57xt... its not going to offer double to performance or anything crazy, and I doubt I would be able to sell the 57xt for enough to make it a inexpensive upgrade. I'll be holding out for a 7700 XT to replace it next year or early 2022. I expect the monitor I'm using now to last me that long anyway so I really don't need 200 FPS.
 
TPU had issues with Zen 3 review and testing, which they cleared up in a follow up article. Looks like something is off with this release as well.
 
Well just to be clear.... this entire MS developed DX with NV stuff is silly.

Do people really think MS just got their hands on the tech in their console this month ? Or even this year ?
I am not sure how clearer I am after reading this, the mentionning of Nvidia was just to show that when 2018 publicly started to work on DirectML they used Nvidia hardware (and Intel) a lot supporting the fact that it will not be an AMD exclusive.

https://www.pcworld.com/article/326...re-to-smarten-up-windows-10-apps-with-ai.html
https://www.pcworld.com/article/3263660/microsoft-winml-directml-machine-learning-games.html

Here:
https://devblogs.microsoft.com/directx/gaming-with-windows-ml/

Nvidia appear 7 time AMD or radeon none.

There is even cuda configuration in their code:
https://github.com/microsoft/tensor.../directml/third_party/gpus/cuda_configure.bzl
 
TPU had issues with Zen 3 review and testing, which they cleared up in a follow up article. Looks like something is off with this release as well.
TPU went to the wall declaring their results on target. Yet here we are asking why their results are different than everybody else. huh.
 
TPU went to the wall declaring their results on target. Yet here we are asking why their results are different than everybody else. huh.
To be fair, their results on the CPU were lower (than other reviews) because they didn't test with the best possible memory configuration. The specs of the test system that was used, are in the results. So even though the results are lower, they are valid. And for full context, they did a follow up article with 4 populated ram slots vs 2, which showed higher performance. And those results are valid as well.
Really, what they pointed out with the different results from different memory configurations, is good information.

Every other reviewer "only" showing AMD in the best possible light does a disservice IMHO.
 
GoodBoy yes and no, they want to show the maximum performance from either platform and to do that you need to have the ideal and properly tuned systems. Sure, most people wont have 4 memory slots filled, and most people just install their gear, install windows and go, they dont want to tinker around, so ya TPU articles are more inline then with the average consumer. But the high end people or those who know enough to want to get the most performance out of their rigs, have a baseline they can go from based on the other reviews.
 
There is something really wrong with TechPowerUp's setup though.

As another example...

LegitReviews, Kitguru, and HEXUS shows the Radeon RX 6800 is faster than the GeForce RTX 3070 in Borderlands 3, and again, it wasn't even close.
I dunno, I saw a few video (youtube) reviews of the card with their number, and the 6800 seemed to have at most 11% more FPS on games, with that value getting smaller for 4k. That said, who knows what settings were used other than "1440p ultra settings" and to be honest I really do have review fatigue and stop looking at too many sites plus for the level of difficulty it is to get any of these cards my level of caring about which is faster drops off pretty fast.
 
I'm sure I can dig up some examples.
https://tpucdn.com/review/amd-radeon-rx-6800/images/anno-1800-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/borderlands-3-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/borderlands-3-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/death-stranding-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/detroit-become-human-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/divinity-original-sin-2-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/far-cry-5-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/project-cars-3-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/project-cars-3-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/jedi-fallen-order-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/jedi-fallen-order-3840-2160.png

Because despite what you may say, the 6800 is about the same in performance as the 3070, maybe slightly ahead. But not when it comes to Ray-Tracing performance. Also don't forget that consumers favor Nvidia over AMD. At $570, AMD won't be selling many 6800's. The price should be $450 at best, otherwise it'll end up like the 5600XT, if not worse. Remember when the 5600XT was released for $280 but 5700 cards were going for $300? That was a stupid move from AMD, and charging $70 more than Nvidia is even more stupid.
View attachment 300954
View attachment 300955
View attachment 300956

AMD's CPU's have also risen in price. Their 8 core 5800X is $50 more than the 3700X on release. Seems AMD has gotten greedy.
Wow, such narrow margins of "victory" in those links. Some on them also show a 2080ti faster than a 3090 lol!
Also you forget that the 6800 has twice the VRAM.
 
I'm sure I can dig up some examples.
https://tpucdn.com/review/amd-radeon-rx-6800/images/anno-1800-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/borderlands-3-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/borderlands-3-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/death-stranding-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/detroit-become-human-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/divinity-original-sin-2-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/far-cry-5-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/gears-5-3840-2160.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/project-cars-3-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/project-cars-3-2560-1440.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/jedi-fallen-order-1920-1080.png
https://tpucdn.com/review/amd-radeon-rx-6800/images/jedi-fallen-order-3840-2160.png

Because despite what you may say, the 6800 is about the same in performance as the 3070, maybe slightly ahead. But not when it comes to Ray-Tracing performance. Also don't forget that consumers favor Nvidia over AMD. At $570, AMD won't be selling many 6800's. The price should be $450 at best, otherwise it'll end up like the 5600XT, if not worse. Remember when the 5600XT was released for $280 but 5700 cards were going for $300? That was a stupid move from AMD, and charging $70 more than Nvidia is even more stupid.
View attachment 300954
View attachment 300955
View attachment 300956

AMD's CPU's have also risen in price. Their 8 core 5800X is $50 more than the 3700X on release. Seems AMD has gotten greedy.

I would remove 1080p scenarios as they are usually CPU limited

I would also remove 4K scenarios as these 2 cards (3070 & 6800 are targetted at 1440p with max settings)

Then we are left with below benchmarks:


4 games out of the 23 reviewed by TPU

TLDR when we look at a playable frame rates scenario, the 6800 consistently beats the 3070

EDIT:
By *playable framerates scenario* I mean a [H] style review where you choose the most optimal settings for the card(6800) & then compare all competitors in the same settings

In such a scenario**, the 6800 consistently beats the 3070

**unless of course, Gears 5 is the only game that you want to play. In which case you can go for a 3070
 
Last edited:
I am glad AMD was not lying in their press release. RX6800XT is kicking ass and they did not even need SAM to do it like some people were suggesting. Basic 6800 however, I can't hell but feel it is overpriced. I mean, it does offer 2080 Ti performance at half the price and would pretty much triple the performance that my GTX1080 has, but so would 3070 and do it 80 bucks cheaper. Yes AMD has 16gb which makes it more future proof but as of right now I can't help but to think that 3070 is a better deal. And as an added bonus you get more mature Raytracing too. RX6800XT really does mess the pricing model of Nvidia on the very high end. However I think vanilla RX6800 is not putting enough pressure on the lower high end market.

If AMD would release a 12gb (or even 8gb) version for cheaper then that would make things really interesting.

are you forgetting about SAM and that special cache thing they got going on? that may be the perfect amount of memory to get the best performance. also it's best to have the 16GB GDDR available since that's what's available on consoles.

and i know what you're gonna say. "well that is used for system and graphics memory" .. yeah well you wanna take a chance of running out? especially when you got windows running in the background. And do you really want "console level performnace" ... pfff ... :flick of the wrist:
 
all you 6XXX owners just wait for that wine to ripen boy.... ooooweeee.


launch reviews only tell part of the story. heck even w/ nvidia. just don't let your card get past a couple generations before benchmarking w/ team greed.

nvidia taking a play out of apples playbook with that one.
 
also it's best to have the 16GB GDDR available since that's what's available on consoles.

I doubt it ever get close to that for what get used as VRAM, if there is say 13.5 gig of the total ram available for the main game currently playing, that total ram (i.e. your computer will have 26 or 38 such quantity of ram with a 10 gig VRAM card), if the game use 5-6 gig of ram for non VRAM usage it is left with 7-8 gig for the VRAM part.

Console in mind 8 gig of vram is potentially quite good (depending on how much that decompression turn out to make a difference, but in that case the amount of ram is not much of clue)
 
6800 and 3070 aren't even in the same class. There's a 10%+ performance gap and double the VRAM. Don't even know how that argument dragged on for so long
 
I doubt it ever get close to that for what get used as VRAM, if there is say 13.5 gig of the total ram available for the main game currently playing, that total ram (i.e. your computer will have 26 or 38 such quantity of ram with a 10 gig VRAM card), if the game use 5-6 gig of ram for non VRAM usage it is left with 7-8 gig for the VRAM part.

Console in mind 8 gig of vram is potentially quite good (depending on how much that decompression turn out to make a difference, but in that case the amount of ram is not much of clue)
like i said tho we're just seeing this gen kicking off. even cyberpunk's gonna run on last get hardware. dude once these dev's start harnessing the horsepower available to them...

look, i'd rather have too much than not enough. see when i'm building a rig i'm not looking for something i'm gonna have to upgrade in a year. (even if i should)

reminds me of all the people little over a year ago saying "all you need is a quad core for gaming" if you ask me, the last thing you want to be doing is running minimum specs. but hey, i was f'n around in the 90's so i know how it goes.
 
6800 and 3070 aren't even in the same class. There's a 10%+ performance gap and double the VRAM. Don't even know how that argument dragged on for so long
they are in the same "market segment". just because one wins doesn't mean they're not in the same class.

eg: mustang vs camaro / GT vs Corvette
 
look, i'd rather have too much than not enough. see when i'm building a rig i'm not looking for something i'm gonna have to upgrade in a year. (even if i should)
Yes (depending of the price tag of the wasted affair and the consequence of missing).

like i said tho we're just seeing this gen kicking off. even cyberpunk's gonna run on last get hardware. dude once these dev's start harnessing the horsepower available to them...
Like I said maybe the decompression and ssd speed change the notion of VRAM (but then it is all off to know what to expect and 16 isn<t a reference int hat case either) after all they are able in 2020 to squeeze Cyberpunk on the 2013 PS4, the point is that the 16 gig of ram off the console is total ram and not much of an indication that game will start use that much because of that, because the OS and the games will be using about half of it, 5-6 gig in extreme case where they use little.
 
Yes (depending of the price tag of the wasted affair and the consequence of missing).


Like I said maybe the decompression and ssd speed change the notion of VRAM (but then it is all off to know what to expect and 16 isn<t a reference int hat case either) after all they are able in 2020 to squeeze Cyberpunk on the 2013 PS4, the point is that the 16 gig of ram off the console is total ram and not much of an indication that game will start use that much because of that, because the OS and the games will be using about half of it, 5-6 gig in extreme case where they use little.
but think about it, theoretically, how much vram could the PS5 address? and lets say that number is 14.5 gb? it's not like they have 15 GB ram chips.

i mean it's whatever. maybe it won't make a difference now, but in 2 yrs? you want a cheaper model, maybe one will come in time? i know after seeing the reviews i'm waiting for custom boards anyway because of the coil whine. but for people that want their pc's AT LEAST as good as a console, they will appreciate the 16GB as standard.
 
Yes (depending of the price tag of the wasted affair and the consequence of missing).


Like I said maybe the decompression and ssd speed change the notion of VRAM (but then it is all off to know what to expect and 16 isn<t a reference int hat case either) after all they are able in 2020 to squeeze Cyberpunk on the 2013 PS4, the point is that the 16 gig of ram off the console is total ram and not much of an indication that game will start use that much because of that, because the OS and the games will be using about half of it, 5-6 gig in extreme case where they use little.
Sony hasn't publicly talked about typical memory usage. But the series X is setup like this:

Memory Config:
10240 MB for GPU memory / 320 bit @ 560GB/s
+3584 MB for system memory / 192 bit @ 336GB/s
+2560 MB reserved by the OS / 192 bit @ 336GB/s

I am certain that if needed, they can use some of the 10gig pool, for system memory. and probably regularly do. 3584gb is likely a minimum reservation.
 
are you forgetting about SAM and that special cache thing they got going on? that may be the perfect amount of memory to get the best performance. also it's best to have the 16GB GDDR available since that's what's available on consoles.

and i know what you're gonna say. "well that is used for system and graphics memory" .. yeah well you wanna take a chance of running out? especially when you got windows running in the background. And do you really want "console level performnace" ... pfff ... :flick of the wrist:

No I did not. It is just not a feature I am considering because I am on a B450 rig and at least at the moment it does not support it. Though who knows what the future brings now that Nvidia spilled the beans and said that the feature is not technically locked to anything and they are also making something similar. But if it did I would also need to upgrade my CPU, I think.
 
but think about it, theoretically, how much vram could the PS5 address? and lets say that number is 14.5 gb? it's not like they have 15 GB ram chips.

i mean it's whatever. maybe it won't make a difference now, but in 2 yrs? you want a cheaper model, maybe one will come in time? i know after seeing the reviews i'm waiting for custom boards anyway because of the coil whine. but for people that want their pc's AT LEAST as good as a console, they will appreciate the 16GB as standard.
And would leave 1.5 gig of ram for an incredibly heavy game + the OS all other background affair, maybe in some 3d tech demo.... A AAA video game usually like to have 5 gig of ram at least an modern OS an other 1 (the PS4 was using at least 2.5 gig of ram I think, but this one is apparently more efficient)

but for people that want their pc's AT LEAST as good as a console, they will appreciate the 16GB as standard.
I think that for the at least as good as a console, 16GB VRAM is probably not that relevant of a reference frame for the reason expressed, like you didn't need anything need 8 gig of VRAM to be at least as good as the original PS4. But the think if you are spending as much if not way more on a video card than in a complete new console, you almost certainly want your PC to be significantly better than a console (depending how good keyboard/mouse support are on console system/game now a day). Running Cyberpunk with RT, games at high setting at 60 fps+ instead of 30 fps and so on, being able to play Flight simulator with x monitors, that what it is often about now a day when spending over $500 on a video card.

Some buffer above console level to be confident to be worth it to not have simply bought a console is a nice piece of mind for many, thus the nice to have more than 8 gig (or even 10 gig) of VRAM, but I suspect around 8 gig max of ram used has VRAM will often be the norm on a console.

What is still in the air, is that has ssd/caching/decompression-transfer to memory get faster the lesser the cost of missing vram become, on the PS5 for example maybe that some game will feel like having 8 gig fast vram, 50 gig slow vram.
 
so has anybody else seen any of the hilarious ebay posts of 6800xt scalpers? i know there was a whole thread dedicated to 3080 scalping but didn't think there was a point in starting one for amd's cards at this point. but thought this was funny...
funny ebay.jpg
 
Last edited:
Back
Top