Where Gaming Begins: Ep. 2 | AMD Radeon™ RX 6000 Series Graphics Cards - 11am CDT 10/28/20

I want a 6900XT, but I'll probably go for a 6800XT. Those graphs make me feel better about "settling". I don't see $350 worth of performance over the 6800XT, but that's usually how it goes with the top card.

I'm more interested in the minimum FPS - we'll see how that goes with the reviews. I think $650 for the 6800XT plus $150ish for a water block will get me close enough to 6900XT numbers that I won't shed any tears. I'm also planning on a 5800X purchase (taking the lower latency of a single CCX over moar cores and moar $$$) and hopefully an LG OLED to take advantage of that hardware at 4K.
I am interested if those perfect 80CU dies are going to clock better than the 6800XT.
 
I am interested if those perfect 80CU dies are going to clock better than the 6800XT.
This launch has been such a wild ride -- not just this launch, but also all the drama with Intel and Nvidia -- a part of me wishes that I was still doing hardware news.
 
I have this sneaking suspicion that once independent reviews of the new 6800XT/6900XT series cards hit and they become available for sale, we are going to start seeing a lot more 3080/3090 suddenly become "in stock" and available for purchase.... because supply channels will start filling up when no one will want to pay more for less...
I would have to disagree with you. There are people who are die hard Nvidiots who will never buy an AMD card, and there are still quite a few people who have G-Sync monitors (with the module in the monitor) who can only go with the Nvidia Eco-system.

It the same with Intel users. I can tell you right now there will be people who won't buy the new AMD 5k series CPU's just because it isn't Intel.....Fanboi's I tell yeah!
 
I would have to disagree with you. There are people who are die hard Nvidiots who will never buy an AMD card, and there are still quite a few people who have G-Sync monitors (with the module in the monitor) who can only go with the Nvidia Eco-system.

It the same with Intel users. I can tell you right now there will be people who won't buy the new AMD 5k series CPU's just because it isn't Intel.....Fanboi's I tell yeah!
I think most fanboys are typically people that have already bought something and are just trying to justify their purchase. They wouldn't be buying a new CPU/GPU anyways.
 
Not sure if it is interesting that they choose 3200mhz ddr-4 for that 5900x or not, but I would have expected higher with the talks of run best at 3800-4000 or what not there was a little while ago.
3200 is spec IIRC. So that would jmake sense.
 
I think most fanboys are typically people that have already bought something and are just trying to justify their purchase. They wouldn't be buying a new CPU/GPU anyways.
I feel like this rings more true. It feels to me like the real die hard fan boys are the ones that don't purchase every cycle and might have 1 or 2 older products. I get this same feeling with razer fan boys mostly though.
 
3200 is spec IIRC. So that would jmake sense.

And it saves them hearing.... well that RAM is optimized for AMD and thus boosting their smart memory trickery. Clearly not optimal for Nvidia build.

Knowing that there is probably some head room there yet is nice to know. Reviewers are going to have fun now having to test ram with GPUS. lol
 
And it saves them hearing.... well that RAM is optimized for AMD and thus boosting their smart memory trickery. Clearly not optimal for Nvidia build.

Knowing that there is probably some head room there yet is nice to know. Reviewers are going to have fun now having to test ram with GPUS. lol
Yeah I can see an argument for faster ram pulling ahead a bit even with slower timings. When everything is paired together. But i have also seen some good arguments for the exact opposite it’s a strange time. This is not the month to be trying to build out a system, there are so many things that need reviewing and testing it’s a crazy time.
Big F-U to AMD for rocking the boat and deviating from the status quo.
 
  • Like
Reactions: ChadD
like this
Coreteks hasn't been the most accurate with some stuff surrounding this launch as well as with Ampere, but if he's right about this, than part of the reason AMD has been ... soft ... on ray-tracing details is that their ray-tracing performance may be tied to CPU performance.

Multi-core/thread processing could be a big determining factor in how Big Navi handles ray-tracing. On the one hand, if this is true, than the GPU doesn't need as much dedicated hardware just sitting there doing nothing if you're not doing any ray-tracing. On the other hand, this means that with AMD, if you want better ray-tracing, you need more cores/threads.

This could explain some of the console APU decisions this gen.
That sounds like a good excuse to spring for the 5900x or 5950x if true. It would also explain some of the stuff I saw about CPU usage that didn't make complete sense when I was reading up on DXR recently.
 
That sounds like a good excuse to spring for the 5900x or 5950x if true. It would explain some of the stuff I saw about CPU usage that didn't make complete sense when I was reading up on DXR recently.
Yeah, and it really plays to AMDs strengths. Even if they didn't have the IPC with Zen 3, they're smashing it with core counts.

If this is the case, then we're well and truly moving away from single-threaded performance being key to better gaming.

I mean, while there are absolutely exceptions, if someone was making a dual-core, quad-threaded CPU that ran at 9.99GHz, let's face it, it would probably be the must-have CPU for gaming. Maybe we're finally seeing that change.
 
Yeah, and it really plays to AMDs strengths. Even if they didn't have the IPC with Zen 3, they're smashing it with core counts.

If this is the case, then we're well and truly moving away from single-threaded performance being key to better gaming.

I mean, while there are absolutely exceptions, if someone was making a dual-core, quad-threaded CPU that ran at 9.99GHz, let's face it, it would probably be the must-have CPU for gaming. Maybe we're finally seeing that change.


I doubt sheer speed is even very important these days, from a realistic perspective.

When it comes down to it, CPUs are going to be sold to data centers and servers, and there, what will count is cores. Each core would be running its own set of server and database applications, needing to gather as much data as possible, as quickly as possible. and send it along through the wires. In those situations, NVMe type devices paired with a core for each server instance will be what gives a database its speed. PCIe 4.0 has advantages there when it comes to advanced, ultra-high speed network cards that utilize NVMe-over-Fabric type technologies.
 
I'm not talking about data center PCs I'm talking about gaming PCs. Many cores has been a major factor in data centers for ... for forever. They haven't been particularly relevant to gamers, though, not beyond more than a couple cores.
 
Honest question, is there any noticeable difference between 200+ FPS and 150 FPS? or for that matter 100 FPS?
 
Looks like AMD fans finally have something to be excited about!

Almost didn't say anything, don't want to jinx it.
But claiming you don’t want to jinx something is totally 100% how you jinx something. So now you owe the whole forum a Coke??? Is that how that works, or must we rhythmically chant circle circle dot dot and everybody gets a cooty shot??
 
I was planning on getting a 3090 but just haven't been able to. If the 6900xt is as good as they're saying at only $999 I don't think I could buy a 3090 without feeling like a sucker.

I'm eager to see reviews.
 
Honest question, is there any noticeable difference between 200+ FPS and 150 FPS? or for that matter 100 FPS?
Well considering 360Hz monitors recently were released, yes.

I'm on 160Hz myself, and most people buying these video cards I bet are on 1440p (or greater), possibly 144Hz+, so yeah, it matters.
 
  • Like
Reactions: Parja
like this
WTF?! AMD actually releasing new GPUs that can compete with Nvidia's newly released flagship products?! Just when I thought 2020 couldn't get any fucking crazier!!??
 
Honest question, is there any noticeable difference between 200+ FPS and 150 FPS? or for that matter 100 FPS?
If we are talking purely about the refresh rate spec, on paper: For most people, 120-144hz is big improvement over 60hz. After that, the actual gain in milliseconds sharply enters diminishing returns.

However, I have watched a few videos on youtube which attempt to guage the benefits. Using enthusiast gamers. gaming journalits, and also pro gamers, as test subjects. And basically all of them show that 240hz and higher, do tend to result in better overall average scores in fast paced competitive games, compared to 120/144hz. And compared to 60hz, its a large advantage. Even with a pro gamer using 60hz.

The other thing to consider, however, is not simple paper specs. But whether or not your monitor's pixel response can actually keep up with that fast of an input refresh rate. Basically, you probably shouldn't be looking at any current VA monitor, for refresh rates over 144hz (save for the Samsung Odyssey VA monitors). You'll need the very best IPS panels to mostly keep up with a 240hz refresh. But the current high performing IPS panels do take a hit to contrast/black level. And higher than 240 is TN panel territory. And TN's have so many image quality downsides, I would only recommend them for people who value gaming scores above all else.
 
Last edited:
Also, average frame rate (even above screen refresh) may help with higher lows, meaning you are less likely to drop below your refresh rate, even in intensive parts of the game.
 
Looks like AMD fans finally have something to be excited about!

Almost didn't say anything, don't want to jinx it.
Not just AMD fans. Many of us who don't pick sides but buy the high end GPUs are plenty excited too. More competition, more innovation and better pricing. Can't wait to pick up a 6900XT.
 
Honest question, is there any noticeable difference between 200+ FPS and 150 FPS? or for that matter 100 FPS?


To long didn't watch version:

High FPS matters more than high refresh rate
Going from 60fps to 120fps makes a big difference
Going from 120fps to 240fps makes a smaller difference
Screenshot 2020-10-31 100129.jpg

BTW: Paul and Linus are the non-pro players in this test and they have the largest delta in performance between 120 and 240 fps.
 
Last edited:
I'm not gonna watch a LTT video so can you explain what "high fps matters more than high refresh rate"? How can you display a high framerate without a monitor with an equivalent refresh rate
 
Yeah I seem to recall watching one of those videos in the past (maybe this is the same video) and the consensus is many couldn't tell the difference, and skimming through this video, makes a difference for "gamers" but for people like me who just want things to look nice maybe not so much, and if anything 200+ vs 150 isn't going to make any difference in these games for me but future proofing it may matter if that 200+ comes down to 120 for some future game, and then the 150 would be in the 70 or so range maybe.
 
I'm not gonna watch a LTT video so can you explain what "high fps matters more than high refresh rate"? How can you display a high framerate without a monitor with an equivalent refresh rate
My understanding is that higher than your refresh rate means you’ll be able to easier lock to your refresh rate. Which with the advents of variable refresh/adaptive sync it’s not as big of an issue but still helps to not have issues.

Just think back to enabling V Sync at 60Hz but your fps is bouncing off of 60 down to 43 and the associated judder and the refresh and frames fall out of sync. But if you’re lowest fps is 65, sure only 60 frames are displayed but nothing falls out of sync and no screen tearing.
 
Well almost anyone with a monitor over 144hz is going to be gaming at 1440 max anyway.... and basically any of these cards seem to be able to deliver darn close to 144 average with ultra settings in it seems most games. Engage the freesync and your golden. There really is zero point in going a ton over a monitors refresh rate unless the card has terrible drops which would mean crazy spikes as well to get those average numbers.

Anything over 144hz... is pretty much going to involve 1080p outside of a handful of very expensive ultra wides with high refresh or something. Many gamers buy very nice monitors, still the number that spend $500+ on a GPU and $1000+ on their monitor is a pretty small number.

This generation (NV included) is hitting a nice sweet spot where you should be able to push a 1440 monitor up to refresh or at least within 15-25 fps which Free/Gsync handle with basically no human perceptible difference. 4k with high refresh is still not really happening in the monitor market... and by the time it does we'll be talking about Navi 4 or 5 and 6080s lol. 1080p 240hrz is cool if your looking for every edge in on your leet sniping skills in the FPS of your choice... but really its of use to only 1-2 types of games, at a uber competitive level, most people (and for sure most of us old men around [H]) I'm sorry we may have all been twitch kings in our days but 240 or 144 fps is going to make little difference for us. lol

The game changer this gen is being able to play 144 hrz at 1440 with ULTRA settings. No need to turn anything off at all to max our high end 1440 gaming monitors.
 
these bench comparisons with "+smart memory access" are useless.

Definitely interested. Want real benches.
Dear god.
I can't even replicate their Call of Duty 1440p RTX 3090 results (I get higher FPS in multiplayer, and my card doesn't even exceed 350W very much even with a 400W power limit via maxed power slider) because apparently, according to someone else, AMD ran this in "Benchmark mode", however there is no way to activate benchmark mode in the blizzard client. So I guess real end users have no access to benchmark mode. And a +120 core / +600 ram overclock over stock does not give a +30 FPS boost. No way in hell.
 
I'm not gonna watch a LTT video so can you explain what "high fps matters more than high refresh rate"? How can you display a high framerate without a monitor with an equivalent refresh rate
They tested 200+ FPS at 60hz vs 60fps at 60hrz, and they were able to increase their reaction times. The theory they presented was input lag is affected by frame rate.
 
I'm not gonna watch a LTT video so can you explain what "high fps matters more than high refresh rate"? How can you display a high framerate without a monitor with an equivalent refresh rate
Game refreshes its state more often, so what you get to see at that monitor's refresh time is a game state much closer to it. Otherwise you see something further down in the past.
 
Well I sold my 3080 this morning. Let’s see if that was a mistake or not!
 
They tested 200+ FPS at 60hz vs 60fps at 60hrz, and they were able to increase their reaction times. The theory they presented was input lag is affected by frame rate.
Yeah the theory behind the hardware/logic is that you may see an image on the screen but it really isn't there in the computer memory just on screen. That said, having line of sight insta-bullet travel "sniper skills" in a multiplayer isn't exactly anything I care about.
 
Back
Top