Where Gaming Begins: Ep. 2 | AMD Radeon™ RX 6000 Series Graphics Cards - 11am CDT 10/28/20

Dear god.
I can't even replicate their Call of Duty 1440p RTX 3090 results (I get higher FPS in multiplayer, and my card doesn't even exceed 350W very much even with a 400W power limit via maxed power slider) because apparently, according to someone else, AMD ran this in "Benchmark mode", however there is no way to activate benchmark mode in the blizzard client. So I guess real end users have no access to benchmark mode. And a +120 core / +600 ram overclock over stock does not give a +30 FPS boost. No way in hell.

Frankly I believe AMD over you. I half kid.

One thing no one seems to be taking into account is AMD lists which API they are using.... and I notice they are sometimes using DX11 when we know there is DX12 available.

But whatever... if you don't believe AMD marketing that is cool everyone takes marketing with a grain of salt. However the current people at AMD are not known to fudge nothing. They don't overlock the NV cards AT all... they also DO NOT overlock their own cards unless they specifically mention they are using one of their Auto overclock features or Rage mode or whatever.

Reviewers will have the cards soon.... and they will test them at stock and tortured. Vs NV at stock and tortured. Then you can compare apples to apples from a third party. I tend to believe you probably really really really overpaid for your 3090. But then most GPUs that are only ever going to be the best of the best for a few months in general are terrible value propositions. Yes if true and the 6900 even trades blows with the 3090, its really going to suck for people that paid a massive premium for a card that was only top of the top for a few weeks. But such is life.

And for the record.... ya the 3090s... seem from all accounts from review sites prone to differences. They are binned chips... and there is a lot more in play then just Mhz they run at. It seems like the quality of the power going to those monstrosities matters and can have real impacts. If you run your card at stock and are besting AMDs results. I would count yourself super lucky, not only did you get one of the apparently couple hundred cards end users have in the wild... you also got one with quality silicon from a supplier that also used quality power delivery components. Congrats. (seriously winning the silicon lottery with GPUs is nothing new and always feels good when you get a card that can clock a little higher and run a little cooler)
 
Game refreshes its state more often, so what you get to see at that monitor's refresh time is a game state much closer to it. Otherwise you see something further down in the past.

Anything that can be measured? Anyone else test this other than LTT?
 
I've never been excited about an AMD GPU like this, especially in over a decade. I have an ASUS 3080 Strix on the way and I'm just feeling "meh" about it but the 6900 XT has me [H]ard as a rock. Got a 5700 XT today and this thing is rock solid, I haven't run into a single problem with it so I think the AMD driver's being shit myth is just that, a myth. They might have had black screen issues before but I don't see anything like that right now, 12/8 can't come soon enough, my F5 key will be ready and oiled. I think I'm becoming an AMD fanboy at long last.
 
I've never been excited about an AMD GPU like this, especially in over a decade. I have an ASUS 3080 Strix on the way and I'm just feeling "meh" about it but the 6900 XT has me [H]ard as a rock. Got a 5700 XT today and this thing is rock solid, I haven't run into a single problem with it so I think the AMD driver's being shit myth is just that, a myth. They might have had black screen issues before but I don't see anything like that right now, 12/8 can't come soon enough, my F5 key will be ready and oiled. I think I'm becoming an AMD fanboy at long last.

Someone check the skies for a 50 mile meteor impact, we must be near the end of the world and the end times if joker is excited about amd gpus.
 
Anything that can be measured? Anyone else test this other than LTT?
Not really, but “server ticks” are a thing and those vary based on lag times and depending on the tick rate and people’s input late cues they could be issuing multiple commands between ticks or getting them in after the tick completed. There are a lot of different methods for determining the optimal tick time and some games just use an arbitrary value. The easiest to measure and where it is most frequently encountered is in MMO’s where the idea of the GCD (Global Cool Down) comes in where the Pope rate was fixed to an arbitrary value such as 1.5s.
 
I've never been excited about an AMD GPU like this, especially in over a decade. I have an ASUS 3080 Strix on the way and I'm just feeling "meh" about it but the 6900 XT has me [H]ard as a rock. Got a 5700 XT today and this thing is rock solid, I haven't run into a single problem with it so I think the AMD driver's being shit myth is just that, a myth. They might have had black screen issues before but I don't see anything like that right now, 12/8 can't come soon enough, my F5 key will be ready and oiled. I think I'm becoming an AMD fanboy at long last.
Been rocking my MSI 5700 XT since last year. Got it for a fantastic deal before all this Cov crap drove prices up. ($499 Canadian / $375 US) Its been rock solid the entire time. The black screen stuff was a problem... but it hardly effected everyone. I have had no real issues with my card at all... and the 570 and 580 I had in my machine before that both ran fine as well. (although to be fair those later model GCN cards where dialed in after 2 or 3 refreshes). AMDs hurt themselves on the drivers with the UI overhauls imo There where a lot of people that where very vocal about hating their current UI redesign. I myself found it a bit over done at first... but its grown on me. But with that and the actual legit black screen bug that effected some people for a month or so, they have grown a myth of AMD drivers bad.

Also if it matters to you in the slightest.... AMDs open source Linux drivers are insanely good (by Linux standards no lying gaming on Linux is still mostly slower outside a few outlier windows vulcan games that will run faster under Linux). If your new to Linux or just plan to dabble once a year... AMD makes it stupid easy. Basically EVERY distro will just install and go on AMD, no selecting non free drivers, or bolting things to your kernel... or installing with crap open source drivers and jumping through hoops to switch to the Nvidia closed source black listing the free driver ect ect. Anyway ya AMD+Linux is great.
 
Yeah Canada’s electronic supply chain is a mess right now. So that is/was a good deal, between COVID, “Tanker” Bans, forest fires, and other things yeah getting electronics into Canada is a PITA right now.
 
  • Like
Reactions: ChadD
like this
Yeah Canada’s electronic supply chain is a mess right now. So that is/was a good deal, between COVID, “Tanker” Bans, forest fires, and other things yeah getting electronics into Canada is a PITA right now.
First time I ever remember seeing a GPU I have owned for a year on a shelf for more then I paid for it. I have been trying not to get excited about Navi 2... cause I know what it will be like getting one here. Even if stock is good, I'm not convinced any of our retailers will be priced right. I prefer to buy from the local guys if I can rather then ordering online... its just nice to go back there if something is funky out of the box. The local Memory Express here has been great to me the last few years exchanging RAM with Errors no questions asked ect.
 
First time I ever remember seeing a GPU I have owned for a year on a shelf for more then I paid for it. I have been trying not to get excited about Navi 2... cause I know what it will be like getting one here. Even if stock is good, I'm not convinced any of our retailers will be priced right. I prefer to buy from the local guys if I can rather then ordering online... its just nice to go back there if something is funky out of the box. The local Memory Express here has been great to me the last few years exchanging RAM with Errors no questions asked ect.
They can't be priced right, as it currently stands because our Gov't is full of idiots who can't figure out how the world works we now have all our Electronics delivered in California and New York, trucked to Utah, then taken from there to Vancouver, Calgary, and Ontario for distribution to the rest of Canada. When they instituted that "Tanker" Ban to appease the environmentalists they didn't ban tankers, they banned all boats with a greater than X displacement. Which covered most of the larger freighters coming from Asia, so instead of docking in Vancouver and Montreal where they could before they have had to reroute to the US because the Gov't couldn't comprehend the idea that these shipping companies just don't have smaller freighters sitting around waiting to be used. So yeah most of our electronics now get the added fees from the transport by truck up from the US, as well as all the border handling fees. And the best part, the tanker ban was to stop or slow down the methane plants going in for export to China, because the plants caused too much CO2 and blah blah blah, well the increased trucking basically offset that amount so all they did was delay the plants and add a huge carbon footprint to just about every device that comes into Canada. And the plants are going ahead anyway, they will just use smaller Tankers and make more frequent trips. So the only thing those environmentalists and their tanker ban did was delay the project by about 3 years, and increase CO2 emissions, all while adding like 20% to all electronics that come up here.
 
Hopefully with Amazon opening up a distribution center in Winnipeg MB, we'll start to see lower pricing/quicker availability of PC components to Canada.

I'm lucky in that I live near the border and can pick up products shipped to a US destination (at least before the border closure).
 
  • Like
Reactions: ChadD
like this
I would have to disagree with you. There are people who are die hard Nvidiots who will never buy an AMD card, and there are still quite a few people who have G-Sync monitors (with the module in the monitor) who can only go with the Nvidia Eco-system.

It the same with Intel users. I can tell you right now there will be people who won't buy the new AMD 5k series CPU's just because it isn't Intel.....Fanboi's I tell yeah!

I am in the latter camp, I have an X34 G-sync monitor so unless Acer releases a firmware update (which is rumored to be in the works) I am pretty much stuck with Nvidia unless I buy a new monitor....something I am not really keen to do. But that's okay, I am happy to buy Nvidia's now inevitable 20gb 3080ti for the same price as the 3080 :D
 
I've never been excited about an AMD GPU like this, especially in over a decade. I have an ASUS 3080 Strix on the way and I'm just feeling "meh" about it but the 6900 XT has me [H]ard as a rock. Got a 5700 XT today and this thing is rock solid, I haven't run into a single problem with it so I think the AMD driver's being shit myth is just that, a myth. They might have had black screen issues before but I don't see anything like that right now, 12/8 can't come soon enough, my F5 key will be ready and oiled. I think I'm becoming an AMD fanboy at long last.
LOL. I tried so hard to grab a 3080 or 3090 and 0 luck. It wasn't meant to and now I am on to 6900xt. Nice toys for next few months. 5900x or 5950x F5 key rape on Nov 5th and then Dec 8th will be another f5 key rape. Might buy a new keyboard along with it ROFL.
 
Anything that can be measured? Anyone else test this other than LTT?
https://blurbusters.com/faq/benefits-of-frame-rate-above-refresh-rate/

blur-busters-gsync-101-vsync-off-w-fps-limits-60Hz.png


https://www.nvidia.com/en-us/geforce/news/what-is-fps-and-how-it-helps-you-win-games/

fps-hz-visualization.png
 
Hopefully with Amazon opening up a distribution center in Winnipeg MB, we'll start to see lower pricing/quicker availability of PC components to Canada.

I'm lucky in that I live near the border and can pick up products shipped to a US destination (at least before the border closure).

I'm in Winnipeg... good chance the warehouse will be less then 5 min from where I am. Hopefully they keep it well stocked. Been getting next day delivery on most stuff here anyway... but same day sounds better. lol
 
1 MORE UNO MAS 1 MORE!

6900XT: $999 !!!!!!!!!!!!!!!!!!!!!!!!!!!!
80 CU
2250 Mhz
128 mb Inifinity Cache
16GB GDDR6
300w

Trades Blows with 3090!!!!! And even beats it!!
Look at the fine print. Those 6900 benchmarks are done with overclocking the gpu that is what Rage mode is overclocking the gpu. They di not attempt to overclock the RX 3090 though the 3090 overclocking ability is almost zilch. That is a deceptive practice from AMD. So I think the 6900XT has little value to me but the 6800XT like an MSI Gaming X Trio 6800XT would definitely interest me. Probably will go for $750 . Asus has a ROG Strix 6800XT that is water cooled, It will probably go for $850. Too rich for my taste.
 
So I think the 6900XT has little value to me but the 6800XT like an MSI Gaming X Trio 6800XT would definitely interest me. Probably will go for $750...

I do think the 6900XT that gives 12% more performance than the 6800XT for $350 more is somewhat palatable, especially since it's the upper end (and especially when compared to the absurdly-priced 3090.)

Thing is, the open market will set the price for these cards for most of us. I have a feeling the divide in price between 6900XT and 6800XT is going to be bigger than $350 at street prices e.g. $1300 Vs $850, unless you're one of the very luck few who beats the scalpers on Nov 18. And in this case, the 6900XT makes even less sense for most ppl.
 
Look at the fine print. Those 6900 benchmarks are done with overclocking the gpu that is what Rage mode is overclocking the gpu. They di not attempt to overclock the RX 3090 though the 3090 overclocking ability is almost zilch. That is a deceptive practice from AMD. So I think the 6900XT has little value to me but the 6800XT like an MSI Gaming X Trio 6800XT would definitely interest me. Probably will go for $750 . Asus has a ROG Strix 6800XT that is water cooled, It will probably go for $850. Too rich for my taste.
Both 6900XT and 3090 have no value for gamers, but the former's Rage mode probably doesn't account for more than 5% on average.
 
Look at the fine print. Those 6900 benchmarks are done with overclocking the gpu that is what Rage mode is overclocking the gpu. They di not attempt to overclock the RX 3090 though the 3090 overclocking ability is almost zilch. That is a deceptive practice from AMD. So I think the 6900XT has little value to me but the 6800XT like an MSI Gaming X Trio 6800XT would definitely interest me. Probably will go for $750 . Asus has a ROG Strix 6800XT that is water cooled, It will probably go for $850. Too rich for my taste.

I THINK there is no deceptive practice going on here, and the fine print was not all that fine but clearly seen. They are showing their GPU's in the best light, which is exactly what they expect AMD to do. After all, we already know that NVidia showed their cards in the best light as wel...... Oh, wait, they did not show us anything at all in their prerelease event. :D

Edit: The 6900XT beats the 3090 at 4k, at that can only be a good thing for all of us.
 
It's not really in AMDs best interest to be purposely deceptive in any benchmarks they release. Anyone whom thinks otherwise is a fool. They are on the cusp of leading or sharing the lead in the market for GPU and CPU. Reputation means a lot in this position. If you examined and get the feel of the the Culture that Lisa Su has bred at AMD, you will realize that they are not releasing bullshit benchmarks.

Unofficial benchmarks? That's could be different ballgame. However it is clear - even most unofficial benchmarks are telling us the same thing, Be skeptical, but don't tell me that AMD has just launched the biggest misinformation campaign since the Russians in the last re-election scam. That's fanboyism, paid troll talk or simply being delirious.
 
Last edited:
I’m quite interested in seeing 3rd party apples to apples overclocked comparison benchmarks (with all the stops pulled out on water - no LN2 stuff) for both AMD‘s 6900XT and Nvidia‘s RTX 3090 FE. Hopefully we’ll see those shortly after Dec 8th. Ideally, running on a new 5000 series Ryzen (to also take advantage of Smart Access Memory and Rage Mode where the 6900XT is concerned) since that will be the platform that most new “from scratch“ builds will most likely be headed.
 
I’m quite interested in seeing 3rd party apples to apples overclocked comparison benchmarks (with all the stops pulled out on water - no LN2 stuff) for both AMD‘s 6900XT and Nvidia‘s RTX 3090 FE. Hopefully we’ll see those shortly after Dec 8th. Ideally, running on a new 5000 series Ryzen (to also take advantage of Smart Access Memory and Rage Mode where the 6900XT is concerned) since that will be the platform that most new “from scratch“ builds will most likely be headed.
Yes, and the CPU really optimized / clocked / cooled as well, with fast memory. Let's get as much CPU out of the way as we can.
I understand why AMD posted benchmarks with "normal" systems for repeatability, but we'd all like to see what hot rods will do too.
 
The only red flag I got on the AMD benchmarks where the APIs used. They did select DX11 in a few DX12 titles, that is an interesting move. Third party reviews will be nice. Hopefully they are getting hardware to reviewers very early... there is a lot to test. There are a lot of things to test this generation. Overclocking, auto overclocking, ram speeds, smart memory. Makes for a lot of work I am sure.

All I know is this generation... FPS isn't everything. Yes they are going to be both be in the same range. (so for most of us box checked) However both companies now have very different feature sets. Thows me back a lot of years to ATI vs Nvidia.... when at times Nvidia had the faster cards but I still went ATI almost every generation cause ATI had obviously better filtering.

This generation has me excited cause ATI err I mean AMD has caught up in the feature war. FPS I'm fine as long as everyone is close. Nvidia started adding new features first a couple generations back already. AMD tried to add a few cool things like RIS and Radeon Chill, and that was nice good driver features but it was all software based... This generation AMD is bringing some unique hardware features. That has me more excited then 150fps vs 175fps at any resolution.
 
I've never been excited about an AMD GPU like this, especially in over a decade. I have an ASUS 3080 Strix on the way and I'm just feeling "meh" about it but the 6900 XT has me [H]ard as a rock. Got a 5700 XT today and this thing is rock solid, I haven't run into a single problem with it so I think the AMD driver's being shit myth is just that, a myth. They might have had black screen issues before but I don't see anything like that right now, 12/8 can't come soon enough, my F5 key will be ready and oiled. I think I'm becoming an AMD fanboy at long last.

*checks date

wtf
 
*checks date

wtf

No April fools here. I’m looking forward to getting 6900 XT on release. I might also F5 for a 5900X.

PS i convinced a buddy to return his overpriced ASUS TUF 3090 and get the 6900 XT and another to skip the 3080 and get the 6800 XT. Guess AMD is already picking up new nvidia customers. Where’s my commission Lisa?? 😂

In case someone doubts it:
651B8326-FF49-478E-9E0F-D454BA4DB173.png
 
Last edited:
The only red flag I got on the AMD benchmarks where the APIs used. They did select DX11 in a few DX12 titles, that is an interesting move. Third party reviews will be nice. Hopefully they are getting hardware to reviewers very early... there is a lot to test. There are a lot of things to test this generation. Overclocking, auto overclocking, ram speeds, smart memory. Makes for a lot of work I am sure.

All I know is this generation... FPS isn't everything. Yes they are going to be both be in the same range. (so for most of us box checked) However both companies now have very different feature sets. Thows me back a lot of years to ATI vs Nvidia.... when at times Nvidia had the faster cards but I still went ATI almost every generation cause ATI had obviously better filtering.

This generation has me excited cause ATI err I mean AMD has caught up in the feature war. FPS I'm fine as long as everyone is close. Nvidia started adding new features first a couple generations back already. AMD tried to add a few cool things like RIS and Radeon Chill, and that was nice good driver features but it was all software based... This generation AMD is bringing some unique hardware features. That has me more excited then 150fps vs 175fps at any resolution.

I got the biggest increases by going away from DX to Vulkan with my 5700. I would be more concerned if everything was done with Vulkan games and no DX ones.
 
No April fools here. I’m looking forward to getting 6900 XT on release. I might also F5 for a 5900X.

PS i convinced a buddy to return his overpriced ASUS TUF 3090 and get the 6900 XT and another to skip the 3080 and get the 6800 XT. Guess AMD is already picking up new nvidia customers. Where’s my commission Lisa?? 😂

In case someone doubts it:
View attachment 294829
Careful what you wish for... that's all we need is Hardware "influencers". Oh wait that ship sailed didn't it.

You need to start the JokerTech Youtube channel to get that cheddar. lol
 
I don't think it is deceptive. AMD was showing the performance in the best possible light, with new features that they want to advertise.

Of course they would show the best case scenario. Any company would do the same.
 
The only red flag I got on the AMD benchmarks where the APIs used. They did select DX11 in a few DX12 titles, that is an interesting move.

AMD stated that results were obtained in the best API for each card, which is why they may pick on version of DirectX over another. This is exactly the opposite of attempting to handicap any card.
 
I don't think it is deceptive. AMD was showing the performance in the best possible light, with new features that they want to advertise.

Of course they would show the best case scenario. Any company would do the same.
Of course it's deceptive - a bunch of big graphs to try to create perception that results would be typical for any game, and then in the fine print hide that the games were cherrypicked and required some or all of:

* Rage OC Mode
* Smart Access Memory
* Requires specific CPU
* Requires game to be specifically coded to support Smart Access Mem

And yes Nvidia does the same to an extent as far as embellishing on their presentation slides, but I don't believe they've ever gone as far as putting a bunch of big graphs showing let's say only DLSS-enabled games vs stock AMD, but then trying to hide or downplay the fact the benches were all run DLSS-enabled.

Don't get me wrong, AMD came out way stronger than expected, and we still haven't seen what kind of OC beasts the AIB cards might be, not to mention they're lifting the whole market and everyone benefits.
 
Last edited:
Of course it's deceptive - a bunch of big graphs to try to create perception that results would be typical for any game, and then in the fine print hide that the games were cherrypicked and required some or all of:

* Rage OC Mode
* Smart Access Memory
* Requires specific CPU
* Requires game to be specifically coded to support Smart Access Mem

And yes Nvidia does the same to an extent as far as embellishing on their presentation slides, but I don't believe they've ever gone as far as putting a bunch of big graphs showing let's say only DLSS-enabled games vs stock AMD, but then trying to hide or downplay the fact the benches were all run DLSS-enabled.

Don't get me wrong, AMD came out way stronger than expected, but I suspect major goalpost shifting when real benches come out.

SAM needs no intervention from the developer as far as we know.
 
Of course it's deceptive - a bunch of big graphs to try to create perception that results would be typical for any game, and then in the fine print hide that the games were cherrypicked and required some or all of:

* Rage OC Mode
* Smart Access Memory
* Requires specific CPU
* Requires game to be specifically coded to support Smart Access Mem

And yes Nvidia does the same to an extent as far as embellishing on their presentation slides, but I don't believe they've ever gone as far as putting a bunch of big graphs showing let's say only DLSS-enabled games vs stock AMD, but then trying to hide or downplay the fact the benches were all run DLSS-enabled.

Don't get me wrong, AMD came out way stronger than expected, but I suspect major goalpost shifting when real benches come out.

3000 series marketed increase vs actual, It just works, the more you buy the more you save, the list of both RTX and DLSS games many of which had features added year(s) after release or just dropped entirely.

To say nVidia doesn't go as far, as far as what? Raja? Sure. Current AMD? Seems pretty sedate in comparison to both nVidia and AMD of yesteryear.
 
You can also use fast sync instead of no v-sync. This avoids tearing and works pretty well in many games (some games have noticeable micro stutter with fast sync). Fast Sync isn't quite as responsive as no v-sync at all. But it is still a whole lot faster than V-sync or adaptive sync. And you can even combine fast sync and G-sync.
 
Look at the fine print. Those 6900 benchmarks are done with overclocking the gpu that is what Rage mode is overclocking the gpu. They di not attempt to overclock the RX 3090 though the 3090 overclocking ability is almost zilch. That is a deceptive practice from AMD. So I think the 6900XT has little value to me but the 6800XT like an MSI Gaming X Trio 6800XT would definitely interest me. Probably will go for $750 . Asus has a ROG Strix 6800XT that is water cooled, It will probably go for $850. Too rich for my taste.
Rage mode isn't exactly an overclock. Its an officially supported power mode, which runs the GPU at 100% power and is probably basically providing a more consistent/higher average boost behavior. AMD guarantees the feature as 1-2% extra performance and its officially supported. It also has its own more aggressive fan curve to go along with it.

Lisa Su is not one to mess around. I do not think these performance numbers are misrepresented. I mean, they are telling us that a regular 6800 is beating a RTX 3090 in Forza Horizon, if you use the Smart Access Memory feature. That's going to be a really easy thing to get blowback on, if they are lying. I don't think that they are. I do think that their Nvidia numbers are based on "reference" clocks. Which is a bit confusing this time around, because the Founders Editions aren't actually reference designs.
 
You can also use fast sync instead of no v-sync. This avoids tearing and works pretty well in many games (some games have noticeable micro stutter with fast sync). Fast Sync isn't quite as responsive as no v-sync at all. But it is still a whole lot faster than V-sync or adaptive sync. And you can even combine fast sync and G-sync.

Isn't Fast sync basically like vsync + Triple buffering but with less input lag, but continuing to work to prevent tearing above the refresh rate? Sort of like a poor man's Gsync, since it's not as smooth unless you cap the FPS to refresh rate.
 
Isn't Fast sync basically like vsync + Triple buffering but with less input lag, but continuing to work to prevent tearing above the refresh rate? Sort of like a poor man's Gsync, since it's not as smooth unless you cap the FPS to refresh rate.
I don't know actual specific technical details on fast sync but according to the driver panel description, it lets the GPU run unconstrained like no v-sync, but it drops frames past the refresh rate. So as long as you can meet or exceed the refresh rate setting of your monitor, you won't have tearing. So there is no reason to cap under the monitor refresh, with fast sync. Whatever its doing to drop the frames does still incur a bit of a delay in terms of milliseconds. But its a lot lower than V-sync.

However, It does tear if you drop under the refresh rate. and I would imagine then that a framerate cap under the refresh rate would always tear, with fast sync. If you are playing something which varies a lot and goes bother under and over your monitor refresh, I would suggest trying g-sync + fast sync together.
 
Last edited:
I don't know actual specific technical details on fast sync but according to the driver panel description, it lets the GPU run unconstrained like no v-sync, but it drops frames past the refresh rate. So as long as you can meet or exceed the refresh rate setting of your monitor, you won't have tearing. So there is no reason to cap under the monitor refresh, with fast sync.

However, It does tear if you drop under the refresh rate. and I would imagine then that a framerate cap under the refresh rate would always tear, with fast sync.

Huh?
I don't have any tearing at all under the refresh rate with fast sync.
 
Back
Top