AMD's Radeon RX 7900-series Highlights

Apple did it because Apple wants to pretend that their Metal API matters, so Apple copied the cool kids. Look at DLSS and FSR's imagine quality as Hardware Unboxed finds that neither reproduce the original imagine perfectly.
I imagine this is trolling the if they do not reproduce perfectly the image in the early days it is a bad idea, just think where it was 3 years ago, where it is now, imagine in just 10 years with apple, nvidia, netflix, amd, intel, putting money in this.

Apple will push 2x 4000 x 4000 image soon with a mobile apu at I imagine high fps has it is necessary for VR, it will use fovea tracking, upscaling, projected frame and all the tricks in the books.

The only good thing about DLSS and FSR is that older GPU's can make better use of this and get better frame rates. Both have problems with ghosting and blurry textures, though with FSR it's usually worse. If I wanted worse image quality and better frame rates I'd just manually lower the game settings to do it.
It also:
) Better than native at well understood by the AI render, like text, cable, etc... that list would become larger and larger in the next 20-50 years
) Energy gain on Mobile becoming a big factor would it be the next switch or a autonomous VR headset
) Achieve much better visual quality at the same frame rate than your manual setting (Which is the point)

An easy to take about it, console user are pretty much 100% agnostic usually not aware of the strategy used to have the best image quality at the wanted FPS, almost all game choose upscaling over native with lesser graphic, I imagine they all tested that it looked better that way. FSR 2.0 seem to often lead to even better result on the consoles and they picked the upscaling ways before having that option.
 
Last edited:
I imagine this is trolling the if they do not reproduce perfectly the image in the early days it is a bad idea, just think where it was 3 years ago, where it is now, imagine in just 10 years with apple, nvidia, netflix, amd, intel, putting money in this.
ATI was crucified when they did this with Quake 3 about 21 years ago. Nvidia was also pointed out a few years later cheating in 3Dmark. In both cases they would reduce imagine quality to increase performance. Today we high five AMD and Nvidia for doing the same thing with DLSS and FSR.
 
ATI was crucified when they did this with Quake 3 about 21 years ago. Nvidia was also pointed out a few years later cheating in 3Dmark. In both cases they would reduce imagine quality to increase performance. Today we high five AMD and Nvidia for doing the same thing with DLSS and FSR.
And Unreal, Unity, pretty much everyone do upscaling, the question becoming how, yes reducing resolution augment performance DLSS/FSR being a non hidden way to do it, again you very well know than having the option to lower the resolution in a game and change how it will be upscaled by your monitor (will it be the game engine, your monitor doing it or something else) is vastly different than the ATI thing.

We always had the option and many did upscale since LCD and their disadvantage over CRT to have a fix native resolution became the norm and console has since 1080p became popular, upscaling is really not new or fitting to have better upscaling or temporal upscaling (TAA), its sound more like being angry of a video card company (or people ?) back in the days using anti-aliasing instead of maxing out their CRT resolution first.

What do you suggest happen when someone plugs a Nintendo switch or a 3060/6070 in a 4k TV if not using some upscaling at some point that would have an higher image quality that FSR2/DLSS2/unreal other game engine temporal upscaler ?
 
Last edited:
ATI was crucified when they did this with Quake 3 about 21 years ago. Nvidia was also pointed out a few years later cheating in 3Dmark. In both cases they would reduce imagine quality to increase performance. Today we high five AMD and Nvidia for doing the same thing with DLSS and FSR.
This is a false equivalency. Of course they were crucified for doing it, they were caught cheating to make their cards get higher FPS.

Second problem with your statement is that DLSS and FSR give far better results than the hacks they used back then. And they are improving with each iteration. Already in some games DLSS is better than Native with FXAA or TAA.

I just don't get the arguments against these types of technologies. You don't have to use them. If you don't like the results just turn it off.
 
It's 75% of a 4090 for 63% of the price. A better price per frame, but for fewer frames.

I mean, unless ray tracing is involved. In that case it's like 50% of a 4090 for 63% of the price.
It's not just ray-tracing; VR frame times are noticeably worse under Babel Tech Reviews benchmarking (and seemingly more microjuddery than the NVIDIA plots), to the point that even the RTX 4080 suddenly looks like good value if your goal is max settings VR gameplay on a Valve Index set to 90 Hz, never mind an even higher-res HP Reverb G2, Pimax or Varjo HMD.

(With that said, the RX 7900 XTX is still way better than running VR with an old GTX 980, speaking from personal experience.)

Content creation benchmarks also still heavily favor NVIDIA, and that's when they're using OpenCL or anything that isn't CUDA, which isn't supported outside of NVIDIA GPUs.

I'm of the opinion that once you start looking at four-figure GPU prices, the word "compromise" should not even be in the vocabulary.

That said, NVIDIA has one major compromise of their own: Linux drivers. Those looking to switch over to SteamOS or other Linux flavors may very well opt for AMD just for that reason.
 
It's not just ray-tracing; VR frame times are noticeably worse under Babel Tech Reviews benchmarking (and seemingly more microjuddery than the NVIDIA plots), to the point that even the RTX 4080 suddenly looks like good value if your goal is max settings VR gameplay on a Valve Index set to 90 Hz, never mind an even higher-res HP Reverb G2, Pimax or Varjo HMD.

(With that said, the RX 7900 XTX is still way better than running VR with an old GTX 980, speaking from personal experience.)

Content creation benchmarks also still heavily favor NVIDIA, and that's when they're using OpenCL or anything that isn't CUDA, which isn't supported outside of NVIDIA GPUs.

I'm of the opinion that once you start looking at four-figure GPU prices, the word "compromise" should not even be in the vocabulary.

That said, NVIDIA has one major compromise of their own: Linux drivers. Those looking to switch over to SteamOS or other Linux flavors may very well opt for AMD just for that reason.
I look at it as a question of do I want the best gaming performance/dollar i.e. the area where I will almost uniquely stretch the card's legs.

I have a passing interest in RT, I will never use VR, I will never do content creation. I may at some point want to dabble using Linux. So for my use case, and many others, 7900XTX is where it's at i.e. pure rasterization with passable/tast-tester RT performance.
 
This is a false equivalency. Of course they were crucified for doing it, they were caught cheating to make their cards get higher FPS.

Second problem with your statement is that DLSS and FSR give far better results than the hacks they used back then. And they are improving with each iteration. Already in some games DLSS is better than Native with FXAA or TAA.

I just don't get the arguments against these types of technologies. You don't have to use them. If you don't like the results just turn it off.
I agree. I'd rather deal with the occasional fringing with DLSS or FSR than the consistent vaseline smear of TAA.
 
You misunderstand a few things here, they added it as an optional feature to display port in 1.2a because the feature already existed for VGA, DVI, and HDMI as optional parts of the spec for in some cases decades variable refresh rates are not new, and have been an active part of CRT specifications since the '60s. It was a selling feature of the Matrox cards back in the ‘90s. Prior to NVidia creating their GSync program people were already doing it using the VESA options but the standard at the time was in 10hz increments from 10-30hz and on LCD displays it didn’t work well there was flashing and headaches it was a generally bad time. There were forums dedicated to sharing what monitors you could do it with, and how to either mod them or tweak your display drivers to make it work, Nvidia didn't come up with the idea of Gsync out of the blue, but what they did do was provide a solution to a community of users who had been trying for years and mostly floundering on making it work with LCD displays.

And yes I do think the AMD Freesync certification program was and is bad, they let companies slap FreeSync stickers on everything regardless of whether or not it actually even did it, they later tried to and have corrected it to some degree with the FreeSync Pro and Premium certification processes, but those are basically carbon copies of Nvidia's certification processes. But VESA rolled out the Atapdive Sync certification program because very few were doing the Freesync certification work, they are also in talks with Nvidia to roll the GSync certification program into the Adaptive Sync process to simplify their jobs and then only require the display vendors to certify against one set of standards. Looking at it they have also rolled in the Apple ProMotion certification and Qualcomm Q-Sync ones as well.

I think you are the one who misunderstands a few things. They didn't add it to the displayport spec because of blah, blah, blah. They added it because AMD submitted a specification change request in November 2013. It was a change to a timing parameter amongst other things This was their final proposal, so it's pretty obvious that this was something that they were working on for a while. This proposal was approved in early 2014 and it was added as an optional part of the the 1.2a spec in May/June 2014.

Prior to adaptive sync, they weren't using anything like it on desktop monitors. What you were talking about were hacks that people used, and they were crap really. The hacks only worked on some monitors for the same reason that people thought some existing monitors might be suitable for Adaptive sync when it was released in 2015. Some monitors had suitable scalers that could be adjusted.

As for discussing the whole history of VRR and going back to DVI/VGA and CRT's, why? They have absolutely nothing to do with this discussion, they are completely different technologies.

And we know Nvidia didn't come up with Gsync out of the blue. Neither did AMD. They both 'stole' the idea from panel self refresh feature in the embedded display port standard. That's basically how the Gsync module works and Freesync/Adaptive Sync work and how Gsync compatible works.

Nvidia released Gsync first, but, as I said in a previous post, AMD were definitely working on Adaptive Sync and Freesync long before Nvidia did their demo. They had the hardware needed for Freesync in GPUs they released before the demo. In this instance I don't believe that AMD followed Nvidia, they were just slower to release as they had to go through the open source route. Who came up with the idea first? That we will never know.

The Freesync certification was the way it was because they needed get monitor manufacturers on board with the minimum of hassle. When the benefits of Adaptive Sync became clear, AMD became more stringent and added the Freesync 2 branding in 2017. This morphed into Freesync premium and Freesync Pro. And you say these are basically a copy of the Gsync certification program. And you also say that this new Certification is because AMD's Freesync certification is so bad. So which is it? You say Nvidia's certification program is been used in this new Vesa certification program and AMD's current certification is a carbon copy but is useless?

Because what you are saying is nonsense. They aren't releasing this because the Freesync certification is bad, they are about 5 years too late for that. They are releasing it for exactly the reason they stated, to avoid confusion and to have a fair an unbiased testing method. And were VESA in talks with Nvidia about setting this up? Of course they were, you seem to be a little confused about what VESA is and how it works. The board of VESA is made up by people from Apple, Nvidia, LG, AMD, Qualcomm, Intel and HP. 100's of companies are members. They make changes by discussions with it's members. Dozens of companies were involved in setting up this new Certification process. So, yes, there was discussions with Nvidia, but there were discussions with Apple, Qualcomm, AMD, LG etc etc. as well.
 
  • Like
Reactions: kac77
like this
I look at it as a question of do I want the best gaming performance/dollar i.e. the area where I will almost uniquely stretch the card's legs.

I have a passing interest in RT, I will never use VR, I will never do content creation. I may at some point want to dabble using Linux. So for my use case, and many others, 7900XTX is where it's at i.e. pure rasterization with passable/tast-tester RT performance.
You know your needs and are buying the product that best meets those needs, as all people should.

I'm much after the same thing - the most frames per dollar spent - but because I'm prioritizing VR performance above all else, I'm hoping I didn't just make a costly mistake with the RX 7900 XTX.

But even if I did, Micro Center's got a good return policy. I've got a month to put this thing through its paces, and first impressions are that it's certainly not a bad VR card in practice (just not the best value unless AMD somehow pulls off a miracle with later drivers) and hilariously overkill for max settings 1080p120 gaming.
 
You know your needs and are buying the product that best meets those needs, as all people should.

I'm much after the same thing - the most frames per dollar spent - but because I'm prioritizing VR performance above all else, I'm hoping I didn't just make a costly mistake with the RX 7900 XTX.

But even if I did, Micro Center's got a good return policy. I've got a month to put this thing through its paces, and first impressions are that it's certainly not a bad VR card in practice (just not the best value unless AMD somehow pulls off a miracle with later drivers) and hilariously overkill for max settings 1080p120 gaming.
I'm quite interested in hearing about your VR experience with the 7900 as I'm considering it for the same reason. My 6900 is just a bit shy of the VR performance I need to max out my headsets resolution plus some extra SS. The 6900 drivers took awhile to get to where there weren't bad for VR but steam VR was partially at fault there too. I'll be watching this closely.
 
Quick look on Amazon shows to be a lot of scalping going on. I've seen 7900 XT's going for $1250, and someone was kinda enough to write a review that says scalping. XTX's are going for $1700 on Amazon scalping. Because charging more than a Nvidia 4090 is a smart move. Ebay is where these cards seem to have ended up, as I'm seeing a lot of overpriced postings. The XT's are usually $250 to $300 more than the MSRP, but the XTX's I've seem them for sale at over $2k. You can still find 7900 XT's that haven't been fully scalped, like on NewEgg. Probably due to the terrible value the 7900 XT's offer.

Seems the scalpers have went full retard and just bought as many cards as they could. Probably due to Christmas being 9 days away and hoping to catch them some last minute shoppers. Pricing of these cards on Ebay after new years is going to be very interesting.
74kjl8.jpg
 
Probably due to the terrible value the 7900 XT's offer.
Apparently they are the vast majority of the initial stock has well, which is somewhat surprising. That is quite refreshing to see, an on the nose msrp version of a new card, that you can purchase on Newegg.com, good on AMD there, the talks of massive initial stock seem true (for the little we can say, hard to know without seeing sales). And it could be where the giant stock of high end RDNA 2 hurt a bit, without the $780 with 2 free games 6950xt existing, that $900 7900xt would look better. At least there is 0 Ampere competition to them.

Was there ever an msrp ($1200) 4080 ? They do not seem to exist anymore, even the Zotac are more than $1300.
 
I think you are the one who misunderstands a few things. They didn't add it to the displayport spec because of blah, blah, blah. They added it because AMD submitted a specification change request in November 2013. It was a change to a timing parameter amongst other things This was their final proposal, so it's pretty obvious that this was something that they were working on for a while. This proposal was approved in early 2014 and it was added as an optional part of the the 1.2a spec in May/June 2014.

Prior to adaptive sync, they weren't using anything like it on desktop monitors. What you were talking about were hacks that people used, and they were crap really. The hacks only worked on some monitors for the same reason that people thought some existing monitors might be suitable for Adaptive sync when it was released in 2015. Some monitors had suitable scalers that could be adjusted.

As for discussing the whole history of VRR and going back to DVI/VGA and CRT's, why? They have absolutely nothing to do with this discussion, they are completely different technologies.

And we know Nvidia didn't come up with Gsync out of the blue. Neither did AMD. They both 'stole' the idea from panel self refresh feature in the embedded display port standard. That's basically how the Gsync module works and Freesync/Adaptive Sync work and how Gsync compatible works.

Nvidia released Gsync first, but, as I said in a previous post, AMD were definitely working on Adaptive Sync and Freesync long before Nvidia did their demo. They had the hardware needed for Freesync in GPUs they released before the demo. In this instance I don't believe that AMD followed Nvidia, they were just slower to release as they had to go through the open source route. Who came up with the idea first? That we will never know.

The Freesync certification was the way it was because they needed get monitor manufacturers on board with the minimum of hassle. When the benefits of Adaptive Sync became clear, AMD became more stringent and added the Freesync 2 branding in 2017. This morphed into Freesync premium and Freesync Pro. And you say these are basically a copy of the Gsync certification program. And you also say that this new Certification is because AMD's Freesync certification is so bad. So which is it? You say Nvidia's certification program is been used in this new Vesa certification program and AMD's current certification is a carbon copy but is useless?

Because what you are saying is nonsense. They aren't releasing this because the Freesync certification is bad, they are about 5 years too late for that. They are releasing it for exactly the reason they stated, to avoid confusion and to have a fair an unbiased testing method. And were VESA in talks with Nvidia about setting this up? Of course they were, you seem to be a little confused about what VESA is and how it works. The board of VESA is made up by people from Apple, Nvidia, LG, AMD, Qualcomm, Intel and HP. 100's of companies are members. They make changes by discussions with it's members. Dozens of companies were involved in setting up this new Certification process. So, yes, there was discussions with Nvidia, but there were discussions with Apple, Qualcomm, AMD, LG etc etc. as well.
AMD didn't submit anything it was already there VESA just added optional parts of their other specifications into the optional parts of their new specifications.
Prior to adaptive sync, was the 90s, adaptive sync was an optional part of the VESA standards from day 1, they called it different things but its functionality remained the same.
Matrox was the first to really use the adaptive sync stuff, and in both the ATI and Nvidia forums there were extensive posts on how to replicate the features on the cards available at the time, success mostly came down to the panels, I had good luck with BENQ displays at the time by modifying VSync values, so when VSync was enabled the system would change those values on the fly, which is how Nvidia ultimately implemented their version of Gsync.
Adaptive Sync is a thing, Freesync is AMD's branding of Adaptive Sync, Freesync is a logo, not a technology.
In AMD's quest to get as many display manufacturers on board, they gave them free rein and trusted them to do work they didn't do. Many display manufacturers slapped a FreeSync branded logo on their box with any display with a DP 1.2a or 1.3 connectors and called it done, regardless of the actual panels' capabilities to do the job. This is why Freesync was plagued with people complaining of flashing displays, stuttering, tearing, and all sorts of visual glitches. As you say AMD fixed this with FreeSync 2, later renamed to FreeSync Premium Pro, but this was more to do with the fact that any display capable of passing the tests to verify 120hz and HDR required a display driver that was more than capable of handling the adaptive sync specifications in their entirety. It had nothing to do with AMD's enforcement or lack of it, of the Freesync branding.
VESA has a few reasons for wanting to supplant the FreeSync and Gsync certification processes, the biggest being that getting GSync and Premium Pro certified is expensive for display manufacturers and as VESA technically works for the display manufacturers they want a way to cut costs.
Secondly, it works to remove FreeSync from the market which for much of Asia is a laughingstock because it is on just about any display made after 2013 regardless of its ability to do the job.
 
I'm quite interested in hearing about your VR experience with the 7900 as I'm considering it for the same reason. My 6900 is just a bit shy of the VR performance I need to max out my headsets resolution plus some extra SS. The 6900 drivers took awhile to get to where there weren't bad for VR but steam VR was partially at fault there too. I'll be watching this closely.
We might not have the same games and I don't have any setup resembling a standard VR benchmark, but if you could name the games you're looking for results in, I'll go out of my way to test them.

Right now, I'm aiming for DCS World and No Man's Sky performance to start with since they're amongst the most demanding and least optimized titles this side of MSFS 2020, but I have many other titles I could test.

Also worth noting: stuff like DCS and MSFS2020 hits the CPU especially hard. Switching from my 12700K 5.1P/4.0E all-core OC to something else, especially the 5800X3D, might alter the results considerably, and I'm not sure what CPU you have.
 
I'm quite interested in hearing about your VR experience with the 7900 as I'm considering it for the same reason. My 6900 is just a bit shy of the VR performance I need to max out my headsets resolution plus some extra SS. The 6900 drivers took awhile to get to where there weren't bad for VR but steam VR was partially at fault there too. I'll be watching this closely.
What specific things would need to be benchmarked for VR that aren't demonstrated with standard 4k benchmarks (the ones that include frame time analysis or 1% lows anyways). Time warp / async reprojection is handled by the VR headset right? or is that a GPU thing?
 
Mostly play IL2 Sturmovik and FS2020 my sig is up to date on current hardware. I just don't play much on "modern titles".
 
I was able to score an 7900xtx off AMDs site yesterday morning-how long does it take them to ship it?
 
I was able to score an 7900xtx off AMDs site yesterday morning-how long does it take them to ship it?
It's probably slightly different from regularly in stock (i.e. stable volume flowing in and not freshly launched products) cards but my experience ordering directly for RX6000 series was shipped within a day or two and usually just ground speed unless you paid for higher.
 
Was there ever an msrp ($1200) 4080 ? They do not seem to exist anymore, even the Zotac are more than $1300.
From the looks of it, the RTX 4080's and 4090's are scalped as well. It seems to be graphic cards in general.
 
It's probably slightly different from regularly in stock (i.e. stable volume flowing in and not freshly launched products) cards but my experience ordering directly for RX6000 series was shipped within a day or two and usually just ground speed unless you paid for higher.

Funny I just got the shipping notification after you posted this. Should have it next Wednesday via FedEx
 
Quick look on Amazon shows to be a lot of scalping going on. I've seen 7900 XT's going for $1250, and someone was kinda enough to write a review that says scalping. XTX's are going for $1700 on Amazon scalping. Because charging more than a Nvidia 4090 is a smart move. Ebay is where these cards seem to have ended up, as I'm seeing a lot of overpriced postings. The XT's are usually $250 to $300 more than the MSRP, but the XTX's I've seem them for sale at over $2k. You can still find 7900 XT's that haven't been fully scalped, like on NewEgg. Probably due to the terrible value the 7900 XT's offer.

Seems the scalpers have went full retard and just bought as many cards as they could. Probably due to Christmas being 9 days away and hoping to catch them some last minute shoppers. Pricing of these cards on Ebay after new years is going to be very interesting.
View attachment 534710
I am vey curious about pricing after xmas and new years
 
It's probably slightly different from regularly in stock (i.e. stable volume flowing in and not freshly launched products) cards but my experience ordering directly for RX6000 series was shipped within a day or two and usually just ground speed unless you paid for higher.
The 6700XT I got off AMD's site took over a week to start shipping. But, that was when supply was really bad. Its my understanding that supply is good right now, for the new 7000 cards.
 
The 6700XT I got off AMD's site took over a week to start shipping. But, that was when supply was really bad. Its my understanding that supply is good right now, for the new 7000 cards.
I remember ordering Radeon VII on amd.com it took about 3-5 days if i remember correctly
 
Supply seems good-I was able score the 7900xtx on Thursday. Been following Falcodrin discord for stock alerts and it looks like AMD added stock around 10AM EST the past two days. Otherwise they have been selling out in seconds.

I see 7900XTs popping up on Amazon all the time, but the delivery dates are after Christmas or early January.
 
Nvidia know how to run a business. AMD, not so much.

Could that possibly be because AMD were almost bankrupt several times in the last decade and they dedicated the bulk of their resources to developing a new line of cpu while the gpu division for the most part had to make do with whatever they could cobble together? I mean wasn't Vega essentially a compute gpu pulling double duty as a stop gap in the gaming space? That's certainly what I've seen Vega referred to as in the past, whereas NVidia had no such monetary issues to deal with.

AMD still put out some decent cards in that time period, fury x while not setting the world on fire was a solid gpu (even though there technically wasn't that many of them produced vs the NVidia counterparts). Its just a shame for AMD that NVidia had 980TI waiting in the wings to pish on their thunder.
 
The 7900xtx is 35-40% faster than the 6950xt depending on the review. If you goes balls put, you might get that +50%. Just hope electricity is cheap by you.

The 7900xtx vs the 4080 seems a lot like the HD7970 VS GTx 680. AMD likely will have some fine wine improvements. Also, for those that are holding onto their 16GB DDR4 systems a bit longer, having the extra vram for SAM will likely be beneficial. This is even more true for the 7900xt vs the 4070ti.
Screenshot_20221216-211102_Samsung Internet.jpg
 
Last edited:
The 7900xtx is 35-40% faster than the 6950xt depending on the review. If you goes balls put, you might get that +50%. Just hope electricity is cheap by you.

The 7900xtx vs the 4080 seems a lot like the HD7970 VS GTx 680. AMD likely will have some fine wine improvements. Also, for those that are holding onto their 16GB DDR4 systems a bit longer, having the extra vram for SAM will likely be beneficial. This is even more true for the 7900xt vs the 4070ti.
LOL, some people will never get over the fine wine fraud. This very website (Hard|OCP) debunked that garbage. Idiots believed that stuff and new ones still do.
 
AMD and Nvidia get performance improvements over time. Day 1 vs day 365 for both teams show things get better. But very rarely do the numbers skew much, you might see one improve 12 vs 15 percent but I’m not counting on the 15-25% some on Reddit are speculating for the 7900 cards. A 7950 series bringing that sure I’d believe that but drivers alone… AMDs driver team leaves some to be desired but they aren’t incompetent.
 
Perhaps not from a purely gaming perspective, but the AMD cards seem to be more desirable years later for a myriad of reasons: 7970 vs 680, rx580 bs gtx 1060, even rx 5700 rtx 2060
 
Wine wine doesn't exist... in the year both Nvidia and AMD completely re wrote their DX code and increased performance across the board. lol

Its not that "Fine wine" as a concept hasn't happened in the past its not fake or made up. Its just not something anyone should bank on. What your card can do today is all its worth. Every card will get fixes and little gains here and there... that is true for AMD Nvidia and Intel. (even Intel iGPUs get updates with performance gains). Just don't bank on it... some generations release more fully baked. Some get leap frogged fast (such as Vega) and updates stop really benefiting those archs.

Fine wine is real, but unreliable. Which is why it shouldn't be a serious consideration when purchasing.
 
Perhaps not from a purely gaming perspective, but the AMD cards seem to be more desirable years later for a myriad of reasons: 7970 vs 680, rx580 bs gtx 1060, even rx 5700 rtx 2060
This reminds me of this revisit. Was pretty damn surprised that Vega 64 ended up being decent.

 
Back
Top