Radeon RX 5600 XT Launch Review Round Up

Except your RTX argument makes no sense whatsoever. First of all, if ray tracing is absolutely necessary to run a game it's going to be quite a few years down the line before that happens.
If history repeats itself then it'll be when the console systems are launched. It'll be a few years before most games utilize Ray-Tracing, but some will at launch. And some developers will be lazy and implement a Ray-Tracing only version of their games.

I'll say it right now. Attempting to "future proof" your system with first generation ray tracing is going to be an exercise in futility. We've already seen that even the top of the line RTX card gets hammered when ray tracing is turned on and that's with the relatively few effects current games have. It seems like half the games featuring ray tracing turned down the ray tracing effects considerably just to get a somewhat playable framerate since release. Performance simply isn't going to get any better in future games featuring ray tracing on the current hardware.
Considering a RTX 2060 is only $20 more, I would rather get that over the 5600 XT. You aren't losing much other than $20. It'll take a while before the $300 price drop takes full effect on the RTX 2060, so if you can't find it for $300 then just wait a bit. Nobody knows how Ray-Tracing will even be done by AMD and the 9th generation consoles, but it'll be done and eventually required in games.
 
80W is a concern if you play games 24/7, but most people don't. Idle power is more of a concern and the Vega 56 is fine there.

It's not a difference in terms of power costs, I agree, but more that the 5600XT is better suited for smaller installations or installations with less airflow or less dampening and so on. In a large enough case with enough airflow, of course the only real delta would be the extra 80w dumped into the room, which could be a decision point for some but is more worth noting than holding as a no-go for the Vega part.

I believe that some developers will get lazy and make some games that require Ray-Tracing because they don't want to make it without Ray-Tracing. So some Ray-Tracing is better than no Ray-Tracing.

I do feel that this is quite possible. I balance it with the knowledge that most, if not all of the engines developers will be using will have raster paths available for implementation, but as you say, they could also just be too lazy.

Sony and Microsoft will push for the feature, as well as Nvidia. It'll be a good 2-3 years before most games utilize RT, but some will when the PS5 and Xbox Series X launches.

Any real response I'd have here would center around what is meant by 'most', so in general I agree, industry-wide adoption to the point that the majority of games shipped also ship with RT is still a few years out. I'm betting that more than a few that ship with the upcoming consoles will ship raster only just due to the sheer scale of effort required to sparingly implement RT on such limited hardware.
 
We are already seeing counter arguments to your claim.

Developers are getting better at using Ray Tracing Effects with less impact, and DLSS has shown multiple jumps in improvements.

Far from crushing a high end card. They can run Wolf Youngblood with RT and DLSS on at the same FPS as with it off.

Here they run RT on, with DLSS. They can run as fast on a 2060 with RT effects, as a 5600 xt with no Ray tracing.


He concludes that you would be nuts to play without RT and thats on the lowest end RT card in existence. There really is negligible downside to turning it on with the current version of DLSS and RT in this game.

Ultimately RT HW is a bonus here. It absolutely will let you play some games with RT effects on, others maybe it wont' be practical. But some is better than NONE.

The argument that this bonus isn't worth the price, carries some weight when we are talking about the $100 difference between 5700XT and 2070 Super.

But that argument really doesn't carry weight when the difference in price drops to $20.

$20 bucks more for Ray Tracing, Better Video Encoding, and more stable drivers is pretty much the sensible choice, and obvious choice.

AMD did well with the pricing on 5700 cards, but 5500/5600 cards really don't deliver any kind of win. 5500/5600 cards are AMD fan only cards.

To be clear, RT is basically a 50% performance hit in Youngblood. Relatively speaking, that's not any general improvement in performance, compared to other games so far. Aside from Youngblood's Vulkan implementation being very high performance (so that 50% is still acceptable): DLSS is carrying all of the weight here. And with how long it took Nvidia to get it working well, I'd say they lucked out. However, while DLSS in Youngblood is quite good; it does have some visual issues when you are actually playing it (which is tough to show on youtube or in screenshots). Some haloing and other noise, typical of upscaling of any kind. DLSS could still get even better, indeed. But, there tends to be a wall on upscaling/image reconstruction techniques.
 
We are already seeing counter arguments to your claim.

Developers are getting better at using Ray Tracing Effects with less impact, and DLSS has shown multiple jumps in improvements.

Far from crushing a high end card. They can run Wolf Youngblood with RT and DLSS on at the same FPS as with it off.

Here they run RT on, with DLSS. They can run as fast on a 2060 with RT effects, as a 5600 xt with no Ray tracing.


He concludes that you would be nuts to play without RT and thats on the lowest end RT card in existence. There really is negligible downside to turning it on with the current version of DLSS and RT in this game.

Ultimately RT HW is a bonus here. It absolutely will let you play some games with RT effects on, others maybe it wont' be practical. But some is better than NONE.

The argument that this bonus isn't worth the price, carries some weight when we are talking about the $100 difference between 5700XT and 2070 Super.

But that argument really doesn't carry weight when the difference in price drops to $20.

$20 bucks more for Ray Tracing, Better Video Encoding, and more stable drivers is pretty much the sensible choice, and obvious choice.

AMD did well with the pricing on 5700 cards, but 5500/5600 cards really don't deliver any kind of win. 5500/5600 cards are AMD fan only cards.


DLSS is a crutch in this case. It's the ONLY thing allowing RT to be remotely playable and you are sacrificing image quality to make it happen. DLSS is better than at launch, but there is still a quality loss when using it. Not only that, but DLSS is only going to work in the handful of games that support it. Not every RT game right now supports DLSS. The crutch only works if the game supports it. Not to mention, with how all over the map DLSS quality is I don't think it's a feature worth relying on right now.

Also, do you really think RT effects are going to remain as they are now? In the hypothetical scenerio where games require it, those effects will be a lot better and more performance intensive than what we have now. Meaning even with the DLSS crutch it still might not be viable on the 2060.

Buying 1st gen hardware for a next gen feature is an exercise in foolishness. NO ONE should be buying any current RTX card for their RT capabilities, especially for the future use of RT.
 
DLSS is a crutch in this case. It's the ONLY thing allowing RT to be remotely playable and you are sacrificing image quality to make it happen. DLSS is better than at launch, but there is still a quality loss when using it. Not only that, but DLSS is only going to work in the handful of games that support it. Not every RT game right now supports DLSS. The crutch only works if the game supports it. Not to mention, with how all over the map DLSS quality is I don't think it's a feature worth relying on right now.

Also, do you really think RT effects are going to remain as they are now? In the hypothetical scenerio where games require it, those effects will be a lot better and more performance intensive than what we have now. Meaning even with the DLSS crutch it still might not be viable on the 2060.

Buying 1st gen hardware for a next gen feature is an exercise in foolishness. NO ONE should be buying any current RTX card for their RT capabilities, especially for the future use of RT.

DLSS is heading in the right direction. I think it will be a difference maker down the line, especially for mid range cards looking for higher fps at higher rez. I play WolfYB with a 2070 and it's really nice at 4k.

Never expected much from RT or DLSS, certainly not something I based my purchase decision on. But it's been a really fun ride so far playing thru various games and implementations.

Ent[H]usiast
 
..Buying 1st gen hardware for a next gen feature is an exercise in foolishness. NO ONE should be buying any current RTX card for their RT capabilities, especially for the future use of RT.

I don't think anyone disagrees that future cards will be better at raytracing/DLSS/etc than current cards.

But saying NO ONE should be buying Raytracing (current cards) for raytracing, is itself foolishness.

Even low-end 2060, affordable raytracing cards are giving gamers enjoyment right now in current games.

Future-proof? Nothing is future-proof, and video cards used to be obsolete at 6 months old. Now we can go 2 years or longer (look at the 1080Ti), but they also cost more. Cost over time has probably gone up but its not as much of an increase as the sticker-shock prices make us feel like it is, since we pay more but replace our cards less often. (well, maybe it is in the 2080Ti's case with a 50% price jump :) )
 
I don't think anyone disagrees that future cards will be better at raytracing/DLSS/etc than current cards.

But saying NO ONE should be buying Raytracing (current cards) for raytracing, is itself foolishness.

Even low-end 2060, affordable raytracing cards are giving gamers enjoyment right now in current games.

Future-proof? Nothing is future-proof, and video cards used to be obsolete at 6 months old. Now we can go 2 years or longer (look at the 1080Ti), but they also cost more. Cost over time has probably gone up but its not as much of an increase as the sticker-shock prices make us feel like it is, since we pay more but replace our cards less often. (well, maybe it is in the 2080Ti's case with a 50% price jump :) )

I disagree. DLSS works well in, what, two games? Deliver Us the Moon and Wolf: YB. Outside of that, DLSS either plain doesn't support 1080p or is a massive detriment to visual quality. So anyone buying a 2060 for RT can play two games with RT effects without royally fucking their visual experience (the whole point of enabling RT in the first place). Woo, what a great reason to buy a $300 video card. Maybe that number will double in the next 11 months or so. The 2060 is a good card, for it's raster performance (and video production, in the case of the KO) but I would never recommend people buy it over another card just because it can do RT, but only when it can rely on a crutch that may or may not get added.

Speaking as someone that has owned a $1200 2080 ti for almost a year: I wouldn't recommend buying any of the RTX cards for their RT capabilities. There are certainly other reasons to buy them, but RT is not one of them.

Edit: And I have been saying the same thing about RT since Nvidia announced the RTX lineup. Nothing in that time has convinced me otherwise.
 
You guys haven't been visited by the angel of ray tracing yet. He should be here shortly to explain to you why you're wrong

Well, he is right -- and wrong.

Mostly, like any new feature (and all of the TVs at Best Buy), 'bright' gets peoples' attention.

Where ray tracing isn't providing a more accurate lighting solution, the developers are 'wrong'.

That doesn't mean that ray tracing can't be used properly, but it does mean that developers aren't great at it yet.
 
got tired of trying to find a 5600xt in canada, splurged on a xfx 5700 for ~$40CAN more. figure ill be able to recoup a good chunk of it selling off my 470 and 580, maybe...
You made the right choice, its a great card. I am rockin a reference card and have been very happy at 1440p 144hz.
 
The irony of a mostly panned game like Wolfenstein Youngblood being a good example of rtx/dlss performance.
 
You made the right choice, its a great card. I am rockin a reference card and have been very happy at 1440p 144hz.
just got it, xfx dd ultra version. checking [H], hitting "nature" and then im gonna plop it in as is for now. i want to transfer my g12/h60 to it but have seen mixed reviews of that combo...
my first current gen gpu since the geforce 2 series :)

edit: works great, is a massive bump over my 580 but now i need to figure what to do with the glaring bright blue power leds. case is black with red lighting and now these brighter-than-everything-else blues.

upload_2020-1-29_19-58-41.png


it might be a little overkill for what i do and play but i really couldnt find a 5600xt anywhere local or online. just to kinda stay ot...
 
Last edited:
DLSS is heading in the right direction. I think it will be a difference maker down the line, especially for mid range cards looking for higher fps at higher rez. I play WolfYB with a 2070 and it's really nice at 4k.

Never expected much from RT or DLSS, certainly not something I based my purchase decision on. But it's been a really fun ride so far playing thru various games and implementations.

Ent[H]usiast
Personally, I think DLSS is just lowering imagine quality settings with less steps. The RTX cards seem to have a bandwidth issue because if DLSS makes RT work better then maybe having Ray-Tracing hardware requires something like HBM memory for it to work properly. I personally think there's alternatives to Ray-Tracing other than DXR. I like Crytek's Neon Noir demo as it shows that Ray-Tracing has room for improvements, and Nvidia's RTX can be utilized to boost it's performance. DLSS is just a bandaid that Nvidia is utilizing to make RTX work better, and not something we should look forward to.
 
Personally, I think DLSS is just lowering imagine quality settings with less steps. The RTX cards seem to have a bandwidth issue because if DLSS makes RT work better then maybe having Ray-Tracing hardware requires something like HBM memory for it to work properly. I personally think there's alternatives to Ray-Tracing other than DXR. I like Crytek's Neon Noir demo as it shows that Ray-Tracing has room for improvements, and Nvidia's RTX can be utilized to boost it's performance. DLSS is just a bandaid that Nvidia is utilizing to make RTX work better, and not something we should look forward to.

More info here so as to not clog up this thread.

More DLSS...
 
Personally, I think DLSS is just lowering imagine quality settings with less steps

It's an attempt at using machine-learning to upsample. It may or may not be a good idea in the long run, but the premise is solid.

The RTX cards seem to have a bandwidth issue because if DLSS makes RT work better then maybe having Ray-Tracing hardware requires something like HBM memory for it to work properly.

I don't see how memory bandwidth is an issue at all. Does overclocking the memory by itself provide substantial performance improvements?

I personally think there's alternatives to Ray-Tracing other than DXR.

You could use a rendering cluster...

I like Crytek's Neon Noir demo as it shows that Ray-Tracing has room for improvements

It shows that if you halfass enough, you can do any rendering in software ;)

and Nvidia's RTX can be utilized to boost it's performance.

...which will be done using DXR.

DLSS is just a bandaid that Nvidia is utilizing to make RTX work better, and not something we should look forward to.

More efficient upsampling, however it's done, is absolutely something we should look forward to. DLSS is just one stab at trying to do it smartly.
 
Personally, I think DLSS is just lowering imagine quality settings with less steps. The RTX cards seem to have a bandwidth issue because if DLSS makes RT work better then maybe having Ray-Tracing hardware requires something like HBM memory for it to work properly. I personally think there's alternatives to Ray-Tracing other than DXR. I like Crytek's Neon Noir demo as it shows that Ray-Tracing has room for improvements, and Nvidia's RTX can be utilized to boost it's performance. DLSS is just a bandaid that Nvidia is utilizing to make RTX work better, and not something we should look forward to.
DLSS doesn't make RT work better. DLSS actually has nothing to do with Ray Tracing and does not need Ray Tracing enabled, in order to use it.

You run the game internally in a lower resolution (so the game is easier to run) and then an algorithm based on an AI research model, upscales the image to a higher resolution. So you might actually be running the game in 900p, but its upscaled to 1440p (not necessarily actual resolution numbers from any game. I just said this as a quick example).

The "SS" in DLSS, stands for Super Sampling. So its not just upscaling----its actually reconstructing the image, based on higher resolution information learned by the AI research model. So, DLSS can actually have better resolved details in the distance or on really small stuff, such as fences or wires, even compared to native resolution. Because that information is Super Sampled back into the image. But, there are still some artifacts such as sharpness issues, haloing. And while retention of resolution during motion is way better than say, DLSS in Control (which isn't true DLSS), its still got some issues.

However, Youngbloods is the only game I have confirmed is using an algorithm based on the AI research model. Deliver Us The Moon might be as well, not sure. But, before those two games, other implementations of DLSS were conventional upscaling algorithms, which were tweaked based on some things Nvidia learned from the AI Research model.
 
Last edited:
However, Youngbloods is the only game I have confirmed is using an algorithm based on the AI research model. Deliver Us The Moon might be as well, not sure. But, before those two games, other implementations of DLSS were conventional upscaling algorithms, which were tweaked based on some things Nvidia learned from the AI Research model.

I think it's the other way around.

Previous to the game "Control", DLSS, was using a Deep Learning technique. It had pretty big limitations around where it would work and at what frame rates. Example, you couldn't use it to achieve much higher frame rates, because the neural network also took a specfic amount of time to run, which became a bigger and bigger part of frame time if you tried to use it at lower resolutions to boost frame rates. You could see proof they were using a neural net approach at least for Metro Exodus, because NVidia published a frame activity graph, show a big Tensor core burst at the end of the frame with DLSS enabled.

Starting with "Control". That changed completely, and it was revealed that NVidia was using what they learned from an advanced Neural Network (Too big to run realtime on consumer cards), to create a traditional Cuda algorithm. This blew the doors of the neural network approach in previous games. But it doesn't use Tensor cores, and NVidia still misleadingly calls it DLSS.

Since then there have been better implementation of "DLSS", that look and behave like the Control version, with none of the limitations of the previous actual Neural Net approach running on Tensor cores. I strongly suspect these are Cuda Algorithms, not Tensor cores running Neural Nets.
 
I think it's the other way around.

Previous to the game "Control", DLSS, was using a Deep Learning technique. It had pretty big limitations around where it would work and at what frame rates. Example, you couldn't use it to achieve much higher frame rates, because the neural network also took a specfic amount of time to run, which became a bigger and bigger part of frame time if you tried to use it at lower resolutions to boost frame rates. You could see proof they were using a neural net approach at least for Metro Exodus, because NVidia published a frame activity graph, show a big Tensor core burst at the end of the frame with DLSS enabled.

Starting with "Control". That changed completely, and it was revealed that NVidia was using what they learned from an advanced Neural Network (Too big to run realtime on consumer cards), to create a traditional Cuda algorithm. This blew the doors of the neural network approach in previous games. But it doesn't use Tensor cores, and NVidia still misleadingly calls it DLSS.

Since then there have been better implementation of "DLSS", that look and behave like the Control version, with none of the limitations of the previous actual Neural Net approach running on Tensor cores. I strongly suspect these are Cuda Algorithms, not Tensor cores running Neural Nets.
You have any documentation for this? I was not aware of this and would like to read up on the changes to this, as I am sure others would.
 
I found this post by Nvidia.

https://www.nvidia.com/en-us/geforce/news/dlss-control-and-beyond/

I don't understand it, but, some people interpreted it to mean than DLSS doesn't use tensor cores in Control


Yes, That is the post where they describe the switch to a Hand crafted image processing algorithm.

Before that behavior was very different, and you can see evidence in the Metro Exodus frame time that DLSS was using Tensor cores, so I expect all the old implementations of DLSS are like this.

So we have three timeframes and behaviors.

1: Before Control, DLSS was running on Tensor Cores, and was very limited.
2: Control, where they changed to an image processing algorithm. That seemed better/faster, and with less limitations.
3: After Control: This is the speculative part.

IMO, #3, After Control, looks MUCH more like what is happening in Control than it does like anything happening in previous Tensor Core based DLSS.
 
I always understood that the way DLSS worked, was that images from the game had to be pre-analyzed by supercomputers w/AI/deep learning, which came up with the algorithms on a per game basis, and that is what was used by the DLSS (cores?) in the RTX cards to do faster AA. Whether that 'algorithm' was baked into the game, or the video drivers, I don't know. But that is how I believe DLSS has always been meant to work. Pretty sure it was described in a review, probably here on [H] with how that was supposed to work.

Over time how they pre-process the images seems to be improving, which is what that nvidia news post was about. i.e. "hey this is our best DLSS support in a game yet" not that this was the first time something had been done a certain way. And it is a process I would expect to get better as they get more experience doing it. Still seems to sacrifice some visual quality, but I do think it is a very clever and new way to think about AA. Definitely an outside the box solution. Hopefully it gets better and can be done in a way that doesn't cost any image quality.

Note: the image quality observations are of course my opinion from looking at the still images in the news post. I haven't played Control yet, so the in game experience could very well be excellent. The news post mentioned that DLSS is much better at handling in motion/fast moving objects better. Something we would need to play the game first hand to properly experience. If control is on steam now (think it should be) I will give it a go.
 
Last edited:
I always understood that the way DLSS worked, was that images from the game had to be pre-analyzed by supercomputers w/AI/deep learning, which came up with the algorithms on a per game basis, and that is what was used by the DLSS (cores?) in the RTX cards to do faster AA. Whether that 'algorithm' was baked into the game, or the video drivers, I don't know. But that is how I believe DLSS has always been meant to work. Pretty sure it was described in a review, probably here on [H] with how that was supposed to work.

Over time how they pre-process the images seems to be improving, which is what that nvidia news post was about. i.e. "hey this is our best DLSS support in a game yet" not that this was the first time something had been done a certain way. And it is a process I would expect to get better as they get more experience doing it. Still seems to sacrifice some visual quality, but I do think it is a very clever and new way to think about AA. Definitely and outside the box solution. Hopefully it gets better and can be done in a way that doesn't cost any image quality.

Note: the image quality observations are of course my opinion from looking at the still images in the news post. I haven't played Control yet, so the in game experience could very well be excellent. The news post mentioned that DLSS is much better at handling in motion/fast moving objects better. Something we would need to play the game first hand to properly experience. If control is on steam now (think it should be) I will give it a go.

Control is not on Steam yet, but Wolf YB is and has a great DLSS implementation.

More DLSS...
 
I always understood that the way DLSS worked, was that images from the game had to be pre-analyzed by supercomputers w/AI/deep learning, which came up with the algorithms on a per game basis, and that is what was used by the DLSS (cores?) in the RTX cards to do faster AA. Whether that 'algorithm' was baked into the game, or the video drivers, I don't know. But that is how I believe DLSS has always been meant to work. Pretty sure it was described in a review, probably here on [H] with how that was supposed to work.

Over time how they pre-process the images seems to be improving, which is what that nvidia news post was about. i.e. "hey this is our best DLSS support in a game yet" not that this was the first time something had been done a certain way. And it is a process I would expect to get better as they get more experience doing it. Still seems to sacrifice some visual quality, but I do think it is a very clever and new way to think about AA. Definitely and outside the box solution. Hopefully it gets better and can be done in a way that doesn't cost any image quality.

Note: the image quality observations are of course my opinion from looking at the still images in the news post. I haven't played Control yet, so the in game experience could very well be excellent. The news post mentioned that DLSS is much better at handling in motion/fast moving objects better. Something we would need to play the game first hand to properly experience. If control is on steam now (think it should be) I will give it a go.
Actually, motion is still one of the weak points in Control's DLSS. And it is even kind of exacerbated by the already noisey nature of Control's VFX. Youngbloods' DLSS seems to have improved motion handling a lot.
 
Actually, motion is still one of the weak points in Control's DLSS. And it is even kind of exacerbated by the already noisey nature of Control's VFX. Youngbloods' DLSS seems to have improved motion handling a lot.

Control was a fun game and quite original
Very linear yet open world. They did something very hard to pull off. Anyways dlss worked good for me on my overpriced 2080ti.
 
they said the 14gbps is on hold as it needs more engineering validation, so the 12gbps article was published this morning. So, as of now, XFX is officially at 12gbps, but that may change.


XFX has stated that it is very likely it will increase the memory to 14GHz in a newer BIOS, but as it stands right now 12GHz is the official specification.

Therefore, in order to get the most from your video card, you will want to manually overclock

it seems that memory overclocking affected performance to a greater degree. This GPU was craving more memory bandwidth, and the overclock gave it just that

https://www.thefpsreview.com/2020/01/29/xfx-radeon-rx-5600-xt-overclocking-review/14/
 
  • Like
Reactions: Schro
like this
XFX has stated that it is very likely it will increase the memory to 14GHz in a newer BIOS, but as it stands right now 12GHz is the official specification.

Therefore, in order to get the most from your video card, you will want to manually overclock

it seems that memory overclocking affected performance to a greater degree. This GPU was craving more memory bandwidth, and the overclock gave it just that

https://www.thefpsreview.com/2020/01/29/xfx-radeon-rx-5600-xt-overclocking-review/14/

via over clock 3d

https://www.overclock3d.net/news/gp...xt_failure_rates_-_it_s_running_within_spec/1
Powercolor confirms that the memory in 5600 XT is rated for 14Gbps but not all boards have the circuitry/quality/setup to handle the same


Further clarification to PC Gamer:

https://www.pcgamer.com/amp/powerco...emory-speeds-differ-on-the-radeon-rx-5600-xt/
"Our Stock model 5600 XT follows AMD's reference specs but we share [the same] memory modules as Red Devil and Red Dragon. You would ask why we don't run at 14Gbps then—the reason is Red Devil and Red Dragon have a higher PCB layer count, 10 versus 8 found on the stock 5600 XT, which means the memory signal is much cleaner and stable with our premium models. Having a 10 layer PCB is important to the higher clocks on the GDDR6 14Gbps memory," PowerColor said.

In other words, PowerColor's stock 5600 XT is in fact using memory chips validated to run at 14Gbps, the same as found on its Red Devil and Red Dragon variants, but still might not be able to run at that speed because of the lower cost and lower quality printed circuit board (PCB) design
.
 
  • Like
Reactions: Auer
like this
via over clock 3d

https://www.overclock3d.net/news/gp...xt_failure_rates_-_it_s_running_within_spec/1
Powercolor confirms that the memory in 5600 XT is rated for 14Gbps but not all boards have the circuitry/quality/setup to handle the same


Further clarification to PC Gamer:

https://www.pcgamer.com/amp/powerco...emory-speeds-differ-on-the-radeon-rx-5600-xt/
"Our Stock model 5600 XT follows AMD's reference specs but we share [the same] memory modules as Red Devil and Red Dragon. You would ask why we don't run at 14Gbps then—the reason is Red Devil and Red Dragon have a higher PCB layer count, 10 versus 8 found on the stock 5600 XT, which means the memory signal is much cleaner and stable with our premium models. Having a 10 layer PCB is important to the higher clocks on the GDDR6 14Gbps memory," PowerColor said.

In other words, PowerColor's stock 5600 XT is in fact using memory chips validated to run at 14Gbps, the same as found on its Red Devil and Red Dragon variants, but still might not be able to run at that speed because of the lower cost and lower quality printed circuit board (PCB) design
.

This just makes it seem like even more of a mess. So not every AIB got equel memory allocation? Some got special treatment and were given all 14Gbps memory while others got a mix?
 
  • Like
Reactions: Auer
like this
This just makes it seem like even more of a mess. So not every AIB got equel memory allocation? Some got special treatment and were given all 14Gbps memory while others got a mix?

I read it like PowerColor is calling MSI a liar.
 
The embargo has been lifted on the Radeon RX 5600 XT, AMD's so-called "Ultimate" 1080p gaming card. There are no reference cards this time around, so you'll see offerings from a variety of manufacturers around the web today. I'll keep adding them here as I see them.

This has been getting a bit more interesting with the whole performance upgrade upon card launch thing

The FPS Review - XFX

The FPS Review (BIOS Update) - XFX

The FPS Review (Overclocking) - XFX

Guru3d - Sapphire

Guru3d - Gigabyte

Guru3d - Asus

Guru3d - MSI

PCPer - Sapphire

AnandTech - Sapphire

Tom's Hardware - Sapphire

Kitguru - Sapphire

Phoronix - Sapphire

ThinkComputers - MSI

ExtremeTech - Sapphire

PCGamer - Sapphire

TechSpot - MSI

TechPowerUp - Sapphire

Bluesnews - Roundup

PCGamesN - Sapphire

Hexus - Sapphire

VideoCardz - Roundup

5600 XT compared to a graphics card released 3 years ago, ( RX 580 4gb reportedly taken from Schro 's son's PC )

https://www.thefpsreview.com/2020/02/03/upgrading-radeon-rx-580-to-radeon-rx-5600-xt/
 
Back
Top