AMD Radeon RX 5700 XT and RX 5700 Review Roundup


Well, I mean you're cherry picking which titles the 2060 excels at. Averaging out the differences between game engines and how they perform on certain architectures in a broader range of games, you get something like this (and this is against the 2070 Super). Hence why I think the 5700XT is a better buy at $400.
5700XT_1.png
 
Interesting... designing the GPU (rDNA) to be backwards compatible with GCN, hamstrings their innovation.

Guess we can be glad that nVidia isn't worrying about that for their GPU's. But that totally sucks for AMD's desktop GPU's. Explains why their GPU's are always 2 years behind nVidia and have been for the last what, 8ish years? Approximately the same length of time that they have been doing console GPU's?

Ugh... fucking consoles..
 
Well, I mean you're cherry picking which titles the 2060 excels at. Averaging out the differences between game engines and how they perform on certain architectures in a broader range of games, you get something like this (and this is against the 2070 Super). Hence why I think the 5700XT is a better buy at $400.
View attachment 173111

What Cherry picking? I was answering the claim about Ray Tracing being useless. The graph I included is only about Ray Tracing performance the RTX 2060.
 
What Cherry picking? I was answering the claim about Ray Tracing being useless. The graph I included is only about Ray Tracing performance the RTX 2060.

I missed the DXR part in the title of the graph (I'm on my phone and it's small).

Either way I still think that the extra performance of the 5700XT is of more value to me than the handful of DXR titles.
 
Last edited:
In my eyes, more options is nothing but good for PC gaming as a whole. Especially in the mid-rage which is vast majority of the market.

Cherry picking ... just for fun here's one beating out the RTX 2080 at 1080p by a fair margin (Hexus review). Running a 2080 TI at 4k is nice but that still represents a small fraction of consumers. Even at 4k the 5700 XT runs neck in neck with the 2070 super in this particular title. If I were a fan of games that happen to have a strong Vulkan presence I'd be giving these new AMD cards a hard look. With that in mind. It should be interesting when the most anticipated game of the year is released November 22nd. One we know is Vulkan only and well optimized for it.

2.jpg

22.png
 
Last edited:
In my eyes, more options is nothing but good for PC gaming as a whole. Especially in the mid-rage which is vast majority of the market.

Cherry picking ... just for fun here's one beating out the RTX 2080 at 1080p by a fair margin (Hexus review). Running a 2080 TI at 4k is nice but that still represents a small fraction of consumers. Even at 4k the 5700 XT runs neck in neck with the 2070 super in this particular title. If I were a fan of games that happen to have a strong Vulkan presence I'd be giving these new AMD cards a hard look. With that in mind. It should be interesting when the most anticipated game of the year is released November 22nd. One we know is Vulkan only and well optimized for it.

View attachment 173136
View attachment 173140


Not bad results in that game for the 5700 XT, with a little overclocking I bet it could match and even exceed stock 2080 results on 4K
 
In my eyes, more options is nothing but good for PC gaming as a whole. Especially in the mid-rage which is vast majority of the market.

Cherry picking ... just for fun here's one beating out the RTX 2080 at 1080p by a fair margin (Hexus review). Running a 2080 TI at 4k is nice but that still represents a small fraction of consumers. Even at 4k the 5700 XT runs neck in neck with the 2070 super in this particular title. If I were a fan of games that happen to have a strong Vulkan presence I'd be giving these new AMD cards a hard look. With that in mind. It should be interesting when the most anticipated game of the year is released November 22nd. One we know is Vulkan only and well optimized for it.
]

Doom Eternal? Isn't that also a RTX enabled title? DLSS as well?

If so, maybe AMD will have some kind of RT tech enabled by then...?
 
Doom Eternal? Isn't that also a RTX enabled title? DLSS as well?

If so, maybe AMD will have some kind of RT tech enabled by then...?
Yes. It is my understanding RT in Doom Eternal will be available to Nvidia owners through Vulkan. Forthcoming benchmarks should be fun.
 
  • Like
Reactions: Auer
like this
Yes. It is my understanding RT in Doom Eternal will be available to Nvidia owners through Vulkan. Forthcoming benchmarks should be fun.


Supposedly the Google Streaming version is going to be balls to the walls crazy compared to what us lowly "PC Gamers" are going to be able to run locally. It will be crazy if it turns out true. Can't wait either way. I just do not see how you can play a twitch shooter with streaming and input lag.
 
Supposedly the Google Streaming version is going to be balls to the walls crazy compared to what us lowly "PC Gamers" are going to be able to run locally. It will be crazy if it turns out true. Can't wait either way. I just do not see how you can play a twitch shooter with streaming and input lag.

There will be special features, but those are more likely to be related to games actually running the game on the servers (zero overhead twitch streaming and things like that), not like it will be a visual upgrade, since it appears you only get one Vega 56 to run it on this generation of Stadia.
 
There will be special features, but those are more likely to be related to games actually running the game on the servers (zero overhead twitch streaming and things like that), not like it will be a visual upgrade, since it appears you only get one Vega 56 to run it on this generation of Stadia.


CarMack was quoted (loosely) "as the Stadia version will have visuals not possible on PC...It's pretty nuts"....Something along those lines. I am never going to use it, so I am too lazy to Google it but his actual quotes are out there. /shrug
 
CarMack was quoted (loosely) "as the Stadia version will have visuals not possible on PC...It's pretty nuts"....Something along those lines. I am never going to use it, so I am too lazy to Google it but his actual quotes are out there. /shrug

Didn't Carmack leave to go work for Oculus (facebook) about 5 years ago? Presumably he has nothing to do with building Doom franchise games anymore.

Here is an actual id quote:
https://arstechnica.com/gaming/2019...-to-overcome-ids-stadia-streaming-skepticism/

Land also teased that id was busy working on ways to differentiate the Stadia version of Doom Eternal in ways that aren't possible on other platforms. "That is all I'm allowed to say on the subject" for the time being, he added.

I expect thiis mostly around streaming, game save states, etc. Not superior visuals.
 
Last edited:
Can we stick to 1440p charts (for parity sakes).

Face it, hardly anybody is spending $350+ to game at 1080p or 4k. These cards are aimed squarely at 2k, so it is where we should define their parity. Yes, 1080p and 4k help tell a story, but for charts and comparisons, 1440p is the legit choice.
 
Obviously not, normal consumers don't care about node process. And I never sad anything to suggest a typical consumer would, so no idea why you are bringing this up.



+ or - a few points, or beating particular card X is not the point, and isn't relevant. The fact is, Navi is a 7nm part that consumes approximately the same power as a 12nm part.

The primary reasons/benefits of doing a node shrink:
1) Decreased power drain for the same performance (as the prior process) [or] increased performance for the same power (as the prior process). Historically both were achieved, but no longer.
2) Smaller area = cheaper (since you get more chips per wafer, the overall per chip cost goes down)
3) Larger transistor count in the available area, allows for more complex and powerful chips.

I am focusing on the first reason. Since the power drain is basically the same as nVidia's 12nm GPU's, it means that they opted for "increased performance" at the same power level.
We can infer from this that at the lowered power level that is achievable with a die shrink, that the performance was too low to be competitive. So they had to boost the power.
It means the GPU's layout/design is not as good as their competitors. It also means that when the competitor drops to a 7nm process and gets all of those same advantages, that this part will be at an even larger deficit.

Maaybe better cooling will help Navi, remains to be seen. Someone said they got a water block on order, looking forward to those results. But that adds to the cost and changes its' place in the price/performance matchups... putting closer to or at the 2070 Supers' price. Is it going to make up enough performance? Hopefully he reports the results so we get an idea.

I suspect the silicon is already at it's limits, but with a shitty blower type cooler, it might be throttling, and maybe he can get more out of it. But judging from the past Radeons, including one that required factory water cooling, I am not holding my breath.



Now you just sound like a salesman and/or fanboy. Doesn't help your credibility.


Do you really think actual buyers are concerned about 16nm node process, or 7nm node process..? Or do you think you are just being long winded and talking about history and past.... because you do not want to talk about today and the future? You said that Consumer's were not. If consumers are not concerned about node process, then why are you sitting here trying to tell us techies about something we know, but don't care about when price performance on Navi is remarkable compared to nvidia's newly released SUPER. Navi is the clear winner...

Many People have already given you illustrations, yet you keep pretending that Navi on water will cost too much. You've never explained your rational behind that, as if water cooling on Navi cost more than water cooling on SUPER..?

Funny thing is, Navi doesn't need water to compete with Nvidia, but it is really close to competing with Nvidia $499 card.. AIO's will have cheap 5700's & 5700x's LQ cards soon. And they will not cost $499.





Also:

Help Navi do what..? Overclock, or run cooler... because you don't seem to understand how Navi works (it overclocks itself until it hits a thermal wall. If you remove that wall it keeps clocking), or have not bothered to read how it reaches 2100hz on stock blower... o_O
 
Can we stick to 1440p charts (for parity sakes).

Face it, hardly anybody is spending $350+ to game at 1080p or 4k. These cards are aimed squarely at 2k, so it is where we should define their parity. Yes, 1080p and 4k help tell a story, but for charts and comparisons, 1440p is the legit choice.

Hardly anybody? The vast majority of gamers are at 1080p, not 2k. The majority of people buying these cards will be gaming at 1080p.
 
No they won't. The 1080p gamer will be buying up all the used cards from the Navi/Super stampede.

27" 1080p monitors are for little kids, they don't own $350 GPUs.
 
Hardly anybody? The vast majority of gamers are at 1080p, not 2k. The majority of people buying these cards will be gaming at 1080p.
Agree. Steam hardware survey June 2019. 1440p is not even a close second. A great many people are gaming on TVs in their living rooms these days. And that is 1080.

survey.jpg
 
Last edited:
Hardly anybody? The vast majority of gamers are at 1080p, not 2k. The majority of people buying these cards will be gaming at 1080p.

I think his point was that 1080p gamers are not spending 350 bucks on a video card, they are more in the 200 and under range. Personally I think there is a bit of a mix at that resolution of big spenders and penny pinchers.
 
Can we stick to 1440p charts (for parity sakes).

Face it, hardly anybody is spending $350+ to game at 1080p or 4k. These cards are aimed squarely at 2k, so it is where we should define their parity. Yes, 1080p and 4k help tell a story, but for charts and comparisons, 1440p is the legit choice.

You might want to look at what level 1080p players are forced to buy at for quality/longevity (aka more than 2 years)/adaptive sync/144hz with an actual ram upgrade to the 1060 or a price/perf to a 970 (or Polaris cards). Its $279+ for a 1660ti that some reviewers even sold as capable at 1440p. I want something that will handle most games at high refresh rates for 1080p yet both Nvidia and AMD have me stuck buying $350+ cards to be safe (with 6gb 192bit no less). That or $280-320 for quality custom 1660ti. I might as well go 5700 or 2060 at that point. I watched one tech tuber the other day rant that navi wasn't a vega replacement or a polaris replacement and thus prices were fine. This gpu market is just fubar.

Heck even if you want nvidia and decent 1080p 60hz performance you're stuck buying a 1660 at $230 or so. Though I'm seeing Nvidia/AIB drop prices on their 1660ti's $30-50 this week. I happened to catch this at the Egg:
https://www.newegg.com/asus-geforce-gtx-1660-ti-ph-gtx1660ti-o6g/p/N82E16814126299

$285 for that single fan card... looks like they took it off sale too. It was down to $240 AR. Utter joke this industry has become.
 
No they won't. The 1080p gamer will be buying up all the used cards from the Navi/Super stampede.

27" 1080p monitors are for little kids, they don't own $350 GPUs.

There also for us middle aged old guys.

I don't like gaming on anything more then 30 or so inches.... man I'm forced to wear a thousand dollar pair of progressives these days... any larger then 30 and I look like a rooster bobbing my head up and down trying to keep everything in focus.

The same old man eyes also mean that I honestly can barely make out the difference between a GOOD 1080p monitor and 1440. So when I went monitor shopping last I found a good high refresh 1080p monitor. Don't regret it at all. You are not wrong though... don't have much need of a super high end GPU to push 1080p. Only upside of having to spend so much on my glasses these days.... I haven't felt the need for GPU speed. lol
 
Can we stick to 1440p charts (for parity sakes).

Face it, hardly anybody is spending $350+ to game at 1080p or 4k. These cards are aimed squarely at 2k, so it is where we should define their parity. Yes, 1080p and 4k help tell a story, but for charts and comparisons, 1440p is the legit choice.


1080p and 2K are the same thing.


1920 x 1080 is 2k.

Double that and you have:
3840 x 2160 (AKA 4K).

Double that and you have:
7680 x 4320 (AKA 8K)

The nomenclature comes from Horizontal width being approximate 2000, 4000, and 8000 pixels.

By that logic 2560 x 1440 would be 2.5K.

Better yet, just stick to the actual resolutions. Since there are so many aspect odd ratios these days.

If we wanted to have one number to describe the Monitor driven by a GPU we should probably use Mega Pixels from digital cameras.

1920 x 1080 = 2.1 MP
1920 x 1200 = 2.3 MP
2560 x 1080 = 2.8 MP
2560 x 1440 = 3.7 MP
2560 x 1600 = 4.1 MP
3440 x 1440 = 5.0 MP
3840 x 1600 = 6.1 MP
3840 x 2160 = 8.3 MP

Now you can directly compare the impact on the GPU, and it's easy to see that a 4K monitor needs 4 times the GPU power to drive it that 2X monitor does.

FWIW, I have a lovely old NEC Pro monitor with 1920x1200 resolution, that I will use until it fails. I don't see upgrading video cards as a reason to buy a new monitor. It isn't like having a bit of GPU overkill is a bad thing.
 
Agree. Steam hardware survey June 2019. 1440p is not even a close second. A great many people are gaming on TVs in their living rooms these days. And that is 1080.

You can't agree.... when his comments aren't true for the scenario given.

Nobody is dsputing 1080p has more use today... but those people already have video cards and are gaming at 1080p ALREADY! I said those buying $350+ video cards, are doing it for 1440p. Nobody is buying Navi or Super for 1080p... they are buying these new cards for their brand new 1440p monitor. I suspect he knows what I said, but likes to mock the thread with play arguments.
 
There also for us middle aged old guys.

I don't like gaming on anything more then 30 or so inches.... man I'm forced to wear a thousand dollar pair of progressives these days... any larger then 30 and I look like a rooster bobbing my head up and down trying to keep everything in focus.

The same old man eyes also mean that I honestly can barely make out the difference between a GOOD 1080p monitor and 1440. So when I went monitor shopping last I found a good high refresh 1080p monitor. Don't regret it at all. You are not wrong though... don't have much need of a super high end GPU to push 1080p. Only upside of having to spend so much on my glasses these days.... I haven't felt the need for GPU speed. lol

Old men have money...
Old men know what they want...
Old men don't compromise...
Old men can afford a LARGER monitor than 27" inches...

Hence, only little kids game on 27" monitors... and at 1080p, because it is all they can afford. People who can afford a larger monitor (ie: 32" 1440p -or- 34" 1440p) also have money to afford a $350 Gaming card. Again, find someone who is upgrading their video card with Navi, or SUPER to game on 1080p. K thanks.
 
Old men have money...
Old men know what they want...
Old men don't compromise...
Old men can afford a LARGER monitor than 27" inches...

Hence, only little kids game on 27" monitors... and at 1080p, because it is all they can afford. People who can afford a larger monitor (ie: 32" 1440p -or- 34" 1440p) also have money to afford a $350 Gaming card. Again, find someone who is upgrading their video card with Navi, or SUPER to game on 1080p. K thanks.
Maybe I'm just missing it but I don't see the third option. Or are you suggesting "little kids" transition directly into "Old men" overnight without a gradual transition.:oldman:
 
Old men have money...
Old men know what they want...
Old men don't compromise...
Old men can afford a LARGER monitor than 27" inches...

Hence, only little kids game on 27" monitors... and at 1080p, because it is all they can afford. People who can afford a larger monitor (ie: 32" 1440p -or- 34" 1440p) also have money to afford a $350 Gaming card. Again, find someone who is upgrading their video card with Navi, or SUPER to game on 1080p. K thanks.

Dude, I h ad no issue with your take on Navi as compared the Nvidia counterparts. However, your comment above about gaming on only anything larger than 27 inch monitors is just ridicules. Affording something does not make a person better nor what they game on make a person better. I now also own a 24 inch 1080p 144 hz monitor, a 27 inch 1080p 144hz monitor (Both curved) and a 43 inch 4k Samsung TV / Monitor. By your take, a person who has only one machine or one really good video card is better.
 
Can we stick to 1440p charts (for parity sakes).

No.

Face it, hardly anybody is spending $350+ to game at 1080p or 4k. These cards are aimed squarely at 2k, so it is where we should define their parity. Yes, 1080p and 4k help tell a story, but for charts and comparisons, 1440p is the legit choice.

Sure they are. 1080p240 would be a good match for an RX5700. Further, the RX5700 can do fairly well at 4k60 if settings are kept in check; keep your eye on 0.1% lows. With a Freesync monitor this is a fairly versatile GPU waiting for a decent cooler.
 
You can't agree.... when his comments aren't true for the scenario given.

Nobody is dsputing 1080p has more use today... but those people already have video cards and are gaming at 1080p ALREADY! I said those buying $350+ video cards, are doing it for 1440p. Nobody is buying Navi or Super for 1080p... they are buying these new cards for their brand new 1440p monitor. I suspect he knows what I said, but likes to mock the thread with play arguments.

Nah, you said gaming on a 27 inch monitor is for kids. Also, the idea that only if you have a 1440p monitor will you then buy a Navi is ludicrous. Navi would also work great with high refresh rate 1080p monitors.
 
Sweet! I'm in the 0.85% gaming at 1920x1200.

lol so you are that one other person!
still got my Dell 2408FPW. I dont like 16:9, too bad there are so few 16:10s around.
next montitor eventually though will by ultrawide or a 16:10 27".
 
1080p and 2K are the same thing.
1920 x 1080 is 2k.
Double that and you have:
3840 x 2160 (AKA 4K).
Double that and you have:
7680 x 4320 (AKA 8K)
The nomenclature comes from Horizontal width being approximate 2000, 4000, and 8000 pixels.
By that logic 2560 x 1440 would be 2.5K.
Better yet, just stick to the actual resolutions. Since there are so many aspect odd ratios these days.
If we wanted to have one number to describe the Monitor driven by a GPU we should probably use Mega Pixels from digital cameras.
1920 x 1080 = 2.1 MP
1920 x 1200 = 2.3 MP
2560 x 1080 = 2.8 MP
2560 x 1440 = 3.7 MP
2560 x 1600 = 4.1 MP
3440 x 1440 = 5.0 MP
3840 x 1600 = 6.1 MP
3840 x 2160 = 8.3 MP
Now you can directly compare the impact on the GPU, and it's easy to see that a 4K monitor needs 4 times the GPU power to drive it that 2X monitor does.
FWIW, I have a lovely old NEC Pro monitor with 1920x1200 resolution, that I will use until it fails. I don't see upgrading video cards as a reason to buy a new monitor. It isn't like having a bit of GPU overkill is a bad thing.


4k resolution is 4x that of 1080p. <---- Fact!
2k is universally understood as anything between Full HD (1080p) and 4k. Which is essentially every 1440p & ultrawide. As such, any gamer would understand what another gamer meant by sub-4k gaming... as 2k. Even on stage at events, speakers often glance over things and use 2k and 4k in gaming references...

Thanks for the Wikipedia lessons, but probably not needed for those who understand 2k gaming. (Add up the total pixels and divide by 2 = approximate K)




You: -"I don't see upgrading video cards as a reason to buy a new monitor."-

Brilliant twisting of words there, but enough to say that nobody would ever upgrade their video card ever... at all? Otherwise they would always have the same monitor. But I do know that everyone who buys a new Monitor, wants a new video card to push their new pixels. Since their 1080p video card, really doesn't push the pixels on their new 1440p monitor.

Hence, people upgrade their video card after buying, or when buying a new monitor. Video card must match your monitor, they are an established pair.


And since 1440p montiors are not mainstream and relatively cheap... all those people stuck on 1080p can now move to bigger monitors with higher resolutions, for dirt cheap prices = mainstream. The 5700 series is right there to provide all those millions and millions of young kids and adults who are ready to move up to mainstream technologies.

That is what is pushing the sales of $350~ GPUs. Not 1080p
 
4k resolution is 4x that of 1080p. <---- Fact!

But 8k resolution is not 8x of 1080p. Nor is 5k 5x, etc. That comparison only works for 4k.

2k is universally understood as anything between Full HD (1080p) and 4k.

The nk nomenclature as in 4k is universally understood to be a close approximation of the number of columns of pixels, where as the np nomenclature as in 1080p is universally understood to be the number of rows.

Since 1080p is 1920x1080, it is also 2k. 2560x1440 would be 2.5k, but is usually understood as '1440p'.
 
Old men have money...
Old men know what they want...
Old men don't compromise...
Old men can afford a LARGER monitor than 27" inches...

Hence, only little kids game on 27" monitors... and at 1080p, because it is all they can afford. People who can afford a larger monitor (ie: 32" 1440p -or- 34" 1440p) also have money to afford a $350 Gaming card. Again, find someone who is upgrading their video card with Navi, or SUPER to game on 1080p. K thanks.
I am an old man with 27" 4K monitor, 27" 1440p montor, plus 34" 3440x1440p. Not sure if I know what I want, just buy everything if in doubt ;)
 
But 8k resolution is not 8x of 1080p. Nor is 5k 5x, etc. That comparison only works for 4k.



The nk nomenclature as in 4k is universally understood to be a close approximation of the number of columns of pixels, where as the np nomenclature as in 1080p is universally understood to be the number of rows.

Since 1080p is 1920x1080, it is also 2k. 2560x1440 would be 2.5k, but is usually understood as '1440p'.


Yes, I think everybody knows this.
So, If someone mentions they game at 2k, would you know what they mean?

Same when someone says their car has 480 horses... do you know what they mean, even if it technically isn't the truth?



Do you see how these AMD naysayers have made this thread all about someone mentioning 2k and harping on them, instead of discussing AMD's 5700 Series. That is a technique known as threadcrapping.. and that is what these characters are tring to do. They don't want an in-depth Radeon 5700 thread, they want to disrupt the conversation, because they have stakes in doing so.
 
4k resolution is 4x that of 1080p. <---- Fact!
2k is universally understood as anything between Full HD (1080p) and 4k. Which is essentially every 1440p & ultrawide. As such, any gamer would understand what another gamer meant by sub-4k gaming... as 2k. Even on stage at events, speakers often glance over things and use 2k and 4k in gaming references...

Thanks for the Wikipedia lessons, but probably not needed for those who understand 2k gaming. (Add up the total pixels and divide by 2 = approximate K)

Your posts are a constant reminder of the Dunning–Kruger effect.

The resolution of 4K monitors is not 4X the resolution of 1080p monitors. Resolution is linear. Also 4K nomenclature has nothing to do with being 4 times anything either.

4K has DOUBLE the linear resolution it is named for, as in, 1920 (AKA 2K) doubles to become 3840 (AKA 4K).

What quadruples is the total pixel count which is an AREA measurement, not a linear measurement.

So, If someone mentions they game at 2k, would you know what they mean?

Depends on if they know what they are talking about or not. If they know what they are talking about it would mean they have approximately 2000 columns of horizontal resolution.

If they don't know what they are talking about, then I have no idea what is in their head. ;)
 
Back
Top