amd rdna 5700 or

Sure, if you already have a RTX 2080ti to do Ray Tracing, a 5700Xt can make sense for a second machine.

But if it's going to be your only GPU and AMD doesn't offer significantly better perf/$ I would take the "free" RT HW to try in some games.
Raytracing below a 2080 is basically a no go though. Its like having a strap on for your lawnmower.
 
They got it in vulkan? That will be interesting to see if so indeed. If anyone can do it, it'll be Carmack and a tunnel shooter which is the easiest way. I will look forward to the day when it can be done open world, not just partial scene rt like now and with frame rates North of 60hz at 4k. Right now is a generation or two too early.
 
As an rtx2080ti owner I absolutely agree! I'm gonna switch my alternate machine from 1070ti to 5700xt.

As far as lightning, go and buy a lightning arrestor kit for your house.
and dont forget about the network cables!
 
  • Like
Reactions: N4CR
like this
lol, now at the same time 99.9% of games have no RT and will never have RT for this generation. Something to consider.
Dunno man. Ive used RT on a OCed Pascal GPU without RTX HW support at lower resolutions in BFV and I could def understand why people would want it.
On the other hand the price you pay for that perf is steep as fuck which has always led me to dislike the RTX series of GPUs.
Waiting another Gen myself before buying into anything, 4k rasterization for another year or so.
 
Dunno man. Ive used RT on a OCed Pascal GPU without RTX HW support at lower resolutions in BFV and I could def understand why people would want it.
On the other hand the price you pay for that perf is steep as fuck which has always led me to dislike the RTX series of GPUs.
Waiting another Gen myself before buying into anything, 4k rasterization for another year or so.

I don't think anyone debates that it is a neat feature. I think the performance is lacking on anything except the highest end cards and there aren't a lot of titles. Seems like the early adopters are paying a steep price for very little return except more powerful rasterization performance. and you could make the argument that the Pascal generation is a better bang for the buck right now for rasterization unless you are buying a 2080Ti. E.g. the 1080Ti is within 10% of a 2080, a 1080 within 10% of a 2070, etc. That RTX tax is steep on the 2080Ti.
 
Last edited:
Dunno man. Ive used RT on a OCed Pascal GPU without RTX HW support at lower resolutions in BFV and I could def understand why people would want it.
On the other hand the price you pay for that perf is steep as fuck which has always led me to dislike the RTX series of GPUs.
Waiting another Gen myself before buying into anything, 4k rasterization for another year or so.
I guess I need to try that out, played some Quake II RTX and that was somewhat impressive but sad as well. Will SLI it and see if it makes a significant IQ difference in BFV.
 
While I am not a huge fan of Nvidia's anti consumer business plans, I do not deny their HW is amazing. I use AMD stuff for a reason, and that reason has paid me off handsomely.


But one thing what everyone is forgetting is the that you are paying for a HUGE 600+mm^2 die that still has very mediocre yields a year later.

I mean Nvidia did this to themselves. Thry figured they would offset the R&D by bringing their big die scientific cores to the gaming market, all while making them a huge amount. It's a good idea from a business prospective if you can get enough people to fall for it and basically pay $1200 for a die that would otherwise have a great chance of hitting the scrap pile otherwise (depending on it's defect it *could* get cut down into a lower SKU but a lot do not make that cut).

I am interested in the 5700 series because I want to see if AMD was finally able to uncork Raja's screw up. I rewarded AMD by buying a bunch of VEGA and VIIs since they paid me in spades for it but I am still pissed about the lies. Glad Raja is gone and Lisa Su has things on course.
 
But one thing what everyone is forgetting is the that you are paying for a HUGE 600+mm^2 die that still has very mediocre yields a year later.

What are the yields on the big Turing 12nm parts? That process is basically 16nm and is as old as dirt by now.
 
What are the yields on the big Turing 12nm parts? That process is basically 16nm and is as old as dirt by now.
2080Ti is 754mm² I don't ever remember a die near that big for mainstream.
I think that says enough.

I'd love one acid bathed and in a glass casing to see the deliciousness.
 
2080Ti is 754mm² I don't ever remember a die near that big for mainstream.
I think that says enough.

I'd love one acid bathed and in a glass casing to see the deliciousness.

Yeah it's stupid massive but size isn't the only factor that determines yield. Process should be really cheap and mature now. We'll probably never know what these things actually cost.
 
I'm in the same boat. Sitting on a GTX 1070@3440x1440 144Hz monitor. Gonna see how the 5700XT vs RTX Super shakes out (if they REALLY exist)
 
I'm in the same boat. Sitting on a GTX 1070@3440x1440 144Hz monitor. Gonna see how the 5700XT vs RTX Super shakes out (if they REALLY exist)
RTX Super is just a Super Expensive version is all, a new deal from Nvidia ;). Rebadge a card with slight modifications, maybe faster memory that performs no faster then previous OC memory let say, faster rated boost clock which previous cards could do, a new sticker and charge the same price as before. Just joking, it should have some higher performance overall if Nvidia actually has higher cuda core count card versions (how ever achieved) as rumored.

I would be interested a game that actually has an optimized path for wave 32 on RDNA. Maybe Doom Eternal will have that, some top notch state of the art programmers there at ID Software. Now if AMD can bake into drivers some of the shaders for games to run at wave 32 I have no idea if they could or would do that. Some major performance increases for those operations can be achieved. How much overall it can affect performance needs to be seen. I would not be too surprised with a game using wave 32 outperforms a 2080 or Vega Vii in certain aspects or areas.
 
I would be interested a game that actually has an optimized path for wave 32 on RDNA. Maybe Doom Eternal will have that, some top notch state of the art programmers there at ID Software. Now if AMD can bake into drivers some of the shaders for games to run at wave 32 I have no idea if they could or would do that. Some major performance increases for those operations can be achieved. How much overall it can affect performance needs to be seen. I would not be too surprised with a game using wave 32 outperforms a 2080 or Vega Vii in certain aspects or areas.

Better optimizations in paths that use DX11 or Open-GL ?
 
I would be interested a game that actually has an optimized path for wave 32 on RDNA. Maybe Doom Eternal will have that, some top notch state of the art programmers there at ID Software. Now if AMD can bake into drivers some of the shaders for games to run at wave 32 I have no idea if they could or would do that. Some major performance increases for those operations can be achieved. How much overall it can affect performance needs to be seen. I would not be too surprised with a game using wave 32 outperforms a 2080 or Vega Vii in certain aspects or areas.

I will- again- throw out the warning about putting faith in special AMD GPU features. There is a long history of this not happening, even when they had a commanding share of the market (back when they were ATi...), and they've done it with every recent release to zero effect.
 
I will- again- throw out the warning about putting faith in special AMD GPU features. There is a long history of this not happening, even when they had a commanding share of the market (back when they were ATi...), and they've done it with every recent release to zero effect.

Well it is just something that can speed up things more then anything else and it seems that it will be supported in newer generations as well

Navi-Slide-10-640x352.jpg


If you go by the same principles as with GCN where optimized code benefits each successor then eventually you will get there.
No I am not talking about fine wine ;)
 
I will- again- throw out the warning about putting faith in special AMD GPU features. There is a long history of this not happening, even when they had a commanding share of the market (back when they were ATi...), and they've done it with every recent release to zero effect.


While you are mostly correct, FarCry5 had RapidPackedMath enabled which gave VEGA a nice boost.

I'm still a bit better over the PSR that Raja promised was going to be enabled "in a future driver release"...I think now that he is gone then things should be a bit better. Time will tell.
 
If you go by the same principles as with GCN where optimized code benefits each successor then eventually you will get there.
No I am not talking about fine wine ;)

Well, fine wine comes from AMD's legendarily terrible release drivers improving over time- in this case, we'd be talking about developers taking advantage of new AMD-specific features over time.

I'll submit that it's entirely possible now that DX12 and Vulkan essentially force developers to deal directly with GPUs, but it does depend on how well AMDs also legendary lack of developer support 'gets the word out' and helps get the features productively implemented.

I'm always happy when AMD moves the industry forward, it's just very, very rare, and counting on developers to implement something with their marketshare and level of support is seriously reaching.
 
Well now that I own a 5700xt I am very impressed with everything but the cooling solution.

It's a really fast card to be honest its super [fucking] smooth (emphasis added. When amd advertised the card at computex and boasted about how smooth it was they weren't lying.

While my 2080ti water cooled can shit all over every other gpu basically, in some games the rdna card feels smoother even if it's a lower frame rate.

I cant explain why, it just seems to feel that way to my eyes.
 
I will- again- throw out the warning about putting faith in special AMD GPU features. There is a long history of this not happening, even when they had a commanding share of the market (back when they were ATi...), and they've done it with every recent release to zero effect.

So, in other words, like RTX :D.
 
So, in other words, like RTX :D.

;)

In terms of AAA games that support DXR / Vulkan RT well and show a difference that uninformed users would notice and appreciate, they're similar; in terms of market acceptance and developer effort, they couldn't be more different.
 
Well now that I own a 5700xt I am very impressed with everything but the cooling solution.

It's a really fast card to be honest its super [fucking] smooth (emphasis added. When amd advertised the card at computex and boasted about how smooth it was they weren't lying.

While my 2080ti water cooled can shit all over every other gpu basically, in some games the rdna card feels smoother even if it's a lower frame rate.

I cant explain why, it just seems to feel that way to my eyes.


So they really got that anti lag thing to work and isn't entirely placebo? (I've been betting on placebo myself)

Btw if you can, do some tests with Radeon image sharpening and share your thoughts on it, it's always great to hear from firsthand accounts.
 
Well now that I own a 5700xt I am very impressed with everything but the cooling solution.

It's a really fast card to be honest its super [fucking] smooth (emphasis added. When amd advertised the card at computex and boasted about how smooth it was they weren't lying.

While my 2080ti water cooled can shit all over every other gpu basically, in some games the rdna card feels smoother even if it's a lower frame rate.

I cant explain why, it just seems to feel that way to my eyes.

Now you are almost making me wish I had picked up a 5700 on Monday anyways, even though I do not need one. (Have a Vega 56 with 64 bios flashed to it.)
 
Hardware unboxed tested Radeon Image Sharpening and their conclusion was that for the small hit it was better than Nvidia's DLSS.



The purpose of DLSS is mainly to increase speed which is useful when possible to offset RT.

DLSS2X would have been IQ oriented but still waiting...
 
Hardware unboxed tested Radeon Image Sharpening and their conclusion was that for the small hit it was better than Nvidia's DLSS.



When it comes to those videos, if it is not Steve, I do not bother watching. (Not sure why, just the way I am.)
 
The purpose of DLSS is mainly to increase speed which is useful when possible to offset RT.

DLSS2X would have been IQ oriented but still waiting...

Well, Radeon Image Sharpening does give you increased speed at a similar or better image quality is the point, and when DLSS is better the speed increment is smaller than RIS, this allows you to increase the RIS quality (from 1440p to 1800p) to then again, increase RIS benefit against it.

There are also the different points raised about DLSS being resolution locked, game locked and image quality locked, while RIS is only locked to Dx12, Vulkan, Dx9 and that is because dx11 is in the works. Any image quality/resolution/game should work within those parameters.
 
DLSS is the current competitor, FreeStyle is the competitor to Fidelity FX get that right, RIS is only one part of Fidelity FX and nvidia is free to grab the shader of the Contrast Aware Sharpening and add it to Freestyle.
 
DLSS is the current competitor, FreeStyle is the competitor to Fidelity FX get that right, RIS is only one part of Fidelity FX and nvidia is free to grab the shader of the Contrast Aware Sharpening and add it to Freestyle.

Many are over-reacting to what is merely a sharpening filter. You could add a sharpening filter for years with reshade and for some time with Freestyle from NVidia.

So for years you could drop your resolution and run a sharpening filter if you wanted to, and pretty much no one did.

This is NOTHING new.

It would not be a big deal either, except that people are comparing it to DLSS which is a technology that was almost universally panned as a failure.

Before RIS, HWUB had already concluded that a standard scaling was already better than DLSS.

So really this is just the same argument again, with a bit of image sharpening thrown on top, and sharpening has ALWAYS been used as a crutch to higher resolution in Video and Still images.

So yeah a little sharpening can help an image bridge the gap, but then again, you could apply a little sharpening to the higher resolution image, and have that gap return...

I am not defending DLSS. It's a failure.

But excitement about a sharpening filter like it's some kind of new idea, is absurd. I have seen sharpening filters Ad nauseam since the early days of digital cameras, and then Video.
 
Many are over-reacting to what is merely a sharpening filter. You could add a sharpening filter for years with reshade and for some time with Freestyle from NVidia.

So for years you could drop your resolution and run a sharpening filter if you wanted to, and pretty much no one did.

This is NOTHING new.

It would not be a big deal either, except that people are comparing it to DLSS which is a technology that was almost universally panned as a failure.

Before RIS, HWUB had already concluded that a standard scaling was already better than DLSS.

So really this is just the same argument again, with a bit of image sharpening thrown on top, and sharpening has ALWAYS been used as a crutch to higher resolution in Video and Still images.

So yeah a little sharpening can help an image bridge the gap, but then again, you could apply a little sharpening to the higher resolution image, and have that gap return...

I am not defending DLSS. It's a failure.

But excitement about a sharpening filter like it's some kind of new idea, is absurd. I have seen sharpening filters Ad nauseam since the early days of digital cameras, and then Video.


I'm not a big defender of DLSS or RTX, in fact I've been pretty critical for the price and availability. But I wouldn't say DLSS is a failure yet (or RTX) for the simple reason that it may be much more widely used in the following year's games. If you simply meant it's a failure as a selling point I'd 100% agree with that statement.
 
Fwiw. Done in Nvidia Freestyle.

1440p Sharpened:



4K no filters:

 
Last edited:
Back
Top