RTX 2060 Super or RX 5700 XT or RX 5700 ?

Swagata

n00b
Joined
Jan 9, 2013
Messages
55
I am confused bwtweens Zotac rtx 2060s amp; sapphire pulse rx 5700 and nitro plus rx 5700xt. For the nvidia card, its cooler and more power efficient which is a big yes for me and also I would get call of duty modern warefare free with zotac one. At the same price I also have the sapphire nitro plus 5700xt. This is faster and cooler but consumes very high pwer; not to mention the horrible driver support. And there is the sapphire pulse 5700 non xt version; much cheaper than those card with decent cooling and performance. I am looking for opinions. I am now on 1080p but I will be upgrading to 1440p in near future. I currently use ryzen 3600, cryorig h7, msi b450 tomahwak max, 16gb g skill 3200 mhz ddr4, zotax gtx 1060 3gb, cooler master v700.
 
1060 3GB? May as well get a new card by the time you upgrade to 1440p. You will probably have better options then. In the meantime, the 1060 should be able to do well at 1080p?
 
1060 3GB? May as well get a new card by the time you upgrade to 1440p. You will probably have better options then. In the meantime, the 1060 should be able to do well at 1080p?
the 3 gb version doesnt do work well in 1080p. I will upgrade to 1440p within 1 month. for gpu; can you suggest me now ? based on what i said. thanks.
 
I’d Wait for the Navi 5090 and the next gen Nvidia ampere. First half 2020. May bring mid range prices down.
 
Not sure what you have now, but those choices you're looking at would be good for 1080p and beyond.

Personally, I like the 5700 XT a bunch, but the 2060 Super is also a decent card.

AMD driver support is not as bad as people claim. There are sometimes bugs and issues but they do eventually get fixed.
 
not to mention the horrible driver support.

First hand experience or word of mouth? They have no more issues than Nvidia, keeping in mind Navi is brand new, and when Nvidia Turing was first released just google driver issues from the initial first few weeks, or dying RTX 2080's. I have a Navi 5700 XT and have been running it issue free with September Drivers, so not even the latest. And no, I'm not AMD bias, I also run a 2070s and haven't experienced any driver issues. Unless you have a Navi card, and have experienced issues with the drivers first hand (and even then it could be system specific) you really shouldn't be shitting on their drivers. Anytime NV or AMD releases a card, usually the first batch or two of drivers has some issues, but that's in no way a clear representation of drivers moving forward after the initial release ones. Unless I hear from reputable site or youtube channel that NV or AMD has a wide scale issue with a specific driver, you have to assume first that if someone is complaining of an issue, it "could" be their own system configuration. And again, that's not to say both AMD and Nvidia don't have issues periodically, but you can't lump all their drivers in a "They're all shit" category. K, rant done.
 
I read people are having problems with Radeon Adrenalin so I would always go Nvidia.
 
Get the 2060 super. It's got cod included and has nvidia drivers which historically have been better. It's a better value.
 
For the price, I'd get a 5700 then flash the XT bios to it. I bought one new under $300AR and at that price point nothing compares. Drivers are better than launch drivers. I haven't had any issues.


I dunno, I killed my powercolor red devil 5700XT about 6 hours of gaming into the bios flash.
 
Yeah, everything was in check. GPU just wasn't up to the task is my guess. Its the same cooler as the XT
 
Killed as in totally dead? Could you revive it? Or return to the store? I won't look down on you.
 
I vote 5700XT unless you really care about DXR raytracing specifically. Generally it looks to be the faster card at 1440- the 2060S is really more of a 5700 competitor but at $100+ more, not that it's a bad card but those RT/Tensor cores aren't free. Even if there have been Navi driver issues at launch it'll be fine long-term, at least has been the case with my 7850 CF, 290x, Vega 64 experiences.
 
not to mention the horrible driver support.

My last upgrade was from R9 280X to GTX 1070Ti. Honesty, my experience is that AMD and Nvidia are pretty much the same with regard to driver support. As a matter of fact, I would contend that’s pretty much been true since ATi released the Catalyst driver. People love to bitch constantly about AMD and their driver support like they’re the only ones who ever had any problems because their Rage 128 card was kind of crap and they’ve used Nvidia ever since, seemingly forgetting that Nvidia has had plenty of their own issues over the years as well. Take it from me, the driver support is not a deciding factor between those cards. I typically go back and Fort her between the brands and never noticed any real difference.

If you can wait, upgrade when the new cards come out next year. If you can’t, just get whatever has the best deal, because RTX ray tracing on a 2060 at 1440p won’t be playable anyway.

On power consumption, your main decision there should be if your PSU supports it, and heat generation. Otherwise, the difference in power consumption might be like $5-10 per year depending on the number of hours you game at full bore. People talk about this like they’re buying a new A/C or a refrigerator. Nvidia’s cards are the power consumption kings, but they also often come and a higher price point, and you probably won’t make that money up in power draw before your next upgrade depending on your gaming habits.
 
All of the 5700 driver issues have been fixed. The 5700 can easily hit 5700xt performance and has regularly been available for near $300. At that price point it's a no brainer.
 

AMD's GPU driver issues are legendary at this point -- historically, when there aren't issues, that's the exception.


Of course I'll through in my own anecdote; my most recent experience has been pretty good, though I don't like their 'slick' new driver interface much.

I don't hesitate to recommend them where they make sense, and that's pretty much every price point short of a 2060, with up to 2070 levels being a tossup.

I see the 1660 only really being relevant for the Turing video transcode block which might be a killer feature for some, but not many.
 
My last upgrade was from R9 280X to GTX 1070Ti. Honesty, my experience is that AMD and Nvidia are pretty much the same with regard to driver support. As a matter of fact, I would contend that’s pretty much been true since ATi released the Catalyst driver. People love to bitch constantly about AMD and their driver support like they’re the only ones who ever had any problems because their Rage 128 card was kind of crap and they’ve used Nvidia ever since, seemingly forgetting that Nvidia has had plenty of their own issues over the years as well. Take it from me, the driver support is not a deciding factor between those cards. I typically go back and Fort her between the brands and never noticed any real difference.

If you can wait, upgrade when the new cards come out next year. If you can’t, just get whatever has the best deal, because RTX ray tracing on a 2060 at 1440p won’t be playable anyway.

On power consumption, your main decision there should be if your PSU supports it, and heat generation. Otherwise, the difference in power consumption might be like $5-10 per year depending on the number of hours you game at full bore. People talk about this like they’re buying a new A/C or a refrigerator. Nvidia’s cards are the power consumption kings, but they also often come and a higher price point, and you probably won’t make that money up in power draw before your next upgrade depending on your gaming habits.

Exactly my above post point. You'd think people were scarred from the Rage 128 driver days... :rolleyes: Nvidia has arguably had worse drivers issues
 
I'd argue AMD has worse driver issues with the launch of a new product, but after a few months they're pretty much on a level playing field in regards to drivers. I've had issues with both.
 
so, i am guessing 5700; cheapest and very good performance. could also afford a rtx 2070 super. but does it worth the extra money ? i want 60 fps at 1440p for at least 2/3 years.
 
so, i am guessing 5700; cheapest and very good performance. could also afford a rtx 2070 super. but does it worth the extra money ? i want 60 fps at 1440p for at least 2/3 years.

5700 is definitely the best value. I imagine there will be good cyber monday deals too. I imagine in some games you might have to do “high” instead of “ultra” (which I have trouble telling the difference). There’s tons of 1440p reviews, go have a looksie. Besides ray tracing, which a viable card is about twice the budget of a 5700 and is in it’s infancy, the graphics demand of games have been pretty stagnant imo.
 
If you're after absolute best value and can put in the time to stretch it even further, RX 5700 with a BIOS flash will kick everything's ass. Pound for pound, the best in this segment.

I personally do prefer Nvidia drivers though. I have a RX 5700 XT rig right next to my 2070S rig, and can confirm what many say about AMD drivers being relatively rough around the edges.
 
If you're after absolute best value and can put in the time to stretch it even further, RX 5700 with a BIOS flash will kick everything's ass. Pound for pound, the best in this segment.

I personally do prefer Nvidia drivers though. I have a RX 5700 XT rig right next to my 2070S rig, and can confirm what many say about AMD drivers being relatively rough around the edges.
bios flash is something i am not interested at all. for the nvidia card; which one do u suggest ? 2060 super or 2070 super ?
 
I'd recommend the card that best fits your budget, resolution, and requirements.

RTX 2060S is the better relative value being $100 cheaper (~25% less) and maybe 10-15% slower on average, but that's just one metric to consider. It should do fine at 1440P @ 60Hz, 1080P @ 120+ Hz. If you have a 1440P@120+ Hz display (or higher), then I'd stretch for the 2070S.
 
I did just read today that they are putting back into production the vanilla 2070. Will probably slide between the 2060 super and the 2070 Super.

I think both the 2060 super and 2070 super are overpriced for what they are. The 2060 Super should be a $329 card.
 
Last edited:
Watch any reviewers youtube video. They almost all recommend the 5700 XT if raytracing isn't a die hard requirement for you. All those folks, who do this day in and day out. I doubt they are all wrong.
 
I did just read today that they are putting back into production the vanilla 2070. Will probably slide between the 2060 super and the 2070 Super.

I think both the 2060 super and 2070 super are overpriced for what they are. The 2060 Super should be a $329 card.

That's very interesting if true. Source by any chance?

On paper, the 2060S and 2070 are nearly identical and the performance reflects that. Even if priced appropriately, I cannot see how one will not cannabalize the other.

upload_2019-10-23_7-51-28.png
 
I went back and forth on the 5700xt, 2060s and 2070s myself. I ended up landing on the EVGA 2060s https://www.evga.com/products/product.aspx?pn=08G-P4-3067-KR ($419 list got mine open box for less) and couldn't be happier. At the time the 2070s were scarce and ones close to MSRP even more so.

Running a 3900x and the 2060s at 1440p. Everything I play runs over 60fps with high details-- many hitting the 144hz refresh rate of my monitor.

If you plan on keeping this next GPU for longer the 5700xt or 2070s may have more legs.. BUT I'll take dumping a few settings and saving the 100 bucks myself vs the 2070s. The 2070s is just too much more for the performance bump and diminishing returns start to kick in IMO. 5700xt is much better value for the price vs both really.

The mid range is called a sweet spot for a reason.

Happy Hunting!
 
Last leaks from Nvidia themselves about Ampere cards that will be launched around march/april 2020 suggests they will be much faster in regular use (like much more than 2 times faster), extrememly much faster in Ray tracing (like much more than 10 times faster than RTX), will be less expensive than Turing and much less expensive on high end and with double the Vram (think of a 3080 TI 24Gb GDDR6 at 1000$).
So just buy some RX 570 8GB and wait for the new Nvidia cards that will blow off the actual Turing and not yet revealed RDNA2 AMD cards with a huge unprecedented gap.
The graphics card you may buy today will be completely obsolete in 6 months in a way it never happened before between generations.
This is because Nvidia has so much more efficient graphics cards than AMD and that they worked hard for nearly 3 years on those cards, waiting for AMD to catch up but they still didn't even by using a much better chip technology (7nm vs 12nm), and Nvidia is jumping over 3 generations of chip technology, from 16nm+ (aka 12nm) over 7nm, 7nm+ to 6nm EUV equivalent at TSMC but made by Samsung (who calls it 7nm+ EUV).
Furthermore to replace Turing, the difference will be so huge that Nvidia will have to organise some kind of Turing give up for nearly free, as they did on Kepler when Maxwell came out.
So don't be a loser and stay tuned ,and hold your money for after Christmas, because anything you buy before is money lost.
The new Ampere chips will be huge even if made on 7nm+ EUV and they will have very little failures because of the EUV technology. To avoid power consumption the voltage will be much lower than usual, never used before low voltage that won't permit overclocking, but the clocks will be much higher and but the TDP will still be lower. Ampere cards may permit for some games to run full Raytrace in Full HD without even the need for raster. So some time it will be non-hybrid Raytrace and this is new too. It was expected for around year 2025 but Nvidia brought it on the market now, 5 years earlier.
 
Last edited:
Last leaks from Nvidia themselves about Ampere cards that will be launched around march/april 2020 suggests they will be much faster in regular use (like much more than 2 times faster), extrememly much faster in Ray tracing (like much more than 10 times faster than RTX), will be less expensive than Turing and much less expensive on high end and with double the Vram (think of a 3080 TI 24Gb GDDR6 at 1000$).
So just buy some RX 570 8GB and wait for the new Nvidia cards that will blow off the actual Turing and not yet revealed RDNA2 AMD cards with a huge unprecedented gap.
The graphics card you may buy today will be completely obsolete in 6 months in a way it never happened before between generations.
This is because Nvidia has so much more efficient graphics cards than AMD and that they worked hard for nearly 3 years on those cards, waiting for AMD to catch up but they still didn't even by using a much better chip technology (7nm vs 12nm), and Nvidia is jumping over 3 generations of chip technology, from 16nm+ (aka 12nm) over 7nm, 7nm+ to 6nm EUV equivalent at TSMC but made by Samsung (who calls it 7nm+ EUV).
Furthermore to replace Turing, the difference will be so huge that Nvidia will have to organise some kind of Turing give up for nearly free, as they did on Kepler when Maxwell came out.
So don't be a loser and stay tuned ,and hold your money for after Christmas, because anything you buy before is money lost.
The new Ampere chips will be huge even if made on 7nm+ EUV and they will have very little failures because of the EUV technology. To avoid power consumption the voltage will be much lower than usual, never used before low voltage that won't permit overclocking, but the clocks will be much higher and but the TDP will still be lower. Ampere cards may permit for some games to run full Raytrace in Full HD without even the need for raster. So some time it will be non-hybrid Raytrace and this is new too. It was expected for around year 2025 but Nvidia brought it on the market now, 5 years earlier.

2x the raster performance and 10x the RT? WTF are you smoking? That's as bad as the 5Ghz 16 core AMD Zen2 at $499 BS.
 
2x the raster performance and 10x the RT? WTF are you smoking? That's as bad as the 5Ghz 16 core AMD Zen2 at $499 BS.

It's not impossible... but it is fairly unlikely. Let's say that Nvidia could do that -- double the raste performance -- why would they?

Wouldn't they just shrink the GPU sizes to increase their margines instead, with say a 25% to 30% raster performance increase?


[the increase in RT performance is... plausible, but only if we're talking about it in a vacuum or in say the Quake II RTX demo and not something like Battlefield V]
 
  • Like
Reactions: noko
like this
How ready will Samsung be in making big EUV 7nm node chips??? How can the 3080Ti have 16gb of ram - 512bit bus??? I don't see that, 384bit with faster DDR 6 would be my guess. I won't worry or think too much more about Ampere at this time.
 

Sorry, but WCCFtech is NOT a news site. It ranges questionable rumors, to complete made up internal extrapolations.

Tip for the future. If you want a site that has a track record of credible leaks, it's videocardz.com. Just about the only credible leak source IMO.

It's not impossible... but it is fairly unlikely. Let's say that Nvidia could do that -- double the raste performance -- why would they?

Wouldn't they just shrink the GPU sizes to increase their margines instead, with say a 25% to 30% raster performance increase?

[the increase in RT performance is... plausible, but only if we're talking about it in a vacuum or in say the Quake II RTX demo and not something like Battlefield V]


It might not be physically impossible, but it is economically impossible. Transistor costs are stagnant, and massive increase in RT and Raster performance would require a massive increase in transistor count. Which entails a massive increase in build costs, which is not going to lead to cheaper prices.

Likewise the RT performance increase is NOT plausible. Even on something like Q2 RTX.

Just because Quake II uses RT for all effects, doesn't mean, the shader cores are idle. Far from it.

All RT cores do is calculate intersections. You still have to shade the pixels as the result of those intersections, and you still have to denoise those effects.

From the frame time breakdown shown for Control, a typical pure RT effect might use the HW like this:

40% RT intersection testing, 30% shading those results, 30% denoising those results.

Even if you built 10X as many RT cores. That only reduces your frame time from 100% to 64%, and that is on a pure RT path game like Q2.

So 10X RT cores, only gives you 56% improved frame rate on pure RT titles.

There is no quick fix for improving RT performance. You need balanced improvement in all unit counts, not just RT cores.
 
Back
Top