Poll: AMD RX 5700 Series vs NVidia RTX Super. Which gets your money?

AMD RX 5700 Series vs NVidia RTX Super. Which gets your money?

  • NVidia RTX 2060 (old 6GB version)

    Votes: 0 0.0%
  • AMD RX 5700

    Votes: 18 7.2%
  • NVidia RTX 2060 Super

    Votes: 12 4.8%
  • AMD RX 5700 XT

    Votes: 81 32.5%
  • NVidia RTX 2070 Super

    Votes: 30 12.0%
  • Already have close enough/better AMD card for now, waiting on next big thing

    Votes: 19 7.6%
  • Already have close enough/better NVidia card for now, waiting on next big thing

    Votes: 89 35.7%

  • Total voters
    249
Status
Not open for further replies.
I wanted a 5700 XT reference board so i could add either a Morepheus II or Arctic cooling Extreme III VGA cooler to it. Which would be better than most custom boards anyways?
What Dan_D said earlier.
AMD Ref boards are very good quality and more than overbuilt for most OCing. Also heard they are better quality than Nvidia boards from others. They also fit the widest range of coolers like those that you listed.
I also have an Arctic III which I'm wanting to add to my V64 to further OC it, or to a big Navi/next gen card. Harder to get on the V64 than the non HBM cards.
 
I'll wait and see what the AIB cards have to offer, I'm holding off until like November / December, but I'd like to have a new card.
 
Huh? You have no clue of my work. You are looking outside in. Banned more red than green actually, that's all I will say.

You need to quit putting things on a personal level. That's what will get you banned. Its not the Red or Green status, its the context of the post.

Sorry Mods if off topic. Feel free to delete.


I'm stating my personal experience on those forums, sorry if it hurts your feelings.


You got to stop with the wounded Nvidia fan stick. Nvidia has more then earned it's criticism over the years and AMD has been roasted for their failures as well. You of course are always free to buy whatever you feel is best for you, but you gotta stop thinking you can force everyone to see the world your way. I have owned cards from both companies and have been happy and disappointed with both. But this whole it has to have RTX or it's a failure thing is getting old, Ray Tracing in it's current form is very weak and barely worth the effort for the massive performance hit you take. Will we get there one day and everyone will care, sure but not a soul will be using these cards when that day happens.

So I have a "wounded NVIDIA fan stick (sp)" yet I'm the one getting personal? Nice logic there. I presented facts to back up what I said, especially to dispute the misinformation put out by OnceSetThisCannotChange If you take it as some kind of fanboying or NVIDIA shtick, that's on you. I'm not going to praise a card that is 3 years too late to the party with nothing new to tout other than "hurr durr we're on 7 nm and don't have ray tracing" and try to market it on these forums as a good thing.

P.S. Contrary to the picture you and Rvenger are trying to paint, I'm not an NVIDIA fanboy. I've got no emotional or monetary stake with them. I buy what is best and they certainly excel in that department and AMD is lacking and I call them out for it. I personally look forward to the day the GPU market has real competition via Intel because AMD has clearly failed.

NVIDIASTICK.png
 
Last edited:
With regard to your bolded statement, if you're the same guy who mods AT forums, then I can see why because you guys ban anyone over there that says anything positive about NVIDIA and contrary to the status quo of AMD. Very slanted and biased forums and why I never went back.

Ah....now things make a lot more sense...thanks for pointing this out ;)
 
I like ray tracing but not DLSS as I think tensor cores were a mistake to include in a consumer card and I'm sure Ampere will rectify that.

I have a sneaky suspicion that DLSS is just the start of what NVIDIA intends to use their Tensor cores for.
Deep learning/A.I. is making huge strides these years...and I think the Tensor cores are here to stay.
I could be mistaken though /shrugs
 
I have a sneaky suspicion that DLSS is just the start of what NVIDIA intends to use their Tensor cores for.
Deep learning/A.I. is making huge strides these years...and I think the Tensor cores are here to stay.
I could be mistaken though /shrugs

True, AI is the future of pretty much everything so it will be interesting to see what direction NVIDIA decides to go. With regards to DLSS, it definitely has a lot of room to improve as the current implementation is lacking in fidelity. However, one could always couple NVIDIA game filter with DLSS to improve the picture and I'm surprised this hasn't been brought up in reviews.
 
Why would someone be stupid enough to buy the 5700 XT and spend an extra $100+ on it just to match the 2070S in ONE game (which is all you have shown it can do with the heavy modifications + WC OC) and still lack RTX. Again, you provide shit advice. If you go back and look at that Toms graph, the 2070S on stock NVIDIA cooling gained around 7-8 fps with the OC they used without ANY mods and the 5700XT with a bios mod + water cooling managed to gain 10-12 fps. Yeah really great value there buddy.

Not to mention the first part of your statement is blatantly false/misleading as you don't do "better in most games" with 5700XT against a 2070S, rather it falls behind by a decently large margin. Don't try to shift the narrative by saying "similarly priced competition" because you originally tried to pair it against a 2070S and proclaim it was a better value and equal performance with the caveat of bios editing (warranty void) and $100+ in mods and now you're backtracking.

Even if you used an AIB card, it wouldn't achieve the same results as a WC + bios edited card that Toms showed.

P.S. Games that people actually play everyday by the millions:

View attachment 175035
View attachment 175036

Sorry but AMD's latest marvel still can't keep up with 3 year old Pascal in games that matter.

I'm stating my personal experience on those forums, sorry if it hurts your feelings.




So I have a "wounded NVIDIA fan stick (sp)" yet I'm the one getting personal? Nice logic there. I presented facts to back up what I said, especially to dispute the misinformation put out by OnceSetThisCannotChange If you take it as some kind of fanboying or NVIDIA shtick, that's on you. I'm not going to praise a card that is 3 years too late to the party with nothing new to tout other than "hurr durr we're on 7 nm and don't have ray tracing" and try to market it on these forums as a good thing.

P.S. Contrary to the picture you and Rvenger are trying to paint, I'm not an NVIDIA fanboy. I've got no emotional or monetary stake with them. I buy what is best and they certainly excel in that department and AMD is lacking and I call them out for it. I personally look forward to the day the GPU market has real competition via Intel because AMD has clearly failed.

So calling people stupid isn't making it personal? Then you proceed to interject your opinion as fact, then proceed to rage some more and make a wild ass guess that a AIB card cant do the same despite not having a clue if thats true or not. One would expect a water cooled card to do better but I have seen air cooled cards hold just as high of clocks on the Nvidia side. People mod cards all the time because it's a enthusiast site here and most of the general public never run a card beyond stock ever. Then you pick Fortnite which is a multiplayer game and almost impossible to get a reliable bench on and thus why almost no one benches it. Plus most times when you get outliers like that anyway it just means the game needs a bit of tweaking for the new hardware which is not uncommon. The complaint on Rainbow 6 I dont get as the minimum frames are over 100 so who cares what the max are. 2070 Super should be better then the 5700 XT yet it can match it in certain titles with a healthy overclock. I dont see how on a enthusiast site how it's a bad thing to discuss the value of how well a card performs overclocked. Also if your getting banned in other forums then yeah your making it personal. Also if you think Intel is actually going to be even close to competitive with what they have then I cant wait to laugh at you when they launch and we see what they can do. Intel isn't going to have something you want on their initial try. You even bash the [H] on The FPS Review forums as well and then rage on about AMD and argue with a mod. https://forums.thefpsreview.com/threads/amd-radeon-rx-5700-xt-and-rx-5700-video-card-review.498/ . Picking on the GPU side I can get because they dont have a top competitor to the 2080Ti but bashing Ryzen shows your true colors as they are quite competitive there. I also dont get talking up RTX when your all into Multiplayer games that are first person shooters, where it's going to be the first thing you turn off to keep the fps up.
 
So calling people stupid isn't making it personal? Then you proceed to interject your opinion as fact, then proceed to rage some more and make a wild ass guess that a AIB card cant do the same despite not having a clue if thats true or not. One would expect a water cooled card to do better but I have seen air cooled cards hold just as high of clocks on the Nvidia side. People mod cards all the time because it's a enthusiast site here and most of the general public never run a card beyond stock ever. Then you pick Fortnite which is a multiplayer game and almost impossible to get a reliable bench on and thus why almost no one benches it. Plus most times when you get outliers like that anyway it just means the game needs a bit of tweaking for the new hardware which is not uncommon. The complaint on Rainbow 6 I dont get as the minimum frames are over 100 so who cares what the max are. 2070 Super should be better then the 5700 XT yet it can match it in certain titles with a healthy overclock. I dont see how on a enthusiast site how it's a bad thing to discuss the value of how well a card performs overclocked. Also if your getting banned in other forums then yeah your making it personal. Also if you think Intel is actually going to be even close to competitive with what they have then I cant wait to laugh at you when they launch and we see what they can do. Intel isn't going to have something you want on their initial try. You even bash the [H] on The FPS Review forums as well and then rage on about AMD and argue with a mod. https://forums.thefpsreview.com/threads/amd-radeon-rx-5700-xt-and-rx-5700-video-card-review.498/ . Picking on the GPU side I can get because they dont have a top competitor to the 2080Ti but bashing Ryzen shows your true colors as they are quite competitive there. I also dont get talking up RTX when your all into Multiplayer games that are first person shooters, where it's going to be the first thing you turn off to keep the fps up.

The post you quoted was edited so nice try but even then, I said his advice was shit (that's not an attack on him, I didn't say he was a shit person). And yes, I can confidently say an AIB air cooled 5700XT card will not match a bios modded card under a water block, I have years and years of experience with both bios modding (and hardware shunt mods) and building custom WC systems to know better. You cite NVIDIA cards doing well under air in the past vs WC, but what about AMD? Because that is who we are discussing here, try not to confuse the subject matter.

Modding cards and overclocking them is fine, I love it and encourage it. What I find disingenuous is when people come in and try to market it as a positive against a stock rival card and claim it is better when that is far from the case. As for Fortnite (the biggest and arguably most important PC game right now), obviously Techspot/HWUB is capable of benchmarking it so it must not be that difficult huh? The same holds true with any BR, they can all be benchmarked if you aren't lazy (which most reviewers are). BTW R6:S numbers matter to those running high refresh monitors (which most competitive gamers do).

I don't think Intel will initially be competitive with NVIDIA at the high end but I do expect them to release refreshes more frequently than AMD and to achieve parity with NVIDIA sooner than AMD ever could or will.

To address your final point, I used to complain right here on [H] about their reviews and not having MP games mixed in, that's nothing new. I took the same complaint to Brent's new forum in hopes they'd listen and change their ways but apparently they're satisfied with doing the same old. As for bashing Ryzen, you obviously didn't read my posts and decided to take it out of context, I disputed the assertion that AMD was beating Intel in marketshare as it clearly is not and never will.
 
Last edited:
Why would someone be stupid enough to buy the 5700 XT and spend an extra $100+ on it just to match the 2070S in ONE game (which is all you have shown it can do with the heavy modifications + WC OC) and still lack RTX. Again, you're providing terrible advice to gamers who don't know better. If you go back and look at that Toms graph, the 2070S on stock NVIDIA cooling gained around 7-8 fps with the OC they used without ANY mods and the 5700XT with a bios mod + water cooling managed to gain 10-12 fps. Yeah really great value there buddy.

Not to mention the first part of your statement is blatantly false/misleading as you don't do "better in most games" with 5700XT against a 2070S, rather it falls behind by a decently large margin. Don't try to shift the narrative by saying "similarly priced competition" because you originally tried to pair it against a 2070S and proclaim it was a better value and equal performance with the caveat of bios editing (warranty void) and $100+ in mods and now you're backtracking.

Even if you used an AIB card, it wouldn't achieve the same results as a WC + bios edited card that Toms showed.

P.S. Games that people actually play everyday by the millions:

View attachment 175035
View attachment 175036

Sorry but AMD's latest marvel still can't keep up with 3 year old Pascal in games that matter.

The thread is about - which card gets my money - and if I was buying today, it would be 5700XT, waterblock or not.

Without a BIOS mod it looks better than 2600S, which is priced the same at $400. With the WB, it costs the same as 2700S ($500), and at least in one game (which generally favours the green team) it is still faster. I certainly did not expect that before the launch, and well, that's enough for me.

However saying that buying 5700XT over 2600S is poor advice, I disagree. At this price point it makes sense to buy XT overclocked or not. Waiting a month and getting an AIB card, would be better still.

Here is HBU take on the XT.



Plain 5700 is equal to 2600S, and XT version is 2% slower stock with the blower fan vs 2700S. You can also OC the XT about 20% with the blower fan :D. Noise will surely be better once AIB cards are out.

Which one would you buy for a 50$ difference in the first case, or $100 in the second?

Even for the two games you posted, 2060S which is actual price match to 5700XT is equal in Fortnite, another game heavily favouring Nvidia, and in R6 it's slower.

As far as I am concerend, this round in that price range it's pretty clear, suprisingly enough. Nvidia is surely going to sell enough due to brand recognition, even though RTX at that price level is just a marketing gimmick.
 
1080ti here, so maybe in 2020. Which side? Well depends on who releases the best card first, doesn’t matter. I am honestly expecting it to be NVidia with AMD 6mo to a yr behind.
 
The thread is about - which card gets my money - and if I was buying today, it would be 5700XT, waterblock or not.

Without a BIOS mod it looks better than 2600S, which is priced the same at $400. With the WB, it costs the same as 2700S ($500), and at least in one game (which generally favours the green team) it is still faster. I certainly did not expect that before the launch, and well, that's enough for me.

However saying that buying 5700XT over 2600S is poor advice, I disagree. At this price point it makes sense to buy XT overclocked or not. Waiting a month and getting an AIB card, would be better still.

Here is HBU take on the XT.



Plain 5700 is equal to 2600S, and XT version is 2% slower stock with the blower fan vs 2700S. You can also OC the XT about 20% with the blower fan :D. Noise will surely be better once AIB cards are out.

Which one would you buy for a 50$ difference in the first case, or $100 in the second?

Even for the two games you posted, 2060S which is actual price match to 5700XT is equal in Fortnite, another game heavily favouring Nvidia, and in R6 it's slower.

As far as I am concerend, this round in that price range it's pretty clear, suprisingly enough. Nvidia is surely going to sell enough due to brand recognition, even though RTX at that price level is just a marketing gimmick.


I agree the 5700 is a better buy than 2060 if you don't care about RT and similarly the 5700XT over the 2060S. However that wasn't what you originally talked about, you specifically said the 5700XT would be a better pick over 2070S because of Tom's article.

We still have to see how AIB cards will be priced though.
 
AMD tends to over design and over build it's cards. The VRM's, PCB's etc. always feel higher quality to me than any NVIDIA reference design. When you look into the VRM's, that's generally the case. That said, I think your right in that they went with a shitty cooler design. I'm not sure if they left 50% performance on the table. That seems unlikely. If AMD could have competed with NVIDIA in higher performance brackets, it would.
Cards with higher power draw, heat require more robust designs, no surprise there. ie, a 300w Vega vs a 180w GTX 1080 or R290X vs GTX980, etc.
 
Here is HBU take on the XT.

Plain 5700 is equal to 2600S, and XT version is 2% slower stock with the blower fan vs 2700S. You can also OC the XT about 20% with the blower fan :D. Noise will surely be better once AIB cards are out.
.

Sure when you only test about 10 games and you have massive outlier Forza 4 tilting the results.

But then you you look at TPUP that tested about 20 games and the picture changes quite a bit, having the 5700XT closer to the 2060S performance than it is to 2070S performance.
https://tpucdn.com/review/amd-radeon-rx-5700-xt/images/relative-performance_2560-1440.png

Here 2060S has 95% the performance of the 5700XT. Priced about equal, performance very close and you get RTX for free...

It really depends so much these days on what games you want to play. Forza 4 fans, AMD has your card.
 
With regard to your bolded statement, if you're the same guy who mods AT forums, then I can see why because you guys ban anyone over there that says anything positive about NVIDIA and contrary to the status quo of AMD. Very slanted and biased forums and why I never went back.

AT = Ars or Anand?

Anand forums are... Odd. The Super Moderator is a Massive AMD fan, and gets in arguments with everyone saying anything positive about Intel/NVidia. Kind of hard when it the Mod is so blatantly one sided.

That and they have very strange "profanity rule", that includes saying something is BS. Just that abbreviation is considered profane...

Ars, I really haven't noticed anything in the moderation (Except that reviewer who hates Tesla).
 
The 5700 XT blower cooler is not even really that bad. Alot of the reviews make it seem like it's a hot loud mess but it's really not. You can't hear the fan at idle and under load it's a quiet whisp. The default fan curve by AMD is very conservative and results in the higher temps you see in reviews. I'm gonna try the washer mod and repaste that seems to drop temps down quite a bit.
 
  • Like
Reactions: noko
like this
I'm not sure if they left 50% performance on the table. That seems unlikely. If AMD could have competed with NVIDIA in higher performance brackets, it would.

Call 50% optimistic- but between cooling and power table optimizations, I think that a properly raced-out 5700XT might get close.
 
The 5700 XT blower cooler is not even really that bad. Alot of the reviews make it seem like it's a hot loud mess but it's really not. You can't hear the fan at idle and under load it's a quiet whisp. The default fan curve by AMD is very conservative and results in the higher temps you see in reviews. I'm gonna try the washer mod and repaste that seems to drop temps down quite a bit.

Just, you know, don't increase the power target and try to overclock it. You have to leave that potential alone if you want it to be even remotely quiet.
 
AT = Ars or Anand?

Anand forums are... Odd. The Super Moderator is a Massive AMD fan, and gets in arguments with everyone saying anything positive about Intel/NVidia. Kind of hard when it the Mod is so blatantly one sided.

That and they have very strange "profanity rule", that includes saying something is BS. Just that abbreviation is considered profane...

Ars, I really haven't noticed anything in the moderation (Except that reviewer who hates Tesla).

Anandtech...biased forum/mod.
 
This is astroturfing at its finest. Considering your join date, your ferocious post frequency and the content and ad like nature of your posts, it is highly suspect. Just saying you own a Nvidia card in your main computer is not enough. Someone could say they own 4 RTX titans with a slice of ham cooking in their main rig, it is the internet, posts are cheap. Even a pic of an RTX coming from you would be suspect because of how engrained in marketing you appear you are.

Your acting like 7nm cards are not coming from Nvidia and Nvidia's current product are not competitive with AMD. This is absolutely false. The wafer cost of 7nm negates much the of the cost of production advantage from Nvidia using large dies and the performance per watt similarities is mostly a result of the nodal process rather than the architecture.

Navi is a competitive product which is great and 5700xt is faster at the 400 price point compared to a RTX 2060 super, but it isn't quite enough to own the market because as the market has shown, the general public is willing to pay more for Nvidia because of the brand and even when there is a performance deficit. AMD needs to be aggressive to win marketshare and the crappiness of AMD's game bundle shows they are not being aggressive.

Add in that the more titles you include, the smaller the advantage becomes as techpowerup or babeltechs review show, and there is a good chance the RTX 2060 super will outsell the 5700xt. The public takes branding much more into account than forums and add 2 games which are significantly more valuable than a 3 month game pass and there is a good chance the average joe will still pass over AMD. AMD did what it needed to compete but left Nvidia with alot of cards to play. It can still drop the price further and from what I have seen, there are partner cards with little marketup over reference cards. AMD will have to ensure that partner cards do not come with high markups to compete well with Nvidia AIB cards. E.g A 450 AIB cards will struggle to look good to the general public vs something like the MSI RTX 2070 super gaming X which sold was selling at 499.99 recently.

AMD will need be aggressive to compete against Nvidia because they are not the premium brand no matter how much you shout it. Too much damage was done during 28nm and 16/14nm for it to become the premium brand overnight. Without a flagship that outperforms Nvidia's fastest, they will continue to play the value angle and the 5700 cards slotting beneath the pricing of Nvidia's offering is proof of this.

No matter how much you try to discount it either, RTX ray tracing does carry some weight because it is the first ray tracing implementation and it is coming to some of the biggest games if not the biggest games. Mechwarrior 5 and Cyberpunk 2077 are huge games, particularly the latter which will help sell Nvidia based cards. Big games in the future will still use unreal engine 4 including the biggest game in the world right now, fortnite.

AMD has a tough battle ahead and next year will not be as easy as you think with Nvidia moving to 7nm. You don't think am 100% improvement in transistor density, 30% improvement in performance or 60% improvement in efficiency(improvements denoted by TSMC 7nm vs 16nm finfet) had anything to do with AMD catching up to nvidia? That is what your implying by your posts and talk of next year. Apply these improvements to Turing and you have a situation which is not much different than Polaris vs Pascal. I.e apply a die shrink to the RTX 2070 core and all of a sudden you have a chip that is 30% faster than a RTX 2070 or something with RTX 2070 performance at about 90 watts at 220mm2 or something a balance between the two like 15% faster than a RTX 2070 at 120watts. Same thing applies to all the RTX lineup today.

The nodal advantage AMD posses currently for CPU's is not remotely as exclusive in the GPU space. Nvidia is much more aggressive compared to Intel when it comes to execution. It is best not to underestimate them if your AMD. Additionally, if Nvidia decides to price their 220mm dies in the $250 to 300 range, expect them to wipe out much of AMD's goodwill because of their actions today of selling Polaris replacements a 350 to 400, much like Nvidia lost when they initially priced turing so high. AMD was expected to save the market, not simply be competitive with Nvidia's cards. 250mm2 dies selling 350/400 dollars is merely the underdog brand trying to be competitive against the market share/brand leader and making a heavy buck doing it.


But who are you?

Attack the messenger, then give your opinion on a broken Turing cards as proof as to how Nvidia is going to dominate consoles, cloud, and gaming? Your saying this with a strait face & your a techie..? Sound to me like you've blown so much smoke over your posting career, it's also become your life?

In respect, I will respond since you attempted to form a rebuttal. (Even though you actually didn't look at my post cadence, or frequency.)




I am not acting like Nvidia is coming out with 7nm, because they aren't.

Nvidia's 7nm GPU is about a year away from shelves. And Nvidia's answer to AMD's 7nm 2nd generation GPU was Turing SUPER. That is all Nvidia has... is just more Turing. Turing isn't any more powerful than Pascal, Turing doesn't do much more in Battlefield, than Pascal for upgrading for $800. As such, Turing doesn't do anything more for the gamer than Pascal. Turing and RTX are flops for Gamer's, because ad-hoc ray-tracing is not what any gamer wanted for their hard earned dollars. Three years after Pascal's high-end, Gamers were looking for more performance, not more pseudo fluff & marketing.

What I am saying is mirrored in nearly every review and everywhere else in the industry. Nvidia's ray tracing is ad-hoc attempt which didn't pan out. Where as Microsoft's DirectX Ray Tracing is the future and the next GPUs to be released, will have real raytracing implemented. (And not using nvidia's pseudo "RTX On" shenanigans.)



Subsequently, it seems like what you are actually trying to defend: is that Nvidia use to be the Premium Brand.

And having to argue those has-been facts, angers you that AMD's RDNA was kept so secret, and was so well developed, with so many game Developer's needs/wants..... that AMD's RDNA simply outclasses and outshines anything else. The entire Gaming Industry is on board with RDNA. So, the tides have changed... and such nvidia's dlss isn't the best (just more of Jensen marketing), while AMD's FidelityFX will be standardized in all games soon. Because Nvidia is no longer the leading force in the Gaming industry, AMD is..!!


Gamers today are looking forward at all the games coming out. And it is obvious RDNA is a better choice.
 
Just, you know, don't increase the power target and try to overclock it. You have to leave that potential alone if you want it to be even remotely quiet.
The same applies to my card with it's reference cooler. Not endorsing the AMD design but I think we are all used to going aib for a better cooler.
 
But who are you?



What I am saying is mirrored in nearly every review and everywhere else in the industry. Nvidia's ray tracing is ad-hoc attempt which didn't pan out. Where as Microsoft's DirectX Ray Tracing is the future and the next GPUs to be released, will have real raytracing implemented. (And not using nvidia's pseudo "RTX On" shenanigans.)

hqdefault.jpg


How do you think raytracing is currently implemented with turing in dx12 games?

DXR

JESUS CHRIST.

P.S I ACTUALLY OWN A 5700XT, I DONT WANT YOU ON MY TEAM.
 
How do you think raytracing is currently implemented with turing in dx12 games?
DXR
JESUS CHRIST.
P.S I ACTUALLY OWN A 5700XT, I DONT WANT YOU ON MY TEAM
.

Nvidia's solution didn't work (How is ray tracing implemented in Battlefield..? o_O...), so Nvidia is now backtracking away from their own proprietary pre-baked AI learning effects and now showing Turing trying to use a limited set of the industry standard DirectX RT. (Do you understand this?)

And... how well do you think Turing implements DXR ? (which was NOT designed with Microsoft DirectX Ray-Tracing API). Turing coughs, hiccups and slows down your gameplay, because Turing uses Nvidia's ad-hoc ray-tracing solution. Doesn't matter what games come out in the future, the current crop of RTX cards will always choke when using ray tracing. As such, Nvidia's marketing gimmick "RTX On" doesn't just work...

Subsequently... until the next generation of 7nm ray-tracing GPUs come, the term "raytracing" is nothing other than a catch phrase for cheerleaders.
 
The 5700 XT blower cooler is not even really that bad. Alot of the reviews make it seem like it's a hot loud mess but it's really not. You can't hear the fan at idle and under load it's a quiet whisp. The default fan curve by AMD is very conservative and results in the higher temps you see in reviews. I'm gonna try the washer mod and repaste that seems to drop temps down quite a bit.

temperatures.png


https://www.techpowerup.com/review/amd-radeon-rx-5700-xt/32.html

Some sites have lower, but generally temps are all over the place. Its this inconsistency that should worry potential buyers of ref cooler cards.
 
Nvidia's solution didn't work (How is ray tracing implemented in Battlefield..? o_O...), so Nvidia is now backtracking away from their own proprietary pre-baked AI learning effects and now showing Turing trying to use a limited set of the industry standard DirectX RT. (Do you understand this?)

We understand that Nvidia's solution works great. DICE's implementation is about average... for DICE. They don't even have DX12 working well yet, and they've been working on it for five years.

And... how well do you think Turing implements DXR ? (which was NOT designed with Microsoft DirectX Ray-Tracing API). Turing coughs, hiccups and slows down your gameplay, because Turing uses Nvidia's ad-hoc ray-tracing solution. Doesn't matter what games come out in the future, the current crop of RTX cards will always choke when using ray tracing. As such, Nvidia's marketing gimmick "RTX On" doesn't just work...

Turing implements DXR.

Subsequently... until the next generation of 7nm ray-tracing GPUs come, the term "raytracing" is nothing other than a catch phrase for cheerleaders.

Patently false.
 
Nvidia's solution didn't work (How is ray tracing implemented in Battlefield..? o_O...), so Nvidia is now backtracking away from their own proprietary pre-baked AI learning effects and now showing Turing trying to use a limited set of the industry standard DirectX RT. (Do you understand this?)

And... how well do you think Turing implements DXR ? (which was NOT designed with Microsoft DirectX Ray-Tracing API). Turing coughs, hiccups and slows down your gameplay, because Turing uses Nvidia's ad-hoc ray-tracing solution. Doesn't matter what games come out in the future, the current crop of RTX cards will always choke when using ray tracing. As such, Nvidia's marketing gimmick "RTX On" doesn't just work...

Subsequently... until the next generation of 7nm ray-tracing GPUs come, the term "raytracing" is nothing other than a catch phrase for cheerleaders.

My God you are so ignorant its not even funny.

I'm going to oversimplify here but, RTX is nvidia's hardware accelerated implementation of DXR and it uses shaders+tensor cores+RTX cores. DXR can also be done with just shaders (this is how its implemented in pascal and theoretically with Vega/Navi maybe even Polaris), shader+tensor cores (like Volta) and whatever Intel comes up with.

And its completely DXR, just hardware accelerated. Done under collaboration with MS.

The thing with RT is that its a monumental achievement to be done anywhere near realtime. "unfortunately" we are now used to games running at over 60 fps @4k ultra and see anything below that as a failure. Having games running at 4k/60 at max settings and suddenly drop to under 30fps @2160p with RTX sounds terrible. And having an AI upscaler that just blurs the image (DLSS) doesn't really make it any better.


And I will not even reply to your previous post as it just doesn't make sense.
 
I really prefer Nvidia, but the problem with that is I use MacOS mostly with a eGPU and they do not work with that.
 
I really prefer Nvidia, but the problem with that is I use MacOS mostly with a eGPU and they do not work with that.
I think a 5700XT would be a nice upgrade for a MAC, don't know if there are drivers for it though.
 
True, AI is the future of pretty much everything so it will be interesting to see what direction NVIDIA decides to go. With regards to DLSS, it definitely has a lot of room to improve as the current implementation is lacking in fidelity. However, one could always couple NVIDIA game filter with DLSS to improve the picture and I'm surprised this hasn't been brought up in reviews.

Most reviews are sadly also going the "We don't test with PhysX" in regards to DXR....it's like they are trying to create a "equal senario"...but that is bending the facts...the are benching less I.Q. in a misguided attempt to be "fair"...and with that goes the objectivity and it becomes BIASED.
Could you imagine if NVIDIA had "banned" asynchronus computing benchmarks before Turing?
But somehow AMD is not held to the same standard....really weird.

Eg. when I played The Witcher 3 I had Hairworks to the MAX...because not having it on...was a real I.Q. downgrade...and still most reviews tested TW3 without hairworks...epic fail in a single player game.

But I doubt that will happen with Cyberpunk 2077...I am hpoing for that game to bring snaity back into reviews ;)
 
View attachment 175167

How do you think raytracing is currently implemented with turing in dx12 games?

DXR

JESUS CHRIST.

P.S I ACTUALLY OWN A 5700XT, I DONT WANT YOU ON MY TEAM.

It's even worse.
https://en.wikipedia.org/wiki/DirectX_Raytracing
DirectX 12 now support 2 tiers of raytracing via DXR.
D3D12_RAYTRACING_TIER_1_0 (Turing via dedicated hardware, Pascal via compute)
D3D12_RAYTRACING_TIER_1_1 (No architechture supports this ATM)

Oh yeah...almost forgot:
D3D12_RAYTRACING_TIER_NOT_SUPPORTED (That would be NAVI)
 
Most reviews are sadly also going the "We don't test with PhysX" in regards to DXR....it's like they are trying to create a "equal senario"...but that is bending the facts...the are benching less I.Q. in a misguided attempt to be "fair"...and with that goes the objectivity and it becomes BIASED.
Could you imagine if NVIDIA had "banned" asynchronus computing benchmarks before Turing?
But somehow AMD is not held to the same standard....really weird.

Eg. when I played The Witcher 3 I had Hairworks to the MAX...because not having it on...was a real I.Q. downgrade...and still most reviews tested TW3 without hairworks...epic fail in a single player game.

But I doubt that will happen with Cyberpunk 2077...I am hpoing for that game to bring snaity back into reviews ;)

To be "fair" most games with RTX don't really look that much better (if at all) than traditional rendering.
 
  • Like
Reactions: noko
like this
It's even worse.
https://en.wikipedia.org/wiki/DirectX_Raytracing
DirectX 12 now support 2 tiers of raytracing via DXR.
D3D12_RAYTRACING_TIER_1_0 (Turing via dedicated hardware, Pascal via compute)
D3D12_RAYTRACING_TIER_1_1 (No architechture supports this ATM)

Oh yeah...almost forgot:
D3D12_RAYTRACING_TIER_NOT_SUPPORTED (That would be NAVI)

Care to share the TIER_1_1 link? I was only aware of 1_0 and supposedly it supports all current raytracing features and functionality.

And D3D12_RAYTRACING_TIER_NOT_SUPPORTED basically means it doesn't support raytracing.
 
But who are you?

Attack the messenger, then give your opinion on a broken Turing cards as proof as to how Nvidia is going to dominate consoles, cloud, and gaming? Your saying this with a strait face & your a techie..? Sound to me like you've blown so much smoke over your posting career, it's also become your life?

In respect, I will respond since you attempted to form a rebuttal. (Even though you actually didn't look at my post cadence, or frequency.)




I am not acting like Nvidia is coming out with 7nm, because they aren't.

Nvidia's 7nm GPU is about a year away from shelves. And Nvidia's answer to AMD's 7nm 2nd generation GPU was Turing SUPER. That is all Nvidia has... is just more Turing. Turing isn't any more powerful than Pascal, Turing doesn't do much more in Battlefield, than Pascal for upgrading for $800. As such, Turing doesn't do anything more for the gamer than Pascal. Turing and RTX are flops for Gamer's, because ad-hoc ray-tracing is not what any gamer wanted for their hard earned dollars. Three years after Pascal's high-end, Gamers were looking for more performance, not more pseudo fluff & marketing.

What I am saying is mirrored in nearly every review and everywhere else in the industry. Nvidia's ray tracing is ad-hoc attempt which didn't pan out. Where as Microsoft's DirectX Ray Tracing is the future and the next GPUs to be released, will have real raytracing implemented. (And not using nvidia's pseudo "RTX On" shenanigans.)



Subsequently, it seems like what you are actually trying to defend: is that Nvidia use to be the Premium Brand.

And having to argue those has-been facts, angers you that AMD's RDNA was kept so secret, and was so well developed, with so many game Developer's needs/wants..... that AMD's RDNA simply outclasses and outshines anything else. The entire Gaming Industry is on board with RDNA. So, the tides have changed... and such nvidia's dlss isn't the best (just more of Jensen marketing), while AMD's FidelityFX will be standardized in all games soon. Because Nvidia is no longer the leading force in the Gaming industry, AMD is..!!


Gamers today are looking forward at all the games coming out. And it is obvious RDNA is a better choice.

"You People may get as angry as you want, but you will still be gaming on RDNA next year"

This part of your quote implies Nvidia is not coming out with 7nm products even next year and the futures is set in stone and Nvidia product stack will not move forward.

It reeks of extreme arrogance because your implying you know the future without any metrics to back it up. Actually your entire post is full of it because you are making predictions which currently do not have enough evidence to suggest they are true.

As for attacking the messenger, I hate when guerilla marketers come on forums and try to manipulate the content and tone of forums and it's a cowardly form of marketing. It's a pet peeve of mine. If you want to market for a company, pay Hardocp.com and advertise with them. Don't try to get free marketing with this underhanded bullshit which is meant to deceive people and let the company be hypocrites.

E.g Get gamer's to complain about high prices of midrange products through youtubers and guerilla posters, try to set a tone where AMD is your friend and will launch a midrange at 250 dollars and save the mainstream gaming market. Bark game bundle game bundle when AMD doesn't have competitive products on the playing field.

Then have new posters like yourself come on that say AMD is the premium brand, they can charge what they want because they are the premium brand. $400 dollars is okay to charge for a polaris replacement it all of a sudden which contradict the previous tone guerilla marketers were going for. Now game bundles don't matter, nor does heat and noise metrics. Replacing coolers isn't a con or a added benefit, it's a benefit because we are the elite gaming master race and will replace these coolers anyways.

AMD used to post officially out front, but when those rumors and hype mills didn't pan out, it lost face particularly with bulldozer.

Now they have guerilla marketers come on who join just after Nvidia launches a product. Say they have inside information, hype products up and then people wait and wait until they are disappointed upon launch. These forums posters disappear shortly after and the cycle repeats itself.

I can't see you as anything but a guerilla marketing poster because the content and slant of your posts is so entrenched with AMD marketing . Look at your rebuttals in this thread and they are getting no likes. That is not because your some hyper genius, it because they lack substance and even make AMD fans cringe because it is nothing but AMD marketing psychobabble. No substance. I am not going to bother doing a direct rebuttal for your horrible points with your latest post.

Ray tracing is in its infancy, look at the initial titles which implemented mantle and they were not that hot either. Battlefield was notoriously buggy with mandle who with Dice codeveloped the API. Getting to a point where ray tracing can be implemented in real time in a game is going to take time as it takes a tremendous amount of computing power. It has to start somewhere so the industry has a starting point to move forward from. If we abandoned every technology due to its performance initially, we would not have the electric car, risc based CPU's, SSD's(storage costs) etc. Each newer game that implements RTX, gets better. Battlefield was the worst and Metro exodus is the best. It takes time and maturity for technology to show their true potential. Until AMD shows its ray tracing implementation, they cannot bark and say Nvidia's version is bad. I know they will through guerilla marketing and people who don't own RTX cards(and some that do) will parrot the same thing due to tribalism.

When AMD launches some form of ray tracing these same people(under different user names) will harp on how ray tracing is the future. I hate this type of behavior. Show me your not a marketer by being more balanced in your posts. Every one here agrees you look like a shill and looking at this thread, it is hard not to see it.
 
Last edited:
To be "fair" most games with RTX don't really look that much better (if at all) than traditional rendering.

Well, they don't look different, which is entirely on purpose. As seen with Metro, the developers enhanced the raster pre-renders with the assistence of having RT available. Yet only with RT could they do dynamic lighting with a real atmospheric feel.

You can argue the subjective side (where you can't be wrong), but objectively ray tracing looks significantly better when applied well.
 
To be "fair" most games with RTX don't really look that much better (if at all) than traditional rendering.

Metro 2033 devs made a deliberate choice to not have the visuals be to far apart with DXR vs normal rendering...to keep the experience as a like as possible...so you have to take things like that into account.
 
Care to share the TIER_1_1 link? I was only aware of 1_0 and supposedly it supports all current raytracing features and functionality.

And D3D12_RAYTRACING_TIER_NOT_SUPPORTED basically means it doesn't support raytracing.

https://forum.beyond3d.com/threads/direct3d-feature-levels-discussion.56575/page-9#post-1840641

Direct3D 12 feature checker (July 2019) by DmitryKo

D3D12CheckFeatureSupport.exe is a simple Windows 10 console app which calls IDXGIAdapter2::GetDesc2 and ID3D12Device::CheckFeatureSupport interfaces to check the supported Direct3D 12 options for every graphics adapter in the system.

July 15, 2019
Metacommand parameters, shader model 6_6, raytracing tier 1_1 in version 20H1 (build 18936); skip specified adapter.
 
I am going to be trying a 5700XT in a eGPU, the Vega 20 just doesn't cut it.

My understanding is that the next macOS (Catalina) won't have any support for nVidia cards so I'd get an AMD card for that reason alone. (I would use an AMD card now over nVidia but that's for personal and not technical reasons.)
 
Metro 2033 devs made a deliberate choice to not have the visuals be to far apart with DXR vs normal rendering...to keep the experience as a like as possible...so you have to take things like that into account.

So what's the point then? Other than paying more for video card features that aren't being taken advantage of 11 months after launch and when they are, it's made to look like what you already had and were paying less for.
 
So what's the point then? Other than paying more for video card features that aren't being taken advantage of 11 months after launch and when they are, it's made to look like what you already had and were paying less for.

Getting to know the tech from their side...and it saves the devs a LOT of time doing DXR like they did in Metro 2033.
Use the sun a global raytraced illumination...done!
To make the rasterized version an artist had to manually place lights, taking time (= $$$).
You should really start following devs on twitter...DXR are the new black to them.
 
Last edited:
Status
Not open for further replies.
Back
Top