Struggling to decide - 2070S vs 5700XT?

By your own words you are proclaiming an undocumented vapourware product the king and winner of the performance wars, but you asked for criteria of what you base a purchase on? imagination? Once the products actually exists, it can be tested, and everyone will see where it compares to Nvidia's two year old tech. Hopefully they'll be able to compete, it's been a while.

The market needs competition, which will hopefully drive prices down, I also doubt Nvidia is sitting idly by without working on something of their own. again i'm hoping it's more of a compatitive market a year or so from now.
Agree except when will Nvidia be ready for a 7nm card? When will Samsung be ready to make very complex large 7nm chips? Nvidia may concentrate on small die 7nm chips first before the large ones. So we have AMD vapourware with wishful thinking against Nvidia vapourware.

Since the 5800 is not normally the top of the stack - as in leaving room for a 5900 series as well, maybe 80 CU, HBM2+, still under 300w card. Will Nvidia have anything remotely that can compete against that?
 
Agree except when will Nvidia be ready for a 7nm card? When will Samsung be ready to make very complex large 7nm chips? Nvidia may concentrate on small die 7nm chips first before the large ones. So we have AMD vapourware with wishful thinking against Nvidia vapourware.

Since the 5800 is not normally the top of the stack - as in leaving room for a 5900 series as well, maybe 80 CU, HBM2+, still under 300w card. Will Nvidia have anything remotely that can compete against that?

Apparently there is this:

D9_3qfKUYAE4Ubh?format=png&name=small.png


taken from here:
Looks like the 5950XT is the top end part, but we'll have to wait and see what actually appears and with what specifications/performance.

manufacturing process isn't the be all and end all of performance, Intel still have the performance edge over AMD without going 7nm, granted ryzen has higher IPC, but still lower clocks. My posts almost sound like i'm against AMD, but i'm not. I think this will be my first completely AMD system build i've ever done...... :D
 
DLSS is exactly like RTX... a hoax.


Hoax? That's sensationalizing RTX a bit IMO. I mean, it certainly can be argued as a marketing gimmick due to the lack of maturity of the tech in current gen, especially they way NVidia execs have been talking about it like it's some fully refined "game changer" (pardon the pun), but I fully expect it to be much more refined in terms of performance impact in the next few iterations, and pretty significant once that's the case. I also would agree that it's pretty shit timing to introduce something so taxing and unrefined in terms of performance hit, at the same time when flagship products are barely able to utilize 3k (UW, et al.) and 4k resolutions, let alone beyond, when higher resolutions, pixel densities, and ever increasing refresh rates are clearly here to stay and outpacing the GPUs.


I'd also argue that DLSS is sort of evidence that they're aware of it, considering the timing, since it's effectively a band-aid for it, but still, hoax? Cheeky, arrogant, smug, or even misguided, I'd agree with, but clearly, it's not a hoax. It's real and they're probably going to defend it till into the ground. If they can bring performance of RTX in line, it'll be a big selling point for developers long term, and I expect we'll see an equivalent solution from AMD in the not too distant future, in fact, they've stated they're deep in development of a "hybrid" hardware software solution for ray tracing themselves, which sounds exactly like that RTX and DLSS is to me? Even if DLSS wasn't specifically branded as a companion meant to accommodate the performance hit of RTX ray tracing, it surely seems obvious that's what it really is/was to me, and they're practically mutually exclusive in real use.


I was actually really excited to see ray tracing implementation when the RTX series was announced, it's been an elusive fantasy for graphics and games for decades, that seemed would never actually materialize in any meaningful way. I was just really amazed they chose to implement, and focus so heavily on it from a marketing/pitch standpoint, this early, considering the performance. In the past I would have expected them to wait one more round of architecture progression and performance before trying to push it mainstream, which admittedly does make me think they were struggling to justify the price structure of this gen, especially when comparing 20xx's to the 1080ti's performance. That said, the miners are still responsible for the current pricing of the high end models, but NVidia would have been fools to leave money laying on the table, from a business perspective, I do think they probably wouldn't have released the ray tracing with such a performance hit, or mediocre iterative performance jump between iterations, if it weren't coming off the shit-hot can't keep cards in stock mining market. I think they decided to go all in with the ray tracing to try and add some perceived value to justify the state of the market and the price jump, and are just "gonna own" the decision now.


I'm personally looking forward to the future of ray tracing in games though, and it's clear to me, that AMD wouldn't even be bothering with it, if NVidia hadn't dropped it on us, immature of otherwise. So in the end, I think we'll be grateful, once it's fully realized. That's my pragmatic (and potentially naive) view on it at least.
 
  • Like
Reactions: amenx
like this
Hoax? That's sensationalizing RTX a bit IMO. I mean, it certainly can be argued as a marketing gimmick due to the lack of maturity of the tech in current gen, especially they way NVidia execs have been talking about it like it's some fully refined "game changer" (pardon the pun), but I fully expect it to be much more refined in terms of performance impact in the next few iterations, and pretty significant once that's the case. I also would agree that it's pretty shit timing to introduce something so taxing and unrefined in terms of performance hit, at the same time when flagship products are barely able to utilize 3k (UW, et al.) and 4k resolutions, let alone beyond, when higher resolutions, pixel densities, and ever increasing refresh rates are clearly here to stay and outpacing the GPUs.


I'd also argue that DLSS is sort of evidence that they're aware of it, considering the timing, since it's effectively a band-aid for it, but still, hoax? Cheeky, arrogant, smug, or even misguided, I'd agree with, but clearly, it's not a hoax. It's real and they're probably going to defend it till into the ground. If they can bring performance of RTX in line, it'll be a big selling point for developers long term, and I expect we'll see an equivalent solution from AMD in the not too distant future, in fact, they've stated they're deep in development of a "hybrid" hardware software solution for ray tracing themselves, which sounds exactly like that RTX and DLSS is to me? Even if DLSS wasn't specifically branded as a companion meant to accommodate the performance hit of RTX ray tracing, it surely seems obvious that's what it really is/was to me, and they're practically mutually exclusive in real use.


I was actually really excited to see ray tracing implementation when the RTX series was announced, it's been an elusive fantasy for graphics and games for decades, that seemed would never actually materialize in any meaningful way. I was just really amazed they chose to implement, and focus so heavily on it from a marketing/pitch standpoint, this early, considering the performance. In the past I would have expected them to wait one more round of architecture progression and performance before trying to push it mainstream, which admittedly does make me think they were struggling to justify the price structure of this gen, especially when comparing 20xx's to the 1080ti's performance. That said, the miners are still responsible for the current pricing of the high end models, but NVidia would have been fools to leave money laying on the table, from a business perspective, I do think they probably wouldn't have released the ray tracing with such a performance hit, or mediocre iterative performance jump between iterations, if it weren't coming off the shit-hot can't keep cards in stock mining market. I think they decided to go all in with the ray tracing to try and add some perceived value to justify the state of the market and the price jump, and are just "gonna own" the decision now.


I'm personally looking forward to the future of ray tracing in games though, and it's clear to me, that AMD wouldn't even be bothering with it, if NVidia hadn't dropped it on us, immature of otherwise. So in the end, I think we'll be grateful, once it's fully realized. That's my pragmatic (and potentially naive) view on it at least.



Control is the first game that uses DLSS correctly and RTX.
 
Control is the first game that uses DLSS correctly and RTX.


I'm not disputing this statement, but do you mind clarifying? I ask, and admittedly I haven't played it, but the one review I watched (ACG fwiw) mentioned that it had some pretty obvious (noticeable, maybe distracting, but I only watched it once and don't wanna put words in anyone's mouth) graphical fidelity compromises with DLSS enabled still.

Regardless, do you disagree with or have thoughts about my conjecture concerning DLSS and RTX being implemented at the same time, strategically aimed as a stop gap for RTX implementation being a bit premature in terms of performance? I wholly admit that it's pure speculation, just a hypothetical scenario my mind conjured recently. I just never personally thought of Nvidia as the type of shop to release half baked features en masse, let alone so vehemently defend and promote them. Perhaps I just haven't been paying attention lately, but I always had this idea of them being the type of outfit that would drop a bomb like RTX, being fully evolved, with minimal performance hit, after they'd been secretly working on and refining it for years. Maybe in fact they have, and it's taken that long to get where it's at, it just doesn't "feel" like that to me. To be clear though, I'm not deriding RTX or DLSS at all, I like the idea of and want both to work, and progress, personally. I'm definitely hoping to get a chance to play with both systems soon.



As I said though; just an imaginative leap of hypothesizing I had on the situation, which I readily admit, I'm likely dead wrong about. ;)
 
From what I've seen its supposed to do 4K at 60fps with DLSS and apparently its the best implementation of DLSS so far.
That would be great. But I am skeptic here. PC Gamer did a review and showed a solid 70-90 fps @ 1080p with a 2080ti. If DLSS can do what you say then I will retract all my negativity in regards to this.
So 4K-60 with DLSS or 1080p-60 without. Looking forward to the links to anything real-world.
So Stolv let's see what you have seen please.
 
Perhaps I just haven't been paying attention lately, but I always had this idea of them being the type of outfit that would drop a bomb like RTX, being fully evolved, with minimal performance hit, after they'd been secretly working on and refining it for years.

Raytracing and “minimal performance hit” don’t belong in the same sentence. There’s no magic. It’s just math and RT math is hard on hardware.

The only way you can make a valid comparison is if devs implement rasterized versions of RT features but they’re not doing that. That’s partially because there are some things that rasterization simply cannot do and also because it doesn’t make sense to implement the very hacky rasterized version of something when you have a less hacky RT option available.

For things like global illumination and shadows it’s likely devs will choose to not implement two high quality versions and just stick with RT. So you will never be comparing apples to apples. This will certainly be the case in the next console generation if all hardware has acceptable RT performance.
 
Raytracing and “minimal performance hit” don’t belong in the same sentence. There’s no magic. It’s just math and RT math is hard on hardware.

The only way you can make a valid comparison is if devs implement rasterized versions of RT features but they’re not doing that. That’s partially because there are some things that rasterization simply cannot do and also because it doesn’t make sense to implement the very hacky rasterized version of something when you have a less hacky RT option available.

For things like global illumination and shadows it’s likely devs will choose to not implement two high quality versions and just stick with RT. So you will never be comparing apples to apples. This will certainly be the case in the next console generation if all hardware has acceptable RT performance.

The hit actually is pretty minimal, actually surprisingly so for what it’s doing. 77 to 64 FPS at 1440p (DLSS off) for RTX medium on a 2080. DLSS is way more efficient than other games too.

They used a custom run through of the first level: https://www.pcgamesn.com/control/nv...formance-benchmarks#nn-graph-sheet-1440p-dlss

Too bad it’s not my kind of game. Reflections is also my least favorite implementation, although I appreciate what is happening.
 
Last edited:
The hit actually is pretty minimal, actually surprisingly so for what it’s doing. 77 to 64 FPS at 1440p (DLSS off) for RTX medium on a 2080. DLSS is way more efficient than other games too.

They used a custom run through of the first level: https://www.pcgamesn.com/control/nv...formance-benchmarks#nn-graph-sheet-1440p-dlss

Too bad it’s not my kind of game. Reflections is also my least favorite implementation, although I appreciate what is happening.

Yeah it’s certainly getting better with each game and it’s impressive to the people who can appreciate what’s going on. There seems to be an expectation from some folks though that RT should be cheap or free like aniso filtering. Which of course is kinda insane.
 
Raytracing and “minimal performance hit” don’t belong in the same sentence. There’s no magic. It’s just math and RT math is hard on hardware.

The only way you can make a valid comparison is if devs implement rasterized versions of RT features but they’re not doing that. That’s partially because there are some things that rasterization simply cannot do and also because it doesn’t make sense to implement the very hacky rasterized version of something when you have a less hacky RT option available.

For things like global illumination and shadows it’s likely devs will choose to not implement two high quality versions and just stick with RT. So you will never be comparing apples to apples. This will certainly be the case in the next console generation if all hardware has acceptable RT performance.


Yeah, I'm not disputing that, but I mean, all of this space is "hard on hardware" math, and the solution is typically, specific hardware dedicated or hopefully optimized to doing that math. Whether it gets leveraged in it's first release cycle with X level of performance, or a different, is all about when they decide to push it mainstream, and how it's implemented no? Of course, that's with anything, it's the entire evolution of 3D graphics processing, which, I remember the early SGI and 3DFX solutions, etc, and yes, relative to now, performance was extremely iterative, but it's a little different when you're piling a highly specific component of tech on top of the overall ethos of 3D graphics processing, versus, creating that ethos in the first place. It's more significant to be sure, but analogous to hardware decoders for video compression, etc, it's adding a very important feature, and freeing up the primary processors from having to handle that tedium, but it's not a completely new architecture or fully independent system. Hell, ray tracing is very old tech in the scheme of the space, although yes, it's always been known to be very costly in terms of computing cost, that said, this is relative.

It's not a critique though, just an observation, and one day I'm sure, dedicated ray tracing processing chips will be standard, and old-hat even.
 
The hit actually is pretty minimal, actually surprisingly so for what it’s doing. 77 to 64 FPS at 1440p (DLSS off) for RTX medium on a 2080. DLSS is way more efficient than other games too.

They used a custom run through of the first level: https://www.pcgamesn.com/control/nv...formance-benchmarks#nn-graph-sheet-1440p-dlss

Too bad it’s not my kind of game. Reflections is also my least favorite implementation, although I appreciate what is happening.


I just keep hearing that you'd have to be blind to not notice the fidelity hits with certain stuff on screen with DLSS, even if the frame rate boost is big. That said, this tech seems to have huge room for refinement, and so I'm happy to see it being utilized and fostered.
 
I just keep hearing that you'd have to be blind to not notice the fidelity hits with certain stuff on screen with DLSS, even if the frame rate boost is big. That said, this tech seems to have huge room for refinement, and so I'm happy to see it being utilized and fostered.

Unfortunately it’s game to game and requires nVidia to use their AI supercomputers to create the algorithm for. Every game is different in performance game and IQ. I personally hate implementations of tech that can’t globally turned on and also require certain initial conditions (resolutions). DLSS is a very innovative idea but the real life implementation appears to be way more difficult than nVidia imagined.

I have not tried Control but from what I’ve heard it’s a much better implementation of DLSS. This game looks like you get for example, 1080p frame rates at 1440p resolution but with IQ inbetween the two.

This review has pictures.
https://www.dsogaming.com/news/cont...tings-with-60fps-on-nvidia-geforce-rtx2080ti/
 
Last edited:
Unfortunately it’s game to game and requires nVidia to use their AI supercomputers to create the algorithm for. Every game is different in performance game and IQ. I personally hate implementations of tech that can’t globally turned on and also require certain initial conditions (resolutions). DLSS is a very innovative idea but the real life implementation appears to be way more difficult than nVidia imagined.

Yeah, DLSS sounded great initially, until the details were revealed.

If it requires per game training, that makes it what I consider a "brittle" technology. Prone to cracking with any change, that might be alright for consoles, where you can't mod, but on PCs were visual mods are common, I expect if you make a significant visual mod, DLSS will break, because they training doesn't really apply anymore.

And that is before you get to other issue, which is that it has a hard time matching traditional scaling that just works everywhere.
 
Yeah, I'm not disputing that, but I mean, all of this space is "hard on hardware" math, and the solution is typically, specific hardware dedicated or hopefully optimized to doing that math. Whether it gets leveraged in it's first release cycle with X level of performance, or a different, is all about when they decide to push it mainstream, and how it's implemented no? Of course, that's with anything, it's the entire evolution of 3D graphics processing, which, I remember the early SGI and 3DFX solutions, etc, and yes, relative to now, performance was extremely iterative, but it's a little different when you're piling a highly specific component of tech on top of the overall ethos of 3D graphics processing, versus, creating that ethos in the first place. It's more significant to be sure, but analogous to hardware decoders for video compression, etc, it's adding a very important feature, and freeing up the primary processors from having to handle that tedium, but it's not a completely new architecture or fully independent system. Hell, ray tracing is very old tech in the scheme of the space, although yes, it's always been known to be very costly in terms of computing cost, that said, this is relative.

It's not a critique though, just an observation, and one day I'm sure, dedicated ray tracing processing chips will be standard, and old-hat even.

Yeah in the PC space there really is no other option but to introduce features very conservatively as it takes several years for new tech to gain significant market share. People hate on Nvidia’s marketing but without it we would likely be waiting much longer for games to try new things. e.g. TrueAudio has been out since 2013. How many games use it?

Consoles used to be the place where disruptive new tech could be introduced since there was no need for backwards compatibility and developers could focus on a single hardware config. That’s changing now with consoles essentially running PC hardware and a lot more cross platform titles.

I really hope both Sony and Microsoft go all in with RT otherwise it’ll be another Gameworks scenario for the next 8 years with devs tacking on raytracing onto console ports.
 
Yeah it’s certainly getting better with each game and it’s impressive to the people who can appreciate what’s going on. There seems to be an expectation from some folks though that RT should be cheap or free like aniso filtering. Which of course is kinda insane.

To think that aniso was a performance hog when it was first introduced. :):)
 
If it requires per game training, that makes it what I consider a "brittle" technology. Prone to cracking with any change, that might be alright for consoles, where you can't mod, but on PCs were visual mods are common, I expect if you make a significant visual mod, DLSS will break, because they training doesn't really apply anymore.


That's a good point re: modding that I hadn't really considered, but is really obvious now that you've mentioned it. I'm pretty huge into mods, so definitely worth considering. Although I wonder if there aren't long term goals to compensate for this, either user submitted code/whatever that'll automatically spit out dlss profiles, or some long term solution implementing circuit level processing for this. Kind of reinforces my view about them dropping a couple of half-baked features/whatever in an effort to increase the perceived value of a pretty (in terms of perception, rolling over from 1xxx series, to 2xxx has a certain "significance" from a PR/hype/legacy/whatever standpoint) major, at least in terms of the model number's subconscious importance, that really wasn't much of a big iterative leap in terms of raw performance. I mean, if you take away RTX and DLSS from the on paper specs, and look at the benchmarks, the whole 20xx series would be a pretty difficult leap to rationalize in terms of consumer value, especially before the "S" models, which really adds another layer to the mystery of what the thought process, or focus is going on over at nvidia.

I mean frankly, the "Super" series, should have just been the standard 20xx models, but I don't doubt for one minute, that the shit-hot gpu market from mining wasn't hugely responsible for the versions and prices being released when the 20xx series was dropped.
 
Yeah it’s certainly getting better with each game and it’s impressive to the people who can appreciate what’s going on. There seems to be an expectation from some folks though that RT should be cheap or free like aniso filtering. Which of course is kinda insane.


Yeah in the PC space there really is no other option but to introduce features very conservatively as it takes several years for new tech to gain significant market share. People hate on Nvidia’s marketing but without it we would likely be waiting much longer for games to try new things. e.g. TrueAudio has been out since 2013. How many games use it?

Consoles used to be the place where disruptive new tech could be introduced since there was no need for backwards compatibility and developers could focus on a single hardware config. That’s changing now with consoles essentially running PC hardware and a lot more cross platform titles.

I really hope both Sony and Microsoft go all in with RT otherwise it’ll be another Gameworks scenario for the next 8 years with devs tacking on raytracing onto console ports.


Yeah I agree, and I do think people's expectations about RTX performance, are unrealistic, I wouldn't even be "considering" the question of what the appropriate level of performance out of a big new feature like this though personally, if it weren't for the NVidia CEO's statements and their hype machine push with it. Implying that you'd be insane to buy a non-RTX capable card, when the performance hit is currently so huge. I think, without that, which I admit is really just marketing hype, I'd never have thought twice about it, but it's a pretty bold claim considering the actual performance cost IMO.

I mean, 10-20% ok, especially if the raw GPU power were easily outpacing the state of the market in terms of resolution and refresh, but when you're seeing 50% loss of frames, and the best card they have can't really push 4k 120FPS reliably (and that paints an unfair picture honestly, since it's priced like the Titan series cards used to be, which were never aimed at the consumer, even enthusiast PC gaming market), the idea of RTX being mandatory or you're a dumbass, is kind of laughable. I dig RTX, want it to succeed, but they're gonna need to see some big number performance improvements with the 21xx series IMO, with raw FPS at 4k, and RTX costs, before I think it's a fair assertion to say that it's dumb to not have an RTX capable GPU.


Anyway, this is a fun discussion guys, I'm enjoying the discourse, and it's definitely got me thinking about what's going on in the GPU market than I had previously.
 
Yeah I agree, and I do think people's expectations about RTX performance, are unrealistic, I wouldn't even be "considering" the question of what the appropriate level of performance out of a big new feature like this though personally, if it weren't for the NVidia CEO's statements and their hype machine push with it. Implying that you'd be insane to buy a non-RTX capable card, when the performance hit is currently so huge. I think, without that, which I admit is really just marketing hype, I'd never have thought twice about it, but it's a pretty bold claim considering the actual performance cost IMO.

I mean, 10-20% ok, especially if the raw GPU power were easily outpacing the state of the market in terms of resolution and refresh, but when you're seeing 50% loss of frames, and the best card they have can't really push 4k 120FPS reliably (and that paints an unfair picture honestly, since it's priced like the Titan series cards used to be, which were never aimed at the consumer, even enthusiast PC gaming market), the idea of RTX being mandatory or you're a dumbass, is kind of laughable. I dig RTX, want it to succeed, but they're gonna need to see some big number performance improvements with the 21xx series IMO, with raw FPS at 4k, and RTX costs, before I think it's a fair assertion to say that it's dumb to not have an RTX capable GPU.

Anyway, this is a fun discussion guys, I'm enjoying the discourse, and it's definitely got me thinking about what's going on in the GPU market than I had previously.

For sure. Huang’s statements are just CEO puffery and people really should ignore them. RTX isn’t even close to a must have feature in 2019.

I don’t see anyone on these forums claiming that though.
 
For sure. Huang’s statements are just CEO puffery and people really should ignore them. RTX isn’t even close to a must have feature in 2019.

I don’t see anyone on these forums claiming that though.


Nah, I haven't seen anybody else period (except maybe a bought and paid for "news" site or two) parrot that, but for some reason it was hard for me to ignore. ;)
 
I'm not disputing this statement, but do you mind clarifying? I ask, and admittedly I haven't played it, but the one review I watched (ACG fwiw) mentioned that it had some pretty obvious (noticeable, maybe distracting, but I only watched it once and don't wanna put words in anyone's mouth) graphical fidelity compromises with DLSS enabled still.

Regardless, do you disagree with or have thoughts about my conjecture concerning DLSS and RTX being implemented at the same time, strategically aimed as a stop gap for RTX implementation being a bit premature in terms of performance? I wholly admit that it's pure speculation, just a hypothetical scenario my mind conjured recently. I just never personally thought of Nvidia as the type of shop to release half baked features en masse, let alone so vehemently defend and promote them. Perhaps I just haven't been paying attention lately, but I always had this idea of them being the type of outfit that would drop a bomb like RTX, being fully evolved, with minimal performance hit, after they'd been secretly working on and refining it for years. Maybe in fact they have, and it's taken that long to get where it's at, it just doesn't "feel" like that to me. To be clear though, I'm not deriding RTX or DLSS at all, I like the idea of and want both to work, and progress, personally. I'm definitely hoping to get a chance to play with both systems soon.



As I said though; just an imaginative leap of hypothesizing I had on the situation, which I readily admit, I'm likely dead wrong about. ;)


first off the game itself is very good. The game is the first example of full featured RTX effects and really good DLSS implentation. I think prior games were either half assed or just done in correctly and made the feature appear poor. I think we are now starting to see games that actually use these features correctly. I own the game and have played about an hour and i was switching between RTX on/off and DLSS on/off and combination of both. If you put a gun to my head i don't think i could accurately tell the difference between 1440p native w RTX vs RTX + DLSS.
 
Hoax? That's sensationalizing RTX a bit IMO. I mean, it certainly can be argued as a marketing gimmick due to the lack of maturity of the tech in current gen, especially they way NVidia execs have been talking about it like it's some fully refined "game changer" (pardon the pun), but I fully expect it to be much more refined in terms of performance impact in the next few iterations, and pretty significant once that's the case. I also would agree that it's pretty shit timing to introduce something so taxing and unrefined in terms of performance hit, at the same time when flagship products are barely able to utilize 3k (UW, et al.) and 4k resolutions, let alone beyond, when higher resolutions, pixel densities, and ever increasing refresh rates are clearly here to stay and outpacing the GPUs.


I'd also argue that DLSS is sort of evidence that they're aware of it, considering the timing, since it's effectively a band-aid for it, but still, hoax? Cheeky, arrogant, smug, or even misguided, I'd agree with, but clearly, it's not a hoax. It's real and they're probably going to defend it till into the ground. If they can bring performance of RTX in line, it'll be a big selling point for developers long term, and I expect we'll see an equivalent solution from AMD in the not too distant future, in fact, they've stated they're deep in development of a "hybrid" hardware software solution for ray tracing themselves, which sounds exactly like that RTX and DLSS is to me? Even if DLSS wasn't specifically branded as a companion meant to accommodate the performance hit of RTX ray tracing, it surely seems obvious that's what it really is/was to me, and they're practically mutually exclusive in real use.


I was actually really excited to see ray tracing implementation when the RTX series was announced, it's been an elusive fantasy for graphics and games for decades, that seemed would never actually materialize in any meaningful way. I was just really amazed they chose to implement, and focus so heavily on it from a marketing/pitch standpoint, this early, considering the performance. In the past I would have expected them to wait one more round of architecture progression and performance before trying to push it mainstream, which admittedly does make me think they were struggling to justify the price structure of this gen, especially when comparing 20xx's to the 1080ti's performance. That said, the miners are still responsible for the current pricing of the high end models, but NVidia would have been fools to leave money laying on the table, from a business perspective, I do think they probably wouldn't have released the ray tracing with such a performance hit, or mediocre iterative performance jump between iterations, if it weren't coming off the shit-hot can't keep cards in stock mining market. I think they decided to go all in with the ray tracing to try and add some perceived value to justify the state of the market and the price jump, and are just "gonna own" the decision now.


I'm personally looking forward to the future of ray tracing in games though, and it's clear to me, that AMD wouldn't even be bothering with it, if NVidia hadn't dropped it on us, immature of otherwise. So in the end, I think we'll be grateful, once it's fully realized. That's my pragmatic (and potentially naive) view on it at least.


Yes RTX is a hoax as it pertains to RTX-Turing.

Ampere is a different card and will not use Turing's architecture. Turing is EOL. You know this yet went on a rampart long-hearted story about an Nvidia architecture... that is EOL. Turing is dead-end and Nvidia is not making any more Turing chips, that are developing something better. And the RTX2000 Series RTX-Turing cards will never get better at Ray tracing, they will always be broken. Yet you keep transition to the future Ampere chip, like that will some how make Turing chips better. You are incredible at manipulating the rtx mantra... but we all know Turing is dead end for ray tracing. It would be crazy to buy Turing for ray tracing.

I suspect that is what you tried to say?




I mean, everyone by now understand and knows that Turing can't do real-time ray tracing and is a hoax.

Also, please don't waist anyone's time with trying to give credit to Nvidia for marketing their DLSS tensor cores BS to you. It is a hoax, to help make up for the fact that Jensen tries to sell his Customers GPU's not designed for Games, but instead for AI/Enterprise, with burnt-out chips marketed as gamers cards. Tensor cores are what we are paying for in Turing's "RTX tax" dude... it does nothing for People's gaming. It is a marketing ploy and you fell for it!

Pascal is just as fast...
 
Yes RTX is a hoax as it pertains to RTX-Turing.

Ampere is a different card and will not use Turing's architecture. Turing is EOL. You know this yet went on a rampart long-hearted story about an Nvidia architecture... that is EOL. Turing is dead-end and Nvidia is not making any more Turing chips, that are developing something better. And the RTX2000 Series RTX-Turing cards will never get better at Ray tracing, they will always be broken. Yet you keep transition to the future Ampere chip, like that will some how make Turing chips better. You are incredible at manipulating the rtx mantra... but we all know Turing is dead end for ray tracing. It would be crazy to buy Turing for ray tracing.

I suspect that is what you tried to say?




I mean, everyone by now understand and knows that Turing can't do real-time ray tracing and is a hoax.

Also, please don't waist anyone's time with trying to give credit to Nvidia for marketing their DLSS tensor cores BS to you. It is a hoax, to help make up for the fact that Jensen tries to sell his Customers GPU's not designed for Games, but instead for AI/Enterprise, with burnt-out chips marketed as gamers cards. Tensor cores are what we are paying for in Turing's "RTX tax" dude... it does nothing for People's gaming. It is a marketing ploy and you fell for it!

Pascal is just as fast...


I'm not really sure what to respond to here, as your deft blurring of the target, and/or person of focus, reminds me of a particular; National Book Award winning, and highly controversial Pulitzer selection (but not awarded) for fiction, circa 1974; surrealistic literary acid trip, aka: novel, by Thomas Pynchon.


Although, I will ask one rhetorical question, since esotericism seems to be the flavor of the moment here: Isn't everything (architecture, iteration, whatever) dead, and isn't everyone working on the next thing already? No wait, one more: Do you actually know the meaning of the word hoax? Again, rhetorical.



I did just realize that your account is only a little over a month and a half old and you've got almost 4x as many posts as me, (mine is only nearly 2 years old), and they're mostly in the same semi-manic/incoherent vein, so I'm just gonna grab some popcorn and enjoy the show. Although I'm still not really certain whether you were responding to me particularly, or god/infinity/reality/google/whatever... Regardless, carry on! =D
 
I think very few people have bought RTX cards for RT. Most bought them because they had no adequate upgrade path from their last cards. Love my RTX card, couldnt give a crap about RT.
 
I'm not really sure what to respond to here, as your deft blurring of the target, and/or person of focus, reminds me of a particular; National Book Award winning, and highly controversial Pulitzer selection (but not awarded) for fiction, circa 1974; surrealistic literary acid trip, aka: novel, by Thomas Pynchon.

Although, I will ask one rhetorical question, since esotericism seems to be the flavor of the moment here: Isn't everything (architecture, iteration, whatever) dead, and isn't everyone working on the next thing already? No wait, one more: Do you actually know the meaning of the word hoax? Again, rhetorical.

I did just realize that your account is only a little over a month and a half old and you've got almost 4x as many posts as me, (mine is only nearly 2 years old), and they're mostly in the same semi-manic/incoherent vein, so I'm just gonna grab some popcorn and enjoy the show. Although I'm still not really certain whether you were responding to me particularly, or god/infinity/reality/google/whatever... Regardless, carry on! =D


You should read every one of my posts then...

You'll find, that You have already painted the wrong picture about me. You are already starting to character assassinate (ie: kill the messenger) instead of confront, or refuting what I posted.



You posted 7 blathering post, to my 1.
And you are still unable to overcome the fact that DLSS is a hoax. It is a hoax that Jensen sold you, to sell you his Tensor cores and other non-gaming features in Turing architecture. You bought into his marketing BS, hook, line & sinker. Big FISH ON!
 
You should read every one of my posts then...

You'll find, that You have already painted the wrong picture about me. You are already starting to character assassinate (ie: kill the messenger) instead of confront, or refuting what I posted.



You posted 7 blathering post, to my 1.
And you are still unable to overcome the fact that DLSS is a hoax. It is a hoax that Jensen sold you, to sell you his Tensor cores and other non-gaming features in Turing architecture. You bought into his marketing BS, hook, line & sinker. Big FISH ON!
In fact, YOU bought into his marketing BS. You do have a RTX2080. Don't you? :D:D:rolleyes::rolleyes:
 
You should read every one of my posts then...

You'll find, that You have already painted the wrong picture about me. You are already starting to character assassinate (ie: kill the messenger) instead of confront, or refuting what I posted.



You posted 7 blathering post, to my 1.
And you are still unable to overcome the fact that DLSS is a hoax. It is a hoax that Jensen sold you, to sell you his Tensor cores and other non-gaming features in Turing architecture. You bought into his marketing BS, hook, line & sinker. Big FISH ON!


No, I'm unable to overcome your misuse, or misunderstanding of the word hoax, that's all.

I don't have an rtx card, haven't even used one, am not likely to buy one until the next architecture iteration, and I think you'll see from my "blathering" posts, that I called into question the wisdom of releasing rtx and dlss at such an early state of development, I think they were tacked on just to add perceived value, in a lack-luster cycle from a raw performance standpoint, and to justify the price increase that was driven by mining, and not advancement in overall tech. However I'm open to these technologies, though skeptical. Clearly it does actually exist, and whether it's smart business, remains to be seen.


I agree that in the current state, it's definitely not worth moving to from any 10xx series card, and while my 1080ti recently bit the dust, and I need a new card, I'm going to be replacing it with another 1080ti, since nothing in the 20xx series can compete in terms of performance value, at the current used prices for a 1080ti.
 
So, i've installed the 5700xt, waterblock fitted, all up and running. Temps are a little high full load and OC'd but not hitting thermal limits yet.

I miss my Nvidia 980ti already, i mean honestly does AMD's driver team have a clue what they are doing? I'll do a clean install soon just to make sure it's not some leftover portion of the Nvidia binaries but my system is running like shit! maximum frame rates are decent, but plenty of stuttering and slowdown for no good reason. I used DDU prior to installing the new card, but still, not a good experience.

Not an impressive experience so far. If AMD can't get their driver act together, it won't matter what amazing silicon they bring to the table, it won't put them on top if it doesn't work properly.
 
So, i've installed the 5700xt, waterblock fitted, all up and running. Temps are a little high full load and OC'd but not hitting thermal limits yet.

I miss my Nvidia 980ti already, i mean honestly does AMD's driver team have a clue what they are doing? I'll do a clean install soon just to make sure it's not some leftover portion of the Nvidia binaries but my system is running like shit! maximum frame rates are decent, but plenty of stuttering and slowdown for no good reason. I used DDU prior to installing the new card, but still, not a good experience.

Not an impressive experience so far. If AMD can't get their driver act together, it won't matter what amazing silicon they bring to the table, it won't put them on top if it doesn't work properly.

AMD have just released 19.9.1 drivers. Maybe give those a try.
 
kaigame That doesn't sound normal. Does this happen in all games or just some?

You can see that triple buffering is enabled in the game settings, that may help.

Also make sure that Chill is disabled in AMD global gaming settings.
 
AMD have just released 19.9.1 drivers. Maybe give those a try.

Installed, and not much better, Wattman crashes all the time also, still not impressed!!!

kaigame That doesn't sound normal. Does this happen in all games or just some?

Haven't really given it a proper go yet, happened in the couple games i played (Blade and Soul & GTAV) - Blade and Soul seems to be a shadow issue, once i turn that down it's much better, still miss my 980ti

You can see that triple buffering is enabled in the game settings, that may help.

Triple buffering is an OpenGL thing no?

Also make sure that Chill is disabled in AMD global gaming settings.

Chill is a hotkey toggle, never pressed it, i don't think it's enabled.
 
So, i've installed the 5700xt, waterblock fitted, all up and running. Temps are a little high full load and OC'd but not hitting thermal limits yet.

I miss my Nvidia 980ti already, i mean honestly does AMD's driver team have a clue what they are doing? I'll do a clean install soon just to make sure it's not some leftover portion of the Nvidia binaries but my system is running like shit! maximum frame rates are decent, but plenty of stuttering and slowdown for no good reason. I used DDU prior to installing the new card, but still, not a good experience.

Not an impressive experience so far. If AMD can't get their driver act together, it won't matter what amazing silicon they bring to the table, it won't put them on top if it doesn't work properly.

Sounds like a normal AMD experience to me. Devil May Cry 5 came bundled with my Radeon VII. Fresh install, newest drivers for everything, runs at 1/4 speed.

Was really strange to me since it was bundled... now I’d blame the game more than AMD. Bbutttt it was bundled and I had everything default per AMD lol.
 
Last edited:
Sounds like a normal AMD experience to me. Devil May Cry 5 came bundled with my Radeon VII. Fresh install, newest drivers for everything, runs at 1/4 speed.

Sounds like you have a normal no help response to me, as per standard fare.
 
Installed, and not much better, Wattman crashes all the time also, still not impressed!!!



Chill is a hotkey toggle, never pressed it, i don't think it's enabled.

Then it sounds like you have a physical hardware issue. How did the card run without before you disassembled it?
 
It’s helpful in that expect wonky drivers / peformance with AMD. It’s no secret.

Hey, you own a Dodge, that is your problem, right there, happy I could help. :rolleyes:

Then it sounds like you have a physical hardware issue. How did the card run without before you disassembled it?

This is what I posted above and what is actually considered helpful, or at least a question that can track things down. Ah, forget it, you were never actually wanting to help, anyways, enjoy. :)
 
So, i've installed the 5700xt, waterblock fitted, all up and running. Temps are a little high full load and OC'd but not hitting thermal limits yet.

I miss my Nvidia 980ti already, i mean honestly does AMD's driver team have a clue what they are doing? I'll do a clean install soon just to make sure it's not some leftover portion of the Nvidia binaries but my system is running like shit! maximum frame rates are decent, but plenty of stuttering and slowdown for no good reason. I used DDU prior to installing the new card, but still, not a good experience.

Not an impressive experience so far. If AMD can't get their driver act together, it won't matter what amazing silicon they bring to the table, it won't put them on top if it doesn't work properly.

Which game(s) do you have these issues with? Maybe compare it against a review with frame times?

Also a quick Timespy to make sure your score is on par with reviews never hurts.
 
MSI Afterburner, RivaTuner, or similar, do not like Navi. You should close/disable them and use the AMD overlay.

When you say temps are high, how high are you talking about? The temp reported will be higher naturally since it is hotspot temperature, but you may be throttling depending on the heat and cooling.
 
Back
Top