RTX 3xxx performance speculation

Last I checked, the best version of "DLSS" was in control, and it turns out it really isn't DLSS, in that it isn't running a Deep Learning network. Instead they are using a compute core program to upscale.

Think about that. The best DLSS isn't even using Tensor cores.

I would say that is a fairly damning indictment against how well it's going.

It's almost as if the tensor cores weren't actually made for DLSS, but for more something Crypto related. Since, as anyone knows, Turing was famous for his work on Ray Tracing and AI deep learning during WWII. :confused:
 
Even in this video the brief side to side comparison they discussed was at 60 hZ using a side scrolling game. If they would have used something like my monitor (aorus kd25f) or benq with dyac vs the crt in a modern game pushing 200+ hz and fps, they wouldn't be crowing so loudly about crts. The other points they made are accurate, crts could do almost any resolution and look good and they had built-in softness to them which made harsh scenes in a game smooth out. But both technologies are a compromise, oled has both beat for the near future while long term microled should be king.

Most of us here grew up with crts and used to haul them around to Lan parties to play quake so we know exactly what crt offers and the compromises it comes with. I would never go back, my 240Hz 1080p TN panel delivers exactly what I need for smooth motion handling.

Doesn't change the fact that I.Q. has not surpassed good CRT's...every A/B comparison I have seen in person still validates this.
 
I had both sitting on my desk for about a year, it was no contest. LCD is better. I had both at work, again no contest.

When we were switching over at work, everyone was begging to get upgraded to LCD ASAP.

Not one person wanted to keep using CRT, and these were 21" Trinitrons.

This nostalgic CRT worship is much like the few people who swear Vinyl records sound better than CDs.

Not nostalgia, diffrence technology...and I suspect you are talking about Word...not eg. Control...
 
That would be a mostly useless test as CRT's are no longer made and obsolete.

Waste of time and resources. Also they sucked for image editing. Sucked.

I thought we were talking about gaming?
(Movies also looks SO much better on CRT's FYI.
 
Nope. The same applies with the transtion from CRT TVs to LCD TVs.

LCD TVs are just so amazingly sharp compared to the old CRTs.



Your opnion is disputed...go figure.
But I guess if you cannot see true black, LCD's are an option for you.
 
Steam survey is about as good as you'll get for those numbers. Last time I looked the 2080ti was barely a percent.

This says enough; 1080ti 1.6% 2080ti 0.6%
Just buy it!
Highlights the irrelevancy of halo stuff in real-world use.

Steam isn't very accurate though and has had issues classifying AMD cards for a long time. But it's best we got.

You might want to add all the Turing cards, so you can see the "halo" effect:

NVIDIA GeForce GTX 1650 1.08%
NVIDIA GeForce GTX 1660 2.12%
NVIDIA GeForce GTX 1660 Ti 1.54%
NVIDIA GeForce RTX 2060 3.07%
NVIDIA GeForce RTX 2060 SUPER 0.41%
NVIDIA GeForce RTX 2070 1.96%
NVIDIA GeForce RTX 2070 SUPER 0.63%
NVIDIA GeForce RTX 2080 1.03%
NVIDIA GeForce RTX 2080 SUPER 0.33%
NVIDIA GeForce RTX 2080 Ti 0.50%
Total 12,67%

If Turing was a failure...the AMD's card have been a disaster...you cannot have your cake and eat it too...

EDIT:
NVIDIA is competing with themself...an in that regard they are doing good:
PR-Imgae-GBP.png
 
Last edited:
Your opnion is disputed...go figure.
But I guess if you cannot see true black, LCD's are an option for you.

Yay, more goofy YT links. No thanks. I really don't care about the opinions of click seekers.

CRT had terrible blacks in real usage. The tube picks up and bounces light internally all over the place.

Actual ANSI measured contrast on CRT was about 300:1

Worse than even the poorest LCD.
 
At some point we all just realize that "whatever" new things comes out results in a slight bump. We're just buying new because of the charred-ness of the backside of our existing boards.
 
The market, people buying choices and decisions, views and understanding clearly chosen what was preferred and wanted. Sorry CRT's are just about as viable as extinct dinosaurs. If gamers really saw the first round of affordable LCD's as better than the latest technology CRT's of the time - the hammer sounded, judgement rendered.

I have a QLED, Quantum Dot, HDR600 monitor, 144hz. No way in hell I would look back at CRT technology which could not support HDR, naturally blurry or soft, lower resolutions and the list would go on. Yes motion A/B was better with CRT's but that is about the only thing besides lag (which for most of use is not significant). Everything else that is important, CRT's tech is not there.

Let see what company will start producing high end gaming CRT gaming monitors and how well that will sell over time, most likely no company will bother.
 
Yay, more goofy YT links. No thanks. I really don't care about the opinions of click seekers.

CRT had terrible blacks in real usage. The tube picks up and bounces light internally all over the place.

Actual ANSI measured contrast on CRT was about 300:1

Worse than even the poorest LCD.

I prefer YT links over random forum dude...
 
what about a random forum dude posting YT links?

Nice fallacy...got more? :)
By that "argumentation", you state that a random forum dude with no data (except fuzzy warm feelings) has the same "value" s a link to a YT video from a review site with data...and we all know that is not how the world works....

Post data if you have it, your personal opnion...useless.
 
Nice fallacy...got more? :)
By that "argumentation", you state that a random forum dude with no data (except fuzzy warm feelings) has the same "value" s a link to a YT video from a review site with data...and we all know that is not how the world works....

Post data if you have it, your personal opnion...useless.

Yes, I have more. I know a guy who thinks opinions are useless until they're posted on Youtube, where they magically become data.
 
Should keep in mind that an Ampere core is probably faster per clock than a Turing core. The bigger question is how much faster ;).

Yeah, we don't know. Even if it's a Maxwell like 30% jump though, it will only roughly match 2080 Ti without a clockspeed boost. Of course the core counts could be completely different.
 
Sure!

Then prices can come down :D

Used 2080 Ti are going for $850-900 now so it's a good time to buy one. I paid $800 for my strix and it was new. I used my Titan X Pascal for nearly 4 years, got $430 back on it and spent less than $400 to get the 2080 Ti, I think I got a very cheap upgrade.
 
Last edited:
im mostly betting a die shrink and cores moved down a sku ie 3070 is just a 2080 Super on 7nm with 5% clock speed bump and more ram
 
What method of DLSS does Wolfenstein youngblood use now that RT finally came out on it?

I got the game for free back in September and loaded it up the other day to see how it looked (haven't played it yet). The Ray Tracing looked nice and when I used DLSS at 4K, it looked better than with it off using the AA options. If that game truly uses DLSS, I'd vote that as the best example. FPS was in the 100's after DLSS (~60ish FPS without it) at 4K with RT on and the DLSS looked beautiful.
The Wolfenstein update does finally use their Deep Learning system. Rather than an shader algorithm based on some data points from the Deep Learning data. And yeah, it looks really good.
 
The Wolfenstein update does finally use their Deep Learning system. Rather than an shader algorithm based on some data points from the Deep Learning data. And yeah, it looks really good.
Best RTX usage game I've seen (at least from video). First time I would consider/want a RTX featured card. 2nd play through now on game, I actually love this game due to battle engagements, options and strategies one can take. To me a very underrated title but then I got momentarily bored with RDR2 and started this for the 2nd time. Yes indeed! DLSS not only improved performance but actual quality of the image at the same time -> That is what was expected and it looks like it is very much possible with it -> Great news. Doom Eternal is going to have this as well which should be just as good if not better, maybe a must have feature or close to it for that title, we will see. Ampere I am sure hoping/betting on will improve all of this, developers are more experienced and a break out in using RTX has appeared both in Control and Youngblood. More should follow.



For Enthusiasts, who play the latest new tech games with savagery, RTX or RT (If AMD supports well) maybe one of the key features to look for. I am also wanting and looking for proper next generation monitor support (HDMI 2.1 and DP 2.0). I think I would prefer DP 2.0 more than HDMI2.1 if it only comes with one due to increase ability as long as there are suitable/workable adaptors for HDMI 2.1 for current high end and next gen HDTV's.

edit: Corrected DLSS
 
Last edited:
Best RTX usage game I've seen (at least from video). First time I would consider/want a RTX featured card. 2nd play through now on game, I actually love this game due to battle engagements, options and strategies one can take. To me a very underrated title but then I got momentarily bored with RDR2 and started this for the 2nd time. Yes indeed! DSLL not only improved performance but actual quality of the image at the same time -> That is what was expected and it looks like it is very much possible with it -> Great news. Doom Eternal is going to have this as well which should be just as good if not better, maybe a must have feature or close to it for that title, we will see. Ampere I am sure hoping/betting on will improve all of this, developers are more experienced and a break out in using RTX has appeared both in Control and Youngblood. More should follow.



For Enthusiasts, who play the latest new tech games with savagery, RTX or RT (If AMD supports well) maybe one of the key features to look for. I am also wanting and looking for proper next generation monitor support (HDMI 2.1 and DP 2.0). I think I would prefer DP 2.0 more than HDMI2.1 if it only comes with one due to increase ability as long as there are suitable/workable adaptors for HDMI 2.1 for current high end and next gen HDTV's.

Now that their Deep Learning system is actually being used: I would like to see a driver level setting. That way, we could have DLSS in almost any game. Of course, game specific tweaks will look a little better. But I am certain a generic setting would still be pretty great for a lot of games without specific DLSS features in game.
 
Now that their Deep Learning system is actually being used: I would like to see a driver level setting. That way, we could have DLSS in almost any game. Of course, game specific tweaks will look a little better. But I am certain a generic setting would still be pretty great for a lot of games without specific DLSS features in game.
Well if Nvidia effectively gets AI working well which for this title it is, see the native 4K compared to DLSS lower resolution version and the DLSS properly renders much more items right (I say blows it away), I mean broadly, AMD will have a very hard time since they are not active in dedicated AI hardware. AI rendering has just begun and that has such a huge potential to make all the previous rendering techniques antiques.
 
Best RTX usage game I've seen (at least from video). First time I would consider/want a RTX featured card. 2nd play through now on game, I actually love this game due to battle engagements, options and strategies one can take. To me a very underrated title but then I got momentarily bored with RDR2 and started this for the 2nd time. Yes indeed! DLSS not only improved performance but actual quality of the image at the same time -> That is what was expected and it looks like it is very much possible with it -> Great news. Doom Eternal is going to have this as well which should be just as good if not better, maybe a must have feature or close to it for that title, we will see. Ampere I am sure hoping/betting on will improve all of this, developers are more experienced and a break out in using RTX has appeared both in Control and Youngblood. More should follow.



For Enthusiasts, who play the latest new tech games with savagery, RTX or RT (If AMD supports well) maybe one of the key features to look for. I am also wanting and looking for proper next generation monitor support (HDMI 2.1 and DP 2.0). I think I would prefer DP 2.0 more than HDMI2.1 if it only comes with one due to increase ability as long as there are suitable/workable adaptors for HDMI 2.1 for current high end and next gen HDTV's.

edit: Corrected DLSS


Looks like RTX 2060 Super can pull off 1440p native (no dlss) ~60 fps ray tracing in this game while in combat on ultra settings. Hopefully more games are optimized like this with RTX in the future so it becomes more viable for middle range cards like the 2060. Ampere/Hopper/whatever it's going to be called will probably have some really good RTX performance I bet.

Anyway, the use of DLSS here is what all of us expected the first time they introduced it so hopefully they go back and optimize other DLSS titles like this one so it actually does what we want:

dlssbetterthannative.JPGrtdlssperf.JPG
 
Last edited:
Looks like RTX 2060 Super can pull off 1440p native (no dlss) ~60 fps ray tracing in this game while in combat on ultra settings. Hopefully more games are optimized like this with RTX in the future so it becomes more viable for middle range cards like the 2060. Ampere/Hopper/whatever it's going to be called will probably have some really good RTX performance I bet.

Anyway, the use of DLSS here is what all of us expected the first time they introduced it so hopefully they go back and optimize other DLSS titles like this one so it actually does what we want:

View attachment 218189View attachment 218190
Yes I saw that, seems like he was running out of memory as well but very doable. So how did ID get this one right and others not so much? When DLSS is properly working it sure can be a game changer and a rather big one at that. Agree that other titles should be upgraded if they can. I just find this very strange, probably first time I am actually excited with Turing and more so with the prospects of Ampere.
 
Yes I saw that, seems like he was running out of memory as well but very doable. So how did ID get this one right and others not so much? When DLSS is properly working it sure can be a game changer and a rather big one at that. Agree that other titles should be upgraded if they can. I just find this very strange, probably first time I am actually excited with Turing and more so with the prospects of Ampere.
Its nothing to do with ID. Despite advertising DLSS as a deep learning feature----Nvidia didn't have their AI Research model ready for pirmetime. It has taken this long, for them to optimize it enough so that it not only delivers great image quality, but also great performance. Control, for example, used an algorithm which approximated some of the key points from the Research model, which still wasn't giving good performance when Control was in development. But, because Control's DLSS is not actually derived from the Reasearch model, it has more noise and resolution loss, during fast motion.

There's no reason AMD could not also do something like DLSS and I expect they will. Either that, or bring over checkerboard rendering from the console side. Which would also be pretty great.
 
Last edited:
Please for the love of all that's good take your CRT vs LCD discussion elsewhere - both sides. The rest of us don't care, it's off topic, we came here for RTX 3xxx speculation. If you feel like trading barbs, PM each other or start another thread.

On topic, I'm wondering whether the new Samsung process has matured to the point that we can see really high clocks on the 3xxx series GPUs. What do you guys think stock boost clocks will be? Over 2.1 ghz?
 
Please for the love of all that's good take your CRT vs LCD discussion elsewhere - both sides. The rest of us don't care, it's off topic, we came here for RTX 3xxx speculation. If you feel like trading barbs, PM each other or start another thread.

On topic, I'm wondering whether the new Samsung process has matured to the point that we can see really high clocks on the 3xxx series GPUs. What do you guys think stock boost clocks will be? Over 2.1 ghz?

Last word from NVidia, is that TSMC is still looking like their primary fab, next generation included.
 
Back
Top