RTX 3xxx performance speculation

CRTs inferior for reading text, not as sharp as LCDs. Also too small, max 24" and are bulky heavy, not to mention power hungry. And good luck finding one in good working order that hasnt deteriorated over time (discoloration, phosphor decay, etc).
 
CRTs inferior for reading text, not as sharp as LCDs. Also too small, max 24" and are bulky heavy, not to mention power hungry. And good luck finding one in good working order that hasnt deteriorated over time (discoloration, phosphor decay, etc).
I honestly think motion clarity these days is even superior to CRT. People just refuse to take off the rose colored glasses. Only thing they will always have an advantage in is zero latency and zero input lag.
 
I honestly think motion clarity these days is even superior to CRT. People just refuse to take off the rose colored glasses. Only thing they will always have an advantage in is zero latency and zero input lag.

I wonder if there is any overlap between the people praising CRTs, and criticizing OLED for burn-in and durability issues.

The would be forgetting that CRTs typically peaked at 140 nits and people ran them 80 nits to preserve the fragile phosphors. Anyone in a big org with CRTs will likely remember seeing lots of burned in CRTs.
 
85Hz bruh
60hz always assaulted my eyes on CRTs. most crts could do at least 85hz
Ahhh, in my basement I still have my old Samsung Syncmaster! Would go up to 1280x1024 @ 75Hz... something like that anyway. Not sure why I still have it other than if I need a VGA monitor someday or feel like building an old ass PC out of parts I have from the early 2000's laying around. :) Maybe I keep it around because of all the amazing games like Half-Life, Tribes and UT I had fun with on that thing!
 
85Hz bruh
60hz always assaulted my eyes on CRTs. most crts could do at least 85hz
The higher the res the more blurry, to the point of terrible.
My Vision master pro 210 was not much use over 1600x1200.
It could do 2048x1536 but it wasnt usable.

And higher Hz was more blurry too.
Hz be damned.
 
Last edited:
The higher the res the more blurry, to the point of terrible.
My Vision master pro 210 was not much use over 1600x1200.
It could do 2048x1536 but it wasnt usable.

And higher Hz was more blurry too.
Hz be damned.

Most monitors had a 'sweet spot', like the 19" Trinitron / Diamondtron series that liked 1280x1024@120Hz I believe. They could also do 1600x1200@85Hz, but sacrificed sharpness to do so.
 
Most monitors had a 'sweet spot', like the 19" Trinitron / Diamondtron series that liked 1280x1024@120Hz I believe. They could also do 1600x1200@85Hz, but sacrificed sharpness to do so.
Makes sense. I remember when my dad got a fancy flat screen viewsonic 19" and it could run in the 2k range but yeah, bit blurry and shitty looking. Makes sense.
 
Most monitors had a 'sweet spot', like the 19" Trinitron / Diamondtron series that liked 1280x1024@120Hz I believe. They could also do 1600x1200@85Hz, but sacrificed sharpness to do so.

85hz or above on a CRT was unfathomably smooth. I can't even compare it to modern monitors, maybe what a 360hz panel feels like? Would love if we could compare this with some blur buster big balls youtube video.

If my fw-900 still worked I'd do it myself, alas, the only repair guy in my state is like 2+ hours away each way, and the last quote i got was near 1 grand to service it.
 
Yeah, nothing is quite as smooth as the placebo effect, boosted by Nostalgia.

Lol so true. The crt motion handling was nice but honestly when I'm pushing 240 fps on my 240 hz with strobing the motion is super buttery smooth. I don't miss bulky power hungry crts at all
 
CRTs inferior for reading text, not as sharp as LCDs. Also too small, max 24" and are bulky heavy, not to mention power hungry. And good luck finding one in good working order that hasnt deteriorated over time (discoloration, phosphor decay, etc).

Oh, for sure I'd say modern LCD VA/IPS panels look better in basically every way these days. And black levels/contrast ratio is a thing of consideration now. My old Sony 19" Trinitron was laughably bad, and after the 5 years or so of ownership, it could no longer do black, just a blueish grey! :LOL:

Every res was native though!
 
Those aren't facts, they are opinions.

Opinions in an industry where views = $. Controversial videos stand out and get more views.

And get repeatedly linked when fans of something want something to back them up.

You post invalidates every post you make as an "opnion"...but you keep posting like you post facts...nice own goal.
 
Oh, for sure I'd say modern LCD VA/IPS panels look better in basically every way these days. And black levels/contrast ratio is a thing of consideration now. My old Sony 19" Trinitron was laughably bad, and after the 5 years or so of ownership, it could no longer do black, just a blueish grey! :LOL:

Every res was native though!

I have yet to see a flatscreen in a A/B test that provided better image quality than a CRT, care to link to such reviews?
 
I have yet to see a flatscreen in a A/B test that provided better image quality than a CRT, care to link to such reviews?

Lol, look who's back. Welcome to 2020 Factum! :rolleyes:

I said, "looks better" and, "I'd say".

You can go back to your little cave now.
 
I have yet to see a flatscreen in a A/B test that provided better image quality than a CRT, care to link to such reviews?

That would be a mostly useless test as CRT's are no longer made and obsolete.

Waste of time and resources. Also they sucked for image editing. Sucked.
 
I have yet to see a flatscreen in a A/B test that provided better image quality than a CRT, care to link to such reviews?


I had both sitting on my desk for about a year, it was no contest. LCD is better. I had both at work, again no contest.

When we were switching over at work, everyone was begging to get upgraded to LCD ASAP.

Not one person wanted to keep using CRT, and these were 21" Trinitrons.

This nostalgic CRT worship is much like the few people who swear Vinyl records sound better than CDs.
 
Expanded more:


Even in this video the brief side to side comparison they discussed was at 60 hZ using a side scrolling game. If they would have used something like my monitor (aorus kd25f) or benq with dyac vs the crt in a modern game pushing 200+ hz and fps, they wouldn't be crowing so loudly about crts. The other points they made are accurate, crts could do almost any resolution and look good and they had built-in softness to them which made harsh scenes in a game smooth out. But both technologies are a compromise, oled has both beat for the near future while long term microled should be king.

Most of us here grew up with crts and used to haul them around to Lan parties to play quake so we know exactly what crt offers and the compromises it comes with. I would never go back, my 240Hz 1080p TN panel delivers exactly what I need for smooth motion handling.
 
Last edited:
I had both sitting on my desk for about a year, it was no contest. LCD is better. I had both at work, again no contest.

When we were switching over at work, everyone was begging to get upgraded to LCD ASAP.

Not one person wanted to keep using CRT, and these were 21" Trinitrons.

This nostalgic CRT worship is much like the few people who swear Vinyl records sound better than CDs.

I think it is part nostalgia/rose coloured glasses in that people tend to remember the good things in the past and forget the bad. I also think part of it is whatever we have now, we notice the problems with and thus if something doesn't have those particular problems, it can look better because we don't consider the downsides. CRTs did have some legit advantages over LCDs. Until recently variable/high refresh rates were something you needed a CRT for, the way they draw their picture with only a small part being illuminated makes for very crisp frames, and they are good a doing multiple resolutions near their max (LCDs have trouble being clear if you are slightly under their max rez). However people forget all the issues they had, the spending hours messing around to try and get their geometry square and still have it only "mostly" right, their utter inability to get bright without blooming/loss of saturation, how shitty low resolutions could be when scan lines starter happening because there weren't enough lines for the guns, etc.
 
I don't know how this devolved into CRT discussion but I'm old enough to have used quite a few CRTs and the things people forget are geometry issues, reflections, GPUs having varying quality D/A converters so GPU model X might give you worse image quality than GPU model Y, not being able to run high enough Hz on some so they were headache inducing pieces of crap and yes, they did have a practical "native" resolution which aligns with the size of the shadow mask I think. You could go higher but it would give you too small everything in an age before DPI scaling was a thing.

One thing CRTs did do well is make low resolution look really good. I have a couple of old CRT monitors from a TV studio and those are just amazing for things like Playstation 2 games. Those old games look much better on these and the developers of said games took good advantage of that. CRTs are not pin-sharp so they soften the image just a bit to make it look like they are higher res than they really are.

But getting back to the 30xx series, where they land will be interesting. Will Nvidia release a 3080 Ti right off the bat or save it for later? Will they manage to squeeze out a significant improvement for both raster and raytracing performance to make 2080 Ti owners like me to upgrade? Will Displayport 2.0 support be in despite no displays using it having being released or announced? RT performance is slowly but surely getting better due to improved software solutions too.
 
Here is for the thread, big speculation, Nvidia works on with a number of others and come up with a whole new rendering technique which is not Rasterization nor RT. AI rendering, where the state of the object in how it reacts to lights is defined, AI takes all objects in scene with lighting information for objects and comes up with solution for any light source introduced, changes in lights, movement of lights, material emission for color bleeding, shadowing. A totally different way to calculate what to render so to speak. In a way Nvidia Denoising and DSLL is doing this but still using RT and Rasterization for rendering with modifications by AI. SPECULATION ONLY.
 
They were accomplishing the real time with the capability of the Titan V, so do not think that ability will be a decade out. It is most likely much more refined now. I could see GTAVII having a city where everyone is unique and even each character could be transformed to like a different cultures depending upon location of the user. Except that will most likely be developed using console ability, cough cough AMD. Portraits of believable people that don't exist Nvidia has already proven they can do now:

https://www.theverge.com/tldr/2019/...ple-portraits-thispersondoesnotexist-stylegan

What is Nvidia doing with R&D unless they see a benefit as a company. Most if not all of the software is open sourced which others are using. Hardware that is better refined to accomplished those types of tasks would need to come out for developers to go further into this. RT as in RTX can also be a way to interpret traditional games, assets and game engines. I see it possibly a stepping stone for using AI. AI hands down are giving results that are utterly remarkable and much better than traditional methods of 3d work and rendering. The numbers for real time raytracing for games do not add up as in forcing the calculations in a traditional RT environment. Even 10x the capability of hardware ability for RT will fall short of complete real time full effect RT (really not even close).

Another avenue for big bucks would be the entertainment industry, all these unique capabilities with dedicated hardware would go a long ways in making movies, shows, special effects at a lower cost. Which I do believe would also come to gaming as more and more people get proficient with it and as more movies are also big game development opportunities where the assets could be shared. Maybe a whole new uber eco-system for entertainment/gaming.

I believe Ampere is going to be much more than a traditional upgrade, it has the potential, I do believe it will be the biggest largest step Nvidia has ever made.
 
Last edited:
They were accomplishing the real time with the capability of the Titan V, so do not think that ability will be a decade out. It is most likely much more refined now. I could see GTAVII having a city where everyone is unique and even each character could be transformed to like a different cultures depending upon location of the user. Except that will most likely be developed using console ability, cough cough AMD. Portraits of believable people that don't exist Nvidia has already proven they can do now:

https://www.theverge.com/tldr/2019/...ple-portraits-thispersondoesnotexist-stylegan

What is Nvidia doing with R&D unless they see a benefit as a company. Most if not all of the software is open sourced which others are using. Hardware that is better refined to accomplished those types of tasks would need to come out for developers to go further into this. RT as in RTX can also be a way to interpret traditional games, assets and game engines. I see it possibly a stepping stone for using AI. AI hands down are giving results that are utterly remarkable and much better than traditional methods of 3d work and rendering. The numbers for real time raytracing for games do not add up as in forcing the calculations in a traditional RT environment. Even 10x the capability of hardware ability for RT will fall short of compete real time full effect RT (really not even close).

Another avenue for big bucks would be the entertainment industry, all these unique capabilities with dedicated hardware would go a long ways in making movies, shows, special effects at a lower cost. Which I do believe would also come to gaming as more and more people get proficient with it and as more movies are also big game development opportunities where the assets could be shared. Maybe a whole new uber eco-system for entertainment/gaming.

I believe Ampere is going to be much more than a traditional upgrade, it has the potential, I do believe it will be the biggest largest step Nvidia has ever made.
Whoa, now, let's not start building up expectations like an AMD fanboy. We all know how that ends.
 
Yeah, that looks like an order of magnitude harder than DLSS which we all know how that went...

It's going slowly... last I checked.

Where they've gotten it solidly working, it's delivering in ways that less sophisticated solutions aren't capable.


Of course, they could have started with those ;)
 
It's going slowly... last I checked.

Where they've gotten it solidly working, it's delivering in ways that less sophisticated solutions aren't capable.


Of course, they could have started with those ;)

Last I checked, the best version of "DLSS" was in control, and it turns out it really isn't DLSS, in that it isn't running a Deep Learning network. Instead they are using a compute core program to upscale.

Think about that. The best DLSS isn't even using Tensor cores.

I would say that is a fairly damning indictment against how well it's going.
 
Whoa, now, let's not start building up expectations like an AMD fanboy. We all know how that ends.
Really? Hey just a speculation thread, you have any relevant facts? Nvidia has been working on this for awhile, real info/facts. RTX is not magically going to 10x by calculations or increasing ray casting for more accuracy and usability. It is so little now for real time it will remain limited, even if somewhat useful for some things. The increase in RTX may have much more to do with AI then anything else, then again game developers will most likely be looking at Consoles more than just PC's and would be more limited by their abilities while next gen may have RT, how good will it really be?
 
Last I checked, the best version of "DLSS" was in control, and it turns out it really isn't DLSS, in that it isn't running a Deep Learning network. Instead they are using a compute core program to upscale.

Think about that. The best DLSS isn't even using Tensor cores.

I would say that is a fairly damning indictment against how well it's going.
Those tensor cores maybe used for some other interesting gaming side of house uses, more Youtuby type streaming, video's etc. Maybe will be incorporated into GeForce Experience in the future.

https://blogs.nvidia.com/blog/2019/09/26/nvidia-rtx-broadcast-engine-twitch-livestream-ai/
 
Last I checked, the best version of "DLSS" was in control, and it turns out it really isn't DLSS, in that it isn't running a Deep Learning network. Instead they are using a compute core program to upscale.

Think about that. The best DLSS isn't even using Tensor cores.

I would say that is a fairly damning indictment against how well it's going.
What method of DLSS does Wolfenstein youngblood use now that RT finally came out on it?

I got the game for free back in September and loaded it up the other day to see how it looked (haven't played it yet). The Ray Tracing looked nice and when I used DLSS at 4K, it looked better than with it off using the AA options. If that game truly uses DLSS, I'd vote that as the best example. FPS was in the 100's after DLSS (~60ish FPS without it) at 4K with RT on and the DLSS looked beautiful.
 
Last I checked, the best version of "DLSS" was in control, and it turns out it really isn't DLSS, in that it isn't running a Deep Learning network. Instead they are using a compute core program to upscale.

Think about that. The best DLSS isn't even using Tensor cores.

I would say that is a fairly damning indictment against how well it's going.
I know they had a demo showing off bits of a new space / moon game. I can't say that I've really kept up with it, however, what they've shown has been pretty impressive.

And RTSS is indicative of where rendering is going, as in using some form of machine learning to 'cull' the work that needs to be done in order to get more performance output from available hardware. We're looking at 8k outputs as well as VR on the one hand, while hitting die shrink and reticle diameter limits on the other. Perhaps chiplets will help as they have with CPUs as AMD has demonstrated, but there does need to be more and more focus on optimizing code and hardware in parallel versus simply throwing more resources at the problem.
 
The big advantage I see Nvidia having for Ampere, particularly dealing with RTX is the feedback they have received from the developers dealing with their hardware. How much Ampere could be upgraded using the feedback is another thing but I am sure Nvidia probably had a number of contingencies dealing with Ampere. My speculation dealing with Ampere is 4th quarter this year.
 
Back
Top