Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I honestly think motion clarity these days is even superior to CRT. People just refuse to take off the rose colored glasses. Only thing they will always have an advantage in is zero latency and zero input lag.CRTs inferior for reading text, not as sharp as LCDs. Also too small, max 24" and are bulky heavy, not to mention power hungry. And good luck finding one in good working order that hasnt deteriorated over time (discoloration, phosphor decay, etc).
I honestly think motion clarity these days is even superior to CRT. People just refuse to take off the rose colored glasses. Only thing they will always have an advantage in is zero latency and zero input lag.
Ahhh, in my basement I still have my old Samsung Syncmaster! Would go up to 1280x1024 @ 75Hz... something like that anyway. Not sure why I still have it other than if I need a VGA monitor someday or feel like building an old ass PC out of parts I have from the early 2000's laying around. Maybe I keep it around because of all the amazing games like Half-Life, Tribes and UT I had fun with on that thing!85Hz bruh
60hz always assaulted my eyes on CRTs. most crts could do at least 85hz
The higher the res the more blurry, to the point of terrible.85Hz bruh
60hz always assaulted my eyes on CRTs. most crts could do at least 85hz
The higher the res the more blurry, to the point of terrible.
My Vision master pro 210 was not much use over 1600x1200.
It could do 2048x1536 but it wasnt usable.
And higher Hz was more blurry too.
Hz be damned.
Makes sense. I remember when my dad got a fancy flat screen viewsonic 19" and it could run in the 2k range but yeah, bit blurry and shitty looking. Makes sense.Most monitors had a 'sweet spot', like the 19" Trinitron / Diamondtron series that liked 1280x1024@120Hz I believe. They could also do 1600x1200@85Hz, but sacrificed sharpness to do so.
Most monitors had a 'sweet spot', like the 19" Trinitron / Diamondtron series that liked 1280x1024@120Hz I believe. They could also do 1600x1200@85Hz, but sacrificed sharpness to do so.
85hz or above on a CRT was unfathomably smooth.
Yeah, nothing is quite as smooth as the placebo effect, boosted by Nostalgia.
Yeah, nothing is quite as smooth as the placebo effect, boosted by Nostalgia.
Why bother with facts?
Ignorance is the new black...
CRTs inferior for reading text, not as sharp as LCDs. Also too small, max 24" and are bulky heavy, not to mention power hungry. And good luck finding one in good working order that hasnt deteriorated over time (discoloration, phosphor decay, etc).
Every res was native though!
Ah, Yeah, I miss the good ol day's of taking my pc and crt to lan party's back in the early 00's.CRT had its pros and cons just like lcd's. But a high quality CRT made a big difference in image quality, but god it sucked dragging them to a LAN party.
Those aren't facts, they are opinions.
Opinions in an industry where views = $. Controversial videos stand out and get more views.
And get repeatedly linked when fans of something want something to back them up.
Oh, for sure I'd say modern LCD VA/IPS panels look better in basically every way these days. And black levels/contrast ratio is a thing of consideration now. My old Sony 19" Trinitron was laughably bad, and after the 5 years or so of ownership, it could no longer do black, just a blueish grey!
Every res was native though!
I have yet to see a flatscreen in a A/B test that provided better image quality than a CRT, care to link to such reviews?
I have yet to see a flatscreen in a A/B test that provided better image quality than a CRT, care to link to such reviews?
I have yet to see a flatscreen in a A/B test that provided better image quality than a CRT, care to link to such reviews?
Expanded more:
I had both sitting on my desk for about a year, it was no contest. LCD is better. I had both at work, again no contest.
When we were switching over at work, everyone was begging to get upgraded to LCD ASAP.
Not one person wanted to keep using CRT, and these were 21" Trinitrons.
This nostalgic CRT worship is much like the few people who swear Vinyl records sound better than CDs.
https://www.tomshardware.com/news/nvidia-ai-research-render-graphics,38185.html
Nvidia may have some very excitting tech for us to play with.
Maybe, in a decade or so, but not anytime soon.
Whoa, now, let's not start building up expectations like an AMD fanboy. We all know how that ends.They were accomplishing the real time with the capability of the Titan V, so do not think that ability will be a decade out. It is most likely much more refined now. I could see GTAVII having a city where everyone is unique and even each character could be transformed to like a different cultures depending upon location of the user. Except that will most likely be developed using console ability, cough cough AMD. Portraits of believable people that don't exist Nvidia has already proven they can do now:
https://www.theverge.com/tldr/2019/...ple-portraits-thispersondoesnotexist-stylegan
What is Nvidia doing with R&D unless they see a benefit as a company. Most if not all of the software is open sourced which others are using. Hardware that is better refined to accomplished those types of tasks would need to come out for developers to go further into this. RT as in RTX can also be a way to interpret traditional games, assets and game engines. I see it possibly a stepping stone for using AI. AI hands down are giving results that are utterly remarkable and much better than traditional methods of 3d work and rendering. The numbers for real time raytracing for games do not add up as in forcing the calculations in a traditional RT environment. Even 10x the capability of hardware ability for RT will fall short of compete real time full effect RT (really not even close).
Another avenue for big bucks would be the entertainment industry, all these unique capabilities with dedicated hardware would go a long ways in making movies, shows, special effects at a lower cost. Which I do believe would also come to gaming as more and more people get proficient with it and as more movies are also big game development opportunities where the assets could be shared. Maybe a whole new uber eco-system for entertainment/gaming.
I believe Ampere is going to be much more than a traditional upgrade, it has the potential, I do believe it will be the biggest largest step Nvidia has ever made.
Yeah, that looks like an order of magnitude harder than DLSS which we all know how that went...
It's going slowly... last I checked.
Where they've gotten it solidly working, it's delivering in ways that less sophisticated solutions aren't capable.
Of course, they could have started with those
Really? Hey just a speculation thread, you have any relevant facts? Nvidia has been working on this for awhile, real info/facts. RTX is not magically going to 10x by calculations or increasing ray casting for more accuracy and usability. It is so little now for real time it will remain limited, even if somewhat useful for some things. The increase in RTX may have much more to do with AI then anything else, then again game developers will most likely be looking at Consoles more than just PC's and would be more limited by their abilities while next gen may have RT, how good will it really be?Whoa, now, let's not start building up expectations like an AMD fanboy. We all know how that ends.
Those tensor cores maybe used for some other interesting gaming side of house uses, more Youtuby type streaming, video's etc. Maybe will be incorporated into GeForce Experience in the future.Last I checked, the best version of "DLSS" was in control, and it turns out it really isn't DLSS, in that it isn't running a Deep Learning network. Instead they are using a compute core program to upscale.
Think about that. The best DLSS isn't even using Tensor cores.
I would say that is a fairly damning indictment against how well it's going.
What method of DLSS does Wolfenstein youngblood use now that RT finally came out on it?Last I checked, the best version of "DLSS" was in control, and it turns out it really isn't DLSS, in that it isn't running a Deep Learning network. Instead they are using a compute core program to upscale.
Think about that. The best DLSS isn't even using Tensor cores.
I would say that is a fairly damning indictment against how well it's going.
I know they had a demo showing off bits of a new space / moon game. I can't say that I've really kept up with it, however, what they've shown has been pretty impressive.Last I checked, the best version of "DLSS" was in control, and it turns out it really isn't DLSS, in that it isn't running a Deep Learning network. Instead they are using a compute core program to upscale.
Think about that. The best DLSS isn't even using Tensor cores.
I would say that is a fairly damning indictment against how well it's going.