RTX 3xxx performance speculation

Discussion in 'nVidia Flavor' started by Nebell, Oct 25, 2019.

  1. amenx

    amenx Limp Gawd

    Messages:
    323
    Joined:
    Dec 17, 2005
    CRTs inferior for reading text, not as sharp as LCDs. Also too small, max 24" and are bulky heavy, not to mention power hungry. And good luck finding one in good working order that hasnt deteriorated over time (discoloration, phosphor decay, etc).
     
    Armenius likes this.
  2. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    20,005
    Joined:
    Jan 28, 2014
    I honestly think motion clarity these days is even superior to CRT. People just refuse to take off the rose colored glasses. Only thing they will always have an advantage in is zero latency and zero input lag.
     
  3. Algrim

    Algrim [H]ard|Gawd

    Messages:
    1,611
    Joined:
    Jun 1, 2016
    I, for one, welcome our LCD overlords. The days of wrecking my back picking up a massive NEC monitor are behind me. :eek:
     
    5150Joker, Auer and Armenius like this.
  4. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    9,907
    Joined:
    Apr 22, 2006
    I wonder if there is any overlap between the people praising CRTs, and criticizing OLED for burn-in and durability issues.

    The would be forgetting that CRTs typically peaked at 140 nits and people ran them 80 nits to preserve the fragile phosphors. Anyone in a big org with CRTs will likely remember seeing lots of burned in CRTs.
     
    spine and Armenius like this.
  5. III_Slyflyer_III

    III_Slyflyer_III n00b

    Messages:
    49
    Joined:
    Sep 17, 2019
    Ahhh, in my basement I still have my old Samsung Syncmaster! Would go up to 1280x1024 @ 75Hz... something like that anyway. Not sure why I still have it other than if I need a VGA monitor someday or feel like building an old ass PC out of parts I have from the early 2000's laying around. :) Maybe I keep it around because of all the amazing games like Half-Life, Tribes and UT I had fun with on that thing!
     
  6. Nenu

    Nenu [H]ardened

    Messages:
    19,091
    Joined:
    Apr 28, 2007
    The higher the res the more blurry, to the point of terrible.
    My Vision master pro 210 was not much use over 1600x1200.
    It could do 2048x1536 but it wasnt usable.

    And higher Hz was more blurry too.
    Hz be damned.
     
    Last edited: Jan 10, 2020
  7. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,891
    Joined:
    Jun 13, 2003
    Most monitors had a 'sweet spot', like the 19" Trinitron / Diamondtron series that liked 1280x1024@120Hz I believe. They could also do 1600x1200@85Hz, but sacrificed sharpness to do so.
     
  8. N4CR

    N4CR [H]ardness Supreme

    Messages:
    4,367
    Joined:
    Oct 17, 2011
    Makes sense. I remember when my dad got a fancy flat screen viewsonic 19" and it could run in the 2k range but yeah, bit blurry and shitty looking. Makes sense.
     
    IdiotInCharge likes this.
  9. oldmanbal

    oldmanbal 2[H]4U

    Messages:
    2,121
    Joined:
    Aug 27, 2010
    85hz or above on a CRT was unfathomably smooth. I can't even compare it to modern monitors, maybe what a 360hz panel feels like? Would love if we could compare this with some blur buster big balls youtube video.

    If my fw-900 still worked I'd do it myself, alas, the only repair guy in my state is like 2+ hours away each way, and the last quote i got was near 1 grand to service it.
     
  10. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    9,907
    Joined:
    Apr 22, 2006
    Yeah, nothing is quite as smooth as the placebo effect, boosted by Nostalgia.
     
    grambo, Armenius, PhaseNoise and 4 others like this.
  11. 5150Joker

    5150Joker 2[H]4U

    Messages:
    3,437
    Joined:
    Aug 1, 2005
    Lol so true. The crt motion handling was nice but honestly when I'm pushing 240 fps on my 240 hz with strobing the motion is super buttery smooth. I don't miss bulky power hungry crts at all
     
    Armenius likes this.
  12. oldmanbal

    oldmanbal 2[H]4U

    Messages:
    2,121
    Joined:
    Aug 27, 2010
    well i'd take a lewinsky scandal over aidgate anyday
     
  13. Factum

    Factum [H]ard|Gawd

    Messages:
    1,880
    Joined:
    Dec 24, 2014
    Why bother with facts?


    Ignorance is the new black...
     
    sabrewolf732 likes this.
  14. Factum

    Factum [H]ard|Gawd

    Messages:
    1,880
    Joined:
    Dec 24, 2014
  15. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    9,907
    Joined:
    Apr 22, 2006
    Those aren't facts, they are opinions.

    Opinions in an industry where views = $. Controversial videos stand out and get more views.

    And get repeatedly linked when fans of something want something to back them up.
     
  16. spine

    spine 2[H]4U

    Messages:
    2,626
    Joined:
    Feb 4, 2003
    Oh, for sure I'd say modern LCD VA/IPS panels look better in basically every way these days. And black levels/contrast ratio is a thing of consideration now. My old Sony 19" Trinitron was laughably bad, and after the 5 years or so of ownership, it could no longer do black, just a blueish grey! :LOL:

    Every res was native though!
     
  17. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    9,907
    Joined:
    Apr 22, 2006
    More like no res was native. :D

    Ever put up a 1 pixel checkerboard on a CRT? You have to drop quite low in resolution before they stop blurring out to grey.
     
    Nenu, defaultluser, noko and 2 others like this.
  18. Gideon

    Gideon 2[H]4U

    Messages:
    2,439
    Joined:
    Apr 13, 2006
    CRT had its pros and cons just like lcd's. But a high quality CRT made a big difference in image quality, but god it sucked dragging them to a LAN party.
     
  19. Geforcepat

    Geforcepat Gawd

    Messages:
    924
    Joined:
    Jun 2, 2012
    Ah, Yeah, I miss the good ol day's of taking my pc and crt to lan party's back in the early 00's.:)
     
    N4CR, Gideon, Armenius and 1 other person like this.
  20. Factum

    Factum [H]ard|Gawd

    Messages:
    1,880
    Joined:
    Dec 24, 2014
    You post invalidates every post you make as an "opnion"...but you keep posting like you post facts...nice own goal.
     
  21. Factum

    Factum [H]ard|Gawd

    Messages:
    1,880
    Joined:
    Dec 24, 2014
    I have yet to see a flatscreen in a A/B test that provided better image quality than a CRT, care to link to such reviews?
     
  22. spine

    spine 2[H]4U

    Messages:
    2,626
    Joined:
    Feb 4, 2003
    Lol, look who's back. Welcome to 2020 Factum! :rolleyes:

    I said, "looks better" and, "I'd say".

    You can go back to your little cave now.
     
    5150Joker likes this.
  23. Auer

    Auer [H]ard|Gawd

    Messages:
    1,219
    Joined:
    Nov 2, 2018
    That would be a mostly useless test as CRT's are no longer made and obsolete.

    Waste of time and resources. Also they sucked for image editing. Sucked.
     
    Armenius and 5150Joker like this.
  24. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    9,907
    Joined:
    Apr 22, 2006

    I had both sitting on my desk for about a year, it was no contest. LCD is better. I had both at work, again no contest.

    When we were switching over at work, everyone was begging to get upgraded to LCD ASAP.

    Not one person wanted to keep using CRT, and these were 21" Trinitrons.

    This nostalgic CRT worship is much like the few people who swear Vinyl records sound better than CDs.
     
  25. 5150Joker

    5150Joker 2[H]4U

    Messages:
    3,437
    Joined:
    Aug 1, 2005
    Even in this video the brief side to side comparison they discussed was at 60 hZ using a side scrolling game. If they would have used something like my monitor (aorus kd25f) or benq with dyac vs the crt in a modern game pushing 200+ hz and fps, they wouldn't be crowing so loudly about crts. The other points they made are accurate, crts could do almost any resolution and look good and they had built-in softness to them which made harsh scenes in a game smooth out. But both technologies are a compromise, oled has both beat for the near future while long term microled should be king.

    Most of us here grew up with crts and used to haul them around to Lan parties to play quake so we know exactly what crt offers and the compromises it comes with. I would never go back, my 240Hz 1080p TN panel delivers exactly what I need for smooth motion handling.
     
    Last edited: Jan 13, 2020
    Maddness, Chimpee and Armenius like this.
  26. Sycraft

    Sycraft [H]ardness Supreme

    Messages:
    4,518
    Joined:
    Nov 9, 2006
    I think it is part nostalgia/rose coloured glasses in that people tend to remember the good things in the past and forget the bad. I also think part of it is whatever we have now, we notice the problems with and thus if something doesn't have those particular problems, it can look better because we don't consider the downsides. CRTs did have some legit advantages over LCDs. Until recently variable/high refresh rates were something you needed a CRT for, the way they draw their picture with only a small part being illuminated makes for very crisp frames, and they are good a doing multiple resolutions near their max (LCDs have trouble being clear if you are slightly under their max rez). However people forget all the issues they had, the spending hours messing around to try and get their geometry square and still have it only "mostly" right, their utter inability to get bright without blooming/loss of saturation, how shitty low resolutions could be when scan lines starter happening because there weren't enough lines for the guns, etc.
     
    5150Joker, Snowdog and Armenius like this.
  27. kasakka

    kasakka [H]ard|Gawd

    Messages:
    1,367
    Joined:
    Aug 25, 2008
    I don't know how this devolved into CRT discussion but I'm old enough to have used quite a few CRTs and the things people forget are geometry issues, reflections, GPUs having varying quality D/A converters so GPU model X might give you worse image quality than GPU model Y, not being able to run high enough Hz on some so they were headache inducing pieces of crap and yes, they did have a practical "native" resolution which aligns with the size of the shadow mask I think. You could go higher but it would give you too small everything in an age before DPI scaling was a thing.

    One thing CRTs did do well is make low resolution look really good. I have a couple of old CRT monitors from a TV studio and those are just amazing for things like Playstation 2 games. Those old games look much better on these and the developers of said games took good advantage of that. CRTs are not pin-sharp so they soften the image just a bit to make it look like they are higher res than they really are.

    But getting back to the 30xx series, where they land will be interesting. Will Nvidia release a 3080 Ti right off the bat or save it for later? Will they manage to squeeze out a significant improvement for both raster and raytracing performance to make 2080 Ti owners like me to upgrade? Will Displayport 2.0 support be in despite no displays using it having being released or announced? RT performance is slowly but surely getting better due to improved software solutions too.
     
  28. noko

    noko [H]ardness Supreme

    Messages:
    4,553
    Joined:
    Apr 14, 2010
    Here is for the thread, big speculation, Nvidia works on with a number of others and come up with a whole new rendering technique which is not Rasterization nor RT. AI rendering, where the state of the object in how it reacts to lights is defined, AI takes all objects in scene with lighting information for objects and comes up with solution for any light source introduced, changes in lights, movement of lights, material emission for color bleeding, shadowing. A totally different way to calculate what to render so to speak. In a way Nvidia Denoising and DSLL is doing this but still using RT and Rasterization for rendering with modifications by AI. SPECULATION ONLY.
     
  29. noko

    noko [H]ardness Supreme

    Messages:
    4,553
    Joined:
    Apr 14, 2010
  30. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    9,907
    Joined:
    Apr 22, 2006
  31. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    7,469
    Joined:
    Feb 22, 2012
    Yeah, that looks like an order of magnitude harder than DLSS which we all know how that went...
     
  32. noko

    noko [H]ardness Supreme

    Messages:
    4,553
    Joined:
    Apr 14, 2010
    They were accomplishing the real time with the capability of the Titan V, so do not think that ability will be a decade out. It is most likely much more refined now. I could see GTAVII having a city where everyone is unique and even each character could be transformed to like a different cultures depending upon location of the user. Except that will most likely be developed using console ability, cough cough AMD. Portraits of believable people that don't exist Nvidia has already proven they can do now:

    https://www.theverge.com/tldr/2019/...ple-portraits-thispersondoesnotexist-stylegan

    What is Nvidia doing with R&D unless they see a benefit as a company. Most if not all of the software is open sourced which others are using. Hardware that is better refined to accomplished those types of tasks would need to come out for developers to go further into this. RT as in RTX can also be a way to interpret traditional games, assets and game engines. I see it possibly a stepping stone for using AI. AI hands down are giving results that are utterly remarkable and much better than traditional methods of 3d work and rendering. The numbers for real time raytracing for games do not add up as in forcing the calculations in a traditional RT environment. Even 10x the capability of hardware ability for RT will fall short of complete real time full effect RT (really not even close).

    Another avenue for big bucks would be the entertainment industry, all these unique capabilities with dedicated hardware would go a long ways in making movies, shows, special effects at a lower cost. Which I do believe would also come to gaming as more and more people get proficient with it and as more movies are also big game development opportunities where the assets could be shared. Maybe a whole new uber eco-system for entertainment/gaming.

    I believe Ampere is going to be much more than a traditional upgrade, it has the potential, I do believe it will be the biggest largest step Nvidia has ever made.
     
    Last edited: Jan 15, 2020 at 12:38 PM
  33. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    20,005
    Joined:
    Jan 28, 2014
    Whoa, now, let's not start building up expectations like an AMD fanboy. We all know how that ends.
     
    GoldenTiger likes this.
  34. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,891
    Joined:
    Jun 13, 2003
    It's going slowly... last I checked.

    Where they've gotten it solidly working, it's delivering in ways that less sophisticated solutions aren't capable.


    Of course, they could have started with those ;)
     
    GoldenTiger and Dayaks like this.
  35. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    9,907
    Joined:
    Apr 22, 2006
    Last I checked, the best version of "DLSS" was in control, and it turns out it really isn't DLSS, in that it isn't running a Deep Learning network. Instead they are using a compute core program to upscale.

    Think about that. The best DLSS isn't even using Tensor cores.

    I would say that is a fairly damning indictment against how well it's going.
     
    spine, N4CR, Maddness and 2 others like this.
  36. noko

    noko [H]ardness Supreme

    Messages:
    4,553
    Joined:
    Apr 14, 2010
    Really? Hey just a speculation thread, you have any relevant facts? Nvidia has been working on this for awhile, real info/facts. RTX is not magically going to 10x by calculations or increasing ray casting for more accuracy and usability. It is so little now for real time it will remain limited, even if somewhat useful for some things. The increase in RTX may have much more to do with AI then anything else, then again game developers will most likely be looking at Consoles more than just PC's and would be more limited by their abilities while next gen may have RT, how good will it really be?
     
  37. noko

    noko [H]ardness Supreme

    Messages:
    4,553
    Joined:
    Apr 14, 2010
    Those tensor cores maybe used for some other interesting gaming side of house uses, more Youtuby type streaming, video's etc. Maybe will be incorporated into GeForce Experience in the future.

    https://blogs.nvidia.com/blog/2019/09/26/nvidia-rtx-broadcast-engine-twitch-livestream-ai/
     
  38. III_Slyflyer_III

    III_Slyflyer_III n00b

    Messages:
    49
    Joined:
    Sep 17, 2019
    What method of DLSS does Wolfenstein youngblood use now that RT finally came out on it?

    I got the game for free back in September and loaded it up the other day to see how it looked (haven't played it yet). The Ray Tracing looked nice and when I used DLSS at 4K, it looked better than with it off using the AA options. If that game truly uses DLSS, I'd vote that as the best example. FPS was in the 100's after DLSS (~60ish FPS without it) at 4K with RT on and the DLSS looked beautiful.
     
    Maddness, GoldenTiger and noko like this.
  39. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,891
    Joined:
    Jun 13, 2003
    I know they had a demo showing off bits of a new space / moon game. I can't say that I've really kept up with it, however, what they've shown has been pretty impressive.

    And RTSS is indicative of where rendering is going, as in using some form of machine learning to 'cull' the work that needs to be done in order to get more performance output from available hardware. We're looking at 8k outputs as well as VR on the one hand, while hitting die shrink and reticle diameter limits on the other. Perhaps chiplets will help as they have with CPUs as AMD has demonstrated, but there does need to be more and more focus on optimizing code and hardware in parallel versus simply throwing more resources at the problem.
     
    Maddness and N4CR like this.
  40. noko

    noko [H]ardness Supreme

    Messages:
    4,553
    Joined:
    Apr 14, 2010
    The big advantage I see Nvidia having for Ampere, particularly dealing with RTX is the feedback they have received from the developers dealing with their hardware. How much Ampere could be upgraded using the feedback is another thing but I am sure Nvidia probably had a number of contingencies dealing with Ampere. My speculation dealing with Ampere is 4th quarter this year.