NVIDIA CEO Jensen Huang hints at ‘exciting’ next-generation GPU update on September 20th Tuesday

Is anybody else sort of excited about what Shader Execution Reordering could bring to the table?
I know a lot of people still call Ray Tracing a gimmick, but all the animation tools, texturing tools, and game engine preparation methods are leaning into it heavily it's coming. In the past few years, I have seen students using these tools cut down the amount of time it takes to prep a scene or model from days of monotonous work to a button push that takes a few hours to calculate and ends up with far more consistent results and far more consistent frame rates and more visually appealing results, especially on newer screens.
The supposed FPS increase on raytraced elements with the SER could be a pretty big deal and I am honestly excited for what it could bring about.
 
Read this...

https://hardforum.com/threads/nvidi...eptember-20th-tuesday.2021544/post-1045457136

Then read it again. Doesn't matter how indignant you get about it, the naming convention is misleading unless accompanied by spec details making it clear that the differences between the models go beyond memory configuration.

LOL I am not indignant at all, you should be talking to people with all the outrage. And I would love for you to quote the regulatory and legal systems that Nvidia are breaking. How come they weren't fined or whatever for the 3080 12GB and the 3080 10GB or go back further to the 1060 6GB and 1060 3GB? The difference between those cards wasn't just VRAM. Different number of Cuda cores and memory bandwidth. Or how come neither AMD or Nvidia have never been fined for rebranding old cards and releasing them as new cards.

The specs of both GPUs are on the site. You don't have to leave the shop or anything these days, just check on your phone.

And it's not about been snooty as you call it. If you don't have a clue about what you are buying, you should ask advice or do some research. If someone asked you about a new card coming out, would you tell them to buy it based on the Marketing blurb from the Manufacturer or would you tell them to wait for reviews?

The naming scheme isn't the problem, the price is. I bet nobody would give a damn about naming schemes if the 4080 12GB was $399 and the 16GB was $499.

Serious question, do you think there would have been more or less outrage if they called the card the 4070? Almost a doubling of price from the 3070. The reaction would probably have been worse.
 
I'm always excited about new technology. Ray Tracing was labeled as a gimmick when it first hit consumer GPUs, but the fact is it's progress. The issue with this generation is price and naming. Nvidia did something similar with Turing, but it was not to this extreme.
 
LOL I am not indignant at all, you should be talking to people with all the outrage. And I would love for you to quote the regulatory and legal systems that Nvidia are breaking. How come they weren't fined or whatever for the 3080 12GB and the 3080 10GB or go back further to the 1060 6GB and 1060 3GB? The difference between those cards wasn't just VRAM. Different number of Cuda cores and memory bandwidth. Or how come neither AMD or Nvidia have never been fined for rebranding old cards and releasing them as new cards.

The specs of both GPUs are on the site. You don't have to leave the shop or anything these days, just check on your phone.

And it's not about been snooty as you call it. If you don't have a clue about what you are buying, you should ask advice or do some research. If someone asked you about a new card coming out, would you tell them to buy it based on the Marketing blurb from the Manufacturer or would you tell them to wait for reviews?

The naming scheme isn't the problem, the price is. I bet nobody would give a damn about naming schemes if the 4080 12GB was $399 and the 16GB was $499.

Serious question, do you think there would have been more or less outrage if they called the card the 4070? Almost a doubling of price from the 3070. The reaction would probably have been worse.

Defending bad business to consumer practices when it so simple to be for good business practices is hilarious.

Projecting outrage is amazing, some probably are, I'm just amused that yet again the nVidia defense force is out trying to gaslight and muck rake legitimate criticisms (yet again).

All this 'outrage' is great, because it creates more threads across more sites that people can find through googling the 4080 and get informed, so be outraged people.
 
Well tbf 1080p is pretty low res at this point in time. You could easily get a 1440p 144hz monitor and it would be a really nice upgrade. With gsync/freesync the 2070 would handle it just fine.
You can find good 4K monitors these days for $400.
 
So yeah, maybe like some sort of retention mechanism is needed. Anyway, itx case designers...you've got a new challenge ahead of you!
Yeah... I am looking at these cards and thinking HTF am I going to do this, custom closed loops are the only reasonable thing I can come up with something with a very small reservoir and using a small pump but that is still going to be a lot of energy to dissipate and I am really stretching on what cases can physically fit the amount of radiator that is reasonably needed to do this. With the new power specs for the AM5 platform, it seriously needs at least 4x 120mm radiators, possibly 5 depending on ambient temperatures and humidity where you live. This generation is going to make me work to fit this into a shoebox..
 
Yeah... I am looking at these cards and thinking HTF am I going to do this, custom closed loops are the only reasonable thing I can come up with something with a very small reservoir and using a small pump but that is still going to be a lot of energy to dissipate and I am really stretching on what cases can physically fit the amount of radiator that is reasonably needed to do this. With the new power specs for the AM5 platform, it seriously needs at least 4x 120mm radiators, possibly 5 depending on ambient temperatures and humidity where you live. This generation is going to make me work to fit this into a shoebox..
Run everything on a Primochill Praxis Wetbench SX and connect it to a Watercool M0-RA3, that's what I'm about to do, but not for a 4080, just for a 5950x and 3080.

ETA: Put an acrylic box over top of it and call it a case. ;-)
 
Last edited:
Run everything on a Primochill Praxis Wetbench SX and connect it to a Watercool M0-RA3, that's what I'm about to do, but not for a 4080, just for a 5950x and 3080.
My EKWB loop popped, the Cryofluid went chunky then it broke an impeller blade on the pump and it just squirted everywhere and made a big mess, I am personally off liquid cooling for a little bit, until the wife forgets the mess it made finding a replacement for the rug so there isn't a purple blob on it to remind her could help I suppose.
 
Yeah even the "cheap" 4K panels in the smaller TV's are remarkably good.
I personally am looking at the LG OLED C1 series the Gsync reviews for it are remarkably good.

I've had a C1 for a little over a year and absolutely love it. Great picture, Gsnyc/Freesync/VRR, etc.
I love that it remembers your brightness/color settings based on the signal. That means that you don't have to go in an tinker with your color/brightness for PC signals vs. TV signals vs. console signals (along with HDR, Dolby Vision, etc.). You just set things up once and it will automatically switch to whatever you set moving forward. Not every TV does that. The only knock on it is that it's not that bright compared to some other models. I keep mine cranked pretty high most of the time since I bought the Best Buy extended warranty. I figure I'll intentionally let something burn in just as my warranty is about to expire and they'll have to swap it out for the latest model.
 
I've had a C1 for a little over a year and absolutely love it. Great picture, Gsnyc/Freesync/VRR, etc.
I love that it remembers your brightness/color settings based on the signal. That means that you don't have to go in an tinker with your color/brightness for PC signals vs. TV signals vs. console signals (along with HDR, Dolby Vision, etc.). You just set things up once and it will automatically switch to whatever you set moving forward. Not every TV does that. The only knock on it is that it's not that bright compared to some other models. I keep mine cranked pretty high most of the time since I bought the Best Buy extended warranty. I figure I'll intentionally let something burn in just as my warranty is about to expire and they'll have to swap it out for the latest model.
My desk is in a pretty dark place with little natural light so I’m not too worried about the brightness. Which I am told the C2 is brighter, I haven’t physically seen one yet but everybody is discounting the C1’s pretty hard right now because they are overstocked.
 
My EKWB loop popped, the Cryofluid went chunky then it broke an impeller blade on the pump and it just squirted everywhere and made a big mess, I am personally off liquid cooling for a little bit, until the wife forgets the mess it made finding a replacement for the rug so there isn't a purple blob on it to remind her could help I suppose.


Leakshield

Screenshot from 2022-09-22 15-22-44.png


I thought about this myself, but I don't like the idea of false alarms with an 85 dB siren.
 
This is nothing like the 970 situation where they materially misrepresented the specs of the card and I don't see how there's any grounds to sue. However the fact that what they're doing is perfectly legal doesn't mean they shouldn't be called out for it so the less informed are more likely to hear about it.
Do you even GPU stick bro ? :)
Why would you GPU stick when you can dark obelisk and harness that dark power?
 
I'm always excited about new technology. Ray Tracing was labeled as a gimmick when it first hit consumer GPUs, but the fact is it's progress. The issue with this generation is price and naming. Nvidia did something similar with Turing, but it was not to this extreme.
If DLSS 3.0 and its tricks are any indicator, Ray Tracing is hard, very really hard. I don't mind the de-noising magic, but building frames outside of the timing and context of the game engine is a big no-no from me.
 
If DLSS 3.0 and its tricks are any indicator, Ray Tracing is hard, very really hard. I don't mind the de-noising magic, but building frames outside of the timing and context of the game engine is a big no-no from me.
Ray Tracing is VERY hard. Lights in most games are either projected or painted on, which with today's GPUs is easy to calculate. Ray Tracing is a completely different lighting model where each individual light ray is calculated from the eye to its termination point. Each time that ray hits something, it absorbs some the properties of the object it intersects, and bounces away. Depending on how many light bounces are specified by the engine, this can be incredibly taxing to a GPU, even with specialized hardware.

Just remember, there was a time where calculating vertices and textures was hard, and painted/projected light sources were stupidly difficult. RT will get easier overtime as developers learn tricks to make it more seamless.
 
I liked his post because my Nvidia stock needs people like him 👍

So it seems people are most upset about price @ 1600 for a flagship card. I am in Vegas this week for a conference where 2 people can easily spend $1600 in a single night for a night out at a good restoraunt and a show. Now that's a collosal waste of money. $1600 for a top GPU that will last 4 years equates to less than $40 per month + resale value. Would I love to pay $800 for a top end GPU, yes ofcourse. But have you all noticed inflation? I would love to pay half the price for gas like I did 2 years ago or get my steak at half the price like I did two years ago. Inflation sucks, so do new expensive nodes so do supply chain issues. But I thought this is a place where people are excited about tech. Like I would shit my pants back in 1990s when I had my first voodoo card if someone told me we would have this level of real time ray tracing or if someone explained to me tech behind DLSS or if someone showed me graphics in Cyberpunk in real time. We came a long way, and the road ahead is exciting as hell. I am just super happy to have lived through computer history from C64 to now to whatever will come over next few decades.
 
If DLSS 3.0 and its tricks are any indicator, Ray Tracing is hard, very really hard. I don't mind the de-noising magic, but building frames outside of the timing and context of the game engine is a big no-no from me.
I understand the hesitation but out-of-order execution exists for a reason, and the Graphics pipeline already does something similar just really inefficiently. I mean with textures developers have all sorts of tricks and tools at their disposal to make sure things are rendered in order because in any scene you have a manageable number of things going on, ray tracing calculations have millions of calculations going on at once and it's not something the developer can manually manage, automating it so it can reorder itself for fastest completion time is about the best option going.
 
The problem with these "smartest man in the room" guys is that they are wrong so often. 1000 series GPUs "wow sales doubled in one month at this rate we will sell 16 times that in 4 months!" then came the mining crash. 3000 series GPUs "wow sales doubled in one month at this rate we will sell 16 times that in 4 months!" then came the mining crash.
Ty tyvm but the sarcasm went wizzin right by apparently...
 
My EKWB loop popped, the Cryofluid went chunky then it broke an impeller blade on the pump and it just squirted everywhere and made a big mess, I am personally off liquid cooling for a little bit, until the wife forgets the mess it made finding a replacement for the rug so there isn't a purple blob on it to remind her could help I suppose.
Cryofuel is junk. People should stop using it. Mayhems, Koolance, Aquacomputer, or if you are in the US, ModMyMods are all better coolants.

Been using my 48" C1 since early August and been loving it. Only complaint is the auto-dimming on some screens (usually when there's a lot of white), it can be very off-putting.
 
So it seems people are most upset about price @ 1600 for a flagship card. I am in Vegas this week for a conference where 2 people can easily spend $1600 in a single night for a night out at a good restoraunt and a show. Now that's a collosal waste of money. $1600 for a top GPU that will last 4 years equates to less than $40 per month + resale value. Would I love to pay $800 for a top end GPU, yes ofcourse. But have you all noticed inflation? I would love to pay half the price for gas like I did 2 years ago or get my steak at half the price like I did two years ago. Inflation sucks, so do new expensive nodes so do supply chain issues. But I thought this is a place where people are excited about tech. Like I would shit my pants back in 1990s when I had my first voodoo card if someone told me we would have this level of real time ray tracing or if someone explained to me tech behind DLSS or if someone showed me graphics in Cyberpunk in real time. We came a long way, and the road ahead is exciting as hell. I am just super happy to have lived through computer history from C64 to now to whatever will come over next few decades.

I dunno why you're trying to sell me on it lol I'm already supportive of it one way or another, go for it 👍
 
Cryofuel is junk. People should stop using it. Mayhems, Koolance, Aquacomputer, or if you are in the US, ModMyMods are all better coolants.

Been using my 48" C1 since early August and been loving it. Only complaint is the auto-dimming on some screens (usually when there's a lot of white), it can be very off-putting.
Lesson learned too late on Cryofuel...
The Auto dimming is one of the things that is specifically mentioned in all the reviews I've ready usually alongside links to YouTube videos on how to turn it off because the menu structure in the C1s looks like absolute garbage.
 
I understand the hesitation but out-of-order execution exists for a reason, and the Graphics pipeline already does something similar just really inefficiently. I mean with textures developers have all sorts of tricks and tools at their disposal to make sure things are rendered in order because in any scene you have a manageable number of things going on, ray tracing calculations have millions of calculations going on at once and it's not something the developer can manually manage, automating it so it can reorder itself for fastest completion time is about the best option going.
I actually appreciate SER in Ada. We need more of that, unfortunately it's only up to 1.25x uplift. My issue is with DLSS 3.0 being marketed as a ray tracing improvement. It does not advance Ray Tracing in any way, rather detracts quite destructively from it. What would justify the pricing is actual more R&D into ray tracing and giving us more RT hardware.

Even the updated real-time denoiser is worthy of more attention (if it's used in DLSS 3.0): https://github.com/NVIDIAGameWorks/RayTracingDenoiser
 
Moore's law is double SPEED
the principle that the speed and capability of computers can be expected to double every two years, as a result of increases in the number of transistors a microchip can contain:
When I google and read seem quite focus on the transistor density:

Moore's Law refers to Gordon Moore's perception that the number of transistors on a microchip doubles every two years, though the cost of computers is halved. Moore's Law states that we can expect the speed and capability of our computers to increase every couple of years, and we will pay less for them.
 
When I google and read seem quite focus on the transistor density:

Moore's Law refers to Gordon Moore's perception that the number of transistors on a microchip doubles every two years, though the cost of computers is halved. Moore's Law states that we can expect the speed and capability of our computers to increase every couple of years, and we will pay less for them.
Somebody forgot to enforce it, once it became law.
 
I actually appreciate SER in Ada. We need more of that, unfortunately it's only up to 1.25x uplift. My issue is with DLSS 3.0 being marketed as a ray tracing improvement. It does not advance Ray Tracing in any way, rather detracts quite destructively from it. What would justify the pricing is actual more R&D into ray tracing and giving us more RT hardware.

Even the updated real-time denoiser is worthy of more attention (if it's used in DLSS 3.0): https://github.com/NVIDIAGameWorks/RayTracingDenoiser
Do you mean that stuff with the Optical Flow Accelerator? That does look interesting and I don't understand how it works, I understand what its output goals are but the tidbits in the middle sound too good to be true so I am probably wildly misunderstanding how it works.
The little bit that I can see though would indicate that the Optical Flow SDK has been around since Turing, but it ran on the GPU as a software workflow, and this time around they are just dedicating some hardware to it on the GPU to better accelerate it and have greatly eased the process of implementing it into a program.
IF I understand its end goal here it would be to reduce studdering and sudden FPS drops by either generating the missing frames when the CPU is bound and can't keep up or by dropping a frame or two, say rendering the odd frames and quickly extrapolating the even ones based on a very fast calculation so you get a smooth bit of motion.
I am sure it's way more complicated that that. But at least that sounds good in theory, I mean if you have to choose between your screen studdering and artifacting because it's suddenly CPU or GPU bound, or the GPU dropping some potential accuracy on some explosion or laser effects to keep things visually steady and clear, then isn't that ultimately good?
I mean neither situations are ones you ideally want to be in but, at least one doesn't break immersion, because I think we can all agree your machine suddenly chunking out is jarring and anything that smooths that over is ultimately good, paired with GSync it could be pretty awesome for machines that can't quite keep up, might add some longevity to a system that you might otherwise be somewhat unhappy with.
 
I find it harder and harder to get excited about features that aren't actually supported by any of the games that I play.
I mean I get that, and I don't know what you are playing and it doesn't really matter but at least Nvidia is trying with the feature set they have a pretty decent list of titles ready to go for DLSS 3.0 on launch, Unity and Epic have supposedly already patched it in and it's ready to go on Oct 12.
Raytracing wise it looks like the RTX list is upwards of 200 games now (combination of launched and in production), so the list is growing rather quickly.
 
Lesson learned too late on Cryofuel...
The Auto dimming is one of the things that is specifically mentioned in all the reviews I've ready usually alongside links to YouTube videos on how to turn it off because the menu structure in the C1s looks like absolute garbage.
Hmm I'll have to look that up. To my knowledge it couldn't be turned off.
 
Pretty much. What ever happened to being excited for hitting native resolution targets in all games and didn't require proprietary solutions to do so?
Game feature creep, we could always collectively ask developers to tone down their games, weaker AI, fewer NPCs, smaller playable environments, less detailed models and textures, I mean the list goes on. We could do it, they would be more than happy to do so because it's the flashy stuff that is complicated and gets expensive, they might even be able to then charge less for the titles and spend time actually testing them before launch, that would be nice. Sad thing is though given what games currently look like with the details and settings we like Nvidia and AMD just can't reasonably deliver hardware that is capable of doing that without requiring a dedicated refrigerating unit and NEMA 10-30R plugin. I mean I have that in my workspace already so I am good to go!
 
Hmm I'll have to look that up. To my knowledge it couldn't be turned off.
From what I understand they put it under the most unintuitive name possible and it is the "Adjust Logo Brightness" setting under the OLED Care Menu that controls the auto-dimming, there you can set it to off but for anti-burn-in reasons they recommend putting it to High, which makes it dim faster.
 
Raytracing wise it looks like the RTX list is upwards of 200 games now (combination of launched and in production), so the list is growing rather quickly.

Yeah ray tracing adoption is getting pretty good, but in many cases it's not even worth it. In World of Warcraft for example, you get shadows that look almost exactly the same as with ray tracing disabled yet suffer a ~30% impact to FPS. That makes it basically worthless. I thought that one of the key features of ray tracing was that it was going to be handled via special hardware on the card. So if anything, using ray tracing should have resulted in a performance improvement since shadows are being handled by dedicated hardware, leaving the regular GPU cores to focus on other tasks. But obviously that's not how it works in practice, at least not with my 5900X + RTX 2080 config. Maybe a 4xxx GPU would change that? I doubt it but who knows.
 
Do you mean that stuff with the Optical Flow Accelerator? That does look interesting and I don't understand how it works, I understand what its output goals are but the tidbits in the middle sound too good to be true so I am probably wildly misunderstanding how it works.
The little bit that I can see though would indicate that the Optical Flow SDK has been around since Turing, but it ran on the GPU as a software workflow, and this time around they are just dedicating some hardware to it on the GPU to better accelerate it and have greatly eased the process of implementing it into a program.
IF I understand its end goal here it would be to reduce studdering and sudden FPS drops by either generating the missing frames when the CPU is bound and can't keep up or by dropping a frame or two, say rendering the odd frames and quickly extrapolating the even ones based on a very fast calculation so you get a smooth bit of motion.
I am sure it's way more complicated that that. But at least that sounds good in theory, I mean if you have to choose between your screen studdering and artifacting because it's suddenly CPU or GPU bound, or the GPU dropping some potential accuracy on some explosion or laser effects to keep things visually steady and clear, then isn't that ultimately good?
I mean neither situations are ones you ideally want to be in but, at least one doesn't break immersion, because I think we can all agree your machine suddenly chunking out is jarring and anything that smooths that over is ultimately good, paired with GSync it could be pretty awesome for machines that can't quite keep up, might add some longevity to a system that you might otherwise be somewhat unhappy with.
Optical Flow is not Ray Tracing. That's all people need to know to pick the claims apart.
 
I mean I get that, and I don't know what you are playing and it doesn't really matter but at least Nvidia is trying with the feature set they have a pretty decent list of titles ready to go for DLSS 3.0 on launch, Unity and Epic have supposedly already patched it in and it's ready to go on Oct 12.
Raytracing wise it looks like the RTX list is upwards of 200 games now (combination of launched and in production), so the list is growing rather quickly.
200 games ? (Perhaps there is a third party list somewhere I don't know about)
https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/
78 Games
11 Remasters
3 Double counts (Bright Memory / Bright Memory: inf Minecraft RTS / Minecraft RTX China The Rift Breaker / RB Prologue)

So If I'm being generous I would say 89 titles... of which a handful I didn't go and really count are in Early Access status on steam.

RT is still having a hard go with adoption. Hence the need to talk about Mod tools for gamers at a 4000 card launch. Developers are just not biting on RT. At least not in the numbers Nvidia needs/wants them too. The majority of the games using RT right now are indie developers whos use of RT is questionable at best... and actually bad and crash heavy at worse.

Going through the list of Ray Traced games on Nvidias list...

The Orville Interactive Fan Experience (This is cool and all but its not a game)
The Persistence (A two year old game with 100 reviews on steam, its not good)
The Fabled Woods (Another two year old game with 68 reviews on steam, not good either)
Soulmate (a early access steam game that has been up over a year... FOUR reviews on steam. One says "This game is not even close to being finished enough for even early Alpha.")
Redout: Space Assault (Almost two year old game with 68 not so great reviews on steam)
RAZE 2070 (been on steam 1.5 years... 2 reviews. One claims the game crashes all the time)
Loopmancer (Actually looks like a decent little game but its small 364 steam reviews... mostly positive its however also a platformer that doesn't look all that traced to me)
Hell Pie (Also looks pretty fun... 461 steam reviews however also a 3D platformer I'm not seeing anything that couldn't be done with light maps)
Helios (listed as a game but its actually "a social VR and Desktop framework for you to create your dream worlds and avatars." ) OK I guess
Aron's Adventure (150 reviews on steam not bad reviews... however this game looks like a Indie game designed years ago to my eyes RT Ok)
Poker Club (hey 256 reviews on steam so its not the worst game listed)

I'll be fair... there are some cool titles in the list as well. Of those though most of them I think are pretty light on the actual RT implementation.

Anyway personally I see ZERO reason to shell out a couple thousand dollars for the list of current RT titles. Unless 6 months from now the mod tool really works and tons of old games I loved have RT mods... so who knows perhaps Nvidias Mod tool play isn't that crazy. It does however look sort of desperate. (If its awesome and I get ray traced old elder scroll games ect I'll eat my words happily)
 
Yeah ray tracing adoption is getting pretty good, but in many cases it's not even worth it. In World of Warcraft for example, you get shadows that look almost exactly the same as with ray tracing disabled yet suffer a ~30% impact to FPS. That makes it basically worthless. I thought that one of the key features of ray tracing was that it was going to be handled via special hardware on the card. So if anything, using ray tracing should have resulted in a performance improvement since shadows are being handled by dedicated hardware, leaving the regular GPU cores to focus on other tasks. But obviously that's not how it works in practice, at least not with my 5900X + RTX 2080 config. Maybe a 4xxx GPU would change that? I doubt it but who knows.
Yeah WoW crammed it in there for shadows and lighting in the laziest way they could for bragging rights maybe? New titles in new engines developed with the new tools it's much faster and easier, Epic has been spending huge amounts of money and resources on their toolsets to make working with raytraced stuff easy and seamless to the degree were trying to avoid using it is just too expensive. Raytracing saves too much development time to be ignored and usually leads to things that are more consistent in their performance.
 
Back
Top