Nvidia Says Native Resolution Gaming is Out, DLSS is Here to Stay

Status
Not open for further replies.
Fake frames: You (everyone) do realize that every single frame in a video game is a fake frame? When it's all realtime rendered (exception would be actual video if that is used for a cutscene), it's 100% fake. And rasterization has to fake the lighting, bake in the lightmap so it approximates reality. Being upset over frame gen is silly, when the frames to begin with are 100% computer generated.
This is certainly one of the takes of all time.

Video games on the other hand already provide nothing but a fully fabricated experience. Acting like the source textures are a "source of truth" for the scene and being a purist is fine; that's your opinion... but a game is a game. Developers and people will be happy as long as it subjectively looks good and plays well. If something subjectively looks "better than native" in an blind A/B test, it's just better period, because the overall experience is better. Being a purist is fine, but I'll note that purists are also extremely rare in the audiophile space. If it hits the neurons better, monkey will be happier.
Not using dynamic picture preset and SOE for movie watching, or adding neon colors to realist paintings, or adding bass beat tracks to Beethoven's symphony is also being a purist.
 
Not satisfied with the current market? That’s fair, I don’t think anyone is. But we know that cards are stacking up in warehouses. And there is nothing forcing basically anyone to have to buy anything new. Vote with your wallet really does work. Be patient. And don’t care what everyone else has.
We literally saw this work with the 4060 16GB and the 4070 a few weeks ago. So the consumer can force a price drop it really just has to stop buying cards that are just way overpriced.

1695470600802.png
 
So long as DLSS keeps improving the statement might be true, at the very least it will be a great tool to smooth out play in some games on weaker hardware.

That's nVidia though, and most likely Intel as well. A world where software solutions are the way to constantly improve graphical fidelity is a world where AMD is a (very) distant third in consumer GPUs. Unless they suddenly get their shit together in software, of course.

I'm still not a huge fan of DLSS as of now, I find it feels "weird" to play with. I can't put my finger on it, but it feels and looks muddy sometimes.
 
Fake frames: You (everyone) do realize that every single frame in a video game is a fake frame? When it's all realtime rendered (exception would be actual video if that is used for a cutscene), it's 100% fake. And rasterization has to fake the lighting, bake in the lightmap so it approximates reality. Being upset over frame gen is silly, when the frames to begin with are 100% computer generated. Of course if it is adding visible artifacting, it needs some work. I tried it in Jedi Survivor and the image was still perfect. Again, this may not yet be true for all games, but I have no doubt that it will get there.

I agree. This crying about fake frames and resolution is quite hilarious, especially from a developer standpoint I guess.

In terms of Path tracing vs Rasterization: Does somebody here know how much of a time saving it would be to implement Path Tracing + AI denoiser instead of using Rasterization in a game like Last of Us Part 2? Naughty Dog is pretty impressive with fake lighting and classic Rasterization, but I imagine they need much more time to get this level of quality in various scenes in the game in comparison to a path traced game with an AI denoiser, because they need to manually place all the different shaders etc.
 
That's a pretty weird interpretation of that video. His conclusion at the end of it was that Ray Reconstruction mostly lowered ghosting and smoothed out artifacts, and was (for the majority of the time) at least as good or better than the input. It just had a few (like one or two?) areas where it increased ghosting. Namely that tower lighting. Occasionally it also got rid of things like dirt and grit because it saw it as noise.
There was the part where the woman walking away started to ghost like mad. Right here at 13 minute mark.
 
Ghosting comes mostly from an object hidden in one frame, exposed in the next and either a generated frame or AI interpretation of the two frames. Limit of upsampling, since the actual how it should look data is missing.

Take Cyberpunk 2077, max 4k settings with theoretical card (Nvidia), Path tracing, Ray Reconstruction:

What looks the best?

4k rendering, no upscaling 120fps, DLAA (not an option now but for sake of argument)

4k upscaled DLSS quality 120FPS

4k upscaled DLSS balance, 120FPS using frame generation

I pretty sure I know the answer for myself. Each one above would be dramatically noticeable for me in image quality and feel. With frame generation and increase lag by far the worst. One prospective or view.

Personally, Cyberpunk rendering at Native 4K resolution with rasterization or with low RT gives me the best gaming experience, feeling part of the game, least lag, least artifacting, least blurring etc. except I don’t like the game that much. RT does not improve the gaming experience enough for the performance hit.

The one game that RT improve the gaming experience was Control. DLSS only was usable for me when at DLSS 2.2 and above, a trade off between IQ and performance with DLSS option slightly better for the overall better gaming experience.

Anyways it becomes a personal choice in what works best for folks. Funny that some think what works for them will or must work for you and argue over and over how wrong you are. 😁
 
Last edited:
Not using dynamic picture preset and SOE for movie watching, or adding neon colors to realist paintings, or adding bass beat tracks to Beethoven's symphony is also being a purist.

Sure? Those are completely out of the scope of what we were talking about because they would change the fundamental characteristics of the work to begin with (except dynamic picture preset; that is arguable and a matter of preference). That's a strawman argument.
It’s part of Unreal Engine 5, so it’s a safe bet we’ll be seeing a lot of Lumen use in the future. From what I’ve seen, it looks more or less as good as RT without nearly as much of a performance hit, so I expect we’ll be seeing lots of it.

If that is actually the case, I'm certainly on board. I hope they can push for it to be in more games, and that AMD does actually perform just as well on it in those games as Nvidia does. Again, more competition is never bad. One of the reasons I do not consider an AMD card is that they have bad RT performance. That would at least solve that issue; the other issues (terrible for AI workloads comparatively, FSR being much worse, etc) are a bit harder to solve. But honestly if I saw proof that Lumen was being used in more games, that it looks just as good as RT, and that AMD is doing just as well with Lumen, I would probably heavily consider going AMD just for the price savings. I've seen 7900XTX cards for $850 open box at Microcenter before, which is pretty much half the price of a 4090.
 
The technology in and of itself is good as it offers more options to gamers.

The implied concern that not too many are expressing outright is that it will be used as an excuse to write poor titles or as Nvidia has with the sub-4090 4000-series lineup to offer lackluster generation over generation GPU improvements because "we have DLSS anyway, why would we spend the time, effort or money actually improving things?"

We have definitely already seen this in Nvidias offerings. All the performance gains they trumpet are while using DLSS. Without DLSS, anything in the 4000 series below the 4090 is kind of a huge turd when compared to the 3000 series.

I also think we have already seen this developer behavior in Starfield. This game runs like absolute shit for the way it looks. it has the graphics of yesteryear, yet the unjustified system requirements of some sort of modern day Crysis. They could have fixed their shitty engine, or put other additional work into it to make it run well, but why do that when they can just say "FSR is required" and build it into all of the presets?

So, what many people are concerned about when we talk about scaling is not that they don't like that users have an additional option (which generally is a GOOD thing) but rather that it will be a huge enabler for industry to keep giving us shovel-ware all while charging increasingly ridiculous sums for it.
Again, I said this outright 2 months ago:
The people that worried that instead of upscaling just being used to get better performance on older hardware, but that instead it would be used to push the sale of more expensive GPUs to get the 'real visuals' or worse that devs would get lazy and let upscaling cover for badly optimized games might be correct.

https://hardforum.com/threads/remnant-2-devs-designed-game-with-upscaling-in-mind.2029556/
 
Why put effort into further enhancing hardware performance when we can just work on software upscalers now?
-Nvidia

At some point, and perhaps we are close to being there, stuffing more transistors to brute force onto a package isn't the answer. It will be a combination of brute force and the AI upscalers. The trained AI upscalers/image enhancers are extremely efficient. Are they perfect? Nope. Could they be one day? There is plenty of weird weird pr0n out there that says we are getting very close.
 
We literally saw this work with the 4060 16GB and the 4070 a few weeks ago. So the consumer can force a price drop it really just has to stop buying cards that are just way overpriced.

View attachment 600682
Yet the higher end GPU’s, the ones that generally see less sales remain priced the same. Does that mean more people are moving on up to the higher end and ditching the mid range? Maybe Nvidia is making enough money off of its top three GPU’s to keep them priced the same?

I don’t think it has to do with the 4060/4070 being weak sellers, I think it has to do with the 7700XT and 7800XT coming out and Nvidia reacting to compete at that segment.

The point of my post was: enough people will buy the hardware at that inflated price to justify Nvidia and AMD releasing these cards at ridiculous prices. It may not be you, or anyone in this forum, but people will pay the prices and AMD and Nvidia both will gladly oblige, and eventually people will upgrade and not everyone is going to hit up the second hand market.
 
Or, as I've also pointed out - pricing your cards high just gives you plenty of breathing room if you want to or need to get into a price cut war with your competitors - and if you don't have to - oh no we didn't have to slash prices and made more money - the horror 👻💀🦇
 
Sure? Those are completely out of the scope of what we were talking about because they would change the fundamental characteristics of the work to begin with (except dynamic picture preset; that is arguable and a matter of preference). That's a strawman argument.


If that is actually the case, I'm certainly on board. I hope they can push for it to be in more games, and that AMD does actually perform just as well on it in those games as Nvidia does. Again, more competition is never bad. One of the reasons I do not consider an AMD card is that they have bad RT performance. That would at least solve that issue; the other issues (terrible for AI workloads comparatively, FSR being much worse, etc) are a bit harder to solve. But honestly if I saw proof that Lumen was being used in more games, that it looks just as good as RT, and that AMD is doing just as well with Lumen, I would probably heavily consider going AMD just for the price savings. I've seen 7900XTX cards for $850 open box at Microcenter before, which is pretty much half the price of a 4090.

It depends a lot on the use case, in my opinion. I bought a 7900XTX because it was on sale and I needed something to pair with my 4K monitor. The 4080 (only on sale, not at it’s ridiculous MSRP) would have been my first choice because I was interested in playing with RTX features, but after looking at benchmarks I decided it wasn’t worth the performance hit and was more interested in better 4K raster performance, which the 7900XTX delivers for a lower price. At 1440p, though, I may have still paid up for the 4080. Thus far, I’m perfectly happy with my decision, and perhaps I’ll consider an RTX card in the future once the horsepower catches up with the demands of 4K. I suspect an affordable option for that is still a few years down the road though.

Regarding Lumen, the benchmarks I’ve seen have shown AMD and Nvidia working with Lumen at around the same scale as they do with standard rasterization, whereas RT is dependent on RT cores that AMD lacks, so they take a greater performance hit. Which technology will dominate? Don’t know, but UE5’s Lumen implementation gives much better bang for the buck, it’s a popular engine, and the consoles are all using AMD hardware. None of that might matter, we’ll have to wait and see, but either way, the only viable 4K RT card right now is the 4090 and I wasn’t willing to pay those prices considering I was able to buy a high refresh rate 4K Freesync monitor and a 7900XTX for less than a 4090 would have cost me.
 
I hope regular performance is increased. DLSS is nice but has a long way to go and will probably never reach parity with native. It is a good solution for things like ray tracing and those instances where your frame rates are just not high enough. But I still want to see performance increased elsewhere and not just in up scaling.
 
I don’t think it has to do with the 4060/4070 being weak sellers, I think it has to do with the 7700XT and 7800XT coming out and Nvidia reacting to compete at that segment.

The point of my post was: enough people will buy the hardware at that inflated price to justify Nvidia and AMD releasing these cards at ridiculous prices. It may not be you, or anyone in this forum, but people will pay the prices and AMD and Nvidia both will gladly oblige, and eventually people will upgrade and not everyone is going to hit up the second hand market.
Bingo. 4060 ti 16GB and some 4070's, dropped price shortly after the release of the 7800 XT and 7700 XT.
Regarding Lumen, the benchmarks I’ve seen have shown AMD and Nvidia working with Lumen at around the same scale as they do with standard rasterization, whereas RT is dependent on RT cores that AMD lacks, so they take a greater performance hit. Which technology will dominate? Don’t know, but UE5’s Lumen implementation gives much better bang for the buck, it’s a popular engine, and the consoles are all using AMD hardware. None of that might matter, we’ll have to wait and see, but either way, the only viable 4K RT card right now is the 4090 and I wasn’t willing to pay those prices considering I was able to buy a high refresh rate 4K Freesync monitor and a 7900XTX for less than a 4090 would have cost me.
AMD outperforms Nvidia, in Fortnite with Nanite and Lumen and Immortals of Avium with Nanite and Lumen.

However, Fortnite's Lumen has an extra toggle, to hardware assist the Ray Tracing. And this is where Nvidia regains ground. It very slightly improves the lighting quality. And also enables player character reflections, if you find a mirror to stand in front of in a house or whatever. However, in Fortnite, it is such a teensy visual addition, its not worth turning on.
 
Bingo. 4060 ti 16GB and some 4070's, dropped price shortly after the release of the 7800 XT and 7700 XT.

AMD outperforms Nvidia, in Fortnite with Nanite and Lumen and Immortals of Avium with Nanite and Lumen.

However, Fortnite's Lumen has an extra toggle, to hardware assist the Ray Tracing. And this is where Nvidia regains ground. It very slightly improves the lighting quality. And also enables player character reflections, if you find a mirror to stand in front of in a house or whatever. However, in Fortnite, it is such a teensy visual addition, its not worth turning on.

Also, in Fortnite’s case, most players will not be willing to sacrifice frame rate for ray tracing. It’s not the type of game where you want to do that.

In any case, the repeated claim that AMD can’t do ray tracing is somewhat erroneous, because it really depends on how it’s done. They’re inferior at RT implementation, but they can do ray tracing. Nvidia’s marketing department has succeeded in making ray tracing synonymous with RT and hats off to them for doing that. Side by side though, comparing RTX to Lumen, I would take Lumen every single time even on a 4090 because, as far as I can tell, you’re getting like 95% of the visuals with 5x the FPS.
 
Also, in Fortnite’s case, most players will not be willing to sacrifice frame rate for ray tracing. It’s not the type of game where you want to do that.

In any case, the repeated claim that AMD can’t do ray tracing is somewhat erroneous, because it really depends on how it’s done. They’re inferior at RT implementation, but they can do ray tracing. Nvidia’s marketing department has succeeded in making ray tracing synonymous with RT and hats off to them for doing that. Side by side though, comparing RTX to Lumen, I would take Lumen every single time even on a 4090 because, as far as I can tell, you’re getting like 95% of the visuals with 5x the FPS.
Yeah Lumen is great. Can't wait until UE5 gets more adoption. I'm hoping that FF7 Remake part 3, steps up to UE5....

And yes, RDNA3 is pretty good at hybrid RT in most games. The 7900 XTX especially, scales really well.
 
It depends a lot on the use case, in my opinion. I bought a 7900XTX because it was on sale and I needed something to pair with my 4K monitor. The 4080 (only on sale, not at it’s ridiculous MSRP) would have been my first choice because I was interested in playing with RTX features, but after looking at benchmarks I decided it wasn’t worth the performance hit and was more interested in better 4K raster performance, which the 7900XTX delivers for a lower price. At 1440p, though, I may have still paid up for the 4080. Thus far, I’m perfectly happy with my decision, and perhaps I’ll consider an RTX card in the future once the horsepower catches up with the demands of 4K. I suspect an affordable option for that is still a few years down the road though.

Regarding Lumen, the benchmarks I’ve seen have shown AMD and Nvidia working with Lumen at around the same scale as they do with standard rasterization, whereas RT is dependent on RT cores that AMD lacks, so they take a greater performance hit. Which technology will dominate? Don’t know, but UE5’s Lumen implementation gives much better bang for the buck, it’s a popular engine, and the consoles are all using AMD hardware. None of that might matter, we’ll have to wait and see, but either way, the only viable 4K RT card right now is the 4090 and I wasn’t willing to pay those prices considering I was able to buy a high refresh rate 4K Freesync monitor and a 7900XTX for less than a 4090 would have cost me.

I'm a bit skeptical. I've read some online accounts of Lumen in that Eternals game and a chunk of people seem to quite dislike it and think it looks pretty bad. Also as I understand it, it isn't actually leveraging the RT processing units on the Nvidia GPUs to begin with (at least not by default), because it wants to maintain hardware agnosticism. I think it's a bit disingenuous to compare the two in that manner when Nvidia especially does dedicate a decent amount of the processing power to RT (though apparently still not quite enough).

That said...
The problem with Nvidia GPUs imo... if you ignore all of the other benefits of Nvidia (which there are many, especially if you do ANY AI workloads, and again DLSS is just straight up better than FSR in every way, nvenc, etc)... is that the only Nvidia card that it feels like there is really any worth in purchasing is basically a 4090. Rarely does a flagship card actually equal its ridiculous price difference in terms of actual performance, but a 4090 actually outperforms the price difference between it and the 4080, and even it and the 4070 Ti. It's kind of dumb. Nvidia essentially has no midrange or budget space. All of its midrange and budget options suck. Not only does it equal out its gains in price, but then you consider its VRAM advantage and yeah. They really fucked up with how they released the 4070 Ti. I don't know what they were thinking with that garbage bus width (or VRAM for that matter; would it have killed them to put 16GB on it? Apparently yes it would have).

I have enough savings to buy... let's say a decent car or two. But I have a lot of trouble justifying a 4090. You can see my topic about it in the GPU section of the forum. I tried a few of them out and I just couldn't get past the price tag, and how it still ended up CPU bottlenecked. I'm still looking at them now, but even for a financially secure person it just feels... bad spending that much on something that will probably lose all prominence in 1-2 years. I've since upgraded to a 7800X3D because I got some good deals (high end motherboard, CPU, and ram for <$700).. Theoretically I could use a 4090 to more of its max potential now but... eh. I could also buy a 7900XTX and the total price with the AM5 platform I purchased would be less than a 4090 lol. But there are issues that keep me from choosing that.

Then there's all of the issues with the 4090s to begin with. PCB break at the clip in, that stupid fucking power connector that they're using (that thing looks like a ticking time bomb to me, I don't care if it keeps running well for a long time), and the way the GPU manufacturers just deny all warranty claims. With EVGA gone there's just no one offering good terms to buy a 4090 at. When it costs as much as a nice refrigerator (or actually you can get a nice treadmill for less than that), I need some assurance in my purchase. But the 7900XTX doesn't necessarily look like a worthy upgrade path from my 3080 Ti if more games keep using RT (which it's basically a sidegrade at). It's no wonder people are getting cynical and pissed, this market is just garbage. Can't really win.
 
Nvidia did a tech demo of a game where all of the visuals were AI generated----back in like 2018 or something. It of course looked weird and bad, at the time. But it was also amazing.

Now, I'm sure it would look a lot better. But, being able to do it on an affordable GPU, is probably still 5 - 10 years away.

Also forgot to respond to this, but I think we're closer than most people realize to this (or at least the limited application that I was talking about up there, or some of the other limited scenarios I discussed) being a reality. The thing is, though, you'd be hard pressed to find a studio that would actually want to transition into it too quickly, at least one that's old and established. I think if we actually pooled really talented people into it, a lot of jobs would be lost very quickly. But those talented people are probably hard at work in other, more profitable areas. As far as I'm aware, the video game industry fucking sucks from a standpoint of "do I actually want to work here from a mental health standpoint". Plus, AI has to be approached kind of carefully in terms of creativity. AI can create a good baseline of anything trivially, but you start noticing patterns in it very quickly. My Stable Diffusion works have a very distinct look and feel to them unless I load up some very outlandish Lora/Lyco combos that overtake the art style significantly. And even then a pattern begins developing to those, too. AI is already great at interpolation, but really needs to be set up properly...
 
Also forgot to respond to this, but I think we're closer than most people realize to this (or at least the limited application that I was talking about up there, or some of the other limited scenarios I discussed) being a reality. The thing is, though, you'd be hard pressed to find a studio that would actually want to transition into it too quickly, at least one that's old and established. I think if we actually pooled really talented people into it, a lot of jobs would be lost very quickly. But those talented people are probably hard at work in other, more profitable areas. As far as I'm aware, the video game industry fucking sucks from a standpoint of "do I actually want to work here from a mental health standpoint". Plus, AI has to be approached kind of carefully in terms of creativity. AI can create a good baseline of anything trivially, but you start noticing patterns in it very quickly. My Stable Diffusion works have a very distinct look and feel to them unless I load up some very outlandish Lora/Lyco combos that overtake the art style significantly. And even then a pattern begins developing to those, too. AI is already great at interpolation, but really needs to be set up properly...

Put a bunch of business ties on some servers and make an AI ran company that owns an AI ran game studio to build your AI made game

Then you go "AI did it, don't blame me" 🤷‍♂️
 
I'm a bit skeptical. I've read some online accounts of Lumen in that Eternals game and a chunk of people seem to quite dislike it and think it looks pretty bad. Also as I understand it, it isn't actually leveraging the RT processing units on the Nvidia GPUs to begin with (at least not by default), because it wants to maintain hardware agnosticism. I think it's a bit disingenuous to compare the two in that manner when Nvidia especially does dedicate a decent amount of the processing power to RT (though apparently still not quite enough).

That said...
The problem with Nvidia GPUs imo... if you ignore all of the other benefits of Nvidia (which there are many, especially if you do ANY AI workloads, and again DLSS is just straight up better than FSR in every way, nvenc, etc)... is that the only Nvidia card that it feels like there is really any worth in purchasing is basically a 4090. Rarely does a flagship card actually equal its ridiculous price difference in terms of actual performance, but a 4090 actually outperforms the price difference between it and the 4080, and even it and the 4070 Ti. It's kind of dumb. Nvidia essentially has no midrange or budget space. All of its midrange and budget options suck. Not only does it equal out its gains in price, but then you consider its VRAM advantage and yeah. They really fucked up with how they released the 4070 Ti. I don't know what they were thinking with that garbage bus width (or VRAM for that matter; would it have killed them to put 16GB on it? Apparently yes it would have).

I have enough savings to buy... let's say a decent car or two. But I have a lot of trouble justifying a 4090. You can see my topic about it in the GPU section of the forum. I tried a few of them out and I just couldn't get past the price tag, and how it still ended up CPU bottlenecked. I'm still looking at them now, but even for a financially secure person it just feels... bad spending that much on something that will probably lose all prominence in 1-2 years. I've since upgraded to a 7800X3D because I got some good deals (high end motherboard, CPU, and ram for <$700).. Theoretically I could use a 4090 to more of its max potential now but... eh. I could also buy a 7900XTX and the total price with the AM5 platform I purchased would be less than a 4090 lol. But there are issues that keep me from choosing that.

Then there's all of the issues with the 4090s to begin with. PCB break at the clip in, that stupid fucking power connector that they're using (that thing looks like a ticking time bomb to me, I don't care if it keeps running well for a long time), and the way the GPU manufacturers just deny all warranty claims. With EVGA gone there's just no one offering good terms to buy a 4090 at. When it costs as much as a nice refrigerator (or actually you can get a nice treadmill for less than that), I need some assurance in my purchase. But the 7900XTX doesn't necessarily look like a worthy upgrade path from my 3080 Ti if more games keep using RT (which it's basically a sidegrade at). It's no wonder people are getting cynical and pissed, this market is just garbage. Can't really win.
Gigabyte's lowest 4070 is a good buy, now that its priced at $550. It has solid cooling, a full coverage backplate, and uses a standard 8 pin power connector. So you don't have to spend extra money on a cable which will actually fit in your case.
Also forgot to respond to this, but I think we're closer than most people realize to this (or at least the limited application that I was talking about up there, or some of the other limited scenarios I discussed) being a reality. The thing is, though, you'd be hard pressed to find a studio that would actually want to transition into it too quickly, at least one that's old and established. I think if we actually pooled really talented people into it, a lot of jobs would be lost very quickly. But those talented people are probably hard at work in other, more profitable areas. As far as I'm aware, the video game industry fucking sucks from a standpoint of "do I actually want to work here from a mental health standpoint". Plus, AI has to be approached kind of carefully in terms of creativity. AI can create a good baseline of anything trivially, but you start noticing patterns in it very quickly. My Stable Diffusion works have a very distinct look and feel to them unless I load up some very outlandish Lora/Lyco combos that overtake the art style significantly. And even then a pattern begins developing to those, too. AI is already great at interpolation, but really needs to be set up properly...
IIRC, that Catanzaro guy from Nvidia said that, a game would still have a full art team and a team for physics, UI, etc. But, it would all be input to AI, rather than a raster engine. So, the AI wouldn't be working like those image generators, where you just feed it keywords, descriptors, etc, and then it just pulls something out of a hat, based on its training data. Instead it would take all of the input from the art teams, etc, and use its intelligence to display it.
 
still have a full art team and a team for physics, UI, etc

For now

neural-net-processor-terminator2.gif


It already knows most of that stuff - because it's technically possible - it's a matter of not if but when IMO - just see how long the social aspect slows it down

AI can code you an engine write you a script generate your textures your models generate your voices and sounds...
 
Last edited by a moderator:
a world where AMD is a (very) distant third in consumer GPUs.

Where did you see / read that AMD is (right now) a (very) distant third in consumer GPUs?
Last time I checked, AMD's current flagship is only behind Nvidia's best and the market share posted by JP Morgan also stated AMD is still no. 2, albeit declining market share of their discrete GPUs.
 
IIRC, that Catanzaro guy from Nvidia said that, a game would still have a full art team and a team for physics, UI, etc. But, it would all be input to AI, rather than a raster engine. So, the AI wouldn't be working like those image generators, where you just feed it keywords, descriptors, etc, and then it just pulls something out of a hat, based on its training data. Instead it would take all of the input from the art teams, etc, and use its intelligence to display it.

The team needed to create training data is much smaller than the team needed to create to create all of the assets normally. I can basically guarantee you that. No one's just gonna come out and say, "We'll help your company figure out where the lay--err "cost reductions/resource reallocations" are gonna be lol!"

Gigabyte's lowest 4070 is a good buy, now that its priced at $550. It has solid cooling, a full coverage backplate, and uses a standard 8 pin power connector. So you don't have to spend extra money on a cable which will actually fit in your case.

The 4070 basically has worse performance than last gen's 3080. Actually (I think, at least I see some FE listings for it) you could also just go on Ebay right now and buy a 3090 for less money and it will have better performance and twice the VRAM. On the other hand, you would miss out on new DLSS revision, which is the crappy part.. but still, I don't think Nvidia has anything terribly compelling at the low or midrange right now....
 
Last edited:
No one's just gonna come out and say, "We'll help your company figure out where the lay--err "cost reductions/resource reallocations" are gonna be lol!"

Plus anything done to slow it or ban it is not really gonna stop things because that will just entice others to do it faster/at all elsewhere - gonna ban AI made products etc? - then get jealous watching them 'bloom'/make money elsewhere? - someone's gonna do it just to say it was done no matter what anyway
 
One of the more reasonable takes I’ve seen on here. Upscaling is good to have as an option to squeeze out the desired fps if your GPU is lacking. I won’t use frame gen personally due to the input lag. They are both good options to have, but I don’t get the amount of people evangelizing them as the new gold standard over native resolution. My guess is that most of them are owners of a non-4090 Ada card trying to justify the inflated price to themselves and others.

I have a 4090 and use DLSS whenever it's available. I've been playing the new cyberpunk update and its pretty legit. Personally, I never really could tell the difference between DLSS/DLAA and native resolution. I play BF 2042 and use DLSS and can't tell the difference either.
Anecdotal evidence for sure but it's still good tech.
 
I have a 4090 and use DLSS whenever it's available. I've been playing the new cyberpunk update and its pretty legit. Personally, I never really could tell the difference between DLSS/DLAA and native resolution. I play BF 2042 and use DLSS and can't tell the difference either.
Anecdotal evidence for sure but it's still good tech.
I also have a 4090 and use DLSS if available, unless the game can basically max my refresh rate. I can tell the difference, depending on the game, but it tends to be VERY minor in quality mode, and often is a case where it improves the fine detail of some things. I'll take the extra FPS where I can get them, I like 120Hz.
 
I'm a bit skeptical. I've read some online accounts of Lumen in that Eternals game and a chunk of people seem to quite dislike it and think it looks pretty bad. Also as I understand it, it isn't actually leveraging the RT processing units on the Nvidia GPUs to begin with (at least not by default), because it wants to maintain hardware agnosticism. I think it's a bit disingenuous to compare the two in that manner when Nvidia especially does dedicate a decent amount of the processing power to RT (though apparently still not quite enough).

That said...
The problem with Nvidia GPUs imo... if you ignore all of the other benefits of Nvidia (which there are many, especially if you do ANY AI workloads, and again DLSS is just straight up better than FSR in every way, nvenc, etc)... is that the only Nvidia card that it feels like there is really any worth in purchasing is basically a 4090. Rarely does a flagship card actually equal its ridiculous price difference in terms of actual performance, but a 4090 actually outperforms the price difference between it and the 4080, and even it and the 4070 Ti. It's kind of dumb. Nvidia essentially has no midrange or budget space. All of its midrange and budget options suck. Not only does it equal out its gains in price, but then you consider its VRAM advantage and yeah. They really fucked up with how they released the 4070 Ti. I don't know what they were thinking with that garbage bus width (or VRAM for that matter; would it have killed them to put 16GB on it? Apparently yes it would have).

I have enough savings to buy... let's say a decent car or two. But I have a lot of trouble justifying a 4090. You can see my topic about it in the GPU section of the forum. I tried a few of them out and I just couldn't get past the price tag, and how it still ended up CPU bottlenecked. I'm still looking at them now, but even for a financially secure person it just feels... bad spending that much on something that will probably lose all prominence in 1-2 years. I've since upgraded to a 7800X3D because I got some good deals (high end motherboard, CPU, and ram for <$700).. Theoretically I could use a 4090 to more of its max potential now but... eh. I could also buy a 7900XTX and the total price with the AM5 platform I purchased would be less than a 4090 lol. But there are issues that keep me from choosing that.

Then there's all of the issues with the 4090s to begin with. PCB break at the clip in, that stupid fucking power connector that they're using (that thing looks like a ticking time bomb to me, I don't care if it keeps running well for a long time), and the way the GPU manufacturers just deny all warranty claims. With EVGA gone there's just no one offering good terms to buy a 4090 at. When it costs as much as a nice refrigerator (or actually you can get a nice treadmill for less than that), I need some assurance in my purchase. But the 7900XTX doesn't necessarily look like a worthy upgrade path from my 3080 Ti if more games keep using RT (which it's basically a sidegrade at). It's no wonder people are getting cynical and pissed, this market is just garbage. Can't really win.
it looks bad because they did a bad job, for Lumen to work you need to do your mapping in Nanite and you must use dynamic lighting, static lighting does not interact with the shadow maps and will not interact with any ray traced or dynamic elements.
Dynamic lighting is relatively expensive and in the game they mixed and matched the lighting in a poor attempt to keep FPS usable on the consoles.
As a result shadows and light don’t work correctly, it’s a mess.
 
I also have a 4090 and use DLSS if available, unless the game can basically max my refresh rate. I can tell the difference, depending on the game, but it tends to be VERY minor in quality mode, and often is a case where it improves the fine detail of some things. I'll take the extra FPS where I can get them, I like 120Hz.
Here's a benchmark with DLSS at 4K. What settings are you using?
1695535261420.png
 
Here's a benchmark with DLSS at 4K. What settings are you using?
Haven't played Cyberpunk yet, just talking games in general. When I get to it, not sure what settings I'll end up using.
 
Where did you see / read that AMD is (right now) a (very) distant third in consumer GPUs?
Last time I checked, AMD's current flagship is only behind Nvidia's best and the market share posted by JP Morgan also stated AMD is still no. 2, albeit declining market share of their discrete GPUs.

It's a hypothetical, we don't live a world where software tricks are how we improve graphical fidelity. If that really is the future of graphics AMD is deeply screwed without serious changes to how they support their hardware.
 

I don't think the game is over, I genuinely believe that AMD will get it together in terms of their software. The company is still pretty early into what almost feels like a rebirth under Su and this is only the second full generation of their improving GPU business.

The question I keep asking is: How did they let it get so bad to begin with?
 
I don't think the game is over, I genuinely believe that AMD will get it together in terms of their software. The company is still pretty early into what almost feels like a rebirth under Su and this is only the second full generation of their improving GPU business.

The question I keep asking is: How did they let it get so bad to begin with?
Money plain and simple. nVidia has acquired A LOT of companies far more than AMD could ever hope to dream of. The companies are not even comparable in terms of size. This also affects marketing capability. nVidia to me is in the league of Apple when it comes to that. It's even better than Intel. I mean look at what they did with DLSS. Overnight a few press releases and "All cards don't support DLSS 3" turned into "All do they just don't all support frame generation" even though you can still find marketing slicks from nVidia itself claiming the former.
 
Last edited:
Started long before that/the current state of things - that's how Nvidia got as big as it is now - Nvidia's been at this since ~2008 w/Cuda
 
One of the more reasonable takes I’ve seen on here. Upscaling is good to have as an option to squeeze out the desired fps if your GPU is lacking. I won’t use frame gen personally due to the input lag. They are both good options to have, but I don’t get the amount of people evangelizing them as the new gold standard over native resolution. My guess is that most of them are owners of a non-4090 Ada card trying to justify the inflated price to themselves and others.
In terms of Frame Gen, have you actually used it? Everytime I hear "input lag", I laugh and assume said comment has come from someone who has never used FG before. I have a 4090 and have no trouble driving 4K, but I have used FG in FH5, Portal RTX, CP2077 and SW Jedi Survivor... 0 input lag noticeable in any game. Feels smooth as butter. One key is, you need to turn vSync on in the nvcp for that specific game when using frame gen, never had a single issue and the solid frames are amazing as it helps with 1% lows as well.
 
In terms of Frame Gen, have you actually used it? Everytime I hear "input lag", I laugh and assume said comment has come from someone who has never used FG before. I have a 4090 and have no trouble driving 4K, but I have used FG in FH5, Portal RTX, CP2077 and SW Jedi Survivor... 0 input lag noticeable in any game. Feels smooth as butter. One key is, you need to turn vSync on in the nvcp for that specific game when using frame gen, never had a single issue and the solid frames are amazing as it helps with 1% lows as well.
I use it in Darktide, and what ever input lag it’s generating there is less than the latency from me to the games servers so it doesn’t matter.
Once reflex it turned on it adds like 2ms.
My busted ass hands and tired old eyes can’t notice 2ms.
And the guys I play with haven’t ever complained about me not pulling my weight, well except for the occasional rando who is always dead because somehow they always end up alone and that’s somehow our fault…

But yeah the input lag isn’t really a thing.
 
Seems like one of those things that people perceive differently. I mean some people claim 30 fps is just fine, after all.
 
I've still only played it in Portal RTX (mainly cause I'm cheap and wait for games to hit $10 before I buy them) - which was fine didn't notice anything - but one game/demo from Nvidia isn't really enough for me to judge it for myself overall - but again general consensus seems to be 'it does what they say and is good at it', not 'almost' or 'within 20%'

Stuff like that/excuses can only cut it for so long as we now see
 
I bet that most people who claim FG has too much input lag couldn't reliably determine this in a blind test. Provided the hardware and settings are correct. A GSYNC/VRR display is mandatory.
Yeah all my displays at this point are GSync certified, I’ve never used it on a display that isn’t so I can’t say if that would make it noticeably worse or not. I assume it would but I’m not going to go about testing that.
 
Status
Not open for further replies.
Back
Top