• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

RTX 5090 - $2000 - 2 Slot Design - available on Jan. 30 - 12VHPWR still

I think that's what the TI model will be IMO, a 24GB card with maybe a bump in cores since the gap between the 80 and 90 models is so big.
Only way it gets a bump in cores is a cutdown GB202 from the 5090. I just don't see that happening.

I could be wrong, but AFAIK the 5080 is using the full GB203.
 
So, I have been using Frame Generation in Stalker 2, with DLAA, at 4K, max settings. There is absolutely no artifacts or anything I can see with my eyes playing the game (I have tried). Does that mean it is not there? No, of course not, but unless I am missing something, feels like a mound out of a mole hill type issue here. In terms of "feel", I dunno, it feels great to me and extremely smooth. I do not notice any input lag whatsoever in that game. It has either improved, or it was implemented well on this particular game. I can admit CP2077 did have SOME input lag it felt like, but the game was still very enjoyable with it.

I still don't understand the argument of "fake frames" either... it's all fake because it's all PC generated. You have to start somewhere in order to advance how games are rendered, people can't expect rasterization to be the end all be all until the end of time can they?

Haven't player STALKER 2 yet but there is always a bit more noise with DLSS on things like hair, fences and whatnot. There is also trailing/ghosting for fast objections in motion, though it does vary from game to game. Things like red dot sights, or when using a flash light, you may see some trailing.

DLSS 4 further improves upon this. DLSS frame generation adds some more ghosting. Generally I find frame generation ghosting to be more noticeable, whereas DLSS 3 largely minimized it and the main issues that remain are noise and blurriness which DLSS 4 does seem to further improve.

Example of trailing/ghosting (enlarge & check the plane behind me):

Screen_241028_005331.jpg

Of course the technology is getting better. But it has downsides, and that is why it seems kind of underwhelming. Most of the gains are with DLSS and frame generation. I assume DLSS frame gen with 3 frames will cause more issues. The tech is nice but IMO should be reserved for cases where your frame rate are just not high enough, but shouldn't be the main source of improvements and shouldn't be considered as something that is used by default.

Also EU prices are out. Almost 1000€ for 5070Ti!? Are you fucking kidding me!? With these kind of prices PC gaming is going to die. As a teen i could buy myself a nice gaming PC with what I earned when working during summer breaks. Now as an adult I have to really ponder will I keep up with my hobby or will I go to console gaming instead.

Keep in mind that includes VAT which is typically something like 20%. That would still be $100 more over the US MSRP, but you get what you pay for. Stores are required to handle longer return periods, so of course that costs them more and as a result prices have to go up for everyone. You get some benefits of course, just comes down to what you prefer.

TLDR: You'll only pay $750 in the US if you have no sales tax in your state. Or you get some great sales/cash back.
 
Last edited:
Not an expert in any of this GPU technology, but how accurate is my thinking that we're basically at the inflection point where rasterization has kinda hit the wall in terms of its development, and DLSS 4/MFG is just now becoming a somewhat/vaguely adequate replacement - which is highlighted by Zarathustra[H]'s point that DLSS 4/MFG can currently be the icing on the cake (taking a good experience and making it better), but if the underlying raster isn't adequate (i.e., the framerate to start is low), DLSS 4/MFG is putting lipstick on a pig?
 
Not an expert in any of this GPU technology, but how accurate is my thinking that we're basically at the inflection point where rasterization has kinda hit the wall in terms of its development, and DLSS 4/MFG is just now becoming a somewhat/vaguely adequate replacement - which is highlighted by Zarathustra[H]'s point that DLSS 4/MFG can currently be the icing on the cake (taking a good experience and making it better), but if the underlying raster isn't adequate (i.e., the framerate to start is low), DLSS 4/MFG is putting lipstick on a pig?

I think you mostly got it, but not 100%.

You wouldn't compare DLSS to Raster. Those are two different topics.

You might - however - compare raster to RT. Those are different ways of rendering the pixels.

DLSS falls into the upscaling and/or frame generation category

So you might compare DLSS vs native resolution (if DLSS2) or native frames (if DLSS3 or DLSS4)

DLSS can be used for content rendered in either raster or RT.
 
this is the sad truth, but gamers are no longer the customer base, not for a long time now.

Oh gosh no.

Games have not been Nvidias target demographic since at least 2008, probably even sooner, when they figured out how much more they could charge for data center products and Enterprise/Workstation GPU's.

Way back, Nvidias GPU's were purpose designed for people who play games. That hasn't been the case for over 15 years. For over 15 years we just get hand me downs, downscaled and binned/disabled core versions of products designed for the datacenter.

Gaming for over 15 years now has been an afterthought for them, not a priority.
 
And I for one am perfectly content with this. After all, their raster HW is already the best in the world, and the AI fakery actually happens to.... you know... work! It looks good, and dramatically bumps frame rates on 4K+max settings content. They are doing exactly the right things, and as a 4090 owner I am rather excited by the 5090.
Well said. I don't care what's "fake pixels" or "fake frames" so long as they look good and bump the overall image quality thanks to bring able to enable higher settings/resolutions/etc. BRING IT ON!
 
At which point are the fake frames fake though, changes to DX12 and upcoming changes to Vulkan are going to be providing the GPU with even more data with which it can more accurately generate those frames
https://devblogs.microsoft.com/dire...rectx-cooperative-vector-support-coming-soon/
What is currently AI fakery is getting baked in at an even lower level which will lead to better quality results with less overhead, Nvidia has been laying the groundwork for this for years now.
I know this is an unpopular opinion, but everything being displayed on the screen is fake. And I honestly could not give a rats ass what they do to get the graphics in a game to look amazing while still providing a smooth experience. My enjoyment is paramount to my gaming experience, and I'm not going to nit pick how they get the frames onto the screen.

DLSS, frame gen, Ray tracing, RTX, path tracing. I don't really care how they work, but that they do and provide an enjoyable gaming experience. First iterations needed work, but it's been getting better, and it seems like it will continue to get better.
 
"to get to a similar (or superior) end result."

Aye, but there's the rub.

If you are already at 90fps, adding frame gen to max out your 144hz monitor, looks very smooth, and doesn't suffer play-ability wise. It may not add much, but it doesn't hurt.

But try using frame gen to bring a 45fps framerate up to 90fps. You aren't going to like it very much.

Sure it looks smooth to a bystander, but as soon as you grab that mouse it feels just as laggy and terrible as playing at 45fps does.

If Frame Gen and Upscaling could deliver the goods it would be one thing, but it very much can't. Upscaling is better than frame gen at lower frame rates. You get a tiny IQ hit in exchange for a pretty substantial performance increase, but there is an IQ hit.

But frame gen is really rather useless unless you already have an acceptable frame rate to begin with, and if that is the case, the motivation to even bother with it is relatively limited, because the benefits are so tiny. Going from DLSS3 to DLSS4 with multiframe seems even more useless.
To some of us 60 isn't acceptable. That's about where framegen starts to really work well as a base framerate, anyway :).
 
I know this is an unpopular opinion, but everything being displayed on the screen is fake. And I honestly could not give a rats ass what they do to get the graphics in a game to look amazing while still providing a smooth experience. My enjoyment is paramount to my gaming experience, and I'm not going to nit pick how they get the frames onto the screen.

DLSS, frame gen, Ray tracing, RTX, path tracing. I don't really care how they work, but that they do and provide an enjoyable gaming experience. First iterations needed work, but it's been getting better, and it seems like it will continue to get better.
Don't need to know how the sausage gets made, I just want it to taste good :D
 
To some of us 60 isn't acceptable. That's about where framegen starts to really work well as a base framerate, anyway :).
The problem with that though, is you can have 200frames and all, but if the input latency is terrible it will still feel like 60. Thats why framegen needs more work, but it is going down the right path.

Also is ghosting with frame generation, even Linus said its still present on the 5090 playing cyberpunk.

Lets wait and see for reviews first though, a lot of time before launch day where all these conversations about dlss/frame gen etc will be put to rest.
 
First look at DLSS4. Most of these features are also compatible with the RTX 4000 cards and earlier!


View: https://www.youtube.com/watch?v=xpzufsxtZpA

I guess I am too old to really see the problem with dlss (from that video anyways). Reduced ghosting and greatly improved image quality. Really most images we see use some math operations to optimize things...i.e. jgp does something similar to singular value decomposition with an optimized rank to keep the majority of the image data while greatly reducing the amount of data. The fact is, I see all GPU makers going this route and I see the models getting better and better.
 
Well said. I don't care what's "fake pixels" or "fake frames" so long as they look good and bump the overall image quality thanks to bring able to enable higher settings/resolutions/etc. BRING IT ON!

I mean this is kind of the crux of it.

You cope with path tracing at 20 fps, suffer lower settings/resolution, or wait 3 more generations for ray tracing performance to improve far enough.

...Or deal with some visual artifacts and _possibly_ some input lag and mostly eat your cake.

I think it's obvious what most people are going to pick. The end result is superior quality, whether it's real or not. Path tracing is straight up transformative in some games like Cyberpunk. It's a literal generational difference in visuals.

I've flipped between DLAA and DLSS a hundred times in various games. I couldn't tell you that DLAA definitively looks better in most cases. I've seen it stationary, I've seen it in motion, I probably couldn't tell you which is which in most cases unless I'm spending an obnoxious amount of effort analyzing the image. And I might still get it wrong.

The whole point of optimizing my settings is to get the best bang for the buck visually. And if can't reliably even see a fucking difference, I'll take the "free" win.
 
Sure...and the images you see are not raw data either. There's always a good enough setting and it is never good enough for 100% of the users no matter where you set it.
I agree, good enough for most people. But some people just dont, like it myself. But the ghosting is from frame generation, and not using DLSS. The problem is Nvidia lumps in both and calls in dlss.

So, the misconception I think some people are having is that DLSS quality setting is actually really good with very minimal ghosting and damn good IQ when using it. Causes some input latency.

Frame Generation (also called DLSS) works that it produces more frames on screen. But, causes ghosting because of the frame being generated. Causes a pretty hit to input latency.

I for one can easily tell if my monitor/game is over 15ms of input latency and it drives me nuts. This is why I went with an Oled monitor to reduce the amount of input latency while gaming. But, this will affect everyone differently. I know some people that cant tell a difference from 60 to 120fps. While myself cannot handle anything lower than 55-60fps....

Either way I do plan to get a 5090.. Looking forward to it :)
 
I think you mostly got it, but not 100%.

You wouldn't compare DLSS to Raster. Those are two different topics.

You might - however - compare raster to RT. Those are different ways of rendering the pixels.

DLSS falls into the upscaling and/or frame generation category

So you might compare DLSS vs native resolution (if DLSS2) or native frames (if DLSS3 or DLSS4)

DLSS can be used for content rendered in either raster or RT.

Exactly. Just because raster is reaching its limits doesn't mean DLSS & frame generation is the only way to improve performance. Ray & path tracing is coming, and will be the future of game development. I do like these technologies, and they are improving. But they should not be the primary point of performance increases. They are extras to turn on if necessary. Development of ray/path tracing can continue and ideally at some point up-scaling and frame generation won't be required to use ray tracing with good performance.

I always thought of these as great technologies to help boost frame rates, especially for those on lower end GPUs. But when the major improvements across a line up are DLSS/FG based we have an issue.

Maybe we'll see bigger jumps in raster or without DLSS/FG in the upcoming benchmarks, but Nvidia shying away from showing anything without it seems to imply outside of DLSS/FG the performance jump isn't going to be that big.
 
I agree, good enough for most people. But some people just dont, like it myself. But the ghosting is from frame generation, and not using DLSS. The problem is Nvidia lumps in both and calls in dlss.

So, the misconception I think some people are having is that DLSS quality setting is actually really good with very minimal ghosting and damn good IQ when using it. Causes some input latency.

Frame Generation (also called DLSS) works that it produces more frames on screen. But, causes ghosting because of the frame being generated. Causes a pretty hit to input latency.

I for one can easily tell if my monitor/game is over 15ms of input latency and it drives me nuts. This is why I went with an Oled monitor to reduce the amount of input latency while gaming. But, this will affect everyone differently. I know some people that cant tell a difference from 60 to 120fps. While myself cannot handle anything lower than 55-60fps....

Either way I do plan to get a 5090.. Looking forward to it :)
Anything over 40 fps is crazy good to me...haha. And literally no matter where they put the cutoff, for some it would not be good enough. What do you do? You cover 3 sigma and sell it. The Terrance Tao's of the gaming world will just have to suffer with us normies, haha.

I am glad at least some of you gifted people put up with us in stride 😆.
 
Anything over 40 fps is crazy good to me...haha. And literally no matter where they put the cutoff, for some it would not be good enough. What do you do? You cover 3 sigma and sell it. The Terrance Tao's of the gaming world will just have to suffer with us normies, haha.

I am glad at least some of you take it in stride 😆.
just getting old....I feel you man on that one.

It was just a lot of money spent on gaming over the decades to find out myself the difference between framerate and input latency.....Yeah....a lot of wasted money :( lol
 
I agree, good enough for most people. But some people just dont, like it myself. But the ghosting is from frame generation, and not using DLSS. The problem is Nvidia lumps in both and calls in dlss.

So, the misconception I think some people are having is that DLSS quality setting is actually really good with very minimal ghosting and damn good IQ when using it. Causes some input latency.

Frame Generation (also called DLSS) works that it produces more frames on screen. But, causes ghosting because of the frame being generated. Causes a pretty hit to input latency.

I for one can easily tell if my monitor/game is over 15ms of input latency and it drives me nuts. This is why I went with an Oled monitor to reduce the amount of input latency while gaming. But, this will affect everyone differently. I know some people that cant tell a difference from 60 to 120fps. While myself cannot handle anything lower than 55-60fps....

Either way I do plan to get a 5090.. Looking forward to it :)

DLSS will IMPROVE latency, the game loop is literally running in less time (unless you're so horribly CPU bound that it did nothing, I guess).

It has nothing to wait on until frame gen comes into play - it's pure savings. You've won those milliseconds back.
 
It'll be interesting to see how many people acting like they'd never get one in a million years will be posting that they bought one in a month or two. Happened with both the 3090 and 4090.
Sadly, you're only given 2 kidneys. Drat. Are gall bladders worth anything?
 
Did they test with the fastest CPU "Ryzen 9800X3D", could the performance with pure rasterization be much higher than Nvidia claims?
 
It'll be interesting to see how many people acting like they'd never get one in a million years will be posting that they bought one in a month or two. Happened with both the 3090 and 4090.
They won't because they won't be able to acquire one, and fake-frames grievance will transition to actual frustration - expressed in multi-paragraph freakout over everyone being sheep, AI being the downfall of humanity, and the world being unfair.

There's an assymetry people seem to be missing if they're viewing this product launch through the lens of a bygone era where everything still revolved around gaming: in every figurative "line at Microcenter" is a 500ft Godzilla in that line, and he is the collective AI/LLM/ML/CUDA demand for more of these chips at the professional/industrial/government levels; his gaze fixed forward on that memory bandwidth uplift in particular. He is the reason 4090 has been perpetually sold out at retail for its entire production life, he'll be the reason 5090 will play out exactly the same.

"It uses too much power", "wait for reviews", "it's overpriced", "raster isn't as big an uplift as I hoped", "pointy elbows" etc are all valid arguments. But none of it matters because Godzilla's gonna eat, and demands more calories than the combined output of every foundry on earth can even provide.
 
Last edited:
DLSS will IMPROVE latency, the game loop is literally running in less time (unless you're so horribly CPU bound that it did nothing, I guess).

It has nothing to wait on until frame gen comes into play - it's pure savings. You've won those milliseconds back.
Imagine i put a dot in the game and the dot will randomly choose a direction and acceleration vector every say .001 seconds. And maybe the game is to shoot the dot and it is bound to stay in your screen...you get the idea.

If the dots movement is truely random, it definitionally means it's not predictable. No ai or anything else can predict it. And it's position and movement is crazy fast and erratic. So the frame generated will necessarily be wrong. You're going to have latency issues with frame generation on that dot. You have to. What exactly that looks like i don't know...would be an interesting experiment for someone to do.

Now, for most games that's not a problem. But for competitive gamers, the next new frame of actual enemy position is critical and this tech won't satisfy them.
 
Think I am going to get one if I can. I definitely need one with an AIO, but man I hope it's not as bad as the crap I went through with the 3090. This time I won't have a great company like EVGA that will actually make a list so people don't have to stalk sites and hope a bot doesn't beat them at checkout. 😩
 
Imagine i put a dot in the game and the dot will randomly choose a direction and acceleration vector every say .001 seconds. And maybe the game is to shoot the dot and it is bound to stay in your screen...you get the idea.

If the dots movement is truely random, it definitionally means it's not predictable. No ai or anything else can predict it. And it's position and movement is crazy fast and erratic. So the frame generated will necessarily be wrong. You're going to have latency issues with frame generation on that dot. You have to. What exactly that looks like i don't know...would be an interesting experiment for someone to do.

Now, for most games that's not a problem. But for competitive gamers, the next new frame of actual enemy position is critical and this tech won't satisfy them.
For all those competitive gamers with a 1ms ping to the server, I'm heart-broken.
 
WTF is an AI TOP?

I mean, I expected them to go AI crazy, as that is what they do these days, but I don't give a rats ass about the cards AI capabilities. I don't want them.

Dual slot only makes me suspicious that this won't be a very big performance leap over last gen.

If they required that much power and heat to make the 4090 work, there is no way dual slot is going to cut it in a card that is supposed to be faster than the 4090.

Yes, next gen fab process and all that, but the difference there is minor between TSMC's 4N process and the 4NP this is supposed to use is minor at best. A few percent. I'm not expecting that they will be able to radically improve perf/watt to the extent that this works.

Which probably means this is some "replace even more real performance with AI generative bullshit" generation, and if that is the case, I'm out. I'll keep my 4090 indefinitely. No more fake pixels and frames.
This is one of the dumbest opinions of all time. Cards have been dual slot since forever. The 8800 ultra was a fire waiting to happen, along with many ATI cards of the time, dual slot. And what do you know? The next gen came out, also dual slot and more performance!

If your way of thinking was true, we would be at 10 slot GPUs by now.

Relax.
 
Yeh, keeping my 4080s seems more likely except that I am bored and want new. The only upgrade that I have done since Dec of 24 has been to replace my web game with an Insta360. Well at least I had fun determining why it was crashing my rig. It turns out that the Insta360 needs power direct off of the mobo or risk system instability but I digress. As a visual person, I may end up buying the 5 series if I can get my hands on one of the sexy FE models. :)
 
Imagine i put a dot in the game and the dot will randomly choose a direction and acceleration vector every say .001 seconds. And maybe the game is to shoot the dot and it is bound to stay in your screen...you get the idea.

If the dots movement is truely random, it definitionally means it's not predictable. No ai or anything else can predict it. And it's position and movement is crazy fast and erratic. So the frame generated will necessarily be wrong. You're going to have latency issues with frame generation on that dot. You have to. What exactly that looks like i don't know...would be an interesting experiment for someone to do.

Now, for most games that's not a problem. But for competitive gamers, the next new frame of actual enemy position is critical and this tech won't satisfy them.

With just DLSS, as in just the upscaling, frame gen not included, you will win on latency. The engine just straight up spends less time.

With frame generation you will spend time because you need another frame as the interpolation end point, plus whatever the cost of generating the frames is.

At 60 fps you've added 16.67ms + generation cost at minimum. Plus probably some other small noise. Display will always be behind the game by at least this overhead. It isn't pushing NEW frames faster, you've just crammed more frames into a single update loop.

Actually, there's probably an amusing breakpoint where you'd technically spend more time synthesizing a frame than actually rendering a new one. Going by the digital foundry video, I guess you can extrapolate that it pulls a frame out of it's ass somewhere around every 2.5ms since it looked like about 6-7ms~ difference you move up to 4x frame gen from 2x. It's probably a relatively static cost.

So I guess somewhere in the ballpark of 333-400 base fps, it would theoretically and somewhat comically become more expensive to infer a fake frame on a 5080 than render an actual one.
 
With just DLSS, as in just the upscaling, frame gen not included, you will win on latency. The engine just straight up spends less time.

With frame generation you will spend time because you need another frame as the interpolation end point, plus whatever the cost of generating the frames is.

At 60 fps you've added 16.67ms + generation cost at minimum. Plus probably some other small noise. Display will always be behind the game by at least this overhead. It isn't pushing NEW frames faster, you've just crammed more frames into a single update loop.

Actually, there's probably an amusing breakpoint where you'd technically spend more time synthesizing a frame than actually rendering a new one. Going by the digital foundry video, I guess you can extrapolate that it pulls a frame out of it's ass somewhere around every 2.5ms since it looked like about 6-7ms~ difference you move up to 4x frame gen from 2x. It's probably a relatively static cost.

So I guess somewhere in the ballpark of 333-400 base fps, it would theoretically and somewhat comically become more expensive to infer a fake frame on a 5080 than render an actual one.
Generating wrong data (which has to be the case here) is literally the issue im describing. Or are suggesting that this will speed up the real image processing pipeline?
 
I don’t know what they are saying but apparently Valorant with DLSS4 and the new Reflex 2 can see upwards of 800 fps, with only 2ms of latency….

So if that’s accurate that’s astounding.
If you're generating 4x frames per actual frame you're getting 5 frames per gpu made frame (ground truth frame) so you're really talking about 160 fps of ground truth frames.

2ms of latency is only giving me more questions.
 
To some of us 60 isn't acceptable. That's about where framegen starts to really work well as a base framerate, anyway :).

But the joke is that even above 60, Frame gen gives you smoother appearance, but still the same old input lag you always had, and reduction of input lag has always been the prime motivator for increasing frame rates higher and higher.
 
  • Like
Reactions: erek
like this
But the joke is that even above 60, Frame gen gives you smoother appearance, but still the same old input lag you always had, and reduction of input lag has always been the prime motivator for increasing frame rates higher and higher.
It's part of it but also increased smoothness, more chances to see movement, and reduced motion blur. It's a positive so long as it doesn't significantly increase input lag, which from my experience, it does not.
 
Did they test with the fastest CPU "Ryzen 9800X3D", could the performance with pure rasterization be much higher than Nvidia claims?
the footnotes said games were test on 9800x3d and app were on intel 14something...

I agree, good enough for most people. But some people just dont, like it myself. But the ghosting is from frame generation, and not using DLSS. The problem is Nvidia lumps in both and calls in dlss.
So, the misconception I think some people are having is that DLSS quality setting is actually really good with very minimal ghosting and damn good IQ when using it. Causes some input latency.
im with ya on this one, i can feel/see it all and cant stand it.
possibly...



these are way out of my justifiable range, maybe a lower end one someday...
i am impressed with now much the shrunk the pcb.
 
This is one of the dumbest opinions of all time. Cards have been dual slot since forever. The 8800 ultra was a fire waiting to happen, along with many ATI cards of the time, dual slot. And what do you know? The next gen came out, also dual slot and more performance!

If your way of thinking was true, we would be at 10 slot GPUs by now.

Relax.

Lol.

The 8800 Ultra was a ... (wait, looking it up) ... 175w card. That's less power than a 4070 which clocks in at 200w, and just slightly more than the 160w 4060ti.

The reason you kept getting faster and faster cards over the years without getting bigger and bulkier coolers was simply because die shrinks kept improving the perf/watt.

Even in 2018, the 2080 was only up to 215W, and the 2080ti was 250w.

This worked up until the 2020-2022 time frame when the 3090 was 350w and the 3090 ti was 450w. This is when coolers started to grow out of control on the higher end models, and it continued with the 4000 series, with the 4090's having some of the largest and impractical coolers we have ever seen.

The 4090 is a 450w card (up to 600 with overclocking depending on the model).

The only reason they were able to make that amount of heat dissipation work was by giving them the largest and beefiest coolers video cards have ever seen.

The days of huge die shrinks are over. The 4000 series is on TSMC's 4N process, and the new 5000 series on TSMC's new 4NP process. They are both reworked old processes, derivatives of the 5nm process node. There isn't much in gate density and related power savings (or perf/watt) to be had there. Those days are over.

The GeForce 8800 Ultra was on a 90nm process, more than 20 times as large gates as current processes. It's predecessor, the 7000 series started out at 110nm. It's successor, the 9000 series launched at 65nm. Things were changing rapidly.

Now we are going from a slightly improved 5nm process to a slightly more improved 5nm process.

The days of cranking up performance by shrinking dies is mostly over. Now that performance has to come from somewhere else. So you pretty much crank up the power and get a 600w monster, OR you resort to some sort of trickery.

Now the 5090 is supposed to up the power by another almost 30% over the 4090, and the cooler is getting smaller, and going back to a petite double width design?

It is at the very least suspicious.


Now, Nvidia has partially addressed this (by replacing the TIM with liquid metal to improve heat removal). I'm sure that helps, but does it make a 575w 5090 run as cool as the 200-250w cards that we used to have with petite dual width coolers? I think not. That is just not possible. The laws of thermodynamics get in the way there somehow.

So, either it is going to sound like a freaking jetliner when its two fans spin up to 18krpm, or some trickery is going on inside artificially limiting performance below the 575w level they are claiming. (or maybe it is a little of each?)

Something is off here. Maybe some sort of rampant thermal throttling above like 40C core temps or something. I don't know.

I bet beefy overkill water loops and other extreme cooling will become popular again this gen.
 
With frame generation you will spend time because you need another frame as the interpolation end point, plus whatever the cost of generating the frames is.

I'm pretty sure the reason why DLSS3 (and now 4) can be as low latency as it is, is that the AI works by guessing future outcomes solely on past frames, not on a future interpolation endpoint.

I believe that's why they call it frame gen, and not interpolation.

I've never noticed an increase in input lag when enabling DLSS3 (but I haven't noticed an improvement when enabling it either, and that makes it useless in most cases to me)
 
I'm pretty sure the reason why DLSS3 (and now 4) can be as low latency as it is, is that the AI works by guessing future outcomes solely on past frames, not on a future interpolation endpoint.
Yep, it uses motion vectors to estimate future frames. It isnt grandpa's TV frame interpolation.
 
Back
Top