RTX 5090 - $2000 - 2 Slot Design - available on Jan. 30 - 12VHPWR still

Indeed.

However, some games have frame pacing or hitching issues, or tearing, with framegen. One or more of the pieces of the chain are failing, it seems.

Point being, this stuff is far from perfect. Despite some of its bullish fans.

Frame generation automatically enables Nvidia's reflex, which caps fps at a few fps below your refresh rate.

Interesting, I will have to re-check that. In my experience it never seems to work. Every game I use frame generation on the FPS seems to be unlocked, and regularly goes above my refresh rate (144). I can see it go up to 175 or so. Setting FPS cap in game is unusually disabled, and if enabled in NV control panel it does not seem to work when frame generation is on.
 
I love my eye candy too, but frames/smoothness has its own unique benefit to the quality of the image. Like I'm not going to go from 144 FPS to 65 FPS just for slightly better reflections in limited areas.

I have a 120hz monitor these days (LG C3), and while I do appreciate a little higher framerate than I used to, I am still an "eye candy first" aficionado. There is just something about good eye candy that can make a good story title even more immersive.

But when it comes to the old 60fps rule of thumb thing, there is something people forget. And that is that back then, sure we targeted an average of 60fps, but that was because we had v-sync on. The target was an "average" of 60fps only because every single frame was exactly 1/60 of a second (unless you dropped below 60fps, which we didn't want to do.

That is not the same as targeting an average of 60 fps today. An average framerate of 60fps in the G-Sync/Free-Sync/VRR era is MUCH worse than a flat 60fps.

By modern standards the equivalent is more like targeting a 0.1% minimum of 60fps.

And that is completely different. If you target a 0.1% minimum of 60fps, your average tends to wind up in the 90-100fps range, and that I actually tend to find quite acceptable. I'm not going to scoff at more if I can get it, but I am also not going to sacrifice eye candy and immersion in order to do so.
 
Last edited:
Interesting, I will have to re-check that. In my experience it never seems to work. Every game I use frame generation on the FPS seems to be unlocked, and regularly goes above my refresh rate (144). I can see it go up to 175 or so. Setting FPS cap in game is unusually disabled, and if enabled in NV control panel it does not seem to work when frame generation is on.

Turn on v-sync? That seems to cap things at your displays max refresh (on Nvidia, but not on AMD)
 
I have a 120hz monitor these days (LG C3), and while I do appreciate a little higher framerate than I used to, I am still an "eye candy first" aficionado. There is just something about good eye candy that can make a good story title even more immersive.

But when it comes to the old 60fps rule of thumb thing, there is something people forget. And that is that back then,m sure we targeted an average of 60fps, but that was because we had v-sync on. The target was an "average" of 60fps only because every single frame was exactly 1/60 of a second (unless you dropped below 60fps, which we didn't want to do.

That is not the same as targeting an average of 60 fps today. An average framerate of 60fps in the G-Sync/Free-Sync/VRR era is MUCH worse than a flat 60fps.

By modern standards the equivalent is more like targeting a 0.1% minimum of 60fps.

And that is completely different. If you target a 0.1% minimum of 60fps, your average tends to wind up in the 90-100fps range, and that I actually tend to find quite acceptable. I'm not going to scoff at more if I can get it, but I am also not going to sacrifice eye candy and immersion in order to do so.

I never used vsync, ever, until I started using VRR.
 
Interesting, I will have to re-check that. In my experience it never seems to work. Every game I use frame generation on the FPS seems to be unlocked, and regularly goes above my refresh rate (144). I can see it go up to 175 or so. Setting FPS cap in game is unusually disabled, and if enabled in NV control panel it does not seem to work when frame generation is on.
Do you have Vsync on the NV control panel? You should. Gsync works best that way. And so does Reflex.
 
I never used vsync, ever, until I started using VRR.

Ugh. Back in the fixed refresh monitor days, I had it permanently on. Even with competitive multiplayer games.

Tearing is vomit-worthy. Could not play without v-sync at all in those days.
 
Last edited:
Wait i read somewhere specification of Rtx 5080 . And have smaller ROPS,CUDA,TENSORS than 4090 rtx. Is this true? If yes so 4090 will be faster?
yes, it was clearly announced.
yes. it makes up with software trickery.
you better get the 5090 or youll always doubt if you made the right choice. or better, wait for the 6090 'cause it will be even better! ;)
 
yes, it was clearly announced.
yes. it makes up with software trickery.
you better get the 5090 or youll always doubt if you made the right choice. or better, wait for the 6090 'cause it will be even better! ;)
ok thx. I think i will buy still 5080. But must wait for prizes. it's enough for me to 1440P, rtx 5080
 
Last edited:
One thing I did not see mentioned, and I assume has not changed, is that you cannot set a frame rate cap with frame generation. That is a downside of the technology. Often times when I do turn it on it may make scenes where I am getting 80 frame rates look better at 110 or so, but other scenes it pushes above my refresh rate/G-sync limit. And then I see screen tearing. This will probably be a bigger issue with this new iteration of frame generation for many people. I know some monitors have higher refresh rates but those are often still limited to higher end OLED panels.

If I missed that and some how they got frame rate caps to work with frame generation, then that would be excellent news. But it is another reason why I can't get too excited over increases in frame generation performance and why I prefer seeing improvements outside of DLSS/frame generation for native resolution.
You can frame cap with frame generation. RTSS does it and IIRC nvidia's limiter was updated to do it too.

That and Reflex + g-sync + v-sync = automatic capping below refresh rate.
 
I just leave the nvcp automatic for everything, and have gsync on. The games have always done what they need to for me. If I'm not using FG, frames cap at my refresh rate. If I am using FG, they seem to cap at 5~6 FPS under my refresh rate. I only use frame limiters when a game does not have one built in, but most these days do. If u are using FG, Vsync should be set to auto and frame caps should be off.
 
Would Nvidia knowingly leaves inefficiencies in its firmware or drivers that it can fix to unlock additional performance, if the market isn't behaving how Nvidia wants, to give itself a PR / competition boost?
 
Would Nvidia knowingly leaves inefficiencies in its firmware or drivers that it can fix to unlock additional performance, if the market isn't behaving how Nvidia wants, to give itself a PR / competition boost?

I don't see what they would have to gain by doing that.

They want to put the most competitive product on the market they can, so it sells. If there are driver inefficiencies, the product either doesn't sell as well, OR they have to spend more on beefier chips to get the same performance. Either way, they make less money, and gaming is alreadyuch less of a money.aker than their other markets.

More likely they just have a much smaller software team than is necessary to get the job done right the first time, and they use agile like methodologies to beta test on the customer.

Talented programmers are in short supply and expensive, and they want to use as many of them as they can to improve their AI product lines, because that is where the money is right now.
 
No.

They have no need or reason to do something like that, their stuff is already king of the hill.
 
So in 30 january in my local shops in Poland,starting to sell Rtx 5080/5090. I wonder if i will buy card without problem on this day? Or scalpers will be faster?
 
No.

They have no need or reason to do something like that, their stuff is already king of the hill.

On the top end yes, but for all the attention it gets, the top end is a small minority of overall gaming sales. Most come in the mid to low end, and there they are seeing increasing pressure from both AMD and Intel.

They have every kit of incentive to maximize the internal cost to performance ratio, at least without spending huge on development resources. They have no incentive to intentionally sabotage their launch.

The rumor mill is pretty strong that AMD will be releasing a very competitive mid range offering on January 22nd, just a week before the 50 series is set to go live.
 
But prime atx 3.1 and pcie 5.1 is not in shops.

Interesting. Did not know that.

I've been using my two old 1200W Prime Platinum PSU's I bought about 6 years ago.

They are still working perfectly, so I have had no reason to replace them.

I've just been using CableMod adapter cables for the new-style Nvidia connectors. They appear to work perfectly.
 
Required System Power (W) (5)1000
Supplementary Power Connectors4x PCIe 8-pin cables (adapter in box) OR
1x 600 W PCIe Gen 5 cable
5 - Minimum is based on a PC configured with a Ryzen 9 9950X processor. Recommend PCIe CEM 5.1 compliant PSU. Power requirements can be different depending on system configuration.

Probably many AiB partners will supply a 4x8pin adapter with the card.
 

Unofficial roundup

1737387054966.png



View: https://youtu.be/EAceREYg-Qc?si=8uCAGm4dhke0GWqn


View: https://youtu.be/XWZN6DuG7sk?si=cCgm-p550F29AsPG


View: https://youtu.be/6IH0YgZKSwM?si=60P--2MV7lzwQwWA
 
Last edited:
Of course I'm stuck in the office and unable to watch these vids today. Totally forgot they were rolling this stuff out.
 
Reviewers scoffed that the new card would be like a radiator and in the end it turned out to be 2 slots.Jensen surprised everyone again.
He's the best tech CEO out there and continues to prove it. Don't listen to the biased clickbait idiots.
 
Reviewers scoffed that the new card would be like a radiator and in the end it turned out to be 2 slots.Jensen surprised everyone again.
We shall see how the 600w card trowing all the heat toward the cpu and memory. The top mounted case fans will be key in dumping all that heat out of the case as fast as possible.
I am still wondering how that main board with everything packed in there so tight will cause problems that we are not aware of yet but will over time. that is a lot of heat in a small area.
 
Back
Top