$2,500 RTX-5090 ( 60% faster than 4090 )

Once you start playing games at 150+ fps with the high refresh rate monitors, you can never go back. I had to swap monitors around and mine defaulted to 60hz when I reconnected it. I started playing a game and thought something was broken. I thought my GPU was failing or something. The 60hz refresh rate looked so bad to me.

Once you see 150+, you can't go back.
I don't mind 60 Hz as long as it never drops below 60 fps. Obviously crazy high refresh rate is another experience entirely, but as long as I can get a consistent 60 fps, it's very smooth. Lag can die a fiery death.
 
Tbh I haven't had a gaming experience I would consider "smooth" since a cheap CRT. Even this currently 120Hz LG C3 has noticeable blur to me. It's better than LED, and I'm keeping it around for the blacks, but it's still at the end of the day not perfect for motion clarity. The 240Hz 3440x1440p 40 something inch panel was definitely noticeably better. What was even better than that, surprisingly, is my old ASUS ROG Swift in ULMB mode (iirc, it's been off my desk for a while).

The problem is just sample and hold. I think it's called "persistence". We need like 1k refresh rate OLED before we can overcome that. I don't think the GPU will actually have to pass 1k FPS into it, though. Just the monitor being able to refresh that quickly will do the trick.
 
Once you see 150+, you can't go back.

For my eyes the diminishing returns really begin at around 110 fps. I always tailor settings with a 110 min framerate in mind. I love eye candy and with VRR don't necessarily need to maintain a constant 144. Dropping below triple digits too often is where I begin making cuts in graphical fidelity.

That said I can't wait for big screen OLED TV's to support 240Hz and beyond. More frames is ALWAYS better. I'd salivate to have a TV as sweet as my S95C running at 1000 FPS :D
 
Leaker Kopite is saying that the RTX 5000 series is going back to more sane profiles. 2-slot 2-fan max, even for the 5090. I hope this is true. 3-4 slot wide behemoths are too much GPU for little 'ol me.
 
Leaker Kopite is saying that the RTX 5000 series is going back to more sane profiles. 2-slot 2-fan max, even for the 5090. I hope this is true. 3-4 slot wide behemoths are too much GPU for little 'ol me.
Didn't I just see a leak by him a week ago saying how it was going to be huge like the rumored 4090 ti hsf?
 
It seem to really depends, people to people, a bit lik CRT flicker it started to look ok to be around 85hz and could use 75hz in exchange of higher resolution when I needed it for application, some people had no issue with 60hz, some could not go under 100.

Personnaly for something like games, my 170hz like you at some point for some reason went back to 60 and.... was not sure, I had to go check in the setting if that was the case,
My biggest issue with CRTs back in the day was oftentimes we could do 85+ at 1600x1200 but if we weren't using BNC inputs, the text would get fuzzy which was annoying to my (then) 20/15 close vision! 60Hz on a CRT particularly with white background was utterly murderous to my vision like migraine in a few minutes!

60Hz on LCD is fine for (most) desktop work. 120Hz on my LG C3 is pure bliss, just wish the uniformity was better. ;-)
 
That's just what leakers do, pretty much say everything possible that way no matter what the outcome is they can be like HA see I was right!
Yep exactly.

Its anywhere from 2 to 4 slots for the cooler.

It's either one or two chips.

Its either 384-bit, 448-bit, or 512-bit.

Anywhere from 24GB to 32GB.

May walk on water.
 
It's either one or two chips.
Double the performance ! even more in RT when using new reconstruction, but it is not a good generation upgrade, no node gains and will not be bigger than the 4090 to keep cost down, so Turing like gen performance over Pascal expected.

Will come out between from late summer 2024 and Christmas season of 2025, cost between $1000 and $3000.
 
Didn't I just see a leak by him a week ago saying how it was going to be huge like the rumored 4090 ti hsf?
I thought that leak came from Chiphell? Eh, doesn't matter. We'll all know what's up when Jensen gets on stage and announces cards and pricing this fall. I for one will reiterate that I hope the rumor is true. Power consumption and form factor (read: size and weight) has gotten out of hand.

While speculations are fun and I don't mind occasionally engaging with them, at the end of the day I will not argue that all this talk in the end is mental masturbation. Trying to read the digital tea-leaves is still fun, though :)
 
I had an ATi Radeon 9600XT back then and it really struggled with the Doom 3 and Far Cry at 640x480 resolution on a CRT monitor. Only Half Life 2 ran decently.

It was only when I got a Geforce 7600GT a few years later that I was able to finally play Doom 3 and Far Cry properly and even on a higher resolution 1024x768 LCD.
I got Doom 3 to run on my Ti 4200 pretty well at 1024x768 with a highly customized autoconfig file. I may have it saved somewhere.
PC gaming is actually a relatively cheap hobby, unless you're into racing sims then it can get CRAZY expensive. There are way more expensive hobbies out there. One of my friends tried to clown on me for spending 4 figures on a GPU while he had no issue spending 4 figures on a set of tires for his car.
Racing sims used to be relatively cheap and they still can be. You can get a decent wheel and pedal setup for $300-400. Considering inflation that isn't much more than the $150 I spent on a Logitech Driving Force Pro back in 2005.
From my understanding the 4090 is memory bandwidth starved, swapping to GDDR 7 and a larger bus should ameliorate that problem. I think the two slot thing might be true because there ought to be fewer memory modules at 2 gb per module right? I think GDDR 7 is also more efficient.
The memory chips on the 4090 are already 8Gb/2GB a piece. All 12 of them reside on the front of the PCB.
9xx was 28nm while 10xx was 16nm.
28nm was used for four generations: both Kepler gens and both Maxwell gens. 16nm was used for both Pascal and Turing (12nm was just an improvement of the 16nm process with no real improvement in transistor density).
I thought that leak came from Chiphell? Eh, doesn't matter. We'll all know what's up when Jensen gets on stage and announces cards and pricing this fall. I for one will reiterate that I hope the rumor is true. Power consumption and form factor (read: size and weight) has gotten out of hand.

While speculations are fun and I don't mind occasionally engaging with them, at the end of the day I will not argue that all this talk in the end is mental masturbation. Trying to read the digital tea-leaves is still fun, though :)
Is it going to be real Jensen, CGI Jensen, or AI Jensen this time?
 
28nm was used for four generations: both Kepler gens and both Maxwell gens. 16nm was used for both Pascal and Turing (12nm was just an improvement of the 16nm process with no real improvement in transistor density).
Yeah 28nm was really kind of a long lived gen and it's kind of crazy we got 3 (kind of 4) gens of cards out of it. And then what's funny is you can almost say Pascal was Maxwell gen 3 because it was basically just Maxwell on speed due to the drop to 16nm.
 
NVIDIA is now the second most valuable public company in the US and is now ahead of Apple. Who would have thought!
 
For my eyes the diminishing returns really begin at around 110 fps. I always tailor settings with a 110 min framerate in mind. I love eye candy and with VRR don't necessarily need to maintain a constant 144. Dropping below triple digits too often is where I begin making cuts in graphical fidelity.

That said I can't wait for big screen OLED TV's to support 240Hz and beyond. More frames is ALWAYS better. I'd salivate to have a TV as sweet as my S95C running at 1000 FPS :D
120 for me. 120 vs 240 is indistinguishable for me.
Interesting everyone is different and honest about it too.


Anyone here with laser eyes notice the difference between 360 and 480? :0
 
Well the last time 2 generations were made on the same node was 9xx and 10xx and the 1080's were a huge boost in performance, and are still good cards nearly 8 years on. So 5090 might surprise us if we get a big performance uptick despite being on the same node.

Edit: Nope I remembered wrong:

So we can only guess on the performance from gen to gen based on things like core counts, clockspeed, and memory bandwidth.
1080 still plays DOTA2 on max settings with a 2600K.
 
One of the best run companies in the world. Jensen is a visionary. Best GPU's in the world.

The fact that Jensen, an engineer, is one of the co-founders and he's been right there from the start, all the while navigating the ship through the high stakes and brutally competitive tech industry and being able to take it to where it is now. Quite extraordinary, I have to say.
 
Last edited:
The fact that Jensen, an engineer, is one of the co-founders and he's been right there from the start, all the while navigating the ship through the high stakes and brutally competitive tech industry and being able to take it to where it is now. Quite extraordinary, I have to say.
It is just hilarious to think of what the boxes for their first GPUs looked like. Scantily clad buxom armored women with blasters. They were marketing their products to like a 14 year old. Now their GPUs look like stately high fi components, they are the number 2 most valuable company in the world (for now) and their target customer are the biggest companies in the world.

I feel like NVIDIA grew up, but thanks to their products, I didn’t! 😂

I hope they don’t forget about us!
 
It is just hilarious to think of what the boxes for their first GPUs looked like. Scantily clad buxom armored women with blasters. They were marketing their products to like a 14 year old. Now their GPUs look like stately high fi components, they are the number 2 most valuable company in the world (for now) and their target customer are the biggest companies in the world.

I feel like NVIDIA grew up, but thanks to their products, I didn’t! 😂
So you’re saying that even after all this time you still look at scantily clad buxom armored women with blasters decoded by NVIDIA hardware-based real-time codec acceleration?
 
It is just hilarious to think of what the boxes for their first GPUs looked like. Scantily clad buxom armored women with blasters. They were marketing their products to like a 14 year old. Now their GPUs look like stately high fi components, they are the number 2 most valuable company in the world (for now) and their target customer are the biggest companies in the world.

I feel like NVIDIA grew up, but thanks to their products, I didn’t! 😂

I hope they don’t forget about us!
Nvidia's investments in GPGPU and CUDA way back in the mid 2000's was the turning point for the company, I think.
 
So you’re saying that even after all this time you still look at scantily clad buxom armored women with blasters decoded by NVIDIA hardware-based real-time codec acceleration?
No, I didn’t really pay much attention to the box art I thought it was ridiculous back then lol. The GPUs have been a distraction, however.
 
Nvidia's investments in GPGPU and CUDA way back in the mid 2000's was the turning point for the company, I think.
If I understand correctly, it is CUDA that gives them their competitive “moat.”

I just find it fascinating that these GPUs we’ve been obsessing about for decades are now considered like the holy grail of computing. We actually have been putting them in display cases in our houses for years, showing them off, making these crazy looking machines. And now those machines are taking over the economy.
 
I hope they don’t forget about us!
Look at their pricing these days compared to historical norms on consumer GPUs... Listen to what Jensen said on stage... nVIDIA is an AI company now. Not a GPU company. They haven't forgot about us; they just don't care about giving us value anymore after the scalper/cryptoboom and with Big Iron/AI money on the table.

For AI Moore's Law is running at 2x. For us gamers Moore's Law is dead.

Vote with your wallet. I am (by buying used).
 
I just found out a guy here locally charges people $150 for a haircut before a tip. If that can happen, this can also happen.
 
I just miss the days when Creative Labs was the top dog. Then Hercules. Then EVGA. Sigh.
 
For AI Moore's Law is running at 2x. For us gamers Moore's Law is dead.

The 294mm 4070 ti has an estimated 35.800 millions transistor, the 276mm Ampere (3060) had an estimated 12,000 millions, seem comfortably in the doubling every 2 years

The jump was so high that the 3060 is often not even present in the 4070ti reviews, at 1440p the 4070ti has 134% performance jump more than doubling.

The price did hide how good of a leap tech wise Lovelace was (for the power used, size of die, bandwitch bus width, etc...), Ampere transistor density almost doubled Turing just before that.

will have to see but if Blackwell jump is close (we can doubt it is possible too), Turing->Ampere->Lovelace->Blackwell would have been quite something if they do and not that far from Moore laws.

If instead of transistor present on a chip double every 2 years, we use price of transistor is cut in half every 2 years, than yes it has been death.
 
Last edited:
The 294mm 4070 ti has an estimated 35.800 millions transistor, the 276mm Ampere (3060) had an estimated 12,000 millions, seem comfortably in the doubling every 2 years

The jump was so high that the 3060 is often not even present in the 4070ti reviews, at 1400p the 4070ti has 134% performance jump more than doubling.

The price did hide how good of a leap tech wise Lovelace was (for the power used, size of die, bandwitch bus width, etc...), Ampere transistor density almost doubled Turing just before that.

will have to see but if Blackwell jump is close (we can doubt it is possible too), Turing->Ampere->Lovelace->Blackwell would have been quite something if they do and not that far from Moore laws.

If instead of transistor present on a chip double every 2 years, we use price of transistor is cut in half every 2 years, than yes it has been death.
And the 284mm² 1660 Ti had 6,600 million transistors.
 
And the 284mm² 1660 Ti had 6,600 million transistors.
yeah almost doubling, if TPu estimate are close enough:


2018: 24.3M / mm² (Turing)
2020: 44.4M / mm² (Ampere)
2022: 125.3M / mm² (Lovelace)

Doubling every 2 years in comp:
2018: 24.3
2020: 48.6
2022: 97.2
2024: 194.4

Maybe Blackwell will slow down, but would they have been TSMC N3E chips like Apple and released this year, they would still be ahead of the curve (around 215millions)
 
Lovelace is freaking incredible. It's kind of a shame that the pricing on it was pretty rough for certain SKUs, and people were led to believe that the 4090 was going to be some ridiculous 600 watt monster when in reality it only pulls around 360 watts in gaming which is barely anymore than a 3090.
 
Lovelace is freaking incredible. It's kind of a shame that the pricing on it was pretty rough for certain SKUs, and people were led to believe that the 4090 was going to be some ridiculous 600 watt monster when in reality it only pulls around 360 watts in gaming which is barely anymore than a 3090.
Yep, and if you power limit it to 80% you lose very little performance yet drop the usage under 300w.

I got a notebook with an RTX 4080 mobile (around 4070 super perf) and absolutely love it. It's quiet for a notebook and efficient on its balanced mode, and if I need a little more it is reasonable on its performance mode.
 
The wording of those are so strange.

Rumours of a 2025 laucnh mean that a 2024 launch has been postpone....

They not only take new rumours true but the previous rumours has been true (so they were not wrong, they were true and the launch has been postponed), not saying that it is not the case, but it is written and said with such "factual" to them, from nothing.

Like when specs move, they changed, the previous leaker were not wrong about their rumours the way it is announced.
 
Lovelace is freaking incredible. It's kind of a shame that the pricing on it was pretty rough for certain SKUs, and people were led to believe that the 4090 was going to be some ridiculous 600 watt monster when in reality it only pulls around 360 watts in gaming which is barely anymore than a 3090.
Even AIBs were led to believe that. Have you seen the size of their coolers? With the 4090 GPU only using about 1v and the amount of thermal headroom the cards have I’m convinced nvidia had a 4090ti in the pipeline, they just had no reason to release it.
 
I’m convinced nvidia had a 4090ti in the pipeline, they just had no reason to release it.
And full AD102 (L40) selling for like $10,000 were sell out on 6 weeks or more waiting line, being a strong reason not to release it (combined with not that impressive scaling it seems like).
 
Back
Top