NVIDIA GeForce RTX 3090 Ti is real

Running an OC'd 3900x and 2080TI, in a small office that gets direct sunlight, that room heats up fast.
You can help with the right kind of window tint and blinds, perhaps a new window if yours is not double or triple pane already...
 
You can help with the right kind of window tint and blinds, perhaps a new window if yours is not double or triple pane already...
Yeah, it needs a new window, it's a single pane in a wooden frame, the house was built in the '20s. it's a bit of a project to say the least.
 
Yeah, it needs a new window, it's a single pane in a wooden frame, the house was built in the '20s. it's a bit of a project to say the least.
A window, some nice tint and blinds would not be ruinously expensive and would reduce the amount of time your air conditioner needs to run. You could just delete the window too if you wanted... If you lived there long enough it might pay for itself eventually but our lives are short, I'd do it regardless of the monetary savings on general principle in that I value my family's and our pets' comfort. The hottest it's ever gotten during my lifespan to date in my town was 111f with 99% relative humidity. The coldest it's ever gotten was -1f. These days and nights convinced me to buy the house I did and gut it a little the way I did before modifying it adequately with multiroom sound, LAN, AP jacks in the ceilings, spray foam insulation in exterior walls, and rockwool brand in the interior walls plus new much higher efficiency air conditioners upsized a little, and same efficiency but new gas furnaces. The whole thing is 3x the size of the house we used to rent yet uses 20~25% less energy. It also means I can game on my ridiculous 7.2 system without bothering anyone since the walls and the region between the floor and the downstairs ceiling are insulated. To achieve the following I had to replace the door an door frame with a custom solid hardwood door and exterior frame. I have to run the receiver up beyond -22dB to generate human complaints and -26dB to wake up dogs on the other end of the house. (German Shepherds with gigantic ears). It also means we can lightly roughhouse with the dogs and get them all riled up indoors without bothering our neighbors. This is nice when the weather outside is frightful (tornado warnings etc). It's easier to get your dogs into the bunker when they're tired I find.

The squiggly bit is due to it never got that cold at the rental house so I'm not sure how much natural gas it would have burned.
 
Last edited:
The difference is that in the 40's that was a reasonable provisioning for power. Today it's crappy companies trying to save a penny even if it screws you. I'd also expect them to have undersized HVAC, slightly cheaper up front; but significantly higher operational costs for the sucker who bought it.
Yep, bought a new house and the AC compressor is way undersized, among other issues. So great, you have this "super efficient" undersized unit that has to run three times as long because the differential is like 10 degrees.. Awesome.
 
A window, some nice tint and blinds would not be ruinously expensive and would reduce the amount of time your air conditioner needs to run. You could just delete the window too if you wanted... If you lived there long enough it might pay for itself eventually but our lives are short, I'd do it regardless of the monetary savings on general principle in that I value my family's and our pets' comfort. The hottest it's ever gotten during my lifespan to date in my town was 111f with 99% relative humidity. The coldest it's ever gotten was -1f. These days and nights convinced me to buy the house I did and gut it a little the way I did before modifying it adequately with multiroom sound, LAN, AP jacks in the ceilings, spray foam insulation in exterior walls, and rockwool brand in the interior walls. The whole thing is 3x the size of the house we used to rent yet uses 20-30% less energy. It also means I can game on my ridiculous 7.1 system without bothering anyone since the walls and the region between the floor and the downstairs ceiling being insulated. I have to run the receiver up beyond -22dB to generate human complaints and -26dB to wake up dogs on the other end of the house. (German Shepherds with gigantic ears). It also means we can lightly roughhouse with the dogs and get them all riled up indoors without bothering our neighbors. This is nice when the weather outside is frightful (tornado warnings etc).

The squiggly bit is due to it never got that cold at the rental house so I'm not sure how much natural gas it would have burned.
AC.... what kind of magic is that? But seriously though yeah it's a huge reno project (for me) so windows, siding, insulation, plumbing, and electrical are all needed, then Kitchen and bathroom. So it's planned for this spring/fall, no gas though all wood heat, but no neighbors to bother for a fair way in all directions.
 
AC.... what kind of magic is that? But seriously though yeah it's a huge reno project (for me) so windows, siding, insulation, plumbing, and electrical are all needed, then Kitchen and bathroom. So it's planned for this spring/fall, no gas though all wood heat, but no neighbors to bother for a fair way in all directions.
We have a property similar to that out on Lake Ouachita/out in the country but it was heavily renovated during the Great Recession to stimulate the community's out of work contractors. That was probably my favorite way to spend dead folks' money. We have wood stoves at my house, the lake, and deer/duck camp but it's not primary means of heat at any of them. Lake runs on propane and wood, camp runs on a large pond and geothermal loop + wood and a propane furnace for emergencies, house runs on natural gas and wood. I highly recommend central a/c though. Ours keeps it between 65 and 75f without any human intervention except monthly (366lbs of German Shepherds) hvac filter changes. We use the wood stoves to burn the deadfall from our forests. Stoves are great but using them as primary heat necessitates a whole lot of wood and cleaning up the ashes :/ Instead we clean the stoves monthly (in the Winter) and the chimneys annually.
 
Last edited:
So you live in an alternate world where SLI works great im all modern games?
What I was thinking. NVIDIA stopped supporting implicit SLI a couple years ago, and you can't enable SLI bits for other games without running into rendering issues nearly 100% of the time.
 
What I was thinking. NVIDIA stopped supporting implicit SLI a couple years ago, and you can't enable SLI bits for other games without running into rendering issues nearly 100% of the time.
There is a small handful of games that it still works with, GTA 5, Witcher 3, For Honor, I think Fallout 4. But it's a pretty sad list. I'm pretty sure that unless its a game with a specified profile it just doesn't actually run in SLI and one of the cards sits idle.
 
Last time I used SLI was in the 8800GT days, but at least back then there were regular driver updates for SLI profiles. I can't imagine trying to use it now since it's basically no longer supported in driver updates, and it doesn't work the same way 3090's in NVLink work.
 
7.7% bump in perf for a 50% increase in scalpage over 3090

Im predicting $3000+ for this
 
I'm still running my 2x 1070TI's in SLI and rocking 4k 2160p gaming. I see no need to upgrade for still some time.
Unless your only playing old games or turning settings down significantly in some cases, I disagree. I mean, you might truly be satisfied, but I'm telling you that the performance given by a pair of GTX 1070 Ti's at 4K is less than stellar. I had two GTX 1080 Ti's in SLI and though they did work well in a few titles, the vast majority of them didn't touch the second card at all. When the RTX 2080 Ti's came out I jumped ship for a single card and saw a reduction in performance in the few titles SLI worked, but often my experience was a little better because it was more consistent with better minimum frame rates. In other cases where SLI did nothing, the RTX 2080 Ti was significantly faster than what was effectively a single GTX 1080 Ti. Even then, I'd hesitate to call it a good 4K gaming card. It was simply the best available at the time.

I've been doing 4K gaming since the beginning. The RTX 3090 is the first card I've seen that really has the power to deliver a solid 4K gaming experience in most titles. Even then, there are some titles where performance could be better. Now, the RTX 3080, 3080 Ti, 6800XT/6900XT, etc. are all fine as well. While I haven't used them myself, the numbers are generally good for it. A regular RTX 3080 would be my minimum for 4K gaming now. Even then, it's 10GB of VRAM is limiting in a few scenarios and certainly will be as time passes.

I'm running an RTX 3090 now and if there were a card that provided a significant uplift in performance at 4K, I'd buy it on the spot. From the sound of it, the RTX 3090 Ti isn't going to be worth it over what I have.
 
Unless your only playing old games or turning settings down significantly in some cases, I disagree. I mean, you might truly be satisfied, but I'm telling you that the performance given by a pair of GTX 1070 Ti's at 4K is less than stellar. I had two GTX 1080 Ti's in SLI and though they did work well in a few titles, the vast majority of them didn't touch the second card at all. When the RTX 2080 Ti's came out I jumped ship for a single card and saw a reduction in performance in the few titles SLI worked, but often my experience was a little better because it was more consistent with better minimum frame rates. In other cases where SLI did nothing, the RTX 2080 Ti was significantly faster than what was effectively a single GTX 1080 Ti. Even then, I'd hesitate to call it a good 4K gaming card. It was simply the best available at the time.

I've been doing 4K gaming since the beginning. The RTX 3090 is the first card I've seen that really has the power to deliver a solid 4K gaming experience in most titles. Even then, there are some titles where performance could be better. Now, the RTX 3080, 3080 Ti, 6800XT/6900XT, etc. are all fine as well. While I haven't used them myself, the numbers are generally good for it. A regular RTX 3080 would be my minimum for 4K gaming now. Even then, it's 10GB of VRAM is limiting in a few scenarios and certainly will be as time passes.

I'm running an RTX 3090 now and if there were a card that provided a significant uplift in performance at 4K, I'd buy it on the spot. From the sound of it, the RTX 3090 Ti isn't going to be worth it over what I have.
With the 3080 12GB there is basically no reason for the 3080ti/3090 for 4K gaming, IMO. Both yield marginally higher FPS, but the major flaw of the 3080 was resolved now that it has the 12GB of VRAM. Only cases i've seen the 3080ti/3090 be noticeably better is when it becomes raytracing heavy and the raw compute of said cards wins out.

Of course, at the time, the 3080 10GB and 3090 were the only choice anyways..

I agree though, the 1080ti was barely passable for 4K gaming. The 2080ti was decent movement forward, but not enough IMO. It wasn't until I got my 3080ti that I finally got acceptable FPS for newer 4K titles. I can't imagine using a 1070ti for 4K gaming.. I say 107t0ti, because SLI blows. Been there, done that. Last time I ever used it was with the 8800GT. Was such a pain in the ass, and there were quite a few games that it just outright didn't work.
 
Last edited:
With the 3080 12GB there is basically no reason for the 3080ti/3090 for 4K gaming, IMO. Both yield marginally higher FPS, but the major flaw of the 3080 was resolved now that it has the 12GB of VRAM. Only cases i've seen the 3080ti/3090 be noticeably better is when it becomes raytracing heavy and the raw compute of said cards wins out.

Of course, at the time, the 3080 10GB and 3090 were the only choice anyways..

I agree though, the 1080ti was barely passable for 4K gaming. The 2080ti was decent movement forward, but not enough IMO. It wasn't until I got my 3080ti that I finally got acceptable FPS for newer 4K titles. I can't imagine using a 1070ti for 4K gaming.. I say 107t0ti, because SLI blows. Been there, done that. Last time I ever used it was with the 8800GT. Was such a pain in the ass, and there were quite a few games that it just outright didn't work.
Well, as I said the RTX 3090 was the first card I thought was truly capable of solid 4K gaming. It does predate the RTX 3080 Ti. Were I buying today, the 3080 Ti would make more sense. However, I got my RTX 3090 FE a few weeks after the product launch. And again, the AMD cards are fine for it as I understand it, barring some ray tracing limitations and early on, the lack of a DLSS like solution.

I actually had great luck with SLI over the years. It wasn't until the GTX 1080 Ti's that support had gotten so bad that it wasn't really viable anymore.
 
With the 3080 12GB there is basically no reason for the 3080ti/3090 for 4K gaming, IMO. Both yield marginally higher FPS, but the major flaw of the 3080 was resolved now that it has the 12GB of VRAM. Only cases i've seen the 3080ti/3090 be noticeably better is when it becomes raytracing heavy and the raw compute of said cards wins out.

Of course, at the time, the 3080 10GB and 3090 were the only choice anyways..

I agree though, the 1080ti was barely passable for 4K gaming. The 2080ti was decent movement forward, but not enough IMO. It wasn't until I got my 3080ti that I finally got acceptable FPS for newer 4K titles. I can't imagine using a 1070ti for 4K gaming.. I say 107t0ti, because SLI blows. Been there, done that. Last time I ever used it was with the 8800GT. Was such a pain in the ass, and there were quite a few games that it just outright didn't work.
My oc’d 2080TI does great at 1440p, wouldn’t want to use it at 4K 120 though. Maybe if I had a GSync monitor but I don’t, could get the card or the screen but not both.
 
My oc’d 2080TI does great at 1440p, wouldn’t want to use it at 4K 120 though. Maybe if I had a GSync monitor but I don’t, could get the card or the screen but not both.
2080ti was tolerable at 4k, but yes, I have a g-sync monitor which really helps when you're talking 40~FPS.
 
Ok well then, this definitely wont work with an SF750 and 5950x. Looking at the specs its probably less than 10% faster right? More like 5%?
 
Ok well then, this definitely wont work with an SF750 and 5950x. Looking at the specs its probably less than 10% faster right? More like 5%?
Based off specs it should be 10% faster then a FE 3090. With a OC you probably could put them on par. I doubt there is much head room left on the TI vs the non TI for clock speeds.
 
Ok well then, this definitely wont work with an SF750 and 5950x. Looking at the specs its probably less than 10% faster right? More like 5%?
5-10%, for sure. Like the Titan X and Titan Xp, it's not worth the money if you already have a 3090.
 
EVGA 3090ti is about 4 inches thick pic on Videocardz. Thicker doe a nt mean more substance like a pack of Hotdogs.
 
I just don't see the point to release it for a whole 10% more performance. Specially with lovelace coming in what 6-8 months?

Probably just want to go against the 6950x. But eh its very late to the party if you ask.
 
Steve from Gamers Nexus said it best in his review of the 3090 Ti: "The maximum card you need to be looking at for a gaming computer really stops at a 3080 or a 6800 XT"
 
This is definitely the "I want the fastest GPU you have, and I can't afford to wait" card for people who are either building their dream PC or a creative workstation. Virtually everyone else would be better-served by buying a much cheaper placeholder GPU (say, an RTX 3070/3070 Ti) and buying a nice RTX 40 series card when they're available.

Also, fun thought: this card not only costs as much as a Mac Studio, an entire computer, but consumes more power and probably occupies as much or more total volume. You wouldn't buy a Mac Studio for gaming, of course, but you also can't use the 3090 Ti as a full-fledged PC.
 
TPU reviewed the FTW3 Ultra, which is still in stock at Newegg right now. 12% faster than the 3090 at 4K, up to 20% faster when ray tracing is added. Considering the current price of the standard 3090 this is a no-brainer.

umm.... am i the only one kinda upset at that statement. I totally blame the consumer for paying these prices..
 
Back
Top