MSI GeForce RTX 3090 Ti SUPRIM X Specs Leak Out: 1900 MHz Extreme Mode Boost Clock & 1000W PSU Recommended

Went ahead and purchased the EVGA 3090TI FTW. Yeah, it's not worth it price wise, especially since I already have a 3090 but I figured, I just upgraded my PC to the absolute latest and greatest, so why not just go all the way. Going to probably give my nephew my 3090. (Or, if there's enough space, can always use it as my dedicated PhysX card!)
 
Nvidia is just wacked out of their minds with this shit. Never liked Nvidia. What is the purpose of this crap, to get 300 fps. Who ever buys this crap needs to be shot.
Counterpoint: The games I play (ex: DCS World, in VR) actually would benefit perceptibly from this. Even with a 3090, it's possible to get reprojection in DCS, and this appears to be 5-15% faster, depending on the game. Reprojection sucks. I can imagine it being worth it for some folks, beyond just the compulsive need to have "the best." I'm not that committed, but to the person who built himself a model of an F-16 cockpit in his basement, this would probably be a no-brainer, especially if he were upgrading from something older, like a 2080 Ti.

Also, I can imagine some business applications where this would be considered a bargain. For instance, I have a field workstation we use for processing our collected LIDAR data in the field. I'm not sure if the boost over the 3090 would make that much of a difference, but the 24GB of VRAM would allow us to load up an entire flight's worth of data all at once, which would provide a very real benefit in the time it takes to make sure we didn't miss any spots.
 
Went ahead and purchased the EVGA 3090TI FTW. Yeah, it's not worth it price wise, especially since I already have a 3090 but I figured, I just upgraded my PC to the absolute latest and greatest, so why not just go all the way. Going to probably give my nephew my 3090. (Or, if there's enough space, can always use it as my dedicated PhysX card!)
I am sorry. In 6-8 months your card will be blown away by Lovelace. With AMD and Nvidia expecting a huge jump in performance between 80-120% faster cards then the current top......You should just wait it out.

Just my 0.02c
 
I am sorry. In 6-8 months your card will be blown away by Lovelace. With AMD and Nvidia expecting a huge jump in performance between 80-120% faster cards then the current top......You should just wait it out.

Just my 0.02c
Sounds like he has the money. I bet he buys Lovelace too. More power to him I say.
 
Once we start getting some credible 40-series leaks current gen sales are going to plummet.
There are already leaks out about the new cards. The hacker group released it all, and numerous people have verified that information.

Lovelace is going to be 1 monster of a card with a massive jump in performance, as is AMD's GPU's as well.
 
Counterpoint: The games I play (ex: DCS World, in VR) actually would benefit perceptibly from this. Even with a 3090, it's possible to get reprojection in DCS, and this appears to be 5-15% faster, depending on the game. Reprojection sucks. I can imagine it being worth it for some folks, beyond just the compulsive need to have "the best." I'm not that committed, but to the person who built himself a model of an F-16 cockpit in his basement, this would probably be a no-brainer, especially if he were upgrading from something older, like a 2080 Ti.

Also, I can imagine some business applications where this would be considered a bargain. For instance, I have a field workstation we use for processing our collected LIDAR data in the field. I'm not sure if the boost over the 3090 would make that much of a difference, but the 24GB of VRAM would allow us to load up an entire flight's worth of data all at once, which would provide a very real benefit in the time it takes to make sure we didn't miss any spots.
ehh for the price that 5%-10% is not worth it. Wait for the next gen, mabe you can get in and out at no cost depedning on the used market, but if you can't you are tossing money away.
 
This card is definitely huge. Much bigger than the 3090 FTW. And you'll need a massive case in order to NV-Link. It really is a 4 slot card, and sadly, my case really only supports 3 slots for the bottom PCI-Express.

Anyway:

https://www.3dmark.com/3dm/73666196?

But Quake 2 RTX now runs roughly @ 50+ fps on 4k with all options maxed (100+ fps in 1440p).
 
I think i will wait for 4xxx series to upgrade from 3090. Dont see any sense to buy 3090 ti. Especially i am poor actually. Have loans.
 
Last edited:
I think i will wait for 4xxx series to upgrade from 3090. Dont see any sense to buy 3090 ti. Especially i am poor actually. Have loans.
There are two kinds of people who have 3090s.

1: people who are quite wealthy and can spend their disposable income on a 3090

2: people who are not quite wealthy BECAUSE they spent their disposable income on a 3090
 
Man this decision to buy the TUF card is TUF. Dammit!

I just ordered a TUF because of how TUF it looks (it even has military grade hardware, says so right on the box!)

Now the question is... will my 1000w PSU be enough to power this freak along with the 12900k. I will keep the fire extinguisher at the ready for any nuclear meltdown I inadvertently trigger.
 
I just ordered a TUF because of how TUF it looks (it even has military grade hardware, says so right on the box!)

Now the question is... will my 1000w PSU be enough to power this freak along with the 12900k. I will keep the fire extinguisher at the ready for any nuclear meltdown I inadvertently trigger.
its fine
 
I just ordered a TUF because of how TUF it looks (it even has military grade hardware, says so right on the box!)

Now the question is... will my 1000w PSU be enough to power this freak along with the 12900k. I will keep the fire extinguisher at the ready for any nuclear meltdown I inadvertently trigger.
Also I hear it has bitching fans that are quiet like a military assassin.
 
I just ordered a TUF because of how TUF it looks (it even has military grade hardware, says so right on the box!)

Now the question is... will my 1000w PSU be enough to power this freak along with the 12900k. I will keep the fire extinguisher at the ready for any nuclear meltdown I inadvertently trigger.
Depends, do you plan to overclock the CPU and GPU?

I say that tongue in cheek, you should be good. Though it should be noted the FTW3 version can consume a little over 500 watts, not sure about the suprim
 
I am sorry. In 6-8 months your card will be blown away by Lovelace. With AMD and Nvidia expecting a huge jump in performance between 80-120% faster cards then the current top......You should just wait it out.

Just my 0.02c
He appears to have enough disposable income to upgrade to Ada when available so I don’t think he’s too concerned.
 
I am sorry. In 6-8 months your card will be blown away by Lovelace. With AMD and Nvidia expecting a huge jump in performance between 80-120% faster cards then the current top......You should just wait it out.

Just my 0.02c
Um, They say that at every release, double the performance! reality hits and it's like 20-40% depending on the scenario :)
 
Um, They say that at every release, double the performance! reality hits and it's like 20-40% depending on the scenario :)
I at least expect it with AMD. An MCM design will bring huge gains. Think of 2 6900xt chips on 1 die. That’s why I am thinking Nvidia has to bring something huge in performance to keep up. Next gen gpu’s are going to be super interesting
 
I at least expect it with AMD. An MCM design will bring huge gains. Think of 2 6900xt chips on 1 die. That’s why I am thinking Nvidia has to bring something huge in performance to keep up. Next gen gpu’s are going to be super interesting

How is AMD going to spin it? it would be a 550W-600w card

I don't think we are going to see double anything this generation without a radical improvement in the tech being used. Maybe a big improvement on RTX performance and DLSS type workloads.
 
How is AMD going to spin it? it would be a 550W-600w card

I don't think we are going to see double anything this generation without a radical improvement in the tech being used. Maybe a big improvement on RTX performance and DLSS type workloads.
Who knows right now its all speculation. The only thing we know for sure is they are going with an MCM design that will be on either 4nm or 5nm. Nvidia is going with a monolithic design and FINALLY going back to TSMC.

Too many questions as right now. But being able to basically have 2 chiplets on 1 die is a huge leap in technology when it comes to GPU's. Interesting times ahead for both companies.
 
Depends, do you plan to overclock the CPU and GPU?

I say that tongue in cheek, you should be good. Though it should be noted the FTW3 version can consume a little over 500 watts, not sure about the suprim

No OC needed for the CPU, but I will want to max the power slider in Afterburner. That is where I had issues with the FE 3090/8700k and an 850w Seasonic unit. I had total system shutdowns while gaming and setting +114% on GPU power. 850w *should* have been fine according to spec...

Anyway, I'll find out tomorrow when the TUF arrives. Got the whole weekend to play with it 🤘
 
No OC needed for the CPU, but I will want to max the power slider in Afterburner. That is where I had issues with the FE 3090/8700k and an 850w Seasonic unit. I had total system shutdowns while gaming and setting +114% on GPU power. 850w *should* have been fine according to spec...

Anyway, I'll find out tomorrow when the TUF arrives. Got the whole weekend to play with it 🤘
Transient loads.
 
No OC needed for the CPU, but I will want to max the power slider in Afterburner. That is where I had issues with the FE 3090/8700k and an 850w Seasonic unit. I had total system shutdowns while gaming and setting +114% on GPU power. 850w *should* have been fine according to spec...

Anyway, I'll find out tomorrow when the TUF arrives. Got the whole weekend to play with it 🤘
Watch your metrics. Increasing the power limit could cause the card to hit the power wall more quickly and you'll lose performance compared to leaving it at 100%. The best thing you can do to keep the card in the highest Boost bin most of the time is do anything and everything to keep the temperatures down under load. Ideally under 50C.
 
Also, I can imagine some business applications where this would be considered a bargain. For instance, I have a field workstation we use for processing our collected LIDAR data in the field. I'm not sure if the boost over the 3090 would make that much of a difference, but the 24GB of VRAM would allow us to load up an entire flight's worth of data all at once, which would provide a very real benefit in the time it takes to make sure we didn't miss any spots.
The 3090 has the same amout of VRAM. The boost is minimal and would in no logical sense ever be considered a "bargain".
 
Back
Top