The rumored RTX 4090Ti/TITAN

Specs seem insane tbh, not a small bump

"According to the latest rumors, the TITAN ADA graphics card could feature as many as 18176 CUDA cores and possibly even 48GB of memory. In fact, this could be NVIDIA’s first card to launch with 24 Gbps GDDR6X modules.

The 800W TDP that is being mentioned is the maximum board power, which is not comparable to RTX 4090’s 450W TDP, but rather its max power limit of 600W.

Thus far there is no indication that this GPU might be released soon. However, with so many photographs now leaked, there is no doubt that the quad-slot RTX 40 design was planned for a long time."

View attachment 545020

https://videocardz.com/newz/nvidias...ti-titan-800w-graphics-card-has-been-pictured
7c63283168b819a38876f8a3bc6e8428.gif
 
Maybe there will be a bigger supply of these I would never buy one I rather have a mid range card and a Oled monitor before this 2.5K Nightmare.
The 800 watts would kick out my Radio and anything else i have connected to the house. You figure you have to have a Monitor connected and Speakers
which most people do and the wattage is really getting out of hand for most households. Steve Burke will laugh at it along with Linus tech tips.
 
Last edited:
Man good thing I have a dedicated 30 amp breaker for my UPS in the office. Between that and a 13900k you might trip a breaker on full load, unless to move to 240v.
 
not against it having 48 GB of memory...because that's just pure awesome. but....what the hell would use that much?
 
not against it having 48 GB of memory...because that's just pure awesome. but....what the hell would use that much?
AI Models mainly. But even then, jesus. Last folks I worked with using consumer kit heavily for that tended to go with multi-GPU to scale up compute with the RAM... because models don't care about SLI.

For those, two 4090s > this monstrosity. And this is too big to run two of them.
 
not against it having 48 GB of memory...because that's just pure awesome. but....what the hell would use that much?

A lot of typical use case (they have 80GB card available and for a good amount of time and memory pooling to make that much larger) would not fit that much with the design of that card.

AI stuff can use a lot, for example, workload peaking at 72 GB of vram:
 
Tell me why the 4090 cooler coudln't cool this given how it keeps 4090 cool and it shouldn't be that hard to cool 4090 ti given its not even that big a chip compared to 4090 lmao.
Derbauer is going to laugh at this card. Nvidia is clearly pushing the limits again. This is going to be another exercise in pulling an additional 200-300W for an additional measly 10%.
Even my 4090 starts to get inefficient over 70% power being most efficient at about 55% it uses almost half the power but still nets you over 80-90% of the full performance lol.
 
Maybe there will be a bigger supply of these I would never buy one I rather have a mid range card and a Oled monitor before this 2.5K Nightmare.
The 800 watts would kick out my Radio and anything else i have connected to the house. You figure you have to have a Monitor connected and Speakers
which most people do and the wattage is really getting out of hand for most households. Steve Burke will laugh at it along with Linus tech tips.
What if midrange cards start to pull over 600 in the near future? how hard is it to upgrade the electricity power delivery in our homes? lol
 
Derbauer is going to laugh at this card. Nvidia is clearly pushing the limits again. This is going to be another exercise in pulling an additional 200-300W for an additional measly 10%.
Even my 4090 starts to get inefficient over 70% power being most efficient at about 55% it uses almost half the power but still nets you over 80-90% of the full performance lol.
If 40 watt of those 200 more watts are for the ram, 10 watt for the buffier caps, mosfet, fans, higher losses

That leave say around 150 watt for the GPU itselfs for 11% more cores, which should push the diminishing return curve a bit higher.

That the core count of RTX 6000 ada, that went in the other direction for less heat-noise-size, slower ECC ram for workstation and being to rack 2-3 of those and a 300watt usage.
 
not against it having 48 GB of memory...because that's just pure awesome. but....what the hell would use that much?
In addition to AI workloads, there's also 3D rendering. I encounter plenty of models which use 21GB and this often doesn't leave enough room on a 24GB card for the AI denoiser to be used. That creates a gigantic performance hit compared to the same model being able to use AI denoising on a card with even just 2GB more VRAM. We're talking a change of like 3-5x in many scenes just by having enough available VRAM to run that denoiser.
 
Last edited:
A 800W TDP GPU means the PC's peak TDP is probably going to be 1.2 kW or something close to that. I mean, that is like 10A on a 120V circuit: that's got to be coming close to tripping a breaker in most households.
American circuits are so weird. Why half a phase? Its also the reason your kettles take twice as long to boil as anywhere else on the planet.

On the upside... it is actually probably one of the design limits on GPUs now :p
 
American circuits are so weird. Why half a phase? Its also the reason your kettles take twice as long to boil as anywhere else on the planet.

On the upside... it is actually probably one of the design limits on GPUs now :p
Too many folks licking light sockets. Gotta keep them from frying quite so easily.
 
American circuits are so weird. Why half a phase? Its also the reason your kettles take twice as long to boil as anywhere else on the planet.

On the upside... it is actually probably one of the design limits on GPUs now :p
It's a bit complicated but the reason is largely because of light bulbs were designed to ha dle 110v with their carbon filaments, later on when metal filaments were introduced the standards were already in place in homes that changing would be infeasible, where as Euopean countries had the perfect opportunity when a world war decimated countries.

That said it's not like the US is the only country that uses 120v as standard for internal wiring.

Plus 240v is going into homes, if you really wanted you could wire it up like that as is done with heavy electrical users like electric stoves, electric water heaters, and electric clothes dryers and soon to be computer outlets?
 
not against it having 48 GB of memory...because that's just pure awesome. but....what the hell would use that much?
I can easily hit 20+ GB VRAM usage working with terrain in Unreal Engine 5; reason I went with a 3090 instead of a 3080Ti. As far as any game hitting that? Really depends on assets, renderer efficiency, and resolution.
 
I can easily hit 20+ GB VRAM usage working with terrain in Unreal Engine 5; reason I went with a 3090 instead of a 3080Ti. As far as any game hitting that? Really depends on assets, renderer efficiency, and resolution.
I'm salivating at the OPTIX render capabilities of this beast...
 
This is H

It all comes down to performance. If it performs like a monster, no problem being the size of a Tarrasque (and the hunger of one too).
 
But will it play Crysis? 🤣

It should if Crysis works on Windows 10/11. The 4090ti doesn't support 7/Vista so depending on the game compadibility issues with old software will vary.
 
  • Like
Reactions: TMCM
like this
Will it play Forespoken?
Will anybody play Forspoken?
But I mean Nvidia has to do something with all those OAM cards they can't sell, Datacenter sales tanked across the board. AMD might have a bit of a runaway hit with the MI300 data center APU because it offers a little of both at a "reasonable" price and it is a surprisingly good fit for a lot of projects.
 
LOL




Chroma version


This needs to be updated in 2023.
Except make it un-tethered.
A few kilowatt hours worth of 21700s would be good for a decent performance.
Put rubber around it so it can bounce its way down the road!
Imagine enjoying your lunch hearing something reminiscent of king kong on a pogo stick only to see a washing machine making its way down the avenue! Wouldn't want to park on the street that day!
 
I wonder if the vertical PCB would also hold weight better by potentially mounting into a PCIe slot that it would otherwise cover up?

Kinda surprised we haven't seen vertical PCBs before now that I think about it. Must be more issues with it, but there are no shortage of people who are vertically mounting their GPUs now.
 
I wonder if the vertical PCB would also hold weight better by potentially mounting into a PCIe slot that it would otherwise cover up?

Kinda surprised we haven't seen vertical PCBs before now that I think about it. Must be more issues with it, but there are no shortage of people who are vertically mounting their GPUs now.
When the 4000s originally released, I said we're going to start seeing separate cases for GPU. Surprised that hasn't taken off.
 
When the 4000s originally released, I said we're going to start seeing separate cases for GPU. Surprised that hasn't taken off.
CXL and hte like aren't ready yet, and as fast as Thunderbolt is, it ain't THAT fast yet.
 
It's pretty unclear what they mean by "vertical" board. I they mean it is parallel to the motherboard, I will believe it when I see it.
 
Back
Top