RTX 5090 - $2000 - 2 Slot Design - available on Jan. 30 - 12VHPWR still

ASUS ROG Astral 5090 has four fans. Don't think I've seen one like this before, usually there is just a cut out for the 3rd fan.



1735958274943.jpg
Is that intake or exhaust? If that top fan is intake, that's kind of stupid. I can't really see it doing anything valuable. If anything, I think it would have the opposite effect. It probably won't matter with how dumb oversized that heat brick (sorry, heat sink) is. I think if there is a generation where one should opt to go full water, this is probably it. Might as well also invest in super long tubing and put the radiator in another room or something. Maybe pipe it up to the attic (though you would need a pretty strong pump to get it that high)... or hell just put the entire computer in the attic.

I actually had an idea for a while about maybe just making an isolated chamber with some cheap piping or tubing leading up to the attic or something. Basically get a bathroom fan above your computer and then isolated so all of the heat will just funnel out into the outside, but it can still draw in air from the inside temps. Might lower AC costs enough to be worth it.

As far as SLI, I used SLI 760 GTXs and 780 GTXs for a while. Wasn't ever quite smooth. It was certainly playable. But not really smooth. Also the issue with the heat buildup between the cards.
 
Let me start with this, I've used Nvidia in the past, and probably will at some point in the future. I also consider myself pragmatic and the fact that this thing is only two slots is fan-friggin-tastic. I'm personally quite tired of giant ass video cards. I'm all for more performance, but not if we need a big block in a Chevy II type situation to get there all the time.
100% ,when my Antec 1100 was suddenly too 'small' coz it can only accommodate a 338 mm GPU, something didn't feel right, glad I didn't replace it last month with the 9800X3D...
 
WTF is an AI TOP?

I mean, I expected them to go AI crazy, as that is what they do these days, but I don't give a rats ass about the cards AI capabilities. I don't want them.

Dual slot only makes me suspicious that this won't be a very big performance leap over last gen.

If they required that much power and heat to make the 4090 work, there is no way dual slot is going to cut it in a card that is supposed to be faster than the 4090.

Yes, next gen fab process and all that, but the difference there is minor between TSMC's 4N process and the 4NP this is supposed to use is minor at best. A few percent. I'm not expecting that they will be able to radically improve perf/watt to the extent that this works.

Which probably means this is some "replace even more real performance with AI generative bullshit" generation, and if that is the case, I'm out. I'll keep my 4090 indefinitely. No more fake pixels and frames.
trillions of operations per second
 
I have a 4090 and I think the most demanding game I have played on it is Hitman 3. I'm still tempted by the 5090. Maybe I'll be "sensible" and get a 5080 :D. Then I can move the 4090 into another system to replace my 6900XT.
bro...
 
changing out from aluminum to copper and tightening up the fins to increase surface area most likely, paired with a better thermal compound.
The cooling solution on that card I suspect cost a pretty penny.
Newtons law of cooling says rate of cooling is proportional to temperature. My guess is they have a much thermal limit. All cooling solutions are more efficient at higher temperatures.
 
Newtons law of cooling says rate of cooling is proportional to temperature. My guess is they have a much thermal limit. All cooling solutions are more efficient at higher temperatures.
But a copper radiator is going to have a much higher saturation point so it all comes down to ensuring that they can wick the energy from the GPU core as fast as possible.

Either it looks like an impressive engineering feat.
 
I really heard nothing in the entire presentation about the cooler other than "look how slim this is it has vapor chambers and two flow through fans."

But there was no explanation as to what wizardry of thermodynamics was making that little cooler handle over 500w.

The show and tell and explanation seemed more like hand-wavium related to the cooling. I did a triple take when Jensen said it was the 5090 cooler. If they can cool a 500W+ 5090 with that... Shouldnt we be able to do exactly the same thing with a similar cooler low profile for desktop CPUs like the 14900k? That would only be around 250-300w heavily OC'd.

A Noctua D15 g2 can barely dissipate 250w.

So what's the secret sauce I wonder?

GPU are direct die cooling, for one thing.

2nd, the design of the card allows them to have a bunch of heatsink material spanning the entire thickness of the card, front to back.
Flow through also improves cooling a bit. And it's flow through on both sides.

They also probably spent an extra couple bucks per unit, to have a relatively better crafted heatsink with better tolerances.

Overall amount of heatsink material and overall cooling performance, is probably comparable to or just shy of a conventional 3 slot design.

I think it will probably be like a 33DB card at full load. So, not whisper. But, not bad.
Targeting ease of fitment for workstations.

With a side effect of being Interesting for ITX. Although the popular ITX sandwich designs with riser cable, usually struggle, due to no where for the flowthrough air to go. It will be fun to hear about how various ITX builds work out. Fractal Ridge will probably be great, as it has a custom, hard riser board. Which doesn't really block airflow like most riser cable designs in other console style cases.
As well any case which allows the GPU to slot directly into the mobo. Hyte Revolt 3, Nr200, Meshroom D, etc.
 
Hi guys. If i am plannin using 1440P will 5080 be fine?

Also last question.If i choose Rtx 5080 and 9800X3D will Focus Gx 1000W pcie 5.1 atx 3.1 will be enough?
 
  • Like
Reactions: erek
like this
will 5080 be fine?
A 7800XT would be..... (or the upcoming 9070xt) more than fine to play video game at 1440p. A $1000 new gpu is not fine, it is an ultra deluxe option.

Really depends on what game, setting and framerate you target. If you plan to play pathtracing set to max and want 90 fps in every new game without DLSS, maybe no it will not be, otherwise of course.

I would not choose a 5080 before third party review, I doubt you need to decide and pre-order them. A 1000 watt psu would be enough, 9800x3d is not a big power hog while gaming.
 
A 7800XT would be..... (or the upcoming 9070xt) more than fine to play video game at 1440p. A $1000 new gpu is not fine, it is an ultra deluxe option.

Really depends on what game, setting and framerate you target. If you plan to play pathtracing set to max and want 90 fps in every new game without DLSS, maybe no it will not be, otherwise of course.

I would not choose a 5080 before third party review, I doubt you need to decide and pre-order them. A 1000 watt psu would be enough, 9800x3d is not a big power hog while gaming.
Only games in 1440P. Wanna use DLSS and wanna full path tracing. :) Wanna play Call of Dutys,Cyberpunk,Stalker 2,Doom Dark Ages ah and new Dragon Age Veilguard.. Thats games are interesting me.
 
One thing I did not see mentioned, and I assume has not changed, is that you cannot set a frame rate cap with frame generation. That is a downside of the technology. Often times when I do turn it on it may make scenes where I am getting 80 frame rates look better at 110 or so, but other scenes it pushes above my refresh rate/G-sync limit. And then I see screen tearing. This will probably be a bigger issue with this new iteration of frame generation for many people. I know some monitors have higher refresh rates but those are often still limited to higher end OLED panels.

If I missed that and some how they got frame rate caps to work with frame generation, then that would be excellent news. But it is another reason why I can't get too excited over increases in frame generation performance and why I prefer seeing improvements outside of DLSS/frame generation for native resolution.
 
One thing I did not see mentioned, and I assume has not changed, is that you cannot set a frame rate cap with frame generation.
Frame generation automatically enables Nvidia's reflex, which caps fps at a few fps below your refresh rate.
 
I currently bought Playstation 5 PRO,just must plug it. In monday. And i think that with 5080 rtx and 9800X3D i will be happy with performance. For all above 30-45fps is fine.
 
One thing I did not see mentioned, and I assume has not changed, is that you cannot set a frame rate cap with frame generation. That is a downside of the technology. Often times when I do turn it on it may make scenes where I am getting 80 frame rates look better at 110 or so, but other scenes it pushes above my refresh rate/G-sync limit. And then I see screen tearing. This will probably be a bigger issue with this new iteration of frame generation for many people. I know some monitors have higher refresh rates but those are often still limited to higher end OLED panels.

If I missed that and some how they got frame rate caps to work with frame generation, then that would be excellent news. But it is another reason why I can't get too excited over increases in frame generation performance and why I prefer seeing improvements outside of DLSS/frame generation for native resolution.

Well, DLSS Quality is pretty nice, and you don't need frame generation. The upscaling on most titles is pretty sweet.
I've been using it on a handful of titles since Final Fantasy XV(not a shining example of the tech, but got playable framerates on higher resolution), such as the Diablo 2: Resurrected beta, Cyberpunk with RT, and did I mention I have small hands? But I've always viewed DLSS as a hardware life-extending feature more than anything else.
 
Well, DLSS Quality is pretty nice, and you don't need frame generation. The upscaling on most titles is pretty sweet.
I've been using it on a handful of titles since Final Fantasy XV(not a shining example of the tech, but got playable framerates on higher resolution), such as the Diablo 2: Resurrected beta, Cyberpunk with RT, and did I mention I have small hands? But I've always viewed DLSS as a hardware life-extending feature more than anything else.
In Lords of The Fallen , i loved FG.
 
But I've always viewed DLSS as a hardware life-extending feature more than anything else.
It's an image quality booster by letting you run higher graphics settings at higher framerates. The impact of dlss q is tiny on the image, but it boosts your capabilities immensely.
 
One thing I did not see mentioned, and I assume has not changed, is that you cannot set a frame rate cap with frame generation. That is a downside of the technology. Often times when I do turn it on it may make scenes where I am getting 80 frame rates look better at 110 or so, but other scenes it pushes above my refresh rate/G-sync limit. And then I see screen tearing. This will probably be a bigger issue with this new iteration of frame generation for many people. I know some monitors have higher refresh rates but those are often still limited to higher end OLED panels.

If I missed that and some how they got frame rate caps to work with frame generation, then that would be excellent news. But it is another reason why I can't get too excited over increases in frame generation performance and why I prefer seeing improvements outside of DLSS/frame generation for native resolution.

Frame generation automatically enables Nvidia's reflex, which caps fps at a few fps below your refresh rate.
Indeed.

However, some games have frame pacing or hitching issues, or tearing, with framegen. One or more of the pieces of the chain are failing, it seems.

Point being, this stuff is far from perfect. Despite some of its bullish fans.
 
Back
Top