NVIDIA's Next-Gen Reference Cooler Costs $150 By Itself, to Feature in Three SKUs

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,785
This has been circulating around lately. Surprised that it's amounting to real news coverage.

"The next SKU, the SK20, which is the RTX 2080 Ti successor, will be cut down from SKU10. It will feature 11 GB of GDDR6X memory across a 352-bit wide memory interface, and have a 320 W typical board power rating. This board will likely feature the RTX 3080 Ti branding. Lastly, there's the SKU30, which is further cut-down, features 10 GB of GDDR6X memory across a 320-bit wide memory interface, and it bears the RTX 3080 model number, succeeding the RTX 2080 / RTX 2080 Super.

When launched, "Ampere" could be the first implementation of the new GDDR6X memory standard, which could come with data-rates above even the 16 Gbps of today's GDDR6, likely in the 18-20 Gbps range, if not more. Lesser SKUs could use current-gen GDDR6 memory at data-rates of up to 16 Gbps."



https://www.techpowerup.com/268278/...ts-usd-150-by-itself-to-feature-in-three-skus
 
I'll throw gas on the fire, software/gaming has always been cyclical where we've had game release peaks and valleys between the software catching up to the hardware, and vice versa. With last years AAA titles (and the few years before that) mostly falling flat, and today's "AAA" titles now on version....4, 6, 8+........I'm less concerned about where hardware is going (Ray Tracing, we get it) and more concerned that there won't be a damned game worth playing on all this $1000 hardware. Forza Horizon 5, some-racing-game-name-here, Super-Rebooted-Shooter-Same-Campaign-If-Any-At-All........other than having to spend a grand just to get "even bettah!" Shadows and Lighting......why are we supposed to be excited again?
 
Either that is a flaming hot gpu they are making or someone made a mistake on their napkin math.
 
I was hoping for more VRAM.
Yeah if that's all the vram we are getting on the top end cards then something like the 3070 and 3060 we'll be looking at 8 gigs at best. That is sufficient for now but it is right at the limit for some games and actually for Wolfenstein Youngblood it is not enough and you have to turn down textures one notch with 8 gigs of vram and ray tracing on. I think it would be asinine to have upper mid-range cards released at the end of 2020 at $400 plus that are clearly going to be the vram limited not long after release. I mean even the day you take it out of the damn box you have to turn down textures one notch in a current game now.
 
Either that is a flaming hot gpu they are making or someone made a mistake on their napkin math.
320W TDP is what the rumor says for the Ti-equivalent, which equals heat. If true they clearly knew conventional design wasn't going to be efficient. Look how hot the reference 300W 290X ran with a blower.
 
320W TDP is what the rumor says for the Ti-equivalent, which equals heat. If true they clearly knew conventional design wasn't going to be efficient. Look how hot the reference 300W 290X ran with a blower.

Dont have to look I owned a 290X with a reference design ;) Ran toasty no doubt but thats why I slapped a water block on it and it was just fine. But the blower did a decent job so not sure 20 more watts requires a super expensive cooling system. But will see, so many rumors floating around right now and most of them are likely to be untrue.
 
Probably $15 since it'll be made by the finest Chinese child labor. If Nvidia were to pay $150 then it would be for water cooling.
 
4k gaming at 120FPS mins and HDMI 2.1

That’s nice to have but how much better is your gaming experience really at 4K 120fps vs 1440p 120fps. I run either 1440p, 1800p or 4K depending on the game and in general the jump to 4K is barely noticeable. It’s a huge waste of hardware performance that would be better spent on more advanced graphics (or physics or AI).
 
That’s nice to have but how much better is your gaming experience really at 4K 120fps vs 1440p 120fps. I run either 1440p, 1800p or 4K depending on the game and in general the jump to 4K is barely noticeable. It’s a huge waste of hardware performance that would be better spent on more advanced graphics (or physics or AI).
Well for one thing there is no way a 3080 or even 3080 ti is going to be remotely capable of even averaging 120 FPS much less having minimums of 120 FPS in most upcoming demanding games at 4K maxed. I guess people forget that a 2080 ti cannot even average 60fps in lots of games right now at 4K even without ray tracing or MSAA/SSAA. That means many of those games actually have minimums in the 40s and even 30s. So bottom line is even with some current games the 3080 ti would have to be way way more than twice as fast as a 2080ti to even average 120 nevermind maintain it. Throw in ray tracing and next gen graphics in you're probably looking at 60fps. Hell if it's only 50% faster you'd be looking at barely over 60 FPS with some current games with ray tracing. I guess my point is there's no such thing as too much GPU power if you want to crank the settings in every single game. Heck if I cranked every single setting in Red Dead Redemption 2 I'd probably be in the 30s even at 1080p with an RTX 2080 super.
 
Last edited:
That’s nice to have but how much better is your gaming experience really at 4K 120fps vs 1440p 120fps. I run either 1440p, 1800p or 4K depending on the game and in general the jump to 4K is barely noticeable. It’s a huge waste of hardware performance that would be better spent on more advanced graphics (or physics or AI).

You do realize that 4k is technically more advanced graphics when compared to 1440p ;p. That aside there are differences and it depends on your attention to detail as well as how well the game was designed. You can tell the difference between 1440p and 4k and some people prefer 4k. Its no different than people who prefer high end sound cards. If a person can tell the difference and it matters to them its not a waste of hardware performance.
 
You do realize that 4k is technically more advanced graphics when compared to 1440p ;p.

Yep resolution is certainly very important to the end result. But we’re already at the point of extremely diminishing returns and there are other more important things that can benefit from more powerful graphics hardware. 4K 120fps should not be an immediate goal for developers when 4K 60fps is still not achievable even with mediocre graphics.

That aside there are differences and it depends on your attention to detail as well as how well the game was designed. You can tell the difference between 1440p and 4k and some people prefer 4k. Its no different than people who prefer high end sound cards. If a person can tell the difference and it matters to them its not a waste of hardware performance.

4K doesn’t fix low poly geometry, muddy textures, fuzzy reflections or fake volumetric effects. It certainly doesn’t add global illumination, more detailed environments or better physics.

The biggest winners from 4K are nvidia and the monitor manufacturers because they get to sell more expensive hardware without waiting for software to catch up. As consumers we’re still waiting for actual advances in graphics rendering.
 
Yep resolution is certainly very important to the end result. But we’re already at the point of extremely diminishing returns and there are other more important things that can benefit from more powerful graphics hardware. 4K 120fps should not be an immediate goal for developers when 4K 60fps is still not achievable even with mediocre graphics.



4K doesn’t fix low poly geometry, muddy textures, fuzzy reflections or fake volumetric effects. It certainly doesn’t add global illumination, more detailed environments or better physics.

The biggest winners from 4K are nvidia and the monitor manufacturers because they get to sell more expensive hardware without waiting for software to catch up. As consumers we’re still waiting for actual advances in graphics rendering.

Even games with mediocre graphics benefit from the sharpness increase you get from 4K. It's easy to see, or at least for me it is.
No argument here, just a personal preference.
 
Even games with mediocre graphics benefit from the sharpness increase you get from 4K. It's easy to see, or at least for me it is.
No argument here, just a personal preference.

Depends on how close your sitting to the screen and how big the screen is. Unless I am within two feet of the screen or so I have a hard time seeing much difference in a 4K screen.
 
Even games with mediocre graphics benefit from the sharpness increase you get from 4K. It's easy to see, or at least for me it is.
No argument here, just a personal preference.

I agree that it helps. Just saying that the focus for next gen should be better graphics at 4K 60 not the same old at 120fps.
 
Depends on how close your sitting to the screen and how big the screen is. Unless I am within two feet of the screen or so I have a hard time seeing much difference in a 4K screen.
It depends on pixels per degree. Around 60 PPD is considered optimal, where the pixel structure disappears and the image appears sharpest. At 4K on 27" the critical viewing distance for 60 PPD is 1.76 feet. 2560x1440 on the same size screen is 2.63 feet. In other words, you should be able to start telling the difference within 3 feet of the screen which is the typical viewing distance of a person sitting at a desk.
 
Yep resolution is certainly very important to the end result. But we’re already at the point of extremely diminishing returns and there are other more important things that can benefit from more powerful graphics hardware. 4K 120fps should not be an immediate goal for developers when 4K 60fps is still not achievable even with mediocre graphics.



4K doesn’t fix low poly geometry, muddy textures, fuzzy reflections or fake volumetric effects. It certainly doesn’t add global illumination, more detailed environments or better physics.

The biggest winners from 4K are nvidia and the monitor manufacturers because they get to sell more expensive hardware without waiting for software to catch up. As consumers we’re still waiting for actual advances in graphics rendering.

Again if it matters to an individual then its worth it. No different than those who claim they can tell the difference between two high end audio systems.
 
Again if it matters to an individual then its worth it. No different than those who claim they can tell the difference between two high end audio systems.

That’s fine on an individual level but let’s not pretend this is purely a subjective issue. There are obvious limitations to simply increasing resolution.
 
I just wish we could buy the video cards naked and not spend money on expensive coolers we don't plan on using.

If we could specify binned ASIC quality tiers at different price points that would be great too.

I'd spend more for 98% ASIC quality without a cooler, than 85% with a silly cooler that I won't use.
 
I just wish we could buy the video cards naked and not spend money on expensive coolers we don't plan on using.

If we could specify binned ASIC quality tiers at different price points that would be great too.

I'd spend more for 98% ASIC quality without a cooler, than 85% with a silly cooler that I won't use.
While I agree with the naked card idea, I greatly disagree with ASIC. ASIC just determines the amount of power leakage, which goes to how far you can overclock. With the recent generation of video cards literally all overclocking to the same clock speed with conventional cooling methods this will just be an excuse for even more price inflation with no real benefit unless you're going for world records.

With the naked card idea, I think you should just be able to order a card and then freely pick a cooler from those available to have installed or none at all. I don't know what that would look like at retail, though
 
While I agree with the naked card idea, I greatly disagree with ASIC. ASIC just determines the amount of power leakage, which goes to how far you can overclock. With the recent generation of video cards literally all overclocking to the same clock speed with conventional cooling methods this will just be an excuse for even more price inflation with no real benefit unless you're going for world records.

With the naked card idea, I think you should just be able to order a card and then freely pick a cooler from those available to have installed or none at all. I don't know what that would look like at retail, though

It would probably be more expensive because now they wouldn't be able to run them all down the same factory line. Unless the majority of people dont want the card to come with heatsinks I think you will continue to see them come with stock cooling. Lets be honest here changing out the cooler is an enthusiast thing. Most people who buy these cards probably dont do that.
 
I like the idea of a naked card. Cards with pre installed blocks tend to be greatly over priced and might not be as good as a aftermarket water block. Also it seems people run into rma problems on those cards cause companies lack proper stock of them for rma purposes.
 
It would probably be more expensive because now they wouldn't be able to run them all down the same factory line. Unless the majority of people dont want the card to come with heatsinks I think you will continue to see them come with stock cooling. Lets be honest here changing out the cooler is an enthusiast thing. Most people who buy these cards probably dont do that.

Same factory line wouldn't be to bad, they just need different packaging. They would just take the naked ones off the main line early and have them go to the other package line. But yeah, I doubt there is enough demand for naked cards. I think they would already sell them like that if there was.
 
If anything the actual construction, looks less expensive than the current FE with cooler with massive Vapor chambers and large amount of extraneous aluminum:


So I guess following the logic here in the original story, means cheaper cards. Right? ;)
 
If anything the actual construction, looks less expensive than the current FE with cooler with massive Vapor chambers and large amount of extraneous aluminum:

So I guess following the logic here in the original story, means cheaper cards. Right? ;)
I was thinking the same about the construction unless they went with dual vapor changes even though it seems that wouldn't be necessary.
 
Yeah if that's all the vram we are getting on the top end cards then something like the 3070 and 3060 we'll be looking at 8 gigs at best. That is sufficient for now but it is right at the limit for some games and actually for Wolfenstein Youngblood it is not enough and you have to turn down textures one notch with 8 gigs of vram and ray tracing on. I think it would be asinine to have upper mid-range cards released at the end of 2020 at $400 plus that are clearly going to be the vram limited not long after release. I mean even the day you take it out of the damn box you have to turn down textures one notch in a current game now.

The titan card that costs $$$$ is rumored to have 24GB

Someone commented that gddr6 has a hard time supporting 512bit wide memory bus. Which is why you are unlikely to see 16GB cards. (10GB, 11GB, 12GB, & 24GB are the rumors as of now)
 
It would probably be more expensive because now they wouldn't be able to run them all down the same factory line. Unless the majority of people dont want the card to come with heatsinks I think you will continue to see them come with stock cooling. Lets be honest here changing out the cooler is an enthusiast thing. Most people who buy these cards probably dont do that.

Doesn't have to make a big difference. You can pull a minority of cards for those who don't want the stock cooler off the line at random before the cooler is mounted.

Sure, having two SKU's instead of one will raise costs a little bit, but nowhere near $150. We are talking a couple of bucks.
 
God I'd love the idea of a naked cards for purchase, but I'd imagine it would make RMA and warranty more difficult as you have to account for how many mis-mounted or improperly sized coolers.

Still it would be awesome not to have buy a considerable chunk that you aren't going to use
 
Back
Top