3080 - What's the difference? Asus ROG vs TUF?

DarkSideA8

Limp Gawd
Joined
Apr 13, 2005
Messages
375
The ROG is listed at $850 while the TUF is the same price as the FE ($700). But when I look at the 'compare' page on NVidia's site, I'm not seeing much there. Is there some guarantee that the ROG uses the better capacitors?

Also - does anyone know if the AIB cards using the cheap capacitors have been 'fixed' with a driver release? If so, any links to what impact that's had on performance?

Thanks
 

Whach

[H]ard|Gawd
Joined
Dec 22, 2011
Messages
1,064
As far as I can tell, the strix cards have more features related to extreme overclocking (voltage monitoring points on PCB while using ln2, 3x8 pin power delivery and more power delivery headroom). Also some "gamer" vibe and design aesthetics.

TUF used to be a high grade and durable components feel, but because of the small headroom in ampere overclocking, there's not that much in difference in normal use cases. The strix has a higher default clock iirc. I think the TUF is better value in that sense.

The capacitor issue has largely been solved by better drivers that evened out the auto boosting overclocks from being too aggressive with negligible performance loss & sometimes even better.
 

exlink

Supreme [H]ardness
Joined
Dec 16, 2006
Messages
5,153
STRIX has a much higher power limit and might have pre-binned chips. This means it’ll likely overclock higher. The TUF has a slightly higher power limit than the Founders Edition. It still has the potential of hitting decently high clock speeds but less likely than the STRIX.

Interestingly I read that the STRIX has a plastic shroud while the TUF is all metal. Kind of odd that the lower tier card has better quality materials for the shroud.
 

Meeho

Supreme [H]ardness
Joined
Aug 16, 2010
Messages
5,159
The ROG is listed at $850 while the TUF is the same price as the FE ($700). But when I look at the 'compare' page on NVidia's site, I'm not seeing much there. Is there some guarantee that the ROG uses the better capacitors?

Also - does anyone know if the AIB cards using the cheap capacitors have been 'fixed' with a driver release? If so, any links to what impact that's had on performance?

Thanks
These cards are pretty much maxed out. You'll see no tangible benefits over TUF. Look for noise, thermals and warranty.
 

HockeyJon

[H]ard|Gawd
Joined
Dec 14, 2014
Messages
1,281
These cards are pretty much maxed out. You'll see no tangible benefits over TUF. Look for noise, thermals and warranty.

This. I don’t see the point of paying an extra $150 for the Strix when overclocking these cards is only yielding like 3% extra performance, and the TUF is unlocked anyway and has good thermal performance. Save your money, just get the TUF. The $700 version of the TUF looks like it’s one of the best values I’ve seen reviewed of the 3080s thus far, and is the card I will buy if I decide to upgrade to a 3080 this round.
 

x509

2[H]4U
Joined
Sep 20, 2009
Messages
2,209
This. I don’t see the point of paying an extra $150 for the Strix when overclocking these cards is only yielding like 3% extra performance, and the TUF is unlocked anyway and has good thermal performance. Save your money, just get the TUF. The $700 version of the TUF looks like it’s one of the best values I’ve seen reviewed of the 3080s thus far, and is the card I will buy if I decide to upgrade to a 3080 this round.

Does this difference in design also apply to the 3090? Or in my case, for a 3070.
 

HockeyJon

[H]ard|Gawd
Joined
Dec 14, 2014
Messages
1,281
Does this difference in design also apply to the 3090? Or in my case, for a 3070.

It does for the 3090, but we won’t be able to say for the 3070 until the it launches. My understanding is that the power for the 3080 and 3090 is being limited in the video BIOS which is limiting gains from overclocking in the 3080 and 3090. Whether or not the same design decision is made in the 3070 is something we won’t know until reviews come out.

Seriously though, if you’re just gaming, don’t buy a 3090. It’s a waste of money. You will spend 100% more for a 10% gain. It’s marketed as a gaming card, but it really isn’t. The 3080 gives excellent value on a performance per frame basis, but you have almost zero overclocking potential, so I wouldn’t pay an additional $150 for almost the same cooling performance and a 3% uplift in frames. You won’t notice the difference. If you’re in the market for the 3070, then just wait. You can’t find cards anywhere right now anyway, and AMD is launching big Navi the day before the 3070 arrives, so you can compare options.
 

x509

2[H]4U
Joined
Sep 20, 2009
Messages
2,209
It does for the 3090, but we won’t be able to say for the 3070 until the it launches. My understanding is that the power for the 3080 and 3090 is being limited in the video BIOS which is limiting gains from overclocking in the 3080 and 3090. Whether or not the same design decision is made in the 3070 is something we won’t know until reviews come out.

Seriously though, if you’re just gaming, don’t buy a 3090. It’s a waste of money. You will spend 100% more for a 10% gain. It’s marketed as a gaming card, but it really isn’t. The 3080 gives excellent value on a performance per frame basis, but you have almost zero overclocking potential, so I wouldn’t pay an additional $150 for almost the same cooling performance and a 3% uplift in frames. You won’t notice the difference. If you’re in the market for the 3070, then just wait. You can’t find cards anywhere right now anyway, and AMD is launching big Navi the day before the 3070 arrives, so you can compare options.
I'm not a gamer, but my main GPU-intensive applications are Adobe Lightroom and Adobe Photoshop, both of which can run plug-ins that are very GPU-intensive. In a photography forum, there is now a thread about how much the speed of cleaning up the "noise" in a digital photo depends on the power of GPU. That's why I'm thinking 3070, because it fits in with my budget. A 3080 would be nice, but it's too pricey for me.
 

vegeta535

Supreme [H]ardness
Joined
Jul 19, 2013
Messages
4,831
ROG has all the fancy lights and a beefier cooler. The tuf uses all the better caps also. Unless you want all
I'm not a gamer, but my main GPU-intensive applications are Adobe Lightroom and Adobe Photoshop, both of which can run plug-ins that are very GPU-intensive. In a photography forum, there is now a thread about how much the speed of cleaning up the "noise" in a digital photo depends on the power of GPU. That's why I'm thinking 3070, because it fits in with my budget. A 3080 would be nice, but it's too pricey for me.
If it is for professional work then spring for the 3080. Is $200 really that much of range?
 

x509

2[H]4U
Joined
Sep 20, 2009
Messages
2,209
ROG has all the fancy lights and a beefier cooler. The tuf uses all the better caps also. Unless you want all

If it is for professional work then spring for the 3080. Is $200 really that much of range?
But it's not for professional work, and I also need to buy a photo printer and 4K IPS monitor for photo use. Plus some specialist photo editing software.
 
Last edited:

DarkSideA8

Limp Gawd
Joined
Apr 13, 2005
Messages
375
But it's not for professional work, and also need to buy a photo printer and 4K IPS monitor for photo use. Plus some specialist photo editing software.

people have been doing amazing photography work on 1080p for years. You only really need 4k at 32 inches or more on your monitor.
 

x509

2[H]4U
Joined
Sep 20, 2009
Messages
2,209
But it's not for professional work, and I also need to buy a photo printer and 4K IPS monitor for photo use. Plus some specialist photo editing software.
Forgot to mention that in 2021 I will probably replace my current camera body with the rumored Nikon D860, $3+ right there.
 

VirtualMirage

Limp Gawd
Joined
Nov 29, 2011
Messages
357
people have been doing amazing photography work on 1080p for years. You only really need 4k at 32 inches or more on your monitor.
I've been doing photo editing on a 27" 4K for the past 4 years and it was a huge upgrade over 1080P, even at 27".

Right next to it I have my work PC which has a 32" 4K monitor that I use for doing design work and the extra few inches are nice, no doubt about it, but I don't think the extra size made much of a difference for me in the effectiveness of using a 4K display. Just my opinion.

Now if I had the choice right now to buy a 4K monitor, sure I would want to try and get a larger one because everyone want a bigger screen. But as close as I sit to my monitor, I could probably be just as happy with another 27".
 
Top