• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

5080 Reviews

I will wait and see next month for the 5070 Ti review and if it is any much better than 4070 Ti Super. Hope it is not like this 5080 vs 4080S again. Don't have high hopes though, just by looking at the differences in cuda count (8960 vs 8448). 😞
 
If AMD's new cards turn out to be duds, I am looking forward to this years black friday. Maybe there is a decently priced 4080/Super somewhere then. I would not lose ANYTHING by going last gen graphics card over these new 5000 gen cards. Even this new and fancy multiframe generation has any use ONLY if you have a monitor that is 200hz+. On my 120hz 4K TV, a single frame gen can provide all the (questionable) benefits that this technology can provide.
 
LOL not in the least bit surprised. NVidia doesn’t want to sell these cards. Why waste all that precious silicon on low profit gaming GPUs?
 
Well poo.

<looks at 3080> Sorry old gal, your request for a discharge was not approved.
1738174429381.png
:D
 
And remember, with Nvidia having mostly, if not entirely, ended production for 4080 supers months ago along with yet more squeezing of AiBs, most of those new 5080s are going to be ever so slightly tweaked departures from reference designs with a $200-$500, or in Asus's case a $200-$800, markup over the the $999 MSRP, and little stock of the older supers to buy for much longer... not that the 4080 or 4080 super were even decent value to begin with.

(edit for typo)
 
Last edited:
I snagged one recently for a price I’m not complaining about but damned I really expected the 5080 to be better. I get leaving room for the 5080S or what not but Jesus tap dancing Christ that’s just nuts.
I don't see how they left room for a 5080 Super that isn't just the same 5080 with 24GB. 5080 is the full GB203 die. GB202 already cut down significantly for the 5090. I don't see any situation where they'd further cut down a 750mm2 die to sell as a cheaper 5080 Super / Ti.
 
There's already people claiming that the 5080 is "technically" faster due to Multi-Frame Generation. Do these people not realize that FG and MFG actually hurt performance in a competitive game scenario? It also hurts the base framerate compared to a game which only renders the base frames, not the generated ones.

Are people actually buying into Nvidia's BS? I think DLSS is a fabulous technology for upscaling and resolving finer details, but frame generation is literally frame smoothing and generates artifacts on fast moving scenes. It also makes the game feel like mashed potatoes in lower framerate scenarios.
 
I don't see how they left room for a 5080 Super that isn't just the same 5080 with 24GB. 5080 is the full GB203 die. GB202 already cut down significantly for the 5090. I don't see any situation where they'd further cut down a 750mm2 die to sell as a cheaper 5080 Super / Ti.
You take the non-qualifying, semi-broken GB202's and laser off the non-functional parts of the die until you have something that splits the difference between the 5080 and the 5090 along with a bunch of empty/dead silicon. After that, restrict its bus width to 384 and use 2GB RAM or further restrict it to 256 and use 3GB.
 
I don't see how they left room for a 5080 Super that isn't just the same 5080 with 24GB. 5080 is the full GB203 die. GB202 already cut down significantly for the 5090. I don't see any situation where they'd further cut down a 750mm2 die to sell as a cheaper 5080 Super / Ti.
It was more of a "I see the desire" thing, but the execution of it....
Because yeah, I don't see how they grow it out but holy hell the 5080 is disappointing to say the least.

Looking at it going from N5 to N4P they somehow lost transistor density... Yeah it's got better power efficiency, and maybe that means that a super version could clock significantly higher but WTF.
Maybe Blackwell doesn't actually get a super refresh and they just have a whole new architecture this time next year, or maybe they have a GB201-B that is 500mm2 or something but man. This is just a let down.
There are a great number of persons at Nvidia who should be holding their heads in shame right now standing in a corner getting whipped with rubber hoses.
 
There's already people claiming that the 5080 is "technically" faster due to Multi-Frame Generation. Do these people not realize that FG and MFG actually hurt performance in a competitive game scenario? It also hurts the base framerate compared to a game which only renders the base frames, not the generated ones.
Like for flat earthers, we will never really know, maybe, competitive game is one place where MFG could not hurt even help a bit performance in some ways, that where Reflex 2 will tend to be implemented and the base frame rate to be high enough for it to work well. But I am not sure that what they have in mind.

You take the non-qualifying, semi-broken GB202's
Things can change fast and non-functioning (by a vast amount) AD102 ended up in some 4070Ti super, so everything is possible, but enough working core to be significantly better than the 5080 with a working memory controller will be tempting to put in RTX 5800, X20 type of product. RTX 5000 after that.

The 5000 of the ada generation starting price was ~$4,000, used one on ebay still seem to be able to fetch $3,000 right now, 5880 seem to have been quite popular in China:
https://www.ebay.com/sch/i.html?_nk...sacat=0&_from=R40&_trksid=p4432023.m570.l1313
 
As an eBay Associate, HardForum may earn from qualifying purchases.
There's already people claiming that the 5080 is "technically" faster due to Multi-Frame Generation. Do these people not realize that FG and MFG actually hurt performance in a competitive game scenario? It also hurts the base framerate compared to a game which only renders the base frames, not the generated ones.

Are people actually buying into Nvidia's BS? I think DLSS is a fabulous technology for upscaling and resolving finer details, but frame generation is literally frame smoothing and generates artifacts on fast moving scenes. It also makes the game feel like mashed potatoes in lower framerate scenarios.
How MFG can hurt performance if fps is higher ?
 
AMD is probably rethinking their original $900 price for the 7090 XT but we all know nobody would buy them still. The RTX 4060 wasn't even faster than the RTX 3060 and it's #1 on Steam. Keep the price low and AMD could have the equivalent of their Ryzen moment. Nvidia clearly screwed up.
9idq17.jpg
 
How MFG can hurt performance if fps is higher ?
MFG to be able to generate interpolated frame has to wait for the next real frame to exist before rendering its generated frame

Real frame 1-> interpolated 1->interpolated 2->interpolated 3->real frame 2, so it augments the time before real frame 2 show up by the time it takes to make the frame and render them (and keep a good pacing, not doing it as fast at it can), will tend to reduce real baseline render frame and add latency.
 
Sucks, this was one of a few cards I was interested in, not really anymore. 5090 will be the only card that's a meaningful upgrade and it costs 2 grand. RIP. Hopefully the 9070 XT isn't priced to the moon, or the 5070 ti is really good.
 
AMD is probably rethinking their original $900 price for the 7090 XT but we all know nobody would buy them still. The RTX 4060 wasn't even faster than the RTX 3060 and it's #1 on Steam. Keep the price low and AMD could have the equivalent of their Ryzen moment. Nvidia clearly screwed up.
View attachment 707177

7090xt?
 
I know why the 4000 series is hard to find, Nvidia was busy putting a 5 on all the 4000 series boxes. Worse part is even the 5090 feels more like a 4090Ti.

I questioned my 4090 purchase at the time, I had remembered how the ampere release made my 2080ti seem antiquated overnight. Not so much this time around, these new cards making that 4090 seem like a reasonable value looking back.

At least the 5090 has 32gb as a selling point. If you need that VRAM, the price can be justified. The 5080 on the other hand… it’s just a “4080 Super 2”, pretty weak showing considering all the GPU R&D and advancements from the AI gold rush of the past 3 years.
 
Nvidia will continue to do this because #1 They currently are uncontested and have virtually no competition, and they know it. And #2 they will sell regardless. This is what every publicly traded company wishes they could do.

Additionally, you could also say they are allowing AMD to stay in the race and avoid becoming a monopoly. Even though we know they don't really have any competition.

If people are finding this Nvidia release disappointing and boring, I hope they keep this same energy for the Radeon XX70 series release.
 
You take the non-qualifying, semi-broken GB202's and laser off the non-functional parts of the die until you have something that splits the difference between the 5080 and the 5090 along with a bunch of empty/dead silicon. After that, restrict its bus width to 384 and use 2GB RAM or further restrict it to 256 and use 3GB.
I would say yes, but we can't forget that Nvidia hasn't yet announced their RTX Quadro lineup, they could use those cut-down 5090s for a 5080 TI, or do they use them for a Quadro RTX 5000 Blk card instead where the full die goes to the Quadro RTX 6000 Blk ?

The problem with cut-down chips is Nvidia doesn't want them, they are the result of a manufacturing problem, and the quantity of them produced goes down over time unless they start artificially gimping silicon to fill a market need but that is a design problem.
 
  • Like
Reactions: ChadD
like this
Nvidia will continue to do this because #1 They currently are uncontested and have virtually no competition, and they know it. And #2 they will sell regardless. This is what every publicly traded company wishes they could do.

Additionally, you could also say they are allowing AMD to stay in the race and avoid becoming a monopoly. Even though we know they don't really have any competition.

If people are finding this Nvidia release disappointing and boring, I hope they keep this same energy for the Radeon XX70 series release.
Honestly, in my head, the "Throwing AMD a bone, and keeping regulators off their back" angle tracks, but how much of that is me trying to rationalize some shit decisions on Nvidia's part I couldn't tell you. No matter how I look at it, something looks off though so it's as good of a reason as any other right now.
 
Good news for everyone who wanted their 4090 to retain some of its value? 😂
Honestly, if this thing had 20 gb of vram it'd be far more interesting as a long term solution. Some games at 4k are already running out of vram.
AMD has a huge opportunity to garner good value with their next releases.
They'll squander it like they always do
 
You know what this 5XXXX generation from Nvidia feels like? Intel from like the 4XXXX series to the 11 or 12XXX series. Just checking the box of minimal performance increase and charging more money. Now look where Intel is.

Who the fuck at Nvidia saw that downfall and thought, hey that's a really great idea, lets do that.

Absolutely pathetic. I thought you had to be smart to work there.
 
You know what this 5XXXX generation from Nvidia feels like? Intel from like the 4XXXX series to the 11 or 12XXX series. Just checking the box of minimal performance increase and charging more money. Now look where Intel is.

Who the fuck at Nvidia saw that downfall and thought, hey that's a really great idea, lets do that.

Absolutely pathetic. I thought you had to be smart to work there.
I'm not so sure, I think the easy gains are just gone and they're on practically the same node. Aside from being on a new node it's going to be difficult going forward, but I'm not a chip designer and I don't know shit it's just speculation. We probably won't see big gains until they're on 2nm.
 
Back
Top