Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
"Absolutely" would be my answer, you're just not getting a great value.But for 1440P will 5080 be ok?
I mean, there isn't anything better aside from the 4090 and 5090, but you aren't going to feel good about paying for one.But for 1440P will 5080 be ok?
Right after they stop buying buggy video games.I do wonder how many more times nvidia will do this until gamers take a real stand and vote with their wallet.
Tom's Hardware "just buy it" article in 3...2...1...I am sure the nVidia Defence Force will be along shortly to tell you it is amazing.
you better be buying that 5090 you bugged us about for months...But for 1440P will 5080 be ok?
Well poo.
<looks at 3080> Sorry old gal, your request for a discharge was not approved.
I don't see how they left room for a 5080 Super that isn't just the same 5080 with 24GB. 5080 is the full GB203 die. GB202 already cut down significantly for the 5090. I don't see any situation where they'd further cut down a 750mm2 die to sell as a cheaper 5080 Super / Ti.I snagged one recently for a price I’m not complaining about but damned I really expected the 5080 to be better. I get leaving room for the 5080S or what not but Jesus tap dancing Christ that’s just nuts.
You take the non-qualifying, semi-broken GB202's and laser off the non-functional parts of the die until you have something that splits the difference between the 5080 and the 5090 along with a bunch of empty/dead silicon. After that, restrict its bus width to 384 and use 2GB RAM or further restrict it to 256 and use 3GB.I don't see how they left room for a 5080 Super that isn't just the same 5080 with 24GB. 5080 is the full GB203 die. GB202 already cut down significantly for the 5090. I don't see any situation where they'd further cut down a 750mm2 die to sell as a cheaper 5080 Super / Ti.
It was more of a "I see the desire" thing, but the execution of it....I don't see how they left room for a 5080 Super that isn't just the same 5080 with 24GB. 5080 is the full GB203 die. GB202 already cut down significantly for the 5090. I don't see any situation where they'd further cut down a 750mm2 die to sell as a cheaper 5080 Super / Ti.
Yes. https://www.tomshardware.com/news/nvidia-rtx-gpus-worth-the-money,37689.htmlAre people actually buying into Nvidia's BS?
Like for flat earthers, we will never really know, maybe, competitive game is one place where MFG could not hurt even help a bit performance in some ways, that where Reflex 2 will tend to be implemented and the base frame rate to be high enough for it to work well. But I am not sure that what they have in mind.There's already people claiming that the 5080 is "technically" faster due to Multi-Frame Generation. Do these people not realize that FG and MFG actually hurt performance in a competitive game scenario? It also hurts the base framerate compared to a game which only renders the base frames, not the generated ones.
Things can change fast and non-functioning (by a vast amount) AD102 ended up in some 4070Ti super, so everything is possible, but enough working core to be significantly better than the 5080 with a working memory controller will be tempting to put in RTX 5800, X20 type of product. RTX 5000 after that.You take the non-qualifying, semi-broken GB202's
How MFG can hurt performance if fps is higher ?There's already people claiming that the 5080 is "technically" faster due to Multi-Frame Generation. Do these people not realize that FG and MFG actually hurt performance in a competitive game scenario? It also hurts the base framerate compared to a game which only renders the base frames, not the generated ones.
Are people actually buying into Nvidia's BS? I think DLSS is a fabulous technology for upscaling and resolving finer details, but frame generation is literally frame smoothing and generates artifacts on fast moving scenes. It also makes the game feel like mashed potatoes in lower framerate scenarios.
Input -> reception latency. If it actually increases or does not change that latency, even if the FPS was higher - it doesn't matter.How MFG can hurt performance if fps is higher ?
MFG to be able to generate interpolated frame has to wait for the next real frame to exist before rendering its generated frameHow MFG can hurt performance if fps is higher ?
That was my first actual laugh today.Y'all suckers if you buy this generation.
AMD is probably rethinking their original $900 price for the 7090 XT but we all know nobody would buy them still. The RTX 4060 wasn't even faster than the RTX 3060 and it's #1 on Steam. Keep the price low and AMD could have the equivalent of their Ryzen moment. Nvidia clearly screwed up.
View attachment 707177
I know why the 4000 series is hard to find, Nvidia was busy putting a 5 on all the 4000 series boxes. Worse part is even the 5090 feels more like a 4090Ti.
I would say yes, but we can't forget that Nvidia hasn't yet announced their RTX Quadro lineup, they could use those cut-down 5090s for a 5080 TI, or do they use them for a Quadro RTX 5000 Blk card instead where the full die goes to the Quadro RTX 6000 Blk ?You take the non-qualifying, semi-broken GB202's and laser off the non-functional parts of the die until you have something that splits the difference between the 5080 and the 5090 along with a bunch of empty/dead silicon. After that, restrict its bus width to 384 and use 2GB RAM or further restrict it to 256 and use 3GB.
Honestly, in my head, the "Throwing AMD a bone, and keeping regulators off their back" angle tracks, but how much of that is me trying to rationalize some shit decisions on Nvidia's part I couldn't tell you. No matter how I look at it, something looks off though so it's as good of a reason as any other right now.Nvidia will continue to do this because #1 They currently are uncontested and have virtually no competition, and they know it. And #2 they will sell regardless. This is what every publicly traded company wishes they could do.
Additionally, you could also say they are allowing AMD to stay in the race and avoid becoming a monopoly. Even though we know they don't really have any competition.
If people are finding this Nvidia release disappointing and boring, I hope they keep this same energy for the Radeon XX70 series release.
Honestly, if this thing had 20 gb of vram it'd be far more interesting as a long term solution. Some games at 4k are already running out of vram.Good news for everyone who wanted their 4090 to retain some of its value?![]()
They'll squander it like they always doAMD has a huge opportunity to garner good value with their next releases.
I would have never guessed that a GPU I purchased for $1600 on launch day would still be worth $1600 after 2+ years. This is epic!Good news for everyone who wanted their 4090 to retain some of its value?![]()
You must have missed out on that previous crypto bubble.I would have never guessed that a GPU I purchased for $1600 on launch day would still be worth $1600 after 2+ years. This is epic!
Thanks Nvidia!
I'm not so sure, I think the easy gains are just gone and they're on practically the same node. Aside from being on a new node it's going to be difficult going forward, but I'm not a chip designer and I don't know shit it's just speculation. We probably won't see big gains until they're on 2nm.You know what this 5XXXX generation from Nvidia feels like? Intel from like the 4XXXX series to the 11 or 12XXX series. Just checking the box of minimal performance increase and charging more money. Now look where Intel is.
Who the fuck at Nvidia saw that downfall and thought, hey that's a really great idea, lets do that.
Absolutely pathetic. I thought you had to be smart to work there.
Sadly. However, I have some friends that lost everything when the bubble burst, so I'm not terribly unhappy in that respect.You must have missed out on that previous crypto bubble.