• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

So I got a 5080....

Sure in SP games if you dont really care about framerate. If you playing any type of FPS or competitive game, you turn it off.

Even with my 4090 i tend to lean toward getting as high frames on my 240hz 4k over using RT.
I don't spend $2,500+ on a video card just to run games in potato mode. I don't give a shit what kind of game it is.
 
I ordered a prebuilt from a deal on Costco- MSI Ageis with a 5080. Super clean build wtih some minuses - Like the Mobo only has 2 memory slots! That aside it is a Ryzen 9900x 32 gb ddr 5 6000 and 2 tb drive.

I play Marvel Rivals on a 4k oled LG monitor. My old rig is a 7800x3d with a 7900xt. I am seriously considering returning the MSI to Costco. I am not sure what I was expecting but the 5080 is not a generational shift from the 7900xt. At Ultra settings it is probably 20-30 fps faster than my old 7900xt. That feels.... disappointing. I am questioning why I spent almost 3000 dollars for 20-30 fps lol.

Am I missing something?
try with the latest drivers, supposed to give 8% more performance on a 5080,

"NVIDIA's Latest 576.02 WHQL Driver Add Up to 8% Performance Bump in Synthetic Benchmarks
by AleksandarK Today, 04:21 Discuss (23 Comments)
NVIDIA's latest GeForce 576.02 WHQL driver, released on April 16, which brought support for the GeForce RTX 5060 Ti, appears to deliver unexpected performance gains for several "Blackwell" GeForce RTX 50 series GPUS in UL's 3DMark Steel Nomad benchmark. ComputerBase community users with RTX 5070, 5070 Ti, and 5080 GPUs reported three to eight percent score uplifts after upgrading, prompting ComputerBase to verify these claims under controlled conditions. Their independent testing confirms that the RTX 5080 jumps from 8,094 to 8,550 points (a 5.6 percent boost), the RTX 5070 Ti climbs from 6,463 to 6,932 points (7.3 percent), and the RTX 5070 improves from 4,838 to 5,242 points (8.4 percent)."
 
I don't spend $2,500+ on a video card just to run games in potato mode. I don't give a shit what kind of game it is.
Considering there is no game a 5090 can max out at 240hz at 4k thats a newer generation with RT. We are all playing in potato mode lol. And Frame Gen doesnt count
 
Considering there is no game a 5090 can max out at 240hz at 4k thats a newer generation with RT. We are all playing in potato mode lol. And Frame Gen doesnt count
I wouldn't agree with that. Sub-60FPS, sub-100FPS, sure. Maybe but I don't think you need to hit 240FPS for it not to be potato mode. Especially not if the settings are maxed out.
 
I wouldn't agree with that. Sub-60FPS, sub-100FPS, sure. Maybe but I don't think you need to hit 240FPS for it not to be potato mode. Especially not if the settings are maxed out.
Yeah, either you're not fully utilizing your GPU or your monitor, pick one. Lol

IMO graphical settings are prioritized and I accept lower FPS and/or resolution to reach my desired IQ settings.
 
  • Like
Reactions: Dan_D
like this
I don't recall seeing any cases where a 5080 was slower than the 4080 or 4080 Super. However, the uplift is small at around 10-12% max as I recall. That's certainly not worth upgrading to given the cost increase. However, if you are on an older card such as something from the 20 or 30 series, the 5080 isn't a bad choice.
At stock stock the 5080 is not worth upgrading to if you have a 4080, both support DLSS4, but there are those that will say MFG is the reason to buy the 5xxx series. I'm from the old school where I go by native performance when buying a GPU, not trickery. Also what isn't mentioned is to use MFG 2x or 4x you need a minimum of a 240hz monitor, for 4x a 360hz, that's going to set one back around $1000 give or take. I happen to have a Alienware 360hz OLED, but I still won't buy the 5080 because it's stock speeds I judge by, not DLSS or MFG.
 
At stock stock the 5080 is not worth upgrading to if you have a 4080, both support DLSS4, but there are those that will say MFG is the reason to buy the 5xxx series. I'm from the old school where I go by native performance when buying a GPU, not trickery.
Native performance of the 5080 is higher than that of the 4080. However, its by only 10-12% at most. That certainly isn't worth the price of admission given how small a jump that is in performance and how expensive the card is. You are talking about RTX 3090 to RTX 4090 money for what is essentially 4080 performance. That's not a good deal. Its true that multi-frame generation is the sole new feature on the 5080 series. As for the trickery part, its a stupid complaint. You want to complain about CPU branch prediction or speculative execution while you are at it? Security concerns aside, trickery to get more performance is as old as the industry itself. More is more, period. Now, in the cases where there is a significant or even a noticeable hit in input latency I wouldn't use frame generation of any kind. So far I haven't seen an appreciable hit to this on my 5090. That's not to say I can't tell its there, but its so slight as that I can deal with it. That said, only competitive twitch type games would I consider turning it off to reduce that tiny bit of latency.

Your mileage may vary and this is certainly something I'd take on a game by game basis.
Also what isn't mentioned is to use MFG 2x or 4x you need a minimum of a 240hz monitor, for 4x a 360hz, that's going to set one back around $1000 give or take. I happen to have a Alienware 360hz OLED, but I still won't buy the 5080 because it's stock speeds I judge by, not DLSS or MFG.
Where on Earth did you come up with that idea? I've been able to enjoy the benefits of that feature on a 120Hz 4K display. Also, not judging a product on its full feature set and capabilities doesn't make sense to me.
 
So my nephew let me keep his 9800x3d and 30cl memory over the weekend. So I tuned my 5080 system using dlss -performance auto fram gen and ultra quality on everything else I get abut 180 fps peaking at about 200. Did the same test with my 7800x3d and 9700xt amd fsr3 auto and utlra quality on the rest - 160 fps - peaks at 180. Long story short...taking it back to Costco. No need to sped tha tmuch if my sytem can carry me a few more years. Time to ge tthe 5x4 nvme lol.
 
So you got a GPU to test a CPU-intensive game and don't like what you see?
The whole pc was 2299. A 5080 anywhere in my area is 1600+ if you can find one. For 600 dollars more I got a full on system, nice case, 850 gold psu, 32 gb ddr5 6000 ram, 2 tb nvme and the 5080. Yes it has a 9900x in it but I could always put in my 7800x3d and sell the 9900x or just part out my old system and it would have cost me considerably less. that was my thinking...that and I had nviidia brain fever. BUT....I did it knowing I have a 90 return policy with Costco so if it didn't work out, I could return it and since I paid by my CC I am not really out any money if I return it in less than 30 days.
 
FG or 2X has been there for a while. I use it in all games including competitive ones if the latency is under 30 ms (like BO6).

4X only usable in Cyberpunk 2077 imo.
 
I think the problem with mfg right now is that it’s hard to tell if it’s the games implementation or the current terrible nvidia drivers.
 
Been playing Control the last few days on my 9070 XT to torture test its RT abilities. I'm using anti-lag, FSR borderless upscaling, and fluid frames 2.1... all implemented outside the game engine.

All settings maxed including RT. 1440p upscaled to 4k. Freesync Premium and HDR on on OLED.

Its taking a 60-70fps base and pumping it up to around 120fps which is the limit of my screen anyway.

The experience is so good I'm hooked on finishing the game now.

Very, very good for a little over $700.

The point here is if you aren't hyper focused on the couple of full path tracing titles that a 5090 can barely handle well... There are other great choices even for 4k play.
 
FG or 2X has been there for a while. I use it in all games including competitive ones if the latency is under 30 ms (like BO6).

4X only usable in Cyberpunk 2077 imo.
4x looks like shit on cuberpunk. Really not a good example from what I’ve seen. Overall 4x mfg has been junk.
 
That's okay. It's new. It will improve without a doubt.
Physx and hairworks want a word with you in the hall. Not everything improves, and sometimes Nvidia either ends support or changes how a “feature” operates. I wouldn’t be surprised if they back down from the 4x mfg and focus on improving 2x with more performance and features.
 
4x looks like shit on cuberpunk. Really not a good example from what I’ve seen. Overall 4x mfg has been junk.
It also looks horrifying on Spider-Man Remastered. Unbearably bad I just turn it completely off and that's not even the new Frame gen it's the previous version one on the 4090 it's unplayable such a messy experience.
 
4x frame gen is unplayable in the half-life 2 rtx game. Literally looks like you're tripping on lsd.

4x frame gen right now seems like a hot bag of wet dook.
 
Back
Top