• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

5080 Reviews

Was maybe a pipe dream
But little of that card die has dedicated hardware for it, what it would look like on just a 3000 RT tflops with a giant denoiser gpu, that feature just enough raster power in small raster core section to build enough of the scene for the light to know the color of what they are bouncing on ?

If we go back in 1994, telling people what get close to realtime rendering with optix and a 5090 in a blender viewport would sound quite something.

Took me minutes to render a simple 3 white sphere over each other, very simple texture on a 4x86 back in the days, the idea that we are particularly far in right now... maybe, maybe not, we were certainly not in 2002 10 years in openGL graphic cards in.

Lot of things could happen, just eye tracking and caring about highquality-fidelity-resolution just around what the eyes are currently looking at like in VR and other tricks, how many less rays would be needed with that single relatively simple change
 
Last edited:
  • Like
Reactions: ChadD
like this
But little of that card die has dedicated hardware for it, what it would look like on just a 3000 RT tflops with a giant denoiser gpu ?

If we go back in 1994, telling people what get close to realtime rendering with optix and a 5090 in a blender viewport would sound quite something.

Took me minutes to render a simple 3 white sphere over each other, very simple texture on a 4x86 back in the days, the idea that we are particularly far in right now... maybe, maybe not, we were certainly not in 2002 10 years in openGL graphic cards in.

Lot of things could happen, just eye tracking and caring about highquality-fidelity-resolution just around what the eyes are currently looking at like in VR and other tricks, how many less rays would be needed with that single relatively simple change
All true. The advancement is very impressive. I am just not convinced we get to a point where game developers don't have to mix in raster tricks for almost everything. At least not for a long time. Before Nvidia RTX marketing what was the guesses for real time ray tracing normally 20-30 years? :) I think that is still actually the case. Real full Ray tracing is not technically impossible but its probably 20 years off. lol
In some ways I think the RTX.. and Path tracing sillyness. Has been a diversion that has maybe sent a few things down the wrong roads.
Epics lumen is on the right track imo. Most games should be aiming to add a mix of traced lighting in places with mostly raster + software map tricks. Use the more expensive RT calculations for specific things where it will actually be noticiabbly better. Bake the rest into maps and raster. It looks to me like a game like Indian Jones is sort of getting it right, it runs pretty well on AMD hardware cause it isn't doing a ton or RT but they seem to mostly have used it in the right places.

Doing path tracing and setting bounce and occlusion rules so it doesn't run at 2fps... and call it a day has really just made developers a bit lazy. :) Result is games that can look cool in the right scenes, and in other scenes they don't look much better but run at 1/8 the performance. I know some people think RT is the future cause it makes developers lives easier. Well sure that is true, but I think most people can admit at this point. It might be quicker but the few games we have so far that have been developed that way. Don't really look much better, and they are noisy. Sometimes the only real solution is to put in the work. Game devleopement is supposed to be an Art not a spreedsheet of light ray calculations. I don't actually want my games to mimic real life perfectly.

IMO if Nvidia hadn't went on the 6 year RTX marketing push. Developers might have gotten to things like Lumen earlier. As well as actual competition to that system from other game engines. RT hardware has a place. Path Tracing... ain't it. imho anyway
 
OF course that isn't possible.
We certainly managed to do so during the 3d accelerator years.

Admittedly, it'd require a rejiggering of things now so that motherboards had sufficient slots & PCIe lanes.
 
  • Like
Reactions: ChadD
like this
it's not thousands and thousands...you need to look at AAA games released in the past few years (since the 3080)...you can't count indie games...almost every major AAA title has featured some sort of RT support...not to mention the older games like Quake 2, Half Life, Witcher 3 getting it as well

Indeed.

And when you disable RT today, the games often look worse than they did back before RT was a thing, as game devs no longer want to waste time and effort manually creating shadow maps, and fake reflections and all the other stuff necessary for rasterization to look good, when RT rendering does it for them automatically, by just placing alight sources and cameras.

In other words, for AAA games in the last 5 years or so, if you are not using RT you are getting a subpar product.

And we have even seen a move towards titles that require RT in the last few months. Indiana Jones and the Great Circle is one of them. There was one more too, but I forget what it was now.

These are games where there is no option to run without RT, and they do not run on GPU's that do not support RT.

This IMHO is the future. Not because I want it to be - mind you - but because development is easier and cheaper for game developers. Raster graphics require so many manual workarounds to get things to look good that can be accomplished with RT by just dropping light sources and cameras.

I think it sucks, because rasterization can look 99% as good as RT at a fraction of the GPU render load if a dev really wants it to, but the truth is that whatever is easier for developers usually wins.

So, I suspect as time goes on, titles that have mandatory RT are going to become the norm, possibly even the majority, as the cost savings (having to pay fewer artists to manually tweak shadow maps, etc.) are huge and timelines (the time all of that takes) is huge.

Lets be real about RT. When Nvidia dropped it with the 20 series, it was never a feature for the Gamer. They pushed RT because:

1.) They knew it would make developers lives easier, and they could get huge adoption relatively quickly.
2.) They knew they had an advantage over the competition with RT, and knew that if game devs started using it, even reluctant gamers would feel forced to buy into it, or get a subpar experience.

So it was a way to make devs happy, and a way to stick it to AMD. And that has pretty much come true. All reports are that AMD has mostly caught up with the new 90 series Radeons, at least in the mid range now, and that going forward devs that were titillated by the massive cost and time savings they could get out of RT, but feared implementing it because they didn't want to alienate mainstream gamers are going to have less and less reason to not push RT and make their titles "mandatory RT".
 
lol i was. its NOT " the RT golden age" or whatever was claimed...
there are vastly more games released without it, its not there yet. half the time the "better looks" are just the sharpening thats included with the scaling/frame gen....
Vastly more games? Maybe. But these must be little crappy indie games that no one plays, not the big games that get most of the interest from our community.

I can't remember the last mainstream AAA game that didn't have RT. Ever since Cyberpunk they pretty much all have it. Not to the same extent as Cyberpunk, sure, but at least at some level. I don't think I've bought a game without RT support in 5 years (except for old remasters like System Shock).

Actually, no. There was one. Starfield. But that's it.

And it's not as if I have been looking for RT enabled games.

I tend to like big open world single player FPS titles, and RT has been in almost all of them lately, and to an extent where they game just does not look the same without it. (Cyberpunk was a little bit of a letdown on my 6900xt without RT (because enabling even the lowest RT setting turned it into a slide show) but amazing on my 4090)

I don't know what games you are finding that are launched without RT. Casual flash games? Lightweight indie side scrollers? But your comment about more games being launched without RT just doesn't ruing true to me, unless you consider titles that aren't typically considered "real games" in our circles.

As mentioned, the only exception I can think of that I have bought in the last 5 years is Starfield.

I was a critic of RT when it first came out, and I still am. I would prefer if games were still made with every raster trick in the book to make raster look better, as it is much easier on the GPU. But in 2025 I consider good RT performance to be an absolute necessity in order to enjoy the games I like, and I consider that a bit disappointing.
 
There was over 15,000 game released on steam, the there is vastly more game that X than Y can feel a bit empty, 97% of the games are extremely small indie affair.

After even on the big enough to have a wikipedia page, many are still 2d or PS4 supported title.

Actually, no. There was one. Starfield. But that's it.
Giant budget one can tend to avoid it, Call of Duty for example, those guys are really good and can have had no RT release. I am sure there is a good list.

Very console centric that got a pc port from really good studio, God of war ragnarok, last of us part 2, horizon forbidden west, without looking do not need/use it, if it does it it would be under the hood in some way we do not know at least.
 
lol i was. its NOT " the RT golden age" or whatever was claimed...
there are vastly more games released without it, its not there yet. half the time the "better looks" are just the sharpening thats included with the scaling/frame gen....

you're changing what I said...what I did say was the vast majority of AAA games being released nowadays support RT features of some kind...and GPU manufacturers (mainly Nvidia) are now making RT performance the main selling point as rasterization performance has tapped out and any card from the last 2 generation of GPU's (Nvidia and AMD) will do excellent in that respect

so we are in the Age of Ray Tracing as that's the technology that is driving graphics in video games and GPU's for the next decade+
 
You know what this 5XXXX generation from Nvidia feels like? Intel from like the 4XXXX series to the 11 or 12XXX series. Just checking the box of minimal performance increase and charging more money. Now look where Intel is.

Who the fuck at Nvidia saw that downfall and thought, hey that's a really great idea, lets do that.

Absolutely pathetic. I thought you had to be smart to work there.
They're phoning it in because gaming GPUs are a tiny fraction of their business.
 
Actually, no. There was one. Starfield. But that's it.
Horizon Zero Dawn Series
God of War Series
Spiderman Series (RT was added later)
Banishers Ghost of Eden
Atomic Heart (RT added later)
Tales of Arise
Scarlet Nexus
Final Fantasy 16
Death Stranding
Star Wars: Jedi Fallen Order
There's actually quite a few AAA titles that not only didn't have it, but look substantially better than games with RT. The main thing is in order to turn it on you have to immediately sacrifice quality, or turn on RT low. At that point you're defeating the purpose because you're not getting enhanced graphics you're getting accurate lighting. Fidelity and accurate lighting are NOT the same thing. RT more than anything helps out developers but there's a huge trade off with everything else. Indiana Jones has found probably the best balance I've seen. Black Myth: Wukong OTOH is a noisy nightmare that really would have been better without it.
 
Last edited:
Vastly more games? Maybe. But these must be little crappy indie games that no one plays, not the big games that get most of the interest from our community.

I can't remember the last mainstream AAA game that didn't have RT. Ever since Cyberpunk they pretty much all have it. Not to the same extent as Cyberpunk, sure, but at least at some level. I don't think I've bought a game without RT support in 5 years (except for old remasters like System Shock)

exactly...he's trying to focus on the fact that there are more non-RT games to make some wild claim about RT as a feature not being relevant...that's totally wrong...of course there are more non-RT games which is why I emphasized the AAA part...very rare to find a new AAA release without some sort of RT...and most of the time it can be disabled so you don't have to use it
 
AAA wasnt in youre original claim.

yes it was...it didn't need to be said because it's obvious...no one is claiming that RT is taking over indie games...I brought it up after you made your ludicrous statement that there were maybe only 40- 50 total games with RT support and there were 'thousands and thousands' of games without it...that's common sense
 
They're phoning it in because gaming GPUs are a tiny fraction of their business.

No shit, we all know that. What's your point?

Gaming CPUs were a tiny fraction of Intel's business as well. How are they doing?
 
Do you realize how long it will be before anyone could go "full RT"? At the current pace it would take decades, cost 10s of thousands of dollars and require a small nuclear reactor to power it. To have any chance of going "full RT" at this point would regress gaming back at least twenty years. The hardware isn't here and won't be for a very, very long time.

This has been my point and the point of many others since the release of the 2xxx nVidia series. RT was never going to be anything but window dressing for the foreseeable future.
Pretty fucking expensive windows dressings.
 
yes it was...it didn't need to be said because it's obvious...no one is claiming that RT is taking over indie games...I brought it up after you made your ludicrous statement that there were maybe only 40- 50 total games with RT support and there were 'thousands and thousands' of games without it...that's common sense

Look, I understand gaming has changed over the years, but this is the [H]. AAA IS gaming. Nothing else needs apply
 
Indeed.

And when you disable RT today, the games often look worse than they did back before RT was a thing, as game devs no longer want to waste time and effort manually creating shadow maps, and fake reflections and all the other stuff necessary for rasterization to look good, when RT rendering does it for them automatically, by just placing alight sources and cameras.
So you're saying everyone's being forced to shell out a shitload of extra money so cushy game developers can have an easier time at their already easy life?
 
Though to be fair, I hear that working for a game developer is pretty rough, with long hours and frequent layoffs between projects.
letterkenny-to.gif

its a chosen profession with a known rep....
 
This is such a fucking paper launch.....my local sold all stock and pre-order allocations for 5090s within 30 to 60 seconds of their website going live. Now I have to wait until the end of Feb to get a god damn GPU.
 
So the 5080 is for people who didn't already buy a 4080 (Super) or 4090, with barely any performance uplift for roughly the same price point.

That's a letdown - kind of the RTX 20 Series all over again, while the RTX 4090 seems to be the new GTX 1080 Ti despite the lofty price tag the way things are going.
 
Multi Frame Generation only really applies to people with 240hz or higher refresh rate displays...so it makes the main 5000 series exclusive feature meaningless for me (I currently have a 165hz display)...plus it's still on 5nm with the same 16GB VRAM as my ASUS TUF 4080 Super
 
Multi Frame Generation only really applies to people with 240hz or higher refresh rate displays...so it makes the main 5000 series exclusive feature meaningless for me (I currently have a 165hz display)...plus it's still on 5nm with the same 16GB VRAM as my ASUS TUF 4080 Super
Well it isn't any faster then your 4080 super really so your golden.
 
I will say that seeing the overclocking results on the 5080.....it might actually not be a bad card. Need to see more reviews, but so far the ones I have seen showing it getting super close to a 4090.

One of the overclocking results netted around 15% extra performance. It has been awhile since we had any form of overclocking headroom...

Heres one example:


View: https://www.youtube.com/watch?v=D_sVNuOg74c
 
Pretty much yeah.

Though to be fair, I hear that working for a game developer is pretty rough, with long hours and frequent layoffs between projects.
I like to distinguish between game developers (the people) and game developer studios/publishers (the companies).

They always gets mixed up in discourse, which I imagine is precisely what the leadership of the latter group likes to see, so they can piggyback on the sympathies for the first group.
 
Last edited:
Man those videos of people lining up around the micro center.
Nvidia Nvidia... you would hope more people would start to have had enough of their BS. Apparently they still line up around the building. Seems like they upset a lot of people today. Imagine lining up in Minnesota in the cold at a store that got ZERO stock. lol
Now waiting for the first post complimenting amd on "their genius strategy baiting nvidia into releasing cards too early, while amd produces the numbers before they release."
 
I will say that seeing the overclocking results on the 5080.....it might actually not be a bad card.
Some people say stuff like "there are no bad cards, only bad prices." I'm not going to dispute that today but I'm going to suggest that a card that *needs* to be overclocked probably isn't, in fact, good. Just my opinion.

I've got a 4070, I won't be upgrading this generation, because it meets my needs.
 
Now waiting for the first post complimenting amd on "their genius strategy baiting nvidia into releasing cards too early, while amd produces the numbers before they release."
Yes AMD is going to rent a stage and have Lisa come out... and say haha Suckers got ya. Its not a 9070xt its a 9700pro, and its the large core... bait driver deleted real driver go. Its the full core. mhahaha. lol :)

I joke. To be honest I'll be the first to say it. I think AMD has heard 20 different things from various partners with loose tongues and other sources. They probably heard about the fake frames, they probably also heard some people say oh damn this thing is only going to be a few % faster, then other people saying wooo this thing goes. I think they were wise to just wait for NV to drop theirs first. They really should have figured that out before sending stock to stores though and having partners show up to CES with Radeon cards they wanted to show off but had to sort of tease instead. I think its true though that there was zero benefit in AMD being first this time. It wouldn't have mattered if AMD dropped a $500 GPU that was 20% faster then a 4070. The majority of the market would have said, that's nice lets wait and see what the 5070 is.

Waiting for what looks like an epic face plant was probably a long shot hope for them. Looks like that might be what is about to happen. If a 5070 compares to the 4000 as the 80 has. Combined with terrible stock issues. AMD may look like heros. I will say if that comes to pass it was probably 90% luck more then their doing. Admit it though Lisa coming out and saying she baited them would be hilarious.

I just hope the 9070 is decent. Gamers not wanting to spend 2 or 3 months rent on a GPU should be able to feel like they are getting at least a minor if not substantial upgrade from 3-5 year old cards. If the Intel b580 ends up being the deal of the generation... that is ucking sad. lol
 
I will say that seeing the overclocking results on the 5080.....it might actually not be a bad card. Need to see more reviews, but so far the ones I have seen showing it getting super close to a 4090.

One of the overclocking results netted around 15% extra performance. It has been awhile since we had any form of overclocking headroom...

Heres one example:


View: https://www.youtube.com/watch?v=D_sVNuOg74c


They just unlocked 5080 Super that is going to be released in a year when 5080's sit on the shelves gathering dust.
 
AMD has a huge opportunity to garner good value with their next releases.
I believe that is why they haven't launched yet. I bet they have a pretty good idea of what 5070 5070 Ti will look like and they are polishing drivers. If those cards are already with retailers as reported, they can't really don much for the physical cards other than a day 0 BIOS update, but that would have to be coordinated with all the AIBs to be worthwhile.
 
the real 5080 will be the 5080 Super...Nvidia has nerfed all their release models except for the 90 cards in the hopes that people will be desperate enough to pay $2000+
 
The biggest issue IMO is that there is a massive gulf between the 5080 and 5090. 16 -> 32GB, double the cores, double the price, etc. The 4090 would do amazingly well right in the middle, but Nvidia has stopped production of it. I just don't get their strategy.
 
The biggest issue IMO is that there is a massive gulf between the 5080 and 5090. 16 -> 32GB, double the cores, double the price, etc. The 4090 would do amazingly well right in the middle, but Nvidia has stopped production of it. I just don't get their strategy.
They stopped production because they probably needed the capacity at TSMC because they needed the wafers for the 5000 series.

The problem is the real 4080 should of been a cut down die of the 5090. But then it would cost way more than $1000.....But instead Nvidia is trying to sell you a 5070 for $1000....OOPS I mean 5080.
 
Back
Top