• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

5080 Reviews

it's not thousands and thousands...you need to look at AAA games released in the past few years (since the 3080)...you can't count indie games...almost every major AAA title has featured some sort of RT support...not to mention the older games like Quake 2, Half Life, Witcher 3 getting it as well
Of course AAA is interested when it'll drastically cut their costs. Why not if they're being bought and played?
 
AMD hasn't been able to release a decent high end RT card, that's been their Achilles heel...in the age of ray tracing they are stuck with rasterization
I have it on very good authority from prominent menders of this very forum that rasterization is the only thing that matters because the fake images on the screen don't have any real rays to trace, and it gets even more ridiculous when you try to fake frames from the fake image on the screen creating an exponential fake² event horizon that will swallow reality into the it's fakeness. At it's culmination, you get fake³ which is the end of the universe.

Oh, forgot to mention that AMD can't really compete at raster either without Nvidia's choice of producing the cards that some consider not to be good enough. Which honestly confuses me how people bash one company for producing anemic cards while praising another company for producing anemic cards.
 
This is essentially a RTX 5070. If priced at $550, it would be a good card though with a high power draw for a mid range card.

The actual "RTX 5070" may only perform as good as an RTX 4070 Ti, or be between the 4070 Super and 4070 Ti. Would be kind of sad if the 4070 Ti Super was faster than the 5070.
5070 ti 8,960 cuda cores
5070 6,144 cuda cores
4070 ti 7,680 cuda cores
4070 super 7,168

Prepare to be sad. 7,168 vs 8,960 on the ti... and only 6,144 on the non ti.
5080 10,752 cores vs 4080 super 10,240 cores.
We know how that shakes out now 4-8%. 5% more cores 5% more performance.
The 5070 is going to be a sad product. AMD was wise to hold their powder. (hopefully)
 
Last edited:
it's not thousands and thousands...you need to look at AAA games released in the past few years (since the 3080)...you can't count indie games...almost every major AAA title has featured some sort of RT support...not to mention the older games like Quake 2, Half Life, Witcher 3 getting it as well
PCGamingWiki lists 254 titles with some form of RT. A little over 6 years now since the first RTX card. An average of about 3 new titles a month. I know more lately. Really RT has been a non factor until very recently. Its going to matter more now then it did 5 years ago. Nvidia has been hard selling basically a tech demo for half a decade.
 
At those speed some driver overhead issue could be big impact on FPS, 0.2 millisecond become a lot at 600 something.

Hardware canucks result were quite different, with the more expected 5090 > 4090:

Hardware unboxed tested the 5090 with the 566.36 drivers it seem, Canucks with the 571.86
I wasn't even looking at the 5090-4090 differential, but that is also awkward. The fact that the 4080 Super is resoundingly beating the 5080 in both maximum and minimum frame rates is eyebrow raising.
https://www.techpowerup.com/review/nvidia-geforce-rtx-5080-founders-edition/46.html
View attachment 707245

Probably driver related. They were using press drivers, but then CS2 has been out for ages. Maybe they figured they didn't have to do anything for CS2?
Maybe, but still utterly embarrassing IMHO. Having a new flagship GPU losing to a 2 year superseded model in one of the most played games in the world is not a good look.
 
PCGamingWiki lists 254 titles with some form of RT. A little over 6 years now since the first RTX card. An average of about 3 new titles a month. I know more lately. Really RT has been a non factor until very recently. Its going to matter more now then it did 5 years ago. Nvidia has been hard selling basically a tech demo for half a decade.

the 3080 was the first true RT card where you could really use it...the 2000 series was the tech demo...everything from the 3080 has been a good RT gaming GPU...almost every major AAA game over the past 2 years has had some form of RT...even path tracing titles are becoming more prevalent for the ultimate RT experience

and 254 RT titles is almost 7 times the number that pendragon listed in his exaggerated post
 
the 3080 was the first true RT card where you could really use it...the 2000 series was the tech demo...everything from the 3080 has been a good RT gaming GPU...almost every major AAA game over the past 2 years has had some form of RT...even path tracing titles are becoming more prevalent for the ultimate RT experience
Depends on what you consider an acceptable number of frames. I would counter that the RT hardware on the 3000s was completely useless. Path tracing type stuff is almost unusable even on a 4090... almost. AMD was able to provide essentially the same experience without any dedicated hardware. Where NV leaves them in the dust is crazy path tracing settings that are not exactly the most playable settings on anything but the latest flagships. NV was selling datacenter tensor silicon to gamers. I have said it a few times Kudos for them for making some of their crazy software ideas stick. Clearly DLSS and tech like it has mostly panned out. RT has as well sort of. Obviously its going to become more important. That doesn't change that basically 3 1/2 gens of Nvidia hardware was sold and bought on a promise of wide spread RT everywhere. That just hasn't happened... until the back half of '24 6 years after the tech launched. Even today 250 some games are all there is... and of those lets all be honest 40-50 of them get any sort of wide spread play.
 
The 5070 is going to be a sad product. AMD was wise to hold their powder. (hopefully)
At some point down the line with 192 bits bus that the massive memory bandwidth could start to help for games and not just for non game scenario....

Because just raw core count alone

4070: 5,888 at 2,475mhz
5070: 6,144 at 2,510mhz

5080: 10,752 at 2,617mhz
4080: 9,728 at 2,505 mhz

10,752/9,728 * 2,617 / 2,505 = 15.5% faster, at 4k the 5080 was 14.9% faster than the 4080.

If the 5070 follow the same track
6,144 / 5,888 * 2.510/2.475, that only 6% faster.... under the same logic than above 5.x faster..

That would be significantly lower than the 4070super performance that was $600, that would be why they price cut it (68% higher memory bandwidth than the 4070/4070 super is so much of a boost that it could be closer to 10% than 6%... but that would still be below the 4070 super, would be a 6900xt, just above the 7800xt which was $500 since 2023...)

That doesn't change that basically 3 1/2 gens of Nvidia hardware was sold and bought on a promise of wide spread RT everywhere.
On that note, nvidia and the media, did not really push it this time for the new neural shaders, could be harder to explain and market than ray being traced-lighting/shadow/reflection, could just be not even a little Portal playable demo of the tech and will be upcoming. But after how long raytracing took, buzz around directstorage (does game use it now, do we know if they do... some do, some have instant loading without using that word, etc...), maybe there will be more let wait and see, before saying you should buy something now because it supports X.
 
Last edited:
Depends on what you consider an acceptable number of frames. I would counter that the RT hardware on the 3000s was completely useless. Path tracing type stuff is almost unusable even on a 4090... almost. AMD was able to provide essentially the same experience without any dedicated hardware. Where NV leaves them in the dust is crazy path tracing settings that are not exactly the most playable settings on anything but the latest flagships. NV was selling datacenter tensor silicon to gamers. I have said it a few times Kudos for them for making some of their crazy software ideas stick. Clearly DLSS and tech like it has mostly panned out. RT has as well sort of. Obviously its going to become more important. That doesn't change that basically 3 1/2 gens of Nvidia hardware was sold and bought on a promise of wide spread RT everywhere. That just hasn't happened... until the back half of '24 6 years after the tech launched.

it was understood when RT was first launched that it was in its infancy...you needed the cards first before developers could adopt it in games...RT is still in its early stages and will only become more prominent over the next 5 years...path tracing is not the same as 'regular' RT, that is for the ultimate GPU melting experience so it's understandable that not even the highest end GPU's can handle it (even with DLSS)
 
  • Like
Reactions: ChadD
like this
5070 ti 8,960 cuda cores
5070 6,144 cuda cores
4070 ti 7,680 cuda cores
4070 super 7,168

Prepare to be sad. 7,168 vs 8,960 on the ti... and only 6,144 on the non ti.
5080 10,752 cores vs 4080 super 10,240 cores.
We know how that shakes out now 4-8%. 5% more cores 5% more performance.
The 5070 is going to be a sad product. AMD was wise to hold their powder. (hopefully)
so 5070 ti will be similarly trading blows with 4070 ti super... what a nothingburger
 
wasn't the 4080 the same?...it's only the 4080 Super where it became worthy of the 80 series name
No, just before launch they renamed the 4080 12GB to 4070. And we got the real 4080 16GB in launch.

If the same happened now, the 5080 would be called 5070 right now, and a true 5080 with 20GB of memory would've released.
 
  • Like
Reactions: T4rd
like this
path tracing is not the same as 'regular' RT,
The line between path tracing and regular RT does not really exist (or I just do not understand the terminology), after how many bounces and how many surfaces have bouncing enabled does RT become PT...
and will only become more prominent over the next 5 years.
I feel 2024 turned the corner,
- Avatar (december 23) /Stars wars outlaws having an heavy version of it always on,
- Indiana jones and the upcoming doom (which is a fast paced type of game, not the usual suspect for RT always on)
- black myth-Senua's Saga: Hellblade 2, those Unreal 5 engine that have it always on

That 3 big title that have raytracing always on, a big way that it feel like a corner was turn, no Nvidia pushing it at all, game dev, the biggest one deciding by themselve (on AMD sponsored title in some of those) without using Raytracing in their marketing using it.

Now that many of the biggest engine has it, Assassin's Creed and other Ubisoft title will have it, that could kind of force PS6 to have strong performance and once PS6 has strong RT, then you turn the second corner, only one left after that to have made your full circle around the block.
 
5070 ti 8,960 cuda cores
5070 6,144 cuda cores
4070 ti 7,680 cuda cores
4070 super 7,168

Prepare to be sad. 7,168 vs 8,960 on the ti... and only 6,144 on the non ti.
5080 10,752 cores vs 4080 super 10,240 cores.
We know how that shakes out now 4-8%. 5% more cores 5% more performance.
The 5070 is going to be a sad product. AMD was wise to hold their powder. (hopefully)

I don't disagree. The 4070 Super having more CUDA cores than the 5070 is pathetic. And the other issue is these GPUs don't bring anything good to the table, no real gain in ray tracing performance, and no new technology either. Multi frame generation has more visual issues than single frame generation and is useless where it would be helpful (sub 50 FPS, lower end GPUs). Going from the videos I've seen with it, the visual downsides are too noticeable to use. Thankfully DLSS 4 with the transformer model works on RTX 4*** GPUs, because that generally improves on the older DLSS upscaling though it also has some minor issues than the CNN model did not have.
 
it was understood when RT was first launched that it was in its infancy...you needed the cards first before developers could adopt it in games...RT is still in its early stages and will only become more prominent over the next 5 years...path tracing is not the same as 'regular' RT, that is for the ultimate GPU melting experience so it's understandable that not even the highest end GPU's can handle it (even with DLSS)
I agree it will grow. I disagree that NV has made it plain and well understood that RT wasn't a thing yet 6 years ago. Or for the now 3 generations since. I would argue this is the first generation where you don't have to argue really that it matters. It matters, Now.

Plenty of people bought 3060s 3070s 4060s 4070s... and sure even the 2000s. Believing almost without question the NV marketing hype, and that they were somehow future proofed. Sure I doubt that would be many people around here on [H]. I mean I know more then one NV purchaser that bought into the idea they neeeeded to be ready for RT. Those same friends will also tell you today that they have a handful of games that support RT, and in almost everyone of them they turn it off. Its really only now we are getting games where even the low settings use a bit of RT, and in those cases AMDs tensor less hardware does just fine. I mean even cards like the 6700xt are playing games like IJs circle at 1080p ultra or 1440 high.

Its the future sure, agreed. The future has been now for Nvidia marketing for 6 years. :)
 
Last edited:
I agree it will grow. I disagree that NV has made it plain and well understood that RT wasn't a thing yet 6 years ago. Or for the now 3 generations since. I would argue this is the first generation where you don't have to argue really that it matters. It matters, Now.

Plenty of people bought 3060s 3070s 4060s 4070s... and sure even the 2000s. Believing almost without question the NV marketing hype, and that they were somehow future proofed. Sure I doubt that would be many people around here on [H]. I mean I know more then one NV purchaser that bought into the idea they neeeeded to be ready for RT. Those same friends will also tell you today that they have a handful of games that support RT, and in almost everyone of them they turn it off. Its really only now we are getting games where even the low settings use a bit of RT, and in those cases AMDs tensor less hardware does just fine. I mean even cards like the 6700xt are playing games like IGs circle at 1080p ultra or 1440 high.

Its the future sure, agreed. The future has been now for Nvidia marketing for 6 years. :)

I think the issue is that games have become more complex with their RT implementations over the past 6 years...it's not so much that the 3080 could not handle RT games...they could, for games released at that time...it's just that games have evolved over that same time period...the most complex RT games during the 3000 series days were Cyberpunk 2077 and Metro Exodus Enhanced Edition which the 3080 could handle pretty well...but now there is a new generation of titles like Alan Wake 2, Doom: The Dark Ages, STALKER 2 etc (STALKER 2 will be adding hardware RT in a future patch)
 
I think the issue is that games have become more complex with their RT implementations over the past 6 years...it's not so much that the 3080 could not handle RT games...they could, for games released at that time...it's just that games have evolved over that same time period...the most complex RT games during the 3000 series days were Cyberpunk 2077 and Metro Exodus Enhanced Edition which the 3080 could handle pretty well...but now there is a new generation of titles like Alan Wake 2, Doom: The Dark Ages, STALKER 2 etc (STALKER 2 will be adding hardware RT in a future patch)
Agreed that is for sure the reality. I think Nvidia marketing though was happy to let the average gamer kid believe that buy buying a 3070 was the only real option. Instead of saving a couple hundred bucks buying a AMD card that was faster in 99% of the games they played at the time as RT games could be counted with your hands + a few toes. I also agree with you that the 3080 being a bit higher end card might have had a bit more breathing room.
I can see why AMD would be frustrated at this point, and by a bit gun shy of when it comes to Nvidia marketing. Pulling their announcement when they heard tales of multi frame gen and BS neural rendering. When no doubt they heard on the sly from partners months ago what we now know that the 5000 cards are often only single digit % faster. AMD has had the better raster options for 3 generations and damn near every game has been pure raster, and the few with RT really nothing short of the 80 class really counted. Almost everyone with lower 70 and 60 class cards turned RT off other then to take screen shots. (Even x80 owners mostly game with it off)
 
Ahh remember the days when x60 series cards competed with x80 series cards of the prior gen? and then Nvidia changed it so it was the x70 series cards competed with the x80 series cards of the prior gen.

And we ate it because that's all we could get?

Now they have given us a x80 card that competes and is barely faster than the x80 series card of the prior gen.

This card to me feels like it should be an x60 card, or hell, even a 'more modern' x70 card. Not an x80 card.

So in short. Waste. Of. Sand.
 

View: https://www.youtube.com/watch?v=0L1Uyw22UAw

1738207516244.png
 
Last edited:
I would argue this is the first generation where you don't have to argue really that it matters. It matters, Now.
After all these years, it's still not worth the computational cost to me.

As far as I am aware, there was one major release (Indiana Jones) that required rt hardware, and I am seeing videos on youtube of non-supported video cards running that game quite well with a software layer in between.

Imo, RT hardware has been the single most damaging thing to happen to video game rendering ever. All that effort in the name of realism wasted for little gain in esthetics.
 
Last edited:
MLID claiming from sources that the 9070XT may well match the 4080 RTX.


View: https://www.youtube.com/watch?v=udG8y64A8eo

If true, AMD may not fuck this up if they actually price it at $599, especially if supply of 5080s is going to be limited until Feb.


Was AMD not expecting such lackluster performance from Nvidia? If this matches RTX 5080 performance AMD will probably price them at $800 in a best case scenario. It would still be $200 less. Can't seem them being 40% cheaper. As much as I would want them to gain market share I assume if they dropped them at $600, Nvidia would price match at $700 even if they were only breaking even on cost.
 
If the gddr7 end up being ~useless in gaming, that 256bits GDDR6 was enough for that performance class afterall, it could be AMD chance here.

9070xt-9070 are about same node, about the same size than the 5080 I think, the big difference was a giant memory bandwith advantage for Nvidia by buying quite expensive GDDR7.

Amd could be in a that position where their card perform about the same without costing more to make this time around, rdna 2 style, where Nvidia cannot easily just underprice them the moment they want if they need too.
 
I'd say somewhere between the GTX 4xx and 2xxx series, yeah. Not great (or even all that good), but the new software features are a nice look at future things to come.
If you're an RTX 3xxx or RX 6xxx user tho, RTX 5080/90 looks to be a decent upgrade.
Probably the worst when you consider that Nvidia is banking on their AI features like DLSS and frame generation instead of producing better hardware. Again, proprietary software tools which nobody else can use. This is why nobody should be happy about Nvidia's DLSS+FG technologies as it's an excuse to sell inferior hardware.
Why would they bother when everyone is going to slurp up 4x frame gen like its the ambrosia of gods? Nobody plays on native any more.
Don't tell that to the r/FuckTAA guys.
 
All they have to do is hit the same 7900xtx performance level they had before and sell it for 1/3 less and they win it all. It's close enough to the 5080 that no one will care as long as it's $300-$400 cheaper.
And if you add a nice boost to RT on top of that that's a winner right there
 
  • Like
Reactions: kac77
like this
That doesn't change that basically 3 1/2 gens of Nvidia hardware was sold and bought on a promise of wide spread RT everywhere. That just hasn't happened... until the back half of '24 6 years after the tech launched. Even today 250 some games are all there is... and of those lets all be honest 40-50 of them get any sort of wide spread play.
Well the issue is that RT will never get into a state where it's as good as raster in terms of price/performance. Every generation there's an expectation for an effects and resolution uplift. Every time that happens RT performance will take a hit, which means you need the next latest card to be able to run it. There's really no way around it. The more accurate the RT effect the bigger the hardware cost. Personally from a gamer perspective I don't see a benefit. We are already seeing games with mandatory RT that look no better than games released 5 years ago.
 
They're getting better at RT, just not quickly enough.

The PROBLEM is that there's no end in sight to having two video cards in one and we pay for both. The raster engine and the RT engine. Too much die and cost being taken up doing both when it should be one or the other.

We need the transition to RT to complete far more quickly than it is. I think we are out of time based on this launch. There is no way to keep everyone happy on this path any longer.

Maybe separate Raster and RT into separate cards (old school daughter cards or co-processors) that are upgraded separately until RT reaches supremacy then freeze raster development and start to deprecate it. I don't know. But this can't continue. The cards are insane monsters that are somehow jack of all trades but master of none which isn't what you pay $800-2000+ for. Do one thing, and do it well.

Even the mighty 5090 isn't a master of RT. It should be at $2000-3000. Otherwise it should just be 100%+ faster at raster and call it a day. Or, it should have the same pure raster as say a 5070 Ti, but 3x the RT performance of thet 4090 and be able to hit nearly 100fps in RT Overdrive Cyberpunk at 4k. THAT would convince people to want the next gen of 80 class card to stop increasing raster and provide that level of RT instead. After that it would trickle down to the rest of the stack quickly.

But none of the cards in this gen actually smash the RT barrier completely. Since none of them do, everyone rightly has only one question:

"Since the RT still sucks, how's the raster performance?" Well, it also overall SUCKS because once again they try to do everything on one card.

I'd rather have raster performance freeze entirely or even regress for two generations if it means we can break the back of the RT/PT performance problem and just get on with it. Either that or abandon RT. But gaming is going to die if we have to linearly pay for every performance improvement from here on out and power draw is nearing the aboslute limits... again.
 
They're getting better at RT, just not quickly enough.

The PROBLEM is that there's no end in sight to having two video cards in one and we pay for both. The raster engine and the RT engine. Too much die and cost being taken up doing both when it should be one or the other.

We need the transition to RT to complete far more quickly than it is. I think we are out of time based on this launch. There is no way to keep everyone happy on this path any longer.

Maybe separate Raster and RT into separate cards (old school daughter cards or co-processors) that are upgraded separately until RT reaches supremacy then freeze raster development and start to deprecate it. I don't know. But this can't continue. The cards are insane monsters that are somehow jack of all trades but master of none which isn't what you pay $800-2000+ for. Do one thing, and do it well.

Even the mighty 5090 isn't a master of RT. It should be at $2000-3000. Otherwise it should just be 100%+ faster at raster and call it a day. Or, it should have the same pure raster as say a 5070 Ti, but 3x the RT performance of thet 4090 and be able to hit nearly 100fps in RT Overdrive Cyberpunk at 4k. THAT would convince people to want the next gen of 80 class card to stop increasing raster and provide that level of RT instead. After that it would trickle down to the rest of the stack quickly.

But none of the cards in this gen actually smash the RT barrier completely. Since none of them do, everyone rightly has only one question:

"Since the RT still sucks, how's the raster performance?" Well, it also overall SUCKS because once again they try to do everything on one card.

I'd rather have raster performance freeze entirely or even regress for two generations if it means we can break the back of the RT/PT performance problem and just get on with it. Either that or abandon RT. But gaming is going to die if we have to linearly pay for every performance improvement from here on out and power draw is nearing the aboslute limits... again.
Do you realize how long it will be before anyone could go "full RT"? At the current pace it would take decades, cost 10s of thousands of dollars and require a small nuclear reactor to power it. To have any chance of going "full RT" at this point would regress gaming back at least twenty years. The hardware isn't here and won't be for a very, very long time.

This has been my point and the point of many others since the release of the 2xxx nVidia series. RT was never going to be anything but window dressing for the foreseeable future.
 
I wasn't even looking at the 5090-4090 differential, but that is also awkward. The fact that the 4080 Super is resoundingly beating the 5080 in both maximum and minimum frame rates is eyebrow raising.

Maybe, but still utterly embarrassing IMHO. Having a new flagship GPU losing to a 2 year superseded model in one of the most played games in the world is not a good look.
Ha! Timewise cs2 is my #1 game. Now I'm definitely not upgrading! Downgradeeee ftl. What a joke, they should be embarrassed
 
I have it on very good authority from prominent menders of this very forum that rasterization is the only thing that matters because the fake images on the screen don't have any real rays to trace, and it gets even more ridiculous when you try to fake frames from the fake image on the screen creating an exponential fake² event horizon that will swallow reality into the it's fakeness. At it's culmination, you get fake³ which is the end of the universe.

Oh, forgot to mention that AMD can't really compete at raster either without Nvidia's choice of producing the cards that some consider not to be good enough. Which honestly confuses me how people bash one company for producing anemic cards while praising another company for producing anemic cards.
Who is praising AMD? Seriously, who?
 
Do you realize how long it will be before anyone could go "full RT"? At the current pace it would take decades, cost 10s of thousands of dollars and require a small nuclear reactor to power it. To have any chance of going "full RT" at this point would regress gaming back at least twenty years. The hardware isn't here and won't be for a very, very long time.

This has been my point and the point of many others since the release of the 2xxx nVidia series. RT was never going to be anything but window dressing for the foreseeable future.
There are already games that do "full RT" like Minecraft and Quake 2 but it's not worth it. What Advil is suggesting is that the industry dump Raster and use the silicon space for RT only. This might actually make modern games go full RT, but the industry would need to back this up entirely because whoever does this will obviously lose financially due to all the legacy games. As it stands the window dressing we get from Ray-Tracing is not making games look better than before. Especially with all the ghosting from upscalers and blur from frame generation.
 
There are already games that do "full RT" like Minecraft and Quake 2 but it's not worth it. What Advil is suggesting is that the industry dump Raster and use the silicon space for RT only. This might actually make modern games go full RT, but the industry would need to back this up entirely because whoever does this will obviously lose financially due to all the legacy games. As it stands the window dressing we get from Ray-Tracing is not making games look better than before. Especially with all the ghosting from upscalers and blur from frame generation.
And that's not nearly enough RT hardware to go full RT. Unless the game industry regresses about 20 years. Even the light amount of RT in games now brings cards down to their knees and it's only affecting lighting. The vast majority of the rendering is still raster. RT is very inefficient compared to raster and unless some almost magical breakthrough happens it's going to stay that way for the foreseeable future.
 
  • Like
Reactions: kac77
like this
There are already games that do "full RT" like Minecraft and Quake 2 but it's not worth it. What Advil is suggesting is that the industry dump Raster and use the silicon space for RT only. This might actually make modern games go full RT, but the industry would need to back this up entirely because whoever does this will obviously lose financially due to all the legacy games. As it stands the window dressing we get from Ray-Tracing is not making games look better than before. Especially with all the ghosting from upscalers and blur from frame generation.
The only way that works is if we basically get 2 different bits of hardware. A GPU which does raster... and some separate accelerator card.
OF course that isn't possible.
Even if that was feasible. RT would die off pretty fast. It would be expensive extra hardware no one needs, cause game studios would just keep making raster games. Install base. Same reason they don't make VR games. Or the PhysX add on cards were a fail.
 
The only way that works is if we basically get 2 different bits of hardware. A GPU which does raster... and some separate accelerator card.
OF course that isn't possible.
Even if that was feasible. RT would die off pretty fast. It would be expensive extra hardware no one needs, cause game studios would just keep making raster games. Install base. Same reason they don't make VR games. Or the PhysX add on cards were a fail.
yaaas! Bring back the 3d accelerator cards and phys-x cards!
 
And that's not nearly enough RT hardware to go full RT. Unless the game industry regresses about 20 years. Even the light amount of RT in games now brings cards down to their knees and it's only affecting lighting. The vast majority of the rendering is still raster. RT is very inefficient compared to raster and unless some almost magical breakthrough happens it's going to stay that way for the foreseeable future.
The truth is... at some point people might just have to square up with the idea that full "real time" ray tracing. Was maybe a pipe dream. (or a marketing strategy ....) Path tracing is close to full on ray tracing, with two catches. First ALL implementations are severely restricting how much is really being ray traced. They aren't calculating every single pixel, they aren't accounting for multiple bounces. They put a ton of limitations on the work being done else wise it would run at less then 1 fps. Second even with all the limitations developers use to make the majority of what matters trace... even $2k+ flagship cards 4 generations of hardware in can't provide acceptable frame rates.

If a game was to go actually 100% RT for all lighting. Even a 5090 is going to run that at 2FPS at most. lol
 
Back
Top