PC Gamer - AMD RX 6800 XT runs at almost half the frame rate of Nvidia's RTX 3080 in Vulkan Ray Tracing tests

Fuck that. I don't buy $1,500 video cards so I can play games in potato mode.
Yeah, same here. I tried to get CP2077 running at high refresh, and I did it, but at the cost of all the graphical quality.

I've since just maxed out all settings (but put DLSS on ultra performance), and it looks amazing, but I have to live with around 45 fps in the city, or 75 fps indoors. I'll take it.
 
fuk the eyecandy - I want FPS and raster is good enough.
Sure, but someone buying a $600 USD video card, they probably care a lot about eyes candy (so much that they want over 1080p resolution eye candy and play with high details and so on)
 
Yeah, same here. I tried to get CP2077 running at high refresh, and I did it, but at the cost of all the graphical quality.

I've since just maxed out all settings (but put DLSS on ultra performance), and it looks amazing, but I have to live with around 45 fps in the city, or 75 fps indoors. I'll take it.

I'm getting about the same with a Ryzen 9 3950X, overclocked RTX 2080 Ti and DLSS on Quality. This is at 3440x1440.
 
Yeah, same here. I tried to get CP2077 running at high refresh, and I did it, but at the cost of all the graphical quality.

I've since just maxed out all settings (but put DLSS on ultra performance), and it looks amazing, but I have to live with around 45 fps in the city, or 75 fps indoors. I'll take it.
Fuck that. I don't buy $1,500 video cards so I can play games in potato mode.
Fuk that I don't spend $1500 on a video card.

Sure, but someone buying a $600 USD video card, they probably care a lot about eyes candy (so much that they want over 1080p resolution eye candy and play with high details and so on)

Maybe they do, maybe they don't. Who's to say? Graphics Cards have always been a balance between candy, FPS and price.

FACT - Make no mistake - RT is EYECANDY! Jensen wants everyone to believe it's all raytraced now. FAR FROM IT!
FACT - Everything is still Raster Based. Imagine that.
FACT - Nvidia is lining their pockets with your cash on this effect. How does that 1080 play with RTX now, or ever? Like shit. Without DOWNSAMPLING with DLSS the latest cards on 4k can't cut it either.
FACT - SUPER LIMITED Selection of Games available for RTX and DLSS
OPINION - RTX "enhancement" is the most overrated and overhyped graphic effect ever. Nvidia has pulled off the greatest bs marketing hype scheme ever. Congrads Jensen.
DLSS is a CRUTCH for Nvidia, 2.0 may be worthy but everyone raved about garbage looking 1.0. That was in everyone's head and not reality.

FOR ME - Until we get to FULL Raytracing EVERYTHING..
RT OFF - GIMMICK

Enjoy your $1500 GPUs with limited selection eyecandy, I'll wait until RT is the real thing and not some "after-effect" with a absolute HUGE performance hit.
 
FACT - Make no mistake - RT is EYECANDY! Jensen wants everyone to believe it's all raytraced now. FAR FROM IT!
Ya but that is what all of graphics are: Eyecandy. And when properly implemented to enhance games, it can be nice eye candy. Control looks pretty damn cool with RT. I wouldn't use it if I had a card that couldn't handle it, it doesn't look cool enough to play at 30fps, but while it DOES impact frame rate, I can still get 80-100fps (at 3840x1600) which is quite smooth.

It's understandable if you don't want to buy a new card to get it, but acting like it is a gimmick that people shouldn't care about is kinda silly. It is eye candy that does look good when well done, and that is why we have GPUs in the first place.
 
Fuk that I don't spend $1500 on a video card.



Maybe they do, maybe they don't. Who's to say? Graphics Cards have always been a balance between candy, FPS and price.

FACT - Make no mistake - RT is EYECANDY! Jensen wants everyone to believe it's all raytraced now. FAR FROM IT!
FACT - Everything is still Raster Based. Imagine that.
FACT - Nvidia is lining their pockets with your cash on this effect. How does that 1080 play with RTX now, or ever? Like shit. Without DOWNSAMPLING with DLSS the latest cards on 4k can't cut it either.
FACT - SUPER LIMITED Selection of Games available for RTX and DLSS
OPINION - RTX "enhancement" is the most overrated and overhyped graphic effect ever. Nvidia has pulled off the greatest bs marketing hype scheme ever. Congrads Jensen.
DLSS is a CRUTCH for Nvidia, 2.0 may be worthy but everyone raved about garbage looking 1.0. That was in everyone's head and not reality.

FOR ME - Until we get to FULL Raytracing EVERYTHING..
RT OFF - GIMMICK

Enjoy your $1500 GPUs with limited selection eyecandy, I'll wait until RT is the real thing and not some "after-effect" with a absolute HUGE performance hit.
Good is where I am at on it currently. RT is a checkbox not really returning anything for the huge performance trade-off. DLSS has not impressed either with its in-game artifacting.

Ray tracing is coming but we are years away.
 
Ya but that is what all of graphics are: Eyecandy. And when properly implemented to enhance games, it can be nice eye candy. Control looks pretty damn cool with RT. I wouldn't use it if I had a card that couldn't handle it, it doesn't look cool enough to play at 30fps, but while it DOES impact frame rate, I can still get 80-100fps (at 3840x1600) which is quite smooth.

It's understandable if you don't want to buy a new card to get it, but acting like it is a gimmick that people shouldn't care about is kinda silly. It is eye candy that does look good when well done, and that is why we have GPUs in the first place.
Low FPS frames are what count, not max.

All features are not eyecandy - but RTX is. its an addon to a very limited number of games, candy. It's a glossy feature to enhance what is already there. You pay out the ass for a little twinkle and your wallet kicks you in the ass to remind you how good it is and how worth it - it was.
 
Fuk that I don't spend $1500 on a video card.



Maybe they do, maybe they don't. Who's to say? Graphics Cards have always been a balance between candy, FPS and price.

FACT - Make no mistake - RT is EYECANDY! Jensen wants everyone to believe it's all raytraced now. FAR FROM IT!
FACT - Everything is still Raster Based. Imagine that.
FACT - Nvidia is lining their pockets with your cash on this effect. How does that 1080 play with RTX now, or ever? Like shit. Without DOWNSAMPLING with DLSS the latest cards on 4k can't cut it either.
FACT - SUPER LIMITED Selection of Games available for RTX and DLSS
OPINION - RTX "enhancement" is the most overrated and overhyped graphic effect ever. Nvidia has pulled off the greatest bs marketing hype scheme ever. Congrads Jensen.
DLSS is a CRUTCH for Nvidia, 2.0 may be worthy but everyone raved about garbage looking 1.0. That was in everyone's head and not reality.

FOR ME - Until we get to FULL Raytracing EVERYTHING..
RT OFF - GIMMICK

Enjoy your $1500 GPUs with limited selection eyecandy, I'll wait until RT is the real thing and not some "after-effect" with a absolute HUGE performance hit.
My thoughts exactly.

I foresee Ray Tracing going the way of Physx & Hairworks. The technology is there sure, but how many developers are willing to implement it correctly, and at what cost to them in the form of time and deadlines?
 
Last edited:
Fuck that. I don't buy $1,500 video cards so I can play games in potato mode.
Ohhhh man that sucks cause if you want Full RT non-potato mode on a 3090.... your still going to be doing that at 1080p. Welcome to the RT future.

Not that there is anything really wrong with gaming at 1080p.... and if you don't mind mini-potato mode you can probably jump up to 1440 and select medium in a which ever other Full non potato settings you can stand down grading on your $1500 card. :)

Half kidden... but I mean really that is the RT reality today. Its the future sure.... but current hardware is still forcing choices. No one is selling the loaded baked potato version quite yet.
 
raytracing was never going to be a big thing this generation, consoles will struggle past basic delivery, nVidia can only do it with quality hits from DLSS, and AMD just implemented their first attempt at it.

Raytracing is only for marketers and those that guzzle marketing.

Give it 4-6 years and yes, ray tracing will be big, but thats will be and certainly not on this hardware.0
 
Ohhhh man that sucks cause if you want Full RT non-potato mode on a 3090.... your still going to be doing that at 1080p. Welcome to the RT future.

Not that there is anything really wrong with gaming at 1080p.... and if you don't mind mini-potato mode you can probably jump up to 1440 and select medium in a which ever other Full non potato settings you can stand down grading on your $1500 card. :)

Half kidden... but I mean really that is the RT reality today. Its the future sure.... but current hardware is still forcing choices. No one is selling the loaded baked potato version quite yet.
Yeah, you're pretty much paying $1500 dollars for a "loaded" baked potato but with Bacos bacon bits, instead of real bacon. lol
 
  • Like
Reactions: ChadD
like this
My thoughts exactly.

I foresee Ray Tracing going the way of Physx & Hairworks. The technology is there sure, but how many developers are willing to implement it correctly, and at what cost to them in the form of time and deadlines?
I said it in another thread... but the future of RT is AMD Navi 2. Is it as powerful as Nvidias.... drivers and software optimizations not withstanding its probably not. However it is the target for 3-4 years. But that might not be that bad.... its powerful enough that games developed with it in mind from day one should still be able to pull off some very cool effects. I think the key for developers will be understanding when RT is going to make for those eye catching shadows and reflections.... and when the action is high and a map will achieve 95% of the same effect and no one will really notice. Right now basically every RT game we have gotten was designed with non RT hardware in mind.... and for the most part RT was added after the fact by a couple programmers (and probably a Nvidia staff programmer) where they basically just drop a RT lighting engine in as a replacement. Hopefully with newer games developed with AMDs consoles in mind artists earlier in the design pipeline can intelligent decisions that give us the best of both worlds. Fast light maps when the visual impact of RT is just not great enough for the performance cost.... and RT effects when the player will really notice the IQ.

IMO that is what its going to come down to... developers. There going to have to learn how to use RT wisely.... and I really don't think in fairness many game artists have had that chance yet. I look forward to seeing what games hit next xmas.... by then developers should have had a few years to have their artistic types figuring out how to use RT properly without hopefully just trying to replace every light map with rays. Cause hardware isn't going to be able to handle that with full ultra settings for a few years.

PS EDIT.... I am also personally convinced after reading how both companies RT implementations work that AMDs version is in fact superior. Yes nvidia was out first and has more optimization right now.... and probably more brute calculation force. However imo the future isn't full RT its a hybrid use of light maps and old raster tricks... with a sprinkle of ray calculation where it makes the most sense. AMDs all in one compute core solution is more elegant and should suffer from less latency. I also have a feeling some smart developers are going to figure out how to properly share the resources in those CUs. The future will tell... but I have a feeling games released in the next few years developed with AMDs solution in mind may run like crap on Nvidias dual core solution.
 
Last edited:
My thoughts exactly.

I foresee Ray Tracing going the way of Physx & Hairworks. The technology is there sure, but how many developers are willing to implement it correctly, and at what cost to them in the form of time and deadlines?
I think we should not mix Nvidia specific Ray tracing implementation with RT.

Physic simulation in game is going as strong has ever and non graphic calculation on the GPU is also more popular it has ever been, Lara Croft hair in the latest tomb raider is the best they ever been and hair/fur simulation is still fully going on:
https://www.altchar.com/game-news/a...s-with-unreal-engine-integration-aNgzp8P3BQTX

Ray Tracing according to most (like John Carmack) is almost certain to win eventually, there is reason it is massively used when people have time and electricity to use it, it make everything much better and simpler, also it is unlike rasterization a log instead of linear, the bigger the resolution get the less fast rasterization get versus raytracing:
https://www.researchgate.net/figure...ng-with-input-complexity-We-also_fig8_2574984

I.e. maybe that by say 12-16K resolution with extremely small triangle the relative cost of raytracing vs rasterization become quite small.

Will in 2200 real time raytracing look or has anything to do with Nvidia is a different question, but I doubt RT is going away, it is almost certain that it will be use in real time affair, so is physx like and hairworks like affair.

At a certain point of complexity RT should be faster and that should make it win easily (combined with the much simpler and realistic results) and because that what non real time renderer will continue to use development and hardware for it will almost certainly continue to be made even if it's use in real time game is just a 2018-2021 fad and take a break until 2040 or something like that.
 
Fuk that I don't spend $1500 on a video card.



Maybe they do, maybe they don't. Who's to say? Graphics Cards have always been a balance between candy, FPS and price.

FACT - Make no mistake - RT is EYECANDY! Jensen wants everyone to believe it's all raytraced now. FAR FROM IT!
FACT - Everything is still Raster Based. Imagine that.
FACT - Nvidia is lining their pockets with your cash on this effect. How does that 1080 play with RTX now, or ever? Like shit. Without DOWNSAMPLING with DLSS the latest cards on 4k can't cut it either.
FACT - SUPER LIMITED Selection of Games available for RTX and DLSS
OPINION - RTX "enhancement" is the most overrated and overhyped graphic effect ever. Nvidia has pulled off the greatest bs marketing hype scheme ever. Congrads Jensen.
DLSS is a CRUTCH for Nvidia, 2.0 may be worthy but everyone raved about garbage looking 1.0. That was in everyone's head and not reality.

FOR ME - Until we get to FULL Raytracing EVERYTHING..
RT OFF - GIMMICK

Enjoy your $1500 GPUs with limited selection eyecandy, I'll wait until RT is the real thing and not some "after-effect" with a absolute HUGE performance hit.
You don't understand how new technologies come about and are refined over time very well.

I don't mean to say you should run out and buy a $1500 video card to use RT (I don't either,) but you sure seem to be trying to say that nobody else should (or be unable because it shouldn't be released until it's flawless) because you don't find value in currently. Then expect it to still get refined for years with no ROI until it's flawless. Imagine the price of a card if they had 10+ years of R&D tied to it for a feature that never had any ROI during that time. That might work for some commercial and industrial markets, but the consumer market not so much.

That's all without even considering the implications of the software developers role in it. I guess it shouldn't be released til it's just a single variable for them to flip and it magically inserts flawless RT into their game with minimal performance loss.
 
You don't understand how new technologies come about and are refined over time very well.

I don't mean to say you should run out and buy a $1500 video card to use RT (I don't either,) but you sure seem to be trying to say that nobody else should (or be unable because it shouldn't be released until it's flawless) because you don't find value in currently. Then expect it to still get refined for years with no ROI until it's flawless. Imagine the price of a card if they had 10+ years of R&D tied to it for a feature that never had any ROI during that time. That might work for some commercial and industrial markets, but the consumer market not so much.

That's all without even considering the implications of the software developers role in it. I guess it shouldn't be released til it's just a single variable for them to flip and it magically inserts flawless RT into their game with minimal performance loss.
Having several technical degrees - I fully understand technology refinement. thank you. Yes RT was the past and is the future, however it's obviously NOT READY for real-time games. As Kyle said to simplify the RT debate currently to it's base elequently - "paying too much for too little". I guess everyone has their opinion, I've stated mine and exactly why.
 
Maybe they do, maybe they don't. Who's to say? Graphics Cards have always been a balance between candy, FPS and price.

Do you know one person that play only low impressive graphic at the lowest possible details that buy those type of card, you have the pro type of players wanting 300 fps on their 240hz+ monitor yes, but that an exception. Like you said candy was always in the balance of graphic cards, almost by definition (i.e. you seem to fully agree that it is quite probable, but not certain, that people that are buying a 6800xt instead of a 5700xt/6800 care in some ways about graphical quality, would it be resolution above 720p or 1080p, not playing the game at the lowest setting and so on, and not just shadows because it help 3d perception and make them better at the game).
 
Do you know one person that play only low impressive graphic at the lowest possible details that buy those type of card, you have the pro type of players wanting 300 fps on their 240hz+ monitor yes, but that an exception. Like you said candy was always in the balance of graphic cards, almost by definition (i.e. you seem to fully agree that it is quite probable, but not certain, that people that are buying a 6800xt instead of a 5700xt/6800 care in some ways about graphical quality, would it be resolution above 720p or 1080p, not playing the game at the lowest setting and so on, and not just shadows because it help 3d perception and make them better at the game).
I'm not sure what you are asking here if anything. You assume too much out of my response. I will say I prefer shadows that don't cut my frramerates in half and bring my min fps below 60. If the fps hit is larger than that threshold, shadows get turned off as they are not that crucial to any game.
 
I'm not sure what you are asking here if anything. You assume too much out of my response. I will say I prefer shadows that don't cut my frramerates in half and bring my min fps below 60. If the fps hit is larger than that threshold, shadows get turned off as they are not that crucial to any game.
Well yes and that completely different than saying in a thread that compare graphic cards being sold over $700 USD that candy does not matter at all, everyone always make a compromise between candy and performance, everyone is like you (just a little bit different line but in that very similar ballpark).
 
And this is why Nvidia wants reviewers to focus on raytracing despite the vanishingly few titles that actually use it...

Of course Nvidia wants reviewers to cover it! it is impressive tech..

AMD fans do NOT want to hear about it, as they are loudly proclaiming in the thread. Salt in the wound I suppose.

I played Quake2 RTX last summer, but it's an ancient game. Cyberpunk is the first game I have played that it really makes a difference.

I suppose, but it was kind of a false hope to begin with. It was either make a competitive card in rasterization or make a competitive card for RT or do both and be mediocre in both.

Like the 20xx series it will take AMD a generation or two to shine with RT (hopefully)

Playing Cyberpunk on a 2080Ti and the FPS is good. All you gotta do is set cascaded shadows setting down 1 notch and performance practically doubles. It's not 30fps as the haters in the thread proclaim.
 
Playing Cyberpunk on a 2080Ti and the FPS is good. All you gotta do is set cascaded shadows setting down 1 notch and performance practically doubles. It's not 30fps as the haters in the thread proclaim.
Is that in native resolution or with using DLSS?
 
Of course Nvidia wants reviewers to cover it! it is impressive tech..

AMD fans do NOT want to hear about it, as they are loudly proclaiming in the thread. Salt in the wound I suppose.

I played Quake2 RTX last summer, but it's an ancient game. Cyberpunk is the first game I have played that it really makes a difference.



Playing Cyberpunk on a 2080Ti and the FPS is good. All you gotta do is set cascaded shadows setting down 1 notch and performance practically doubles. It's not 30fps as the haters in the thread proclaim
So Nvidia fans proclaim "salt in the wound from AMD fans" over one game (Cyberpunk) and declare validation and victory for RTX?
 
Amd will work with game developers to get their rt working better in their engines. Amd being in all of the console will definitely help getting their rt worked into game engines.
 
Of course Nvidia wants reviewers to cover it! it is impressive tech..

AMD fans do NOT want to hear about it, as they are loudly proclaiming in the thread. Salt in the wound I suppose.

I played Quake2 RTX last summer, but it's an ancient game. Cyberpunk is the first game I have played that it really makes a difference.



Playing Cyberpunk on a 2080Ti and the FPS is good. All you gotta do is set cascaded shadows setting down 1 notch and performance practically doubles. It's not 30fps as the haters in the thread proclaim.

I run a 2080Ti.

Lol, it absolutely does not double, best I got was to lowwr cascaded shadows by one, tweak a few other settings and went from 45fps to 53fps, at times I top out at 60fps and I don't dare turn my monitors OD on because I need to be in gsync range. both with dlss on auto. if dlss is off then yes, its a 20-30fps slide show outside of 1080p.

There are so many lies floating around about RTX, and DLSS, its hilarious. I agree it can look good, but the performance cost is tremendous.
 
Having played some RT titles for a couple of weeks now with a 3080, I think it represents a good initial tipping point toward RT readiness. I think of it as similar to something between the Geforce256 and Geforce2 and their hardware T&L. It's good enough now to derive some enjoyment from software that takes advantage of it, but is most certainly something of a luxury feature. So no one should buy today because it's essential, but if you have the economic means to enjoy it as a perk, then by all means do so! If not, that's fine, but don't begrudge those who are kicking the tires on it anymore than we did in the early days of 2D acceleration, 3D T&L, HW vertex/pixel shaders etc etc.

CP2077 and SoTR haven't impressed me so much (the former in particular because turning on RT almost adds too much eye candy, causing distraction for me) but I just started Control and it really is amazing.
 
Well yes and that completely different than saying in a thread that compare graphic cards being sold over $700 USD that candy does not matter at all, everyone always make a compromise between candy and performance, everyone is like you (just a little bit different line but in that very similar ballpark).
If we where squabbling about a difference of 120 FPS vs 100 FPS.... I think I would be on team RT Now.
However we are not talking about that.... we are talking about 200+ FPS vs 60-70 FPS (and probably a resolution drop) on $1500 hardware.... we have halo cards that can run ultra settings at over 200 FPS but you flip RT on and all of a sudden you can't hit the refresh rate on a $200 dollar monitor. (never mind that $1000+ gaming beast your likely to have if you spring for a halo card)
If your in for the real flag ship 3080/6800 class cards.... then your talking amore like 200 FPS vs 50-60 FPS and no doubt a resolution bump at least one step down. (unless you really are running a flagship card on a 1080p monitor). Perhaps you can bump that RT frame rate back up over 100 FPS IF... IF you are willing to drop things back from ultra/high settings into medium setting territory.

I don't understand the argument to flip RT on at the expense of more game impacting IQ settings.... are you really going to drop texture filtering, or AA, tessellation or any number of other settings we have gotten used to being able to set to ultra on flagship hardware just so you can enable RT lighting and shadows. The shadows don't look any better 9 times out of 10... and the lighting CAN be very cool, but most of the time its hard to see the difference cause your busy playing a game, and the old raster way is pretty damn convincing when your running in a FPS, or racing ect.

We had this exact same argument years back with anti aliasing... the first 3-4 generations could use it to varying degrees of quality and performance impact. One generation could perhaps flip 2x on... the next 4x.... the next had some new fangled way to do it better. Point is everyone could see the potential, and almost everyoen agreed AA was going to be a future defacto feature. We even bought cards where we said yes everything else is equal but X has a bit better AA performance or can use a different mode that performs better with the same IQ. No one however said I'm going to buy this card... cause it can do 8x MSAA, I mean sure it drops me down to single digit frames but the other guys just don't have it so I'm sold. RT is there today.... Nvidia can do some cooler stuff right now no argument from me, however its still a feature that runs like crap even on the halo and flagship cards, on the 3060 class cards it might as well not be included its unusable.

Nvidias solution DLSS is a shit joke... and I really really hope AMD doesn't just release a me to version that sucks just as bad. I am hoping AMD comes up with something that might not promise the silly "4k quality 1080p performance" junk that Nvidia is pedaling. (and I suspect based on recent reviewer stories... NV is probably writing reviewers DLSS copy) Frankly DLSS makes me shake my head... its garbage, every game I have seen it in (the few that support it) to my eye it looks like shit. I would rather run at a proper resolution and slide a few things down to medium then see graphic glitches or Vaseline like clarity every time I get into a corner or something where the AI didn't get enough data or whatever the issue was. Nvidia selling DLSS as the answer to RT now.... is lacking, and I am pretty sure that the people that bought into their last generation are mostly coming to the same conclusion I did. I'll be glad to enable RT when the hardware can push it at native resolution without forcing me to choose which other IQ features to step down to retain at least a gsync/freesyc frame rate range. ;)
 
Last edited:
I don't understand the argument to flip RT on at the expense of more game impacting IQ settings.... are you really going to drop texture filtering, or AA, tessellation or any number of other settings we have gotten used to being able to set to ultra on flagship hardware just so you can enable RT lighting and shadows. The shadows don't look any better 9 times out of 10... and the lighting CAN be very cool, but most of the time its hard to see the difference cause your busy playing a game, and the old raster way is pretty damn convincing when your running in a FPS, or racing ect.
This, the cost in eye candy lost to get the RT eye candy is took big, is closer to what is going on that the original message that spawned the conversation. Take the new Dark Souls, what sacrifice in is really impressive presentation would it have needed to make to have some RT effect in there ?

I am hoping AMD comes up with something that might not promise the silly "4k quality 1080p performance"
That close to what people buying those new console are getting, what they are doing there is probably getting available to the PC side as well.
 
  • Like
Reactions: ChadD
like this
Ray tracing is worth it for me. I don't mind DLSS, I actually like the look it gives. More like watching a movie.
 
While I agree that for now 10GB is enough, I keep my cards for at least 3 years and I skip generation(s). My current card is an evga 1080 ti ftw and the warranty is up in 4 months. It is also hard for me to give up a card with 11Gb for one with 10GB.

The game that I play the most, age of empires 3 definitive edition shows that it allocates/uses over 10GB from time to time, and yes I understand it could be allocate and not use, but for my peace of mind, I prefer a card with a minimum of 12GB, since I plan to keep it until the warranty is near its end.

I really think nvidia misses the boat this go around, thinking people that upgrade to the 3080 are 1080 card owners. Or maybe, they really try to push 1080 ti owners into buying the 3090. I don’t know, but 3090 is too much money for me. I paid $800 with tax for my 1080 ti, at most I would pay $1k this time around, so I am hoping for the 3080 ti if that ever comes out.
I went from a 1080 Ti to a (thankfully free) 2080 Super (8GB) for a bit. I didn't have any issues at 4K in anything, not even in doom eternal with ultra nightmare (likely because I had a very high memory OC). It was a total side grade as they were both under AiO watercooling and heavily overclocked and they were both very close in performance (other than raytracing).

If you were upgrading to an 8GB card for 4K today, I'd say yeah not so smart. 10GB with the compression and memory streaming that is coming to games? Really shouldn't be an issue unless you mod games for ultra realistic (and unoptimized!) textures.

With that all said, I'm using a 3090 right now because it's over twice as fast as two 1080 Tis in SLI. I've got the beta Afterburner/RTSS which allows you to configure it to see actual in-use VRAM, and even CP2077 at 4K ultra RT preset with DLSS set to balanced only uses around 6 to 7GB (with up to 9GB allocated).
 
If we where squabbling about a difference of 120 FPS vs 100 FPS.... I think I would be on team RT Now.
However we are not talking about that.... we are talking about 200+ FPS vs 60-70 FPS (and probably a resolution drop) on $1500 hardware.... we have halo cards that can run ultra settings at over 200 FPS but you flip RT on and all of a sudden you can't hit the refresh rate on a $200 dollar monitor. (never mind that $1000+ gaming beast your likely to have if you spring for a halo card)
If your in for the real flag ship 3080/6800 class cards.... then your talking amore like 200 FPS vs 50-60 FPS and no doubt a resolution bump at least one step down. (unless you really are running a flagship card on a 1080p monitor). Perhaps you can bump that RT frame rate back up over 100 FPS IF... IF you are willing to drop things back from ultra/high settings into medium setting territory.

No man, you are really overstating it. It does have a big hit, though how big depends on the game and how much RT is does, but this idea that is is like 1/3rd the FPS with a rez drop is not supported by data. According to Techspot's benchmarks Control runs at 109fps average, no RT and 67fps average with RT at 1440 on a 3080. That's about a 30% drop. Metro Exodus at 1440 Ultra quality goes from 153fps to 98fps average. Now on the other hand Fortnite gets destroyed if you turn RT all the way up going from 170fps down to 46fps.

So depends on the game, and the rez, but the idea that all games just drop to unplayable rates is not the case. Also the absolute drop doesn't matter necessarily, depending on your preferences and your screen. What I mean is that if you have a VRR screen, going from very high fps to just high fps may not feel like much of a difference. To me the difference between 60 and 90 fps is pretty noticeable but the difference between 90 and 120 is a lot more subtle. Depending on the game, I'll take the lower fps for better visuals.

RT is certainly not the be-all, end-all and gaming will continue to be very enjoyable without it, but it is rather silly the people that are just attacking it and trying to pretend like it is unusable. On the 3080 (and thus also 3090) at least, you can play games at reasonable fps with RT.

It's a nice option to have, if nothing else.
 
I went from a 1080 Ti to a (thankfully free) 2080 Super (8GB) for a bit. I didn't have any issues at 4K in anything, not even in doom eternal with ultra nightmare (likely because I had a very high memory OC). It was a total side grade as they were both under AiO watercooling and heavily overclocked and they were both very close in performance (other than raytracing).

If you were upgrading to an 8GB card for 4K today, I'd say yeah not so smart. 10GB with the compression and memory streaming that is coming to games? Really shouldn't be an issue unless you mod games for ultra realistic (and unoptimized!) textures.

With that all said, I'm using a 3090 right now because it's over twice as fast as two 1080 Tis in SLI. I've got the beta Afterburner/RTSS which allows you to configure it to see actual in-use VRAM, and even CP2077 at 4K ultra RT preset with DLSS set to balanced only uses around 6 to 7GB (with up to 9GB allocated).

I am just not sure 3 years from now if 10GB is still enough, and I am sure no one is willing to guarantee that for me. Hey, do you have link for that Afterburner beta that I can download and try it out?
 
Having several technical degrees - I fully understand technology refinement. thank you. Yes RT was the past and is the future, however it's obviously NOT READY for real-time games. As Kyle said to simplify the RT debate currently to it's base elequently - "paying too much for too little". I guess everyone has their opinion, I've stated mine and exactly why.
Yes you fully understand followed right up with, it's the future but not ready. Again, how do you expect to get there from here? It requires involvement from both the hardware devs & software devs and lots of R&D that costs money. Are the hardware devs supposed to make secret hardware for the software devs to use with their secret software, then both somehow recoup the cost of several generations of R&D in one generation of product once it's Ready™ by whoevers standards without consumers falling over at the price? Automakers started putting (mechanical) fuel injection on cars in the 50's, they didn't just somehow arrive at modern EFI in the 2000s without all the previous iterations or secretly work on it for 50+ years with no ROI in that time.

I agree with the sentiment "paying too much for too little" for me, personally, so I don't. That doesn't mean I think nobody else shouldn't either, or it shouldn't be offered until it does though. It's not as if it's completely broken and/or unusable.
 
I am just not sure 3 years from now if 10GB is still enough, and I am sure no one is willing to guarantee that for me. Hey, do you have link for that Afterburner beta that I can download and try it out?
I can't guarantee that it will, but it sure as hell looks like it. Or I could say that I guarantee it, but why not read up on it yourself?

https://www.resetera.com/threads/vram-in-2020-2024-why-10gb-is-enough.280976/ is where I got the link for the beta AB 4.6.3 beta 3 and the beta RTSS and my confidence that 10GB won't be what holds a 10GB card back, performance wise.
 
I'm getting more than 30 FPS in CP2077 with RT for sure, it's pretty stable around 50-60. I guess to me it shines, it works well and is playable.
Fair... you are getting around 60 fps.... with a flagship card. and a brand spanking new Ryzen. I am sure that is playable... for my money though if I'm buying the fastest card on offer (we can forget the 3090 I think) I am not sure I'm happy with 60 fps. Sure with a decent gsync monitor that is probably still fairly smooth looking. Still just seems painful.... I know Cyberpunk might just be a mess optimization wise. Still just would make me wonder what the next big game would run like down the line. Which is why I think RT is just a nice to have at best right now.... of course I have mentioned a few times now I expect game developers will probably find a lighter touch with RT anyway. So perhaps this generation will age better then it would seem out of the gate. Out of the gate I would be worried the next big game would run at 30FPS.... but ya if developers design with Navi 2 in mind first. Perhaps the 3080 RT performance in the next big game will be acceptable.

Who knows right now the tech is just in so much flux I don't think anyone should be basing purchasing decisions on RT anything. Or reading to much into things like 6800 sucks in Quake II RT... next game could go the opposite direction. I don't think Nvidia has this race locked up as tight as it looks right now. To each their own though.... we have had plenty of cool new tech from AA to Tessellation that people would choose to crank on early cards cause they where willing to trade the IQ for FPS. To each their own... and no doubt Cyberpunk does look great with RT, hopefully the devs get the patches out that make that game run better all round.
 
My thoughts exactly.

I foresee Ray Tracing going the way of Physx & Hairworks. The technology is there sure, but how many developers are willing to implement it correctly, and at what cost to them in the form of time and deadlines?
Ray tracing has the potential to cut back on a lot of developer work. It does things automatically on the fly that usually requires months of “optimization” with teams of people checking textures and lighting and all the little things. In a fully ray traced environment they don’t do any of that, they import the objects set material properties and the engine handles the rest. It takes months of work and cuts it back to hours. I can assure development studios want it to be the new normal.
The automation processes for Raytracing and its texture tools is currently one of the selling points for the Unreal5 engine that they are working very closely with Disney on for development.
 
Ray tracing has the potential to cut back on a lot of developer work. It does things automatically on the fly that usually requires months of “optimization” with teams of people checking textures and lighting and all the little things. In a fully ray traced environment they don’t do any of that, they import the objects set material properties and the engine handles the rest. It takes months of work and cuts it back to hours. I can assure development studios want it to be the new normal.
The automation processes for Raytracing and its texture tools is currently one of the selling points for the Unreal5 engine that they are working very closely with Disney on for development.

That of course assumes we can get to 100% raytraced scenes any time soon, instead of 90+% raster with a tiny sprinkling of RT effects, like we have today.

Not even the 3090 is more than like 10% there from a performance perspective.
 
I'm getting more than 30 FPS in CP2077 with RT for sure, it's pretty stable around 50-60. I guess to me it shines, it works well and is playable.
With my RTX 3090 I'm getting 60-70 FPS average @ 3840x1600 (about 25% less pixels than 4K) with maxed settings including RT, and DLSS set to Balanced.

Looks amazing and plays great. People can shit on RT all they want, but when it is implemented well in a game it can really improve visuals despite the performance hit. And in a single-player, story driven game like CyberPunk 2077 I'm completely satisfied with 60 FPS+ with these types of visuals.
 
Back
Top