fuk the eyecandy - I want FPS and raster is good enough.
Fuck that. I don't buy $1,500 video cards so I can play games in potato mode.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
fuk the eyecandy - I want FPS and raster is good enough.
Yeah, same here. I tried to get CP2077 running at high refresh, and I did it, but at the cost of all the graphical quality.Fuck that. I don't buy $1,500 video cards so I can play games in potato mode.
Sure, but someone buying a $600 USD video card, they probably care a lot about eyes candy (so much that they want over 1080p resolution eye candy and play with high details and so on)fuk the eyecandy - I want FPS and raster is good enough.
Yeah, same here. I tried to get CP2077 running at high refresh, and I did it, but at the cost of all the graphical quality.
I've since just maxed out all settings (but put DLSS on ultra performance), and it looks amazing, but I have to live with around 45 fps in the city, or 75 fps indoors. I'll take it.
Yeah, same here. I tried to get CP2077 running at high refresh, and I did it, but at the cost of all the graphical quality.
I've since just maxed out all settings (but put DLSS on ultra performance), and it looks amazing, but I have to live with around 45 fps in the city, or 75 fps indoors. I'll take it.
Fuk that I don't spend $1500 on a video card.Fuck that. I don't buy $1,500 video cards so I can play games in potato mode.
Sure, but someone buying a $600 USD video card, they probably care a lot about eyes candy (so much that they want over 1080p resolution eye candy and play with high details and so on)
Ya but that is what all of graphics are: Eyecandy. And when properly implemented to enhance games, it can be nice eye candy. Control looks pretty damn cool with RT. I wouldn't use it if I had a card that couldn't handle it, it doesn't look cool enough to play at 30fps, but while it DOES impact frame rate, I can still get 80-100fps (at 3840x1600) which is quite smooth.FACT - Make no mistake - RT is EYECANDY! Jensen wants everyone to believe it's all raytraced now. FAR FROM IT!
Good is where I am at on it currently. RT is a checkbox not really returning anything for the huge performance trade-off. DLSS has not impressed either with its in-game artifacting.Fuk that I don't spend $1500 on a video card.
Maybe they do, maybe they don't. Who's to say? Graphics Cards have always been a balance between candy, FPS and price.
FACT - Make no mistake - RT is EYECANDY! Jensen wants everyone to believe it's all raytraced now. FAR FROM IT!
FACT - Everything is still Raster Based. Imagine that.
FACT - Nvidia is lining their pockets with your cash on this effect. How does that 1080 play with RTX now, or ever? Like shit. Without DOWNSAMPLING with DLSS the latest cards on 4k can't cut it either.
FACT - SUPER LIMITED Selection of Games available for RTX and DLSS
OPINION - RTX "enhancement" is the most overrated and overhyped graphic effect ever. Nvidia has pulled off the greatest bs marketing hype scheme ever. Congrads Jensen.
DLSS is a CRUTCH for Nvidia, 2.0 may be worthy but everyone raved about garbage looking 1.0. That was in everyone's head and not reality.
FOR ME - Until we get to FULL Raytracing EVERYTHING..
RT OFF - GIMMICK
Enjoy your $1500 GPUs with limited selection eyecandy, I'll wait until RT is the real thing and not some "after-effect" with a absolute HUGE performance hit.
Low FPS frames are what count, not max.Ya but that is what all of graphics are: Eyecandy. And when properly implemented to enhance games, it can be nice eye candy. Control looks pretty damn cool with RT. I wouldn't use it if I had a card that couldn't handle it, it doesn't look cool enough to play at 30fps, but while it DOES impact frame rate, I can still get 80-100fps (at 3840x1600) which is quite smooth.
It's understandable if you don't want to buy a new card to get it, but acting like it is a gimmick that people shouldn't care about is kinda silly. It is eye candy that does look good when well done, and that is why we have GPUs in the first place.
My thoughts exactly.Fuk that I don't spend $1500 on a video card.
Maybe they do, maybe they don't. Who's to say? Graphics Cards have always been a balance between candy, FPS and price.
FACT - Make no mistake - RT is EYECANDY! Jensen wants everyone to believe it's all raytraced now. FAR FROM IT!
FACT - Everything is still Raster Based. Imagine that.
FACT - Nvidia is lining their pockets with your cash on this effect. How does that 1080 play with RTX now, or ever? Like shit. Without DOWNSAMPLING with DLSS the latest cards on 4k can't cut it either.
FACT - SUPER LIMITED Selection of Games available for RTX and DLSS
OPINION - RTX "enhancement" is the most overrated and overhyped graphic effect ever. Nvidia has pulled off the greatest bs marketing hype scheme ever. Congrads Jensen.
DLSS is a CRUTCH for Nvidia, 2.0 may be worthy but everyone raved about garbage looking 1.0. That was in everyone's head and not reality.
FOR ME - Until we get to FULL Raytracing EVERYTHING..
RT OFF - GIMMICK
Enjoy your $1500 GPUs with limited selection eyecandy, I'll wait until RT is the real thing and not some "after-effect" with a absolute HUGE performance hit.
Ohhhh man that sucks cause if you want Full RT non-potato mode on a 3090.... your still going to be doing that at 1080p. Welcome to the RT future.Fuck that. I don't buy $1,500 video cards so I can play games in potato mode.
Yeah, you're pretty much paying $1500 dollars for a "loaded" baked potato but with Bacos bacon bits, instead of real bacon. lolOhhhh man that sucks cause if you want Full RT non-potato mode on a 3090.... your still going to be doing that at 1080p. Welcome to the RT future.
Not that there is anything really wrong with gaming at 1080p.... and if you don't mind mini-potato mode you can probably jump up to 1440 and select medium in a which ever other Full non potato settings you can stand down grading on your $1500 card.
Half kidden... but I mean really that is the RT reality today. Its the future sure.... but current hardware is still forcing choices. No one is selling the loaded baked potato version quite yet.
I said it in another thread... but the future of RT is AMD Navi 2. Is it as powerful as Nvidias.... drivers and software optimizations not withstanding its probably not. However it is the target for 3-4 years. But that might not be that bad.... its powerful enough that games developed with it in mind from day one should still be able to pull off some very cool effects. I think the key for developers will be understanding when RT is going to make for those eye catching shadows and reflections.... and when the action is high and a map will achieve 95% of the same effect and no one will really notice. Right now basically every RT game we have gotten was designed with non RT hardware in mind.... and for the most part RT was added after the fact by a couple programmers (and probably a Nvidia staff programmer) where they basically just drop a RT lighting engine in as a replacement. Hopefully with newer games developed with AMDs consoles in mind artists earlier in the design pipeline can intelligent decisions that give us the best of both worlds. Fast light maps when the visual impact of RT is just not great enough for the performance cost.... and RT effects when the player will really notice the IQ.My thoughts exactly.
I foresee Ray Tracing going the way of Physx & Hairworks. The technology is there sure, but how many developers are willing to implement it correctly, and at what cost to them in the form of time and deadlines?
I think we should not mix Nvidia specific Ray tracing implementation with RT.My thoughts exactly.
I foresee Ray Tracing going the way of Physx & Hairworks. The technology is there sure, but how many developers are willing to implement it correctly, and at what cost to them in the form of time and deadlines?
You don't understand how new technologies come about and are refined over time very well.Fuk that I don't spend $1500 on a video card.
Maybe they do, maybe they don't. Who's to say? Graphics Cards have always been a balance between candy, FPS and price.
FACT - Make no mistake - RT is EYECANDY! Jensen wants everyone to believe it's all raytraced now. FAR FROM IT!
FACT - Everything is still Raster Based. Imagine that.
FACT - Nvidia is lining their pockets with your cash on this effect. How does that 1080 play with RTX now, or ever? Like shit. Without DOWNSAMPLING with DLSS the latest cards on 4k can't cut it either.
FACT - SUPER LIMITED Selection of Games available for RTX and DLSS
OPINION - RTX "enhancement" is the most overrated and overhyped graphic effect ever. Nvidia has pulled off the greatest bs marketing hype scheme ever. Congrads Jensen.
DLSS is a CRUTCH for Nvidia, 2.0 may be worthy but everyone raved about garbage looking 1.0. That was in everyone's head and not reality.
FOR ME - Until we get to FULL Raytracing EVERYTHING..
RT OFF - GIMMICK
Enjoy your $1500 GPUs with limited selection eyecandy, I'll wait until RT is the real thing and not some "after-effect" with a absolute HUGE performance hit.
Having several technical degrees - I fully understand technology refinement. thank you. Yes RT was the past and is the future, however it's obviously NOT READY for real-time games. As Kyle said to simplify the RT debate currently to it's base elequently - "paying too much for too little". I guess everyone has their opinion, I've stated mine and exactly why.You don't understand how new technologies come about and are refined over time very well.
I don't mean to say you should run out and buy a $1500 video card to use RT (I don't either,) but you sure seem to be trying to say that nobody else should (or be unable because it shouldn't be released until it's flawless) because you don't find value in currently. Then expect it to still get refined for years with no ROI until it's flawless. Imagine the price of a card if they had 10+ years of R&D tied to it for a feature that never had any ROI during that time. That might work for some commercial and industrial markets, but the consumer market not so much.
That's all without even considering the implications of the software developers role in it. I guess it shouldn't be released til it's just a single variable for them to flip and it magically inserts flawless RT into their game with minimal performance loss.
Maybe they do, maybe they don't. Who's to say? Graphics Cards have always been a balance between candy, FPS and price.
I'm not sure what you are asking here if anything. You assume too much out of my response. I will say I prefer shadows that don't cut my frramerates in half and bring my min fps below 60. If the fps hit is larger than that threshold, shadows get turned off as they are not that crucial to any game.Do you know one person that play only low impressive graphic at the lowest possible details that buy those type of card, you have the pro type of players wanting 300 fps on their 240hz+ monitor yes, but that an exception. Like you said candy was always in the balance of graphic cards, almost by definition (i.e. you seem to fully agree that it is quite probable, but not certain, that people that are buying a 6800xt instead of a 5700xt/6800 care in some ways about graphical quality, would it be resolution above 720p or 1080p, not playing the game at the lowest setting and so on, and not just shadows because it help 3d perception and make them better at the game).
Well yes and that completely different than saying in a thread that compare graphic cards being sold over $700 USD that candy does not matter at all, everyone always make a compromise between candy and performance, everyone is like you (just a little bit different line but in that very similar ballpark).I'm not sure what you are asking here if anything. You assume too much out of my response. I will say I prefer shadows that don't cut my frramerates in half and bring my min fps below 60. If the fps hit is larger than that threshold, shadows get turned off as they are not that crucial to any game.
And this is why Nvidia wants reviewers to focus on raytracing despite the vanishingly few titles that actually use it...
I suppose, but it was kind of a false hope to begin with. It was either make a competitive card in rasterization or make a competitive card for RT or do both and be mediocre in both.
Like the 20xx series it will take AMD a generation or two to shine with RT (hopefully)
Is that in native resolution or with using DLSS?Playing Cyberpunk on a 2080Ti and the FPS is good. All you gotta do is set cascaded shadows setting down 1 notch and performance practically doubles. It's not 30fps as the haters in the thread proclaim.
So Nvidia fans proclaim "salt in the wound from AMD fans" over one game (Cyberpunk) and declare validation and victory for RTX?Of course Nvidia wants reviewers to cover it! it is impressive tech..
AMD fans do NOT want to hear about it, as they are loudly proclaiming in the thread. Salt in the wound I suppose.
I played Quake2 RTX last summer, but it's an ancient game. Cyberpunk is the first game I have played that it really makes a difference.
Playing Cyberpunk on a 2080Ti and the FPS is good. All you gotta do is set cascaded shadows setting down 1 notch and performance practically doubles. It's not 30fps as the haters in the thread proclaim
Of course Nvidia wants reviewers to cover it! it is impressive tech..
AMD fans do NOT want to hear about it, as they are loudly proclaiming in the thread. Salt in the wound I suppose.
I played Quake2 RTX last summer, but it's an ancient game. Cyberpunk is the first game I have played that it really makes a difference.
Playing Cyberpunk on a 2080Ti and the FPS is good. All you gotta do is set cascaded shadows setting down 1 notch and performance practically doubles. It's not 30fps as the haters in the thread proclaim.
If we where squabbling about a difference of 120 FPS vs 100 FPS.... I think I would be on team RT Now.Well yes and that completely different than saying in a thread that compare graphic cards being sold over $700 USD that candy does not matter at all, everyone always make a compromise between candy and performance, everyone is like you (just a little bit different line but in that very similar ballpark).
This, the cost in eye candy lost to get the RT eye candy is took big, is closer to what is going on that the original message that spawned the conversation. Take the new Dark Souls, what sacrifice in is really impressive presentation would it have needed to make to have some RT effect in there ?I don't understand the argument to flip RT on at the expense of more game impacting IQ settings.... are you really going to drop texture filtering, or AA, tessellation or any number of other settings we have gotten used to being able to set to ultra on flagship hardware just so you can enable RT lighting and shadows. The shadows don't look any better 9 times out of 10... and the lighting CAN be very cool, but most of the time its hard to see the difference cause your busy playing a game, and the old raster way is pretty damn convincing when your running in a FPS, or racing ect.
That close to what people buying those new console are getting, what they are doing there is probably getting available to the PC side as well.I am hoping AMD comes up with something that might not promise the silly "4k quality 1080p performance"
Ray tracing is worth it for me. I don't mind DLSS, I actually like the look it gives. More like watching a movie.
I went from a 1080 Ti to a (thankfully free) 2080 Super (8GB) for a bit. I didn't have any issues at 4K in anything, not even in doom eternal with ultra nightmare (likely because I had a very high memory OC). It was a total side grade as they were both under AiO watercooling and heavily overclocked and they were both very close in performance (other than raytracing).While I agree that for now 10GB is enough, I keep my cards for at least 3 years and I skip generation(s). My current card is an evga 1080 ti ftw and the warranty is up in 4 months. It is also hard for me to give up a card with 11Gb for one with 10GB.
The game that I play the most, age of empires 3 definitive edition shows that it allocates/uses over 10GB from time to time, and yes I understand it could be allocate and not use, but for my peace of mind, I prefer a card with a minimum of 12GB, since I plan to keep it until the warranty is near its end.
I really think nvidia misses the boat this go around, thinking people that upgrade to the 3080 are 1080 card owners. Or maybe, they really try to push 1080 ti owners into buying the 3090. I don’t know, but 3090 is too much money for me. I paid $800 with tax for my 1080 ti, at most I would pay $1k this time around, so I am hoping for the 3080 ti if that ever comes out.
If we where squabbling about a difference of 120 FPS vs 100 FPS.... I think I would be on team RT Now.
However we are not talking about that.... we are talking about 200+ FPS vs 60-70 FPS (and probably a resolution drop) on $1500 hardware.... we have halo cards that can run ultra settings at over 200 FPS but you flip RT on and all of a sudden you can't hit the refresh rate on a $200 dollar monitor. (never mind that $1000+ gaming beast your likely to have if you spring for a halo card)
If your in for the real flag ship 3080/6800 class cards.... then your talking amore like 200 FPS vs 50-60 FPS and no doubt a resolution bump at least one step down. (unless you really are running a flagship card on a 1080p monitor). Perhaps you can bump that RT frame rate back up over 100 FPS IF... IF you are willing to drop things back from ultra/high settings into medium setting territory.
I went from a 1080 Ti to a (thankfully free) 2080 Super (8GB) for a bit. I didn't have any issues at 4K in anything, not even in doom eternal with ultra nightmare (likely because I had a very high memory OC). It was a total side grade as they were both under AiO watercooling and heavily overclocked and they were both very close in performance (other than raytracing).
If you were upgrading to an 8GB card for 4K today, I'd say yeah not so smart. 10GB with the compression and memory streaming that is coming to games? Really shouldn't be an issue unless you mod games for ultra realistic (and unoptimized!) textures.
With that all said, I'm using a 3090 right now because it's over twice as fast as two 1080 Tis in SLI. I've got the beta Afterburner/RTSS which allows you to configure it to see actual in-use VRAM, and even CP2077 at 4K ultra RT preset with DLSS set to balanced only uses around 6 to 7GB (with up to 9GB allocated).
Yes you fully understand followed right up with, it's the future but not ready. Again, how do you expect to get there from here? It requires involvement from both the hardware devs & software devs and lots of R&D that costs money. Are the hardware devs supposed to make secret hardware for the software devs to use with their secret software, then both somehow recoup the cost of several generations of R&D in one generation of product once it's Ready™ by whoevers standards without consumers falling over at the price? Automakers started putting (mechanical) fuel injection on cars in the 50's, they didn't just somehow arrive at modern EFI in the 2000s without all the previous iterations or secretly work on it for 50+ years with no ROI in that time.Having several technical degrees - I fully understand technology refinement. thank you. Yes RT was the past and is the future, however it's obviously NOT READY for real-time games. As Kyle said to simplify the RT debate currently to it's base elequently - "paying too much for too little". I guess everyone has their opinion, I've stated mine and exactly why.
I'm getting more than 30 FPS in CP2077 with RT for sure, it's pretty stable around 50-60. I guess to me it shines, it works well and is playable.The 3000 NV cards don't shine with RT either. Lets all just face facts RT is a 30-60 FPS technology today, and it will be for another generation probably at least.
I can't guarantee that it will, but it sure as hell looks like it. Or I could say that I guarantee it, but why not read up on it yourself?I am just not sure 3 years from now if 10GB is still enough, and I am sure no one is willing to guarantee that for me. Hey, do you have link for that Afterburner beta that I can download and try it out?
Fair... you are getting around 60 fps.... with a flagship card. and a brand spanking new Ryzen. I am sure that is playable... for my money though if I'm buying the fastest card on offer (we can forget the 3090 I think) I am not sure I'm happy with 60 fps. Sure with a decent gsync monitor that is probably still fairly smooth looking. Still just seems painful.... I know Cyberpunk might just be a mess optimization wise. Still just would make me wonder what the next big game would run like down the line. Which is why I think RT is just a nice to have at best right now.... of course I have mentioned a few times now I expect game developers will probably find a lighter touch with RT anyway. So perhaps this generation will age better then it would seem out of the gate. Out of the gate I would be worried the next big game would run at 30FPS.... but ya if developers design with Navi 2 in mind first. Perhaps the 3080 RT performance in the next big game will be acceptable.I'm getting more than 30 FPS in CP2077 with RT for sure, it's pretty stable around 50-60. I guess to me it shines, it works well and is playable.
Ray tracing has the potential to cut back on a lot of developer work. It does things automatically on the fly that usually requires months of “optimization” with teams of people checking textures and lighting and all the little things. In a fully ray traced environment they don’t do any of that, they import the objects set material properties and the engine handles the rest. It takes months of work and cuts it back to hours. I can assure development studios want it to be the new normal.My thoughts exactly.
I foresee Ray Tracing going the way of Physx & Hairworks. The technology is there sure, but how many developers are willing to implement it correctly, and at what cost to them in the form of time and deadlines?
Ray tracing has the potential to cut back on a lot of developer work. It does things automatically on the fly that usually requires months of “optimization” with teams of people checking textures and lighting and all the little things. In a fully ray traced environment they don’t do any of that, they import the objects set material properties and the engine handles the rest. It takes months of work and cuts it back to hours. I can assure development studios want it to be the new normal.
The automation processes for Raytracing and its texture tools is currently one of the selling points for the Unreal5 engine that they are working very closely with Disney on for development.
With my RTX 3090 I'm getting 60-70 FPS average @ 3840x1600 (about 25% less pixels than 4K) with maxed settings including RT, and DLSS set to Balanced.I'm getting more than 30 FPS in CP2077 with RT for sure, it's pretty stable around 50-60. I guess to me it shines, it works well and is playable.