demondrops
Limp Gawd
- Joined
- Jul 7, 2016
- Messages
- 422
Someone posted a vid on reddit showcasing 2060 RT capabilities on 1080p as a true 60 fps avg. card
I flat out said RT is a scam, and only 2080ti is a true RT card, and even then i would have some issues saying it is a true RT card, as no game really utilize all the RT features and a console game like read dead 2 can, on console even showcase these kind of graphics. And downvotes are flowing.
I can barely maintain metro exodus 70 fps avg. max on 1440p, and they want to say 2060 is a good RT card. Am i really that wrong, now i think im asking the wrong croud. Cause vibe i got here is really that RT is a bit hype however good it can be in like metro exodus, u still pay premium on fps. It dont add RT as a feature but it scavenge non RT performance and with use of RT it get both at a cost, where those RT cores should add it as a feature and not a cost to non RT performance, or taking something from RT performance that is. Is it really wrong of me to think this way?
It's like resolution dont matter more, RT dont increase graphics in any way, it just make it look better. So you shouldnt really loose non RT peformance and still pay more for it. U get RT that scavenge on non RT performance to just run RT.
they prety much say to cannibalize on rasterisation performance, to achieve RT performance is a good deal.. it would be if u didnt have to.
I flat out said RT is a scam, and only 2080ti is a true RT card, and even then i would have some issues saying it is a true RT card, as no game really utilize all the RT features and a console game like read dead 2 can, on console even showcase these kind of graphics. And downvotes are flowing.
I can barely maintain metro exodus 70 fps avg. max on 1440p, and they want to say 2060 is a good RT card. Am i really that wrong, now i think im asking the wrong croud. Cause vibe i got here is really that RT is a bit hype however good it can be in like metro exodus, u still pay premium on fps. It dont add RT as a feature but it scavenge non RT performance and with use of RT it get both at a cost, where those RT cores should add it as a feature and not a cost to non RT performance, or taking something from RT performance that is. Is it really wrong of me to think this way?
It's like resolution dont matter more, RT dont increase graphics in any way, it just make it look better. So you shouldnt really loose non RT peformance and still pay more for it. U get RT that scavenge on non RT performance to just run RT.
they prety much say to cannibalize on rasterisation performance, to achieve RT performance is a good deal.. it would be if u didnt have to.
Last edited: