Developer sees no reason to add promised RTX support

The problem with your argument is you're looking at it in the best case scenario for Nvidia. The 1080 was never really $700 unless you wanted the founder's edition. The AIB versions started at $599 and dropped further after the 1080Ti launch. Likewise, the 2080 non-super hit in between with an asinine $799 price point when the 1080Ti was $699. It's not until recently that we have the performance boost you're talking about.

I paid $700 for my current Gigabyte 1080 in August 2016.

Yes, the Turing price situation was much worse last year and only recently became somewhat palatable. My point is that RTX is not the only reason and probably not the main reason Turing is so expensive.
 
  • Like
Reactions: Auer
like this
I have to agree. I know a good portion of users here love the tech behind ray tracing and understand that it's the way of the future, but is not enough as consumer where overall performance and price to performance are main concerns. Next in priority is maybe AA performance as it benefits and is visually noticeable in all games. Most of the games I play I want to high enough FPS to match my monitor (120hz 1440p or 60hz 4k), if that is met then I want to increase AA. Often times I will turn off or lower any feature that effects FPS (ambient occlusion, shadows, lighting) because in fast moving games I just don't see the difference with those features on say vs AA where I do see difference. If I have FPS to spare then that stuff gets turned back on.

TLDR: I'm not paying extra for a feature I'm gonna use in maybe 10-20% of games I play (adventure/rts games where 45fps is very acceptable). If I did have an RTX card I turn RTX on in a game to see it in action and think very cool, but turn it back off for normal gameplay.
 
I wouldn't be shocked if they agreed to implementing it before finding out the performance implications. Or that of the RTX lineup only like 2 Top tier GPUs would be capable of doing an even mediocre job of running it lol.

I appreciate the Devs honesty. Essentially, "We have bugs to fix and content to add. We may add this eventually, but it's at the bottom of our list."

Small game developer... gets a visit from some Reps with Volta development platforms and a bag of money. Says here use these they won't be fast but it will work... our next card is going to pump up the volume.

Then they get actual RTX cards 6 months to a year later... and they realize only the top of the food chain RTX is going to be able to even turn RT on with their implementation.

I find this very easy to believe a AAA house probably would better know the game and keep their implementation super conservative until they saw what NV was cooking for end users. A smaller developer could easily get sucked up in the hype and swing for an aggressive implementation.
 
TLDR: I'm not paying extra for a feature I'm gonna use in maybe 10-20% of games I play (adventure/rts games where 45fps is very acceptable). If I did have an RTX card I turn RTX on in a game to see it in action and think very cool, but turn it back off for normal gameplay.

That is my biggest concern with RT tech. Game types. All the games people seem to be getting excited about are not games where I would be willing to drop FPS. (shooters and competitive stuff). As much as I want to play Cyberpunk like everyone else I am wondering if even if we are able to get 100+fps with RTX on is it even going to be that noticeably better then a light map unless your taking screen shots.

The other games that take up my time... the adventure puzzle strategy games. The developers don't see the value in RTX (and neither do I really) in those types of games the visuals tends to skew to artsy unrealistic fantastical styles. In which case realistic lighting and reflections are probably completely counter to what they are trying to achieve anyway.

I do think this tech will be around for awhile and get better... I am just not convinced developers in general feel its a major feature to implement until the hardware really improves. I am not holding my breath for the next gen either. I would expect the next gen consoles will be the bar developers will shoot for... for years. So I'm thinking at most we will get a few areas with RT reflections, and some smaller bits with global lighting. I doubt we get a ton of the big massive hyper realistic open world map games with everything traced NVs marketing dept would have everyone buying a RTX card believe are weeks/months away.

RTX will be a decent ambient type effect used sparingly by devs for a long time yet. imo
 
I don’t disagree with their statement in the least. I’m glad they are focusing on the actual game then to tack on graphical eye candy that very few have the ability to run and even fewer to run it at acceptable performance
 
That is my biggest concern with RT tech. Game types. All the games people seem to be getting excited about are not games where I would be willing to drop FPS. (shooters and competitive stuff). As much as I want to play Cyberpunk like everyone else I am wondering if even if we are able to get 100+fps with RTX on is it even going to be that noticeably better then a light map unless your taking screen shots.

The other games that take up my time... the adventure puzzle strategy games. The developers don't see the value in RTX (and neither do I really) in those types of games the visuals tends to skew to artsy unrealistic fantastical styles. In which case realistic lighting and reflections are probably completely counter to what they are trying to achieve anyway.

I do think this tech will be around for awhile and get better... I am just not convinced developers in general feel its a major feature to implement until the hardware really improves. I am not holding my breath for the next gen either. I would expect the next gen consoles will be the bar developers will shoot for... for years. So I'm thinking at most we will get a few areas with RT reflections, and some smaller bits with global lighting. I doubt we get a ton of the big massive hyper realistic open world map games with everything traced NVs marketing dept would have everyone buying a RTX card believe are weeks/months away.

RTX will be a decent ambient type effect used sparingly by devs for a long time yet. imo
This is why I wont be playing Cyberpunk on release.
There wont be a card good enough to play it at its best, that will appear later in the year hopefully.
It might even need longer.
At least most of the bugs will be ironed out and it will be cheaper by then.
 
  • Like
Reactions: ChadD
like this
That is my biggest concern with RT tech. Game types. All the games people seem to be getting excited about are not games where I would be willing to drop FPS. (shooters and competitive stuff). As much as I want to play Cyberpunk like everyone else I am wondering if even if we are able to get 100+fps with RTX on is it even going to be that noticeably better then a light map unless your taking screen shots.

The other games that take up my time... the adventure puzzle strategy games. The developers don't see the value in RTX (and neither do I really) in those types of games the visuals tends to skew to artsy unrealistic fantastical styles. In which case realistic lighting and reflections are probably completely counter to what they are trying to achieve anyway.

I do think this tech will be around for awhile and get better... I am just not convinced developers in general feel its a major feature to implement until the hardware really improves. I am not holding my breath for the next gen either. I would expect the next gen consoles will be the bar developers will shoot for... for years. So I'm thinking at most we will get a few areas with RT reflections, and some smaller bits with global lighting. I doubt we get a ton of the big massive hyper realistic open world map games with everything traced NVs marketing dept would have everyone buying a RTX card believe are weeks/months away.

RTX will be a decent ambient type effect used sparingly by devs for a long time yet. imo
But you can always admire the pretty reflections as you are lying dead on the ground? :)

I agree that RTX would be good for the slow games like a puzzle solving type.
 
  • Like
Reactions: ChadD
like this
But you can always admire the pretty reflections as you are lying dead on the ground? :)

I agree that RTX would be good for the slow games like a puzzle solving type.

lol As stupid as it sounds... ya give me a 3D version of bejeweled or something where the reflections bounce off the shiny game pieces. Give me the option to load custom backgrounds that reflect off the shiny 3D jems as I match them. Better yet take web cam input and calculate reflections. hmmm ok seriously that might be cool. lol
 
Right but it took one of your SLI cards to run it properly or a dedicated physX card the FIRST few years to keep from bogging your system down. It's only when it was developed and trimmed to run as a CPU task did it get fully implemented in the last couple of years.

Nvidia fucked up majorly with the 20 series.

#1 Little to no SLI support. When you have a New Hardware based implementation 1st gen, you should push SLI and help developers support it. Sell more cards like when PhysX came out. Duh more money and be able to run RTX the way it's mean't to be!

#2 Pricing: Well you just couldn't help be to market first and sales were and are still stagnant in the home pc market because of the higher cost to develope this card which really should have been on the 7nm or 7nm+ process. But hey, We Were First mentality. $1200 for the Ti version. Have you lost your fucking mind? That's a nice house payment or 3 to 4 car payments, bullshit. If it wasn't for paypal and the credit spending gen, that card would still be sitting on shelves, full stock. And don't tell me you spent cash. You either live in moms house or rich with nothing better to do.

#3 Piss poor management that should have gotten Huang a fine or reprimand by his peers/board members. When you top tier card the 2080 comes out and is on par with a 1080Ti=lost sales. Look at how many of us bought 1080TI's at $650 or less when the 2080 dropped. And I still have no reason to buy anything new. Yes I know they have other areas of interest like AI automobile industry, but don't leave the gaming community as a second thought when they built your company. Yes we did.

#4 Should have concentrated on gaming now, not later. The 20 series should have all been able to run 4K resolution and 60 fps any game with the higher tier cards pushing 100 to 120 fps. They implemented their compute cards with gaming which was the wrong decision, the two should have stayed separated till 7nm or better was achieved. Ala. PhysX like card or like number 1 and 2 where you could afford 2 cards.

Well that IMO and my overall observation. Disagree cool. That's just how I feel why Nvidia failed with gamers.
Good rant... But you completely avoided my point that physx didn't die, they eventually went open source and now it's in use still. I didn't comment on anything else your rant entailed, thanks for staying on topic. Nvidia did a crap thing with PhysX trying to tie consumers in, disabling it if it detected an AMD card, etc. I'm not agreeing with their tactics, because they were crappy then, just like a lot of choices they made. You say using their compute cards for gaming like it matters. You do realize people buy cards for graphics right? Nvidia on 12nm? are still on par with performance per transistor, it's not like they are slow cards. I think Nvidia is a crappy company and I own all AMD cards, so don't take this as an Nvidia fan boy, but just use logic instead of throwing random stuff out, makes you say less credible when you can't hold a real conversation.

PS I'm a grown man with 4 kids and a wife, with 2 mortgage payments... And I could go out tomorrow and buy a 2080ti if I wanted to... I choose not to because the price is stupid for what you get and not worth it to me. There are plenty who think it is worth it, some don't even care for raytracing but wanted it for the highest frame rates, which it has. I don't look down at anyone for buying it, it's their money and choice what they want, stop being so judgmental, you make all the AMD fans look crazy.
 
Last edited:
Good rant... But you completely avoided my point that physx didn't die, they eventually went open source and now it's in use still. I didn't comment on anything else your rant entailed, thanks for staying on topic. Nvidia did a crap thing with PhysX trying to tie consumers in, disabling it if it detected an AMD card, etc. I'm not agreeing with their tactics, because they were crappy then, just like a lot of choices they made. You say using their compute cards for gaming like it matters. You do realize people buy cards for graphics right? Nvidia on 12nm? are still on par with performance per transistor, it's not like they are slow cards. I think Nvidia is a crappy company and I own all AMD cards, so don't take this as an Nvidia fan boy, but just use logic instead of throwing random stuff out, makes you say less credible when you can't hold a real conversation.

PS I'm a grown man with 4 kids and a wife, with 2 mortgage payments... And I could go out tomorrow and buy a 2080ti if I wanted to... I choose not to because the price is stupid for what you get and not worth it to me. There are plenty who think it is worth it, some don't even care for raytracing but wanted it for the highest frame rates, which it has. I don't look down at anyone for buying it, it's their money and choice what they want, stop being so judgmental, you make all the AMD fans look crazy.
Hardware level physx DID die. It was an optimized joke from the beginning. They made it look like you could not have waving flags, flying debris or even sparks without hardware physx which of course is laughable. They simply gimped effects on purpose unless you enabled hardware physx. And most of the effects looked unnatural with the same chunks used for debris in pretty much any scene. And it runs like garbage even today as gpu usage actually drops when doing effects so it does not matter how much gpu power you throw at it. My gtx1080 ti still dips in the 40s at 1080p during heavy physx scenes in Borderlands Pre Sequel and it runs no better than it did on my gtx780 from 2013.
 
Hardware level physx DID die. It was an optimized joke from the beginning. They made it look like you could not have waving flags, flying debris or even sparks without hardware physx which of course is laughable. They simply gimped effects on purpose unless you enabled hardware physx. And most of the effects looked unnatural with the same chunks used for debris in pretty much any scene. And it runs like garbage even today as gpu usage actually drops when doing effects so it does not matter how much gpu power you throw at it. My gtx1080 ti still dips in the 40s at 1080p during heavy physx scenes in Borderlands Pre Sequel and it runs no better than it did on my gtx780 from 2013.

My opinion:

Replace PhysX with Raytracing and come back to this post 6 years from now.
 
Hardware level physx DID die. It was an optimized joke from the beginning. They made it look like you could not have waving flags, flying debris or even sparks without hardware physx which of course is laughable. They simply gimped effects on purpose unless you enabled hardware physx. And most of the effects looked unnatural with the same chunks used for debris in pretty much any scene. And it runs like garbage even today as gpu usage actually drops when doing effects so it does not matter how much gpu power you throw at it. My gtx1080 ti still dips in the 40s at 1080p during heavy physx scenes in Borderlands Pre Sequel and it runs no better than it did on my gtx780 from 2013.
Yes hardware physics isn't what it used to be, it's around because Nvidia finally open sourced it and it's one of the most used physics engines in games. There's plenty of first attempts that got defined, redone and reused in other fashions. Again, I'm not a fan of how Nvidia did the whole physx thing, but that doesn't make it dead because you or I didn't like it. The major version of physx runs on shaders in both Nvidia and AMD hardware, although developers are still able to utilize hardware physx if they want.
 
If they used RTX as a selling feature of the game then they need to add it. If it's just something they talked about after the game was released then I get why priorities change.
 
My opinion:

Replace PhysX with Raytracing and come back to this post 6 years from now.
I give it less time because it's not locked into specific hardware being a dx12 specification, so we most likely won't have to wait as long. It won't be dead, it will be dxr, and Nvidia may move on and stop calling it rtx or they may not. I also doubt it'll move to software given then necessary power required.
 
  • Like
Reactions: Auer
like this
My opinion:

Replace PhysX with Raytracing and come back to this post 6 years from now.

No question that RT is the imminent future. Next year both new consoles get HW RT support, and no doubt some AMD GPU cards do as well. So it will be everywhere.

As for the original post. I have no problem with developers waiting before they get on board.

Just that they shouldn't promise features (any feature, not just RT) they don't really have any intention of supporting in the near future.

Better to under promise and over deliver, than the other way around.
 
Hardware level physx DID die. It was an optimized joke from the beginning. They made it look like you could not have waving flags, flying debris or even sparks without hardware physx which of course is laughable. They simply gimped effects on purpose unless you enabled hardware physx. And most of the effects looked unnatural with the same chunks used for debris in pretty much any scene. And it runs like garbage even today as gpu usage actually drops when doing effects so it does not matter how much gpu power you throw at it. My gtx1080 ti still dips in the 40s at 1080p during heavy physx scenes in Borderlands Pre Sequel and it runs no better than it did on my gtx780 from 2013.

it's not dead it's just breathing funny.
 
No question that RT is the imminent future. Next year both new consoles get HW RT support, and no doubt some AMD GPU cards do as well. So it will be everywhere.

As for the original post. I have no problem with developers waiting before they get on board.

Just that they shouldn't promise features (any feature, not just RT) they don't really have any intention of supporting in the near future.

Better to under promise and over deliver, than the other way around.
From the sounds of it, it appears they had the intention of supporting it until they saw how much effort it was and how bad the frame rates where, they decided to divert resources to more useful things.
 
They simply gimped effects on purpose unless you enabled hardware physx. And most of the effects looked unnatural with the same chunks used for debris in pretty much any scene.

Yeah there were some really cheesy PhysX effects. I think some of it is now rolled into Gameworks but Nvidia isn’t pushing hardware PhysX like they did a few years ago.

We shouldn’t pretend physics is a solved problem though. Proper cloth and fluid dynamics are basically non-existent in games. Yes we have a few flags waving in the wind but clothes etc are still baked on static textures. Hair is still a mess. Seems people have given up on physics acceleration for now but it’s almost as important as proper lighting.

Hopefully with 12 and 16 thread CPUs becoming common place we will make progress there soon.
 
Meanwhile, a lot of other developers are adding RTX support: https://babeltechreviews.com/ray-tracing-news-from-gamescom-2019/

I can also understand a developer concentrating on something else if they can't get it running fast enough or it is taking too many resources from more important development. RTX at this point is a high end niche feature. I do hope that going forward RTX support is not something patched in later but available right at release. Performance issues will be resolved by future GPUs as now only the 2080 Ti is good enough for raytracing but it's important to get support for it as early as possible as that leads to further development, optimization and usage.

People have gotten so used to being able to get massive framerates from games that when a feature equivalent of Crysis 1 comes along and totally tanks performance they cry. We need more software that truly pushes the hardware beyond what is feasible at the time of release. Not everyone needs to play at ultra graphics and coming back to an older game a few years along the line and see it running well on your RTX 4080 Ti is pretty satisfying. Since raytracing can be turned off like any graphics feature you can make the decision based on what you prefer and what gives you a satisfying gaming experience.
 
Meanwhile, a lot of other developers are adding RTX support: https://babeltechreviews.com/ray-tracing-news-from-gamescom-2019/

I can also understand a developer concentrating on something else if they can't get it running fast enough or it is taking too many resources from more important development. RTX at this point is a high end niche feature. I do hope that going forward RTX support is not something patched in later but available right at release. Performance issues will be resolved by future GPUs as now only the 2080 Ti is good enough for raytracing but it's important to get support for it as early as possible as that leads to further development, optimization and usage.

People have gotten so used to being able to get massive framerates from games that when a feature equivalent of Crysis 1 comes along and totally tanks performance they cry. We need more software that truly pushes the hardware beyond what is feasible at the time of release. Not everyone needs to play at ultra graphics and coming back to an older game a few years along the line and see it running well on your RTX 4080 Ti is pretty satisfying. Since raytracing can be turned off like any graphics feature you can make the decision based on what you prefer and what gives you a satisfying gaming experience.
Ironic that you bring up Crysis as that game is far from optimized. Most modern games look orders of magnitude better and run better to boot. Crysis cant even always maintain 60 fps with the fastest modern cpus available as it does not really scale well past 2 cores and it was way too cpu dependent in spots.

And this notion of going back and playing older games on newer hardware just to try it run it on settings you wanted years ago seems silly. Games are usually so outdated that it seems ridiculous and many modern games will still run better while looking miles better at the same time. And really many games today that run poorly are actually some of the most ugliest games using outdated engines.
 
Meanwhile, a lot of other developers are adding RTX support: https://babeltechreviews.com/ray-tracing-news-from-gamescom-2019/

I can also understand a developer concentrating on something else if they can't get it running fast enough or it is taking too many resources from more important development. RTX at this point is a high end niche feature. I do hope that going forward RTX support is not something patched in later but available right at release. Performance issues will be resolved by future GPUs as now only the 2080 Ti is good enough for raytracing but it's important to get support for it as early as possible as that leads to further development, optimization and usage.

People have gotten so used to being able to get massive framerates from games that when a feature equivalent of Crysis 1 comes along and totally tanks performance they cry. We need more software that truly pushes the hardware beyond what is feasible at the time of release. Not everyone needs to play at ultra graphics and coming back to an older game a few years along the line and see it running well on your RTX 4080 Ti is pretty satisfying. Since raytracing can be turned off like any graphics feature you can make the decision based on what you prefer and what gives you a satisfying gaming experience.
All the games with reflections with RTX look way off when on, the off version looks more realistic, RTX version look like overdone mirrors placed in the world. Walking around in the real world reflections are way less bright in most cases with more noise and subtle distortions and inconsistencies, RTX makes it look like a house of mirrors instead. I would also like to see the on/off versions with the resolution and FPS as well :D, Nvidia can be more honest about RTX. Good that developers are using DXR, maybe by the time Hardware can really support it well they will be good at it.
 
Ironic that you bring up Crysis as that game is far from optimized. Most modern games look orders of magnitude better and run better to boot. Crysis cant even always maintain 60 fps with the fastest modern cpus available as it does not really scale well past 2 cores and it was way too cpu dependent in spots.

And this notion of going back and playing older games on newer hardware just to try it run it on settings you wanted years ago seems silly. Games are usually so outdated that it seems ridiculous and many modern games will still run better while looking miles better at the same time. And really many games today that run poorly are actually some of the most ugliest games using outdated engines.

I've gone back to a lot of older games to play them again with better hardware. Crysis was made to work with hardware available at the time, which wasn't 4+ cores but it still took years for GPUs to catch up to be able to run the game well. It was for a long time a benchmark in visuals too.
 
Meanwhile, a lot of other developers are adding RTX support: https://babeltechreviews.com/ray-tracing-news-from-gamescom-2019/

I can also understand a developer concentrating on something else if they can't get it running fast enough or it is taking too many resources from more important development. RTX at this point is a high end niche feature. I do hope that going forward RTX support is not something patched in later but available right at release. Performance issues will be resolved by future GPUs as now only the 2080 Ti is good enough for raytracing but it's important to get support for it as early as possible as that leads to further development, optimization and usage.

People have gotten so used to being able to get massive framerates from games that when a feature equivalent of Crysis 1 comes along and totally tanks performance they cry. We need more software that truly pushes the hardware beyond what is feasible at the time of release. Not everyone needs to play at ultra graphics and coming back to an older game a few years along the line and see it running well on your RTX 4080 Ti is pretty satisfying. Since raytracing can be turned off like any graphics feature you can make the decision based on what you prefer and what gives you a satisfying gaming experience.
Play at ultra graphics? Naw, 8 bit is fine...

Why would not want to play at ultra graphics if you are capable of it? I guess if you want to reduce it and have RTX on. Like it has been said before, RTX is good for slow moving
games. Maybe it's just me. I grew up with a Vic 20 and went from there. I want my awesome graphics!
 
Play at ultra graphics? Naw, 8 bit is fine...

Why would not want to play at ultra graphics if you are capable of it? I guess if you want to reduce it and have RTX on. Like it has been said before, RTX is good for slow moving
games. Maybe it's just me. I grew up with a Vic 20 and went from there. I want my awesome graphics!

Here's what NV should have done:
Sponsored Ultima Underworld 3, with RT. You're welcome. Start hammering checks.

RT in the current incarnations will be too big a performance hit for twitch games, but we're also in a world where that's the genre which spawns next-gen graphics. So, we're in this weird badlands right now.

I too love awesome graphics, but what that means depends upon the title. For competitive online shooters, I'll turn dang near everything off already for FPS. But for pretty much anything else, I'd sacrifice FPS for fidelity. And do right now, turning ambient occlusion, shadows, lighting, and other RT "fakes" to their max. A few more frames doesn't matter, but I like a great looking world.

UU2 ran like poo on the PC I had at the time, arguably, we're already used to slow FPS... ;)
 
L
https://www.guru3d.com/news-story/n...r-lots-of-rtx-on-based-games-at-gamescom.html

I was going to post this separately but RTX threads always turn in to a trash fest so just adding it to this one.
Ok, what settings did they disable to make rtx look better? Call of duty has shadows turned off with rtx off... Like there's are no other alternatives for shadows that have been done for years. I wish they would stop claiming things and just legitimately show it. How does it perform and how is the quality compared to the game at high quality with rtx off.
Could you imagine if AMD put images of RIS up and compared a gimped version of the game and then wouldn't show frame rates?
The funny part is, I am actually a fan of RT and think it's great that Nvidia is pushing these boundaries. I just hate their marketing. Everyone knows new features that are pushing boundaries are going to incur a performance hit. Some are ok with it in some games, others not so much. They have a single card that maybe can play some games with it on. Yet the CEO and all his followers act like you can't live without it. It's a neat feature for early adopters that can afford it and can use it in games they like. That's it. It's not currently for everyone and if your shopping for anything less than a 2080 super, rtx is mostly useless sat the moment. I can't comment on the games coming out, no frame rates where given. You can't say something is the best thing since sliced bread based on games that have yet to be released or even tested. I hope this games are half as good as all the Nvidia fans keep telling us they will, but forgive me if I don't just take the word of someone who hasn't seen the game, gotten hands on or helped in any development.
 
Sounds more like they want a bigger paycheck from nVidia
I didn't get that at all, sounded more like they were promised something that turned out not to be as good as advertised... With limited resources they have more important things to focus on before wasting resources on a feature that < 1% of their user base can use. I'm sure they were promised it would be super easy to do, tons of support and run great on anything from a 2060 up. Then reality hit... Sure it could run on a 2060, 1080p at 20fps... And took way more effort than initially thought.
 
The problem with your argument is you're looking at it in the best case scenario for Nvidia. The 1080 was never really $700 unless you wanted the founder's edition. The AIB versions started at $599 and dropped further after the 1080Ti launch. Likewise, the 2080 non-super hit in between with an asinine $799 price point when the 1080Ti was $699. It's not until recently that we have the performance boost you're talking about.
Not only that, but the 1080ti and 2080 were very close in performance at launch. So you're talking a pretty big price difference for almost no gain. Sure now the 2080 super 3 years later is now an improvement, but... 3 years, not the follow up architecture but the refresh after that. The 2080 super is what the 2080 should have been, or the 2080 should have been priced lower, but hey, if people were still thrown it money away, why wouldn't they try to charge as much as they think they can. It's a business, not a charity.
 
Even if they all had RTX, the 2080ti is pretty much the only one that is going to give the best playable fps?

2070 = 1080p
2080 = 1440p
I use my 2080ti on 3440x1440; 80 fps in BFV and 100 fps Metro.

Should get similar fps on those other cards at those resolutions.

A racing game makes about as much sense for RT as BFV. Many don’t want the frame hits on games that center around split second decisions.
 
2070 = 1080p
2080 = 1440p
I use my 2080ti on 3440x1440; 80 fps in BFV and 100 fps Metro.

Should get similar fps on those other cards at those resolutions.

A racing game makes about as much sense for RT as BFV. Many don’t want the frame hits on games that center around split second decisions.
Seems good for having everything maxed out on the 2080ti.
 
Seems good for having everything maxed out on the 2080ti.

I set RTX to the lowest setting. I can’t tell much of a difference between low and high (or ultra) but it looks way better than off to me. Like BFV is ultra / rtx low.

But again, in a racing game I question what you gain and at what cost. Frame drops are really noticeable.
 
I set RTX to the lowest setting. I can’t tell much of a difference between low and high (or ultra) but it looks way better than off to me. Like BFV is ultra / rtx low.

But again, in a racing game I question what you gain and at what cost. Frame drops are really noticeable.


Honest question: Over the next couple of years, when more ray tracing games come out... do you think your broken RTX Titan will unbreak.... & start playing future ray tracing games unfettered?
Or are you happy with "compromising" as the criteria, for your $1300 top-dollar purchase?


No matter how I tried, I could not find logic in buying a 2080ti and instead bought a 2080 because I know it will end up in a drawer in a few short years. So I's rather spend $800 twice, 2 years apart than big bucks on a card that's essentially obsolete in 4 years.
 
Honest question: Over the next couple of years, when more ray tracing games come out... do you think your broken RTX Titan will unbreak.... & start playing future ray tracing games unfettered?
Or are you happy with "compromising" as the criteria, for your $1300 top-dollar purchase?


No matter how I tried, I could not find logic in buying a 2080ti and instead bought a 2080 because I know it will end up in a drawer in a few short years. So I's rather spend $800 twice, 2 years apart than big bucks on a card that's essentially obsolete in 4 years.

I bought it because it’s ~45% faster at around 60Hz for rasterized. I did not factor in RTX at all.

I do actually believe it’ll do fine though. It’s going to be tough to beat a 754mm^2 12nm chip for some time.

Besides, if something does come out vastly better I could buy that and put the 2080ti into a different rig. It’s not a big deal. Not like it’d just stop working, it’ll be a strong card for years. Personally I feel like we’re hitting diminishing returns on video cards kind of like we did on CPUs a few years ago. (Especially since I play at 60/90Hz). For the first time ever I am content with a video card.

I suppose I am not as optimistic in technological advancement as you lol. I expect a slowdown or at best case a the same rate at best (~30% every ~two years). I am skeptical of a chiplet working for GPUs.
 
Last edited:
Personally I feel like we’re hitting diminishing returns on video cards kind of like we did on CPUs a few years ago. (Especially since I play at 60/90Hz). For the first time ever I am content with a video card.

I suppose I am not as optimistic in technological advancement as you lol. I expect a slowdown or at best case a the same rate at best (~30% every ~two years). I am skeptical of a chiplet working for GPUs.

Exactly this has been mentioned for years. Notably AMD is making a big deal about it this year. Just this week at Hot Chips this appears attributed to Lisa Su in her Presentation:

https://www.anandtech.com/show/1476...ay-1-dr-lisa-su-ceo-of-amd-live-blog-145pm-pt

  • 04:54PM EDT - Cost per transistor increases with the newest processes

That is very notable, because most of GPU performance gains have historically come from throwing ever more transistors at the problem. Something reasonable to do, when Cost per transistor is regularly and significantly decreasing with each new process step. But for a while that cost has been stagnating, just trickling lower. Now we have a CEO stating it is actually going up with the newest processes.

AMD has also been showing this slide everywhere, further telegraphing they are unlikely to build a monster to compete with 2080Ti anytime soon. This slide was also in Lisa Su's Hot chips presentation linked above:

IMG_20190819_135435.jpg
 
Back
Top