Initial RTX performance...

psyclist

Gawd
Joined
Jan 25, 2005
Messages
844
Hmm so what do we think of THIS... I think dumpster fire is accurate, but I honestly wanna know is the community gonna accept this level of performance loss? and is it gonna kill more cards?
 
Hmm so what do we think of THIS... I think dumpster fire is accurate, but I honestly wanna know is the community gonna accept this level of performance loss? and is it gonna kill more cards?

It’s not a dumpster fire yet, it’s still smoldering and surrounded by cardboard.
 
Hopefully NVidia will work with DICE to get SLI enabled. It's only fair that people who bought two 2080 TIs be able to play the game with RTX on at a playable framerate.
 
Not sure what people expected. Anyone buying these cards to use this feature are pretty stupid. The selling point of a 2080ti was being able to play 4k games at a decent FPS. Raytracing itself I always considered a nice to have novelty that would run like ass as most new technologies do.

Not to mention, frostbite engine runs like shit on both AMD and nvidia in DX12 mode - So raytracing is an instant no go anyways.
 
Nearly a 1/3rd of the performance! Hilarious, why did they bother with ray tracing when you can't even hit a constant 1080peasant 60Hz with max eye candy?
I find it funny hearing some of the 9000Hz crowd suddenly change tack on 1080peasant Ray Tracing, but I bet most with 2080Tis don't even have 1080 screens lmao, it must but lovely paying all that much for non-native sub-60Hz drops. The other part is HDR.. is it present with Ray Tracing? Is it working on G-Sync yet?
 
I really really hope these numbers are accurate.

70% More Expensive; 50% Less Performance.

nVidia...The way it's meant to be paid
 
Yeah DX12 on my setup isnt smooth thats for sure, hopefully DXR gives us something, are cards now going to be certified DXR, or is it just a feature add to DX12?

Maybe they can work on it, and boost performance over a next couple months. Its not good out of the gate though...looking forward to how things shake out in the coming months
 
Nearly a 1/3rd of the performance! Hilarious, why did they bother with ray tracing when you can't even hit a constant 1080peasant 60Hz with max eye candy?
I find it funny hearing some of the 9000Hz crowd suddenly change tack on 1080peasant Ray Tracing, but I bet most with 2080Tis don't even have 1080 screens lmao, it must but lovely paying all that much for non-native sub-60Hz drops. The other part is HDR.. is it present with Ray Tracing? Is it working on G-Sync yet?

Yes - I just enabled it and all features are working just fine.

The performance drop with all Ultra settings at 4k is pretty bad - Although I believe a lot of this is also made worse due to how bad DX12 mode is to begin with in this game.

Maybe the drivers that came out today helped a lot, but I was getting above 30 FPS frame rates on the 'low' setting w/ 4k. Not worth it to me, but some may consider that a playable frame rate.

At the end of the day this is just not the type of game where you want to sacrifice frametimes/FPS for something like ray tracing. Tomb Raider or that new Metro maybe - Since those aren't twitchy multiplayer based shooters.
 
Come on guys- you shouldn't feel bad about shelling out $1200 for some amazing screenshots. :p

I was going to get a 2080Ti for 4k but now I might as well wait.
 
Yes - I just enabled it and all features are working just fine.

The performance drop with all Ultra settings at 4k is pretty bad - Although I believe a lot of this is also made worse due to how bad DX12 mode is to begin with in this game.

Maybe the drivers that came out today helped a lot, but I was getting above 30 FPS frame rates on the 'low' setting w/ 4k. Not worth it to me, but some may consider that a playable frame rate.

At the end of the day this is just not the type of game where you want to sacrifice frametimes/FPS for something like ray tracing. Tomb Raider or that new Metro maybe - Since those aren't twitchy multiplayer based shooters.

Thanks for that, hadn't seen any mentions when googling around.
And a great point. On the right game, this will be amazing to see and use, maybe even racing games (easier to get higher FPS) or corridor shooter like doom perhaps if it added ray tracing in an expansion or similar. Don't get me wrong I'm excited for RT but now doesn't seem the time for it to be implemented well.. nearly. Maybe 2-4 years.
 
How did some of you not see this coming? Of course ray tracing performance is going to be terrible. Ray tracing is extremely computationally expensive and this is 1st gen tech.

You have the option of disabling it, so I don't really see the problem. Someone had to get the ball rolling eventually. This is just the start.
 
This is so insane, looking at RTX on / off screenshots you can barely tell the difference to begin with. I can see why this has become such a meme.
 
I am not sure what my framerates were... But with RTX on I was in huge fire fights ect and everything was fine in 64 man MP. Im playing on Ultra settings at 1440p on a 165hz GSync monitor.
 
Yepper, glad I skipped this gen and went in on a second 1080 Ti instead (used, both added up to less than a 2080 Ti no less).
 
That looks pretty bad actually. Worse than I was expecting.

I mean, I knew it would take a hit but I guess I was hoping they could optimize it before launch.

I even "upgraded" to a 1080p ultrawide in preparation, looks like the performance won't be there unless SLI supports comes quick.
 
Hopefully NVidia will work with DICE to get SLI enabled. It's only fair that people who bought two 2080 TIs be able to play the game with RTX on at a playable framerate.

ROFL! Nvidia got you on that one!
 
That looks pretty bad actually. Worse than I was expecting.

I mean, I knew it would take a hit but I guess I was hoping they could optimize it before launch.

I even "upgraded" to a 1080p ultrawide in preparation, looks like the performance won't be there unless SLI supports comes quick.

That is the results of their optimizations. It was slower. And their Optimizations are basically turning off as much Ray Tracing as they can to get kind of playable frame rates.

I don't know why people expected it to be better. Real Time Ray Tracing is tough and the evidence was there from the start when developers were saying they were hoping to get it running right at 1080p.

From my own point of view, I think it's actually better than I expected. I thought it might be a few generations before we were seeing any decent performance in Ray Tracing, but, this gives me hope that the second generation of RTX hardware will be worth purchasing.
 
That is the results of their optimizations. It was slower. And their Optimizations are basically turning off as much Ray Tracing as they can to get kind of playable frame rates.

I don't know why people expected it to be better. Real Time Ray Tracing is tough and the evidence was there from the start when developers were saying they were hoping to get it running right at 1080p.

From my own point of view, I think it's actually better than I expected. I thought it might be a few generations before we were seeing any decent performance in Ray Tracing, but, this gives me hope that the second generation of RTX hardware will be worth purchasing.

I was playing this morning on 3440x1440 ultra / low rtx and couldn’t tell an fps difference. (60Hz monitor) I thought it was great for singleplayer. Multiplayer I’ll have to choose between crazy scaling and rtx which is tbd. (100% scaling I only use 60% util). Haven’t had a chance to compare the two.

I just played quick because I was interested in power usage - which I only hit 80% PL. Seems it’s RT limited.

DICE said only RTX low is decently optimized so far. Which most review sites say low is similar to the other settings IQ wise but report 60-73 ish fps at 1440p.

What they really need is RT + DLSS....

For me this is what I hoped for. Something playable / to dabble with at 3440x1440p which is 2.5x the pixels of 1080p so I am somewhat impressed.
 
Low I really couldn't see any difference. But with Ultra, that's a different story. I'll try to get some screen captures later on. Sadly, we're at the point where video isn't that great for demonstrating stuff like this anymore. YouTube is just terrible with compression.
 
It’s funny how 60 fps is now intolerable. I assume most of the people saying that were born after Y2K.

Real-time raytracing hard. Like with any tech it starts somewhere and matures with time. Why are people acting like this is some new phenomenon?
 
It’s funny how 60 fps is now intolerable. I assume most of the people saying that were born after Y2K.

Real-time raytracing hard. Like with any tech it starts somewhere and matures with time. Why are people acting like this is some new phenomenon?

I think it's a certain sect of people. The ones that played Quake 3 with zero graphic settings on their 240hz crts. More power to them, but I agree with you. While I prefer 60fps+ for some games, I have found joy in 30fps in games like BOTW, but also enjoy 165hz on my gsync monitor for fps's and pc games. But it's not the END of the world of I have to play at 60fps. Hell after a short period, I really can't tell the difference.

I think people don't seperate stuttering along with constant fps. I can play 24-30 fps as long as there isn't performance stuttering. For example Bloodborne on the PS4. I have a MAJOR disconnect from the game because it stutters more than Colin Firth did in the Kings Speech. It just isn't fun. But then I'm playing some games that are 30fps on it and smooth, and while it doesn't look silky, if it's consistent it doesn't bother me.
 
It’s funny how 60 fps is now intolerable. I assume most of the people saying that were born after Y2K.

Real-time raytracing hard. Like with any tech it starts somewhere and matures with time. Why are people acting like this is some new phenomenon?

I think a lot of people here knew 1st generations won't be good enough and that it will take a generation or two before it will be acceptable level of performance.
 
It’s funny how 60 fps is now intolerable. I assume most of the people saying that were born after Y2K.

Real-time raytracing hard. Like with any tech it starts somewhere and matures with time. Why are people acting like this is some new phenomenon?

I think people understand it's hard and aren't really surprised at the performance hit. What I'm concerned about is where they go from here. It appears that only the 2080ti has the horsepower to really power raytracing at the quality and resolutions desirable. The issue with that is that the 2080ti is a massive chip at 18.6B transistors. I don't think nVidia is just abusing their market leader position, I think they are charging a lot for this chip because it is very expensive to manufacture. And for what, better puddles? We're just touching the surface of what ray tracing really requires and already you need 18.6B transistors to do it at TSMC 12nm. What's the future for this? How many more transistors are required before everything is traced? How do you plan on fitting those transistors on a chip that also needs to perform rasterization and a myriad of other tasks? I doubt tracing will make much headway by the time we start hitting incredibly hard scaling issues beyond TSMC 7nm and EUV.

In short, ray tracing has been around a long time but it's not done in real time simply because of the horsepower it would take to do it. Unless there's a major breakthrough, I'm just not sure nv makes this more than a novelty before we run into hard limits re how many transistors is reasonable to fit on a consumer GPU.
 
Honestly it makes more sense to me to maybe make multi-GPU more relevant again by basically selling cards that are stripped down just to do compute for ray tracing.
 
Does anyone know if enabling ray tracing is possible with SLI? Or does it require you to disable it?
 
I think people understand it's hard and aren't really surprised at the performance hit. What I'm concerned about is where they go from here. It appears that only the 2080ti has the horsepower to really power raytracing at the quality and resolutions desirable. The issue with that is that the 2080ti is a massive chip at 18.6B transistors. I don't think nVidia is just abusing their market leader position, I think they are charging a lot for this chip because it is very expensive to manufacture. And for what, better puddles? We're just touching the surface of what ray tracing really requires and already you need 18.6B transistors to do it at TSMC 12nm. What's the future for this? How many more transistors are required before everything is traced? How do you plan on fitting those transistors on a chip that also needs to perform rasterization and a myriad of other tasks? I doubt tracing will make much headway by the time we start hitting incredibly hard scaling issues beyond TSMC 7nm and EUV.

In short, ray tracing has been around a long time but it's not done in real time simply because of the horsepower it would take to do it. Unless there's a major breakthrough, I'm just not sure nv makes this more than a novelty before we run into hard limits re how many transistors is reasonable to fit on a consumer GPU.

A 2080 can definitely do 1440p 60Hz locked in BFV multiplayer with ultra/ rtx low. My 2080ti runs around 85% util at 3440x1440 which is 25% more MP than 1440p.

2080 is still a big chip and not exactly value. IIRC it’s 25% larger than the Titan Xp. But I guess you have to start somewhere.

My thoughts are if nVidia released an all CUDA chip it’d be the last card people bought for years. Without ray tracing I am only at 60% util at 3440x1440 60Hz. An all CUDA chip would have been what, 40% util? There’d be no reason for me to upgrade for years and years. It’s also a market discriminator for them. So from their perspective it’s a smart move if they didn’t maul the launch.

I would expect next gen 1080p ray tracing should be ~$300 and 1440p ~$500.
 
Last edited:
I think people understand it's hard and aren't really surprised at the performance hit. What I'm concerned about is where they go from here. It appears that only the 2080ti has the horsepower to really power raytracing at the quality and resolutions desirable. The issue with that is that the 2080ti is a massive chip at 18.6B transistors. I don't think nVidia is just abusing their market leader position, I think they are charging a lot for this chip because it is very expensive to manufacture. And for what, better puddles? We're just touching the surface of what ray tracing really requires and already you need 18.6B transistors to do it at TSMC 12nm. What's the future for this? How many more transistors are required before everything is traced? How do you plan on fitting those transistors on a chip that also needs to perform rasterization and a myriad of other tasks? I doubt tracing will make much headway by the time we start hitting incredibly hard scaling issues beyond TSMC 7nm and EUV.

In short, ray tracing has been around a long time but it's not done in real time simply because of the horsepower it would take to do it. Unless there's a major breakthrough, I'm just not sure nv makes this more than a novelty before we run into hard limits re how many transistors is reasonable to fit on a consumer GPU.

I think it’s far too early to predict where we go from here. First off it’s hella impressive that any sorta RT runs at 60fps @ 1440p on a 2080Ti. There are a lot of people still happily gaming on 60hz monitors so I don’t get the hate. Future software and hardware implementations will be more efficient.

The other thing to consider is what to do with new transistors going forward. Throwing them at rasterization hacks isn’t going to result in higher IQ or performance. Those hacks are getting very expensive and obviously don’t look as good as proper RT. I rather they’re spent on higher detail geometry or better lighting.

I don’t agree with people that prefer to spend that transistor budget on going from 4K 60fps to 4K 144fps with no actual IQ improvement.
 
Anyone tried this? It was posted on Nvidia forums. It’s now playable for me in 4K with DXR on. Runs smooth between 50-60 FPS consistently. Looks great too.

“We recommend in this first release of DXR that “DXR Raytraced Reflections Quality” be set to “Low” due to Battlefield V’s Known Issues when using “Medium”, “High”, and “Ultra” settings. EA, DICE, and NVIDIA will also continue to optimize this implementation and deliver regular updates.

Although preferences will vary, we recommend disabling the following settings from Video-->Basic when DXR is enabled for the best overall experience:

Chromatic Aberration
Film Grain
Vignette
Lens Distortion”

https://forums.geforce.com/default/topic/1080010/geforce-rtx-20-series/battlefield-v-dxr-faq/
 
I asked earlier but nobody has answered...can you use SLI with DXR? Or is it disabled in SLI mode?
 
Anyone tried this? It was posted on Nvidia forums. It’s now playable for me in 4K with DXR on. Runs smooth between 50-60 FPS consistently. Looks great too.

“We recommend in this first release of DXR that “DXR Raytraced Reflections Quality” be set to “Low” due to Battlefield V’s Known Issues when using “Medium”, “High”, and “Ultra” settings. EA, DICE, and NVIDIA will also continue to optimize this implementation and deliver regular updates.

Although preferences will vary, we recommend disabling the following settings from Video-->Basic when DXR is enabled for the best overall experience:

Chromatic Aberration
Film Grain
Vignette
Lens Distortion”

https://forums.geforce.com/default/topic/1080010/geforce-rtx-20-series/battlefield-v-dxr-faq/

I run 3440x1440 60Hz ultra/rtx low and it’s a solid 60 fps at around 85% gpu utilization. Given that’s 5MP vs 4k’s 8MP it sounds about right. Pretty amazing.
 
I think it’s far too early to predict where we go from here. First off it’s hella impressive that any sorta RT runs at 60fps @ 1440p on a 2080Ti. There are a lot of people still happily gaming on 60hz monitors so I don’t get the hate. Future software and hardware implementations will be more efficient.

I don't find it that impressive to be honest. And that's because we've known how to do ray tracing for a very long time and we've had the option of just throwing horsepower at it to get it done in real time. The challenge is doing it in such a way that it will work with consumer GPUs. A $1000 or $1200 GPU is *not* a GPU that's going to drive adoption. Nvidia did what we've always known would need to be done, throw a massive number of transistors at the problem. I just don't find it that interesting that nvidia made a massive GPU die that only the very upper tier of enthusiasts can afford or are willing to buy. The major impediment to adoption now will be getting the capability into a GPU that's small enough to impact the market. This is my point, I'm not sure that's realistic. If nvidia comes up with a novel optimization process or makes a major breakthrough in transistor density to make this more realistic, I'll be more impressed.

The other thing to consider is what to do with new transistors going forward. Throwing them at rasterization hacks isn’t going to result in higher IQ or performance. Those hacks are getting very expensive and obviously don’t look as good as proper RT. I rather they’re spent on higher detail geometry or better lighting.

Or better AI or better physics or a bunch of other things that would make the game more immersive. If given a choice, I'd probably choose transistor budget be spent on better AI in a more immersive world over than better looking reflections. But that's preference.

I don’t agree with people that prefer to spend that transistor budget on going from 4K 60fps to 4K 144fps with no actual IQ improvement.

Again, there are lots of ways to spend your transistor budget. I don't personally think better reflections warrant the cost or is the best way to spend the transistors if game *immersion* is the goal. But again, that's my playstyle.
 
How did some of you not see this coming? Of course ray tracing performance is going to be terrible. Ray tracing is extremely computationally expensive and this is 1st gen tech.

You have the option of disabling it, so I don't really see the problem. Someone had to get the ball rolling eventually. This is just the start.

The problem is that we have to pay extra for useless tech.
 
Back
Top