Initial RTX performance...

Why is Nvidia selling 2 RTX cards (2070,2080) that can't even play Raytracing games at a decent framerate? Raytracing is a joke right now. Nvidia is asking folks to pay for tech now that isn't even ready for primetime, and doesn't appear to ever be ready until we have cards that are twice as fast as the RTX2080ti.
 
Why is Nvidia selling 2 RTX cards (2070,2080) that can't even play Raytracing games at a decent framerate? Raytracing is a joke right now. Nvidia is asking folks to pay for tech now that isn't even ready for primetime, and doesn't appear to ever be ready until we have cards that are twice as fast as the RTX2080ti.

What? In BFV multiplier the 2080 can do >60 fps at 1440p and 2070 >60fps at 1080p. And they’ve had what, a few months to implement and optimize?
 
SLI is not supported in Battlefield V with DirectX12. You don't need to disable it, but it just won't work. (With DX11, you can hack in support, but you don't get DXR).

Got it, thanks. Yeah I used Nvidia Inspector to change the compatibility bits for my 1080Ti's and SLI works great. Was wondering what the DX12 implementation was like, or if SLI could help DXR perform better. Guess not.
 
What? In BFV multiplier the 2080 can do >60 fps at 1440p and 2070 >60fps at 1080p. And they’ve had what, a few months to implement and optimize?

Where did you get those numbers from? This guy tested BFV with RT and this is what he got. 1440p is a no go for anything but the 2080ti and he didn't even bother testing 4k.
 
What? In BFV multiplier the 2080 can do >60 fps at 1440p and 2070 >60fps at 1080p. And they’ve had what, a few months to implement and optimize?

A few months to implement and optimize? This is starting to sound like AMD.
 
"Consumers shouldn't have to pay more for what is essentially a tech demo on your PC"
 
I’m getting 50-60 FPS in 4K with DXR on using one 2080ti. The game play is smoother than having DXR off. Can’t wait until they get sli working and optimize DXR even more. This is just the first release of DXR. And we’re still waiting on DLSS + DXR.

No complaints here.
 
Where did you get those numbers from? This guy tested BFV with RT and this is what he got. 1440p is a no go for anything but the 2080ti and he didn't even bother testing 4k.


A few months to implement and optimize? This is starting to sound like AMD.

Because I have the actual card and game and I said in multiplayer. Multiplayer is less strenous than singleplayer ... and it’s BFV... I play the singleplayer for a few hours, multi for hundreds.

In multiplayer, what the vast majority of players buy the game for, the 2080 has no problem pushing 1440p. I run a 2080ti at 3440x1440 and it’s around 85% util 60fps locked. Absolutely beautiful.

Singleplayer is still fine for me, but it’s pegged a lot more often.

I do find it impressive how quickly they were able to implement and optimize a brand new tech as far as they did.
 
Where did you get those numbers from? This guy tested BFV with RT and this is what he got. 1440p is a no go for anything but the 2080ti and he didn't even bother testing 4k.

I watched the video, really sad. I'm still excited to try it myself, but I can't help but feel Nvidia pulled a fast one here.

Ray-tracing is really amazing in theory, if they were doing all the lighting, global illumination, ray-traced shadows, reflections, etc. But in BFV all you are getting is slightly better looking puddles with 1/3 of the fps. Not a good deal in any way.
 
  • Like
Reactions: Mylex
like this
I totally agree, it made me feel ok with down grading to the 2080 strix after my faulty 2080 ti. I just plan on using does when its an option now. Sad thing I would still buy one to tinker with just because of the newness. If they announce 7nm cards next summer I will buy one of those too or if AMD shows up with a winner one of theirs.
 
Hmm so what do we think of THIS... I think dumpster fire is accurate, but I honestly wanna know is the community gonna accept this level of performance loss? and is it gonna kill more cards?

It may be terrible for fps game where framerate is all that matters but what about racing, rpg/mmorpg, action etc games where visuals trump super high fps? That is where RTX will shine and as long as you can get near 60 fps its all good. It would increase immersion and be worth it for those scenarios. People are being short sighted and dramatic calling this a failure.

I do think 20 Gigarays on 3080 Ti with 7nm would have impressed everyone a lot more but this still has its uses, just not in competitive fps games.
 
Last edited:
It may be terrible for fps game where framerate is all that matters but what about racing, rpg/mmorpg, action etc games where visuals trump super high fps? That is where RTX will shine and as long as you can get near 60 fps its all good. It would increase immersion and be worth it for those scenarios. People are being short sighted and dramatic calling this a failure.

I totally agree. Competitive first person shooters with wide open spaces where stable, high framerate is very important are the worst place to use RTX at the moment. Use it in something like Alien: Isolation where you are moving slowly in narrow corridors most of the time and it would work well for improving the visuals while still offering sufficient performance. Just imagine seeing the reflection of the Alien in a puddle of water or a big pipe etc.
 
Getting 60 fps locked at 3440x1440 (5MP) Ultra / RTX low in multiplayer. Pretty awesome. I can max out my IQ plus some RT and my monitor Hz... I'll try medium when DICE fixes it.

You're definitely in the minority if you think paying $1200 to play a new feature on "Low" is acceptable.
 
Low looks the same as ultra to me and is vastly better than without it. If they called it “high” would that make you feel better?

Review sites say the whole RTX thing doesn't make it look that much different if you want to use subjective points of reference.
 
Review sites say the whole RTX thing doesn't make it look that much different if you want to use subjective points of reference.

I have actually played it and the ray traced flashes off surfaces help for situational awareness. Plus looks awesome. And I am impressed they have it working so well, a solid 60fps at 3440x1440, 2.5x the pixels of 1080p.

Plus turn it off and you still have 30-45% higher performance than a 1080ti. Even more if they implement DLSS.
 
I have actually played it and the ray traced flashes off surfaces help for situational awareness. Plus looks awesome. And I am impressed they have it working so well, a solid 60fps at 3440x1440, 2.5x the pixels of 1080p.

Plus turn it off and you still have 30-45% higher performance than a 1080ti. Even more if they implement DLSS.

"The Low version looks almost more like a DX11 god ray effect, while the Ultra ray tracing shows a much more realistic lighting model."

Quote from a review.

And I'm definitely not paying 70% more money for 35% performance gain by disabling features. Charge $800 and I'd buy it.
 
Last edited:
I wonder if a seperate chip for RT cores is a better way to go? NVLink but for a dedicated raytracing chip? not sure if latency would be too high. But as it stand the 2070 is DOA for raytracing as its performance is embarrassing, well they all are, but at least you can still kinda play a game on the 2080 and 2080ti

All in all, a huge let down...7nm will fix a lot of this hopefully, dont buy the 20XX series...id say wait, or grab a 1080ti and wait for compelling ray tracing options.

Performance is garbage, and Nvidia knew this, but didnt let on how much performance would tank, they just pushed that preorder narrative as hard as they could. Im guessing to try and save the Q3 earnings, that they still shit the bed on.
 
"The Low version looks almost more like a DX11 god ray effect, while the Ultra ray tracing shows a much more realistic lighting model."

Quote from a review.

And I'm definitely not paying 70% more money for 35% performance gain by disabling features. Charge $800 and I'd buy it.

“Quote from a review.” How are god rays even applicable to BFV RT?

35% by disabling features? That’s apples to apples to a 1080ti. For me it’s probably closer to 45% since I target 60FPS. I won’t argue value since for some people $1200 is a lot, to other’s it’s like pissing in the Susquehanna.
 
Saying that the 2080ti should cost the same as the 2080 is just stupid. Even if it can't do raytracing as good as you would have hoped it's still 30-40% faster across the board compared to a 2080/1080ti. It's the main reason people are buying it. It's the only card that finally gives decent frame rates at 4k.

Now, if you aren't using 4k - I agree - why waste the money, but some folks here are saying some really stupid stuff. Anyone who bought a 2080ti bought it for 4k performance. Raytracing is merely a novelty/added benefit.
 
Saying that the 2080ti should cost the same as the 2080 is just stupid. Even if it can't do raytracing as good as you would have hoped it's still 30-40% than a 2080/1080ti. It's the main reason people are buying it. It's the only card that finally gives decent frame rates at 4k.

Now, if you aren't using 4k - I agree - why waste the money, but some folks here are saying some really stupid stuff.

This is simple. 780Ti to 980Ti to 1080Ti all in the same ballpark price wise and all gaining about 35-45%. Suddenly, there's a 70% increase in price for the same performance jump on the promise of new features and they don't work fast enough to be useful. Why not strip out all those RTX features and offer a GTX 2080Ti for $800? As it stands now you're getting your performance bump at too high of a cost for what you're getting in the opinion of many.
 
I think the crux of the issue is that people spending $1,200+ on a video card are not running 60Hz 1080p monitors. Most likely at least 1440p and probably high refresh at that.

So we have RT in BFV that is just making the cut on 1080p, you can see why this would be a disappointment for anyone with a high-end gaming monitor. Especially since the visual quality is not even that big of a difference compared to the performance cost.

You can say this is an early implementation, and that might be true, but DICE are on the top end of AAA developers. If they can't pull it off, it's not a good sign but I guess we'll see.
 
I think the crux of the issue is that people spending $1,200+ on a video card are not running 60Hz 1080p monitors. Most likely at least 1440p and probably high refresh at that.

So we have RT in BFV that is just making the cut on 1080p, you can see why this would be a disappointment for anyone with a high-end gaming monitor. Especially since the visual quality is not even that big of a difference compared to the performance cost.

You can say this is an early implementation, and that might be true, but DICE are on the top end of AAA developers. If they can't pull it off, it's not a good sign but I guess we'll see.

I don't think it was the right type of game for such a demanding new technology, not to mention frostbite engine has always ran like shit in DX12. You loose a good 5-10% at least average FPS, and get pretty substantial frame drops under DX12. It's been broken forever.

I think Shadow of the Tomb Raider would have been a better game to reveal the technology on, or the new metro game as you aren't talking about such extreme circumstances for what needs to be rendered like in BF5.

I also think patching in DXR to older titles would have been maybe a better idea. Deus Ex Mankind Divided with raytracing would look incredible but still probably run great.

This is why i'm waiting for a few more games before I completely write off the feature at this point.

Although ultimately raytracing had 0% to do with why I bought a 2080ti. I bought a 2080ti to get 60hz+ @ 4k. The 'RTX cores' were just an added novelty. I do think it would be cool to see developers use the RTX cores for more then just raytracing. I'm kind of confused why they can't just tap into the turing/AI/RTX/whatever you want to call them cores for additional standard raster performance. Even if it's only another 5-10% FPS it would totally be worth it.
 
Oh, I'm not writing off the feature. I'm really excited about it actually, and hope maybe future titles will pull it off.

And I still want to try BFV when the full release comes to see for myself. Just can't help but be disappointed by these initial reviews.
 
This is simple. 780Ti to 980Ti to 1080Ti all in the same ballpark price wise and all gaining about 35-45%. Suddenly, there's a 70% increase in price for the same performance jump on the promise of new features and they don't work fast enough to be useful. Why not strip out all those RTX features and offer a GTX 2080Ti for $800? As it stands now you're getting your performance bump at too high of a cost for what you're getting in the opinion of many.

I think two reasons:

1.) Games are beginning to flat line with graphics demand with rasterized rendering creating no reason to upgrade. Somewhat similar to processors and games. Making demand plummet. RT resets that whole curve. If it’s adopted it reboots it all.

2.) It’s a market discriminator from AMD and Intel, especially if they successfully couple it with their tensor cores and AI knowledge.

So from a 10,000 ft level it makes tons of sense to nVidia. Your products syngerize, you have market discriminators and advantages.

When the rubber hits the road will it make sense? Considering the masses don’t seem to care about RT I think it’s a great opportunity for a competitor to jump in without RT, much lower cost, and the same or better rasterized performance for at least a gen or two.

Kinda of feels like how Intel shot themselves in the foot and AMD had a recovery.
 
Last edited:
Oh, I'm not writing off the feature. I'm really excited about it actually, and hope maybe future titles will pull it off.

And I still want to try BFV when the full release comes to see for myself. Just can't help but be disappointed by these initial reviews.

I really think it comes down to what you're expecting from DXR. If you're expecting millions of extra polygons, of course it will be a disappointment. For me, it's about the lighting. Lighting in video games has been dreadful since time immemorial. When I see something explode in Battlefield, and it reflecting, it looks miles better in my opinion. I understand it might be small to some, but for me, it's fairly large.
 
I think the crux of the issue is that people spending $1,200+ on a video card are not running 60Hz 1080p monitors. Most likely at least 1440p and probably high refresh at that.

So we have RT in BFV that is just making the cut on 1080p, you can see why this would be a disappointment for anyone with a high-end gaming monitor. Especially since the visual quality is not even that big of a difference compared to the performance cost.

You can say this is an early implementation, and that might be true, but DICE are on the top end of AAA developers. If they can't pull it off, it's not a good sign but I guess we'll see.

When I first saw some of the benchmarks from the review sites I was very optimistic about the future of Ray Tracing and how quickly it would become mainstream. But, since then, I had time this weekend to sit down and read the reviews and I have changed my mind. I didn't realise that they were only using one aspect of Ray Tracing(reflections) and only on some surfaces. That information changes things completely for me. I knew it was going to be tough to do Ray Tracing, I just didn't realise how tough. Think of how far we are away from full scene Ray Tracing. Some people are happy with the performance and think that the visual improvements are well worth it even on low, but, others are saying that it isn't worth it. People I know and trust say there is very little difference between low and off but a massive performance hit. They all seem to agree that on Ultra the reflections are amazing but the performance tanks.

It's still just the start, but, it's gone from not great to worse for me. Everybody seems to be waiting for DLSS to save the situation, but, I am seriously beginning think that's just another red herring and it's not going to be all that it's been made out to be. I always worry when there is long delay in new tech. Tensor cores have been around since last year. We had a Ray Tracing demo of Metro Exodus in March and that was before the RT cores were available. At launch we had slides from Nvidia showing DLSS improvements in games but now three months later, we still have no games using DLSS. They talked about how easy it was for Developer to implement, that the Nvidia super computer does the training work, but still no games? Did they just theoretical numbers for their slides? If not, then what's the problem?
 
Right. BFV is only doing RT for reflections. If the whole scene was using RT lighting and shadowing, maybe that would be worth the performance hit. But they are using it for a very limited effect, and still performance tanks to 1/2 or 1/3 of having it off.

I haven't tried it yet, but looking at the review videos honestly it didn't look all that much better. It's a value proposition, and I'm not sure the value is there.
 
I think two reasons:

1.) Games are beginning to flat line with graphics demand with rasterized rendering creating no reason to upgrade. Somewhat similar to processors and games. Making demand plummet. RT resets that whole curve. If it’s adopted it reboots it all.

2.) It’s a market discriminator from AMD and Intel, especially if they successfully couple it with their tensor cores and AI knowledge.

So from a 10,000 ft level it makes tons of sense to nVidia. Your products syngerize, you have market discriminators and advantages.

When the rubber hits the road will it make sense? Considering the masses don’t seem to care about RT I think it’s a great opportunity for a competitor to jump in without RT, much lower cost, and the same or better rasterized performance for at least a gen or two.

Kinda of feels like how Intel shot themselves in the foot and AMD had a recovery.

You make a good point, however, I think the artificial demand from crypto mining gave them rose colored glasses about what they thought they could charge with this generation. When a non-essential luxury item jumps 70% in cost in a year, it makes you question how much you really need it. Obviously, there are people here who would buy at any price, and there are others (like myself) who would buy it if it were cheaper because they don't think its worth the inflated Nvidia price. If Nvidia gave me a compelling reason why it was worth $1200 other than simply ~35% faster performance in rasterized games, maybe I'd be more inclined to open my wallet. I just haven't seen a good reason yet.

Personally, I got rid of my 4k TV/monitor and went back to a 27" 144hz panel for kicks, but I played all the way through AC:Odyssey with my 1080Ti and 4k TV and never felt like I was missing $700 worth of extra visuals from the 2080Ti.
 
  • Like
Reactions: N4CR
like this
I’m getting 50-60 FPS in 4K with DXR on using one 2080ti. The game play is smoother than having DXR off. Can’t wait until they get sli working and optimize DXR even more. This is just the first release of DXR. And we’re still waiting on DLSS + DXR.

No complaints here.

Same, and if I enable hdr it looks amazing on an oled.
 
Same, and if I enable hdr it looks amazing on an oled.
So you two have some magic version of the game that performs better at 4K than everybody else is showing us it runs at 1080p? Every single video review shows performance dips into the 40s and 30s just at 1080p in areas where there is actually Ray tracing racing going on. I think some of you are reaching really really hard.
 
I'm not sure what settings they are using but with the nvidia profile at 4k ultra with low rtx I haven't noticed any major issues other than an occasional stutter. The cut scenes are slow but that is the case on ultra with rtx on or off.
 
Right. BFV is only doing RT for reflections. If the whole scene was using RT lighting and shadowing, maybe that would be worth the performance hit. But they are using it for a very limited effect, and still performance tanks to 1/2 or 1/3 of having it off.

I haven't tried it yet, but looking at the review videos honestly it didn't look all that much better. It's a value proposition, and I'm not sure the value is there.

Exactly this. There is no way I'm paying Nvidia $700 after selling my 1080Ti just for shiny water reflections that kill performance.
 
I think the crux of the issue is that people spending $1,200+ on a video card are not running 60Hz 1080p monitors. Most likely at least 1440p and probably high refresh at that.

So we have RT in BFV that is just making the cut on 1080p, you can see why this would be a disappointment for anyone with a high-end gaming monitor. Especially since the visual quality is not even that big of a difference compared to the performance cost.

You can say this is an early implementation, and that might be true, but DICE are on the top end of AAA developers. If they can't pull it off, it's not a good sign but I guess we'll see.
Except DICE has never gotten DX12 working right so far. Not all AAA developers are made equal. In my opinion, if anyone is to get ray tracing right it's going to be id.
Right. BFV is only doing RT for reflections. If the whole scene was using RT lighting and shadowing, maybe that would be worth the performance hit. But they are using it for a very limited effect, and still performance tanks to 1/2 or 1/3 of having it off.

I haven't tried it yet, but looking at the review videos honestly it didn't look all that much better. It's a value proposition, and I'm not sure the value is there.
So much this. Real time global illumination is going to be the real game changer. Once a game comes out that is completely ray traced instead of a mix of rasterization and ray tracing I bet people will change their tune.
 
Back
Top