Initial RTX performance...

You don't "have" to do anything.

The RTX series came out, I wasn't impressed, so I bought a 1080Ti for my second system instead. If you don't feel like it's worth the money, don't buy it. NVIDIA doesn't have a gun to anyone's head.

So you are against attractively priced products? You also seem to be against Nvdia following their past high-end video card pricing, and seem to be quite fine with Nvidia doing whatever the hell they want. No gun to my head but we can damn sure call them for what they are right now. And that's would be a money gouging greedy corporate monopoly.
 
So you are against attractively priced products? You also seem to be against Nvdia following their past high-end video card pricing, and seem to be quite fine with Nvidia doing whatever the hell they want. No gun to my head but we can damn sure call them for what they are right now. And that's would be a money gouging greedy corporate monopoly.

His point is that you are not obligated to buy an nVidia graphics card, let alone an insanely overpriced one.
 
Once a game comes out that is completely ray traced instead of a mix of rasterization and ray tracing I bet people will change their tune.


And what FPS will a completely ray-traced game run at? 3-5 fps. All they ray-traced now are puddles and that alone seriously compromises FPS.
 
His point is that you are not obligated to buy an nVidia graphics card, let alone an insanely overpriced one.

Where did I say I was obligated? We are allowed to voice our displeasure? Aren't we?
 
Where did I say I was obligated? We are allowed to voice our displeasure? Aren't we?

'Voicing' your displeasure can also include not buying the latest nVidia cards. The term 'vote with your wallet' becomes very meaningful here. I sure am...
 
Why not strip out all those RTX features and offer a GTX 2080Ti for $800? As it stands now you're getting your performance bump at too high of a cost for what you're getting in the opinion of many.

Exactly!!! But most folks here seem to take on this "It's the Nvdia way or the highway" type of attitude. "Nobody is pointing a gun to your head". Just get the cheaper last gen card that isn't even being produced anymore and be happy. lol!!
 
Except DICE has never gotten DX12 working right so far. Not all AAA developers are made equal. In my opinion, if anyone is to get ray tracing right it's going to be id.

So much this. Real time global illumination is going to be the real game changer. Once a game comes out that is completely ray traced instead of a mix of rasterization and ray tracing I bet people will change their tune.

They actually fixed the vast majority of DX12 issues with the last patch. It runs smooth for me with RTX on.
 
So you are against attractively priced products? You also seem to be against Nvdia following their past high-end video card pricing, and seem to be quite fine with Nvidia doing whatever the hell they want. No gun to my head but we can damn sure call them for what they are right now. And that's would be a money gouging greedy corporate monopoly.
I'm not sure how you inferred this from my post. I have commented elsewhere that I think the RTX series is overpriced and underperforming, and I am the 1% of NVIDIA customers that is fine dropping $1000-$1200 on a video card. These aren't worth the money, in my opinion.

However what you said is that "The problem is that we have to pay extra for useless tech." That's not what is going on. You're paying extra because AMD has been asleep at the wheel for like 3 years on GPUs and the only competition NVIDIA has is their previous generation, which if you are following along with NVIDIA as a business, you know that their supply channels are flooded with mid-range 1000 series cards they are having trouble moving. NVIDIA is simply charging more for performance partly because they can, and partly because they have to. The RTX effects have nothing to do with it. Yeah, RTX performance blows, but the 2070/2080/2080 Ti are still the fastest GPUs you can buy, so they are charging accordingly.
 
So you two have some magic version of the game that performs better at 4K than everybody else is showing us it runs at 1080p? Every single video review shows performance dips into the 40s and 30s just at 1080p in areas where there is actually Ray tracing racing going on. I think some of you are reaching really really hard.

Multiplayer runs way better than singleplayer. I run 3440x1440 60Hz and never drop below it. 85% utilization.

The reviews are all single player which makes no sense at all.
 
Interesting that the VAST majority of people who are salty about the performance of the 2080 Ti are people who don't own one.


Id say the VAST majority of gamers would not spend this kind of money on a GPU, so yes there are many more voices on this side of the fence. But man id be REALLY salty if i coughed up $1200 for a card only for it be rendered a potato when its game changing feature is enabled. Then with the card failures/fires on top of all this, a dumpster fire of a launch is accurate wouldnt you say?

RTX 1st gen looks to be a flop (if they cant get performance up), sorry to say. I really hope they added more RT cores into the 30XX series and with the impending die shrink, it should bring a good performance uplift for the next gen cards.

Im pumped for raytracing and was hopeful when they announced it. But it looks like it was more of a tech demo this gen, rather than a usable feature. Always wait for reviews, never buy on a companies presentations and promises.
 
  • Like
Reactions: N4CR
like this
Meh, to put it in perspective 5 years ago when people said 4k gaming was pointless and impossible, I played through Tomb Raider at 4k 60fps on a 65in Panasonic tv over displayport (since hdmi 2.0 wasn't around yet) using dual 7970 ghz cards in crossfire. Now you can game at 4k on a laptop lol, and nvidia might finally release the BFGD next year. The early adopters always pay a huge premium, give it a few years and ray tracing might be everywhere, six years from now we might even have rtx on mobile phone games.
 
I'm pre-loading BFV now. Hope my experience is similar to those in this thread, rather than the poor results from reviews. I'll reserve judgement til I see it myself.

Awesome! First thing I’d do is see if you can tell the difference between rtx low and ultra. I couldn’t and rtx low has way higher fps, and certainly better than off.

Also multiplayer runs way better for me than single. I haven’t even touched single more than five minutes.
 
It's almost like you're not even paying attention. People aren't talking about AMD. They are talking about the relative value or lack thereof of a $1200 video card when its signature feature cuts performance in half.

Exactly, its not a red vs green argument. Its an asshole move no matter what camp tried to pull it off. Im glad were at the gates of raytracing, but its got a ways to go before its viable. Nvidia didnt bother to disclose the performance hit at all...just put maximum effort into the preorder now narrative, all while jacking prices and rushing the build of these cards by the looks of it.

It all points to them trying to save their quarterly ER that shit the bed because of the crypto implosion. Overpricing and rushing the cards out to save themselves...well we can now see how it all turned out.

Red, Green, Blue...I dont care the color, just dont be an asshole anti-competitive, anti-consumer company or a slave to shareholders solely. Speaking of which, they probably arent too pleased at the moment...

Looking forward to the next couple of months to see how performance improves, hopefully they have remedied the 2080ti failures by then as well.
 
Awesome! First thing I’d do is see if you can tell the difference between rtx low and ultra. I couldn’t and rtx low has way higher fps, and certainly better than off.

Also multiplayer runs way better for me than single. I haven’t even touched single more than five minutes.

In the 3rd War Story, walk close to a fire, and look at your gun. Ultra Vs Low DXR becomes apparent quickly.
 
...man id be REALLY salty if i coughed up $1200 for a card only for it be rendered a potato when its game changing feature is enabled.

I play on a 2560x1440 165hz monitor. With Ray Tracing set to ultra and every other option also set to ultra... I average between 60 - 80 FPS in single player BF:V with my 2080Ti overclocked to 2010Mhz on the core and 14800Mhz on the memory. It has occasionally dipped into the 40s, but that's what G-Sync is for, so I haven't noticed. I haven't used Ray Tracing on multiplayer yet, but performance is supposed to be even better.

Then with the card failures/fires on top of all this

My FE 2080Ti has been running just fine since day one. I'm definitely keeping an eye on the situation, but it's too early for me to draw much of a conclusion with this whole RMA situation. We'll see how it goes.

a dumpster fire of a launch is accurate wouldnt you say?

No I would not.

I was pretty frustrated with the price... But it was exactly what I expected considering there is no high end competition. If you were expecting more performance with Ray Tracing enabled, I don't know what to tell you. Performance with Ray Tracing is on par or better than I expected to be honest. And in games without Ray Tracing (or with it disabled), the performance is phenomenal. Tell me this... What kind of performance does a 1080Ti get with Ray Tracing enabled?
 
Played BFV for 2 hours and tested RTX. Short answer: it's a bust (at least in my opinion).

Started single player with DX12 RTX all Ultra settings. Performance was barely breaking 60 fps at 2560x1080, not a high resolution. It was often in the 50 fps range and felt barely playable.

Played the Norway snow mission, this was slightly better but still not great. I set DXR to low, and I got in the 75 fps range but it still felt sluggish and I didn't even see any reflections in the level.

Then I switch DXR off, turned to DX11, and enabled the BF1 SLI profile. Now I was getting 164 fps locked (my frame limit for my 166Hz monitor). A few parts dipped into the 140 - 150 range, but mostly at 164.

Just from that, I can see no reason to legitimately want to run DXR. With it on, on my SLI rig, I was literally getting 1/2 the fps (in the best case) or less than 1/3 the fps at worst (or potentially less, if I disabled frame limit). And in the levels I tried, there was no noticeable difference in picture quality. It looked the same.

In multiplayer, it was a different story, but not a different conclusion. Basically, with DXR low, I was getting in the 75 - 85 fps range. It was playable, and in the Rotterdam map, you could see the effect actually doing something. So that was nice.

With DX11 SLI, I was getting 90 - 125 fps, so better, but short of single player performance. For a competitive shooter, though, I would take the higher fps any day. Also, although the RT was nice, when you are running fast trying to survive, I doubt you will have time to stop and look at the glass in a window.

So, overall, it did look good. I'll give them that. At least in the maps where it worked. But was it "worth losing 100fps and becoming unplayable" good? No.

Maybe if they can release a patch to support DX12 mGPU that would be something. Or do some optimization (running ray-tracing at a lower resolution, etc.) that would be something as well. As-is it's not worth the performance cost to me.
 
Played BFV for 2 hours and tested RTX. Short answer: it's a bust (at least in my opinion).

Started single player with DX12 RTX all Ultra settings. Performance was barely breaking 60 fps at 2560x1080, not a high resolution. It was often in the 50 fps range and felt barely playable.

Played the Norway snow mission, this was slightly better but still not great. I set DXR to low, and I got in the 75 fps range but it still felt sluggish and I didn't even see any reflections in the level.

Then I switch DXR off, turned to DX11, and enabled the BF1 SLI profile. Now I was getting 164 fps locked (my frame limit for my 166Hz monitor). A few parts dipped into the 140 - 150 range, but mostly at 164.

Just from that, I can see no reason to legitimately want to run DXR. With it on, on my SLI rig, I was literally getting 1/2 the fps (in the best case) or less than 1/3 the fps at worst (or potentially less, if I disabled frame limit). And in the levels I tried, there was no noticeable difference in picture quality. It looked the same.

In multiplayer, it was a different story, but not a different conclusion. Basically, with DXR low, I was getting in the 75 - 85 fps range. It was playable, and in the Rotterdam map, you could see the effect actually doing something. So that was nice.

With DX11 SLI, I was getting 90 - 125 fps, so better, but short of single player performance. For a competitive shooter, though, I would take the higher fps any day. Also, although the RT was nice, when you are running fast trying to survive, I doubt you will have time to stop and look at the glass in a window.

So, overall, it did look good. I'll give them that. At least in the maps where it worked. But was it "worth losing 100fps and becoming unplayable" good? No.

Maybe if they can release a patch to support DX12 mGPU that would be something. Or do some optimization (running ray-tracing at a lower resolution, etc.) that would be something as well. As-is it's not worth the performance cost to me.

I get about 75 at 3440x1440 with dxr low. Was SLI on and maybe hurting perf slightly? Not that it matters much.

My monitor is only 60hz so in my use case I lose 0% performance turning dxr on. As a medic you start to notice small things like the next room lighting up that helps lol.
 
Played BFV for 2 hours and tested RTX. Short answer: it's a bust (at least in my opinion).

Started single player with DX12 RTX all Ultra settings. Performance was barely breaking 60 fps at 2560x1080, not a high resolution. It was often in the 50 fps range and felt barely playable.

Played the Norway snow mission, this was slightly better but still not great. I set DXR to low, and I got in the 75 fps range but it still felt sluggish and I didn't even see any reflections in the level.

Then I switch DXR off, turned to DX11, and enabled the BF1 SLI profile. Now I was getting 164 fps locked (my frame limit for my 166Hz monitor). A few parts dipped into the 140 - 150 range, but mostly at 164.

Just from that, I can see no reason to legitimately want to run DXR. With it on, on my SLI rig, I was literally getting 1/2 the fps (in the best case) or less than 1/3 the fps at worst (or potentially less, if I disabled frame limit). And in the levels I tried, there was no noticeable difference in picture quality. It looked the same.

In multiplayer, it was a different story, but not a different conclusion. Basically, with DXR low, I was getting in the 75 - 85 fps range. It was playable, and in the Rotterdam map, you could see the effect actually doing something. So that was nice.

With DX11 SLI, I was getting 90 - 125 fps, so better, but short of single player performance. For a competitive shooter, though, I would take the higher fps any day. Also, although the RT was nice, when you are running fast trying to survive, I doubt you will have time to stop and look at the glass in a window.

So, overall, it did look good. I'll give them that. At least in the maps where it worked. But was it "worth losing 100fps and becoming unplayable" good? No.

Maybe if they can release a patch to support DX12 mGPU that would be something. Or do some optimization (running ray-tracing at a lower resolution, etc.) that would be something as well. As-is it's not worth the performance cost to me.


I wonder where the difference in our framerate is coming from considering I'm playing at a higher resolution (2560 x 1440). My 8700k @ 5ghz?

I came to a somewhat similar conclusion to you though. In multiplayer, while it did look nice, I'd rather have the extra frames per second as I'm too focused on movement and whatnot to pay attention to the pretty reflections.

Single player was where I really enjoyed it. As I said before... With my GPU overclocked, I average between 60 - 80fps with drops into the 40s on single player. I found this more than acceptable and the slower pace of the campaign allowed me to look around and enjoy the sights a bit.

That being said, I'm still not wowed by the Ray Tracing effects and just wanted the extra horsepower a 2080Ti gives you over a 1080Ti. In that regards, I'm very satisfied.
 
Last edited:
I came to a somewhat similar conclusion to you though. In multiplayer, while it did look nice, I'd rather have the extra frames per second as I'm too focused on movement and whatnot to pay attention to the pretty reflections.

Single player was where I really enjoyed it. As I said before... With my GPU overclocked, I average between 60 - 80fps with drops into the 40s on single player. I found this more than acceptable and the slower pace of the campaign allowed me to look around and enjoy the sights a bit.

Pretty much what I expected to happen. In videos the RTX does look better, but not half the framerate better and in a game like this by the time you are in movement and focusing on spotting enemies you will most likely miss the effect altogether. I think Metro Exodus with its RTX global illumination will look far more impressive and the slower pace of the game will suit it better.
 
And what FPS will a completely ray-traced game run at? 3-5 fps. All they ray-traced now are puddles and that alone seriously compromises FPS.
That's the thing. In order to get more ray tracing performance more silicon area will need to be dedicated to it. It's just a question of how quickly the game industry will shift their rendering engines to accomodate such a change, if at all. Such a drastic change in rendering practices won't happen overnight, and it will certainly be painful in the interim.
 
That's the thing. In order to get more ray tracing performance more silicon area will need to be dedicated to it. It's just a question of how quickly the game industry will shift their rendering engines to accomodate such a change, if at all. Such a drastic change in rendering practices won't happen overnight, and it will certainly be painful in the interim.

I like the pretty popular idea of RT having an add-on card. Minimal RT on the main card (for adoption/cost purposes) and then slap on another card if you’re into it. I remember reading parallel processing of RT is easy to get efficient, god knows where I read that though.

I wonder how many tensor cores sit idle... I know they do the denoise portion of RT. Well DLSS should put them to good use if it’s ever implemented. I was always more excited about that than RT.
 
A specialized card is a non-starter as it will fragment the market even further and destroy any hope of RT catching on. Best case they let you dedicate one of your regular graphics cards to RT like they do with PhysX.
 
Not running smooth for me.. Still runs worse compared to DX11. It's better, but still not as a good.

I heard it runs smoother with RTX on, not sure if you’re using that, than without and DX12. I haven’t personally tried it with it off. I might try later with scaling though just to make sure I am not missing out on scaling fidelity > dxr.

I use 60Hz though. Plus the swaying might cover it up for me. If you’re higher hz you could be more sensitive.
 
I heard it runs smoother with RTX on, not sure if you’re using that, than without and DX12. I haven’t personally tried it with it off. I might try later with scaling though just to make sure I am not missing out on scaling fidelity > dxr.

I use 60Hz though. Plus the swaying might cover it up for me. If you’re higher hz you could be more sensitive.

That’s part of the problem. The frame drops are just so noticeable when DX11 is so smooth at 120hz/120fps. It’s just not worth it.
 
That’s part of the problem. The frame drops are just so noticeable when DX11 is so smooth at 120hz/120fps. It’s just not worth it.

Being honest, I’d pick high Hz over DXR if that was an option for me! Maybe in two years when the new ultrawide predator has been out for a while.
 
  • Like
Reactions: mikeo
like this
Yeah for twitch games like BO4 lately I've been going with 120hz 1080p on the oled vs 4k 60hz. Still happy with the 2080 ti for that as all settings ultra it does 120hz easy, could have just gone with a 1080 but since I was upgrading from a 9 series figure the 2080 is worth it for the nvlink SLI improvements when the price drops in a year or two.
 
Yeah for twitch games like BO4 lately I've been going with 120hz 1080p on the oled vs 4k 60hz. Still happy with the 2080 ti for that as all settings ultra it does 120hz easy, could have just gone with a 1080 but since I was upgrading from a 9 series figure the 2080 is worth it for the nvlink SLI improvements when the price drops in a year or two.

1080p does benefit greatly from super scaling. If the game doesn’t have a scaling bar I enable DSR in the nVidia control panel and render at higher resolutions when I have extra GPU utilization to burn.

Deep down inside I wish they went all CUDA cores instead of rtx heh.
 
Everything said, I think if you have a single 1080p 60Hz monitor, you could run DXR and everything on Ultra w/ a 2080 Ti and get 60 fps+. So that's something.

For me, I just got this 166Hz ultrawide monitor, and seeing how smooth it is it's kind of hard to drop to even 75Hz.

I'm still really excited about ray-tracing. I mean, in the parts where it worked, it did really *look* next-gen. So I could see this being great if developers can figure it out, maybe they didn't have enough time to do it right.
 
I FINALLY snagged an EVGA XC Ultra 2080Ti from newegg. Almost got the Sea Hawk but I need 2 of them eventually and mounting the radiators would be a pain. Just went with a single for now to see how it goes (and keeping my SLI 1080Ti's as backups). If things go well and it doesn't explode I'll add a second one. Excited to try Ray Tracing and see how it performs otherwise.
 
This dude is claiming to have gotten 2080 Ti SLI working with DXR. What does he know?

 
So I don't know what I was smoking last night, SLI is not really working, even in DX11. I was sure I was getting good performance, but maybe it was just the level I was testing.

I tried again tonight. With DX11 SLI I was getting activity on both cards, but they were staying under 40% usage on both. Tested in the France SP level, getting in the 90 - 100 fps range, not as impressive. I also noticed bad blur artifacts that seem SLI related.

Tested DX12 DXR in the France map, at DXR Ultra was getting 30 - 40 fps, completely unplayable. With DXR Low it was hitting the 60 - 70 fps range, so just making the cut (though still choppy for my taste). Putting everything else on Low did not improve fps at all.

Then I tested DX12 with DXR off, and this was the most playable for me. Getting around 135 fps average, give or take. Lowering settings helped a bit, but even on Low everything it was still only around 155 fps, not worth sacrificing the image quality for 20 fps.

I'm going to try with DXR but with frame limit at 60Hz. Maybe that will be playable.
 
60 fps frame cap (in BFV) did help somewhat with DXR low. At least when it was getting above 60 fps the feel was consistent.

However, once I got into a fire-fight the fps quickly dropped to 40's and even in the 30's. Really sad.

You pay $1,200 for a video card, you would hope for better than 30 fps at 1080p with the setting on low. Nvidia played us.
 
Honestly, I'd have liked to have seen the GPU offered as both RTX and GTX Versions... RTX like what we have and a NON Ray-Tracing GTX Version... Could in theory have THREE Classes; GTX, RTX, and Titan... Dang, What would the Titan cost?!? My Wallet is Scared...
 
Honestly, I'd have liked to have seen the GPU offered as both RTX and GTX Versions... RTX like what we have and a NON Ray-Tracing GTX Version... Could in theory have THREE Classes; GTX, RTX, and Titan... Dang, What would the Titan cost?!? My Wallet is Scared...

Based on the many posts I've read, I think you're in the majority. I would have preferred if nVidia could have used standard compute functions instead of dedicated cores so AMD could join the party. Then again, since AMD's cards tend to be better at compute that might have been a very bad move on nVidia's part, for nVidia. :whistle:
 
Back
Top