cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,076
Real-time ray tracing is the new hot word in PC gaming as Nvidia focused on it during the Nvidia RTX 2000 series reveal. Digital Foundry has been privy to new information about the tech and how it runs under the hood of Battlefield V from their discussions with DICE employees. For example, DICE is looking into creating an option to lower the detail and resolution of real-time ray tracing in the game, but keep a higher resolution such as 4K for everything else. Right now running on beta drivers, enabling real-time ray tracing on a RTX 2080 Ti at 4K resolution creates an undesirable sub 30 fps performance situation. 1440p was better, but still sub 50 fps at times.

If you're interested in more Battlefield V information there is an interview with the developers that touches on subjects like monetization, and of course the announcement of the release delay to polish the game more.

The optimisations are many, but the fact remains that ray tracing like this is still vastly expensive from a computational perspective, even with dedicated hardware acceleration. The RTX implementation as it stands right now is designed to keep you above 1080p60 on an RTX 2080 Ti. The demo stations were locked to the 1080p resolution of the attached high refresh rate display, but the internal scaler allowed us to simulate 1440p and 4K resolutions. On the former, we eyeballed frame-rates in the 40-50fps area but 4K plummets into sub-30 territory. Chatting with DICE later, there was surprise that the game was running at all with 4K and RTX enabled. It really is early days with the implementation and final hardware, so even thedevelopers are not entirely sure of how far things can be pushed.
 
Why really bother implementing Ray tracing in a Multiplayer platform where most competitive players kill the video settings for more lag free responsiveness to getting more frags ?
 
Why really bother implementing Ray tracing in a Multiplayer platform where most competitive players kill the video settings for more lag free responsiveness to getting more frags ?

Why bother making graphic better at ALL then? because EVERY multiplayer gamer turns off ALL the settings ALL THE TIME, so NOBODY EVER uses high-graphics settings, and it by no means acts as a driving force to sell higher-end PC hardware...
 
Why really bother implementing Ray tracing in a Multiplayer platform where most competitive players kill the video settings for more lag free responsiveness to getting more frags ?

I agree with your sentiment but I can think of at least two reasons: one, you can actually have an environment where being able to see things others can't will be to a skilled player's advantage and two, BF has never been truly "competitive" in that sense and even if it were this can be a way to move copies. I'm down in the 1080p max FPS camp and when I hear people say, "this RTX will help competitive players utilize reflections," I laugh pretty hard...it won't happen this gen or next gen but at some point it will. For the time being, it's just some "cool" effects you can sell to people who think 60 FPS is king...or that 30 FPS is "cinematic."
 
Why bother making graphic better at ALL then?
Good question, not like it makes for a better game, costs more money, more time, more hardware to run it and probably takes away from devtime making a better game. Really good question indeed.

Personally will pay 60+ for a good game, will pay 0 for just a pretty game.
 
I'm an eye candy turned on ultra kinda guy. I'm going to sit this generation out but looking forward to Gen 2
 
Why really bother implementing Ray tracing in a Multiplayer platform where most competitive players kill the video settings for more lag free responsiveness to getting more frags ?

Because most players aren't really competitive?

Apart from that, Battlefield V will also have a single player campaign.
 
Because most players aren't really competitive?

Apart from that, Battlefield V will also have a single player campaign.
I'm not competitive at all, but I make absolutely sure that I maintain my fps to at least always match the refresh rate of my display. I'm not going to play to die on purpose. Ha ha! :)
 
Why bother making graphic better at ALL then? because EVERY multiplayer gamer turns off ALL the settings ALL THE TIME, so NOBODY EVER uses high-graphics settings, and it by no means acts as a driving force to sell higher-end PC hardware...

Because a majority of the community would love to see this level of ray tracing (me specifically) and that is where we're heading.

buddha.jpg
 
I'm still confused here on the dedicated hardware. If you enable a feature that doesn't touch the normal compute cores why does the fps tank? Isn't it supposed to be asynchronous according to the slide on Turing frames?
 
This is shaping up to be very close to my projections!

I figured optimizations and the workload would be tailored to make all the RTX cards useful. Plus, the scaling between cards just matched too closely the various resolutions/frame rates.
 
Why really bother implementing Ray tracing in a Multiplayer platform where most competitive players kill the video settings for more lag free responsiveness to getting more frags ?

I know right? Competitive players make up 100% of the player base so why even bother! Let's make everything look like CS 1.6 so that things can be 10000% lag free and competitive player focused!

This is my campaign for Shitty Graphics President of 2018, for the players by the players, and I stand by these statements.
 
4k at sub 30fps at times KILLS me.

I was hoping the new titan card could do it..

Maybe Gen 2?
 
4k at sub 30fps at times KILLS me.

I was hoping the new titan card could do it..

Maybe Gen 2?

The card is PR rumored to kill at 4k resolutions but when you throw a gen 1 feature like ray tracing into the mix what do you expect? I suspect that even gen 2 won't get around to a fully fluid ray tracing implementation at 1440p much less 4k/60.
 
I still have seen nothing that makes me regret my purchase of a Vega 64. The Nvidia fans can keep this $1200 turd. I will wait for something better.
 
it's going to look so nice in a smaller scale game where they use the render power on more details. I just build an all AMD desktop with Vega 64, it will be interesting to see if AMD has any answer for ray tracing in their next hardware.
 
Why really bother implementing Ray tracing in a Multiplayer platform where most competitive players kill the video settings for more lag free responsiveness to getting more frags ?

Countdown until someone finds a car bumper or some shit angled just right that reflects towards a window that reflects motherfuckers coming from 80 feet and 2 corners away
 
I love how we have been warned by someone who has been in the business for decades about how we aren't going to see real performance numbers until later because of an NDA yet you see people freaking out over numbers on BETA drivers. (long winded sentence)

Man, some of you would fuck up a wet dream. All of this debate over something that isn't even "tangible" yet.
 
Why bother making graphic better at ALL then? because EVERY multiplayer gamer turns off ALL the settings ALL THE TIME, so NOBODY EVER uses high-graphics settings, and it by no means acts as a driving force to sell higher-end PC hardware...


Well... Sh!t. Guess i've been playing games wrong all along.. cause I'm apparently one of the few that plays Battlefield Maxed out @ 1440p

What bothers me.. is the fact with the 2080 Ti and 4k... it cant even get past 30FPS...

WHY??? ...know what, never mind.
 
I'm still confused here on the dedicated hardware. If you enable a feature that doesn't touch the normal compute cores why does the fps tank? Isn't it supposed to be asynchronous according to the slide on Turing frames?

Because the GPU can't calculate all those rays in a vacuum.

The main issue with real time ray tracing has always been... that games tend to cheat a lot in order to render what you can actually see. In a game you don't need what is behind you rendered, the engine will keep track of objects but it has no need to calculate much else. For ray tracing to work it needs to know what each ray is bouncing off. You can't know that if you aren't calculating exact size and surface info for items not in the field of view (those objects may well be in the FOV of the ray being calculated).

Forward rendering (the old way of doing things) may well end up being the best way to apply ray tracing tech. Unreal used forward rendering for years and didn't add a deferred rendering pipe until unreal4. I imagine a lot of developers are toying with using forward rendering again.

My point is turning Ray tracing on is going to mean the engine is going to have to calculate a lot of things that developers had been using tricks to avoid calculating before. Nvidia can talk about AI to reduce that a bit... but AI can't fill in scene physics, it will be great for reducing RT noise, but a scene using RT lighting is still going to need to calculate a bunch of stuff developers had the engine skipping before.

The main issue for developers right now... is most of the major engines default to using deferred rendering, and most developers with in house engines are all using deferred rendering right now. I'm not an expert, but considering how deferred rendering calculates lighting sources ect, I can't imagine RT would work well at all with that method of rendering.
https://en.wikipedia.org/wiki/Deferred_shading
 
Countdown until someone finds a car bumper or some shit angled just right that reflects towards a window that reflects motherfuckers coming from 80 feet and 2 corners away
Car bumper? With all the shiny reflective surfaces in the presented implementations of ray-tracing, you could use a cardboard box!
 
so RT enabled.. and getting 50 FPS, thats just fine for me. Im not a competitive MP guy

Well... Sh!t. Guess i've been playing games wrong all along.. cause I'm apparently one of the few that plays Battlefield Maxed out @ 1440p

What bothers me.. is the fact with the 2080 Ti and 4k... it cant even get past 30FPS...

WHY??? ...know what, never mind.

If you watched the video they expect to get another 30% out of it so that’d be 65FPS at 1440p. They also want to possibly separate ray tracing resolution from rasterized. So you can have your normal 4k high FPS and render the ray tracing at a lower resolution which is still way better than the crap they have now.

It’s truly impressive. Especially how much they did so quickly.
 
Last edited:
Why bother making graphic better at ALL then? because EVERY multiplayer gamer turns off ALL the settings ALL THE TIME, so NOBODY EVER uses high-graphics settings, and it by no means acts as a driving force to sell higher-end PC hardware...

This is what I find so confusing among some PC gamers... among other things there are some people who are just obsessed with FPS when it comes to games who aren't pro gamers. That always confused me because the vast majority of us aren't pro gamers so sacrificing resolution for frames just doesn't make sense especially if you just built a brand new rig. Playing games at 4K on my KS8000 at 60fps is more than adequate, and yes I have owned a 34" ultrawide 3440x1440 100hz monitor before and while it was nice I preferred the higher resolution. In short i'd rather have 4K @ a solid 60fps than 1440p @100hz+. I know this is entirely a preference on my part, but if you have the latest hardware then you might as well have the best video quality.
 
I'm still confused here on the dedicated hardware. If you enable a feature that doesn't touch the normal compute cores why does the fps tank? Isn't it supposed to be asynchronous according to the slide on Turing frames?

Basically everything else has to sit around and wait for the ray traced lighting to finish and apply it to the frame before it gets displayed.

Right now, it's a massive bottleneck to the GPU itself when turned on, no different than having too little RAM on your system. There just aren't enough RT cores to keep up with everything else.
 
Last edited:
I'm still confused here on the dedicated hardware. If you enable a feature that doesn't touch the normal compute cores why does the fps tank? Isn't it supposed to be asynchronous according to the slide on Turing frames?

The Touring slides showed RT being computed in parallel not asynchronously. This is a big difference. When doing computations in parallel, the frame is not complete until ALL elements are rendered. In this case we are seeing the RT elements taking much more time than the rasterized elements, hence RT limits the frame rate regardless of how fast everything else gets done. This why doing RT at lower resolutions increases fps; since it is the bottleneck and takes less time with the resolution reduction.

IF RT was actually being done asynchronously there would be RT elements from previous frames in the current frame, which would create goofy artifacts and irregularities.
 
Basically everything else has to sit around and wait for the ray traced lighting to finish and apply it to the frame before it gets displayed.

Right now, it's a massive bottleneck to the GPU itself when turned on, no different than having too little RAM on your system. There just aren't enough RT cores to keep up with everything else.
I've been skeptical that 10 gigarays is sufficient for RT. This basically confirms my suspicions. However, I've been rather hopeful and optimistic that there are optimizations which will enable "good enough" RT with any of the RTX cards, including the 2070. When using those optimizations or shortcuts, I expect the performance projections I've posted a few times.
 
I've been skeptical that 10 gigarays is sufficient for RT. This basically confirms my suspicions. However, I've been rather hopeful and optimistic that there are optimizations which will enable "good enough" RT with any of the RTX cards, including the 2070. When using those optimizations or shortcuts, I expect the performance projections I've posted a few times.


Honestly, I think DLSS is gonna end up being a much bigger deal with the 2000 series GPUs than Ray tracing, especially if it ends up being something you can apply universally from the drivers. If I'm understanding it correctly with DLSS, the Anti-Aliasing is handled by the RT cores so we get decent AA visuals with zero cost (sans an even bigger hit to performance while using ray tracing which anyone who wants decent resolutions or fps will have disabled anyway).
 
Can I say that's a stupid question without it possibly being a personal attack? Guess I will not quote the question...
 
Has anyone verified cpu usage with RT enabled?

A DICE video said they targeted 12 thread CPUs. So my 2700x should be fine. Kappa. I haven’t seen charts yet.

I think we are getting BFV for christmas along with with a 2080ti if they deliver on the 1440p 60hz+ ray tracing. Especially if they make it possible to choose the resolutions between rasterized and RT so you can get 4k at 60Hz+.
 
Back
Top