DICE and NVIDIA Collaborate to Increase DXR Performance in Battlefield V by up to 50%

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,074
DICE and NVIDIA are collaborating to fix bugs and optimize Battlefield V to meet their performance goal of running DXR, real-time ray tracing, at 60+ FPS. This work has created performance uplifts of up to 50% in some problematic areas of the game. The companies are expected to continue this partnership into the future.

We've been working closely with EA and DICE to rapidly optimize DXR Ray Tracing in Battlefield V, improving performance up to 50% and enabling real-time ray tracing at 60+ FPS. Learn about all the DXR performance optimizations in Battlefield V Tides of War Chapter 1: Overture from two of the talented developers at NVIDIA and DICE. Chapter 1: Overture launches on December 4.
 
Like everything new in 3d graphics it takes time to optimize, fix, change, etc. Looking forward to more of this in the future. And very curious how AMD is going to address this.
Until Ray Tracing becomes mainstream, AMD won't be doing anything about it. While, IIRC, they support Ray Tracing through an extension of theirs they've already stated they won't be introducing a mainstream part that truly supports it.

I expect this to be a gimmick to sell video cards until it's no longer a gimmick. Big publishers can fork out the dev time to support it, not everyone can tho. It's a cool looking addition to games, tho it impacts nothing I play.
 
This is great news on many levels - first, real usable performance, second, real visual improvements - that's significantly better than a AA tweak. 3rd and more - we're seeing it in current/new titles, this is gaining adoption and it's making hardcore PC gamers have something to benefit from besides just added FPS or AA bumps, our $$ is getting us a better visual experience, and the fact we're seeing solid 60+ in only the first few weeks, means we should see some great optimization coming in the future.

the next-gen cards in 18months or so should be a substantial Ray tracking leap too, driving this further. I wouldn't be surprised to see this fully take off in 2019 as a legit next-gen graphics leap. can't wait for others to start taking advantage of the AI cores to offload further GPU cycles.
 
If this story is actually true - a 50% increase is huge. The game isn't that playable @ 4k w/ DXR enabled on a 2080ti - But another 20FPS or so if the claim is true will make it useable.
 
  • Like
Reactions: mikeo
like this
This is great news on many levels - first, real usable performance, second, real visual improvements - that's significantly better than a AA tweak. 3rd and more - we're seeing it in current/new titles, this is gaining adoption and it's making hardcore PC gamers have something to benefit from besides just added FPS or AA bumps, our $$ is getting us a better visual experience, and the fact we're seeing solid 60+ in only the first few weeks, means we should see some great optimization coming in the future.

the next-gen cards in 18months or so should be a substantial Ray tracking leap too, driving this further. I wouldn't be surprised to see this fully take off in 2019 as a legit next-gen graphics leap. can't wait for others to start taking advantage of the AI cores to offload further GPU cycles.

I think you are looking at 2020 or so for it be something decent. For me 60fps is okay but if you have a 1080ti its probably best to drop it to 1080p anyways. Even then it looks like you would need a 1200 dollar card to play rtx at 1080p in either case. Which is probably 2 generations behind. Until it comes to mainstream we won't really have it take off. It will be just a thing of luxury for now.

As I always mentioned Turing was okay but by the time they get all the features up to task, it is just early adopters tax. 2nd and 3rd gen RTX is where it will be taking off but hopefully by then its mainstream. Otherwise it will still be thing of luxury.
 
I think you are looking at 2020 or so for it be something decent. For me 60fps is okay but if you have a 1080ti its probably best to drop it to 1080p anyways. Even then it looks like you would need a 1200 dollar card to play rtx at 1080p in either case. Which is probably 2 generations behind. Until it comes to mainstream we won't really have it take off. It will be just a thing of luxury for now.

As I always mentioned Turing was okay but by the time they get all the features up to task, it is just early adopters tax. 2nd and 3rd gen RTX is where it will be taking off but hopefully by then its mainstream. Otherwise it will still be thing of luxury.

For clarity, my reference point is that I don't care much for mainstream, it's been somewhat dry up here in the leading-edge group of PC gaming, for those that do have latest greatest it's nice to see something genuine we get for that investment. So my 2019 is in reference to those with $1300 GPUs and $3000ish rigs. I agree it could be more years and generations for it to make it's way down. and the fact there's actually a new technology in PC gaming is fantastic.
 
If this story is actually true - a 50% increase is huge. The game isn't that playable @ 4k w/ DXR enabled on a 2080ti - But another 20FPS or so if the claim is true will make it useable.

The wording makes me doubt it will be that good. But we'll see I suppose.
 
For clarity, my reference point is that I don't care much for mainstream, it's been somewhat dry up here in the leading-edge group of PC gaming, for those that do have latest greatest it's nice to see something genuine we get for that investment. So my 2019 is in reference to those with $1300 GPUs and $3000ish rigs. I agree it could be more years and generations for it to make it's way down. and the fact there's actually a new technology in PC gaming is fantastic.

I am all for upper end. But Nvidia is going to have to suck my left nut for me to pay them double for Ti lol. To me it isn't about money, I could in a second buy the titan as well. But to me its more about principal of things. I'll wait for Ti prices to come to their senses or I will stick to AMD. Honestly Nvidia doesn't give two shits about RTX adoption, they just want to sell their cards and make that money. I'll patiently wait until I don't have to adjust sliders to get decent frames with ray tracing before I jump on it. Ray tracing value isn't there right now at the price they are asking.
 
They turned down the ray tracing, changed the textures and called it a performance improvement?

2:14 shows it perfectly on the gun, the barrel and bipod leg are still shiny metal, but the top of the gun is now matte? Pretty sure they weren't painting mass produced guns. New model they added some really low res texturing on it looks like.
 
They turned down the ray tracing, changed the textures and called it a performance improvement?

2:14 shows it perfectly on the gun, the barrel and bipod leg are still shiny metal, but the top of the gun is now matte? Pretty sure they weren't painting mass produced guns. New model they added some really low res texturing on it looks like.

Ofcourse, they know as long as it shines people won't notice. I can bet they toned it down quiet a bit. May be ultra is the new medium and the medium is the new low lol.
 
Ofcourse, they know as long as it shines people won't notice. I can bet they toned it down quiet a bit. May be ultra is the new medium and the medium is the new low lol.

And in the video did they not say they didn't do anything to the graphics lol
 
And in the video did they not say they didn't do anything to the graphics lol

Honestly it's one of those things that is in the eye of the people lol. Like they probably know people will fight over quality and there won't be a definitive conclusion so they just roll with it.
 
It's good news, I will still wait for a few titles more, and a refined (less fire, heat, space invaders) card. I notice now how good some of the pre-baked lighting is in games after seeing RTX. Some of it is pretty well done, see Doom, Lichdom Battlemage, and about ten thousand other games.
 
Proof of what companies will have to go through to properly support this. All the extra money and time it will cost to optimism it and do it right. It's just another Physx, works great when its actually implemented but rarely is.
 
DICE and NVIDIA are collaborating to fix bugs and optimize Battlefield V to meet their performance goal of running DXR, real-time ray tracing, at 60+ FPS.

“translation”. Nvidia pays Dice millions to X
 
Awesome so now with these new drivers and patch, instead of a slideshow it will be more like a projector film from the 1930's in BF5
 
Until Ray Tracing becomes mainstream, AMD won't be doing anything about it. While, IIRC, they support Ray Tracing through an extension of theirs they've already stated they won't be introducing a mainstream part that truly supports it.

It'll be utter stupidity if they wait for the first mover to become entrenched again. The time for AMD to do something about it is yesterday.

I expect this to be a gimmick to sell video cards until it's no longer a gimmick. Big publishers can fork out the dev time to support it, not everyone can tho. It's a cool looking addition to games, tho it impacts nothing I play.

Hairworks, PhysX and the like were arguably gimmicks. But raytracing has always been the holy grail, it's always been where graphics were going to go once processing became powerful enough. This is what all the "lol raytracing suxx right now though cuz I need 144FPS" posts are missing - the holy fucking grail of graphics has begun iterating.
 
Last edited:
It'll be utter stupidity if they wait for the first mover to become entrenched again. The time for AMD to do something about it is yesterday.



Hairworks, PhysX and the like were arguably gimmicks. But raytracing has always been the holy grail, it's always been where graphics were going to go once processing became powerful enough. This is what all the "lol raytracing suxx right now though cuz I need 144FPS" posts are missing - the holy fucking grail of graphics has begun iterating.

Yea but raytracing is not exclusive to nvidia. its just how nvidia implements it it is part of direct x. Developers will have no issues enabling it on AMD hardware as it doesn't require a lot of work. I highly doubt developers are going to sit there and waste time on raytracing when it only on Nvidia. Nvidia did work with microsoft so its not exclusive.

Nvidia is using DXR still. So no issue here, there is no such thing as falling behind. When AMD implements something similar it will work just fine. For now though nvidia will lead the way, which is good, cuz they are the ones spending the money on it to get it going. AMD doesn't need to do shit and wait until they release a card with enough horse power and compute power to do something similar. This won't be like games that do ray tracing wont work on AMD. it will an easy thing to do.
 
Yea but raytracing is not exclusive to nvidia. its just how nvidia implements it it is part of direct x. Developers will have no issues enabling it on AMD hardware as it doesn't require a lot of work. I highly doubt developers are going to sit there and waste time on raytracing when it only on Nvidia. Nvidia did work with microsoft so its not exclusive.

Nvidia is using DXR still. So no issue here, there is no such thing as falling behind. When AMD implements something similar it will work just fine. For now though nvidia will lead the way, which is good, cuz they are the ones spending the money on it to get it going. AMD doesn't need to do shit and wait until they release a card with enough horse power and compute power to do something similar. This won't be like games that do ray tracing wont work on AMD. it will an easy thing to do.
No disagreement but I wasn' talking about API proliferation, but hardware. I'd even hope AMD would say to hell with the artificial DX12/Windows10 raytracing lock-in and embrace the agnostic Vulkan solution which would support Win7/8/10 + Linux + whatever ChromeOS is reinvented as. I'd dump my 1080Ti and go AMD without a second thought.
 
No disagreement but I wasn' talking about API proliferation, but hardware. I'd even hope AMD would say to hell with the artificial DX12/Windows10 raytracing lock-in and embrace the agnostic Vulkan solution which would support Win7/8/10 + Linux + whatever ChromeOS is reinvented as. I'd dump my 1080Ti and go AMD without a second thought.

Nvidia has cash so they pretty much do what they want. If they want a game to support their certain tech they will do that. Thats just the way they are. But honestly I see no point of ray tracing when you need a $1200 card to get 60fps at 1080p or above. Only time I see the benefit of it will be when it comes to mainstream.
 
They turned down the ray tracing, changed the textures and called it a performance improvement?

2:14 shows it perfectly on the gun, the barrel and bipod leg are still shiny metal, but the top of the gun is now matte? Pretty sure they weren't painting mass produced guns. New model they added some really low res texturing on it looks like.

I mean it's clearly not a perfect comparison since there's mud on 1 gun and not the other...
 
No disagreement but I wasn' talking about API proliferation, but hardware. I'd even hope AMD would say to hell with the artificial DX12/Windows10 raytracing lock-in and embrace the agnostic Vulkan solution which would support Win7/8/10 + Linux + whatever ChromeOS is reinvented as. I'd dump my 1080Ti and go AMD without a second thought.

Nvidia is already supporting ray tracing in Vulkan... and AMD will support it in DX12.
 
Nvidia has cash so they pretty much do what they want. If they want a game to support their certain tech they will do that. Thats just the way they are. But honestly I see no point of ray tracing when you need a $1200 card to get 60fps at 1080p or above. Only time I see the benefit of it will be when it comes to mainstream.
Perhaps you didn't read the post but you DON'T need a $1200 card to get 60fps at 1080p and you didn't before the latest optimizations were made. I guess many people are blinded by the anti nvidia herd mentality here at HARDOCP, but I for one am very glad I opted for an RTX 2080 and not a 1080TI. Looking forward to lots of fun with ray tracing in Battlefield and other titles going forward. Good luck on that with 10 series cards.
 
Perhaps you didn't read the post but you DON'T need a $1200 card to get 60fps at 1080p and you didn't before the latest optimizations were made. I guess many people are blinded by the anti nvidia herd mentality here at HARDOCP, but I for one am very glad I opted for an RTX 2080 and not a 1080TI. Looking forward to lots of fun with ray tracing in Battlefield and other titles going forward. Good luck on that with 10 series cards.

Was I actually responding to you in my post? you are right I didn’t read your post. So I guess you need a $700+ Card to play 1080p. I wasn’t pointing to anyone or saying anyone is wasting money. May be you didn’t read my post. I wasn’t saying you shouldn’t buy what you want. It was my opinion in general. Enjoy your game. So it’s 700 dollar card to play a game I guess, look at my post I was talking about mainstreams. I am not sure why people feel the need to justify their purchase when I never asked you to. Don’t remember saying you aren’t enjoying your games. Let me know when it can do 144hz at 1440p and I’ll be interest at 700, happy?
 
why didn't this happen before the game was released?

enjoy making your premium customers beta test your 1200 dollar product?
 
why didn't this happen before the game was released?

enjoy making your premium customers beta test your 1200 dollar product?

I think the yanks call it double jeopardy , now everything will focus on the previous patch and the new and improved Nvidia patch.
More drama to come for sure.
And if you read closely
This work has created performance uplifts of up to 50% in some problematic areas
Remember how Crysis had so many polygons in objects that it crippled performance (tessellation) on AMD cards?

I'm guessing that there removing properties from landscapes/objects that their hardware does not seem to handle correctly. If it was software then it would have been fixed already ...
 
It'll be utter stupidity if they wait for the first mover to become entrenched again. The time for AMD to do something about it is yesterday.

It might be the holy Grail of geaphics processing , however , AR/VR was the holy Grail of graphics processing not so long ago too. Ray Tracing is something we've seen in rendered films and yeah, it's cool. But it doesn't really affect most of us. 99% of the games on the planet don't support it. EA has the money to toss at it and that's all fine and good, however the mass consumer doesn't have the money to drop on a 500 buck video card (the cards at the lower end that support it will suck donkey balls by the way) in addition to their 500-700 dollar computer / laptop.

So, for now, access to the tech is being gated by Ngreedia and video cards that range from 500-1500 bucks. The 2070 IIRC being the lowest RTX to support it with a buy in price around 500 bucks ... And not really gonna push that 60 fps target that EA and Ngreedia are working on in BF5.

I have a 2080Ti but I don't know many outside these forums that actually owns one (one person).

I think AMD has it right. They just don't have the stability, spare cash or need to back it and dev a costly accelerator to push a lighting trick that looks incredible but is not widely adopted. The sheer requirements are staggering and as I recall the games that implement it have to hack it down to make it run fast so it doesn't screw games FPS into the dirt.

It's a cool tech, but us high end users will be the only ones to see it this gen of card. Maybe next gen or the one after or ... 3-5 years it might be in enough games to give a shit about it.
 
why didn't this happen before the game was released?

enjoy making your premium customers beta test your 1200 dollar product?

Maybe because new tech and lots of developers are still testing and optimizing it? Apparently, you do not know the R&D world
 
Back
Top