Real-Time Ray Tracing Is Enabled in Battlefield V

It’s on ultra and rtx low which they said looks the same as higher rtx levels.

And if someone spent $1200 to just run RTX features, something not even out yet, they are an idiot.



DX12 has some issues with that game that they hopefully resolve soon. They did mention that.

Fair enough, I guess I'll fess up to only skimming the article, don't really give canned benchmarks more than that as I know Kyle and Co will always deliver the goods.
 
Lets hope its software and not hardware limitations, because its currently a joke. Good on em for innovating, but im thinking this bun should have stayed in the oven a little longer. Yay for not pre-ordering anything!
 
https://www.techpowerup.com/reviews/Performance_Analysis/Battlefield_V_RTX_DXR_Raytracing/

Wow Vega 56/64 runs neck to neck with 2070 with no RT @ 4K.

I do not remember where I saw it, a Disney animator discussed why the single card approach doesn't work to render and show high fFPS ray tracing, the studio GPU cluster used had 100 gig VRAM definitely not a 2080 ti rig.
Yes, my Vega 64 performs decently in most games. That 2080 Ti looks tasty though. I personally wouldn't entertain spending $1,200 on a video card, but I see why some would.
 
Lets hope its software and not hardware limitations, because its currently a joke. Good on em for innovating, but im thinking this bun should have stayed in the oven a little longer. Yay for not pre-ordering anything!
Well to be honest, the real-time ray tracing is a separate toggle in the options. So you can enable all of the features in the game options like usual and then choose to use low, medium, high, or ultra ray tracing. That's what the guys streaming the game on Twitch showed us when they opened their options panel to enable real-time ray tracing. I don't think that it is worth going back to 1080p gaming for shiny floors, but some may certainly find it enjoyable and pleasing to the eye. It sure is taxing on the GPU.
 
It’s on ultra and rtx low which they said looks the same as higher rtx levels.

And if someone spent $1200 to just run RTX features, something not even out yet, they are an idiot.



DX12 has some issues with that game that they hopefully resolve soon. They did mention that.
On low it looks like you lose global illumination, unless they're just not performing ray pathing on distant surfaces.

LOW:
upload_2018-11-14_12-45-12.png


ULTRA:
upload_2018-11-14_12-46-13.png
 
I still don't like raytracing for games. There needs to be something done with the geometry and lighting to fix it.

The surfaces look too smooth/shiny.

And really... a mirror finish floor in a building.. in a war no less. Yeah.. not happening.

That being said, I haven't played BFV so maybe some of it just has to do with the game design.

The person who crafted that level made a shiny surface and thus it was rendered that way. If you want to make a point...at least make the right one.
 
Ignoring the shiny floor, the light scattering from the lamps inside the building look amazingly realistic to me. Can somebody dig up a screenshot in that same building with ray tracing disabled?

If it looks nearly as good with it off, I'd be surprised. Also the shading on the chair and reflections of neighboring buildings on the floor look insane. Oh and the way the light interacts with the players jacket. When the power is there to do this at high res and high framerates nobody will complain. This gen doesn't have the horsepower enough to mainstream the technology yet ofc.

Even if the scene isn't fully ray traced and this is only 1080p, i doubt rtx is going anywhere. This kind of thing in real time has been a dream for decades.

I'm sure some special forces were assigned floor waxing duty special for this map to show off the power of RTX :p
 
Last edited:
What is the deal with DX12 running like crap. Is it just me or does BFV have funky lighting effects + soldiers look washed out and too bright in certain areas ?

I would hope that will be going the way of the Dodo soon. We should be seeing the last wave of games built on DX11 and retrofitted for DX12 in the next year or so I think. DX12 has been out for what, 3 or 4 years now and the development cycle is usually 5 years I think? That combined with the slow death of windows 7 should polish off ever seeing an AAA game with DX11 running better than DX12
 
Wow, so I can pay $1200 to have worse performance at 1440p with raytracing at LOW than my $700 1080 Ti without any RT tech, in a largely online competitive game. IT JUST WORKS (n), what a fucking joke, sucks to be the early adopters who got suckered by this BS.
 
Wow, so I can pay $1200 to have worse performance at 1440p with raytracing at LOW than my $700 1080 Ti without any RT tech, in a largely online competitive game. IT JUST WORKS (n), what a fucking joke, sucks to be the early adopters who got suckered by this BS.
I think it would work really well in puzzle type games where you are searching your surroundings and having it look pretty without worrying about a fast moving pace.
 
On low it looks like you lose global illumination, unless they're just not performing ray pathing on distant surfaces.

LOW:
View attachment 120018

ULTRA:
View attachment 120019

I’d still take that over off. It really does it for me, even in pictures, I imagine real time play is way better. I am going to order BFV tonight.

I run 3440x1440 60Hz so ultra + rtx low seems like a perfect fit.
 
I still don't like raytracing for games. There needs to be something done with the geometry and lighting to fix it.

The surfaces look too smooth/shiny.

And really... a mirror finish floor in a building.. in a war no less. Yeah.. not happening.

Does the mirror finish of a high end appartment disappear magically when a war breaks out? The war was fought inside regular houses, some of which were owned by very wealthy people.
 
Does the mirror finish of a high end appartment disappear magically when a war breaks out? The war was fought inside regular houses, some of which were owned by very wealthy people.
If it hasn't been cleaned, I think dust might settle on it? Just a guess.
 
I'm running BFV at 3440x1440 ultra with medium DXR and it's playable.

That’s great! I have my card bios and shunt modded. I am interested to see what happens to my power limit level with RTX on vs off. I wonder how much fps is gimped by PL... or where the bottleneck is.
 
I think it would work really well in puzzle type games where you are searching your surroundings and having it look pretty without worrying about a fast moving pace.
No need for hardware RT then surely?
 
So a little over 60 FPS average at 1080p with a 2080 Ti which of course means minimums well below 60 at times. Some of you were living in denial when I told you that 60fps was going to be the Target at 1080p with a 2080 TI but at least now here's a reality check for you people. This is also going to be the same result for every single triple A game that has decent Graphics so get used to it. There is one clown on here that seems to live in a fantasy land thinking Ray tracing is going to work at 4k and way way over 60fps.
 
In the second pic, the lamp on the far wall is casting duplicate reflections on the floor right next to each other. That seem normal? Looks a little weird and not accurate for ray tracing I would think. I dont see another light source that would cause that.

*edit. actually both the low and the ultra are showing that along with other abnormalities I wouldnt expect with ray tracing.

*edit. there is a blue team box covering the other light. I guess that's where it's coming from
 
Yes, on the low settings.

With a 50% hit in a fps game lol. Ray tracing was hoisted on us gamers because of nVidias professional market ambitions. If they left out rtx and tensor cores we would have had a faster traditional raster chip and it probably would have cost less. Well maybe Intel will disrupt the market in 2020, I've given up on AMD.

Also imo real time destructible environments and physics is far more important than ray tracing in games. Imagine how much more fun games like Blackout would have been if they could handle it. Too bad nVidia killed ageia and physX.
 
Last edited:
Who knew this was nothing more than a gimmick..... I paid 1100 for my EVGA Titan that lasted me over 6 years till I was gifted a 1080.
 
With a 50% hit in a fps game lol. Ray tracing was hoisted on us gamers because of nVidias professional market ambitions. If they left out rtx and tensor cores we would have had a faster traditional raster chip and it probably would have cost less. Well maybe Intel will disrupt the market in 2020, I've given up on AMD.

Also imo real time destructible environments and physics is far more important than ray tracing in games. Imagine how much more fun games like Blackout would have been if they could handle it. Too bad nVidia killed ageia and physX.

How about some proper AI boots in games? How many times have I slaughtered a room of enemies with all kinds of weapons, and never once do they react in any way realistic, they just blindly pathfind to me making the slaughter easier.

It would be nice if they ran, or panicked, or used any sembalence of tactics. Honestly this one thing would take immersion to a whole new level. Instead we really haven't advanced AI much since the 2000's.
 
If you like playing at DXR low. A 2080ti is at 49.6 fps at Ultra, that my friend blows chunks and you cant even get 30 fps with it on in 4K unless it's on low.

Considering low looks about the same as ultra I don’t see the issue with trying it, especially if it’s 73 fps at 1440p.

If people have trouble seeing the difference on a still frame/video there basically is no difference imo.
 
Considering low looks about the same as ultra I don’t see the issue with trying it, especially if it’s 73 fps at 1440p.

If people have trouble seeing the difference on a still frame/video there basically is no difference imo.
There are sometimes settings where motion will bring out the differences.
 
There are sometimes settings where motion will bring out the differences.

I’ll dabble with it. I am interested to see how much power it uses and if that’s a contributor to the performance hit.

Just think if it is. What if you could get 30% higher rtx performance by modding/voiding the warranty on a card known to fail bahahahahah.
 
ROFl.....

NICE.. $1200 GPU, cant hold its own with RT enabled. Called it, which is why I'll hang onto my 2 1080 Ti's till the next gen or the gen after that comes out with a more practical price.

All those who bought the 2080 Ti got jacked for there money.
 
imagine what Resident Evil 2 remake is gonna look like with ray tracing.
 
I can't believe all the misguided children in here who must be crying because they cant or will not pay for the 2080ti. This is an enthusiast site and you should be ashamed of yourselves. Everyone here should be excited about new tech even if they dont buy this generation. I can't help but feel people have really gone off the deep end with jealousy to act as immature and childish as they are.

I for one still applaud Nvidia for creating the most exciting step forward in generations of the same video tech, I want the future, not to be stuck in the past regardless of if its takes a couple generations to become affordable.
 
nVidia is telling people that only RTX "low" is optimized right now. They are working on fixing the higher settings. They are recommending using RTX "low" at this time.

From nVidia:

DXR Preset Performance Impact

We recommend in this first release of DXR that “DXR Raytraced Reflections Quality” be set to “Low” due to Battlefield V’s known issues when using “Medium”, “High”, and “Ultra” settings. EA, DICE, and NVIDIA will continue to optimize this implementation and deliver regular updates.


For more information from EA, check the links below:

Reddit – Forums – https://forums.battlefield.com/en-us/discussion/161023/battlefield-vs-known-issues-list
 
https://www.techpowerup.com/reviews/Performance_Analysis/Battlefield_V_RTX_DXR_Raytracing/

Wow Vega 56/64 runs neck to neck with 2070 with no RT @ 4K.

I do not remember where I saw it, a Disney animator discussed why the single card approach doesn't work to render and show high fFPS ray tracing, the studio GPU cluster used had 100 gig VRAM definitely not a 2080 ti rig.
It's also impressive how at 4K Fury X is right on the heels of GTX 1070 too. DICE certainly knows how to develop a game that looks great and actually runs efficiently among both AMD and Nvidia cards.
 
You want to talk about misguided, how about your ignorant Assumption of jealousy? Maybe pay attention to the comments for many people here as they have high end systems and can easily afford a 2080 Ti. Now if you remove your head from your rear end then you will see what most people are talking about here. The issue is that people that can afford a 2080 TI sure as hell do not want to game at 1080p just to turn on a ray tracing effect. Most of those people have 4K or 1440p screens so running well below native resolution and degrading the whole image just to turn on one effect is idiotic and asinine. But hey maybe there are a few people silly enough to have a native 1080p screen that will buy a 2080 TI.

That said I do agree that we had to start somewhere with Ray tracing. If Nvidia didn't come out with cards that support it then developers sure as hell wouldn't work on it for future games. But like most people I sure as hell am not paying $1,200 to be an early adopter to play at 1080p and not even be able to maintain 60fps. I am thankful for those that are willing to make the sacrifice though...

You can play 1440p Ultra / low RTX at 73 fps that multiple sites say is nearly identical to RTX high/ultra. That is better than I expected for the first go around.
 
Back
Top