Battlefield V: GeForce RTX 2080 Ti Rotterdam Gameplay- Guru3d

killroy67

[H]ard|Gawd
Joined
Oct 16, 2006
Messages
1,583
I don't get all of the hate. Yeah the cards are expensive, that's a valid complaint but you do realize this is real-time raytracing? Playing a game with real-time raytracing has been a fantasy for decades. Now, we have cards that can do it (albeit with mixed results) and people complain that it can only do it at 1080P at 60FPS. Personally, I'm pleasantly surprised that it can do ray tracing at those frame rates.

I mean, how long have we had video cards that can play rasterized games at 4K at reasonable frame rates? 2 years? Maybe not even that long. To expect a technology like this to come out and to be playable at 4K frame rates is unrealistic IMO. I think we are probably at least two video card generations away from playing fully ray-traced games at 4K. Which in itself is amazing.
 
I don't get all of the hate. Yeah the cards are expensive, that's a valid complaint but you do realize this is real-time raytracing? Playing a game with real-time raytracing has been a fantasy for decades. Now, we have cards that can do it (albeit with mixed results) and people complain that it can only do it at 1080P at 60FPS. Personally, I'm pleasantly surprised that it can do ray tracing at those frame rates.

I mean, how long have we had video cards that can play rasterized games at 4K at reasonable frame rates? 2 years? Maybe not even that long. To expect a technology like this to come out and to be playable at 4K frame rates is unrealistic IMO. I think we are probably at least two video card generations away from playing fully ray-traced games at 4K. Which in itself is amazing.

It should be obvious why there's hate. They raised the prices up by a significant margin. If you're paying ~50% more for a ~50% performance gain over the 10 series, even if you get to beta test the new ray tracing features it's a really hard sell for lost of people. But I do think everyone should just take it a easy until we get the actual reviews because maybe the performance is much better than expected.
 
the point is to release it when it's ready to run at all 3 of today's gaming resolutions. but it's still early and dev's have only had access to it for a month at best and same with the cards.
 
It should be obvious why there's hate. They raised the prices up by a significant margin. If you're paying ~50% more for a ~50% performance gain over the 10 series, even if you get to beta test the new ray tracing features it's a really hard sell for lost of people. But I do think everyone should just take it a easy until we get the actual reviews because maybe the performance is much better than expected.

Do you hate Ferrari for making cars you cannot afford? They could raise the price on the cards 1000% - if it's not worth it to you, don't buy it. They are releasing other, less expensive cards.
 
Do you hate Ferrari for making cars you cannot afford? They could raise the price on the cards 1000% - if it's not worth it to you, don't buy it. They are releasing other, less expensive cards.

?? Pretty terrible analogy. And yes if I expected to buy a ferrari at similar prices to the previous years model and they suddenly raised prices 50% I'd consider buying a mclaren, pagani, lambo, etc. instead.
 
I think the "add some sort of realistic mirror effect to water on the ground" option that was applied to BF5 through the ray tracing tech is cool and is a good start.
 
Drivers might not be ready for a game and card that isn't even out. You could turn off Ray tracing in game maybe?
 
Drivers might not be ready for a game and card that isn't even out. You could turn off Ray tracing in game maybe?

Yes of course you can turn it off. I posted the below in a different thread but very applicable here:


https://www.hardocp.com/news/2018/0...altime_ray_tracing_graphics_options_explained

If you watch the video in here DICE expects at least another 30% as they optimize it further. So that’ll bring them to 60-70FPS at 1440p with RT on.

They also want to try and make it so you can render at different resolutions. Like use 4k, but the RT will be at 1080p or 1440p which is still VASTLY better than the current method in both labor and visuals. Then you can have high 4k framerates.
 
I don't know...The amount of raytracing it looks like they've implemented in BFV reminds me of the early days of bloom -- totally overdone. As it is, it looks like it's been slapped on like a bad paint job. For example, there's a mark in the video where the player walks between two train cars and the sides look like mirrors - quite unrealitic.

I'm sure raytracing will be a great feature when applied properly and with discretion, but for the first batch of games that leverage I'm betting it's going to be kids with crayolas and a bare wall.

As for the performance hit (if accurate) and price increases... Not even in the realm of being worth it. I'll stick to 4k@60mhz without raytracing on my 1080 ti.
 
I don't know...The amount of raytracing it looks like they've implemented in BFV reminds me of the early days of bloom -- totally overdone. As it is, it looks like it's been slapped on like a bad paint job. For example, there's a mark in the video where the player walks between two train cars and the sides look like mirrors - quite unrealitic.

I'm sure raytracing will be a great feature when applied properly and with discretion, but for the first batch of games that leverage I'm betting it's going to be kids with crayolas and a bare wall.

As for the performance hit (if accurate) and price increases... Not even in the realm of being worth it. I'll stick to 4k@60mhz without raytracing on my 1080 ti.

I think the reason it looks so terrible is because it's not proper Ray Tracing. They are having to use tricks to get the performance to a playable level, it's a mixed bag between low resolution, part rasterised and part Ray Traced. It wouldn't look so much like a slapped on paint job if the GPU had the power to Ray Trace everything.
 
Does the 60 FPS need to be stable on a g-sync monitor?

No, that's the point of gsync/freesync. Essentially, you want to keep the FPS above 30-35 but below your monitor's refresh rate limit. For my monitor, the sweet spot is 35-70 FPS which is just about perfect for a 1080 ti at 4k in most newer games.
 
The 2080Ti rendering is a hybrid when Ray Tracing is turned on, not all lighting will be ray traced in other words, rasterization is still present. I thought the AA in the video for 1080p was outstanding, the lower resolution took a hit on texture clarity. I find it amusing that 60fps is now the thing even at 1080p :ROFLMAO:. Quality over quantity maybe. I am sure the Turing generation will be great performing cards but not sure they are remotely worth the asking price. Using AI to enhance the experience is not being talked about too much - that is probably the bigger gain at this point, those Tensor cores.

I will probably just invest in a quality GSync monitor with the two 1080 Ti's.

I can see upcoming reviews will be all over the place, the new stuff will probably defuddle many reviewers in how to approach it. Only [H]ardOCP have recent relevant data for apple to apple comparisons from generation to generation. I do like seeing Nvidia pushing the envelope in new technology but most likely hate how Nvidia (to be determine) will shove it down everyone's throats. This generation maybe just new tech looking for an application that uses it. As for Ray Tracing, I've been playing around with Modo (Beta) which has AMD ProRender (AMD's CPU/GPU renderer), using the two Vega FE's, using real time rendering window except it is no way close to being real time - fast yes but not real time, still tho it is 100% ray tracing, well with some OpenGL shading thrown in that is.
 
The 2080Ti rendering is a hybrid when Ray Tracing is turned on, not all lighting will be ray traced in other words, rasterization is still present. I thought the AA in the video for 1080p was outstanding, the lower resolution took a hit on texture clarity. I find it amusing that 60fps is now the thing even at 1080p :ROFLMAO:. Quality over quantity maybe. I am sure the Turing generation will be great performing cards but not sure they are remotely worth the asking price. Using AI to enhance the experience is not being talked about too much - that is probably the bigger gain at this point, those Tensor cores.

I will probably just invest in a quality GSync monitor with the two 1080 Ti's.

I can see upcoming reviews will be all over the place, the new stuff will probably defuddle many reviewers in how to approach it. Only [H]ardOCP have recent relevant data for apple to apple comparisons from generation to generation. I do like seeing Nvidia pushing the envelope in new technology but most likely hate how Nvidia (to be determine) will shove it down everyone's throats. This generation maybe just new tech looking for an application that uses it. As for Ray Tracing, I've been playing around with Modo (Beta) which has AMD ProRender (AMD's CPU/GPU renderer), using the two Vega FE's, using real time rendering window except it is no way close to being real time - fast yes but not real time, still tho it is 100% ray tracing, well with some OpenGL shading thrown in that is.

DICE has 1440p RT at 40-50 fps and think 30% is easily doable. (52-65fps).

They are also working on rendering RT at a different resolution. So you can render at 4k and just RT at 1080p and have your 90Hz, ect. It’ll still be vastly better than the alternative.

 
I can't tell any difference between that video and one without Ray Tracing. I can see the examples in the nvidia demos but to me it seems like a colossal waste of money and effort for little to no visual change and a downgrade in resolution at that.
 
Raytracing will be the future.

Yet with tech like this, I am not going in on the first generation. It will be 4X faster the next gen, and I will be able to play at 4K like I want to, and it should be cheaper unless Nvidia gouges us more for a Ti.

Or, raytracing flops big and no game devs use it and I don't need to worry about 40% of my GPU diespace being useless because I didn't go all in with the first gen of a brand new hardware device that requires third party software implementation to use.
 
Last edited:
Ray Tracing is definitely the future. We now just need AMD to get on board and show us what they can do. And I think we might be quite surprised at how good that could be. Considering how good there cards are at compute.
 
I can't tell any difference between that video and one without Ray Tracing. I can see the examples in the nvidia demos but to me it seems like a colossal waste of money and effort for little to no visual change and a downgrade in resolution at that.
BF5 is probably not the best showcase of the tech. They are only using ray-tracing for reflections, with RTX off they do screen-space reflections and cube maps (popular in most games today) so the difference may not be obvious to an untrained eye. Developers have the opportunity to do other stuff with ray-tracing, such as shadows or even the entire lighting system (global illumination), in which case you would see a more obvious improvement.
 
I have a solution for 4k. Turn off the feature you got charged extra for. LOL RTX OFF.
 
  • Like
Reactions: Savoy
like this
I have a solution for 4k. Turn off the feature you got charged extra for. LOL RTX OFF.
Well, you still will probably have the fastest video card ever in the history of mankind and can happily play in 4K without ray-tracing.
 
Honestly I can barely tell the difference between the "raytrace" enabled battlefield and a regular "rasterized" version. Seems like a lot fuss about nothing.

Now if the Star Wars demo had been part of a playable game, Star Wars Battlefront 3 w/Raytracing sounds like a killer app, now that would have been something.
 
Honestly I can barely tell the difference between the "raytrace" enabled battlefield and a regular "rasterized" version. Seems like a lot fuss about nothing.

Now if the Star Wars demo had been part of a playable game, Star Wars Battlefront 3 w/Raytracing sounds like a killer app, now that would have been something.

Hahha. you are right. The star wars demo was much better battlefield its hard to tell. I mean its a very fast paced game so In those type of games it will hard to see shadows and stuff. Thats why they were slowing down the shit out of it during Nvidia presentation.
 
IMO if you are playing MP....the last thing you are worried about is Ray Tracing. Your main goal is to have a high refresh rate at whatever resolution you play at.

Not saying Ray Tracing sucks, just saying it will be a waste in a high paced FPS type of game.
 
Last edited:
I think Digital Foundy stated that the RT component can run at arbitrary resolutions, so you could run the Raster layer at 4K and the RT at 1080P and theoretically hit 60FPS.

That's IF this feature is made available. This would also make the 2070 and 2080 more viable.
 
Back
Top