Battlefield V May Support Real-Time Ray Tracing via NVIDIA’s RTX Technology

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
PCGamesHardware has found potential evidence of Ray Tracing support in EA/DICE’s Battlefield V, which is due out in October: entering console command Render.DrawScreenInfo 1 in the alpha version brings up a variety of technical details, one of which references the hyped technology (NVIDIA GeForce GTX 1080 Ti [Raytrace: Off]).

Battlefield 5 will be powered by the latest version of the Frostbite Engine and all the previous BF games worked great in DX11. However, Raytracing requires DX12 or Vulkan and that renderer was significantly slower in all previous Battlefield games. As such, we are really looking forward to seeing whether DICE has optimized DX12 so it can be as fast as DX11, and whether these real-time ray tracing effects will be worth the additional performance hit.
 
It's great that nVidia is pushing new technologies and games are already supporting them.

I just hope that this does not penalize the competition for little visual impact, such as Hairworks
 
Who cares.....This only means interest is actually low and is pushing tech (which is usually AMD for them) to be alluring to gamers that would buy such supporting GPU's and go OOoooooo; then the gamer will bitch that the game sucks and that is its only selling point.
 
My guess is that the engine supports it, but it won't be available in the actual game outside of tech demos. Nvidia likely assisted (shared costs / development) to get it into Unity/UE4/Frostbite/any-major-engine.

Nvidia has been pushing the ray-tracing tech towards the render/vfx workflows, they probably know it'll take a few generations to make its way into games and be playable. Game companies won't start utilizing it properly until there's a significant userbase of support that warrants the development costs.
 
My guess is that the engine supports it, but it won't be available in the actual game outside of tech demos. Nvidia likely assisted (shared costs / development) to get it into Unity/UE4/Frostbite/any-major-engine.

Nvidia has been pushing the ray-tracing tech towards the render/vfx workflows, they probably know it'll take a few generations to make its way into games and be playable. Game companies won't start utilizing it properly until there's a significant userbase of support that warrants the development costs.



You can mix raytracing for a few specific areas, such as reflection. And that's probably what it's going to be at first. You'll have raytracing support for a few key objects (most likely reflective objects). Just a little bit of extra graphical flair, but nothing too extravagant.
 
  • Like
Reactions: Elios
like this
One big mystery I've been wondering with Nvidia's Ray-Tracing is how did they make a graphics card do what the industry couldn't for a long time. The industry uses CPU's to do Ray-Tracing because the CPU does it better than a GPU, because of recursion. Ray-Tracing does a lot of recursion and GPU's are terrible at recursion. GPU's are good at processing math and that's it. In order to do logic processing well a GPU needs Out-of-Order execution, which helps a lot in recursion. So what's in the RTX cores in Nvidia's new GPUs? What could be wasting 1/4 of the silicon used to make the GPU to do just Ray-Tracing? An ARM CPU is my guess. It has out-of-order execution, you can pack a lot of them in a confined space, use little power, and we know Nvidia has made ARM chips. I believe Nvidia put an ARM CPU in their RTX cards to do the Ray-Tracing.
 
One big mystery I've been wondering with Nvidia's Ray-Tracing is how did they make a graphics card do what the industry couldn't for a long time. The industry uses CPU's to do Ray-Tracing because the CPU does it better than a GPU, because of recursion. Ray-Tracing does a lot of recursion and GPU's are terrible at recursion. GPU's are good at processing math and that's it. In order to do logic processing well a GPU needs Out-of-Order execution, which helps a lot in recursion. So what's in the RTX cores in Nvidia's new GPUs? What could be wasting 1/4 of the silicon used to make the GPU to do just Ray-Tracing? An ARM CPU is my guess. It has out-of-order execution, you can pack a lot of them in a confined space, use little power, and we know Nvidia has made ARM chips. I believe Nvidia put an ARM CPU in their RTX cards to do the Ray-Tracing.

AI is the trick
https://blogs.nvidia.com/blog/2017/07/31/nvidia-research-brings-ai-to-computer-graphics/
the ray trace render is VERY low res but with "Deep Learning AA" they can get usable image out of it
if you watch this video at 5m45s you can see whats going on before the AI denoising


Noise_2.png
 
My guess is that the engine supports it, but it won't be available in the actual game outside of tech demos. Nvidia likely assisted (shared costs / development) to get it into Unity/UE4/Frostbite/any-major-engine.

Nvidia has been pushing the ray-tracing tech towards the render/vfx workflows, they probably know it'll take a few generations to make its way into games and be playable. Game companies won't start utilizing it properly until there's a significant userbase of support that warrants the development costs.
We will know more in February when Metro Exodus comes out, it does support ray tracing, things may have been in the works for DX12 and Nvidia for a long time, it may not take as long as usual for this new tech to be used.
 
so nvidia does dx12 now ? LOL i guess thats why amd owned dx12 for a while now nvidia beating amd again sorry guys...
 
Doesn't explain much. Just shows that Nvidia can make less ray casts do more, but what's casting the rays? What is doing the actual Ray-Tracing?

that is in the DX/Vulkan ray casting API and the GPU hardware for it you would have to dig more in but the bottom line is you can blend ray traced lighting with raster render stuff now in real time at 60fps +
 
The current flavour of real time stuff reminds me a lot of early post processing over 10 years ago where the screen space effects were rendered at often 1/4 of the native resolution of the image (aka. the age of the vaseline camera).
 
The current flavour of real time stuff reminds me a lot of early post processing over 10 years ago where the screen space effects were rendered at often 1/4 of the native resolution of the image (aka. the age of the vaseline camera).
yeah but for reflections and shadows its fine for now and it only well get better
 
One big mystery I've been wondering with Nvidia's Ray-Tracing is how did they make a graphics card do what the industry couldn't for a long time. The industry uses CPU's to do Ray-Tracing because the CPU does it better than a GPU, because of recursion. Ray-Tracing does a lot of recursion and GPU's are terrible at recursion. GPU's are good at processing math and that's it. In order to do logic processing well a GPU needs Out-of-Order execution, which helps a lot in recursion. So what's in the RTX cores in Nvidia's new GPUs? What could be wasting 1/4 of the silicon used to make the GPU to do just Ray-Tracing? An ARM CPU is my guess. It has out-of-order execution, you can pack a lot of them in a confined space, use little power, and we know Nvidia has made ARM chips. I believe Nvidia put an ARM CPU in their RTX cards to do the Ray-Tracing.

Does this mean a multi-core CPU will be able to process if a GPU doesn't have ReTardX cores or is this another proprietary ngreedia gimmick like physux that will only work with their cards. Why not just use the extra cores on a multicore cpu to process this if it does it better and at full resolution instead of cheap low res nvidia implementation?

If game engines look for proprietary cores to process ray-tracing this will die fast because AMD . Though NVidia rules hardware on pc, AMD rules gaming as a whole because consoles.

Still don't buy the ray-tracing hypetrain, seems gimmicky and fragile.
 
Does this mean a multi-core CPU will be able to process if a GPU doesn't have ReTardX cores or is this another proprietary ngreedia gimmick like physux that will only work with their cards. Why not just use the extra cores on a multicore cpu to process this if it does it better and at full resolution instead of cheap low res nvidia implementation?

If game engines look for proprietary cores to process ray-tracing this will die fast because AMD . Though NVidia rules hardware on pc, AMD rules gaming as a whole because consoles.

Still don't buy the ray-tracing hypetrain, seems gimmicky and fragile.

ok then drinking the Kool-aid bit there are we... this is pretty big change in how stuff is rendered it may seem like not the big of deal to the end user yet but its a step to full real time ray tracing
 
Does this mean a multi-core CPU will be able to process if a GPU doesn't have ReTardX cores or is this another proprietary ngreedia gimmick like physux that will only work with their cards. Why not just use the extra cores on a multicore cpu to process this if it does it better and at full resolution instead of cheap low res nvidia implementation?

If game engines look for proprietary cores to process ray-tracing this will die fast because AMD . Though NVidia rules hardware on pc, AMD rules gaming as a whole because consoles.

Still don't buy the ray-tracing hypetrain, seems gimmicky and fragile.
Try to generate a BHV tree and traverse it on a CPU in real time and fast enough to comfortably play a video game. All while the CPU still is managing AI, physics, and other threads that are not handled by the GPU (yet).
 
Try to generate a BHV tree and traverse it on a CPU in real time and fast enough to comfortably play a video game. All while the CPU still is managing AI, physics, and other threads that are not handled by the GPU (yet).

No problem! There are cpu’s that have cores to spare and the tech won’t be physx 2.0 proprietary poop.
 
No problem! There are cpu’s that have cores to spare and the tech won’t be physx 2.0 proprietary poop.

you do know there is more then then just the ray tracing that makes this work notably the Deep Learning image completion
 
you do know there is more then then just the ray tracing that makes this work notably the Deep Learning image completion

Like I said Proprietary Poop.

Ray-tracing isn't new, only nvidia is pretending like it is.
 
Like I said Proprietary Poop.

Ray-tracing isn't new, only nvidia is pretending like it is.

your missing the fact that it doest work at real time speed with out that

its using deep learning to fill in the missing pixels its not ray tracing everything maybe 40% at best then the AI kicks in and fills in the gaps
 
your missing the fact that it doest work at real time speed with out that

its using deep learning to fill in the missing pixels its not ray tracing everything maybe 40% at best then the AI kicks in and fills in the gaps

In that case if it was really open source it would allow the use of a cpu or gpu or additional physxBSpu to perform the marketing bs “deep learning” computation.

Gimmick at best.
 
In that case if it was really open source it would allow the use of a cpu or gpu or additional physxBSpu to perform the marketing bs “deep learning” computation.

Gimmick at best.
you could but its simply not fast enough look at whats needed to run the Google Go AI's same thing but this needs to be even faster since it has to do it 1/10 the time it takes kick out frame
 
you could but its simply not fast enough look at whats needed to run the Google Go AI's same thing but this needs to be even faster since it has to do it 1/10 the time it takes kick out frame
Don't feed the troll dude.
 
  • Like
Reactions: Elios
like this
you could but its simply not fast enough look at whats needed to run the Google Go AI's same thing but this needs to be even faster since it has to do it 1/10 the time it takes kick out frame

Nah, cpu’s would do it better, they can execute out of order instructions for ray tracing better than gpu’s.
 
Back
Top