Unreal Engine 5.2- Next-Gen Graphics Tech Demo

Sadly more scalable at the expense of storage space. But yes they certainly are now.

It's a problem, but there are solutions for it. They can just make separate downloads for graphical settings. Games already do this now with "4k texture packs", etc.
 
Unreal Engine 5.2 Out Now– Adds Improvements to Anti-Stuttering System, Enhances Lumen and Nanite

Today, Epic Games announced the public release of Unreal Engine 5.2...the previous release, UE 5.1, had introduced an experimental PSO precaching system to improve hitching in DirectX12 games...in UE 5.2, the performance and stability have been increased, and the system now supports skipping drawing objects altogether if the relative PSOs aren't ready yet...while the goal is to have them ready, there is no guarantee they will be...with the new support for skipping, the stuttering shouldn't happen if the PSO hasn't been compiled

Epic also reduced the number of caches to compile in Unreal Engine 5.2 thanks to improved logic that smartly finds those that would never actually be used...lastly the old manual caching system can now be used alongside the automated precaching one...Unreal Engine 5.2 also comes with native support for Apple Silicon Macs for the first time...

https://www.unrealengine.com/en-US/..._medium=ue_twitter&utm_campaign=ue_launch_5_2
 
Unreal Engine 5.2 Out Now– Adds Improvements to Anti-Stuttering System, Enhances Lumen and Nanite

Today, Epic Games announced the public release of Unreal Engine 5.2...the previous release, UE 5.1, had introduced an experimental PSO precaching system to improve hitching in DirectX12 games...in UE 5.2, the performance and stability have been increased, and the system now supports skipping drawing objects altogether if the relative PSOs aren't ready yet...while the goal is to have them ready, there is no guarantee they will be...with the new support for skipping, the stuttering shouldn't happen if the PSO hasn't been compiled

Epic also reduced the number of caches to compile in Unreal Engine 5.2 thanks to improved logic that smartly finds those that would never actually be used...lastly the old manual caching system can now be used alongside the automated precaching one...Unreal Engine 5.2 also comes with native support for Apple Silicon Macs for the first time...

https://www.unrealengine.com/en-US/..._medium=ue_twitter&utm_campaign=ue_launch_5_2
It's still dependent on the developer not being lazy, unfortunately.
 
Too bad games never actually live up to these tech demos. I mean games are barely at the level of UE3 demos from 10 years ago now.
Even to this day, the title "Alien Colonial Marines" continues to frustrate me. It's a game that I wish I could erase from my memory, as it was an utter disappointment compared to the promising gameplay that was initially showcased. It fell far short of expectations and could be described as nothing less than a complete failure.
 
Seems Devs want to really put in work recently though. Look at Jedi Survivor, the work put in to remove DLSS is just astounding. :facepalm:

You know I can buy you a beer to cry in if you like.

As for the tech demo, it looks nice like all tech demos, but it usually never ends up looking the same in games. Guess will see what developers end up doing with it.
 
Seems Devs want to really put in work recently though. Look at Jedi Survivor, the work put in to remove DLSS is just astounding. :facepalm:
https://www.unrealengine.com/marketplace/en-US/product/nvidia-dlss/questions

It is not just a switch that magically make it easy for a developer. Maybe it will be put into Jedi Survivor later. Seem like they had enough issues with the PC port to then to add in more issues incorporating DLSS would be stupid. Who knows for sure except the developers themselves. Since cross platform, consoles with a huge amount of sells, I don't see why they would waste extra time if they ran out of time to get this publish incorporating a propriety standard that can only work on one particular vender GPUs unless being helped and paid for it. AMD may have helped the developers for a better Console experience, should they then push for DLSS, a closed standard that would not work with the consoles and most of the hardware being used for gaming in the PC world? That would be stupid. Is DLSS really that much better over FSR? I think it is better but nothing earth shattering, I think I would be able to tell the difference which is which due to motion stability but that is about it. When standing still they are virtually indistinguishable, one can be sharper then the other but dependent upon the game and settings if available.

To have DLSS 3 frame generation only work on 40 series which are sitting on shelves is an obvious waste of time to incorporate unless Nvidia supports that tech directly, maybe they will. I would think Nvidia is the one responsible for games to incorporate their own tech for their hardware. Not expect others to automatically foot the bill to promote their limited access or exclusive access. Yes it would be nice but that is about it.
 
You know I can buy you a beer to cry in if you like.

As for the tech demo, it looks nice like all tech demos, but it usually never ends up looking the same in games. Guess will see what developers end up doing with it.
I don't cry about petty things like this. I laugh at all the Muppets defending the move.
https://www.unrealengine.com/marketplace/en-US/product/nvidia-dlss/questions

It is not just a switch that magically make it easy for a developer.
It has already been said a million times. It takes more effort to remove. The cope is real.
 
I don't cry about petty things like this. I laugh at all the Muppets defending the move.
It has already been said a million times. It takes more effort to remove. The cope is real.
Really? Where did you get that from? The Internet. It still has to be worked in plus this game started development several years back, so that is not even remotely true.

Edit:
ChatGPT response:
Q What steps are needed to incorporate DLSS into a game? How much time in general for coding and testing this will add to the development cycle?
A To incorporate DLSS into a game, you can install the DLSS plugin by extracting the “dxgi.dll” and “dlsstweaks.ini” files into the same folder as a game’s executable (the one in the “Binaries” folder for Unreal Engine 4 games that have two). You can tweak the settings by editing the ini file1. The time it takes to code and test this will depend on the complexity of the game and how much optimization is required. However, enabling DLSS is generally a straightforward process that should not take too much time2.
 
Last edited:
was this covered already?

1683917213269.png

https://www.engadget.com/apple-silicon-macs-now-natively-support-unreal-engine-5-124257710.html
 
Remove what, it's not part of the core engine

Really? Where did you get that from? The Internet. It still has to be worked in plus this game started development several years back, so that is not even remotely true.

Edit:
ChatGPT response:
Q What steps are needed to incorporate DLSS into a game? How much time in general for coding and testing this will add to the development cycle?
A To incorporate DLSS into a game, you can install the DLSS plugin by extracting the “dxgi.dll” and “dlsstweaks.ini” files into the same folder as a game’s executable (the one in the “Binaries” folder for Unreal Engine 4 games that have two). You can tweak the settings by editing the ini file1. The time it takes to code and test this will depend on the complexity of the game and how much optimization is required. However, enabling DLSS is generally a straightforward process that should not take too much time2.
It seems you guys haven't been paying attention. The fact that there is a mod to re-enable dlss means it was in the game and ripped out at some point.
 
It seems you guys haven't been paying attention. The fact that there is a mod to re-enable dlss means it was in the game and ripped out at some point.

That doesn't guarantee that at all - for example, the old FSR mod for Cyberpunk.

It was a renderer hook to try and wrap FSR around the DLSS functions. The game literally thought it was still running DLSS
 
Unreal 5 continues to get more impressive. While these tools can be used for games, Epic is really pushing further into cinema and other non-game rendering uses where real-time or near real-time performance is a consideration, but not the primary driver. Given the performance impact of many of the features, I suspect most studios will target the medium to high range settings in terms of the assets they use; the ultra/cinematic quality assets from Quixel devour storage extremely quickly. I had to get a 3090 for a project just to have enough VRAM to handle a terrain texture without it hard swapping and dropping into single-digit (or decimal) FPS in the viewport.

Don't get me wrong, I'd love to see a game maker push the envelope at max settings, I just doubt it will happen.
 
  • Like
Reactions: noko
like this
So the big takeaways seem to be:

  • 5.2 helps shader stuttering significantly, but isn't absolutely perfect.
  • Hardware ray tracing performance has improved and now can seemingly handily beat software while looking better at the same time.
  • Loading new areas can still hitch.
  • The engine is CPU bound as shit and doesn't scale that well on the CPU.

That last one is pretty concerning, because this has been a thing with Unreal for a while. The renderer is wonderful, bit I think at one point I saw their 4090 at 50% GPU usage on a 12900k. That's pretty bad.
 
That said those demos does not have gameplay, bouncing physic, NPC or much going (not sure about 3d sound), and the dev team does not necessarily put the same effort in a demo than a good game studio, maybe it will scale a bit more
 
Last edited:
So the big takeaways seem to be:

  • 5.2 helps shader stuttering significantly, but isn't absolutely perfect.
  • Hardware ray tracing performance has improved and now can seemingly handily beat software while looking better at the same time.
  • Loading new areas can still hitch.
  • The engine is CPU bound as shit and doesn't scale that well on the CPU.

That last one is pretty concerning, because this has been a thing with Unreal for a while. The renderer is wonderful, bit I think at one point I saw their 4090 at 50% GPU usage on a 12900k. That's pretty bad.
I wonder if the CPU issue are because of blueprints vs bespoke code. You can build things very quickly with Blueprints and it's a lower cost/barrier to entry in many respects but for max performance you need people who know how to code writing the game logic.

For example, the abandoned Unreal Tournament project is a mix of Blueprints and code that needs to be compiled.
 
So the big takeaways seem to be:

  • 5.2 helps shader stuttering significantly, but isn't absolutely perfect.
  • Hardware ray tracing performance has improved and now can seemingly handily beat software while looking better at the same time.
  • Loading new areas can still hitch.
  • The engine is CPU bound as shit and doesn't scale that well on the CPU.

That last one is pretty concerning, because this has been a thing with Unreal for a while. The renderer is wonderful, bit I think at one point I saw their 4090 at 50% GPU usage on a 12900k. That's pretty bad.

I wonder if the CPU issue are because of blueprints vs bespoke code. You can build things very quickly with Blueprints and it's a lower cost/barrier to entry in many respects but for max performance you need people who know how to code writing the game logic.

For example, the abandoned Unreal Tournament project is a mix of Blueprints and code that needs to be compiled.

In the Unreal Developers documentation, they use the AMD Ryzen 5 3600 as the default targeted system.
When assigning a thread you must have at least 1GB of dedicated system memory for every thread you create, while being cognisant to leave system ram remaining for the CPU/GPU swap space and the operating system itself. It also mentions that you must leave at least 2 CPU threads remaining for the host OS to correctly function.
So if you are targeting the default baseline of a 3600 which is a 6/6 system you can at most schedule 10 threads, requiring 10GB of dedicated ram for that application, leaving the remaining for the OS to do what it needs to, so a 16GB system in all likelihood there.

Those values of course can be modified in the BuildConfiguration.xml file
<?xml version="1.0" encoding="utf-8" ?>
<Configuration xmlns="https://www.unrealengine.com/BuildConfiguration">
<ParallelExecutor>
<ProcessorCountMultiplier>2</ProcessorCountMultiplier>
<MaxProcessorCount>10</MaxProcessorCount>
</ParallelExecutor>
</Configuration>

Where ProcessorCountMultiplier refers to the state of HT and SMT and MaxProcessorCount is the number of threads generated, so in the above config it creates 10 threads spread over 5 physical cores, leaving 1 physical core (2 threads) for the host OS to do its background tasks.

So if you change that there it will scale up for more cores but that also means you have to change your requirements accordingly, so the next configuration up being 2,14, would require you to have a system with 16 threads and realistically 32GB in ram as your minimum requirements which removes it from the console market as well as most of the PCs out there as well so it severely limits the target audience.
Hopefully Epic can find a way to make that auto-detect and scale accordingly, but they haven't yet.

Honestly though in the video above the demo wasn't pinning any of the threads it was allocating, so that would indicate that the bottlenecks were elsewhere and not with the CPU itself.
 
Last edited:
I wonder if the CPU issue are because of blueprints vs bespoke code. You can build things very quickly with Blueprints and it's a lower cost/barrier to entry in many respects but for max performance you need people who know how to code writing the game logic.

For example, the abandoned Unreal Tournament project is a mix of Blueprints and code that needs to be compiled.

I doubt it, and I don't think it's particularly worrisome unless you're doing something absurd with it. It's there to be gameplay glue - if you need obnoxious amounts of compute then you're probably doing it wrong.

Blueprint is battle tested as shit at this point and is used everywhere, including Fortnite.

Quake ran its game code in a virtual machine 2.5 decades ago on a Pentium.
 
In the Unreal Developers documentation, they use the AMD Ryzen 5 3600 as the default targeted system.
When assigning a thread you must have at least 1GB of dedicated system memory for every thread you create, while being cognisant to leave system ram remaining for the CPU/GPU swap space and the operating system itself. It also mentions that you must leave at least 2 CPU threads remaining for the host OS to correctly function.
So if you are targeting the default baseline of a 3600 which is a 6/6 system you can at most schedule 10 threads, requiring 10GB of dedicated ram for that application, leaving the remaining for the OS to do what it needs to, so a 16GB system in all likelihood there.

Those values of course can be modified in the BuildConfiguration.xml file
<?xml version="1.0" encoding="utf-8" ?>
<Configuration xmlns="https://www.unrealengine.com/BuildConfiguration">
<ParallelExecutor>
<ProcessorCountMultiplier>2</ProcessorCountMultiplier>
<MaxProcessorCount>10</MaxProcessorCount>
</ParallelExecutor>
</Configuration>

Where ProcessorCountMultiplier refers to the state of HT and SMT and MaxProcessorCount is the number of threads generated, so in the above config it creates 10 threads spread over 5 physical cores, leaving 1 physical core (2 threads) for the host OS to do its background tasks.

This is for building the engine
 
Unreal Engine 5.2- Next-Gen Evolves- New Features + Tech Tested- And A 'Cure' For Stutter?



Key Transcript:

https://www.eurogamer.net/digitalfo...nalysed-is-this-the-answer-to-stutterstruggle


With UE 5.1 and a matching Fortnite update, Epic added an asynchronous shader compilation scheme which worked in real-time, pre-compiling shaders in the background on the CPU during play to hopefully prevent stutter. This technique isn't quite perfect, as if a shader needed to be drawn but wasn't ready the the game would stutter.


In UE5.2, this asynchronous system is more accurate and critically adds the ability for the developer to delay the shader display until it fully compiles, thus potentially eliminating all shader related stutter completely - but with the potential that a visual effect or material could display a bit later than it might do otherwise.

This improved asynchronous shader pre-caching and the new skipdraw feature from 5.2 have a transformative effect based on my testing, eliminating the biggest (~500ms) stutters and dramatically improving fluidity. However, this doesn't completely eliminate stutters, with some 30-50ms examples persisting that aren't found in a completely 'warm' cache. Some of these could be attributable to traversal stutter, which UE5 has inherited from UE4 - and can still be found in the latest version of Fortnite running Unreal Engine 5.2.

In terms of stutter, Unreal Engine 5.2 is certainly an improvement then - but traversal stutter needs work and even the new asynchronous shader caching system is not a silver bullet that developers can wholly rely on for a smooth player experience. For one, it doesn't seem to be on my default, which some developers might miss, and secondly it produces some stutters that are fixed by the more traditional shader cache method.

Therefore, it probably makes sense to combine this new asynchronous system with the older offline precaching system to produce the smoothest experience on PC.
 
Is there even any games using UE5?

that new Layers of Fear game which came out last month uses it...but it's more of an indie title

Immortals of Aveum will be the first big 'real' UE5 game (releasing August 22)...Lords of the Fallen will also be using it (and it looks stunning)- coming in October

 
Is there even any games using UE5?
Lots in development and a bunch of AAA studios are also transitioning from their own engines to UE5.
Epic has been putting that Fortnight money into engine development and most other studios can't keep pace with that sort of advancement.
Awesome for us because UE5 is a really solid engine, while also a little sad because it decreases engine diversity and the different things that different engines do well.
 
Lots in development and a bunch of AAA studios are also transitioning from their own engines to UE5.
Epic has been putting that Fortnight money into engine development and most other studios can't keep pace with that sort of advancement.
Awesome for us because UE5 is a really solid engine, while also a little sad because it decreases engine diversity and the different things that different engines do well.

It's actually very good at least for the short term that studios stop wasting time working on engines that are inferior. Games are much better for it.
But it could definitely be bad in the long term if Epic gets complacent or starts jacking up prices.

It would be great if there was an real competitor for AAA games. Unity is the closest there is to competition for UE5. Unity is getting better all the time, but honestly just isn't good enough for AAA games. So AAA studios either build thier own engine or use UE.
 
It's actually very good at least for the short term that studios stop wasting time working on engines that are inferior. Games are much better for it.
But it could definitely be bad in the long term if Epic gets complacent or starts jacking up prices.

It would be great if there was an real competitor for AAA games. Unity is the closest there is to competition for UE5. Unity is getting better all the time, but honestly just isn't good enough for AAA games. So AAA studios either build thier own engine or use UE.
My concern/worry/thoughts are that UE titles look and feel like UE titles, yeah the games are different but they are all very similar, which for consistency is good and all, but it doesn't really lend well to gameplay diversity.
 
It's actually very good at least for the short term that studios stop wasting time working on engines that are inferior.

That can be true in some ways. For example nanite looks great and once we start seeing games use it, other games will feel dated if they don't offer something similar. ArmA 4 will be on a new in house game engine but if they don't have a nanite like feature it will suffer from one of the biggest issues Arma 1/2/3 did. And that is terrian pop in and LOD issues. When you're dealing with long distances and realistic weapon ranges that starts to become a huge problem. Trees/grass will change shape and either disappear or completely obscure enemies at long distances. This makes target ID hard, and it makes it impossible to balance the AI because they don't suffer from LOD/pop in like humans do. So at a distance in the ArmA games, when an enemy is prone the grass layer is rendered as a solid green cover. So if an enemy is prone they disappear into it. That is why it feels like you're often being killed by enemies you can't see. Due to LOD issues, you literally cannot see them.

And then you have games like Witcher 2 which had grass that essentially grew 20 feet in front of you.

I think once we see how nanite works in open world games, any engine that doesn't use it will look awful no matter how much ray tracing or whatever they put into it.

My concern/worry/thoughts are that UE titles look and feel like UE titles, yeah the games are different but they are all very similar, which for consistency is good and all, but it doesn't really lend well to gameplay diversity.

That depends on the developer really, but yes, a lot of games will feel a bit more similar if they borrow heavily from the default features.
 
“what better way to go off roading than in a rivian…..” oh maybe a ram rebel or trx, hell, even a tundra. lol
 
My concern/worry/thoughts are that UE titles look and feel like UE titles, yeah the games are different but they are all very similar, which for consistency is good and all, but it doesn't really lend well to gameplay diversity.
I had the same thought when Direct3D was unveiled.

Still a ton of good stuff in art, and gameplay which the engine doesn't affect.
 
My concern/worry/thoughts are that UE titles look and feel like UE titles, yeah the games are different but they are all very similar, which for consistency is good and all, but it doesn't really lend well to gameplay diversity.
I think that a 100% valid concern, the medium can be the message, and I imagine what an engine is good at can influence development and orient both the type of game and what they look-feel like, could be overstated but could Doom feel exactly the same if it was an Unreal engine title.

does it help that much if you want to make Flight Simulator, so do you make Flight Sim if you already good with Unreal, etc...
 
Yeah. It's a pretty amazing tool chain. It is still totally up to the game devs to make it work / look well though. But the skill cap has been raised.
 
Back
Top