Unreal Engine 5 Feature Showcase

polonyc2

Fully [H]
Joined
Oct 25, 2004
Messages
25,779
Epic Games has provided a fresh look at Unreal Engine 5 and the many groundbreaking tools it will bring to game makers...the video is largely aimed at developers, showing off UE5’s actual UI and new tech like Nanite, Lumen, adaptable animation systems, and open-world streaming solutions, but regular gamers will still want to check it out as Epic showcases all this via a new impressive “practical sample project” entitled 'Valley of the Ancient'...

 
Pretty incredible. You could see the frames so its pushing pretty hard.
 
The demo is out on pc,
"Valley of the Ancient is a separate download of around 100 GB. If you want to run the full demo, the minimum system requirements are an NVIDIA GTX 1080 or AMD RX Vega 64 graphics card or higher, with 8 GB of VRAM and 32 GB of system RAM. For 30 FPS, we recommend a 12-core CPU at 3.4 GHz, with an NVIDIA RTX 2080 or AMD Radeon 5700 XT graphics card or higher, and 64 GB of system RAM. We have successfully run the demo on PlayStation 5 and Xbox Series X consoles at full performance."

Like i said long ago and was mocked for, the ps5 wasn't magic, it was just very low latency//overhead, and PC would be able to brute force the same whilte microsoft gets its act together with mostly, a lot more ram. And there it is.

This demo is STILL a stress test, don't base the size it has to extrapolate anything about what UE5 games will be. Yes, the assets were pared back a bit from the 8k textures and 16k shadows of movie quality that were used on the ps5 demo, but notice that the size is still a whopping 100GB for the demon and let's be honest, absolutely overkill still.

The demo has Epic's + AMD's answer to DLSS2.0; Temporal Super Resolution, the quality is very very good, and the performance i have seen on a friend is that 4k upscaled from 1080p= 43fps on this demo while 4k native was giving him 18fps.

All in all, it is pretty amazing how the technology is advancing, the full public UE5 release won't come until next year, but the future is looking very bright _for everyone_.
 
I downloaded the engine and the demo.

Nanite indeed works, it's very cool, and the Quixel stuff is extremely seamless. Literally just a couple clicks to open Quixel up and dump whatever to the engine - it's fantastic. The dev experience is exceptional. They really crushed it here. The asset quality is generally immaculate, both in mesh and texture detail. When you export a mesh to the engine, you can pick a couple options, from like Low, Medium, High, and Nanite. I haven't tried the others, but I'm guessing Nanite is the happy medium they came up with where it's not so utterly overdetailed that it's basically completely imperceivable (and basically wasting disk space) but still wildly above any mesh you'd use in a game prior. The mesh I imported was still well into the millions triangle wise.

I picked whatever the craziest looking mesh I could find was and threw it in the scene. It was multiple millions of triangles and drop a couple in. Instantly cleave my framerate from like 60 to 25 with just a couple of them. Couple minutes later, I figured out how to convert them to Nanite. It takes a moment to process the mesh, then bam, done. Framerate is instantly back up. So I keep adding more and more. Eventually I have like a hundred of these rock formations covering every inch of my view and framerate is still smooth and barely seems to budge at fall no matter how many I throw into the scene. It's wild. Literally individual pebbles and rocks and modeled, the detail was astoundingly high. It definitely does what it says on the tin, it's an incredible piece of tech - I've never seen anything quite like it. Nanite seems very cheap and fast from a performance point of view. So yeah, their claim that you can practically throw Nanite meshes around with near impunity seems to be pretty much true, even on older hardware like mine.

I'm not sure what they did to dynamic shadows, but it looks like Nanite plays a role in rendering them. They're EXTRMELEY sharp up close, seem to have no real issues catching even very tiny bits of detail and geometry, and filter out to a nice soft penumbra. Honestly, they're probably among the highest quality I've seen in a game. They're excellent.

Lumen is pretty heavy, but doable. Didn't screw with it much. Looks like it can accelerate certain things if you have hardware raytracing available (I don't, GTX 1080). It's definitely usable on this card. I'm sure I could dial a few things back a little and make the performance more tolerable. The quality looks very good.

The new temporal upsampling definitely works well. It's very clean when you're sitting still. Without much motion, even going to half res was good quality. As the resolution falls, you'll start seeing artifacts in motion around edges, but it resolves to a very clean image. I have a 1440p monitor, and I think I could definitely live with whatever it was pumping out at 75% screen resolution. I don't know if it's as good as DLSS since I haven't used it before, but it definitely works very well as it stands, and I'd presume will continue getting better.


This is definitely feels next gen - the entire thing is amazingly impressive.
 
I downloaded the engine and the demo.

Nanite indeed works, it's very cool, and the Quixel stuff is extremely seamless. Literally just a couple clicks to open Quixel up and dump whatever to the engine - it's fantastic. The dev experience is exceptional. They really crushed it here. The asset quality is generally immaculate, both in mesh and texture detail. When you export a mesh to the engine, you can pick a couple options, from like Low, Medium, High, and Nanite. I haven't tried the others, but I'm guessing Nanite is the happy medium they came up with where it's not so utterly overdetailed that it's basically completely imperceivable (and basically wasting disk space) but still wildly above any mesh you'd use in a game prior. The mesh I imported was still well into the millions triangle wise.

I picked whatever the craziest looking mesh I could find was and threw it in the scene. It was multiple millions of triangles and drop a couple in. Instantly cleave my framerate from like 60 to 25 with just a couple of them. Couple minutes later, I figured out how to convert them to Nanite. It takes a moment to process the mesh, then bam, done. Framerate is instantly back up. So I keep adding more and more. Eventually I have like a hundred of these rock formations covering every inch of my view and framerate is still smooth and barely seems to budge at fall no matter how many I throw into the scene. It's wild. Literally individual pebbles and rocks and modeled, the detail was astoundingly high. It definitely does what it says on the tin, it's an incredible piece of tech - I've never seen anything quite like it. Nanite seems very cheap and fast from a performance point of view. So yeah, their claim that you can practically throw Nanite meshes around with near impunity seems to be pretty much true, even on older hardware like mine.

I'm not sure what they did to dynamic shadows, but it looks like Nanite plays a role in rendering them. They're EXTRMELEY sharp up close, seem to have no real issues catching even very tiny bits of detail and geometry, and filter out to a nice soft penumbra. Honestly, they're probably among the highest quality I've seen in a game. They're excellent.

Lumen is pretty heavy, but doable. Didn't screw with it much. Looks like it can accelerate certain things if you have hardware raytracing available (I don't, GTX 1080). It's definitely usable on this card. I'm sure I could dial a few things back a little and make the performance more tolerable. The quality looks very good.

The new temporal upsampling definitely works well. It's very clean when you're sitting still. Without much motion, even going to half res was good quality. As the resolution falls, you'll start seeing artifacts in motion around edges, but it resolves to a very clean image. I have a 1440p monitor, and I think I could definitely live with whatever it was pumping out at 75% screen resolution. I don't know if it's as good as DLSS since I haven't used it before, but it definitely works very well as it stands, and I'd presume will continue getting better.


This is definitely feels next gen - the entire thing is amazingly impressive.
If you're just a gamer and have no programming or developer experience, is it hard to just install the demo and run around in it? Or does it require a bunch of klingon pain sticks torture to get running?
 
The demo is out on pc,
"Valley of the Ancient is a separate download of around 100 GB. If you want to run the full demo, the minimum system requirements are an NVIDIA GTX 1080 or AMD RX Vega 64 graphics card or higher, with 8 GB of VRAM and 32 GB of system RAM. For 30 FPS, we recommend a 12-core CPU at 3.4 GHz, with an NVIDIA RTX 2080 or AMD Radeon 5700 XT graphics card or higher, and 64 GB of system RAM. We have successfully run the demo on PlayStation 5 and Xbox Series X consoles at full performance."

Like i said long ago and was mocked for, the ps5 wasn't magic, it was just very low latency//overhead, and PC would be able to brute force the same whilte microsoft gets its act together with mostly, a lot more ram. And there it is.

This demo is STILL a stress test, don't base the size it has to extrapolate anything about what UE5 games will be. Yes, the assets were pared back a bit from the 8k textures and 16k shadows of movie quality that were used on the ps5 demo, but notice that the size is still a whopping 100GB for the demon and let's be honest, absolutely overkill still.

The demo has Epic's + AMD's answer to DLSS2.0; Temporal Super Resolution, the quality is very very good, and the performance i have seen on a friend is that 4k upscaled from 1080p= 43fps on this demo while 4k native was giving him 18fps.

All in all, it is pretty amazing how the technology is advancing, the full public UE5 release won't come until next year, but the future is looking very bright _for everyone_.
That statement on the PS5 has an "apart from that, Mrs. Lincoln, how was the play?" vibe to it.

I don't recall people disputing that you could technically replicate the PS5's performance with a reasonably modern PC. The problem is that you currently need a lot of PC to replicate that performance, and that the PS5's low overhead and latency (along with adopting PCIe Gen 4 SSDs quickly) go a long way toward making UE5 sing on relatively affordable hardware. That and it'll be years before PC game developers can assume a large chunk of their customer base has all the necessary components in place.
 
If you're just a gamer and have no programming or developer experience, is it hard to just install the demo and run around in it? Or does it require a bunch of klingon pain sticks torture to get running?

No, just a lot of disk space.

Basically just install the 5.0 engine release from the Epic launcher, then install the project and open it. You can run the demo straight in the editor.

My feeble GTX 1080 is definitely feeling the heat, dropping to about 25 fps on high settings at 1440p. Resolution scaling recovers a lot of performance and it still looks very good. A more modern card should have no problem.
 
I downloaded the engine and the demo.

Nanite indeed works, it's very cool, and the Quixel stuff is extremely seamless. Literally just a couple clicks to open Quixel up and dump whatever to the engine - it's fantastic. The dev experience is exceptional. They really crushed it here. The asset quality is generally immaculate, both in mesh and texture detail. When you export a mesh to the engine, you can pick a couple options, from like Low, Medium, High, and Nanite. I haven't tried the others, but I'm guessing Nanite is the happy medium they came up with where it's not so utterly overdetailed that it's basically completely imperceivable (and basically wasting disk space) but still wildly above any mesh you'd use in a game prior. The mesh I imported was still well into the millions triangle wise.

I picked whatever the craziest looking mesh I could find was and threw it in the scene. It was multiple millions of triangles and drop a couple in. Instantly cleave my framerate from like 60 to 25 with just a couple of them. Couple minutes later, I figured out how to convert them to Nanite. It takes a moment to process the mesh, then bam, done. Framerate is instantly back up. So I keep adding more and more. Eventually I have like a hundred of these rock formations covering every inch of my view and framerate is still smooth and barely seems to budge at fall no matter how many I throw into the scene. It's wild. Literally individual pebbles and rocks and modeled, the detail was astoundingly high. It definitely does what it says on the tin, it's an incredible piece of tech - I've never seen anything quite like it. Nanite seems very cheap and fast from a performance point of view. So yeah, their claim that you can practically throw Nanite meshes around with near impunity seems to be pretty much true, even on older hardware like mine.

I'm not sure what they did to dynamic shadows, but it looks like Nanite plays a role in rendering them. They're EXTRMELEY sharp up close, seem to have no real issues catching even very tiny bits of detail and geometry, and filter out to a nice soft penumbra. Honestly, they're probably among the highest quality I've seen in a game. They're excellent.

Lumen is pretty heavy, but doable. Didn't screw with it much. Looks like it can accelerate certain things if you have hardware raytracing available (I don't, GTX 1080). It's definitely usable on this card. I'm sure I could dial a few things back a little and make the performance more tolerable. The quality looks very good.

The new temporal upsampling definitely works well. It's very clean when you're sitting still. Without much motion, even going to half res was good quality. As the resolution falls, you'll start seeing artifacts in motion around edges, but it resolves to a very clean image. I have a 1440p monitor, and I think I could definitely live with whatever it was pumping out at 75% screen resolution. I don't know if it's as good as DLSS since I haven't used it before, but it definitely works very well as it stands, and I'd presume will continue getting better.


This is definitely feels next gen - the entire thing is amazingly impressive.
100% agree.
I'm messing around in the UE5 editor now.
I think that this will fix my biggest gripe I have with UE4 games, they are a pain to optimize, so therefore you get games like ARK. Amazing idea, and super fun, but so so horribly optimized that you hate it.
Nanite literally takes LODs away from the developer, which were so time consuming to make well.
My RX 5700 never felt so great, and I bet the performance is even better if you package the demo/game (doing that now)

This will save SO much development time to focus on other things like bug fixing
 
Been waiting for this to be publicly announced for a bit, and was the main reason why I wanted to get my hands on a 3090. Expecting my 980ti to burst into flames. Going to spend some time this weekend and get some assets through the system and see how it is.

Games are complaining about diskspace and download sizes now...
 
Making LOD's is trivial for modern games with tools like Simpolygon integrated into the engine pipeline as a pre-process at object import. It's very rare I have to make an LOD, and it's usually a fast (annoying) process. Most of my time is spent on initial high poly.

Until I go through the pipeline 100%, I can't say what has been saved in terms of asset creation. But I think I can see where artist will spend more time: modeling and texturing/fixing 3D Scans. We may be able to nix the need to make a low poly mesh, but the need to make more detailed models will increase (depending on game type).
 
Making LOD's is trivial for modern games with tools like Simpolygon integrated into the engine pipeline as a pre-process at object import. It's very rare I have to make an LOD, and it's usually a fast (annoying) process. Most of my time is spent on initial high poly.

Until I go through the pipeline 100%, I can't say what has been saved in terms of asset creation. But I think I can see where artist will spend more time: modeling and texturing/fixing 3D Scans. We may be able to nix the need to make a low poly mesh, but the need to make more detailed models will increase (depending on game type).
I guess I'm just inexperienced with UE4.
I've been using assets from the marketplace and some of them are horribly optimized
 
I just realized that I just meet the "recommended" requirements:
64GB of RAM..... DANG

1622068332652.png
 
I guess I'm just inexperienced with UE4.
I've been using assets from the marketplace and some of them are horribly optimized
UE4 has built in LOD system. It works. But if it mangles up an asset, that is when you will need to manually make one. But I am sure it will work for 95% of the cases, specially level stuff.

Marketplace assets are a gamble I am sure.
 
Remember when Euclideon came out with their Unlimited Detail voxel-based engine demos, and people pointed-out that it's only with non-animated objects and said there'd never be 'unlimited' detail? Euclideon were savaged for even claiming the concept is attainable.

But have Epic Games now accomplished that very thing?




 
I went to grab this and had space on my drives for it, but it insisted on having 100GB on my C drive for cache stuff used even though I was installing it in a different drive, so I said nah.
 
Can't wait to try this. Got the engine installed, just taking a while to compile shaders. Looks like this is the real next gen.

Surprised the 64GB of RAM is recommended for 30fps, lol. Might have to buy some RAM right now.
 
Mobile data average 500kbps, how long download? Cn it run off hard drive? Depression I think is the main feature for taco herr.

UE4 took forever to load off an SSD. I am going to assume an HDD is really not going to work well.
 
So with Lumen, no more "building lighting"?

You can still use static lighting if you really want, but the engine defaults are now purely dynamic lighting.

You actually need to explicitly opt into static lightning now - it's literally an engine configuration setting (that requires a full engine restart, no less), and unless you have it toggled, the build lighting setting is grayed out.
 
Also I've tried a few UE4 projects and they convert without issue.

Just create the project for UE 4.26, and try to open it in UE5. That's basically it.

I got the Infiltrator demo loaded without much fuss then fucked around with converting the geometry to Nanite, which accomplished a whole lot of nothing, really. You can easily convert any asset to Nanite, but it doesn't really seem to be much point outside of extremely detailed meshes. Seems like there's a little bit of overhead to turning Nanite on, and if you're just feeding it pussy little bitch meshes and not zillion triangle bad motherfucker meshes that strike fear into GPUs, it doesn't look like there will be any performance wins to find. If anything, it might be a little slower.

Interestingly, Lumen actually looks kind of fucked up at times in these demos because they placed a bunch of lights to fake lighting.

edit: also the engine runs DX12 by default now, haven't tried to see if the DX11 renderer is still alive yet.

Also also it loads MUCH faster than UE4. Even the gigantic demo project in its 100gb glory loads very quickly.

Also also also, this thing devours CPUs. My 6700k is absolutely getting its cheeks CLAPPED.
 
Finally got everything downloaded and installed.

The demo is friggin nuts. Real next gen. Sadly it was struggling to get 30 fps and my machine is no slouch.

I only have 16GB of RAM so maybe that is the issue. I ordered another 16GB so hoping that will fix the stutter.
 
That statement on the PS5 has an "apart from that, Mrs. Lincoln, how was the play?" vibe to it.

I don't recall people disputing that you could technically replicate the PS5's performance with a reasonably modern PC. The problem is that you currently need a lot of PC to replicate that performance, and that the PS5's low overhead and latency (along with adopting PCIe Gen 4 SSDs quickly) go a long way toward making UE5 sing on relatively affordable hardware. That and it'll be years before PC game developers can assume a large chunk of their customer base has all the necessary components in place.


I was mocked because i pointed out that PC was going to require higher ram, which it did, still this is for edition with RAW assets; if one builds the demo it compresses all the way to 25GB which is still hefty but more manageable, also the RAM requirement shrinks; this has only been possible after 1 year of polish and the inclusion of the RAD Game Tools compression technology (that Epic bought). It is really amazing how far they have gone in 1 year, so i have really high expectations for the next year.


(The mocking was specifically from PCMR fellas that were mocking the ps5 i/o saying that anything on pc at that moment could run it, which wasn't the case because it was just an initial rough test build, also i had to point out that on pc at the moment data had to take a much longer detour which affected the latency and specifically the windows stack at that point in time was hot garbage (it still mostly is, which is why we are desperately waiting for the rewrite that microsoft promised), this "windows i/o stack is hot garbage" hit Linus Sebastian like a sack of rocks when he tried himself to create his ultra high ssd speed + low latency server stack for his staff's constant edition of content videos and forced him to do the apology video.
 
Finally got everything downloaded and installed.

The demo is friggin nuts. Real next gen. Sadly it was struggling to get 30 fps and my machine is no slouch.

I only have 16GB of RAM so maybe that is the issue. I ordered another 16GB so hoping that will fix the stutter.



At the start it is in edition mode, which uses raw everything, i don't know the specifics but you gotta look up how to tell it to "build" the demo with nanite, you know that it worked because it will create a 25GB package with everything much less stressful on system resources.
 
Well I think the demo has high requirements cause Epic wanted to go all out and show what the engine could do.

In a real game, you wouldn't necessarily be loading super detailed pebbles to the point where 1 scene takes 100GB.

I tried some of my UE4 projects in UE5 and performance looked similar, still have to mess with the new features, but it seems like the new engine is performant. It's just the massive assets in the demo.
 
I was mocked because i pointed out that PC was going to require higher ram, which it did, still this is for edition with RAW assets; if one builds the demo it compresses all the way to 25GB which is still hefty but more manageable, also the RAM requirement shrinks; this has only been possible after 1 year of polish and the inclusion of the RAD Game Tools compression technology (that Epic bought). It is really amazing how far they have gone in 1 year, so i have really high expectations for the next year.


(The mocking was specifically from PCMR fellas that were mocking the ps5 i/o saying that anything on pc at that moment could run it, which wasn't the case because it was just an initial rough test build, also i had to point out that on pc at the moment data had to take a much longer detour which affected the latency and specifically the windows stack at that point in time was hot garbage (it still mostly is, which is why we are desperately waiting for the rewrite that microsoft promised), this "windows i/o stack is hot garbage" hit Linus Sebastian like a sack of rocks when he tried himself to create his ultra high ssd speed + low latency server stack for his staff's constant edition of content videos and forced him to do the apology video.
Ah, I getcha. Sorry, it's been a while since then!

On that note, I did an informal PC config to see what you'd need to hit the UE5 demo's 30FPS performance figures and you could easily pay triple what a PS5 (disc edition, even) or Xbox Series X costs, at least if you're not scraping the bottom of the barrel for components. Yeah, the PC will be useful for productivity, but it's going to be a while before the price comes down to to the point where it's a better value than just buying a PS5/XSX and a cheap laptop.
 
I was getting around 30 fps with a 2080 Ti, 8700K, and 16GB of RAM.

The issue I was having, though, is that it would be smooth and nice for 10 - 15 seconds, then stall out and stutter at like 1 fps.

I believe this may have been a memory paging issue when I ran out of RAM. Because when it was working, the 30 fps was smooth and playable. So I bought 16GB more RAM, hoping that will fix it.

Also, someone on Reddit said there was a way to compress and rebuild the demo down to 25GB and then it would work better on "slower" machines. I will look into this.
 
packaging the game with defaults made the game about 25GB.
Do we need to toggle anything to package it with Nanite?
 
Seems like threaded renderering is disabled by default for some reason?? This can have a solid performance benefit in some scenes.

r.RHIThread.Enable 1 in console.
 
Back
Top