Unreal Engine 5 Feature Showcase

Ugh I hate this so hard..... It makes me want to support Unreal more because I can actually deploy this to the students in a manner they can use and is relevant... The bastards, the tie-in with their tools, assets, and Nvidia with their cheap motion capture... I can deploy this pretty cheaply and have huge student uptake with it.

I hate it more because I am gonna need a new PC, I slightly beat out the minimum requirements but not by much. July 20'th 2023 is when I am scheduled for my next replacement and I look forward to what's available then.
 
Ugh I hate this so hard..... It makes me want to support Unreal more because I can actually deploy this to the students in a manner they can use and is relevant... The bastards, the tie-in with their tools, assets, and Nvidia with their cheap motion capture... I can deploy this pretty cheaply and have huge student uptake with it.
The integration isn't great if you're a fan of choices in developer tools, but this really is one of those instances where folding so many features into one editor could have a meaningful effect on content creation. I like the thought that a relatively inexperienced developer could make something decent without spending a small fortune or wasting hours on some tasks.
 
The integration isn't great if you're a fan of choices in developer tools, but this really is one of those instances where folding so many features into one editor could have a meaningful effect on content creation. I like the thought that a relatively inexperienced developer could make something decent without spending a small fortune or wasting hours on some tasks.
That's the thing I could deploy this to our Highschools right now with minimal effort on my part and basically give them everything they needed to get started in an afternoon. In my case, too many tools is a huge problem because teachers aren't going to learn 10 new tools to give kids choice but if I say here's one it does basically everything to an OK degree they'll get to work on that shit.
 
That's the thing I could deploy this to our Highschools right now with minimal effort on my part and basically give them everything they needed to get started in an afternoon. In my case, too many tools is a huge problem because teachers aren't going to learn 10 new tools to give kids choice but if I say here's one it does basically everything to an OK degree they'll get to work on that shit.
I'd be sorely tempted to roll it out, then. The goal is to get teens creating something and learning skills they can translate elsewhere — I'd rather they work in one tool now than have to wait until college, especially when it's a tool they might use in their careers.
 
I'd be sorely tempted to roll it out, then. The goal is to get teens creating something and learning skills they can translate elsewhere — I'd rather they work in one tool now than have to wait until college, especially when it's a tool they might use in their careers.
Literally on the phone with 3 principals and the guy who handles grant applications to see if we can get this funded for next school year. This is everything I would have wanted when I was in HS and I want them to have access to it.
 
What should inexperienced developer start using blueprints or learning C programming ?
If you have any kind of coding experience (HTML does not count, sorry) I would say C of course.

If you have zero coding experience, you can still start with Blueprints to see what you want to do visually, then start replacing that functionality with C.
 
So photogrammetry and marketplace based asset sharing makes it into game development finally. The two things I've been saying game development needed since about 2012. They took their time.
I am looking forward to trying out some scans using their phone app. Depending on how well it works it could also prove useful for 3d printing.
 
So photogrammetry and marketplace based asset sharing makes it into game development finally. The two things I've been saying game development needed since about 2012. They took their time.
Marketplace asset sharing has been common for unity since 2008-9, and unreal engine once their market got critical mass in ue4 since 2015ish...
 
I am looking forward to trying out some scans using their phone app. Depending on how well it works it could also prove useful for 3d printing.
I have used Reality Capture before, it was hit and miss, definitely not the best photo reconstruction app. And that was the paid version, this is some free derivative of that, so I'd not expect much from it. The first thing that seems idealized in the video, that they show taking photos of a sofa, then the final perfect sofa model as a result. In reality you'd get a ton of background noise with such model, unless you meticiosuly masked out the background in all images. And even then the algorithm might decide to fold the model in on itself, sending you into a frenzy trying to change processing paramaters so you get a relatively clean model. And emphasis on relatively, as with photogrammetry the models look nice at a distance due to the hires texture. But if you look closer there is not a single flat surface on it, every surface is a mountain range with tiny ridges and valleys.

But maybe they vastly improved the software since I last checked. Either way I'd recommend trying 3DF Zephyr, which was the best software for scanning objects like this last I checked. That also has a free version, with a limit on how many photos you can use in one model. But that's not cloud based so you need to use your own computer for processing (which I prefer anyway).
 
I'm pre coffee and tried to figure out what Phantasea was for far too long... :ROFLMAO:😱 You are probably right, Studio FOW has got to be chomping at the bit...
 
kissing a robot version of yourself would be strange, but I'll watch that one go at it
 
As extremely impressive as UE5 is it still looks very static. This a question for those of you that have used it, how difficult is it to add proper physics to these scanned objects?

One thing that always bugs me about our current state of graphics fidelity is that the physics of the game world and objects in it rarely matches the quality of it's appearance which I find very jarring. The more realistic the graphics the easier it is to notice the things that are off.
 
As extremely impressive as UE5 is it still looks very static. This a question for those of you that have used it, how difficult is it to add proper physics to these scanned objects?

One thing that always bugs me about our current state of graphics fidelity is that the physics of the game world and objects in it rarely matches the quality of it's appearance which I find very jarring. The more realistic the graphics the easier it is to notice the things that are off.
That's thanks to a minority (amdrones) rejecting the idea of gpu physx... It's now done very roughly, slowly, and simply by the cpu. Thus limits the capabilities us devs have dramatically. Nvidia shouldn't have restricted amd users from using a second Nvidia card for gpu physx though... That definitely hurt adoption.
 
As extremely impressive as UE5 is it still looks very static. This a question for those of you that have used it, how difficult is it to add proper physics to these scanned objects?

One thing that always bugs me about our current state of graphics fidelity is that the physics of the game world and objects in it rarely matches the quality of it's appearance which I find very jarring. The more realistic the graphics the easier it is to notice the things that are off.
They've improved the physics a lot. There were some videos on it earlier. For example instead of static walking animations they apply physics to it so characters can walk naturally over rough terrain, on slopes, going up stairs, etc. A bunch of different things react to physics which makes it not only look better, but cuts way down on development time because animators don't have create a ton of different animations and tweak a ton of stuff by hand to make it look good.

Also if you watch videos for some of the games being made in UE5 they do some pretty cool things. There is that Wukong game where they're fighting in waist high snow and all the snow is getting pushed around.
 
Can you build this into a package? My UE5 doesn't have the packager options.
 
That's thanks to a minority (amdrones) rejecting the idea of gpu physx... It's now done very roughly, slowly, and simply by the cpu. Thus limits the capabilities us devs have dramatically. Nvidia shouldn't have restricted amd users from using a second Nvidia card for gpu physx though... That definitely hurt adoption.
Because the PhysX code for CPU is ancient and is not really supported by Nvidia.
 
Great breakdown, hopefully they can get core usage normalized since most enthusiast machines these days have really started to pack in the cores...
 
Great breakdown, hopefully they can get core usage normalized since most enthusiast machines these days have really started to pack in the cores...
Sadly that has more to do with Branch Prediction and the current state of CPU based physics and AI pathing than anything else, you can only throw more cores at a situation until you find yourself with a too many cooks situation and you have them sitting around either waiting for something else to finish calculating or some resource to be made available.
In reality, we need more of these things off the CPUs and onto something specific, either some secondary card capable of good OpenCL performance, or a dedicated FPGA, things are getting more complex and general computing devices are falling behind.
 
That's thanks to a minority (amdrones) rejecting the idea of gpu physx... It's now done very roughly, slowly, and simply by the cpu. Thus limits the capabilities us devs have dramatically. Nvidia shouldn't have restricted amd users from using a second Nvidia card for gpu physx though... That definitely hurt adoption.
Not AMD's fault, that's heavily misplaced. They where working on it with Havok. The two biggest reasons it died, if your gonna point fingers at companies, are maybe Intel and nVidia??? Intel for buying and essentially killing Havok going the GPU route and nVidia for never opening it up and switching focus to CPU acceleration so it could be used more widely.(it's the default in both UE4 and Unity)

Because the PhysX code for CPU is ancient and is not really supported by Nvidia.
From what I can remember, it's actually the other way around. The CPU side continues to get development by nVidia to this very day and is used allot. The GPU acceleration side is essentially dead(even though nVidia could probably make amazing use of the Tensor cores to do some insane physics stuff)
 
Last edited:
Because the PhysX code for CPU is ancient and is not really supported by Nvidia.
They have announced PhysX 5.0 and was scheduled to have it out already with its new and shiny features but <insert COVID delay story here> so it's not out yet but they say soon, they are making some pretty grand promises about it so far and if half of them are true it should at long last give Havok a serious contender.
 
Not AMD's fault, that's heavily misplaced. They where working on it with Havok. The two biggest reasons it died, if your gonna point fingers at companies, are maybe Intel and nVidia??? Intel for buying and essentially killing Havok going the GPU route and nVidia for never opening it up and switching focus to CPU acceleration so it could be used more widely.(it's the default in both UE4 and Unity)


From what I can remember, it's actually the other way around. The CPU side continues to get development by nVidia to this very day and is used allot. The GPU acceleration side is essentially dead(even though nVidia could probably make amazing use of the Tensor cores to do some insanely amazing physics stuff)
https://github.com/NVIDIAGameWorks/PhysX
More or less yes, but when they announced that they were working on 5.0 most of the development outside of bug patching for 4.1 stopped, it is lacking features and generally is not as good as Havok for hard body physics but better for visual effects, particles, water, weather, that sort of stuff.
But yes the CPU side gets more work than the GPU side because more than not the GPU doesn't have the resources to space meanwhile the CPU has cores sitting idle. The problem is that good CPU-based physics is memory channel intensive and is more often than not the cause of those nasty 1 and 5% lows.
 
Not AMD's fault, that's heavily misplaced. They where working on it with Havok. The two biggest reasons it died, if your gonna point fingers at companies, are maybe Intel and nVidia??? Intel for buying and essentially killing Havok going the GPU route and nVidia for never opening it up and switching focus to CPU acceleration so it could be used more widely.(it's the default in both UE4 and Unity)

Didn't Microsoft buy Havok?
 
Back
Top