Unreal Engine 5 Revealed! Running real time on Playstation 5

If you have the time I recommend checking out Digital Foundry’s video on the reveal of PS5s specs. Starting at around 13:30 mark they begin talking about the SSD and it’s controller.

It’s all above my head so I’m not going to attempt to surmise what they or Cerny say on the matter.

Actually yea that is what I was talking about. what they are doing is having 12 channels to call on the storage with. How they make sure they all have access to the same data is interesting but in theory they don't all need access to the same data. So what they are doing is saying specific calls get specific channels based on priority.

This is very similar to how larger servers do storage. You have multiple channels to access the storage space to load into memory and serve to the application. You will still need memeory caching, but the fact that you have the video controller able to directly access the storage is interesting.

package this with a PC and you're talking about loading ram drives of entire regions very very fast instead of loading smaller segments.

This will require a different way of accessing the storage.

I wonder how this will compare to the new xbox.
 
Other than a few maps that have limited interesting detail in them, UT3 did look pretty unappealing when it released, IMO. I wanted to like it, but it feels dead and lost compared to the other games in the series. And most of it looks like crap, too - I agree with the "washed-out" comment.

It looks, sounds, and plays like a dissociated half-cooked console take on UT.
 
This is very similar to how larger servers do storage. You have multiple channels to access the storage space to load into memory and serve to the application. You will still need memeory caching, but the fact that you have the video controller able to directly access the storage is interesting.
Yea, it's also related to how levels are structured, the visibility/occlusion methods (or VIS in source and other engines), you basically set areas of the world which you see before others, and that data gets stored in a way so that it's retrieved in the most optimal way. They do this when authoring CDs, and some defrag programs do this on your harddrive/storage. This isn't automatic though, so it's still not as if an artist can just do whatever they want and expect the same performance, levels will still be made with walls/barriers to help manage and direct these things. A player will likely never notice these things, except of courrse the biggest annoyance is LOD pop-in and viewing distances. Some games also limit FOV to help save performance.

anyways, that's my attempt to put it into laymans terms. AWS and other network access methods obviously have a big influence here as well... and as much as I hate to say it, folks like mojang and euclideon at least helped get more people thinking in terms of how to best optimize these sorts of things, as they are a bit less conventional. These things might all sound unrelated to each other, but when creating games there is a lot of dependencies across a wide array of assets/pipelines.
 
I'm really excited for Unreal Engine 5. There's nothing I want more than to watch as fully raytraced cheaters make games unplayable due to a nearly complete lack of basic rule validation in the engine. So much fun!
 
While many think that Unreal Engine is limited to gaming, that is not the case. According to Quentin Quentin Staes-Polet, General Manager, India and SEA, of Epic Games, the Unreal Engine is transforming many industries from movie production to broadcasting and automobiles. Quentin says the game technology was (not !?) invented purely to support game developers.



https://indianexpress.com/article/technology/gaming/epic-games-unreal-engine-5-interview-6421867/

https://hardforum.com/threads/unreal-engine-v.1996579/post-1044597515

“Unreal Engine and the main push in India for us was actually not gaming but the use of the engine by moviemakers, broadcasters, animation and VFX houses, architects to do simulation and 3D visualisation of data. The features we are adding in the engine are going to change both the film and broadcasting industry very dramatically.”
 
Most of us here understand what makes UE5 worthy of that hype and it's not just the graphics.
Most of it seems to be people overly impressed with photogrammetry - "whoa next gen graphics" - which isn't something new. We all remember the Infiltrator demo and that ended up not really meaning much.

But if it creates dev tools that dissolve technical barriers and speed implementation time then that's impressive for sure. Anything to make games look better.
 
Last edited:
Actually yea that is what I was talking about. what they are doing is having 12 channels to call on the storage with.
If they're using a 12-channel controller instead of the common 8-channel controller... that's not really that impressive IMO.

Consumer SSDs could be significantly faster today, but the costs of getting their now are far higher than the benefits of getting there now (which are basically none).

You'll find much different approaches in the enterprise sector.
 
Most of it seems to be people overly impressed with photogrammetry - "whoa next gen graphics" - which isn't something new. We all remember the Infiltrator demo and that ended up not really meaning much.

But if it creates dev tools that dissolve technical barriers and speed implementation time then that's impressive for sure. Anything to make games look better.

That is what the hype is about. The graphics shown didn't look much better than Shadow of The Tomb Raider. What was noteworthy is that it will be much easier to create such environments, and they can be much more expansive. So less time to get that level of detail, which is a great thing. Obviously it looked good, but the looks weren't the impressive part. It sounds like the time spent as well as optimization will be greatly improved.
 
Interesting take from one dev. He’s a fan of Lumen and thinks it’s good enough with a lower performance hit than RT. We first saw SVOGI in early UE4 demos but it never actually made it into the engine. There are more games with RT than SVOGI today.

Time will tell whether UE5 games actually ship with SVOGI or will future RT hardware be fast enough to make it irrelevant.

https://simonmajar.com/blog/mNG8/the-need-for-lumen

In my opinion, raytracing GI is not ready for games yet on a large scale, because of its performance cost. Few games have been released with it, and I expect the same pace for the next few years. The hardware just isn’t powerful enough yet.

A Sparse Voxel Octree is a simplified representation of the geometry, where the lighting calculations are cheaper. Combined with screen space more precise methods, results are amazing. That’s the path Epic chose to go with Lumen, and I’m very happy with that decision.

Graphical features in video games always make trade off between accuracy and performance, and GI being particularly expensive, brute force raytracing doesn’t seem to be the right solution yet.
 
Interesting take from one dev. He’s a fan of Lumen and thinks it’s good enough with a lower performance hit than RT. We first saw SVOGI in early UE4 demos but it never actually made it into the engine. There are more games with RT than SVOGI today.

Time will tell whether UE5 games actually ship with SVOGI or will future RT hardware be fast enough to make it irrelevant.

https://simonmajar.com/blog/mNG8/the-need-for-lumen

In my opinion, raytracing GI is not ready for games yet on a large scale, because of its performance cost. Few games have been released with it, and I expect the same pace for the next few years. The hardware just isn’t powerful enough yet.

A Sparse Voxel Octree is a simplified representation of the geometry, where the lighting calculations are cheaper. Combined with screen space more precise methods, results are amazing. That’s the path Epic chose to go with Lumen, and I’m very happy with that decision.

Graphical features in video games always make trade off between accuracy and performance, and GI being particularly expensive, brute force raytracing doesn’t seem to be the right solution yet.

I have a feeling he wont be the only developer to feel that way either.
 
Interesting take from one dev. He’s a fan of Lumen and thinks it’s good enough with a lower performance hit than RT. We first saw SVOGI in early UE4 demos but it never actually made it into the engine. There are more games with RT than SVOGI today.

Time will tell whether UE5 games actually ship with SVOGI or will future RT hardware be fast enough to make it irrelevant.

https://simonmajar.com/blog/mNG8/the-need-for-lumen

In my opinion, raytracing GI is not ready for games yet on a large scale, because of its performance cost. Few games have been released with it, and I expect the same pace for the next few years. The hardware just isn’t powerful enough yet.

A Sparse Voxel Octree is a simplified representation of the geometry, where the lighting calculations are cheaper. Combined with screen space more precise methods, results are amazing. That’s the path Epic chose to go with Lumen, and I’m very happy with that decision.

Graphical features in video games always make trade off between accuracy and performance, and GI being particularly expensive, brute force raytracing doesn’t seem to be the right solution yet.

Jarrod Walton till recently, editor at PC Gamer & now in Tom's Hardware seems to agree:

Unreal Engine 5, uses a combination of techniques, including a library of objects that can be imported into games as hundreds of millions of polygons, and a hierarchy of details treating large and small objects differently to save on its demands on processor resources.

For most game makers such "hacks and tricks" will be good enough, says Mr Walton.

https://www.bbc.com/news/business-52541218
 
Jacob Ridley of PCGamer asked some of today's top developers what Unreal Engine 5 means to them and the future of videogame development:


"It is quite impressive to actually see a new technology that works on an upcoming console and that looks like it could become a standard for the upcoming generations.
... we have been a partner/licensee with Epic since 2010 and have published almost all our games with their tech
... This brings back memories of the tech demos showcasing Sparse Voxel Octree of many years back"
~ Carlos Bordeu, one of three co-founders of studio ACE Team


"...Nanite is the feature that I find most exciting;
Near unlimited detail and faster art production is a win-win!
... The only drawback I see is exponentially increasing the install size of the game..."
~ Stephen Kick, CEO of Nightdive Studios


"Artists, Designers, Programmers; All of the different disciplines that make up the development team are bogged down by redundant tasks that could otherwise be optimized out of the development pipeline.
This leaves the remaining time for creative solutions to bigger problems, or simply more time spent on the part of game development that we all got into this industry for."
~ Sean McBride, creative director at Tripwire Interactive


"Seeing the work that Epic is doing in UE5 will give more studios the confidence that they can quickly take assets to a game ready state empowering them to be creative, take risks, and make a product they may not have been able to previously.
... I am excited to see how the gap between large and small studios shifts!"
~ Jon Carr, technical director at Tripwire Interactive
 
Jacob Ridley of PCGamer asked some of today's top developers what Unreal Engine 5 means to them and the future of videogame development:

Jacob also feels UE5's Lumen will replace the need for using Ray Tracing to achieve Global Illumination in games.
Global illumination is a key driver behind ray tracing adoption as it stands today, wherein tracing the path of indirect and diffused lighting can serve to generate physics-based lighting that is entirely dynamic.
Yet UE5, and its proprietary built-in solution to the same end, is proving popular among developers as an avenue to achieve that goal sooner rather than later.
 
Well UE4 can already take several minutes to open a blank project.

I don't want to know how long you'll wait if you have million poly assets in there unoptimized.
 
Well UE4 can already take several minutes to open a blank project.

I don't want to know how long you'll wait if you have million poly assets in there unoptimized.

Maybe with Microsoft's DirectInput & velocity architecture, time is not an issue. Only disk size (& cost)?

As of today Microsft & Seagate have announced 1TB cartridges (but no price)

In future they can market 2TB, 4TB, or 8TB cartridges too. Unlike the console, the accessories (esp Cartridges) will not be discounted, unless it is part of a bundle deal when buying the console !
 
Well UE4 can already take several minutes to open a blank project.

I don't want to know how long you'll wait if you have million poly assets in there unoptimized.

Yeah I noticed this to, even on an SSD. Tried to open the Insurgency Sandstorm map editor which is UE4. Took very long to load and froze. They claim 16GB was okay for minimum requirements but I think my minimum they mean will more than likely crash due to lack of RAM. :(
 
The more I look at this, the more I can't help but think the PC gaming industry will be in trouble for a while as it adapts to the PS5 and Xbox Series X.

You'd realistically need at least an NVMe SSD in a computer to keep up with what UE5 and similar technologies can do on the newer consoles, but many people aren't about to drop hundreds of dollars on upgrading storage (or on higher-end configs for new computers) even if they can afford it. Yeah, you can lower the detail levels for people stuck with SATA SSDs and spinning disks, but you can't do that if the issue is a very large world or other inherent gameplay mechanics.

As such, there may be a year or two where PC-only gamers will have to look on with some envy as certain kinds of games simply look and/or run nicer on the PS5 and XSX.
 
The more I look at this, the more I can't help but think the PC gaming industry will be in trouble for a while as it adapts to the PS5 and Xbox Series X.

You'd realistically need at least an NVMe SSD in a computer to keep up with what UE5 and similar technologies can do on the newer consoles, but many people aren't about to drop hundreds of dollars on upgrading storage (or on higher-end configs for new computers) even if they can afford it. Yeah, you can lower the detail levels for people stuck with SATA SSDs and spinning disks, but you can't do that if the issue is a very large world or other inherent gameplay mechanics.

As such, there may be a year or two where PC-only gamers will have to look on with some envy as certain kinds of games simply look and/or run nicer on the PS5 and XSX.

I'm sure it'll be fine. Developers certainly know how to handle load times and will continue to make games for PS4/X1X for a few more years. It's a very good thing that consoles are forcing a PC upgrade cycle and it also helps that NVMe drives are pretty cheap now.
 
In an interview with VG24/7, Vice President of Engineering Nick Penwarden claimed that his team rewrote Unreal Engine 5’s I/O code after witnessing what the PlayStation 5 and its oft-advertised SSD could do.

“The ability to stream in content at extreme speeds enables developers to create denser and more detailed environments, changing how we think about streaming content. It’s so impactful that we’ve rewritten our core I/O subsystems for Unreal Engine with the PlayStation 5 in mind.”

https://www.thefpsreview.com/2020/0...alizing-what-the-playstation-5s-ssd-could-do/

Yesterday, Tim Sweeney clarified that while the PS5 was a significant motivator, the work that’s been accomplished for Unreal Engine 5 thus far will benefit all platforms.

There has been a massive effort to upgrade Unreal Engine loading and streaming to ensure CPU doesn’t become the bottleneck. PS5 has provided much of the impetus, but the work will benefit all platforms.
https://twitter.com/TimSweeneyEpic/...20/06/03/unreal-engine-rewrite-ps5-ssd-speed/
 
Here's the full video of the Epic Games China livestream in which an engineer said the demo, or at least a part of it, gets 40 FPS running on their notebook. It's in Chinese without subtitles, though.

 
I do not understand, Red Dead Redemption 2 looked far and wide better than this demo, I just do not get it.
 
Here's the full video of the Epic Games China livestream in which an engineer said the demo, or at least a part of it, gets 40 FPS running on their notebook. It's in Chinese without subtitles, though.



Did they mean to say that the demo when running unlocked on PS5 can do 40 fps but it was locked to 30 fps ?

I think Tim Sweeney clarified that the laptop was showing a video of PS5 running the demo.
 
Did they mean to say that the demo when running unlocked on PS5 can do 40 fps but it was locked to 30 fps ?

I think Tim Sweeney clarified that the laptop was showing a video of PS5 running the demo.

Tim was initially shown a screenshot of the EG China demonstration with the PS5 demo running on a laptop and he didn't know that the basis of the question he was posed wasn't that screenshot but were the comments of an EG China engineer during that livestream. The EG China engineer's comments were apparently a separate matter from the video shown of the PS5 demo. And when that was pointed out to Tim, he then declined to comment on the EG China engineer's comments, saying he doesn't know what the correct translation of them is.

The EG China engineer apparently said that he can run the demo in the UE5 editor on his notebook in real-time and it gets 40 FPS. His comment apparently wasn't related to the video of the PS5 demo which was shown during the livestream event.
 
I do not understand, Red Dead Redemption 2 looked far and wide better than this demo, I just do not get it.
I think the idea is you can make a great looking game without a third of a billion dollars, 1600 developers, and 8 years development time. (thanks google)
 
Last edited:
So the PS5 is already beaten by a laptop using a GTX 2080 and a Evo 970 plus by around 25%.

So all the fluff was just that...fluff.
just like any other console launch...the consoles start with a performance deficit.

Guess this is a "told you so" moment...
https://www.pcgamer.com/unreal-engine-5-tech-demo-pc-performance/

Tim Sweeney did point-out that the PS5 demo was running at a constant 30 FPS, which means that it was running at 30 FPS and up - and Tim said often much higher than 30 FPS.
 
Tim Sweeney did point-out that the PS5 demo was running at a constant 30 FPS, which means that it was running at 30 FPS and up - and Tim said often much higher than 30 FPS.

Yeah, let us see how that play out..."much higher" is a fluff metric...the "often" being the devil in the details.
It was "often", not the "majority"...don't get your hopes up.
 
I think the idea is you can make a great looking game without a third of a billion dollars, 1600 developers, and 8 years development time. (thanks google)

Except that RDR2 is far and wide the best looking game but also the best single player game to come out in a very long time. Tim Sweeny is delusional if he thinks a game engine like that will make everything cheaper and easier to develop, unless all you want is Call of Duty 56: Same as Before. :D
 
Except that RDR2 is far and wide the best looking game but also the best single player game to come out in a very long time. Tim Sweeny is delusional if he thinks a game engine like that will make everything cheaper and easier to develop, unless all you want is Call of Duty 56: Same as Before. :D

I think Global Illumination implemented in unreal engine is a big deal. This should work for all GPUs too irrespective of SSD or if GPU supports Ray Tracing !!


Jacob also feels UE5's Lumen will replace the need for using Ray Tracing to achieve Global Illumination in games.
 
Jacob also feels UE5's Lumen will replace the need for using Ray Tracing to achieve Global Illumination in games.
Of course he is no going to sell their tech sort...but it is still a hack trying to do what raytracing does...keyword being "trying".
Raytracing is not going anywhere, like it or not.
 
So the PS5 is already beaten by a laptop using a GTX 2080 and a Evo 970 plus by around 25%.

So all the fluff was just that...fluff.
just like any other console launch...the consoles start with a performance deficit.

Guess this is a "told you so" moment...
https://www.pcgamer.com/unreal-engine-5-tech-demo-pc-performance/

Yeah but cost tho, looking at $2k for a 2080 laptop. If we're thinking a ~$500 PS5, just gotta find a way to 4 way Crossfire some PS5s :rolleyes: and we're talking apples to apples.
 
Yeah but cost tho, looking at $2k for a 2080 laptop. If we're thinking a ~$500 PS5, just gotta find a way to 4 way Crossfire some PS5s :rolleyes: and we're talking apples to apples.

No one ever mentioned "apples to apples", but thank for the fallacy.
This is relevant, because EVERY single consoles launch have had the same BS.
This is no different.
Good luck vs Ampere and thanks for holding gaming back 6-7 year due to a locked platform...history repeating it self...once again.
 
No one ever mentioned "apples to apples", but thank for the fallacy.
This is relevant, because EVERY single consoles launch have had the same BS.
This is no different.
Good luck vs Ampere and thanks for holding gaming back 6-7 year due to a locked platform...history repeating it self...once again.
I never know who Sony & MS are trying to impress when they want to get into a spec debate with the 600lb discrete-GPU gorilla. It's not even trying to punch above their weight, but more like a kid with chocolate all over its face pulling on Nvidia's pantleg to get attention.

People that care about specs aren't really focused on consoles; people that are focused on consoles don't tend to know what specs even are, or how it compares to dedicated PC hardware. Mostly people care about the exclusives. So it's always lots of bluster leading up to a new gen launch, then the peashooter hardware is exposed for what it is soon after.
 
Last edited:
I never know who Sony & MS are trying to impress when they want to get into a spec debate with the 600lb discrete-GPU gorilla. It's not even trying to punch above their weight, but more like a kid with chocolate all over its face pulling on Nvidia's pantleg to get attention.

People that care about specs aren't really focused on consoles; people that are focused on consoles don't tend to know what specs even are, or how it compares to dedicated PC hardware. Mostly people care about the exclusives. So it's always lots of bluster leading up to a new gen launch, then the peashooter hardware is exposed for what it is soon after.

I think you are wrong this time, since the Next Gen Console hardware is far and wide better than what came before and easily equal to an upper mid range PC will be, upon release.
 
  • Like
Reactions: kac77
like this
Yeah I noticed this to, even on an SSD. Tried to open the Insurgency Sandstorm map editor which is UE4. Took very long to load and froze. They claim 16GB was okay for minimum requirements but I think my minimum they mean will more than likely crash due to lack of RAM. :(

The editor usually needs to spend some time the first launch. Opening the Infiltrator demo on my machine takes like a solid fucking hour the first time. Open task manager and you'll probably see a bunch of Shader Completion processes.

After that it's pretty fast.
 
I do not understand, Red Dead Redemption 2 looked far and wide better than this demo, I just do not get it.

I haven't played it, but that demo looked a lot better than anything I've seen from RDR2 videos. But idea is always from a development perspective. Will this cut down development time and man hours? Then good. A lot of times you'll see indie games that look pretty good but environments may look a bit stale. If this allows them to get closer quality to RDR2 with limited man power/time then that is good.
 
The geometry generation stuff is one thing; not using RT hardware is another.

I'm skeptical about the lack of hardware leverage here; this sounds quite a bit like a copout for slow AMD RT tech. Granted, that's what most of us have been expecting out of a console APU, but still, I was kind of hoping developers would figure out how to make it work.

And obviously, we don't know what the limits of Epic's technology are. And we won't, really, until games start coming out with it, because up until games are released and analyzed, Epic is in marketing mode!
 
The geometry generation stuff is one thing; not using RT hardware is another.

I'm skeptical about the lack of hardware leverage here; this sounds quite a bit like a copout for slow AMD RT tech. Granted, that's what most of us have been expecting out of a console APU, but still, I was kind of hoping developers would figure out how to make it work.

And obviously, we don't know what the limits of Epic's technology are. And we won't, really, until games start coming out with it, because up until games are released and analyzed, Epic is in marketing mode!
You'd think they'd seize the opportunity to come up with their own firstparty title to showcase how it's done. Like Crytek did with Crysis to show everyone what CryEngine was capable of - that's three Cry's in one sentence.
 
Like Crytek did with Crysis to show everyone what CryEngine was capable of - that's three Cry's in one sentence.

Half-cry 3 confirmed!

though, didn’t they try to do that with UE4 and Unreal Tournament? I don’t recall what happened with that. I played it a few times but got tired of needing to run another launcher app just to play CTF with bora occasionally - I didn’t know anyone who wanted to actually spend a lot of time playing it. Last one I spent any real time with was UT2004 I think.
 
Back
Top