Unreal Engine 5 Revealed! Running real time on Playstation 5

I think you are wrong this time, since the Next Gen Console hardware is far and wide better than what came before and easily equal to an upper mid range PC will be, upon release.
Correct. People drastically underestimate what optimisation occurs on consoles versus PC. Having a 2080ti does absolutely nothing if a game isn't built around it. Yes you'll get higher frame rates but it doesn't change IQ.

How many console ports have absolute terrible performance? Tons and there's a reason for it. PC titles are built around the lowest common denominator this isn't the case for consoles. To underestimate this woefully disrespects the engineers who create this hardware and the software developers who build the games.
 
I don’t think this generation will have much in the way of graphics that are something the pc couldn’t do day 1. What is fascinating with all the discussion from Sweeney about the I/O and SSD pipelines. Linus has already apologized, for at least awhile the PS5 (maybe Xbox Series X) will be capable of games that’s impossible on pc.
 
I think you are wrong this time, since the Next Gen Console hardware is far and wide better than what came before and easily equal to an upper mid range PC will be, upon release.

TU106 has 7.5TFlops and 448GB/s. Looking at GA100 there’s a good chance GA106 will be north of 10TFlops for $350 ~ 250mm^2. Should give PS5 a run for its money.
 
I don’t think this generation will have much in the way of graphics that are something the pc couldn’t do day 1. What is fascinating with all the discussion from Sweeney about the I/O and SSD pipelines. Linus has already apologized, for at least awhile the PS5 (maybe Xbox Series X) will be capable of games that’s impossible on pc.

Meh. Linus didn't have much to apologize for given Sweeny's past. He wasn't the only one to jump to the conclusion of it being hype nonsense. It all sounds good but I'll wait to see it in action.
 
TU106 has 7.5TFlops and 448GB/s. Looking at GA100 there’s a good chance GA106 will be north of 10TFlops for $350 ~ 250mm^2. Should give PS5 a run for its money.
It's not going to be optimized for so no. That's the difference. If you think developers are going to optimize for a card that what 1% of gamers have it's just not going to happen. nVidia can sponsor some of them, probably alot, but not all. Then you have exclusives. I'm primarily a PC gamer but I have a console too for a reason. There's titles that won't be released on PC and there's some even if ported aren't all that great.
 
It's not going to be optimized for so no. That's the difference. If you think developers are going to optimize for a card that what 1% of gamers have it's just not going to happen. nVidia can sponsor some of them, probably alot, but not all. Then you have exclusives. I'm primarily a PC gamer but I have a console too for a reason. There's titles that won't be released on PC and there's some even if ported aren't all that great.

Name one time consoles had better I.Q. and resolution than the PC?
(This launch will be no different than the previous launches)
 
I actually think PS4 had some games that rivaled PCs, the developers were able to do a lot even with weaker hardware.

Like Until Dawn, Horizon, and Detroit (I know Detroit is on PC now and looks better, but that is several years later and with much more expensive hardware).

1591691369675.png


1591691407423.png


1591691476117.png
 
No one ever mentioned "apples to apples", but thank for the fallacy.
This is relevant, because EVERY single consoles launch have had the same BS.
This is no different.
Good luck vs Ampere and thanks for holding gaming back 6-7 year due to a locked platform...history repeating it self...once again.
I think consoles become more of a lowest common denominator for game developers. So basically for a PC a year or so after consoles come out really need to be close to the console in performance to still game well on newer games. Consoles sure aren't the top end of gaming, ever. That said, I don't think they hold back gaming (at least not much more than otherwise would happen), without consoles and their game sales it would take much longer to recoup costs and game developers would still have to make sure things ran ok on older hardware, which as I mentioned, consoles give you a good baseline for that. Sure if everyone could afford to upgrade to the latest GPU always and devs could program to only worry about a single or a few high end cards, we could possibly see amazing advances. In reality, people are still rocking gtx 770's and r7 270's on low quality. Devs are always going to have to program to older hardware, regardless of consoles. If consoles didn't exist, people would buy $400 PC's to run games instead.. which wouldn't run any better than an optimized consoles. Guess i'm just trying to say we can't just expect games to take advantage of new hardware and not old hardware, although I do get annoyed at some of those games that are converted from console -> PC horribly. Maybe with newer consoles being so many cores + freesync + a bunch of other features in PC's the games won't seem like they were just badly copied from a console.
 
The geometry generation stuff is one thing; not using RT hardware is another.

I'm skeptical about the lack of hardware leverage here; this sounds quite a bit like a copout for slow AMD RT tech. Granted, that's what most of us have been expecting out of a console APU, but still, I was kind of hoping developers would figure out how to make it work.

And obviously, we don't know what the limits of Epic's technology are. And we won't, really, until games start coming out with it, because up until games are released and analyzed, Epic is in marketing mode!
Seeing as Unreal started going this direction years ago, I don't think they had a clue what AMD RT performance would be at the time... possibly had no clue what NV RT performance would be yet. I think it's a copout for not wanting to re-implement something they spent a lot of time on, but seeing as it's going to be widely supported even on older hardware, I can understand not wanting write an entire new rendering path at this point (or at least for now) especially if the results are close enough to not matter (and performance is as good or better). With how long this has been in development I find it difficult to support your correlation for AMD's future card/platform and a feature that's been in the works for over 6 years.
 
Seeing as Unreal started going this direction years ago, I don't think they had a clue what AMD RT performance would be at the time... possibly had no clue what NV RT performance would be yet. I think it's a copout for not wanting to re-implement something they spent a lot of time on, but seeing as it's going to be widely supported even on older hardware, I can understand not wanting write an entire new rendering path at this point (or at least for now) especially if the results are close enough to not matter (and performance is as good or better). With how long this has been in development I find it difficult to support your correlation for AMD's future card/platform and a feature that's been in the works for over 6 years.

Thats right.

Also Unreal Engine supports DXR as well for developers who want to take advantage of RT hardware.
 
I think consoles become more of a lowest common denominator for game developers. So basically for a PC a year or so after consoles come out really need to be close to the console in performance to still game well on newer games. Consoles sure aren't the top end of gaming, ever. That said, I don't think they hold back gaming (at least not much more than otherwise would happen), without consoles and their game sales it would take much longer to recoup costs and game developers would still have to make sure things ran ok on older hardware, which as I mentioned, consoles give you a good baseline for that. Sure if everyone could afford to upgrade to the latest GPU always and devs could program to only worry about a single or a few high end cards, we could possibly see amazing advances. In reality, people are still rocking gtx 770's and r7 270's on low quality. Devs are always going to have to program to older hardware, regardless of consoles. If consoles didn't exist, people would buy $400 PC's to run games instead.. which wouldn't run any better than an optimized consoles. Guess i'm just trying to say we can't just expect games to take advantage of new hardware and not old hardware, although I do get annoyed at some of those games that are converted from console -> PC horribly. Maybe with newer consoles being so many cores + freesync + a bunch of other features in PC's the games won't seem like they were just badly copied from a console.

Hell, I'd say a PS4 or Xbox One S is still better than a $400 PC, since they at least have dedicated GPUs.

I'm not actually convinced consoles will be the lowest common denominator this time around, not for a while at least. The approach to storage is so fundamentally different than what you see with much of the PC market that games which truly take advantage of the PS5 or XSX won't run properly on most PCs, even if they have fast CPUs and GPUs. Can you imagine a PC port of a PS5 game in 2021 with minimum requirements that list "NVMe solid-state drive with at least 3Gbps transfer speed?"

I'm sure enthusiasts will adapt quickly if they haven't already, but it's going to be a long while before somebody's $600 Best Buy clearance special can run a PS5 or XSX game properly.
 
Trying to tie an excluive in to "Better I.Q. and frame-rate" is a retarded fallacy ;)

The numbers don't support your notion though:
https://www.polygon.com/2018/4/12/17227946/god-of-war-ps4-pro-60fps-4k-1080p

Strike 1!
I think it would have made more sense for him to say consoles generally outperform comparable hardware in a PC due to optimizations that can take place in consoles. Obviously, this isn't what he said, but reading between the lines of the conversation I think that's what they were trying (and failed) to say. Using an example of a console exclusive is not useful in any way. Heck, even some console games ported to PC sometimes end up like crap. This is because it's typically easier to target one hardware config than support many different configs. General purpose typically ends up less optimized, but with 4-5 year old console vs. current CPU/GPU it's easy to overcome. Typically first games out on console aren't as well optimised as games that come later, so even from the get to you typically won't see as good of quality on consoles (although they can come close for a short period of time).
 
Last edited:
Hell, I'd say a PS4 or Xbox One S is still better than a $400 PC, since they at least have dedicated GPUs.

I'm not actually convinced consoles will be the lowest common denominator this time around, not for a while at least. The approach to storage is so fundamentally different than what you see with much of the PC market that games which truly take advantage of the PS5 or XSX won't run properly on most PCs, even if they have fast CPUs and GPUs. Can you imagine a PC port of a PS5 game in 2021 with minimum requirements that list "NVMe solid-state drive with at least 3Gbps transfer speed?"

I'm sure enthusiasts will adapt quickly if they haven't already, but it's going to be a long while before somebody's $600 Best Buy clearance special can run a PS5 or XSX game properly.

If that NVME was so a "vital" tech for performance...a Laptop with a RTX 2070 should not have been getting 40 FPS...those 40 FPS on a laptop ruins the "special sauce" arguments...
 
I actually think PS4 had some games that rivaled PCs, the developers were able to do a lot even with weaker hardware.

Like Until Dawn, Horizon, and Detroit (I know Detroit is on PC now and looks better, but that is several years later and with much more expensive hardware).

View attachment 252012

View attachment 252013

View attachment 252014

Yet, when you observe/analyze videos of games that are crossplatform (screenshots in 2020, please?) at no point in history the consoles offered the same performance/I.Q.
There is always cheats:

- Lower texture quality
- Lower Framerate
- Dynamic resolution
- LOD levels (including pop-in objects)

The list is long.
 
Yet, when you observe/analyze videos of games that are crossplatform (screenshots in 2020, please?) at no point in history the consoles offered the same performance/I.Q.
There is always cheats:

- Lower texture quality
- Lower Framerate
- Dynamic resolution
- LOD levels (including pop-in objects)

The list is long.
Well, now your being as disingenuous as the person you're arguing with. Obviously games from a 6-7 year old console won't be as good as a current PC. Screen shots from 2013/2014 make more sense than screen shots from 2020, so asking for 2020 to prove the PS4 didn't has as good of quality as a PC when it came out is just as silly as someone thinking the console can have better quality for any length of time (at least in past history) as the PC platform that gets way more upgrades over time.
 
I actually think PS4 had some games that rivaled PCs, the developers were able to do a lot even with weaker hardware.

Like Until Dawn, Horizon, and Detroit (I know Detroit is on PC now and looks better, but that is several years later and with much more expensive hardware).

View attachment 252012

View attachment 252013

View attachment 252014

You're looking at screenshots during a cutscene zoomed in on a single character. It's a trick marketing has been doing for decades do to make the graphics seem better.
  • It's a screenshot so you don't see it's running at a low framerate.
  • It's a super tiny field of view so only the object you're looking at needs to be rendered. Instead of rendering the details of an entire interactive environment they pump all of that detail into a single character.
  • It's a cutscene so they can cut the detail you would see at other angles and bake effects to look good for a single angle. Things you can't do if a player controlled the camera and was able to view from other angles.
 
If that NVME was so a "vital" tech for performance...a Laptop with a RTX 2070 should not have been getting 40 FPS...those 40 FPS on a laptop ruins the "special sauce" arguments...

It's not about frame rates, it's about the ability to process large volumes of data without choking.

To start, that laptop likely has an NVMe SSD in it. Most desktops in use today, meanwhile, are still using spinning hard drives or SATA SSDs. How is a 7,200RPM hard disk going to handle movie-quality assets in an expansive, seamless game world? Well... it won't, really. Unless that PC has gobs of RAM to cache large swaths of the game world, it's going to suffer.

I don't expect some of the first games to absolutely depend on a fast SSD to play well on PCs, and there will be titles that underuse the consoles to reach a wider audience. But with UE5, and as developers get used to what the PS5 and XSX can do, there may be a tipping point where certain games either aren't feasible for PCs or will carry steep storage requirements to work.
 
You're looking at screenshots during a cutscene zoomed in on a single character. It's a trick marketing has been doing for decades do to make the graphics seem better.
  • It's a screenshot so you don't see it's running at a low framerate.
  • It's a super tiny field of view so only the object you're looking at needs to be rendered. Instead of rendering the details of an entire interactive environment they pump all of that detail into a single character.
  • It's a cutscene so they can cut the detail you would see at other angles and bake effects to look good for a single angle. Things you can't do if a player controlled the camera and was able to view from other angles.

I started a console screenshot thread:
https://hardforum.com/threads/the-screenshot-thread.1963797/

No photo mode in these games:
Detroit: Become Human
Ratchet and Clank
The Evil Within 2, all those screenshots were taken using the first person mode
Here They Lie
The Shadow of the Colossus
Infamous Second Son
Flower
Nothing Ever Remains Obscure


Games with photo mode
Uncharted Series
God of War
The Last of Us
Mad Max
Horizon: Zero Dawn
The Order 1886
No Man's Sky

I'm not sure about the screenshots I didn't take.

The point is, console games can look amazing when designed right.
PC is still superior, but I'm excited to see what can be done with the next generation, both PC and console.

All of my screenshots are on a regular PS4.
PS4 pro will obviously look a little better, and potentially be higher res.
 
It's not about frame rates, it's about the ability to process large volumes of data without choking.

To start, that laptop likely has an NVMe SSD in it. Most desktops in use today, meanwhile, are still using spinning hard drives or SATA SSDs. How is a 7,200RPM hard disk going to handle movie-quality assets in an expansive, seamless game world? Well... it won't, really. Unless that PC has gobs of RAM to cache large swaths of the game world, it's going to suffer.

I don't expect some of the first games to absolutely depend on a fast SSD to play well on PCs, and there will be titles that underuse the consoles to reach a wider audience. But with UE5, and as developers get used to what the PS5 and XSX can do, there may be a tipping point where certain games either aren't feasible for PCs or will carry steep storage requirements to work.

Only data set we have so far is for the Unreal Engine 5 demo...and that doesn't not give any support to the PR fluff...I doubt Sweeny was very happy about one of the chinese engineers "leaking" that his laptop got ~40 FPS....that kinda revealed the marketing hype for what it is...marketing hype.

Anyone remember this?
https://www.eurogamer.net/articles/...t-was-actually-real-in-the-e3-2005-ps3-reveal

That was SONY and NVIDIA...doing the same PR FUD as we have seen in EVERY SINGLE console launch since.

And people still fall for it...lulz
 
Only data set we have so far is for the Unreal Engine 5 demo...and that doesn't not give any support to the PR fluff...I doubt Sweeny was very happy about one of the chinese engineers "leaking" that his laptop got ~40 FPS....that kinda revealed the marketing hype for what it is...marketing hype.

Anyone remember this?
https://www.eurogamer.net/articles/...t-was-actually-real-in-the-e3-2005-ps3-reveal

That was SONY and NVIDIA...doing the same PR FUD as we have seen in EVERY SINGLE console launch since.

And people still fall for it...lulz

It was still running on a system that likely had an NVMe SSD.

You can get back to me when you can show that same demo running just as smoothly, without any loading hitches, on a system with a spinning hard drive or SATA SSD. I know early console tech demos don't tell the whole story, but Epic hasn't been shy about stating that the demo depends on very fast storage. And it's only going to get tougher when

Listen: it's okay if gaming PCs aren't the best at absolutely everything. Your self-worth will not take a hit if a highly optimized PS5 offers better performance (at least, for the money) in key areas for a couple of years. I know the PS5 won't have raw CPU and GPU power going for it, and there will clearly be areas where PCs will be better... but I don't need to have the ultimate gaming machine to have a good time. I'm just intrigued and excited that the usual presumptions about console performance may be up-ended for a while.
 
It was still running on a system that likely had an NVMe SSD.

You can get back to me when you can show that same demo running just as smoothly, without any loading hitches, on a system with a spinning hard drive or SATA SSD. I know early console tech demos don't tell the whole story, but Epic hasn't been shy about stating that the demo depends on very fast storage. And it's only going to get tougher when

Listen: it's okay if gaming PCs aren't the best at absolutely everything. Your self-worth will not take a hit if a highly optimized PS5 offers better performance (at least, for the money) in key areas for a couple of years. I know the PS5 won't have raw CPU and GPU power going for it, and there will clearly be areas where PCs will be better... but I don't need to have the ultimate gaming machine to have a good time. I'm just intrigued and excited that the usual presumptions about console performance may be up-ended for a while.
The idea that storage speed is going to make up for peashooter CPU/GPU power is pure fantasy. Storage speed isn't going to increase FPS. It wasn't a bottleneck before, and developers are notorious for lowest common denominator development - I don't foresee a rush or even interest to developing console games simply for the sake of giving the SSD lots to do.

The most powerful a new console ever is is before its launched
 
Last edited:
The idea that storage speed is going to make up for peashooter CPU/GPU power is pure fantasy.

The most powerful a new console ever is is before it launches.

HDD speeds are the biggest bottleneck in the whole system, so yes faster data speeds can in fact increase performance as then you dont loose cycles waiting for data. But you and Factum have no interest in the consoles so why sit here and argue about something you dont care about?
 
It was still running on a system that likely had an NVMe SSD.

You can get back to me when you can show that same demo running just as smoothly, without any loading hitches, on a system with a spinning hard drive or SATA SSD. I know early console tech demos don't tell the whole story, but Epic hasn't been shy about stating that the demo depends on very fast storage. And it's only going to get tougher when

Listen: it's okay if gaming PCs aren't the best at absolutely everything. Your self-worth will not take a hit if a highly optimized PS5 offers better performance (at least, for the money) in key areas for a couple of years. I know the PS5 won't have raw CPU and GPU power going for it, and there will clearly be areas where PCs will be better... but I don't need to have the ultimate gaming machine to have a good time. I'm just intrigued and excited that the usual presumptions about console performance may be up-ended for a while.

You seem to overestimate the performance benefits:


It's okay, you just got chaught up in console-favoritism and choose to ignore real-world data. It is okay...it has happend at every single console launch, so no wonder you fell in that hole.
Wake me up when the PC need to do "console-tricks" to keep up with the consoles ;)
 
HDD speeds are the biggest bottleneck in the whole system, so yes faster data speeds can in fact increase performance as then you dont loose cycles waiting for data. But you and Factum have no interest in the consoles so why sit here and argue about something you dont care about?
Storage speed was not an FPS bottleneck was the point - no kidding loadtimes sucked and it will be great if those are cut down going forward.

"You & Factum" lol - haven't even read his posts, I just see the same wishful thinking before a console release like what happens every time.

Again, developers aren't going to back-asswards their development pipelines and start designing games just for the sake of having huge data and texture sets just so the SSD has lots to do. It's going to be business as usual where they'll simply bump the graphical settings up one small notch from PS4/X1 Medium-Low to PS5/X2 Medium so it hits minimum framerates. Enjoy.
 
Storage speed was not an FPS bottleneck was the point - no kidding loadtimes sucked and it will be great if those are cut down going forward.

"You & Factum" lol - haven't even read his posts, I just see the same wishful thinking before a console release like what happens every time.

Again, developers aren't going to back-asswards their development pipelines and start designing games just for the sake of having huge data and texture sets just so the SSD has lots to do. It's going to be business as usual where they'll simply bump the graphical settings up one small notch from PS4/X1 Medium-Low to PS5/X2 Medium so it hits minimum framerates. Enjoy.

If you have the bandwidth to use your going to use it and simply put it looks like the PS5 will have a superior setup for the NVME style drive. The whole point is the design team can just scan it in and go saving tons of man hours, so yes they will use it. It remains to be seen if this will cause issues on the PC side but my guess is they can overcome it a bit by leveraging more power on the PC side. We have only a bit of info from a demo to go by and some passing words but so far it seems many developers were quite impressed with it, so I expect it will be used especially if it saves lots of man hours. But once again I dont get why you want to argue about consoles when you have no intentions of using one, also I hate to break it to you but if video card prices stay high it will drive console sales massively.
 
Trying to tie an excluive in to "Better I.Q. and frame-rate" is a retarded fallacy ;)

The numbers don't support your notion though:
https://www.polygon.com/2018/4/12/17227946/god-of-war-ps4-pro-60fps-4k-1080p

Strike 1!
Not really. Not being able to play a game at all is far worse than locked at 30. That's the reality. Like it or not.

To address your orginal question, first off better I.Q has ABSOLUTELY NOTHING to do with the video card. If the textures are 1080p then that's what you're going to get period. It's not garunteed. It's entirely up to the developers if they want higher resoltuion on the texture front. That leaves particles and shadows which while important aren't exactly some super high quality feature. You're talking as though the different in I.Q is like comparing a NES with a N64. That's not the case. Not at all.

As for framerate I said that before. Of course you'll have higher framerate. Is it needed? No. Will you notice it? Of course you will. Not that doesn't necessarily add to realisim as I've played many games on my PC that at 60FPS look like the animations are too fast. It's great for FPS. But that's not all I play.

In addition to that optimizations in development have allowed relatively "Weak" hardware do quite a bit more than one can experience on the PC. The PS4 is 5 years old. For experiment take a five year old video card and run a relatively new console port. I can garuntee you it will run faster on the console and look better.
 
I think it would have made more sense for him to say consoles generally outperform comparable hardware in a PC due to optimizations that can take place in consoles. Obviously, this isn't what he said, but reading between the lines of the conversation I think that's what they were trying (and failed) to say. Using an example of a console exclusive is not useful in any way. Heck, even some console games ported to PC sometimes end up like crap. This is because it's typically easier to target one hardware config than support many different configs. General purpose typically ends up less optimized, but with 4-5 year old console vs. current CPU/GPU it's easy to overcome. Typically first games out on console aren't as well optimised as games that come later, so even from the get to you typically won't see as good of quality on consoles (although they can come close for a short period of time).
I was actually joking but completely serious similtaneously. Not being able to play the game at all is an issue. In addition, you can't compare a 1 year old video card with a 5 year old console. In my previous post I said compare a 5 year old video card with the PS4. It won't do too much better if at all. The overhead on the PC is pretty large.
 
Last edited:
I'm just looking forward to SNES/Genesis cartridge-style load times again. It took us awhile, but it appears we're on the cusp of catching up.
 
I actually think PS4 had some games that rivaled PCs, the developers were able to do a lot even with weaker hardware.

Like Until Dawn, Horizon, and Detroit (I know Detroit is on PC now and looks better, but that is several years later and with much more expensive hardware).

View attachment 252012

View attachment 252013

View attachment 252014

All those titles do things to make it work, checkerboard rendering, limited FOV, etc. I have always noticed when I'm playing on a console, particularly the restrictive FOV in 90% of all titles, followed by the lower FPS.

Every time its the same arguments by both sides, reality is consoles will never compete with a HEDT, they can't and frankly don't have too. They only have to deliver a good enough experience to the masses for +/- $500.

Optimization of games for consoles over PC is becoming a thing of the past, especially with Microsoft launching xbox titles on PC as well, and that Consoles largely run on the same architecture as pc's now (no more cell processors or proprietary hardware, its all generic video cards and cpus). Yes there are still some bad console ports, but honestly, most of them these days aren't great games anyway.
 
The most powerful a new console ever is is before its launched

This...why do people insists on throwing out real world performance data (there is no "magical sauce").
Never has the consoles had better I.Q.+performance against the PC....but it gets claimed EVERY single launch...it is like people choose to ignore the past...and hopes that everyone else does too.
Not really. Not being able to play a game at all is far worse than locked at 30. That's the reality. Like it or not.

To address your orginal question, first off better I.Q has ABSOLUTELY NOTHING to do with the video card. If the textures are 1080p then that's what you're going to get period. It's not garunteed. It's entirely up to the developers if they want higher resoltuion on the texture front. That leaves particles and shadows which while important aren't exactly some super high quality feature. You're talking as though the different in I.Q is like comparing a NES with a N64. That's not the case. Not at all.

As for framerate I said that before. Of course you'll have higher framerate. Is it needed? No. Will you notice it? Of course you will. Not that doesn't necessarily add to realisim as I've played many games on my PC that at 60FPS look like the animations are too fast. It's great for FPS. But that's not all I play.

In addition to that optimizations in development have allowed relatively "Weak" hardware do quite a bit more than one can experience on the PC. The PS4 is 5 years old. For experiment take a five year old video card and run a relatively new console port. I can garuntee you it will run faster on the console and look better.

I.Q.+performance are indeed linked the performance of the GPU.

If not, there would be no need for "settings"
If not, there would be no need for different SKU's.
If not, there would be no need for developers to downgrade game visuals due to consoles.
(Yes, devs do state this in public, like it or not)
Textures impact do GPU RAM load btw.

You are always shifting the bottleneck around.

And again, no console has offered better I.Q+performance at launch against the PC ever...and this launch is no different.
 
The idea that storage speed is going to make up for peashooter CPU/GPU power is pure fantasy. Storage speed isn't going to increase FPS. It wasn't a bottleneck before, and developers are notorious for lowest common denominator development - I don't foresee a rush or even interest to developing console games simply for the sake of giving the SSD lots to do.

The most powerful a new console ever is is before its launched

Never said it did, but it does open up possibilities for visuals and game design that weren't there before. When you aren't significantly constrained in terms of the complexity and scale of the geometry you can load, that lets you do a whole lot more. That UE5 demo is only a hint of what's possible. You can have truly complex open worlds, for example, instead of the relatively barren ones you tend to see today.

No good developer will create titles purely to give an SSD "lots to do." But a good developer does take advantage of new hardware features to provide experiences and tell stories that weren't viable before. I'm sure there will be developers who'll just produce cookie-cutter FPS games and action RPGs, but there's a good chance you'll see some games that use the SSD in next-gen consoles to do fresh things... I wish you'd open your mind and realize that.
 
This...why do people insists on throwing out real world performance data (there is no "magical sauce").
Never has the consoles had better I.Q.+performance against the PC....but it gets claimed EVERY single launch...it is like people choose to ignore the past...and hopes that everyone else does too.


I.Q.+performance are indeed linked the performance of the GPU.

If not, there would be no need for "settings"
If not, there would be no need for different SKU's.
If not, there would be no need for developers to downgrade game visuals due to consoles.
(Yes, devs do state this in public, like it or not)
Textures impact do GPU RAM load btw.

You are always shifting the bottleneck around.

And again, no console has offered better I.Q+performance at launch against the PC ever...and this launch is no different.
No it's not. You actually think that if you buy a new video card of any caliber really that it immediately improves IQ on it's own? No need for the developers to do anything? It's just magic? It does not work that way.

I really hope you don't believe that. Now devlelopers will offer options that improve I.Q they do that. Not the video card. The video card just needs to be powerful enough to do it. That's it. I must say that someone telling me how minimum specs or maximum specs for that matter are determined within a game is quite amazing. Considering I did that and more for 5 years but please go ahead.

I'm not shifting anything. I'm speaking about the reality. Developers determine IQ more so than video cards, hell even FPS. PC doesn't have the same titles as consoles, that's some of the reason why people buy them including myself. You really can't compare a 5 year old console with a 1 year old or less video card. It's mental to do so. These are realities like it or not. I'm not playing a game here really.
 
Last edited:
No it's not. You actually think that if you buy a new video card of any caliber really that it immediately improves IQ on it's own? No need for the developers to do anything? It's just magic? It does not work that way.
I really hope you don't believe that. Now devlelopers will offer options that improve I.Q they do that. Not the video card. The video card just needs to be powerful enough to do it. That's it.

I'm not shifting anything. I'm speaking about the reality. Developers determine IQ more so than video cards. PC doesn't have the same titles as consoles, that's some of the reason why people buy them including myself. You really can't compare a 5 year old console with a 1 year old or less video card. It's mental to do so. These are realities like it or not. I'm not playing a game here really.

First, the xBone and PS4 where already behind to their comparable PC's when they came out, its a nature of console development cycles, parts being lined up many months to year(s) in advance, and needing to hit that sweet price range for a console to sell meaning you never get high end specs of the day anyway.

Second, being already behind at launch and not being able to upgrade critical components, consoles fall further behind as the generation goes on. It is a testimate to developers pushing every last inch of power out of every generation by the end of the cycle.

Third, most, and I would bet well into the 90th percentile now, titles are launched on every platform. In fact I can only think of a handful of exclusives in the last few years, all on the PS, Horizon, TLOU, Spiderman, Deathstanding. Now Death and Horizon are coming to PC, and I think the age of exclusivity is dying (particularly with everyone on x86 systems).

Forth, yes a developer sets the IQ options, but the video card dictates to the end user what they can use, a console is largely locked, PCs for the most part get all goodies to play with.
 
You seem to overestimate the performance benefits:


It's okay, you just got chaught up in console-favoritism and choose to ignore real-world data. It is okay...it has happend at every single console launch, so no wonder you fell in that hole.
Wake me up when the PC need to do "console-tricks" to keep up with the consoles ;)


You do realize that video is irrelevant to the discussion at hand, right? I love real-world data, but it has to be applicable... and this isn't.

LTT is running existing games that are tuned to be playable on spinning hard drives. Gee, I wonder why the SSD's performance is frequently wasted? It's like arguing that a race car is inferior to a sports car because you can't test its agility on the highway. We're talking about games optimized for high-speed SSDs, that know they can fetch massive amounts of data at a moment's notice.

So yes, a by-the-numbers FPS title with the exact same level structure and detail levels won't gain as much as you'd think. But that's not what we're talking about, and it's disheartening that you can't imagine anything beyond a faster-loading Call of Duty.
 
You do realize that video is irrelevant to the discussion at hand, right? I love real-world data, but it has to be applicable... and this isn't.

LTT is running existing games that are tuned to be playable on spinning hard drives. Gee, I wonder why the SSD's performance is frequently wasted? It's like arguing that a race car is inferior to a sports car because you can't test its agility on the highway. We're talking about games optimized for high-speed SSDs, that know they can fetch massive amounts of data at a moment's notice.

So yes, a by-the-numbers FPS title with the exact same level structure and detail levels won't gain as much as you'd think. But that's not what we're talking about, and it's disheartening that you can't imagine anything beyond a faster-loading Call of Duty.

I would argue imagining has nothing to do with it, I've been imaging what computers could do since the 80s, but reality is it doesn't matter what I imagine they could do, it only matters what they do. In that vein SSD's are only a tool for faster loading at this point, I don't know a single game that uses them beyond that and until there is on I don't buy the imagined.
 
First, the xBone and PS4 where already behind to their comparable PC's when they came out, its a nature of console development cycles, parts being lined up many months to year(s) in advance, and needing to hit that sweet price range for a console to sell meaning you never get high end specs of the day anyway.

Second, being already behind at launch and not being able to upgrade critical components, consoles fall further behind as the generation goes on. It is a testimate to developers pushing every last inch of power out of every generation by the end of the cycle.

Third, most, and I would bet well into the 90th percentile now, titles are launched on every platform. In fact I can only think of a handful of exclusives in the last few years, all on the PS, Horizon, TLOU, Spiderman, Deathstanding. Now Death and Horizon are coming to PC, and I think the age of exclusivity is dying (particularly with everyone on x86 systems).

Forth, yes a developer sets the IQ options, but the video card dictates to the end user what they can use, a console is largely locked, PCs for the most part get all goodies to play with.
Behind? Not really. Those are 8-core machines with 8GB of RAM neither of which were common and they are tied to GDDR for system bandwidth and video card which no PC had or even has even today.

Titles are launched on every platform? Nope Try again

Not being upgradable is a double-edged sword which is the point in case you missed it.

The video card does not dictate in anyway unless you are talking about video card specific APIs and features which enmasse are hardly used. The compliance with the API for the OS dictates far more. You need the API and video card compliance to work together for that to happen. A video card alone just isn't doing it. You need both.
 
I would argue imagining has nothing to do with it, I've been imaging what computers could do since the 80s, but reality is it doesn't matter what I imagine they could do, it only matters what they do. In that vein SSD's are only a tool for faster loading at this point, I don't know a single game that uses them beyond that and until there is on I don't buy the imagined.

Given that the UE5 demo offered a practical demonstration of what's possible by optimizing for SSDs, and that numerous games will likely make substantial use of that optimization, wouldn't you at least want to be open to what's coming? I'm not so naive as to think that every PS5 game will shatter expectations (many won't, I'm sure), but if you didn't see the UE5 demo and realize what doors it opened, you weren't paying attention.
 
Back
Top