PS5 - The Future of Gaming | The Power of V

What do you think of the console design?

  • Yay - Very futuristic

    Votes: 20 13.5%
  • Nay - Looks like a router

    Votes: 36 24.3%
  • Don't Care

    Votes: 67 45.3%
  • I prefer the refrigerator box

    Votes: 25 16.9%

  • Total voters
    148
  • Poll closed .
It is funny how you can see some hardcore PC gamers' insecurity on display here. They can't concede that a console might have a hardware advantage over a PC, even if it's only a partial one that might not last for more than a couple of years, because that might involve admitting that their PC isn't always the best and that a console might have more expansive and seamless game worlds for a while. Look, I'm sure your pricey gaming PC is awesome and can do things a PS5 or XSX can't, but... you don't need it to be better at everything, all the time to feel good about yourself. You're allowed to prefer one platform while recognizing that another has its perks.

A walled garden does have its advantages. Consoles can focus on very specific use cases and the Sony SSD tech is a good example of this. Doesn't mean it's better than PC capabilities...it's just better at the thing it's focused on. Anyway, PC's are focused on general computing and that's fine.

Regardless, I will probably buy an Xbox SeX because of the beefier graphics card. Not to say a PS5 might end up in my future. Why not have the best of both worlds?
 
A walled garden does have its advantages. Consoles can focus on very specific use cases and the Sony SSD tech is a good example of this. Doesn't mean it's better than PC capabilities...it's just better at the thing it's focused on. Anyway, PC's are focused on general computing and that's fine.

Regardless, I will probably buy an Xbox SeX because of the beefier graphics card. Not to say a PS5 might end up in my future. Why not have the best of both worlds?

Exactly. The PS5 is a tightly integrated, highly optimized set of hardware that will excel at games that can take advantage of extremely fast storage, whether it's adding more visual detail to a given scene (see UE5) or just creating uninterrupted open game worlds with more complexity than you'd normally expect at that scale. PCs are wonderful generalized devices, but that means no one thing lives up to its theoretical potential.

I'm leaning PS5 due to the games more than the hardware potential, but here's hoping Microsoft and other exclusive devs will make use of the XSX's beefier GPU.
 
A walled garden does have its advantages. Consoles can focus on very specific use cases and the Sony SSD tech is a good example of this. Doesn't mean it's better than PC capabilities...it's just better at the thing it's focused on. Anyway, PC's are focused on general computing and that's fine.

Regardless, I will probably buy an Xbox SeX because of the beefier graphics card. Not to say a PS5 might end up in my future. Why not have the best of both worlds?

I think for its budget the PS5 has been designed & customized well & first party games should shine on it.

PC may not be able to use same custom hardware for i/o but maybe compensate by going for a larger system RAM & gpu VRAM
 
If I were to go back to console, I'd probably choose Xbox because from keeping an eye on them + using friend's Xbox 360s/Xbox Ones or PlayStation 3s/PlayStation 4s - Microsoft (understandably why) has the better software/OS and thus UX in my opinion. Never been a fan of the PlayStation OS/UI.
 
It’s not an NVMe drive as the standard SSD in the PS5. It’s a fully custom solution using processes that everyone knows about but just don’t do due to the nature of PC.
So you can't just upgrade the PS5 SSD to like a 2TB? I can see that being a huge problem for storage. Games are getting big in size.

Yes those things are indeed coming to PC but I don’t understand how you think it’s all BS or marketing. If they had not done all of the custom work to it you’d have the Series X solution. It really is that simple, and the Series X solution is what PCs are already doing.
Do you see benchmarks? Do you see anything beyond their word on how well this will work? Also this is Sony, and they've lied about console performance in the past. Remember Cell, remember the Emotion Engine? These things weren't all that good and super hyped up. Cell was a super computer CPU, and 50 million polygons per second on PS2. It is easy to make lies about console hardware when the console doesn't run Windows or Linux to test it. Remember what Gabe Newell said about the PS3? Show me benchmarks and then we'll talk.
 
A walled garden does have its advantages. Consoles can focus on very specific use cases and the Sony SSD tech is a good example of this. Doesn't mean it's better than PC capabilities...it's just better at the thing it's focused on. Anyway, PC's are focused on general computing and that's fine.
Doesn't work like that. A walled garden has no advantages for the consumer, just the developers. This is the same problem with Apple as they're doing the same thing, and we all agree it's shitty for consumers. Sony's SSD tech is probably a good idea, but the claims they make on the performance is bold. If they're using a M.2 SSD then they're full of it. If they're using something soldered to the circuit board that doesn't allow for the replacement of the SDD then I may believe them. But if you're soldering your NAND chips to the circuit board of the PS5 then that's probably even a worse idea since all flash memory wears out and needs to be replaced. Plus games are getting bigger and I can't imagine Sony making 2TB PS5's for cheap. Without upgradable SSD's the PS5 is doomed to fail.

Regardless, I will probably buy an Xbox SeX because of the beefier graphics card. Not to say a PS5 might end up in my future. Why not have the best of both worlds?
Wish I had endless amounts of money to buy anything and everything. Where's that second Trump check?
 
If I were to go back to console, I'd probably choose Xbox because from keeping an eye on them + using friend's Xbox 360s/Xbox Ones or PlayStation 3s/PlayStation 4s - Microsoft (understandably why) has the better software/OS and thus UX in my opinion. Never been a fan of the PlayStation OS/UI.
I couldn’t care less about the UI. The PlayStation has always had better exclusives.
 
I couldn’t care less about the UI. The PlayStation has always had better exclusives.

Seeing as I'm not playing them now anyway without a PlayStation, the exclusives wouldn't matter to me. The UI/UX would for the device I would be using.
 
Seeing as I'm not playing them now anyway without a PlayStation, the exclusives wouldn't matter to me. The UI/UX would for the device I would be using.
I guess I just play games because I like playing games. The Nintendo Switch has the worst interface in the history of mankind, but it still has amazing games I play all the time.
 
I guess I just play games because I like playing games. The Nintendo Switch has the worst interface in the history of mankind, but it still has amazing games I play all the time.

Yeah, I'm not obsessive about playing every game ever released across all platforms or anything. I would just want whatever would give me the most enjoyable overall experience whenever I decided to play a game (or nowadays, when also using as a Roku equivalent, etc).

That's why my SHIELD TV is my current 'console' - gives me all my Steam/PC games on my TV via Gamestream, along with my entertainment apps via Android TV - and emulators for other older games (PS1 for example, via ePSXe) as well. Good UI (subjective) and all controllable with either a remote control, game controller, or mouse and keyboard.
 
Last edited:
lol I’m not swallowing their bs. A few tweaks here and there is not at all what I get from the road to PS5 brief and Game Dev statements. I can see that for Xbox SeX namely their own software stack being “tweaks”.

It’s not earth shattering no, and will it make a huge difference I have no idea. Maybe for first party games.

But everything about the drive configuration was changed quite literally. Custom controller, custom lane configuration, custom software stack, they effectively removed all the little bottlenecks across the board. That’s why you can’t just put a 970 Evo Plus in it and expect it to work right. Cerny said it’ll end up needing to be a 7+GB per sec drive to be able to meet the 5.5 of their custom solution because of its inherent overhead with these SSD manufacturers custom controllers.

It’s ok if Sony has a cool SSD solution that PC doesn’t have right now. Y’all are acting like the world is ending or something.
A few more channels? A 'custom' controller? A different software stack?

Those are tweaks. Earth-shattering would be using something in the Optane-class but peaking PCIe 4.0 bandwidth.

I'm not saying that it isn't useful stuff nor that it won't make a difference; I'm saying that Sony essentially jumped the normal tech cycle gun by about six months.
 
yeah but the NVME drives they're using are broken up into 5 sections (if i remember correctly, think he said 5 channels) and are being accessed in a RAID 0 or 5 fashion to give them the higher bandwidth. That's why they were saying that the drives are proprietary and there are currently no drives for PC that have that kind of bandwidth. prob just 5 256GB NVME ssd's w/ custom raid controller to equal a 1TB raid 5 array basically. (or something like that?)
I still don't understand why should I be impressed by that? It just doesn't matter, at all.
When I went from spinner drives to the first sata SSD and loading went from 60 seconds to 5 seconds for some games, I was impressed.
When I want from sata SSD to NVME ssd, and loading time went from 5 to 3 seconds, I couldn't care less. I would still today buy a sata SSD no reservation, because it just doesn't matter.
So why should I care that a PS5 will be able to take loading times from 3 seconds to 1.5 seconds? It's just not a feature that matters. And if this is the key selling point of the console, then something is rotten in the state of denmark.
 
the enclosure box looks cool ... I can sit there and mindlessly stare at it for an hour like I used to not do with RGB lighting.

RGB lighting, how "cool" something looks ... it all fits in with the same voice that said, "By a case of toilet paper to combat COVID19"
 
As I mentioned elsewhere - it's only around 2007/2008 that PCs started to become equal/better graphically than the current consoles of that time (PC games like Doom 3 and HL2 that were coming out in 2004 and 2005 were looking better than then current 6th gen consoles at the end of their life/time, but then 7th gen consoles came out in 2006 and reset the the status quo briefly) - and have pretty much held that ground since (8th gen really was lackluster, and felt uneventful from the PC side). Like I said, that's a long time for old timers to forget and newbies who never knew to begin with - PC wasn't always on top.

Even if in a new area aside from resolution/framerate - PCs having to catch up to console would be a return to norm in my eyes; rather than what I still consider 'a new norm'. I don't know why people feel the need to take it so personal like all of a sudden THEY don't matter anymore if a console can beat their PC in some way, in one specific benchmark for one specific thing.
Consoles have been trailing PCs graphics wise ever since the original Voodoo got released in 1996.
 
I’m a PC gamer and have a PS4 for “exclusives” the problem for me is I can’t finish them due to bad frame pacing or 30fps with bad motion blur.

I’m excited for the PS5 for the BC and me catching up on all the ones I bought and never played or finished. I’m also excited to see what happens when SSDs become the norm in game design. How each company differs in that aspect will be fun.

My main PC is a 9900k and 2080Ti, I mainly play Halo atm but always go back to Total Wars or other First Person Shooters. That’s one thing consoles will never be able to touch.

I'm a PC gamer with a 2080/3700X and 120Hz 3440x1440 120 Hz G-Sync monitor and have no issues playing the PS4 exclusives, as pretty much none of them have any frame pacing issues. The only ones I've played that do is Bloodborne and Until Dawn, all others have pretty much perfect frame pacing at 30-60FPS, as expected of a first party title. They also have a pretty good implementation of motion blur, which helps considerably in smoothing out fast camera pans and movement.
 
I think it has been more like a game of leapfrog since the voodoo cards were released, but there’s a certain point where it’s subjective. I certainly don’t recall anything on PC looking nearly as good as Dreamcast games when it launched in 1999, voodoo or not. Hardware T&L was a big leap but GF3 cards launched a year after PS2. By the time we had games like half-life and other big titles of 2004-2005 that generally looked better than a lot of late-gen PS2 games like god of war or FF12, the PS3 and Xbox 360 were launching.

that generation and this most recent one had longer lifespans than the earlier ones, and I think OC games had generally surpassed those by... 2009 or so?

I would have called it even at the PS4 / Xbox One launch, with the system performing like a higher end gaming PC of the era, which obviously was outperformed by the Maxwell GPUs and Haswell profs shortly after.

I expect This generation will be largely the same, with overall performance surpassed by next gen gaming PCs, but in terms of the games they’re all ports now anyways, so they’ll all be about the same.
 
Do you see benchmarks? Do you see anything beyond their word on how well this will work? Also this is Sony, and they've lied about console performance in the past. Remember Cell, remember the Emotion Engine? These things weren't all that good and super hyped up. Cell was a super computer CPU, and 50 million polygons per second on PS2. It is easy to make lies about console hardware when the console doesn't run Windows or Linux to test it. Remember what Gabe Newell said about the PS3? Show me benchmarks and then we'll talk.

That's hilarious, considering I'm running YDL 6.2 on a 4-node PS3 Beowulf cluster right now, and Linux can now be run on PS4 consoles (with a bit of work) successfully, with many benchmarks being released; the PS2 could also natively run Linux as well.
The Cell was extremely powerful (floating point operations, though not so much on integer operations) compared to contemporary x86 and the then-new dual-core x86-64 CPUs in the mid-2000s, and if taken advantage of, had far greater performance capabilities.

You can read all about it in this thread from December of last year (the article is BS, but the thread itself is great):
https://hardforum.com/threads/ps3s-cell-cpu-is-by-far-stronger-than-new-intel-cpus.1989892/

Also, the 360, being released in 2005, was far more powerful than any single-GPU gaming PC on the market, and legit could only be topped with a then-new SLI or Crossfire PC, which at the time with high-end GPUs would have cost $2000-3000 - far more costly than the 360 console itself.
GPUs did catch up within the next year with NVIDIA's release of the G80, but still, for it's time, the 360 was the fastest gaming system on the market.

As for Gabe Newell talking about programming on the Cell, I will agree with him that the lack of debugging output was total bullshit.
Relying on expert programmers with the Cell (at the beginning, there were very few) without generalized tools did make things an uphill battle - that one is squarely on both IBM and Sony.

As far as him complaining about 'learning something new', aka, moving away from single-threaded applications, he was in the wrong and needed to up his game, especially since SMP applications on x86-64 were starting to emerge anyways; granted, the IBM Cell is not using SMP with it's SPE units, but still, new hardware equals new software and programming.
It's always like that with any new technology, be it hardware or software, and that is hardly an issue strictly limited to game consoles; heck, if anything, consoles now are more standardized than they have ever been in the past 40+ years.
 
I think it has been more like a game of leapfrog since the voodoo cards were released, but there’s a certain point where it’s subjective. I certainly don’t recall anything on PC looking nearly as good as Dreamcast games when it launched in 1999, voodoo or not. Hardware T&L was a big leap but GF3 cards launched a year after PS2. By the time we had games like half-life and other big titles of 2004-2005 that generally looked better than a lot of late-gen PS2 games like god of war or FF12, the PS3 and Xbox 360 were launching.
After the first Voodoo graphics card, consoles had been a huge step behind PC gaming up until the release of the Xbox 360 and PS3. The reason for this was because this is the first time consoles used PC hardware to gain performance. Up until this point Sony had used custom hardware for the Playstation consoles, and MIPS CPUs. Nintendo had also used custom hardware up until the Wii U. When the 360 and PS3 were released, it pushed a lot of PC exclusive developers to jump onto console with games that used to be only availible on PC. The PS4/XB1 as well as PS5/Xbox Series X cemented this by continuing to use PC hardware.
that generation and this most recent one had longer lifespans than the earlier ones, and I think OC games had generally surpassed those by... 2009 or so?
Consoles are lasting longer because technology isn't evolving as quickly and consumers are more reluctant to spend money for an upgrade.

I would have called it even at the PS4 / Xbox One launch, with the system performing like a higher end gaming PC of the era, which obviously was outperformed by the Maxwell GPUs and Haswell profs shortly after.
The reason why the PS4 and Xbox One looks slow and dated was because of Nvidia's Maxwell hardware. Specifically the GTX 970 is what set higher standards for PC gaming. The GTX 780 Ti was $700 and then Nvidia releases a GTX 970 for $330 that performs equally the same or better? Nvidia wasn't being generous, they knew they had to set a high bar over consoles and they did it with the GTX 970. You think Nvidia was happy when AMD got both the PS4 and Xb1? Nvidia fought back and won the graphics card market, because AMD didn't want to rock the Sony and Microsoft boat. How do you think Sony and Microsoft would react if AMD releases a graphics card right now for $400 that performs much faster than a Xbox Series X or PS5? I can assure you that Nvidia is working on something faster than these consoles and it'll be less than $400.
I expect This generation will be largely the same, with overall performance surpassed by next gen gaming PCs, but in terms of the games they’re all ports now anyways, so they’ll all be about the same.
My prediction sees things differently. I think this console generation is going nowhere because I'm pretty sure these consoles will cost more than $400. COV19 screwed up the economy, so nobody is buying nothing this Holiday season. The same applies for new PC graphic cards from Nvidia and AMD. Nobody is going to buy anything for some time.
 
A few more channels? A 'custom' controller? A different software stack?

Those are tweaks. Earth-shattering would be using something in the Optane-class but peaking PCIe 4.0 bandwidth.

I'm not saying that it isn't useful stuff nor that it won't make a difference; I'm saying that Sony essentially jumped the normal tech cycle gun by about six months.

We have different opinions on what tweaks are then. In my opinion Microsoft made a “few” tweaks to their industry standard implementation of what PCs do.

There’s almost nothing Sony didn’t change. The only thing they didn’t do was invent new memory types and compression techniques.

The new PCIe 4.0 controllers coming out later this year still won’t solve some of the issues Sony has addressed. This is all relative since we’d need to see Sony exclusives on PC using that hardware to make a valuable comparison. But it’s not just a PC with Semi custom SoC running a cheap ADATA PCIE 4.0 SSD.
 
That's hilarious, considering I'm running YDL 6.2 on a 4-node PS3 Beowulf cluster right now, and Linux can now be run on PS4 consoles (with a bit of work) successfully, with many benchmarks being released; the PS2 could also natively run Linux as well.
The Cell was extremely powerful (floating point operations, though not so much on integer operations) compared to contemporary x86 and the then-new dual-core x86-64 CPUs in the mid-2000s, and if taken advantage of, had far greater performance capabilities.

You can read all about it in this thread from December of last year (the article is BS, but the thread itself is great):
https://hardforum.com/threads/ps3s-cell-cpu-is-by-far-stronger-than-new-intel-cpus.1989892/

Also, the 360, being released in 2005, was far more powerful than any single-GPU gaming PC on the market, and legit could only be topped with a then-new SLI or Crossfire PC, which at the time with high-end GPUs would have cost $2000-3000 - far more costly than the 360 console itself.
GPUs did catch up within the next year with NVIDIA's release of the G80, but still, for it's time, the 360 was the fastest gaming system on the market.

As for Gabe Newell talking about programming on the Cell, I will agree with him that the lack of debugging output was total bullshit.
Relying on expert programmers with the Cell (at the beginning, there were very few) without generalized tools did make things an uphill battle - that one is squarely on both IBM and Sony.

As far as him complaining about 'learning something new', aka, moving away from single-threaded applications, he was in the wrong and needed to up his game, especially since SMP applications on x86-64 were starting to emerge anyways; granted, the IBM Cell is not using SMP with it's SPE units, but still, new hardware equals new software and programming.
It's always like that with any new technology, be it hardware or software, and that is hardly an issue strictly limited to game consoles; heck, if anything, consoles now are more standardized than they have ever been in the past 40+ years.

I’ve been on this forum for almost 2 decades and he was the first person I’ve ever ignored. I just can’t anymore, I’m totally open to criticism and even heated discussion but yeah no.
 
After the first Voodoo graphics card, consoles had been a huge step behind PC gaming up until the release of the Xbox 360 and PS3. The reason for this was because this is the first time consoles used PC hardware to gain performance. Up until this point Sony had used custom hardware for the Playstation consoles, and MIPS CPUs. Nintendo had also used custom hardware up until the Wii U.
Does "custom hardware" to you mean "anything not x86-based", because if that is the case, you need to up your game with knowledge on these subjects and get away from your Wintel mentality, because it is holding you back, massively.
The original XBox used a x86 Pentium III Celeron with a DDR1 memory controller on it's chipset, and a GeForce 3-based GPU.

The GameCube, Wii, and Wii U all used an ATI/AMD Radeon GPU.
So the CPU isn't always an x86 or x86-64 CPU, so what?

Through the 1970s to the 1990s, x86 CPUs were hardly the gold standard when it came to performance, and PowerPC and MIPS both wiped the floor with them in overall performance clock-for-clock, and did so even into the early-2000s; where x86 won out was due to the price-performance ratio compared to those other ISAs, and this is now the argument being made against consoles, with you saying PCs (x86-64) is better than the consoles - well yeah, of course, when you spend a couple grand on the PC over the console, much like dropping a couple grand on those other ISAs back in the 1990s and early compared to x86 PCs of the time.
It really wasn't until the AMD Athlon 64 (Socket 754) was released in 2003, and with the AMD Athlon 64 X2 (Socket 939) in 2005, did x86-64 start to become a true contender with other CPU ISAs.

When the 360 and PS3 were released, it pushed a lot of PC exclusive developers to jump onto console with games that used to be only availible on PC.
That is total bullshit, and I remember a ton of games that were on both PC and console, being released on one or the other and being ported to one or the other, as far back as the 1980s and definitely so in the 1990s and 2000s.
You are claiming things that are pure fiction, at best, to purely fit whatever bullshit narrative you are literally making up at this point.

The PS4/XB1 as well as PS5/Xbox Series X cemented this by continuing to use PC hardware.
The PS4 and XBone both use a unified memory architecture, and the CPUs themselves have a memory controller designed specifically to utilize and communicate with GDDR5.
Aside from the CPU itself, neither of them are using "PC" hardware, and neither console is an IBM-compatible PC; this video, at the start, describes this in great detail:

 
Last edited:
I'm a PC gamer with a 2080/3700X and 120Hz 3440x1440 120 Hz G-Sync monitor and have no issues playing the PS4 exclusives, as pretty much none of them have any frame pacing issues. The only ones I've played that do is Bloodborne and Until Dawn, all others have pretty much perfect frame pacing at 30-60FPS, as expected of a first party title. They also have a pretty good implementation of motion blur, which helps considerably in smoothing out fast camera pans and movement.

God of War feels bad man and Horizon is just off imo. Days Gone also didn’t have a good feel.

Blood Bourne was unplayable.

Uncharted wasn’t bad. Feel is subjective , one of my GOATs from the PS3 was TLoU but it was a chore to get through. The 60fps remaster was far superior.

Spider-Man felt fine, I dunno I’m just holding out till PS5. I’m praying for a 60fps patch for most of them.

Like for instance I have Doom for PS4, I haven’t played past the first level or so. But that’s a dead zone, joystick issue I think. Not a frame pacing/frame rate issue. I got it because I wanted to try it out in the living room and just couldn’t get it feeling right.
 
Blood Bourne was unplayable.
If Bloodborne were going to be re-released, or at least optimized or allowed to run at 60fps on the PS5, then the PS5 would be a day-one purchase, hands down.
Even on the PS4 Pro, some areas would make the console work so hard, and could not even maintain a semi-consistent 30fps - outside of that, the game was damn near perfect overall.
 
Then don't and buy the many plethora of pre-built PC's. Buy a laptop that fits your needs. What we do here is clearly not for everyone.
No shit.

When the original Playstation was launched, gaming was very different. In September 1995 not 1994 in Japan, there was no such thing as a gaming PC. The first Voodoo graphics card was released in November 1996. It wasn't cheap to own any sort of PC in 1995, let alone one specifically built for gaming, as a PC back then was at least $1k in 1995 money. That's why consoles exist because they were far cheaper than a PC, plus nobody wanted to learn DOS commands to launch a game.
There were gaming PCs, and other non-x86 and even x86 computers that weren't PCs, which could be used for gaming and were used as such.

Arguably, the Apple II computers from the late 1970s were considered to be both general purpose and gaming computers.
As far as PCs themselves go, having an 80386 PC in 1985 or later was quite the treat for gamers of the time.

So, what you mean are computers (not just PCs) with 3D accelerators, then yes, that would have been from around 1995/1996 or later.
SGI MIPS-based workstations had that back in the 1980s, though, and could be used for developing, compiling, and playing games (which was done back then - for a price), and that was far before any contemporary PC had a 3D accelerator in it.

As for the cost of a 'gaming PC' back in 1995, it was more like $2000 (not adjusted for inflation) - $1000, like you are claiming, would not have bought a new Packard Bell budget PC.
Once again, you are making claims that are pure fiction - I know, I was there, I remember the cost of computers at that time.

Hell, the cost of the original IBM PC 5150 in 1981 was well over $2000 - adjust that for inflation and have a heart attack at the cost of the base-model 5150.
There is a reason the console market existed back, even before the Atari 2600 in the 1970s - full computers were expensive, let alone high-performance models.

Ray-Tracing isn't amazing but when consoles have it then PC games will require it. Will it be playable? That depends on your definition of "playable".

The only thing extending a 1080 Ti's value is console hardware limiting PC gaming. A GTX 970 owner would still enjoy todays games at 1080p 60fps with medium to high settings. A 1080 Ti would allow you to run games at max settings at 1440P, depending on the game? When Ray-Tracing hardware is a requirement then the 1080 Ti equally as incapable as a GTX 970.
Ray-Tracing isn't a requirement at this point in time, much like NVIDIA PhysX - it is an optional feature, and one with currently minimal games and minimal gains.
It would appear that its importance, at least at this point, is more from the market and sales point of 'having the feature' itself, rather than a product actually needing it for true functionality.

Yea, actually it does. There are some games that required DX10 support, though not many because DX9 was the choice for most PC games. Mostly because the PS3 and 360 ran on DX9 like hardware. Going back a bit further there was a time when you either had DX9.0c or you didn't get to play games. Remember when BioShock required SM3.0 and couldn't run on ATI's SM2.0 cards? This guy remembers. This guy also remembers when my Radeon 8500 could play games that Geforce 4 Ti owners couldn't because DX8.1 became the minimum needed.
The only game that "required" DX10 to play was Halo 2: Vista Edition, and it was less a true requirement than Microsoft's own marketing bullshit to push Windows Vista at the time.
Turns out, the game could be hacked to run on Windows XP with DX10 enabled on it, so it was a bullshit requirement forced by Microsoft.

As for the rest, I agree, those were true technical requirements which were needed natively in hardware, and I remember that all as well.
I'm glad you finally remembered something that actually happened and was true from that era.
 
Last edited:
There’s almost nothing Sony didn’t change. The only thing they didn’t do was invent new memory types and compression techniques.

The new PCIe 4.0 controllers coming out later this year still won’t solve some of the issues Sony has addressed. This is all relative since we’d need to see Sony exclusives on PC using that hardware to make a valuable comparison. But it’s not just a PC with Semi custom SoC running a cheap ADATA PCIE 4.0 SSD.
Some inline storage compression logic? Trim of the software stack? Prioritization of accesses?

Really the compression logic is about it, and the biggest reason it makes a difference is because the other hardware is fixed. When you're running your cores at ~2.0GHz and only have so much RAM, CPU cache, etc., yeah, you probably do need to put some logic in and trim the software stack to implement online level loading to that degree. But the comparison begs the question: since faster NVMe SSDs are really a product of cost optimization, and compression schemes are really not that difficult, what's stopping a 'near-enough' tweak to current and near-future desktop hardware from easily providing the same capability?

And will games actually use it to the point that it'd be desirable as more than a bullet-point for marketing?
 
Ray-Tracing isn't a requirement at this point in time, much like NVIDIA PhysX - it is an optional feature, and one with currently minimal games and minimal gains.
It would appear that its importance, at least at this point, is more from the market and sales point of 'having the feature' itself, rather than a product actually needing it for true functionality.
I really, really dislike this comparison. PhysX as a technology took off, just not quite in the direction Nvidia had intended.

The approach here is very different. Ray-tracing isn't a proprietary push; rather, it's more of a race that every GPU vendor is participating in. So it's not a requirement yet. But it will be.

Is it annoying that Nvidia 'got there first'? Perhaps; obviously they're quite happy to charge for the privilege, and at the same time, we haven't seen an end-user implementation that really shines. So I guess that's similar to PhysX, but again, it's a matter of 'when'.
 
I really, really dislike this comparison. PhysX as a technology took off, just not quite in the direction Nvidia had intended.

The approach here is very different. Ray-tracing isn't a proprietary push; rather, it's more of a race that every GPU vendor is participating in. So it's not a requirement yet. But it will be.

Is it annoying that Nvidia 'got there first'? Perhaps; obviously they're quite happy to charge for the privilege, and at the same time, we haven't seen an end-user implementation that really shines. So I guess that's similar to PhysX, but again, it's a matter of 'when'.
You know, that is a good point that I didn't consider, and while Ray-Tracing does require its own hardware, it isn't proprietary to NVIDIA, and it does seem like everyone is racing to get it.
Perhaps NVIDIA HairWorks would have been a better example of a feature that is being pushed, but is not a true requirement for basic functionality.
 
The launch titles look weak imo
Console launch lineups have historically been weak, especially in the last 15 years with games "needing" to also play on the previous console. It's worse when you have gimmicks that studios are trying to integrate into their games, and the PS5 is no different. Usually takes at least a year before they get over it and get back to making real games.
 
Console launch lineups have historically been weak, especially in the last 15 years with games "needing" to also play on the previous console. It's worse when you have gimmicks that studios are trying to integrate into their games, and the PS5 is no different. Usually takes at least a year before they get over it and get back to making real games.
At least so far, the PS5 launch seems to be less "cinematic" than the PS4 launch. :D
I don't know too many people who are going to miss 30fps.
 
God of War feels bad man and Horizon is just off imo. Days Gone also didn’t have a good feel.

Blood Bourne was unplayable.

Uncharted wasn’t bad. Feel is subjective , one of my GOATs from the PS3 was TLoU but it was a chore to get through. The 60fps remaster was far superior.

Spider-Man felt fine, I dunno I’m just holding out till PS5. I’m praying for a 60fps patch for most of them.

Like for instance I have Doom for PS4, I haven’t played past the first level or so. But that’s a dead zone, joystick issue I think. Not a frame pacing/frame rate issue. I got it because I wanted to try it out in the living room and just couldn’t get it feeling right.

Are you playing on a base PS4 or Pro? I'm on a Pro an for God of War I think I ended up mostly playing on the performance mode that was basically an unlocked frame rate that usually stayed in the 40-60 range, which I usually don't like and prefer the locked 30 that the resolution mode gives you and from what I remember of Digital Foundry's analysis it did pretty well holding, but the unlocked frame rate mode on that game is the only one I've been able to tolerate and actually prefer over a locked 30.

Horizon is probably the best example of perfect frame pacing on PS4 that I've came across on both the base and Pro console when I had them both, as I didn't notice a dropped frame throughout the 100+ hours I put into it over a couple replays, and Digital Foundry's newest vid (after a patch soon after launch) backs that up pretty well, so I dunno how it felt "off" to you at all. I also thought it had a great implementation of per-object motion blur (as opposed to full-scene motion blur that most people associate with motion blur and is universally hated) to make it seem smoother than most 30 FPS games.

I agree on Bloodborne, it really needed at least a Pro patch and I'm surprised it never got one, as even with Boost mode (which barely does much on base PS4 games really) on my Pro it was pretty bad. I'm really hoping BC on the PS5 can fix it or they at least port/patch it for PS5, because it definitely deserves more love and attention than it got.

I just recently played a few hours of Days Gone, so its got a bunch of patches since launch and I remember performance and bug issues in reviews when it came out, but it seems that they've ironed most of it now since then in the early game I've played so far.

Pretty much all the PS4 Naughty Dog games play perfectly to me as well with perfect frame pacing at a locked 30, or 60 in TLoU Remaster's instance (which I just finished playing again last night in preparation for TLoU Part 2 coming out in a few days) and have a pretty good motion blur implementation as well, which Digital Foundry's tech analysis also backs up. Spiderman is more of the same.

As for Doom, which I typically reserve FPS and cross-platform games for my PC, but I did the same as you and played it on my base PS4 not long after it came out and didn't have any issues with it personally, but of course I much preferred it on PC just for KB&M controls and also better performance.

PS5 will just give me an excuse to replay all of the above games at 60 FPS hopefully, even if they have to run at 1440p to do it, which they shouldn't have to with the significant hardware leap from PS4, but I guess it depends on if they do a proper PS5 patch for these games to take better advantage of the hardware or will rely on BC/emulation to do it, which would still bode well for games like God of War with an unlocked frame rate mode so it could be mostly at a locked 60 FPS supposedly as I'm sure that's its target/cap. Horizon will be on PC soon though, so can't wait to replay that on my ultrawide at 100+ FPS, but I may stick to using my controller on that game since I think most 3rd person games like that play better with a controller still.
 
You can have good implementation of motion blur, it's still motion blur, and I have a problem with motion blur period.

Most people don't know the difference between full-scene motion blur and per-object motion blur and don't realize that a lot of games use the latter without them even noticing it or having the option to toggle it on some PC games, because regardless of frame rate, it generally improves game play and image quality at least in 3rd person action/adventure games like most of PS4's exclusives are. It also matters what input you're playing with, as with controllers you're looking around with slower, more consistent camera movement as opposed to with a mouse that's faster and has more jarring movements/input which can clash with certain types and amounts of motion blur.

I'd link to it if I could get to Youtube at the moment, but Digital Foundry did a good video about a year ago explaining the different motion blur methods, where the stigma came from and why people like you insist it's all bad, and why it's still needed and used even on PC games with 60+ FPS.
 
I'd link to it if I could get to Youtube at the moment, but Digital Foundry did a good video about a year ago explaining the different motion blur methods, where the stigma came from and why people like you insist it's all bad, and why it's still needed and used even on PC games with 60+ FPS.
Motion blur is trying to mimic a flaw in real world cinematography in games. Why would you want a flaw implemented in a game? It's the same category as film grain, chromatic aberration and lens flare. Only some are more annoying than others.
There are better and worse implementations of it, but as soon as I see motion blur in a game, I'm scrambling to turn it off. The only good implementation is which I don't even notice.

I really don't appreciate condenscending videos like that which try to tell you what you should like.
 
Motion blur is trying to mimic a flaw in real world cinematography in games. Why would you want a flaw implemented in a game? It's the same category as film grain, chromatic aberration and lens flare. Only some are more annoying than others.
There are better and worse implementations of it, but as soon as I see motion blur in a game, I'm scrambling to turn it off. The only good implementation is which I don't even notice.

I really don't appreciate condenscending videos like that which try to tell you what you should like.

It's not trying to mimic a flaw of anything, it's trying to simulate real-life vision and make it easier for your eyes to follow on-screen movement by making it appear smoother. Turn your head and/or eyes fast enough and you see everything blur. It's not in the same category as those other effects either, as those are just artistic effects, while blur has a practical use, esp. in lower frame rate games at 30-60 FPS.

The video isn't condescending in the slightest, nor is it telling you what your should like. I recommend watching it before making a snap judgement like that and seeming offended by an objective take on the subject.
 
Looks stackable to me. That curve in the middle starts and ends at the same vertical, so anything flat and longer than that curve can be stacked on top.

The bottom looks flat, so it should have no problem being stacked on top of others.
There's no way this could be stacked
 
Slim towers just look sexier in pres releases. Nobody is actually going to leave anything more ungainly than the Wii in vertical orientation, uness you have it nailed to the floor! The cooling is typically better horizontal (just physics).

The Xbox Series X is such a chunker that there's no avoiding it's orientation, but I will give Sony credit for creating something with the option of EITHER in it's design.
Rectangles can easily be rotated 90 degrees, not sure what you are smoking?
 
Back
Top