PS5 - The Future of Gaming | The Power of V

What do you think of the console design?

  • Yay - Very futuristic

    Votes: 20 13.5%
  • Nay - Looks like a router

    Votes: 36 24.3%
  • Don't Care

    Votes: 67 45.3%
  • I prefer the refrigerator box

    Votes: 25 16.9%

  • Total voters
    148
  • Poll closed .
It's not trying to mimic a flaw of anything, it's trying to simulate real-life vision and make it easier for your eyes to follow on-screen movement by making it appear smoother. Turn your head and/or eyes fast enough and you see everything blur. It's not in the same category as those other effects either, as those are just artistic effects, while blur has a practical use, esp. in lower frame rate games at 30-60 FPS.

The video isn't condescending in the slightest, nor is it telling you what your should like. I recommend watching it before making a snap judgement like that and seeming offended by an objective take on the subject.
I for one like motion blur in certain games. As long as it doesn't introduce lag.
In a game like ARK? sure!
In a game like Quake live? NO!
 
  • Like
Reactions: T4rd
like this
GPUs cost like $1000 and here are people crying about a $500 console. Stop being poor. Either pony up and buy one, or go back to your PS3.
 
GPUs cost like $1000 and here are people crying about a $500 console. Stop being poor. Either pony up and buy one, or go back to your PS3.

I don't think anyone is crying about a $500 console, if anything people here are crying about a $500 console not being as "good" as a $2000+ PC with one of those $1000 GPUs, which to me is missing the point of gaming in general and is why I game on all platforms including my $2000 PC.
 
Im just wondering how long will they release PS4/XBONE games after launch of these, I remember being shocked when they still released games for the 360 in like either 18 or 19. If AAA titles still factor those in it maybe at least 2 years before seeing what these new consoles can do.
 
Im just wondering how long will they release PS4/XBONE games after launch of these, I remember being shocked when they still released games for the 360 in like either 18 or 19. If AAA titles still factor those in it maybe at least 2 years before seeing what these new consoles can do.

Microsoft said they will release their exclusive titles for Xbox One for two years after the launch of the Series X if I recall correctly. Sony has stated that their PS5 exclusives will be PS5 only. I'm sure we'll get third party games released on PS4/Xbox One for quite a while.
 
Last edited:
Does "custom hardware" to you mean "anything not x86-based", because if that is the case, you need to up your game with knowledge on these subjects and get away from your Wintel mentality, because it is holding you back, massively.
By custom I mean hardware you won't find on PC or anyplace else. It is unique to that machine.
The original XBox used a x86 Pentium III Celeron with a DDR1 memory controller on it's chipset, and a GeForce 3-based GPU.
With the exception of the Xbox and Dreamcast since the PowerVR chip in the Dreamcast was sold on PC for a while.
The GameCube, Wii, and Wii U all used an ATI/AMD Radeon GPU.
So the CPU isn't always an x86 or x86-64 CPU, so what?
GameCube and Wii used a GPU chip developed by ArtX, which was bought by ATI before the release of the GameCube. ArtX also developed the GPU for the N64. The Wii U did use a DX10 GPU from AMD, but that's more of the generation of the PS4 and XB1.

That is total bullshit, and I remember a ton of games that were on both PC and console, being released on one or the other and being ported to one or the other, as far back as the 1980s and definitely so in the 1990s and 2000s.
You are claiming things that are pure fiction, at best, to purely fit whatever bullshit narrative you are literally making up at this point.
Consoles did get PC games, but it usually took the generation afterwards. Like HL2 and Doom3 was on Xbox because it was literally a PC, but not the PS2 or GameCube. It took until the PS3 before Sony got these games, and since the Wii was an overclocked GameCube it never saw these games. The Wii U could have but that console financially died before it had the chance. American McGee's Alice is a game that was released on PC in 2000 but never saw a console port until Xbox 360 and PS3. The 1998 Thief was on Xbox Original but didn't see a Playstation release until PS3. The original Doom was ported to SNES with the help of the SuperFX chip, and even then most of the consoles that received Doom had some really bad ports. There was always a flaw like missing levels, low frame rates, bad music, or no music.





The PS4 and XBone both use a unified memory architecture, and the CPUs themselves have a memory controller designed specifically to utilize and communicate with GDDR5.
Aside from the CPU itself, neither of them are using "PC" hardware, and neither console is an IBM-compatible PC; this video, at the start, describes this in great detail:


The video you linked has the presenter say it isn't a PC, but then he goes on to say he's called it a legacy free PC. It isn't a PC in the sense that you can't install Windows or Linux without doing anything, but it is a legacy free PC in that you can make something like Linux work so long as you do a lot of work to get things going. I've linked that video before and watched it, and the dude had to find out what GPU the PS4 uses and found a match and is able to get the AMD open source drivers working on the PS4 because the GPU is mostly copy and pasted from some other AMD PC hardware.

At this point what makes a "PC" a "PC" for you is how much legacy crap is involved, while my definition is that it uses X86 and GCN so therefore it must be a PC. The fact that Linux runs at all while able to play Steam games on it shows that I'm more correct than you are.
 
So, what you mean are computers (not just PCs) with 3D accelerators, then yes, that would have been from around 1995/1996 or later.
Before 3D accelerators the thing that defined if your PC was a gaming PC was the sound card. So... yea.
As for the cost of a 'gaming PC' back in 1995, it was more like $2000 (not adjusted for inflation) - $1000, like you are claiming, would not have bought a new Packard Bell budget PC.
Once again, you are making claims that are pure fiction - I know, I was there, I remember the cost of computers at that time.

Hell, the cost of the original IBM PC 5150 in 1981 was well over $2000 - adjust that for inflation and have a heart attack at the cost of the base-model 5150.
There is a reason the console market existed back, even before the Atari 2600 in the 1970s - full computers were expensive, let alone high-performance models.
Either way a PC was expensive, and buying one for gaming was not a good idea at the time.

Ray-Tracing isn't a requirement at this point in time, much like NVIDIA PhysX - it is an optional feature, and one with currently minimal games and minimal gains.
It would appear that its importance, at least at this point, is more from the market and sales point of 'having the feature' itself, rather than a product actually needing it for true functionality.
Right now it's optional, but we're talking about after the PS5 and Xbox Series X are released.
 
By custom I mean hardware you won't find on PC or anyplace else. It is unique to that machine.
Most of the consoles used CPUs, graphics processors, controllers, etc. which were used in other computer systems, servers, embedded systems, etc.
The Genesis/Mega Drive used a Motorola 68000 m68k CPU, which was used mainstream in many workstations throughout the 1980s and early 1990s.

With the exception of the Xbox and Dreamcast since the PowerVR chip in the Dreamcast was sold on PC for a while.
The Cell CPU and the NVIDIA GPU were used in other computer systems outside of the PS3 and were not unique - does that mean the PS3 is a "PC" as well?

GameCube and Wii used a GPU chip developed by ArtX, which was bought by ATI before the release of the GameCube. ArtX also developed the GPU for the N64. The Wii U did use a DX10 GPU from AMD, but that's more of the generation of the PS4 and XB1.
The PowerPC CPU used in the GameCube and Wii were variants of the Motorola PowerPC CPU used in many Apple Macintosh laptops, all-in-ones, workstations, etc.
What's your point?

Consoles did get PC games, but it usually took the generation afterwards. Like HL2 and Doom3 was on Xbox because it was literally a PC, but not the PS2 or GameCube. It took until the PS3 before Sony got these games, and since the Wii was an overclocked GameCube it never saw these games. The Wii U could have but that console financially died before it had the chance. American McGee's Alice is a game that was released on PC in 2000 but never saw a console port until Xbox 360 and PS3. The 1998 Thief was on Xbox Original but didn't see a Playstation release until PS3. The original Doom was ported to SNES with the help of the SuperFX chip, and even then most of the consoles that received Doom had some really bad ports. There was always a flaw like missing levels, low frame rates, bad music, or no music.
So basically, if you had thousands of dollars to spend, you could get the best system for these games in the 1990s, aka, getting a PC.
For those who didn't have $3000+ to blow on an early 90s gaming PC, an affordable SNES got the job done - each has its pros and cons, so again, what is your point in all of this?

Half-Life 2 had high system requirements, and the original XBox had 64MB of RAM, allowing it to be ported properly.
The PS2 only had 32MB of RAM (and no HDD natively) and the GameCube only had 16MB of RAM, so getting a then-large game like Half-Life 2 running on each of those, without massively cutting corners/graphics/functionality, it would have been a stretch.

It wasn't just because the original XBox "was a PC", and the game still needed to be ported and optimized for it.
Nice videos on the Doom ports, though - I have seen them before, and that YouTube channel makes great content.

The video you linked has the presenter say it isn't a PC, but then he goes on to say he's called it a legacy free PC. It isn't a PC in the sense that you can't install Windows or Linux without doing anything, but it is a legacy free PC in that you can make something like Linux work so long as you do a lot of work to get things going. I've linked that video before and watched it, and the dude had to find out what GPU the PS4 uses and found a match and is able to get the AMD open source drivers working on the PS4 because the GPU is mostly copy and pasted from some other AMD PC hardware.

At this point what makes a "PC" a "PC" for you is how much legacy crap is involved, while my definition is that it uses X86 and GCN so therefore it must be a PC. The fact that Linux runs at all while able to play Steam games on it shows that I'm more correct than you are.
Seems like we are arguing anachronistic semantics at this point - "PC" means IBM-compatible PC, which is the term I am going with, and what these game consoles all are not, despite them being full computers.
However, if you mean "PC" as "personal computer in general", then everything down to a smartphone or calculator could easily fall under that category, which I can agree with since it is a generalized term that blankets any computer below a mainframe.

So which is it with you?
 
Last edited:
Before 3D accelerators the thing that defined if your PC was a gaming PC was the sound card. So... yea.
That statement is total bullshit, and counters what you just said above - so now you have flopped from a PC needing a 3D accelerator to be defined as a "gaming PC" to now needing a sound card to be defined as a "gaming PC" - give me a break. :meh:
Sound cards were optional in the 1980s and early 1990s (and technically still are from a requirement standpoint), and in the early 90s I gamed on an 80386 with Wolfenstein 3D without a sound card - that didn't make the PC any less of a gaming PC.

Either way a PC was expensive, and buying one for gaming was not a good idea at the time.
Either way, you don't have a clue about either the history of any of this, or what you are talking about.
Again, this is a hypocritical statement with you previously saying "PC gaming has always been the best" and now you are saying "buying it was not a good idea at the time" - the fuck are you smoking???

Right now it's optional, but we're talking about after the PS5 and Xbox Series X are released.
That has yet to be seen, and is pure speculation on your part.
 
I still don't understand why should I be impressed by that? It just doesn't matter, at all.
When I went from spinner drives to the first sata SSD and loading went from 60 seconds to 5 seconds for some games, I was impressed.
When I want from sata SSD to NVME ssd, and loading time went from 5 to 3 seconds, I couldn't care less. I would still today buy a sata SSD no reservation, because it just doesn't matter.
So why should I care that a PS5 will be able to take loading times from 3 seconds to 1.5 seconds? It's just not a feature that matters. And if this is the key selling point of the console, then something is rotten in the state of denmark.
because at this point all the files on the ssd can be accessed basically as fast as if they were in ram. so now devs can use massive amounts of textures, make much larger worlds w/ no loads and basically do away with loading screens altogether.
 
because at this point all the files on the ssd can be accessed basically as fast as if they were in ram. so now devs can use massive amounts of textures, make much larger worlds w/ no loads and basically do away with loading screens altogether.
That is not how it works. All assets for every possible next frame need to be in VRAM otherwise you get stuttering, or even a pause in the gameplay while they are being swapped in. It might help streaming, but there were streaming implementations that work very well without ultra fast drives, you don't need a huge amount of bandwith to stream a game world seamlessly. The limiting factor is still the size of VRAM. Just becuase you have a very fast SSD does not mean you can access the data as if it was already in VRAM, unless the GPU has direct access to the drive to use assets directly from there. But then you would need the same bandwith as the VRAM.

This is not a game changer, this is a hyped up solution in seek of a problem.
 
That is not how it works. All assets for every possible next frame need to be in VRAM otherwise you get stuttering, or even a pause in the gameplay while they are being swapped in. It might help streaming, but there were streaming implementations that work very well without ultra fast drives, you don't need a huge amount of bandwith to stream a game world seamlessly. The limiting factor is still the size of VRAM. Just becuase you have a very fast SSD does not mean you can access the data as if it was already in VRAM, unless the GPU has direct access to the drive to use assets directly from there. But then you would need the same bandwith as the VRAM.

This is not a game changer, this is a hyped up solution in seek of a problem.
You can see that in the Ratchet & Clank video. Severe stuttering is present whenever the player pulled another world in. They try to mask it with the framebuffer effect. This is why I am calling this a gimmick.
 
It's not trying to mimic a flaw of anything, it's trying to simulate real-life vision and make it easier for your eyes to follow on-screen movement by making it appear smoother. Turn your head and/or eyes fast enough and you see everything blur. It's not in the same category as those other effects either, as those are just artistic effects, while blur has a practical use, esp. in lower frame rate games at 30-60 FPS.

The video isn't condescending in the slightest, nor is it telling you what your should like. I recommend watching it before making a snap judgement like that and seeming offended by an objective take on the subject.
I don't know what video you are talking about then, because the one I saw was exactly about how everyone should come over the fence and embrace motion blur.
Chromatic aberration, Lens flare and film grain are not artistic effects, they are past and present limitations of camera technology, that are used as artistic effects in games, effects that I don't approve of. At least they don't use Black and white in games very often. As some filmmakers / photographers think that artistic as well.

As for motion blur, it does exactly what I said. It tries to mimic camera shutter in games, so instead of being a single point in time, it captures the blur / motion of objects for the entire duration of a frame. If it is implemented well it is very taxing on the hardware, if it is implemented badly it looks awful. And I don't like motion blur even when it is implemented well, as I don't like excessive amounts of motion blur on film either. I prefer crisp frames, that's why I advocate for HFR movies instead of the archaic 24fps nightmare. Thta some filmmakers think is artistic. No, it's not artistic it's a tradition and we all know what traditions are as explained by Rick.
 
That is not how it works. All assets for every possible next frame need to be in VRAM otherwise you get stuttering, or even a pause in the gameplay while they are being swapped in. It might help streaming, but there were streaming implementations that work very well without ultra fast drives, you don't need a huge amount of bandwith to stream a game world seamlessly. The limiting factor is still the size of VRAM. Just becuase you have a very fast SSD does not mean you can access the data as if it was already in VRAM, unless the GPU has direct access to the drive to use assets directly from there. But then you would need the same bandwith as the VRAM.

This is not a game changer, this is a hyped up solution in seek of a problem.

either way you will now be able to swap things in and out of ram at a much higher rate of speed including like i said textrures or maps and geometry without any stutters. and will reduce loads to nearly zero. i would say GTA V and Red Dead are pretty much pushing current technology to the max and what about in gta 6 when they will want to have a lot more random vehicles, people, buildings and trees with twice the detail? They are gonna need a to be able to have a definite high speed access to those files to keep from having stuttering or loads expecially at with the higher detail. and there are some rumors that maybe the gpu WILL have direct access to the ssd because I was reading somewhere about some of the professional Radeon cards have been using onboard nvme for some time now??
 
I don't know what video you are talking about then, because the one I saw was exactly about how everyone should come over the fence and embrace motion blur.
Chromatic aberration, Lens flare and film grain are not artistic effects, they are past and present limitations of camera technology, that are used as artistic effects in games, effects that I don't approve of. At least they don't use Black and white in games very often. As some filmmakers / photographers think that artistic as well.

As for motion blur, it does exactly what I said. It tries to mimic camera shutter in games, so instead of being a single point in time, it captures the blur / motion of objects for the entire duration of a frame. If it is implemented well it is very taxing on the hardware, if it is implemented badly it looks awful. And I don't like motion blur even when it is implemented well, as I don't like excessive amounts of motion blur on film either. I prefer crisp frames, that's why I advocate for HFR movies instead of the archaic 24fps nightmare. Thta some filmmakers think is artistic. No, it's not artistic it's a tradition and we all know what traditions are as explained by Rick.
well as far as games go i'm with you, but gross, you LIKE the soap opera effect for movies? can't say i'm with you on that. makes everything look cheap, fake and less believable.
 
either way you will now be able to swap things in and out of ram at a much higher rate of speed including like i said textrures or maps and geometry without any stutters. and will reduce loads to nearly zero. i would say GTA V and Red Dead are pretty much pushing current technology to the max and what about in gta 6 when they will want to have a lot more random vehicles, people, buildings and trees with twice the detail? They are gonna need a to be able to have a definite high speed access to those files to keep from having stuttering or loads expecially at with the higher detail. and there are some rumors that maybe the gpu WILL have direct access to the ssd because I was reading somewhere about some of the professional Radeon cards have been using onboard nvme for some time now??
Repeating the same thing is the definition of insanity. You can't swap in assets without stuttering even from RAM to VRAM, then how could you from disk? I just explained that. I also explained that the limitation for higher resolution objects in the game world is not drive speed but the amount of VRAM.

You act as a paid shill repeating the sales pitch.

Professional usage is very different to gaming, for pro usage you need huge amounts of memory and an SSD is cheaper than RAM assuming what you say is true and there are really pro cards with on board SSDs. Stuttering also is not a deal breaker in pro usage. I'm happy if my card can just move a CAD scene / pointcloud with reasonable latency.
 
well as far as games go i'm with you, but gross, you LIKE the soap opera effect for movies? can't say i'm with you on that. makes everything look cheap, fake and less believable.
The so called soap opera effect is a myth. It has more to do with bad videography than the actual frame rate. Have you seen 96FPS HFR footage? It is brilliant, crisp, you can actually see details during motion and not just a blurry mess (altough some movies do benefit from hiding their fight choreograhpy or stunt work behind motion blur) So it was fake all along, only with high FPS you can notice fake fights more easily.
 
Repeating the same thing is the definition of insanity. You can't swap in assets without stuttering even from RAM to VRAM, then how could you from disk? I just explained that. I also explained that the limitation for higher resolution objects in the game world is not drive speed but the amount of VRAM.

You act as a paid shill repeating the sales pitch.

Professional usage is very different to gaming, for pro usage you need huge amounts of memory and an SSD is cheaper than RAM assuming what you say is true and there are really pro cards with on board SSDs. Stuttering also is not a deal breaker in pro usage. I'm happy if my card can just move a CAD scene / pointcloud with reasonable latency.

ok but there are already videos showing higher framerates in games like GTA V, Witcher 3, and Red Dead using a SSD vs HDD. and that's current gen. I'm guessing they are trying to lay the groundwork for pushing things forward and especially w/ ray tracing coming into play i'm guessing it's gonna be nice for devs to have the extra bandwidth. I'm assuming they will be able to put it to use. If it wasn't a benifit i don't think they would have spent so much to include it. Oh and suppose to help audio too somehow.
 
Are you playing on a base PS4 or Pro? I'm on a Pro an for God of War I think I ended up mostly playing on the performance mode that was basically an unlocked frame rate that usually stayed in the 40-60 range, which I usually don't like and prefer the locked 30 that the resolution mode gives you and from what I remember of Digital Foundry's analysis it did pretty well holding, but the unlocked frame rate mode on that game is the only one I've been able to tolerate and actually prefer over a locked 30.

Horizon is probably the best example of perfect frame pacing on PS4 that I've came across on both the base and Pro console when I had them both, as I didn't notice a dropped frame throughout the 100+ hours I put into it over a couple replays, and Digital Foundry's newest vid (after a patch soon after launch) backs that up pretty well, so I dunno how it felt "off" to you at all. I also thought it had a great implementation of per-object motion blur (as opposed to full-scene motion blur that most people associate with motion blur and is universally hated) to make it seem smoother than most 30 FPS games.

I agree on Bloodborne, it really needed at least a Pro patch and I'm surprised it never got one, as even with Boost mode (which barely does much on base PS4 games really) on my Pro it was pretty bad. I'm really hoping BC on the PS5 can fix it or they at least port/patch it for PS5, because it definitely deserves more love and attention than it got.

I just recently played a few hours of Days Gone, so its got a bunch of patches since launch and I remember performance and bug issues in reviews when it came out, but it seems that they've ironed most of it now since then in the early game I've played so far.

Pretty much all the PS4 Naughty Dog games play perfectly to me as well with perfect frame pacing at a locked 30, or 60 in TLoU Remaster's instance (which I just finished playing again last night in preparation for TLoU Part 2 coming out in a few days) and have a pretty good motion blur implementation as well, which Digital Foundry's tech analysis also backs up. Spiderman is more of the same.

As for Doom, which I typically reserve FPS and cross-platform games for my PC, but I did the same as you and played it on my base PS4 not long after it came out and didn't have any issues with it personally, but of course I much preferred it on PC just for KB&M controls and also better performance.

PS5 will just give me an excuse to replay all of the above games at 60 FPS hopefully, even if they have to run at 1440p to do it, which they shouldn't have to with the significant hardware leap from PS4, but I guess it depends on if they do a proper PS5 patch for these games to take better advantage of the hardware or will rely on BC/emulation to do it, which would still bode well for games like God of War with an unlocked frame rate mode so it could be mostly at a locked 60 FPS supposedly as I'm sure that's its target/cap. Horizon will be on PC soon though, so can't wait to replay that on my ultrawide at 100+ FPS, but I may stick to using my controller on that game since I think most 3rd person games like that play better with a controller still.

Yep I’m on a Pro. If the motion blur is top notch I might be able to do 30fps but if not it’s just terrible. PC gaming literally ruined consoles for me haha. My wife says she doesn’t understand but we do. The struggle is real.
 
ok but there are already videos showing higher framerates in games like GTA V, Witcher 3, and Red Dead using a SSD vs HDD. and that's current gen. I'm guessing they are trying to lay the groundwork for pushing things forward and especially w/ ray tracing coming into play i'm guessing it's gonna be nice for devs to have the extra bandwidth. I'm assuming they will be able to put it to use. If it wasn't a benifit i don't think they would have spent so much to include it. Oh and suppose to help audio too somehow.
Framerate has nothing to do with loading speed.
 
Framerate has nothing to do with loading speed.
i know the ssd's especially the newer implementations will benefit both depending on the title. as it is now being that titles being required to be able to run off of HDD's is holding back dev's from doing certain things. and if 560 MB/s can make that HUGE of a difference, just think what 5.5-9.6 GB/s will be able to achieve? that's basically the bandwidth of ddr2.
 
Last edited:
That statement is total bullshit, and counters what you just said above - so now you have flopped from a PC needing a 3D accelerator to be defined as a "gaming PC" to now needing a sound card to be defined as a "gaming PC" - give me a break. :meh:
Before 3D accelerators that's what you bought if you wanted to play games. Sound cards came with game ports that you would use to plug in your joystick. Even then it wasn't a gaming PC as we know it today, but somebody who wanted to play games with his work PC.
Sound cards were optional in the 1980s and early 1990s (and technically still are from a requirement standpoint), and in the early 90s I gamed on an 80386 with Wolfenstein 3D without a sound card - that didn't make the PC any less of a gaming PC.
Nor does not having a graphics card make it not a gaming PC. Lets be honest here, hearing bleeps and bloops for sound sucked in the early 90's. Not having a graphics card today, sucks.

Either way, you don't have a clue about either the history of any of this, or what you are talking about.
Again, this is a hypocritical statement with you previously saying "PC gaming has always been the best" and now you are saying "buying it was not a good idea at the time" - the fuck are you smoking???
A sports car is always better than not buying a sports car, but it doesn't mean its a good financial decision. Today if I want a fast car I don't have to spend sports car prices. PC gaming has changed a lot since the early 90's. Today I can easily buy a $800 gaming PC, while in the early 90's you couldn't even buy an IBM compatible PC for $1000. Even before the Voodoo graphics card, PC gaming was already ahead of console. The Voodoo graphics card plus ID Software games has kick started PC gaming into the mainstream. Prices came down thanks to competition from AMD and ATI. Things changed.
That has yet to be seen, and is pure speculation on your part.
It is pure speculation but you know I'm right. Probably for the next 1-2 years most PC games will have Ray-Tracing as optional, but afterwards it'll become required. It'll be required because developers don't want to make a version of their game just because you didn't upgrade, and because AMD and Nvidia will push them to, so they can make money from hardware sales.
 
Before 3D accelerators that's what you bought if you wanted to play games. Sound cards came with game ports that you would use to plug in your joystick. Even then it wasn't a gaming PC as we know it today, but somebody who wanted to play games with his work PC.

Nor does not having a graphics card make it not a gaming PC. Lets be honest here, hearing bleeps and bloops for sound sucked in the early 90's. Not having a graphics card today, sucks.


A sports car is always better than not buying a sports car, but it doesn't mean its a good financial decision. Today if I want a fast car I don't have to spend sports car prices. PC gaming has changed a lot since the early 90's. Today I can easily buy a $800 gaming PC, while in the early 90's you couldn't even buy an IBM compatible PC for $1000. Even before the Voodoo graphics card, PC gaming was already ahead of console. The Voodoo graphics card plus ID Software games has kick started PC gaming into the mainstream. Prices came down thanks to competition from AMD and ATI. Things changed.

It is pure speculation but you know I'm right. Probably for the next 1-2 years most PC games will have Ray-Tracing as optional, but afterwards it'll become required. It'll be required because developers don't want to make a version of their game just because you didn't upgrade, and because AMD and Nvidia will push them to, so they can make money from hardware sales.
So, out of all of this dance and pony routine you keep taking us all through - what's your point?
 
The so called soap opera effect is a myth. It has more to do with bad videography than the actual frame rate. Have you seen 96FPS HFR footage? It is brilliant, crisp, you can actually see details during motion and not just a blurry mess (altough some movies do benefit from hiding their fight choreograhpy or stunt work behind motion blur) So it was fake all along, only with high FPS you can notice fake fights more easily.
I kind of get both sides; though many will argue that 'cinematic' capture is at 24FPS.

Personally, I prefer the clarity. When the 'smooth motion' or whatever the tech is marketed as is turned on at its lowest setting, I find it balanced.

Bigger issue is that's for playback of pre-recorded, non-interactive content: if it's interactive, turn off the fucking motion blur. That basically just creates a 'trail' behind object or whole screen movement.
 
So, out of all of this dance and pony routine you keep taking us all through - what's your point?
Point is that consoles are no longer needed. Their purpose was to offer cheap computing power for gaming purposes because PC's were expensive. I really doubt that the PS5 and Xbox Series X are going to be cheap. If these new consoles are $500 or higher then I can't see why someone wouldn't get a PC instead for a few hundred more? Consequently if the PS5 was $400 then PC gaming is in for a shit storm. Especially when AMD and Nvidia have been offering more expensive graphic cards lately. Lets be honest here, the RX 5700 should have been $250 from release. I expect Nvidia to release some fast new GPU's, but will they be affordable? Even I have to admit that if the Nvidia RTX 3070 was $500 and performs like a PS5 then why wouldn't I buy a PS5? Console prices can make or break their future success.
 
i know the ssd's especially the newer implementations will benefit both depending on the title. as it is now being that titles being required to be able to run off of HDD's is holding back dev's from doing certain things. and if 560 MB/s can make that HUGE of a difference, just think what 5.5-9.6 GB/s will be able to achieve? that's basically the bandwidth of ddr2.

Just a comment on this, since this seems to be a very common misconception, that the speed advantage of SSD (SATA) over HDDs on the PC especially for typical application loading (eg. games) is not really due to the sequential transfer rate difference.

The real advantage with SSDs is that they have something like 100x lower access latency and faster random read times compared to HDDs. While NVMe based SSDs have much potentially higher sequential transfer times (especially at higher queue depths) they actually in some cases (implmentations) aren't even faster than SATA SSDs in the above 2 metrics.

This is also important in terms of why bandwidth (throughput) is not the only consideration for example in a comparison to system memory. While they can potentially match DDR2 in bandwidth in sequential transfers they are orders of magnitude slower in latency and random access metrics.

So it then also becomes an interesting software question of how much the workload, in this case games, will actually be adaptable to simply the sequential transfer speed differences.
 
Point is that consoles are no longer needed. Their purpose was to offer cheap computing power for gaming purposes because PC's were expensive. I really doubt that the PS5 and Xbox Series X are going to be cheap. If these new consoles are $500 or higher then I can't see why someone wouldn't get a PC instead for a few hundred more? Consequently if the PS5 was $400 then PC gaming is in for a shit storm. Especially when AMD and Nvidia have been offering more expensive graphic cards lately. Lets be honest here, the RX 5700 should have been $250 from release. I expect Nvidia to release some fast new GPU's, but will they be affordable? Even I have to admit that if the Nvidia RTX 3070 was $500 and performs like a PS5 then why wouldn't I buy a PS5? Console prices can make or break their future success.
Again, what I posted previously:

What a lot of individuals here are forgetting is that many people don't want to build systems - be it lack of knowledge, too much work, lack of time, etc. - many of them just want to get a console, connect it into their TV and Wi-Fi, sit down, and start playing games with minimal hassle; they aren't in it for the hardware, they are in it for the games, period.
Convenience and ease-of-use are both massive factors with the average gamer.


If we go about it logically, yes, PCs are most likely going to be better overall for everything, depending on the cost, and I am not denying you that.
However, the average gamer may not want that hassle, as I stated above, and simply wants an easy-to-use plug-n-play system that they can easily and quickly enjoy their games on, with minimal hassle.

Not to mention, exclusives for a console can be the killer app for a specific platform.
I swore I would never get another console until I realized Bloodborne was a PS4 exclusive, and then lucky me, Spider-Man released not long after I got one, and that was another fantastic PS4 exclusive; both titles plus the console were absolutely worth the cost.

So, "ease of use" and "exclusive titles", which is a large market share, pretty much defeats your argument on the business aspect.
It isn't that consoles are 'better' (obviously in hardware, they are not), but they do offer exclusive perks that many individuals find enticing outside of price-performance, and Sony (and MS and Nintendo) are going to capitalize on that.

You keep looking at this purely from the price-performance aspect, and are completely missing the business and marketing aspects.
I really recommend you start wrapping your head around those aspects before going any further with your argument.
 
Just a comment on this, since this seems to be a very common misconception, that the speed advantage of SSD (SATA) over HDDs on the PC especially for typical application loading (eg. games) is not really due to the sequential transfer rate difference.

The real advantage with SSDs is that they have something like 100x lower access latency and faster random read times compared to HDDs. While NVMe based SSDs have much potentially higher sequential transfer times (especially at higher queue depths) they actually in some cases (implmentations) aren't even faster than SATA SSDs in the above 2 metrics.

This is also important in terms of why bandwidth (throughput) is not the only consideration for example in a comparison to system memory. While they can potentially match DDR2 in bandwidth in sequential transfers they are orders of magnitude slower in latency and random access metrics.

So it then also becomes an interesting software question of how much the workload, in this case games, will actually be adaptable to simply the sequential transfer speed differences.
you say "in some instances" right in a porly designed implementation. this is a custom 5 channel design which was designed to give very low access times and was designed specifically for this use case. Not like they are using a bottom of the barrel single channel sata over m.2 thing. So when it comes to smaller files then we really want to see the 4k speeds and iops as well as access times. and i'm sure whatever it is it blows sata ssd's out the water.

But really for most cross platform games the most you'll have to worry about is whatever Xbox is doing as that will be the minimum spec. The only games that will be able to take advantage of the tech will be PS exclusive titles anyway but I haven't heard any devs coming out complaining about or mocking the tech. Only people on this board. And actually rumors one the streets devs are saying that the PS is the better console. So time will tell.
 
Last edited:
You keep looking at this purely from the price-performance aspect, and are completely missing the business and marketing aspects.
I really recommend you start wrapping your head around those aspects before going any further with your argument.

He's not going to, myself and others have made these same points ad nauseam to him in other threads and that's when he calls console users stupid and console exclusives illegal with some bogus/irrelevant movie theater deal. Again, I called this happening on the first page from his first post in the thread and recommend ignoring him outright from now on.
 
I know it sucks that some of you are gonna have to upgrade your rig to play the latest games but that's how it goes when it comes to pc. Been a long time since it's happened but it's nothing new. Everyone was complaining how consoles were holding PC gaming back and now that they are pushing things forward the same people are complaining because their Sandybridge quadcore and 1st gen SSD from 2011 may not be good enough in 2021. Time to go [H]
 
So it then also becomes an interesting software question of how much the workload, in this case games, will actually be adaptable to simply the sequential transfer speed differences.
Games were never limited by sequential transfer rate speed of the disk they are loaded from.

I know it sucks that some of you are gonna have to upgrade your rig to play the latest games but that's how it goes when it comes to pc. Been a long time since it's happened but it's nothing new. Everyone was complaining how consoles were holding PC gaming back and now that they are pushing things forward the same people are complaining because their Sandybridge quadcore and 1st gen SSD from 2011 may not be good enough in 2021. Time to go [H]
When I pointed out that games are not limited by disk transfer rate you ignored it, and now pages later you still spout the same marketing line as if nothing has happened. Please just stop for a moment. Or do you think if you make enough shilling posts you can bury actual arguments?
 
Games were never limited by sequential transfer rate speed of the disk they are loaded from.

When I pointed out that games are not limited by disk transfer rate you ignored it, and now pages later you still spout the same marketing line as if nothing has happened. Please just stop for a moment. Or do you think if you make enough shilling posts you can bury actual arguments?
And you also ignored that there are games now that are getting higher framerates and better frame times/pacing with current ssd tech so what makes you think that a high bandwidth ssd won't be a requirement to have a smooth experience in the future. Look man I don't even own a console and haven't since a release PS3 that i sold 2 yrs later. Who am i shilling for?
 
Games were never limited by sequential transfer rate speed of the disk they are loaded from.


When I pointed out that games are not limited by disk transfer rate you ignored it, and now pages later you still spout the same marketing line as if nothing has happened. Please just stop for a moment. Or do you think if you make enough shilling posts you can bury actual arguments?
oh and i just remembered, from the ps5 reveal, that you can use an external drive with the PS5 you just won't be able to run "next gen" games from it. so take that as you will.
 
I don't really keep track of the 'game', but didn't Star Citizen start requiring or recommending an SSD a while back?
 
What a lot of individuals here are forgetting is that many people don't want to build systems
We've been through this, you don't have to build anything. Most people who game on PC are probably doing it on a laptop and last I checked you can't build a laptop, though that would be amazing if you could.
If we go about it logically, yes, PCs are most likely going to be better overall for everything, depending on the cost, and I am not denying you that.
However, the average gamer may not want that hassle, as I stated above, and simply wants an easy-to-use plug-n-play system that they can easily and quickly enjoy their games on, with minimal hassle.
Most people who game on PC or console aren't worried about difficulty of setting it up and using it. The main issue for gamers is if they can play with their friends. If your friends are using PS4 to play GTAV online and you're on PC then you're compelled to buy a PS4 as well, or risk losing your friends. This is the reason why Sony was and maybe still is reluctant to let PS4 gamers play with PC and Xbox gamers. The lack of crossplay is what keeps people using consoles, as well as familiarity. You'd be surprised how many people don't think they can play first person shooters with a keyboard+mouse.
Not to mention, exclusives for a console can be the killer app for a specific platform.
I swore I would never get another console until I realized Bloodborne was a PS4 exclusive, and then lucky me, Spider-Man released not long after I got one, and that was another fantastic PS4 exclusive; both titles plus the console were absolutely worth the cost.
Yea, I don't support the business practice of exclusives. If you bought a PS4 to play Bloodborne then congratulations for buying outdated hardware to play one single game. No sorry, 2 games because Spider-Man. This is why I support emulators, though it seems Bloodborne might be making its way to PC. If Sony keeps this up then I won't have to look forward to a PS4 emulator to play these exclusives. Though I guess we can't call them exclusive anymore.
So, "ease of use" and "exclusive titles", which is a large market share, pretty much defeats your argument on the business aspect.
The former is debatable but the latter is true. But again, exclusive titles are anti-consumer. Good for PS4 business model, but bad for me who just wants to game on my PC.
 
We've been through this, you don't have to build anything. Most people who game on PC are probably doing it on a laptop and last I checked you can't build a laptop, though that would be amazing if you could.
Even if a pre-built system is bought, many people don't want the hassle of dealing with the OS and/or setup/maintenance of it all.
Consoles are much easier to manage overall for the average person - if you don't believe this, then you are giving the average person way too much credit; that isn't elitist thinking, it just is what it is from decades of experience.

Most people who game on PC or console aren't worried about difficulty of setting it up and using it. The main issue for gamers is if they can play with their friends. If your friends are using PS4 to play GTAV online and you're on PC then you're compelled to buy a PS4 as well, or risk losing your friends. This is the reason why Sony was and maybe still is reluctant to let PS4 gamers play with PC and Xbox gamers. The lack of crossplay is what keeps people using consoles, as well as familiarity. You'd be surprised how many people don't think they can play first person shooters with a keyboard+mouse.
That argument goes both ways, and there are games on PC that are not crossplay-capable with consoles, and vice-versa; that isn't an issue with the systems themselves, but more so the developers/producers/license holders of the games.
This is quite "anti-consumer" as well, and again, this argument goes both ways; not disagreeing with you.

Yea, I don't support the business practice of exclusives. If you bought a PS4 to play Bloodborne then congratulations for buying outdated hardware to play one single game. No sorry, 2 games because Spider-Man. This is why I support emulators, though it seems Bloodborne might be making its way to PC. If Sony keeps this up then I won't have to look forward to a PS4 emulator to play these exclusives. Though I guess we can't call them exclusive anymore.
Exclusives do tend to be annoying, but the silver-lining to it is that exclusives tend to be optimized for one platform; Spider-Man on the PS4 is the pinnacle of optimization, and Bloodborne is a close second (it's only flaw is the low-performing Jaguar CPU causing frame dips).
As for the "outdated hardware", I have no regrets playing two absolutely fantastic games on such a platform.

Granted, playing NES games on a console from the 1980s (with a CPU from the 1970s) is still, to this day, a fantastic experience, despite the games running on "outdated hardware".
If we were arguing if console or PC is more powerful, then I would agree with you on the "outdated (or inferior) hardware" aspect, but we are talking about exclusives.

Emulators are great, and I have been using them since the mid-1990s, and for those who only have a PC or Raspberry Pi or etc. then go for it - use what works.
Given the choice between an emulator and original hardware, if possible, I will take the original hardware any day, CRT SDTV included; this is a personal choice, though, and hardly dictates that one way to enjoy a game is better than another.

So really, it doesn't matter if the hardware is outdated as much as it is that the games themselves are fun.
While I don't care much for the fact that the PS4's CPU holds so many high-end titles back to 30fps, I will say for those few exclusives (until they are officially ported or re-released) it is absolutely worth it.

I would also rather have an exclusive title which is highly optimized for a single platform, than a title which is ported to everything is barely optimized for anything; way too many examples of the latter.
There are good titles that are multi-platform, and still highly optimized, like Alien: Isolation; again, the argument can go both ways, and it really comes down to the devs and publishers.

But again, exclusive titles are anti-consumer. Good for PS4 business model, but bad for me who just wants to game on my PC.
They are, but it is the business world we live in, and exclusives are hardly a new thing, as they date back into the 1970s with both consoles and computer systems, years before the IBM PC was even a thing.
If you can't afford it, then emulate the hell out of it. :cool:
 
Last edited:
  • Like
Reactions: kac77
like this
1592372986474.jpeg


Looks like $500 for the one with the disk drive, 400 for digital edition if these leaks from Amazon are to be believed.
 
Back
Top