PlayStation 5 Rumored to Sport Ryzen 8-Core CPU, Cost $500

Actually VRAM isn't a major bottleneck anymore. Modern consoles have a heap of VRAM for what they need compared to before. Back in the Xbox 360/PS3 days, VRAM was THE limiting factor, as consoles really only had 256MB, a single terrain file can take up that amount of space now. It's not infinite, but it isn't limiting much. The real bottleneck is CPU power. Moving to an 8-core Ryzen (even with SMT Disabled) would more than quadruple the CPU power at hand, and be a near 8-times improvement with SMT on. By giving developers that much more CPU headroom, huge amounts of game play spaces can emerge.

Next Gen consoles will most likely have 10-16GB of RAM, but I doubt that will really effect much in terms of visual quality, the CPU power will be what completely changes the industry.
Yes, exactly. (y)
I couldn't have said it better myself.
 
>90% of console buyers have no idea how many cores they have nor do they care.
I think you underestimate the amount of console sites and X1 vs PS4 debates they read.
Or, maybe I'm giving them too much credit for information retention... lol It could honestly be either one at this point! :sour:
 
Actually VRAM isn't a major bottleneck anymore. Modern consoles have a heap of VRAM for what they need compared to before. Back in the Xbox 360/PS3 days, VRAM was THE limiting factor, as consoles really only had 256MB, a single terrain file can take up that amount of space now. It's not infinite, but it isn't limiting much. The real bottleneck is CPU power. Moving to an 8-core Ryzen (even with SMT Disabled) would more than quadruple the CPU power at hand, and be a near 8-times improvement with SMT on. By giving developers that much more CPU headroom, huge amounts of game play spaces can emerge.

Next Gen consoles will most likely have 10-16GB of RAM, but I doubt that will really effect much in terms of visual quality, the CPU power will be what completely changes the industry.

I suppose my original claim was that consoles were NOT the primary reason PC gaming has been stagnant, but most of these last posts have been in response to on vRam. I am not sure I have the energy for the CPU argument.
In many cases you are correct, CONSOLES are not able to hit above 30 FPS at 1080p in some games due to cpu constraints. However, that is in no way holding back PCs from hitting higher FPS. I just don't see how having a faster CPU in consoles will want to make PC game developers crank up the visuals either. I mostly thought that was from the GPU power and vRam to include bandwidth (texture level).

edited for Peace
 
So come on guys, assuming that Consoles are truly holding back better graphics on PC, which aspect is it? CPU power or vRam capacity that stops PCs ports fro being developed with a higher level of graphics?
Also, even if the PS5 and Next Xbox get released next year, it will still be at least another year before they have the majority of the purchasing power. So sadly, it will still be at least 2 years before Pc gamers can truly get the visual quality that they demand with that shiny new GTX 2080ti. Maybe by that time, PC gamers will be able to play Red Dead 2.
 
Next Gen consoles will most likely have 10-16GB of RAM, but I doubt that will really effect much in terms of visual quality, the CPU power will be what completely changes the industry.
If we think about it from a Sony/M$ perspective, and factor in their love for re-releasing with improvements... given where process nodes are now, it's going to be cost and thermal prohibitive to touch the CPU much. So I think you may be on to something there, that they are going to aim for a huge jump in CPU performance in order to provide them with WAY WAY more futureproofing than they did with the 8x Jaguar cores. There, they needed as much CPU horsepower for as little power consumption, which at that point AMD just wasn't that good at. Had they been Intel cores (not Atom), it'd have been 4 cores heh

With Ryzen though, and the nodes being at the feasible limits, that means two avenues for upgrades: More or faster RAM, and either faster or more core count GPUs.

In time, there might be a revision to the CPU that addresses some optimizations, but overall I wouldn't see it contributing in any meaningful way. Though I suppose if there are big enough improvements to the Uncore side, with latency, it may help. But still, doubling down on CPU power now will let them milk the next gen longer and make more money, is how I see it.

----------

As far as the other debate going on, I don't remember who is claiming what, but my personal opinion has been since about the middle of PS3/X360 product cycle, PC games were very much held back by Consoles, and that still holds true.

The GOOD news for us is that with the PS4/X1, we have consoles with 8 CPU cores, which means devs won't get to hide behind the excuse of "we just can't put in the time to code in multi-threading in our titles", because now they HAVE to leverage all the cores :D

If this gen were to drag on for 4 more years, we'd be right back in that same stagnated situation... but it has VERY LITTLE to do with the capabilities of the PC's hardware as they've always been more than up to the task! No it's because the number of console owners far outweighs the number of high-end PC gamers, that they are more or less "doing us a favor" by porting their game to PC :( Sad really since the fidelity in gaming today has PC to thank for getting us where we are. Otherwise we'd still be playing platformers... which, I won't even touch on how sad the fact is that we ARE, due to the influx of indie games :shifty:

(Typo correction: "fedelity"... that'll teach me to post at 2am... *sigh*)
 
Last edited:
As far as the other debate going on, I don't remember who is claiming what, but my personal opinion has been since about the middle of PS3/X360 product cycle, PC games were very much held back by Consoles, and that still holds true.

The GOOD news for us is that with the PS4/X1, we have consoles with 8 CPU cores, which means devs won't get to hide behind the excuse of "we just can't put in the time to code in multi-threading in our titles", because now they HAVE to leverage all the cores :D

If this gen were to drag on for 4 more years, we'd be right back in that same stagnated situation... but it has VERY LITTLE to do with the capabilities of the PC's hardware as they've always been more than up to the task! No it's because the number of console owners far outweighs the number of high-end PC gamers, that they are more or less "doing us a favor" by porting their game to PC :(
Again, could not have said this better myself. (y)
 
Yes, exactly. (y)
I couldn't have said it better myself.

Again, could not have said this better myself. (y)

Ok, so now you are saying that Console CPUs are holding back PC ports. I guess the vRam argument wasn't working out. The only thing the CPUs affect in modern titles is fps (AI too but we are talking about graphics here). Despite consoles never having enough cpu to hit much above 60 fps and usually limited to 30 fps, 144 fps gaming has managed to be a thing in PC gaming. Amazing what these developers can do!
 
I told you, I am done with you, and you need to stop.
You are putting words into my mouth and their mouths, and I do not appreciate it.

Ok I will put your own words in your own mouth....

Are you for real?
Console game development and ports have been almost directly tethered to one another since the PS3 and 360 - did the last 10+ years not happen with you?

PC games, in terms of graphical advancements, were absolutely stagnating towards the end of the PS3 and 360's life cycles, and once the PS4 and XBone were released, there was a dramatic jump in graphics on console-to-PC ports.
Look at the difference between Far Cry 3: Blood Dragon (2013) and Far Cry 4 (2014) - they are night and day levels of different graphically.

The PS3 had 256MB of VRAM - compare that to the PS4 with ~6GB of unified RAM (the other 2GB of unified RAM was dedicated to the OS/background processes - total of 8GB).
When the new consoles released, suddenly GPUs with "only" 1-1.5GB of VRAM weren't enough and graphics, even at 1080p, were now requiring 2-3GB or more VRAM at higher settings, and all of that happened within a year circa 2014.

Unless you were blind to PC and console ports for the last 10-15 years, you need to seriously learn up on some recent tech and game development history...

First your claims on the vRam revolution despite the fact the developers needed to sell a shit ton of those games to older PS3 and XBOX 360 owners still.

Actually VRAM isn't a major bottleneck anymore....

Next Gen consoles will most likely have 10-16GB of RAM, but I doubt that will really effect much in terms of visual quality, the CPU power will be what completely changes the industry.

Yes, exactly. (y)
I couldn't have said it better myself.

You then played cheerleader and shifted focus to CPU power, yet the fact remains that most games are still developed to play far above 60 fps despite what the consoles can do. You also continue to dismiss the fact that almost all games are developed with multiple presets, to include ultra, which is always more advanced than even the newest consoles can achieve. This right here completely negates your entire "consoles are holding back PC ports" argument.
 
First your claims on the vRam revolution despite the fact the developers needed to sell a shit ton of those games to older PS3 and XBOX 360 owners still.
You then played cheerleader and shifted focus to CPU power, yet the fact remains that most games are still developed to play far above 60 fps despite what the consoles can do. You also continue to dismiss the fact that almost all games are developed with multiple presets, to include ultra, which is always more advanced than even the newest consoles can achieve. This right here completely negates your entire "consoles are holding back PC ports" argument.
Good lord - the last generation's weakness was lack of VRAM (this was my original point to you), and the current generation's weakness is lack of CPU processing power (30fps limitation).
Is that so hard to understand???

Many games are developed and optimized for PC and operate wonderfully like DOOM (2016) and Alien Isolation (2014) at 4K resolutions and well above 60fps - I never denied this.
Where you are putting words into my mouth is by stating that I am supposedly saying "consoles are holding back all PC games across the board", and that simply is not true, nor did I ever state that.

However, the consoles are the bar-setting-standard for all PC games (that are not console exclusives).
It has been this way since the 2000s, and will continue to do so into the future - I'm not making this up for my health, I'm just going off of history from the last 10-15+ years.
 
Last edited:
Good lord - the last generation's weakness was lack of VRAM (this was my original point to you), and the current generation's weakness is lack of CPU processing power (30fps limitation).
Is that so hard to understand???

Many games are developed and optimized for PC and operate wonderfully like DOOM (2016) and Alien Isolation (2014) at 4K resolutions and well above 60fps - I never denied this.
Where you are putting words into my mouth is by stating that I am supposedly saying "consoles are holding back all PC games across the board", and that simply is not true, nor did I ever state that.


However, the consoles are the bar-setting-standard for all PC games (that are not console exclusives).
It has been this way since the 2000s, and will continue to do so into the future - I'm not making this up for my health, I'm just going off of history from the last 10-15+ years.

You are older than I am - how do you not know this?!


Real mature.

Boom, there it is in bold. So PC ports can be optimized for the PC. Then the blame should be on developers being lazy and publishers being cheap and NOT the current specs of the consoles. Period. W1zzard came to this conclusion with Hitman 2 as well at TPU.
 
edit: Thank you
You know, you seem like a cool guy, and I didn't mean for this conversation to take such a downhill path.
Look, if you want to legitimately discuss this like adults and gentlemen (I'm including myself in that statement, too) then I'm willing to give this another go if you are, sincerely. :)
 
Last edited:
Boom, there it is in bold. So PC ports can be optimized for the PC. Then the blame should be on developers being lazy and publishers being cheap and NOT the current specs of the consoles. Period. W1zzard came to this conclusion with Hitman 2 as well at TPU.
You know, I thought long and hard about this and the point you are trying to make.
But, using the two examples I threw out (DOOM and Alien: Isolation) it is true that games can be fully optimized for PC hardware *and* consoles simultaneously; doesn't mean they always will be, but it is certainly possible.

While the consoles themselves are currently held back by their respective CPUs, it does not (or should not) hold back PC ports/titles from being optimized to run better - you were right, and opened my eyes to this in a new way, thank you for that.
Sorry for the harsh comments, too, I stand corrected.



EDIT:
Thinking back to our examples, you did state that Far Cry 3 used more than 512MB of VRAM on PC, yet the PS3 version could not have used more than 256MB of VRAM since that was its hard-limit.
That would make Far Cry 3 a "good" PC game, ported or developed natively.

The opposite end of that spectrum would be Rage, which was supposed to be developed specifically for PC back in 2011, but at the last minute was changed to be developed directly for the consoles.
The end result was a lazy (for real) port to PC with minimal VRAM usage, even at max settings and high resolution used, and tons of texture popping - this would be a prime example of what you are talking about with "lazy devs" and/or "cheap publishers", which at this point I would agree with.

Again, you were right and I was wrong.
Thanks again for pointing this out and sorry I didn't understand sooner, it would have saved us both the headache of what you were trying to get across!
 
Last edited:
You know, you seem like a cool guy, and I didn't mean for this conversation to take such a downhill path.
Look, if you want to legitimately discuss this like adults and gentlemen (I'm including myself in that statement, too) then I'm willing to give this another go if you are, sincerely. :)

Likewise, enjoy your Thanksgiving.
 
I wish Xbox would just give up lol. Sony will kill them again. The reason Xbox can't get any exclusive titles is because no one is going to give up the hundred million dollars they can make with Sony. And the Japanese will never ever show loyalty to Xbox over the Japanese Playstation
Then you can look forward to the $1200 PlayStation 6.
 
You know, I thought long and hard about this and the point you are trying to make.
But, using the two examples I threw out (DOOM and Alien: Isolation) it is true that games can be fully optimized for PC hardware *and* consoles simultaneously; doesn't mean they always will be, but it is certainly possible.

While the consoles themselves are currently held back by their respective CPUs, it does not (or should not) hold back PC ports/titles from being optimized to run better - you were right, and opened my eyes to this in a new way, thank you for that.
Sorry for the harsh comments, too, I stand corrected.



EDIT:
Thinking back to our examples, you did state that Far Cry 3 used more than 512MB of VRAM on PC, yet the PS3 version could not have used more than 256MB of VRAM since that was its hard-limit.
That would make Far Cry 3 a "good" PC game, ported or developed natively.

The opposite end of that spectrum would be Rage, which was supposed to be developed specifically for PC back in 2011, but at the last minute was changed to be developed directly for the consoles.
The end result was a lazy (for real) port to PC with minimal VRAM usage, even at max settings and high resolution used, and tons of texture popping - this would be a prime example of what you are talking about with "lazy devs" and/or "cheap publishers", which at this point I would agree with.

Again, you were right and I was wrong.
Thanks again for pointing this out and sorry I didn't understand sooner, it would have saved us both the headache of what you were trying to get across!

I disagree, devs are lazy, they always have been, that is why consoles hold back PC development. It was never about vram, or CPU, or anything, it has always been about devs developing for the largest demographic with the lowest average performance (consoles, not igpus). Then they sloppily port it to PC, by increasing that lowest average, things are pushed forward.

Yeah, some studios won't be lazy, but the reality is many are and will be. So moving that bar moves quality forward.

People get twisted up on specifications and lost in the weeds. They lose sight of how business operates, in a limited production budget we optimize to the largest audience and the rest gets what they get.

Edit: how many games today do we still see that have 420p or 720p textures for objects? Why make everything 1080p when consoles will need to checker board render anyway just to hit 30fps.

I can't expect devs to stop being lazy, because lazy is the wrong word, they are businesses and operate to make a profit. I can't fault console manufacturers because they aren't devs. It's not anyones fault, it's just a reality of business, so when consoles improve, the rising tide raises all ships.
 
Last edited:
The only thing the CPUs affect in modern titles is fps (AI too but we are talking about graphics here).
Actually, that's not entirely true! :p Well, the result of it ends up in turn affecting FPS, but I digress, the CPU does more than that now, and that's THANKS* to consoles. *Not a thanks of appreciation, either.

Take for example Fallout 4... There's AI and Physics that run off the CPU, as one would expect, but ALSO there's Shadows! :\ These were once handled by the GPU in games, since it could calculate them much faster. Except with the consoles having less GPU horsepower, that meant needing to leverage the now-available abundance of CPU cores.. sooo, they offloaded shadows onto the CPU *palm* It meant my much older (first-gen, Phenom II based, Llano) AMD APU with an R9 390 8GB, wasn't able to play at super high frames due to being CPU limited. As such, I ran it with Virtual Super Res (nigh-4K down to 1080p) with no change in FPS heh

Either way, point being, that a derpy thing done to accommodate the consoles, impacted PCs, and we had no say in it (as in no game option to use Hardware Shadows). Which I don't recall that being how Skyrim did shadows, as I used that same CPU and dual HD 5770s, with no issues (besides the early Crossfire issue that a workaround was cooked up for, but eventually fixed).

Apologies is my post sorta went sideways... I had to interrupt typing it for 30 mins so my train of thought was kinda lost :(
 
Actually, that's not entirely true! :p Well, the result of it ends up in turn affecting FPS, but I digress, the CPU does more than that now, and that's THANKS* to consoles. *Not a thanks of appreciation, either.

Take for example Fallout 4... There's AI and Physics that run off the CPU, as one would expect, but ALSO there's Shadows! :\ These were once handled by the GPU in games, since it could calculate them much faster. Except with the consoles having less GPU horsepower, that meant needing to leverage the now-available abundance of CPU cores.. sooo, they offloaded shadows onto the CPU *palm* It meant my much older (first-gen, Phenom II based, Llano) AMD APU with an R9 390 8GB, wasn't able to play at super high frames due to being CPU limited. As such, I ran it with Virtual Super Res (nigh-4K down to 1080p) with no change in FPS heh

Either way, point being, that a derpy thing done to accommodate the consoles, impacted PCs, and we had no say in it (as in no game option to use Hardware Shadows). Which I don't recall that being how Skyrim did shadows, as I used that same CPU and dual HD 5770s, with no issues (besides the early Crossfire issue that a workaround was cooked up for, but eventually fixed).

Apologies is my post sorta went sideways... I had to interrupt typing it for 30 mins so my train of thought was kinda lost :(

Loads of assumptions on what can be done with a better cpu but to be blunt if it is all true then your $500 PS5 can not have both a full cpu (65Watt) and a full gpu (180Watt) and still only cost that much. If it is still an APU then it would be hard to cool as well which would mean extra hardware for cooling.
 
I disagree, devs are lazy, they always have been, that is why consoles hold back PC development. It was never about vram, or CPU, or anything, it has always been about devs developing for the largest demographic with the lowest average performance (consoles, not igpus). Then they sloppily port it to PC, by increasing that lowest average, things are pushed forward.

Yeah, some studios won't be lazy, but the reality is many are and will be. So moving that bar moves quality forward.

People get twisted up on specifications and lost in the weeds. They lose sight of how business operates, in a limited production budget we optimize to the largest audience and the rest gets what they get.

Edit: how many games today do we still see that have 420p or 720p textures for objects? Why make everything 1080p when consoles will need to checker board render anyway just to hit 30fps.

I can't expect devs to stop being lazy, because lazy is the wrong word, they are businesses and operate to make a profit. I can't fault console manufacturers because they aren't devs. It's not anyones fault, it's just a reality of business, so when consoles improve, the rising tide raises all ships.

After what Nightfire has pointed out in this thread, I am starting to agree with this more and more.
Don't get me wrong, I do still believe (for non-PC exclusive titles) that the consoles (of any generation) set the standard for games, game engines, and processing capabilities of CPU/GPU/RAM/etc.

However, I no longer believe consoles are what strictly hold back console-to-PC ports, or even games that are developed simultaneously for both.
DOOM (2016) is the best example of this, since it runs flawlessly on PC and scales wonderfully on the consoles and even, to a lesser extent, on the ARM-based Nintendo Switch.

Games like Rage (2011), though, were purely lazy devs (John Carmack included, sadly) lazily porting the game to PC, and this backs what you are saying 100%.
 
After what Nightfire has pointed out in this thread, I am starting to agree with this more and more.
Don't get me wrong, I do still believe (for non-PC exclusive titles) that the consoles (of any generation) set the standard for games, game engines, and processing capabilities of CPU/GPU/RAM/etc.

However, I no longer believe consoles are what strictly hold back console-to-PC ports, or even games that are developed simultaneously for both.
DOOM (2016) is the best example of this, since it runs flawlessly on PC and scales wonderfully on the consoles and even, to a lesser extent, on the ARM-based Nintendo Switch.

Games like Rage (2011), though, were purely lazy devs (John Carmack included, sadly) lazily porting the game to PC, and this backs what you are saying 100%.

It's business, not hardware, always has been.
 
Next Xbox and Playstation should technically be able to do 144hz. The Xbox One already has adaptive sync for 120hz, but so far, 60hz is the cap for the current consoles.

It is nice that some games on Console have the ability to run with uncapped framerate. Turning VSYNC off on the X1X and PS4 Pro in R6 Siege can yield nearly 100fps at times. Entirely CPU limited.

Once consoles are off jaguar I really hope 30fps dies in a fire.
 
Next Xbox and Playstation should technically be able to do 144hz. The Xbox One already has adaptive sync for 120hz, but so far, 60hz is the cap for the current consoles.

It is nice that some games on Console have the ability to run with uncapped framerate. Turning VSYNC off on the X1X and PS4 Pro in R6 Siege can yield nearly 100fps at times. Entirely CPU limited.

Once consoles are off jaguar I really hope 30fps dies in a fire.

Keep dreaming lol. 30 fps isn't going anywhere. Most people are casuals and don't care. It's the reality we live in but feel free to keep dreaming and fantasize.
 
Keep dreaming lol. 30 fps isn't going anywhere. Most people are casuals and don't care. It's the reality we live in but feel free to keep dreaming and fantasize.

We shall see. A lot of console friends of mine are aware, especially on the R6 scene. That is a 60fps game though.
 
I'd be thrilled if they just added a 30/60 fps toggle for 1080p where the graphical settings changed accordingly....

Next gen consoles should be more than capable of this.

I'm really tired of "hardware limitations" being an excuse for not offering split screen coop in many games.

Like all the new Halo ones.

Same with Halo Wars 2, with their ridiculous unit cap.

Because the hardware can't handle rendering that many objects, supposedly.
 
We can all agree that CPU power needs to go up in order to really surpass the One X, but I was sort of curious on how much shared memory is really needed.

Techspot did a great article on how much system memory is needed in some cases: https://www.techspot.com/article/1535-how-much-ram-do-you-need-for-gaming/page2.html
A few things first:
1. Even on a sterile system, Windows is chewing through at least 2 GB of system memory
2. System memory looks to supplement video memory... up to a certain point.
3. To simplify the math, lets call the 1060 GDDR5 180 GB/S and the DDR4 system memory as 60 GB/S.

b1.PNG

The above example show the 6 GB GTX 1060 see no penalty with 8 GB of system ram, but a small penalty for the 3 GB model using the same 8 GB of system memory.
We can conclude the the game here is using less than 12 GB (6 GB + 8 GB - 2GB for Windows) but more than 9 GB (3 GB + 8 GB - 2 GB for Windows) of COMBINED memory.

c1.PNG

The above example is the extreme one. Here, the 3 GB 1030 takes a penalty on the minimums no matter how much system memory is there to "assist". We also see that both cards take a penalty with 8 GB of system RAM.
So we can say that CoD: WW2 is using less than 17 GB (3 GB + 16 GB - 2 GB for Windows) but more than 12 GB (6 GB + 8 GB - 2GB for Windows).

At first this looks bleak for a One X with "only" 12 GB of shared memory, it will most likely do better than a PC setup with a 6 GB of vRam with 8 GB of system ram since the system memory in the PC setup swaps out textures at a rate of 1/3 of the video card .
In other words, the PC example is only getting 6 GB of vRam with maybe 2 GB [ (8GB - 2 GB) / 3 ] of "help" from the system memory. Since the test setup with only 4 GB of system ram is running single channel (only 30 GB/s!), it is essentially giving the GPU no help at all.
 

Attachments

  • upload_2018-11-21_22-6-15.png
    upload_2018-11-21_22-6-15.png
    45.9 KB · Views: 0
PS5 doesn't really need to be that much more powerful than PS4 pro to sell if they have the exclusives lineup for it, afterall, Switch sold record numbers despite having less power than even PS4/Xbox one, let alone their upgraded versions.

Besides, I find arguing Xbox One vs PS4 on hardware specifications is an extremely fruitless exercise, because ultimately, games dictate sales, not hardware. Your run of the mill consumer who buys consoles generally couldn't care less about what's under the hood, they just want games that look pretty and play nice.
 
PS5 doesn't really need to be that much more powerful than PS4 pro to sell if they have the exclusives lineup for it, afterall, Switch sold record numbers despite having less power than even PS4/Xbox one, let alone their upgraded versions.

Besides, I find arguing Xbox One vs PS4 on hardware specifications is an extremely fruitless exercise, because ultimately, games dictate sales, not hardware. Your run of the mill consumer who buys consoles generally couldn't care less about what's under the hood, they just want games that look pretty and play nice.
It matters if even casual gamers ever want those said exclusives to get beyond 30fps.
Bloodborne and Spider-Man are two of my favorite games from this decade, and both are PS4 exclusives that each run at 30fps.

While 30fps does not ruin the games by any means, having them in 60fps would have perfected the experience.
Even the original Dark Souls ran at 30fps on the PS3, but the remastered version on the PS4 runs at 60fps and looks quite a bit better than the original, even though they are both the same game.

You are definitely right on what you are saying, but that Jaguar CPU is just too underpowered to run the types of games that are being pushed on the PS4 and XBone.
The Switch is in a bit of a different scenario since graphics take a back seat to gameplay - this isn't a bad thing, just a different approach, but it is one that the devs and publishers do not really use in most cases with the PS4 and XBone, and instead push the graphics at the forefront, thus we need a better CPU if we want more than 30fps at any resolution.
 
I will say this:

We want to Best Buy the other night and got the $199 ps4 Spiderman bundle because of the other thread.

I am very impressed with how well it performs. Game ran very smoothly.

Definitely better than the base Xbox One, which I have slowly started to despise.
 
It matters if even casual gamers ever want those said exclusives to get beyond 30fps.
Bloodborne and Spider-Man are two of my favorite games from this decade, and both are PS4 exclusives that each run at 30fps.

While 30fps does not ruin the games by any means, having them in 60fps would have perfected the experience.
Even the original Dark Souls ran at 30fps on the PS3, but the remastered version on the PS4 runs at 60fps and looks quite a bit better than the original, even though they are both the same game.

You are definitely right on what you are saying, but that Jaguar CPU is just too underpowered to run the types of games that are being pushed on the PS4 and XBone.
The Switch is in a bit of a different scenario since graphics take a back seat to gameplay - this isn't a bad thing, just a different approach, but it is one that the devs and publishers do not really use in most cases with the PS4 and XBone, and instead push the graphics at the forefront, thus we need a better CPU if we want more than 30fps at any resolution.

I mean you're at the end of the product cycle. It was built using a several year old CPU and video card. What do you really expect? You don't play modern games at 4k on a 8-core FX chip with an AMD 7850 GPU do you (which is what I think the PS4/XB1 were compared to back in the day).
 
I wouldn't say that all developers are lazy, but they do have to work in a triage manner with what little time, manpower and budget has been assigned to them.

If they aren't given enough of any one of those resources and some stuff will be left aside,vs whatever is considered "top priority".

The blame in these cases I would put it higher up imho, either at upper management of the production or directly from the overseer/executives of the publishing company. Problem is that we as customers are unable to really know at the end of the day, we just kinda feel more or less obviously that it could have been better.
 
Remember PlayStation 3 was supposed to be 8 cores, and how amazing that was at the time. They werent even full CISC cores. Threadripper 16 core 32 thread now costs $410 on eBay today. 4.0ghz under full boost with 32 threads for ~$400.

It’d be fun to see a $1000 console built with that heart and a Vega 64 level GPU and see what game devs could do with it over the course of a couple years!
 
I don't know what exclusives you're playing. The only games designed for the lowest common denominator that I can think of are Call of Duty, Battlefield, and MOBA/Arena games. Those aren't console exclusives though.

Games like The Last of Us, Uncharted, God of War, Detroit: Become Human, Until Dawn, Bloodborne, and Hellblade: Senua's sacrifice are just a handful that have deep plots, are difficult, and challenging.

And some of the worst trends started on PC. Just to list a few:
- Pay to win
- Free to play
- Social games (Valve, Blizzard)
- MOBA
- Battle royale

And garbage concepts like match making have been embraced by PC oriented developers like Valve & Blizzard.

To be fair the best trends and niches are also found on PC. But historically PC has been where the innovation happens. There are certainly a few console exclusives that pushed the industry forward, but most of the innovation good and bad happens on PC.
 
I mean you're at the end of the product cycle. It was built using a several year old CPU and video card. What do you really expect? You don't play modern games at 4k on a 8-core FX chip with an AMD 7850 GPU do you (which is what I think the PS4/XB1 were compared to back in the day).
kirbyrj and misterbobby are spot on.
The Jaguar was a low-power CPU meant for embedded systems and thin clients back in 2013, let alone in 2018+, so it was hardly ever a high performance CPU!
 
Last edited:
Back
Top