PlayStation 5 Rumored to Sport Ryzen 8-Core CPU, Cost $500

Until you can show me something made by AMD that can compete with a GTX 1080 without consuming close to 500 watts, I call full on shenanigans to your post. You want to compare a 30w APU to a full blown next gen console GPU? LOL.

Fenguang is a Ryzen powered apu and is VERY impressive for the power draw. The One X is not Ryzen powered but still does damn good for the power, somewhere between a GTX 1060 and GTX 1070. Having a Vega 56 - like card on one of these APUs seems very doable.

edit - if the just stick with Polaris and and made it faster and bigger while using 7 mm with GDDR5X, it would make things MUCH easier for game developers. It is important to look at it from all points if view.
 
Personally I find consoles have had a negative impact on games. They used to have more depth, more plot, and be more intellectually challenging.

Now they seem designed for children with the attention span of a gnat, and I blame consoles for that.

Because of this I ahve absolutely no interest in playing any console exclusive. They are designed for the lowest common denominator.
Back in the day, the scope and difficulty of a game was what kept someone playing, as there just wasn't the graphical horsepower to make them visually complex. These days the immersion is a huge part of the game and takes a lot of talent to pull off, and talent = money. Money that now can't be put into other areas.

Not to say that there aren't some great examples of those still, but on average games are easier. This is my console-gaming friend's opinion as well.


I would expect them to go with more cores at lower clocks, as they can bump the clocks up every time they have a die shrink.
We'll assume you mean "node advancements" and not "shrinks". I think it's a solid bet that whatever PS5 has is going to be all 7nm, and I don't know how soon 5nm will come or how affordable it'll be for a simple half-gen refresh.
 
There is zero chance of maintaining 60fps true 4K in next gen games. The PS4 Pro can't even maintain 30fps at 1440p in some games right now. That means even if Graphics didn't improve all that you would be looking at needing a GPU two-and-a-half to three times faster and that doesn't look likely at all. Factor in improved graphics and you're right back to struggling at 4K 30fps.
The real issue and the reason many games won't go past 30fps, be it at 720p or 4K, is because of the weak CPU more so than even the GPU (high-end consoles only).
The new CPU, and hopefully better GPU plus more unified RAM, will all help tremendously, but the CPU is the main bottleneck right now, at least with the current generation.
 
Well what would be the point in Sony releasing the PS5 with anything less powerful than a GTX 1080? The Xbox One X is already about as powerful as a GTX 1070.
I thought the GPU in the XBoneX was about as powerful as a RX580, perhaps slightly faster, give or take.
That is quite a ways from a GTX 1070 (or GTX 980Ti) - not saying you are wrong or anything, I just didn't remember it being quite that powerful. :)
 
What rubbish. Maybe should check out their Ryzen powered APUs before making such ridiculous claims
The current-gen consoles do not use off-the-shelf APU parts, as the GPUs offered in the consoles are far more powerful than what is offered in standard desktop/OEM APUs - assuming that is the point you were trying to make.
As for the next-gen consoles, though, it will be interesting to see what GPU will officially be used in the final product.
 
Bone and PS4 both went 30w for CPU, and 60-90w for the GPU. half the power consumption of their predecessors at launch (So you could stand to be in the same room as them).

Expect the same for the PS5, so either 8 cores at 2.2 GHz, or 4 cores at 3 GHz. I would expect them to go with more cores at lower clocks, as they can bump the clocks up every time they have a die shrink.
That's exactly what I was originally thinking for the specs of the CPU in the PS5, but for some reason I thought the TDP was way higher for an 8-core Ryzen - must have been thinking of 8-core Bulldozer, heh.
As for the 8-core Jaguar in the original PS4 and Xbone, I thought the TDP was closer to ~50 watts?

Only reason I am saying this is that the 4-core Jaguar @ 2GHz in my sig has a TDP of 25 watts, so I would assume 8-cores @ 1.6/1.75GHz would be closer to a 40-50 watt TDP, and the higher-end consoles with 8-cores @ 2.13/2.3GHz would be around a 50-60 watt TDP.
 
Oh, I forgot to quote the person who said they figure it's going to be an APU...

But that does sorta track that it may be, as even AMD had gone out to say that their APUs in (pretty sure it was said as 2019) would be as powerful as a console. Now that could very well be implied as "as powerful as" this gen, but it does kinda seem like foreshadowing given the circumstances...

Just a thought.

Personally I envision a low-to-mid 2GHz 8C no-HT that is in the 25-35W range, with a discrete-on-package GPU and probably HBM2, but GDDR6 (or 5X at the least) sounding more feasible to hit budget. But if PS4 launched at $400, I suppose it's very possible that at $500, it has HBM2.
 
Until you can show me something made by AMD that can compete with a GTX 1080 without consuming close to 500 watts, I call full on shenanigans to your post. You want to compare a 30w APU to a full blown next gen console GPU? LOL.

It's really apples and oranges. The XBone X can do 4K. It may be upscaled, but it's pretty smooth. The original XBone had a local ram on the chip package. Now with the interposer, you have unified memory management in the middle, HBM on one package, CU on another, and 2 small Ryzen cores. I think you will be surprised how powerful this combination will be. They have been marching this direction for a while and now the pieces are coming together.
 
The real issue and the reason many games won't go past 30fps, be it at 720p or 4K, is because of the weak CPU more so than even the GPU (high-end consoles only).
The new CPU, and hopefully better GPU plus more unified RAM, will all help tremendously, but the CPU is the main bottleneck right now, at least with the current generation.
The cpu is weak but it is NOT that weak. It has only a little less IPC than the Phenom 2 cpus so really 7 cores at 2.1 dedicated to games is as fast or faster than most Phenom X4 cpus and plenty of i3 and older i5 cpus. To say it is the reason games will not go past 30 fps is silly.
 
That's exactly what I was originally thinking for the specs of the CPU in the PS5, but for some reason I thought the TDP was way higher for an 8-core Ryzen - must have been thinking of 8-core Bulldozer, heh.
As for the 8-core Jaguar in the original PS4 and Xbone, I thought the TDP was closer to ~50 watts?

Only reason I am saying this is that the 4-core Jaguar @ 2GHz in my sig has a TDP of 25 watts, so I would assume 8-cores @ 1.6/1.75GHz would be closer to a 40-50 watt TDP, and the higher-end consoles with 8-cores @ 2.13/2.3GHz would be around a 50-60 watt TDP.

https://www.notebookcheck.net/AMD-A-Series-A4-5000-Notebook-Processor.92867.0.html

15w. at 1.5 GHz, and that includes the integrated GPU. So 8 cores @1.6 GHz without the GPU = 30w.

The dynamic power consumption of a CMOS circuit is dependent on frequency * number of cores * voltage squared. A higher frequency usually also requires a increase in voltage, so power consumption can go up VERY fast with frequency.

Doubling the frequency of a part can often mean doubling the voltage. That means 2x ther frequency *1.2v^2/0.6v^2 =, eighth times the power (using the 22nm gate numbers starting at lowest activation voltage).

schmoo_transistor.png


That's why adding cores is preferred over bumping frequency: it's pure linear increase in power.
 
Last edited:
The cpu is weak but it is NOT that weak. It has only a little less IPC than the Phenom 2 cpus so really 7 cores at 2.1 dedicated to games is as fast or faster than most Phenom X4 cpus and plenty of i3 and older i5 cpus.
Well, an AMD 8-core Jaguar @ 2.3GHz is very similar in performance to an Intel Haswell i3 dual core @ 3.1GHz (with SMT enabled).
I even did the math on it a while back comparing the Jaguar directly to Ryzen:

I wouldn't just compare the GFLOPS rating, as floating point operations is only one part of the CPU (and GPU), and does not directly correlate with CPU performance in total, as integer functions and IPC both play a large factor.
This is one area where a synthetic benchmark can give a better rough guestimate or ballpark of where CPUs and GPUs fit next to one another performance-wise in general.

The best comparison I can attempt to give between Jaguar and Ryzen is this:
https://www.cpubenchmark.net/compare/AMD-Ryzen-5-1500X-vs-AMD-GX-420CA-SOC/3001vs2121

Both the Ryzen 5 1500X and AMD GX-420CA are quad-cores, so we can do a nearly identical comparison of them, at least with IPC.
Please keep in mind that the synthetic benchmark's scores in the link above by itself doesn't mean anything specifically other than a placeholder for where each CPU falls performance-wise to one another.

The Ryzen 5 1500X quad-core @ 3.5GHz scored 10685.
The AMD GX-420CA (Jaguar) quad-core @ 2.0GHz scored 2299.

Now lets do the math on this!


So, if we want an apples-to-apples GHz to GHz comparison, we want to bring the Ryzen 5 1500X down to 2.0GHz as well, along with the score itself:
2.0 ÷ 3.5 = ~0.571428571
(3.5 is roughly 57% faster than 2.0)

Now, we want to take the score of the Ryzen 5 1500X and bring it down in the same manner:
~0.571428571 x 10685 = 6106 (rounding up from a long decimal)
(lowering the Ryzen CPU's score by 57% as well to match the equally reduced clock speed to allow a direct comparison)

So, a Ryzen 5 1500X quad-core @ 2.0GHz would have a score of 6106.
How does this compare in IPC in general performance-wise to the AMD GX-420CA?

6106 ÷ 2299 = ~2.655937364

Thus, the general performance core-for-core and clock-for-clock for the Ryzen 5 1500X quad-core @ 2.0GHz would be roughly 2.66 times faster than the AMD GX-420CA quad-core @ 2.0GHz.



Now, I should state that these scores do not implicitly compare whetstone (floating point) or drystone (integer) operations or performance, and this is just a performance-in-general comparison.
Hope this helps! :D


To say it is the reason games will not go past 30 fps is silly.
Well, about that...
https://www.tweaktown.com/news/55032/ps4-pro-held-back-jaguar-cpu-heres-proof/index.html
https://www.forbes.com/sites/insert...-of-ps4-exclusivity-way-too-far/#331941a739ad
https://gearnuke.com/destiny-2-project-lead-confirms-30-fps-due-cpu-limits-evaluating-4k-xbox-one-x/

Direct from Bungie's Mark Noseworthy:
"We would never hold back game performance on a platform to appease a partner. No partners asked us to either. We optimize for each platform's tech to deliver the best social action game experience we can. Period. All consoles will run at 30fps to deliver D2's AI counts, environment sizes, and # of players. They are all CPU-bound. We are currently evaluating 4K for Xbox One X. It launches after D2 and so we are focused on the launch consoles right now."

This is the same with both Bloodborne and Spider-Man on the PS4 - regardless of PS4/Slim/Pro and resolution, the games will both only run at 30fps.
Since the devs forced this, and especially since these two games are super-optimized for, and are both exclusives for, the PS4/Slim/Pro, the CPU is primarily to blame.

I'm sure you understand the differences between "CPU bound" and "GPU bound", so this isn't directed at you and I'm just throwing the following info out for reference. :)


GPU bound:

720p - 100fps
1080p - 60fps
4k - 30fps
(As the resolution gets bigger, the GPU does not have the processing power to handle what is happening at larger resolutions without lowering the frame rate.)


CPU bound:
720p - 30fps
1080p - 30fps
4k - 30fps
(As the resolution gets bigger, regardless of how powerful the GPU is, the CPU can only provide it so much data since that is literally the limit of the CPU, regardless of the resolution.)
 
Last edited:
To say it is the reason games will not go past 30 fps is silly.
I want to also add that obviously there are quite a few games running on the PS4/Slim/Pro that run at 60fps, regardless of resolution.
Take Dark Souls Remastered - regardless of resolution, it runs at 60fps and the CPU is far from being a bottleneck with this game.

Bloodborne, however, is limited to 30fps regardless of resolution, and the CPU is not powerful enough to drive (send the GPU data) at more than 30fps with everything that is being processed.
Both games were developed by From Software as well, who are excellent programmers and developers.
 
https://www.notebookcheck.net/AMD-A-Series-A4-5000-Notebook-Processor.92867.0.html

15w. at 1.5 GHz, and that includes the integrated GPU. So 8 cores @1.6 GHz without the GPU = 30w.

The dynamic power consumption of a CMOS circuit is dependent on frequency * number of cores * voltage squared. A higher frequency usually also requires a increase in voltage, so power consumption can go up VERY fast with frequency.

Doubling the frequency of a part can often mean doubling the voltage. That means 2x ther frequency *1.2v^2/0.6v^2 =, eighth times the power (using the 22nm gate numbers starting at lowest activation voltage).

View attachment 121535

That's why adding cores is preferred over bumping frequency: it's pure linear increase in power.
You're correct - I was including the GPU within my CPU, and removing that, the TDP would be much lower just as you stated.
Thanks for pointing that out to me and for the reference, I stand corrected!
 
The real issue and the reason many games won't go past 30fps, be it at 720p or 4K, is because of the weak CPU more so than even the GPU (high-end consoles only).
The new CPU, and hopefully better GPU plus more unified RAM, will all help tremendously, but the CPU is the main bottleneck right now, at least with the current generation.

No it's because they are trying to control stutter. The game may run perfectly fine at 50 hz maybe even 60hz. But some interrupt or texture miss causes a massive delay on the draw call/frame render and that creates a sudden frame rate drop.

Uneven frame rates actually look worse than 30fps frame rates. So a lot of developers cap it at 30.
 
No it's because they are trying to control stutter. The game may run perfectly fine at 50 hz maybe even 60hz. But some interrupt or texture miss causes a massive delay on the draw call/frame render and that creates a sudden frame rate drop.

Uneven frame rates actually look worse than 30fps frame rates. So a lot of developers cap it at 30.
That is also a big factor in the 30fps cap as well.
Would definitely make sense for games like Spider-Man with such a massive draw distance and thousands of models/animations on screen at once.

For games like Bloodborne, though, it is because the CPU is not powerful enough (which probably causes more stuttering as per your point) - I will have to find From Software's statement again on this one as well.
 
It would have to be a true 8 core Ryzen to give a decent performance upgrade from a Ps4 Pro or XBOne X. They will need to target 4K 60fps for PS5. It will be a laughing stock if it can't achieve that when XBone X already does it in some games. A 4 core chip would be seen as a step backwards.

A 4 core Ryzen would easily outperform an 8 core Jaguar. Remember that the Jaguar is equivilant to an Intel Atom.

The Jaguar is 2-wide, the Ryzen is 6. That means the Ryzen can dispatch 4 more instructions per clock than the Jaguar.
 
A 4 core Ryzen would easily outperform an 8 core Jaguar. Remember that the Jaguar is equivilant to an Intel Atom.

The Jaguar is 2-wide, the Ryzen is 6. That means the Ryzen can dispatch 4 more instructions per clock than the Jaguar.
On paper, and even in reality, the numbers support a 4C/8T Ryzen being sufficient I'm sure, but --and this really is a factor I'd think-- when you consider the current console gamer's level techno-knowledge... is it going to move enough units?
"Powered by 4x Ryzen Cores, capable of twice the power of PS4's 8x Jaguar Cores!"

They know what Cores are, and that "more is better"... they also know that marketing jargon doesn't always related to real world performance. So to them there's 1/2 as many cores, with lying marketing speak, so "it's going to be slower, or only as-good as the PS4 Pro... and they want $500 for this?!"

That's why I think they're going to appeal to the masses with 8 real cores.
There's also the case that I think (and I thought this has been shown) a chip with 8 cores at lower a slower clock will still be more capable, draw less power, and more importantly produce less heat... than a 4C/8T part that is running a bit faster (but still well under the speed=more voltage threshold), but is having the cores run at far higher capacity due to utilizing HT. A situation of having 8 fit people pulling something heavy vs 4 macho people. They're up to the task, but who is going to want to go straight out to dinner with the macho guys who are now all sweaty from working way harder? lol (really bad analogy, but it's midnight... :LOL:)
 
I wish Xbox would just give up lol. Sony will kill them again. The reason Xbox can't get any exclusive titles is because no one is going to give up the hundred million dollars they can make with Sony. And the Japanese will never ever show loyalty to Xbox over the Japanese Playstation

Why on earth would any consumer want there to be less competition?
 
Personally I find consoles have had a negative impact on games. They used to have more depth, more plot, and be more intellectually challenging.

Now they seem designed for children with the attention span of a gnat, and I blame consoles for that.

Because of this I ahve absolutely no interest in playing any console exclusive. They are designed for the lowest common denominator.

Personally, I disagree entirely, but we’re all entitled to our opinions. I play games that have great gameplay, art, stories, sound, etc. Games are meant to be fun and provide entertainment value, and perhaps challenge too. They can’t all be masterpieces, nor are they all great, but there’s enough variety out there to please everyone. There are plenty of challenging games, if you’re into that, there is no shortage of that on the various platforms. It’s all moot if you’re set in your ways and unwilling to try, however.

I don't know what exclusives you're playing. The only games designed for the lowest common denominator that I can think of are Call of Duty, Battlefield, and MOBA/Arena games. Those aren't console exclusives though.

Games like The Last of Us, Uncharted, God of War, Detroit: Become Human, Until Dawn, Bloodborne, and Hellblade: Senua's sacrifice are just a handful that have deep plots, are difficult, and challenging.

Well said, and couldn’t agree more. We have access to an incredible assortment of games out there, if one can’t find something to play, perhaps he or she has outgrown the hobby.
 
The current-gen consoles do not use off-the-shelf APU parts, as the GPUs offered in the consoles are far more powerful than what is offered in standard desktop/OEM APUs - assuming that is the point you were trying to make.
As for the next-gen consoles, though, it will be interesting to see what GPU will officially be used in the final product.

Point is, their APUs are pretty great. As you pointed out, consoles don't use off the shelve parts, meaning their new SoC is probably going to rock. The person I replied to is basically a troll
 
Stronger consoles can't come fast enough, being the lowest performer and the highest adoption holds everything else back.

I would like to see any sort of evidence of this. PC gaming didn't surge ahead when the Pro and One X were released. Let's ignore that PC hardware has been stagnant the last 2 years.

I suppose Smart phones are really holding PC gaming back as even more people game on those.

Please stop with this ridiculous talking point.
 
I would like to see any sort of evidence of this. PC gaming didn't surge ahead when the Pro and One X were released. Let's ignore that PC hardware has been stagnant the last 2 years.

I suppose Smart phones are really holding PC gaming back as even more people game on those.

Please stop with this ridiculous talking point.

No, I want hardware to move forward, when games are made for the mass market, they are optimized for what the most users use, being consoles. I get it, they are cheaper and provide a good enough experience. Games look markedly better today then they did in 2012/13/14, EOL 360/PS3 and Beginning of One/PS4.

It take time for the developers to catch up, they caught up in 2015 with the ceiling of One/PS4. So once this new generation comes out, we still have two years till we get those better titles, the sooner the better imho.

PC hardware did a quantum leap past consoles with Pascal, so there has been an utter lack of pressure to move forward.

Leave mobiles out of it, they aren't a factor until the big titles also release on phones.
 
On paper, and even in reality, the numbers support a 4C/8T Ryzen being sufficient I'm sure, but --and this really is a factor I'd think-- when you consider the current console gamer's level techno-knowledge... is it going to move enough units?
"Powered by 4x Ryzen Cores, capable of twice the power of PS4's 8x Jaguar Cores!"

They know what Cores are, and that "more is better"... they also know that marketing jargon doesn't always related to real world performance. So to them there's 1/2 as many cores, with lying marketing speak, so "it's going to be slower, or only as-good as the PS4 Pro... and they want $500 for this?!"
>90% of console buyers have no idea how many cores they have nor do they care.
 
I would like to see any sort of evidence of this. PC gaming didn't surge ahead when the Pro and One X were released. Let's ignore that PC hardware has been stagnant the last 2 years.
Are you for real?
Console game development and ports have been almost directly tethered to one another since the PS3 and 360 - did the last 10+ years not happen with you?

PC games, in terms of graphical advancements, were absolutely stagnating towards the end of the PS3 and 360's life cycles, and once the PS4 and XBone were released, there was a dramatic jump in graphics on console-to-PC ports.
Look at the difference between Far Cry 3: Blood Dragon (2013) and Far Cry 4 (2014) - they are night and day levels of different graphically.

The PS3 had 256MB of VRAM - compare that to the PS4 with ~6GB of unified RAM (the other 2GB of unified RAM was dedicated to the OS/background processes - total of 8GB).
When the new consoles released, suddenly GPUs with "only" 1-1.5GB of VRAM weren't enough and graphics, even at 1080p, were now requiring 2-3GB or more VRAM at higher settings, and all of that happened within a year circa 2014.


I suppose Smart phones are really holding PC gaming back as even more people game on those.
That has nothing to do with anything other than the mobile market.
Unless we are getting PC ports of mobile games, which we don't, this completely irrelevant.


Please stop with this ridiculous talking point.
Unless you were blind to PC and console ports for the last 10-15 years, you need to seriously learn up on some recent tech and game development history...
 
Last edited:
Are you for real?
Console game development and ports have been almost directly tethered to one another since the PS3 and 360 - did the last 10+ years not happen with you?

PC games, in terms of graphical advancements, were absolutely stagnating towards the end of the PS3 and 360's life cycles, and once the PS4 and XBone were released, there was a dramatic jump in graphics on console-to-PC ports.
Look at the difference between Far Cry 3: Blood Dragon (2013) and Far Cry 4 (2014) - they are night and day levels of different graphically.

The PS3 had 256MB of VRAM - compare that to the PS4 with ~6GB of unified RAM.
When the new consoles released, suddenly GPUs with "only" 1-1.5GB of VRAM weren't enough and graphics, even at 1080p, were now requiring 2-3GB or more VRAM at higher settings, and all of that happened within a year circa 2014.

That has nothing to do with anything other than the mobile market.
Unless we are getting PC ports of mobile games, which we don't, this completely irrelevant.

Unless you were blind to PC and console ports for the last 10-15 years, you need to seriously learn up on some recent tech and game development history...

First off, Far Cry 3 could use way more than 512 MB of Vram on the PC so I am not sure what you are getting at there. Also, there are plenty of amazing looking PC games that came out during the PS3/360 era such as CoD: MW, Bioshock Infinite and Crysis 3.
In addition, Battlefield Hardlines was nothing special compared to BF: 4. Fallout 4 and ME: Andromeda were not exactly earth shattering. All of those games were release well into the PS4/XB1 era.

How about putting the blame on lazy developers? Explain to me why Hitman 2 does not even bother with Dx12. It is basically a reskinned version of the 2016 game. Just because a game use 8+ GB of vRam, does not man it looks good.

Most importantly, does having games developed for 8 GB of vRam make ass rapping from developers like EA any better after playing Battlefront 2? Did that 8 Gb of vRam in the new consoles get ya Red Dead 2?
 
does that mean ryzens will be better in the future when games are going to be designed around an 8-core CPU?
 
irst off, Far Cry 3 could use way more than 512 MB of Vram on the PC so I am not sure what you are getting at there. Also, there are plenty of amazing looking PC games that came out during the PS3/360 era such as CoD: MW, Bioshock Infinite and Crysis 3.
You completely missed the point of the original statement you made and the statement I made - that point being that consoles (of that era) severely held back development of higher-end graphics.
Bioshock Infinite wasn't a console-to-PC port, and was primarily developed for PC and consoles simultaneously.

I never said games didn't look good back then, I said the consoles held PC ports of said games back - remember Rage (2011) and all of the horrid texture popping?
You claim it is because of "lazy developers", but fail to realize that the real issue is due to companies wanting to pander to the masses with the lowest common denominator while making the highest sales, i.e. developing for consoles first and porting to PCs (Windows / OS X / Linux) as an afterthought, thus forcing those devs to comply.

Also, Crysis 3 looked like crap compared to the original - the original of which was specifically designed around high-end PCs (of that era) and later ported to consoles, which made it an exception and a damn good one.

In addition, Battlefield Hardlines was nothing special compared to BF: 4. Fallout 4 and ME: Andromeda were not exactly earth shattering. All of those games were release well into the PS4/XB1 era.
Battlefield Hardline was developed with the PS3 and 360 in mind, so there were still severe limits on what they could do with it compared to the PS4/XBone and PCs - aka a chain is only as strong as its weakest link, which is basically EA's motto, lame as they are.

Explain to me why Hitman 2 does not even bother with Dx12. It is basically a reskinned version of the 2016 game. Just because a game use 8+ GB of vRam, does not man it looks good.
Um, pretty sure Hitman 2 does use DX12:


More VRAM normally means higher resolution textures (Fallout 4 HD texture pack, DOOM 2016 nightmare settings), so yes and in general, higher VRAM usage does equate to better looking textures (not always, though).

Most importantly, does having games developed for 8 GB of vRam make ass rapping from developers like EA any better after playing Battlefront 2?
I don't care for EA either, but that sentence has nothing to do with anything we are discussing...

Did that 8 Gb of vRam in the new consoles get ya Red Dead 2?
Other than the XBoneX, the consoles do not have 8GB of VRAM, they have a unified memory architecture with around 6GB dedicated (PS4 at least) to the game for RAM/VRAM/unified RAM.
 
That video is of Hitman 1. Hitman 2 uses dx11 only. The point of the BF2 and RDR comment is that console hardware is the least of PC gaming issues. You are right about Crysis 3, but that game could chew through WAY more vRam that the original, so where does that fir in? You claim Hardline was developed with the 360 in mind. That's funny since it still can use well over 2 GB. How about Advanced Warfare? That game was released well before Hardline and uses even more vRam. Was that developed for the Xbox 360 as well??
 
Just because you are giving random examples, does not prove your point. There is no correlation to your examples and other examples can be used to disprove your theory.
 
That video is of Hitman 1. Hitman 2 uses dx11 only. The point of the BF2 and RDR comment is that console hardware is the least of PC gaming issues. You are right about Crysis 3, but that game could chew through WAY more vRam that the original, so where does that fir in? You claim Hardline was developed with the 360 in mind. That's funny since it still can use well over 2 GB. How about Advanced Warfare? That game was released well before Hardline and uses even more vRam. Was that developed for the Xbox 360 as well??
Hitman: Sniper Assassin is part of Hitman 2 - get your facts straight.
From Wiki:

The game's announcement was accompanied by the release of a cooperative multiplayer mode titled Sniper Assassin, available immediately to those who pre-order Hitman 2. This mode is also bundled with all copies of Hitman 2 when the game releases.
Source: https://www.theverge.com/2018/6/7/1...ve-agent-47-announced-e3-2018-xbox-one-ps4-pc
Also: https://www.ign.com/articles/2018/06/07/hitman-2-sniper-assassin-is-deep-fun-and-full-of-potential

So yes, Hitman 2 does use DX12 - not saying it doesn't have issues, but it does utilize it, as evidenced by the video.
https://pc-mac-help.com/blog/fix-for-hitman-2-crashing-freezing-issue-on-pc
(recommendation to disable it to fix issues)

You are right about Crysis 3, but that game could chew through WAY more vRam that the original, so where does that fir in?
Crysis 3 used a completely different engine, so textures and memory-usage differed greatly - that's where it fits in.
Do you not know how game engines work or can differ from one another???

For someone who touts to know so much about games and VRAM usage, you sure do know little about all of it...

Just because you are giving random examples, does not prove your point. There is no correlation to your examples and other examples can be used to disprove your theory.
I didn't give "random examples", I used the same exact examples of games you posted about. :meh:
Again, get your facts straight - much of what you have posted is totally incorrect or total bullshit.

You see, when I'm wrong, I can at least act like an adult an admit to my mistakes, or at a bare minimum, will state that I will look into it further to learn more, or find sources/links to back my claims.
Let's see if you can do the same thing, or just keep beating that dead horse.
 
Last edited:
Just because you are giving random examples, does not prove your point. There is no correlation to your examples and other examples can be used to disprove your theory.

I don't think you get it, as a game developer (a game artist, at least) this is not some hypothesis: This is fact. Console development targets hold back PC games. Games engines are designed around platform limitations. If you develop a game on, say, UE4, you first and foremost have to scope out what platforms you want to cater to. If you want to hit consoles+PC, there are a lot of PC-specific features that won't be easily available unless you have a team specifically in charge of getting those features working. This also goes for different consoles: If you're targeting both PS4 and XBone, there are a lot of features exclusive to one console that you'll need to spend extra time to specifically develop for if you want to include them.

Right now, the modern consoles are using a CPU that is effectively slower than a 2014 smartphone. Any computation that requires more single-threaded power than a 2.0Ghz jaguar core will cause slowdown on the console's part, and thus they are usually shaved down or removed. If that computation is highly scalable, sure, you can just give PC users the ability to 'turn it up' (this goes for things like cloth physics, crowd densities, foliage densities, cosmetic physics objects etc.) but if that feature is either not scalable or is in any way crucial to the gameplay, then don't expect that task to be re-written just so PC users can have more of their CPU used.

In other words, PC games will always look better than console games, but it is usually by a fixed ratio. When console hardware improves, the graphics quality of PC games improve as well. This is not a hypothesis, this is not a guess or an idea, this is fact.
 
Hitman: Sniper Assassin is part of Hitman 2 - get your facts straight.
From Wiki:


Source: https://www.theverge.com/2018/6/7/1...ve-agent-47-announced-e3-2018-xbox-one-ps4-pc
Also: https://www.ign.com/articles/2018/06/07/hitman-2-sniper-assassin-is-deep-fun-and-full-of-potential

So yes, Hitman 2 does use DX12 - not saying it doesn't have issues, but it does utilize it, as evidenced by the video.
https://pc-mac-help.com/blog/fix-for-hitman-2-crashing-freezing-issue-on-pc
(recommendation to disable it to fix issues)


Crysis 3 used a completely different engine, so textures and memory-usage differed greatly - that's where it fits in.
Do you not know how game engines work or can differ from one another???

For someone who touts to know so much about games and VRAM usage, you sure do know little about all of it...


I didn't give "random examples", I used the same exact examples of games you posted about. :meh:
Again, get your facts straight - much of what you have posted is totally incorrect or total bullshit.

You see, when I'm wrong, I can at least act like an adult an admit to my mistakes, or at a bare minimum, will state that I will look into it further to learn more, or find sources/links to back my claims.
Let's see if you can do the same thing, or just keep beating that dead horse.

Alright Mr. Factman. Hitman: Sniper Assassin, which you showed, had dx12. Hitman 2: Sniper Assassin will not have dx12. For Christ sake, the pc mac help link that you gave shows dx11 only. Could you at least read your own links?!
Your argument about consoles holding back PC ports is old hat and no one takes it serious anymore. Ever notice that most games have 4 or 5 graphic presets? Would it be that much harder to make one more for last gen consoles. Or easier yet, just have the 'low' settings mirror that of last gen hardware. Have 'high' or 'very high' mirror that of current gen hardware. vRam requirements drop drastically with settings, even more so than resolution. If a game uses 5 GB on ultra, it may use 1 GB or less on low.

Now you want to take the high road and "act like an adult" yet you are having a hissy fit and calling everything I posted as bullshit. All that I claimed is that consoles have NOT held back PC ports from using more vRam and for some reason you can't seem to handle that.
 
Alright Mr. Factman. Hitman: Sniper Assassin, which you showed, had dx12. Hitman 2: Sniper Assassin will not have dx12. For Christ sake, the pc mac help link that you gave shows dx11 only. Could you at least read your own links?!
I did read it, maybe you should, too - from that link: https://pc-mac-help.com/blog/fix-for-hitman-2-crashing-freezing-issue-on-pc
Solution4: Disable Direct X 12
As Hitman 2 recommended system requirement says to use DX11 so please disable DX12 to fix low FPS, graphics Issue and Stuttering.
FFS, it uses DX12 - this isn't even an argument, you are just flat out in denial.


Your argument about consoles holding back PC ports is old hat and no one takes it serious anymore.
Even though KazeoHin, a game developer, just posted that they do...

Ever notice that most games have 4 or 5 graphic presets? Would it be that much harder to make one more for last gen consoles. Or easier yet, just have the 'low' settings mirror that of last gen hardware. Have 'high' or 'very high' mirror that of current gen hardware. vRam requirements drop drastically with settings, even more so than resolution. If a game uses 5 GB on ultra, it may use 1 GB or less on low.
You are going off on a tangent that has nothing to do with anything here, and those settings really depend on the game and engine used - your statement is not universally, or even in generally, correct.

Now you want to take the high road and "act like an adult" yet you are having a hissy fit and calling everything I posted as bullshit. All that I claimed is that consoles have NOT held back PC ports from using more vRam and for some reason you can't seem to handle that.
Everything you have posted is total bullshit, and a legitimate game developer is also calling you out on your bullshit as well.
Yes, consoles have always set the standard for game development, especially within the last 10-15 years more so than ever.

Your claims are totally invalid - I was there and lived the 2000s and remember very well how consoles held PC games and ports back severely during those times.
 
I did read it, maybe you should, too - from that link: https://pc-mac-help.com/blog/fix-for-hitman-2-crashing-freezing-issue-on-pc

FFS, it uses DX12 - this isn't even an argument, you are just flat out in denial.



Even though KazeoHin, a game developer, just posted that they do...


You are going off on a tangent that has nothing to do with anything here, and those settings really depend on the game and engine used - your statement is not universally, or even in generally, correct.


Everything you have posted is total bullshit, and a legitimate game developer is also calling you out on your bullshit as well.
Yes, consoles have always set the standard for game development, especially within the last 10-15 years more so than ever.

Your claims are totally invalid - I was there and lived the 2000s and remember very well how consoles held PC games and ports back severely during those times.

From a real source: https://www.techpowerup.com/reviews/Performance_Analysis/Hitman_2/5.html
"Visual fidelity is good, certainly not the best we've ever seen from a PC title, slightly improved over the previous Hitman, but not in a dramatic way. What's a huge change is that unlike the 2016 version, Hitman 2 no longer supports DirectX 12. It looks like the developer didn't want to waste time and money on supporting both rendering APIs at the same time (they are fundamentally different in terms of development concepts). While it's certainly sad to see DX12 go, it looks like that's where the industry is heading. Most new titles just use the tested and well understood DirectX 11 API, probably to reduce development costs—there seems to be no place for a somewhat romantic notion of supporting the latest and greatest tech. No, game publishers want their titles out quickly so they can get return on their investment—nothing else matters."

Our Game 'Artist' friend is now making the argument that Jaguar is holding back visuals which is a different argument altogether.

Show me a single AAA game that does not have at least 4 presets. In pretty much all of them, a Chinese Teenager can play that game on Low settings with his Intel iGpu and 4 GB of shared DDR3 because that is the lowest common denominator.

I am sorry about your traumatic experience in the 2000s. However, the fact that there are no great PC exclusives like Crysis anymore and other games are being delayed shows that PC gaming is broken anyhow. This has nothing to do with consoles lacking vRam.
 
From a real source: https://www.techpowerup.com/reviews/Performance_Analysis/Hitman_2/5.html
"Visual fidelity is good, certainly not the best we've ever seen from a PC title, slightly improved over the previous Hitman, but not in a dramatic way. What's a huge change is that unlike the 2016 version, Hitman 2 no longer supports DirectX 12. It looks like the developer didn't want to waste time and money on supporting both rendering APIs at the same time (they are fundamentally different in terms of development concepts). While it's certainly sad to see DX12 go, it looks like that's where the industry is heading. Most new titles just use the tested and well understood DirectX 11 API, probably to reduce development costs—there seems to be no place for a somewhat romantic notion of supporting the latest and greatest tech. No, game publishers want their titles out quickly so they can get return on their investment—nothing else matters."

Our Game 'Artist' friend is now making the argument that Jaguar is holding back visuals which is a different argument altogether.

Show me a single AAA game that does not have at least 4 presets. In pretty much all of them, a Chinese Teenager can play that game on Low settings with his Intel iGpu and 4 GB of shared DDR3 because that is the lowest common denominator.

I am sorry about your traumatic experience in the 2000s. However, the fact that there are no great PC exclusives like Crysis anymore and other games are being delayed shows that PC gaming is broken anyhow. This has nothing to do with consoles lacking vRam.

Actually VRAM isn't a major bottleneck anymore. Modern consoles have a heap of VRAM for what they need compared to before. Back in the Xbox 360/PS3 days, VRAM was THE limiting factor, as consoles really only had 256MB, a single terrain file can take up that amount of space now. It's not infinite, but it isn't limiting much. The real bottleneck is CPU power. Moving to an 8-core Ryzen (even with SMT Disabled) would more than quadruple the CPU power at hand, and be a near 8-times improvement with SMT on. By giving developers that much more CPU headroom, huge amounts of game play spaces can emerge.

Next Gen consoles will most likely have 10-16GB of RAM, but I doubt that will really effect much in terms of visual quality, the CPU power will be what completely changes the industry.
 
From a real source: https://www.techpowerup.com/reviews/Performance_Analysis/Hitman_2/5.html
"Visual fidelity is good, certainly not the best we've ever seen from a PC title, slightly improved over the previous Hitman, but not in a dramatic way. What's a huge change is that unlike the 2016 version, Hitman 2 no longer supports DirectX 12. It looks like the developer didn't want to waste time and money on supporting both rendering APIs at the same time (they are fundamentally different in terms of development concepts). While it's certainly sad to see DX12 go, it looks like that's where the industry is heading. Most new titles just use the tested and well understood DirectX 11 API, probably to reduce development costs—there seems to be no place for a somewhat romantic notion of supporting the latest and greatest tech. No, game publishers want their titles out quickly so they can get return on their investment—nothing else matters."
I will give that to you, it does indeed look like Hitman 2 did at one point use DX12, as evidenced by the video and links I posted, but your links do show and state that it does not any longer - thanks for pointing that out.


Our Game 'Artist' friend is now making the argument that Jaguar is holding back visuals which is a different argument altogether.
KazeoHin never even mentioned the word "Jaguar" - you are putting words into his mouth and are denying facts from someone who works directly in the industry itself.
He expressly stated:
When console hardware improves, the graphics quality of PC games improve as well. This is not a hypothesis, this is not a guess or an idea, this is fact.
Please re-read his post again.

Show me a single AAA game that does not have at least 4 presets. In pretty much all of them, a Chinese Teenager can play that game on Low settings with his Intel iGpu and 4 GB of shared DDR3 because that is the lowest common denominator.
This has nothing to do with anything, please stop bringing up game graphic presets.

I am sorry about your traumatic experience in the 2000s.
LOL!

However, the fact that there are no great PC exclusives like Crysis anymore and other games are being delayed shows that PC gaming is broken anyhow. This has nothing to do with consoles lacking vRam.
Wat.
 
Back
Top