AMD: Native 4K Resolution at 30fps with PS4 Image Quality Requires 7.4 TFLOPS GPU

I'l stick with Diablo 3 the engine is different thus the result is different. It is major not minor one is single thread the other is not. How difficult is that to understand? You want me to send you the source code?
You know what, to hell with it, I'm going to answer my own question that I asked you from this thread that you couldn't, and still cannot, answer:
https://hardforum.com/threads/sony-...-playstation-5.1960852/page-5#post-1043648885

You said:
You prove that what you say is not true everyone that ran Diablo 3 can tell you it won't run on any Jaguar core based solution certainly not 1080p and unless you do very low settings hardly at 720p.
You also fail to understand that in your chain of though it is cpu bound guess what game is CPU bound Diablo 3 it will run like crap on Bulldozer and even Thuban it does not like anything else then raw single thread power.
You list theoretical specs but you never have an idea what is going on in the real world.

Then I asked:
Also, I wanted ask, how do you KNOW that Diablo 3 won't play on a Jaguar-based system (other than the consoles) - have you actually tried it yourself?


Obviously you have not, however, I have, so here is your answer:
Yes, Diablo 3 will run on a system with a Jaguar CPU and a GT 1030 2GB GPU (GDDR5 version), specifically the one in my sig which is a lowly HP T620 Thin Client, hardly a "gaming beast" and a far weaker system than the consoles, at around 35-45fps consistently at 2560x1080.

Not to mention I had Firefox and a few other apps running in the background, as you can clearly see, and in a dual monitor setup.
You can also clearly see that the game, again running at 2560x1080, is running at medium to high settings and getting 35-45fps (FRAPS fps rate is in the upper-left corner of the game) - not 720p at very low settings like you stated.

So, apparently, a quad-core Jaguar CPU @ 2.0GHz is indeed capable of running Diablo 3, and at a playable frame rate to boot.
 
Last edited:
This is a simple math problem and I am quite sure Nvidia would come up with a similar result if you asked them.

Also to think that developers are not using every last trick known to man to achieve their result is foolhardy at best... you can see for your self when see screen captures side by side between the systems and PC.


Or a vega56. Consoles dont use nvidia....

Nintendo switch does... Targa to be exact
 
The industry has set console gamer expectations way too high. 4k gaming on consoles with current tech is a joke. Why would I ever want to compromise with 30fps when 120fps/Hz is just so much better? Fuck that.

They're trying to sell the market on a subpar experience.

I don't play fps. 120hz is nice to have but not needed.
For my gaming needs, 4k@60hz is far superior to 1080p@144hz
 
You know what, to hell with it, I'm going to answer my own question that I asked you from this thread that you couldn't, and still cannot, answer:
https://hardforum.com/threads/sony-...-playstation-5.1960852/page-5#post-1043648885

You said:


Then I asked:



Obviously you have not, however, I have, so here is your answer:
Yes, Diablo 3 will run on a system with a Jaguar CPU and a GT 1030 2GB GPU (GDDR5 version), specifically the one in my sig which is a lowly HP T620 Thin Client, hardly a "gaming beast" and a far weaker system than the consoles, at around 35-45fps consistently at 2560x1080.

https://orig00.deviantart.net/bb70/f/2018/196/f/5/diablo3_jaguar_cpu_by_redfalcon696-dchclxt.png

Not to mention I had Firefox and a few other apps running in the background, as you can clearly see, and in a dual monitor setup.
You can also clearly see that the game, again running at 2560x1080, is running at medium to high settings and getting 35-45fps (FRAPS fps rate is in the upper-left corner of the game) - not 720p at very low settings like you stated.

So, apparently, a quad-core Jaguar CPU @ 2.0GHz is indeed capable of running Diablo 3, and at a playable frame rate to boot, whodathunkit?!
Best $9.99 I have ever spent proving someone's arrogant ass wrong. :ROFLMAO:
Diablo 3 is a very forgiving game load-wise. If you zoom in, you can see they have sprites all over the place. It's mostly the art department and keeping a locked camera angle making it look more refined than it is. I'd be curious what its polygon count is for a typical scene. It looks like you can hit 60fps with as low as a Geforce 460 on high quality settings.
 
The industry has set console gamer expectations way too high. 4k gaming on consoles with current tech is a joke. Why would I ever want to compromise with 30fps when 120fps/Hz is just so much better? Fuck that.

They're trying to sell the market on a subpar experience.

For 99.1% of people it is good enough. Those who want frame rates above 60fps are truly a minority. I'd take added resolution and detail every time over more frames, unless it's a fighting game we're talking about, or a game where the extra responsiveness makes sense. Most people I know hook up their consoles with the analog A/V leads still, ffs.
 
For 99.1% of people it is good enough. Those who want frame rates above 60fps are truly a minority. I'd take added resolution and detail every time over more frames, unless it's a fighting game we're talking about, or a game where the extra responsiveness makes sense. Most people I know hook up their consoles with the analog A/V leads still, ffs.
I just wish those who want frame rates above 30fps weren't the minority.
 
I just wish those who want frame rates above 30fps weren't the minority.
I understand. There are those out there that want the option to push their expensive hardware (and monitors!) in the way they want to - more frames per second. You guys (and gals) just want the option to.
 
I understand. There are those out there that want the option to push their expensive hardware (and monitors!) in the way they want to - more frames per second. You guys (and gals) just want the option to.
It's more like the difference between 30 and 60 is pretty dramatic. I mean hell, even the NES was pushing 60fps in a lot of titles. It would be nice if 60fps was the standard to hit.
 
You know what, to hell with it, I'm going to answer my own question that I asked you from this thread that you couldn't, and still cannot, answer:
https://hardforum.com/threads/sony-...-playstation-5.1960852/page-5#post-1043648885

You said:


Then I asked:



Obviously you have not, however, I have, so here is your answer:
Yes, Diablo 3 will run on a system with a Jaguar CPU and a GT 1030 2GB GPU (GDDR5 version), specifically the one in my sig which is a lowly HP T620 Thin Client, hardly a "gaming beast" and a far weaker system than the consoles, at around 35-45fps consistently at 2560x1080.

https://orig00.deviantart.net/bb70/f/2018/196/f/5/diablo3_jaguar_cpu_by_redfalcon696-dchclxt.png

Not to mention I had Firefox and a few other apps running in the background, as you can clearly see, and in a dual monitor setup.
You can also clearly see that the game, again running at 2560x1080, is running at medium to high settings and getting 35-45fps (FRAPS fps rate is in the upper-left corner of the game) - not 720p at very low settings like you stated.

So, apparently, a quad-core Jaguar CPU @ 2.0GHz is indeed capable of running Diablo 3, and at a playable frame rate to boot, whodathunkit?!
Best $9.99 I have ever spent proving someone's arrogant ass wrong. :ROFLMAO:

Yeah I was able to run it on Kabini based system as well :).

Then you start farming when you have 40 to 50 mobs not in story mode it tanks were not talking anything above 10 fps were talking not moving from 1 fps.
Guess how I know this? Even on Kabini the framerate in story mode is good.
To compare my desktop system that is based on a Piledriver at 4.3 ghz will drop frames like crazy unless everything is low you will not get a stable frame rate.
Until I got Ryzen 1800x I able to turn settings up. And the videocard I used is the R9 290X (both systems).


And now I realize that I never explained how intensive the difference game play is when you do serious farming. I linked a video which shows a bit of mild game play which will not work.

This is more intense then the previous one .

That is where you will find that the game engine is cpu limited. For good measure a PS4 youtube of someone farming.

And if I did not explain what scenario I was talking about when describing Diablo 3 game play then I am sorry.

Diablo 3 used to have a system where they would display damage numbers it started to bog down the game so much you only can get the abbreviated damage to not have an impact.
 
It's not true 4K60. No current console title is. They're all cheating with checkerboarding/uprezzing. Racers are also among the least graphically demanding.

1080Ti remains the only way to get legit 4K60 across the board. Anything less powerful and the developer has to cheat and cut corners to create a false perception - exceptions being low poly cartoon games like Overwatch/Fortnite. But most console buyers don't really care - the psychological contentment of 4K printed on the box is good enough.
Forza 7 is native 4K60. They can get away with it because they are not rendering anything beyond the boundaries of the track, and the trackside detail is very low.

Not true about the 1080 Ti being the only card to do 4K60. You don't need to have all the details turned up to ultra with 8xSSAA to play a game.
 
The ps4 pro and xbox 4k whatever don't actually render at 4k though.

Console games are made to be highly threaded and very efficient. That still doesn't come close enough to making up for having 6/7 jaguar cores at 1.6ghz (1-2 reserved for the OS/other functions)

But they do render an image with 4K output that there using techniques as up scaling to achieve this is trivial because when you are looking for performance no one will make an exception and just not deliver a visually good representation.

https://en.wikipedia.org/wiki/PlayStation_4_technical_specifications#APU

The difference between the 2 systems is clear and it is not the cpu 400mhz per core improvement on the PS4 pro that makes the difference.
 
This is a simple math problem and I am quite sure Nvidia would come up with a similar result if you asked them.

Also to think that developers are not using every last trick known to man to achieve their result is foolhardy at best... you can see for your self when see screen captures side by side between the systems and PC.




Nintendo switch does... Targa to be exact
Tegra to be exact exact
 
But they do render an image with 4K output that there using techniques as up scaling to achieve this is trivial because when you are looking for performance no one will make an exception and just not deliver a visually good representation.

https://en.wikipedia.org/wiki/PlayStation_4_technical_specifications#APU

The difference between the 2 systems is clear and it is not the cpu 400mhz per core improvement on the PS4 pro that makes the difference.

Ah no wait they are using low overhead hardware and software features to produce an *equivalent* to 4k. It remains that the 4k here is still not actual 4k. Not a million miles away but not exactly close either.

The upgraded ps4 has a bump to clockspeed of 25%. That's not insignificant. Ps4 pro games tend to run at a slightly higher, more stable frame rate. The relationship here is hard to deny.

The biggest change from ps4 to ps4 pro is the graphics power and the biggest change in the games is the resolution. So far so good.

However what doesn't really change is the framerate. It's better sure but it's not transformed, which is spookily similar to the cpu power increase.

The question then is did they make the tradeoff of more resolution over framerate? Highly likely. Could they have a much higher frame rate at a lower resolution with the faster ps4 pro hardware? Probably but the consensus is that it would never be that much higher because the cpu cant prepare tasks for the gpu to do quickly enough.

The signs are there in the differences in framerate and frame consistency between the base consoles and their upgraded counterparts; the game engines havent changed in any meaningful way between versions, the performance profiles are the same, just a few settings have been bumped up. Between original and '4k' versions the frame rates are highest in the same parts and struggle and splutter in the same parts, but where they tank in the latter the frametimes are more stable, which is the classic sign of reduced cpu limitation. In the rare instances in pc gaming these days where you are cpu limited (our cpus are so much better than the console ones!) the game will 'feel' slower than fraps will be indicating to you.

Going back to ps4 pro games If you want to hit that 60fps target you have to chop back significantly the effort involved in advancing the engine 1 more frame before sending *anything* to the renderer. At some point something a keen-eyed observer can notice gets the chop.

It's something you can explore yourself. Find a high framerate console game and if you dont own it yourself load up a gameplay run through of it on youtube in high definition. There you will see the telltale signs: low interactivity with the environment (static objects, scenery), narrow field of view, limited number of ai instances, limited physics, npc dead bodies, decals, debris that fade out after 30 seconds. For example racing games are typically technically speaking just glossy tunnels. There's car physics going on, a few alpha effects, maybe a damage model but everything outside the track is completely static and sparsely detailed. That's how the low time-to-draw-one-more-frame is achieved.
 
Last edited:
Ah no wait they are using low overhead hardware and software features to produce an *equivalent* to 4k. It remains that the 4k here is still not actual 4k. Not a million miles away but not exactly close either.

The upgraded ps4 has a bump to clockspeed of 25%. That's not insignificant. Ps4 pro games tend to run at a slightly higher, more stable frame rate. The relationship here is hard to deny.

The biggest change from ps4 to ps4 pro is the graphics power and the biggest change in the games is the resolution. So far so good.

However what doesn't really change is the framerate. It's better sure but it's not transformed, which is spookily similar to the cpu power increase.

The question then is did they make the tradeoff of more resolution over framerate? Highly likely. Could they have a much higher frame rate at a lower resolution with the faster ps4 pro hardware? Probably but the consensus is that it would never be that much higher because the cpu cant prepare tasks for the gpu to do quickly enough.

The signs are there in the differences in framerate and frame consistency between the base consoles and their upgraded counterparts; the game engines havent changed in any meaningful way between versions, the performance profiles are the same, just a few settings have been bumped up. Between original and '4k' versions the frame rates are highest in the same parts and struggle and splutter in the same parts, but where they tank in the latter the frametimes are more stable, which is the classic sign of reduced cpu limitation. In the rare instances in pc gaming these days where you are cpu limited (our cpus are so much better than the console ones!) the game will 'feel' slower than fraps will be indicating to you.

Going back to ps4 pro games If you want to hit that 60fps target you have to chop back significantly the effort involved in advancing the engine 1 more frame before sending *anything* to the renderer. At some point something a keen-eyed observer can notice gets the chop.

It's something you can explore yourself. Find a high framerate console game and if you dont own it yourself load up a gameplay run through of it on youtube in high definition. There you will see the telltale signs: low interactivity with the environment (static objects, scenery), narrow field of view, limited number of ai instances, limited physics, npc dead bodies, decals, debris that fade out after 30 seconds. For example racing games are typically technically speaking just glossy tunnels. There's car physics going on, a few alpha effects, maybe a damage model but everything outside the track is completely static and sparsely detailed. That's how the low time-to-draw-one-more-frame is achieved.

But those are the trade off per engine and or per game in some matters. You can not make the hardware do more. if you can't do as a developer you are missing your target that is attainable. Dice battlefield 4 makes the same trade off for each version of the game it is different https://www.eurogamer.net/articles/digitalfoundry-battlefield-4-current-gen-face-off

Graphical or performance limits per title on different hardware some games do not have a distinction but you guessed that optimizations were not needed because just releasing some thing on both platforms with minimal effort is perceived as cost effective.

And it comes down to this if you have about 400 watts of computational power for the console to play with the bigger the share of the cpu in a percentage is not going to increase your performance if you took a desktop 8 Zen cores at 65 Watt from that total of 400 Watt you would get worse performance. The GPU would be able to do more. If you decrease the amount of Watt needed for your console you can see that the cpu is not (more or less) going to be a factor anyway.
I'm not saying that Zen cores can't be useful but I'm saying in the grand scheme of things they are not going to be the at the same performance level as we have seen on the desktop (lower in frequency maybe other things altered).
 
Wait this is an odd example to pick out and I'm not sure why it's here. But, well ok. The 360 and ps3 are exotic fully custom hardware with no similarities to the ps4. It's not anywhere near where I was going and outside of all the points I was making. Coming back to this planet albeit apparently briefly the general crux of the issue with ps4, xbox one is that the cpu desperately limits the depth in complexity of the gameworlds. AI/physics/interactive objects and npcs, 60fps and the depth of simulation all suffer as it stands. Looking ahead any current amd or intel desktop core will do more for the same manufacturing process technology at a given wattage than what jaguar does now. If its ryzen in ps5 it'll do a lot more for it's 50 watts than jaguar on 50 watts on the same process node. Because jaguar is outdated old hat yesterdays tech and task energy and yada yada yada. If you wanna spend a bigger share of your power budget on the cpu in the next generation then in this generation its because it will allow far, far richer game worlds and 60fps at *the same time*
 
Then don't fucking sell 4K if it's going to be subpar. Again, it's setting the gamers up for a subpar experience. And there are 120hz 4K displays out there, the line between "TV" and "Display" doesn't matter any more. They are the same tech now, just a different label.

It is simple they can't bankroll hardware for consumer prices that does 60hz or better for 4K. Samsung (2018) is the only TV that supports faster framerate

Faster hardware needs more cooling more power and that will spiral cost. If no one can afford your console you are not going to sell games.
 
But 4K console gaming is 30fps, not 60. And even still, you don't need to play FPS' to actually tell the difference for 120hz/144hz/120fps/144fps, it's tangibly smoother and a provable better experience.

But that still doesn't matter, because current consoles STRUGGLE to do 4K @ 30FPS already, let alone 60...

SUB. PAR.

I don't play fps. 120hz is nice to have but not needed.
For my gaming needs, 4k@60hz is far superior to 1080p@144hz
 
Console systems currently STRUGGLE to do 30FPS, not even 60FPS at 4K. It is a SUB PAR experience, not even considering that 120FPS at 4K is superior.

People who are okay with "good enough" _don't care about 4K_, they think 1080p or hell, 720p, is _good enough_, as you say, and this is the case.

But for 4K on consoles with current gen. It's 30FPS, and that is just shit for games.

For 99.1% of people it is good enough. Those who want frame rates above 60fps are truly a minority. I'd take added resolution and detail every time over more frames, unless it's a fighting game we're talking about, or a game where the extra responsiveness makes sense. Most people I know hook up their consoles with the analog A/V leads still, ffs.
 
Yeah I was able to run it on Kabini based system as well :).

Then you start farming when you have 40 to 50 mobs not in story mode it tanks were not talking anything above 10 fps were talking not moving from 1 fps.
Guess how I know this? Even on Kabini the framerate in story mode is good.
To compare my desktop system that is based on a Piledriver at 4.3 ghz will drop frames like crazy unless everything is low you will not get a stable frame rate.
Until I got Ryzen 1800x I able to turn settings up. And the videocard I used is the R9 290X (both systems).


And now I realize that I never explained how intensive the difference game play is when you do serious farming. I linked a video which shows a bit of mild game play which will not work.

This is more intense then the previous one .

That is where you will find that the game engine is cpu limited. For good measure a PS4 youtube of someone farming.

And if I did not explain what scenario I was talking about when describing Diablo 3 game play then I am sorry.

Diablo 3 used to have a system where they would display damage numbers it started to bog down the game so much you only can get the abbreviated damage to not have an impact.


That makes a bit more sense, and I appreciate you being honest about it, really.
I can totally see how a Jaguar quad-core could be overwhelmed in that specific scenario, and most RTS or RTS-like games are nearly all heavily CPU bound games - the more cores, the better.

I will have to play through a bit more to see if I can get the fps to drop with numerous enemies/animations on screen with that system.
Thanks for the videos as well, that does help to explain what you were trying to get at. (y)
 
But 4K console gaming is 30fps, not 60. And even still, you don't need to play FPS' to actually tell the difference for 120hz/144hz/120fps/144fps, it's tangibly smoother and a provable better experience.

But that still doesn't matter, because current consoles STRUGGLE to do 4K @ 30FPS already, let alone 60...

SUB. PAR.

I don't think current consoles are even meant to play 4k. It's just a bonus.
Also, in those 4k titles I played on my PS4 Pro (Horizon, The Last of Us), 30 fps did not bother me.
 
I think the recent findings with VR and not having < 90 FPS causing motion sickness is a pretty damn good argument for FPS > Resolution.
 
Back
Top