AMD: Native 4K Resolution at 30fps with PS4 Image Quality Requires 7.4 TFLOPS GPU

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
According to AMD developer Timothy Lottes, a 7.4 TFLOP GPU would be required to run the average PS4 title at 4K 30fps. Interestingly, both the Xbox One X and PS4 Pro fall below this spec, which offer 6 TFLOPS and 4.2 TFLOPS of power respectively. Xbox fans are quick to mention that their console offers numerous native 4K 30fps titles, such as Forza Horizon 4 and Gears of War 4.

Lottes did, however, mention that that brief description is an oversimplification of a very complicated and layered process. That complication, of course, comes in form of stuff like supersampling, checkerboarding, and temporal reconstruction, which are means that developers in some cases use to have less than 4K pixels on the screen in terms of pixel count in their games, and then fill in those gaps through those techniques.
 
So, they'd need something on the order of a 1070 or 1070 ti to pull it off.
 
Timothy Lottes? That guy was working on NVIDIA and was the main guy behind FXAA. If I remember correctly he then moved to EPIC. Now apparently AMD...
 
Timothy Lottes? That guy was working on NVIDIA and was the main guy behind FXAA. If I remember correctly he then moved to EPIC. Now apparently AMD...

2011 - 2013 - Nvidia
2013 - 2015 - Epic
2015 - Present - AMD

Timothy Lottes is an optimization-obsessed developer working as part of the AMD Game Engineering team, with prior experience at Epic, NVIDIA, Human Head Studios, and ILM. He is based in AMD Orlando working with the GPU hardware team, Vulkan driver team, and directly with game developers.
 
Forza 7 is 4k60. So something is off here. And its gorgeous.

I can imagine they might use some of the other techniques. Racing games traditionally have a blurry image near the sides as objects pass by at speed, with that in mind, you don't really need to render those portions of the screen at full resolution and can subsample them. So you're rendering some parts of the image at 720, some at 1080, and some at 4k.

Same way some VR goggles (in production, not sure if it's mainstream yet) use eyetracking to render portions of the image you're looking at with higher resolution.
 
The industry has set console gamer expectations way too high. 4k gaming on consoles with current tech is a joke. Why would I ever want to compromise with 30fps when 120fps/Hz is just so much better? Fuck that.

They're trying to sell the market on a subpar experience.
 
The industry has set console gamer expectations way too high. 4k gaming on consoles with current tech is a joke. Why would I ever want to compromise with 30fps when 120fps/Hz is just so much better? Fuck that.

They're trying to sell the market on a subpar experience.
It is simple they can't bankroll hardware for consumer prices that does 60hz or better for 4K. Samsung (2018) is the only TV that supports faster framerate

Faster hardware needs more cooling more power and that will spiral cost. If no one can afford your console you are not going to sell games.
 
Forza 7 is 4k60. So something is off here. And its gorgeous.

It's not true 4K60. No current console title is. They're all cheating with checkerboarding/uprezzing. Racers are also among the least graphically demanding.

1080Ti remains the only way to get legit 4K60 across the board. Anything less powerful and the developer has to cheat and cut corners to create a false perception - exceptions being low poly cartoon games like Overwatch/Fortnite. But most console buyers don't really care - the psychological contentment of 4K printed on the box is good enough.
 
Last edited:
I could never see myself playing at 30 frames per second, no matter how good it looked. It wouldn't be a very fun experience for me.
 
I could never see myself playing at 30 frames per second, no matter how good it looked. It wouldn't be a very fun experience for me.

Yep. The first few of hours God of War at 4K30 was migraine inducing. Switching over to 1080/60 was an incredible relief.
 
According to AMD developer Timothy Lottes, a 7.4 TFLOP GPU would be required to run the average PS4 title at 4K 30fps. Interestingly, both the Xbox One X and PS4 Pro fall below this spec, which offer 6 TFLOPS and 4.2 TFLOPS of power respectively. Xbox fans are quick to mention that their console offers numerous native 4K 30fps titles, such as Forza Horizon 4 and Gears of War 4.

Lottes did, however, mention that that brief description is an oversimplification of a very complicated and layered process. That complication, of course, comes in form of stuff like supersampling, checkerboarding, and temporal reconstruction, which are means that developers in some cases use to have less than 4K pixels on the screen in terms of pixel count in their games, and then fill in those gaps through those techniques.
gaming at 30fps
lol
60 minimum or go home
 
There are some caveats that I have:

Single player game I am reasonably ok with 60 FPS. When I play single player games I rather play them in 3D Vision than any other way; it truly is that revolutionary.
Multiplayer FPS game I am happy with 99% of the frames over 100

99% of frames over 100 with TAA on > 2k or 4k gaming at 60 or 30 frames

Even in the PC space people are struggling to run 2k titles at a consistent 100+ FPS.

Seems the console developers wan to keep that user base in the 30-60 FPS range that they are accustomed to.
 
Your Xbox is about as powerful as that iGPU that comes in Intel processors. Pretty sure this is true.
The standard Xbox, kinda. But the One X's Polaris unit carries more compute units than an RX580 (40 on the Scorpio vs 36 on the RX580), it's basically a "RX 590" with a down clock where AMD to ever produce such a thing. Native 4K on the platform isn't plentiful but there are a number of titles that do it. We are talking about graphical level's that hit a mix of medium and high on the PC with better optimization here. Track focused driving games like Forza have an easier time hitting 60 as all the detail is in the car models and some of the more immediate track details, the rest of the environment really doesn't hold up to close scrutiny.
 
I only get 30fps or less in 4k when I do silly things...

fkHbAGj.jpg
 
Yep. The first few of hours God of War at 4K30 was migraine inducing. Switching over to 1080/60 was an incredible relief.
Honestly I played the whole game at 30 because I didn’t care for the highs and lows of the unlocked framerate. It doesn’t jump a lot but it jumped enough to bug me.

Ironically I recently went back to Bloodborne, a game I’ve put hundreds of hours into, after playing hours of Dark Souls Remastered, and the much lower framerate was absolutely painful.

So Iunno.
 
Forza 7 is 4k60. So something is off here. And its gorgeous.

Have you seen the crowds? Racing games are almost 100% focused on the cars. Everything else lacks detail. Racing games look so "good" because of how much the developers control what the gamers focus on. Even big open-world racers like the Horizon games do it. Cars are super detailed while everything else is not. Cars are also the only items in the game that need advanced physics calculations. Crowds are low-poly models with incredibly limited animation. Racing games are never as demanding as it looks like they should be due to the shortcuts developers are able to take.
 
Eurogamer is always keen to say that consoles are woefully cpu bound. It definitely seems like the weak side of the equation.

I always see their youtube (digital foundry) content then shake my head about some of the opinions they vent about the console business and how the hardware can be improved by doing X or Y and the funny part is where they project their improvements would gain and all I'm thinking is just say no to drugs because it does not work like that but then again that never stopped anyone or anything from posting stuff on youtube ...

Sad to see that people that do have some talents waste it on speculation on how future hardware will perform.
 
It's not true 4K60. No current console title is. They're all cheating with checkerboarding/uprezzing. Racers are also among the least graphically demanding.

1080Ti remains the only way to get legit 4K60 across the board. Anything less powerful and the developer has to cheat and cut corners to create a false perception - exceptions being low poly cartoon games like Overwatch/Fortnite. But most console buyers don't really care - the psychological contentment of 4K printed on the box is good enough.
Even a 1080 Ti isn't going to be "across the board", but granted, it will get a lot of titles.
 
There are some caveats that I have:

Single player game I am reasonably ok with 60 FPS. When I play single player games I rather play them in 3D Vision than any other way; it truly is that revolutionary.
Multiplayer FPS game I am happy with 99% of the frames over 100
99% of frames over 100 with TAA on > 2k or 4k gaming at 60 or 30 frames
Even in the PC space people are struggling to run 2k titles at a consistent 100+ FPS.
Seems the console developers wan to keep that user base in the 30-60 FPS range that they are accustomed to.

Read the article then realize that how much more money it would cost to get hardware to the point of that many Tflops. I'll spoil it for you it is not $5.
And when 95% of your consumer base have a 60hz TV why would they suddenly do 120hz ?
Btw if you go to Digital Foundry you can find some material why locked framerate is good at either 30 or 60 ....
 
Read the article then realize that how much more money it would cost to get hardware to the point of that many Tflops. I'll spoil it for you it is not $5.
And when 95% of your consumer base have a 60hz TV why would they suddenly do 120hz ?
Btw if you go to Digital Foundry you can find some material why locked framerate is good at either 30 or 60 ....

Locked framerate is not universally good for every type of content. For sure that's the case with high quality video because it's not meant to be generated in real time. Games are completely different. At least the games I play and the way I experience them.

Shooters feel horrible below 90 fps, and even that is far from optimal when things get real hectic. There are Doom and Quake, and there's Uncharted. There's Barry White, and there's Jeff Mills
 
It's not true 4K60. No current console title is. They're all cheating with checkerboarding/uprezzing. Racers are also among the least graphically demanding.

1080Ti remains the only way to get legit 4K60 across the board. Anything less powerful and the developer has to cheat and cut corners to create a false perception - exceptions being low poly cartoon games like Overwatch/Fortnite. But most console buyers don't really care - the psychological contentment of 4K printed on the box is good enough.
I think you are wrong on that. I think that Forza 7 is native 4k60. Others have pointed out that they used a bit of clever developing to get there (such low quality out of sight-line textures), but either way I'm fairly certain it's native and like I stated it's beautiful. I could be wrong, but I'm kinda confident I read it's legit 4k with no uprezzing or checkerboard. PS4 Pro on the other hand uses those techniques for EVERY game - XoneX is quite a bit more powerful though.
 
I always see their youtube (digital foundry) content then shake my head about some of the opinions they vent about the console business and how the hardware can be improved by doing X or Y and the funny part is where they project their improvements would gain and all I'm thinking is just say no to drugs because it does not work like that but then again that never stopped anyone or anything from posting stuff on youtube ...

Sad to see that people that do have some talents waste it on speculation on how future hardware will perform.

Any particular points you remember? The occasional things I've seen I haven't found particularly disagreeable. I will say moving from 8 tablet circa 2013 class cores to 8 desktop class ones would be a huge leap.
 
Any particular points you remember? The occasional things I've seen I haven't found particularly disagreeable. I will say moving from 8 tablet circa 2013 class cores to 8 desktop class ones would be a huge leap.
Because you are thinking about desktop cpu and all it does now is just send data to the gpu that is where the horse power is.
Why would that drastically change ?

The dynamic is that you have a power sufficient/efficient system towards the goal of the hardware. Nothing changes in that respect
 
Because you are thinking about desktop cpu and all it does now is just send data to the gpu that is where the horse power is.
Why would that drastically change ?

The dynamic is that you have a power sufficient/efficient system towards the goal of the hardware. Nothing changes in that respect

Um, no, I am not going to allow you to continue with your completely unproven bullshit opinions on this topic, not again.
Nearly everyone, including myself, proved your ass wrong in this thread about this very topic:

https://hardforum.com/threads/sony-...rovements-possibly-for-playstation-5.1960852/

You had absolutely zero response and no retort other than your "look, this ONE game runs at 1080p @ 60fps, so obviously the system can do it" YouTube video, and then you became curiously silent after your nonsense claims had no backing, and you had no evidence to back them with.
You had absolutely nothing to base your claims with and did nothing but spew bullshit and insults - so I am telling you again, start providing some real evidence of your claims before going forward with this total bullshit you continuously talk about, especially before your embarrass yourself again.

The current consoles are very much CPU-bound, and if you need a refresher, please re-read through that thread, specifically the last few pages, before we start going around in circles again.
No, the GPU does not do "everything" as you claim, and the CPU must be sufficient to pass data to it, otherwise without being able to do so, the system will become CPU-bound; this is hardly a new concept and is very old hat at this point.

As I mentioned in that previous thread, and provided real-world evidence to you numerous times as others did as well, try playing a modern AAA game on a GTX1080Ti paired with a Pentium 4, and see how the games run at 5-10fps because that is all the further that CPU can feed data to the GPU, not to mention has to run the game engine/AI/etc.
Either start providing real evidence to your claims or seriously just STFU.
 
Because you are thinking about desktop cpu and all it does now is just send data to the gpu that is where the horse power is.
Why would that drastically change ?

The dynamic is that you have a power sufficient/efficient system towards the goal of the hardware. Nothing changes in that respect

I'll concur with the poster above you have got the cpus roll a bit wrong. It's far far far more than just bumping things to the gpu. When a console game is 4k60 that means it's not native 4k and to get 60fps a lot of things were chopped back.
 
Um, no, I am not going to allow you to continue with your completely unproven bullshit opinions on this topic, not again.
Nearly everyone, including myself, proved your ass wrong in this thread about this very topic:

https://hardforum.com/threads/sony-...rovements-possibly-for-playstation-5.1960852/

You had absolutely zero response and no retort other than your "look, this ONE game runs at 1080p @ 60fps, so obviously the system can do it" YouTube video, and then you became curiously silent after your nonsense claims had no backing, and you had no evidence to back them with.
You had absolutely nothing to base your claims with and did nothing but spew bullshit and insults - so I am telling you again, start providing some real evidence of your claims before going forward with this total bullshit you continuously talk about, especially before your embarrass yourself again.

The current consoles are very much CPU-bound, and if you need a refresher, please re-read through that thread, specifically the last few pages, before we start going around in circles again.
No, the GPU does not do "everything" as you claim, and the CPU must be sufficient to pass data to it, otherwise without being able to do so, the system will become CPU-bound; this is hardly a new concept and is very old hat at this point.

As I mentioned in that previous thread, and provided real-world evidence to you numerous times as others did as well, try playing a modern AAA game on a GTX1080Ti paired with a Pentium 4, and see how the games run at 5-10fps because that is all the further that CPU can feed data to the GPU, not to mention has to run the game engine/AI/etc.
Either start providing real evidence to your claims or seriously just STFU.
But he shakes his head at Digital Foundry, must be legit.
 
Um, no, I am not going to allow you to continue with your completely unproven bullshit opinions on this topic, not again.
Nearly everyone, including myself, proved your ass wrong in this thread about this very topic:

https://hardforum.com/threads/sony-...rovements-possibly-for-playstation-5.1960852/

You had absolutely zero response and no retort other than your "look, this ONE game runs at 1080p @ 60fps, so obviously the system can do it" YouTube video, and then you became curiously silent after your nonsense claims had no backing, and you had no evidence to back them with.
You had absolutely nothing to base your claims with and did nothing but spew bullshit and insults - so I am telling you again, start providing some real evidence of your claims before going forward with this total bullshit you continuously talk about, especially before your embarrass yourself again.

The current consoles are very much CPU-bound, and if you need a refresher, please re-read through that thread, specifically the last few pages, before we start going around in circles again.
No, the GPU does not do "everything" as you claim, and the CPU must be sufficient to pass data to it, otherwise without being able to do so, the system will become CPU-bound; this is hardly a new concept and is very old hat at this point.

As I mentioned in that previous thread, and provided real-world evidence to you numerous times as others did as well, try playing a modern AAA game on a GTX1080Ti paired with a Pentium 4, and see how the games run at 5-10fps because that is all the further that CPU can feed data to the GPU, not to mention has to run the game engine/AI/etc.
Either start providing real evidence to your claims or seriously just STFU.
It ended with you not understanding what cpu bound is I told you about Diablo 3 and how that engine is not able to push anything but single thread. You wanted me to prove to you about what I said you published the specs for the game but you still don't know anything you replied with "have you tried running Diablo 3 yourself on jaguar" Told you that the engine on PS4 is drastically different.
https://hardforum.com/threads/sony-...-playstation-5.1960852/page-5#post-1043648885

There is a major difference on how console games are programmed and PC games. You can visit the D3 thread in the games forum and ask yourself instead of telling me to STFU.

I'l stick with Diablo 3 the engine is different thus the result is different. It is major not minor one is single thread the other is not. How difficult is that to understand? You want me to send you the source code?

Another reason for me not replying is that you ignore the painfully obvious oversight of both console revision having the same CPU as be it slightly higher clocked on this generation. You also ignore that the PS4 pro and Xbox 1X do 4K and sometimes Xbox does it a little better but that is not due to the cpu or is it? You figured it out yet ?

Maybe you need to read the topic it does spell it out for you....
 
It ended with you not understanding what cpu bound is I told you about Diablo 3 and how that engine is not able to push anything but single thread. You wanted me to prove to you about what I said you published the specs for the game but you still don't know anything you replied with "have you tried running Diablo 3 yourself on jaguar" Told you that the engine on PS4 is drastically different.
https://hardforum.com/threads/sony-...-playstation-5.1960852/page-5#post-1043648885

There is a major difference on how console games are programmed and PC games. You can visit the D3 thread in the games forum and ask yourself instead of telling me to STFU.

I'l stick with Diablo 3 the engine is different thus the result is different. It is major not minor one is single thread the other is not. How difficult is that to understand? You want me to send you the source code?

Another reason for me not replying is that you ignore the painfully obvious oversight of both console revision having the same CPU as be it slightly higher clocked on this generation. You also ignore that the PS4 pro and Xbox 1X do 4K and sometimes Xbox does it a little better but that is not due to the cpu or is it? You figured it out yet ?

Maybe you need to read the topic it does spell it out for you....
I get that games and software are programmed differently for different platforms, that is a given for any system/CPU/GPU/ISA/architecture/etc.; I don't need to ask the game devs about this as this is fairly common knowledge for any multi-platform/architecture software and has been this way since the 1960s - I work with software devs who can also verify this, and this is one point that I am not in disagreement with you on.
However, you didn't answer my question in that post you quoted: "Also, I wanted ask, how do you KNOW that Diablo 3 won't play on a Jaguar-based system (other than the consoles) - have you actually tried it yourself?"

You never answered that question, and of which you earlier vehemently denied that the Jaguar was an inferior CPU that couldn't run it on any platform other than the consoles - this also contradicts your opinion that the GPU does everything and the CPU simply passes information to it, so why wouldn't the Jaguar CPU be good enough on other platforms?
So again, how the hell do you KNOW this???

I really would sincerely like your answer on this one, and for brevety's sake, lets just stick with the Jaguar CPU and Windows as the example for this question.



EDIT: The point of the title (of the thread/article) is because the existing GPUs themselves in the XBone X and PS4 Pro are not capable of providing and rendering today's console games at a native 1:1 4K @ 60fps without any graphical techniques/tricks.
The CPU will still hold most games back from that anyways, but the point is more focused on the GPUs themselves this time being the bottleneck for that specific task - this doesn't change the fact that the biggest bottleneck in those systems is the CPU, though.
 
Last edited:
It ended with you not understanding what cpu bound is I told you about Diablo 3 and how that engine is not able to push anything but single thread. You wanted me to prove to you about what I said you published the specs for the game but you still don't know anything you replied with "have you tried running Diablo 3 yourself on jaguar" Told you that the engine on PS4 is drastically different.
https://hardforum.com/threads/sony-...-playstation-5.1960852/page-5#post-1043648885

There is a major difference on how console games are programmed and PC games. You can visit the D3 thread in the games forum and ask yourself instead of telling me to STFU.

I'l stick with Diablo 3 the engine is different thus the result is different. It is major not minor one is single thread the other is not. How difficult is that to understand? You want me to send you the source code?

Another reason for me not replying is that you ignore the painfully obvious oversight of both console revision having the same CPU as be it slightly higher clocked on this generation. You also ignore that the PS4 pro and Xbox 1X do 4K and sometimes Xbox does it a little better but that is not due to the cpu or is it? You figured it out yet ?

Maybe you need to read the topic it does spell it out for you....
The ps4 pro and xbox 4k whatever don't actually render at 4k though.

Console games are made to be highly threaded and very efficient. That still doesn't come close enough to making up for having 6/7 jaguar cores at 1.6ghz (1-2 reserved for the OS/other functions)
 
The ps4 pro and xbox 4k whatever don't actually render at 4k though.

Console games are made to be highly threaded and very efficient. That still doesn't come close enough to making up for having 6/7 jaguar cores at 1.6ghz (1-2 reserved for the OS/other functions)
I'm sure they can actually render games at 1:1 4K @ 60fps, assuming it is a much older/retro/2D game, though.
But for the vast majority of existing console games, you are 100% correct. (y)

I do want to reiterate that I am not trying to knock the consoles themselves or the technology used within them, but simply pointing out the existing bottlenecks of said technology that do currently exist.
 
I'm sure they can actually render games at 1:1 4K @ 60fps, assuming it is a much older/retro/2D game, though.
But for the vast majority of existing console games, you are 100% correct. (y)

I do want to reiterate that I am not trying to knock the consoles themselves or the technology used within them, but simply pointing out the existing bottlenecks of said technology that do currently exist.

Theyre 2013 era tech with something like a 150w power budget. They do as much as that implies. Heres hoping the next wave go balls to the wall 300w like the ps3 and xbox 360
 
Theyre 2013 era tech with something like a 150w power budget. They do as much as that implies. Heres hoping the next wave go balls to the wall 300w like the ps3 and xbox 360
I certainly hope so, and while the original consoles' CPUs were clocked at a lowly 1.6GHz (PS4) and 1.75GHz (XBone), the newer versions have increased the clock speeds marginally to 2.13GHz (PS4 Pro) and 2.3GHz (XBone X).
The increased clock speeds do certainly help, but for the most part, especially on a lot of more limited titles, the most the higher clock speed allows for (taking the upgraded GPUs into account) is providing a more consistent 30fps with less stutter and/or lag.

In the previous thread, I mentioned FromSoftware's Bloodborne, which is a PS4 exclusive title that is about as highly optimized for that specific console as is possible, and even on the PS4 Pro with the Boost Mode enabled, it still only manages (a more consistent) 30fps @ 1080p.
Now, not all games are like this and there were quite a few that did get bumped up from 30fps to 60fps with Boost Mode and patches when running on the PS4 Pro, but 4K @ 60fps is another task entirely.

The TDP is certainly a limitation, but there is a certain cost envelope that both Microsoft and Sony have to stay within without going back into selling the consoles at a loss (loss-leader) again.
If the next generation can even get 4K @ 60fps semi-consistently per title, I would call that a win, but the biggest win will be a fully-consistent 1080p @ 60fps for all titles, at least as a minimum bar to set. :)
 
Back
Top