Xbox Series X 4K 60 FPS Frame Rate "Standard," not "Guaranteed"

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,894
Trust it?

"Microsoft's next-generation Xbox Series X entertainment system has some pretty serious hardware specs, to support its lofty design goals of 4K UHD gaming at 60 Hz, with ray-tracing, and yet have 8K capability. It turns out that Microsoft isn't holding game developers to that 60 FPS number at 4K UHD, and that the minimum frame-rate is still 30 FPS. At lower resolutions such as Full HD, the console could offer high refresh-rate gaming. Apparently, the console natively displays 4K UHD at 60 Hz, and uses VESA adaptive-sync on TVs and monitors that support it; but game developers are free to cram in enough eye-candy to drive performance down to 30 FPS. This came to light when Ubisoft confirmed that "Assassin's Creed Valhalla" will run at 30 FPS on the Xbox Series X. "Developers always have flexibility in how they use the power, so a standard or common 60 FPS is not a mandate," said Aaron Greenberg, Xbox marketing head, in a Tweet Tuesday night."

https://www.techpowerup.com/267022/xbox-series-x-4k-60-fps-frame-rate-standard-not-guaranteed
 
I feel like everyone should have expected this to be the case. Developers and publishers are never going to prioritize framerate over graphics and shiny effects. Let's face it, the majority of people buying consoles don't care if a game is 60 or 30fps so studios are not going to make it their priority. Even when studios do focus on getting 60fps you can bet it will be done with dynamic resolution in most cases so they can avoid sacrificing graphics to reach it. The best we can hope for is that we'll see more stable frame rates since these systems will not have all the same bottlenecks as current consoles.
 
Ubisoft said the next AC is already 30fps.. Nice try Microsoft...
 
I dont buy consoles for fps , thats why i got my PC , on my console i expect a seamless experience , complete ecosystem for gaming and hassle free time.
All i know is that MS is going all in with the specs , and it will be a nice upgrade for my current Xbox 1 X for less than the price of a mid-high end gpu.
 
Ubisoft said the next AC is already 30fps.. Nice try Microsoft...
People were here not long ago swearing up and down that Series X was going to be the 8K 120FPS 2080Ti killer because of RDNA2 magic dust.

Are you suggesting they should let the mushroom hallucinations wear off, then maybe take another look?
 
Last edited:
I feel like everyone should have expected this to be the case. Developers and publishers are never going to prioritize framerate over graphics and shiny effects. Let's face it, the majority of people buying consoles don't care if a game is 60 or 30fps so studios are not going to make it their priority. Even when studios do focus on getting 60fps you can bet it will be done with dynamic resolution in most cases so they can avoid sacrificing graphics to reach it. The best we can hope for is that we'll see more stable frame rates since these systems will not have all the same bottlenecks as current consoles.
This reminded me of a Digital Foundry video I watched just the other day of a guy that got Bloodborne to run at (nearly) locked 60fps by dropping the resolution to 720p and boost mode on the Pro.

Visually it looked so much better even with the drop in resolution. Utilizing a dynamic resolution to hit a solid 60 is a worthwhile trade I think. (Depending how aggressively the resolution has to drop of course)
 
I personally feel that motion clarity matters more than resolution, and if some games have to use a dynamic resolution to achieve stable motion clarity then its a worthy sacrifice.
 
Feels like vista ready & vista capable fiasco all over again

https://www.thefpsreview.com/2020/0...valhalla-will-run-at-60-fps-on-xbox-series-x/

As suggested by some Xbox Series X fans, Microsoft may want to put their foot down and mandate 60 FPS (or higher) for titles that flaunt the “Optimized for Series X” badge. “Games featuring the Optimized for Xbox Series X badge will showcase unparalleled load-times, heightened visuals, and steadier framerates at up to 120 FPS,” wrote the company – but that doesn’t mean much when a developer can slack off and still earn the privilege.
 
While 60 fps has remained elusive for a number of games on console, it’s much more common on PC. Destiny 2 is locked at 30 fps even on the powerful Xbox One X, players with higher-end PCs have been playing it in 60 fps for years.

And while Assassin’s Creed games have been traditionally capped at 30 fps on consoles, including even remasters of older ones, on PC they haven’t been, allowing players to decide on their own set of trade-offs between higher resolution and higher frame-rates.

https://kotaku.com/the-xbox-series-x-has-60-fps-as-standard-but-its-not-g-1843425714
 
I don't understand why they can't give us more control over graphic settings on consoles. I'm not saying open it up completely like a PC game, but at least let people choose with presets on what to prioritize. Some games do this already, but it should become more common place.
 
People were here not long ago swearing up and down that Series X was going to be the 8K 120FPS 2080Ti killer because of RDNA2 magic dust.

Are you suggesting they should let the mushroom hallucinations wear off, then maybe take another look?

No, they were not, although I am sure you were claiming they were, back then.
 
I don't understand why they can't give us more control over graphic settings on consoles. I'm not saying open it up completely like a PC game, but at least let people choose with presets on what to prioritize. Some games do this already, but it should become more common place.

It does seem odd in the age of dynamic resolution scaling that we don't have this happening but if you've ever worked software development you know that just getting one version of something working, properly, all the time is near impossible. That's why you have patches and fixes and so forth. Consoles are fixed hardware, but asking developers to have a 30fps mode and a 60fps mode can effectively mean "make two games for the price of one", its not like you can say "ok for 60fps run at this resolution and remove 1/4 of the trees in the game". Ok, which trees. What about if your guy/car/whatever comes over this hill, how many trees are rendered...you can't have the game chug to 15fps or 30fps because on Xbox anyhow you're vsync locked, that would make the system look back and MIcrosoft will kick your developing ass to the curb if you make their system look like a Potato.....so most devs go "Console users are fine with 30, 60 is a nice to have".

One thing also worth mentioning: If you are wholly accustomed to gaming at 30fps, 60fps can almost seem "wrong"......usually because on a console you'll have to limit more than just resolution to go from 30 to 60fps. Forza Horizon 4 is a good example, on Xbox ONe X you can go 1080p/60 or 4k/30, and I think those resolutions are locked, not dynamic.....but the 1080p/60 game has, I think, a simplification of either the lighting engine/shadows and a lack of some post processing....the game just looks...different. I have the choice and, even with my wheel, I usually stick to 4k/30 mode....1080p/60 is nice....buuuuut......I don't *need it* for this game. Funny thing is I can run this on my pc at 4k/60 easily......but running that long HDMI cable sucks so I just deal with 30, it's 'good enough'. That is really whaty ou're up against here, console users are just used to 30fps and if its locked, its always smooth enough. Its only when you see 60fps that you go "oh wait, I want that".

PS: By "wrong at 60fps" I mean people buy with their eyes......prettier at 30 will sell more games than faster but uglier at 60fps. If you are USED to 30fps gaming, sometimes seeing that same game at 60fps can feel like its 'running too fast' at first. Trading up from 30 to 60fps is nice.... only trading *down* from 60fps to 30 would be regarded as "eww I don't like this, this sucks", so basically don't give them 60fps and let them live with 30. 30fps buys more pretty pictures = longer life for your system. We know all games will be 30fps locked, we know some will be 60fps......hopefully most...but we also know that Ray Tracing, the new elephant in the room, is going to phuck all that up because its new and it's going to be one more thing that has to get patched/tested to hell and back.....for....ahem, what I will expect to be minimal visual gains in many (most?) implemeentations.

I'm still laughing at the way people went batsheet over Battlefield V......in the middle of a warzone I don't want ultra-clean non-deformable puddles just so they can reflect properly, I don't need ultra-reflective marble floors when everything should be covered in dust and dirty and I still can't even see footprints in the dust, etc, etc. Modern Warfare showed me that RT was 'barely distinguishable' from baked-in lighting when your time of day was fixed and your environment is non deforming anyhow....
 
Last edited:
I don't understand why they can't give us more control over graphic settings on consoles. I'm not saying open it up completely like a PC game, but at least let people choose with presets on what to prioritize. Some games do this already, but it should become more common place.

Some of it might be CPU issues on the current gen. Some games seem to not be fillrate limited, as you'd expect, but rather CPU limited. You see that with some of the ones that'll let you choose a lower rez and uncap like FF15 where their FPS doesn't go up very much. That indicates it isn't fillrate, but something else, and the CPU is a prime suspect since it is not a strong CPU and in so far as it is strong, you need good multi-threading to use it, something games don't tend to be great at.
 
One thing also worth mentioning: If you are wholly accustomed to gaming at 30fps, 60fps can almost seem "wrong"......usually because on a console you'll have to limit more than just resolution to go from 30 to 60fps. Forza Horizon 4 is a good example, on Xbox ONe X you can go 1080p/30 or 4k/60, and I think those resolutions are locked, not dynamic.....but the 1080p/60 game has, I think, a simplification of either the lighting engine/shadows and a lack of some post processing....the game just looks...different. I have the choice and, even with my wheel, I usually stick to 4k/30 mode....1080p/60 is nice....buuuuut......I don't *need it* for this game.

PS: By "wrong at 60fps" I mean people buy with their eyes......prettier at 30 will sell more games than faster but uglier at 60fps. If you are USED to 30fps gaming, sometimes seeing that same game at 60fps can feel like its 'running too fast'. Trading up from 30 to 60fps is nice, only trading *down* from 60fps to 30 would be regarded as "eww I don't like this", so basically don't give them 60fps and let them live with 30 = th emantra, so 30fps buys more pretty pictures = longer life for your system.
No
 
People were here not long ago swearing up and down that Series X was going to be the 8K 120FPS 2080Ti killer because of RDNA2 magic dust.

Are you suggesting they should let the mushroom hallucinations wear off, then maybe take another look?
No
People were here not long ago swearing up and down that Series X was going to be the 8K 120FPS 2080Ti killer because of RDNA2 magic dust.

Are you suggesting they should let the mushroom hallucinations wear off, then maybe take another look?
Only fandboi were claiming it. It maybe be possible on simple undie games but not AAA. I don't even believe HDMI 2.1 can even do 8k/120
Even leaving out HDR and going down to 8bit.
 
...to the part that devs dont stick to 30fps because its easier to hit 30 consistently with more visual eyecandy and thusly get more life out of a given console generation, cuz that's indisputable. No to the part where if you are used to 30fps, 60fps can seem weird or wrong? That's debatable, but I've had people express that to me. Sometimes seeing things at 60fps, depending on how the animation was done in a game or the pacing of the game itself, can make things feel artificial...the Soap Opera Effect is a good way to put it, but in gaming form. Has a lot to do with tertiary animations as well. Example, if rear-view mirrors lock into a lower-than-60 framerate in a driving game or if animations of reflections are at 30.....things seem 'off'. If you're in a fps where vegetation animations or the like are fixed to 15 or 30fps updates, things seem off....thats where I was going with it.
 
...to the part that devs dont stick to 30fps because its easier to hit 30 consistently with more visual eyecandy and thusly get more life out of a given console generation, cuz that's indisputable. No to the part where if you are used to 30fps, 60fps can seem weird or wrong? That's debatable, but I've had people express that to me. Sometimes seeing things at 60fps, depending on how the animation was done in a game or the pacing of the game itself, can make things feel artificial...the Soap Opera Effect is a good way to put it, but in gaming form. Has a lot to do with tertiary animations as well. Example, if rear-view mirrors lock into a lower-than-60 framerate in a driving game or if animations of reflections are at 30.....things seem 'off'. If you're in a fps where vegetation animations or the like are fixed to 15 or 30fps updates, things seem off....thats where I was going with it.
I think it's a combination of a lot of things. The most common is probably because they're used to it. The feeling of the "soap opera" effect could be from the continued use of motion blur as framerate increases when it's no longer needed. LOD can be an issue, as well, but if the game was designed for unlocked framerate the "tertiary" animations are usually at least interpolated to look smooth. The level of interpolation needed could exacerbate the effect, though, like going from 15 FPS to 120.

I don't personally understand it, so I cannot say definitively if any of this is true.
 
Back
Top