Xbox360 Frame Rates VARY when using different output resolution.

Archaea

[H]F Junkie
Joined
Oct 19, 2004
Messages
11,821
How many people thought the Xbox360 internally rendered at 720p, and that regardless of selected output resolution all should get the same framerates?

I thought I'd read that several places, but that doesn't seem to be the case, at least not in the Tomb Raider demo. I've heard rumors about possible differences, and read a bit of info here and decided to test it myself with the TR legends demo for the XBox360. The differences are quite obvious ----that at least in this demo the ouput resolution very much effects the frames per second.

Read about it here.

http://www.hardforum.com/showthread.php?t=1040374

Download the demo and test for yourself. The default resolution for 720p (1280x720) runs much slower and stutters, but if you run your xbox360 at 640x480(selectable if you have the vga adapter) it runs incredibly smooth. It makes sense, but goes against what I've read and is also disappointing for the reason that my native resolution is 1280x720 on my HDTV projector and now I will forever be itching to run my xbox360 at a lower non native resolutioin in order to achieve higher frame rates/smoother gameplay :(
 
720p is really low res. The 360 should be able to handle anything at a better res than that.
 
I thought the 360 did HD? I thought that was a big selling point and that is why everyone hates the revolution? Now your telling me it can't do 720p without stuttering? LMFAO!!!

If it can do HD it should be able to run every game at 1920x1080 perfectly with no stutter. What happened to not having to worry about framerate on consoles? I mean, the developers KNOW the EXACT hardware they have to work with, why is it so difficult to make it run properly?
 
Codex,

720P is HDTV

and 720P = 1280x720

and when I say stuttering I don't mean running unplayably ---- I mean it runs a bit slower than 30 FPS. It's still quite playable, and still looks good, but doesn't run as smoothly at 1280 x 760 as it does at 640x480.


Your blanket xbox360 putdown and lack of valuable input as well as misinformation isn't helpful, nor on topic.
 
Archaea said:
Codex,

720P is HDTV

and 720P = 1280x720

and when I say stuttering I don't mean running unplayably ---- I mean it runs a bit slower than 30 FPS. It's still quite playable, and still looks good, but doesn't run as smoothly at 1280 x 760 as it does at 640x480.

I know it is, but so is 1080i, which I assume would be even more taxing. Also, you say it runs slower than 30fps? HD is 60fps...

SO, what I was saying was, at even the lowest HD resolution it still cant handle some games... apparently (according to what you just said) dropping to as much as half the native framerate, pretty sad.
 
CodeX said:
I know it is, but so is 1080i, which I assume would be even more taxing. Also, you say it runs slower than 30fps? HD is 60fps...

SO, what I was saying was, at even the lowest HD resolution it still cant handle some games... apparently (according to what you just said) dropping to as much as half the native framerate, pretty sad.

HD is not 60fps. You have interlaced and progressive output.

I wouldn't be so quick to blame the Xbox for failure to have acceptable frame rates in a demo.

It could very well be that the build of the TR demo is older and has a lot of debug code in it slowing it down...
 
Obi_Kwiet said:
720p is really low res. The 360 should be able to handle anything at a better res than that.

Don't get confused, there are definite limits to the 360's power. Alot of the games out now don't even have AF being applied.


To the opening poster, I was reading through a thread about that issue earlier today (this one), and went on google for some info. I was almost certain Microsoft said it was rendered at 720p internally, but, meh. I found this interview kind of odd,

http://www.microsoft.com/presspass/exec/rbach/05-16-05E3.mspx
PETER MOORE: So to create experiences that transform reality, well, that's really a lot to ask of any medium. To ensure that the next generation delivers, every game is built on what we call Essentials, a foundation of requirements every game must have before it can ship on the Xbox 360 platform. These essentials include a minimum of 720p high definition resolution to provide gamers with jaw-dropping visual clarity and fidelity. Many developers are already working on games in 1080i.

"Working on games in 1080i"? That seems to imply that the 1080i mode isn't just upscaling being done by the console itself. It's possible I'm reading too much into this, but take it for what it's worth.
 
Ok, 1080i IS 60fps, and the 360 can't even do full frame rate in 720p, let alone 1080i, happy :rolleyes:

My point remains the same, at even the lowest HD res and framerate the 360 fails to deliver, either that or the developers of most of the games out for it screwed up.
 
Strange, I do not notice any frame rate decress in 720p, nearly everything is atleast 30+FPS
 
Would it not depend on the HDTV that it is being played on? Many HDTVs are LCDs with different response times.

When I had my 360 hooked to my Dell 2405, I was getting low frame rates @ 720p playing Condemned and NFS:MW. I chalked it up to the panel response time which is a little high.

I am now playing on a 19" crt (while I wait for my fw900 to arrive :D ) and I have not noticed the slowdown on any of the resolutions offered through the VGA adapter.
 
CodeX said:
Ok, 1080i IS 60fps, and the 360 can't even do full frame rate in 720p, let alone 1080i, happy :rolleyes:

My point remains the same, at even the lowest HD res and framerate the 360 fails to deliver, either that or the developers of most of the games out for it screwed up.

1080i is 30 scan lines and 1080p is 60 scan lines.

http://en.wikipedia.org/wiki/1080i
 
CodeX said:
Ok, 1080i IS 60fps, and the 360 can't even do full frame rate in 720p, let alone 1080i, happy :rolleyes:

My point remains the same, at even the lowest HD res and framerate the 360 fails to deliver, either that or the developers of most of the games out for it screwed up.
I think you need to visit the AVS forum, or an actual audio/video section of some forums. 1080i will NEVER give you 60 frames per second, because each frame of animation is split into 2 fields, odd and even. Only 1 field can be displayed at once, and it takes 2 field passes to display the whole picture, as to where in a progressive picture, each frame of animation is shown as a whole, with each field shown at once. Think of it this way:

Each - represents an odd or even line of data. Each number represents 1 60th of a second

720p Progressive image-both lines of data shown at once.
Frame 1
1--
2--
3--

and so on...as you see, each number (or 60th of a second) has BOTH lines, odd and even, displayed at once

1080i interlaced image-only 1 line of data can be shown every 60th of a second-ultimately causing a 30 frames per second cap:
Frame1
1-
2-
3-

as you see, each 60th of a second contained only 1 line of data. So, in essence, a whole frame is created every 30th of a second, ultimately maxing out at 30 frames per second.
 
This issue may not be pertinant to all games, I hope it's not....but it is obvious playing the TR legends demo.(I've not tried others yet) Download the demo and try this particular game against this theory. It's very noticable. Try 1280x720(or any other higher resolution) vs. 640x480. It's quite a big difference, and has nothing to do with the LCD.

It doesn't matter what LCD you are using, only the output resolution...Your Xbox 360 doesn't know or care what resolution you are using....It only knows the output resolution.
 
Well it stands to reason that running a higher res will cause a performance hit. I see what you're saying though, supposedly all 360 games are rendered in HD and then downscaled for lower res displays? Maybe the devs found they couldn't get the performance they wanted in HD and modified the engine to allow 640x480 native output knowing full well that 90% of the market would never know the difference.

On the other hand, maybe it's just a buggy demo?
 
skittzle said:
Would it not depend on the HDTV that it is being played on? Many HDTVs are LCDs with different response times.

That doesn't make sense. By that logic, you'd expect to see framerate drops on broadcast television aswell. The framerate being bad is because of the source, not the display.
 
skittzle said:
Would it not depend on the HDTV that it is being played on? Many HDTVs are LCDs with different response times.

When I had my 360 hooked to my Dell 2405, I was getting low frame rates @ 720p playing Condemned and NFS:MW. I chalked it up to the panel response time which is a little high.

I am now playing on a 19" crt (while I wait for my fw900 to arrive :D ) and I have not noticed the slowdown on any of the resolutions offered through the VGA adapter.
I see what you're saying, but that's just not possible. An LCD response time has nothing to do with the frames per second you're getting in a game. Now, you may experience control/sound lag due to scaling, and it may seem slightly blurrier than on another monitor, but it will not affect the frames per second you are getting.
 
Archaea said:
I mean it runs a bit slower than 30 FPS.
Very few things truly piss me off, and anything under 30 FPS is defiantly one of them.
 
I see what you're saying, but that's just not possible. An LCD response time has nothing to do with the frames per second you're getting in a game. Now, you may experience control/sound lag due to scaling, and it may seem slightly blurrier than on another monitor, but it will not affect the frames per second you are getting.

I see.

Well now I know. ;)
 
so here's a strangity for you...I just tested it all tonight on my HDTV native 720p panasonic projector.

with the VGA cable adapter the resolution matters in regards to framerates on this demo. 640x480 is super smooth. 1280x720 has a bit of slowdown.

Granted this is only the demo and not the real game ---- BUT with the component video at 480p vs. 720p I couldn't tell a difference in framerates.
480p = 7nn x 4nn where as 720p = 1280x720.

Why would it matter with the vga adapter but not with the component output in regards to higher or lower resolution...I must say the 720p ran much better than the 1280x720 with the vga adapter???????
 
Here is a question. Are there any games on the X360 that uses v-sync to syncronize the game with the display to eliminate any tearing? If the game was syncronizing with the display, couldn't it have serious frame rate issues if there was a lot of different stuff going on in the game? The only game where I have seen some graphical tearing on my X360 is Madden 06. It also has really smooth frame rates in 720p. I don't know much about this, so i'm asking you guys who know about this stuff.
 
tomb raider is a poor example. it wasnt optimized properly. i think it was rushed out the door.
 
junehhan said:
Here is a question. Are there any games on the X360 that uses v-sync to syncronize the game with the display to eliminate any tearing? If the game was syncronizing with the display, couldn't it have serious frame rate issues if there was a lot of different stuff going on in the game? The only game where I have seen some graphical tearing on my X360 is Madden 06. It also has really smooth frame rates in 720p. I don't know much about this, so i'm asking you guys who know about this stuff.

I noticed with GRAW that the Ubisoft logo when you first turn on the game has some horrific tearing. Made me wanna cry.

After reading this thread, i'm glad i'm not very sensitive to frame rate changes. Maybe it's just cuz my eyes are bad, but i can't tell the difference between 30 and 40 FPS like some people can.
 
beanman101283 said:
I noticed with GRAW that the Ubisoft logo when you first turn on the game has some horrific tearing. Made me wanna cry.

After reading this thread, i'm glad i'm not very sensitive to frame rate changes. Maybe it's just cuz my eyes are bad, but i can't tell the difference between 30 and 40 FPS like some people can.

Interesting you noticed that.

On GRAW I too see the tearing on the ubi logo at the start. BUT on Far Cry they have the same ubi logo animation and there is no tearing. I wonder what the difference is....
 
Filter said:
tomb raider is a poor example. it wasnt optimized properly. i think it was rushed out the door.
I agree that TR was not optimized properly to run with the higher dx9 graphical settings. It was a ps2 and xbox game they added dx9 features too and rushed out the door. Reason I say this is I have not had frame rate issues with any other xbox360 games. Plus the game runs @ or above 30 FPS 95-99%+ of the time. The game runs worse on a PC with dx9 features. Its only when entering large areas with lots of new dx9 features (running water and reflections in large environments) that the game slows down for few a second. Doesn't really affect game play because these are usually scenes that are just showing off the beautiful landscape. I just finished the game with 100% completion and this is my take. Fun game BTW :).
 
I like how all these people seem to think their own personal Mark 1 Eyeball is the most highly calibrated visual testing implement available. "My Mark 1 Eyeball can determine FPS within 0.01% Accuracy! Therefore the 360 is crap!"

I am sorry to burst your bubble guys, but the human eye is not an accurate way to measure FPS. You can notice and perceive differences in FPS, however you cant plug in a game, watch it run and say "That game is running at 35 FPS, with slight dips down to 25FPS." You just cant percieve the video that accurately, what you can percieve are changes in the framerate as the game progresses.

So please stop saying "The game cant keep 30FPS" or something like that, instead just say that you notice slow down at times.



P.S. I am not trying to say that the game doesn't slow down, or what you think is choppiness isn't really there. I am saying that you can't say what FPS the game is running at with any degree of accuracy. It's like sticking your finger on the heatsink and telling me what the CPU temp is, not worth shit.
 
Syphon Filter said:
Interesting you noticed that.

On GRAW I too see the tearing on the ubi logo at the start. BUT on Far Cry they have the same ubi logo animation and there is no tearing. I wonder what the difference is....

i noticed this on far cry for my pc when it came out.
 
Archaea said:
BUT with the component video at 480p vs. 720p I couldn't tell a difference in framerates.
480p = 7nn x 4nn where as 720p = 1280x720.
It sounds like when using the component cables it renders every thing in 720p then scales to what ever res you chose and with the vga cable it just directly renders the chosen res.. It would make for the clearest picture on a monitor that way.
 
wish i could test , but i only have a regular tv, and havent picked up a vga adapter for the xbox.
 
I feel much dumber after reading some of this thread.

Can I just point out that there is no such thing as frames per second when you're talking about Displays. Displays are refresh rates, and response times. Theres a Big differance.

PC gamers have been playing in higher than HD resolutions for years, and we all know that when we run a game at those resolutions we expect a performance hit. Now wether thats 2fps or 40fps decrease is dependant on how well the game performs overall. In the case of all Xbox 360 titles, you will almost always get a drop in frame rates when going from SD to HD. When you increase the res size you increase the amount of pixels, the more pixels to render the more your gpu has to work, therefore you get a drop in performance. There is really nothing you can do about it. You can't blame Microsoft for poor performance in an Xbox 360 title thats played in HD. They have no control over devlopers not designing games to perform well in HD. The simple fact is most people that own 360 probally don't have HD display's yet, so its not cost effect for developers to spend the extra time over-optimizing their games, they conform to they will conform to the SD standard until they absoultly have to move into HD.
 
pistola said:
I feel much dumber after reading some of this thread.

Can I just point out that there is no such thing as frames per second when you're talking about Displays. Displays are refresh rates, and response times. Theres a Big differance.

PC gamers have been playing in higher than HD resolutions for years, and we all know that when we run a game at those resolutions we expect a performance hit. Now wether thats 2fps or 40fps decrease is dependant on how well the game performs overall. In the case of all Xbox 360 titles, you will almost always get a drop in frame rates when going from SD to HD. When you increase the res size you increase the amount of pixels, the more pixels to render the more your gpu has to work, therefore you get a drop in performance. There is really nothing you can do about it. You can't blame Microsoft for poor performance in an Xbox 360 title thats played in HD. They have no control over devlopers not designing games to perform well in HD. The simple fact is most people that own 360 probally don't have HD display's yet, so its not cost effect for developers to spend the extra time over-optimizing their games, they conform to they will conform to the SD standard until they absoultly have to move into HD.

Well they absolutely have to conform to HD right now, and always have had to. Microsoft controls which games can be released for the system, and Microsoft has mandated that all games must run in 720P at 30FPS. They used to require 2xAA as well. Supposedly the 360 rendered everything internally at 720P and then a scaler chip scaled that image to the desired output resolution, that is what the discussion is about...whether or not everything is actually rendered in 720P.
 
I'm not an expert...but I think there's a CLEAR visual difference between games running at different resolutions. Tomb Raider and Full Auto *immediately* come to mind. I don't know their exact framerates, but there's a difference. You'd be blind not to notice one.
I dunno if it's 25FPS vs. 50fps or 10 vs. 20...but it's there.

As for 1080i only being 30FPS, it's that only 1/2 true? Other posts I've read indicate it's 30 half-frames which can cheat to produce the equivalent of 60FPS.

I'm not an expert or really even all of that knowledgable about this, but I know that when people are claiming that a game on the 360 is locked at 30FPS...it's running a LOT smoother than a game doing that on my PC. I have no clue if it's the connection, the medium, or what...but it's definitely different. It might just be that it's LOCKED and doesn't fluctuate, too?

The last item, vsync, is a trick to get more FPS in console games that more and more companies are using. Of the 360 release games, about 1/2 of them abused it. God of War is another notorious offender, however it at least REALLY pushed the PS2.
 
I feel much dumber after reading some of this thread.

Nice attitude. Here is an idea, instead of insulting the posters in this thread, go to a different thread that meets your eletist standards.

Problem solved.
 
I belive 720p is spec'ed at 60"fps/scan/refresh etc" and 1080i is 60 but interlaced making it effectively 30 real screen updates.

In the case of the xbox im sure its trying to hold 60 in every game but its more or less just shooting for the 720p resolution at what ever frame rate it can get. (be it 25,30-60) its not like a fixed rate (ie TV/video stream)
 
skittzle said:
Nice attitude. Here is an idea, instead of insulting the posters in this thread, go to a different thread that meets your eletist standards.

Problem solved.

If you're insulted by that, then you have extremly thin skin... lighten up. Not everything some one says like that is directed as an insult, I was merely breaking balls.
 
Blame the software devs and the effort put into their tittles not the hardware. We've seen crappy jobs done on every system ever released, and on the flip side we've seen the opposite.
 
Erasmus354 said:
I like how all these people seem to think their own personal Mark 1 Eyeball is the most highly calibrated visual testing implement available. "My Mark 1 Eyeball can determine FPS within 0.01% Accuracy! Therefore the 360 is crap!"

I am sorry to burst your bubble guys, but the human eye is not an accurate way to measure FPS. You can notice and perceive differences in FPS, however you cant plug in a game, watch it run and say "That game is running at 35 FPS, with slight dips down to 25FPS." You just cant percieve the video that accurately, what you can percieve are changes in the framerate as the game progresses.

So please stop saying "The game cant keep 30FPS" or something like that, instead just say that you notice slow down at times.



P.S. I am not trying to say that the game doesn't slow down, or what you think is choppiness isn't really there. I am saying that you can't say what FPS the game is running at with any degree of accuracy. It's like sticking your finger on the heatsink and telling me what the CPU temp is, not worth shit.


maybe not exactly spot on......but nobody said Spot on "25 fps" ----- I've been playing PC games since 1991, with my Tandy 1000, had a dozen machines since, am a PC technichian by occupation at a national business....and have run more 3dmark demos and frame rate benchmarks than I care to admit -- - - - - I can guess within 5-10 FPS of what a game is running at up until about the 60FPS mark and then it's anyone's guess. The lower the frame rates the easier it is to tell....
 
outsida said:
Blame the software devs and the effort put into their tittles not the hardware. We've seen crappy jobs done on every system ever released, and on the flip side we've seen the opposite.

Very true.

I dont think that its the console thta is incapable, more that the devs are too rushed or just cant be bothered.
 
Back
Top