Halo 3 actually runs 640p no matter what.

Joined
Jan 24, 2004
Messages
760
I honestly don't know enough about this type of stuff to back up what this guy is saying, but I believe it.

http://forum.beyond3d.com/showpost.php?p=1070774&postcount=276

What he's saying is that the game renders at 640p and upscales in the console itself. He's done testing on grip of games, and found out that Perfect Dark Zero and Halo 3 actually render at 640p, and that most every other game renders at 720p (even if they support 1080p output). There were 2 exceptions being "Virtua Tennis 3" and "NBA Street" which did true 1080p. There were some others even worse, rendering at 600p. Those were Project Gotham 3, and Tomb Raider.

If it turns out to be true, It clears up a lot of questions I had about a consoles true ability to run games at 1080p. It just didn't click with me that performance wouldn't change going from 720p to 1080p.

On a side note: I still think Halo 3 is one of the most fun games I have ever played and I love my Xbox 360.
 
If you play it co-op on the same console, you even get 2 massive black bars on either side of the screen, where they have sacrificed pixels to keep the fps up. As soon as you jump to a cutscene, it fills the screen again.......pathetic
 
If you play it co-op on the same console, you even get 2 massive black bars on either side of the screen, where they have sacrificed pixels to keep the fps up. As soon as you jump to a cutscene, it fills the screen again.......pathetic

No, that's not because of the frame rate. They need to preserve the aspect ratio.
 
Not terribly surprising. I said it long ago -- the X360's limit is being scraped, and it may seem that such a statement was accurate. Will developers continue to push the pixel/vertex/polygonal barrier by sacrificing true output resolution? If that's the case, it's very unfortunate for X360 owners, even if they don't realize it or fully grasp the consequences.

It seems the consensus is that this guy's method is on the level. My understanding tells me also that this is a perfectly valid method.

That's a shame. Makes me wonder about the PS3 and whether or not this also occurs on that platform.
 
The testing this guy has done (if you expand out his thread) states that he has indeed done this testing with the PS3 and found the same results. Most games render at 720p even if they're claiming 1080p.

By the way, the amount of "jaggies" has a lot more to due with the absence of anti-aliasing, not the resolution.
 
I'm also curious about the PS3. I can tell you that the two fighting games (VF5, Tekken 5) both look significantly better with each resolution upgrade. I don't know if all games do, but those two DEFINITELY look a lot better. With Halo I played around with it and the difference is minimal. I think it might be a 360 thing.

(EDIT: after seeing the above post, I guess that's the answer. Either way, there's still a definite visual upgrade as you increase the resolution on the two games I mentioned)
 
Upscaling does not mean worse quality, in fact it can seem to improve the quality with good upscaling.

The CPU in the 360 has a ton of power so it can do very good upscaling and provide very good results.

And before saying the 360 is crippled, as said in this thread the PS3 does the same thing. There was an interview with some developers on the subject a year or so ago and it turns out that to make a game in full 1080p is pretty useless because it takes so much more time to do it and so much more disk space that the games would have to be very short. The pixels go up exponentially and with video games people have to make those pixels, hence the time of work goes up exponentially as well.

Think that Halo 3 took around 4 years to make...now think about it taking 4x as long to get full 1080p!

I cannot find the link to the interview, but they had some comparisons and it was amazing just how close upscaling came to true 1080p and the fact of how much less time it takes to create games that upscale.

Halo 3 IMO does not have that good of graphics...if you want to show the true balls of the 360 take a look at GOW or Forza 2. Technically the CPU in the PS3 is more powerful but the 360 has a more powerful graphics chip. They tend to cancel each other out from everything I have seen (except for in folding :p ). The difference in the graphics of the games is pretty much up to the devs behind the game between the two consoles...not the hardware itself. Sony was big into marketing and comparing useless numbers, but the truth behind it is that the 360 graphics chip is better and PS3 cpu is better.
 
Upscaling does not mean worse quality, in fact it can seem to improve the quality with good upscaling.
An upscaler does nothing but take an image and stretch it to fit a given resolution. This is the same process involved when you use the Image Size feature in Photoshop. Non-integral scaling involves a process called interpolation (for images, interpolation is typically used if it's an integral ratio or not). Unless the level of precision involved is infinite (or if the scaling is an even multiple, which it clearly would not be in this case), the result of upscaling is image degradation.

Resample 44.1kHz audio to 48kHz, for example, and the result is degradation -- always. The level of degradation is dependent on the level of precision.

And before saying the 360 is crippled, as said in this thread the PS3 does the same thing.
If you're referring to my post, I said the X360's limit is being scraped. I never said it was crippled.

Think that Halo 3 took around 4 years to make...now think about it taking 4x as long to get full 1080p!
1080 is a screen resolution -- nothing more. The amount of time needed to generate content for a title that renders at 1920x1080 is identical to the amount of time needed to generate content for a title that renders at 640x480, 1280x720, 320x240, or any other resolution imaginable.
 
Millions of people disagree.

seems to me the higher the resolution, the less you need to worry about it. i know i don't care that much. in older games i'll turn it on since my card can handle it. but its just extra eye candy thats not required.
 
1130x640. That's a lower pixel count than a game running at 1024x768. I think that if someone said when the Xbox 360 launched that Halo 3 would be running a lower pixel count than 1024x768, with no anti-aliasing or bilinear filtering, and it still would only reach 30 frames per second, they would have been laughed at pretty hard.

It seems the graphics power of these "next-gen" consoles is a bit overstated when comparing it to the PC side of things.

Both Sony and Microsoft have managed to pull a huge wool blanket over a lot of people's eyes, myself included.

Then again, I think 360 games look fantastic and I love kicking back on the couch with a platform I know will be developed on for another few years.

It's interesting to me, and I am supprised how little I thought I knew on the subject. It's also not a huge deal to me in the overall scheme of things. I won't stop playing and enjoying a game because it's rendered resolution isn't as high as I thought it was.
 
Pardon my ignorance of the proper term (I think it was fill rate?), but I'll take a hit in resolution if it means amping up the textures and particle effects and other various things that make for good eye candy. It's nice if your game can crank out 1080p native, but if you just offer sterile environments and models is it really worth it?

Seems like you have a toss-up between resolution, visual effects, and framerates - pick two out of the three. For Halo 3, it's a shame it couldn't do 720p natively, but from the little I've played so far there seems to be enough visual quality to make up for it. 720p looks to be the sweet spot for consoles in terms of achieving all three of the factors mentioned above*

*Note I said CONSOLES, PCs are obviously excluded for obvious reasons...
 
1080 is a screen resolution -- nothing more. The amount of time needed to generate content for a title that renders at 1920x1080 is identical to the amount of time needed to generate content for a title that renders at 640x480, 1280x720, 320x240, or any other resolution imaginable.

While it is true that if you're rendering a game at 1920x1080 it doesn't take any more time to generate content for it than 1280x720. But that is only partly true.

If you render a game at 640x480 and you KNOW that the resolution will be 640x480 when you output, you won't need to put the time into the textures to display it any higher.

A perfect example of this is Super Street Fighter II: Turbo HD Remix coming out for the Xbox 360 Live Marketplace. All of the sprites are being totally redrawn, because the original game outputs such a small resolution, the sprites didn't need to be as detailed.

And while we're talking about 3D environments and not sprites, the textures on rocks, weapons, and everything else will look different when you render it at 1920x1080 vs. 640x480.
 
Microsoft claimed a long time ago games were being developed for 720p internal rendering and would be scaled for your display? Gamespot ran a big article on it.

What's the news here?

I doubt that Halo3 is anything other than 720p....somebody took a screen print on his particluar TV setup with so many random variables and unscientific testing methods and you guys are all getting crazy --- this doesn't mean a thing.
 
Microsoft claimed a long time ago games were being developed for 720p internal rendering and would be scaled for your display? Gamespot ran a big article on it.

What's the news here?

I doubt that Halo3 is anything other than 720p....somebody took a screen print on his particluar TV setup with so many random variables and unscientific testing methods and you guys are all getting crazy --- this doesn't mean a thing.

It does mean something when he lays out his methods and is able to visually show you the difference across titles, and show you that 640p is a unique oddity, and one that exists on another game.

Does it matter in the grand scheme of things in terms of visual quality? Yes.

Does it make the game any less fun? No.
 
here's a VERY easy observation of the scaling that takes place on the 360.

Download the 360 Tombraider demo.

Play it with the VGA adapter so you can choose the resolution.

Play it at 640x480 (oh so ultra buttery smooth)
Play it at 1280x720 (plays standard - a smidgeon of lag, but still playable)
Play it at the max resolution 13xx by 9xx or whatever it is (feels choppy)


Why?

Cause it doesn't scale using the VGA adapter. It is what it is. The Tombraider game could never play at 1080p (if that was an option) using the VGA adapter, because the VGA doesn't rescale the image.


NOW play Tombraider using component.

480p, 720p, 1080p all play exactly identical, because it's rendering the game at 720p internally and rescaling for your display request.

I credit myself with finding this, as I was the first person I know about to report this here...I made a thread about it when the tombraider demo first came out and I've not read about it elsewhere either.

This same thing happens on Full Auto demo - though I've not tested more than those two demos.
 
here's a VERY easy observation of the scaling that takes place on the 360.

Download the 360 Tombraider demo.

Play it with the VGA adapter so you can choose the resolution.

Play it at 640x480 (oh so ultra buttery smooth)
Play it at 1280x720 (plays standard - a smidgeon of lag, but still playable)
Play it at the max resolution 13xx by 9xx or whatever it is (feels choppy)


Why?

Cause it doesn't scale using the VGA adapter. It is what it is. The Tombraider game could never play at 1080p (if that was an option) using the VGA adapter, because the VGA doesn't rescale the image.


NOW play Tombraider using component.

480p, 720p, 1080p all play exactly identical, because it's rendering the game at 720p internally and rescaling for your display request.

I credit myself with finding this, as I was the first person I know about to report this here...I made a thread about it when the tombraider demo first came out and I've not read about it elsewhere either.

This same thing happens on Full Auto demo - though I've not tested more than those two demos.

Do the test with Halo 3. Report back with details. It would be extremely interesting to find out a VGA cable would prevent internal scaling and actually do true rendering vs. component and/or HDMI.
 
And I thought I was the only lunatic wondering what was up with all the jaggies. This isn't really a flame on the game itself though. I picked up the uber edition and I've been having fun with it but there really are a ton of jaggies (1080i here) more so in some spots. :confused:
 
And I thought I was the only lunatic wondering what was up with all the jaggies. This isn't really a flame on the game itself though. I picked up the uber edition and I've been having fun with it but there really are a ton of jaggies (1080i here) more so in some spots. :confused:

Bungie isn't exactly known for their graphics engines. They did a decent job technically with the game (lots of impresive lighting and texturing), but they did a fantastic job with gameplay.

It does explain why it appears to have more jaggies than other 720p titles with no AA.
 
I think later in the future its only going to get worse. As games get more complex they will continue to need better hardware to run them. The resolution will only get turned down further and further along with texture settings and all that. Just like a pc as the hardware gets older the graphics will always have to be scaled down. Consoles don't get optimized too much either because of cost constraints and because of more complex hardware and longer development. To me we have already hit a wall in graphics with Gears of war practically. If anyones such a graphics junky anyways you should just have a pc lol. To me it looks pretty good anyways right now. But to me the wii hasn't really reached it potential yet because of shotty ports and stuff. But both 360 and ps3 have hit a wall already. Have heard ut2007 has been having problems running at 60 frames so they had to downgrade it to 30fps locked on the ps3. More games will be like this in the coming future most likely.
 
Wii has hit the same wall. Take a look at Metroid Prime Corruption. That's about as much you'll see done with the limited power of the Wii. That console is legitmately underpowered.

PS3 apparently has some tricks up its sleeve, when a game like Heavely Sword can do 4xAA/8xAF with HDR lighting. It'll be interesting to see if anyone can figure out what the heck Ninja Theory did to pull of what they did on that game. Heavenly Sword is as pretty as games will be on PS3 without a doubt, the question is, will someone figure out how to pull it off on Xbox 360 (they should considering their hardware is tit-for-tat comparable).
 
To me we have already hit a wall in graphics with Gears of war practically.

Epic already said UT3 has a lot more going on in a typical scene than GoW.

Have heard ut2007 has been having problems running at 60 frames so they had to downgrade it to 30fps locked on the ps3. More games will be like this in the coming future most likely.

I don't believe they ever targeted 60fps with UT3. The interview I heard said it could run 60fps, but would have drops in the heaviest scenes. So to make it as smooth as possible and not worry about overhead, they stuck with 30fps. Anyone worth their salt will buy UT3 for PC anyways. Seems consoles just have lower standards for frames per second in shooters. At least we know Gran Turismo 5 will render 1080p 60fps natively. :p
 
The 360 scales games to 1080p over VGA exactly the same way it does over Component.

I'm on my third HDTV since I got the 360 and there's been no variance in any game from one resolution display to another, nor is there on the 19" computer monitor I played it on many a time at a friend's house.

The only way his mentioned performance issues are real is if it's only those games that somehow cause it as I've not played those particular ones.

I can varify that Halo 3 runs the same in 1080p over VGA as it does in 1080p over Component though.
 
I just want to point out that anyone who thinks you can calculate a video game's resolution based on the number of steps in a random diagonal lines on your TV screen is a complete and total idiot who has absolutely no clue what he is talking about.

Ahh, what does it matter? This is the Internet. Sorry for the reply. Continue with the Halo hate.
 
The 360 scales games to 1080p over VGA exactly the same way it does over Component.

I'm on my third HDTV since I got the 360 and there's been no variance in any game from one resolution display to another, nor is there on the 19" computer monitor I played it on many a time at a friend's house.

The only way his mentioned performance issues are real is if it's only those games that somehow cause it as I've not played those particular ones.

I can varify that Halo 3 runs the same in 1080p over VGA as it does in 1080p over Component though.


Well I've not tested more, but I DID test Tombraider and Full Auto. Download those two game demos and see if your results match mine. My observations were verified by several people who saw the exact same thing when they tried it (it's a very distinct difference - not something that is subjective -- literrally buttery smooth at 640x480 on tombraider, to really herky jerky at the highest xbox 360 res using VGA) This was a long time ago though, when the 360 was young. Maybe they've changed things in some of the firmware patches.
 
I'll download the demos and give'em a shot on my own older PC monitor and my TV in 1080p over both connection types and see what I come up with.
 
I just want to point out that anyone who thinks you can calculate a video game's resolution based on the number of steps in a random diagonal lines on your TV screen is a complete and total idiot who has absolutely no clue what he is talking about.
Then enlighten us. Explain why that isn't a legitimate method.
 
PS3 apparently has some tricks up its sleeve, when a game like Heavely Sword can do 4xAA/8xAF with HDR lighting. It'll be interesting to see if anyone can figure out what the heck Ninja Theory did to pull of what they did on that game.

When you beat Heavenly Sword and unlocking the "Making Of" video, they go into how they created their own technologies and tools to program the game since when they began the PS3 wasn't closed to being released and they had no specifications for the PS3 (they were banking on Moore's Law)...not to mention that the dev kits from Sony were no where near complete.

I believe they said in the video the game took 5 (or 6?) years to make, so they began working on it at least 4 years before the release of the PS3.

They did a fantastic job.
 
When you beat Heavenly Sword and unlocking the "Making Of" video, they go into how they created their own technologies and tools to program the game since when they began the PS3 wasn't closed to being released and they had no specifications for the PS3 (they were banking on Moore's Law)...not to mention that the dev kits from Sony were no where near complete.

I believe they said in the video the game took 5 (or 6?) years to make, so they began working on it at least 4 years before the release of the PS3.

They did a fantastic job.

The graphics are beyond anything else out there. The length of the game is unfortunate. I am hoping for a 10+ hour Heavenly Sword 2 by the time I buy a PS3.
 
I just want to point out that anyone who thinks you can calculate a video game's resolution based on the number of steps in a random diagonal lines on your TV screen is a complete and total idiot who has absolutely no clue what he is talking about.

why not? the picture quite clearly shows both the individual pixels of the LCD screen and the stepping along the edge of the line in the upscaled image. It's rather easy to discern the true resolution from this.
 
I skimmed the thread at B3D and didn't see any explanation of why this was true. I saw an explanation of the math involved, but no proof that it's true or explanation of why it's true.

yes exactely, find the recurring pattern (or clear step for starting and ending counting) or after a first aproximative result you deduce the real ratio per proximity and find the recurring pattern for a second analyze
for example in Halo3 in my first counting on 45 step i find 76 pixels = 639.47 , i deduce that real native resolution is 640p because is the most simple ratio for 720p upscale in this case (8/9 ratio) , after this assumption i can to find easily the recurrent pattern (every 8 step on 720p display and every 16 step on 1080p display for this exemple... check that your screen have no overscan) and note the recurrence for confirmation


The way I read the quote above is that he's making a rough guess then fitting the information he finds to that guess. Not exactly scientific. Also isn't aliasing worse on high contrast edges? If the screenshot used was lower contrast would the resolution become higher?

Wether this is true, or not, it doesn't affect my enjoyment of this or any other game. That said, the methods seem suspect to me.
 
1080 is a screen resolution -- nothing more. The amount of time needed to generate content for a title that renders at 1920x1080 is identical to the amount of time needed to generate content for a title that renders at 640x480, 1280x720, 320x240, or any other resolution imaginable.


I dont play flame fights, but stop saying things you dont know.

The higher the res the more pixels in said resolution. Thus the creators have to put more time into making more pixels in their work. This is everything from textures to player models and weapon models.

The extra pixels take a ton of extra time because there are exponentially more. What you are saying is rendering which is thus upscaling which is what they do which is what this thread is crying about! It does not take the same amount of time to make them native to the resolutions.

You can flame all you want, but I am done with these kinds of threads in the games forum....people bashing with info they think are facts but are not :rolleyes:
 
flames and arguments aside, Bungie has officially cleared up the story on the 640p issue.

in short, it's 640p.

http://www.bungie.net/News/content.aspx?type=topnews&cid=12821

You Owe me 80p!

One item making the interwebs rounds this week was the scandalous revelation that Halo 3 runs at “640p” which isn’t even technically a resolution. However, the interweb detectives did notice that Halo 3’s vertical resolution, when captured from a frame buffer, is indeed 640 pixels. So what gives? Did we short change you 80 pixels?

Naturally it’s more complicated than that. In fact, you could argue we gave you 1280 pixels of vertical resolution, since Halo 3 uses not one, but two frame buffers – both of which render at 1152x640 pixels. The reason we chose this slightly unorthodox resolution and this very complex use of two buffers is simple enough to see – lighting. We wanted to preserve as much dynamic range as possible – so we use one for the high dynamic range and one for the low dynamic range values. Both are combined to create the finished on screen image.

This ability to display a full range of HDR, combined with our advanced lighting, material and postprocessing engine, gives our scenes, large and small, a compelling, convincing and ultimately “real” feeling, and at a steady and smooth frame rate, which in the end was far more important to us than the ability to display a few extra pixels. Making this decision simpler still is the fact that the 360 scales the
“almost-720p” image effortlessly all the way up to 1080p if you so desire.

In fact, if you do a comparison shot between the native 1152x640 image and the scaled 1280x720, it’s practically impossible to discern the difference. We would ignore it entirely were it not for the internet’s propensity for drama where none exists. In fact the reason we haven’t mentioned this before in weekly updates, is the simple fact that it would have distracted conversation away from more important aspects of the game, and given tinfoil hats some new gristle to chew on as they catalogued their toenail clippings.
 
I love how Bungie responds to stuff like this... They don't ever seem to take themselves too seriously and any reference to tinfoil hats in a press release is golden.
 
I had a great time playing through the singleplayer campaign and am currently having a blast playing multiplayer. I don't care what the details are in how the game gets displayed on my screen. The only thing that matters is that it is, in fact, displayed and that it's fun to play.

I do like Bungie's response though. That was pretty funny.
 
I dont play flame fights, but stop saying things you dont know.

The higher the res the more pixels in said resolution. Thus the creators have to put more time into making more pixels in their work. This is everything from textures to player models and weapon models.

The extra pixels take a ton of extra time because there are exponentially more. What you are saying is rendering which is thus upscaling which is what they do which is what this thread is crying about! It does not take the same amount of time to make them native to the resolutions.

You can flame all you want, but I am done with these kinds of threads in the games forum....people bashing with info they think are facts but are not :rolleyes:

What are you talking about? Your entire post makes no sense. You tell Phide not to talk about things he doesn't know, but in reality it is what you're talking about that makes absolutely no sense. I don't like flame wars either, and this isn't intended to be one, but what Phide said I completely agree with.

Developers don't have to work harder to make "extra pixels." I think your understanding of what a resolution is is fundamentally lacking. Developers might spend extra time into making better textures if they think higher resolutions demand it for better picture quality, but even then texture filtering for the most part can fix this unless the textures are an insanely low-resolution mess. And models don't have to be upgraded at all. A model with a relatively low amount of triangles is actually going to look much better at higher resolutions because aliasing will start to be less and less noticeable. Halo 2 upscaled to 1280x720 using the Xbox 360's emulator didn't make any modifications to the actual game, nor did it take insanely long to do.

And what about PC games? Crysis will be able to run at resolutions quite a deal greater than 1080p. They have been spending a long time on it, but by your logic they would need to spend years and years on the game in order for it to specifically display at 2560x1600, which is something you can get Tribes to do - a game released in 1998 that took less than a year to develop.

On another note, Bungie's response was pretty hilarious, but I will agree that the difference between 720p and 1152x640 is almost nil.
 
Back
Top