why video game graphics have so low quality?

In the simplest terms, because the high quality graphics you see in rendered video have been rendered not in real time.

Most CG films, when rendered produce maybe 1 frame per 10 minutes. Games play at 30 frames per second, or more. So for people to run it on slower hardware at speeds you can play, the amount of graphical information has to be many many times less. So the lighting, model complexity and other things are all greatly reduced.
 
when we play a movie in the same hardware system, arent the graphics rendered in high fps?

not sure how rendering actually works, but why not drop rendering and create video games from small movie parts, that each will be played accordingly with the user's actions...
 
when we play a movie in the same hardware system, arent the graphics rendered in high fps?

not sure how rendering actually works, but why not drop rendering and create video games from small movie parts, that each will be played accordingly with the user's actions...

In simple terms this is how games work today, small parts, polygons with textures on them.
 
In simple terms this is how games work today, small parts, polygons with textures on them.

I think he meant using pre-rendered movies as a part of the environment and maybe even interactive objects. problem with that is it'll be confined to 2D. In order to have the same freedom of perspective that 3D has, you'll have to pre-render every possible angle, which is impractical. Anyway, old arcade games like Killer Instinct used movie files for the backdrop, which shifts perspective according to where the players were on the field.

It's a pretty neat concept that I'd like to see a return from modern high-budget development companies. Not exactly using movie files, but highly polished, pre-rendered 2D. Limited as a 2D game, but virtually limitless/timeless visual quality in terms of art and environment design. [rant]Kinda sick of seeing "retro" 2D games being overly simplistic. One thing devs need to understand while making a retro game is to have the same mindset as the devs 2 decades ago: squeeze out the best looking stuff with the limitations you have.[/rant]
 
I think he meant using pre-rendered movies as a part of the environment and maybe even interactive objects. problem with that is it'll be confined to 2D. In order to have the same freedom of perspective that 3D has, you'll have to pre-render every possible angle, which is impractical. Anyway, old arcade games like Killer Instinct used movie files for the backdrop, which shifts perspective according to where the players were on the field.

It's a pretty neat concept that I'd like to see a return from modern high-budget development companies. Not exactly using movie files, but highly polished, pre-rendered 2D. Limited as a 2D game, but virtually limitless/timeless visual quality in terms of art and environment design. [rant]Kinda sick of seeing "retro" 2D games being overly simplistic. One thing devs need to understand while making a retro game is to have the same mindset as the devs 2 decades ago: squeeze out the best looking stuff with the limitations you have.[/rant]

That's how the early Resident Evil games looked as good as they did - all of the backgrounds were pre-rendered.

My answer to the OP is basically one word: consoles. Crysis was a huge step forward in graphics back in 2007, and if developers had kept up with that level of advancement we would have games with much better graphics than we have now. At present you have to run most PC games at very high resolutions like in Eyefinity to push today's hardware at all.
 
when we play a movie in the same hardware system, arent the graphics rendered in high fps?

not sure how rendering actually works, but why not drop rendering and create video games from small movie parts, that each will be played accordingly with the user's actions...

A movie is just a series of pictures. So to play this it just needs to load this set of images one after the other.

3d used in games is different. For every object, the computer has to render a 3d model made up of many thousands of 2d shapes, with another layer of 2 image mapped over this for texture. Then it has to decide how the different sources of light (like the sun or street lamps) create shadows or highlights. Added to this, things like the physical interactions of all the elements, like how snow settles and bullets hit the glass, and how this will effect movements of objects into the next frame.

Back in the 90's (and even 80's) and into the early 2000's there was a genre of game that did exactly that, mainly because of the graphical limitations of the platforms. They were called FMV adventures or Interactive movies. Famous examples are Night trap, sewer shark and an x-files game. All are pretty much known to be awful. Mainly because you spend your time watching a movie with very little effect n the action happening beforeyou apart from decisions every 1 or 2 minutes. Also being pre determined means that after clearning the game, there is very little oppotunity to have new experiences. The games tended to be pretty short too.

There have been a few games that have combined pre rendered movies and 3d models quit well (fear effect was one) but generally, the difference in appearance between the two media make it look quite jarring.
 
That's how the early Resident Evil games looked as good as they did - all of the backgrounds were pre-rendered.

My answer to the OP is basically one word: consoles. Crysis was a huge step forward in graphics back in 2007, and if developers had kept up with that level of advancement we would have games with much better graphics than we have now. At present you have to run most PC games at very high resolutions like in Eyefinity to push today's hardware at all.

Not just consoles. Its catering to the masses with aging hardware. Why wont a developer make a PC only game just for overclocked i7s and latest graphics cards only, because it will sell at most a few thousand copies. Back 10 years ago when it was more independant people made games for the higher end machines they owned, loads of games wouldn't run on most peoples computers, which wasn't exactly good for buisness. Nows its corporations making games for the low spec masses.
 
when we play a movie in the same hardware system, arent the graphics rendered in high fps?

not sure how rendering actually works, but why not drop rendering and create video games from small movie parts, that each will be played accordingly with the user's actions...

A movie, again, is pre-rendered (usually a process taking months using an entire farm of computers) and recorded from only one perspective so all your home computer (or dvd player) has to do is display the video. In a video game, every 3d object is rendered in real time, so you can view the world from any perspective. As for using pre-rendered clips for gaming, it doesn't work well in a 3d game (2d games aren't bad. The original Mortal Kombat used a similar technique for animating the characters), other than for cut-scenes where you have no control. It's just not feasible to pre-record a video clip for each and every possible action and perspective in a 3d world.

They tried out pre-rendering video for game play in the early 90s (it was called Full Motion Video). It was really only used because computers weren't yet able to render convincing 3d visuals in real time. As a result, you were extremely limited in what you could do in the games because you were restricted to actions that had been pre-recorded. This is an example of what they looked like. As you can see, you could only move in certain directions and look in certain directions because you were limited to what they had video of. The games took the phrase "on rails" to a whole new level. In order to do a true 3d game with full range of motion and the ability to do any action at any time would require an insane amount of time and effort to pre-record. You would have to record every single possible outcome for every single second of the game.

What if your character looks left and catches a glimps of an enemy running down the side of the map? Gotta record that possibility. What if an enemy jumps at you while you are crouching in a bush looking south-west? Gotta record that. What if you look at the ground while running to a specific location then jump to get a better view of the battlefield? Gotta record that. It would be nearly impossible to do in a way that wasn't incredibly constricting to game play. And even if they were somehow able to pull it off, it would take an your entire hard drive to store all those HD video clips... probably more.

It's just not realistic.
 
That's how the early Resident Evil games looked as good as they did - all of the backgrounds were pre-rendered.

My answer to the OP is basically one word: consoles.

oh comon, anyone who's seen me post on these subjects will tell you I'm not one to take the blame off consoles, but your talking about differences measured in percentiles, what the OP is asking about is differences of orders of magnitude.

Also, its tough to make the argument that modern graphics are ugly when games already released look as good as this.

To the OP, Yoshiyuki Blade's explanation is exactly correct. The non-technical explanation, as best I can make it, is that in the case of a movie every pixel, every last detail of the movie is known ahead of time, meaning with a movie each frame and its subsequent frame can be stacked up one after the other and simply run through. A graphics system must pull in the resources on-the-fly, modify them (lighting, shading, etc), and then sent that frame to get drawn. Turns out that's a fair bit harder.

I think the biggest problem with attempting a modern pre-rendered based game is disk access. To make such a game in any way I can think of with any kind of fidelity would require a huge amount of disk space --but considering 2TB drives that might not be the issue, the issue would be localizing that data and moving it into memory, that's where I think you'd run into problems.

Nonetheless it would be an interesting experiment, and experiments aren't picked up by publishers since the collapse of oct 2008.
 
oh comon, anyone who's seen me post on these subjects will tell you I'm not one to take the blame off consoles, but your talking about differences measured in percentiles, what the OP is asking about is differences of orders of magnitude.

Also, its tough to make the argument that modern graphics are ugly when games already released look as good as this.

Bad example you picked there; only the buildings look good in Assassins Creed 2. Everything else - in particular the character models - looks like arse.
 
Not just consoles. Its catering to the masses with aging hardware. Why wont a developer make a PC only game just for overclocked i7s and latest graphics cards only, because it will sell at most a few thousand copies. Back 10 years ago when it was more independant people made games for the higher end machines they owned, loads of games wouldn't run on most peoples computers, which wasn't exactly good for buisness. Nows its corporations making games for the low spec masses.

That would be fine if they were able to make scalable graphics... which they can. The problem is, for some reason, most people feel they need to run a game at max settings. If they can't, then they wont buy it. They would rather have a crappy looking game that they can run at max settings than a game that looks fantastic that they have to run at medium settings. It's a weird mindset.

Because of this, developers have little incentive to pour money into developing graphically intensive games. They do better business releasing games that 4 year old hardware can run at high graphical settings, and they cost less to develop. It's win-win for them. Then you add in the Console market, and it's a trifecta. There just isn't any motivation to push technology at the moment.
 
That would be fine if they were able to make scalable graphics... which they can. The problem is, for some reason, most people feel they need to run a game at max settings. If they can't, then they wont buy it. They would rather have a crappy looking game that they can run at max settings than a game that looks fantastic that they have to run at medium settings. It's a weird mindset.

I dont think that's the mindset to be honest. Its more people dont want to have to fuck around with settings and things, and quite often graphics in high end games done scale well (they look good maxed, but at lower settings they dont match other titles available at the same time).

That was my problem with Crysis, not that I couldn't run it maxed on my 8800GTS, but that I spent more time playing with settings and looking at framerate numbers than actually enjoying the game. What was smooth one level wasn't smooth the next and by the time you lowered the settings to something smooth from start to finish the game looked worse than other games available at the time. Warhead scaled better and actually looked pretty good at not-maxed settings, but then they compromised by making the maxed settings look worse.

Why even bother putting in extra development effort to make it stress high end systems when its just gonna create headaches for those with mainstream systems and only benefit maybe 2.5% of the PC gaming community, which is probably less than 0.5% of your customer base when developing a multiplatform game.
 
For games to take the next step in approaching CG movies you would need some sort of physics standard and a move to a hybrid rasterisation/ray tracing method of rendering. Otherwise the animation and lighting gap will forever be there. Designing a game for a OCed i7 2600k paired with 3xGTX 580s is not the issue.
 
This is what they used to render Avatar.

Thirty four racks comprise the computing core, made of 32 machines each with 40,000 processors and 104 terabytes of memory.

This is why we are not getting the prerendered quality on our 3D games yet. Even the mightiest ATI and Nvidia cards will not hold a candle against these monsters.

It is not stated how long did they need to render for each frame. I don't know when will be getting the equivalent horsepower in our desktop but I hope Nvidia and ATI can compete against each other so that we can reach that level faster, 20 years maybe?
 
I don't know when will be getting the equivalent horsepower in our desktop but I hope Nvidia and ATI can compete against each other so that we can reach that level faster, 20 years maybe?

Moore's law...20 years from now we will have reached singularity. Avatar in real time within 5 years.
 
when we play a movie in the same hardware system, arent the graphics rendered in high fps?

not sure how rendering actually works, but why not drop rendering and create video games from small movie parts, that each will be played accordingly with the user's actions...

They had a few arcade games like this back in the 80's.
One was Dragons Lair and another was Mach 3, which you piloted fighter plane or bomber.
The video was played off of a Laserdisc. Pretty neat at the time too.

Firefox was another laserdisc based game as well.
 
LOL its amazing how we quickly forget were we have been n terms of graphics. I remember when the only place we could get "decent" graphics was by visiting the arcade. Who here now visits the arcade when you can can the same graphics if not better at home?

If you want to see how far we have come on PC graphics pickup an old Unreal Tournament Demo or even Lara Croft. The difference is day and night. We live in interesting time folks. 10 years from now we might have Avatar like graphics in real time.
 
LOL its amazing how we quickly forget were we have been n terms of graphics. I remember when the only place we could get "decent" graphics was by visiting the arcade. Who here now visits the arcade when you can can the same graphics if not better at home?

If you want to see how far we have come on PC graphics pickup an old Unreal Tournament Demo or even Lara Croft. The difference is day and night. We live in interesting time folks. 10 years from now we might have Avatar like graphics in real time.

Most Arcades around me have been closed. Some movies theaters here still have a dozen machines.
I was playing an older game recently and the graphics were terrible, compared to todays games, but I do remember when I first played that game that it was the best thing since sliced bread.
 
Most Arcades around me have been closed. Some movies theaters here still have a dozen machines.
I was playing an older game recently and the graphics were terrible, compared to today's games, but I do remember when I first played that game that it was the best thing since sliced bread.

Yeah same here. I always remember my mom taking me to the mall and giving me two quarters to play. I remember savoring every minute of game time.

I remember thinking that when I grow up I would love to take my son/daughter to the arcade. Well now that I have a son I guess I'll take him to the movies and let him play there or take him to Disney Quest.
 
I wish they would use high resolution detailed textures. Fine put it on two discs I dont care.
 
when we play a movie in the same hardware system, arent the graphics rendered in high fps?

not sure how rendering actually works, but why not drop rendering and create video games from small movie parts, that each will be played accordingly with the user's actions...

Because the rendering isn't done in real time for movies. They're pre-rendered on huge rendering farms.

The difference between a videogame and a movie is that the videogame doesn't know what it needs to render beforehand. Everything needs to be rendered in real time.

Both produce frames to show the audience / user, but the frames rendered by the movie aren't done in real time.
 
Moore's law...20 years from now we will have reached singularity. Avatar in real time within 5 years.

Give me a call when they perfect sub-nm transistor tech, ok?
 
Moore's law...20 years from now we will have reached singularity. Avatar in real time within 5 years.

And yet, we have seen little advancement in gaming graphics in the last 4 years, even though in that time (and true to Moore's Law) we have seen the number of transistors that can be placed on an integrated circuit double it's self twice. Just because we have the ability and capacity to push gaming graphics and game play doesn't mean we do.


I wish they would use high resolution detailed textures. Fine put it on two discs I dont care.

Well, the storage capacity of the media (or your hard drive for that matter) isn't really the issue. It's the ability of your computer to bring those HD textures into active memory. The more detailed and high resolution those textures are, the more ram it takes. So, since most people have 1-2 gigs of ram, developers try to keep the HD textures to a level that those computers can handle.

As far as consoles go, limitations of the media may be an issue as the game is essentially loaded directly from the disk. However, the amount of ram employed becomes a problem long before the limitations of the media comes into play where HD textures are concerned. So basically, it all comes back to memory.
 
Back
Top