Console hardware vs PC hardware and Carmack

armandh01

Weaksauce
Joined
Dec 31, 2009
Messages
123
I watched an interview and seen some quotes and what not with Carmack and his beef with drivers and api. and why consoles have some specific advantages over pcs when it comes to programming, this infamous "close to metal" stuff.

It's always vaguely touched upon and rarely fully explained (unless someone here has a good video).

Anyway, is there in fact an advantage of console hardware vs pc for gaming?

Now when I look at the specs of the xenos, although it has 48 shaders like the x1900, it has half the bandwidth (comparable to a x1650xt). and half the rops. Yet people incessantly compare the two, same as RSX vs 7800gtx. Is it unfair to compare these? I always see happening.

They seem very gimped.

I understand consoles do sub 720p and sometimes poor textures often but I am often amazed that they can play some games at all (crysis 2) to any degree.


I would appreciate if someone could give some insight into this and if PCs are at any disadvantage.
 
That was his quakecon speech this past year, right? Last year I noted in a RAGE thread several of the things he said during the speech.

But yes, with consoles you don't have to worry about how drivers interact with the game. Drivers are meant to be a layer between games and the hardware because hardware changes over time. The drivers take what the game may be trying to do to the hardware and in some cases change or fool the game into thinking something about the hardware. Any time you add a layer you will lose performance. The programing API makes programming for hardware easier. XBOX uses DX and PS3 uses OpenGL, both which are supported on GPU's.

The reality is there is no standardization for GPU/Drivers so programming for them is always changing.
 
So if you had a console. vs a pc with the exact same specs of hardware, there is more headroom on a console? Is it significant?


Also how do you think a x1800 is more powerful than a xenos? I mentioned that the xenos has less bandwidth (22gbps vs 32gbps) and rops (8 vs 16) but does it make it up some how with the combo of 48 unified shaders?

I got in a debate with someone on this topic and the arguement was comparing a x1900 that was able to play crysis 2 at 720p quite a bit better than the 360, I was claiming that is unfair because the 360's xenos is probably weak even to an X1800. Although I am not sure that I am necessarily correct.

Anyway he was claiming that there is no difference or advantage between console hardware programming vs PC. I thought there is, mostly based on carmacks words.
 
Well in the case of the Xbox, it is a PC. It runs Windows, it uses DX 9. The real difference is simply that a console is one base configuration that's pretty static for years so people learn how to optimize for it and it's pretty stable. There's little static or stable and much more diverse and that adds to the complexity.
 
So if you had a console. vs a pc with the exact same specs of hardware, there is more headroom on a console? Is it significant?
There is more overhead on PC's, but consoles do have an OS. These consoles only have 512MB of ram and must fit bot the OS/Game and GPU memory all within that so there's not much room for extra's.

Also how do you think a x1800 is more powerful than a xenos? I mentioned that the xenos has less bandwidth (22gbps vs 32gbps) and rops (8 vs 16) but does it make it up some how with the combo of 48 unified shaders?

I got in a debate with someone on this topic and the arguement was comparing a x1900 that was able to play crysis 2 at 720p quite a bit better than the 360, I was claiming that is unfair because the 360's xenos is probably weak even to an X1800. Although I am not sure that I am necessarily correct.

Anyway he was claiming that there is no difference or advantage between console hardware programming vs PC. I thought there is, mostly based on carmacks words.
I don't know the specifics of the hardware you are talking about but consoles have significantly less overhead than a PC does in regards to what is between the game and hardware. Try to run Windows XP with a 7 year old dual core, 256 MB of ram, a 256MB memory X1800 GPU, and tell me if you can maintain 30 FPS @1280x720 in Crysis 2, with similar looking graphics as consoles? I doubt it.

The point I'm trying to make is if the Xbox 720 came out today with an Intel 2500k CPU a GeForce 680GTX GPU and 8GB of memory and you compared that to an identical Windows PC, the Xbox would win in gaming performance.
 
Last edited:
The programing API makes programming for hardware easier. XBOX uses DX and PS3 uses OpenGL, both which are supported on GPU's.

360 is DX-ish and PS3 uses a low level library called libGCM. Both consoles can bypass the API and do all sorts of hackery, like they can just dump data into the GPU command buffer, don't need to have the driver trying to make sure you don't bring the system down in flames, etc.


Is there an overhead, yes, but the layer of abstraction on PC isn't really the end of the world. Freeing CPU time is not going to save you if you're completely and utterly GPU bound to begin with. The bad part is hoping for driver support in the first place, like the multi-threaded rendering that DX11 was supposed to bring to the table. Great feature, totally attacks a major problem on PC side... in theory -- but it might be a performance loss in the end because support is still flaky on the driver side. I think it's overblown though, shit port jobs will run bad, it's not surprising. If you don't treat the platform right, it won't work well. Even 360 to PS3 ports are shit sometimes. COD is occasionally like half the framerate as the 360 version.

I imagine PC side will benefit too if the next consoles are x86 machines and developers are writing optimized, vectorized assembly and stuff to try and milk the most out of the platform.
 
VERY SIMPLE EXPLANATION:

Advantage of consoles: A console is one fixed hardware configuration that never (or rarely) changes during its life cycle. Writing software for fixed hardware is easier because you know exactly what to expect from the console. On the other hand, PCs have millions of hardware and software configuration possibilities so writing games for them requires much more testing and the possibility of incompatibilities is much higher.

Advantage of a PC: Constantly evolving hardware; more hardware resources (CPU speed; memory) available.

These are pretty much the advantages from a developer's viewpoint.
 
Here's a slight explanation for an area where consoles are superior to their PC brethren:

http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/2

On consoles, you can draw maybe 10,000 or 20,000 chunks of geometry in a frame, and you can do that at 30-60fps. On a PC, you can't typically draw more than 2-3,000 without getting into trouble with performance, and that's quite surprising - the PC can actually show you only a tenth of the performance if you need a separate batch for each draw call.

Now the PC software architecture – DirectX – has been kind of bent into shape to try to accommodate more and more of the batch calls in a sneaky kind of way. There are the multi-threaded display lists, which come up in DirectX 11 – that helps, but unsurprisingly it only gives you a factor of two at the very best, from what we've seen. And we also support instancing, which means that if you're going to draw a crate, you can actually draw ten crates just as fast as far as DirectX is concerned.

But it's still very hard to throw tremendous variety into a PC game. If you want each of your draw calls to be a bit different, then you can't get over about 2-3,000 draw calls typically - and certainly a maximum amount of 5,000. Games developers definitely have a need for that. Console games often use 10-20,000 draw calls per frame, and that's an easier way to let the artist's vision shine through.'
 
360 is DX-ish and PS3 uses a low level library called libGCM. Both consoles can bypass the API and do all sorts of hackery, like they can just dump data into the GPU command buffer, don't need to have the driver trying to make sure you don't bring the system down in flames, etc.


Is there an overhead, yes, but the layer of abstraction on PC isn't really the end of the world. Freeing CPU time is not going to save you if you're completely and utterly GPU bound to begin with. The bad part is hoping for driver support in the first place, like the multi-threaded rendering that DX11 was supposed to bring to the table. Great feature, totally attacks a major problem on PC side... in theory -- but it might be a performance loss in the end because support is still flaky on the driver side. I think it's overblown though, shit port jobs will run bad, it's not surprising. If you don't treat the platform right, it won't work well. Even 360 to PS3 ports are shit sometimes. COD is occasionally like half the framerate as the 360 version.
OK so the 360 programing library is 'based on DX' and the PS3 programing library is 'based on OpenGL' :D

But yeah, dumping right onto the hardware is a huge advantage and is something that Carmack has said he wishes he could do more but GPU makers do not allow that. That's the whole point of this thread.

I imagine PC side will benefit too if the next consoles are x86 machines and developers are writing optimized, vectorized assembly and stuff to try and milk the most out of the platform.
I totally see that. The speculation even says that both 720 and PS4 are using Hybrid Crossfire. I can see the console ported PC games take full advantage of that which may mean next gen games ported to PC won't be nearly as bad as the current gen. All console games/ports will be natively designed to support multi-GPU, which is great for PC gamers receiving those ports.
 
OK so the 360 programing library is 'based on DX' and the PS3 programing library is 'based on OpenGL' :D

But yeah, dumping right onto the hardware is a huge advantage and is something that Carmack has said he wishes he could do more but GPU makers do not allow that. That's the whole point of this thread.

I totally see that. The speculation even says that both 720 and PS4 are using Hybrid Crossfire. I can see the console ported PC games take full advantage of that which may mean next gen games ported to PC won't be nearly as bad as the current gen. All console games/ports will be natively designed to support multi-GPU, which is great for PC gamers receiving those ports.

libGCM isn't OpenGL, it's just a thin wrapper over the hardware. :)

But calling it a HUGE advantage is perhaps overkilling it. DX10+ has helped fix some of the worst offenders, like in DX9 it required a user / kernel mode switch which is partly why certain things were very expensive and there's other good features to help drive it even lower. But still, it's not going to be a game changer if that isn't where you're bottlednecked anyway.
 
I understand consoles do sub 720p and sometimes poor textures often but I am often amazed that they can play some games at all (crysis 2) to any degree.


I would appreciate if someone could give some insight into this and if PCs are at any disadvantage.

I think you underestimate just how shitty they actually play those games. You can play Crysis 2 on an 8600GT as well, which from what I understand has similar performance to the console GPUs.

I don't know a lot about interpreting GPU specifications... but looking at these numbers...

http://www.gpureview.com/GeForce-8600-GT-card-513.html
http://en.wikipedia.org/wiki/RSX_'Reality_Synthesizer'
http://en.wikipedia.org/wiki/Xenos_(graphics_chip)

8600GT
FLOPS: 113.28 GFLOPS
Texture Fill Rate: 8640 MTexels/sec
Pixel Fill Rate: 4320 MPixels/sec
Memory Bandwidth: 22.4 GB/sec

RSX
Floating Point Operations: 400.4 Gigaflops (24 * 27 Flops * 550 + 8 * 10 Flops * 550)
Maximum texel fillrate: 13.2 GigaTexels per second (24 textures * 550 MHz)
Peak pixel fillrate (theoretical): 4.4 Gigapixel per second
22.4 GB/s read and write bandwidth

Xenos
240 GFLOPS
Maximum texel fillrate: 8 gigatexels per second (16 textures × 500 MHz)
4 gigapixels per second without MSAA (8 ROPs × 500 MHz)

Now if someone wants to tell me I'm wildly wrong in my interpretation of these specifications that's cool, I'm no GPU architect :p But to me it sounds like the 8600GT is the underdog out of those GPUs... yet it will still play Crysis 2 on low settings at 1024x768 (the same resolution as the consoles). Just youtube "crysis 2 8600GT" and you'll find lots of videos of people claiming around 25-30fps playing it.

http://www.youtube.com/watch?v=btd07E-lC3E

My personal opinion is that PC gaming performance for what hardware performance it has available compared to console gaming performance for what hardware performance it has is actually similar. People talk a lot about consoles being able to take better advantage of their hardware, but I haven't actually seen a lot of evidence for that other than in the memory limitation of the consoles being better utilised. I'm happy to change that opinion if people can provide some evidence though.

The most I think you can say is that game developers actually lower their graphical settings to the point where it's optimised for a console, whereas if you just have an extremely low end PC of similar performance you might have the problem that the developers didn't actually include the option to play at such shitty low settings to account for your shitty computer's performance.
 
I think you underestimate just how shitty they actually play those games. You can play Crysis 2 on an 8600GT as well, which from what I understand has similar performance to the console GPUs.

I don't know a lot about interpreting GPU specifications... but looking at these numbers...

http://www.gpureview.com/GeForce-8600-GT-card-513.html
http://en.wikipedia.org/wiki/RSX_'Reality_Synthesizer'
http://en.wikipedia.org/wiki/Xenos_(graphics_chip)

8600GT
FLOPS: 113.28 GFLOPS
Texture Fill Rate: 8640 MTexels/sec
Pixel Fill Rate: 4320 MPixels/sec
Memory Bandwidth: 22.4 GB/sec

RSX
Floating Point Operations: 400.4 Gigaflops (24 * 27 Flops * 550 + 8 * 10 Flops * 550)
Maximum texel fillrate: 13.2 GigaTexels per second (24 textures * 550 MHz)
Peak pixel fillrate (theoretical): 4.4 Gigapixel per second
22.4 GB/s read and write bandwidth

Xenos
240 GFLOPS
Maximum texel fillrate: 8 gigatexels per second (16 textures × 500 MHz)
4 gigapixels per second without MSAA (8 ROPs × 500 MHz)

Now if someone wants to tell me I'm wildly wrong in my interpretation of these specifications that's cool, I'm no GPU architect :p But to me it sounds like the 8600GT is the underdog out of those GPUs... yet it will still play Crysis 2 on low settings at 1024x768 (the same resolution as the consoles). Just youtube "crysis 2 8600GT" and you'll find lots of videos of people claiming around 25-30fps playing it.

http://www.youtube.com/watch?v=btd07E-lC3E

My personal opinion is that PC gaming performance for what hardware performance it has available compared to console gaming performance for what hardware performance it has is actually similar. People talk a lot about consoles being able to take better advantage of their hardware, but I haven't actually seen a lot of evidence for that other than in the memory limitation of the consoles being better utilised. I'm happy to change that opinion if people can provide some evidence though.

The most I think you can say is that game developers actually lower their graphical settings to the point where it's optimised for a console, whereas if you just have an extremely low end PC of similar performance you might have the problem that the developers didn't actually include the option to play at such shitty low settings to account for your shitty computer's performance.


As one of the previous guys pointed out, Is the video card the 256mb version? Also is the total system memory 512mb not accounting for what the operating system etc takes out of that. Is the processor that the system they ran Crysis 2 on have a better processor for the task? I don't know if that stuff ultmately means much but I would think all variables need to be taken into account. Is Crysis 2 CPU heavy at all? etc
 
Armhand, writing to metal means writing a program the sends instructions directly to the hardware. Console programmers can do this because they know exactly what hardware there is (which graphics chip, which sound chip, which CPU, etc.). On the PC, the programmer sends instructions to a software layer (The OS plus a hardware driver) which in turn sends instructions to the hardware. This overhead slows things up.

Another advantage of knowing exactly what the hardware is, is optimizing the game for that hardware. Suppose they make a game that runs at 60fps. But, at one point the game dives to 20fps, on that hardware. They can rewrite that part of the game to do something else, that the specific hardware can handle better, to remove that dip. But, on a PC optimizing for one PC (e.g. fast CPU, slow graphics card) might hurt performance on another PC (e.g. slow CPU, fast graphics card) with different hardware that does different things well.

Finally, they just work a whole lot harder on console programming. They program to the lowest common denominator and they have to work very hard to get something nice out of the puny consoles. Then with little effort, they can just port it to a PC. The PC's greater power means they don't have to spend the time optimizing it, or doing tricks to get every last ounce of performance out of a PC. And, if you don't want your game dipping to 20fps, get a faster graphics card.
 
As one of the previous guys pointed out, Is the video card the 256mb version? Also is the total system memory 512mb not accounting for what the operating system etc takes out of that. Is the processor that the system they ran Crysis 2 on have a better processor for the task? I don't know if that stuff ultmately means much but I would think all variables need to be taken into account. Is Crysis 2 CPU heavy at all? etc

That video I linked was running the 512mb version, but there's other similar videos of people running the 256mb version.

The CPUs being used in those videos, again, are pretty shit. one video the guy is running an Athlon 64 X2 4400+, the one I posted above was an E2180 2.00 GHz. I'm honestly not sure how they compare to the console CPUs (and the PS3 CPU has a different architecture, so I don't know if its easy to compare at all), but yeah, those are pretty low end CPUs.

http://www.cpubenchmark.net/cpu_lookup.php?cpu=Intel+Pentium+Dual+E2180+@+2.00GHz

System memory I'm not sure... I did say "other than in the memory limitation of the consoles being better utilised". ;) As the one thing I think PCs have to take into account, even though Windows doesn't consume much performance while gaming, it still has to be held in memory, so the minimum memory requirement for PCs I feel would be higher. Once you're up and gaming, I think Windows takes far less system resources than people make out (unless its horribly bloated with lots of background tasks running), though it does still have to be held in memory.

Basically I hear a lot of stuff like what Stone Cold is saying, with the overhead and blah blah blah, but not so much evidence that its actually affecting performance. Both consoles and PCs will have SOME overhead, I don't think I've ever heard of someone actually trying to quantify that overhead and from my experience, it really doesn't seem as much as people are making it out to be.
 
I watched an interview and seen some quotes and what not with Carmack and his beef with drivers and api. and why consoles have some specific advantages over pcs when it comes to programming, this infamous "close to metal" stuff.

It's always vaguely touched upon and rarely fully explained (unless someone here has a good video).

Anyway, is there in fact an advantage of console hardware vs pc for gaming?

Now when I look at the specs of the xenos, although it has 48 shaders like the x1900, it has half the bandwidth (comparable to a x1650xt). and half the rops. Yet people incessantly compare the two, same as RSX vs 7800gtx. Is it unfair to compare these? I always see happening.

They seem very gimped.

I understand consoles do sub 720p and sometimes poor textures often but I am often amazed that they can play some games at all (crysis 2) to any degree.


I would appreciate if someone could give some insight into this and if PCs are at any disadvantage.

Carmack is an idiot when he talks......His glory days are long gone because he lost FOCUS on what is important to gamers and instead got greedy and went after the money.....

BTW I did this comparison in the other threads about this subject and the 360 cannot even keep up with a ATi 1800xl.......
 
My understanding of it, feel free to disagree:

It's more of a challenge to optimize a game to run on PC across all it's varieties of configurations. As things get more and more advanced, it will become even more of a challenge, unless someone is continually trying to simplify drivers enough so the game developers can focus more on the game and not on compatibility issues.

I'm sure many us would be fine if Carmack just focused on simplifying drivers and whatnot, he seems better at that kinda stuff then working with game designers to produce a fun game. Of course, he'll do what he wants and that's fine too.
 
he lost FOCUS on what is important to gamers and instead got greedy and went after the money.....
He's never come across to me as somehow "for the gamers", but rather "for the game designers". I also don't believe he's greedy because why does he continue to take risks by trying new things?
 
^trying new things? like what? Rage? that was a joke on the PC......

Doom 2, Quake 1-3 were all fun games to play Doom 3 was even OK once they ironed out the bugs..

Just think back to all of the AWESOME games that used those engines....
 
It's more of a challenge to optimize a game to run on PC across all it's varieties of configurations. As things get more and more advanced, it will become even more of a challenge, unless someone is continually trying to simplify drivers enough so the game developers can focus more on the game and not on compatibility issues.

Probably not nearly as much as you might think.

There's some gotchas here and there, like maybe AMD favors one way while NV favors another, but it's not like you're not laying out 12312838 pieces of hardware and it's like FUCK GOTTA OPTIMIZE FOR EVERY SINGLE COMBINATION EVER, that's what the driver and layer of abstraction saves us from.
 
^trying new things? like what? Rage? that was a joke on the PC......

Doom 2, Quake 1-3 were all fun games to play Doom 3 was even OK once they ironed out the bugs..

Just think back to all of the AWESOME games that used those engines....


Does he even make hte games any more? or just build the tools for the development team to use?
 
it's not like you're not laying out 12312838 pieces of hardware and it's like FUCK GOTTA OPTIMIZE FOR EVERY SINGLE COMBINATION EVER, that's what the driver and layer of abstraction saves us from.
Yeah, but only to an extent, then you are dealing with diminished returns. What makes matters even worse is dealing with graphic techniques that are fundamentally different then almost any game out there.

The fact that ports typically turn out so shoddy is a glimpse into why drivers don't auto-magically make everything work smoothly. Ever notice UE3's flashy cross-platform development demos? They wouldn't be stressing that so much if it weren't typically a huge headache for game programmers.

I'm not sure how much time Carmack worked on the various aspects of the engine or tools, but he sure did talk a lot about cross-platform delivery strategies. Like it or not, many of the top programming minds are spending more and more time trying to figure out how to simplify working with the increasingly evolving complexities of new hardware. You have to set priorities though, and not focus so much on the technology that you lose sight of what's really important, the game. This is what arguably happened with id and other companies. Things have a way of balancing out though, and some companies die in the process. I'm excited to see what's around the corner, and if it's carmack leading the way, great, if not, someone else will and I just hope they get recognized for it.
 
^trying new things? like what? Rage? that was a joke on the PC......

Doom 2, Quake 1-3 were all fun games to play Doom 3 was even OK once they ironed out the bugs..

Just think back to all of the AWESOME games that used those engines....

I think part of the problem with Id Software, they have had a little too much internal drama. John carmack is still the residential genius but for whatever reason he lost his passion programming for pc. That and the fact they fired, for one reason or another, so many of the original founders/crew that made the doom and quake series.

It could be part pc gamers fault, quake 3 and doom 3 both were lambasted pretty hard. Now with easy access to internet forums and various ways to post info in almost every device, the mass seem more mean and fussy about pc games.
 
I think part of the problem with Id Software, they have had a little too much internal drama. John carmack is still the residential genius but for whatever reason he lost his passion programming for pc. That and the fact they fired, for one reason or another, so many of the original founders/crew that made the doom and quake series.

It could be part pc gamers fault, quake 3 and doom 3 both were lambasted pretty hard. Now with easy access to internet forums and various ways to post info in almost every device, the mass seem more mean and fussy about pc games.

I actually enjoyed Doom 3 once I got hardware that could run it and they fixed the dang flashlight..

Quake III got ripped for not having a story and just being a UT clone......
 
quake 3 and doom 3 both were lambasted pretty hard

IMO it's only by a small percentage of people who yelled and screamed about it.

For the most part, when people enjoy a game they just go about playing it and enjoying life, feeling no need to go on to forums, giving it praise. It's the depressed butt hurt ones you mostly see on forums complaining about how life is being unfair to them. They are the ones who typically need it as an outlet.
 
IMO it's only by a small percentage of people who yelled and screamed about it.

For the most part, when people enjoy a game they just go about playing it and enjoying life, feeling no need to go on to forums, giving it praise. It's the depressed butt hurt ones you mostly see on forums complaining about how life is being unfair to them. They are the ones who typically need it as an outlet.

Like mass effect 3 (puts on flame suit)

I forgot developers were making games to make money. I guess that is easier to do on consoles. Call my crazy.
 
Back
Top