Next gen consoles, why the low specs?

dave343

[H]ard|Gawd
Joined
Oct 17, 2000
Messages
1,869
Why are these next gen consoles packed with low end hardware?

While I do think it was good to go with the AMD 8 Core, why are they running at 1.6(ish) ghz and not 3ghz which seems like it wouldn't have been a problem achieving. The default clock on cpu's these day's seems to kick around 3ghz +... and although I'm aware Ghz isn't everything, it sure would have given these new machine's a boost since they'll be around for a good 6+ years. I can't see heat being a big issue, after all pretty much all new CPU's these days are very low powered, and 45nm or less.

2nd, the video card. When the 360 and PS3 were released, they used AMD and NV's top line chips, 7800GTX and the 1800 XTX, which I think is why they've been able to squeeze so many years out of the current consoles. The new ONE (correct me if I'm wrong) has the equivelent to the 7700, and the PS4 slightly better with a equivelent 7850. Why didn't they try for something higher as in a 7870 for both consoles, or something between a 7870 and 7950...? The GPU in the ONE isn't exactly top end in the PC world, and yet it has to do for a good 6+ years again... I can't see the cost being THAT huge. They expect these consoles to last how long?

I do praise Sony for using GDDR5, but why MS went with DDR3 is beyond me :rolleyes:
 
It does come out to cost, they both lost a considerable sum of money on their first years, and that is also the reason why they had to extend so long the "life" of their console, this time around they preferred to make a buck from the first year on.

And well, even though for us pc guys they both are kinda sucky, we have to admit that from raw numbers even the one represents an increment of around 5x the numbers of the Xenos (which was the most powerful gpu on last consoles), add to that their apparent target of around 720p 60// 1080p 30 on the one and you could argue that it is enough for that.

It is kinda hard to compare the Cell and Xenon cpu's to the upcoming Jaguar based ones tho, the tech is quite different between all three, but one thing is for certain, in the current Performance//Watt arena it is hard to beat it while it also gave both Sony and MS the benefit of a widely known and "safe" base on the form of x86 (x64 actually).
http://www.anandtech.com/show/6976/...wering-xbox-one-playstation-4-kabini-temash/5

So, yeah, it is all about cost... and well, the memory, Sony admittedly lucked out on price of GDDR5 coming down (they first toyed with 2GB, then 4GB and lastly secured the deal for 8GB), MS tried somehow to mimic their Xenos success in a way that i consider unbalanced myself... Xenos had 10MB eDRAM, so they upped the ante to 32MB this time around of better eSRAM, which arguably may not be enough to deliver enough "free" effects if anything at 1080p. Time will tell but don't hope for a miracle on this front, anyone who does is delusional.
 
Note DDR3 and GDDR3 are not the same thing. GDDR5 is based off of DDR3 so by just knowing the types and nothing else is it not posible that both give the same performance?

Also keep in mind that when you are developing for a single CPU, single motherboard and single video card you can get better results than you can for the average pc. For the average pc game you have to allow for a wide range of hardware which means you have to make operations more generic so that they work on various cards from various makers. By using a single card you can access 100% of the power of that card by optimizing the code for just that card.

There is also cost. Sure they could go for the top of the line hardware and then the consoles would cost $2000+ each and nobody would buy them. Rember that a console and pc are two different things.
 
We can hope that prices will drop much earlier and lower than in previous gens as the price of hardware decreases.
 
Console specs are decided years before release, so they chose the components for this gen back in 2010. Then you add the desire to reduce cost, achieve yields etc. Specs in the end don't matter, it's only because of Interney forums that people even are. The avg consumer doesn't give a rats ass what the specs are, it's like an appliance. Do you care about the clock speed of your tv, which also has a CPU?
 
While I do think it was good to go with the AMD 8 Core, why are they running at 1.6(ish) ghz and not 3ghz which seems like it wouldn't have been a problem achieving. The default clock on cpu's these day's seems to kick around 3ghz +... and although I'm aware Ghz isn't everything, it sure would have given these new machine's a boost since they'll be around for a good 6+ years. I can't see heat being a big issue, after all pretty much all new CPU's these days are very low powered, and 45nm or less.
There are several factors that go into the CPU choice.

Without being too complicated, AMD's 8 core chip is not an 8 core in the sense that Intel's 8 core is. Intel 8 core chip is 8 AU's and 8 FPU's, where AMD is 8 AU's and only 4 FPU's. PS4 and One games will likely be primarily 4 threads to take the most advantage of the AMD CPU. AMD ships these chips as 3GHz plus on PC to try to compete with Intel. On a console, where you know everyone has the same hardware, they can ship you a slower chip to save power, reduce heat, as well as have increased yields (saving production cost). Since everyone has identical hardware, it's easier to code, so there is efficiency to be had. But the three factors I listed earlier are the primary reason.
2nd, the video card. When the 360 and PS3 were released, they used AMD and NV's top line chips, 7800GTX and the 1800 XTX, which I think is why they've been able to squeeze so many years out of the current consoles. The new ONE (correct me if I'm wrong) has the equivelent to the 7700, and the PS4 slightly better with a equivelent 7850. Why didn't they try for something higher as in a 7870 for both consoles, or something between a 7870 and 7950...? The GPU in the ONE isn't exactly top end in the PC world, and yet it has to do for a good 6+ years again... I can't see the cost being THAT huge. They expect these consoles to last how long?
On PS3 and 360, at least initially, the GPU's were a separate chip than the CPU. This meant they could have a bigger die area and their own cooling. Eventually 360 merged their CPU and GPU into one chip (like what is on the PS4/One).

Because the PS4 and One use a SoC, they have to consider there are only so many transistors to go around that can fit on the same die. The 7000 series AMD hardware stuck on the same die as an 8 core CPU is very impressive and is by no means underpowered. Some PS4 games are shipping 1080p @ 60 FPS (KZ4 for example) so the chips are by no means underpowered.

I do praise Sony for using GDDR5, but why MS went with DDR3 is beyond me :rolleyes:
Microsoft went with DDR3 to save money. Their CPU is very complex and is expensive, just like the Kinect is. If they would of went with GDDR5, they would of had to cut elsewhere to fit in the $499 price. This may of been removal of many of the other SoC components that will be used to enhance the TV features and such. Also, I think the original leak of PS4 only having 4GB played into this decision as well. Microsoft could slap 8 GB of cheap memory on the console and proclaim we have more memory than PS4, and non-technical people would think One was better because of this.
 
Yeah its essentially down to cost and viability.
They want to make a profit on each sale of this console and it looks like they will.

Temps have to be kept down if the console is to last, a bad rep on reliability will harm a company unless they spend a lot to remedy it.
Take a look at the 360!
Better cooling either costs more or takes more space, can cost more and use more power, so the easiest solution is to use the minimum amount of power/heat you can get away with to keep the consoles footprint small and costs down.

The cost of slower parts is a lot less, you can die shrink older designs too.
They produce less heat and create less design headaches to utilise the throughput and to keep things stable.
They are also tried and tested so there are few surprises to cope with.

They have done the bare minimum to produce gaming at 1080p while promoting their online software business.
 
Why are these next gen consoles packed with low end hardware?

While I do think it was good to go with the AMD 8 Core, why are they running at 1.6(ish) ghz and not 3ghz which seems like it wouldn't have been a problem achieving. The default clock on cpu's these day's seems to kick around 3ghz +... and although I'm aware Ghz isn't everything, it sure would have given these new machine's a boost since they'll be around for a good 6+ years. I can't see heat being a big issue, after all pretty much all new CPU's these days are very low powered, and 45nm or less.

2nd, the video card. When the 360 and PS3 were released, they used AMD and NV's top line chips, 7800GTX and the 1800 XTX, which I think is why they've been able to squeeze so many years out of the current consoles. The new ONE (correct me if I'm wrong) has the equivelent to the 7700, and the PS4 slightly better with a equivelent 7850. Why didn't they try for something higher as in a 7870 for both consoles, or something between a 7870 and 7950...? The GPU in the ONE isn't exactly top end in the PC world, and yet it has to do for a good 6+ years again... I can't see the cost being THAT huge. They expect these consoles to last how long?

I do praise Sony for using GDDR5, but why MS went with DDR3 is beyond me :rolleyes:
The short answer is because both Sony and MS lost billions of dollars on hardware costs and warranties trying to cram high-end hardware into a set-top box that retails for <$500. The PS3 cost $800 to build at launch and sold for $599 meaning that Sony took a $200 hit for every console sold. They don't have the financial clout to do that now (realize they just had to sell their US headquarters building to pay off debt).

Also, they're still targeting 1080p or less for the new consoles, which you don't need an expensive GPU to do these days. 4K is still years away from being mainstream, and even if you put the most expensive single GPU in the current consoles, it still wouldn't have enough power to handle 4K. So they can target 1080P, sell a cheaper console and still offer competitive visuals.

These consoles probably won't age as well as the PS3/360, but I don't think MS or Sony care.

Also RE: the clock speed of the AMD Jaguar, it's a low-end CPU. It's designed for low-power / low-heat applications and I don't think they sell any Jaguar parts over 2GHz at this time. It's not a comparable design to any mainstream Intel processor or AMD processor so you can't really say "Well my i7 has been at 3GHz for years, why isn't this CPU". It wasn't designed that way.
 
I wouldn't be too worried about the CPU. If the programmers properly thread their games, they should be able to balance out the load across the available cores even though they are ~1.6GHz each. I'd be more concerned about the GPU power.
 
I wouldn't be too worried about the CPU. If the programmers properly thread their games, they should be able to balance out the load across the available cores even though they are ~1.6GHz each. I'd be more concerned about the GPU power.

Totally agree. People underestimate how much you can do with this much CPU power and optimized programming.
 
I wouldn't be too worried about the CPU. If the programmers properly thread their games, they should be able to balance out the load across the available cores even though they are ~1.6GHz each. I'd be more concerned about the GPU power.

Totally agree. People underestimate how much you can do with this much CPU power and optimized programming.

I wouldn't be so quick on this one myself. Though the CPU, number wise, seems pretty decent the actual processing capability of the cores are weak. Like, super weak.

Based on benchmarks from the Kabini APU (which uses the same Jaguar cores as the consoles) a 8-core 1.6GHz Jaguar based APU (e.g. next-gen consoles) is about on par with a 3GHz DUAL-core Intel i3.

http://www.anandtech.com/show/6974/amd-kabini-review/5

The AMD Kabini in question is the A4-5000 @ 1.5GHz per core. The Intel Pentium 2020M CPU, ya know, the one that kicks the A4-5000's ass, is a 2.4GHz dual-core dual-thread CPU.

So though the 8-threads might be able to do some interesting things, fact of the matter is that a Intel i3 is faster. I'll repeat that...a Intel i3 is faster that the 8-core CPU in the consoles.

That's pathetic. Truly. The nearly 7 year old Intel Q6600 quad core would absolutely wipe the floor with these "next-gen" CPU's. It's crazy how slow the next-gen APU's are CPU wise (well, and GPU wise if you think about it).

1080P/60 not good enough for first gen games (Killzone)?

That's fine...problem is that most games won't do this. We'll see on down the road but I see 1080p/60fps becoming a rarity pretty quickly as developers push for higher-end graphics. Give it a couple years and I bet you 720p/30fps will probably become fairly normal. This generation is certainly not "NEXT-GEN" but more "update-gen". True next-gen, in this day and age for me, would be consistent 1080p/60fps as a MINIMUM.
 
So you think the new consoles with many times the GPU power as the current gen are going to *barely* be better with resolution and frame rate?
 
After some time when people want more pretty graphics? Yes I do.

Current consoles show this perfectly. To run a game like Halo for instance they had to drop the resolution down to something like ~600p and 30fps. You think the new consoles are going to have graphics that will consistently increase in fidelity and that 1080p/60fps will be maintained? I don't. I think this time around we certainly won't see lower than 720p...that will at the minimum will be maintained. Possibly 900p. However 1080p/60fps will be dropped sooner or later. Not for ALL games, but a lot.

Lets say 1080p/60fps is the current going rate, right? Well, if developers go 1080p/30fps that's AT LEAST 30-40% more prettiness they can add to games. If they go to 720p/60fps that's half the pixels of 1080p/60fps...so again, you're looking at 30-40% just changing the resolution.

If you go 720p/30fps however...you're looking at nearly a 100% increase in what could be done graphically and effects wise such as particles, physics, A.I., etc over 1080p/60fps. Considering these new consoles will probably be around for the next 7-8 years...yeah. I totally see 720p/30fps becoming a reality. It would have to if dev's want to increase their graphics.
 
I think you should go play Halo 3 and Halo Reach. Halo 4 looks better and seems to run better with all that extra shit going on.

Halo 3 ran in 640P on the 360. By your logic, Halo 4 should be running in something like 480P because people are stupid and they want pretty games, right? So obviously Halo 4 would need to drop the resolution?

Halo 4 runs 720P native. Years of development improvements led to a game that looks much better and runs in a higher resolution, not the other way around. There is no reason to think otherwise for the next gen platforms.

Also things like particles, physics, and AI can be done independent of screen resolution. If that code takes up 10% of the CPU at 720P/30, it uses the same 10% at 1080P/30.
 
Last edited:
I think you should go play Halo 3 and Halo Reach. Halo 4 looks better and seems to run better with all that extra shit going on.

Halo 3 ran in 640P on the 360. By your logic, Halo 4 should be running in something like 480P because people are stupid and they want pretty games, right? So obviously Halo 4 would need to drop the resolution?

Halo 4 runs 720P native. Years of development improvements led to a game that looks much better and runs in a higher resolution, not the other way around. There is no reason to think otherwise for the next gen platforms.

Also things like particles, physics, and AI can be done independent of screen resolution. If that code takes up 10% of the CPU at 720P/30, it uses the same 10% at 1080P/30.

Well if I'm wrong I'm wrong if I'm right I'm right.

Either way, I can tell you that I think what I said will be the case. I mean, the Xbox One version of BF4 is said to be running at 900/60 as it is. So who knows? I know that I personally would love to see the graphics on a system like the PS4 that would need to run at 720/30 if properly utilized. Look at a game like the new Killzone...that's 1080/60...could you imagine what kind of graphics they could push at 720/30? :eek:
 
I think Joe is somewhat correct, but I doubt all games will do this. It will likely come down to developers and what they want to do. Racing and fighting games will hopefully always be 1080p/60 while FPS games will be halve and halve. Platform games being 720p/30 unless they choose to do something different.

Though I don't believe the CPU is underpowered at all given the gpu and nature of consoles.
 
I wouldn't be so quick on this one myself. Though the CPU, number wise, seems pretty decent the actual processing capability of the cores are weak. Like, super weak.

Ehh you are looking at this from the wrong perspective, you are looking at it from a PC perspective.

Gotta get in the shoes of the designers of the consoles in comparison to current versions:

http://www.eurogamer.net/articles/digitalfoundry-tech-interview-metro-2033?page=4

"Oles Shishkovstov: You can calculate it like this: each 360 CPU core is approximately a quarter of the same-frequency Nehalem (i7) core. Add in approximately 1.5 times better performance because of the second, shared thread for 360 and around 1.3 times for Nehalem, multiply by three cores and you get around 70 to 85 per cent of a single modern CPU core on generic (but multi-threaded) code."

So, a full 360 is ~70-85% of a single Nehalem i7 core, that is beyond remarkably weak, don't you think?... with that perspective in mind you may understand why even the weak jaguar cores are a massive improvement, specially again when considering that this generation isn't needed to last as much as the last one.
 
Ehh you are looking at this from the wrong perspective, you are looking at it from a PC perspective.

Gotta get in the shoes of the designers of the consoles in comparison to current versions:

http://www.eurogamer.net/articles/digitalfoundry-tech-interview-metro-2033?page=4

"Oles Shishkovstov: You can calculate it like this: each 360 CPU core is approximately a quarter of the same-frequency Nehalem (i7) core. Add in approximately 1.5 times better performance because of the second, shared thread for 360 and around 1.3 times for Nehalem, multiply by three cores and you get around 70 to 85 per cent of a single modern CPU core on generic (but multi-threaded) code."

So, a full 360 is ~70-85% of a single Nehalem i7 core, that is beyond remarkably weak, don't you think?... with that perspective in mind you may understand why even the weak jaguar cores are a massive improvement, specially again when considering that this generation isn't needed to last as much as the last one.

OK so the 360 CPU is weak as hell? What's that have to do with the Jaguar being slow as hell as well? That it's slightly better? That it's comparable to two Nehalem cores instead of one? Either way it's still shit. Especially considering it's 2013 and these are supposed to be "next-gen".
 
OK so the 360 CPU is weak as hell? What's that have to do with the Jaguar being slow as hell as well? That it's slightly better? That it's comparable to two Nehalem cores instead of one? Either way it's still shit. Especially considering it's 2013 and these are supposed to be "next-gen".

Yeap the point is that for them it is "strong enough for the job", and "good design" from a economical point of view who wants to maximize earnings is to spent the least money for the best return.

Even with all the extras that these new consoles will do on this go, a jaguar with an extra coprocessor to handle the "video share" function would be more than enough for them. And it is more than two nehalem cores, with software that will be tailored properly to it.

And the "it's 2013" doesn't fly for them, as they have to answer to investors and neither wanted to be in a position where the RRoD or YLoD could happen, so again, lowest cost with best performance per watt won the day. Just as it has done in the past when it comes to consoles, heck the last generation could be argued to have been an anomaly that ended up biting them both in the ass.
 
Yeap the point is that for them it is "strong enough for the job", and "good design" from a economical point of view who wants to maximize earnings is to spent the least money for the best return.

Even with all the extras that these new consoles will do on this go, a jaguar with an extra coprocessor to handle the "video share" function would be more than enough for them. And it is more than two nehalem cores, with software that will be tailored properly to it.

And the "it's 2013" doesn't fly for them, as they have to answer to investors and neither wanted to be in a position where the RRoD or YLoD could happen, so again, lowest cost with best performance per watt won the day. Just as it has done in the past when it comes to consoles, heck the last generation could be argued to have been an anomaly that ended up biting them both in the ass.

With the last generation, they were losing more money on the consoles than they normally would, by the sound of it. It might explain why they waited longer to come out with another round of consoles.
 
4K tv's aren't going to be mainstream for at least 2 years. And in 5 years the number of people with 4K tv's is going to be really tiny. It makes no sense to build a console capable of that resolution.

The much bigger memory, unified memory architecture and the on die gpu make a big difference. There is plenty of room for optimization as we shall see. Look at the increase in graphics on the 360 from launch to something like Halo 4/GTA5 and you can guess what we can expect in 5 years from next gen consoles.
 
What you should take from this is

6 cores dedicated to the game! that should give devs a good reason to use well.. 6 cores. Current games mostly use 1-2 and some will use more, but this will really advance the state of game engines using multi-core cpus efficiently.
 
4K tv's aren't going to be mainstream for at least 2 years. And in 5 years the number of people with 4K tv's is going to be really tiny. It makes no sense to build a console capable of that resolution.

The much bigger memory, unified memory architecture and the on die gpu make a big difference. There is plenty of room for optimization as we shall see. Look at the increase in graphics on the 360 from launch to something like Halo 4/GTA5 and you can guess what we can expect in 5 years from next gen consoles.

Yea.

A console capable of pushing 4k at 60fps would look like a desktop PC and cost $1600+. 4k isn't going to become THE standard. It's going to remain niche for years. 1080p is going to be the "it" resolution for quite a while, still.
 
Why are these next gen consoles packed with low end hardware?

While I do think it was good to go with the AMD 8 Core, why are they running at 1.6(ish) ghz and not 3ghz which seems like it wouldn't have been a problem achieving. The default clock on cpu's these day's seems to kick around 3ghz +... and although I'm aware Ghz isn't everything, it sure would have given these new machine's a boost since they'll be around for a good 6+ years. I can't see heat being a big issue, after all pretty much all new CPU's these days are very low powered, and 45nm or less.

2nd, the video card. When the 360 and PS3 were released, they used AMD and NV's top line chips, 7800GTX and the 1800 XTX, which I think is why they've been able to squeeze so many years out of the current consoles. The new ONE (correct me if I'm wrong) has the equivelent to the 7700, and the PS4 slightly better with a equivelent 7850. Why didn't they try for something higher as in a 7870 for both consoles, or something between a 7870 and 7950...? The GPU in the ONE isn't exactly top end in the PC world, and yet it has to do for a good 6+ years again... I can't see the cost being THAT huge. They expect these consoles to last how long?

I do praise Sony for using GDDR5, but why MS went with DDR3 is beyond me :rolleyes:

Ugh... this kind of thread again. It's not a fucking computer dude. When will you idiots stop comparing consoles to computers? Consoles are about sheer convenience and delivering a product that will play games. Go do some research on how much the 3DO, PS3, and the SNK Neo Geo cost at launch in the U.S... if you want high specs comparable to a computer then go buy a computer. If you want sit-in-front-of-your-tv convenience to play games, and still have plenty of gaming capabilities then buy a console.

Quite frankly, these threads are starting to annoy the shit out of me. Consoles aren't computers and should stop being compared to them spec-to-spec. It makes no sense. It's like asking why can't every car just be a Ferrari? Well, not everyone around the world can afford a Ferrari, and some people don't want to pay a premium for the Ferrari and/or it's more than what they need. Some people like driving a Ford Focus. It'll last a long time for what I need, is fun to use, and is affordable. Not to mention that the specs you read, are ALL dedicated to gaming. You aren't multi-tasking chrome, aol messenger, vlc media player, downloading apps, updating Steam games, surfing Facebook... The system is a gaming specialized system.

The fact is these systems have to perform a incredible balancing act which people who keep bringing this stupid spec debate up can't grasp. The console designers have to take a lot of things into consideration when making these. Being affordable enough to people around the planet. Being accessible enough to developers so that it's easy to make games for. High-end enough to deliver a positive leap forward in graphical fidelity... and this all has to be done in order to also make a profit. Sony didn't see profits from the PS3 hardware until over 4 years into it's lifespan. And at the time, the hardware inside the PS3 was revolutionary. However, it cost $500. The Xbox One costs $500 but it includes more than a console.

The graphics are gonna look good. There's no doubt about it. Start focusing on the value of the system rather than numbers. At the end of the day, you're not paying for the specs... you're investing in a console to play some great games and have some quality experiences. I'll leave you with some ACTUAL screenshots from the PS4 so you can judge, and see how good or bad the graphics are for a $400 machine. All comes down to the games you want to play. Also, everyone knows that games on consoles get better and better the longer into their life, so if this is only the beginning... then I think the games will look fantastic 4 years from now.

index.php


Killzone-Shadow-Fall-trailer-screenshots-1.jpg


index.php


ku-xlarge.jpg
 
Last edited:
OK so the 360 CPU is weak as hell? What's that have to do with the Jaguar being slow as hell as well? That it's slightly better? That it's comparable to two Nehalem cores instead of one? Either way it's still shit. Especially considering it's 2013 and these are supposed to be "next-gen".

The point is that you can't compare a video game running on the console on a cpu/gpu combo and it running on a console with a the same cpu/gpu combo.

That slow cpu will give you far better results in a console than it would a pc.

So yes it is slow and weak, but it still will play games much better being inside of that console that it would being inside of that pc.

When you code something for a pc, you assume only a few things. You don't know how many cores there are so you write your code generic. if they have 2 cores then it does stuff with two cores, if they have 6 it uses 6. With the console you know that it has 8. So you can spawn processes that use a single core and that is all that core does is process a certain part of the game. same for the video card you don't know how many shading units you have, you don't know how many render output processors you have... there is a lot of unkowns. for the console you know exactly.

so you can get rid of all the generic code which gives you better results. You also don't have to have a full OS running that is meant to work on different hardware instead you have a streamlined OS doing just base functionality.

That is what he is trying to get at. The Xbox 360 didnt' have anything great and yet it was able to create game that when designed for the platform looked fine because it is able to do more with less. Sure some games did look like complete crap and a pc version would look better, fallout new vegas is one that i didn't like on the xbox 360 so bought for the pc for better graphics. Not going to argue with that. but games like Halo 4 and others still looked good on the xbox 360. PS3 had similar games that looked just fine. So given that you are still making a decent improvement over the previous gen, even if not the best hardware out there you are still make a jump which means that the quality of games will increase also.
 
Nobody used to care about specs in the early days of consoles when all the really genre defining games came out. Now people couldn't care less about gameplay and spend all their time nitpicking about graphics, AA and resolutions.
 
This is a thread about hardware specs, I would expect there to be a discussion on the capabilities of the hardware :p
There are threads discussing gameplay, many games get slated, we do care.
 
Nobody used to care about specs in the early days of consoles when all the really genre defining games came out. Now people couldn't care less about gameplay and spend all their time nitpicking about graphics, AA and resolutions.

That is why iOS games never sell. Your statement is incorrect. People that play console games don't care about graphics. People that play iOS games dont' care about graphics. A small group of people only care about graphics and nothing else. Hense the never ending PC vs console debate.

If graphics meant everything then minecraft wouldn't be such a hugely successful game.

To some people graphics are everything. To others they play games to have fun and that is it. It is like the person that can't watch a movie unless it is 1080p quality or better on a 80in plus sized screen. Or the people that don't listen to music to just enjoy the music, but would bitch about anything they could about the audio just not being perfect. Some people aren't going to look at a game for if it is fun or not, they will only look at special effects and texture res and even then will find something to bitch about.

You can't use a site full of pc enthusiasts as the standard for the entire world. you are looking at a group who makes up a very small piece of the gaming market.
 
One thing that's changed a lot since the launch of the 360 and PS3 is the power consumption of high end GPUs. I bought a 7900GTX back in 2006. At the time it was top of the line, at least for a single GPU card. It had a TDP of 84W. Current high end cards draw much more. The 7970 and 780 are both rated at 250W. Along with the added cost, I figure that kind of power draw is more than MS and Sony are willing to accept. So basically consoles are significantly less powerful than high end gaming PCs this time around because high end gaming PCs use way too much power. Plus they cost too much.
 
The point is that you can't compare a video game running on the console on a cpu/gpu combo and it running on a console with a the same cpu/gpu combo.

That slow cpu will give you far better results in a console than it would a pc.

So yes it is slow and weak, but it still will play games much better being inside of that console that it would being inside of that pc.

When you code something for a pc, you assume only a few things. You don't know how many cores there are so you write your code generic. if they have 2 cores then it does stuff with two cores, if they have 6 it uses 6. With the console you know that it has 8. So you can spawn processes that use a single core and that is all that core does is process a certain part of the game. same for the video card you don't know how many shading units you have, you don't know how many render output processors you have... there is a lot of unkowns. for the console you know exactly.

so you can get rid of all the generic code which gives you better results. You also don't have to have a full OS running that is meant to work on different hardware instead you have a streamlined OS doing just base functionality.

That is what he is trying to get at. The Xbox 360 didnt' have anything great and yet it was able to create game that when designed for the platform looked fine because it is able to do more with less. Sure some games did look like complete crap and a pc version would look better, fallout new vegas is one that i didn't like on the xbox 360 so bought for the pc for better graphics. Not going to argue with that. but games like Halo 4 and others still looked good on the xbox 360. PS3 had similar games that looked just fine. So given that you are still making a decent improvement over the previous gen, even if not the best hardware out there you are still make a jump which means that the quality of games will increase also.
People like to say that console games are tailored specifically to the hardware but I think that is largely untrue for most games. Sure, some system-exclusive titles are optimized very heavily (The Last of Us for example) but a big reason that the console games run so well is that they are running sub-720p at ~30fps. You can play games on a PC at 720p and 30fps using an integrated GPU these days.

Yes, you can get better performance writing console specific code and using low-level tricks, etc. But it's not going to get nearly as much of a performance boost as compared to a decade ago. Modern OSes and APIs on computers are much more lean than they used to be.
 
Fact of the matter is there's no practical benefit to the average person to have 4K resolution on their LCD TV's unless they are sitting 4 feet away from the set. At a typical 6-8 foot viewing distance there just isn't the difference to be seen unless you have greater than 20/20 vision (and I'm sure some do)

Getting cable and satellite providers to even do 1080i properly without showing compression artifacts is still a challenge. Getting movie studios to do proper remastering of original film material to do good quality bluray releases is still a challenge, and even if they did that, very few films would see obvious enhancement from 4k resolutions over 1080p... maybe pixar films, transformers, or other special effects heavy films and the differences will be quite subtle.

Face the facts... you might be able to buy 4k sets in a couple years for under $2000, but the amount of 4K content that actually shows appreciable quality gains will be a good 5 years off yet... 4K TV programming on a cable or satellite provider? More than 5 years away.. the bandwidth to do it without compression artifacts just isn't there. 4K streaming from netflix? Good luck unless you have fios (and I'm sure some here do) and even then it will have compression artifacts.

Ultra HD resolutions will be limited to PC monitors and users running expensive multi-GPU setups for years yet (at least where benefit will be seen from these resolutions)

Video content will be somewhat like trying to get good quality audio by subscribing to sirius... the compression artifacts will just kill any potential benefit that might be there from too many channels crammed into too limited bandwidth, not to mention the bandwidth wasted still supporting the SD channels, and god forbid, the analog versions on basic cable.

Have some comfort in knowing that PC ports of these console games will come more easily and there will definitely be a quality jump to come in PC gaming from the new consoles regardless of how powerful they are.. any large scale upgrade of this aging hardware is a good thing.. and the life cycle will probably not be as long either on these. Sick with your 30 inch Dells and multi-GPU setups and sit 2 feet away if you really want to 'notice' the benefits of ultra-hd resolutions. :)
 
Last edited:
People like to say that console games are tailored specifically to the hardware but I think that is largely untrue for most games. Sure, some system-exclusive titles are optimized very heavily (The Last of Us for example) but a big reason that the console games run so well is that they are running sub-720p at ~30fps. You can play games on a PC at 720p and 30fps using an integrated GPU these days.

Yes, you can get better performance writing console specific code and using low-level tricks, etc. But it's not going to get nearly as much of a performance boost as compared to a decade ago. Modern OSes and APIs on computers are much more lean than they used to be.

This is VERY TRUE! I hear the "consoles are better optimized" argument all the time but the fact of the matter is that if you build a PC using closely related hardware and there is a good chance that PC ports are still playable.

I have no doubt that a Unreal 3 engine based game could run perfectly fine on a dual core PC with a X1800 GPU and MAYBE 1GB or less of RAM at 600-720p @ 30fps on medium settings. I don't think consoles are as highly optimized as people give them credit for...
 
Considering the cpu and gpu are from the same series, and use the same design ( just different quantities) I would expect games to be more optimized this time around.
 
Ugh... this kind of thread again. It's not a fucking computer dude. When will you idiots stop comparing consoles to computers? Consoles are about sheer convenience and delivering a product that will play games. Go do some research on how much the 3DO, PS3, and the SNK Neo Geo cost at launch in the U.S... if you want high specs comparable to a computer then go buy a computer. If you want sit-in-front-of-your-tv convenience to play games, and still have plenty of gaming capabilities then buy a console.

Quite frankly, these threads are starting to annoy the shit out of me. Consoles aren't computers and should stop being compared to them spec-to-spec. It makes no sense. It's like asking why can't every car just be a Ferrari? Well, not everyone around the world can afford a Ferrari, and some people don't want to pay a premium for the Ferrari and/or it's more than what they need. Some people like driving a Ford Focus. It'll last a long time for what I need, is fun to use, and is affordable. Not to mention that the specs you read, are ALL dedicated to gaming. You aren't multi-tasking chrome, aol messenger, vlc media player, downloading apps, updating Steam games, surfing Facebook... The system is a gaming specialized system.

The fact is these systems have to perform a incredible balancing act which people who keep bringing this stupid spec debate up can't grasp. The console designers have to take a lot of things into consideration when making these. Being affordable enough to people around the planet. Being accessible enough to developers so that it's easy to make games for. High-end enough to deliver a positive leap forward in graphical fidelity... and this all has to be done in order to also make a profit. Sony didn't see profits from the PS3 hardware until over 4 years into it's lifespan. And at the time, the hardware inside the PS3 was revolutionary. However, it cost $500. The Xbox One costs $500 but it includes more than a console.

The graphics are gonna look good. There's no doubt about it. Start focusing on the value of the system rather than numbers. At the end of the day, you're not paying for the specs... you're investing in a console to play some great games and have some quality experiences. I'll leave you with some ACTUAL screenshots from the PS4 so you can judge, and see how good or bad the graphics are for a $400 machine. All comes down to the games you want to play. Also, everyone knows that games on consoles get better and better the longer into their life, so if this is only the beginning... then I think the games will look fantastic 4 years from now.

index.php


Killzone-Shadow-Fall-trailer-screenshots-1.jpg


index.php


ku-xlarge.jpg

Is that last screenshot (The Witcher 3, I presume?) really supposed to be from a next generation console game? Those are some really flat textures, especially from the building on the right. Can't we get some bump-mapping over here?
 
Back
Top