Are next-gen consoles already behind the times?

JoeUser

2[H]4U
Joined
Mar 30, 2010
Messages
3,919
So me and a buddy of mine were talking tech as we often do and we started discussing next-gen consoles.

We all know the specs and software and supported features and as we were talking about this it became painfully obvious how behind technology wise next-gen consoles already are...and they haven't even been released yet! This generation is supposedly going to last 10 years...TEN YEARS...yet speed wise they are already behind MOST computers on this forum.

Ya know, when the PS3 and 360 came out they had some pretty nice state of the art technology. The 360's GPU was actually a head and FASTER than any desktop GPU at the time. Now we look at current consoles and they have CPU's that are slower than a ~3GHz i3 (based off of Kabini reviews) and GPU's that are lower-mid to mid range in comparison to desktops.

HOW IS THIS GOING TO LAST 10 YEARS?! I mean, you've got games like DRIVECLUB on the PS4 running at 1080p at 30 FRAMES PER SECOND. In a time when we are moving into the 4K resolution realm isn't 1080p dying? Is a next-gen console running 30fps kind of ridiculous?

Now I hear you screaming, "BUT THEY WILL GET BETTER CAUSE OF UNTAPPED POWER!!!". Well, you know what? I'm not buying that this time.

With the current gen consoles they were truly pretty unique at the time. Unified shaders, PowerPC cores, Cell processor, ESRAM, etc meant that developers really genuinely had to LEARN how to code for these platforms. This generation though...yeah...that simply isn't the case. We're dealing with x86 processors, known GPU architecture, known software and OS's (in the case of the Xbox One it's using Windows 8 for fucks sake).

So the whole games will look a lot better 3-4 years down the road...I'm just not too sure about that. I mean, developers are already having a seemingly hard time hitting 1080p at 30fps...you REALLY think that there is THAT much room to improve? Improve over what? Take a game like BF4 that was said to run at 1080p at 60fps. How exactly do they plan to achieve this while still having crisp visuals and truly next-gen graphics? How do they plan on keeping this up in the future when engines become more evolved and even low-end gaming PC's have blown past next-gen consoles (which it could be argued they already have).

So in my opinion as PC gamers we will benefit for a little while but I see the whole "consoles are holding PC's back" coming A LOT sooner than we think. Now I know Sony and Microsoft couldn't afford to pump Intel 4770k's and Nvidia Titans into the new consoles but...come on! My nearly two year old PC can ALREADY play next-gen games without a problem (Metro Last Light, Crysis 3, etc). What's this say about the state of next-gen console ports?

I don't know ya'll...I am excited for next-gen because it's about fucking time and it'll give us PC gamers some proper DX11 graphics and features but damn...next-gen is already last-gen tech wise.

Am I the only one that finds this sort of unsettling?
 
To the vast majority, the current 8 year old consoles are just fine. It's only to PC gamers who think 1080p on a single monitor is a joke who'd find the new consoles inadequate. 4K monitors are going to be still fairly niche in 10 years, it wouldn't make sense to try to plan for that on a $400 box.

You could be right about optimization of x86 consoles, but on the other hand, there hasn't really been any optimization of x86 architecture for gaming, so it's not like it's already optimized. A lot of optimization is just knowing exactly what your hardware specs are, which PC game developers never know.

Finally, I remember hype about the current gen being better than PCs at the time, but it turned out to be just hype. The 7800GTX in the 360 hit stores June, 2005, one month after the 360.
 
This discussion/argument has been talked about since the beginning of time.

The short answer is that consoles have always been behind PC's. Even during the 360/PS3 generation at launch. It's a misnomer to ever believe otherwise. So, if that unsettles you, you should've been unsettled for a while. There isn't much of a point in comparing PC's to consoles, although for some reason everyone always wants to draw that comparison. The markets are different, and what consoles are trying to do in terms of gaming are vastly different.

Consoles of this current gen only have to produce visuals at 1080p from between 30-60fps. This doesn't take a lot of power to do. Even with so-called 'next-gen' graphics engines. 1080p in terms of resolution is nothing compared to 4K or someone with triple 2560x1600 monitors. It's not going to take a lot to drive it. For another side point, what are 'next-gen' graphics anyway? We're already in an area where graphically everything is starting to look the same, where it's becoming difficult to see any 'fold' increase in visual fidelity. 4k might have been nice for the PS4 and the XB1, but it wouldn't have been attainable realistically. They would of had to have systems that costed $700 at launch. More or less a guarantee of console failure. For the general public, 4K probably won't really have any penetration for at least 5-7 years.

I was in two threads recently, one for Crytech 4 (tech demo) and another for an upcoming game called Titanfall. In both threads people complained about graphics. "Oh, nothing looks that different/special/cool, I'm not impressed. It's rehashed, blah blah blah." It's pretty impossible even now to create a game graphically that will wow people, not because there aren't improvements all the time, but that the level of improvement required so that there is an easily perceivable difference is too high. This effect is called "Change Blindness". You can read a clinical report or Wikipedia.

I still remember clearly people perceiving graphics from Gears of War to be at the same level or as good as PCs. Console graphics have to just be "good enough" for the masses. The Hardforum isn't the general public. 30fps doesn't matter to most people. If you're freaked out and this bothers you, then simply just don't game on consoles. If on the other hand your interest suddenly shifts from 'graphics' to 'gameplay' then you might find a more compelling and reasonable reason to play on consoles in the first place.
 
Stopped reading.

Why? Because he used the term "monitor"? He is right for the most part. 1080p took well over a decade for decent penetration. 4K will take at least as long. Heck even here in the [H] most people are using 1080p as their standard. Few are using 2560x1440 or 1600.
 
I don't remember 360/PS3 being any different and they are suppose to get 10 yrs.
 
I don't know a lot about this subject, but I don't know if eight 1.6Ghz cores can even push those 7XXX cards effectively. Not because it shouldn't, but simply because next gen consoles are soooo similar to a PC's DNA makeup.

Heck almost nobody recommends AMD CPU's because of low single core benchmark scores. Well guess what? The BD and PD cores no one is recommending on this site (elitism d-baggery most likely or bad skills take your pick), are coming stock @ 3.5GHz and above per core. People on this site, and others say those AMD CPU's running over 2x as fast per core stock won't push a modern'ish GPU as fast as an Intel, let alone a 1.6GHz PS4/XBOX1 single core. Ouch!

So these newer CPU's in the next gen consoles are 3x worse basically than PC market current Intel Overclocked Single core) AMD just released a 5GHz 8 core and every site basically says it's crap or whatever. (lulz) But these 2 new consoles run @ 1.6GHz per core, so how will they feed that GPU?:eek:...good thing is I think developers will be pushed to multi-threaded titles for performance now because of the hardware limitations, but will they? History says F NO BRO! I think BF4 will struggle on PS4/XBOX1 because of that 1.6GHz single core speed it can't feed the GPU it's full power potential, it can't process the net code, the sound etc fast enough, the game world, the polygons and shit quick enough. I even bet the resolution will be scaled far down the far up up to 1080p from some low ass resolution 720p or lower. Just to make it feel decent, but it's not. IMO Hell no!, I don't see native 1080p 60 fps in BF4. I hope I'm completely wrong here though. :) I hope the PS4 is amazing, cuz we're all getting one :D Origin and Steam suck :cool:
 
Part of me wants to be able to wait to get a console capable of producing games at 4K resolutions when the TVs are at a reasonable price. Let's be honest here, the HD (kinda) resolution of the current consoles made the biggest difference this gen. If you didn't have an HD TV, you were missing out on a lot of the advancements. Get an HD TV, and anything that doesn't support those resolutions looks like ass.

If the new consoles supported 4K resolutions for gaming, then I knew that I'd have to get one when the price of 4K TVs went down. The resolution bump makes a huge difference to me.
 
While still a far cry from more powerful PCs, remember that you program more to the metal on consoles, without ten layers of this and that between your game and the hardware. That makes them faster than the PC with similar components. But they will always be slower than better PCs and more so as years pass by.
 
it's behind [H] PC's, equal to the avarage PC in steam hardware survey, better than the avarage PC globally :p
 
While still a far cry from more powerful PCs, remember that you program more to the metal on consoles, without ten layers of this and that between your game and the hardware. That makes them faster than the PC with similar components. But they will always be slower than better PCs and more so as years pass by.

This is it entirely. AMD, Carmac, and others who know what they are talking about have repeatedly pointed out that PC's get a fraction of the performance out of the parts in them. Consoles lack an OS, do not suffer from DirectX, and can go straight to the metal. Combined with a static box instead of varying components this enables them to extract every bit of performance from the hardware in them instead of just a portion of it. Windows is more to blame for PC gaming issues than consoles.

Most people here also forget a key fact about the rest of the world when it comes to PCs... what people actually buy. Desktops are 90's early 2000's, they're about as current as a type writer or a fax machine to most people and make about as much sense to buy. People go out and buy a laptop, which are more than good enough for everyone who isn't gaming or doing CAD/CAM. At that point form factor and battery life become much higher priorities than raw power. The end result is that their computers are never going to catch up to consoles, and their resolution is going to be 738/900/1080p anyways. A lot of people spend the last generation with computers that could never game better than the consoles they had.
 
This is it entirely. AMD, Carmac, and others who know what they are talking about have repeatedly pointed out that PC's get a fraction of the performance out of the parts in them. Consoles lack an OS, do not suffer from DirectX, and can go straight to the metal. Combined with a static box instead of varying components this enables them to extract every bit of performance from the hardware in them instead of just a portion of it. Windows is more to blame for PC gaming issues than consoles.

Most people here also forget a key fact about the rest of the world when it comes to PCs... what people actually buy. Desktops are 90's early 2000's, they're about as current as a type writer or a fax machine to most people and make about as much sense to buy. People go out and buy a laptop, which are more than good enough for everyone who isn't gaming or doing CAD/CAM. At that point form factor and battery life become much higher priorities than raw power. The end result is that their computers are never going to catch up to consoles, and their resolution is going to be 738/900/1080p anyways. A lot of people spend the last generation with computers that could never game better than the consoles they had.

This is actually not really true. Did you forget that the OSes on the PS4 and Xbox One use 2.5 to 3.5 gigs of the RAM in the consoles? That the consoles allow the game to freeze, and you can do something and go back to it later? All consoles have an OS, so to speak, but before this new generation, the OS has been fairly minimal. The OS is getting a bigger and bigger footprint each time a new generation comes out, for better or for worse.
 
This is actually not really true. Did you forget that the OSes on the PS4 and Xbox One use 2.5 to 3.5 gigs of the RAM in the consoles? That the consoles allow the game to freeze, and you can do something and go back to it later? All consoles have an OS, so to speak, but before this new generation, the OS has been fairly minimal. The OS is getting a bigger and bigger footprint each time a new generation comes out, for better or for worse.

Unfortunately they are becoming more than just a game console.... At least the ps4 isn't as bad (I'm an Xbox fan boy but know they screwed up bad)
 
This is actually not really true. Did you forget that the OSes on the PS4 and Xbox One use 2.5 to 3.5 gigs of the RAM in the consoles? That the consoles allow the game to freeze, and you can do something and go back to it later? All consoles have an OS, so to speak, but before this new generation, the OS has been fairly minimal. The OS is getting a bigger and bigger footprint each time a new generation comes out, for better or for worse.

I should have stated full blown PC OS, it's still smaller, they still do not have the same amount of API's to go through, and still have to the metal coding. It's a vastly more efficient and elegant way of doing things, compared to the PC.

Unfortunately they are becoming more than just a game console.... At least the ps4 isn't as bad

They're entertainment centers, and even though they've grown they still lack the hassles and problems of the PC, so it's not that bad yet.
 
I should have stated full blown PC OS, it's still smaller, they still do not have the same amount of API's to go through, and still have to the metal coding. It's a vastly more efficient and elegant way of doing things, compared to the PC.



They're entertainment centers, and even though they've grown they still lack the hassles and problems of the PC, so it's not that bad yet.

It depends what you mean by problems of a PC? Viruses, yes not common on consoles.
Overheating, it's there. Lag, freezing hardware failure all are there.

I say the main plus any console has is that the game is designed for it(most of the time) and should run well and you don't have to question if your console will run the next game coming out because your hardware is 6 months old.
 
I would hope this gen lasts 5 years or less since they aren't taking massive hits this time around. There's no reason to hold on for so long as they don't need to recoup all that money.

If they think they'll be relevant in 10 years, that's a big mistake and someone will steal market share (Valve).
 
I would hope this gen lasts 5 years or less since they aren't taking massive hits this time around. There's no reason to hold on for so long as they don't need to recoup all that money.

If they think they'll be relevant in 10 years, that's a big mistake and someone will steal market share (Valve).

I think the deciding factor will be the adoption rate of 4K tvs. If they sell fast, someone will come out with a new console to take advantage of gaming at that resolution.
 
I say the main plus any console has is that the game is designed for it(most of the time) and should run well and you don't have to question if your console will run the next game coming out because your hardware is 6 months old.

You can play new games on your old PC too, but with the same graphics quality, not the new max. Future's medium is today's ultra high. So it is not inferior to consoles in that regard.
 
It depends what you mean by problems of a PC? Viruses, yes not common on consoles.
Overheating, it's there. Lag, freezing hardware failure all are there.

I say the main plus any console has is that the game is designed for it(most of the time) and should run well and you don't have to question if your console will run the next game coming out because your hardware is 6 months old.

PC's often have overheating issues, lag, hardware failure as well. Plus the issue's that are PC centric like virus, drivers, games not working properly, and all the other hassles involved.

Keep in mind that a lot of people don't like tinkering around with and messing with computers. I work in IT and I enjoy tinkering with computers... however through the past generation between having to bake video cards, SSD firmware issues, motherboard issues, and other stuff I've had vastly more PC issues than console issues. I just don't mind sorting them out. That's not a bad thing, IT people are vastly over paid in this country already because of government inflation of salaries that has screwed up the free market, and people not enjoying having to sort through these issues. If it wasn't for these two reasons we'd all be paid what we are worth, a LOT less.

But crap that I consider minor, like hosed SSD firmware, virus infections, or the SATA port fiasco on intel P67 chips drives other people up the fucking wall and make things a waste of their damn time.

Oh well, that and government jobs have vastly inflated my salary.
 
PCs will always be better than consoles on raw performance power. Sorry but "optimization" doesn't suddenly give a console a bajillion extra FPS. And guess what, during that optimization time PCs are just getting better and better.

But a lot of people just want a turn-it-on-and-play box and that is what a console gives them. And the publishers see it as an easier box to fully control.
 
these next gen systems are all betting on stupid social features like roman gladitoral contests to keep the masses distracted for years.
 
The "times" are subjective. In terms of PC power, yeah - the next generation is definitely behind the times. Further than the previous generation to be sure; but still - a Radeon HD-7870 is a huge leap over the Xenos 48-shader, 16 TMU GPU used in Xbox 360. And so is a Radeon HD-7790, for that matter. Remember that the consoles only need to do 1080p. Hell - most games on the current generation don't even do 720p, but a slightly lower resolution that's scaled up to 1080p at max; and the current gaming public doesn't seem to mind. PS4 and Xbox One will be decent jumps from the previous generation.

But we really have gotten to the age of graphics where everything is about refinement now. With Xbox 360 and PS3, we saw PS3.0 jump into the living room and take off (hence the "sheen" that games in the earlier part of that generation tend to have). With this generation we'll pretty much have more of the same, except it's going to be more refined and sublte (which I like).

As far as 1080p lasting for a long time. Heh - I think it's here to stay a lot longer than people would like. Face it, [H] users are different from most, and even then 1080p, 1200p seems to be the common resolution around here. Probably the generation after the PS4 and Xbox One will see 4K (if we still game in the living room - who knows what 10 years will bring?). I mean - I'm using an FW900 that was made in 2001. What graphics card could play "nextgen" games at its max resolution of 2304x1440 with all settings turned up back then? Hell - that resolution is pretty tough for most current graphics cards in the most demanding titles today.

EDIT: The connection with the FW900 I'm trying to draw is that after 10 years of that resolution, finally - powerful single graphics cards are now caught up to be able to play with most of the settings turned up at that resolution. Personally - my GTX-560 keeps me at 1920x1200 or 1600x1024 for the more intense games that I have.
 
I completely agree with OP it think these consoles will be pretty underwhelming especially if they try to extend it out like the ps3/xbox360. The best thing we can hope for is another consoles in 5 years... i think it's more likely happen and be completely backwards compatible making the jump not as bad.
 
PCs will always be better than consoles on raw performance power. Sorry but "optimization" doesn't suddenly give a console a bajillion extra FPS. And guess what, during that optimization time PCs are just getting better and better.

But a lot of people just want a turn-it-on-and-play box and that is what a console gives them. And the publishers see it as an easier box to fully control.

PC's get a fraction of the power of the hardware in it, consoles can generally extract every last drop. AMD and Carmac have commented several times on how despite all the extra hardware power DirectX, Windows, and everything is so limited the hardware is just gimped. It's not just optimization, it's PC's actively hindering performance as well.

It's an "oh well" issue, because we are comparing apples and oranges. The only thing to remember is that it's insane to judge a consoles performance by what that hardware could do on a PC, it's just stupid.
 
Finally, I remember hype about the current gen being better than PCs at the time, but it turned out to be just hype. The 7800GTX in the 360 hit stores June, 2005, one month after the 360.

This discussion/argument has been talked about since the beginning of time.

The short answer is that consoles have always been behind PC's. Even during the 360/PS3 generation at launch. It's a misnomer to ever believe otherwise. So, if that unsettles you, you should've been unsettled for a while.

I don't remember 360/PS3 being any different and they are suppose to get 10 yrs.

These statements are totally false.

The 360 when it was initially launched was faster than a gaming PC. It had a x1800-x1900 class GPU (not a 7800, they used ATI not Nvidia :rolleyes:) of which hadn't even been launched on the PC yet. Also the 360 GPU supported things that PC's didn't like limited tessellation (look at the water in Halo Reach) as it was still a custom chip.

Not only that but the CPU was a triple core 3.2GHz chip, again, something where the PC was still stuck on dual-cores.

The PS3 had Blu-ray and the Cell processor, which if it was properly utilized at the time blew away ANY CPU on the PC.

So to claim that consoles have always been slower than PC simply isn't true.

The gap between PC power and console power is leagues different with these next-gen consoles than the current-gen consoles ever dreamed of being.

This is actually not really true. Did you forget that the OSes on the PS4 and Xbox One use 2.5 to 3.5 gigs of the RAM in the consoles? That the consoles allow the game to freeze, and you can do something and go back to it later? All consoles have an OS, so to speak, but before this new generation, the OS has been fairly minimal. The OS is getting a bigger and bigger footprint each time a new generation comes out, for better or for worse.

As was pointed out this isn't the case anymore. 1/3 of the RAM and TWO CPU cores in next-gen consoles is dedicated to the OS. Also, in the case of the Xbox One it is said to have THREE OS's running AT THE SAME TIME. So the argument that the consoles have a very limited OS is false this time around. Sure they won't have all the background processes of a PC, sure, but is that really a problem now with PC's? When sitting idle my PC is using like 1-2% of its power, especially if you keep your system clean.

So to me this really can't be argued anymore.

This is it entirely. AMD, Carmac, and others who know what they are talking about have repeatedly pointed out that PC's get a fraction of the performance out of the parts in them. Consoles lack an OS, do not suffer from DirectX, and can go straight to the metal. Combined with a static box instead of varying components this enables them to extract every bit of performance from the hardware in them instead of just a portion of it. Windows is more to blame for PC gaming issues than consoles.

Again, not true. Consoles have a lot of resources going towards the OS and the Xbox One is specifically using DX11.1+. As far as straight to the metal coding goes I just don't see it making THAT big a difference. Not like it did with current-gen consoles.

Also keep in mind the hardware of the consoles...even with direct access to the hardware things like the CPU's (of which on 6-cores are available to games) are like SUPER slow...have you seen benchmarks of the Kabini APU (that uses Jaguar cores)? These CPU's in the consoles are overall slower than the processors found in PC's 5-6 years ago. Compared to something like even a stock speed 2500k they get absolutely crushed in EVERY possible way.

Sure they have six of them and sure they can be programmed for parallel processing, but considering how weak they are you can only do so much with them. Why do you think Sony/MS and dev's are saying how you can unload things like physics and A.I. to the GPU? BECAUSE YOU'LL TOO! THE CPU SIMPLY ISN'T GOOD ENOUGH!

While still a far cry from more powerful PCs, remember that you program more to the metal on consoles, without ten layers of this and that between your game and the hardware. That makes them faster than the PC with similar components. But they will always be slower than better PCs and more so as years pass by.

Read above. The effects won't be as strong this time around. Go look at that Unreal Engine 4 tech demo that Epic ported to the PS4. You can tell a noticeable difference between the PC (which used a i7 and a SINGLE GTX 680) and the PS4 version...like, a major difference. So even with bare metal access you still can't just make magic happen. Next-gen have severe limitations, and it's becoming more apparent (DRIVECLUB on the PS4 seems to have a hard time maintaining 30fps at 1080p, BF4 supposedly being on 720p, etc).

PCs will always be better than consoles on raw performance power. Sorry but "optimization" doesn't suddenly give a console a bajillion extra FPS. And guess what, during that optimization time PCs are just getting better and better.

Exactly. This "optimization" stuff simply won't be as major as current-gen. Dev's are ALREADY maxing these consoles out.

PC's get a fraction of the power of the hardware in it, consoles can generally extract every last drop. AMD and Carmac have commented several times on how despite all the extra hardware power DirectX, Windows, and everything is so limited the hardware is just gimped. It's not just optimization, it's PC's actively hindering performance as well.

I don't think PC's are using as little power as you think.

When gaming for example my GPU when clocked from 950MHz to 1200MHz increases my FPS by 10-20 depending on the game. So games are obviously using a lot of the hardware.

So yes PC's aren't a properly resourced as consoles are, but to say they are only using a "fraction" of the power I don't think is entirely accurate.

Also with consoles now being x86 and using current technology (GPU unified shaders, etc) I see PC power being even more utilized than it is now.
 
Last edited:
As was pointed out this isn't the case anymore. 1/3 of the RAM and TWO CPU cores in next-gen consoles is dedicated to the OS. Also, in the case of the Xbox One it is said to have THREE OS's running AT THE SAME TIME. So the argument that the consoles have a very limited OS is false this time around. Sure they won't have all the background processes of a PC, sure, but is that really a problem now with PC's? When sitting idle my PC is using like 1-2% of its power, especially if you keep your system clean.

So to me this really can't be argued anymore.

Don't take this the wrong way, but you seem to be aggressively agreeing with me, or you misread my statement.
 
i'm interested to see the benefit of huma on these console (uma for xbox). Not having to copy info from the memory pull and just reading it , should give some benefits.

Tiled resource keeping texture fidelity instead of blurring far texture.

For cpu core performance wise, we ve seen over the year on the pc, that when games are unoptimized, a dual core at higher frequency is better. But once a quad core is optimized
it can do better at lower frequency than a dual core. (yes i'm saying ps4 , xbox1 games will be better multithreaded.)

Compared to pc's where you might upgrade from 1080p to 1440p, the consoles have a fixed 1080 target
and that won t change. So they ll be able to focus on what matters gameplay and storyline. *rant* how many game developers have been chasing the rainbow that is better graphics and churning out mediocre games for the past 10 years.It's One thing of making tech demos , another to make games.
*rant over*

Games are better with better graphics is like saying JJ Abraham's Star trek is better cause it's got more lense flare. (i enjoy quite a bit of the low visual fidelity indie games of the past year)
 
I like how every single PC gamer completely misses the point.

Of course your Ferrari is going to be faster than a Honda Civic.

Take the $400 price point, no $400 PC will match the PS4.

My 4770k, 16gb ram, 780gtx pc will run circles around the ps4.. I also paid far more for it.
 
ugh!

can we just sticky these threads every new console release and merge them, same crap everytime a new console comes out...

PC vs Console blah blah blah...

A $600 PC could match a console to some degree, but only if it is a well coded PC game and not some cheap port.

Cheaper PC could run games better, but many developers do not optimize, or are forced to leave their game so open because of the end less config PC's could have.
 
I like how every single PC gamer completely misses the point.

Of course your Ferrari is going to be faster than a Honda Civic.

Take the $400 price point, no $400 PC will match the PS4.

My 4770k, 16gb ram, 780gtx pc will run circles around the ps4.. I also paid far more for it.

Wrong comparison. A PC is throwing a v8 in an SUV, not efficient use but brute power. A console is throwing a v4 in a crotch rocket, not that much power but very efficient.

Nothing wrong with it, you can own both!
 
I like how every single PC gamer completely misses the point.

Of course your Ferrari is going to be faster than a Honda Civic.

Take the $400 price point, no $400 PC will match the PS4.

My 4770k, 16gb ram, 780gtx pc will run circles around the ps4.. I also paid far more for it.

Not really the point I'm trying to make with this thread.

I'm just saying how with current-gen consoles they were Ferrari's when they came out where as next-gen are Honda Civics. That's what worries me...why are the new consoles HONDA CIVICS. We'd be much better in the long run if they were at least Acuras or something...see?

ugh!

can we just sticky these threads every new console release and merge them, same crap everytime a new console comes out...

PC vs Console blah blah blah...

A $600 PC could match a console to some degree, but only if it is a well coded PC game and not some cheap port.

Cheaper PC could run games better, but many developers do not optimize, or are forced to leave their game so open because of the end less config PC's could have.

This goes to both of you. This was NOT made as a PC vs. console thread...that's would dumb and pointless and I'm not dumb.

This was more to point out just how slow consoles ALREADY are in the grand scheme of things. How they are already severely outclassed and they haven't even been released yet.
 
Not really the point I'm trying to make with this thread.

I'm just saying how with current-gen consoles they were Ferrari's when they came out where as next-gen are Honda Civics. That's what worries me...why are the new consoles HONDA CIVICS. We'd be much better in the long run if they were at least Acuras or something...see?



This goes to both of you. This was NOT made as a PC vs. console thread...that's would dumb and pointless and I'm not dumb.

This was more to point out just how slow consoles ALREADY are in the grand scheme of things. How they are already severely outclassed and they haven't even been released yet.

I understand the thread. I feel like I'm still waiting for the real next gen, or something.
 
I like how every single PC gamer completely misses the point.

Of course your Ferrari is going to be faster than a Honda Civic.

Take the $400 price point, no $400 PC will match the PS4.

My 4770k, 16gb ram, 780gtx pc will run circles around the ps4.. I also paid far more for it.

Not what we are comparing here. We are comparing the performance of the last gen in comparison to the average high end pc available at the time to what we are getting this time around. It's kind of a joke this time and the way integrated graphics like amd's apus are progressing we could have a $400 pc matching consoles faster than you think.
 
They weren't really Ferrari's though. The use of custom parts in consoles often had to do with importing it from arcade hardware the company was pushing (Sega was huge on this, Sony as well) and hoping to get more exclusives for that console rather than having something where it was easy to port over.

Outside of the Xbox 360's GPU nothing was really "better" that was in them. The Cell and fancy RAM was to push that tech into TV's, Blue Ray Players, and Super Computers. Outside of some Super Computers this flopped. It was also to lock developers into Cell technology, the 7800 in it was gimped. Though the arcade hardware for it did workout rather nicely for Sony.

There's nothing at all to freak out about. If anything having x86 as the underlying tech will make cross platform easier and more cost effective than ever, and emulation come much faster.
 
They weren't really Ferrari's though. The use of custom parts in consoles often had to do with importing it from arcade hardware the company was pushing (Sega was huge on this, Sony as well) and hoping to get more exclusives for that console rather than having something where it was easy to port over.

Outside of the Xbox 360's GPU nothing was really "better" that was in them. The Cell and fancy RAM was to push that tech into TV's, Blue Ray Players, and Super Computers. Outside of some Super Computers this flopped. It was also to lock developers into Cell technology, the 7800 in it was gimped. Though the arcade hardware for it did workout rather nicely for Sony.

There's nothing at all to freak out about. If anything having x86 as the underlying tech will make cross platform easier and more cost effective than ever, and emulation come much faster.

I don't know...maybe not Ferraris but they were at least a shit ton more powerful at launch compared to PC's and especially compared tech wise with current consoles. At least with a 360 or PS3 early on you had some truly nice technology. Now with the current consoles it's more "meh"...which is how I feel about them. I think that's what the engineers that made these did too..."meh, 8GB sounds nice...faster GPU is a given and eh OK lets stick some cheap x86 processor in there...:sigh: :yawn:...XBOX ONE AND PS4 ARE BORN!"

In my opinion calling these next-gen only signifies that they are the next iteration of consoles...not truly next generation in terms of power or technology.
 
I'm pretty sure that these consoles are designed for a 5-10 year lifecycle. Possibly they could be a stopgap release while a more advanced one is being prepared (like the original Xbox to the 360) but really all of them have plenty of power for immediate use and years to come.

Developers will learn to use the hardware better as time goes on, and squeeze ever more performance out of the available hardware. Which only needs to run at a fraction of the speeds found in a PC because it's not bogged down by windows and layers of code.

I think all the next-gen consoles have the best possible components available without making them hideously expensive both to manufacture and sell. They can always put out new revisions which upgrade the components, like die shrinks or cooling improvements, and extend viability even more.
 
Developers will learn to use the hardware better as time goes on, and squeeze ever more performance out of the available hardware. Which only needs to run at a fraction of the speeds found in a PC because it's not bogged down by windows and layers of code.

See I agree with this but only to a certain extent.

With current consoles the custom made hardware really had to be learned and best used.

With next-gen though they're a lot more straight forward and use a lot more common and standard hardware. Basic AMD GPU and a x86 processor.

I just don't see console games 3-5 years down the road looking THAT much more impressive than what we're seeing now...at least no where near like what we have with current consoles.
 
See I agree with this but only to a certain extent.

With current consoles the custom made hardware really had to be learned and best used.

With next-gen though they're a lot more straight forward and use a lot more common and standard hardware. Basic AMD GPU and a x86 processor.

I just don't see console games 3-5 years down the road looking THAT much more impressive than what we're seeing now...at least no where near like what we have with current consoles.

They'll still be able to extract several times more performance from the same hardware than a PC could, and that process will take time. It's best to pretend you don't know what's in the box at all rather than trying to compare it to a PC.
 
They'll still be able to extract several times more performance from the same hardware than a PC could, and that process will take time. It's best to pretend you don't know what's in the box at all rather than trying to compare it to a PC.

But is that true though? Can comparable PC's not already run these games the same that a console can? You think some lowly quad-core PC with a 7870 and 8GB of RAM will have a hard time running next-gen console ports? I don't. Not at all.

The next-gen consoles aren't some magical fairy box full of untapped power from a parallel dimension. They're off the shelf PC hardware stuffed in a little plastic box. Nothing special.

This is the topic of the thread in case you didn't notice. Next-gen consoles are already behind the times, and severely so. Not just slightly, not just "oh some PC's are faster"...no, these are basically current-gen consoles with a RAM increase and a better GPU...processor I really wouldn't even consider THAT big a change as they're so super slow.
 
No, you're only talking about a 15-20% processing gap when you code directly to metal, at most.

Why the new console hardware doesn't really matter = because software is improving very slowly. Shiny new graphics on par with a top of the line PC from 2011 is still a huge improvement over 2005 hardware. I think console manufacturers want to make money this time around, too.
 
Back
Top