GPU knowledge wanted: How is the PS4 NOT 30-50% faster than Xbox One?

JoeUser

2[H]4U
Joined
Mar 30, 2010
Messages
3,919
With the GPU knowledge here I'd like to get down to brass tacks and tackle this debate once and for all.


From Microsoft (various sources):
'No way' PS4 has a 30% power advantage over Xbox One
Difference between two consoles 'not as great as consumers believe'; Major Nelson 'looking forward to the truth coming out' about Xbox One.

The power difference between PlayStation 4 and Xbox One "is not as great as the raw numbers lead the average consumer to believe," Xbox's senior director of marketing and planning Albert Penello has stated, leading colleague Larry Hyrb to say that he is "very much looking forward to the next few months (and beyond) as the truth comes out" about the power of Xbox One.

So with all this talk going on recently about the Xbox One not being that much slower than the PS4 and that the 30-50% figures being thrown around are "exaggerated" it got me thinking...How is this NOT a lie?

I mean, the PS4 has 384 more GPU cores (which is HALF the total number in the Xbox One - so you're looking at 150% for the PS4 vs. 100% for the Xbox One), it has GDDR5 RAM for CPU and GPU, double the ROP's, hUMA (which still hasn't been confirmed to be in the Xbox One, right?) bare metal access, etc...so how, purely based on hardware specs, could the PS4 not be 30-50% faster? I would think the extra 384 cores ALONE would automatically equal at least 30% faster. How could it not?

Is MS just talking out of their butts or does the Xbox One contain some magical powers that we aren't aware of? Because from what I can see on the technology side of things the PS4 just isn't faster, but it's clearly 30-50% faster EASILY.

Thoughts?
 
The PS4 also has 3x the memory bandwidth.

Microsoft also claims the cloud will allow their games to be more advanced than just the console hardware itself allows.

It all sounds like marketing BS to me. The consoles are basically the same, if the PS4 has a hardware edge (it does have a big one from just looking at the specs), we should see it on multi-platform games in a year.
 
The PS4 also has 3x the memory bandwidth.


That's not true at all. If you count on developers not taking advantage of eDRAM (which they obviously have with the 360) then I'd agree with that basic statement. Unfortunately, that is not the case. Developers have made great use with the super fast 10MB eDRAM on the 360 that in the early years it was considered superior to the PS3 which was said to be superior hardware. Even now I'd say the 360 eeks out a solid "V" in comparison test, probably no thanks to Sony's crappy developer tools, Cell architecture that nobody understands, and Microsoft's straight up DirectX support.

The 360 could sit back and relax while developers had to fight tooth and nail to get projects done on the PS3.

It isn't a fair assumption to say 30% more shader will make the PS4 superior, time will tell. There is a high possibility that console specific titles will look better on the PS4, but it remains to be seen. The cross-platform games I doubt will be noticeable given developers will always go for the lowest common denominator if both platforms are equal in all other respects.

I do concur the Cloud smells like snake oil, I mean you're basically talking about streaming video games OnLive style, although to a lesser extent most likely. Until ISP's start knocking the cap bullcrap off I don't see this becoming very useful even in the U.S. where caps are higher. Hardcore gamers could be looking at drastically higher monthly bandwidth usage.
 
With the GPU knowledge here I'd like to get down to brass tacks and tackle this debate once and for all.


From Microsoft (various sources):


So with all this talk going on recently about the Xbox One not being that much slower than the PS4 and that the 30-50% figures being thrown around are "exaggerated" it got me thinking...How is this NOT a lie?

I mean, the PS4 has 384 more GPU cores (which is HALF the total number in the Xbox One - so you're looking at 150% for the PS4 vs. 100% for the Xbox One), it has GDDR5 RAM for CPU and GPU, double the ROP's, hUMA (which still hasn't been confirmed to be in the Xbox One, right?) bare metal access, etc...so how, purely based on hardware specs, could the PS4 not be 30-50% faster? I would think the extra 384 cores ALONE would automatically equal at least 30% faster. How could it not?

Is MS just talking out of their butts or does the Xbox One contain some magical powers that we aren't aware of? Because from what I can see on the technology side of things the PS4 just isn't faster, but it's clearly 30-50% faster EASILY.

Thoughts?

where the hell you find that PS4 have 384 more GPU cores than XBXO?.. where you find that have double ROPS??.. post here with proof and stop talking BS.

the only good advatange from PS4 over XBXO if the GDDR5 Memory and 2.0GHz PS4 CPU vs 1.75GHZ XBXO CPU.

overall PS4 will have 7%-10% more brute potency, and will be the same sht happens with PS3 you will not notice it until the end life of the console..

The PS4 also has 3x the memory bandwidth.

Microsoft also claims the cloud will allow their games to be more advanced than just the console hardware itself allows.

It all sounds like marketing BS to me. The consoles are basically the same, if the PS4 has a hardware edge (it does have a big one from just looking at the specs), we should see it on multi-platform games in a year.

they are nearly the same at all.. as you said all is pure BS marketing, that 3X Memory bandwidth will be really capped due to the GDDR5 shared between the whole system and not fully used as vRAM.. why all that extra Bandwidth with a lack of CPU power to feed it?. just not make sense to me.
 
where the hell you find that PS4 have 384 more GPU cores than XBXO?.. where you find that have double ROPS??.. post here with proof and stop talking BS.

the only good advatange from PS4 over XBXO if the GDDR5 Memory and 2.0GHz PS4 CPU vs 1.75GHZ XBXO CPU.

overall PS4 will have 7%-10% more brute potency, and will be the same sht happens with PS3 you will not notice it until the end life of the console..


What? Huh? The Xbox One has 768 cores and 16 ROPs. This is confirmed. The PS4 has 1152 cores and 32 ROPs. This is also confirmed. It isn't talking BS. Research it yourself.
 
That's not true at all. If you count on developers not taking advantage of eDRAM (which they obviously have with the 360) then I'd agree with that basic statement. Unfortunately, that is not the case. Developers have made great use with the super fast 10MB eDRAM on the 360 that in the early years it was considered superior to the PS3 which was said to be superior hardware. Even now I'd say the 360 eeks out a solid "V" in comparison test, probably no thanks to Sony's crappy developer tools, Cell architecture that nobody understands, and Microsoft's straight up DirectX support.

The 360 could sit back and relax while developers had to fight tooth and nail to get projects done on the PS3.

It isn't a fair assumption to say 30% more shader will make the PS4 superior, time will tell. There is a high possibility that console specific titles will look better on the PS4, but it remains to be seen. The cross-platform games I doubt will be noticeable given developers will always go for the lowest common denominator if both platforms are equal in all other respects.

I do concur the Cloud smells like snake oil, I mean you're basically talking about streaming video games OnLive style, although to a lesser extent most likely. Until ISP's start knocking the cap bullcrap off I don't see this becoming very useful even in the U.S. where caps are higher. Hardcore gamers could be looking at drastically higher monthly bandwidth usage.

Most devs are going to use the 32MB eDRAM as a frame-buffer.
Killzone Shadow Fall is already using 32MB of the PS4's ram (this was when they thought they had 4 gigs to work with) for a frame-buffer. (They are also using a 6MB Particle Buffer, but I do not know if that is being used as a frame buffer)

If there is future tech that needs a larger frame buffer, those features will only be able to work on the PS4. If future tech needs a lot of fast ram, the PS4 will have an edge. The PS2 actually had more memory bandwidth than the PS3 or Xbox 360, that allowed Shadow of the Colossus to have those awesome fur effects.

Memory bandwidth is important for a lot of visual effects, it's not something that should be over looked.
 
Not only that but even with the ESRAM doesn't the information still have to be taken off the DDR3 RAM at DDR3 speeds?

I'm not going to act like I know what I'm talking about here, but I would think that having a single pool of DDR5 RAM @ 176GB/s would just be easier to use and have more consistent performance than DDR3 RAM @ 68GB/s PLUS a 32MB pool of ESRAM.

Like I said, I don't know anything about the technicals here, but even with the ESRAM supposedly having ~200GB/s of bandwidth the DDR5 RAM of the PS4 isn't far behind and it's 8GB not 32MB.
 
How is this NOT a lie?
Performance does not scale 100% with a corresponding increase in hardware resources, at some point diminishing returns kick in. Even for embarassingly parallel operations. And that is assuming there are no other bottlenecks in the system which is probably unlikely. Google Amdahl's Law if you want to know waaay more on this subject than you probably care about.

<snip various hardware features> Is MS just talking out of their butts or does the Xbox One contain some magical powers that we aren't aware of?
These things are all important but like many out there you're down playing xb1's huge eSRAM LSU as well as the various DSP's/task dedicated processors. Xb1's sound chip alone is incredibly powerful...the early dev kits required another 8 Jaguar cores just to emulate it in software. Which is where the rumor of 16 cores was coming from. MS's hardware decisions might look strange now but when they start doing die shrinks MS will have Sony hands down on hardware costs. MS has more of a PR problem with Xb1 at this point.

hUMA (which still hasn't been confirmed to be in the Xbox One, right?)
hUMA is just AMD's marketing term for a particular version of a unified memory access scheme between the CPU and GPU memory systems (and registers too IIRC). The GPU in Xb1 might not meet those requirements to be called hUMA but it does have a form of it already since its based on current GCN architecture GPU's. How effective it is vs hUMA depends on MS's software tools but MS is pretty much the best at making tools.

bare metal access
Developers have already said the API is close enough to "bare metal" to not matter. But then no one does "bare metal" ASM these days. Too labor intensive and expensive since at least 2 console gens ago. Even on the PSX/N64 it wasn't all that common. SNES era was probably the console ASM "golden years".
 
Most devs are going to use the 32MB eDRAM as a frame-buffer. Killzone Shadow Fall is already using 32MB of the PS4's ram (this was when they thought they had 4 gigs to work with) for a frame-buffer....If there is future tech that needs a larger frame buffer
Comparing how games are coded on PS4 vs Xb1 is apples to oranges. The real limitation here is going to be resolution for the Xb1 FYI, "future tech" does not play into it since neither console has the performance to make 4K resolution games viable. 1080p native resolution games will likely be somewhat faster on the PS4 vs Xb1. For 720p games I wouldn't be shocked if the Xb1 was faster than PS4. And its likely developers will target 720p over 1080p. Developers know that few people have TV's big enough or sit close enough to their TV's to see the difference between 720p and 1080p content.* Hell current consoles upscale games from sub-720p resolutions all the time up to 720p or 1080p and most people still can't see the difference in their living rooms.

Memory bandwidth is important for a lot of visual effects, it's not something that should be over looked.
Its also not the end all be all of performance though. Xbox games often looked better than PS2 games too.

*
What the chart shows is that, for a 50-inch screen, the benefits of 720p vs. 480p start to become apparent at viewing distances closer than 14.6 feet and become fully apparent at 9.8 feet. For the same screen size, the benefits of 1080p vs. 720p start to become apparent when closer than 9.8 feet and become full apparent at 6.5 feet. In my opinion, 6.5 feet is closer than most people will sit to their 50&#8243; plasma TV (even through the THX recommended viewing distance for a 50&#8243; screen is 5.6 ft). So, most consumers will not be able to see the full benefit of their 1080p TV.

Pretty much no one sits 6' away from their TV. Most people sit around 12-15' from their TV's. Usually in poor lighting conditions on top of that too. You'd be amazed at how much even just a little glare messes with your eyes ability to discern any difference in resolution at those sitting distances.
 
Last edited:
Not only that but even with the ESRAM doesn't the information still have to be taken off the DDR3 RAM at DDR3 speeds?
If you want something from the DDR3 RAM then yes. But the CPU and GPU also can access the eSRAM. Both at the same time in certain conditions and that is where the very low latency and high bandwidth it can offer really shine. Its there to act primarily as a "scratch pad" for the CPU and GPU while stuff (ie. textures and sound, both of which are going to be chewing up heaps of RAM this console gen) is streamed from the DDR3 main memory. Basically if you're a developer and you're spending most of the time streaming stuff from main RAM to eSRAM you're doing it wrong. But then no developer is going to do that, they've all had years to work with small/high speed "scratch pad" RAM in the X360. And the X360's eDRAM cache was relatively gimped too compared to the eSRAM LSU in the Xb1, even though it did have some nifty on die hardware that gave it nearly "free" but crappy FSAA.

but I would think that having a single pool of DDR5 RAM @ 176GB/s would just be easier to use and have more consistent performance than DDR3 RAM @ 68GB/s PLUS a 32MB pool of ESRAM.
Easier? Yes. More consistent performance? Nope. Easier enough to make a big difference vs Xb1 in development costs/time? Nope. Remember: architecturally Xb1 is pretty much a refined and improved X360, and developers had nothing but good things to say about working on the X360.
 
Last edited:
MS's hardware decisions might look strange now but when they start doing die shrinks MS will have Sony hands down on hardware costs. MS has more of a PR problem with Xb1 at this point.

Well said, they know what they are doing, that GDDR5 is going to bite Sony in the butt a few years down the line, since it doesn't scale as well as DDR3 with die shrinks.
 
The bigger problem is that their GDDR5 requires special custom packaging that no one but Sony is using. In a few years Sony will also be about the only ones still using GDDR5 too, its already fairly boutique stuff as is which is why its so expensive. It'll be the XDR RAM fiasco all over again for them.
 
With how similar consoles end up being, whether or not there is x% more power available is not as relevant as who gets programmed for. Many of the cross platform games are written for xbox 360 then ported to ps3 (many times precisely because of the direct x support). But that aside, do people know about the power of all of the cores? It is not as simple as more cores=better performance. Certain cores are better at others at doing the calculations used in gaming. This reminds me of the ps3 vs xbox 360 debate.
As a PC gamer it is really cute to hear all of these arguments about how much power this or that system has. It will never matter: no matter how new the system you can generally build a PC that is just as powerful as that system two years after it has released for the same price. If you build used you can usually do it one year after release. In that two year period, devs are NEVER able to use the full power of the system anyway making it even easier to match/beat the performance of a consoles. And nearly any computer you build will be more more reliable. Of course this is an oversimplification since you have to put in a lot more effort to build your own than just buying a box. But when you look at game prices it becomes obvious that if you build a budget rig there is essentially no way to come out losing if you are a person who buys more than 10 or so games a year.
 
The talk of peasants do not concern me
Let them squabble over their crumbs!

Developers have already said the API is close enough to "bare metal" to not matter. But then no one does "bare metal" ASM these days.
"Metal" doesn't necessarily denote assembly. In the context of game consoles, it tends to mean 'raw' or very close and relatively unfettered access to the underlying hardware.
 
But that aside, do people know about the power of all of the cores?
AFAIK its not public knowledge yet just what all of them do in detail.

It is not as simple as more cores=better performance.
True but most in the Xb1 are task dedicated background CPU's/DSP's that work "automagically". The developers probably won't even have to touch them most of the time unless they're trying to eke out some extra performance or a novel game feature. The software tools will take care of most of the dirty work for them.
 
"Metal" doesn't necessarily denote assembly. In the context of game consoles, it tends to mean 'raw' or very close and relatively unfettered access to the underlying hardware.
That is what ASM is, you litterally cannot get any more unfettered than that.

If you want to quibble or stretch the meaning of words or phrases until they mean something else than fine I guess but don't be surprised when you get nothing but confused or frustrated responses since no one else can communicate with you.
 
See: relatively.
Its ASM and not RASM for a reason. What you mean to say is "a high performance API that does little or nothing to obfuscate the underlying hardware from the developer" but there are no nifty 3 letter acronyms for that.
 
Mesyn, so what you're saying is that the dev tools are so much better for the xbox1 that it will be easier to get performance out of the cores? Also I wholeheartedly agree with the facts you shared about high def. It is a numbers game waste, much like this arguing about which console has more power when the real win is whether people can develop cost and resource efficiently for the system.
 
Its ASM and not RASM for a reason. What you mean to say is "a high performance API that does little or nothing to obfuscate the underlying hardware from the developer" but there are no nifty 3 letter acronyms for that.
No, I'm not asserting that a "lightweight" heavy abstraction (an API) is "to-the-metal". I'm asserting that no API is "to-the-metal". You'll invariably wind up in assembly, but it doesn't mean you're writing it.
 
Mesyn, so what you're saying is that the dev tools are so much better for the xbox1 that it will be easier to get performance out of the cores?
Its not so much that the tools will be so much better for xb1 vs PS4 to offset the PS4's brute force performance advantage, its that the developers need not even worry about or mess with the task dedicated CPU's/DSP's most of the time. And those task dedicated CPU's/DSP's will free up processing resources from the Jaguar CPU's and/or the GPU too.
 
No, I'm not asserting that a "lightweight" heavy abstraction (an API) is "to-the-metal". I'm asserting that no API is "to-the-metal". You'll invariably wind up in assembly, but it doesn't mean you're writing it.
...
mesyn191 said:
If you want to quibble or stretch the meaning of words or phrases until they mean something else than fine I guess but don't be surprised when you get nothing but confused or frustrated responses since no one else can communicate with you.
 
Take it up with Carmack.
This sentence: "If we were programming that hardware directly on the metal the same way we do consoles, it would be significantly more powerful.” is the only one in the article that vaguely applies (in that it mentions "bare metal" programming) to what we're discussing and yet does not contain the word "relative" in it at all so I don't know what you're talking about.

Do you need me to quote myself again?
 
The PS2 actually had more memory bandwidth than the PS3 or Xbox 360

No, the PS2 did not have more memory bandwidth than the PS3 or 360.

PS2 (RDRAM): 3.2GB/s
PS3 (XDR DRAM): 25.6GB/s
360 (GDDR3): 22.4GB/s


Sorry, but you're totally wrong about your statements.
 
Its not so much that the tools will be so much better for xb1 vs PS4 to offset the PS4's brute force performance advantage, its that the developers need not even worry about or mess with the task dedicated CPU's/DSP's most of the time. And those task dedicated CPU's/DSP's will free up processing resources from the Jaguar CPU's and/or the GPU too.

What resources are the dedicated CPU's/DSP's doing that the PS4 isn't though? The two new consoles are already said to have two CPU cores unavailable for games. So there's what the PS4 needs for it's behind the scenes processing.

Given that the Xbox One will be using only 6 CPU cores for games I fail to see how it's extra DSP's and what not really = better performance. Plus from what I've read the extra DSP's/CPU's really only do things like video compression and what not...nothing that relates specifically to running games.

So basically:

PS4 = 2 Jaguar Cores for "stuff" other than games = OS management, streaming game play, etc.

Xbox One = 2 Jaguar Cores PLUS DSP's for "stuff" other than games = OS management, streaming game play, etc.

You see what I mean? What does the Xbox One having other little bits and bobs really matter to RUNNING GAMES? The CPU resources are ALREADY allocated and can't be altered. Both consoles already have only six cores for games...everything else is trivial.

Also, Xbox's audio I fail to see how that has anything to do with the actual graphical performance of the system. OK, so the Xbox One has some nice sound...so what? What is the Xbox One going to do sound wise that the PS4 can't or, at the very least, what is it going to do that will MATTER? I don't see either console going past 5.1 or 7.1 sound neither do I see them going past ~128 sound channels...as anymore just simply isn't needed. So as I said, Xbox One has some super powerful audio chip/processing...why's that matter at all?
 
No, the PS2 did not have more memory bandwidth than the PS3 or 360.
He meant the eDRAM. Had around 40GB/s bandwidth IIRC which for the time was very impressive. Problem was it was too small and for some reason, I can't remember why exactly, it was actually pretty hard to get anywhere near peak bandwidth out of the eDRAM. Had something to do with the bus to the eDRAM actually being a whole bunch of smaller independent buses that required a bunch of developer's blood, sweat, and tears to corral properly I think.
 
What resources are the dedicated CPU's/DSP's doing that the PS4 isn't though?
Its more about the degree to which they do things for the most part. AFAIK the stuff that Xb1 has that the PS4 doesn't mostly relates to Kinect 2.0, video "planes", and video scaling. None of which by themselves is anything to get worked up about: its when you factor in all of them working in the background that the end result is impressive. That is a hell of a lot of crap that the CPU/GPU no longer have to deal with and some of which chewed up quite a bit of X360's performance.

The two new consoles are already said to have two CPU cores unavailable for games. So there's what the PS4 needs for it's behind the scenes processing.
No they're both reserving CPU cores, GPU resources, and memory for apps/OS. Xb1's task dedicated CPU's/DSP's have their own cache's and memory though they can also use the main memory and possibly the eSRAM if necessary.

Plus from what I've read the extra DSP's/CPU's really only do things like video compression and what not...nothing that relates specifically to running games.
They do a lot more than that and the degree to which they do it in some cases is very impressive. The SHAPE audio sub system alone is ridiculous. Anything that frees up the CPU means you can do more stuff with it in game, and they're freeing lots of CPU resources with all these task dedicated CPU's/DSP's.

The CPU resources are ALREADY allocated and can't be altered. Both consoles already have only six cores for games...everything else is trivial.
Hundreds of GFLOPs worth of performance usually isn't considered trivial by most people. That BTW is just for SHAPE. The move engines, Kinect 2.0 dedicated CPU, and video scalers are just icing on the cake but they all add up too.

Also, Xbox's audio I fail to see how that has anything to do with the actual graphical performance of the system.
Well you seem to be failing to see an awful lot of things here. Just where do you think the computing power has to come from if you want to do high quality reverb through multiple rooms if your sound chip isn't up to snuff? This is not some minor corner case issue: accurate high quality sound reproduction can be just as important as accurate high quality physics simulation to a game.

I don't see either console going past 5.1 or 7.1 sound neither do I see them going past ~128 sound channels...as anymore just simply isn't needed.
Its less about making use of someones 5.1-whatever sound system (which only a tiny percentage of people have available) or sound channels and more about trying to make your average console gamer's crappy built-into-TV speakers sound as if they were a 4.1 or maybe even a 5.1 sound system. That is really hard to do.

They might even be able to pull this sort of thing off in game. You'd need headphones for the effect to work though so I doubt many would bother with it, but still pretty damn cool.
 
This sentence: "If we were programming that hardware directly on the metal the same way we do consoles, it would be significantly more powerful.” is the only one in the article that vaguely applies (in that it mentions "bare metal" programming) to what we're discussing...
That's probably why I linked it. I know of no one in the developer community who refers to "metal" or "bare metal" development, as it pertains to game development, as necessarily denoting hand-written assembly. You seem to be the exception. Carmack, specifically, has on numerous occasions used the term to describe exactly as I've suggested.
 
Carmack, specifically, has on numerous occasions used the term to describe exactly as I've suggested.
Except he doesn't and isn't. In fact he very specifically mentions "bare metal" in the context of removing overhead in programming for the PC which is what any API will introduce. So we're back to square one, looks like I need to quote myself again:
mesyn191 said:
If you want to quibble or stretch the meaning of words or phrases until they mean something else than fine I guess but don't be surprised when you get nothing but confused or frustrated responses since no one else can communicate with you.
 
In fact he very specifically mentions "bare metal" in the context of removing overhead in programming for the PC which is what any API will introduce.
I'm not asserting that a "lightweight" heavy abstraction (an API) is "to-the-metal". I'm asserting that no API is "to-the-metal".
Where's the confusion? Are you perhaps conflating APIs and higher-level programming languages?
 

I've seen this graph before and think we need to be careful when looking at charts like this when talking about what distances from televisions (displays) where you can tell the difference in resolution.

This chart is probably talking about live action video or film, captured from motion cameras, with lenses that focus to varying depths based on an ever changing focal point of the scene, and capture real light scattered through the atmosphere at probably between 25 to 30 frames per second, with "shutter" speeds open long enough for each frame to also capture varying amounts of motion blur.

We however are talking about real time rendered rasterized computer graphics, which are not captured by a camera, but instead generated on a flat 2d screen, broken down to its basic form, just triangles of varying shapes and sizes displayed on the screen to give the illusion of other shapes and depth, but at the end of the day they are all just flat triangles on the screen and any depth is an optical illusion. modern 3d engines adding artificial camera depth of field, motion blur and lighting after the polygons are all rendered and textured.

For live action video or film, the lower the resolution at any point from capture to display the image just gets softer (or blurry) which in that context the graphic looks acceptable

With rendered graphics however, the lower the pixels per inch, instead of getting blurry (blur will still occur if running at lower then default display resolution) you get jaggies and aliasing and other kinds of image distortion.

Because of this, Resolution, and pixels per inch, is very important and there are still very noticeable aliasing and "jaggy" effects that even though i have terrible eye sight, since i am aware of what display based picture distortions look like, i can still see them even on my 27" 1440p monitor from pretty far away, far enough away so that its just a limitation of my eyesight.

So in short i disagree with the graphic in the context of this dicussion. Even though i haven't seen anything close to it, i don't think we'll be done with needing to increase resolution untill displays are utilizing photon or equivlant sized "pixels", and don't have a color limit either.

with that in mind i think both consoles will be able to handle 4k upscaling at 30fps, kinda like they upscale to 1080p today. even if they could do 4k60 i don't think they have hdmi 2.0 ports since that standard just reticently got finalized, so with hdmi 1.4 we're limited to 3840×2160p30 and 4:2:2 colorspace with audio.

4k is going to get pushed hard way before these consoles life cycle is up, and unlike 3d, 4k will sell. and so will 8k and 16k and....
 
Pretty much no one sits 6' away from their TV. Most people sit around 12-15' from their TV's. Usually in poor lighting conditions on top of that too. You'd be amazed at how much even just a little glare messes with your eyes ability to discern any difference in resolution at those sitting distances.

pfft, hahahahahahaahaahahahaha

I know a lot of people who fit into your statement, but I know quite a lot that DO sit close. Personally I sit about 8-10' from my 50" plasma while watching TV / a movie, and about 4-6' away when gaming usually. In fact almost anyone I am / have been friends with that games does it close to the TV.

Also, Avatar82 is spot on. 720p is not acceptable for games. It still results in significant amounts of jaggies, moire and other issues. I'm not sure if I'm buying the PS4 (definitely not an XB1) but if (most of) the games run at 720p that is a deciding factor against it. Wouldn't even consider it if that were the case. Why would I - or anyone - buy a system that's hardly even an upgrade?
 
With rendered graphics however, the lower the pixels per inch, instead of getting blurry (blur will still occur if running at lower then default display resolution) you get jaggies and aliasing and other kinds of image distortion.
Current and future consoles already use FSAA. Granted its a crappy edge detect FSAA that has also been upscaled but jaggies aren't really the issue with the current gen. Its the crappy textures + low fps that are much more of an issue.

i don't think we'll be done with needing to increase resolution untill displays are utilizing photon or equivlant sized "pixels", and don't have a color limit either...with that in mind i think both consoles will be able to handle 4k upscaling at 30fps
ooook now you're just trolling me, good job getting me I guess.
 
sorry not trying to troll, just letting my imagination get the best of me with the idea we don't really know how far we need to take technology to be satisfied until we see it. especially when it comes to immersive entertainment.
 
but I know quite a lot that DO sit close....In fact almost anyone I am / have been friends with that games does it close to the TV.
Congrats, you're in a tiny + insignificant minority that may actually see some benefit from higher resolutions. Good on you I guess, but also irrelevant to what is actually being discussed or to the market which console developers/manufacturers target their products.

720p is not acceptable for games.
X360 and PS3 both render many if not most of their games at sub 720p resolutions and yet tens of millions have bought both of those consoles and enjoy them even now! Now it may certainly be true that 720p is unacceptable to you but the post of mine you're responding to was addressing things on a general/market wide scale and not the individual/personal level you're discussing. Me=apples, you=oranges.
 
Back
Top