Xbox One’s Unlocked 7th Core Isn’t Much Of A Boost

The thing with the One is there is a chunk of unused GPU real estate that's always switched off due to initial chip yields when it first got manufactured.

Shame they couldn't just switch that on and replace the early ones it wouldn't work on.

If that's possible.
 
That's what happens when you limit the power draw like console manufacturers do. That's why I have no interest in owning a console. Why play the same games at crappier resolutions with less controller options? If you want to sit on your couch, get a SteamBox or just hook up a controller to your Windows PC and call it a day.
 
That is just silly. Solid 1080p @ 60FPS doing what? They can both do that now depending on what they are displaying.

What is your benchmark for making such a silly statement? Should a game that has 100+ AI characters in a city that has moving cars, explosions, etc... be 1080p @ 60FPS? What do you sacrifice to hit 1080p because that is the most important thing to you? Texture quality, amount of things going on at once, off load processing to the cloud? What?

1080p is nice, but making it "the benchmark" means you will lose something else to get there.

The problem is, if a "Next Gen" console can't maintain 60FPS @ 1080p with new titles are release, or shortly there after, how well do you think they will do in 3 or 4 years?

It seems that this generation of consoles made to many compromises, hardware wise, and ended up severely under powered.

Consider for a minute that PC gaming is moving from 1080p to 1440p and even 4k, meanwhile the newest consoles are still having trouble with 1080p. WTF?!?!
 
Were they not touting 4K gaming for this generation? I think I remember Sony saying it was going to happen.
 
99% of the people that game on consoles have never seen a video game on a high end gaming rig. They have nothing to compare it to, therefore Microsoft most certainly DID hit their target market at a good price point. And the end consumers are none the wiser. They see "oooooh, shiny" and think it's good.

We're enthusiasts on an enthusiasts forums discussing low-end garbage consoles.
 
The thing with the One is there is a chunk of unused GPU real estate that's always switched off due to initial chip yields when it first got manufactured.

Shame they couldn't just switch that on and replace the early ones it wouldn't work on.

If that's possible.

What? no.

The xbone was a 1.3Tf GPU that had 10% reserved for Kinect, that reservation was handed back a couple months post launch at the discretion of the developer .

There is nothing else switched off on the GPU unless you believe in "MisterXMedia" and if you do believe in him, come over here so i can show you this shiny new bridge that could be yours for a really *really* cheap price!
 
99% of the people that game on consoles have never seen a video game on a high end gaming rig. They have nothing to compare it to, therefore Microsoft most certainly DID hit their target market at a good price point. And the end consumers are none the wiser. They see "oooooh, shiny" and think it's good.

We're enthusiasts on an enthusiasts forums discussing low-end garbage consoles.

Citations please?

Get off your soapbox. I game on both console and pc, console I game more on because it is simpler and cheaper to play games with the wife. I can buy one game for my xbox and they allow you to share it to play multiplayer. No it doesn't have the massive detail the pc can do but that isn't at all what makes a game good or bad. My xbox was a decent upgrade from the 360, will it game better than pc? No of course not, but we all know that so let the dead horse be. I grew tired over the years (started pc gaming on wolfenstein shareware) of having to tinker with video card drivers or waiting for drivers to be released to fix bugs, different versions of DRM software on versions of games I paid for, and just the overall arrogance and poor community the pc gaming culture has to offer.

The hacks also are a huge annoyance as well. People modding textures to be translucent so they can see through walls, or aimbots, lag switches etc.

My point really is, why hate on the console gaming community? Is that group somehow a threat to your lifestyle? Maybe if the pc gaming group wasn't such elitist assholes, then maybe more of us would come back to play and you could get a shift back into pc gaming being the primary platform again.
 
What is also interesting is that the eDRAM cache strategy was lauded at the time for the 360 but ridiculed this time round.

It wasn't. It sounded great for the press releases, but really it was about finding a way to use cheap 128-bit GDDR3 as unified system memory, by providing a fast small frame buffer for the GPU. It was cheaper than sourcing 256-bit GDDR3, as they would never be able to reduce the cost of mounting those chips on the PCB!

It was a good long-term strategy because the eDRAM could eventually be integrated on one die, whereas external GDDR3 would never see as much cost reduction.

But on release it was criticized for being too small for a complex frame buffer, and it quickly became clear once games started shipping at 640p:

http://www.gamespot.com/forums/syst...dram-benefit-turns-into-a-liability-26116823/

The problem was they limited the size of the eDRAM to reduce costs, and it turned out to be too small!

The 32MB SRAM on the One was similarly panned, but this time for even more important reasons:

1. Still too small to get a complex 1080p frame buffer inside of. It's 3x the capacity of the Xbox 360, but the resolution increase from 640 to 1080p is 2.5x. So, MORE memory per pixel rendered, but not by much.

2. The eDRAM on the Xbox 360 used a separate chip for the GPU and eDRAM, on the same package (until the Slim). The transistors used for the eDRAM did not directly limit the number of transistors available for the GPU when it was designed. But the One uses a combined APU, so the space taken up by 32MB SRAM definitely has an impact on capacity for things like shaders and ROPs! From what I read, the two SoCs from Microsoft and Sony have roughly the same die size!

3. Back when the Xbox 360 shipped, eDRAM was the best cost compromise they could have made at the time. But the One had the option to go with GDDR5, an already well-established memory standard! So they could have saved that die space for more GPU power, like Sony did! They didn't even save themselves board complexity like they did with the 360 - it shipped with a 256-bit external bus, just like the PS4!
 
Citations please?

Get off your soapbox. I game on both console and pc, console I game more on because it is simpler and cheaper to play games with the wife. I can buy one game for my xbox and they allow you to share it to play multiplayer. No it doesn't have the massive detail the pc can do but that isn't at all what makes a game good or bad. My xbox was a decent upgrade from the 360, will it game better than pc? No of course not, but we all know that so let the dead horse be. I grew tired over the years (started pc gaming on wolfenstein shareware) of having to tinker with video card drivers or waiting for drivers to be released to fix bugs, different versions of DRM software on versions of games I paid for, and just the overall arrogance and poor community the pc gaming culture has to offer.

The hacks also are a huge annoyance as well. People modding textures to be translucent so they can see through walls, or aimbots, lag switches etc.

My point really is, why hate on the console gaming community? Is that group somehow a threat to your lifestyle? Maybe if the pc gaming group wasn't such elitist assholes, then maybe more of us would come back to play and you could get a shift back into pc gaming being the primary platform again.

lolwut? I'm not hating. I'm pointing out that Microsoft and Sony did exactly as they set out to do. Put to market a mediocre device that delivers mediocre performance for a price most people can swallow. And 99% of those people have nothing else to compare their experience to. So, all is good in the hood with them.

Ask you average XBL kiddie what 1080/60 means to them and they probably won't know wtf you are talking about. The only people that means anything to is us, the enthusiasts. We make up a pretty small demographic of M$/Sony client base.
 
Citations please?

Get off your soapbox. I game on both console and pc, console I game more on because it is simpler and cheaper to play games with the wife. I can buy one game for my xbox and they allow you to share it to play multiplayer. No it doesn't have the massive detail the pc can do but that isn't at all what makes a game good or bad. My xbox was a decent upgrade from the 360, will it game better than pc? No of course not, but we all know that so let the dead horse be. I grew tired over the years (started pc gaming on wolfenstein shareware) of having to tinker with video card drivers or waiting for drivers to be released to fix bugs, different versions of DRM software on versions of games I paid for, and just the overall arrogance and poor community the pc gaming culture has to offer.

The hacks also are a huge annoyance as well. People modding textures to be translucent so they can see through walls, or aimbots, lag switches etc.

My point really is, why hate on the console gaming community? Is that group somehow a threat to your lifestyle? Maybe if the pc gaming group wasn't such elitist assholes, then maybe more of us would come back to play and you could get a shift back into pc gaming being the primary platform again.

EOT

The ownage in the last sentence in nails. Also the poster talking about "good enough" earlier in the thread is nails too.

The outcry on here to justify their $1500 gaming PCs is ridiculous. I play a lot on PC, but most of my friends purchase a PS4 and go. I wouldn't have very many friends in the real world if I insulted them for playing games on consoles. I also sure as hell do not want to play the majority of my games with elitist l33t PC gamer assholes.

BUT THE BIGGER WIFIS, FASTER GBS, AND MORE PIXALSS RWAR!!!
 
lolwut? I'm not hating. I'm pointing out that Microsoft and Sony did exactly as they set out to do. Put to market a mediocre device that delivers mediocre performance for a price most people can swallow. And 99% of those people have nothing else to compare their experience to. So, all is good in the hood with them.

Ask you average XBL kiddie hat 1080/60 means to them and they probably won't know wtf you are talking about. The only people that means anything to is us, the enthusiasts. We make up a pretty small demographic of M$/Sony client base.

Yeah you are. The bolded part is evidence of the subtle jab at the console players you are saying you aren't hating.
 
lolwut? I'm not hating. I'm pointing out that Microsoft and Sony did exactly as they set out to do. Put to market a mediocre device that delivers mediocre performance for a price most people can swallow. And 99% of those people have nothing else to compare their experience to. So, all is good in the hood with them.

Ask you average XBL kiddie what 1080/60 means to them and they probably won't know wtf you are talking about. The only people that means anything to is us, the enthusiasts. We make up a pretty small demographic of M$/Sony client base.

Fair point. I apologize, the way I read your post it sounded very condescending, as that was not your intent I pull back on that.
 
man oh man i miss them olden days where console hardware was specialized and markets were largely segregated.

it wasnt without great flaws, but i liked them days better. where the PC market had PC games, and playstation had playstation games, and nintendo had nintendo games. each device had its own style and character, and hardware differences were distinct and an experience. now everything is extremely similar to everything else, and crossplatform is everywhere.

crossplatform is great for marketability, but youve got stuff like xbox's forced parity so development suffers. crossplatform also means youre appealing to a much wider market, which means youre considering more people's "tastes", which often leads to middle of the road development decisions. i.e., boring status quo stuff.

consoles are just low end PCs now :( and the differences are boring. difference between x1 and ps4? slower RAM, and a slower GPU. yawn. the differences in rendering techniques are so small too. yawn. same console basically, just one is slower.

at least nintendo is keeping the old dream alive, but the wii sucks and so does the wii u. they got a bit lazy and obsessed with profitability. good on em for trying new things, but low end hardware + game rehashes is not a long term solution for the nintendo faithful.

sorry for the rant but GET OFF MY LAWN (cant remember who im yelling at)
 
Were they not touting 4K gaming for this generation? I think I remember Sony saying it was going to happen.

They never touted games at 4k. I do remember them mentioning 4k video content...specifically blu-rays and possibly some streaming options.
 
man oh man i miss them olden days where console hardware was specialized and markets were largely segregated.

it wasnt without great flaws, but i liked them days better. where the PC market had PC games, and playstation had playstation games, and nintendo had nintendo games. each device had its own style and character, and hardware differences were distinct and an experience. now everything is extremely similar to everything else, and crossplatform is everywhere.

crossplatform is great for marketability, but youve got stuff like xbox's forced parity so development suffers. crossplatform also means youre appealing to a much wider market, which means youre considering more people's "tastes", which often leads to middle of the road development decisions. i.e., boring status quo stuff.

consoles are just low end PCs now :( and the differences are boring. difference between x1 and ps4? slower RAM, and a slower GPU. yawn. the differences in rendering techniques are so small too. yawn. same console basically, just one is slower.

at least nintendo is keeping the old dream alive, but the wii sucks and so does the wii u. they got a bit lazy and obsessed with profitability. good on em for trying new things, but low end hardware + game rehashes is not a long term solution for the nintendo faithful.

sorry for the rant but GET OFF MY LAWN (cant remember who im yelling at)



Yes! I miss wanting all consoles (but could only afford one) because you had to pick between entire libraries of games, instead of a few exclusives. Back then you had a couple games that were crossplatform and that was it. Where you were excited when a friend had a different system than you so you could go play it.
 
Going by multiple tests by digital foundry a i3/750ti combo beats the PS4 in most multi-plats in framerate & image quality (settings).

Every time I fire up Destiny I think to myself how great this could be with MKB & 4k...
 
In the age of 4K consoles are still struggling with 1080p lol.
 
What? no.

The xbone was a 1.3Tf GPU that had 10% reserved for Kinect, that reservation was handed back a couple months post launch at the discretion of the developer .

There is nothing else switched off on the GPU unless you believe in "MisterXMedia" and if you do believe in him, come over here so i can show you this shiny new bridge that could be yours for a really *really* cheap price!

It has 14 CUs of which only 12 are switched on for yields. By now I would have thought all 14 CUs would be perfectly viable. PS4 has a similar situation.

Apparently it can be done but MS decided to go with a clock increase instead. Why not just add the extra two CUs now we are two+ years down the line?

http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects
 
at least nintendo is keeping the old dream alive, but the wii sucks and so does the wii u. they got a bit lazy and obsessed with profitability. good on em for trying new things, but low end hardware + game rehashes is not a long term solution for the nintendo faithful.

You're describing what Nintendo has always done. They have a vast IP library, but they've always relied on it to drive their library. Making new interations of their IPs. EVERY COMPANY DOES THAT.

So, it's odd when people hold Nintendo to some odd standard, but don't do it to others. Xbox is Halo 5, Gears 5, Forza 6, and etc... Sony is Uncharted 4, Killzone 3, LBP 3, etc...

Blaming Nintendo for sequels is kinda onesided and odd. I mean, it's not like Nintendo really pumps out the same game. I can only blame them with that on New Super Mario Bros. However, they are still solid games. Probably stronger sequels than some of the ones above.
 
Sad that everyone is buying 4K TVs and the cutting edge consoles can barely do 1080P @60fps
Interesting point about scaling being used in Halo5 to achieve 1080p. Crazy.

To be fair it's very difficult for a very-high end PC to do 4K above 30fps in a modern game today.

Still, I think it was ridiculous for them to release a generation of consoles as underpowered as they did. The systems can barely hit 1080p in an era where everybody is already chomping at the bit for 4K.
 
Yes! I miss wanting all consoles (but could only afford one) because you had to pick between entire libraries of games, instead of a few exclusives. Back then you had a couple games that were crossplatform and that was it. Where you were excited when a friend had a different system than you so you could go play it.

oh man thats the stuff. back then, a specific console represented a library of games unique to that platform. N64 had exclusives by RARE (blast corps, goldeneye, etc.), 1080 snowboarding, the best version of SF Rush, etc. PS had Metal Gear, some weird fighting games (battle area toshinden), front mission, fear effect, etc. Saturn had virtua everything, die hard arcade, etc.

interesting times.

twonunpackmule said:
it's odd when people hold Nintendo to some odd standard, but don't do it to others. Xbox is Halo 5, Gears 5, Forza 6, and etc... Sony is Uncharted 4, Killzone 3, LBP 3, etc...

nintendo has earned a higher standard due to its games. rehashing candy crush 1000 times isnt the same as a mario game stagnating; the latter is worse :(
 
To be fair it's very difficult for a very-high end PC to do 4K above 30fps in a modern game today.

Still, I think it was ridiculous for them to release a generation of consoles as underpowered as they did. The systems can barely hit 1080p in an era where everybody is already chomping at the bit for 4K.

You are wrong because the new consoles can hit 60 fps easily.
 
It has 14 CUs of which only 12 are switched on for yields. By now I would have thought all 14 CUs would be perfectly viable. PS4 has a similar situation.

Apparently it can be done but MS decided to go with a clock increase instead. Why not just add the extra two CUs now we are two+ years down the line?

http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects

Every GPU is like that, PS4's APU also has 128 stream processors disabled, making it a "real" 20 CU APU by your take, but the industry doesn't produce *full chips* except in small batches for niche products with a high premium, since they must be binned specifically for it.

The difference in yield thanks to those disabled CU's is always massive, remember that these chips have billions of transistors and thus the tiniest slip up on the fabrication process can skyrocket the chances of a faulty set of stream processors, and thus those two CU's set aside as a "just in case" ensure that the production quotas are achieved lowering the overall per chip production cost, increasing profits that way.
 
onintendo has earned a higher standard due to its games. rehashing candy crush 1000 times isnt the same as a mario game stagnating; the latter is worse :(

How has Mario stagnated? New Super Mario Bros. WiiU was probably the second best one. Then, they released probably the best 3D Platformer in years with Super Mario 3D World. I hardly consider that "going stagnant." If anything, Nintendo went stagnant with Mario Sunshine. The worst 3d Mario title.
 
I can't seem to edit. But, I would have corrected, "New Super Mario Bros. WiiU was probably the second best one in that series."
 
Every GPU is like that, PS4's APU also has 128 stream processors disabled, making it a "real" 20 CU APU by your take, but the industry doesn't produce *full chips* except in small batches for niche products with a high premium, since they must be binned specifically for it.

The difference in yield thanks to those disabled CU's is always massive, remember that these chips have billions of transistors and thus the tiniest slip up on the fabrication process can skyrocket the chances of a faulty set of stream processors, and thus those two CU's set aside as a "just in case" ensure that the production quotas are achieved lowering the overall per chip production cost, increasing profits that way.

Yeah I know that but that redundancy is usually more for the early runs of the chips. After production is under way, reliability/quality improves at the Fab stage to such a state that redundancy is moot.

At the end of the day though, doesn't matter as both consoles are crap bits of kit.:(
 
They have their use, i would have never expected them to replace desktops... KB+mouse + just really being in control of my system to choose IQ vs performance at the point that matters the most to me.

That said, i do expect this to be a short generation, which is good.
 
They have their use, i would have never expected them to replace desktops... KB+mouse + just really being in control of my system to choose IQ vs performance at the point that matters the most to me.

That said, i do expect this to be a short generation, which is good.

Yeah I reckon new consoles by end of 2017 perhaps.
 
You are wrong because the new consoles can hit 60 fps easily.

Sure, if they're displaying a blank screen. I have a PS4 and a lot of the games are limited at 30fps or struggle to hit 60 in complex scenes. Not only that I have a 720p TV so I can't imagine it isn't smoother than it is for people who are running at 1080p.

There are some good games for the new consoles and I like consoles on the whole for their simplicity, but the hardware this generation isn't that great. PS4's unified GDDR5 is really the only interesting advantage it has and that lead is shrinking fast with DDR4 (GDDR5 is hopped up DDR3 after all).
 
Back
Top