Xbox One’s Unlocked 7th Core Isn’t Much Of A Boost

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
The fact that certain Xbox One games have to run in “dynamic resolution” to maintain 60 frames per second makes me sad that some of them will never be released on PC.

Right now Larian Studios is working on a PS4 and Xbox One console port of the cRPG Divinity: Original Sin, and during the dev cycle Larian has spent a lot of time getting intimately acquainted with current-gen console architecture. The bump in processing speed granted by the Xbox One's seventh core didn't seem to help very much, despite the speculation that the extra core could be responsible for some Xbox One games hitting PS4-level performance.
 
Isn't the main reason the Xbox One can't handle 1080p well is because of its weaker GPU? I don't think an unlocked core or better optimization of its ESRAM can help it much if the GPU is weak.
 
The PS4 has more shader power, double the ROP count and full access to GDDR5. The Xbox One will NEVER be as powerful. Who would ever claim that an extra CPU core would help with resolution?
 
Not surprising; GPUs are the primary bottleneck. That isn't changing anytime soon.
 
Isn't the main reason the Xbox One can't handle 1080p well is because of its weaker GPU? I don't think an unlocked core or better optimization of its ESRAM can help it much if the GPU is weak.

Sony was having issues with 1080p and keeping up the frame rate as well. Have they gotten around it? Really both systems were underpowered. Kinda sad.
 
Sony was having issues with 1080p and keeping up the frame rate as well. Have they gotten around it? Really both systems were underpowered. Kinda sad.

Depends on the game but generally speaking PS4 tends to run a bit higher resolution and/or framerate than Xbone, which isn't surprising considering it has a better GPU. They are still pretty weak compared to most gaming PCs though.

http://www.ign.com/wikis/xbox-one/PS4_vs._Xbox_One_Native_Resolutions_and_Framerates

A lot of games on that list that run at 30FPS on consoles run over 60FPS on even a mid-range PC graphics card.
 
People thought freeing up one of the extremely weak cores would have substantial performance gains? Ha.
 
Depends on the game but generally speaking PS4 tends to run a bit higher resolution and/or framerate than Xbone, which isn't surprising considering it has a better GPU. They are still pretty weak compared to most gaming PCs though.

http://www.ign.com/wikis/xbox-one/PS4_vs._Xbox_One_Native_Resolutions_and_Framerates

A lot of games on that list that run at 30FPS on consoles run over 60FPS on even a mid-range PC graphics card.

The reason I asked was I remember the launch were they were patching games to get to 1080p on the ps4 but they were going to drop the frame rate to get there. Its been a while though so I didn't remember the exact details. Really the power of both systems was disappointing, with the graphics power on the xbone being a little more disappointing.
 
When you are deciding on hardware for your next console and you aim for what was then midrange PC performance, well...

:rolleyes:
 
When you are deciding on hardware for your next console and you aim for what was then midrange PC performance, well...

:rolleyes:

Midrange for 2010. The hardware they went with was low-end for 2013 when the consoles launched.
 
They really needed to go with GTX 780/ R9 290 level hardware for these things. They're already horribly dated just two years on.
 
I'm amazed that one day one of the brainstorming for both the One and PS4 that on the flip chart at No.1 the priority wasn't "Solid 1080p 60FPS".

Or at least solid 1080p/50 or even 1080p/40...

Maybe next gen?

I guess going with AMD the only way to have got that would have been consoles twice the size with cooling.
 
hmm...if only they made these devices where you could swap out the CPU and GPU when they get too old/slow, and add more RAM if/when it's needed...oh wait...those are called PC's. my bad.
 
I'm amazed that one day one of the brainstorming for both the One and PS4 that on the flip chart at No.1 the priority wasn't "Solid 1080p 60FPS".

Or at least solid 1080p/50 or even 1080p/40...

Maybe next gen?

I guess going with AMD the only way to have got that would have been consoles twice the size with cooling.

That is just silly. Solid 1080p @ 60FPS doing what? They can both do that now depending on what they are displaying.

What is your benchmark for making such a silly statement? Should a game that has 100+ AI characters in a city that has moving cars, explosions, etc... be 1080p @ 60FPS? What do you sacrifice to hit 1080p because that is the most important thing to you? Texture quality, amount of things going on at once, off load processing to the cloud? What?

1080p is nice, but making it "the benchmark" means you will lose something else to get there.
 
Sad that everyone is buying 4K TVs and the cutting edge consoles can barely do 1080P @60fps
Interesting point about scaling being used in Halo5 to achieve 1080p. Crazy.
 
"60fps gameplay is supported by a new progressive resolution system that allows us to dynamically scale the resolution at which we render the game (up to 1080p) based on the needs of the scene. This enables us to deploy resources where they have most impact across a diverse series of experiences throughout the game while delivering the most visually stunning Halo game ever".

Sooo....it runs the game in 720P or lower and scales it to phoney 1080P to ensure 60FakesPerSecond.

BASED on what is on the screen at the time. Just sad.
 
That is just silly. Solid 1080p @ 60FPS doing what?

Hmm well lets see...at least in most games?

That would be a start.

Well lets even make it easier for them...full 1080p at 30FPS+ in ANY game.

But we don't even have that in most cases.

Piss poor console generation.
 
I feel they should have squeezed one more year out of previous gen to give more time for development/cost reduction so we could have received a console that would have felt much more powerful than the previous one. Though with the group of muggles out there shouting "1080p/60fps" as the only spec that matters, there is no way to win. Remember texture size, polygon count per model, shadows, some of that good stuff that makes a game look better?

I would like if there were graphics sliders in games for console, let people decide what is more important, plus that feedback the software would provide would give better insight into whether people prefer quality over framerate of vice versa
 
Have video consoles EVER been on level or better then PC hardware from the same timeframe?

Possibly the Neo Geo in 1990, but it was very costly.
 
Have video consoles EVER been on level or better then PC hardware from the same timeframe?

Possibly the Neo Geo in 1990, but it was very costly.

The Xbox 360 was fairly close at launch. It's specs were probably that of a mid to high end PC. It had less ram than PCs at the time, but then again it loaded a lot less information to the ram than PCs did at the time.
 
I think at current pace ARM will catch up with the Xbone and PS4 in a year or two.
 
That is just silly. Solid 1080p @ 60FPS doing what? They can both do that now depending on what they are displaying.

What is your benchmark for making such a silly statement? Should a game that has 100+ AI characters in a city that has moving cars, explosions, etc... be 1080p @ 60FPS? What do you sacrifice to hit 1080p because that is the most important thing to you? Texture quality, amount of things going on at once, off load processing to the cloud? What?

1080p is nice, but making it "the benchmark" means you will lose something else to get there.
What can you lose to get there? Oh, I don't know... TV support, Kinect, fake passthrough, bloated OS...

If the hardware was there then you could do all you listed. Point is the console manufacturers aimed low with the hardware to keep costs down while shoving social and entertainment features down our throats.

Have video consoles EVER been on level or better then PC hardware from the same timeframe?

Possibly the Neo Geo in 1990, but it was very costly.
Funnily enough, the 6th (PS2, Xbox, Gamcube) and 7th (PS3, Xbox 360) generation consoles were on par with mid- to high-range PC hardware at the time they were released. Of course, a new generation of PC hardware is never more than 18 months around the corner, so the consoles were quickly overshadowed by it. The 8th generation consoles were low-end or worse at the time of their release.
 
Isn't the main reason the Xbox One can't handle 1080p well is because of its weaker GPU? I don't think an unlocked core or better optimization of its ESRAM can help it much if the GPU is weak.

No, the primary issue is that the Xbox One uses DDR3 for it's main RAM. Most devs program their games to fit the framebuffer into the ESRAM cache because otherwise graphics performance falls off a cliff. You can reduce details all you like but unless your framebuffer fits into that ESRAM along with whatever else you need in there you're not going to get decent performance out of the Xbox One. The PS4, with all of its DDR5 doesn't have this problem. The PS4 also has more execution units, which would allow for better FPS or more details but the frame buffer issue with Xbox One's one totally insurmountable issue.
 
Honestly they'd be better off using more of their TDP on the GPU than using that 7th CPU core.
 
The Xbox 360 was fairly close at launch. It's specs were probably that of a mid to high end PC. It had less ram than PCs at the time, but then again it loaded a lot less information to the ram than PCs did at the time.

I remember the 360 being able to run Call of Duty 2 a little better than my PC at the time which was no slouch. Also, when Gears of War launched it looked better than any game that was available in my opinion. I heard the 360s GPU was on the level of a Radeon X1900 which was a high end GPU back then. The 360 was actually worth getting for its power but the current consoles today aren't. They are worth getting for their exclusives if they appeal to you.
 
Another Microsoft fail.

I'm not sure of it being a fail, as much as an empty attention grab by MSFT. Anyone with a basic level of gaming hardware knowledge will know this will do absolutely nothing for in-game performance, but many people (especially Xbox fanboys) will eat it up.

At the end of the day, there's really nothing Microsoft can to make up for a 25-35% GPU performance deficiency and slower overall memory pool vs. PS4. In any case, we're talking about 4 year-old GPUs (cut-down Pitcairn) on both consoles so it doesn't matter much to PC master racers like us :)
 
I'm amazed that one day one of the brainstorming for both the One and PS4 that on the flip chart at No.1 the priority wasn't "Solid 1080p 60FPS".

Or at least solid 1080p/50 or even 1080p/40...

Maybe next gen?

I guess going with AMD the only way to have got that would have been consoles twice the size with cooling.

You forget that when the systems were designed, Nvidia not really any better off in terms of GPU usage... AND Nvidia doesn't make CPUs.

The GPUs are also very low end compared to what was available for PCs at the time.
 
I remember the 360 being able to run Call of Duty 2 a little better than my PC at the time which was no slouch. Also, when Gears of War launched it looked better than any game that was available in my opinion. I heard the 360s GPU was on the level of a Radeon X1900 which was a high end GPU back then. The 360 was actually worth getting for its power but the current consoles today aren't. They are worth getting for their exclusives if they appeal to you.

Correct. The GPU ("Xenos") was somewhere in between an x1800 and x1900 in performance which was rather high-end at the time. I remember thinking it was great that my $450 purchase was almost as fast as my $2000 PC at the time. The PS3, on the other hand, was a tad dated in GPU (essentially a 7800 GT) at the time of its release the following year. Both consoles were eclipsed significantly shortly after with the release of G80/8800 GTX.
 
Both the PS4 and X1 are weak. They were weak when they launched. The X1 weaker than the PS4. However saying 1080p@60FPS or bust is silly. It just means they will limit other aspects of the game to meet 1080p@60FPS. Both consoles can easily do 1080p@60FPS if you limit what they have to do within the game.

So which is more important 1080p or huge fire fights, lots of physics, lots of AI, etc... ?
 
I can remember when the 360/PS3 came out most of the articles were really keen on the 'cutting edge' tech in them. It was almost like they only had a 6 month development period till market.

However this gen is like a mid range 'laptop in a box'.

What is also interesting is that the eDRAM cache strategy was lauded at the time for the 360 but ridiculed this time round.

Ah well that's what happens when you put playboy multi-millionaire project managers on win/win contracts in charge of major parts of the corporation.
 
Both the PS4 and X1 are weak. They were weak when they launched. The X1 weaker than the PS4. However saying 1080p@60FPS or bust is silly. It just means they will limit other aspects of the game to meet 1080p@60FPS. Both consoles can easily do 1080p@60FPS if you limit what they have to do within the game.

So which is more important 1080p or huge fire fights, lots of physics, lots of AI, etc... ?

Okay whilst I agree the 'locked 1080p 60FPS' is a very tall order, it's certainly something they should have tried far harder to aim at. It's not like they were totally unaware of this expectation.

More was expected. We had a great leap with the 360/PS4, this gen, it's like a minor refresh.
 
Xbox One costs $350. On what planet should we expect it to have the same capability as a full on gaming rig?

All this boils down to is another iteration of "Good Enough" technology. E.G.
Blu-ray vs. Video on Demand. VOD will never be as high quality as Blu-ray, but it's good enough.
FLAC/OPUS vs MP3. MP3 will never be as high a quality, but it's good enough.
Xbox One (or even PS4) vs. dedicated gaming rig. A console will never run games 4k @ 60+ fps, but they are good enough.

Good Enough will always win because it's cheaper to implement, cheaper to buy and more convenient for the masses to use. That doesn't mean we have to like it, just that we pretty much have to accept the fact that "Good Enough" isn't going away probably ever.
 
So the article doesn't say where this other core was taken from. Was it reserved for Kinect processing, or what?
 
Back
Top