Assassin's Creed: Parity, Ubisoft gimps PS4 version to match XBone

DPI

[H]F Junkie
Joined
Apr 20, 2013
Messages
12,885
How has the latest console not made the rounds here yet. Neogaf thread raged to 100 pages in a matter of hours today.

It all started here: http://www.videogamer.com/ps4/assas...y_is_900p_30fps_on_both_ps4_and_xbox_one.html

Assassin's Creed Unity will run at 900p/30fps on both PlayStation 4 and Xbox One, Ubisoft has confirmed, with the publisher opting to aim for platform parity to avoid discussion over the differences in performance.

"We decided to lock them at the same specs to avoid all the debates and stuff," senior producer Vincent Pontbriand told VideoGamer.com while explaining that it's the consoles' CPUs – not the GPU – that prevents Ubisoft Montreal from improving the game's performance.

"Technically we're CPU-bound," he said. "The GPUs are really powerful, obviously the graphics look pretty good, but it's the CPU [that] has to process the AI, the number of NPCs we have on screen, all these systems running in parallel.

"We were quickly bottlenecked by that and it was a bit frustrating, because we thought that this was going to be a tenfold improvement over everything AI-wise, and we realised it was going to be pretty hard. It's not the number of polygons that affect the framerate. We could be running at 100fps if it was just graphics, but because of AI, we're still limited to 30 frames per second."

Last year's Assassin's Creed 4: Black Flag also shipped at 900p/30fps on both PS4 and Xbox One. A post-release patch, however, bumped the PS4 version to 1080p. Ubisoft has given no indication that it has plans to do the same for Unity.

http://www.gamespot.com/articles/assassins-creed-unity-runs-at-900p-30fps-on-xbox-o/1100-6422771/
http://kotaku.com/ac-unity-will-have-same-specs-on-xbox-one-and-ps4-to-av-1643054770
http://www.neogaf.com/forum/showthread.php?t=907457
http://www.reddit.com/r/PS4/comments/2igmzq/assassins_creed_parity_unity_is_900p30fps_on_both/
 
Last edited:
is there really any noticeable difference in image quality between 900p and 1080p?...I understand where Ubisoft is coming from...PS4 shouldn't have been gimped but overall I don't think anyone on consoles will notice...
 
is there really any noticeable difference in image quality between 900p and 1080p?...I understand where Ubisoft is coming from...PS4 shouldn't have been gimped but overall I don't think anyone on consoles will notice...

Well the controversy definitely isn't whether or not the diff between 900p and 1080p can be seen by most people. Instead of just making it look the best it can on each respective platform, the revolt is over a developer spilling the beans that they are holding back one platform's version to make the platform holder of a competing platform happy, in part due to a co-marketing arrangement.

I'll be playing it on PC myself so me no care, but after the beating Ubisoft took on Watch_Dogs and gimping PC graphics to be more in line with the console versions, they really don't need another controversy. Ubisoft is in damage control mode now and released a polished PR statement, claiming the producer that spoke out of turn was "misinterpreted". They claim they didn't "lower" specs on either version of the game, but carefully avoided addressing whether they intentionally limited what one console's version could do.
 
Last edited:
Do all topics at NeoGAF need to be discussed here as well?

For what it's worth, you can't quite achieve absolute parity on these two machines by running the same resolution and locking to the same frame rate. All else being equal, and unless the game is completely CPU-bottlenecked, the machine with the faster GPU will generally have more consistent frame times. So, the PS4 might have the occasional (and occasionally imperceptible) edge over the X1.

If you care very deeply about the difference 180 lines of resolution makes, though, other platforms are available to you. If you just want to enjoy what's probably a pretty good-looking game, I don't think this 'controversy' is anything to worry about.
 
Do all topics at NeoGAF need to be discussed here as well?

If it was only a 'Gaf topic then right who cares. But this was on every gaming site today, the CEO of Sony made a few public statements about it, and its going to rage on from the looks of it.

If you care very deeply about the difference 180 lines of resolution makes, though, other platforms are available to you. If you just want to enjoy what's probably a pretty good-looking game, I don't think this 'controversy' is anything to worry about.

Again, the heart of the issue isn't the diff in resolution and/or whether its perceptible to most people. Its about Microsoft moneyhatting developers to artificially hold back the competing platform's version of the game.
 
Last edited:
Where is the evidence of that? You can hardly base a controversy around a supposition.
 
is there really any noticeable difference in image quality between 900p and 1080p?...I understand where Ubisoft is coming from...PS4 shouldn't have been gimped but overall I don't think anyone on consoles will notice...

I get tired of people saying this. SOMEWHERE SOMETHING WAS pulled out. If there is no difference then it means they locked the FPS lower than it could have gone or they turned down graphical quality. You have a machine with X power and another with X+Y and you run them at the same settings with same FPS there is only 1 explanation X+Y is wasting potential for no good reason. Why wouldn't anyone who owns X+Y not be pissed.

Second anyone out there who cares about gaming should be telling this companies to cut this out. Its not good for anyone because it hides competitive disadvantages and teaches companies like MS in this case that they can under deliver then strong arm game devs into gimping another completely independent companies platform. How would like it if you owned a PC and some game company was like well due to many people running intel graphics we are locking your game to 30 fps, you know its really important to keep the gaming experience consistent, you know because I don't know consistency. What is even more disturbing about all this shit is that people in almost all these games cannot play against or with other players on other platforms. So any argument for an unfair advantage doesn't matter. PC gamers play with PC gamers, Xbone gamers with xbone gamers, and PS4 gamers with themselves. Consistency is has no meaning at all. This is simply MS paying a company off to hinder the performance of a game on a system they don't even have rights too. You know if this was worded any other way in any other industry it would probably fall under an anti competitive practice.
 
is there really any noticeable difference in image quality between 900p and 1080p?...I understand where Ubisoft is coming from...PS4 shouldn't have been gimped but overall I don't think anyone on consoles will notice...

I don't understand the mental gymnastics people go through to justify regression in resolution and framerate requirements in 2014. We have been striving to improve them since the advent of gaming, regressed a bit with LCDs but finally started moving to 120/144 Hz and 4k, but we are now wondering if 720/900 upscaled to measly 1080 is really that important? Sigh...
 
Nice try, if it really was CPU limited it would be running at 1080p on both.
 
Not buying this game at all. 900p? Black flag was 1080. This is ridiculous. And yes I do notice the difference from 900p to 1080p.
 
is there really any noticeable difference in image quality between 900p and 1080p?...I understand where Ubisoft is coming from...PS4 shouldn't have been gimped but overall I don't think anyone on consoles will notice...

I don't think there is for the average person. I've played a bunch of games on both platforms to see if there was something I could notice. For me I had hard time really telling the difference in many cases. If your'e the person that is sitting 1 foot away from your screen then you may notice it.

On the destiny beta, I could only notice slight aliasing in some areas.

Hell I even played Ground Zeroes and couldn't notice that big a difference. That was from 720p and 1080p.
 
Looking like the dirty word of this generation will be 'console parity'. Hopefully this isn't some behind door dealing, but it's hard to not believe the rumours with the way Ubi's PR is spinning it.
 
Pc had this for YEARS.... where is the news? we just get bad port on PC
 
Looking like the dirty word of this generation will be 'console parity'. Hopefully this isn't some behind door dealing, but it's hard to not believe the rumours with the way Ubi's PR is spinning it.

I have a feeling its behind the door dealings. Developers didnt seem to have a problem with not gimping the 360's version compared to ps3 last gen for any multiplat. Why now? I smell something fishy.

Either behind the door dealings or just very lazy and trying to cheat costs.
 
I have a feeling its behind the door dealings. Developers didnt seem to have a problem with not gimping the 360's version compared to ps3 last gen for any multiplat. Why now? I smell something fishy.

Either behind the door dealings or just very lazy and trying to cheat costs.

Pretty sure Microsoft paid up for it. These back door dealing ruin the gaming industry. No wonder PC gamers hate consoles so much, its ruins it for everyone
 
I plan to buy this on the PC, but the last two have been littered with technical issues...some of which were never resolved. With the lack of triple buffering, it's as though they wanted you to play AC4 at 30fps on the PC with most setups.
I'll probably wait a few days to make sure it isn't a steaming pile. AC4 was such a great game, it's not like I can just ignore this one.
 
I have a feeling its behind the door dealings. Developers didnt seem to have a problem with not gimping the 360's version compared to ps3 last gen for any multiplat. Why now? I smell something fishy.

Either behind the door dealings or just very lazy and trying to cheat costs.

I can understand the 360 vs PS3 differences (that PS3 was apparently a bit of a donkey to develop for), but yeah, something feels a bit off here.

I think you're right that it's either backdoor politics or laziness/costs savings. The whole CPU-limiting factor smells a bit desperate to me, and combine that with the non-answer to the allegation - blah.

Hopefully if there's enough of a stink caused every time something like this happens, we'll see less of it. And I won't be surprised if we'll see a 1080p post-launch patch for both versions come out later.
 
I have a feeling its behind the door dealings. Developers didnt seem to have a problem with not gimping the 360's version compared to ps3 last gen for any multiplat. Why now? I smell something fishy.

Either behind the door dealings or just very lazy and trying to cheat costs.
I have the same feeling. I think a major contributing factor is how PS4 has been killing XBONE in monthly sales. I said very early on that the XBONE could potentially hold back progress this generation of consoles and it seems that so far for the major publishers it has turned out to be true.

I say buy whichever console makes you happy, but don't bitch about the differences in hardware. I'm sure there are more people like myself who purchased a PS4 for the console-exclusives because it is stronger hardware-wise, hoping for a better experience. I'm kind of lying, though... because I usually buy every console :p. But this gen I bought the PS4 for the multiplats and the XBONE for the exclusives, while last gen it was the other way around.
 
I just don't understand it, if this were the other way around (PS4 gimping the XBONE in a world where the ps4 is less powerful) you would hear a BIG uproar and more articles about it.

I hate this generation purely on the fact that companies (like Bungie and now Ubisoft) use the damn "consistency" as an argument for mediocrity.

If there is no cross-gen play there is no damn reason to be limiting a game because of weaker consoles. We will NEVER ever have to play against them. This is purely to drive sales so they don't lose sales elsewhere on the pretense that one game is inferior to another due to lower resolution and/or framerate.

Its sickening. Publishers will not drop the previous generation because of the install base and that coupled with shady back door dealings will hold back this generation for years to come.
 
I hate to throw a wrench in this argument (or give XBONE any props) but if what he's saying is true about being CPU bound, they gimped the One version to match PS4.

Xbox One's CPU is marginally faster than PS4's CPU (1.75GHz vs 1.6GHz).
 
I hate to throw a wrench in this argument (or give XBONE any props) but if what he's saying is true about being CPU bound, they gimped the One version to match PS4.

Xbox One's CPU is marginally faster than PS4's CPU (1.75GHz vs 1.6GHz).

Ha, I like that. I wish I was at home so I could go check and see if anyone's posted that in the neogaf thread, if not, you should go post that information there so I can read all of the rage when I get home. :D
 
I hate to throw a wrench in this argument (or give XBONE any props) but if what he's saying is true about being CPU bound, they gimped the One version to match PS4.

Xbox One's CPU is marginally faster than PS4's CPU (1.75GHz vs 1.6GHz).
I feel dumb for not having thought of that. The X1's CPU also has a nice chuck of developer-addressable SRAM and lower-latency DDR3 (latency being quite important for typical CPU ops).
 
I hate to throw a wrench in this argument (or give XBONE any props) but if what he's saying is true about being CPU bound, they gimped the One version to match PS4.

Xbox One's CPU is marginally faster than PS4's CPU (1.75GHz vs 1.6GHz).

Sony has never confirmed or stated the PS4's CPU clock, and judging from this thread, it might actually be faster: http://www.neogaf.com/forum/showthread.php?t=737629

Of course, it's not solid proof, but it's the only benchmark that's ever been done which only takes into account CPU and not GPU.
 
Again, if it were the CPU holding it back they could crank up the resolution on both consoles without any impact to the framerate.
 
is there really any noticeable difference in image quality between 900p and 1080p?...I understand where Ubisoft is coming from...PS4 shouldn't have been gimped but overall I don't think anyone on consoles will notice...


I don't own a new console, but does not running in a native mode (1080p TV for example) make it look bad? I know it does on PC monitors as the pixels get distorted/blurred.

It's very disappointing that these new consoles can't hit 1080p. That should have been the baseline. 1080p with good graphics.
 
No, running non-native on a TV isn't nearly as jarring as it is with a monitor. The TV does a bunch of post processing plus you're not sitting as close so its really not that bad until you get into 60"+ territory.
 
"We decided to lock them at the same specs to avoid all the debates and stuff,"

You... don't... say...
Ubisoft is truly the most incompetent dev on the planet. I remember back in the day when developers were all about pushing hardware to the limit, regardless of platform, not making concessions and excuses... This is more embarrassing than anything. Pathetic even. It'd be acceptable if it was a small indie shop, that couldn't afford to take it to the limit, because of, you know, money... Ubisoft is really something else...
 
I'll be playing it on PC myself so me no care, but after the beating Ubisoft took on Watch_Dogs and gimping PC graphics to be more in line with the console versions, they really don't need another controversy. Ubisoft is in damage control mode now and released a polished PR statement, claiming the producer that spoke out of turn was "misinterpreted". They claim they didn't "lower" specs on either version of the game, but carefully avoided addressing whether they intentionally limited what one console's version could do.

the difference between the Watch Dogs controversy and this is that there was a noticeable difference in IQ from the earlier presentations and the final product...I get why people are upset about 900p but it's more about principle then actual substance...they are gimping the console versions to more align with each other when there wasn't a huge gap to begin with...they are not touching the PC version (hopefully)
 
the crazy part about this 'controversy' is that almost every game that's come out on Xbox One has been less then 1080p while the PS4 ones have all been 1080p...there was never any major uproar over this until now...point being that Ubisoft in trying to quell any graphics controversy created an even bigger one...if Ubisoft had left things alone then this wouldn't have even been an issue for most
 
This is not a PS4 vs XB1 issue, this is about every platform hitting their potential. That is how competition and progress happen.
I suspect Microsoft is the driving force behind this bullshit, but Ubi can go eat shit forever as far as I am concerned.
 
Sony has not been shy at sharing how their system is superior. I see no reason why they would hide the fact that their CPU is faster, if it truly was.
 
Loved the image showing game name change to Assassin's Creed Parity. lol.

Multi-plat play on PC. Enough said.
 
Sony has never confirmed or stated the PS4's CPU clock, and judging from this thread, it might actually be faster: http://www.neogaf.com/forum/showthread.php?t=737629

Of course, it's not solid proof, but it's the only benchmark that's ever been done which only takes into account CPU and not GPU.
Is it? Dynamic image generation is something exceptionally well-suited to GPU computation. I'd be very surprised if the Substance Engine is entirely CPU-based.
 
Dunno if a random PR account is enough proof. And it still doesn't explain why the PS4 would score higher on that benchmark.
Your post said this:
Sony has never confirmed or stated the PS4's CPU clock...
It is an official Sony account, meaning a Sony employee is stating the speed of the PS4's APU in that Google+ post. Whether or not that is true or accurate is not the issue in the context of your post. The higher bandwidth of the GDDR5 in the PS4 compared to the DDR3 in the XBONE could contribute to higher scores in certain benchmarks where 0.3ns of latency wouldn't make much of an impact.
 
So you're just arguing semantics then?

I think it's fairly obvious what I meant. Almost every Twitter/G+/Facebook account is handled by an intern or a low level marketing guy. Hell, that post you linked flat-out says the PS4 is 43 PS2's together, which isn't even remotely accurate.

edit: Not to mention from that thread, this guy develops for both platforms: http://www.neogaf.com/forum/showpost.php?p=94264594&postcount=50

Yes, you can get more out of the PS4's CPU than you can the Xbox's.
 
Don't believe what Sony says — believe only what anonymous NeoGAF posters say.
 
Lol @ PR trying to save face.

UPDATE: Ubisoft has released a statement claiming that it "did not lower the specs for Assassin's Creed Unity to account for any one system over the other".

"We understand how Senior Producer Vincent Pontbriand's quotes have been misinterpreted," Ubisoft said in a statement provided to Kotaku. "To set the record straight, we did not lower the specs for Assassin's Creed Unity to account for any one system over the other.
 
Back
Top