Shadow of Mordor Graphics Compared: Xbox One, PS4 and PC

When it was revealed that the PC suggested specs for Ultra settings was 6gb Vram I thought there was a huge disparity between that number and the quality of the textures we've seen in game.

That's because the "6GB for Ultra" has been completely overblown, and the developer did a poor job communicating what started out as good intentions. The game was created with 6GB Titans in the developers' machines, and when development was finished they decided why not give people the option to use the source textures used during development - and that became Ultra. Up to that point, what is "High" in the game now -was- the Ultra preset. They really should've just made the 6GB Textures be "Ultra Extreme" or something since everyone's getting bent out of shape thinking they're missing out if they don't run Ultra. SoM's High >= Ultra in other games.

Compare it to Metro / Crysis / Witcher 2/ etc, and it's not an especially attractive PC game.

Only someone that hasn't played it could think that. I love Crysis/Witcher2/Metro 2033&LL, but there are some areas in SoM that blow other games away. Particularly the areas like rock faces, cliffs, the stones of building ruins, mud, they're pretty spectacular and you have to see them for yourself on a decent PC. The motion capture and fluidity of the movements in SoM is also some of the best I've seen. Run around in SoM for a while, then load up Witcher2 as I did and notice how much more archaic it looks and feels. Night and day.

And Crysis 3 takes gorgeous screenshots but its the gameplay in SoM that ultimately steals the show.
 
As usual, Eurogamer is much better at this. Checkout their comparison article and video.

The utlra textures on the PC do look a little better. However, the difference is not drastic at 1080p. If I weren't' watchig a comparison video, I'd be hard pressed to tell the difference between PS4 and Ultra PC.
 
36 GB download at 420 kbs = sad panda
Hurry! Hurry! Hurry!
 
If I had to guess the reason of the wash out on the PC capture it's likely that they captured the footage using an external HDMI capture unit un-aware that Nvidia defaults to limited RGB over HDMI without a registry hack. This can normally flatten the picture in games since they will normally use full range on the PC.
 
The PS4 to me looks best in the comparison. No way the PC version should look that washed out. Seems like something fishy is going on or they didn't set it up the PC version properly.

Yeah, my opinion is that PC looked the worst of all of them, largely because of how washed out it was, which killed details, especially behind the game character.
 
The PC should have been tweaked for at least 1920x1200 & higher....You can't do that with either console. Another stupid console game ad trying to stimulate console sales by deliberately reducing the quality of the PC version to as close to the limits of the consoles as they could manage. Sad thing is there lots of really confused console customers out there...;) I mean, I guess it's "sad." It's also pretty funny, too, when you think about it...
 
As usual, Eurogamer is much better at this. Checkout their comparison article and video.

The utlra textures on the PC do look a little better. However, the difference is not drastic at 1080p. If I weren't' watchig a comparison video, I'd be hard pressed to tell the difference between PS4 and Ultra PC.

Trouble is, though...you can do worlds better than 1920x1080 with a PC...but that's not true with these consoles. Some poor souls will look at this and think the PS4 runs games at the same resolution of a PC....;) Imagine if they'd run the comparison using a PC running at 2560x1600...and we haven't even begun to discuss 4k, etc., cpus, gpus--the PC completely outclasses the consoles in *every* category--so the entire point of this demo is only to show what poor PC programmers the developers are...it's almost embarrassing!
 
A previous response mentioned a fog effect on the PC that's not found in the console versions. It might account for some or all of the brightness or gamma difference.

Anyone asked the author to account for it?
 
PC obviously has more fog/dof. It's not washed out if you look at the foreground.

Hard to tell from the video if I like the fog/dof effect, or if it's overdone, but it definately does not lend itself to a good comparison.
 
Sweetfx would have changed the complete look of the pc. I use it with every game I play and is always by far the most dramatic image enhancing change I can make

SweetFx turns most games to a liquid rainbow looking piece of shit.
 
I've seen that washed out look on PC when Digital Foundry does it's PC vs Console game comparison which I notice but don't think it's a big enough deal. It's like comparing someone who sets the monitor brightness higher than you and it's easily fixable anyway. You need to play the game to really notice the difference. Showing it on YT just makes it worse, loss of image quality because of compression and stream.
 
"But unlike previous console generations, the Xbox One and PS4 versions of these games are really starting to hold their own against the mighty PC gaming rig." (from kotaku article linked above)

Sure, the PS4 and Xbone hold their own as of right now. Just wait a few years too see how they are doing after a few more generations of graphics cards are released from Nvidia and AMD.
 
PC version looks like crap in this video. My game doesn't look nearly that washed out.

Thanks. This is what I wanted to know from those who already own the game. Does that video accurately represents the actual graphics.

Here's a version with a PC using the HD texture pack.
Shadow Of Mordor Looks Incredible On Maxed-Out PC Settings

Speaking of which, does the HD texture really require 6GB of graphic memory? Anyone tried running it on a 4GB card
 
So far, in my case, Ultra will use almost all of the 6GB on my video card. I have made some video's on YouTube trying to compare the HD content with the without HD content Doesn't help that YouTube is dropping the video bit-rate from 42k to 8k. But, with both content on Ultra, I am getting basically the same video quality on the second video I didn't play as long as I did the first video so it didn't hit 6GB, but the non-HD content did. In the first video, it hits 6041MB used around the 6:10 mark.:eek: Second video is with the HD content installed.

First Video
https://www.youtube.com/watch?v=Qh6GYo6351U

Second Video
https://www.youtube.com/watch?v=-N2ZkmQ_J0Q

Rig is in my Sig
 
The only thing wrong with the PC version is the way this comparison shows it.
I'm currently playing it on a 7950, with the Ultra textures installed, and @ 2560X1440 and it looks a whole lot better then their comparison, but even bringing it down to 1080P with the textures on high, it's not all washed out like this video.

It's just a very poor job done by this site. It's not a true comparison. That said, none of the versions look bad, and it's a great game regardless.
 
It's just a very poor job done by this site. It's not a true comparison. That said, none of the versions look bad, and it's a great game regardless.
So I guess the only question is if it's straight up incompetence or an actual sponsored agenda to make the console versions look better. I think in this particular case I would lean towards incompetence.
 
So you all are trying to compare a $400 machine and experience to machines that have $300+ video cards alone?

Someone mentioned that in a years time you can get new hardware to run the game better?

Out of one side of your mouths you claim that PC gaming is cheaper, but then want to compare the experience $1500 gaming machines that you spend $300+ a year minimum to update at least the GFX cards. You can't have both. A $400 PC probably wouldn't touch the performance on consoles in this game.

In reality people find extreme value in the consoles and will continue to. There is not an experience to be gained on the PC at the price point the consoles are. No matter how much you beat your chest about console gamers paying full price yada yada yada.

The cheap sale mentality of the PC gaming crowd only hurts their chances of getting games that tax hardware for the 1% who own it even more.

It is just not profitable to put the funding in to try and make the niche market happy.
 
Only someone that hasn't played it could think that. I love Crysis/Witcher2/Metro 2033&LL, but there are some areas in SoM that blow other games away. Particularly the areas like rock faces, cliffs, the stones of building ruins, mud, they're pretty spectacular and you have to see them for yourself on a decent PC. The motion capture and fluidity of the movements in SoM is also some of the best I've seen. Run around in SoM for a while, then load up Witcher2 as I did and notice how much more archaic it looks and feels. Night and day.

And Crysis 3 takes gorgeous screenshots but its the gameplay in SoM that ultimately steals the show.

I haven't play SoM yet so you're correct my only point of reference is the videos I've seen online. I know it looks good but it's not revolutionizing PC gaming graphics technology. I don't think it's even nudging the envelope. My issue is this; it could have. It probably would have if there wasn't an imperative to maintain parity between console and PC ports.

This is just speculation on my part however. I'm wearing a tinfoil hat and I want to believe.
 
I haven't play SoM yet .


Let me stop you there. Play it and then reassess. Gameplay videos of this game were kinda boring to me prerelease. Then I played.. Holy crap gameplay turned out to be like crack. Addictive as hell.

Graphics wise there are some really amazing eye popping textures, like rock and stone and mud and some of the fabrics. You can't play this game in its full PC glory without thinking it looks great.

This game came out of nowhere and stole my tears ((nohomo))
 
This comparison video shows the PC version in a poor light; I'm running the game maxed out without the ultra texture pack and it looks amazing at 1080p. The game can get repetitive but so can every other game the combat keeps me coming back because you feel like a bad ass with 20 orc corpses at your feet.
 
oh the irony of using lossytube to try and compare graphics.
 
One of the most informed responses of the bunch. I would think that any new Gfx card would allow the PC to excel over the consoles but we don't know what they were using in the samples? I'm considering Xbox1 to keep in touch with family, etc. but PC gaming is where I'm rooted for now esp if I go 2560x1440. nuff said ...
 
This sure doesn't look like 6 gb worth of vram quality. All that tells me is that they didn't optimize it for PC worth a flip. Amazing graphics and combat for the character, sure, but the terrain is still 2d and looks worse than a decent skyrim texture mod.
 
So you all are trying to compare a $400 machine and experience to machines that have $300+ video cards alone?

Someone mentioned that in a years time you can get new hardware to run the game better?

Out of one side of your mouths you claim that PC gaming is cheaper, but then want to compare the experience $1500 gaming machines that you spend $300+ a year minimum to update at least the GFX cards. You can't have both. A $400 PC probably wouldn't touch the performance on consoles in this game.

In reality people find extreme value in the consoles and will continue to. There is not an experience to be gained on the PC at the price point the consoles are. No matter how much you beat your chest about console gamers paying full price yada yada yada.

The cheap sale mentality of the PC gaming crowd only hurts their chances of getting games that tax hardware for the 1% who own it even more.

It is just not profitable to put the funding in to try and make the niche market happy.

I have not upgraded my computer in at least 4-5 years. And when I did, I certainly did not spend $1500 on it. My 7950 cost me $180, and I am playing this game and pretty much every other release on maxed out settings.
 
Here's an interesting related story: New Assassin's Creed game already limited by console CPUs, and even XBox one is not hobbling PS4 before PC comes into picture:

http://techreport.com/news/27168/assassin-creed-unity-is-too-much-for-console-cpus

the title will be limited to 1600x900 resolution and 30 frames per second on both consoles "to avoid all the debates and stuff."

So frustrating. This is interesting through:

GPU horsepower doesn't appear to be the most important limiting factor in this case, though

Well if that's the case then why limit resolution? Someone else would have to explain that one, I'm kinda confused why a bottleneck in AI code affects resolution.

Assassin's Creed Unity is coming to the PC, and that version shouldn't be locked to a specific resolution and frame rate

Lets hope this is the case and not another Watchdogs.
 
http://www.eurogamer.net/articles/2...nity-graphics-lock-for-parity-on-ps4-xbox-one

"We're a bit puzzled by the ACU situation and had a chat about it amongst ourselves. Internal bandwidth is shared between CPU and GPU on both machines which can result in a battle for resources, and if ACU is as CPU-heavy as Ubisoft says it is, that could potentially have ramifications - the only problem with this theory is that there's very little evidence that other titles have experienced the same issue, certainly not judged by Ubisoft's own output.

Ok so it's not the CPUs but rather interconnect limitation? Sounds like they ran out of bandwidth with CPU and GPU sharing resources here. I guess that explains why resolution was bumped down, was confused about that for a minute.
 
PC definitely had some washed out colors. Everything else looks basically even by my estimation.

This makes me think there's some shenanigans going on here, possibly the dev being forced to create false parity between ports.

I agree. Even the Ultra shader 6GB pack did not, at least to my eyes, appear much better than a PC at 2GB.

I think there is a movement to keep the performance nearly identical on all platforms. So if you want the BEST graphics you will have to look at exclusives.
 
http://www.eurogamer.net/articles/2...nity-graphics-lock-for-parity-on-ps4-xbox-one



Ok so it's not the CPUs but rather interconnect limitation? Sounds like they ran out of bandwidth with CPU and GPU sharing resources here. I guess that explains why resolution was bumped down, was confused about that for a minute.
The PS4 has as an extra bus specifically for this. Additionally, the CPU and GPU don't have to go through traditional wait states in order to communicate. So, this doesn't make sense unless they are using traditional style code, therefore ignoring the new features. Wouldn't surprise me...
 
The PS4 has as an extra bus specifically for this. Additionally, the CPU and GPU don't have to go through traditional wait states in order to communicate. So, this doesn't make sense unless they are using traditional style code, therefore ignoring the new features. Wouldn't surprise me...
It's more likely, that they ran into a limitation on one console, then didn't want to add extra effort and expense to fix it on another, so rather than mod their engine, they decided to keep both versions as is at a reduced resolution. The same thing happened with the original XBox360 and PS3, games generally ran better on the 360 because the devs didn't want to rework their engines for the PS3 due to budget/time constraints.
 
Can't comment on this game specifically, but it's real enough. From an Ubisoft dev leak:

"Currently as it stands, there is definitely a lot of push coming from publishers to not make the experience so different on consoles as to alienate people into thinking that next generation is not as powerful as PC."

"while ‘Yes’ the lead platform is the PC, we simply cannot have such a big gap. "

But the consoles are NOT as powerful as PC's...unless I missed the point completely. I'm tired.
 
It's more likely, that they ran into a limitation on one console, then didn't want to add extra effort and expense to fix it on another, so rather than mod their engine, they decided to keep both versions as is at a reduced resolution. The same thing happened with the original XBox360 and PS3, games generally ran better on the 360 because the devs didn't want to rework their engines for the PS3 due to budget/time constraints.

Yes that's exactly what they did, and they specifically said so:

the title will be limited to 1600x900 resolution and 30 frames per second on both consoles "to avoid all the debates and stuff."

Quoted verbatim.
 
But the consoles are NOT as powerful as PC's...unless I missed the point completely. I'm tired.

Yes, the point is for business reasons they decided to hobble PC ports artificially due to fear of cannibalizing console sales. They need to make consoles to look not so far behind because that's where they make big bucks - closed garden with less competition so prices stay higher.
 
Yes, the point is for business reasons they decided to hobble PC ports artificially due to fear of cannibalizing console sales. They need to make consoles to look not so far behind because that's where they make big bucks - closed garden with less competition so prices stay higher.

to make consoles look*

God no edit button is a pain.
 
But the consoles are NOT as powerful as PC's...unless I missed the point completely. I'm tired.
Yeah, that's the irony to all this. They're not as powerful, but there's a powerful marketing incentive to make it appear otherwise, which in turn, has a direct impact on how good the PC ports look and/or run.
 
Boy do we live in a wonderful place where those who pay more money do not get the full value in return...shame really. I mean this is blatantly obvious now that my brain is working with coffee in my system, but it's a shame through and through (worked in the game industry for a few years, so I'm not completely blindsided by it, just irritated).
 
Yeah, that's the irony to all this. They're not as powerful, but there's a powerful marketing incentive to make it appear otherwise, which in turn, has a direct impact on how good the PC ports look and/or run.

You'd think being someone who worked in the games industry and spent hours of the day coding wouldn't have to ask what I did, but I had no coffee yet...don't hate.

On a serious note though, the reasoning is clear and it makes complete sense why they're doing it, but it still sucks wall to wall.
 
Back
Top