Wolfenstein: The New Order Performance Review @ [H]

All of us here say save your money as the new Wolf is only suitable as a bargain bin purchase.
Burn.

I am disappoint. I mean, I didn't hate Rage. I thought it was a fun take on a new universe with interesting graphics, even if they were technically flawed. I ASSUMED that the next game would learn. Id is no slouch when it comes to games, and no stranger to adapting to the gaming world. But they didn't The same mistakes were made, and my description of Rage seems to fit the new Wolfenstein to a t.

Thing is, i was thinking back on Id's game catalog, and it's depressing.
I mean, when was the last time Id made a really good game? Wolfenstein is solid, but technically still broken. Rage too. Quake Wars was a bust, generally. Do we really need to go back to Doom 3 in 2004 to find a really good Id title? It's just sad. Id used to be THE studio that set the tone for PC gaming and graphics in general. Now...
tumblr_inline_mvhjrmwyy61s1vy8w.gif
 
I can't really see how id has much of a future with the way things are. I know they didn't make this game but the engine has been a big disappointment. They went "all in" on consoles and gave a giant middle finger to PC gamers. Clearly they forgot who their fans were. If we never see Doom 4 I won't be surprised. Maybe it's justice.
 
OpenGL 4.4 is available to only 67% of Steam's customers. If you want to know why Wolfenstein is 3.2, that's why: 3.2 is available on greater than 96% of surveyed systems.

I doubt that's why they chose 3.2. The recommended specs is what? 4% of steam users? I think it was just laziness or lack of funds/time to program in multi support.
 
Does anyone still play games for fun? Technical issues exist but a fun game trumps all of those (within reason, of course). The game is fun. Technically it isn't amazing. But it's a great game IMO.

I get the feeling like a lot of people play just for graphics, and while great graphics are nice - they don't make a game. Putting lipstick on a pig doesn't a good game make. However, I really do feel that Wolf: TNO is a good and fun game.

Game companies need to stop releasing half-assed crap. Maybe it is fun but that doesn't mean you should ignore all the faults or accept them because that's just how things are. They made a bunch of the same mistakes that they made with a previous game they made. That should be unacceptable. If they had added all the PC graphics settings and had no fps cap or vram limited settings, I think [H]ard would have gone a bit easier. But they didn't and so they deserve the critcism
 
Game companies need to stop releasing half-assed crap. Maybe it is fun but that doesn't mean you should ignore all the faults or accept them because that's just how things are. They made a bunch of the same mistakes that they made with a previous game they made. That should be unacceptable. If they had added all the PC graphics settings and had no fps cap or vram limited settings, I think [H]ard would have gone a bit easier. But they didn't and so they deserve the critcism

Ahh yes, the everything must be perfect or else it fails scenario.
 
Does anyone still play games for fun? Technical issues exist but a fun game trumps all of those (within reason, of course). The game is fun. Technically it isn't amazing. But it's a great game IMO.

I get the feeling like a lot of people play just for graphics, and while great graphics are nice - they don't make a game. Putting lipstick on a pig doesn't a good game make. However, I really do feel that Wolf: TNO is a good and fun game.

I want to see the advancement of PC gaming. I want to see games push PCs to their limits and create a need to upgrade and want better hardware, better video cards, all to improve the gameplay experience. I want to see the PC gaming segment be forward thinking, and prove how and why PC gaming can be massively superior to console gaming. The potential is there, now use it.
 
Also I can name a few games that share these faults but still manage to be incredible games.

-Mass Effect
-Skyrim
-Borderlands

Yeah it sucks and should be fixed but that is the fault of consolitis. Until we are at a level where consoles are on an even playing field as PC "LOLOLOL" it's going to continue to happen, or we wipe out consoles alltogether. Doesn't mean we should bash the game as completely shit or half-assed.
 
Also I can name a few games that share these faults but still manage to be incredible games.

-Mass Effect


Yeah it sucks and should be fixed but that is the fault of consolitis. Until we are at a level where consoles are on an even playing field as PC "LOLOLOL" it's going to continue to happen, or we wipe out consoles alltogether. Doesn't mean we should bash the game as completely shit or half-assed.


Did that make your list before or after the tremendous community outcry over the way the series was wrapped up and subsequent 'well, guess we fucked up and have to finish it now for all the whiners' result?

PC gamers have some very simple requirements. Not fulfilling them automatically puts you in the half assed category.

Ability to run >60FPS
Robust settings adjustments, FOV sliders, graphical settings, multiple input support
SLI/Crossfire support

Meet these three criteria and you are WELL on your way to getting support of PC gamers. Make the active decision not to, and you will rue the day I am afraid.
 
Did that make your list before or after the tremendous community outcry over the way the series was wrapped up and subsequent 'well, guess we fucked up and have to finish it now for all the whiners' result?

-_-

For all the flak the game gets for it's ending, 99% of the game was still great and the journey was worth it. I've put the past behind me and still love every bit of the story even if they did butcher the outlook with some weird lazy bullshit. Fuck I am starting to remember...
 
I think the article is highly flawed in that it doesn't examine any bottlenecks. Thinking that throwing the highest end PC at the game and expecting it to work flawlessly at max settings without actually checking what those settings do is not particularly scientific.

Running with uncompressed textures, for example, with MaxPPF set at 64 introduces a ton of streaming artifacts on my system and balloons video memory usage. I also imagine that it increases stress on the hard drive and is another confound. Reducing those to compressed and 32 pretty much eliminates both issues without any noticeable reduction in quality. It also disregards how my HD6850 running on 'low' pegs GPU utilization at around 50% while my 660Ti running on 'ultra' becomes GPU limited at around 95% utlization. And the former system runs at a full 1080p while the latter only runs at a lower 1680x1050.

The fact that the game is limited to 60hz doesn't mean that it isn't performance intensive or worth further examining. You just need to reevaluate the performance metrics.
 
It's a 3-4 year old engine that was outdated when it launched. There are no bottlenecks in reasonable scenarios. Sure, if someone has a terribly unbalanced system (say the CPU from the article and a Radeon X300 or something, or an X300 paired with a very high res screen) you're going to run into issues. But on the PC gaming scale, mid range systems are going to run this very well (your 660ti example is a good one), and [H] isn't about mid range systems. it's just irrelevant to this site and the purpose of the article.
And none of that approaches the issues with the engine and tech that Id has apparently refused to address.
 
It's a 3-4 year old engine that was outdated when it launched. There are no bottlenecks in reasonable scenarios. Sure, if someone has a terribly unbalanced system (say the CPU from the article and a Radeon X300 or something, or an X300 paired with a very high res screen) you're going to run into issues. But on the PC gaming scale, mid range systems are going to run this very well (your 660ti example is a good one), and [H] isn't about mid range systems. it's just irrelevant to this site and the purpose of the article.
And none of that approaches the issues with the engine and tech that Id has apparently refused to address.

My thoughts, as well. This isn't [M]edi|OCR
 
I've been playing it for a couple hours and having a great time. Still trying to figure out the settings that will keep the texture stream artifacts to a minimum, but the game itself is a blast to play.

I think, though, that for people like me that are really enjoying the game - we should cut [H] some slack on the review. They are reviewing the game from the perspective of how well (or poorly) it utilizes cutting-edge hardware. This game doesn't do that (except for VRAM size - maybe). It's like Bioshock:Infinite - no tesselation even though it used UE3 for the graphics. You didn't play it to look at how well your GPU does tesselation and soft shadows combined - you played it for the world building stuff, the art, the storyline, the shooting mechanics, and so on. Wolf:NewOrder is the same way - older engine, yeah. But what you're paying for is the world building, the story, the mechanics, the audacity, etc..

But [H] isn't a game review site. It's a hardware site. This game is just not a good title for measuring new hardware, hence their disappointment.

For those of us enjoying the game, we're doing so regardless of how useful it is as a hardware benchmark.

But Bethesda should fix the glitches anyway. Mildly annoying, but needs fixing. :)
 
Granted, H is a hardware review site so I can understand the performance review in that context. I wouldn't fault Brent for this at all because he reviews PC hardware. Not PC games. Yet seeing people dismiss the game based on this? Whatever. It's a blast to play. If you want to sperg over graphics all day long, I hope nobody played Mass effect 1, 2, the living dead (great game series), Dragon Age: Origins, or boderlands. I could go on here. These are all great games which did not push graphical boundaries. Many of them even had 60 fps limits. Does that make a game automatically bad? No. It's not ideal, but it doesn't make a good game into a bad oen. I think Wolf: TNO is a great game.

I get it. You want to push your 1000$ GPUs. I understand this, and I like good graphics as well. But good graphics don't make a game, putting lipstick on a shit game doesn't make it a good game. So i'm very much enjoying this despite the engine.

I'd also add that some of the "flaws" people bring up are completely overblown. Watch some of the youtube reviews such as the one by Total Biscuit. There is certainly nothing game breaking. What there is a very fun game if you LIKE old school games with great gun mechanics. This game is not Rage - what I mean is, while it uses the same engine, Rage was a complete letdown. I have not felt let down by this game at all. Was it worth 60$? Probably not, but it's a damn fun game nonetheless in my mind.
 
It's a 3-4 year old engine that was outdated when it launched. There are no bottlenecks in reasonable scenarios. Sure, if someone has a terribly unbalanced system (say the CPU from the article and a Radeon X300 or something, or an X300 paired with a very high res screen) you're going to run into issues. But on the PC gaming scale, mid range systems are going to run this very well (your 660ti example is a good one), and [H] isn't about mid range systems. it's just irrelevant to this site and the purpose of the article.
And none of that approaches the issues with the engine and tech that Id has apparently refused to address.

But that's somewhat my point. My 660Ti is also an Intel SRT system where I have the HDD cached to a 20gb Intel 313 that no doubt helps with the streaming artifacts. I remember Rage having somewhere north of of 50% hard drive active time on the same system while Wolfenstein carries 3x the amount of textures and is thus more dependent on hard disk seek times. And especially when using uncompressed textures because we don't know where in the pipeline the compression happens.

The texture streaming artifacts they experienced could be due to the hard drive being the bottleneck. Finding the bottleneck is what an [H] article should be about, in my opinion. There are more bottlenecks in any given system than just the CPU and GPU. And if 1680x1050 can bottleneck a 660Ti then that that's a GPU intensive game however you slice it.

The 60hz limitation is similarly pretty a big factor in this situation. Just because both GPUs manage 60fps doesn't mean that one GPU isn't running at 60% capacity while the other is at 90% capacity. Not getting a 90fps vs. 60fps benchmark doesn't mean that the relative difference isn't there. The 60hz limitation just changes the performance metric
 
Last edited:
Granted, H is a hardware review site so I can understand the performance review in that context. I wouldn't fault Brent for this at all because he reviews PC hardware. Not PC games. Yet seeing people dismiss the game based on this? Whatever. It's a blast to play. If you want to sperg over graphics all day long, I hope nobody played Mass effect 1, 2, the living dead (great game series), Dragon Age: Origins, or boderlands. I could go on here. These are all great games which did not push graphical boundaries. Many of them even had 60 fps limits. Does that make a game automatically bad? No. It's not ideal, but it doesn't make a good game into a bad oen. I think Wolf: TNO is a great game.

I get it. You want to push your 1000$ GPUs. I understand this, and I like good graphics as well. But good graphics don't make a game, putting lipstick on a shit game doesn't make it a good game. So i'm very much enjoying this despite the engine.

I'd also add that some of the "flaws" people bring up are completely overblown. Watch some of the youtube reviews such as the one by Total Biscuit. There is certainly nothing game breaking. What there is a very fun game if you LIKE old school games with great gun mechanics. This game is not Rage - what I mean is, while it uses the same engine, Rage was a complete letdown. I have not felt let down by this game at all. Was it worth 60$? Probably not, but it's a damn fun game nonetheless in my mind.

Title of the article is "Wolfenstein: The New Order Performance Review"
The title of the article is NOT "Wolfenstein: The New Order Gameplay Review."

The game might be good, but the games graphics are donkey dung.
 
It's been a while since I've said "ooooohhhh" out loud when picking up a new gun.

This game is extremely fun.
 
Played it for a few hours. It suffers heavily from consolitis. It's not very good looking and kinda reminds me of CoD (Call of Wolfenstein?) graphic wise. Sound is lacking .. Heck, I cranked my speakers to hear gunshots. It just doesn't have the sound quality of modern AA titles. Kinda like it was designed for a TV set..

I'm enjoying the story. But the game feels heavily dated. That's the major drawback. As a note, I'm playing on ultra 1920x1200 with texture compress off. Seems to run fine on my 290. Some textures look like they may be 64x64 resolution. I mean holy hell they're bad. Others look like they have low-res normal maps applied. Really hurts the fidelity. I haven't experienced much texture pop.

What's good?
- Levels aren't 100% linear. Some open areas.
- They kept the wolfenstein style of armor/health. Although it does get annoying spamming E.
- There's secret passages and such. But I'm not sure I'll replay to find everything.

This is one of those titles I'll beat once and uninstall.
 
Last edited:
using latest beta drivers

SLI not functioning, no AA sucks, graphics are mediocre. Gun sounds are lame. Game play is decent though, and I am really liking the story!

5/10 for the story alone.
 
I doubt that's why they chose 3.2. The recommended specs is what? 4% of steam users? I think it was just laziness or lack of funds/time to program in multi support.

Wasn't 3.2 the highest supported version of OpenGL on Mac for a very long time? If you want to target that platform most of what I've read still suggests that you not target above 3.2. With that said, most of the changes since 3.2 have been around compute, tessellation, and geometry shaders - none of which are used in these games.

Id Tech 5 was built around megatexture and data streaming - the idea being that you can just throw gobs of data at a scene rather than trying to simulate effects with shaders. I mean, if you take it to its logical conclusion even lighting can be fully encapsulated in the game world data rather than being calculated at runtime. Carmack even made statements about this in the mid-2000's stating that the holy grail of realism would rely on infinite memory and fully voxelized digital worlds. I think Id Tech 5 was his attempt to move the industry in this direction. The idea being to minimize the computational requirements while taking full advantage of available storage. Tessellation/Geometry shaders fell outside of the 'problems' the engine was designed to tackle.

I
 
The game LOOKS like shit. Very depressing to see how far thy haven't come.
 
I've been enjoying the game, quite a lot actually, but I agree it is a disappointment as far as tech goes.
 
Something I forgot to add in the conclusion, I think the game is overpriced as well. It cost me $60, I pre-ordered, and that just seems like way too much to me for what we got. Steam doesn't have a return policy unfortunately, if this was on Origin I'd be getting my money back right now.


For the record, it was HardOCP's money....but yeah, we still want our Rage money back too! :p
 
Tech / engine issues aside I'm having a pretty damn good time with this game.
 
God how pathetic. Was seriously considering it; not anymore. Thanks. Saved me the money.
 
eh, despite the hate here I am playing it and enjoying it. Picked it up for $40 at GMG and worth it IMO.
 
Something I forgot to add in the conclusion, I think the game is overpriced as well. It cost me $60, I pre-ordered, and that just seems like way too much to me for what we got. Steam doesn't have a return policy unfortunately, if this was on Origin I'd be getting my money back right now.

...in Australia it's US$80.00 - minus .05 cents...
On a side note - should be able to play this on my new GF4!! Might even try it on my SLI voodoo 2s...no, what? It's x64? No win 95 or XP64?
 
really good review... thank you very much
completely agree with you, i'm tired of old engine, old API... bad console porting, lack of texture quality and issues (pop-in in 2014? it's a shame) and after all this absurdity... even framerate drops? are they kidding us? :-D
these devs have to understand that it's time to change... stop to tease customers, please... i will never give my money to this trash..this is garbage by now
i have a high end card ATM (r9 290) and i want it to be useful and adequately exploited...so real "NEXT-GEN", THANKS
 
Last edited:
I remember paying the full price for RAGE and not even being able to play it the stutters were horrible texture pop-ups everywhere.

Remember the 10/10 graphics reviews many sites gave RAGE? I lol'd and was baffled after seeing those scores and playing the game for a few minutes and seeing the horrible pop-in and vast amount of horrible textures. I beat RAGE twice on my 360 FTR, and own the PC version (STEAM sale ftw).

No point in fixing the engine since most of sales will likely come from console users who sit 10 ft away from their TV's and won't notice the pop-in. I have tons of games backlogged so I won't fret about the pop-in, however I am disappointed since the game looked cool. Could always re-play the 2009, semi-open world Wolfenstein which has tons of shiny textures.
 
Last edited:
Where is this 'old API' nonsense coming from? OpenGL 3.2 needs DX10-level hardware and not DX9 as the article states. And the game still needs an OpenGL 4.x driver and making it a 4.x game.

It's no different to the average DX11 game in that regard. BF4 is a DX11 game with a DX10-level minimum hardware requirement.
 
I agree with you, Xoleras, that graphics don't make the game. If I was hopped up on that idea, I would not have bought those many indie games in the past few years. And, I have many more indie games on my Steam Wish List that I'd like to get as well. I just saw this game called "Max: The Curse of the Brotherhood" pop up on Steam and I'm considering on buying it. It isn't the greatest looking game but it looks incredibly fun and entertaining.

However, the point I'm trying to make with my previous comments is that we're talking about a large, well-known gaming and development studio here with years upon years of experience in the gaming industry. If CD Projekt RED came out and released Witcher 3 with graphical issues, limitations, and technical issues and/or stuck to DirectX 9 API that distracted the game away from the gameplay, then I'm sure the same criticisms about it would be said as well. Many gamers expect a level of polish in their games especially if one spent $50 or $60 on it and if the game came out of a well-known development house. The same happened with Battlefield 4, not on graphics issues, but launch issues with a very buggy game, and coming from a big name company-- DiCE and EA.

We can pretty much forgive this with smaller studios with smaller budgets, but how much can we forgive or are allowed to forgive when it comes to a larger company and larger budgets?

If the gameplay in the game is fun from beginning to end and I get my $60 worth out of it, then maybe, just maybe, I'll forgive the technical issues in the game. However, if the technical and graphical issues distract me from the gameplay to the point of it being nearly unplayable or unenjoyable then I will not consider it money well spent.

We all expect good games coming from larger companies, and we expect them to be fun and enjoyable. Just how far can we forgive or how much can we forgive a larger company for not fixing their game or their game engine after a certain period of time? id Tech 5 had five years to fix itself since RAGE was released and that never happened. Why? We will probably never know.

I have a lot of hopes in OpenGL and that API being a good alternative to DirectX. It's just really unfortunate that the "best looking" games using OpenGL are using older versions of it, or are having issues of delivering AAA-quality games that are on par or better than DirectX-based games.

There has to be reason and explanation for why developers aren't utilizing the best of OpenGL and showcasing that it can be a good alternative to DirectX. Think about it: You could feasibly play Wolfenstein on Windows XP (64-bit mind you) because the API isn't locked down by OS like DirectX is.

I have said this before in another forum post here:
We need forward momentum in PC gaming, not backwards momentum.
 
We can pretty much forgive this with smaller studios with smaller budgets, but how much can we forgive or are allowed to forgive when it comes to a larger company and larger budgets?

MachineGames is known for huge AAA titles? Bethesda might be producing but all that means is their cracking the whip at MachineGames to get Wolfenstein out on release date.
 
I've been playing this at 1080p on my PC equipped with a dual HD6970 setup and I'm having problems trying to get the optimal graphics/performance.

I'm getting massive tearing with Vsync off and turning it on introduces SIGNIFICANT cursor lag.
I've been trying to force some decent AA as well but I haven't been able to get it to work.

Does anyone have a similar setup and has mastered these settings?


P.S. I'm not really sure what Mr. Bennett means with the old OpenGL thing though.
 
Back
Top