Wolfenstein: The New Order Performance Review @ [H]

I don't know, for someone of the options to not exist, such as a 2 GB GTX 770 not being able to have ultra high settings, even though it's capable of it, seems like shoddy programming or for whatever reason, intentional crippling.

Because a 2gb card is not capable of keeping all those textures in video memory. The game can consume over 3.5 gigabytes of video memory when using uncompressed textures on Ultra. Using compressed textures brings that memory usage down into the 2.5gb range so you already need a 3gb card just for that.

Otherwise you'd just have more people complaining about streaming issues and without understanding the reason why it happens. We're seeing it in this very thread.
 
I played this on a 460 GTX with 1GB RAM. Resolution 1440x900 + FXAA, medium settings.

While not Crysis 3, the graphics were fine. There was hardly any noticeable texture popup and framerate hit 60 FPS.

The game itself is one of the best I've played. I got no real complaints. There were even a few spots where I stopped to admire the much lamented graphics. It's better than, say, "The Last of Us", and plenty of folks managed to enjoy that game's look.
 
eh, methinks the review is misleading in the way it is putting people off. Yes, a lot of the points about the graphics are valid. But the game is a lot more than that. It is fun to play, the story is excellent and a lot of the mechanics are well thought out and just work.

This is a performance website & a performance review. Story and gameplay are not important.
 
this is really disappointing.. i know many of us want iD to be better than this, for them to develop outstanding games and they just keep failing us over and over.. there are plenty of other companies who can product high quality engines without all of the issues inherent with id tech 5.. oh well
 
This is a performance website & a performance review. Story and gameplay are not important.

"This website is dedicated to performance automobiles. Ride comfort, fuel economy, interior, and other such topics are irrelevant." I'm not saying you're wrong, but declaring that something's worthwhile based solely on some number crunching & technical criteria's a pretty lousy approach to life.
 
That's not at all the conclusion of the article. The article is saying that, though those might be perfectly adequate, the INadequacy of the graphics, and their distractingly poor nature, overwhelm and overshadow any ability for the story and gameplay to save the game. Rage was a very passable game, if you ignored the graphics issues. The weapon leveling/upgrading was nice, the vehicle sections were well done, and the shooting was responsive. Bit between the horrible ending and the terrible graphics, it was a bust.

Here? Same story, different jacket.
 
Yeah, I agree, with all due respect ... HardOCP seems to have a chip on the shoulder regarding Rage that's made it's way into this preview / review.

I've been playing this game and on my system ( see sig ) it plays like a dream. The experience is one of the best I've had in terms of action, art, gameplay, exploration, puzzles, etc. It's a freaking blast to play. voice acting, cinematography, cute scenes ... all FREAKING amazing. And who doesn't love the dark war-tech of Nazi's?

The art direction on this game is amazing ...............

And some of the comments on here, I disagree with what I'm reading .... some of it doesn't make sense. Play the game, totally forget this preview / review and start out with a clean slate. The game is that good. Form your own opinion. Even HardOCP will tell you this that you have to play the game and decide for yourself. I would hate to see people get tainted into not playing this game. That would be a crime.
 
Last edited:
I think that HardOCP has done a disservice to computer gaming by blasting this game. I've been gaming since before the first Wolfenstein, and the current game meets my expectations of a fun game. It uses 70% of my R280x and about 40% of each of the 8 cores on my 8350, which is much better hardware usage than other games. I know it's not perfectly tuned for high end PC's (which most HardOCP folks have). But it is a fun game and runs very well on my system, much better than RTCF did on my 800mhz Pentium 3.
 
Last edited:
and the current game meets my expectations of a fun game.

We aren't reviewing if the game is fun or not. This is not the scope of our evaluation, so you've missed the point.

It uses 70% of my R280x

That is not a good thing. If it is only using 70% of an R9 280X, then it shows this game is more CPU limited/oriented, than GPU. This is not the type of game that sells video cards, or makes one want to upgrade. It also does not have any innovative, forward thinking, PC gaming graphics features. It is very much, in terms of graphics, a last generation game.

and about 40% of each of the 8 cores on my 8350

Further proving the CPU limited/oriented nature of the game.

The bolded bits in the conclusion show that it lacks basic PC gaming features.

-------------------------------------

I don't know how else to stress to others in this thread we aren't reviewing if the game is fun. That is not what we evaluate. We evaluate from a more technical basis in terms of graphics image quality, performance, and PC gaming features. We evaluate with the video card as the primary center piece of technology accelerating the game and the benefits that said game receive from video cards. We evaluate what current, or forward thinking graphics features are exploited in the game and are taken advantage of by the video card. We find out what level of video cards are required to enjoy the game, what settings are playable, and if you should upgrade your video card. We want PC gaming to move forward, and technology to move with it. Games sell gaming video cards, games have the power to make gamers upgrade if the game demands it. Otherwise, no one would ever upgrade their hardware for gaming if games never demanded more. Gaming technology and features are very important for AMD and NVIDIA, and we evaluate if said game is helping or hurting the industry, in our opinion of course. You want to find out if the game is fun, go read another pc gaming storyline centered review. You want to find out if the game will run on your video cards, and how well, and if you need to upgrade, or not, and if the game is forward thinking, or stuck in the past in terms of technology, then read our evaluation. We will tell you how it is. Period.
 
Last edited:
When people call the game a turd based on the review then they are are getting a false view of the game. I have played a lot of turds with great graphics and this game is not a turd.
 
That is not a good thing. If it is only using 70% of an R9 280X, then it shows this game is more CPU limited/oriented.
What is the measure of reliability of any "percent usage" figure, and how are these figures even derived? I've seen a lot of emphasis on these statistics over the past couple years — and more people throwing them around as though they're gospel — but I've not seen much to suggest their usefulness. Can "70%" be conclusively and consistently proven to be worse from a performance/efficiency perspective than "100%"? Or "90%"?

If it can't, how can you draw good conclusions as to whether your assertions about the engine's efficiencies are sound? If it can, where is good data on that?

We want PC gaming to move forward, and technology to move with it. Games sell gaming video cards, games have the power to make gamers upgrade if the game demands it. Otherwise, no one would ever upgrade their hardware for gaming if games never demanded more.
And you'd be out a job. Do you feel that plays a part in the opinion-based components to your evaluations?
 
Makes me wonder if a hack or work around will surface eventually that will enable Ultra settings on a 2GB card. If so I might reconsider. AA fix would help to persuade me as well.
 
More info I just found out, the game doesn't support SLI or CrossFire anyway, because of idTech 5. So no SLI or CF support. Forcing AFR gives negative performance scaling. Also, the GPU Transcode option, which can be turned on from console commands, is incompatible with SLI, though its disabled by default. Some say this helps texture load pop in on GeForce cards. Anyway, SLI, and CrossFire, not supported, which is a shame.

I'm playing through the game on a 6GB TITAN, and still getting texture pop-in badly, so VRAM capacity not the cure for it.
 
I had to downgrade to 335.23 to get the game playable on custom maxed settings. I am going to try the new WHQL released for Watch_Dogs and see if that also works (both 337.50 and 337.81 game me 1fps on custom maxed settings).

WHQL 337.88 works perfectly for me.
 
Last edited:
With regards to the review,

Can I ask why there wasn't a CPU bench mark/comparison done please as I thought what with the i7 being stipulated as a minimum it would be nice to see what performances differences there was between an i5 and i7

I hope if there is going to be a Watch Dog's performance review that there is a CPU bench mark done as it would be greatly appreciated.
 
In the other thread there's a screencap which shows over 5.8 GB VRAM being used, so I'm not sure that limited VRAM capacity isn't a problem. The question then arises: how much VRAM do you actually need?
 
The guru3d watch dogs performance review had a Titan using 6GB of VRAM.

Despite this, the reviewer mentioned that the game doesn't look very good. Think about this for a second. That's pretty damn hilarious. A game that looks worse than Witcher 2 but uses 5 times more VRAM. Yeah. Okay.

Now when a game looks like shit but takes up 2.5-6GB of VRAM, what the fuck good is that? Clearly the PC port situation seems to be getting all the worse with this nonsense.

That said, if the game is fun, that's what matters the most to me. I will simply wait for performance drivers and the game to be fucking fixed, and then i'll happily enjoy it.
 
You guys are seriously missing out on an Epic Adventure ... Trains, Prison Camps, Massive Bridges, The Moon, U-Boars, Helicopters and that's only 37% of the way in .... WOW

Sure I saw a few textures that were not up to standard but I didn't dwell on it and let it taint the experience.
 
I'm playing through the game on a 6GB TITAN, and still getting texture pop-in badly, so VRAM capacity not the cure for it.

You're presumably running with uncompressed textures on Ultra? I'd be curious to know how much VRAM it consumes. If you're running dual displays then I'd also be curious to know how much disk active time you're seeing in-game. I see spikes upwards of 90% active time on my own system and that's with an Intel SRT cache disk attached.

People need to bear in mind that there are more bottlenecks in a system than just the CPU and GPU. A MegaTexture is basically just a database file and that puts additional stress on other subsystems. PCIe 3.0 x16 only allows for 273 megabytes per frame at 60FPS, for example, and running the game with uncompressed textures could conceivably exceed that in some instances and be your cause of texture pop-in.

It's these sorts of things I'd want to know more about. MTs being built like databases isn't an inherently bad idea but databases are typically I/O bound and need to be considered as such.
 
More info I just found out, the game doesn't support SLI or CrossFire anyway, because of idTech 5. So no SLI or CF support. Forcing AFR gives negative performance scaling. Also, the GPU Transcode option, which can be turned on from console commands, is incompatible with SLI, though its disabled by default. Some say this helps texture load pop in on GeForce cards. Anyway, SLI, and CrossFire, not supported, which is a shame.

I'm playing through the game on a 6GB TITAN, and still getting texture pop-in badly, so VRAM capacity not the cure for it.



I'm playing this game in Eyefinity on a forced CF profile.
I don't know what game you're playing but I rarely see any texture pop-in/out. Maybe out of the corner of my eye, but nothing like we saw in Rage.:eek:
 
You guys seem to have a chip on your shoulder left over from Rage, and it has extended to this game.

Yes, there are some graphical weakness here for a game released in 2014.
Does it effect the gameplay, not at all.
Does it change the story, nope.

I was thinking about what you said here, so last night I loaded up Rage...again, maybe for the last time is what I was thinking.

I only have the texture pop issue happen one time. This alone with previous experiences was full deal breaker for me as it just killed the immersion of the experience.

I played for about 3 or 4 hours in the wee morning hours and had a good time with it, getting much further in the campaign that I had before.

Still if feels very linear and on rails, but I did enjoy it, but still would call it a bargain bin purchase. The graphics go from awesome to plain shitty, as those did before. But I am finally glad to be playing my $60 game without constant game killing texture pop distractions.

As for Watch Dogs, I cannot get it to run here on my personal system. This seems to be happening on NV and AMD cards. Brent is working through some things now. Look for our preview tomorrow on WD.
 
In regards to being biased against RAGE. For the record, I am coming at Wolfenstein from a clean slate as I never played RAGE at all. Mark did our evaluation on it, I personally never spent the money on it and never loaded it up. So I have no bias against RAGE, and I was coming at Wolfenstein from a very clean slate in terms of view.

I did not know what to expect, my expectations were positive at first when I was downloading the game. I thought it would be the next gen game we were looking for, at first. Then I played it.

I'm not downing it's fun factor, not at all, I'm still playing the game on my free time as I've expressed, I'm going to play it to the end, i'm having quite a bit of fun.

Our evaluation doesn't evaluate fun, as discussed. Fun and graphics are certainly not necessarily mutual things, one of my favorite games of all time is a 2D game that has no 3D acceleration at all and looks like crap compared to modern games. Of course we get that. But our review doesn't evaluate that, it's not the scope of the review. This game is not worthy to add to our gaming suite lineup, simple as that.

As for texture pop-in, I've played the game on 2 different systems now, and it's bad on both. One system was with an SSD, so that's not the fix, and the other with a 6GB TITAN, so that's not the fix. The fact that it's doing it so badly means it could be for others as well, and it is very very distracting to gameplay. I've kind of been trying to just ignore it, but I do notice it all the time.

I've looked at CPU usage on an i7 with HT enabled and the CPU will max in the 70% range. I'll note VRAM usage on my next play session.

BTW, I found Jimi Hendrix :p
 
Last edited:
Because a 2gb card is not capable of keeping all those textures in video memory. The game can consume over 3.5 gigabytes of video memory when using uncompressed textures on Ultra. Using compressed textures brings that memory usage down into the 2.5gb range so you already need a 3gb card just for that.

Otherwise you'd just have more people complaining about streaming issues and without understanding the reason why it happens. We're seeing it in this very thread.

It also depends on the resolution. I mean at 800x600 you probably wont hit 2gb even but it should be my choice what res to game at. Even though no one plays at 800x600. Was just an example.
 
As for texture pop-in, I've played the game on 2 different systems now, and it's bad on both. One system was with an SSD, so that's not the fix, and the other with a 6GB TITAN, so that's not the fix. The fact that it's doing it so badly means it could be for others as well, and it is very very distracting to gameplay. I've kind of been trying to just ignore it, but I do notice it all the time.

Any chance to trick you into testing it using compressed textures as well?

I just did a quick test on my own system using a 2gb 660ti and its VRAM is already getting maxed out on Medium when the textures are set to be uncompressed. Sysinternals' Process Explorer shows a full 1.6gb differential in memory consumption between compressed/uncompressed at that setting.

At its highest, I've seen the game commit a full 5.5gb of memory (2gb VRAM and 3.5gb system RAM) when using uncompressed textures. That number dropped to 3.3gb (1.6gb VRAM and 1.7gb system RAM) when setting the textures to be compressed. I'd expect that number to rise significantly with more VRAM and at higher settings.
 
:D, the game sounds like a blast to play, does not really push hardware capability except maybe VRam usage and hard drive speeds. I would have thought HardOCP would be using a SSD with this game :confused:. Does it actually use more then two cpu threads? Maybe more useful for CPU testing (if the 60fps limit is overcome) and not GPUs.

Sounds like MSAA as well as SSAA is easily obtain with a console command and driver settings. Not everything needs to be in the game settings, we are the ones that push things including custom settings and using other programs to push things. :rolleyes:

Id other games aspect ratios was also configurable by the console? Using compresses textures with the 2gb cards? Quality and if FPS can be maintained at Ultra settings? Review in my opinion did not fully research all the available options, some are just very easy to do as well.

Still I am not willing to shell out $60 on this title - <$40 to kill some Nazis, hell yeah!
 
As for texture pop-in, I've played the game on 2 different systems now, and it's bad on both. One system was with an SSD, so that's not the fix, and the other with a 6GB TITAN, so that's not the fix. The fact that it's doing it so badly means it could be for others as well, and it is very very distracting to gameplay. I've kind of been trying to just ignore it, but I do notice it all the time.

I've looked at CPU usage on an i7 with HT enabled and the CPU will max in the 70% range. I'll note VRAM usage on my next play session.

I've been playing it on my home system with 4Ghz i5-3570 32GB RAM and Raid 0 across 4 intel SSD's with a Titan. I haven't had any issues with texture pop that I have noticed. I'm not saying I have zero texture pop but I certainly haven't noticed any. It's a given my system is odd ball for a home user and if that's what it takes to not get texture pop then the engine needs some major optimization.

Forgot to mention my settings are Ultra then I upped "Max PPF" setting and the "Screen-Space Reflections". My Res is only 1080p though. I am waiting for a nice G-Sync monitor to get to a higher res.

I have also been having a blast playing it. Been a long time since I really enjoyed a single player FPS.
 
Last edited:
It is funny because the game is one of the best FPS for PC (keyboard and mouse) of the last years, with very good gameplay and gunplay with mouse. Much better than any previous multi-platform game as Rage, Wolfenstein 2009 Quake 4 or any other any developer made FPS in the last years. The control with mouse is great .

And PC gamers are criticizing the graphics ..... that are good except the resolution of the textures. :rolleyes:
 
And PC gamers are criticizing the graphics ..... that are good except the resolution of the textures. :rolleyes:
Isn't that, arguably, the most important aspect of graphics? It doesn't matter how much AA, tesselation, or HBAO+, or physics simulation your game has if the textures all look like mud.
 
Isn't that, arguably, the most important aspect of graphics? It doesn't matter how much AA, tesselation, or HBAO+, or physics simulation your game has if the textures all look like mud.

Textures are good ( many unique textures ) ... the problem is resolution ;) But far better resolution than games like Doom 3 , Quake 4 Return Wolfenstein Castle ...

To me Return Wolfenstein Castle of 2001 is better game than Doom3 , Quake 4 , Wolfenstein 2009 , Rage , Crisis 2 and 3 .. because is a old FPS with better gunplay and gameplay to mouse.

And this New Order is even better, with mouse control even at 25/30 fps, things of clasic FPS for PC as medic-packs , armor, secrets, and exploration . And new things of modern FPS as coverage and limited regeneration ( very balanced choice , there are medic-packs and we need exploration of the world because regeneration is limited ). And puzzles, stealth etc ...
 
Textures are good ( many unique textures ) ... the problem is resolution ;) But far better resolution than games like Doom 3 , Quake 4 Return Wolfenstein Castle ...

To me Return Wolfenstein Castle of 2001 is better game than Doom3 , Quake 4 , Wolfenstein 2009 , Rage , Crisis 2 and 3 .. because is a old FPS with better gunplay and gameplay to mouse.

And this New Order is even better, with mouse control even at 25/30 fps, things of clasic FPS for PC as medic-packs , armor, secrets, and exploration . And new things of modern FPS as coverage and limited regeneration ( very balanced choice , there are medic-packs and we need exploration of the world because regeneration is limited ). And puzzles, stealth etc ...
It's been said before and I'll say it again: This wasn't a game review. It was a graphics/performance review.
 
It's been said before and I'll say it again: This wasn't a game review. It was a graphics/performance review.

I agree with the review on graphics terms. Underwhelming if my feeling on it, thankfully it has good game play to make up for it. If they add Co-Op or DM it will be very popular.
 
we are

nothing about our test systems has changed

Good grief, I am blind, sorry about that.

Some say they have texture popping in others do not which I find interesting in what is the difference, Hardware or the User? As in some users are more keen on seeing it while others it just blends in more maybe.
 
Good grief, I am blind, sorry about that.

Some say they have texture popping in others do not which I find interesting in what is the difference, Hardware or the User? As in some users are more keen on seeing it while others it just blends in more maybe.

This is a tricky (but valuable) question.

It depends on:

1. The detail of the environment.
2. The speed of the drive
3. The speed of the memory
4. The amount of VRAM
5. Engine settings
6. The amount of action
7. Personal taste

Some areas are likely to be more problematic than others. So the mission headquarters area is full of interesting details and world-building stuff. It's also explored slowly, as you look around the environment for clues and items that push the storyline forward. The texture popping is more noticable AND more likely to be present. It is the worst of all combinations - lots of environment detail, low action, high exploration. Other areas like some of the bunkers or outdoor areas are much better - not a ton of environment detail, high action, low exploration (lots of sprinting and/or sniping). Those areas have very little texture pop (even when I'm trying to find it), but even if it were there, you'd hardly notice unless you're extraordinarily sensitive to that kind of thing.

I have very mixed feelings about the megatextures approach. Compare the mission HQ in Wolf:TNO with ANY of the indoor areas in Bioshock:Infinite or BF4 (or even Crysis3) and the amount of interesting detail and architectural variety in the idTech5 game is just amazing in comparison. You never get the feel that the level designers were constrained by considering how a surface might need to have a repeating texture tiled across it. Everything looks so.... unique. So "lived in." It's really amazing. But then other times you are next to a wall trying to line up a shot and the wall has a sign/decal on it that is so low-res and pixelated it just looks trashy. I never get that from Crysis3.

I wonder if this is simply an optimization issue. Perhaps the tools are not able to compress textures based on how closely they would be viewed, so it just compresses everything to the same level regardless of how big they would "need" to be given the desire of the player to have things stay sharp at eye level.

Anyway - a controversial graphics engine, to be sure. I almost wish they could mix and match. Have hyper-detailed non-repeating megatextures for some areas, and a more traditional tiled/layered/shader renderer for other areas. There's probably no way to have that look consistent though.
 
I have a i7 Sandy @4.8Ghz/1.43v with Dual PWM Deltas fans on the heatsink and when I run around at full speed through the game it actually starts to heat up the CPU and to the point that the Deltas start to ramp up like it is running Prime95/IntelBurnTest (high thread benchmark). I have not seen this effect to this extent in a game before. Task manager shows the game is using a good percentage of all 8 (virtual) cores.
 
Anyone know how to give a console command that would skip a level or two? Lost my save game. Thanks.
 
aw man I'm getting horrible performance in this game. I have a 4770k, 290x lightnings in CF, 14.6 beta's, and Win 7 64 bit @ 1600P. At all settings max'ed out, I struggle to flatline at 60 fps. Currently, hyperthreading is ON - should I turn it off? Disabling CF seemed to result in worse performance. Where is this smoothness that other AMD users are talking about? :confused:
 
Back
Top