Call of Duty: Black Ops II Performance & IQ Review @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,634
Call of Duty: Black Ops II Performance & IQ Review - Call of Duty: Black Ops II is the first Call of Duty game on PC to support DX11 and new graphical features. Hopefully improvements to the IW Engine will be enough to boost the CoD franchise near the top graphics-wise. We also examine NVIDIA's TXAA technology which combines shader based antialiasing and traditional multisampling AA.
 
so are they using a totally different graphic engine ? or still using old ones and make it plays in directx 11 ??
 
You forgot to mention in the article that this game still doesn't support Multi-Monitor gaming out of the box. BF3 and MoH support it but Call of Duty with a weaker engine can't? Stupid Treyarch.

Yes I know you can use the widescreen fixer to get it to work in Eyefinity/Surround but you run the risk of getting banned and if I do get banned this will be the last Call of Duty I ever buy. It might be the last anyways but we will see.
 
The Secret World uses TXAA but I don't know what it looks like as I'm running a 7950.
 
so what's your take on TXAA? IMO it doesn't improve image quality much over FXAA+4XMSAA.
 
This is about what I expected from the game. Sadly, we can blame an overly extended console life cycle that's forcing devs to scale their games to 6+ year old tech. Sure, there's the argument that they're putting PC gamers on the back burner (and they are) but if they worked in a new engine they'd be cutting out most of the features for the console release, where the majority of their players sit. No way to justify that investment (hence the same old tech/engine with a bit more flare).
 
so what's your take on TXAA? IMO it doesn't improve image quality much over FXAA+4XMSAA.

I've only briefly used it, but for its use in CoD, I don't see the advantage. That doesn't mean it won't have benefits in other games, I just haven't tested or looked at it yet in those other games, and there aren't a lot that use it yet, but for CoD, kind of pointless.
 
very nice review. Call of Duty Black Ops II proves that developers are interested more on the consoles. the PC version is not given its due importance and so we don't see the kind of graphical breakthroughs on the PC like Farcry in 2004 or Crysis in late 2007. hoping that Farcry 3 , Crysis 3 and Metro Last Light can bring much better graphics on the PC and push the GPUs really hard. is there a plan to review the PC version of Farcry 3. :)
 
The conclusion pretty much sums up why I am still okay with a 5850 ;p

Yeah, you really don't need much if you game at 1080p or less. My unlocked 6950 still handles quite well at 1080p. Only downside to 58xx series is lower tesselation performance. So far, I'm pleasantly surprised with the graphics on Hitman and assassins creed 3. Both games look great on the pc and I had to lower or disable anti-aliasing to get smooth fps on hitman.
 
I think that TXAA is closer to SSAA (or even better) in terms of Image Quality, I was overly impressed with it. This is clearly the future of AA. Like playing a CG movie.

After said that what I’d really like to see is a driver implementation from AMD that allows for custom resolutions higher that what your monitor supports (the GPU would downscale to the resolution to the maximum that the monitor supports or, better yet, the one selected by the user). This would really be useful for all those games in which SSAA does not work (or does not work with software like Tridef3D). I do not even understand why SSAA exists at all, downsampling would be so much easier for everybody.

I do not know if I am alone in this but 8xMSAA/FXAA/MLAA does not cut it for me, I much prefer SSAA (or downsampling). It is not about eliminating jaggies anymore, but about rendering an extremely detailed scene, this “shows” greatly even if your monitor has a lower resolution (it almost looks as if you are gaming at the rendered resolution).

@Brent: This SSAA need and the need to play in 3D (via Tridef) is what drives my urge to have more powerful graphics card already!!!, current single graphics card’s performance level struggles in these situations (I do not like all the problems that come with a Crossfire/SLI configuration).
 
very nice review. Call of Duty Black Ops II proves that developers are interested more on the consoles. the PC version is not given its due importance and so we don't see the kind of graphical breakthroughs on the PC like Farcry in 2004 or Crysis in late 2007. hoping that Farcry 3 , Crysis 3 and Metro Last Light can bring much better graphics on the PC and push the GPUs really hard. is there a plan to review the PC version of Farcry 3. :)

having played bo2 on a console, im going to disagree...
in this case the devs didnt appear too interested in giving the console versions their due importance either.
 
The conclusion pretty much sums up why I am still okay with a 5850 ;p

+1 Been playing it over the weekend on my 5870 @1080p looks like im ok for the moment:Dwould like a 7950 or 670 though:cool:

ps lol at the response to chimi:p
 
thanks for the top notch review, looks like another game for my "wait 'till it's $19.99 or less list" one good thing I'm sure it will play well on my old build
 
yes another reason i stayed away from black ops 2.

Still rocking battlefield 3...

I was kinda expecting a Hitman Absolution game play run down. I guess that will be next. You won't be disappointed its a graphically demanding title ;-)
 
so are they using a totally different graphic engine ? or still using old ones and make it plays in directx 11 ??

Although you've been answered. It's clear from the screenshots that this isn't going to win any graphic awards or push any barriers. They slapped lipstick (DX11) on an old pig (6 year old) and expect us to think it's a beauty queen.

*edit*
... and that could be a honey-boo-boo reference, if you want it to be.
 
Unfortunately not everyone live in the USA and can afford GPU upgrades every now and then.

Fortunately, CoD developers know this and will not push the graphics too far, meaning that people on countries which GPUs are freaking god damned expensive will get to enjoy the game using 9800GTs and such (which, by the way, are still sold on retail stores around here - that's how expensive GPUs are). No wonder people LOVE consoles.
 
Unfortunately not everyone live in the USA and can afford GPU upgrades every now and then.

Fortunately, CoD developers know this and will not push the graphics too far, meaning that people on countries which GPUs are freaking god damned expensive will get to enjoy the game using 9800GTs and such (which, by the way, are still sold on retail stores around here - that's how expensive GPUs are). No wonder people LOVE consoles.

you can still have a game that runs on an 9800GT cough 8800GT and still push the performance of a 7970 or 680, that excuse doesn't work. i mean hell i can play BF3 on my 8800GT's yet the game still makes the 680 and 7970 cry if you want to crank up all the settings.
 
I used to keep my pc bleeding edge, now with this economy and lack of necessity I only upgrade when necessary ie frying a vid card. My XFX 9800gtx black edition played this game very well cranked pretty high, until my fans on my acellero cooler died and smoked my card. I dropped in an EVGA 660 ti and yes it runs BO better but not $200 better and sure doesn't look $200 better either. The best upgrade i ade in a long time are dual ocz vertex 4 256g
 
Though this game utilizes DX11, it doesn't use any advanced features we'd like to see like tessellation or more advanced shadowing and lighting. What has happened to these DX11 features being used?

If it doesn't make use of DX11 to it's fullest extent, it isn't utilizing DX11, but merely using it.

Anyway, the article's conclusions don't surprise me any, but I'm glad to have my assumptions proven.
 
If it doesn't make use of DX11 to it's fullest extent, it isn't utilizing DX11, but merely using it.

Anyway, the article's conclusions don't surprise me any, but I'm glad to have my assumptions proven.

Splitting hairs for sure- hell, WoW 'uses' DX11, and it came from a DX8-based engine, first running on a GF3/8500. Done right, DX11 should be 'more efficient' than its predecessors, which is a good thing all around.

So sad to see this done to CoD though- my community was once dominated by CoD players, and now they've all moved on. We don't even have a section for it anymore, it's all BF or CS:S/GO, for the shooters.
 
If it doesn't make use of DX11 to it's fullest extent, it isn't utilizing DX11, but merely using it.

Anyway, the article's conclusions don't surprise me any, but I'm glad to have my assumptions proven.

There are two ways to "use" DX11

You can simply enable and run under the DX11 code path, and utilize the better efficiencies of DX11 to improve performance, this provides more performance, but no graphics change.

Or, you can take that added performance, and add on unique DX11 graphics effects to make your game look better, such as tessellation, global illumination, advanced hdao or hbao, direct compute acceleration effects like depth of field, and motion blur.

CoD chose option 1, simply running under the DX11 code path, and doing nothing more with it
 
At least we're seemingly at the point where DX10/11 titles don't just mysteriously run WORSE and look the same, which you can easily do if you don't give a shit at all about your port job. Lots of changes under the hood with the DX10 clean break -- a definite paradigm shift that you can definitely use to shoot yourself in the foot if you pay no heed.
 
Other sites have shown virtually no difference between the two for gaming

Thank you, I've seen those. I should have said, in this particular game and specifically if there is any feel difference between the two.
 
Thank you, I've seen those. I should have said, in this particular game and specifically if there is any feel difference between the two.

Testing this particular game would be pointless. A seven-year-old engine that's learned a few new tricks is still a seven-year-old engine.
 
Kyle, in your careful review of polygon edges in the graphics, did you or anybody else happen to come across a Von-Schweetz cameo? If not, then I see no reason to try out this game.
 
The conclusion pretty much sums up why I am still okay with a 5850 ;p

thanks for the top notch review, looks like another game for my "wait 'till it's $19.99 or less list" one good thing I'm sure it will play well on my old build

steam sale for sure. I picked up max payne 3 for 14 bucks, and it runs silky smooth on my 5870. I have tessellation on with all the bells and whistles. I just hope GTA 5 runs as good.
I was trolled on a different site for saying that today's games dont need a new $500 video card for gaming @1080, and updating wouldn't be necessary until the new consoles come out. The kiddies came out in droves.
 
You forgot to mention in the article that this game still doesn't support Multi-Monitor gaming out of the box. BF3 and MoH support it but Call of Duty with a weaker engine can't? Stupid Treyarch.

Yes I know you can use the widescreen fixer to get it to work in Eyefinity/Surround but you run the risk of getting banned and if I do get banned this will be the last Call of Duty I ever buy. It might be the last anyways but we will see.

I had no idea that was the case. Thx for pointing this out! I was thinking of getting BO2.. but I still love BF3 as far as FPS games go.... The vehicles... is what does it.
 
I had no idea that was the case. Thx for pointing this out! I was thinking of getting BO2.. but I still love BF3 as far as FPS games go.... The vehicles... is what does it.

The vehicles, the destruction, and the maps to accommodate. CoD can stay on consoles :).
 
I have been playing Assassin's Creed III which has TXAA. This game runs very nicely and looks great at 2560x1600. Yes - it is a console port, and is best played with an xbox controller. But - this is also the first game that has caused my 680 fan to spin up, as I pretty much have it running at default.
 
Yeish, with those framerates you should have tried testing it at tripple monitor resolutions, you know, to actualy stress something on your test PC.


Might as well be playing pong.
 
Back
Top