Max Payne 3 Performance and IQ Review @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,534
Max Payne 3 Performance and IQ Review - Complete performance and image quality coverage in Max Payne 3! GPU performance, AA performance, and feature performance. FXAA vs. MSAA, tessellation, and high quality textures. If you are interested in this game, we give you a good look at what you need to play it, and how to get the best gameplay experience. Bullet Time is back!
 
Great write up as usual. I have been having a blast with this game and the graphics options are great to see. The fluid enemy animations are top notch as well. Just very well done all around, way to go Rockstar!!
 
It was a pleasure reading the review and I hope other developers take note and go that extra mile for the pc gamers
 
so much for bu bu bu bu 2gb wont be enough for tripple display, haters got owned.
 
Why no tests with FXAA + MSAA?

1339406167huP5qqtm9S_3_3.gif
 
Great review.:D Must have taken a bunch of time to get this done.:eek:
I really enjoy this game.
I have played it all the way through on a CF 7970 setup and I am about 2/3 through on a SLi GTX 670, both systems in 5760 x 1200.

I've used the same settings for both systems, all maximum with SSAO, and both systems are nearly identical in FRAPs and in the overall gameplay experience.
So I do disagree a little bit with your conclusion.....the 670 in SLi will give you a great gaming experience, as good or sometimes better than the HD 7970.

I'm sure a GTX 680 SLi would be awesome as well.

One question: did you experience any graphical oddities in the cut scenes at any times? I occassionally will see grainy texture, colors flashing,wierd lines....but not in game. I saw this on both systems, maybe a little more with the AMD system.
 
Really can't wait to get back home in the next few days to play this game. Figures a good game finally comes out when I decide to go on a vacation, and not when i'm sitting bored at home.
 
On the FXAA vs MSAA bias issue, as it applies to this game, I stand corrected.

Kyle, since all of these cards were tested at stock, did it seem like the Nvidia cards generally utilized most or all of the extra boost speed or did they tend to remain near their basic clock speeds?
 
On the FXAA vs MSAA bias issue, as it applies to this game, I stand corrected.

Kyle, since all of these cards were tested at stock, did it seem like the Nvidia cards generally utilized most or all of the extra boost speed or did they tend to remain near their basic clock speeds?

At stock, all cards have the same max boost right?
 
Thanks you for doing this review,

Currently own a pair of gtx670 from gigabyte
Been thinking about returning it and get the 4gb version.
But this review
Changed me,
I will just save the money and stay with the 2 gb version instead

As I only using three 22" (5240*1050)
Not even the 3 1080p monitor that this review used.
 
Good write up and a good game. Not quite as good as the first two games but still very good. I also wish they had given Max a more hefty health bar, I get they were trying to balance the cover mechanic but making Max more fragile hurts the bullet time mechanic a lot. Unlike Max Payne 1 and 2 I can dive head first into a room in slow mo, mow down half the crowd and recover because by the time Max is back on his feet he's dead.
 
Great review.:D Must have taken a bunch of time to get this done.:eek:
I really enjoy this game.
I have played it all the way through on a CF 7970 setup and I am about 2/3 through on a SLi GTX 670, both systems in 5760 x 1200.

I've used the same settings for both systems, all maximum with SSAO, and both systems are nearly identical in FRAPs and in the overall gameplay experience.
So I do disagree a little bit with your conclusion.....the 670 in SLi will give you a great gaming experience, as good or sometimes better than the HD 7970.

I'm sure a GTX 680 SLi would be awesome as well.

One question: did you experience any graphical oddities in the cut scenes at any times? I occassionally will see grainy texture, colors flashing,wierd lines....but not in game. I saw this on both systems, maybe a little more with the AMD system.

Ahhh but did you use the same drivers for your 7970s as per this review?
 
At stock, all cards have the same max boost right?

This is not what I was asking about. A given Nvidia card could have a max boost of 1,000,001THz, but the software and hardware on the Nvidia chip/card determines how much of that boost is actually used at any given moment.

If not much boost is being used then the game is computationally/thermally exhaustive. Conversely, if the Nvidia card(s) are generally using all the available boost on the reference card then we know the processor load is not too bad. This sort of information will help team green overclockers dial in the more likely best settings to squeeze out as many frames or image quality enhancers as possible.
 
If only we could get the PC team behind this one to go drop one final GTA4 PC patch.
 
Happy to see Rockstar didn't give us a crappy console port! Even happier to see that AMD didn't drop the ball (sort of with the 12.6 Beta CAP) with Xfire support for a big release!
 
"When you set MSAA settings the game tells you how much VRAM is required compared to what you have available. If it exceeds that amount you cannot enable that setting, it locks you out of it."

Is it possible to verify that the amount of VRAM the game tells you it will use is, in fact, how much it uses?
 
nice review.

Just for giggles no 670sli or 7870 crossfire/7950?

Seems not many reviewers are doing 7870 crossfire reviews. They have come down in price some available for around 309-350$

So for a few dollars more you can have a 7870 crossfire setup vs a single 680 or 7970.

Heck the 7850 crossfire setup can be had at the same price as the 680/ 7970.
 
Kyle
A very good review. Also one thing which would give even more insight into game performance is overclocked performance on these cards. I am thinking highest playable settings would be better when we overclock the cards. People who have custom HD 7970 at 1200+ Mhz would be able to definitely get close to 60 fps with MSAA 2x and stay above 30 fps for min fps . Similarly for GTX 680.

2560 X 1600 2X MSAA + FXAA
GTX 680 - avg 46.9 min 26
HD 7970 - avg 47.6 min 25

You have started including maximum playable settings when overclocked in your custom card reviews. It would be great if that were made a standard feature across all reviews. just an opinion. :)
 
Kyle
A very good review. Also one thing which would give even more insight into game performance is overclocked performance on these cards. I am thinking highest playable settings would be better when we overclock the cards. People who have custom HD 7970 at 1200+ Mhz would be able to definitely get close to 60 fps with MSAA 2x and stay above 30 fps for min fps . Similarly for GTX 680.

2560 X 1600 2X MSAA + FXAA
GTX 680 - avg 46.9 min 26
HD 7970 - avg 47.6 min 25

You have started including maximum playable settings when overclocked in your custom card reviews. It would be great if that were made a standard feature across all reviews. just an opinion. :)

Since when has overclocking improved performance? What a novel idea! :p Just ribbin' ya!

Seriously, what you are asking for basically DOUBLES the workload to produce the article. We have addressed overclocking and scaling with new GPUs, do not think that it would be any different with this game as we saw very good scaling. We could run different variables on this game and hardware alone that would easily take a year of data testing. Keep in mind that we used a full 10 minutes of playtime, not a 90 second cut scene as other sites have done. You want quality, you have to sacrifice other avenues of coverage. And quite simply put, there are a plethora of sites that cover the easy stuff. :)
 
Couple things I'm curious about:
1. Renderer differences. Any performance / visual differences when they're all at the same settings?
2. A little more in depth look at the AO settings. What was the visual / performance difference between the settings? It's mentioned at the start that HDAO is surprisingly cheap but how major is the visual jump?

Anyway, I ran the game on a GTX 260 in DX10 @ 1680x1050 with settings as high as it let me go (besides AA, which I just used FXAA) and it was smooth the whole way through. RAGE is definitely in better form than it was for GTA 4, someone pretty obviously cared about the PC this time around.
 
There's another way to find the overclocked performance: just add a static percentage to the results of the GPU. If for example you are wondering how a 7970 at 1110/1650 will perform, the increase in both core and memory is 20%, so add 20% to the average FPS that the card got in the review. In this example if the 7970 gets 50 FPS you could estimate that it would get 60 FPS with that same overclock.

Of course this isn't foolproof in the sense that different games benefit more or less from core/memory overclocks, and some games don't respond as well to increased clocks in general. You could try to adjust for this in some way too.

But take this whole thing with a grain of salt. Just my ideas. ;)
 
I like the gameplay but Max constantly moping around gets old pretty quick. Yes, you're life sucks so drink yourself to death, and hurry up with it.
 
Why no tests with FXAA + MSAA?

We did one test with FXAA+2X MSAA at 5760x1200 with SLI and CFX.

Otherwise, it is kind of pointless. With as great of a performance hit MSAA takes, it doesn't make sense to use MSAA on a single card. Therefore, adding in FXAA, taking it down even a few more percentages in performance is just pointless to test. Take the percentage you saw MSAA dive to and add the percentage FXAA causes, and there you go, FXAA+MSAA results.

Great review.:D Must have taken a bunch of time to get this done.:eek:
I really enjoy this game.
I have played it all the way through on a CF 7970 setup and I am about 2/3 through on a SLi GTX 670, both systems in 5760 x 1200.

I've used the same settings for both systems, all maximum with SSAO, and both systems are nearly identical in FRAPs and in the overall gameplay experience.
So I do disagree a little bit with your conclusion.....the 670 in SLi will give you a great gaming experience, as good or sometimes better than the HD 7970.

I'm sure a GTX 680 SLi would be awesome as well.

One question: did you experience any graphical oddities in the cut scenes at any times? I occassionally will see grainy texture, colors flashing,wierd lines....but not in game. I saw this on both systems, maybe a little more with the AMD system.

I did not, but the game has a lot of flashy graphics with colors, flashing and so forth. I didn't see any grainy textures. The game has a fair bit of "fluff" graphics as I call it, to give it a cinematic feel.

did you try to run triple monitor on a single gpu?

While not timed, I did load up the level on a single card just to see if it would be playable or not. Hence the comment in the review that it was at the border line for being playable, pretty much if you want to play in triple displays you will need dual-GPUs.

On the FXAA vs MSAA bias issue, as it applies to this game, I stand corrected.

Kyle, since all of these cards were tested at stock, did it seem like the Nvidia cards generally utilized most or all of the extra boost speed or did they tend to remain near their basic clock speeds?

At stock, all cards have the same max boost right?

All cards, in every game I've tested so far, exceed the GPU Boost clock and achieve what we call TOP GPU Boost clock speed. I haven't encountered a game yet where the cards stay close to their baseclock, or even the GPU Boost speed, they are always higher.
 
If not much boost is being used then the game is computationally/thermally exhaustive. Conversely, if the Nvidia card(s) are generally using all the available boost on the reference card then we know the processor load is not too bad. This sort of information will help team green overclockers dial in the more likely best settings to squeeze out as many frames or image quality enhancers as possible.

Generally speaking, that's not the way boost works. As long as the card is under the power and temp targets it will boost to it's maximum frequency (unless it is a very light load). The card won't boost to different frequencies based on game complexity or something. For example, my card boosts to 1275, and as long as I am under 132% power and 70C it will always be at that speed in any game that puts any kind of load on the card (not cutscenes, for example). The card doesn't flip around between 1115, 1200, 1235, 1066, etc during games.
 
"When you set MSAA settings the game tells you how much VRAM is required compared to what you have available. If it exceeds that amount you cannot enable that setting, it locks you out of it."

Is it possible to verify that the amount of VRAM the game tells you it will use is, in fact, how much it uses?

Actually yes, you can use a third party utility that displays an OSD and has the capability to display video card ram used. Afterburner does this, EVGA Precision X does this, maybe a couple others.

nice review.

Just for giggles no 670sli or 7870 crossfire/7950?

Seems not many reviewers are doing 7870 crossfire reviews. They have come down in price some available for around 309-350$

So for a few dollars more you can have a 7870 crossfire setup vs a single 680 or 7970.

Heck the 7850 crossfire setup can be had at the same price as the 680/ 7970.

The price of 7870 is so out of whack currently, it really needs to drop. You can get a 7950 for the same price as a 7870. So why even do 7870 CFX, 7950 CFX would be more appropriate. But of course, adding in more cards and comparisons = more time, and this already took over a week to complete.

Kyle
A very good review. Also one thing which would give even more insight into game performance is overclocked performance on these cards. I am thinking highest playable settings would be better when we overclock the cards. People who have custom HD 7970 at 1200+ Mhz would be able to definitely get close to 60 fps with MSAA 2x and stay above 30 fps for min fps . Similarly for GTX 680.

2560 X 1600 2X MSAA + FXAA
GTX 680 - avg 46.9 min 26
HD 7970 - avg 47.6 min 25

You have started including maximum playable settings when overclocked in your custom card reviews. It would be great if that were made a standard feature across all reviews. just an opinion. :)

Kyle said it well. I'll also add, what other settings would we improve upon? The game was already maxed out all the way down to 7870, so there are no higher in-game settings to set. Therefore, overclocking will give you more performance, but the same visual experience.
 
Couple things I'm curious about:
1. Renderer differences. Any performance / visual differences when they're all at the same settings?
2. A little more in depth look at the AO settings. What was the visual / performance difference between the settings? It's mentioned at the start that HDAO is surprisingly cheap but how major is the visual jump?

Anyway, I ran the game on a GTX 260 in DX10 @ 1680x1050 with settings as high as it let me go (besides AA, which I just used FXAA) and it was smooth the whole way through. RAGE is definitely in better form than it was for GTA 4, someone pretty obviously cared about the PC this time around.

1.) Not sure what you are asking. The game looked identical between AMD and NV in DX11. I haven't tested DX10 or DX9. But yes it would change visuals, because some of the features aren't supported in those lower API levels, like the Tessellation for example.

2.) Performance difference was under 10% between HDAO and SSAO, there is no reason not to run HDAO. It doesn't take as big of a hit as say HBAO in Battlefield 3. HDAO was visually better, if you know what to look for. AO, like always, is a subtle thing, it's kinda like you have to see the whole scene as a whole, and your brain just knows it looks better.

I love how great the textures look in this game, and how RAGE was supposed to have these awesome streaming textures but failed miserably in texture quality, seriously, Rockstar showed that high resolution textures can be done without a massive performance hit, and no Megatexture streaming required.
 
1.) Not sure what you are asking. The game looked identical between AMD and NV in DX11. I haven't tested DX10 or DX9. But yes it would change visuals, because some of the features aren't supported in those lower API levels, like the Tessellation for example.

2.) Performance difference was under 10% between HDAO and SSAO, there is no reason not to run HDAO. It doesn't take as big of a hit as say HBAO in Battlefield 3. HDAO was visually better, if you know what to look for. AO, like always, is a subtle thing, it's kinda like you have to see the whole scene as a whole, and your brain just knows it looks better.

I love how great the textures look in this game, and how RAGE was supposed to have these awesome streaming textures but failed miserably in texture quality, seriously, Rockstar showed that high resolution textures can be done without a massive performance hit, and no Megatexture streaming required.

I guess that HDAO is a full res version of the effect rather than the typical half res SSAO, so it's probably done in the compute shader. Really nice that it's so fast. I'm still stuck on a DX10 card though so I can't enable it for myself. :(

RAGE had a pretty good idea but it didn't work too well in practice, I guess they were too worried about the potentially enormous amount of disk space it would take. Ignoring space, the tech itself has neat potential: http://www.youtube.com/watch?v=p49vOX-_LyU
 
RAGE had a pretty good idea but it didn't work too well in practice, I guess they were too worried about the potentially enormous amount of disk space it would take. Ignoring space, the tech itself has neat potential: http://www.youtube.com/watch?v=p49vOX-_LyU

That along with the inconsistency of the visual quality. The game looked amazing in some parts but then you'd come across a texture that belonged on a playstation 2.

They also seemed to forget to make a game with good gameplay (which isn't the engine's fault of course).
 
It seems like the small delay for the PC version has paid off. Again, the visuals are great, but you can certainly tell that the true capabilities of the game's engine are held back, especially with the levels of details.

It looks great, but I think it could have looked even better if there wasn't simultaneously developed alongside the console counterparts.
 
Generally speaking, that's not the way boost works. As long as the card is under the power and temp targets it will boost to it's maximum frequency (unless it is a very light load). The card won't boost to different frequencies based on game complexity or something. For example, my card boosts to 1275, and as long as I am under 132% power and 70C it will always be at that speed in any game that puts any kind of load on the card (not cutscenes, for example). The card doesn't flip around between 1115, 1200, 1235, 1066, etc during games.

I was only meaning within a few dozen MHz. As an example, Brent had a 14MHz range when testing the maximum manual boost frequency in BF3. You are taking the variability to as much of hyperbolic extreme as I was when indicating that the max potential boost could be set pointlessly high. I understand that the actual frequency is based off of thermal load, however (and this is using a GTX 580 or anything from team red) if I run Civ 5 with VSync off and a fixed fan speed then I will get a peak temperature X for a given clock speed. If I run Batman that peak temperature will be slightly higher than X. If I run a benchmarking suite, the temperature will be higher still. Same level of GPU usage (outside of cut scenes) yet different temperature based on how the chip/card is stressed.

If Max Payne 3 stressed the the 680 in a very bad way then the card (when set to factory automatic fan profile and no power adjustment) could conceivably hit the thermal choke threshold at a GPU frequency closer to base clock rather than the level of, or above, the stock boost profile. Fortunately, this is not a concern for this game as Brent answered my question with, "[a]ll cards, in every game I've tested so far, exceed the GPU Boost clock and achieve what we call TOP GPU Boost clock speed." More fortunately, this will probably not be a concern unless there is a major change in game engines that this architecture is only marginally suited for like ATI 58xx cards and heavy tessellation.
 
Damn, anybody notice AMD is now getting close or even beating Nvidia even at games Nvidia paid for (TWIMTBP). That's a little sad.
 
Great article but I'm a little disappointed at how poor MSAA looks in this game. As you can see in my lossless PNG comparison *HERE* MSAA barely offers any anti-aliasing at all compared with FXAA.

What is even more puzzling, at least on my setup as the issue may be specific to my PC, is that 4xMSAA + FXAA=Very High actually looks WORST than 0xMSAA + FXAA=Very High! Now my understanding is that MSAA is applied during rendering and FXAA is a fullscreen post-process effect applied to the image AFTER it has been rendered so why does combining FXAA with MSAA appear to break FXAA? At the very least, even with MSAA being as poor as it is, FXAA + MSAA should look the same as FXAA on its own, surely? On my PC it doesn't: FXAA + MSAA appears identical to MSAA + *no* FXAA. That can't be right... can it? :confused:

Anyone else noticed this? I contacted Rockstar about this issue but so far they have failed to understand what it is I'm trying to tell them, even requesting images and a DXDiag file when I'd already given them both!
 
P.S. I should add that Max Payne 3 is a great game. Apart from the embarrassing custom install issue, which caused the game to not load if there were spaces in the pathname (thankfully this issue was fixed in the patch released within a week of the game's launch), and the apparently broken MSAA thing this is otherwise an outstanding example of how all PC games should be made. The amount of customisation on offer and the fact that the Graphics Options are *gasp* WITHIN the game, with it restarting itself when needed, makes it very easy to play around with different settings. Even Batman: Arkham City, The Witcher 2 and Skyrim, terrific as they all are, had the audacity to put their Graphics Options outside the game, making testing different settings a needless chore. Why PC developers do that I really don't know.
 
"When you set MSAA settings the game tells you how much VRAM is required compared to what you have available. If it exceeds that amount you cannot enable that setting, it locks you out of it."

Is it possible to verify that the amount of VRAM the game tells you it will use is, in fact, how much it uses?

Since the patch the retail and non-Steam versions now work with MSI Afterburner and EVGA Precision X's OSD and it appears the VRAM usage is correct from my testing as I'm using 0xMSAA + FXAA=Very High and it shows around 1.9 GB of memory out of 4 GB available required (the VRAM is doubled because I'm using SLI). Ingame I see around 1 GB per GPU on those settings and when I ran the game with briefly 4xMSAA + FXAA=Very High (before I turned off MSAA when I saw it wasn't working correctly) I saw around 2 GB per GPU. So it looks like the VRAM estimates are correct.
 
Dear [H] - thanks for another great review

Dear Games Industry - No one has ever gotten $60 out of me for a game until now. Thank you Rockstar for actually making a great PC game.

Dear Comcast - Sorry for the 35 GB download, I'll be done soon.
 
2. A little more in depth look at the AO settings. What was the visual / performance difference between the settings? It's mentioned at the start that HDAO is surprisingly cheap but how major is the visual jump?

I saw one post on the Guru3D forum (I think) that claimed SSAO looked better than HDAO and, sure enough, when I tested it I found that it did look better. HDAO has very thin, faint shadowing whereas SSAO looks bolder and more obvious. Maybe HDAO is more realistic in terms of ambient occlusive lighting but SSAO looks more visually pleasing IMO plus has a lower performance hit.
 
Damn, anybody notice AMD is now getting close or even beating Nvidia even at games Nvidia paid for (TWIMTBP). That's a little sad.

Erm... why is it? I mean NVIDIA cards perform better than AMD cards in DiRT 2 and 3, both AMD Evolved games, but that doesn't mean it's sad, just that their drivers are better optimised for those games. :confused:
 
Back
Top