PC Specs for Metro: Last Light

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
Publisher Deep Silver and Developer 4A Games today announced the official minimum, recommended and optimum PC specs for the upcoming First Person Shooter Metro: Last Light. The highly anticipated sequel to Metro 2033 will be available from May 14, 2013 in North America and May 17, 2013 across Europe for PlayStation®3, the Xbox 360® video game and entertainment system from Microsoft, and Windows PC.

Metro: Last Light is powered by 4A’s proprietary technology, the 4A Engine. Designed to take full advantage of the latest, most powerful PC hardware and DirectX 11 graphics cards, the 4A Engine also supports a host of bespoke NVIDIA® features including NVIDIA PhysX® and NVIDIA 3D Vision®.


Official Metro: Last Light PC specifications:

Minimum
  • Windows: XP (32-Bit), Vista, 7 or 8
  • CPU: 2.2 GHz Dual Core e.g. Intel Core 2 Duo
  • RAM: 2GB
  • Direct X: 9.0c
Graphics Card: DirectX 9, Shader Model 3 compliant e.g. NVIDIA GTS 250 (or AMD equivalent e.g. HD Radeon 4000 series) or higher

For 3D Vision Support:
  • NVIDIA GTX 275 or higher
  • 120Hz Monitor
  • NVIDIA 3D Vision kit for Windows Vista, 7 or 8

Recommended
  • Windows: Vista, 7 or 8
  • CPU: 2.6 GHz Quad Core e.g. Intel Core i5
  • RAM: 4GB
  • Direct X: 11
Graphics Card: NVIDIA GTX 580/660 Ti (or AMD equivalent e.g. 7870) or higher

For 3D Vision Support:
  • NVIDIA GTX 580/660Ti or higher
  • 120Hz Monitor
  • NVIDIA 3D Vision kit for Windows Vista, 7 or 8

Optimum
  • Windows: Vista, 7 or 8
  • CPU: 3.4 GHz Multi-Core e.g. Intel Core i7
  • RAM: 8GB
  • Direct X: 11
Graphics Card: NVIDIA GTX 690 / NVIDIA Titan

For 3D Vision Support:
  • NVIDIA GTX 690
  • 120Hz Monitor
  • NVIDIA 3D Vision kit for Windows Vista, 7 or 8
 
They just went straight to Titan huh, haha. Get a Titan for Optimum gameplay. Nice.
 
Well, that's about what I expected. Support for those clinging to XP, but scalability up to those of us in the self-labeled "enthusiast" crowd.
 
Sooo.. basically "Optimum" means get the best thing out there? Why even list it...
 
I expected this though I was hoping my 560 ti SLI would last me a bit longer. Unfortunately I think the 1gb is not going to be enough to enable much eye candy.
 
like to see this creak along on the minimum spec...lol
 
Under Optimum, how come there is no (or AMD equivalent... ) posted beside the NVIDIA GTX 690 / NVIDIA Titan ? :D
You'd think the 7990 would be ok, but I guess since the offical launch hasn't occured yet, they just missed out.
 
I expected this though I was hoping my 560 ti SLI would last me a bit longer. Unfortunately I think the 1gb is not going to be enough to enable much eye candy.

This is exactly why I went with the 2GB variant of my 5870's when I bought them in 2010. I knew that 1GB wasn't going to be enough for 1080p+ resolutions (I play at 1200p) with graphics turned up.

All my PC buddies mocked me back then "you only need 1GB man don't be a noob" but, here I am. lol
 
Isn't helping PC gamers make their case against the console crowd when the marketing team just decides to throw out the most expensive enthusiast card out there, as if somehow we're dumb enough to buy into their bullshit.
 
Isn't helping PC gamers make their case against the console crowd when the marketing team just decides to throw out the most expensive enthusiast card out there, as if somehow we're dumb enough to buy into their bullshit.

Given the way 2033 ran on top of the line graphics cards, even to this day, I'd say Optimum specs are quite fitting. If you want the best possible performance on the market, well, there it is.
 
This is exactly why I went with the 2GB variant of my 5870's when I bought them in 2010. I knew that 1GB wasn't going to be enough for 1080p+ resolutions (I play at 1200p) with graphics turned up.

All my PC buddies mocked me back then "you only need 1GB man don't be a noob" but, here I am. lol

Yeah, I was an early adopter, there were no 2gb versions of the 560 Ti when I purchased my first one. Was on the fence at the time, part of me wanted to wait as I like you thought 1gb might not be enough for 1080 down the road.
 
Given the way 2033 ran on top of the line graphics cards, even to this day, I'd say Optimum specs are quite fitting. If you want the best possible performance on the market, well, there it is.

And, if the Radeon 7990 was available, it'd probably be under Optimum as well.

They should rename Minimum to "Don't make me laugh, I don't think it'll run well." :D

And, if it had Intel HD graphics, should be called, "That's not a GPU."
 
If they can get the game to run a ps3 it will surely run on anything somewhat modern with enough tweaking.
 
no wonder pc sales are shit these days.

You can still play today's games on super old pc's
 
What was the game that was thorough and completely trashed by Nvidia owners for it being optimized for AMD and them saying that no game should be optimized?

Any chance they still feel the same?

I for one am sick and tired of developers caring about one and only one damned GPU. This means that they gimped portions of the game to ensure Nvidia comes out on top.
 
I hope it looks really nice on max settings. By the preview videos so far, it doesn't seem there will be any "wow" factor.
 
no wonder pc sales are shit these days.

You can still play today's games on super old pc's

It's just as Brent or Steve said in another thread, and I agree too: I can't wait for the day that DirectX 11 becomes the baseline for games.

As you can see, majority of the PC games, their baseline has been DirectX 9.0c. There is no pushing of PC hardware, no innovation, no major use of DX11 or even 10 feature set. Developers are sticking to DX9, not just for older PC hardware, but because of console ports. At the same time, it makes a lot of the developers, in my opinion, lazy and ignorant to PC gamers, as well assholes for taking the cheap route to PC gaming.

I bet their thinking is: "If the game can run on the 360, then it'll run on the PC-- let's not push it any further. It saves us money."

The current 360 and PS3 not only have they ruined PC gaming and limit it, the GPUs in those games cannot support newer extensions and features found in DX11 or higher versions of OpenGL (if going by the PS3 games). It's, in a way, meant to ease development process and porting from consoles to PC, or vice-versa.

I'm hoping with the 720 and PS4 release, that will change and we get something like the following: minimum 2 or 4 core processors at 2.0GHz, and a Radeon 5000/Nvidia 400 series minimum for PC games, and be DirectX 11 capable (or OpenGL 4.0 compatible).
 
Last edited:
Microsoft should end DX9 support in Windows, that would get developers to finally get off their ass and utilize the better features of later versions a bit more.
 
Sweet. Another 5-8 hour linear FPS... can't wait! I hope it runs and looks as awesome as the first one......
 
Last edited:
Awesome! I have $1000 for a Titan laying around. Oh wait, I used it to pay off my car sooner............

shux! I guess I won't play it (until its $5 on steam)
 
If this game is as bad as Crysis 3 you can count me out. I didn't even pay for my copy of that. Generic shooters are boring.
 
If this game is as bad as Crysis 3 you can count me out. I didn't even pay for my copy of that. Generic shooters are boring.

I don't think it will be that bad. They really fucked up with that game.
 
There is a thread in the Video Card Forums talking about how Nvidia announced you would need a Titan to "max out" Metro. This thread here is much more... even-tempered....

Still, I can't wait for the game. IPs like Metro and Witcher and Stalker were trying to push graphics forward, more so then western companies were doing (though that's changing now with things like TressFX).
 
There is a thread in the Video Card Forums talking about how Nvidia announced you would need a Titan to "max out" Metro. This thread here is much more... even-tempered....

Still, I can't wait for the game. IPs like Metro and Witcher and Stalker were trying to push graphics forward, more so then western companies were doing (though that's changing now with things like TressFX).

I agree completely. I just hope it isn't as stale as 2033. I finished it but I felt it was linear and tried to aspire to a group that it never understood.
 
The 1st Metro scaled very well with additional graphics cards, from both teams.
I hope this one does too. :)
 
Isn't helping PC gamers make their case against the console crowd when the marketing team just decides to throw out the most expensive enthusiast card out there, as if somehow we're dumb enough to buy into their bullshit.

lol good luck trying to max the game out with anything else. You need a lot of horse power to turn on the eye candy settings. The Titan, 690 and a couple AMD cards are really the only ones out there that have the appropriate hardware on board to do so.

Sure the game will be playable with other cards, you just won't be able to turn things up all the way and that's what they mean by "optimal".
 
Developers release games on DX9 that everyone's PC can easily run and max out without spending tons of money = OMG CONSOLE PORT WTF WHERE ARE MY FEATURES

Developers release DX11 game with every possible feature that will bring even bleeding edges computers to their knees = OMG UNOPTOMIZED, WTF I CANT RUN THIS GUESS I WILL WAIT FOR IT TO BE $5

Damned if you do, damned if you don't.
 
Developers release games on DX9 that everyone's PC can easily run and max out without spending tons of money = OMG CONSOLE PORT WTF WHERE ARE MY FEATURES

Developers release DX11 game with every possible feature that will bring even bleeding edges computers to their knees = OMG UNOPTOMIZED, WTF I CANT RUN THIS GUESS I WILL WAIT FOR IT TO BE $5

Damned if you do, damned if you don't.

It's impossible to please the pc gaming crowd.
 
It's just hilarious to me, growing up I used to play FPS games like Quake 2 and Quake 3 on a shitty computer that I built in high school that pushed maybe 25fps and I had fun even if the graphics weren't maxed out. These days if you can't max out the game on release it must mean that it's shit. Heaven forbid you upgrade your computer.
 
It's just hilarious to me, growing up I used to play FPS games like Quake 2 and Quake 3 on a shitty computer that I built in high school that pushed maybe 25fps and I had fun even if the graphics weren't maxed out. These days if you can't max out the game on release it must mean that it's shit. Heaven forbid you upgrade your computer.

Generally I only get pissed when I can't max out a game that doesn't even look good. Skyrim is a real bad offender of this. For the performance needed it looks like shit in a lot of places.

That said, if the game is fun I can generally overlook problems like that.
 
Developers release games on DX9 that everyone's PC can easily run and max out without spending tons of money = OMG CONSOLE PORT WTF WHERE ARE MY FEATURES

Developers release DX11 game with every possible feature that will bring even bleeding edges computers to their knees = OMG UNOPTOMIZED, WTF I CANT RUN THIS GUESS I WILL WAIT FOR IT TO BE $5

Damned if you do, damned if you don't.

Not really...there is vast gulf between a true shitty console port and a title that bites off more than it can chew.
 
Back
Top