Cyberpunk 2077's Minimum & Recommended Specs Revealed

SPARTAN VI

[H]F Junkie
Joined
Jun 12, 2004
Messages
8,748
1600447504109.png


Source: https://twitter.com/CyberpunkGame/status/1306991768387321856?s=20

These specs are all over the place. I understand, at least for the "Minimum Requirements," that they want to set expectations for people who have older hardware. I just don't agree with the decision to use older hardware for the recommended specs. For example, I've already had a person contact me in a panic because his Core i5 isn't "recommended" but a Core i7-4790 is; this person has an i5-10600K, which we know is more than enough. And why would they select a Ryzen APU as the recommended AMD processor..? Would rather see Intel Core i3-10300 (or better) or AMD Ryzen 3300X (or better) represented here so net new PC buyers & DIYers know which hardware they should consider upgrading.

Then moving along to the graphics card, I'd rather see current generation hardware represented here - again - to help with informed upgrading decisions (e.g. GTX 1650 SUPER and RX 5500 XT or RX 580 8GB). Cannot complain about recommending a GTX 1060 6GB, but a R9 Fury makes no sense... virtually non-existent market share and I'd hope normies don't go out and actively try to downgrade to meet these specs. That said, these recommended specs are probably barely enough to play this game at 1080P.
 
Last edited:
View attachment 280389

Source: https://twitter.com/CyberpunkGame/status/1306991768387321856?s=20

These specs are all over the place. I understand, at least for the "Minimum Requirements," that they want to set expectations for people who have older hardware. I just don't agree with the decision to use older hardware for the recommended specs. For example, I've already had a person contact me in a panic because his Core i5 isn't "recommended" but an i7-4790 is; this person has a i5-10600K. And why would they select a Ryzen APU as the recommended AMD processor... Would rather see Intel Core i3-10300 (or better) or AMD Ryzen 3300X (or better) represented here. Then moving along to the graphics card, I'd rather see current generation hardware represented here so people know what to upgrade to (e.g. GTX 1650 SUPER and RX 5500 XT or RX 580 8GB). Cannot complain about recommending a GTX 1060 6GB, but a R9 Fury makes no sense... virtually non-existent market share and I'd hope normies don't go out and actively try to downgrade to meet these specs.

I blame the CPU makers for using product naming schemes that don't clearly identify the better/newer product and relative performance between models.

It often seems like the model names are intentionally obfuscated to confuse typical consumers as much as possible, and then it becomes next to impossible for developers to come up with required specifications that are easy to interpret.

I wish we could go back to descriptive product names.

Gen-Corecount-Clockspeed

That would make it a little bit easier to follow.
 
What I would like to see is what settings the recommended specs gets you. Publishers would do everyone a huge service if they added say (1080P, Medium 60fps) as a description otherwise recommendations are kind of meaningless. Would love at least three categories: (720P, low, 60fps), (1080p, High, 60fps), (1440P, Ultra, Target Framerate >60 here).
 
Apparently I didn't click through to the link and read the post itself. They do give that info just not on the graphic on the stream. Should have know CDPR would be doing things mostly the right way.

Please note that the game is both graphics- and processor-intensive, so make sure these components meet or exceed the minimum requirements. Also note that the minimum is created with Low settings and 1080p gaming in mind and Recommended with High and 1080p.

Doesn't appear in the original graphic/link but the info can be found here: https://support.cdprojektred.com/en...issue/1556/cyberpunk-2077-system-requirements
 
Apparently I didn't click through to the link and read the post itself. They do give that info just not on the graphic on the stream. Should have know CDPR would be doing things mostly the right way.

Oh...1080p. Hmm.
 
Its also with no raytracing obviously. Would love to have a recommended (1440P, DLSS, Ultra, Raytracing, 60fps minimum) spec. People need to know if a 2060 or 2070 will cut it or if a 3000 series is needed for the raytraced experience.
 
Seems a bit low, even for 1080 and medium. And 70GB is a pleasant surprise. Thought it would be more like 100-120GB.

Although I will say, I am not finding it to be graphically impressive. Not bad, but just average with plastic looking people.

Are there Ultra settings that are much more demanding than High?
 
  • Like
Reactions: DrDoU
like this
Apparently I didn't click through to the link and read the post itself. They do give that info just not on the graphic on the stream. Should have know CDPR would be doing things mostly the right way.



Doesn't appear in the original graphic/link but the info can be found here: https://support.cdprojektred.com/en...issue/1556/cyberpunk-2077-system-requirements

The recommendation to find some sort of hierarchy chart to compare ones CPU with the minimum spec is a sound one for non-enthuaiasts.
 
Wow, I pretty much was only upgrading to 3080 as figured Cyberpunk would bring my PC to its knees, but by the looks of things I will be more than ready even without a 3080. That's $800 I should probably put towards my OLED and one of the consoles.
 
They need to start adding recommended specs including resolutions.
 
Careful what you wish for, recommended specs might just say xbox series x :p
 
Looking at those specs, I bet the potato I use (in my sig) will probably run this at 1366x768 30FPS which is perfectly acceptable for me to have a good experience. I've come to the conclusion like others that published specs need to be taken with a grain of salt. There are many newer games that run just fine on my relic of a system, and yet I don't meet the minimum specs (Metro Exodus, Far Cry 5/New Dawn, Doom Eternal, and Wolfenstein New Colossus are a few I've played lately).

This makes me really look forward to this game now, as I bet it will run fine for me. Thanks for posting these up!
 
Although I think the CDPR is spending a good amount of time optimizing, I'm sure this game will still require a lot to run maxed out (no one is going to be able to touch id levels of engine performance - though I wish companies would consider performance as a feature in itself).
You'll note in the recommended specs; they didn't even suggest an RTX card, which as we all readily know, this game has raytracing and RTX features.

I'm assuming the recommended spec is probably "medium" settings at 1080p at roughly 60fps. If you want 4k 60-120fps you're going to need a monster system. Especially if you want to play in Ultra with every RTX feature and visual fidelity feature on. I think a 3080 will get you there. I think RDNA2 will also get you there. I actually think a 2080Ti will also get you most of the way there (its been known that all of their demos that they've shown have been running on 2080Ti). But we're talking about top end 1%er parts. Still, it does make me wonder how much performance I can squeeze out of my Radeon VII.
 
Last edited:
Gotta give them respect that a lowly 780 can play this game. Sure it probably 720p with everything to low at maybe 30fps. I fully expect if I got on the steam board there are going to be people crying why their Athlon X2 is not supported.
 
I'm more baffled that they say it supports win7 64 bit but requires DX12. So are they going to go like with WoW and for the game allow DX12 on Win7?
 
I'm more baffled that they say it supports win7 64 bit but requires DX12. So are they going to go like with WoW and for the game allow DX12 on Win7?

Its highly likely this is exactly what they are doing. The reason for this is that Windows 7 still has ~20% of the OS market with windows 10 being around ~75%. The remaining ~5% is Vista, 8.1, and XP which is so insignificant relatively speaking that no one ever supports 8.1 despite Win7 being older. DX12 only works on 64 bit Win7 so that's consistent with the requirements.
 
I'm not surprised at this. Part of the appeal of Witcher 3 was how little video card you needed. I cold max-out the game at 1080p on my GTX 960 (even Hairworks on).

Until we have the new consoles out for a bit, multi-platform games are going to improve more slowly.
 
Wow. I was expecting more... or will we see a Watchdogs level of graphical decrease at launch?
 
I'm not surprised at this. Part of the appeal of Witcher 3 was how little video card you needed. I cold max-out the game at 1080p on my GTX 960 (even Hairworks on).

Until we have the new consoles out for a bit, multi-platform games are going to improve more slowly.

People have gotten so used to AAA developers with half baked efforts at a skeletal framework for DLC that they expect every game to need a machine from ten years in the future to run.
 
Seems a bit low, even for 1080 and medium. And 70GB is a pleasant surprise. Thought it would be more like 100-120GB.

Are there Ultra settings that are much more demanding than High?


Yes,

Low
Medium
High
Very High
Overkill

based on the leaked graphics menu from this spring.And next spring its getting a next gen GRAPHICS upgrade on all platforms,including the PC.
 
Last edited:
Unless a resolution and framerate is stated, min/rec specs generally mean anything from 2 to 20 FPS at 1920x1080 and 480x240, respectively.
 
I'm absolutely expecting these "recommended" mean 1080p30, which is console parity.

So, gotta do roughly 8x that to hit 4k60 and that might take quite a bit of GPU power.
 
those specs are a lot lower than I anticipated. Hopefully my Vega 64 can do 1080P with everything maxed- 30fps is ok for me, I prefer IQ over fps anyways. I'm a little concerned about my FX 8320 though lol. Maybe I can scrounge a cheap 9370 on Ebay, gotta heat my apt somehow
 
Wow. I was expecting more... or will we see a Watchdogs level of graphical decrease at launch?

This isn't me being a contrarian but I don't think it's ever looked like the next Crysis-level game to bring PCs to their knees as many seemed to be expecting (not sure based on what exactly since the game graphically seems rather honest about its level of world and character fidelity which hasn't wowed me but is helped by visual UI/RTX flourishes and the high tech future aesthetic to various things).

RTX will though take its toll like any game with it enabled.
 
Last edited:
Aren't we looking at a game with varying locations with different densities of people and details? From what I've seen it appears that some areas will be much more graphically demanding than others. I'm sure we are going to see scaling of what details and population are rendered.
And anyway who's going to sell a game that only 100 people can play at minimum settings and plays like a slide show for everyone else.
Also look at the flexibility and graphical improvements of aged franchises like Skyrim and GTA 5. I know eventually we'll be able to max out all setting on this game with the 6080 or 7080 we get in 8 years.
 
View attachment 280389

Source: https://twitter.com/CyberpunkGame/status/1306991768387321856?s=20

These specs are all over the place. I understand, at least for the "Minimum Requirements," that they want to set expectations for people who have older hardware. I just don't agree with the decision to use older hardware for the recommended specs. For example, I've already had a person contact me in a panic because his Core i5 isn't "recommended" but a Core i7-4790 is; this person has an i5-10600K, which we know is more than enough. And why would they select a Ryzen APU as the recommended AMD processor..? Would rather see Intel Core i3-10300 (or better) or AMD Ryzen 3300X (or better) represented here so net new PC buyers & DIYers know which hardware they should consider upgrading.

Then moving along to the graphics card, I'd rather see current generation hardware represented here - again - to help with informed upgrading decisions (e.g. GTX 1650 SUPER and RX 5500 XT or RX 580 8GB). Cannot complain about recommending a GTX 1060 6GB, but a R9 Fury makes no sense... virtually non-existent market share and I'd hope normies don't go out and actively try to downgrade to meet these specs. That said, these recommended specs are probably barely enough to play this game at 1080P.

Not to sound rude, but it really isn’t CDPR’s responsibility if people don’t understand basic computer hardware. They can’t write an entire guide explaining the difference between a 4th gen i7 and a 10th gen i5 when telling people the minimum specs required to run their game.
 
Not to sound rude, but it really isn’t CDPR’s responsibility if people don’t understand basic computer hardware. They can’t write an entire guide explaining the difference between a 4th gen i7 and a 10th gen i5 when telling people the minimum specs required to run their game.

Didn't you get the memo?
People have to be spoon-fed these days.

I would argue if the min and rec specs confuse you...sell your PC and buy a console.

Problem solved.
 
Didn't you get the memo?
People have to be spoon-fed these days.

I would argue if the min and rec specs confuse you...sell your PC and buy a console.

Problem solved.

I'll always be going for the top quality PC experience for myself but its honestly not a bad idea. The new consoles are damned powerful (2080 + 3700X levels of performance) for $499. Even with the new 3000 series release, I doubt you can build something that powerful for the same price right now. Steam survey shows that the most common GPU is a 1060 so a console would be a significant bump up for most.
 
those specs are a lot lower than I anticipated. Hopefully my Vega 64 can do 1080P with everything maxed- 30fps is ok for me, I prefer IQ over fps anyways. I'm a little concerned about my FX 8320 though lol. Maybe I can scrounge a cheap 9370 on Ebay, gotta heat my apt somehow
if you're staying on FX grab an 8370 if you can. i still have mine as my "backup" pc but it's prime stable 4.82GHz @ 1.35v 2600 HT / 2200 NB on custom 240 liquid loop. Good luck hitting that voltage on FX-9XXX series. it would even game @ 5 GHz but i had to crank the voltage too high.
 
Cannot complain about recommending a GTX 1060 6GB, but a R9 Fury makes no sense... virtually non-existent market share
maybe they're pointing out that it's more about memory bandwidth than GB? The Fury uses HBM but only 4GB.
 
Back
Top