Cyberpunk 2077’s first major update coming in the next 10 days

The top down approach can work but you're literally talking the best of the best doing that. It's not impossible but it's not typical. When the PS3 / Xbox 360 era hit there was a major shift because the overall system bandwidth was much higher than PCs and VRAM was at least comparable. Back in the days I saw the switch to console first and it was damn near everyone and still is. While you have Cyberpunk being the #1 PC release, you have God of War, and Horizon Zero Dawn these games sold 10 and 50 million on one console alone. Doom 3? 3.5 million. Half-life 2? It reached 10 million but it took them 13 years.
God of War (2018) sold 12 million copies. 50 million is total sales for the entire series.
 
they also need to make money...making a game PC exclusive will kill off the major revenue stream
That's BS and everybody knows it yet somehow everyone acts as if you need consoles to make money. It's the same kind of taboo as DRM. Everyone acts as if it is necessary while everyone knows a good game will sell with or without DRM.
Heck you don't even need a good game, star effing citizen ring a bell? Virtually a money manufacturing machine based on a promise.

At least here don't pretend that consoles are a "necessary evil" no, parallel releases are a product of greed, not necessity. There is absolutely no need to release a game on all platforms at the same time. Of course it is easier to do the shit version tailored for console specs then just push it to PC. And why, because making bank is no longer enough. They want all the money at once and rather sooner than later. Customer satisfaction be damned.
 
That's BS and everybody knows it yet somehow everyone acts as if you need consoles to make money. It's the same kind of taboo as DRM. Everyone acts as if it is necessary while everyone knows a good game will sell with or without DRM.
Heck you don't even need a good game, star effing citizen ring a bell? Virtually a money manufacturing machine based on a promise.

At least here don't pretend that consoles are a "necessary evil" no, parallel releases are a product of greed, not necessity. There is absolutely no need to release a game on all platforms at the same time. Of course it is easier to do the shit version tailored for console specs then just push it to PC. And why, because making bank is no longer enough. They want all the money at once and rather sooner than later. Customer satisfaction be damned.

when was the last major PC exclusive?...if it was so easy to make $$ then why aren't more developers doing it?
 
The top down approach can work but you're literally talking the best of the best doing that. It's not impossible but it's not typical. When the PS3 / Xbox 360 era hit there was a major shift because the overall system bandwidth was much higher than PCs and VRAM was at least comparable. Back in the days I saw the switch to console first and it was damn near everyone and still is. While you have Cyberpunk being the #1 PC release, you have God of War, and Horizon Zero Dawn these games sold 10 and 50 million on one console alone. Doom 3? 3.5 million. Half-life 2? It reached 10 million but it took them 13 years.
Yeah, disregard linear time, why don't ya? Why not go back to the stone age for PC releases? HL2 and Doom3 are from 2004. God of War is the only valid comparison, which came out in 2005 and sold 4.5 million copies. Yeah, franchise sales don't count for nothing, unless you compare franchises that both have the same number of sequels.
 
when was the last major PC exclusive?...if it was so easy to make $$ then why aren't more developers doing it?
Easy? Nothing is easy. But not making proper PC releases (not exclusives, just games that utilize the potential of PCs) has only to do with greed. As I've already explained.
 
They really should have, especially with what the game eventually became.
The CPUs in the last-gen consoles just aren't enough to run such a complex game without optimizing specifically for those platforms.
The Jaguar that ran the last generation games, while considerably meager, allowed the PS4/Xbox One to survive far longer than otherwise possible. The entire point of those system was to provide the multithread capability of something like Cell on a x86 platform while maintaining easier development and backward compatibility. AMD, Sony, Microsoft weren't stupid. The consoles were meant as a way to stress multithreading beyond more than 1 or 2 cores. You can play Cyberpunk on 4 cores but the best experience is on an 8-core processor (minimalizing clock speed binning). You can pick up a six core but an 8 core will perform the best.

The problem with Cyberpunk as the developer itself says is that the focus to include every feature available to PC's (ray tracing included) caused the lift to make consoles doable far harder. They could have delayed consoles but they didn't because of the economic implications. (After xmas?) No publisher is going to do that. So they pushed it.

Trying to prefer PC over the consoles has huge implications for compatibility testing. People need to remember that doing ray tracing in particular is a separate API all together at the time of development. DX12 provides the link/hook. But you've got all of the other features they decided to include that fall outside of that. This game started development 5 years ago or at least around that ballpark. It most likely was developed around a 1080/Ti in terms of performance. There's huge / massive implications to this.

Many games nearly tanked or caused serious problems doing top down. Off the top the later Batman games, later Laura Croft games, some Witcher games, etc. Simply put using exclusive API's has huge outcomes on portability. It's a set it and forget it the other way. Not so the other way around. People can pick apart the statements but all I'm saying is the way it goes.
 
Last edited:
Yeah, disregard linear time, why don't ya? Why not go back to the stone age for PC releases? HL2 and Doom3 are from 2004. God of War is the only valid comparison, which came out in 2005 and sold 4.5 million copies. Yeah, franchise sales don't count for nothing, unless you compare franchises that both have the same number of sequels.
I'm not going to start a console vs PC discussion on merely personal grounds. Whatever you believe is best is fine with me. Although God of War, Horizon Zero Dawn easily stack up against the the best what PC has to offer if you aren't stressing resolution/framerate.
 
"top down" development is completely normal and considered best practice for graphics. It's easy to lower the resolution of textures and simplify models and animations.

It is not done for things like AI, collision, and other CPU heavy things that can't easily be scaled down. Which is what it sounds like they did. Best practice is to do the opposite for those things.
 
The Jaguar that ran the last generation games, while considerably meager, allowed the PS4/Xbox One to survive far longer than otherwise possible. The entire point of those system was to provide the multithread capability of something like Cell on a x86 platform while maintaining easier development and backward compatibility. AMD, Sony, Microsoft weren't stupid. The consoles were meant as a way to stress multithreading beyond more than 1 or 2 cores.
The issue isn't the lack of threads, it's how weak each of those Jaguar cores are - I know, because I'm typing to you from a Jaguar system. ;)
A single Jaguar core at 2GHz is the rough equivalent to a Pentium 4 at 3GHz, and all 8 cores at 1.6GHz in full-SMP is the rough equivalent to an Intel Haswell i3 dual-core at 3GHz (assuming that CPU could do 8 threads instead of 4).

The reasoning for Microsoft and Sony choosing the semi-custom Jaguar-based APU in the last-gen consoles was because IBM's PowerPC CPUs were low-yield (bad for IBM), the Cell was going EoL in 2012 with no true successor, ARM CPUs were still 32-bit at the time thus limiting the overall CPU/GPU memory addressing to 4GB, all Intel CPUs would have been costly and required a separate GPU, and AMD had the best and most cost-effective tech at the time with their APUs.
The backwards compatibility and development eases, while good bonuses, were simply side-effects.

The previous-generation consoles ran far more than 1 or 2 threads.
The PS3's Cell you are already aware of (2 general-purpose threads and 6 available vector-threads for the game/software), and the 360's PowerPC Xenon was a tri-core with SMT (6-threads).

Aside from that, I agree with the rest of your post, and how it all relates to Cyberpunk 2077, especially with 8-threads being the sweet spot with it's game engine.
Again, the PS4 and XBone have the right amount of threads, but the CPU cores themselves are not nearly powerful enough to properly run the game engine, the AI, load textures and assets, etc. nearly quick enough without further heavy optimization from the developers to make it so.
 
The issue isn't the lack of threads, it's how weak each of those Jaguar cores are - I know, because I'm typing to you from a Jaguar system. ;)
A single Jaguar core at 2GHz is the rough equivalent to a Pentium 4 at 3GHz, and all 8 cores at 1.6GHz in full-SMP is the rough equivalent to an Intel Haswell i3 dual-core at 3GHz (assuming that CPU could do 8 threads instead of 4).

The reasoning for Microsoft and Sony choosing the semi-custom Jaguar-based APU in the last-gen consoles was because IBM's PowerPC CPUs were low-yield (bad for IBM), the Cell was going EoL in 2012 with no true successor, ARM CPUs were still 32-bit at the time thus limiting the overall CPU/GPU memory addressing to 4GB, all Intel CPUs would have been costly and required a separate GPU, and AMD had the best and most cost-effective tech at the time with their APUs.
The backwards compatibility and development eases, while good bonuses, were simply side-effects.

The previous-generation consoles ran far more than 1 or 2 threads.
The PS3's Cell you are already aware of (2 general-purpose threads and 6 available vector-threads for the game/software), and the 360's PowerPC Xenon was a tri-core with SMT (6-threads).

Aside from that, I agree with the rest of your post, and how it all relates to Cyberpunk 2077, especially with 8-threads being the sweet spot with it's game engine.
Again, the PS4 and XBone have the right amount of threads, but the CPU cores themselves are not nearly powerful enough to properly run the game engine, the AI, load textures and assets, etc. nearly quick enough without further heavy optimization from the developers to make it so.
I can agree with this. The 1 or 2 core part I meant in relation to pc gaming though. Its a slightly different take but also true.
 
I hear people saying Cyberpunk was too ambitious graphically to run on last gen consoles, but I don't get this.

There are tons of PS4 games (especially exclusives) that look and run pretty nice. Sure, they use lower render scales, and checkerboarding, and other tricks, and you're still talking 30 fps, but they work (and no crashes either).

I don't really see any technical reason Cyberpunk couldn't be fine on PS4. Yeah, reduce the number of NPCs or cars or whatever, run at a lower res, etc. it's not impossible. Hopefully the upcoming patches will fix it, we will see.
 
I hear people saying Cyberpunk was too ambitious graphically to run on last gen consoles, but I don't get this.

There are tons of PS4 games (especially exclusives) that look and run pretty nice. Sure, they use lower render scales, and checkerboarding, and other tricks, and you're still talking 30 fps, but they work (and no crashes either).
The game was sold in large part on its ambitious graphics. Running it at low settings is contrary to the draw.
 
I hear people saying Cyberpunk was too ambitious graphically to run on last gen consoles, but I don't get this.

There are tons of PS4 games (especially exclusives) that look and run pretty nice. Sure, they use lower render scales, and checkerboarding, and other tricks, and you're still talking 30 fps, but they work (and no crashes either).

I don't really see any technical reason Cyberpunk couldn't be fine on PS4. Yeah, reduce the number of NPCs or cars or whatever, run at a lower res, etc. it's not impossible. Hopefully the upcoming patches will fix it, we will see.

It's not just a matter of graphics settings. The way the world was designed and how the engine renders the cityscape as you traverse it etc. comes into play. And at some point, you can scale settings back too far and the game will look like Minecraft and GTA had a kid. It's not a good look.
 
I'm not going to start a console vs PC discussion on merely personal grounds. Whatever you believe is best is fine with me. Although God of War, Horizon Zero Dawn easily stack up against the the best what PC has to offer if you aren't stressing resolution/framerate.

I personally never found a reason for the versus part as I've gotten older. It was easier just to buy and enjoy both. Got a PS4 and Switch 3 years ago after being mainly pc for the last 10 years before that, and have REALLY enjoyed it. With how much most of us at H spend on PC hardware, buying a console during something like a black friday sale basically makes it the equivalent of adding one more part to our hot rod of a pc.

On the note about Cyberpunk, I've had a blast with it on PC but it does suck about the people having issue on console. I had a few friends stream it on their ps4's and despite the crashes they were enthralled and loved the game and it's content. Here's hoping the patches are able to resolve the issues.
 
I personally never found a reason for the versus part as I've gotten older. It was easier just to buy and enjoy both. Got a PS4 and Switch 3 years ago after being mainly pc for the last 10 years before that, and have REALLY enjoyed it. With how much most of us at H spend on PC hardware, buying a console during something like a black friday sale basically makes it the equivalent of adding one more part to our hot rod of a pc.

On the note about Cyberpunk, I've had a blast with it on PC but it does suck about the people having issue on console. I had a few friends stream it on their ps4's and despite the crashes they were enthralled and loved the game and it's content. Here's hoping the patches are able to resolve the issues.
This. same.
 
Cross platform games like Far Cay 5, absolutely playing on PC. But for the console exclusives, I still have a lot of fun with them. I have PS4 and Switch Lite, they are great systems.

I honestly feel bad for people with last-gen consoles trying to play Cyberpunk. I played for a few hours on base PS4. I mean, I guess you could consider it "playable" but it was really raw.

I do understand that CDPR really pushed things too far for console. I believe they are using all real-time lighting (not baked lightmaps) so that takes a huge toll. And the number of NPCs and cars is more than maybe any other game.

So yeah, they went too far, which is great if you have a high PC, but then for most people (most of the customers) not so much. I still think it's technically possible, and there is a lot of pressure now, so let's cross our fingers that they pull it off.
 
Honestly, this is why I like indie gaming on PCs now. They develop on/for PCs only or primarily.
Sure, the graphics aren't the main show like in Triple-A releases, but their art, direction, gameplay, is top notch because they don't bow to these blood and money sucking executives!

I remember when I was last in the PC gaming scene towards the late 2000s before I quit till now, I thought indie games were boring and glorfied equivalent to the then-popular Flash games and the like. I was always looking forward to the triple-A releases. Then those big studios just began disappointing more and more, be it because of consoles rising or other reasons.

Now the smaller studios are where the good PC games are!

//////////

Fascinating reads from some [H] members on graphic history here! :) Great stuff, really enjoyed the dive on this one.
 
That's BS and everybody knows it yet somehow everyone acts as if you need consoles to make money.

You can, but it will be a lot less money. You'd be cutting your revenue by around 50-60%, maybe more.

Generally the more money you have on hand the more time and leeway you have with the follow up product. Most studios don't just want to turn a profit, but have enough money to fund the next 3-5 years of work. Limiting your platforms makes that much harder.
 
Screenshot_20210115-074155~01.jpg
 
You can, but it will be a lot less money. You'd be cutting your revenue by around 50-60%, maybe more.

Generally the more money you have on hand the more time and leeway you have with the follow up product. Most studios don't just want to turn a profit, but have enough money to fund the next 3-5 years of work. Limiting your platforms makes that much harder.
Damnit, why is everyone deliberately obtuse when it comes to console vs pc? I'm not saying don't release games on consoles at all. I'm saying release them at different times, leaving time for a proper port whichever version you release first doesn't matter as long as you make the effort for the other platform. In multiplatform releases usually the PC version that is compromised, but PC gamers are so used to it at this point that they are expecting it and even think of it as the norm. This is the first time as far as I can remember when the console version of a release is compromised. And we can all see how tolerant console users are compared to PC users.
 
It's not just a matter of graphics settings. The way the world was designed and how the engine renders the cityscape as you traverse it etc. comes into play. And at some point, you can scale settings back too far and the game will look like Minecraft and GTA had a kid. It's not a good look.
Well the way it renders things is terrible even on PC and maximal settings. You literally cannot see enemies 100m away because they are not yet rendered. I could make 400m shots in ghost recon, but in cyberpunk I have to walk up to enemies to even see them.
What's even worse is the fake traffic made of sprites that they draw in the distance that disappears as you get closer. It would be better to not have anything there.
 
I think CDPR should have completely disregarded the last generation consoles entirely. The problem was is that preorders had been taken going back years if I am not mistaken. It was pretty clear those consoles were going to be a no go or at least, they had to be scaled back and delayed longer than the PC and next generation console versions.
As I mentioned, rock and hard place (or catch 22, as it were). They should have disregarded last gen consoles completely, but without those console preorders they wouldn't have had backing to continue development for 7 (?) years. Maybe the best solution would have been, as some here have suggested, to develop for PC first, then create a separate branch/project for older consoles (like Crysis did). But then I guess you would have PS4 and XBox1 folks complaining they didn't get the same experience as next gen and PC (...like Xbox and PS3 folks did with Crysis). And again like with Crysis, maybe down the line we could have had an "enhanced" version for PC based on the inferior console version :)
 
WoW ring a bell?

Yes, the game that launched in 2004, that has outlived numerous other MMOs and that no other MMO has come close to matching in playerbase.

The question was "if it was so easy why aren't more developers doing it" and unless you count the rapid rise and fall of generic MMOs that come and go in popularity, I don't know that WoW is really a good example of it.
 
I never pay attentions to pedestrations/side people in these kinds of games.
All they serve is to get in your way (wow, I guess it mimics real life a lot) and serve no purpose other than to just walk in to you and be obstacles, or something to kill on a rampage if you are into that.

They (all NPCs is pretty much every game I can think of) serve nothing more than meaningless brainless props prone to doing stupid things that break any immresion you may have gotten sucked into momentarily.

Something with AI and non-player controller characters need to change drastically in video games. It honestly seems we're getting a regression from the much earlier Triple-A titles of years past - not that their AI was anyhting spectacular, but it just seems like things are getting worse in this department as times goes on rather than better.

The devs want you to believe they're dynamic, have their own lives or whatever. But all they are are ghost people who disappear as soon as you get out of view, with randomized clothing and physical characteristics (fat, slim, black, homeless, CEO) to give a facade of likeness but it's all so blatantly fake and "why bother if you don't make anything meaning or special out of it" is kind of the attitude I've always had about them =/

Sounds like CP2077 is no better and in fact worse (if it were possible, lol).

I guess it's better than a literal ghost town with cutscenes to drive the story? But maybe instead of quantity of them, focus on quality. Or at least make them like in Deadly Premonition games where there are at least *SOME* unique regular NPCs in the midst of crowds you can sometimes spot doing really unique things and join in a quest with them, etc.

Give up on peds and meaningless other "NPCs" in games please, at least until there are some AI and programming breakthroughs to justify their place in games instead of just more things to go wrong and frustrate players and take the immersion and fun out..

AI and reliable pathing and meaningful features for non-major quest givers and other major NPCs really needs to be better here. There's PhysX, Havoc engines, ray tracing, vehicle colissions, sweet gun animations, good stories in games.. but NPCs are always brainless morons who just stand there doing stupid unbeliveable shit and it's just annoying more than anything.

You don't need countless RNG randomized clothing zombie pedestrians in other games to enjoy their stories .. why they keep shoving these buggy crash-test-dummies down gamers' throats is beyond me.

Everything is just so cutthroat and copy-pasted seen it a thround times before except now it's even worse quality than ever, low effort garbage.

I am in my early 30s, I shouldn't be this cynical and distrustful of video games... come on people, just put some effort into all the aspects of the video game you create. Stop throwing shit features around just so your marketing can tout is as something amazing for gamers to then be disappointed.. only to forget this type of crap until the next hype comes along and repeats the cycle.

To clarify my above post: I have not played CP2077, after seeing and hearing reports of it I lost all interest in the game, I don't think I'd last more than a couple hours. May as well play GTA5. I am just going based on other people's reports, my experiences with previous various games of the genre, and what I've seen with my own eyes in streams and reviews of this game and others.
 
Last edited by a moderator:
Aside from that, I agree with the rest of your post, and how it all relates to Cyberpunk 2077, especially with 8-threads being the sweet spot with it's game engine.
Again, the PS4 and XBone have the right amount of threads, but the CPU cores themselves are not nearly powerful enough to properly run the game engine, the AI, load textures and assets, etc. nearly quick enough without further heavy optimization from the developers to make it so.
I'm going to zero in on the optimization part of this because I think people don't completely realize the level of optimization that occurs on consoles. It still happens even today. Imagine not only recompiling your software to use every instruction offered but also optimizing for cache and bandwidth too system wide. That's what happens on consoles. Core utilization does not equal efficiency. Those numbers that manufactures spilt out on throughput? Well your PC is hitting like half that. On PC's there's a shit ton of overhead... generic drivers/software everywhere. Everything that drives your average PC is compiled on minimums that's why everything is so compatible. However games on consoles, especially near EOL are heavily optimized. This is why lately console ports to PC haven't been that great. You think to yourself, " why is this port to PC so shitty?" Well, on consoles everything is accounted for. When you port back to PC, yes you have crazy power but a very large amount of it goes into the ether.

I wouldn't necessarily say that Jaguar was meager in the console context per say. The ease of development was no side feature but one of many features that drove that decision. It's crazy balanced from jump. It's meager compared to what's available on PC and consoles. But out the door it's starting out at like 50 - 60% utilization and connected to total system bandwidth higher than a PC (the entire system ran at GDDR3 speeds not just VRAM). Cell? In the beginning? Like 20%. It took many developers almost 10 years to hit it's theoretical output of 3 - 4 times that of Jaguar. No one saw those numbers until mid 2010's when official support for it ended. Jaguar hit 90% by like the second to 3rd year.
 
I'm going to zero in on the optimization part of this because I think people don't completely realize the level of optimization that occurs on consoles. It still happens even today. Imagine not only recompiling your software to use every instruction offered but also optimizing for cache and bandwidth too system wide. That's what happens on consoles. Core utilization does not equal efficiency. Those numbers that manufactures spilt out on throughput? Well your PC is hitting like half that. On PC's there's a shit ton of overhead... generic drivers/software everywhere. Everything that drives your average PC is compiled on minimums that's why everything is so compatible. However games on consoles, especially near EOL are heavily optimized. This is why lately console ports to PC haven't been that great. You think to yourself, " why is this port to PC so shitty?" Well, on consoles everything is accounted for. When you port back to PC, yes you have crazy power but a very large amount of it goes into the ether.
The made(optimized)-for-consoles is a major appeal, for sure.

I remember when smartphones were new and I owned both iPhones and Androids (as well as the QNX Blackberry 10, which was my favorite by far, and I haven't even used BBs myself even once before then), I always would return to iPhones after a year or two with Android because the software was first made for iPhones and then ported over to Android (sometimes it was done the other way, but it was rare I found, at least with the various popular apps I was using at the time). Things just worked better and as expected on iPhones and just serviceable but clearly unoptimized on Android for a very long time. I haven't used iPhones in quite a few years now but judging by the quality of Android apps today, things seem to have balanced out.

But, as someone who was a PC gamer first in the past, then made a soft-switch to PS3 and Xbox 360, and finally gave up on PC gaming completely (due to a lack of good quality/interesting games in the later 2000s) and moved to PS4... the grass is always greener on the other side, using consoles was annoying and subpar to everything I recall about PC gaming. The low FPS was the worst of it all. 30 FPS is just so disgusting as a previous PC gamer. I have no idea how people were able to tolerate it - perhaps a majority never had the opportunity to play games at 60+ FPS before? It was nauseating. Contrary to belief crashes, hangups, etc still do happen on consoles.

Console OS GUIs (haven't used PS5 or XSX) always start quick and fresh in their early cycle and then end up being dogshit slow (and formatting/reinstalling the system doesn't fix it - it's because they keep adding bloat and code to the OS as time goes on that even the hardware doesn't seem to be able to handle).

There are tons of downsides to consoles (I'm not PCMR or a fanboy in any capacity, I go between Xbox and PS, PC and consoles, Linux and Windows, different phones OS, etc). I am now back to playing on PC and holy shit games are actually effing enjoyable again, even same games as before (god, even just playing Subnautica on PC compared to the PS4 I was playing on up until a month ago is night and day difference. It feels like a different game and different so much more enjoyable experience).

And as time goes on, in consoles it seems games get less and less unique as time goes on, and untilize the console specific exclusive featuers less and less, more and more poorly optimized and forgotten about and not updated (oh, this was another huge plus for PC gaming I missed dearly when I didn't have a gaming PC for the past decade... games often only get 1-2 patches on consoles at most shortly after launch, but are then left in that state more or less permanently. There are exceptions, but in my experience they would often be so far behind the equivalent game on PC patches it was absurd).

And there were far too many (indie studio/smaller team) games that were PC first and either don't get released to consoles or if do, wayyyyyyy later, and then in a quite suboptimal state.
Not to mention the sometimes ridiculous load times on the previous generation consoles...

Console gaming ends up more gimped than PC gaming, by quite a bit.

(And while nothing to do with the optimization topic at hand, console gaming often actually ends up quite a bit more expensive to own and game on than PCs. Yes, it might be $500 for the hardware, but consider that games incredibly rarely if ever go on sale. Triple-A games will get just mediocare discounts on holidays and the like, and you'd have some random sales now and then for other games. But they were far less frequent, and games kept their launch MSRP for WAYYYYY too long post-release. On the PC side of things you get so many sales all the time I am just overwhelmed. It feels like PC gaming costs pennies on the dollar compared to buying games over the course of the life of a console. I would argue console gaming for non-casual gamer whose main interest aren't Peggle is actually more costly than buying a gaming PC outright and saving substantially thanks to game sales on the PC side of things over time)

Grass is always greener on the other side, though.
XSX and PS5 may sound great now, but give it 6 months when the wow factors wears off for both users and developers and sales begin to stabalize, and it'll probably be back to being quite an inferior gaming to PCs again. I may eat my words because this generation seems perhaps different. But that can be said for any past generation of consoles when they launched. They just don't seem to age well.

My biggest fear with the new generation is that while for now it seems devs target 60FPS, I am afraid they'll start being greedy and up the graphical detail or effects, or new engines start to come out that are more demanding and can't keep running 60FPS on the consoles, and developers will default back to 30FPS as standard all over again.

So yes, I guess there is merit in POTENTIAL for consoles to be a superior experience to PCs due to their standard hardware config so you can optimize all drivers and hardware and build games around that software and hardware.. but in reality, it just doesn't work that way. Developers don't give a shit. They do the lowest common denominator, and the least amount of work they can do to achieve something that just barely runs on the thing, put it on sale, and call it a day and move on to their next project, and often dedicating more time and resources to the PC crowd than console, actually.

I think I need a break from internet participation, I'm not sure why I am writing high school level essays on these random topics :barefoot: it's cool to have a place to share our niche and geeky thoughts with each other I suppose!

Crossing my fingers that this patch will be something really cool!

[H] on!
 
Last edited by a moderator:
Something with AI and non-player controller characters need to change drastically in video games. It honestly seems we're getting a regression from the much earlier Triple-A titles of years past - not that their AI was anyhting spectacular, but it just seems like things are getting worse in this department as times goes on rather than better.
Agree with this so much!
I may eat my words because this generation seems perhaps different. But that can be said for any past generation of consoles when they launched. They just don't seem to age well.
Disagree with this part, from my experience Xbox/PS2 and X360/PS3 aged fairly well imo. I did not own either of the XOne or PS4 but the hardware seemed rather mediocre for that generation. For this new gen the hardware seems far more capable for the asking price, time will tell how that plays out ¯\_(ツ)_/¯
 
Agree with this so much!

Disagree with this part, from my experience Xbox/PS2 and X360/PS3 aged fairly well imo. I did not own either of the XOne or PS4 but the hardware seemed rather mediocre for that generation. For this new gen the hardware seems far more capable for the asking price, time will tell how that plays out ¯\_(ツ)_/¯
Just to clarify what I meant by not aged well: Definitely didn't mean in terms of how games play or look when looking back at them years later or replaying them, but in terms of how the console UI performance degrades over time (kind of like a non-defragged disk in the olden days before defragging was automated and more frequent in Windows. Everything in the console OS just feels bogged down after all the updates they add to it over time) and the extra attention and effort given to the console by developers as the console ages. i.e amazing releases at the start of the gen, then a slow decline in quality of many aspects -- graphics, ingenuity, going back and patching of games and the like.

And definitely, the price tag for the hardware you get this gen is just INSANE, especially if you compare that value vs a PC graphic cards alone in prices these days... it is unbeliveable how big of a value disparity this is now more than ever when you combine the standalone value of the consoles, on top of the post-covid/mining resurgence that is causing a huge upmark in some PC hardware prices.
But I still think over the life of the console, the year-round high cost of console video game prices make the value proposition much lower than building a gaming PC instead, while enjoying the cost of incredibly cheap, high quality games on the PC platform 💸🤑
 
Last edited by a moderator:
Just to clarify what I meant by not aged well: Definitely didn't mean in terms of how games play or look when looking back at them years later or replaying them, but in terms of how the console UI performance degrades over time (kind of like a non-defragged disk in the olden days before defragging was automated and more frequent in Windows. Everything in the console OS just feels bogged down after all the updates they add to it over time) and the extra attention and effort given to the console by developers as the console ages. i.e amazing releases at the start of the gen, then a slow decline in quality of many aspects -- graphics, ingenuity, going back and patching of games and the like.

And definitely, the price tag for the hardware you get this gen is just INSANE, especially if you compare that value vs a PC graphic cards alone in prices these days... it is unbeliveable how big of a value disparity this is now more than ever when you combine the standalone value of the consoles, on top of the post-covid/mining resurgence that is causing a huge upmark in some PC hardware prices.
But I still think over the life of the console, the year-round high cost of console video game prices make the value proposition much lower than building a gaming PC instead, while enjoying the cost of incredibly cheap, high quality games on the PC platform 💸🤑

I can't speak for the Xbox side, but for the playstation I didn't really have ui issues as much on my PS4 slim and ps4 pro. Mine were updated for a ssd but they never really had issues with that shared memory for system and video card that caused any major issues. Thats obviously only one perspective, other results may vary, but up until the end the ps4 still felt fine as a system.

In contrary to what you posted, I find devs actually pull some great magic as they finally learn the ins and outs of the hardware, and this was the case for the ps4. The ps4 had a really great last set of games for me. From the last three years I've really really loved God of War, ghost of tsushima, nioh 2, monster hunter world and shadows of the collosus remastered.

There have bene other games such as last of us 3 and days gone that made the last end of the ps4 pretty great. Mind you I feel my ps5 is better, but the four has some very solid games l. Ghost of Tsushima particularly made me super happy.

When you say the high year round cost, could you explain your position by the way? Cause from my experience after the entry of buying a console, it's way cheaper than a pc. For the games on console if you don't buy them new you can get them for super cheap. This last month I got Code Vein and nioh 2 for $10, and a spare copy of ghost of tsushima for $20. A playstation plus membership was only $29 from CD keys and since I'm a ps5 owners I got like 20 games for free. Outside of the $500 for the ps5 I spent a couple months ago, the total cost to consoles over the last two years has even been lower than the cost of the two heatsinks I bought for my ryzen systems.
 
I'm going to zero in on the optimization part of this because I think people don't completely realize the level of optimization that occurs on consoles. It still happens even today. Imagine not only recompiling your software to use every instruction offered but also optimizing for cache and bandwidth too system wide. That's what happens on consoles. Core utilization does not equal efficiency. Those numbers that manufactures spilt out on throughput? Well your PC is hitting like half that. On PC's there's a shit ton of overhead... generic drivers/software everywhere. Everything that drives your average PC is compiled on minimums that's why everything is so compatible. However games on consoles, especially near EOL are heavily optimized. This is why lately console ports to PC haven't been that great. You think to yourself, " why is this port to PC so shitty?" Well, on consoles everything is accounted for. When you port back to PC, yes you have crazy power but a very large amount of it goes into the ether.

I wouldn't necessarily say that Jaguar was meager in the console context per say. The ease of development was no side feature but one of many features that drove that decision. It's crazy balanced from jump. It's meager compared to what's available on PC and consoles. But out the door it's starting out at like 50 - 60% utilization and connected to total system bandwidth higher than a PC (the entire system ran at GDDR3 speeds not just VRAM). Cell? In the beginning? Like 20%. It took many developers almost 10 years to hit it's theoretical output of 3 - 4 times that of Jaguar. No one saw those numbers until mid 2010's when official support for it ended. Jaguar hit 90% by like the second to 3rd year.

We know the jaguar cores are slower than today's processor cores but to compare it to the desktop version is doing it a misjustice to what bare metal coding and optimizations can do on consoles.
 
  • Like
Reactions: kac77
like this
I for one am glad they put PC first with Cyberpunk. No idea how well it has sold, but if it's half price and you don't yet own it, I would recommend it. It is adult rated so if nudity/violence offends your sensibilities, then you might want to pass.

I've ran into a few minor bugs but nothing game breaking (PC).

It has been a very fun game, over 200 hours played. I think I am getting close to the end, still have 4 or 5 side quests to finish before doing nocturne.
 
We know the jaguar cores are slower than today's processor cores but to compare it to the desktop version is doing it a misjustice to what bare metal coding and optimizations can do on consoles.
I agree that's the point I was making.
 
Back
Top