Jedi Survivor is the best showcase of a looming problem for PC players

From what I've surmised through reading: implementing DLSS is just about as easy as implementing FSR, so why not? What feasible explanation would they have for omitting something that's just as easy to implement?

DLSS has Frame Gen, which of course, requires a 40-series card, a feature that I've seen implemented through mods into this game that's been somewhat of a panacea for those wanting to enjoy this game with high frame rates, and to my understanding, the modder who implemented has gottent it to the point to where just DLSS itself is stable and better than FSR. So again, why omit it? I'm currently playing Redfall, a game that's being lambasted for poor performance, visuals, bugs, etc. yet having DLSS and Frame Gen has alleviated the performance issues, and using DLAA, the game doesn't look nearly as bad as people are making it out to be, and well, bugs will be bugs, but at least two of the three issues are alleviated making the game quite enjoyable on my 4070Ti, and from what I've read FSR in Redfall is seriously inferior to DLSS in every way in that game. AMD has no offering to compete with DLSS and/or Frame Gen, yet FSR is chosen while DLSS is omitted in Jedi Survivor... again, why? Why omit something that could be a game changer for people who invested? Why sacrifice image quality and potentially better performance for the people who have the hardware, for something inferior?

To the people arguing about cost benefit... I honestly believe there are more 20-30-40 series cards in use than AMD cards in total. Even if that's just a measly 20%, that's still more than AMD, so again, why only use FSR, when there's so many GPU's out there, in use, that can utilize DLSS? Why ignore a user base, that alone, is larger than AMD's total GPU user base? To appease to the 9-10/400-500 series owners that probably couldn't even run this game, or would be taking a massive hit graphically just to hit near playable frame rates, even with FSR/DLSS? Yes, you can make an argument that a 1080Ti might be able to, but the card is almost seven years old, and probably make up less than a half of a % of the total GPU user base? So again, what purpose would it serve to omit DLSS in this instance?

There really isn't any solid reason other than AMD either paying, or "kindly" asking EA to keep it out of the game due to DLSS being superior, and AMD knowing it. Essentially, AMD's pulling an "Nvidia of old" kind of move, but at least when Nvidia did it, Nvidia's feature was superior to AMD's offerings, if there were any.

So, as I see it--EA/Respawn, by giving into AMD, are essentially knee-capping a bigger than AMD's market share worth of PC gamers because money talks, and AMD is paying. Honestly, I'm glad I went back to using Nvidia based cards because I'd rather be on the bleeding edge, then trailing behind waiting for seconds when it comes to things like Frame Gen, DLSS, G-Sync, Reflex, etc. and if AMD is willing to resort to try coercing developers into an objectively worse gaming experience because they themselves just can't compete, why bother supporting a company like that?

Please, if anyone wants to step in and correct me, I'm all ears.
 
There seems to be a mod now that allows DLSS frame generation to run making the game run much smoother.

https://www.reddit.com/r/pcmasterrace/comments/13amc0s/doubled_fps_on_star_wars_with_1_single_mod/

Now imagine if the developer didn't actively try to gimp a specific manufacturer's performance. Better yet, imagine if this forum treated this behavior equally regardless of manufacturer affected. :coffee:
That’s PureDark’s mod, he demo’d it on launch day but was having problems because of Denuvo, looks like that problem has been worked out. But the fact he was able to have it moderately functional within hours of the game coming out shows how badly they removed it from the UE4 engine.
 
It uses about 3gb for the OS, then still needs to hold regular engine and game code before any amount for VRam. So, staknhalo and LukeTbk are right. You get maybe 10gb for vram typically, if that. Plus the beautiful 1000p 20fps!
I think people get confused by the shared RAM on consoles and think it is the kind of thing where it could all be allocated for a given purpose, like VRAM, but it is real different from how PCs work. Sure in theory that could happen but in practice it is how you note: The OS needs RAM and while that is probably less than Windows needs on a desktop, it is still going to be non-trivial. Then there's the game code/data itself. Some of that may end up being smaller than on a PC since often you have situations where assets are both in system RAM and VRAM, and that doesn't need to happen on a console, but there is going to be a non-trivial amount that goes to non-visual things.

So no, you aren't getting the equivalent of 16GB of VRAM on a console. The 10GB or less estimate is probably a pretty good one. MAYBE a little more if a game is extremely lean on everything but visual assets.

The VRAM whining is also silly to me: You have a LOT of options with modern cards, and most games work just fine with more limited amounts if you turn down some details so either:

1) Turn down your shit. If you cannot (or do not choose to) afford a card with lots of VRAM that's fine, it does mean you need to decrease your settings a bit particularly textures. That's life, the last gen consoles had to do that.

2) Buy a card with more VRAM. You can have 16GB, you can even have 24GB. If you demand max settings all the time, you need to have lots of VRAM.

When did [H] become a place where a console that renders at 720p or less and has to upscale to get to 1020p is somehow a better way to play games than on a PC at native 1440p or 2560p?

We are seeing the same issue that I remember back from the Xbox 360 days: People expecting PCs to do something much higher demand than the consoles and saying they are worse when they don't. Like I remember back when Oblivion came out and it ran at 720p 30fps on the 360, and could dip below that. However PC gamers were trying to run it at 1080 or 1920x1200, wanting 60fps, and pitching a fit when that couldn't happen on anything but an 8800GTX as an example of a "bad port".

So ya, if you are trying to run Jedi Survivor at 1440 or 2160, don't be surprised if it needs a chonk of a GPU to come near to holding 60fps, because it looks like it runs way lower rez on the consoles. Turn that shit down to 720 and use FSR to upscale it, see how it runs.

I'm not trying to defend the bad coding, clearly this game should be much better at using the hardware given, particularly the CPU, but getting worked up because the consoles can do a good job running it at low rez and upscaling need to chill. So can the PC, if you want.

It seems like some people just expect developers to magically make PC ports that can outperform console ports, even on lower end hardware.
 
Yes they did.
https://github.com/intel/xess

It’s part of the OneAPI initiative
https://github.com/oneapi-src
XᵉSS is open standard, not open source. The GitHub link is just the SDK. It doesn't contain the source code. OpenAPI itself is an initiative for open standards. Not every project under its umbrella is open source.
My first thought is to play console games on consoles. If a port isn't optimized well for PC, play it on its original platform. Owning a PS5 is a good idea simply because all of the extraordinary exclusives.
Many games that don't run well on PC are also having issues on consoles. Jedi Survivor, in particular, is not even running at 1080p 30 FPS in quality mode.
No more like selecting the performance option.
If you like running a game at 720p in 2023, sure.
So now DLSS/FSR is bad? Make up your minds. That's literally how DLSS and FSR work both are rendering below native resolutions. Remember?
There are not many games using a reconstruction algorithm on consoles, yet. Most are still doing tile rendering, and then upscaling the image to your output resolution. Ignoring that fact, you're still starting out from 1080p at worse in DLSS Performance mode when outputting to 4K. As mentioned above, this game is starting from around 600p in performance mode, 864p in quality mode.
 
I think people get confused by the shared RAM on consoles and think it is the kind of thing where it could all be allocated for a given purpose, like VRAM, but it is real different from how PCs work. Sure in theory that could happen but in practice it is how you note: The OS needs RAM and while that is probably less than Windows needs on a desktop, it is still going to be non-trivial. Then there's the game code/data itself. Some of that may end up being smaller than on a PC since often you have situations where assets are both in system RAM and VRAM, and that doesn't need to happen on a console, but there is going to be a non-trivial amount that goes to non-visual things.

So no, you aren't getting the equivalent of 16GB of VRAM on a console. The 10GB or less estimate is probably a pretty good one. MAYBE a little more if a game is extremely lean on everything but visual assets.

The VRAM whining is also silly to me: You have a LOT of options with modern cards, and most games work just fine with more limited amounts if you turn down some details so either:

1) Turn down your shit. If you cannot (or do not choose to) afford a card with lots of VRAM that's fine, it does mean you need to decrease your settings a bit particularly textures. That's life, the last gen consoles had to do that.

2) Buy a card with more VRAM. You can have 16GB, you can even have 24GB. If you demand max settings all the time, you need to have lots of VRAM.



We are seeing the same issue that I remember back from the Xbox 360 days: People expecting PCs to do something much higher demand than the consoles and saying they are worse when they don't. Like I remember back when Oblivion came out and it ran at 720p 30fps on the 360, and could dip below that. However PC gamers were trying to run it at 1080 or 1920x1200, wanting 60fps, and pitching a fit when that couldn't happen on anything but an 8800GTX as an example of a "bad port".

So ya, if you are trying to run Jedi Survivor at 1440 or 2160, don't be surprised if it needs a chonk of a GPU to come near to holding 60fps, because it looks like it runs way lower rez on the consoles. Turn that shit down to 720 and use FSR to upscale it, see how it runs.

I'm not trying to defend the bad coding, clearly this game should be much better at using the hardware given, particularly the CPU, but getting worked up because the consoles can do a good job running it at low rez and upscaling need to chill. So can the PC, if you want.

It seems like some people just expect developers to magically make PC ports that can outperform console ports, even on lower end hardware.
I would have to agree. I am HAPPY game devs are finally using Vram. I cannot count how many times a new game comes out that barely uses any vram at 4k......For me I have freaking 24GB of Vram....PLEASE use more textures.

With Jedi Survivor at 4k I use between 18-20GB depending where I am. I could not imagine trying to play this with a 12GB card at 4k and expect smooth performance.....
 
I would have to agree. I am HAPPY game devs are finally using Vram. I cannot count how many times a new game comes out that barely uses any vram at 4k......For me I have freaking 24GB of Vram....PLEASE use more textures.

With Jedi Survivor at 4k I use between 18-20GB depending where I am. I could not imagine trying to play this with a 12GB card at 4k and expect smooth performance.....
I hit 22GB last night playing... I think the only other game that uses that much for me is COD:MW2, but I think that game just allocates it, does not actually use all of it... dunno tho, that game is smooth as butter at 4K.
 
I hit 22GB last night playing... I think the only other game that uses that much for me is COD:MW2, but I think that game just allocates it, does not actually use all of it... dunno tho, that game is smooth as butter at 4K.
Yeah I have a feeling it allocates a lot but may not be using that much at all.
 
Brackle & III_Slyflyer_III I may be in the minority of gamers, but I'm also happy something is leveraging the VRAM on my graphics card. It has been nice to fire up games at absolute maximum everything and see em run, almost out the gate, flawlessly. I'm hoping we see more of this in the future. Though, after the backlash on this title I suspect we will see less demanding games, way worse graphics and in general lots of shit titles in the future. The way people exploded on the devs for releasing a game that hammer's the consoles and PCs unless you have appropriate RAM and VRAM is gonna set the gaming industry back years. No one will push the envelope of gaming development until 16+ Gigs is Mainstream on video cards and 24-32Gigs of shared memory is standard on consoles.
How will this look?
Runs ok, looks like shit. I saw the downscaled / upscaled quality version on the PS5 and it looks like ass. I cannot imagine the 600P downscale for performance looks pretty.
 
Brackle & III_Slyflyer_III I may be in the minority of gamers, but I'm also happy something is leveraging the VRAM on my graphics card. It has been nice to fire up games at absolute maximum everything and see em run, almost out the gate, flawlessly. I'm hoping we see more of this in the future. Though, after the backlash on this title I suspect we will see less demanding games, way worse graphics and in general lots of shit titles in the future. The way people exploded on the devs for releasing a game that hammer's the consoles and PCs unless you have appropriate RAM and VRAM is gonna set the gaming industry back years. No one will push the envelope of gaming development until 16+ Gigs is Mainstream on video cards and 24-32Gigs of shared memory is standard on consoles.

Runs ok, looks like shit. I saw the downscaled / upscaled quality version on the PS5 and it looks like ass. I cannot imagine the 600P downscale for performance looks pretty.

Please people were happy to have Crysis. This isn't Crysis though. It's just a fuck up.
 
Brackle & III_Slyflyer_III I may be in the minority of gamers, but I'm also happy something is leveraging the VRAM on my graphics card. It has been nice to fire up games at absolute maximum everything and see em run, almost out the gate, flawlessly. I'm hoping we see more of this in the future. Though, after the backlash on this title I suspect we will see less demanding games, way worse graphics and in general lots of shit titles in the future. The way people exploded on the devs for releasing a game that hammer's the consoles and PCs unless you have appropriate RAM and VRAM is gonna set the gaming industry back years. No one will push the envelope of gaming development until 16+ Gigs is Mainstream on video cards and 24-32Gigs of shared memory is standard on consoles.

Runs ok, looks like shit. I saw the downscaled / upscaled quality version on the PS5 and it looks like ass. I cannot imagine the 600P downscale for performance looks pretty.
The funny part is, if the game would have released with DLSS 3.0 and Frame Gen, things would have been received way better on the higher end. Plus, all the tweaking I did to make it look and feel better should just be part of the game really. I don't know if it will set things back. The game does look good (after some tweaks), but it does not look "groundbreaking" to me anyway. It does not look bad either, I'm really enjoying the game, the only issue left for me is the loading micro stutters which may never go away unless I snag that DLSS 3.0 mod that is out now.

I'm no stranger to hardware being pushed either... I'm a max settings at 4K gamer and in the past that has hammered even the best of cards. This is probably the first game that makes the 4090 struggle, but it is also missing critical features like DLSS and better CPU usage to make things a bit smoother.
 
I bet a "Crysis" would get a totally different reaction nowadays.

If it can't be maxed out on 8gb of vram the torches come out.
They were pleased!! Finally a game that makes people want to upgrade and made GPU and CPU companies to produce faster products.

The same should be said about Jedi Survivor. IT really is next gen on the PC.
 
The funny part is, if the game would have released with DLSS 3.0 and Frame Gen, things would have been received way better on the higher end. Plus, all the tweaking I did to make it look and feel better should just be part of the game really. I don't know if it will set things back. The game does look good (after some tweaks), but it does not look "groundbreaking" to me anyway. It does not look bad either, I'm really enjoying the game, the only issue left for me is the loading micro stutters which may never go away unless I snag that DLSS 3.0 mod that is out now.

I'm no stranger to hardware being pushed either... I'm a max settings at 4K gamer and in the past that has hammered even the best of cards. This is probably the first game that makes the 4090 struggle, but it is also missing critical features like DLSS and better CPU usage to make things a bit smoother.
well DLSS is an Nvidia tech. It is not needed as you can use FSR.

Now DLSS is better I agree. But it is not needed since FSR is out there and is open source for any devs to use....AND works with any GPU vendor.
 
Is that the only thing going on with this title?
Probably not, but come on, people just shout "optimization" without even knowing what it means most of the time.

Like Hogwarts getting "optimized" aka popping in lower res textures to reduce the burden.

Now we got people unironically thinking PCs should be as optimized as consoles. What is going on? Did spending $600+ for 8gb break people this bad?
 
To me "next gen" would need to be on the UE5 engine. This game is very last gen imo.
As I do agree, but imo there is no other game that looks and feels as good as Jedi Survivor. Maybe The Last of Us on PC which would be a close second. Both games look fantastic at 4k maxed out. IMO I do not think any other games come close to looking next gen. Hogwarts would be the only other games I can think of. And all 3 games use UE4 if im not mistaken.
 
Probably not, but come on, people just shout "optimization" without even knowing what it means most of the time.

Like Hogwarts getting "optimized" aka popping in lower res textures to reduce the burden.

Now we got people unironically thinking PCs should be as optimized as consoles. What is going on? Did spending $600+ for 8gb break people this bad?
This is what happens when 1 GPU vendor controls the market and makes people think that 8GB for $600 was good value....Just like people think of Raytracing at RTX.....It's not RTX is Nvidia's technology to use Raytracing.
 
Probably not, but come on, people just shout "optimization" without even knowing what it means most of the time.

Like Hogwarts getting "optimized" aka popping in lower res textures to reduce the burden.

Now we got people unironically thinking PCs should be as optimized as consoles. What is going on? Did spending $600+ for 8gb break people this bad?
Yeah optimization is usually the art of finding ways to turn stuff down in areas so it looks the same while using less. Like in one grey hallways changing the grey from a black white to a black blue to simplify the bounce lighting and improve performance. Or turn down a resolution on something in the distance with a lower quality texture or a simpler model. For consoles it may just mean rendering fewer details maybe some pipes or lights are not there less debris littering the level, etc.

Very rarely are the optimizations code based as that’s usually an engine thing, they can make small tweaks but they aren’t going to do anything drastic there.

This game just hurt on launch, have an 8/16 core CPU here lets load 1 core and the 1 thread associated with it and leave the remaining 14 idle like this is 2010.
Don’t want to use FSR well we broke TAA and you have to find and modify an ini file to fix it, want to turn on FSR to improve performance well whoops we hard coded it to performance mode so back to that ini file to change a 50 to a 100. Want to change game settings well nope we forgot to put that one in the menu, our bad.
You can get caught up on the FSR2 and DLSS but that’s a distraction, the fact they took the same game engine as the predecessor, managed to remove settings while simultaneously introducing the same game breaking bugs they had already patched out of the previous is the big issue.
It’s like they just forgot all the things they learned from making and patching Fallen Order, I know they’ve had some turn over but it’s like they just swapped out the entire development team half way through.
 
Yeah optimization is usually the art of finding ways to turn stuff down in areas so it looks the same while using less. Like in one grey hallways changing the grey from a black white to a black blue to simplify the bounce lighting and improve performance. Or turn down a resolution on something in the distance with a lower quality texture or a simpler model. For consoles it may just mean rendering fewer details maybe some pipes or lights are not there less debris littering the level, etc.

Very rarely are the optimizations code based as that’s usually an engine thing, they can make small tweaks but they aren’t going to do anything drastic there.

This game just hurt on launch, have an 8/16 core CPU here lets load 1 core and the 1 thread associated with it and leave the remaining 14 idle like this is 2010.
Don’t want to use FSR well we broke TAA and you have to find and modify an ini file to fix it, want to turn on FSR to improve performance well whoops we hard coded it to performance mode so back to that ini file to change a 50 to a 100. Want to change game settings well nope we forgot to put that one in the menu, our bad.
You can get caught up on the FSR2 and DLSS but that’s a distraction, the fact they took the same game engine as the predecessor, managed to remove settings while simultaneously introducing the same game breaking bugs they had already patched out of the previous is the big issue.
It’s like they just forgot all the things they learned from making and patching Fallen Order, I know they’ve had some turn over but it’s like they just swapped out the entire development team half way through.

Nah man just download more VRAM, or buy a console lol
 
This is what happens when 1 GPU vendor controls the market and makes people think that 8GB for $600 was good value....Just like people think of Raytracing at RTX.....It's not RTX is Nvidia's technology to use Raytracing.
The other problem is we only really have 1 GPU vendor. AMD seems more than happy to see their 7% market share slowly slip away almost like they decided dealing with the AIB’s was no longer worth it… And Intel is barely a plucky upstart in the graphics market.
 
Just turn down settings and don't overpay for less hardware next time.
Overpay is about the only option we’ve had for the past 5 years. But it runs like ass across the board the game just chunks on the processor side, almost like there is some sort of hard coded limiter in there somewhere.
 
The other problem is we only really have 1 GPU vendor. AMD seems more than happy to see their 7% market share slowly slip away almost like they decided dealing with the AIB’s was no longer worth it… And Intel is barely a plucky upstart in the graphics market.
As I do agree with you that AMD doesnt seem like its trying very hard. But, AMD also has the most GPU's in households around the world......Consoles.
 
As I do agree with you that AMD doesnt seem like its trying very hard. But, AMD also has the most GPU's in households around the world......Consoles.
And soon to be handheld too as they are launching a line of silicon specifically for things like the steam deck. AMD is just looking over the consumer desktop market and peacing out.

I openly hope that Intel or NVidia takes away one of the major consoles from them next gen so AMD has to start trying again.
 
I guess they've given up competing in retail/business dGPU and have just resigned themselves to 'cheapest bidder for GPUs in other people's products'

That's the 'competition' we're supposed to root for

Edit: And be happy when that 'try not' company sponsors games with superior features from their competitor removed. Root for them people 👍
 
Last edited:
well DLSS is an Nvidia tech. It is not needed as you can use FSR.

Now DLSS is better I agree. But it is not needed since FSR is out there and is open source for any devs to use....AND works with any GPU vendor.
Except FSR in this game is absolute poop and DLSS (overall consensus across games) looks and performs better, plus you get Frame Generation. I have no clue how UE4 works, but everyone is saying it is as "simple" as a checkbox, in which case, no reason not to include it unless you are actually paying the dev not to include it as part of your "sponsorship". It's a moot point now though as a mod is out to add Frame Gen and likely DLSS shortly. Just makes a developer look bad at this point. No reason DLSS, FSR and XeSS should not be in all games in 2023.
 
Except FSR in this game is absolute poop and DLSS (overall consensus across games) looks and performs better, plus you get Frame Generation. I have no clue how UE4 works, but everyone is saying it is as "simple" as a checkbox, in which case, no reason not to include it unless you are actually paying the dev not to include it as part of your "sponsorship". It's a moot point now though as a mod is out to add Frame Gen and likely DLSS shortly. Just makes a developer look bad at this point. No reason DLSS, FSR and XeSS should not be in all games in 2023.
It's not quite as easy as just a checkbox.
You also need to go to the Epic Games\UE_engineversion\Engine\Plugins\Runtime\Nvidia\DLSS\ and extract the zip folder into that directory.
 
I guess they've given up competing in retail/business dGPU and have just resigned themselves to 'cheapest bidder for GPUs in other people's products'

That's the 'competition' we're supposed to root for

Edit: And be happy when that 'try not' company sponsors games with superior features from their competitor removed. Root for them people 👍
Personally I think AMD's future for PC is APU's. They could make some REALLY good APU's that could easily kill the low end....Which seems like a smart business choice.

Nvidia seems to be moving toward AI, which seems to be the smart business choice as well.
 
Personally I think AMD's future for PC is APU's. They could make some REALLY good APU's that could easily kill the low end....Which seems like a smart business choice.

Nvidia seems to be moving toward AI, which seems to be the smart business choice as well.
Well, we are about to see it with the whole Strix Halo stuff coming out next year (a 40CU part coming out). Allegedly, Intel's on-Die graphics is supposed to give most of AMD's upcoming product stack a run for it's money as well on the low to mid range parts.

I don't expect it to be cheap, but it should be cheaper than a discreet card in some instances and way more energy efficient.
 
Well, we are about to see it with the whole Strix Halo stuff coming out next year (a 40CU part coming out). Allegedly, Intel's on-Die graphics is supposed to give most of AMD's upcoming product stack a run for it's money as well on the low to mid range parts.

I don't expect it to be cheap, but it should be cheaper than a discreet card in some instances and way more energy efficient.
Those are laptop / NUC solutions from AMD

Desktop will have only the 2 CU 6nm i/o chip not sufficient for gaming

The 6500/6400/6300 will continue to sell for atleast 2 more years because they are on 6nm.

I am hoping that there would be a 6nm 7600 xt 16gb for $350 from AMD soon.

Best option for 1080p gamers would be to buy the RX 6800 / RX 6800 XT on discount. No new card from AMD/Nvidia is expected to match those. Maybe Intel battlemage...
 
Those are laptop / NUC solutions from AMD

Desktop will have only the 2 CU 6nm i/o chip not sufficient for gaming

The 6500/6400/6300 will continue to sell for atleast 2 more years because they are on 6nm.

I am hoping that there would be a 6nm 7600 xt 16gb for $350 from AMD soon.

Best option for 1080p gamers would be to buy the RX 6800 / RX 6800 XT on discount. No new card from AMD/Nvidia is expected to match those. Maybe Intel battlemage...
AMD will start selling the Desktop versions (of the laptop parts with high end APUs) when they start getting threatened by Intel's graphics division. I'm doubting that any graphics card AMD integrates on die will be cheap. Effective, on-die graphics, is gonna cost some serious $$$.

As for the lower end parts like the 7600XT... AMD needs to release it with 16GB but we're only likely to see 8GB for that 350 buck price point.

Best option for 1080P gaming? Honestly, you can run acceptable FPS on nearly anything at 1080P...

https://www.pcmag.com/picks/the-best-graphics-cards-for-1080p-gaming (A bit old on the pricing)

https://www.amazon.com/XFX-Speedste...g=p00935-20&ascsubtag=00LxQa3f0EXnoTzK3l25bao 289 Bucks
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
How will this look?
Like the console, which is to say not great. My point was just that people shouldn't say "Well the consoles can play it well!" if they aren't using console settings and expecting console res/FPS.

Basically a game is a bad port if an adequately equipped PC can't match console performance. We've seen some of these, consoles will hold 1080p60, similar PC will stutter and drop frames and such. That's a bad port. However if the game can (roughly) match what it is doing on the console, then it isn't a bad port. Might be bad coding all the way around, but not a bad port. If you demand more on the PC, that's great, that's part of the reason to game on a PC IMO, but you need to provide better hardware to do it. You can't come in with an older and/or lower end GPU and then expect to get top notch performance.

Mostly this was targeted at the people crying about it and saying that we should just get consoles and PCs suck for gaming because companies don't care and blah blah. It is clear that this game doesn't run fast on ANYTHING. Consoles get performance by cutting rez hard. Well, that's a valid solution on the PC too.
 
Brackle & III_Slyflyer_III I may be in the minority of gamers, but I'm also happy something is leveraging the VRAM on my graphics card. It has been nice to fire up games at absolute maximum everything and see em run, almost out the gate, flawlessly. I'm hoping we see more of this in the future. Though, after the backlash on this title I suspect we will see less demanding games, way worse graphics and in general lots of shit titles in the future. The way people exploded on the devs for releasing a game that hammer's the consoles and PCs unless you have appropriate RAM and VRAM is gonna set the gaming industry back years. No one will push the envelope of gaming development until 16+ Gigs is Mainstream on video cards and 24-32Gigs of shared memory is standard on consoles.

Runs ok, looks like shit. I saw the downscaled / upscaled quality version on the PS5 and it looks like ass. I cannot imagine the 600P downscale for performance looks pretty.
I'd agree with high VRAM usage if it's "optimized" VRAM usage, in other words no bugs, memory leaks, etc. Unfortunately, we're not at a point to where dev's are just going to start churning out games using 12GB+ of VRAM, as you said not until 16GB is mainstream, which would be awhile since consoles pretty much dictate where gaming is at, and where it's going.

I think the backlash in this title won't set us back, I think, if anything it'll hopefully push dev's to properly optimizing their games prior to release. The issue with AAA games on PC this year hasn't been so much graphics, or gameplay, it's mainly been optimization. Just look at games like Cyberpunk 2077, Control, and Witcher 3 as examples of games that look good, really good, but don't require 12GB+ to run, then look at Jedi Survivor and ask how in the hell can this game use 20GB+ of VRAM? Hogwarts Legacy is another title that looks great, but definitely not 12GB+ VRAM great, it's not above CP2077 or Witcher 3. Games are just being released broken, but the problem is people are putting more emphasis on VRAM than what's really, truly needed. 16GB of VRAM is quite a bit, it's an amount we probably won't see used for at least another six or seven years, unless of course you're into modding and adding high res textures and all that jazz, and the kicker is that a majority of gamers are still running 1080p, not 1440p or 2160p where more VRAM becomes more important, but not for the amount of it, but the memory bus that comes with higher VRAM.

At the end of the day I'm in the group that firmly believes that 12GB is sufficient for 1440p and below, and 16GB is sufficient (for memory bus reasons) for 4K. We won't be seeing many games using more than 12GB naturally, unless it's not properly optimized, and I'm not saying this as a 4070Ti owner, I'm saying this as a realist--games aren't looking better and better enough to justify the drastic uptick in VRAM usage, and if CP2077 is anything to go by, true usage of anything near or above 12GB+ would bring even a 4090 to it's knee. If games are coming out looking worse than CP2077 or Witcher 3 while requiring more resources, then it's on game dev's, not Nvidia or AMD to fix, otherwise game dev's would just release broken game after broken game with little, to no optimization or patching. Putting it on Nvidia and AMD continue to band-aid the issue with more VRAM or faster GPU's every time a series of broken games were released would get very expensive very quickly, and still wouldn't fix the issue at its root, and that right there would be the downfall of PC gaming.
 
Back
Top