GoldenTiger
Fully [H]
- Joined
- Dec 2, 2004
- Messages
- 29,783
LOL at complaining about system ram usage... 32gb of ddr4 is around $60. No reason not to have that nowadays for a gamer.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
LOL at the person who doesn't realize that system memory is slower than VRAM and how that can cause a host of issues. But you do you.LOL at complaining about system ram usage... 32gb of ddr4 is around $60. No reason not to have that nowadays for a gamer.
I said nothing about vram. Are you confused?LOL at the person who doesn't realize that system memory is slower than VRAM and how that can cause a host of issues. But you do you.
Only if you hate resale value.While that's true there's something to be said for the added value of extra VRAM. If you plan on keeping the card for a fairly long time it would make sense to bias your decision in favor of AMD
Again though, you didn't address my original comment--how come the 3070 images looked better than the 6700XT images?You sure you want to go there? Hardware Unboxed did a video about what happens in VRAM constrained scenarios on Nvidia cards. It wasn't pretty.
There's no such thing as 10GB driver overhead.
We are looking at the memory not the performance in the last picture.
Literally the benchmarks don't agree with you. You keep answering with "with some unknown reason" even when the numbers are staring you in the face. This has already been benchmarked. I was making a statement not asking a question we already had an answer to.
Easy they don't.Again though, you didn't address my original comment--how come the 3070 images looked better than the 6700XT images?
The "cap" can be set to anything. It's not just the 3070 that would benefit.As for the VRAM.. look, in the initial three, the only one coming close to hitting VRAM cap was Forza Horizon 5. The rest were at 4GB and 5GB respectively. Last I checked, the 3070 has 8GB of VRAM, so again, why would the Nvidia only system require the use of so much system memory when it's not near cap? It has a bigger memory bus than the 6700XT, so memory bandwidth couldn't be the issue, the graphics settings are the "same," so why would the Nvidia card use 2x the system memory? There's absolutely nothing staring me in the face with those screen shots. I don't get the point of your screen shots truthfully.
Pretty darn sure the 3070 ain't on top in that benchmark.As for the FH5 results, after watching six different videos, yes it uses a lot, but doesn't stutter or have 1% low issues,
You could use the same video the problem persists in every game. Huge amounts of system memory being used to cover up a lack of VRAM.Honestly, I think you went and cherry picked these screenshots to try and make a point, but what point were you exactly trying to make?
Literally every AAA game being released today is going over 8GB.While I will agree the 3070 got the shaft with 8GB of VRAM, it was released at a time when games weren't normally using that much VRAM, and considering the card is targeted at 1440p, that only reinforces that. Even then, still, not a lot of games are actually "using" over 8GB of VRAM, unless they're not optimized, hell, from what I've seen through all these different conversations, videos, research, etc. GPU's with higher VRAM pools tend to allocate more VRAM showing higher VRAM usage, while cards like the 4070Ti, what I have, show very different results.
Because it's got quite a bit more cache than a 3090.As for this whole VRAM debacle... why is it a 4070Ti walks all over a 6950XT, or lands in between a 3090/3090Ti when it has a 192-bit bus/12GB VRAM versus their 16GB/256-bit and 24GB/384-bit bus, even at 4K?
Have we regressed to the point that we have forgotten that FPS only tells part of the story? You can have higher FPS and crappy lows and that is something I wouldn't want.Why is it only a couple games actually post better 1% lows, while still averaging less FPS?
AAA? Most of them and definitely at 4K. I mean I can run through my library but it's quite a bit.Lastly--take a look at the most played games on Steam/Epic and tell me how many of those games actually use anything close to 8GB. Using 2023 games at Ultra settings on a mid-range GPU released in 2021 isn't going to give you the performance you'd expect, and that's been the case with GPU's as far back as I can remember, mid-range GPUs tend to give great performance in AAA games for a good year or two before you need to start adjusting the settings.
Nope.I said nothing about vram. Are you confused?
They're all under $100 eventually .Only if you hate resale value.
If you're keeping your card for a long time, resale value probably isn't a big concern.Only if you hate resale value.
It's only relevant if you're upgrading every cycle or maybe every other. Used cards are rapidly dropping in value now that mining isn't vacuuming the supply.If you're keeping your card for a long time, resale value probably isn't a big concern.
I think it depends on what constitutes a "long time". My 2080ti is almost 4 years old and I'm seeing that people are still buying them for $350! That's about 40% of what I bought it for all those years ago (open box at MC). I'm tempted to sell it for that while I can and grab a 7900/4080.If you're keeping your card for a long time, resale value probably isn't a big concern.
I'm thinking 5+ years. My brother just upgraded, he was still using a 970 4gb. Can't expect to get much from that. You also got a deal on open box, which is not typical, normally you would have paid ~$1200 for that new.I think it depends on what constitutes a "long time". My 2080ti is almost 4 years old and I'm seeing that people are still buying them for $350! That's about 40% of what I bought it for all those years ago (open box at MC). I'm tempted to sell it for that while I can and grab a 7900/4080.
Yeah, I realize that but at the same time I've seen a few open-box 7900XTXs show up at MC for $830-860 over the past few weeks...I'm thinking 5+ years. My brother just upgraded, he was still using a 970 4gb. Can't expect to get much from that. You also got a deal on open box, which is not typical, normally you would have paid ~$1200 for that new.
Actually, no. Check steamdb to see the games with the highest player bases, most played.Easy they don't.
The "cap" can be set to anything. It's not just the 3070 that would benefit.
Pretty darn sure the 3070 ain't on top in that benchmark.
You could use the same video the problem persists in every game. Huge amounts of system memory being used to cover up a lack of VRAM.
Literally every AAA game being released today is going over 8GB.
Because it's got quite a bit more cache than a 3090.
Have we regressed to the point that we have forgotten that FPS only tells part of the story? You can have higher FPS and crappy lows and that is something I wouldn't want.
AAA? Most of them and definitely at 4K. I mean I can run through my library but it's quite a bit.
Agree, averages mean crap, you can have high averages and an utter stutter fest. 1% lows do not affect the averages that much since it is only 1% of the frames but those 1% if in the sub 30fps (even radically departing frame rates 100fps to 50fps, large frame time differences) can wreak havoc on smoothness and timing of shots or movement.Easy they don't.
The "cap" can be set to anything. It's not just the 3070 that would benefit.
Pretty darn sure the 3070 ain't on top in that benchmark.
You could use the same video the problem persists in every game. Huge amounts of system memory being used to cover up a lack of VRAM.
Literally every AAA game being released today is going over 8GB.
Because it's got quite a bit more cache than a 3090.
Have we regressed to the point that we have forgotten that FPS only tells part of the story? You can have higher FPS and crappy lows and that is something I wouldn't want.
AAA? Most of them and definitely at 4K. I mean I can run through my library but it's quite a bit.
I think what is keeping the 2080ti around 350 is that it’s for better or worse the best 1440p Nvidia gpu under 400 right now. It has enough vram and compares to the 6700xt which is similar in price. So if someone has a max 300-350 budget and wants Nvidia the used 2080ti is the one to get. I almost bought one last week but I’m looking more like 300 not 350.I think it depends on what constitutes a "long time". My 2080ti is almost 4 years old and I'm seeing that people are still buying them for $350! That's about 40% of what I bought it for all those years ago (open box at MC). I'm tempted to sell it for that while I can and grab a 7900/4080.
So your whole thing is the most popular are below 8GB? Errr OK. I don't know why that matters since usually your WoWs, Dota 2s, are always going hit top spots because they are either cheap to play or in the case of Dota completely free. We're enthusiasts and most of us aren't broke. So that's who I'm talking about.Actually, no. Check steamdb to see the games with the highest player bases, most played.
https://steamdb.info/
Reread what I put games that use 8GB. I said not a lot of them. You think maybe 10 games constitutes a lot? Even then, once you drop their settings from Ultra to high that usage drops.
Nope. Now it has to be within reason, but if you gave me a game that starts at 30 and never goes below 30 vs a game that starts at 60 and frequently drops I'm going to pick the 30 locked option. It's going to give a much better experience than if FPS is all over the place.Average frame rate is just an important as 1% lows, if your average frame rate is garbage, what good is better 1% lows?
As I stated before where things get stored and how much, is completely programmable. If I had a lineup that included cards like the GTX 1050 for instance that only has 4GB, would I set the cap at 4GB and then stream in textures as needed from system memory to make up the deficit? Yup. What's the benefit? The game is going to run better than if it hits a brick wall then has to immediately go out to storage to load in the next batch of textures. When Hogwarts released for instance this is exactly what you saw. Doing this ensures that all of the cards within your lineup will be able to play the game without having a horrible experience. The reason to do this seems pretty obvious to me.Explain how it makes sense, that a card that isn’t even being VRAM capped to begin with is using 2x the system memory of a card that has only 4GB more. Compensating for the lack of VRAM indicates that the games are actually using more VRAM than what the card has, but in your screen shot using 5GB out of 8GB wouldn’t require phenomenally more system memory. So, again, how is it compensating for something it’s not even fully using? In FH5 I could understand, but the next two, again what you’re trying to say doesn’t make sense.
Yes, and that's why AMD and Nvidia are still going to release 8GB options, and why they're going to still sell like gangbusters. I'm not referring to myself, you, or a good portion of [H]. Until those games start seeing better requirements, 8GB is going to be a mainstream option for the mainstream masses who game on PC.So your whole thing is the most popular are below 8GB? Errr OK. I don't know why that matters since usually your WoWs, Dota 2s, are always going hit top spots because they are either cheap to play or in the case of Dota completely free. We're enthusiasts and most of us aren't broke. So that's who I'm talking about.
Nope. Now it has to be within reason, but if you gave me a game that starts at 30 and never goes below 30 vs a game that starts at 60 and frequently drops I'm going to pick the 30 locked option. It's going to give a much better experience than if FPS is all over the place.
As I stated before where things get stored and how much, is completely programmable. If I had a lineup that included cards like the GTX 1050 for instance that only has 4GB, would I set the cap at 4GB and then stream in textures as needed from system memory to make up the deficit? Yup. What's the benefit? The game is going to run better than if it hits a brick wall then has to immediately go out to storage to load in the next batch of textures. When Hogwarts released for instance this is exactly what you saw. Doing this ensures that all of the cards within your lineup will be able to play the game without having a horrible experience. The reason to do this seems pretty obvious to me.
Now with the release of Tears of the Kingdom, it's even more apparent how lazy developers have gotten. While both Zelda games are not without issues with minor frame drops here and there, the fact that such a gargantuan game exists and plays on such hardware such as Nvidia's Tegra X1 (a SOC released in 2015 no less) is testament to true optimization.I'd like NV to add more VRAM. Yes! Can be used to improve everything. Please more sir. That's one thing.
I'd also like developers to return to the time when they did better asset analysis and working-set management.
These "AAA" titles don't actually look better than things which existed years ago. Random example - The Division 1. Still looks amazing compared to current titles. 2016.
I mean it is a well optimized game (my girlfriend has been playing it). However things that you notice about it right away:Now with the release of Tears of the Kingdom, it's even more apparent how lazy developers have gotten. While both Zelda games are not without issues with minor frame drops here and there, the fact that such a gargantuan game exists and plays on such hardware such as Nvidia's Tegra X1 (a SOC released in 2015 no less) is testament to true optimization.
We know all that. Did you miss the fact that I clearly stated it runs on a Tegra X1? A mobile chip from 2015, in other words a 7 year old mobile chip. Of course there are going to be sacrifices, but there is absolutely no doubt that TotK is insanely more optimized than Jedi Survivor. The mean fact you can jump from a Sky Island all the way down to the depths with absolutely no loading screen on a geriatric mobile chip says it all.I mean it is a well optimized game (my girlfriend has been playing it). However things that you notice about it right away:
1) It is locked to 30fps max, and it drops down to 20fps not infrequently.
2) Game runs at 1600x900 max, and frequently lowers the resolution dynamically to 1280x720 (maybe lower) depending on what is happening.
3) Texture resolution is quite low, many things are solid/cell shaded, the things that are textured have pretty low rez textures overall.
Some of these are things that gamers scream about, a lot, at least if it isn't on a switch. Some of these, like FPS, are things people are screaming about for Jedi Survivor and they are mad it isn't doing 60, not 30.
When you compare it to Jedi Survivor, the graphics, the world detail, the mob detail, the animation, etc, etc are not even in the same ballpark. So sure, Jedi Survivor is not nearly as optimized as it could or should be, but it is also a case that you are comparing apples to steak.
For mainstream I can see it. But people are just going to have to realize that the latest engines mean they are going to have to turn down settings even at 1080P in some instances.Yes, and that's why AMD and Nvidia are still going to release 8GB options, and why they're going to still sell like gangbusters. I'm not referring to myself, you, or a good portion of [H]. Until those games start seeing better requirements, 8GB is going to be a mainstream option for the mainstream masses who game on PC.
1% lows are not directly linked to FPS.In all fairness, IF your GPU is only managing 30 FPS on any title, chances are it's not a locked 30 FPS, unless you intentionally dropped your FPS from something higher to maintain stability. So, at 30 FPS you'll most likely see 1% lows in the teen's to mid-20's.
Every type of game has a difference performance profile. One of the toughest games to do from a performance perspective are racing games. LOD is far higher with racing games because the draw distance is so far. Games that are not open and have a bunch of corridors like Plague Tale, RE, Hitman, they will require far less LOD at a distance so there's less to load. Obscuring what gets loaded in corridor games is far easier to control. FH5 is incredibly open with tons of roads you can take so it's at the extreme of pre-caching.Now, that makes sense. Although, why would a card with only 50% more VRAM require that much less system memory? If games are being programmed to store, let's say 20GB of texture data in memory due to VRAM limitations on an 8GB GPU, why would the 12GB GPU only require half that? In the case of FH5 where it's using 22GB of system memory with the 3070, while the 6700XT only uses 10GB, then Hitman where, again, the 6700XT is using only half the system memory of the 3070. Then you go down the list to Plague Tale Requiem and the difference is only 2GB of system memory usage difference. The math there doesn't add up. It's either over-compensating, or there's something in the games code that is telling it to pull more than it actually needs for the 3070 since both System Memory and VRAM memory usage equals 30GB, while the 6700XT is only 22GB in Forza Horizon 5. Either way, if this is the case, that games are programmed to have memory at the ready to prevent the "brick-wall" effect, then even people with 12GB cards should be fine for quite some time, the 3070 doesn't seem to be having too many issues with high quality settings in newer AAA games, it's only when you set it to the games highest settings. TLOU and Jedi Survivor are examples of this.
Never said always. Always is a word that never should be used in anything technical.Again though, I'm not advocating for more 8GB cards, or for Nvidia to keep releasing cards with bare minimum VRAM, I'm just saying that more VRAM doesn't always equal better performance, otherwise the 4070Ti wouldn't consistently beat a 3090, or 6950XT, even at 4k resolutions.
This is a misnomer. They aren't really being lazy. Time frames in game development are controlled by producers not development. No one wants to release a crappy game from a development and testing POV. Secondarily with ports, consoles drive development. This has been the case for a very long time. The only pet peeves I have with consoles is the previous generation as it was not nearly as much of a jump as it should have been. But this current generation isn't something to be sneezed at. All of them are walking around with a power equivalent to a 3070 or a 6700 non-xt but in a unified set up. Honestly you can't recreate that setup on PC. The only way to do it is to create a unified space LIKE VRAM and have everything come out of that. So if a video card is moving about the cabin with 8GB of VRAM which argue if you want is less that what is possible via console then expect to see problems.1% lows will always happen, games tend to be coded differently, and stuttering, unfortunately will always seem to be a part of PC gaming as long as developers keep on being lazy, it doesn't matter how much VRAM you have, a poorly coded game will always stutter unfortunately.
What game is that?Addendum: I honestly wish we weren't getting games looking straight out of 2018-2019 in 2023, it only makes matters worse when they're releasing with graphics from 2018 and they're using vastly more resources than they should, but I know they do get optimized eventually, hence why I'm not concerned about my 4070Ti's 12GB of VRAM.
Are you serious?? A) There are FPS drops EVERYWHERE in Zelda: Tears of the Kingdom. The draw in distance is really close and when it's not everything is covered in mist. B) The level of detail is .... just come the fuck on not even 2016. The color pallet might be 16-bit but I'm almost doubting that. It's definitely restricted to preserve bandwidth. You're comparing this 720P game (because if nVidia drops the resolution then there's no problem at all) with Virtua Fighter 2 shadows:
View attachment 570122
With this (they rendered arm hair) there's also ray-tracing on the PS5 and Xbox versions:
View attachment 570121
I have the Zelda game, there's loading tons of places. You're just not being honest here not at all but it shows how you're approaching this that's for sure. There's also the issue of there being NO PC PORTS. From a development stand point its not even an apples to oranges comparison. It's more like comparing balloons to grape jelly.
Oh wait, I had to check Digital Foundry (nVidia homebase) because your summary just seemed like marketing and wouldn't you know, you're repeating Digital Foundry word for word (I literally found the exact spot from the nVidia marketing slick in this video):
So i'm dealing with nVidia marketing directly. Hi nVidia? How are sales?
EA can't go after nVidia for defamation but other developers should for their "everything is so unoptimized" marketing nonsense.
Yes.. and it LOOKS LIKE IT. It looks like a game that was designed for an old, low power, mobile chip. It's good for what it is and that's great but there were a lot of tradeoffs made, not in the least of which being low rez, lower than the switch's native and low FPS. Jedi Survivor is the opposite. It is VERY modern, VERY detailed, VERY shiny and unsurprisingly demanding because of that. Now it also clearly is not as well coded as it should be, the biggest evidence being not making effective use of all the cores that modern processors offer. However the fact that a game that looks like this noms up VRAM like mad and needs a chonk to run at 1440p or above is not a surprise to me.We know all that. Did you miss the fact that I clearly stated it runs on a Tegra X1? A mobile chip from 2015, in other words a 7 year old mobile chip. Of course there are going to be sacrifices, but there is absolutely no doubt that TotK is insanely more optimized than Jedi Survivor. The mean fact you can jump from a Sky Island all the way down to the depths with absolutely no loading screen on a geriatric mobile chip says it all.
Holy shit, the fact that I have to keep saying this is mind boggling. Mobile SoC FROM 2015. How is this going over your heads?
One last time for the haters in the back, 7 year old mobile SOC. Still less issues that the turds being released in 2023 because of proper optimization from their Devs.
From here on out, I will be ignoring the obvious trolls.
I mean I will say it is impressive for what it is. The graphics are good for an open world Switch game. But it for sure looks like it. It looks like a Switch game, it runs like a Switch game. It is not some amazing feat of coding that makes me go "This can't possibly be a Switch!"It's kind of hard to be impressed when it looks like a 7 year old mobile game.
What area of Jedi Survivor does this comment relate to?It is apparent to me now how little most people know of actual game development. My mind is blown at just how oblivious most gamers actually are.
View attachment 570130
What area of Jedi Survivor does this comment relate to?
Why would a gamer know a ton about game development, any more than a driver would know about automobile engineering.It is apparent to me now how little most people know of actual game development. My mind is blown at just how oblivious most gamers actually are.
Any number of games running the UE4 Engine. Just look at Batman Arkham Knight, that game was released in 2015 and gives games released 8 years later a run for their money. A lot of these games aren't living up to the hardware cost they demand just to run them optimally. Doom Eternal is an example of a non-UE4 game that looks phenomenal, yet runs on a potato, then there's both Witcher 3 and CP2077 running on the RED engine that just blow every UE4 title out of the water. Then you got RE4 which is running on Capcoms engine and that game looks and runs great on all manners of hardware. UE4 games just look so dated, hence my remark about games looking like they belong back in 2018.What game is that?
Nope. Link? And on AM4 speed does matter. So lets see your shit RAM for $60LOL at complaining about system ram usage... 32gb of ddr4 is around $60. No reason not to have that nowadays for a gamer.
Apparently just pop in lower res textures all over the place like they did with Hogwarts. "Look guys, it uses less vram now!" This argument would never be happening if Nvidia would just be a little bit more forward thinking in how much they include for the price points they're demanding. Like, just thinking one generation ahead.More to the point: What, specifically, needs to be done to optimize Jedi Survivor? What are the specific things they need to do to improve the framerate, or lower the VRAM usage, or fix other issues you have with it?
Nice try, tough guy: https://slickdeals.net/f/16624250-3...dr4-3600-cl18-desktop-memory-66-free-shippingNope. Link? And on AM4 speed does matter. So lets see your shit RAM for $60
Hogwarts, Jedi Survivor, Callisto, etc all suffer from UE4 largely being an ancient engine at this point. There is a lot that needs to be done for optimization on PC. The games frequently end up being CPU bound, even on the highest end CPU's. This is largely because the entire engine slows to a crawl since a single thread is being hammered to render more complicated scenes.Why would a gamer know a ton about game development, any more than a driver would know about automobile engineering.
However since you imply that you DO know so much, what are your credentials? What is your training and experience that gives you extensive insight into game development?
More to the point: What, specifically, needs to be done to optimize Jedi Survivor? What are the specific things they need to do to improve the framerate, or lower the VRAM usage, or fix other issues you have with it? Because, personally, I feel like the "this needs more optimization" gets thrown around by a lot of gamers a lot of the time as a generic term, as though "optimization" is almost like a magic wand that if it was just waved would make everything better. That's not how it works, of course. So if you are well informed about how it is done, let's hear what specific things they needed to do, but didn't, or how they need to reimplement things.
Probably because it's a 720p game not 1080. Also it uses FSR. Yep that FSR. So is FSR still bad or nah?Tears of the Kingdom runs on an ancient relic like the Tegra X1, in a system that has 4GB of RAM and still manages to feel smooth as butter when playing it. Does the game look bad by today's standards? Depends on who you ask, but to me it looks very good considering the hardware running it, and the fact that Nintendo managed to squeeze all of it onto an 18GB cartridge of all things while these newer games are requiring 4-5x that amount.. it's astounding. This is part of the reason I respect Nintendo--they don't fuck around with their first party, or even second party titles when it comes to optimization and squeezing as much as they can from what they have available and managing to make their games entertaining despite having dated graphics.
? That game was notorious for how bad the PC port ran. I'm surprised you mentioned it.Any number of games running the UE4 Engine. Just look at Batman Arkham Knight, that game was released in 2015 and gives games released 8 years later a run for their money.
Also? Doom Eternal requires more than 8GB to run at the higher resolutions. Hardware Unboxed did a write up on it specifically how the 3080 was the better card largely because it had more VRAM than the 3070.A lot of these games aren't living up to the hardware cost they demand just to run them optimally. Doom Eternal is an example of a non-UE4 game that looks phenomenal, yet runs on a potato,
That game had low quality textures and required a HD pack to even bring it up to the level of what we're talking about.then there's both Witcher 3
Yup and it doesn't "run on a potato" either and had one of the most troubled launches that I can remember.and CP2077 running on the RED engine that just blow every UE4 title out of the water.
RE4 also received complains so I'm not sure what you're talking about.Then you got RE4 which is running on Capcoms engine and that game looks and runs great on all manners of hardware. UE4 games just look so dated, hence my remark about games looking like they belong back in 2018.
Arkham knight was a UE 3 game. Albeit a heavily modified one. #rocksteadymagicTears of the Kingdom runs on an ancient relic like the Tegra X1, in a system that has 4GB of RAM and still manages to feel smooth as butter when playing it. Does the game look bad by today's standards? Depends on who you ask, but to me it looks very good considering the hardware running it, and the fact that Nintendo managed to squeeze all of it onto an 18GB cartridge of all things while these newer games are requiring 4-5x that amount.. it's astounding. This is part of the reason I respect Nintendo--they don't fuck around with their first party, or even second party titles when it comes to optimization and squeezing as much as they can from what they have available and managing to make their games entertaining despite having dated graphics.
Any number of games running the UE4 Engine. Just look at Batman Arkham Knight, that game was released in 2015 and gives games released 8 years later a run for their money. A lot of these games aren't living up to the hardware cost they demand just to run them optimally. Doom Eternal is an example of a non-UE4 game that looks phenomenal, yet runs on a potato, then there's both Witcher 3 and CP2077 running on the RED engine that just blow every UE4 title out of the water. Then you got RE4 which is running on Capcoms engine and that game looks and runs great on all manners of hardware. UE4 games just look so dated, hence my remark about games looking like they belong back in 2018.
Never said FSR was bad, point it out and I’ll gladly own up to it, but afaik I’ve never said it was bad. Just not as good as DLSS.Probably because it's a 720p game not 1080. Also it uses FSR. Yep that FSR. So is FSR still bad or nah
Still looked good, both on PC and consoles. Never said anything about its PC port. It did get optimized eventually, and it does look better than 3/4 of the UE4 games available.? That game was notorious for how bad the PC port ran. I'm surprised you mentioned it.
You left out a minor detail—at highest in-game settings with Ray Tracing enabled. At the next level of detail, it can run even at 4K on a 3070.Also? Doom Eternal requires more than 8GB to run at the higher resolutions. Hardware Unboxed did a write up on it specifically how the 3080 was the better card largely because it had more VRAM than the 3070.
I agree, and more publishers should be asking them for advice.Also it's running on an engine developed by easily the best game developer of our life time.
Yes, and even with a texture pack it didn’t require 15-20GB of VRAM. Now we have the updated next-gen version, while still a bit buggy, it isn’t demanding 15-20GB of VRAM just to be playable.That game had low quality textures and required a HD pack to even bring it up to the level of what we're talking about.
I didn’t say CP2077 could be played on a potato, that was Doom Eternal. Yea, CP2077 was troubled at launch due to investors getting impatient, and gamers sending death threats to CD Projekt Red, so they just released it as it was, but are still continuously fixing it—again something that should be noted by game publishers.Yup and it doesn't "run on a potato" either and had one of the most troubled launches that I can remember.
What complaints? The only complaint I’m aware of is from the VRAM Doomer club. If the game had any issues, Capcom was super quick to fix them, unlike EA, or Naughty Dog.RE4 also received complains so I'm not sure what you're talking about.
Well if you can't use DLSS on your own hardware then I don't even agree with the last part.Never said FSR was bad, point it out and I’ll gladly own up to it, but afaik I’ve never said it was bad. Just not as good as DLSS.
We've been talking about the PC ports this entire time. Literally its in the title of this thread and no it does not look better than 3/4 of the UE4 games available.Still looked good, both on PC and consoles. Never said anything about its PC port. It did get optimized eventually, and it does look better than 3/4 of the UE4 games available.
Yup just slower because of lack of VRAM.You left out a minor detail—at highest in-game settings with Ray Tracing enabled. At the next level of detail, it can run even at 4K on a 3070.
It requires more than 8GB though at high settings.Yes, and even with a texture pack it didn’t require 15-20GB of VRAM. Now we have the updated next-gen version, while still a bit buggy, it isn’t demanding 15-20GB of VRAM just to be playable.
And EA is as well with Jedi Survivor.I didn’t say CP2077 could be played on a potato, that was Doom Eternal. Yea, CP2077 was troubled at launch due to investors getting impatient, and gamers sending death threats to CD Projekt Red, so they just released it as it was, but are still continuously fixing it—again something that should be noted by game publishers.
Nope. Digital Foundry complained about it not being optimized and how it used too much VRAM.What complaints? The only complaint I’m aware of is from the VRAM Doomer club. If the game had any issues, Capcom was super quick to fix them, unlike EA, or Naughty Dog.