Hogwarts Legacy

So far generally running good with max settings @ 1440 for me. There is some marginal stutter when you first enter through doors / new areas, but it's really not that bad.

And lol @ the 1/10 wired review. Thus far, this game is amazing from what i've played so far.
 
It's probably not 100% vram, but that is definitely a big factor. Sorry to 3070 users, but everyone should have saw that 8gb was going to be very limiting going forwards.
 
It's probably not 100% vram, but that is definitely a big factor. Sorry to 3070 users, but everyone should have saw that 8gb was going to be very limiting going forwards.
The 10GB VRAM was the reason I didn't want to "downgrade" to a 3080, I knew it would be a problem just didn't imagine it would be a problem this soon.
 
The 10GB VRAM was the reason I didn't want to "downgrade" to a 3080, I knew it would be a problem just didn't imagine it would be a problem this soon.
Yup. Never go down on an upgrade. Idc how fast the bus is. I'm not going to be caught with the next Fury X.
 
Nvidia driver overhead?
Doubt it. that driver overhead doesn't seem to exist on the 3080Ti even though it’s the same driver.

There’s a software issue at play here. It could be drivers or the game. Perhaps both. Clear to me it isn’t a hardware issue.
 
Definitely can't get away with 16gb RAM in this title, total ram usage for me is around 22gb, should've upgraded to 64gb instead of 32.
 
Doubt it. that driver overhead doesn't seem to exist on the 3080Ti even though it’s the same driver.

There’s a software issue at play here. It could be drivers or the game. Perhaps both. Clear to me it isn’t a hardware issue.
The 3080Ti has more VRAM so it fits in the picture, pretty sure the game just wants All the VRAM for RT.
 
I dunno - my 3080 with 10 gigs of vram has barely gone over 9 gigs of usage yet.
Seems very sporadic how all of our experiences are.
Okay I think some people are not realizing what is going on in some newer games. In the past we could look at vram usage and think oh that's just allocating and not really what it needs but things have absolutely changed in some newer games. What is going on now is that some games will actually start swapping out system ram without even allocating all of the vram. For instance if you have a 10 gig card a game may stop allocating at just over 9 gigs and go ahead and start using system ram if it's running out of vram. There are at least a couple of other games that have been doing this because you can see system ram usage go way up compared to a card that has more vram. For example if you took a 3080 12 gig and a 3080 10 gig you would see more system ram usage on the 10 gig model if it's vram limited.
 
36899_3.jpg
 
It's probably not 100% vram, but that is definitely a big factor. Sorry to 3070 users, but everyone should have saw that 8gb was going to be very limiting going forwards.

Far Cry 6 and Watch Dogs Legions at max settings and HD texture options has frame rate drop issues on an RTX 3070. The HD textures don't look that much better, but still, it is a setting that might have to be turned down. 8GB is becoming a bit limiting as of now.
 
Far Cry 6 and Watch Dogs Legions at max settings and HD texture options has frame rate drop issues on an RTX 3070. The HD textures don't look that much better, but still, it is a setting that might have to be turned down. 8GB is becoming a bit limiting as of now.
Don't even know why it's a surprise. Did people really think 8gb was going to last forever? Yeah, sorry people overpaid $500-600+ for that paltry amount, but the writing was on the wall already.
 
The 3080Ti has more VRAM so it fits in the picture, pretty sure the game just wants All the VRAM for RT.
I was responding to driver overhead. Also, why does the RX6650with less vram and a weaker GPU perform better than a 3080? The numbers both of these are producing are nowhere near being cpu limited so it’s not an overhead issue. And the 8gb card performing better than the 10gb card suggests it’s not a vram issue either.
 
Don't even know why it's a surprise. Did people really think 8gb was going to last forever? Yeah, sorry people overpaid $500-600+ for that paltry amount, but the writing was on the wall already.
I had a 3070 and a 2080 TI and even 2 years ago there were games where I had to lower the texture settings on the 3070 and in a couple of cases even at 1440p with ray tracing where as the 2080 TI had no such issues. Really even the 2070 and 2080 should have had 10 or 12 gigs as games then were already pushing close to 8 gigs at playable settings.
 
Okay I think some people are not realizing what is going on in some newer games. In the past we could look at vram usage and think oh that's just allocating and not really what it needs but things have absolutely changed in some newer games. What is going on now is that some games will actually start swapping out system ram without even allocating all of the vram. For instance if you have a 10 gig card a game may stop allocating at just over 9 gigs and go ahead and start using system ram if it's running out of vram. There are at least a couple of other games that have been doing this because you can see system ram usage go way up compared to a card that has more vram. For example if you took a 3080 12 gig and a 3080 10 gig you would see more system ram usage on the 10 gig model if it's vram limited.
I've seen this happen with my 2060 in Shadow of the Tomb Raider, I ran out of VRAM and the game started pulling around 16GB and ran fine.
 
Last edited:
Okay I think some people are not realizing what is going on in some newer games. In the past we could look at vram usage and think oh that's just allocating and not really what it needs but things have absolutely changed in some newer games. What is going on now is that some games will actually start swapping out system ram without even allocating all of the vram. For instance if you have a 10 gig card a game may stop allocating at just over 9 gigs and go ahead and start using system ram if it's running out of vram. There are at least a couple of other games that have been doing this because you can see system ram usage go way up compared to a card that has more vram. For example if you took a 3080 12 gig and a 3080 10 gig you would see more system ram usage on the 10 gig model if it's vram limited.
Well to be fair - I've cached at least 64 gigs from my enterprise NVME to be used in game or wherever windows needs it too. Maybe that's helping as well.
 
The 10GB VRAM was the reason I didn't want to "downgrade" to a 3080, I knew it would be a problem just didn't imagine it would be a problem this soon.

10GB VRAM is still fine (especially if you're playing at 1440p like I am)...this game is an outlier and what looks to be poor optimization...give it a few patches and performance will be better
 
10GB VRAM is still fine (especially if you're playing at 1440p like I am)...this game is an outlier and what looks to be poor optimization...give it a few patches and performance will be better
I don't play 1440p, plus what's an outlier today will be the norm tomorrow. People are too quick to jump on the it's just badly optimized bandwagon. The RT effects do look awesome in this game probably the highest visual difference between RT off and max, than any other game I've seen.
 
I don't play 1440p, plus what's an outlier today will be the norm tomorrow. People are too quick to jump on the it's just badly optimized bandwagon. The RT effects do look awesome in this game probably the highest visual difference between RT off and max, than any other game I've seen.
Except the evidence here suggests that it IS a badly optimized game. Again, unless you can explain why a weaker GPU with less VRAM that isn't CPU limited is performing better than one with more vram and more powerful GPU, it's not only silly, but down right stupid to ignore optimization and jump straight to "VRAM! herp derp"
 
I don't play 1440p, plus what's an outlier today will be the norm tomorrow. People are too quick to jump on the it's just badly optimized bandwagon. The RT effects do look awesome in this game probably the highest visual difference between RT off and max, than any other game I've seen.

did they fix the RT bug?...there was a hack on Reddit which got it working properly but I hear RT AO and shadows were basically broken during the Early Access period
 
Hardwareunboxed said that 8 GB is not enough even at 1080p and that for 1440p 12 GB is the minimum you really need.
 
Hardwareunboxed said that 8 GB is not enough even at 1080p and that for 1440p 12 GB is the minimum you really need.

The game doesn't look that good to warrant that VRAM usage. But 8GB at 1440 and especially ultra wide is becoming a problem.

It could very well be both - the game is unoptimized in some way but still may require more than 8GB of VRAM for 1440.
 
The game doesn't look that good to warrant that VRAM usage. But 8GB at 1440 and especially ultra wide is becoming a problem.

It could very well be both - the game is unoptimized in some way but still may require more than 8GB of VRAM for 1440.
I largely agree with some prior comments on here - Expecting 8GB to be enough for a 2023 AAA title is just absurd. Although Nvidia is somewhat to blame here due to being stingy with their VRAM amounts. The 3080 should have been 12GB to begin with. The 3070 should have been 10GB.
 
And just think the 4060 is only going to be an 8 gig card. That is essentially DOA in the vram department for some games that it could otherwise handle. Really even the next gen 50 class cards need to be 10 or 12 gigs at this point.
 
And just think the 4060 is only going to be an 8 gig card. That is essentially DOA in the vram department for somegames that could otherwise handle.
I think some of this is just many years of stagnation and people largely didn't need to worry about VRAM unless gaming at 4K. Most of GPU history VRAM was always the #1 upgrade factor whenever new games came out and GPU's were very short lived mostly due to the rapid expansion of VRAM requirements.
 
I'll ask you personally for the 3rd time... Why is an RX 6650 which is weaker than a 3080 and has 2GB less VRAM, performing better than said 3080 in a non-CPU limited scenario?
Probably because AMD's implementation of RT uses slightly less VRAM? Who knows, you can't compare apples to oranges. The fact remains that the game needs more VRAM than any other for RT. If it wasn't a VRAM issue why would a 3060 12GB perform infinitely better than a 3080 10GB? Same architecture, same driver.
 
did they fix the RT bug?...there was a hack on Reddit which got it working properly but I hear RT AO and shadows were basically broken during the Early Access period
IDK what bug was that, I tried the game with AO and Shadows off, only with reflections and it was still unplayable on 2080Ti.
 
just d/ling, severs are a little slow, im only getting ~70mbps

IDK what bug was that,
idk if it was a bug or people cant come to terms that their card cant handle it and its still in infancy, imo.
 
  • Like
Reactions: M76
like this
And just think the 4060 is only going to be an 8 gig card. That is essentially DOA in the vram department for some games that it could otherwise handle. Really even the next gen 50 class cards need to be 10 or 12 gigs at this point.
Not really... It will need to run lower settings than max due to the gpu anyway, reducing VRAM requirements.
 
Probably because AMD's implementation of RT uses slightly less VRAM? Who knows, you can't compare apples to oranges. The fact remains that the game needs more VRAM than any other for RT. If it wasn't a VRAM issue why would a 3060 12GB perform infinitely better than a 3080 10GB? Same architecture, same driver.
What do you think is more likely. AMD managed to be better at RT in this one scenario despite being WELL behind nvidia in everything RT related…. Or…. There’s a glaring optimization issue? Do you have a source for AMD’s efficient use of RT? Or was that made up?

Have you ever ran out of vram? Can you point me to a scenario where going from 10 to 12GB yields in a 1200% performance increase? Bet you cannot.

It seems like you’re willing to put in all the baseless guess work necessary to convince yourself this is nothing more than a vram issue.
 
What do you think is more likely. AMD managed to be better at RT in this one scenario despite being WELL behind nvidia in everything RT related…. Or…. There’s a glaring optimization issue?

Have you ever ran out of vram? Can you point me to a scenario where going from 10 to 12GB yields in a 1200% performance increase? Bet you cannot.

It seems like you’re willing to put in all the baseless guess work necessary to convince yourself this is nothing more than a vram issue.
No, unless we're looking at different tests. The benchmarks of the prior page still show the Nvidia hardware being ahead in raytracing.
 
No, unless we're looking at different tests. The benchmarks of the prior page still show the Nvidia hardware being ahead in raytracing.
That's kind of my point. M76 was saying it's probable that AMD uses less vram with RT on, hence better performance on a 6650 vs 3080 even though it's a weaker GPU with 2GB less VRAM. There's a 1440p chart with RT on where the 8GB 6650 was matching a 10GB 3080 in 1% lows and beating it in average fps
 
The game doesn't look that good to warrant that VRAM usage. But 8GB at 1440 and especially ultra wide is becoming a problem.

It could very well be both - the game is unoptimized in some way but still may require more than 8GB of VRAM for 1440.
The dev studio on this game is new. It's likely they aren't the best at telling DX12 what to do and how to do it ;)

Clearly, they do have some talents for art and game design.
 
Not really... It will need to run lower settings than max due to the gpu anyway, reducing VRAM requirements.
Sorry but that is not necessarily the case because it will be faster than previous cards that could already run settings that were close to the vram being tapped out. I mean look right now you can take a slow 3060 and run higher settings that impact the vram than a 3070 or 3070 TI. Stuff like textures have essentially zero impact on GPU performance other than vram.
 
Back
Top