Hogwarts Legacy

Hmm, noticing that the when the stuttering and bad framerate happens...GPU usage dips. No bottlenecks or any other topped out resource. In menus GPU runs near 100% and 200+ frames, so something fucky is going on lol
 
The dev studio on this game is new. It's likely they aren't the best at telling DX12 what to do and how to do it ;)

Clearly, they do have some talents for art and game design.
I agree about the DX12 sentiment, but to be fair, Avalanche has been responsible for a shit load of games back when Disney owned them.

But then again, we are talking about Hanna Montana: Spotlight World Tour and Disney Infinity with Chicken Little :ROFLMAO:
 
Looks like I got a 300mb patch.. Hopefully it's better.

Edit: WAY better.... No more choppiness.
 
Last edited:
Tinkering with settings a bit - Nvdias NIS runs actually better than DLSS for me...
 
played through the intro section until you wake up, i think i like it. ran pretty good on my sig rig, imo. worked the hell out of it though, temps on both cpu/gpu got ~66c but my vram got to a pretty toasty 92c. it auto'd to 4k high w/fsr 2 quality but i knocked it down to balanced, looks pretty good. was getting 45-75(mostly 60-75) with some dips at scene transitions, no choppiness other than that and vrr smoothed it all out. it used all of my 8gb vram, not sure about system ram, forgot to look. might need to play with settings, dial 'em down a bit to get fps lows up and temps down a bit, especially the vram. i even have heatsinks on the chips and a fan mounted to the side of the card to help...
 
I noticed 17GB of system RAM being used throughout my short play tonight. Crazy.... Only 8-9GB of VRAM though (3080 Ti). Runs fine for me otherwise on max with RT stuff. My PSU restarts with this game though due to the insane amount of system resources it seems to pull through the hardware, I thought 750watt Phanteks AMP would be enough, until now. So have a HX1000i coming in the daytime to replace it with lmao.

rjlxHoD.jpg


 
I noticed 17GB of system RAM being used throughout my short play tonight. Crazy.... Only 8-9GB of VRAM though (3080 Ti). Runs fine for me otherwise on max with RT stuff. My PSU restarts with this game though due to the insane amount of system resources it seems to pull through the hardware, I thought 750watt Phanteks AMP would be enough, until now. So have a HX1000i coming in the daytime to replace it with lmao.

View attachment 548235


Interesting. My system in sig is running on 750w PSU. Haven’t played the game yet though. It’ll be interesting to see if it holds up.
 
Just to add to that, a member on OcUK forums also has a 750 watt PSU (though with a 7900XTX) and playing the same settings, they also have had the exact same PSU reboot when playing this game!
 
I largely agree with some prior comments on here - Expecting 8GB to be enough for a 2023 AAA title is just absurd. Although Nvidia is somewhat to blame here due to being stingy with their VRAM amounts. The 3080 should have been 12GB to begin with. The 3070 should have been 10GB.

The RTX 4070 better have 12GB. 10GB wouldn't be enough. Sure it would get by fine in most games, but I am sure newer upcoming games may even have issues with 10GB. Nvidia likes to pretend DLSS is the solution for everything and I can see them arguing that you can use less VRAM if you enable DLSS. The problem is not all games offer DLSS, Far Cry 6 is one example.
 
Been able to for a long time...although originally you couldn't download it...it came on 3.5in floppies.

View attachment 548284
Actually bought one of those back in the mid 90s. Contrary to popular belief it wasn't snake oil. It just compressed your usable RAM. I don't think in practice it doubled it, but it did free up a significant amount.

Keep in mind this was back when RAM was counted in Megabytes.

Seems silly now, but back then I suppose RAM was more precious than CPU cycles.
 
Techpowerup measured VRAM usage but I just about always see higher usage than what they show in their reviews. For instance they showed 8400 MB at 4k for a 4090 in their Dead Space review yet I see over 11000 MB at times with a 3080 ti at that res.

vram.png
 
That 4k number matches pretty close to what I saw. 6800xt was reporting around 12700mb at native 4k today. Turned FSR2 on and it dropped to 11900mb or so.
 
What do you think is more likely. AMD managed to be better at RT in this one scenario despite being WELL behind nvidia in everything RT related…. Or…. There’s a glaring optimization issue? Do you have a source for AMD’s efficient use of RT? Or was that made up?
this has nothing to do with how efficient RT implementation is on AMD, but how efficient it is with Video RAM usage. A different architecture from a different manufacturer perrforming differently can be explained by a million causes, I see no point in guessing why. However two cards from the same architecture performing wildly differently narrows down the possibilities. (3080 vs 3060) And the deciding variable seems to be RAM size there.
Have you ever ran out of vram? Can you point me to a scenario where going from 10 to 12GB yields in a 1200% performance increase? Bet you cannot.
Yes, many times, running out of RAM is a hard treshold, it runs fine whileyou are within memory limits but performance nosedives as soon as you run even slightly over and the GPU needs to start swapping into system ram. It always results in unplayable performance. First time I experienced it was running the og DeusEx on a TNT2. Then I experienced it with a 9600XT in Fear. Imagine the cache running out on a QLC SSD, it performs great, then it drops to a crawl. A similar thing happens when a scene in a videogame does not fit in the memory of your GPU.

It seems like you’re willing to put in all the baseless guess work necessary to convince yourself this is nothing more than a vram issue.
Sigh, it is exactly that I don't want to do baseless guesswork on why AMD is affected less. You think I don't wish this was fixable by a simple patch? I do, as it is in my best interest too. However I don't let wishful thinking get in the way of hard evidence.
 
IDK what bug was that, I tried the game with AO and Shadows off, only with reflections and it was still unplayable on 2080Ti.

Here is how you can significantly improve the quality of Ray Tracing Shadows, Ambient Occlusion & Reflections in Hogwarts Legacy

The view distance for the Ray Tracing Shadows is a bit low, even on Ultra Settings...as a result of that, there are some truly awful pop-in issues when exploring the game’s open-world environments...furthermore, Ray Tracing Ambient Occlusion appeared to be broken in our initial tests...for whatever reason, Avalanche has used some really low-quality settings for it. Hell, you can even see the screen-space AO artifacts, even when enabling RTAO...

https://www.dsogaming.com/articles/...ent-occlusion-reflections-in-hogwarts-legacy/
 
  • Like
Reactions: Q-BZ
like this
I'm running the game at all High Settings and RTX settings all on + set to Ultra. DLSS Performance option. I never drop below 60fps. Using a 12700k @ 5ghz + 2080. 2560x1080 resolution. Very happy with performance.
 
Here is how you can significantly improve the quality of Ray Tracing Shadows, Ambient Occlusion & Reflections in Hogwarts Legacy

The view distance for the Ray Tracing Shadows is a bit low, even on Ultra Settings...as a result of that, there are some truly awful pop-in issues when exploring the game’s open-world environments...furthermore, Ray Tracing Ambient Occlusion appeared to be broken in our initial tests...for whatever reason, Avalanche has used some really low-quality settings for it. Hell, you can even see the screen-space AO artifacts, even when enabling RTAO...

https://www.dsogaming.com/articles/...ent-occlusion-reflections-in-hogwarts-legacy/
I did not notice any of these issues. I don't even know what am I supposed to be looking at the ambient occlusion example video? What is broken there? I don't see anything.
 
this has nothing to do with how efficient RT implementation is on AMD, but how efficient it is with Video RAM usage. A different architecture from a different manufacturer perrforming differently can be explained by a million causes, I see no point in guessing why. However two cards from the same architecture performing wildly differently narrows down the possibilities. (3080 vs 3060) And the deciding variable seems to be RAM size there.

Yes, many times, running out of RAM is a hard treshold, it runs fine whileyou are within memory limits but performance nosedives as soon as you run even slightly over and the GPU needs to start swapping into system ram. It always results in unplayable performance. First time I experienced it was running the og DeusEx on a TNT2. Then I experienced it with a 9600XT in Fear. Imagine the cache running out on a QLC SSD, it performs great, then it drops to a crawl. A similar thing happens when a scene in a videogame does not fit in the memory of your GPU.


Sigh, it is exactly that I don't want to do baseless guesswork on why AMD is affected less. You think I don't wish this was fixable by a simple patch? I do, as it is in my best interest too. However I don't let wishful thinking get in the way of hard evidence.
Please point me to where you read that AMD requires less vram when ray tracing. You said the last thing you want to do is baseless guess work. So where’s your source for this because it’s the first I’ve heard of it.
 
I have a [email protected] w/2080Ti and FV43U monitor. I guess we’ve gotten to the point where I need to upgrade my setup to play the latest games at higher resolutions/refresh rates.

I can afford to upgrade my CPU/motherboard/ram but not the GPU. Was waiting of the 7800x3D first. I guess I’ll be waiting a bit longer, lots of older games to play I guess.
 
I have nothing to add to this discussion other than my impressions after about 8 hour in. Still really digging it. I turned RT stuff off and left everything on ultra. I'm on a 10th gen i7 and 3090. Girlfriend is running a 12th gen i5 and a 3080. We're both gaming at 1440p. Both smooth, but hers is even smoother than mine.
 
Ok, this fixed my issues. Fucking Epic and their fucky ass unreal engine.

Make sure in "Graphics settings" in Windows (it varies on how to get there, but windows search gpu should take you close) and make sure this is on:

1676135249922.png


Go to NVidia Control Panel: Change shader cache to 10GB

1676135301532.png


Go to %appdata%\Local\Hogwarts Legacy\Saved\Config\WindowsNoEditor\Engine.ini

Modify with these values pasted at the end, you guys should be familiar with this for the most part :) :

**NOTE THE POOL SIZE SETTING SHOULD BE HALF OF YOUR VRAM**

[SystemSettings]
r.bForceCPUAccessToGPUSkinVerts=True
r.GTSyncType=1
r.OneFrameThreadLag=1
r.FinishCurrentFrame=0
r.TextureStreaming=1
r.Streaming.PoolSize=6144
r.Streaming.LimitPoolSizeToVRAM=1

[ConsoleVariables]
AllowAsyncRenderThreadUpdates=1
AllowAsyncRenderThreadUpdatesDuringGamethreadUpdates=1
AllowAsyncRenderThreadUpdatesEditor=1

My r.Streaming.PoolSize is set for a 3080ti. Do the calculation that fits your graphics card.

This fixed all of m y issues, and my friend's issues. Hopefully it works for you all!
 
This game is magic. I generally prefer grittier and darker, but the way they captured the universe is truly remarkable. Zillions of little touches that really make it Harry Potter abound.

I've been playing it with one of my kids. He was watching with eyes open in wonder. When we got to Hogsmeade, I just turned him loose to explore. Watching him laugh with glee, grinning ear to ear as he discovered all sorts of little interactive things - pure gold. I was going to say "you can't put a price on that", but I guess 60 bucks does cover it.
 
First Harry Potter game and am enjoying it greatly, so is the GF. I haven't had any issues with performance, running a 6900xt though.
 
I'm 13 hours in now. This game is great and just keeps delivering. Best aaa game i've seen maybe ever. Zero crashes on our two systems and much like Phases thoughts, magical comes to mind. If you're a harry potter fan and don't have this..I don't know what to say.
 
Ok, this fixed my issues. Fucking Epic and their fucky ass unreal engine.

Make sure in "Graphics settings" in Windows (it varies on how to get there, but windows search gpu should take you close) and make sure this is on:

View attachment 548317

Go to NVidia Control Panel: Change shader cache to 10GB

View attachment 548318

Go to %appdata%\Local\Hogwarts Legacy\Saved\Config\WindowsNoEditor\Engine.ini

Modify with these values pasted at the end, you guys should be familiar with this for the most part :) :

**NOTE THE POOL SIZE SETTING SHOULD BE HALF OF YOUR VRAM**





My r.Streaming.PoolSize is set for a 3080ti. Do the calculation that fits your graphics card.

This fixed all of m y issues, and my friend's issues. Hopefully it works for you all!
Man! What a beautiful fix - game is running incredibly smoother with this.

Thank you!
 
Finally got around to playing the game and got to Hogsmeade

Ultra settings. 3440x1440 (1440p ultrawide) with DLSS enabled as well as all the RT options ON. HDR On as well

Up until Hogsmeade, game ran pretty well and did not seem all that demanding. My GPU usage was very rarely pegged. VRAM was around 8GB most of the time but got up to 13GB later in the game.

Once I enterd Hogsmeade, FPS is between 40-60. GPU utilization is about 40%, CPU utilization is about 30%. VRAM around 13.5GB system ram usage around 18GB (I have 32GB)

So poor performance, even though nothing in particular is being taxed hard at all.

Exited the game and performed the settings changes Lifelite made here: Well, all except changing the Shader Cache Size, left that on Auto

Ok, this fixed my issues. Fucking Epic and their fucky ass unreal engine.

Make sure in "Graphics settings" in Windows (it varies on how to get there, but windows search gpu should take you close) and make sure this is on:

View attachment 548317

Go to NVidia Control Panel: Change shader cache to 10GB

View attachment 548318

Go to %appdata%\Local\Hogwarts Legacy\Saved\Config\WindowsNoEditor\Engine.ini

Modify with these values pasted at the end, you guys should be familiar with this for the most part :) :

**NOTE THE POOL SIZE SETTING SHOULD BE HALF OF YOUR VRAM**





My r.Streaming.PoolSize is set for a 3080ti. Do the calculation that fits your graphics card.

This fixed all of m y issues, and my friend's issues. Hopefully it works for you all!


Loaded the game back up and was now getting 60-110 fps. GPU usage now hovering around in the low 50's. CPU usage for the most part unchanged, maybe a 5% hike. VRAM usage same as before. I forgot to glace at my system RAM usage. Frame times were still a bit rough though. It seems there's still a TON of performance left on the table here

My personal conclusion is this game really isn't as demanding as some people make it out to be. My PSU didn't shut off. CPU usage, GPU usage, vram and system ram allocation were all less in this game than they are in Warzone 2. I know allocation isn't the same as whats being used and I have no way to tell how much of the allocated ram in WZ2 and Hogwarts is actually in use, but that's the raw data I have.

For those that argue on the side of "not enough vram" for most of the performance issues instead of "optimization" well... community based discoveries like modifying an ini file quite literally doubling the performance is a big middle finger to that argument.
 
This game is magic. I generally prefer grittier and darker, but the way they captured the universe is truly remarkable. Zillions of little touches that really make it Harry Potter abound.

I've been playing it with one of my kids. He was watching with eyes open in wonder. When we got to Hogsmeade, I just turned him loose to explore. Watching him laugh with glee, grinning ear to ear as he discovered all sorts of little interactive things - pure gold. I was going to say "you can't put a price on that", but I guess 60 bucks does cover it.

the crazy part is that this is the first really big open world AAA game from this developer...amazing what they were able to do...and just like the Batman Arkham games the developer really seems to know their Harry Potter history and lore
 
Graphically this game really doesn't look that good to me. But I am only watching on Youtube which always compresses videos. Certainly not good enough to bring down top of the line systems without ray tracing enabled.

But I will certainly look into this one because it seems like the reviews are decent.
 
the crazy part is that this is the first really big open world AAA game from this developer...amazing what they were able to do...and just like the Batman Arkham games the developer really seems to know their Harry Potter history and lore
I was trying to think of other successful examples of "we really nailed the feel of this IP".

Yours is thankfully much more timely, because what I immediately thought - TIE Fighter, and Dark Forces.
 
Please point me to where you read that AMD requires less vram when ray tracing. You said the last thing you want to do is baseless guess work. So where’s your source for this because it’s the first I’ve heard of it.
The evidence is that the game runs better with more VRAM, than with less? I mean what other evidence you want? You are just looking silly now. Sure there is probably more optimization possible, but optimization is not a digital on off switch between unoptimized and optimized. I don't even know why the hell am I trying to explain anything to you when you are just ignoring everything and answer with "where is the evidence bro?"
 
The evidence is that the game runs better with more VRAM, than with less? I mean what other evidence you want? You are just looking silly now. Sure there is probably more optimization possible, but optimization is not a digital on off switch between unoptimized and optimized. I don't even know why the hell am I trying to explain anything to you when you are just ignoring everything and answer with "where is the evidence bro?"

But it doesn't. 8GB is less than 10 GB yet the 8GB 6650 is out performing the 10GB 3080. So in other words, you're pulling shit out of your ass and don't actually know what you're talking about. Want more evidence you don't know what you're talking about? Just read the last few posts. INI mods increasing performance drastically. Also, no one said anything about magic or on/off... Those are your words
 
The performance issues are ... weird, in observation and experimentation. Even with VRAM use quite within my GPU's ability, there are sections which go to strangely low levels (with RT) but then return to high levels - in the exact same section.
Really smells of either a game path issue or driver.
 
Back
Top