Hogwarts Legacy

you mean spells, like a wizard or a witch would use? that would be the point... its also explained that you usually have to do one of the spells first before normal attacks do damage. it also teaches you how to chain it all together...
edited speelin

I fully understand how it works, it just makes battles particularly long because you're always waiting around for your special attacks to charge up. Just like in AC: Odyssey when you're running around waiting for those attacks to charge up. That also ties into the whole leveling system. I've been fighting the same spiders since I was level 3 and both of our attacks do the same amount of damage at level 25.
 
the combat is rather simple and if you are familiar to combat games it is trivial. MoBs needed more skills as they leveled and more variety. I had fun playing it but, once i finished the main story i had zero desire to go back and finish everything up. i was mostly done anyway. feels like zero replay for me.

some open world games like Witcher3, RDR2, Spider man, as soon as i finished i planned a replay in 2-3 months with ideas on how to play different. it was a fun experience and i didn't have any performance issues so its not a complaint. if it wasn't for hogwarts castle being such a cool building and connected to harry potter i dont think i could of made it though the game. the castle is really cool, i spent hours just exploring and admiring.

half of the games mechanics feel really dated and cheap. it didn't quite tie it all together well. even the newer AC games, i can run around explore all the "?" on the map when i dont feel like jumping in to a quest and have fun. i found none of the open world stuff fun. the only fun i had was pretending it was an AC game and trying to stealth take down whole camps without being detected.

gear is so random. i played for hours and had crap gear. one super simple puzzle and was given a legendary item. gear never felt rewarding. there is zero negative impact for walking in to a store and looting a chest and taking money off the counter. there are 100 little things like that to take away from the good they did do.
 
I ended up winning the game over the weekend. I feel like things got better after you get most of the core spells. For the first half of the game they throw a thousand things at you that you can't do anything with. By the time that you finally can, you've long forgotten where everything was. Luckily, almost nothing matters when it comes to items. It's all random and later in the game I found myself totally skipping most chests because I didn't need money, more clothes, potions, etc. I only went for the RAR ones.
The level scaling annoyed me all the way to the end of the game. You never do really get a leg up on anything short of the monsters walking around near Hogsmeade. You get a shitload more spells that you can cycle through rapid-fire, but things like your HP, armor score, and how many hits enemies take never really changed. They honestly could have just skipped that whole component and just made clothing cosmetic. After all, the clothing scores (and ALL items) are completely random anyway.
Not sure if everyone's experience was like this, but I went from level 1 to level 20 almost instantly. Like, before I even left the castle. After that, it was pretty slow leveling up.
I did enjoy the world and amount of detail the devs put into everything, though. Compared to most games like this, I feel like they did a really good job of making you feel like you were somewhere. Kinda like GTA4, which was also a flawed game in an excellent world.
 
Levels don’t matter at all. Irrespective of your level most single player missions will have levels 1-2 higher than yours. Same for even side quests. As for the game, it was a lot of busy work for nothing. I finished it but had no intention of going back to it. In between there was a time where I even lost interest in the quests with artificial gates to learn a spell or do random assignments to unlock a spell for next main mission. That royally pissed me off.

Still a decent game overall but extremely poor design choices that killed the enjoyment for me.
 
Glad to see they finally got an update out the door. Hopefully they fixed the Nvidia rBAR issues that were giving me fits. Still, with no DLC on the horizon I don't see a reason to re-install the game. I'm not about to go back and complete all of those repetitious Merlin trials.
 
another game which took around 3 months to get in much better shape on PC in terms of performance and visuals...3 months seems to be the average amount of time it takes to fix PC ports- Gotham Knights, Hogwarts Legacy, Callisto Protocol etc...Last of Us, Returnal etc are all improving with each new patch
 
another game which took around 3 months to get in much better shape on PC in terms of performance and visuals...3 months seems to be the average amount of time it takes to fix PC ports- Gotham Knights, Hogwarts Legacy, Callisto Protocol etc...Last of Us, Returnal etc are all improving with each new patch
Yup. I don't get the rush to play day 1 on a singleplayer game. There's no advantage to be had doing that and it costs a ton more money.
 
I didn’t have any performance issues with this game. I had issues with stupidity of game design though.
 
A new Hogwarts Legacy patch is now live on PC and consoles, introducing considerable performance improvements and fixes for a variety of issues...on PC, overall lighting has been optimized, traversal stuttering has been minimized, memory optimization has been improved, and VRAM memory leaking has been resolved...

https://store.steampowered.com/news/app/990080/view/3695812631676589787
 
I can say if anyone was waiting to buy this game cause of performance issues.. This patch basically solved all of the issues at least for me.
 
Which GPU are you using? Ray tracing on or off?
12700k and nvidia 2080@ 1080p.. Everything set to High and all RTX features on. DLSS set to balanced..i never drop below 60fps.. usually hover in the 80-90fps area.
 
12700k and nvidia 2080@ 1080p.. Everything set to High and all RTX features on. DLSS set to balanced..i never drop below 60fps.. usually hover in the 80-90fps area.
That is great news. I have been hesitant to get the title because it wouldn't play well properly on a 3080 10gb. Sounds like now it is good to go.
 
I basically would just load up the game per patch and run through hogsmeade cause that place was filled with frame hitching and fps slow downs. This is the first time I could run though it clean.. So I'm going with good enough fixed. Also using Nvidia 535.98 drivers if that matters.
 
The things I was using Nvidia Profile Inspector to fix (the 3 rBAR functions) were supposedly patched with the last update. I guess they made the non maxed out RT settings feasible, too. Before, if you you used anything except the highest setting you'd get crazy blinding godrays inside buildings at random.
 
I purchased this just a few days ago to test with my new 4090, and because I kind of wanted to try it out anyway. If this is what the game is like when "optimized", I'm afraid to know what it was like when it wasn't. Frankly it runs like garbage. I set everything to Ultra and put it roughly as high as it goes, including Ray Tracing. DLSS is off, though DLAA is on...

I get frame dips in a multiple areas (and they don't make sense considering other areas are fine, that look more demanding), and then I look at the GPU usage and it's at like 50%. Wtf? 50%? Sometimes lower? It's not even using half of my GPU. My CPU isn't being taxed that much either and it's a 5950X. I look at its load and none of the cores are even that high.
trashgame1.jpg

trashgame2.jpg


There is one place where I do get 100% usage on my GPU... the menu.

1692560475867.png


This is kind of pathetic.
 
And then some areas at least use 80% (though they don't use past 80%)... for no conceivable reason. At least the FPS looks better.

1692564411692.png
 
And then some areas at least use 80% (though they don't use past 80%)... for no conceivable reason. At least the FPS looks better.

View attachment 591970

Turn off Ray tracing, not worth the performance hit for only minimal difference in looks. Also some of those dips are because it's loading a area while your running or walking and can cause that fps dip, use a NVME drive. Plays just fine on my 6900XT at 1440p that way with a 132 fps average. Also your 5950x is pretty much useless in this game, only leverages 2 cores from what I have seen.
 
Turn off Ray tracing, not worth the performance hit for only minimal difference in looks. Also some of those dips are because it's loading a area while your running or walking and can cause that fps dip, use a NVME drive. Plays just fine on my 6900XT at 1440p that way with a 132 fps average. Also your 5950x is pretty much useless in this game, only leverages 2 cores from what I have seen.

I noticed quite a bit of difference in the looks with Raytracing on, personally. The game looked much better to me.

Well, not only that, but turning it off for me hardly did anything when I tried it anyway. The game still capped out at like 80% GPU usage at most. As you can see in my pictures, whichever cores the game is using, it's certainly not using very much of either of them. The highest CPU core usage I see is just 50%. That's not terribly high.

Point is, it's not even using the GPU (or CPU). Turning down graphics because the game isn't even leveraging your system is kind of an insult...

Edit: Also, I am running it on an NVME already. SK hynix Platinum P41
 
Last edited:
I noticed quite a bit of difference in the looks with Raytracing on, personally. The game looked much better to me.

Well, not only that, but turning it off for me hardly did anything when I tried it anyway. The game still capped out at like 80% GPU usage at most. As you can see in my pictures, whichever cores the game is using, it's certainly not using very much of either of them. The highest CPU core usage I see is just 50%. That's not terribly high.

Point is, it's not even using the GPU (or CPU). Turning down graphics because the game isn't even leveraging your system is kind of an insult...

Edit: Also, I am running it on an NVME already. SK hynix Platinum P41
4090 is CPU limited with a 5950x. Sometimes even at 4K. (Nvidia's drivers have a lot of CPU overhead and the 4090 is fast enough, it affects last gen CPUs).

Try turning off SMT for your 5950x. You should get a small framerate boost.
 
4090 is CPU limited with a 5950x. Sometimes even at 4K. (Nvidia's drivers have a lot of CPU overhead and the 4090 is fast enough, it affects last gen CPUs).

Try turning off SMT for your 5950x. You should get a small framerate boost.
But it's not doing this on Cyberpunk 2077 at all. With everything turned up, on psycho settings where applicable (and DLSS off), everything runs smoothly and uses the 98-99% on the GPU consistently on Cyberpunk 2077.

Or are you saying that Hogwarts being limited to 2 cores means that pretty much the only thing you're going to use most of your GPU on is a 7800X3D? Because that's still kind of pathetic lol...

I guess I'll just deal with it. I considered just upgrading to a 7800X3D with this GPU but it just seems like a massive waste for just one game with terrible coding.
 
Last edited:
But it's not doing this on Cyberpunk 2077 at all. With everything turned up, on psycho settings where applicable (and DLSS off), everything runs smoothly and uses the 98-99% on the GPU consistently on Cyberpunk 2077.

Or are you saying that Hogwarts being limited to 2 cores means that pretty much the only thing you're going to use most of your GPU on is a 7800X3D? Because that's still kind of pathetic lol...

I guess I'll just deal with it. I considered just upgrading to a 7800X3D with this GPU but it just seems like a massive waste for just one game with terrible coding.
Hogwarts has weird performance issues. I'm not sure if its a CPU thing or GPU thing, exactly. After AMD's recent drivers----their cards now perform a lot better in the game, than Nvidia.

But in general, 4090 can often show CPU limitations, with last gen CPUs. I don't know about Hogwarts, specificially. But, Hardware unboxed just released a video with updated performance of the 4090, with 13900k and 7800X3D. And in the top left corner, they show some footage of where they tested performance. I timestamped Hogwarts for you. You could probably use that to compare your performance.


View: https://youtu.be/H9guEsBly0I?t=178
 
Hogwarts has weird performance issues. I'm not sure if its a CPU thing or GPU thing, exactly. After AMD's recent drivers----their cards now perform a lot better in the game, than Nvidia.

But in general, 4090 can often show CPU limitations, with last gen CPUs. I don't know about Hogwarts, specificially. But, Hardware unboxed just released a video with updated performance of the 4090, with 13900k and 7800X3D. And in the top left corner, they show some footage of where they tested performance. I timestamped Hogwarts for you. You could probably use that to compare your performance.


View: https://youtu.be/H9guEsBly0I?t=178


So they state that they're getting into pure GPU limited territory with those two CPUs by around the 1440p resolution. I'm running on 3440x1440, which I think is somewhere in the neighborhood of 30% more pixels. Even if those CPUs were to deliver 30% more performance, I should be GPU limited in this scenario, especially considering I have every single graphics setting turned to max. Yet they're getting like twice my average FPS, or more.

From my monitoring of my CPU during this game, I see two cores at exactly 50% usage. What this is telling me is that not only does Harry Potter Legacy not use more than two cores, it's not even using two full cores. It's using two full threads and that's it. It's essentially capping itself out at HALF of each core that it's using, so it basically might as well be a single core game.

It would basically run better on a high clocked i3 than any other processor. It's just a broken game.

This is before we get into the really messed up stuff that you were noting with their FPS graphs with AMD vs Nvidia cards...
 
So they state that they're getting into pure GPU limited territory with those two CPUs by around the 1440p resolution. I'm running on 3440x1440, which I think is somewhere in the neighborhood of 30% more pixels. Even if those CPUs were to deliver 30% more performance, I should be GPU limited in this scenario, especially considering I have every single graphics setting turned to max. Yet they're getting like twice my average FPS, or more.

From my monitoring of my CPU during this game, I see two cores at exactly 50% usage. What this is telling me is that not only does Harry Potter Legacy not use more than two cores, it's not even using two full cores. It's using two full threads and that's it. It's essentially capping itself out at HALF of each core that it's using, so it basically might as well be a single core game.

It would basically run better on a high clocked i3 than any other processor. It's just a broken game.

This is before we get into the really messed up stuff that you were noting with their FPS graphs with AMD vs Nvidia cards...
Well it would be best if you tested at 1440p for a bit, to do a good comparison. 1440 ultrawide is a lot more pixels.

That said, your comments support probable CPU limitation with your 5950x. 13900k has a lot more single core performance. 7800x3D also has a lot more (though a bit less than 13900k, when benchmarked with Cinibench R23) --- and is also supported by the big V-cache.

**And as I said, try turning off SMT.
 
Here you can see a comparison of CPUs with a few different peformance levels. 7800x3d being the fastest possible for Harry Potter, compared to Ryen 7700 non-x, i7-13700k, and last gen's best gaming CPU, 5800x3D.

View: https://youtu.be/7gEVy3aW-_s?t=142

All of them compared to the 7800x3D, get less performance out of the 4090, even at 4K. (Although the 13700k is close). I think Hogwarts is a CPU limited game. But, it also exacerbates the CPU limited issues with Nvidia drivers. Which is what they showed in the previous video, comparing the 7900 XTX to the 4090. AMD's drviers have notably less CPU limitation.

So even though the XTX is a less powerful GPU: It can show more performance in scenarios which are heavily CPU limited. And that's what happens when you combine a CPU limited game, with a 4090. Double limitation!
 
Last edited:
Well it would be best if you tested at 1440p for a bit, to do a good comparison. 1440 ultrawide is a lot more pixels.

That said, your comments support probable CPU limitation with your 5950x. 13900k has a lot more single core performance. 7800x3D also has a lot more (though a bit less than 13900k, when benchmarked with Cinibench R23) --- and is also supported by the big V-cache.

**And as I said, try turning off SMT.

My point is that it's kind of pointless for me to go down to 1440p. You'll notice in almost every single result in your video (for any game that probably isn't coded like garbage), basically every CPU is up to parity (that is using the full capacity provided by the card for that graphical detail) by the time you get to 4k. There are usually fluctuations, but they're minor, comparatively speaking. Then you get to games like Hogwarts, and the difference is much bigger, and hell mine is bigger than any of those, granted I'm closer to the 1440p line than the 4k line.

Anyway here are my results with SMT disabled:

1692600459116.png


This is going to be hard to read, but FYI I'm getting 22% more performance in this location (44FPS in original screenshot, 54 FPS in this one). 22%. Why? I'm pretty sure it's because they're using two threads. Not cores. Threads. With SMT disabled, (afaik, I'm just conjecturing) instead of the load being spread out across a thread that uses half of the each core, it's being actually spread out across two full cores.

Obviously that's not at every location:

1692600833025.png


This one only showed 14.9% boost. But that's still a ~15% boost... by disabling multithreading. Lmao.

Again hard to read, but just look at both screenshots and notice that there are two cores that are each at 70-80%++ now, rather than being capped at around 50% each. Granted I have no idea why it still isn't using 100% of both, but I'm going to chalk that up to more trash coding. And again the areas that I experience slowdowns in... still don't make any sense:
1692600139617.png


I get about 80 FPS here, and it has reflections, particle effects, plenty of shadows, etc... yet apparently me staring at a wall and a statue is more graphically demanding than all of this. Okay.

I'm getting close to using some French here, and I'm going to sound pissed, probably because I am pissed. You'll have to excuse me, none of it is directed at you, you're just trying to help. I'm pissed at these garbage devs. There's no other way to state this: This is BS. There is no excuse for this BS. They're using TWO THREADS for this game on your CPU. TWO THREADS. That's not cores, that's THREADS... and they're not even using all of those threads! What the hell? So we need to upgrade our CPU AND GPU every gen now if we want high end performance? Because to play these games with trash code, you need a CPU to brute force it? This game doesn't even seem to care about the extra cache on the 7800X3D much. It's literally just clock bound (and maybe IPC due to architecture advancement bound) by whatever your highest core boost is, near as I can tell. Point of this rant is that my instinctive response to this was "if games are going to be like this going forward, maybe returning my 4090 back to Microcenter and just waiting until I'm ready to do a full platform upgrade is in order, by which time stronger GPUs and CPUs will be out and I can actually justify the purchase." I don't think I've seen a time where it's been quite this bad. I'm hoping that Hogwarts Legacy is just a standout bad example of poor optimization, but I'm not holding my breath. This is like literal regression. We're not advancing our code in any way, we're regressing to freaking apes on typewriters.

Okay, rant done. Anyway, the game is mostly better now, with ~50-80+FPS in most areas (and the stuttering has gone down quite a bit). So I guess that's good enough. I don't actually find this game that compelling, but the presentation is great and so is the eyecandy. I'm just kind of annoyed that I spent $1600 to experience this level of BS in a new title. I grew up reading Harry Potter.
 
My point is that it's kind of pointless for me to go down to 1440p. You'll notice in almost every single result in your video (for any game that probably isn't coded like garbage), basically every CPU is up to parity (that is using the full capacity provided by the card for that graphical detail) by the time you get to 4k. There are usually fluctuations, but they're minor, comparatively speaking. Then you get to games like Hogwarts, and the difference is much bigger, and hell mine is bigger than any of those, granted I'm closer to the 1440p line than the 4k line.

Anyway here are my results with SMT disabled:

View attachment 592020

This is going to be hard to read, but FYI I'm getting 22% more performance in this location (44FPS in original screenshot, 54 FPS in this one). 22%. Why? I'm pretty sure it's because they're using two threads. Not cores. Threads. With SMT disabled, (afaik, I'm just conjecturing) instead of the load being spread out across a thread that uses half of the each core, it's being actually spread out across two full cores.

Obviously that's not at every location:

View attachment 592021

This one only showed 14.9% boost. But that's still a ~15% boost... by disabling multithreading. Lmao.

Again hard to read, but just look at both screenshots and notice that there are two cores that are each at 70-80%++ now, rather than being capped at around 50% each. Granted I have no idea why it still isn't using 100% of both, but I'm going to chalk that up to more trash coding. And again the areas that I experience slowdowns in... still don't make any sense:
View attachment 592019

I get about 80 FPS here, and it has reflections, particle effects, plenty of shadows, etc... yet apparently me staring at a wall and a statue is more graphically demanding than all of this. Okay.

I'm getting close to using some French here, and I'm going to sound pissed, probably because I am pissed. You'll have to excuse me, none of it is directed at you, you're just trying to help. I'm pissed at these garbage devs. There's no other way to state this: This is BS. There is no excuse for this BS. They're using TWO THREADS for this game on your CPU. TWO THREADS. That's not cores, that's THREADS... and they're not even using all of those threads! What the hell? So we need to upgrade our CPU AND GPU every gen now if we want high end performance? Because to play these games with trash code, you need a CPU to brute force it? This game doesn't even seem to care about the extra cache on the 7800X3D much. It's literally just clock bound (and maybe IPC due to architecture advancement bound) by whatever your highest core boost is, near as I can tell. Point of this rant is that my instinctive response to this was "if games are going to be like this going forward, maybe returning my 4090 back to Microcenter and just waiting until I'm ready to do a full platform upgrade is in order, by which time stronger GPUs and CPUs will be out and I can actually justify the purchase." I don't think I've seen a time where it's been quite this bad. I'm hoping that Hogwarts Legacy is just a standout bad example of poor optimization, but I'm not holding my breath. This is like literal regression. We're not advancing our code in any way, we're regressing to freaking apes on typewriters.

Okay, rant done. Anyway, the game is mostly better now, with ~50-80+FPS in most areas (and the stuttering has gone down quite a bit). So I guess that's good enough. I don't actually find this game that compelling, but the presentation is great and so is the eyecandy. I'm just kind of annoyed that I spent $1600 to experience this level of BS in a new title. I grew up reading Harry Potter.
Wow that's a huge improvement with SMT off! Cool!!

Elden Ring is also similar, in that it really only uses like 3 threads or something. SpecialK mod injector splits off some of the stuff into some other threads. And it also can implement its own, better frame cap implementation. It removes nearly all of the general microstutters and little hitches for me, in Elden Ring. As well as some smoothing out some spots which are notorious for otherwise inexplicably major stutters and frame drops.

Looks like SpecialK doesn't have anything for Hogwarts.
 
Well, I'm back to playing it on my 3080 Ti, but I've been sort of enjoying the game after just turning down a few settings. That being said, it's really pretty but it kind of feels skin deep? I don't even know what "basic cast" is, lore wise lol. It's basically like Hogwarts Creed: Legacy... it took the AC formula, added in some spells instead of all of the various tools Ezio or whomever would use, and kind of called it a day? I think that actually makes it pretty fun, though, it's an interesting take.

I have this bad habit of just completely ignoring the main quest and running around exploring and killing anyone I come across, so I've basically uncovered most of the known map and killed a bunch of wizards, goblins, trolls, etc... and some named mobs (the Infamous Foes or whatever). The hardest so far has been some level 30 troll, because I was fighting him at level 14 and he had a bunch of annoying adds, had to kite him out of the add spawn range. Well, that and I've had to skip any area involving undead. Since I don't have anything but the pull, levitate, and basic spells (and 2 bars of the ancient spells), I basically can't do anything to them. They're even immune to fall damage. Only Ancient Magic works, but it's too expensive to use on a mob of them... guess I'll need to go back and get the fire spell sometime. I'm about level 21 now.

One thing that makes it kind of easy though is enemies seem to have a maximum "leash" range, and at the edge of it they just kind of... sit there... and look menacing. The trolls just sit there and toss rocks, which makes them stupid easy to kill because that's free chain stuns and damage. Also can just go up on a 5 foot ledge and pull enemies to you and then that drop will literally kill them somehow. Go up a little higher and trolls just sit there under you and let you kill them...

Also NPC conversations are just hilariously disjointed. Some guy just goes up to a woman and says Hello all cheerfully and then she just breaks out into a politics monologue...
 
Just putting my 2c in here... This game is incredible. As an HP fan of the books and movies they created something very special. It really is like being in that world. I will say the broom flying mechanics for PC are kind of jacked but I am getting used to them... If they turned something like this into an MMO it could be the next EverQuest ... That said I am playing it on 13th gen intel hardware with a 4090 (4k max IQ) and a 42" OLED and a formidable 2.1 sound setup for a PC. It's very immersing. I look forward to finishing this game eventually.
 
Just putting my 2c in here... This game is incredible. As an HP fan of the books and movies they created something very special. It really is like being in that world. I will say the broom flying mechanics for PC are kind of jacked but I am getting used to them... If they turned something like this into an MMO it could be the next EverQuest ... That said I am playing it on 13th gen intel hardware with a 4090 (4k max IQ) and a 42" OLED and a formidable 2.1 sound setup for a PC. It's very immersing. I look forward to finishing this game eventually.
The inside of Hogwarts alone sets it apart from any other game that I can think. Every hanging picture has animation, statues movenat random, props to mess with, flying artifacts, hidden dungeons, etc... truly a beautiful game with a lot of detail.
 
The inside of Hogwarts alone sets it apart from any other game that I can think. Every hanging picture has animation, statues movenat random, props to mess with, flying artifacts, hidden dungeons, etc... truly a beautiful game with a lot of detail.
so much yes.
 
Back
Top