• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Hogwarts Legacy PC Performance Thread

JeloSWE

Member
13700k + 4090 + 32 gig ram



All the RT on, max settings, and DLSS quality with frame generation at it’s holding 4k/120 near flawlessly. Haven’t even bothered to see what it can do without frame generation because it seems to work so well.

Haven’t noticed any stutters any at all.
13900K + 4090 + 32 gig ram + M2 NVME
All settings Ultra, RT max, Frame Generation, DLSS Performance. Vsynced and in NV control panel (ingame is greyed out with Frame Gen enabled) and Framerate capped at 120 fps ingame. I use BFI on my display so any dips or stutters become quite apparent.

I get almost always locked 120fps, inside Hogwarts. But the few segments I've managed to be outside, it's no longer flawless due to the massive amount of foliage when set to Ultra.


I haven't gotten far as I've mostly dabled with settings to find a stable FPS but the game is very promising so far.
 
Last edited:

JeloSWE

Member
I hope they add a better setting for RT reflections, when using Nvidia Ansel as a photo mode (ALT+F2, add an empty profile for the game in NV control panel), you can force "high quality" in the Engine settings and the reflections are so much better, yet performance seems the same on a 4090 (DLSS Quality + FG, all maxed-out, 116 FPS VRR).

Frame Generation is so good (I've put Cyberpunk's DLL version 1.0.5, Hogwarts comes with 1.0.3), latency isn't even an issue anymore and I didn't notice artifacts (except in some scrolling menus as it is not disabled there).

RT AO is definitely broken, hope a patch or Game Ready Driver fixes that, default AO is still there and no visible RT AO.
Where, How, do I need to add a line in the Engine.ini file?
 

GHG

Member
Im running on a 4090 with an intel 13900k, but im getting stutters. Also, in the castle at least my frame rate is ocelating between 65-90 fps. I turned frame generation off because i couldnt use vsync with it on, i was getting some prettty nasty tearing without vsync. Everything is set to ultra, using dlss quality mode

Thanks ill try that later. Still new to pc gaming so the help is definitely appreciated

I posted about this in the OT, should help you get things running smoothly with frame gen turned on:

https://www.neogaf.com/threads/hogwarts-legacy-ot-the-prisoners-of-askforban.1651647/post-267510015
 

phant0m

Member
How's it running on Steam Deck?
honestly pretty alright. out of the box config will get you a pretty solid (and decent looking for the form factor) 30 fps experience. with some tweaking though (change FSR from Quality to Balanced, change hardware refresh and cap to 40) i was able to get a very consistent 40 fps. it stutters when loading in/out of areas which some many find annoying/immersion breaking but they otherwise don't affect gameplay.

some tips:
  • FSR 1.0 runs better than FSR 2.0 (hopefully this will change after some updates)
  • crank up the FSR sharpness to 0.4 or 0.5 to clean up the image
  • disable vsync in-game
  • disabling in-game FSR and enabling the SteamOS FSR did not result in good performance
i'm sure (hoping?) DF or another tech youtube will do all the fiddling and drop a recommended config.
 
Last edited:

Xcell Miguel

Gold Member
Where, How, do I need to add a line in the Engine.ini file?
Maybe that's something that can be forced in the ini files, but I'm talking about a setting in the Nvidia Ansel UI, in the left panel when you enable Ansel (ALT + F2), there's a dropdown with "Engine" in the name, then a setting like "High quality" or something like that.
It seems to force higher LODs and better RT Reflections when the camera is static (you can still see some RT shimmering sometimes), when you move the camera it turns off temporarily.
I don't know if it's just an Nvidia tweak or if it can be forced in gameplay.
 

Captn

Member
It's using Unreal 4 right? Maybe some tweaks can be done to the config files like the Witcher 3 and raise the Raytracing quality on the settings or any other graphic settings for that matter.

Gonna have to test and see.
 
Last edited:

Topher

Gold Member
I keep wanting to try this game on Steam Deck but for some reason cloud save is not syncing. Anyone else have this issue?
 

KyoZz

Tag, you're it.
It's using Unreal 4 right? Maybe some tweaks can be done to the config files like the Witcher 3 and raise the Raytracing quality on the settings or any other graphic settings for that matter.

Gonna have to test and see.
If it can help, you can use alt+F2 to get into the Nvidia photo-mode. Then on "Engine" set the high quality mode. It add better ray traced reflection & shadows and bigger draw distance for the vegetation in other things.
 

SlimySnake

Flashless at the Golden Globes
Preloaded. Have a 3080 paired up with a i7-11700kf. Am I looking at 1440p 60 fps RT with DLSS 2.0 at quality or can i go 4k dlss quality?

Im willing to give up RT for shadows and AO. Reflections i cant do without.
 
Won't be able to play for another 7 hours or so but I'll post how my 4080 runs at 1440p. I don't care for 4K as Im more interested in high frame rates.

Is it even worth enabling RTX? Seems like it's broken and/or doesn't look any better.

Have nvidia released drivers yet? Is the "day 1" patch out?
 
Last edited:

lefty1117

Gold Member
On my 2080 super I noticed a big performance pickup when I turned off ray traced ambient occlusion but left the other ones on. I also noticed the same thing in Midnight Suns. Just seems like too big a performance hit to use RT AO at this time. I'm running Ultra settings on everything except RT, which is on High; 1440p HDR on, getting 80-100 in Hogwarts. Another thing I did was to use Balanced on DLSS instead of Quality ... significant performance pickup for a neglible visual quality hit, at least to my eyes.
 

ACESHIGH

Banned
Anyone tried with 8GB RAM just for shits and giggles? Allocation seems to hover around 13 and 14 GB on 16 GB systems, and that's with the game maxed out.
 
Anyone tried with 8GB RAM just for shits and giggles? Allocation seems to hover around 13 and 14 GB on 16 GB systems, and that's with the game maxed out.
Don't know about 8GB. The game can use about 18-19GB if you have 32GB RAM.

Sounds a lot but I think this is going to be normal going forward. Fortnite uses 19-20GB these days after the UE5.1 update if you max everything out.
 

JayK47

Member
The game kept wanting me to run on medium settings. I switched to ultra and it seems fine to me. Pretty consistently runs at 60fps.
 
06134235646lqvete.jpg


As the title states, discuss any fixes, improvements, or even mods.

Runs well on a 4090/5900x/64gb DDR4/NVME

Averaging 120fps with DLSS+frame generation @4k completely maxed.

Regarding DLSS
Lol dude
 

The Cockatrice

Gold Member
First post "Plays well on a 4090".
This thread is over before it started.

gaf is a small community and most of the pc users here are elitists that will always tell you to buy the most expensive gpu. Dont worry about it. Check other places imo. I mean just take a look at some of the replies even when they know RT in this game is broken they still play with it on. RT Shadows are much worse than non and AO without RTGI is useless and theyre just wasting frames.

Im an elitist as well but I'll never play games on max just because I have a high end PC. optimizing your games, lowering some idiotic graphical settings that offer almost no difference visually is what I love. That being said fuck these devs for nerfing SSR so badly that it makes RT reflections look good. I've seen SSR in old ass games zthat looks better than the half assed RT implementation in this game. Not even the water has RT.
 

b0uncyfr0

Member
Also curious about 6600xt performance. There was that 3060Ti video hitting 100+ outside and 60+ indoors at 1440p Medium. That's not bad at all.
 
Playing in Ultra, 4k, FSR 2 Quality, No Ray Tracing, so far 60 fps locked.
RX 6800 XT Taichi + Ryzen 7 5800X + 32GB Ram
 

rofif

Can’t Git Gud
"Runs well on a 4090/5900x/64gb DDR4/NVME"
And listing fps numbers with frame generation is measuring dick length from your feet. It's doubling the fps artificially and adding input lag. The same people who argue for every ms of input lag are so easily fooled by fake framerate that adds lag

Dog Eye Roll GIF by Rover.com
 
Last edited:

GymWolf

Member
Gaf is the only place that make me feel like a toothless hobo while rocking a 4080/13600k combo...

Anyone interested in a 30 euros standard edition key?
 
Last edited:

Mister Wolf

Member
Is it amazing? Arent you feeling like a fool for having to fix simple things like this on expensive pc playing new game you purchased?
Sure, it's good you can do it but wtf

Not at all. Its not fixing anything. The developers simply chose to not make raytracing as heavy as it could have been by casting less rays and lowering resolution. When Developers do it on consoles like those shitty reflections in Spiderman it's called "optimization". We are simply bypassing their cutoff point because we can afford the fps hit. It's a luxury afforded to us PC gamers and one of the reasons I'm willing to pay for premium hardware.
 
Last edited:

nikos

Member
"Runs well on a 4090/5900x/64gb DDR4/NVME"
And listing fps numbers with frame generation is measuring dick length from your feet. It's doubling the fps artificially and adding input lag. The same people who argue for every ms of input lag are so easily fooled by fake framerate that adds lag

Dog Eye Roll GIF by Rover.com
Sounds like you’ve never actually experienced Frame Generation.
 

DareDaniel

Banned
Booted it up on my system last night.

No DualSense Support, or even PS controller support (n)
Runs great 120fps mostly at 4k (y)

I wouldn't say there is a discernible difference between it and PS5 version in Balanced or Fidelity modes.
Dualshock 4 works with DS4Windows and the controller icons also show up during gameplay.
 

rofif

Can’t Git Gud
Sounds like you’ve never actually experienced Frame Generation.
I didn't. I am using 3080.
It is doubled fake frames with artifacts + added input latency.
It is all measured and factual. I don't have any doubts it feels fine and you can't see artifacts but we are talking performance comparisons here. Real fps and not this
 
Ok so I’ve played a bit more and it’s a great game.

Few issues though. I had to turn off freesync because the shimmering was fucking awful. Also the sound mix is way off. The voices are so quiet at times and characters can sound like they are on another plane of existence even though they are standing 5ft away.

Lack of Dualsense features is painful too!
 

ANDS

King of Gaslighting
Anyone with a 6700xt playing it? In curious

6800XT but it runs well enough on Quality mode with a few dips and crashes so far (only occurred three times and two of those times was transitioning to a heavy scene). Game definitely needs they patch and the RAM utilization probably needs explaining, but if you're not trying to native 4K Ultra everything you'll get excellent performance (I'm running a mostly high with ultra textures and some dumb stuff turned down - like the sky).
 

winjer

Gold Member
It's using Unreal 4 right? Maybe some tweaks can be done to the config files like the Witcher 3 and raise the Raytracing quality on the settings or any other graphic settings for that matter.

Gonna have to test and see.

Add this to the engine.ini

[SystemSettings]
r.RayTracing.Reflections.ScreenPercentage=100
r.RayTracing.Reflections.SamplesPerPixel=1
r.RayTracing.Reflections.MaxRoughness=0.7
r.RayTracing.AmbientOcclusion.Intensity=1
 

GymWolf

Member
If you've got your hardware then just get it set up correctly, try it and form your own opinion.

Too many jealous cunts chatting shit about it when they don't know what they're talking about.
I have to wait until friday unfortunately...
 
  • Empathy
Reactions: GHG
Top Bottom