• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Starfield PC Performance Thread

Pagusas

Elden Member
I'm on a 4070Ti and an in i7-13700k and it's extremely smooth for me. Using DLSS mod at 66% with minimal loss of quality. No frame-gen or DLSS3 and I'm still getting 80-120 in most parts. Occasional dips into like... high 60s, but the point is it's very smooth.

I don't think it's people on current gen hardware that are complaining. If you have a 4-5 year old computer this won't run very well, at least not without big compromises.
Yeah I agree with this based off what I’m seeing. Friend has a 1080 complaining about having to play at a capped 30 and medium settings. Like dude, what do you expect? They make settings tiered like this for that very reason! I’m sorry you can’t play every game at Ultra 4k 120, but if you want that you best start actually investing in your hardware.
 
Bizarrely, I'm having a better experience after turning preset from low to high and render resolution from 70 to 90%.

The FPS, while low, is actually stable for some reason.
 

SF Kosmo

Al Jazeera Special Reporter
Yeah I agree with this based off what I’m seeing. Friend has a 1080 complaining about having to play at a capped 30 and medium settings. Like dude, what do you expect? They make settings tiered like this for that very reason! I’m sorry you can’t play every game at Ultra 4k 120, but if you want that you best start actually investing in your hardware.
I get it in the sense that we've had a very long cross gen period and people have been able to hold on to those GTX 1080s and i7-8700ks for a long time and still have a decent time.

But those days are finally coming to an end. Remnant 2, Immortals of Aveum, Starfield... these games are extremely heavy on PC and console alike and they aren't going to run on last gen. Unfortunately that means last gen PCs are also going to get left behind.

It also means that if you're expecting better-than-console premium performance targets in these games like 4K/60 you are going to need something on the higher end of current gen too.
 

TIGERCOOL

Member
These may have been shared, but a couple tips for Nvidia users that have helped me big time:
(3080 12gb, 5800x3d)

1. if you're using the dlss mod on quality mode use profile inspector and set antialiasing- transparency supersampling to 0x0....8 (AA_REPLAY, etc) and negative lod bias (DX) to -.5000. Anything lower than that introduces weird shadow artifacts. Cleaned up the image a lot. I think balanced would be good at -1.000, performance mode at -1.500. Just check for flickering on planets in orbit and adjust accordingly.

DLSS 3.5, Luke's mod (latest update) and find having Nvidia scaling completely OFF is much nicer.

2. Driver forced rebar. This one's huge. Before Akila market area was low 60s for me (ultra other than crowd density and shadows which are both one step down), after applying rebar I'm getting mid 70s. The upstairs bar in The Rock (strangely demanding on higher settings) also went from low 60s to 80fps. Haven't tested New Atlantis.

Never noticed a difference with rebar like this. Complete game changer if you can't use fancy pants frame gen.
 

SF Kosmo

Al Jazeera Special Reporter
13600K / 32GB / EVGA 3060 Ti / 1080P 144Hz Starfield Nvidia optimization needs some help , I'm using Hardware Unboxed optimized settings and still see 45fps at its lowest (outdoors in New Atlantis) and 60-100fps indoors, No upscaling being used (native 1080P).
On a 3060Ti you probably want to install the DLSS mod and use some upscaling.
 

The Cockatrice

Gold Member
Atlantis is 70FPS for me at 1440P DLSS no fg, on a 4070TI which is ok considering you mostly run around but god damn, wasnt expecting to run around for so many hours, i';ve spent almost 8 hours in the city and Im prolly going to need another 4'ish to finish the side quests that take places there.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
These may have been shared, but a couple tips for Nvidia users that have helped me big time:
(3080 12gb, 5800x3d)

1. if you're using the dlss mod on quality mode use profile inspector and set antialiasing- transparency supersampling to 0x0....8 (AA_REPLAY, etc) and negative lod bias (DX) to -.5000. Anything lower than that introduces weird shadow artifacts. Cleaned up the image a lot. I think balanced would be good at -1.000, performance mode at -1.500. Just check for flickering on planets in orbit and adjust accordingly.

DLSS 3.5, Luke's mod (latest update) and find having Nvidia scaling completely OFF is much nicer.

2. Driver forced rebar. This one's huge. Before Akila market area was low 60s for me (ultra other than crowd density and shadows which are both one step down), after applying rebar I'm getting mid 70s. The upstairs bar in The Rock (strangely demanding on higher settings) also went from low 60s to 80fps. Haven't tested New Atlantis.

Never noticed a difference with rebar like this. Complete game changer if you can't use fancy pants frame gen.
i also have a 3080, but the 10 gb model. i enabled rebar from the nvidia inspector but didnt see a massive difference. is that what you mean by driver forced rebar?

i downloaded this luke mod yesterday and it just kept crashing. is this the luke mod?


do i need the dlss 3 framegen dll in there as well? since my card is a 3080 series card, could this framegen dll might be causing issues?
 

TIGERCOOL

Member
i also have a 3080, but the 10 gb model. i enabled rebar from the nvidia inspector but didnt see a massive difference. is that what you mean by driver forced rebar?

i downloaded this luke mod yesterday and it just kept crashing. is this the luke mod?


do i need the dlss 3 framegen dll in there as well? since my card is a 3080 series card, could this framegen dll might be causing issues?
Wrong mod. It's the non-framegen version you need. Called fsr bridge on nexus. Did you turn on all 3 options for rebar in the inspector?
 

amigastar

Member
Idk man, but these graphics in Starfield look fine to me. Coming from Fallout 3 they are really enhanced (as they should be considering how old Fallout 3 is but still)
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Wrong mod. It's the non-framegen version you need. Called fsr bridge on nexus. Did you turn on all 3 options for rebar in the inspector?
just found the other mod. im an idiot.

and yes, all three. i will try your AA tips after i get it running. im running this on a 4k tv so hoping dlss performnace looks as good as fsr quality.

lf3MQYp.jpg
 

SlimySnake

Flashless at the Golden Globes
This channel is really good. As good as Alex at digital foundry. Found him earlier this year when TLOU and HP came out all fucked up.

Anyone give these a try?



If not, i will report back later today.

EDIT: Really good settings. I couldnt notice a visual downgrade and I was getting 60 fps in Atlantis city most of the time with drops to 57 fps. DLSS set to Quality at 4k. Akila was averaging 55 fps.
 
Last edited:
I have i9-10850k and GTX 3090 trying to run it on 4k (LG b series OLED). 32gb ram, game installed on ssd. I set everything on ultra but kept resolution rendering at default which was 75%. I get 35-50fps but assumed it would’ve been better…
 

Ovech-King

Gold Member
I have i9-10850k and GTX 3090 trying to run it on 4k (LG b series OLED). 32gb ram, game installed on ssd. I set everything on ultra but kept resolution rendering at default which was 75%. I get 35-50fps but assumed it would’ve been better…
4k output is brutal for any GPU . Since you don't have access to Frame generation , if you are using the FSR the game offers, I found that even on the lower resolution scale + the sharpness cranked up the game look very clean still . I'd say drop to 60% (of maybe even 50%) and gain the FPS closer to the 60 count
 

SlimySnake

Flashless at the Golden Globes
I have i9-10850k and GTX 3090 trying to run it on 4k (LG b series OLED). 32gb ram, game installed on ssd. I set everything on ultra but kept resolution rendering at default which was 75%. I get 35-50fps but assumed it would’ve been better…
Ultra is bs. you dont need it and unfortunately, the 3090 is no longer the ultra card anymore. thats the 4090 so time to settle for high.

Change resolution rendering to 67% and download the dlss mod. you will easily get 60 fps.
 

Sleepwalker

Member
7800X3D + 4080 and 32gb ddr5 600mhz ram, im thinking i should be ok for 4k. Have it downloaded but havent bothered starting yet, now that im done with AC6, the time is near
 
Thanks - I will try those suggestions and see how it looks. At this stage I still value visual fidelity over fps so will see what high/cranking down resolutions render looks like.

I do think the game could be optimized better. I run RDR2 on ultra at 4K at that’s pretty consistent 50-60 fps. Btw what software do you all use for FPS monitoring? I’m basing this on the Xbox game bar haha
 

SlimySnake

Flashless at the Golden Globes
These may have been shared, but a couple tips for Nvidia users that have helped me big time:
(3080 12gb, 5800x3d)

1. if you're using the dlss mod on quality mode use profile inspector and set antialiasing- transparency supersampling to 0x0....8 (AA_REPLAY, etc) and negative lod bias (DX) to -.5000. Anything lower than that introduces weird shadow artifacts. Cleaned up the image a lot. I think balanced would be good at -1.000, performance mode at -1.500. Just check for flickering on planets in orbit and adjust accordingly.

DLSS 3.5, Luke's mod (latest update) and find having Nvidia scaling completely OFF is much nicer.

2. Driver forced rebar. This one's huge. Before Akila market area was low 60s for me (ultra other than crowd density and shadows which are both one step down), after applying rebar I'm getting mid 70s. The upstairs bar in The Rock (strangely demanding on higher settings) also went from low 60s to 80fps. Haven't tested New Atlantis.

Never noticed a difference with rebar like this. Complete game changer if you can't use fancy pants frame gen.
did you mean these two settings? LOD Bias is under Texture Filtering.
KiGCJ57.jpg


DLSS mod works fine. im never going back to FSR. Pretty sure im seeing a performance gain too. 2-3 fps but its definitely pushing my CPU harder. saw temps and wattage go up really high in Atlantis.
 

SlimySnake

Flashless at the Golden Globes
Thanks - I will try those suggestions and see how it looks. At this stage I still value visual fidelity over fps so will see what high/cranking down resolutions render looks like.

I do think the game could be optimized better. I run RDR2 on ultra at 4K at that’s pretty consistent 50-60 fps. Btw what software do you all use for FPS monitoring? I’m basing this on the Xbox game bar haha
msi afterburner, rivatuner statistics server. both work together.

the best feature is that they let you see the 1% and 0.1% drops so stutters that dont show up on average framerate counters.

Something like this.

F5eGSVbXwAETHF4
 

R6Rider

Gold Member
I'm still getting random GPU usage drops to single digits (and sometimes 0). Usually indoors and during conversations.

Super annoying, especially after trying multiple things to fix it.
 
Oof. I'll just wait for this shit to buff out. Optimizing for a small minority of people isn't the best thing to do.

The AMD optimization announcement saved me whatever the GMG rates for the premium edition were.
 

TIGERCOOL

Member
did you mean these two settings? LOD Bias is under Texture Filtering.
KiGCJ57.jpg


DLSS mod works fine. im never going back to FSR. Pretty sure im seeing a performance gain too. 2-3 fps but its definitely pushing my CPU harder. saw temps and wattage go up really high in Atlantis.
That's the one!
 

TIGERCOOL

Member
What exactly does that setting do? I have noticed some lod popin since i switched to dlss that i don’t remember being there before. Mostly on akila.
Essentially gives more texture data to DLSS/fsr. It should be set by devs but sometimes isn't, and has been discovered not to have been by bathesda in this case. It made for some ugly distance textures and increased shimmering. I've noticed about the same amount of pop-in with it on and off... though it's probably loading more detail at a distance which some may notice. I preferred -1.5000 but it started to introduce shadow artefacts at -1.000 in quality mode (67% render scale) at 1440p. As always, YMMV. Try to mess around with the setting a bit.
 

Bojji

Member
GTAO used in starfield looks worse than HBAO+ used in fallout4

HBAO+ is still the best non RT form of AO (there is also VXAO but that's voxel based).

But of course AMD sponsored etc. we can't have nic things.

I remember far cry 4 with amazing tessellated god rays on pc and Nvidia fur tech, then they went with AMD and Primal (and FC5 partially) looked way worse than that game.
 
Last edited:

GymWolf

Member
The game overall runs worse than most other pc games we had this year on my end.

Sometimes it's fluid but i have constant hiccups all the time, it just feels bad to play.

If only the game looked much better than stuff like atomic heart or dead island 2 i could understand, but overall, it doesn't...

And no, i'm not gonna study how to fix a game with fucking mods when i payed big money for it, todd need to get his head out of his stupid ass for once.
 

winjer

Gold Member
HBAO+ is still the best non RT form of AO (there is also VXAO but that's voxel based).

But of course AMD sponsored etc. we can't have nic things.

I remember far cry 4 with amazing tessellated god rays on pc and Nvidia fur tech, then they went with AMD and Primal (and FC5 partially) looked way worse than that game.

GTAO is the most advanced AO form. If used properly, by a competent studio. Something that Bethesda is far from ever being.
It does what other AO forms does, but it is aware of GI, so it's shadows will be adequate to the scene lighting.

AMD's sponsored AO is not GTAO. It's CACAO.
So no, AMD did not sponsor AO in this game.

The company that has it's own implementation of GTAO is Intel. Called XeGTAO. But they didn't sponsor this game.
Few games have implemented XeGTAO. One of them was Ratchet and Clank.
 

GymWolf

Member
There are like 3 rebar voices in nv inspector with multiple option to chose, can you people be more specific on what to turn on\set?
 

winjer

Gold Member

Starfield holds significant importance in the gaming landscape for 2023, and Intel recognizes the need to provide gamers with a great gaming experience. Regrettably, Intel’s GPU drivers initially fell short of ensuring a seamless gaming experience during the early access phase. However, the company promptly addressed this issue by releasing the first driver specifically optimized for Starfield.

Nonetheless, that release did not constitute the full ‘Game On’ driver, nor does the latest driver, recently unveiled. While Intel has resolved certain bugs and introduced further optimizations, the roster of known issues remains more extensive than the fixes applied. Nevertheless, Intel’s consistent stream of driver updates demonstrates their commitment to enhancing the gaming experience, considering the circumstances.

Regrettably, Intel was unable to deliver a fully optimized driver for the game’s launch on September 6th, potentially necessitating Alchemist GPU series gamers to exercise patience.

But some gamers are losing this patience as the one who reached out to Bethesda for support. A gamer who sought assistance from Bethesda’s consumer support was informed that the Arc A770 GPU does not meet the minimum requirements specified for Radeon RX 5700 or GTX 1070 Ti GPUs:





Despite the Arc A770 GPU’s better performance compared to the aforementioned cards, Bethesda’s generic response suggests a lack of comprehensive guidance on how to address or optimize the game for their hardware. Consequently, Arc gamers may have to rely on Intel to resolve issues that Bethesda appears unwilling to tackle independently. One can only hope that Intel and Bethesda are working on Starfield optimizations together and Intel will have its “Game On” fully optimized driver for Starfield soon.

Intel's newest driver:

 
Last edited:

mansoor1980

Gold Member



Intel's newest driver:

looks like ARC users need an "UPGRADE"
 

SlimySnake

Flashless at the Golden Globes
Essentially gives more texture data to DLSS/fsr. It should be set by devs but sometimes isn't, and has been discovered not to have been by bathesda in this case. It made for some ugly distance textures and increased shimmering. I've noticed about the same amount of pop-in with it on and off... though it's probably loading more detail at a distance which some may notice. I preferred -1.5000 but it started to introduce shadow artefacts at -1.000 in quality mode (67% render scale) at 1440p. As always, YMMV. Try to mess around with the setting a bit.
Thanks. I’m at 4k dlss quality so 1440p internal.
 

Bojji

Member
GTAO is the most advanced AO form. If used properly, by a competent studio. Something that Bethesda is far from ever being.
It does what other AO forms does, but it is aware of GI, so it's shadows will be adequate to the scene lighting.

AMD's sponsored AO is not GTAO. It's CACAO.
So no, AMD did not sponsor AO in this game.

The company that has it's own implementation of GTAO is Intel. Called XeGTAO. But they didn't sponsor this game.
Few games have implemented XeGTAO. One of them was Ratchet and Clank.

This game is the first time I heard about it so I thought it was something AMD related.

AO quality is one of the best parts of this game graphics.
 

tronied

Member
RTX 3900 with a 12900K. Max everything @ 4k gives 45-85fps. It varies from place to place but it feels very smooth overall.

Haven't downloaded any mods or anything, but yeah pretty impressed with it.
 

winjer

Gold Member

SlimySnake

Flashless at the Golden Globes
This game is the first time I heard about it so I thought it was something AMD related.

AO quality is one of the best parts of this game graphics.
Remember to set it to Ultra. very low performance hit and it looks best at that setting.

cpu utilization looks extremely poor on Ryzen cpu, amd only sponsored the GPU it looks like
nah, it just prefers higher clocks. higher clocked AMD CPUs fare well against both intel and AMD CPUs. And CPU utilization is very high in this game especially in cpu bound cities. Game prefers AMD GPUs though. A 6800xt trades blows with a 3080 in most games, but here it is on par with a 3090 Ti.


S7metgG.jpg
 

peish

Member
Remember to set it to Ultra. very low performance hit and it looks best at that setting.


nah, it just prefers higher clocks. higher clocked AMD CPUs fare well against both intel and AMD CPUs. And CPU utilization is very high in this game especially in cpu bound cities. Game prefers AMD GPUs though. A 6800xt trades blows with a 3080 in most games, but here it is on par with a 3090 Ti.


S7metgG.jpg

nope, i have 7950x3d and the 3D clocks are not even stressed out to 3ghz.
 

SlimySnake

Flashless at the Golden Globes
Which is so strange considering the amd based consoles it runs on, I mean, you would think they would optimize for that…
but they did. What people dont realize is that the consoles are no longer rocking single thread 8 core 1.6 ghz jaguar cpus. Bethesda used up all 8 cores and 16 threads at 3.5 Ghz for the xsx just to get it running at 30 fps at 1440p. Obviously, in order to run this CPU heavy game you will need not just a 2x faster gpu, but also a 2x faster CPU. the AMD 3000 and 5000 series are only around 4.1-4.45 ghz. thats what? 15-20% faster? they have more cache, ok lets another 10-20% better performance. you are still at 20-40% faster CPUs. You need 100% more performance to get to 100% more framerate.

besides, some of these CPUs are only 6 cores and 12 threads. XSX reserves 1 core and 2 threads for the OS so it starts off with an advantage against the 3600 and 5600x. both VERY popular CPUs that top out at 4.2 ghz. Thats just not enough power to run cpu twice in the same time. meanwhile, my intel cpu consistently hits 5.0 ghz causing my pc and myself to sweat. thankfully when doing regular missions indoors and out in the open world, it doesnt hit that high, but that just means that it IS optimized because when it does get to cities with a lot of NPCs, shops and vendors, they use up all the CPU power they can get.

the 7000 series AMD CPUs scale just fine outperforming last gen intel CPUs just like you would expect them to.
 
Last edited:
3060 6GB with an i7 11800H:

The game runs “okay”, but it looks awful. The texture quality seems so incredibly low.

I get 60-70fps on medium settings, and 40-50 fps on high. The game looks significantly better on high.
 

Pagusas

Elden Member
Im hoping we get a mod that lets us seperate camera motion blur from per-object motion blur. Objection motion blur is a GOOD thing, and it looks very good in starfiled. But the camera blur does not look good. I hope they can be seperated.
 
Top Bottom