• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Starfield PC Performance Thread

TIGERCOOL

Member
https://www.nexusmods.com/starfield/mods/595 - Using this too which does make the best looking parts of the game look worse (the clinical looking indoor bits) but makes some of the worst looking bits look way better (caves and other dark places) - A good trade-off for me personally as this game has very uneven visuals but it will always be hard to get everything looking right when the colour of scenes is changed using filters rather than RT.
If you have an HDR monitor I recommend using Special K's HDR settings. You can fix just about everything wrong with the game's gamma and saturation without altering the developers intent too drastically.
 

winjer

Member
Good CPU scaling. Time to upgrade CPUs.


AdYoURd.jpg

It's clear that Bethesda is relying on gamers to upgrade their machines, instead of Bethesda doing some proper optimization. So they are shifting the cost into gamers.
There are several games on the market that look better, have much better AI and physics, than Scamfield but aren't as heavy as this.
 

Xcell Miguel

Gold Member
The guy who made the DLSS Bridge mod (that gives better performance than the PureDark one according to some), has published his free version of the the DLSS 3 version, I can't try it right now :


Also, there is a new LUTs mod, with reduced color and better contrasts :

I'm using the 25% one for now, waiting for the 25% or 50% with better contrasts (for now only the 75% has better contrasts).

EDIT : and another LUT mod, 14% this time, I'll try it as the 25% do not have pure blacks for now :
 
Last edited:

Rossco EZ

Member
guys i watched a tutorial on how to do the dlss2 mod and followed it correctly but to bring up the menu thing to turn dlss on it says hit the end key my keyboard doesn’t have that is there another way to bring it up?

pure dark mod btw
 
Last edited:

Virex

Banned
guys i watched a tutorial on how to do the dlss2 mod and followed it correctly but to bring up the menu thing to turn dlss on it says hit the end key my keyboard doesn’t have that is there another way to bring it up?

pure dark mod btw
If you can get a reservation at Dorsia I might tell you exactly how to do it.

P.S. Do you like Phil Collins?
 

SlimySnake

Flashless at the Golden Globes
It's clear that Bethesda is relying on gamers to upgrade their machines, instead of Bethesda doing some proper optimization. So they are shifting the cost into gamers.
There are several games on the market that look better, have much better AI and physics, than Scamfield but aren't as heavy as this.
Eh. This game is probably the only game that taxes my CPU beyond 50%. It is properly multithreaded. Each core and each thread is up there around 70-80% in stress test areas. My CPU was actually hitting 70 degrees at one point with over 120 watts of power usage. Not even cybperunk which is another good multithreaded game hits that high.

I see people rocking 9900ks with their $2000 4090s. Come on. you dont need to go get a $400 cpu like the 7800x3d. There are plenty of $200-300 CPUs on that list. i bought my CPU for just $250 down from $299 after packaging it up with a mobo. it is not a bottleneck in starfield.

IIRC, 9900k doesnt even have 2 threads per cores. its just 8 cores. AMD's zen 2 lineup just stinks. You can see why the xsx cant do 60 fps by looking at the 3600. we've seen similar things with Sony exclusives like spiderman where going from a 3600 to 13900k doubles the framerate on a 4090.
 

draliko

Member
I'm still thorn between keeping my 7900xt or try to get a 4070ti and keep 100€ in my pockets... Or still lower 4070 but move to 5800x3d... Damn... Fsr3 is really keeingp us waiting...😅
 

Xcell Miguel

Gold Member
guys i watched a tutorial on how to do the dlss2 mod and followed it correctly but to bring up the menu thing to turn dlss on it says hit the end key my keyboard doesn’t have that is there another way to bring it up?

pure dark mod btw
Use the other mod, easier to install, better performance, and not made by a modder putting stuff behind a paywall and DRM.


And if you have a 4000 series and want Frame Generation :

 

Buggy Loop

Member
The budget was apparently just over USD$200m, so given that's seemingly your sole justification, you'll halve the strength of your criticism, correct?


It’s unclear as of yet how much Starfield will cost in total with all other parameters combined, but speculation says that the final figure should be somewhere around $300-400 million, considering the stature of the title at hand and the amount of time it’s been under development. Moreover, if the 500-member development team part is correct, costs are certain to be quite higher than $200 million.


As for "defending", you're posting non-sense and you're getting pushback. Did you expect to shit post without pushback? If you're not happy, post less non-sense.
It definitely underperforms - especially on NVidia hardware - but graphics and tech seem acceptable, given the RPG nature and scale of the game. You're struggling to describe anything that "deserves a bit of shaming".

I'm not posting nonsense, I'm saying that it performs worse than the fucking HOG that is the Star Citizen engine and that IS saying something.

There's nothing in what the Creation engine 2 is presenting that requires the hardware it does, outside of bad optimization if you want to go there, RPG nature with a dialogue tree like has been done since the dawn of cRPGs has nothing to do with that, especially since the NPCs outside of select fews are now dumb as fuck with script pool of replies compared to the NPCs with routines found in Skyrim.

Like i said, super instanced worlds, jpeg planets in space, dumb AI, somehow performs worse than a raw space physic simulator with solar systems modeled running on single threaded engine. Bravo Bethesda, bravo.

But apparently i can't describe the bit of shaming, as if i give a shit how you gatekeep critique to an engine that's clearly outdated/lacking for the task.

I have the game pre-ordered and with the xbox starfield theme controller ready to go tomorrow btw, i simply do not believe in tech excuses for such a slog of performance crawl on $3000 rigs in 2023 with APIs such as DX12 for multi-threading and the backing of a trillion dollar company that controls the fucking API. If the game is actually multi-threaded then it’s even a bigger of a mystery why it performs like it does. If this is excusable, then Microsoft has a fucking problem on the horizon for all their studios, too many engines, too many R&D costs to keep these shitcan running for little to no visual return.

JPEG images for planets in 2023 for that kind of budget, just stop a minute and let that sink in.

Even Lego Star Wars tried to do better than that

 
Last edited:

winjer

Member
Eh. This game is probably the only game that taxes my CPU beyond 50%. It is properly multithreaded. Each core and each thread is up there around 70-80% in stress test areas. My CPU was actually hitting 70 degrees at one point with over 120 watts of power usage. Not even cybperunk which is another good multithreaded game hits that high.

I see people rocking 9900ks with their $2000 4090s. Come on. you dont need to go get a $400 cpu like the 7800x3d. There are plenty of $200-300 CPUs on that list. i bought my CPU for just $250 down from $299 after packaging it up with a mobo. it is not a bottleneck in starfield.

IIRC, 9900k doesnt even have 2 threads per cores. its just 8 cores. AMD's zen 2 lineup just stinks. You can see why the xsx cant do 60 fps by looking at the 3600. we've seen similar things with Sony exclusives like spiderman where going from a 3600 to 13900k doubles the framerate on a 4090.

Optimization is not just about having several threads.
This game probably is trashing caches while un-deduplicated code.
Or it doesn't have proper culling systems, so it's rendering hidden geometry and pixels at a huge rate.
 

SlimySnake

Flashless at the Golden Globes
Intel 11700K beating the AMD 5800X3D at 1080p on some AMD sponsored title

An 11700k beating the 5800X3D

Wait

Triple check, yup

11700k beating the 5800X3D

This Is Fine GIF


GG Bethesda
Blame it on AMD. I have been saying this for a while, but the AMD Zen 2 and Zen 3 CPUs were only beating out these intel CPUs in last gen games. As soon as we started seeing current gen games come out favoring higher clocked CPUs, these AMD CPUs with their low wattage some with 65 watt caps struggled to keep up.

That 11700k gave me a lot of headache early on. It was too hot. It was consuming something crazy like 128 watts in cyberpunk on lower resolutions. All youtubers were like its trash. Microcenter was selling it for a $50-100 discount compared to the AMD equivalent. I think it was their 3700x iirc. But time has proven everyone wrong. it doesnt matter how hot something is or how much wattage it takes, the performance is king.

Optimization is not just about having several threads.
This game probably is trashing caches while un-deduplicated code.
Or it doesn't have proper culling systems, so it's rendering hidden geometry and pixels at a huge rate.
well, we dont know whats under the hood so i can only look at what i see, and what i see is better multithreading than just about any other game on the market.

the game is only cpu bound in big cities with lots of NPCs. only comparable game is mass effect andromeda and it doesnt have cities nearly as big or as many NPCs. something like gta, spiderman and cyberpunk dont count because those are basically open world games. this is a different kind of city.
 

draliko

Member
Game has problems for sure but I think we should wait the first patch to see if they're going the right direction. Anyway it's time for Bethesda to use id soft to rework all the engine
 

winjer

Member
well, we dont know whats under the hood so i can only look at what i see, and what i see is better multithreading than just about any other game on the market.

the game is only cpu bound in big cities with lots of NPCs. only comparable game is mass effect andromeda and it doesnt have cities nearly as big or as many NPCs. something like gta, spiderman and cyberpunk dont count because those are basically open world games. this is a different kind of city.

What we know is that there is nothing special about what this game is rendering. Neither detail, geometry, complex AI, physics, etc.
This is purely badly optimization, as usual from Bethesda and their shit engine.
 

Bojji

Member
Intel 11700K beating the AMD 5800X3D at 1080p on some AMD sponsored title

An 11700k beating the 5800X3D

Wait

Triple check, yup

11700k beating the 5800X3D

This Is Fine GIF


GG Bethesda

Blame it on AMD. I have been saying this for a while, but the AMD Zen 2 and Zen 3 CPUs were only beating out these intel CPUs in last gen games. As soon as we started seeing current gen games come out favoring higher clocked CPUs, these AMD CPUs with their low wattage some with 65 watt caps struggled to keep up.

That 11700k gave me a lot of headache early on. It was too hot. It was consuming something crazy like 128 watts in cyberpunk on lower resolutions. All youtubers were like its trash. Microcenter was selling it for a $50-100 discount compared to the AMD equivalent. I think it was their 3700x iirc. But time has proven everyone wrong. it doesnt matter how hot something is or how much wattage it takes, the performance is king.


well, we dont know whats under the hood so i can only look at what i see, and what i see is better multithreading than just about any other game on the market.

the game is only cpu bound in big cities with lots of NPCs. only comparable game is mass effect andromeda and it doesnt have cities nearly as big or as many NPCs. something like gta, spiderman and cyberpunk dont count because those are basically open world games. this is a different kind of city.

In theory newer engines should favor higher core counts, not just higher clocks.

Problem is, this engine is old as fuck and favors Intel architecture heavily.

6 core CPUs are the same as 8, 12, 16 core CPUs here so multithreading is quite fucking poor. 13900k is much faster than lower models thanks to higher clock and more cache, not more cores.
 

SlimySnake

Flashless at the Golden Globes
What we know is that there is nothing special about what this game is rendering. Neither detail, geometry, complex AI, physics, etc.
This is purely badly optimization, as usual from Bethesda and their shit engine.
I dont know if I agree. There are some stunning looking interiors. There is a great lighting system here. At least indoors. I concede it looks like shit in the open world planets. Every single object in the indoor areas is emaculately modeled. While the open world might not have lots of geometry, i can see why the GPU hit indoors is so massive. This game's interiors remind me of the order where everything feels like its own object and not just a painted on texture.

The NPC count is massive compared to other similar RPGs. And a huge upgrade over Fallout 4 and skyrim. Looking at last gen RPGs like outer worlds and andromeda and its clear that there is a massive leap in NPC count. I was in that neon city yesterday and there were so many NPCs there that I couldnt get around them in alleys and stairs. I can definitely see why they have such high CPU requirements.

Where I will agree is that they need to ditch this engine but not because of the visuals or cpu issues, but the loading and lack of proper streaming. The game has way too many loading screens that break immersion. I dont want to land on a planet from the orbit but i should be able to go inside a ship or a building or in between city zones without fucking loading in 2023. my fucking ssd is so fast it loads most of these levels in a second. so why am i seeing this? just implement a better fucking streaming system.
 

Buggy Loop

Member
Blame it on AMD. I have been saying this for a while, but the AMD Zen 2 and Zen 3 CPUs were only beating out these intel CPUs in last gen games. As soon as we started seeing current gen games come out favoring higher clocked CPUs, these AMD CPUs with their low wattage some with 65 watt caps struggled to keep up.

Kenan Thompson Reaction GIF


Are those "current gen" games in the room with us? Which ones? Especially which ones are even that much of an outlier as Starfield. I'm not talking about a 13900k just having an hair above AMD in performances for 100W more, i'm talking about multiple gens behind beating AMD's bests.

That 11700k gave me a lot of headache early on. It was too hot. It was consuming something crazy like 128 watts in cyberpunk on lower resolutions. All youtubers were like its trash. Microcenter was selling it for a $50-100 discount compared to the AMD equivalent. I think it was their 3700x iirc. But time has proven everyone wrong. it doesnt matter how hot something is or how much wattage it takes, the performance is king.

Again, time has proven everyone wrong, based on what titles.

well, we dont know whats under the hood so i can only look at what i see, and what i see is better multithreading than just about any other game on the market.

the game is only cpu bound in big cities with lots of NPCs. only comparable game is mass effect andromeda and it doesnt have cities nearly as big or as many NPCs. something like gta, spiderman and cyberpunk dont count because those are basically open world games. this is a different kind of city.

How would the open world games not compare? They're even more complex systems with massive cities.
 

Denton

Member
They're world apart in tech



Don't get me wrong, Star Citizen single player is god knows when. But for the budget and ressources Bethesda had for this game, it's mind boggling to run worse than one of the toughest most complex space simulator running on a fucking old engine.


Three things about that video are masterpieces:

- the soundtrack
- the art
- the tech

Now, I just hope I will get to play a singleplayer storydriven game that actually utilizes these, preferably before I die of old age.
 

Gaiff

SBI’s Resident Gaslighter
IIRC, 9900k doesnt even have 2 threads per cores. its just 8 cores. AMD's zen 2 lineup just stinks. You can see why the xsx cant do 60 fps by looking at the 3600. we've seen similar things with Sony exclusives like spiderman where going from a 3600 to 13900k doubles the framerate on a 4090.
Of course, the 9900K has 2 threads per core. It's an i9. You're thinking of the 9600 and 9700K which is when intel dropped hyperthreading from even their i7 models like a bad habit. Hell, i9 didn't even exist until the 9900K if I remember. It was i7, i5, and i3 and i5 and above had hyperthreading before intel unceremoniously removed it.
 

winjer

Member
I dont know if I agree. There are some stunning looking interiors. There is a great lighting system here. At least indoors. I concede it looks like shit in the open world planets. Every single object in the indoor areas is emaculately modeled. While the open world might not have lots of geometry, i can see why the GPU hit indoors is so massive. This game's interiors remind me of the order where everything feels like its own object and not just a painted on texture.

The NPC count is massive compared to other similar RPGs. And a huge upgrade over Fallout 4 and skyrim. Looking at last gen RPGs like outer worlds and andromeda and its clear that there is a massive leap in NPC count. I was in that neon city yesterday and there were so many NPCs there that I couldnt get around them in alleys and stairs. I can definitely see why they have such high CPU requirements.

Where I will agree is that they need to ditch this engine but not because of the visuals or cpu issues, but the loading and lack of proper streaming. The game has way too many loading screens that break immersion. I dont want to land on a planet from the orbit but i should be able to go inside a ship or a building or in between city zones without fucking loading in 2023. my fucking ssd is so fast it loads most of these levels in a second. so why am i seeing this? just implement a better fucking streaming system.

Indoor are GPU bound zones.
Outdoor areas are where the CPu bound scenarios are more evident and they look rather average, compared to other open world games.
 

SlimySnake

Flashless at the Golden Globes
Kenan Thompson Reaction GIF


Are those "current gen" games in the room with us? Which ones? Especially which ones are even that much of an outlier as Starfield. I'm not talking about a 13900k just having an hair above AMD in performances for 100W more, i'm talking about multiple gens behind beating AMD's bests.



Again, time has proven everyone wrong, based on what titles.

Matrix, Star Wars, Spiderman, Gotham Knights, Witcher 3 RT edition, UE5 titles, most games with ray tracing that hit the CPU just as hard as GPUs, all favor intel CPUs. And not just because their engines favor intel CPUs, its because they have higher clocks and more wattage to push the CPUs. AMD was winning on low tdp and lower cooling requirements but now not so much.
How would the open world games not compare? They're even more complex systems with massive cities.
Because something like Spiderman does not have any of the underlying CPU heavy systems a massive RPG like Starfield, Skyrim or Fallout 4 is running constantly behind the scenes. Everyone brings up sandwiches but Bethesda RPGs are doing more than just tracking your inventory. Insmoniac doesnt have to worry about anyone of that when rendering all those NPCs. Thus lower the CPU requirements. Same goes for virtually every open world game last gen including RDR2, Watch Dogs, and Mafia.
 

SlimySnake

Flashless at the Golden Globes
Of course, the 9900K has 2 threads per core. It's an i9. You're thinking of the 9600 and 9700K which is when intel dropped hyperthreading from even their i7 models like a bad habit. Hell, i9 didn't even exist until the 9900K if I remember. It was i7, i5, and i3 and i5 and above had hyperthreading before intel unceremoniously removed it.
I stand corrected.

Indoor are GPU bound zones.
Outdoor areas are where the CPu bound scenarios are more evident and they look rather average, compared to other open world games.
There are two types of outdoor areas though. Atlantis, Neon City and Akala City are all outdoors but they are CPU bound because of all the NPCs. Indoor areas where most of the combat takes place are indeed GPU bound because of the lack of NPCs.

Once you step out into the real open world, the CPU usage drops significantly.
 

twilo99

Member
IIRC, 9900k doesnt even have 2 threads per cores. its just 8 cores. AMD's zen 2 lineup just stinks. You can see why the xsx cant do 60 fps by looking at the 3600. we've seen similar things with Sony exclusives like spiderman where going from a 3600 to 13900k doubles the framerate on a 4090.

That design choice is really hurting the consoles but I don’t think they could’ve done any better at that price point..

Bethesda have done a good job getting the game to run on xss considering how heavy it seems to be on PC hardware.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
That design choice is really hurting the consoles but I don’t think they could’ve done any better at that price point..

Bethesda have done a good job getting the game to run on xss considering how heavy it seems to be on PC hardware.
Yeah, it is what it is. $500 can only buy you so much.

I just hope for mid gen refreshes they dont skimp out on the CPU like they last gen. target $600 if you have to.
 

Admerer

Member
Can't wait to see Bethesda release a performance patch that brings CPU performance to where it needs to be, because at this point the benchmarks seem way off.

Has a day zero or day one patch been released yet? Don't we usually get one on these big releases?
 
Last edited:

T4keD0wN

Member
After playing for a few hours the game has suddenly started to stutter.
I have found out that the game apparently stutters when switching between epic/legendary weapons for some reason. I now have to only use common weapons if i want the game to remain playable. Great optimization.
 
Last edited:

winjer

Member
Let me guess, AMD shat the bed on an AMD sponsored title again ?

No. It's just Bethesda, sheer incompetence and their decade old engine, screwing up. Again.
Even Astyanax says this one is on Bethesda. Not on AMD. And you know how much of an nvidia fanboy he is.
 

analog_future

Resident Crybaby
Found a very interesting discovery --

For PC players w/ AMD hardware, in graphics settings, turn on FSR 2 but keep the render resolution scale to 100%. When FSR 2.2 is enabled at 100% render scale, the game is running at native resolution without an upscaling component similarly to NVIDIA's Deep Learning Anti-Aliasing and produces a significantly sharper picture than native 4k w/ TAA applied.

https://www.techpowerup.com/review/starfield-fsr-2-2/

53167325359_114cb64833_o.png


53167121306_1d85d02a70_o.png




Works for nVidia GPUs as well of course, but I thought AMD players would find this the most interesting.
 

MikeM

Member
Ladies and gentlemen:

Its Time Vegas GIF by BPONGofficial


Clicked on the game- 116GB update. Lol come on…
 
Last edited:

MikeM

Member
From the Windows Store ?
It's not an update, it's just decrypting files, just like Steam does, it's just badly worded.

Same with the gamepass version which uses the Xbox app on PC, it looks like an update, but it’s just unpacking.
Thats the version i’m playing (GP). Starting area in in the 80-95fps mark on a 5600. Not bad so far… awaiting the upcoming CPU beatings.
 

dEvAnGeL

Member
Struggle to maintain 60 fps during cutscenes. During gameplay i have no issues so far. But only played for about 7 hours.

Natuve 4k
All ultra
Motion blur off
Grain filter 0
Depth of field off

13700k
4090
32GB DDR5 at 6200 CL32
Seagate Firecude 530
 
Last edited:

analog_future

Resident Crybaby
Struggle to maintain 60 fps during cutscenes. During gameplay i have no issues so far. But only played for about 7 hours.

Natuve 4k
All ultra
Motion blur off
Grain filter 0
Depth of field off

13700k
4090
32GB DDR5 at 6400 CL32
Seagate Firecude 530

If it's a concern, try DLSS or FSR 2.2
 

dEvAnGeL

Member
If it's a concern, try DLSS or FSR 2.2
not really a concern, just leaving the info here in case someone with a similar setup is looking into the thread. According to Gamers Nexus AMD drivers are more mature for this game so there is still performance to be gained in future drivers. Overall i am pleased. Not a stellar game from the optimization side. But definetly not broken like most of the ports we have been getting lately.
 
Last edited:

Ovech-King

Gold Member
Here’s my recommandations for 4k if you have a similar laptop than mine;

I9 12900hx 16C / 24T
4080 mobile
Running on my living room 4k tv

- I use the hardware unboxed (youtube) performance settings recommandation
- I use the medium resolution scale so 62%
- I’ve put sharpness to max

I have 60 fps or + everywhere except of course in new atlantis where it dips below but its not where youll be most of the time . Solution to stay mostly over 60 fps over there too would be resolution scale 50% instead but the clarity of the image is not as clean as 62% IMO so i stick with 62.

Otherwise very nice clean graphics; some people may complain that dlss might be better but i think it does a wonderful job on this particular game . One thing though; OMG the french lip sync is so off sync on some parts of sentences haha
 

SlimySnake

Flashless at the Golden Globes
Alright, whats the best DLSS mod to use? i downloaded the Pure Dark mod hack that was posted here earlier today but i dont have a 40 series card so I am good with any DLSS 2 mods.

Also, i dont want to use the Neutral lighting mod, but whats the best HDR mod?
 

Yoda

Member
4090 FE, 79503DX, 32GB RAM, Samsung 990 Pro

Maxed out setting, the range is quite large, depending on what's going on 60-120 FPS. FSR and dynamic res are both on.
 

bbeach123

Member
DLSS mod (both) gave me some weird glitch(when switch from menu to game) so I switch back to FSR , I had to say this game has to be the best Fsr implement I saw.

Outsite of some shimmering ,the game have minimal ghosting ,minimal broken grainy moving object that we usually see in FSR 2 (fuck you jedi survivor) .
DLSS mod complete remove the shimmering though (even the shimmering native resolution have ) .
 
Last edited:
If you can get a reservation at Dorsia I might tell you exactly how to do it.

P.S. Do you like Phil Collins?
Dorsia?

Nobody, goes there anymore.

By the way, next time you are at Tex-Aracana try the pork loin with lime jello... it's to die for.
 
Last edited:

Hydroxy

Member
Getting around 30fps in first few hours on my low end laptop at 1080p on low settings with FSR 2 on and 80% render resolution.
Ryzen 5500U
Gtx 1650 4gb
16gb ddr4 3200mhz
 
Top Bottom