• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Todd Howard says Starfield is already optimized "You might need to upgrade your PC."

Crayon

Member
I've been staying blind in the game till I can play it. For people playing it, does it seem like it should be crushing CPUs like it is?
 

K2D

Banned
Translation: We did (half heartedly) optimize it.

(DO YOU SEE WHAT [ENGINE] WE'RE WORKING WITH HERE?)

*Maybe it'll run as intended if you throw a rig with 32GBs of ram and a 13th gen. Intel processor at it..!*
 
Last edited:
LOL at Phil Spencer pulling his head back when Todd said we did optimize. This isnthe funniest shit I've ever seen.
Someone mightve already mentioned it, but notice Phil and Todd react at the exact same time. Likely because of the audio delay. Phil was probably reacting to the perceived audacity of the question, not Todd's response.

I don't think they did good optimization for PC, btw.
 
lol exactly. People bought $1649-2000 4090s then paired them up with cheap $300 CPUs that max out at 4.45 Ghz and 65 watts then wonder why their game is bottlenecked. Yes, the 5800x3d was great when they were running games designed on a 1.6 Ghz jaguar CPU from 2010. But everyone who shat on mid range intel CPUs maxing out at 5.0 to 5.1 Ghz consuming over 120 watts is now wondering why those CPUs are running the game better.

This is on those idiots who run PC review channels who continuously played up the AMD power consumption and low cooling requirements over actual performance. hell, it got so bad that i was able to buy intel CPUs at a $100 discount over the equivalent AMD products because everyone wanted the low tdp cooler CPU.

Well, now you are fucked. Still, if you can afford a $1,200 4080 or a $1,650 4090 you should be able to go out there and upgrade to a $400 Ryzen 7800x3d which should give you a massive 40% performance increase over the 5800x3d.

Hell, even AMDs latest $299 CPUs ensure that they dont bottleneck your GPU. I get 98% GPU utilization in every city, town, ship and outdoors and i have a $300 intel CPU from 3 years ago.
Show us where the 5800x3d touched you, you need to be very brave now.
 
I'm running it on a Legion 7pro w/ i9-13900HX / 4090 and getting around 40-50 w/o FSR. With FSR getting around 60-80 and then I installed the DLSS 3.5 mod and now getting well over 100 pretty much everywhere. Not sure why they didn't include the full DLSS suite in the game.. kinda dumb.
 

SlimySnake

Flashless at the Golden Globes
You're the one displaying a full blown meltdown over several pages. What's the story behind it?
meltdown? It's been pretty level headed responses to anyone who has replied to me so far full of benchmarks, specs and other observations. Basically what forum posts are supposed to be. If you view those posts as meltdowns then maybe a reality check should be in order. but then again, you implied i was molested by a cpu, called an industry legend Todd Howard an incompetent buffoon for essentially saying buy a next gen gpu and cpu, and have an unhealthy need to defend a silicon chip so you are probably beyond help.
 

winjer

Gold Member
Thing here's why Raptor-Lake is ahead of Zen 4... Raptor-Lake being a wider and deeper Core feeds that Bandwith very well into its core.

I don't know what you mean with "wider and deeper Core feeds that Bandwith very well into its core."

But a 12900K only gains 4% going from DDR5 5200 to 7200.
Zen4 gains 13% going from the same 5200 to 7200.

13th Gen probably gains even less, because it has double the L2 cache, so it has fewer cache misses and accesses to system memory.

 

XesqueVara

Member
I don't know what you mean with "wider and deeper Core feeds that Bandwith very well into its core."

But a 12900K only gains 4% going from DDR5 5200 to 7200.
Zen4 gains 13% going from the same 5200 to 7200.

13th Gen probably gains even less, because it has double the L2 cache, so it has fewer cache misses and accesses to system memory.

Raptor-Lake lad, Raptor-Lake is very good in games which loves Bandwith( Spider-man and Starfield the most notable ones).
 

sendit

Member
Lol.
I've been staying blind in the game till I can play it. For people playing it, does it seem like it should be crushing CPUs like it is?
No. especially new Atlantis which isn’t that dense. But is a performance killer and probably the ugliest city in the entire game.

Cyberpunk 2077 while looking better provides much more AI/NPC dense areas and doesn’t tank the FPS like this game.
 
Last edited:

winjer

Gold Member
Raptor-Lake lad, Raptor-Lake is very good in games which loves Bandwith( Spider-man and Starfield the most notable ones).

Starfield doesn't scale that well with memory. Especially on 13th gen.
Spider-man has better memory scaling, but it scales more with PCIe bandwidth.

What tests have found about 13th gen memory scaling is that above 6000MT/s, the performance gains are very limited.
A similar thing for Zen4. And remember that it can also do 8000 MT/s.

But what did you mean with "wider and deeper Core feeds that Bandwith very well into its core".
 
Yeah let's make a game that only works well with DDR5 and a Ryzen 3D processor. Fucking genius Todd good thing you have that Microsoft check.
 
Just done the intro on a x5800 and 3070 in 1440p and the performance is appalling.

Around 35fps during that fight with the pirates and that's with the amd solution thing on. Considering how shit the game looks, this performance is terrible .
How is this acceptable?

Creation engine garbage strikes again.
 

XesqueVara

Member
Starfield doesn't scale that well with memory. Especially on 13th gen.
Spider-man has better memory scaling, but it scales more with PCIe bandwidth.

What tests have found about 13th gen memory scaling is that above 6000MT/s, the performance gains are very limited.
A similar thing for Zen4. And remember that it can also do 8000 MT/s.

But what did you mean with "wider and deeper Core feeds that Bandwith very well into its core".
Wider and deeper Cores manages resources very well into it´s core
Raptor is a 6-wide and a deeper core than Zen 4 so that explains the advantage it has in Games over Zen 4.
 
I think it's the first game I've swapped to the Xbox version on for hardware, seems to be doing better than my 8700k and 3060ti.
 
Sure Jan GIF
 
I am running this game at 4K max setting with over 100 fps but sure.
Even in New Atlantis? I'm assuming you have a beast of a PC.

I can get around 100fps, too, at 1400p w/DLSS mod in some areas, but it drops to sub-30 in New Atlantis with less going on than in Cyberpunk where I get a better framerate. It doesn't make sense to me that it would be so CPU bound unless it was poorly optimized.
 

winjer

Gold Member
Wider and deeper Cores manages resources very well into it´s core
Raptor is a 6-wide and a deeper core than Zen 4 so that explains the advantage it has in Games over Zen 4.

It's true that RL has a wider pipeline stage than Zen4. But that does not affect how memory is accessed.
And I suppose that when you say "deeper" you mean the length of the execution pipeline. But in this case, longer is worst for IPC. It also does not affect memory bandwidth or latency.
What will affect memory is the CPU fetch stage on the execution pipeline. Of course, the front end, with the branch prediction and caches, can help to obfuscate part of the latency and bandwidth.

There is one thing that Golden Cove cores are doing better than Zen4 cores, and that is support for "same bank refresh".
This is a new feature in DDR5 that reduces the need to refresh the memory. For people who are used to do memory OC, will know of tREFI and why it matters.
 

Edmund

Member
Do you all think Todd Howard will sound like Simply Red when he sings? He's got this high tenor voice.

 

SlimySnake

Flashless at the Golden Globes
Even in New Atlantis? I'm assuming you have a beast of a PC.

I can get around 100fps, too, at 1400p w/DLSS mod in some areas, but it drops to sub-30 in New Atlantis with less going on than in Cyberpunk where I get a better framerate. It doesn't make sense to me that it would be so CPU bound unless it was poorly optimized.
What CPU GPU combo do you have again? Thats way too low in my experience.

How deep are you in the game? i remember the ps3 0 fps bug only struck after around 40 hours when you had certain number of quests and inventory messing with the ps3's split ram architecture. I wonder if the game keeps adding shit on top and by the 50th hour, new atlantis is a mess because the cpu has to keep track of everything you have accumulated.
 
Last edited:

hinch7

Member
Just done the intro on a x5800 and 3070 in 1440p and the performance is appalling.

Around 35fps during that fight with the pirates and that's with the amd solution thing on. Considering how shit the game looks, this performance is terrible .
How is this acceptable?

Creation engine garbage strikes again.
Use Hardware Unboxed GPU optimsation guide and download the DLSS 3.5 mod found on Nexusmods.

Turn off frame gen by editing the config file (edit the file with notepad and save). Can get away with lower resolution scaling like 66%, with some sharpening and still look good.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
So if Neil Druckman or Cory comes out and says that their next game is next gen game only and tells PS4 owners to upgrade, would you say they are hacks?

This is a next gen game. Designed around the xsx 12 tflops specs.


lmao. why?

a 3060 is the same as a 2070. The XSX which runs this game at 1440p 30 fps is equivalent to a 2080. Why in the work would a 360 double the framerate despite being 20% less powerful. What logic is this?

And you CAN run this at 60 fps using a 2080. My friend is doing that right now. You just need a powerful CPU like the 7800x3d. I think hes using medium settings at 1080p which makes perfect sense since xsx is running it at almost double the resolution at 30 fps.
This game does not have remotely close to the visual fidelity to justify running this shitty.

I'm fortunate in that I can run it reasonably well on my 4090 and 7800X3D, but it shouldn't take those kinds of specs to be fluid. I'll try the DLSS3 mod later.
 
What CPU GPU combo do you have again? Thats way too low in my experience.

How deep are you in the game? i remember the ps3 0 fps bug only struck after around 40 hours when you had certain number of quests and inventory messing with the ps3's split ram architecture. I wonder if the game keeps adding shit on top and by the 50th hour, new atlantis is a mess because the cpu has to keep track of everything you have accumulated.


A Hybrid HDD can ran Skyrim on PS3 with patches decently
 
What CPU GPU combo do you have again? Thats way too low in my experience.

How deep are you in the game? i remember the ps3 0 fps bug only struck after around 40 hours when you had certain number of quests and inventory messing with the ps3's split ram architecture. I wonder if the game keeps adding shit on top and by the 50th hour, new atlantis is a mess because the cpu has to keep track of everything you have accumulated.
Nvidia 3080 GPU and AMD Ryzen 5 5600X CPU. I'm using optimized settings off of Nexus, which drops the shadows to low, for instance.

It's not constant sub-30 in New Atlantis, but I'd prefer it never go that low. I just haven't heard a compelling argument from Todd Howard or otherwise as to what this game is doing that other games couldn't and why it's so CPU dependent.

Then again, I just looked up what the Xbox Series X's equivalent-ish PC CPU would be, and it's a Ryzen 7 3700X, which is actually better than the 5600X. It's starting to make more sense now.
 

SlimySnake

Flashless at the Golden Globes
With a 3090 and an 11th gen I can barely maintain 60-70fps on ultra at 1440p. What the fuck todd
So your gpu which is 2x more powerful than the Xbox series x is able to run the game at 2x the framerate with way better settings than the xsx, and thats bad because?

I'm fortunate in that I can run it reasonably well on my 4090 and 7800X3D, but it shouldn't take those kinds of specs to be fluid. I'll try the DLSS3 mod later.
It doesnt need a 4090 to be fluid. Or a 7800x3d. i am running the game on a 3 year old intel processor. on a 3 year old 3080. running at mostly 60 fps except in cities where it drops to 40s and 50s. though not a big deal since its just a hub world with no combat.

your 4090 is 3x more powerful than the xsx's 2080 equivalent and its letting you double the resolution AND the framerate. Thats 4x more pixels per frame. What more do you want?

Nvidia 3080 GPU and AMD Ryzen 5 5600X CPU. I'm using optimized settings off of Nexus, which drops the shadows to low, for instance.

It's not constant sub-30 in New Atlantis, but I'd prefer it never go that low. I just haven't heard a compelling argument from Todd Howard or otherwise as to what this game is doing that other games couldn't and why it's so CPU dependent.

Then again, I just looked up what the Xbox Series X's equivalent-ish PC CPU would be, and it's a Ryzen 7 3700X, which is actually better than the 5600X. It's starting to make more sense now.
Yeah, unfortunately that 6 core 12 thread CPU was always going to become a bottleneck on PCs the moment developers started utilizing it on consoles. Its actually roughly on par with the xsx and ps5 cpus if not more powerful because it has higher clocks and more cache despite having few cores and threads (xsx and ps5 reserve 1 core for the os), but the problem is you need to double the framerate in a cpu heavy game like this on PCs since settling for 30 fps on PCs is not an option.

good news is that if you are willing to, you should be able to upgrade your CPU without having to change your motherboard because from what I heard the 7000 series CPUs should not require a motherboard upgrade. For $300 you should have be able to afford a CPU that will not bottleneck your 3080.

try one of these 7000 series AMD CPUs, you dont have to spend $399 to get the 7800x3d to get 60 fps in atlantis or any other cities. I am running high settings at FSR Quality and get around 45-55 fps in atlantis. 45-50 fps in akila city. Same GPU as you but CPU is the 11700k which shows up at 72 fps in this benchmark here. the $300 7700x is 10 fps more than that and a whopping 26 fps more than your 5600x. the difference might be smaller at higher resolutions, but you should be at a consistent 50-60 fps in cities and a locked 60 fps during combat if you buy the 7700x with your settings.


DfTte92.jpg


BHsBqdQ.jpg
 
So your gpu which is 2x more powerful than the Xbox series x is able to run the game at 2x the framerate with way better settings than the xsx, and thats bad because?


It doesnt need a 4090 to be fluid. Or a 7800x3d. i am running the game on a 3 year old intel processor. on a 3 year old 3080. running at mostly 60 fps except in cities where it drops to 40s and 50s. though not a big deal since its just a hub world with no combat.

your 4090 is 3x more powerful than the xsx's 2080 equivalent and its letting you double the resolution AND the framerate. Thats 4x more pixels per frame. What more do you want?


Yeah, unfortunately that 6 core 12 thread CPU was always going to become a bottleneck on PCs the moment developers started utilizing it on consoles. Its actually roughly on par with the xsx and ps5 cpus if not more powerful because it has higher clocks and more cache despite having few cores and threads (xsx and ps5 reserve 1 core for the os), but the problem is you need to double the framerate in a cpu heavy game like this on PCs since settling for 30 fps on PCs is not an option.

good news is that if you are willing to, you should be able to upgrade your CPU without having to change your motherboard because from what I heard the 7000 series CPUs should not require a motherboard upgrade. For $300 you should have be able to afford a CPU that will not bottleneck your 3080.

try one of these 7000 series AMD CPUs, you dont have to spend $399 to get the 7800x3d to get 60 fps in atlantis or any other cities. I am running high settings at FSR Quality and get around 45-55 fps in atlantis. 45-50 fps in akila city. Same GPU as you but CPU is the 11700k which shows up at 72 fps in this benchmark here. the $300 7700x is 10 fps more than that and a whopping 26 fps more than your 5600x. the difference might be smaller at higher resolutions, but you should be at a consistent 50-60 fps in cities and a locked 60 fps during combat if you buy the 7700x with your settings.


DfTte92.jpg


BHsBqdQ.jpg
Great info. Thank you!
 

artsi

Member
The game is addicting and I love it but I booted up CP2077 and sighed how it's on whole different level in both visuals and performance compared to this Bethesda's magnum opus.

I mean I can go from my apartment downtown all the way to the desert without a single loading screen and it boggles my mind how Bethesda couldn't get even one town fit inside the same cell.
 

StueyDuck

Member
A lot of gaslighting going on in this thread. Not sure why we are cheering on objectively bad performing PC games.

The biggest take away however from these comments from Todd is that it seems they aren't going to put any resources into making the game perform better which is very disappointing because again... the game objectively runs like horse shit.
 
Last edited:

JayK47

Member
Lower your expectations, settings, or both :messenger_winking:

I am sure new patches and drivers will improve it a little bit. For me, I am fine with how it looks and runs. But I am not looking for 240 fps on my 5 year old PC.
 
Top Bottom