• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Todd Howard says Starfield is already optimized "You might need to upgrade your PC."

I love Todd Howard because he's so shameless about everything. He reminds me of Elon Musk except Todd is not a genius who builds rockets and electric cars
 

recursive

Member
Pat yourself on the back “hack” we’re all waiting on your game that your directing from the development studio that you’ve been running for the past 25 years.

Why would get so mad about what he said. Are you playing Starfield? Are you a fan of any Bethesda game. Why??
You sound so mentally unhealthy, remember it’s a video game. Entertainment purposes only.
Calm down Phil.
 

Facism

Member
Todd is historically an honest man, so I believe him.

ZXp6xn9.jpg
 
A lot of gaslighting going on in this thread. Not sure why we are cheering on objectively bad performing PC games.

The biggest take away however from these comments from Todd is that it seems they aren't going to put any resources into make the game perform better which is very disappointing because again... the game objectively runs like horse shit.

Of course they're not going to try and fix the performance issues. Once again modders will have to pick up the slack from these asswipes.
 

sendit

Member
Just done the intro on a x5800 and 3070 in 1440p and the performance is appalling.

Around 35fps during that fight with the pirates and that's with the amd solution thing on. Considering how shit the game looks, this performance is terrible .
How is this acceptable?

Creation engine garbage strikes again.
The game picks up after the first three main missions. However, I do agree. The engine is absolute ass:

 

ZehDon

Gold Member
Yeah, no.

With nothing but setting the appropriate profile settings in the NVidia Profiler, a NVidia card user can get an extra 5-20% performance for free. With nothing but a generic DLSS implementation, the game can run and look better on NVidia hardware. This shit could be patched in today and the game would be immediately better optimised.

I'm fortunate enough to have the PC to run this at 60FPS on Ultra settings regardless, but considering I can also run Cyberpunk 2077 on Psycho, complete with psycho ray-tracing, at 60FPS, it's clear this game is leaving a lot of performance on the table. I hope we see some improvements over time.
 
If I had a dollar for every Starfield thread, I’d buy my own island.

on topic , it’s kind of bs to upgrade I thought he said it’s optimized to run on several CPU’s
Do we a thread yet about how women in starfield have awkward heads, are wearing too much clothing, and has been ruined by wokeness yet? That could be another dollar!
 

sendit

Member
this has nothing to do with the 5800x3d. you are the only one getting triggered by me simply posting benchmarks showing this cpu get trounced by AMDs own CPUs.

i had no idea the cpu had a spouse on this board taking everything so personally.
Someone should tell SlimySnake that PC gamers don’t game at 1080p. You know, the resolution where having the latest CPU actually matters.
 

proandrad

Member
Nvidia 3080 GPU and AMD Ryzen 5 5600X CPU. I'm using optimized settings off of Nexus, which drops the shadows to low, for instance.

It's not constant sub-30 in New Atlantis, but I'd prefer it never go that low. I just haven't heard a compelling argument from Todd Howard or otherwise as to what this game is doing that other games couldn't and why it's so CPU dependent.

Then again, I just looked up what the Xbox Series X's equivalent-ish PC CPU would be, and it's a Ryzen 7 3700X, which is actually better than the 5600X. It's starting to make more sense now.
No offense, but a 6 core processor paired with a 3080 isn't exactly a balance build.
 

VideoHideo

Neo Member
As somebody that has a 3080 and an i9….to any of my intel bros struggling with fps, enter the bios and make sure your CPU is running in turbo mode. Makes a huge difference. It’s CPU heavy, Rarely ever drop below 60. Indoors im around 120fps
 
No offense, but a 6 core processor paired with a 3080 isn't exactly a balance build.
First I'm hearing of it. Really, only BG3 and Starfield have been disappointing, performance-wise, on it. I was plenty happy with Cyberpunk and all the other games I've played on it the past 2 years
 
Last edited:

SlimySnake

Flashless at the Golden Globes
As somebody that has a 3080 and an i9….to any of my intel bros struggling with fps, enter the bios and make sure your CPU is running in turbo mode. Makes a huge difference. It’s CPU heavy, Rarely ever drop below 60. Indoors im around 120fps
Just downloaded DLSS and for some reason, it is making my CPU go into overdrive. Indoors, outdoors, doesnt matter, im hitting 140 watts on my i7-11700k and temps are jumping up to 75C. i actually saw it jump to 80 degrees at one point. im legit scared. if I play this game for around a 100 hours, am I killing my cpu if it stays around 75 degrees for dozens of hours?

I have no idea why FSR was holding back my CPU but switching to dlss has made me gain roughly 10% more performance. sadly it comes with more wattage and higher temps.
 

Eotheod

Member
I really wonder what CPU people are running with a 4070 at 1080p to only hit just 60fps. I'm sorry, but that's some whacked out component upgrade pathing if I've seen any.

I have a 6800XT and a Ryzen 3800XT, playing the game at 1440p 144hz I'm getting on average 100-140 fps with everything ultra. I've only turned motion blur off and dynamic res off while pushing to 100% render resolution. How am I able to achieve this yet others are struggling with what is technically a better card?

For once we have a game actually using the CPU instead of four cores max and going GPU heavy. That is a far better balance in my opinion, and what next gen development should be focusing on.
 
Last edited:

proandrad

Member
First I'm hearing of it. Really, only BG3 and Starfield have been disappointing, performance-wise, on it. I was plenty happy with Cyberpunk and all the other games I've played on it the past 2 years
Cyberpunk is heavy on the cpu, especially with ray tracing on, so you are leaving a ton of performance on the table. A better CPU won’t always show it’s worth in benchmarks, since those graphs tend to only show avg framerate running on a pc with nothing else running in the background. Stutters and/or strange low frame drops with a 3080 is either a cpu issue or you are running out of vram. Starfield seems to be pretty good with managing vram so your cpu would most likely be the issue.
 
Last edited:

hinch7

Member
I really wonder what CPU people are running with a 4070 at 1080p to only hit just 60fps. I'm sorry, but that's some whacked out component upgrade pathing if I've seen any.

I have a 6800XT and a Ryzen 3800XT, playing the game at 1440p 144hz I'm getting on average 100-140 fps with everything ultra. I've only turned motion blur off and dynamic res off while pushing to 100% render resolution. How am I able to achieve this yet others are struggling with what is technically a better card?

For once we have a game actually using the CPU instead of four cores max and going GPU heavy. That is a far better balance in my opinion, and what next gen development should be focusing on.
Willing to bet a lot has to do with driver overhead with Nvidia GPU's affecting performance. And as far as we can see from all the CPU testing online, Starfield is very CPU dependant game. That and this is an AMD sponsored title.

Hardware unboxed covered this a couple years ago
 
Last edited:

Roni

Gold Member
Just because a game is top down view, doesn't mean it stops rendering 3D stuff.
Both games are rendering 3D worlds. With the big diference that open areas in DG3 look good, while Starfield looks like crap, while running like crap.
You can cull 80% of the game world in top down. There's a limit to what you can optimize for draw distance in true perspective. C'mon now...
 

SF Kosmo

Al Jazeera Special Reporter
Get a better engine, Todd. This game isn't a visual masterpiece.
Engines are more than just rendering. I don't entirely disagree that the Engine is being stretched to its limits but it isn't just about graphics.
 

Bojji

Member
I really wonder what CPU people are running with a 4070 at 1080p to only hit just 60fps. I'm sorry, but that's some whacked out component upgrade pathing if I've seen any.

I have a 6800XT and a Ryzen 3800XT, playing the game at 1440p 144hz I'm getting on average 100-140 fps with everything ultra. I've only turned motion blur off and dynamic res off while pushing to 100% render resolution. How am I able to achieve this yet others are struggling with what is technically a better card?

For once we have a game actually using the CPU instead of four cores max and going GPU heavy. That is a far better balance in my opinion, and what next gen development should be focusing on.

You must have the best 3800x on the planet to get those numbers, tell me if you are getting the same numbers in new Atlantis?

5800x3d tops 70 something fps when you get out of train in new Atlantis and you are telling me that 3800x is getting over 100?
 

SlimySnake

Flashless at the Golden Globes
I really wonder what CPU people are running with a 4070 at 1080p to only hit just 60fps. I'm sorry, but that's some whacked out component upgrade pathing if I've seen any.

I have a 6800XT and a Ryzen 3800XT, playing the game at 1440p 144hz I'm getting on average 100-140 fps with everything ultra. I've only turned motion blur off and dynamic res off while pushing to 100% render resolution. How am I able to achieve this yet others are struggling with what is technically a better card?

For once we have a game actually using the CPU instead of four cores max and going GPU heavy. That is a far better balance in my opinion, and what next gen development should be focusing on.
The 4070 performs way worse in this game which seems to prefer AMD cards. Your 6800xt is performing more like a 3090 ti.

People are also testing their game in cpu bound areas like atlantis even though the game is mostly fine in the open world and indoor areas where the majority of the combat takes place.

Z9l80xX.png
 

VideoHideo

Neo Member
Just downloaded DLSS and for some reason, it is making my CPU go into overdrive. Indoors, outdoors, doesnt matter, im hitting 140 watts on my i7-11700k and temps are jumping up to 75C. i actually saw it jump to 80 degrees at one point. im legit scared. if I play this game for around a 100 hours, am I killing my cpu if it stays around 75 degrees for dozens of hours?

I have no idea why FSR was holding back my CPU but switching to dlss has made me gain roughly 10% more performance. sadly it comes with more wattage and higher temps.
I would say you are probably fine but somebody more knowledgeable on this can probably clarify?
 
Cyberpunk is heavy on the cpu, especially with ray tracing on, so you are leaving a ton of performance on the table. A better CPU won’t always show it’s worth in benchmarks, since those graphs tend to only show avg framerate running on a pc with nothing else running in the background. Stutters and/or strange low frame drops with a 3080 is either a cpu issue or you are running out of vram. Starfield seems to be pretty good with managing vram so your cpu would most likely be the issue.
Google "3080" and "5600X" and see what people say about the combo. The 5600X bottlenecking the 3080 is a new phenomena.
 

DryvBy

Member
Engines are more than just rendering. I don't entirely disagree that the Engine is being stretched to its limits but it isn't just about graphics.
Of course not but this game mechanically isn't doing anything new that older systems can't do with other engines. I don't think Creative is horrible, but it's dated and old. There's a reason they shouldn't be using this and it boils down to 30fps on Series X and unstable frames with mid tier systems.
 

bbeach123

Member
Just downloaded DLSS and for some reason, it is making my CPU go into overdrive. Indoors, outdoors, doesnt matter, im hitting 140 watts on my i7-11700k and temps are jumping up to 75C. i actually saw it jump to 80 degrees at one point. im legit scared. if I play this game for around a 100 hours, am I killing my cpu if it stays around 75 degrees for dozens of hours?

I have no idea why FSR was holding back my CPU but switching to dlss has made me gain roughly 10% more performance. sadly it comes with more wattage and higher temps.
75 ? Dude my i5-2500k stay at 90+ for like 5 years , and its still working in my nephew PC, could been over 10 years old already .
 

StereoVsn

Member
Of course not but this game mechanically isn't doing anything new that older systems can't do with other engines. I don't think Creative is horrible, but it's dated and old. There's a reason they shouldn't be using this and it boils down to 30fps on Series X and unstable frames with mid tier systems.
It's doing two things that nobody else does really. One is object persistence and the other one is mod support.

But overall it's super dated especially with constant loading and animations.
 

SlimySnake

Flashless at the Golden Globes
I love Todd Howard because he's so shameless about everything. He reminds me of Elon Musk except Todd is not a genius who builds rockets and electric cars
Todd Howard's resume: Game director and Creator on Starfield, Fallout, Elder Scrolls for the last 30 years.

Elon Musk resume: Bought Paypal. Bought Tesla. Bought Space-X. Bought Twitter. He's the Phil Spencer of his industry. Todd Howard is the real deal.
 
Last edited:

DeepEnigma

Gold Member
Todd Howard's resume: Game director and Creator on Starfield, Fallout, Elder Scrolls for the last 30 years.

Elon Musk resume: Bought Paypal. Bought Tesla. Bought Space-X. Bought Twitter. He's the Phil Spencer of his industry. Todd Howard is the real deal.
He didn't buy PayPal, he was just the largest shareholder when it sold. He owned X.com which merged with Confinity and soon after became PayPal.
 

Knightime_X

Member
A game from 2004?
You gonna be kidding me, Half Life 2 is the benchmark of game optimization

The Geforce 2 MX 400 was a freaking potato by late 2004 and manage to run it

You know there are people out there who think a game should run well on their system despite obvious minimum requirements.

You can only optimize so much before you HAVE to upgrade.
The ones complaining have potatoes for pc.
Much less a baked potato.
 

ZiriusOne

Member
I can't wait until the first modder comes around the corner that releases a mod out of the blue that gives everyone 30fps more because he deleted all of the redundant shit code form the game or something like that.

There is no way the game should run like that. It's a nice looking game but it's certainly not a generational leap forward that warrants a big PC upgrade.
 

Krathoon

Member
I guess I should have got a 4070ti. Still, it seems fine with a 4070. I will have to actually setup my computer to see the frame rate stats.

It was definitely time to upgrade from my 1080GTX.
 

Reallink

Member
Just downloaded DLSS and for some reason, it is making my CPU go into overdrive. Indoors, outdoors, doesnt matter, im hitting 140 watts on my i7-11700k and temps are jumping up to 75C. i actually saw it jump to 80 degrees at one point. im legit scared. if I play this game for around a 100 hours, am I killing my cpu if it stays around 75 degrees for dozens of hours?

I have no idea why FSR was holding back my CPU but switching to dlss has made me gain roughly 10% more performance. sadly it comes with more wattage and higher temps.
The 7XXXs and 13XXXs by default intentionally rev themselves up to 95c and stay there full time at load. The 11700 is current enough where I would assume its spec'ed to target at least 85c, if not 95. Modern CPUs are spec'ed to run much hotter than older ones.
 
Last edited:
Well you got a high end gpu with a lower end cpu. Now that we are getting games that demand more power the lower end part isn’t keeping up with the higher end part.
I don't entirely disagree. The missing piece of info for me is what it is this game is doing, CPU computationally, that other games haven't done or couldn't do better with a "low end" CPU if Bethesda had optimized better.
 
Top Bottom