• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Star Wars Jedi Survivor PC Performance Discussion OT: May The Port Be With You

SlimySnake

Flashless at the Golden Globes
Update: Launch Day Patch fixes most issues. Game is still demanding but runs 30+ fps on a GTX 1060. 50-60 fps on a 3060. At 1080p. 3070 and 3080 can run at 50-60 fps at 1440p. Game is still CPU bound in some scenarios but it is definitely not as bad as TLOU.

______________________________________________________________________________________________
Game is phenomenal, GOTY contender, but performance even on high end PCs is awful.

Saw this review posted in the review thread, and figured i'd post a PSA here since a lot of us got burned on TLOU and Hogwarts recently. Also, probably a good idea to keep this discussion out of the OT/Review threads. The game is truly great, and its a shame that the performance discussion might drown out everything else the game is doing right.

In short, Jedi: Survivor not only joins the list of poorly optimized titles on PC, but it is one of the worst so far this year, much worse than The Last of Us Part I, since the latter in 4K at with the Ryzen 9 5950X and RTX 3080 Ti, it never went below 40 FPS, but the general average is 60, since in many parts it stays at 70-80 FPS. Broadly speaking, Jedi: Survivor is closer to Hogwarts Legacy optimization or The Callisto Protocol WITH ray tracing. While there is preloading of shaders, there is traversal stuttering (fixed stuttering when entering or leaving certain zones) and as I mentioned previously, the big issue with VRAM which will cause highly inconsistent frametime, remote texture popping and late loading or erratic of the same,

At 1080p with everything maxed out, the game can eat up to 11GB of VRAM in the most open spots, while the bottleneck in some graphics-heavy spots doesn't help either. To put it simply, if you don't want VRAM issues to play 1080p, you'll need a board with 12GB of VRAM, whereas 16GB would be ideal for 1440p and 4K (and I don't think more than 16GB will be required in the future, unless the game doesn't have DLSS 2 or FSR 2, both of which help reduce VRAM usage by 1-1.5GB). While it runs at 4K at 30 FPS and 1440p at 60 FPS on consoles, for its quality and performance mode respectively, on PC there will probably be a lot of complaints about its performance and Nvidia will once again be the focus of contention for lashing out on the game. amount of memory from mid-range motherboards like the RTX 3060 Ti and RTX 3070.

Source: https://www.pcmrace.com/2023/04/26/star-wars-jedi-survivor-review/

Summary:
- No DLSS Support since this is an AMD sponsored title.
- VRAM issues plaguing even the 12GB 3080 Ti
- Prolonged Stuttering/pop-in plentiful

EDIT: FextraLife guy destroys the game's performance on PC. Calls it abysmal. Does not recommend getting it on PC. XBox version is fine. Single digit dips through out the game's 20-25 hour campaign on a 3080Ti at 1440p high settings.



Here is a 4090 running the game at just 1440p and experiencing prolonged frame drops below 40 fps.



Seeing as how TLOU ran rather well on 16GB cards, let alone the 24GB 4090, this is arguably even worse.

P.S Reports that Xbox 60 fps version got a big boost with the day one patch but no such report for PC. So buy at your own risk!
 
Last edited:

ripjoel

Member
giphy.gif
 

SmokedMeat

Gamer™
Lazy fucking developers. That’s what we’re up against. Don’t give me that horseshit about they’re focused on consoles either. If you’re releasing a $70 PC version it had better work.

Even the Dead Island 2 developers who were like the third or fourth team on that train wreck managed to do better than all of these so called AAA teams.
 
Last edited:

Drizzlehell

Banned
Game: runs okay on epic settings and 4k resolution, definitely better than whatever consoles would be capable of

PC gamers who watched too many Digital Fucktradry videos: "iTsAfUkKeNmEsS! uNpLaYabLe gArBaGe!!!11"
 

Mattdaddy

Gold Member
I dont know tech at all so maybe Im talking out my ass, but I don't get how a 4090 couldn't brute force this thing past 45fps no matter how poorly optimized it is, especially at 1440. I feel like theres gotta be some context missing or something.

Ive got EA Pro and a 4090, I'll give this thing spin tomorrow for free and report back how mine performs. Ill be at 4k too id be mind blown if its 45fps or less.

Edit - Fextralife guys said it too, fuck maybe it is that bad!
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Game: runs okay on epic settings and 4k resolution, definitely better than whatever consoles would be capable of

PC gamers who watched too many Digital Fucktradry videos: "iTsAfUkKeNmEsS! uNpLaYabLe gArBaGe!!!11"
Nonsense. There is footage in the OP that shows high end $1200-1600 cards dropping into single digits with bugs galore.
 

Denton

Member
Let's wait for the actual release before doom and glooming, shall we?

Maybe it will run terribly, maybe not. Gotham Knights ran badly, but got fixed and runs perfectly now. Maybe Respawn manages to do it in day one patch, or maybe few weeks later, we'll see.

Personally I am occupied right now by perfectly running Ghostwire, Dead Island 2 and Dead Space, so Respawn will have few weeks to fix it if necessary before I get to it.
 

Drizzlehell

Banned
Nonsense. There is footage in the OP that shows high end $1200-1600 cards dropping into single digits with bugs galore.
I'm not surprised considering that modern GPUs are trash. People expect these to perform magic solely based on the price but the reality is that Nvidia sold everyone snake oil this generation.

Also, I honestly can't take these videos seriously when the guy shows a footage of a game that seems to be running smoothly, with an occassional stutter here and there, and then the voice over say "it's absolutely abysmal." Like, seriously, are these people blind? If it was the same situation as with Callisto Protocol than sure, that game had some fucked up stuttering. But this? This is nothing.
 

SmokedMeat

Gamer™
Game: runs okay on epic settings and 4k resolution, definitely better than whatever consoles would be capable of

PC gamers who watched too many Digital Fucktradry videos: "iTsAfUkKeNmEsS! uNpLaYabLe gArBaGe!!!11"

A $1600 fastest GPU on the planet shouldn’t be getting 40fps at 1440p.

Don’t play it off like PC gamers are overreacting. Especially after all the lazy ports we’ve been handed.
 

GymWolf

Gold Member
Goty contender:messenger_tears_of_joy:, nice one slimy.

I'm gonna wait for some patches before trying this one, hopefully redfall is not gonna be another turd in the making.
 

SlimySnake

Flashless at the Golden Globes
I'm not surprised considering that modern GPUs are trash. People expect these to perform magic solely based on the price but the reality is that Nvidia sold everyone snake oil this generation.

Also, I honestly can't take these videos seriously when the guy shows a footage of a game that seems to be running smoothly, with an occassional stutter here and there, and then the voice over say "it's absolutely abysmal." Like, seriously, are these people blind? If it was the same situation as with Callisto Protocol than sure, that game had some fucked up stuttering. But this? This is nothing.
Occasional? He specifically says the game constantly drops performance sometimes down to single digits.

Did you see the 4090 video, it has framerate counters and you can see just how much it drops on THE best card on the market at a mere 1440p.

And thats just the performance. The audio and visual bugs are insane.
 

Drizzlehell

Banned
Occasional? He specifically says the game constantly drops performance sometimes down to single digits.

Did you see the 4090 video, it has framerate counters and you can see just how much it drops on THE best card on the market at a mere 1440p.

And thats just the performance. The audio and visual bugs are insane.
Then maybe you'd have to provide a timestamp because I'm watching the video and waiting for the insanity to unfold before my eyes.

I wonder how many games will it take before PC gamers will realize that maybe the reason why consoles don't do 4k ray tracing at 60 FPS is because current day hardware is simply incapable of such feats at these levels of graphical fidelity, and PC isn't as far ahead as it used to be.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Then maybe you'd have to provide a timestamp because I'm watching the video and waiting for the insanity to unfold before my eyes.

I wonder how many games will it take before PC gamers will realize that maybe the reason why consoles don't do 4k ray tracing at 60 FPS is because current day hardware is simply incapable of such feats at these levels of graphical fidelity, and PC isn't as far ahead as it used to be.
Not sure what you mean. The 4090 is over 3x the performance of the consoles' GPUs and high-end PC CPUs are like twice as fast (forgot the exact number).
 

diffusionx

Gold Member
Every gen there is this turnover where PC games become noticeably and suddenly far more demanding with far worse performance than ones that came out just prior. Every gen people complain and whine (understandably) but it just seems related to coding for a higher floor of a machine. Normally this happens around the time the new consoles come out but obviously we've been getting cross-platform games for years. Yet, I can't help but notice that TLOUP1 and Jedi Survivor are both current gen exclusive, and we just started getting more games like that.
 

Drizzlehell

Banned
Not sure what you mean. The 4090 is over 3x the performance of the consoles' GPUs and high-end PC CPUs are like twice as fast (forgot the exact number).
But didn't console hardware always had the advantage of being a closed platform that's just easier to optimize the games for? Not to mention that console hardware works a bit differently, which makes any direct comparisons with PCs kinda pointless.
 

Gaiff

SBI’s Resident Gaslighter
And the custom i/o?
I hope you're joking.
But didn't console hardware always had the advantage of being a closed platform that's just easier to optimize the games for? Not to mention that console hardware works a bit differently, which makes any direct comparisons with PCs kinda pointless.
Which is cool for equivalent spec'd PCs. Consoles should run better than an equivalent PC. The thing is, there are PCs far more powerful than these consoles that run the game poorly. Clearly, there is something else at play here.
 

Zathalus

Member
And the custom i/o?
With DirectStorage its about equal to a PS5. Then again I doubt this game is taking advantage of any of that on consoles as basic textures take over 10 seconds to load on PS5 in some areas. Just seems to be a bit of a technical mess.
 

//DEVIL//

Member
I was talking about this in my previous posts. Yes console owners don't get full tracing and the like. But they get an optimized game that works without technical issues.

I recently downgraded my 4090 FE with a friend to a 4080 strix and 850$ Canadian from his side just because I am starting to feel like PC is becoming a waste of money

But even then, I feel like downgrading again and just keep the PC for old games till devs get their shit together, I am not giving them my money they can fuck off.

The FE 4090 retails for 2100$ ( MSRP price ) The FE 4080 retails for 1700$

This mean I got my 4080 strix for 1250$ Canadian ( and I still feel ripped off with these ports ) such a shame
 
Last edited:

StereoVsn

Member
God dammit. It's every single damn AAA port now that has major issues. It's beyond frustrating.

I got the game "free" with AMD purchase, but because of that there is no DLSS. I think I will just resell the code (didn't register it yet), and come back to this game in like 6 months.
 

Fabieter

Member
With DirectStorage its about equal to a PS5. Then again I doubt this game is taking advantage of any of that on consoles as basic textures take over 10 seconds to load on PS5 in some areas. Just seems to be a bit of a technical mess.

I think its more that those consoles dont have alot of bottlenecks. Sure ray tracing isn't great but besides that... Pc gaming is different but bad optimization is still a factor sure.
 

SlimySnake

Flashless at the Golden Globes
Then maybe you'd have to provide a timestamp because I'm watching the video and waiting for the insanity to unfold before my eyes.
lol are you serious? There are several instances in this video where the 4090 drops below 40 fps. Hell 90% of this video is the 4090 running the game in the 40s at 1440p!!

Here is a timestamp that shwos drops to 30 fps going all the way down to 27 fps. This is a $1,600 4090. The 3080 Ti is roughly 40% slower so it makes sense that it would see regular dips to below 20 fps Fextralife mentioned.



1:23 shows a drop to low 30s.
3:00 shows a drop to low 30s.
6:30 shows a drop to low 30s.

Again, this is 1440p. THE most powerful card in the world. Unable to run the game at anywhere close to 60 fps. Even TLOU wasnt that bad. Just admit you got it wrong and move on.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
With DirectStorage its about equal to a PS5. Then again I doubt this game is taking advantage of any of that on consoles as basic textures take over 10 seconds to load on PS5 in some areas. Just seems to be a bit of a technical mess.
With DirectStorage about equal?
With DirectStorage PCs are hitting 20GB/s.

p1LC2UA.png



We just need games to actually use the damn API.
 

Drizzlehell

Banned
Which is cool for equivalent spec'd PCs. Consoles should run better than an equivalent PC. The thing is, there are PCs far more powerful than these consoles that run the game poorly. Clearly, there is something else at play here.
That something most likely being the fact that it's a lot harder to optimize games for PCs. It's a tale as old as cross-platform video games.

Anyway, I sympathize because I'd probably be annoyed too if a PC that I spent a bajillion dollars on still wouldn't play all the games maxed out, but if I was that person I'd probably just drop the resolution to 1080p and moved on with my life because I just don't care. I'm only interested in the games and I leave it to my brother to agonize over framerates.
 
Last edited:
Top Bottom