• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Last of Us Part 1 on PC; another casualty added to the list of bad ports?

The vram usage makes no sense. The PS5 has a total of 13 GB reserved for games, 13.5 GB if we go by XSX's allocation. My PC has 10 GB of VRAM and 32 GB. They should be able to stream data in and out of that smaller vram pool. Should be way better than getting it from the SSD like they do on PS5.

The game doesnt even use RT like Gotham Knights, Dead Space, Callisto, Hogwarts and RE4, no idea why vram usage is so high here.

If my 3080 which is 2x more powerful than the PS5 runs the game at medium settings because of the vram limitations then thats just fucked up.
Yes but Sony clearly didn't spend effort on optimizing this game for PC, or perhaps Naughty Dog is too noob at game development outside of Playstation. Compare TLOU to Doom and it's night and day, even Cyberpunk ran better performance wise than this on PC.
 

Schmendrick

Member
So Sony bought pc porting specialists and then goes on to port a game like this via an in-house team that can barely write "pc".
What could go wrong....
 

DeepEnigma

Gold Member
Alright now im confused. 4k maxed, no fsr/dlss and hes using 11-12gb, while still high for a linear game how are people with 8GB having issues with it? I dont think its a VRAM issue



this guy barely uses 7GB allocated VRAM usage even at native 1440p

As always gaf loves to scream VRAM left and right.

VEEEEEEEERRRRAAAAAMMMMMMMMUH
 
I find it funny that prior to these consoles launching, some PC users were saying, "just run 128GB of RAM to get around the SSD/IO/Decompression advantages on the PS5. Problem solved." VFX guy was notorious for this narrative.

Now, it doesn't seem to make sense anymore. I feel like I am taking crazy pills every other narrative shifting on this forum.

Meanwhile, there were those of us saying we like that these new consoles will raise the min-spec bar in the PC arena once the current-gen built games get ported.
Well that is true PC can do that the question is will devs code their games to run leverage that much ram. The solution has already been devised though it's called Direct Storage.
 

Buggy Loop

Member
Where did you get this data?
In the official Steam survey page, most recent data from Febuary 2023.

JEJ34us.png


Most popular GPU is the 1650, 16Gb of RAM, and CPU from 2.3ghz to 2.7ghz.

Open the tab and cumulate the series. It’s been done a lot in the past.
 

StereoVsn

Member
Ehh, sounds like most PC games at launch these days.

Here's a guide how not to run into these issues:

1)Wait 6 months
2)Buy the game
3)??????
4)Profit.

People keep buying games day one and then make a surprised Pikachu face.
Yep, I will wait 6 months to a year now days after Cyberpunk fiasco.
 

T4keD0wN

Member
They were so afraid of microsoft potentially releasing inferior and buggier versions of call of duty on other platforms they forgot to look into the mirror when releasing this. LMAO
(yes, i know it was ported by iron galaxy, but they had to ok this)
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Alright. Played some and It is not good.

Drops from 45-55 outdoors all the way down to mid 20s indoors. Makes no sense because surely the outdoor areas would be much more taxing no?

Opening ran between 35-45 fps for me mostly without stutters. High settings at native 4k. But its when I got to Boston did things really start to go bad. Simply turning the camera around would cause stutters and huge framerate drops. But then I got into firefights and that caused shit to straight up go into the teens including one big stutter that last a few seconds and went to 4 fps!

The game took an hour to install shaders TWICE. why am i getting these stutters? why is the framerate so inconsistent?

I switched to 1800p and am getting a locked 60 fps for now. DLSS quality at 4k is roughly 55 fps but i havent gotten into anymore firefights. DLSS looked a bit worse than 1800p. I might just play this at 1800p if they cant sort out the issues.

I dont know if PS5 is using these high settings, but man the game looks just like TLOU2 but at native 4k. Cutscenes look phenomenal though. Wish the whole game looked like that with hero lighting and insane looking character models. That opening shot of Sarah sleeping OMG NEXT GEN. Just give me that quality of graphics in gameplay.
 

Lokaum D+

Member
I have a rtx 2060, 16gb ddr4 and a ryzen 5 3600x and the game did not even open here, i got "not enough RAM or VRAM" and crashed, thats y i m tired of PC game, every release is something new, that u have to go into the Internet to fix.

Refunded
 
Last edited:

JRW

Member
Havent played this yet but ya 8GB GPU's are getting screwed lately, I had to disable RT in RE4 Remake in order to prevent crashing due to vram limits.. at 1080P (3060 Ti).
EDIT: Apparently its crashing for some people with 3080's, 4090's etc so maybe its a bug they need to patch.
 
Last edited:

Corndog

Banned
Currently sitting at 40% on Steam user reviews, ouch!

sglDILm.png

Among the most commonly cited problems are: Frequent crashes, jittery camera movement with the mouse, excessive VRAM usage, and overall poor performance with high-end hardware.

Apparently, Naughty Dog has handled this one in-house but I wonder how much they were involved given that the system requirements page has Iron Galaxy's logo on it. Iron Galaxy's Uncharted 4 PC port was alright, not great but passable with the only issues being the jittery mouse movements...and extreme demands compared to the PS5 version.
I imagine the demands are higher because they are using some middle software to convert from Sonys sdk calls to directx or Vulcan. If that’s what they are doing it could add quite a bit of overhead.
 

Corndog

Banned
I find it funny that prior to these consoles launching, some PC users were saying, "just run 128GB of RAM to get around the SSD/IO/Decompression advantages on the PS5. Problem solved." VFX guy was notorious for this narrative.

Now, it doesn't seem to make sense anymore. I feel like I am taking crazy pills every other narrative shifting on this forum.

Meanwhile, there were those of us saying we like that these new consoles will raise the min-spec bar in the PC arena once the current-gen built games get ported.
You don’t have a clue what you are talking about. I suggest you either educate yourself or quit spouting nonsense.
 

Corndog

Banned
Havent played this yet but ya 8GB GPU's are getting screwed lately, I had to disable RT in RE4 Remake in order to prevent crashing due to vram limits.. at 1080P (3060 Ti).
Nvidia should have gone with at least 10 to 12 gigs but cheaped out.
 

SlimySnake

Flashless at the Golden Globes
Why did Sony buy Nixxes if they barely do anything
lol what? dude they released two games in one year last year. Spiderman and Miles.

or a game and a half if you want to be petty.

This is literally Naughty Dog's PC port. Everyone is blaming Iron Galaxy but they had very little to do with this. This is on ND and ND is leagues ahead of Nixxes.

To people playing with controllers, are you getting stuttering when turning the camera? I thought this was a mouse and keyboard issue.
 

Agent_4Seven

Tears of Nintendo
Looks like I'm good with the PS5 version for a while if not forever.


Fuck this BS PC port from IG - absolute idiots who don't know what they're doing. Steam reviews are as expected, no surprises here.
 

SlimySnake

Flashless at the Golden Globes
Native 4k results. Ouch at 2080 which is supposedly on par with the PS5. These are ultra settings though, the PS5 might be using high. we dont know.

Whats interesting is the 2080 Ti vs 3070 comparisons. Cards offer identical performance but 3070 is only 8 GB VRAM vs 11 GB for the 2080 Ti. Over 25% more performance for just an extra 3 GB.

K6RiDTk.jpg



1440p results are poor for the 10.7 tflops 6600xt. Again, we dont know if the PS5 is using high or ultra, but I doubt going from ultra to high is going to more than double the framerate for that card. You basically need a 20 tflops 6800 xt to get to PS5 performance. I will try and get a benchmark for the delta between high and ultra settings and see if it is really 2x.

n82AUZr.jpg
 

Gaiff

SBI’s Resident Gaslighter
Native 4k results. Ouch at 2080 which is supposedly on par with the PS5. These are ultra settings though, the PS5 might be using high. we dont know.

Whats interesting is the 2080 Ti vs 3070 comparisons. Cards offer identical performance but 3070 is only 8 GB VRAM vs 11 GB for the 2080 Ti. Over 25% more performance for just an extra 3 GB.

K6RiDTk.jpg



1440p results are poor for the 10.7 tflops 6600xt. Again, we dont know if the PS5 is using high or ultra, but I doubt going from ultra to high is going to more than double the framerate for that card. You basically need a 20 tflops 6800 xt to get to PS5 performance. I will try and get a benchmark for the delta between high and ultra settings and see if it is really 2x.

n82AUZr.jpg
Yeah, that's brutal. I can't imagine the PS5 using anything less than the High preset so this would put it on the level of a 6800. That's even worse than with Uncharted 4 where it was on the level of a 2080 Ti/3070.
 

Yerd

Member
...too silly for me.
Oh really. Sounds like you need more proof. This is serious shit only.

You protagonist is human. Both games available on steam and playstation. There's monsters populating the world. I don't know how you can tell them apart, honestly.

You can basically screenshot your f4 game and say "hey look, I'm playing LoU1"
 

SlimySnake

Flashless at the Golden Globes
lol 10 GB VRAM usage at 1080p. Sorry but wtf. No wonder 4k is borked.



EDIT: Skip to 9:12 to see the framerate drop to 4 fps. Exactly what i saw during combat sections on my PC.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Yeah, that's brutal. I can't imagine the PS5 using anything less than the High preset so this would put it on the level of a 6800. That's even worse than with Uncharted 4 where it was on the level of a 2080 Ti/3070.
Just checked. There is a lot of weirdness going on if you switch from high to ultra and back without leaving the game. Sometimes, the framerates would remain the same when switching from high to ultra. sometimes, they would drop way below 30 fps for a good minute or two before coming back up and settling around 40 fps on ultra. One time, high went from 45 fps all the way down to 32 fps without any stutter. im like i didnt change anything! i was just moving around the level.

But loading with ultra settings then quitting out and loading in with high settings gave me the following results at native 4k:
54 fps high
44 fps ultra

not all scenes are the same, simply turning the camera around brought the FPS down by like 6 fps on both, but the delta was roughly 25%.

What we are seeing here between the PS5 and 6800xt and 6600xt is a delta of 100%. Im a big believer in coding to the metal, but that doesnt get you 2x more performance. This is just a bad port.

P.S Camera stutter is really bad at times, but I think it might be related to motion blur and depth of field, both of which kick in when you turn the camera around and aim at the same time. i wonder if turning them off would improve performance like it did in RE4.
 

Buggy Loop

Member
The game reserving 5GBs of VRAM for Windows+apps is completely bonkers too. Windows at 4k won't even take more than 1GB. That other 4GB could literally run a decent game in "apps".
This combined with the memory leak are easy fixes.
 

rodrigolfp

Haptic Gamepads 4 Life
DLSS so far saves the game a little. Plus no more retarded and deficient controls/gameplay now that we have m+kb.
 

yurinka

Member
LOL at unoptimized ports thinking games actually should require so much VRAM
Maybe what it's unoptimized is the PC hardware compared to PS5, and since it has a worse I/O system to stream data to VRAM needs more on PC for games that take advantage of it.

Or maybe you should tone down your visual settings in the game to make it run better. Maybe your PC can't run it at 4K, Ultra, etc.

Why did Sony buy Nixxes if they barely do anything
What are you talking about? They already had two ported games released In slightly more than a year after being acquired.

2080 which is supposedly on par with the PS5.
It isn't. And the I/O of any PC also isn't on par with a PS5. In games that take full advantage of the PS5 hardware (I have no idea if it's the case, I assume not) you'll require way higher specs to play with the same settings and performance. And even more to play in Ultra with 4K and a decent framerate.

Instead of playing in Ultra play in whatever uses the PS5 (high or very high?) and drop the resolution to whatever uses PS5 (1440p?), use a fucking fast SSD (PS5 can read at up to 22GB/s) and then you'll be supposed to get a somewhat similar performance with a 2080 Super (not a 2080) but with more RAM than in a PS5 to compensate the extra bottlenecks of the PC hardware.
 
Last edited:
I'm on a 3090, so it's not bad when we take VRAM into consideration, but I still got frame drops with everything maxed, with DLSS on Quality. I actually had to compromise on a couple of settings, but with all the texture/shadow shit maxed out, I didn't care, but then ran into issues while playing, spontaneous frame drops after spawning? Certain places I'd get frame drops.. like.. rotating camera while in a CORNER. Lol.

It's a solid try, but they need to fix this shit. Gonna go check if Nvidia has a driver, then do some more testing if I don't have it(gonna DL it, duh)

EDIT:
No Last of Us Game Ready Driver, as of now/yet?
 
Last edited:

Fredrik

Member
Glad I didn’t jump in day 1, I’ll wait for some patches. Sucks that they stumbled so badly, this could’ve broken all kinds of Steam records.
 
Top Bottom