• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[DF]: The Last of Us Part 1 PC vs PS5 - A Disappointing Port With Big Problems To Address

ChiefDada

Gold Member


Key Takeaways:

1. PC Background streaming thrashes 3600x CPU performance and results in PS5 with Native 1440p and high/ultra settings is performing ~30% faster than 2070S with 1440p DLSS with Medium textures and High settings . This is applicable to all CPUs (albeit with differing performance impact) as PS5 decompression hardware alleviates this issue on console.

2. PS5 shadow quality appears to scale higher than PC Ultra. (Alex gets really frustrated here, as this was unexpected for him lol).


3. Per Alex, there is no meaningful difference between high and ultra settings (curious that he/they call PS5 textures high vs PC Ultra even though they don't perceive any difference).

4. If you have a powerful enough PC setup, you can have a good experience playing TLOU Pt. 1, however texture and overall performance degredation particularly for 8GB GPUs is disheartening.
 
Last edited:

Buggy Loop

Member
Anticipation Popcorn GIF
 

Saber

Gold Member
It's not glitched, just low resolution textures for certain hardware that's not up to the task.

From what I saw during the long hours of looking at the ground I'm pretty sure theres a part that is glitched, besides the PS5 version.
But there are some other weird occasions as well, like the lamp light.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Not interested in the Twitch version of DF. I'm watching for their analysis. I don't care about them wandering around for 1h cracking jokes. Time is precious and I won't give 1 hour of it to these three guys zooming and pausing to scrutinize textures.

They should throw that shit out and come back with a concise, 20-30 minutes video with a wealth of information. No need for this garbage.
 
Last edited:

Bartski

Gold Member
Running comparison with ps5 perfirmance mode really shows how much closer it is to PC ultra than the mid/high rig, but then there are all thise other issues on the top end rig… very interesting
 
These problems are not going away anytime soon and seem to be getting worse. PC development is complex due to all the possible hardware arrangements. If you are a PC gamer, you must simply accept these issues and I guess be grateful that stuff like TLOU is even coming out for your platform to begin with.
 

Zathalus

Member
These problems are not going away anytime soon and seem to be getting worse. PC development is complex due to all the possible hardware arrangements. If you are a PC gamer, you must simply accept these issues and I guess be grateful that stuff like TLOU is even coming out for your platform to begin with.
It's odd how exceptions like this are suddenly the norm and to be expected. For every game like this you have numerous others that are just flat out better on PC.
 

Gaiff

SBI’s Resident Gaslighter
These problems are not going away anytime soon and seem to be getting worse. PC development is complex due to all the possible hardware arrangements. If you are a PC gamer, you must simply accept these issues and I guess be grateful that stuff like TLOU is even coming out for your platform to begin with.
Lol no. This is a transaction, not a charity. Naughty Dog isn't doing this out of goodwill. Sony wants the PC money and is selling to to PC gamers. Nobody needs to be grateful to nobody.
 
Last edited:
It's odd how exceptions like this are suddenly the norm and to be expected. For every game like this you have numerous others that are just flat out better on PC.

This may be a more extreme example, but it's certainly not too much out of the norm now.

"flat out better on PC" is of course relative to the hardware, and you would hope that they would be, presumably that's what you're paying a premium for.
 

Gaiff

SBI’s Resident Gaslighter
Also, I wouldn't hold my breath for the port to get much better. Naughty Dog will probably manage to fix the crashes and bugs but not much more. Nixxes had to redesigned the entire memory management system for the PS5 version of Spider-Man and they're far better PC developers than Naughty Dog or Iron Galaxy. I'm not expecting the performance profile to get much better but I do expect the game to get more stable and playable. Those running 10GB cards or less will have to do with PS3-era textures regardless. Don't see that changing unless Naughty Dog redraws textures for Medium settings. Of course, there's always the chance that modders come to the rescue and remake textures better than Medium that can fit in a smaller memory budget.
 

GHG

Member
Crying that an older 6 core CPU and 8GB VRAM GPU is struggling to keep up when faced with a next gen port that was made with a next gen console in mind.

Equal parts sad and hilarious. More to come as the generation goes on. I'm so sick and tired of PC gamers with overinflated ideas about what their hardware is capable of. Either put the money down for the best hardware and have a good time or accept that your mid-low tier hardware is exactly that and adjust your expectations accordingly.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Mate....medium?

Those look like they are textures meant for the switch port.
Nah, nope, nada ND took some serious shortcuts doing this shit when authoring down, cuz that is absolutely trash.

And why would they reserve that much VRAM then force people to use such ugly textures?
On a 4090 they reserve almost 5GB of VRAM.......WTF?


P.S
Funny PC community had Oodle Library DLL working better than launch........at launch.
How did ND not pick that shit up.....its absolutely hammering that poor R5 3600.
And even with the "hacked" DLL the game is still unnecessarily heavy on the CPU.
5000 and 12000 series should go through that alot better, but forcing people to upgrade their CPUs cuz your using a more CPU intensive comp/decomp algorithm is another mistake i just dont understand.

I guess we wait for another patch......we are on patch 3 right?
 

Buggy Loop

Member
Crying that an older 6 core CPU and 8GB VRAM GPU is struggling to keep up when faced with a next gen port that was made with a next gen console in mind.

high quality GIF


It's basically PS4 Last of us Part II redux and even worse looking. You can literally play part 1 on an external HDD with 60 fps patch. Magic sauce? Cerny sauce? I don't believe a minute that this is defining a hardware generation.

Plague tale requiem is next gen exclusive and trumps this graphically by a wide margin. Uses ~6.2GB VRAM and doesn't go insane with CPU utilization.
Last of Us LOW textures at 1080p that don't even look good by PS3 standards are using 6GB of VRAM!

WOW, PS5 power!

Sexy Horse GIF


Excusing Naughty Dog for their shit port because "yOu HaVE tO uPgRAde yOur pc" ? No fucking way.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
I am curious to read more about this. Do you have any source to share?

Of course.

Roza confirmed that Nixxes built an entire system to accommodate players who ignore Spider-Man's recommendations for graphics cards (GPUs)—specifically, anyone who plays the game with settings that exceed the amount of video memory (VRAM) available in their GPU. Though Windows includes a memory management system that can shift memory-related calls from VRAM to general-purpose system RAM, this is unoptimized by default, and video games need quicker memory for specific tasks.

Nixxes' solution was to create its own non-unified memory management system, which ranked and tracked a hierarchy of calls' importance in the code, then measured them by "distance to 512 MB" in order to prioritize which large chunks of memory would make the most sense in a shift from VRAM to system RAM. So if you've ever decided to push your Spider-Man gameplay on a GPU that wasn't quite up to the task, you can thank Nixxes for implementing an invisible reduced-stutter option.

Source

Is Hardware Unboxed a Sony Warrior as well?


There's an enormous difference between their posts and this:

Nah, bringing 8GB GPUS into the current generation is a ridiculously bad idea.
How fucking moronic is this? So every 8GB GPU and less is obsolete now? PC scales all the way down to 1080p. It's completely unreasonable to expect 8GB to be enough at 4K but 1080p? A resolution that's been the standard on PC for what, 15 years now? And hilariously, the DF video also touches upon this in the video.

ulq6uiV.png


The fact that the textures look this bad, that the game is full of glitches, and has missing effects on PC tells you that this is a job of incompetence and carelessness but your troll ass can't help it, you have to use it as ammo for system warring.

And then, you once again show your true colors.

I mean, just a few years ago PC gamers were complaining that the consoles were holding gaming back, now we have complaints that a current gen game isn't scaling well enough to accommodate PC cards with limited memory.
Basically was waiting for something like this to strike back at the PC warriors.

Done wasting my time with a pathetic troll like you.
 

GHG

Member
high quality GIF


It's basically PS4 Last of us Part II redux and even worse looking. You can literally play part 1 on an external HDD with 60 fps patch. Magic sauce? Cerny sauce? I don't believe a minute that this is defining a hardware generation.

Plague tale requiem is next gen exclusive and trumps this graphically by a wide margin. Uses ~6.2GB VRAM and doesn't go insane with CPU utilization.

Excusing Naughty Dog for their shit port because "yOu HaVE tO uPgRAde yOur pc" ? No fucking way.

Well they designed the remake with PS5 hardware in mind and then ported it over to PC retrosoectively.

It was never originally made with PC's in mind and the port isn't great so brute force is required. This is nothing new, we've all been here before.

Nobody has to upgrade their PC, even the steam deck will run it at appropriate settings. There's an options menu for a reason, people should put their egos to the side and learn to use them instead of crying "unoptimised" without having any idea about what's going on under the hood and why things are running the way they are.
 

Lysandros

Member
RTX 2070S isn't at PS5's level spec wise to begin with (besides the RT hardware which is irrelevant here), why the hell they are insisting to be so? That is baffling. Just to mention some basics: RTX 2070S is at 9 TF, 113 Gpixel/s, 283 Gtexel/s not to mention its significantly lower geometry throughput and lower bandwidth caches. Now compare it with PS5: 10.28 TF, 142 Gpixel/s, 320 Gtexel/s besides customizations such as cache scrubbers. Are they that clueless?
 

Stuart360

Member
If you have a 8gb card i'd just recommend playing at 1080p.

I'm playing at 1080p with a mix of High and Ultra settings, with Enviroment textures set to High, and all the other texture settings set to Ultra, and i'm averaging only around 0.5gb more vram usage compared to Alex -

fcdYQtD.jpg


Imo textures are argubaly one of, if not the biggest, factor in terms of how good a game looks. So yeah i'd rather play at 1080p with good textures to be honest.
 
Top Bottom