• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[DF]: The Last of Us Part 1 PC vs PS5 - A Disappointing Port With Big Problems To Address

Buggy Loop

Member
Resolution there is 1080p though, didn‘t people have much better luck dropping resolution down to 1080p though?

Panajev

SflY4rE.jpg


Imagine this, 6.2GB at low texture settings @ 1080p, if we can even call that "textures". It's not much lower in VRAM usage in fact than the ultra/high texture settings at 1080p which are roughly 1GB more.

This should already raise a big fat red flag that something's fundamentally broken.

For reference, you can literally take a 6GB card and make the VRAM a virtual disk and then install Crysis on it and play it. Hell, Crysis looks better than the above screenshots and ran on 256-512MB VRAM.

In fact that screenshot's textures are so bad that it looks like the interns couldn't find assets on PS5 version so they picked PS3 textures to have a low setting, the kind of textures you saw on 256MB VRAM consoles, but even that i think i am being too generous to the above screenshot, i think PS3 games did better.

Here's 6.2GB of VRAM (4k, not 1080p) used from a "next" gen exclusive title

A-Plague-Tale-Requiem-Windows-03-11-2022-23-29-34.png


52466162831_0d05f0b717_3k.jpg


52485185829_3c76d691f6_3k.jpg


giphy.gif
 
Last edited:

01011001

Banned
Nope, Cyberpunk had the same shit at launch. I played it on a 2070 super and it would seize up and crash at random intervals after playing for 30 minutes.

Was a widespread issue at the time and the ultimate cause of the crash? VRAM being maxed out due to a memory leak.



Google and you will find plenty more examples of people talking about this.


this is a very obvious bug tho.
it's not so obviously a bug in RE4 or TLOU.
in those 2 cases it seems simply like badly optimised ports that simply don't look/run as good as you would expect from the hardware
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
These problems are not going away anytime soon and seem to be getting worse. PC development is complex due to all the possible hardware arrangements. If you are a PC gamer, you must simply accept these issues and I guess be grateful that stuff like TLOU is even coming out for your platform to begin with.

Sony needs to realize that PC ports need to come out one year or more after their PS5 releases..............or when THEY ARE READY! No need to push something on the secondary hardware markets that's not ready.
 

Senua

Gold Member
Sony needs to realize that PC ports need to come out one year or more after their PS5 releases..............or when THEY ARE READY! No need to push something on the secondary hardware markets that's not ready.
Was clearly rushed out to capitalise on the TV show hype.
 

rofif

Banned
Panajev

SflY4rE.jpg


Imagine this, 6.2GB at low texture settings @ 1080p, if we can even call that "textures". It's not much lower in VRAM usage in fact than the ultra/high texture settings at 1080p which are roughly 1GB more.

This should already raise a big fat red flag that something's fundamentally broken.

For reference, you can literally take a 6GB card and make the VRAM a virtual disk and then install Crysis on it and play it. Hell, Crysis looks better than the above screenshots and ran on 256-512MB VRAM.

In fact that screenshot's textures are so bad that it looks like the interns couldn't find assets on PS5 version so they picked PS3 textures to have a low setting, the kind of textures you saw on 256MB VRAM consoles, but even that i think i am being too generous to the above screenshot, i think PS3 games did better.

Here's 6.2GB of VRAM (4k, not 1080p) used from a "next" gen exclusive title

A-Plague-Tale-Requiem-Windows-03-11-2022-23-29-34.png


52466162831_0d05f0b717_3k.jpg


52485185829_3c76d691f6_3k.jpg


giphy.gif
Plague Tale is fantastic. I don't like stealth so it's not my fav game but the graphics and story are so good.
And I disliked the first one
 
Very interesting video in regards to TLOU problems on PC. It seems I was correct about this whole decompression chip, without it developers are forced to hammer CPU and increase VRAM usage. I wonder if something can be done to improve requirements, maybe GPU decompression will help.

 
  • Like
Reactions: GHG

Tqaulity

Member
Wait what was wrong with Spiderman, Atomic Heart and Returnal? Atomic Heart was missing the advertised RTX which is a piss take but it runs and looks great.
Seriously? I'm not going to rehash all in details but if you've been paying attention (may not have affected you personally which is great):

Spiderman: Poor CPU utilization and boundedness (especially with RT On), VRAM limitations/perf degradation
Atomic Heart: Stutter, crashing, FPS drops (at launch)
Returnal: Poor CPU utilization and boundedness (especially with RT On), streaming stutters when loading (again at launch)

Every one of the issues you bring up can be addressed in ways that make the user experience not just ok, but far superior to the console experience. Stuttering is a shader comp issue that can be addressed by precompiling shaders, it’s a stupid developer choice to not do so. APIs and drivers and what have you have never been better than they are now on PC. I don’t seem to have any issues with GamePass releases, which kinda says to me Microsoft’s quality control is pretty good compared to any rando developer on Steam.
I'm so confused. So PC can offer a "far superior" experience compared to consoles, but having to wait anywhere from a few mins to an hour to precompile shaders before startup is somehow a superior experience to just loading the game and playing? (And having to redo that everytime you make a GPU driver or HW change BTW). Then you say APIs and drivers have never been better....but does that mean they are easier to developer for over console even in their current state? Going from "pain in the ass" to less of a "pain in the ass" is still a ..."Pain in the Ass". Also, you do realize that much of the pain with shader compilation stutter is due to how DX12 works and what the developers now have to do at the SW level right? No it's not just a UE4 issue

Almost all of the games you mentioned were better on PC despite having issues. Let's not pretend console games have not had issues either. Callisto Protocol on Xbox was a mess, Hogwarts Legacy RT mode is basically unusable on consoles and the image quality for Dead Space, Forspoken and RE4 for PS5 at launch was terrible (in RE4 case it still is).
I never said console games don't have their issue. S**t making video games is difficult...PERIOD. But you're missing my point. Using The Last of US PT1 as a recent example:

To this point, the legacy of The Last of Us was simply that is was an exceptional game that many put up as one of the best gaming experiences ever! The legacy of the game was just as a shining example of what the video game medium is capable of and it's quality helped define the reputations of both PlayStation Studios and Naughty Dog as a whole. Even resulting in one of the best shows in HBO history as well. Keep in mind that this game has literally released on 3 Console Generations to this point with various amounts of remastering and remaking of the original game....and while there of course were patches to improve things, the discussion and legacy of the game never wavered from it just being an outstanding game. Now with the PC port, what is the discussion and legacy? It's about how it doesn't run on this and that sku. It's about how poor of a port it is, how bad it performs, and how its crashing for many users, how it can't hit this resolution and FPS on this system etc . No matter what they do from this point on to fix, these impressions will forever stick when thinking back to the PC version of this game. The legacy is now a bunch of silly memes with super wet characters and black artifacts all over them. I haven't seen a single forum thread or media article discussing just how great a game (at its core) The Last of Us really is with this PC release. Why? (here it is...) Because oftentimes, the nature of the PC and the recent issues that come with that has made it increasingly difficult for gamers to just enjoy the games they are playing. Admittedly, it's hard to just enjoy the game when you have glitching, stuttering, inconsistent performances, crashing, and other issues you have to debug just to get it to work properly. Sure, not EVERYONE has these issues but that's the whole point: there isn't just a single PC to consider. Hell there isn't even a series or level of PCs to consider with HW spanning a decade or more at times.

Everyone loves to act like "The PC" is just this single monolithic beast of a computer...how many of you realize that most dev studios barely have 4000x Series cards to test their games on (nvm RX7000 series)? There maybe 1-3 Ada cards at a studio typically reserved to some key devs on the team and not the QA lab (for example). At the same time, there may not be any non-RTX cards or DX9 cards or Windows 10 systems available to test on either. There's so much trust and reliance on so many external entities doing their job properly and address any new bugs just to ensure a game will work correctly. You hope Microsoft does it's job with the Windows OS and it's commands that your game relies on. You hope that Nvidia and AMDs drivers function as expected for the full range of skus out in the market (beyond what you can test). You hope that Steam, Epic, Xbox, Origins, Battlenet, Uplay etc work properly for every user despite what they may be running on their machine. You hope that OBS, Geforce Experience, Logitech, Razer, Norton and whatever other SW is running on the user's machine doesn't mess with the game's functionality and performance. Most importantly, you HOPE that the user will follow instructions, heed warnings, and use "common sense" to not try to do things to the game that may break the game (e.g attempt to run ultra settings and RT on a 5 yr old non-RT GPU with only 8Gbs or pair their old Intel 6000 Skylake quad core CPU with an RTX 4090 and wonder why the perf is so bad :messenger_grinning_smiling:)

Yes game development in general is a b*t** but PC development has its own challenges making it even worst in many ways. When you really think about it, it's literally a miracle something as complex as a modern video game could ever ship without any egregious issues that would prevent the player from simply enjoying the game.
 

Guilty_AI

Member
Seriously? I'm not going to rehash all in details but if you've been paying attention (may not have affected you personally which is great):

Spiderman: Poor CPU utilization and boundedness (especially with RT On), VRAM limitations/perf degradation
Atomic Heart: Stutter, crashing, FPS drops (at launch)
Returnal: Poor CPU utilization and boundedness (especially with RT On), streaming stutters when loading (again at launch)


I'm so confused. So PC can offer a "far superior" experience compared to consoles, but having to wait anywhere from a few mins to an hour to precompile shaders before startup is somehow a superior experience to just loading the game and playing? (And having to redo that everytime you make a GPU driver or HW change BTW). Then you say APIs and drivers have never been better....but does that mean they are easier to developer for over console even in their current state? Going from "pain in the ass" to less of a "pain in the ass" is still a ..."Pain in the Ass". Also, you do realize that much of the pain with shader compilation stutter is due to how DX12 works and what the developers now have to do at the SW level right? No it's not just a UE4 issue


I never said console games don't have their issue. S**t making video games is difficult...PERIOD. But you're missing my point. Using The Last of US PT1 as a recent example:

To this point, the legacy of The Last of Us was simply that is was an exceptional game that many put up as one of the best gaming experiences ever! The legacy of the game was just as a shining example of what the video game medium is capable of and it's quality helped define the reputations of both PlayStation Studios and Naughty Dog as a whole. Even resulting in one of the best shows in HBO history as well. Keep in mind that this game has literally released on 3 Console Generations to this point with various amounts of remastering and remaking of the original game....and while there of course were patches to improve things, the discussion and legacy of the game never wavered from it just being an outstanding game. Now with the PC port, what is the discussion and legacy? It's about how it doesn't run on this and that sku. It's about how poor of a port it is, how bad it performs, and how its crashing for many users, how it can't hit this resolution and FPS on this system etc . No matter what they do from this point on to fix, these impressions will forever stick when thinking back to the PC version of this game. The legacy is now a bunch of silly memes with super wet characters and black artifacts all over them. I haven't seen a single forum thread or media article discussing just how great a game (at its core) The Last of Us really is with this PC release. Why? (here it is...) Because oftentimes, the nature of the PC and the recent issues that come with that has made it increasingly difficult for gamers to just enjoy the games they are playing. Admittedly, it's hard to just enjoy the game when you have glitching, stuttering, inconsistent performances, crashing, and other issues you have to debug just to get it to work properly. Sure, not EVERYONE has these issues but that's the whole point: there isn't just a single PC to consider. Hell there isn't even a series or level of PCs to consider with HW spanning a decade or more at times.

Everyone loves to act like "The PC" is just this single monolithic beast of a computer...how many of you realize that most dev studios barely have 4000x Series cards to test their games on (nvm RX7000 series)? There maybe 1-3 Ada cards at a studio typically reserved to some key devs on the team and not the QA lab (for example). At the same time, there may not be any non-RTX cards or DX9 cards or Windows 10 systems available to test on either. There's so much trust and reliance on so many external entities doing their job properly and address any new bugs just to ensure a game will work correctly. You hope Microsoft does it's job with the Windows OS and it's commands that your game relies on. You hope that Nvidia and AMDs drivers function as expected for the full range of skus out in the market (beyond what you can test). You hope that Steam, Epic, Xbox, Origins, Battlenet, Uplay etc work properly for every user despite what they may be running on their machine. You hope that OBS, Geforce Experience, Logitech, Razer, Norton and whatever other SW is running on the user's machine doesn't mess with the game's functionality and performance. Most importantly, you HOPE that the user will follow instructions, heed warnings, and use "common sense" to not try to do things to the game that may break the game (e.g attempt to run ultra settings and RT on a 5 yr old non-RT GPU with only 8Gbs or pair their old Intel 6000 Skylake quad core CPU with an RTX 4090 and wonder why the perf is so bad :messenger_grinning_smiling:)

Yes game development in general is a b*t** but PC development has its own challenges making it even worst in many ways. When you really think about it, it's literally a miracle something as complex as a modern video game could ever ship without any egregious issues that would prevent the player from simply enjoying the game.
a9c.png
 

GHG

Gold Member
Plague Tale is fantastic. I don't like stealth so it's not my fav game but the graphics and story are so good.
And I disliked the first one

It's the difference between a game built for PC first (plague tale, atomic heart, etc) and then ported to consoles vs a game built for a console first and then ported to PC.

It's no coincidence that recent 3rd party releases like wild hearts and Wo Long have exhibited similar issues (high VRAM usage along with higher than usual CPU usage).
 

Stooky

Member
because Crash wasn't impressive in any area.
Spyro was.

Crash, in terms of gamedesign, could literally have been done on an SNES.
having good looking graphics while running down a corridor isn't impressive
What load of crap. Go do your reasearch of what it took to get that quality of graphics and animation to run on ps1. It had never been done before. That game was a technical masterpiece when it came out.
 
Last edited:

poppabk

Cheeks Spread for Digital Only Future
I'm so confused. So PC can offer a "far superior" experience compared to consoles, but having to wait anywhere from a few secs to a few minutes (or up to an hour for this one game) to precompile shaders before startup is somehow a superior experience to just loading the game and playing? (And having to redo that everytime you make a GPU driver or HW change BTW).
Fixed that for you. Shader compilation is usually a few minutes at most, and a blip relative to downloading 100+ gig of data and having it install. TLOUp1 taking so long is one of the reasons people were immediately wary of the quality of the port.
 
Last edited:

Guilty_AI

Member
Fixed that for you. Shader compilation is usually a few minutes at most, and a blip relative to downloading 100+ gig of data and having it install. TLOUp1 taking so long is one of the reasons people were immediately wary of the quality of the port.
Oh, right. You should know some people here are under the misconception shader compilation is something that needs to be done every time you play the game.
 
My 3080 kept crashing playing this.

Reverted my mild overclocked 3080 back to default clocks and playing on a mixture of High & Ultra (with DLSS set to Quality) on 1440p Ultrawide monitor. No more crashing.

Running stable now and getting around 80FPS. I have textures set to High and every other setting is on Ultra.
 

SlimySnake

Flashless at the Golden Globes
I don't honestly care; what I try to prove is 1.6 GB thing is fake, it only promotes FUD They should remove it ASAP. RDR2 was similar, would claim "other apps" were using some weirdly high number of VRAM, but game would use all free available VRAM regardless.

What I'm trying to show and prove is that you can perfectly run critical high quality textures with 8 GB budget even at 1440p given you have the performance headroom to do so. 1080p is a cakewalk regardless. You can push textures to 7.2-7.3 GB VRAM usage if you're a casual user. If you disable steam's hardware acceleration you can push it to 7.4-7.5 GB instead.

Its just visual, it has no effect on how game ACTUALLY uses memory. It actually USES free memory if there's free memory.
I tried your suggestions but even if i put DWM.exe and SystemSearch bar and 10 other windows app to not use the GPU at the highest priority (in windows graphics settings), they still consume well over 800 MB total.

In game, i can get up to 9.1 GB now. Up from 8.5 GB after making those changes, but I am not able to get them down to 300-400 like you suggested.

At 1440p (4k dlss), i am able to use high textures but i am on a 10 gb card and have 2x the power of a 2080 or 2070 super. I do not know how they can run the game at high settings at 1080p when the game only gives you 1GB back for downgrading from 1440p to 1080p.

is it possible that you have an integrated GPU? My CPU doesnt have one so it could be the reason why DWM uses almost 600 MB, and 3 other windows apps use 100 MB each. Tje rest add up to another 100MB.

When I googled, it showed that DWM can run on the integrated graphics chip so thats probably something DF and other tech sites should mention going forward. 1 GB is A LOT when you only have 8-10 GB of vram available. Its the difference between 1080p and 1440p in this case.
 

Hoddi

Member
I tried your suggestions but even if i put DWM.exe and SystemSearch bar and 10 other windows app to not use the GPU at the highest priority (in windows graphics settings), they still consume well over 800 MB total.

In game, i can get up to 9.1 GB now. Up from 8.5 GB after making those changes, but I am not able to get them down to 300-400 like you suggested.

At 1440p (4k dlss), i am able to use high textures but i am on a 10 gb card and have 2x the power of a 2080 or 2070 super. I do not know how they can run the game at high settings at 1080p when the game only gives you 1GB back for downgrading from 1440p to 1080p.

is it possible that you have an integrated GPU? My CPU doesnt have one so it could be the reason why DWM uses almost 600 MB, and 3 other windows apps use 100 MB each. Tje rest add up to another 100MB.

When I googled, it showed that DWM can run on the integrated graphics chip so thats probably something DF and other tech sites should mention going forward. 1 GB is A LOT when you only have 8-10 GB of vram available. Its the difference between 1080p and 1440p in this case.
DWM at 4k eats ~700MB more than it does at 1440p in some cases. It might be worth testing at native 1440p.

I also see a pretty big perf difference between native 1440p and using DLSS to get that up to 4k. It was like 45fps vs 53fps in the scene I tested.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
DWM at 4k eats ~700MB more than it does at 1440p. It might be worth testing at native 1440p.

I also see a pretty big perf difference between native 1440p and using DLSS to get that up to 4k. It was like 45fps vs 53fps in the scene I tested.
Interesting I will try that. i also heard HDR increases the DWM usage but it fluctuates so much, i wasnt able to verify it.

DLSS has a cost IIRC. especially at quality. native 1440p is definitely going to get you better performance compared to 4k dlss but I saw that the vram usage went down when I turned on dlss so figured it would at least solve my vram bottleneck. (it did, mostly)
 

yamaci17

Member
I tried your suggestions but even if i put DWM.exe and SystemSearch bar and 10 other windows app to not use the GPU at the highest priority (in windows graphics settings), they still consume well over 800 MB total.

In game, i can get up to 9.1 GB now. Up from 8.5 GB after making those changes, but I am not able to get them down to 300-400 like you suggested.

At 1440p (4k dlss), i am able to use high textures but i am on a 10 gb card and have 2x the power of a 2080 or 2070 super. I do not know how they can run the game at high settings at 1080p when the game only gives you 1GB back for downgrading from 1440p to 1080p.

is it possible that you have an integrated GPU? My CPU doesnt have one so it could be the reason why DWM uses almost 600 MB, and 3 other windows apps use 100 MB each. Tje rest add up to another 100MB.

When I googled, it showed that DWM can run on the integrated graphics chip so thats probably something DF and other tech sites should mention going forward. 1 GB is A LOT when you only have 8-10 GB of vram available. Its the difference between 1080p and 1440p in this case.
No, I don't have an iGPU. Do you have single or multiple screen? Do you have any active window? Do you kill dwm.exe? (it is actually not killable, it will restart. but with less VRAM)

4K;

oldEtzx.png

1440p;
yfIJATf.png


what are those 100 mb windows apps? are they necessary? they're the ones who bloat dwm vram usage most likely. try killing 'em.

for the high settings case; I specifically should note that "visual effects" and "volumetrics" are set to low on my end. these two give some vram back that you can spend on other places. that's how I can run high character, enviroment textures at 4k dlss perf. this setting combo gives me around 7.5 GB VRAM usage, and with idle 500 mb vram usage, I'm barely scraping by. at 1440p dlss quality, my settings use 7.1-7.2 GB and easily runnable (1440p dwm vram usage is also much lower)

you cannot get rid of dwm or their VRAM usage, you can reduce their vram usage by killing some of them and killing dwm.exe on top. this resets dwm.exe and it will have peak idle vram usage. 9.1 GB is plenty anyways... you don't need anymore tweaking! if you can have free 9.1 GB vram, you can push " game application " vram usage to 9 gb safely.

this is what I would do if I were you;

6TBRztU.png


visual effects at medium, overall preset is "high"

but the real question is, do ultra textures provide any advantage over high ones? you can, I guess, also push "ultra" enviroment textures. that should put you around 9.1 gb usage. if not, you can at least scale back "enviroment" textures to high. once you do that, your game application usage will go back to 8.3 gb. you will have plenty of vram headroom actually. tweak around, you can push something near 9 gb if you can get idle vram usage of 9.1.... and as I said... 9.1 is plenty at 4k/dlss quality from what I'm seeing.

how I get 7.6 gb? high preset + dlss performance + low volumetrics + low visual effects texture quality. other textures set to high. you have the luxury of running some of them at Ultra actually. (worth it or not, I don't know) if I reduce dynamic object quality to medium, I will get around 7350 mb instead, which most 8 gb users should have as free vram... 1440p is easier as you can further reduce dwm impact.

4K DLSS performance VRAM cost = native 1440p VRAM cost btw. Them 4K lods and textures is costly. but they're worth it.

maybe its HDR. I don't have a HDR screen, sorry. this is why I will miss fullscren exclusive modes in games. you could leave desktop at 1080p and ran games exclusively at 4k and save VRAM that way.

dwm.exe is weird, with time it can get bloated. killing it and restarting explorer.exe makes them more tame and makes them use less VRAM. once you get game to allocate those free VRAM, they cannot intervene and allocate VRAM to themselves anymore (unless you alttab a lot. at some point, they will take VRAM from game and game performance may tank then)
 
Last edited:

Hoddi

Member
Interesting I will try that. i also heard HDR increases the DWM usage but it fluctuates so much, i wasnt able to verify it.

DLSS has a cost IIRC. especially at quality. native 1440p is definitely going to get you better performance compared to 4k dlss but I saw that the vram usage went down when I turned on dlss so figured it would at least solve my vram bottleneck. (it did, mostly)
I should have said 'in some cases' and edited my post. I just did a quick test after killing DWM.exe and get 1GB with my 1440p monitor enabled, 1.5GB with my 4k monitor, and 1.7GB with both enabled.

It's gonna vary depending on your content and all that. But it's worth trying.
 

rofif

Banned
I tried your suggestions but even if i put DWM.exe and SystemSearch bar and 10 other windows app to not use the GPU at the highest priority (in windows graphics settings), they still consume well over 800 MB total.

In game, i can get up to 9.1 GB now. Up from 8.5 GB after making those changes, but I am not able to get them down to 300-400 like you suggested.

At 1440p (4k dlss), i am able to use high textures but i am on a 10 gb card and have 2x the power of a 2080 or 2070 super. I do not know how they can run the game at high settings at 1080p when the game only gives you 1GB back for downgrading from 1440p to 1080p.

is it possible that you have an integrated GPU? My CPU doesnt have one so it could be the reason why DWM uses almost 600 MB, and 3 other windows apps use 100 MB each. Tje rest add up to another 100MB.

When I googled, it showed that DWM can run on the integrated graphics chip so thats probably something DF and other tech sites should mention going forward. 1 GB is A LOT when you only have 8-10 GB of vram available. Its the difference between 1080p and 1440p in this case.
I would honestly feel a bit like a tool having to do this windows 98 crap with a game I’ve just bought. This is so far removed from just playing the game lol.

Reminds me of doing exactly that many times. For doom3, for hl2, doom2016, Witcher 3 and many other. But I usually had mid pc and cared too much about performance. I kinda remember that fondly but I was a kid and it was a great way to learn a lot software and hardware around pc. Really helped me with learning English and that type of pc tinkering is a skill you use your whole life.
 

Thaedolus

Gold Member
I'm so confused. So PC can offer a "far superior" experience compared to consoles, but having to wait anywhere from a few mins to an hour to precompile shaders before startup is somehow a superior experience to just loading the game and playing? (And having to redo that everytime you make a GPU driver or HW change BTW). Then you say APIs and drivers have never been better....but does that mean they are easier to developer for over console even in their current state? Going from "pain in the ass" to less of a "pain in the ass" is still a ..."Pain in the Ass". Also, you do realize that much of the pain with shader compilation stutter is due to how DX12 works and what the developers now have to do at the SW level right? No it's not just a UE4 issue

I haven't actually had to wait for shaders to compile in anything I've played except Warzone 2, it's far from the norm to have to sit there and wait or have constant stuttering. Like I said, GamePass games haven't had issues at all for me, they've run great on day one.

As far as my far superior experience, yes I've been playing on PC monitors at 144hz with VRR for the last 9 years or so. Consoles just now are supporting various refresh rates and VRR. If you're interested in smooth, responsive gameplay, PC is always going to win. And with a Steam Deck, I've basically got a portable PS4-level PC when I want to play on the go, and I can seamlessly start it up again on my desktop from a cloud save.
 

SlimySnake

Flashless at the Golden Globes
No, I don't have an iGPU. Do you have single or multiple screen? Do you have any active window? Do you kill dwm.exe? (it is actually not killable, it will restart. but with less VRAM)

4K;

oldEtzx.png

1440p;
yfIJATf.png


what are those 100 mb windows apps? are they necessary? they're the ones who bloat dwm vram usage most likely. try killing 'em.

for the high settings case; I specifically should note that "visual effects" and "volumetrics" are set to low on my end. these two give some vram back that you can spend on other places. that's how I can run high character, enviroment textures at 4k dlss perf. this setting combo gives me around 7.5 GB VRAM usage, and with idle 500 mb vram usage, I'm barely scraping by. at 1440p dlss quality, my settings use 7.1-7.2 GB and easily runnable (1440p dwm vram usage is also much lower)

you cannot get rid of dwm or their VRAM usage, you can reduce their vram usage by killing some of them and killing dwm.exe on top. this resets dwm.exe and it will have peak idle vram usage. 9.1 GB is plenty anyways... you don't need anymore tweaking! if you can have free 9.1 GB vram, you can push " game application " vram usage to 9 gb safely.

this is what I would do if I were you;

6TBRztU.png


visual effects at medium, overall preset is "high"

but the real question is, do ultra textures provide any advantage over high ones? you can, I guess, also push "ultra" enviroment textures. that should put you around 9.1 gb usage. if not, you can at least scale back "enviroment" textures to high. once you do that, your game application usage will go back to 8.3 gb. you will have plenty of vram headroom actually. tweak around, you can push something near 9 gb if you can get idle vram usage of 9.1.... and as I said... 9.1 is plenty at 4k/dlss quality from what I'm seeing.

how I get 7.6 gb? high preset + dlss performance + low volumetrics + low visual effects texture quality. other textures set to high. you have the luxury of running some of them at Ultra actually. (worth it or not, I don't know) if I reduce dynamic object quality to medium, I will get around 7350 mb instead, which most 8 gb users should have as free vram... 1440p is easier as you can further reduce dwm impact.

4K DLSS performance VRAM cost = native 1440p VRAM cost btw. Them 4K lods and textures is costly. but they're worth it.

maybe its HDR. I don't have a HDR screen, sorry. this is why I will miss fullscren exclusive modes in games. you could leave desktop at 1080p and ran games exclusively at 4k and save VRAM that way.

dwm.exe is weird, with time it can get bloated. killing it and restarting explorer.exe makes them more tame and makes them use less VRAM. once you get game to allocate those free VRAM, they cannot intervene and allocate VRAM to themselves anymore (unless you alttab a lot. at some point, they will take VRAM from game and game performance may tank then)
its system settings. yesterday, the searchbar was also over 100 MB. today its under. DWM is also only 460 mb today. yesterday it went up to 700 MB but it seems to be dynamic.

Xyjv5VR.jpg



I am currently at a very taxing part of the game. so far, its been smooth sailing as long as i dont quit to menu or restart or fail encounters. thats when performance really goes to shit. but for whatever reason, these new areas in pittsburgh which look stunning btw, are just taxing my gpu to its limits. down to 40 fps after being a locked 60 with just 80% gpu utilization all game at 4k dlss quality. Could be a taxing area, could be a vram issue, could be a memory leak. im just going to wait until they release more patches.
 

SlimySnake

Flashless at the Golden Globes
Now this is the definition of cherry picking here as it's famously very likely the best looking texture ever on PS3.
yeah, thats what makes it funny ;p

A more accurate comparison would be what DF posted in their video of the last of us ps3 version vs medium settings on PC where the last of us despite the worse quality lighting and assets, actually has more detailed textures. its a one hour long video so i cant remember the timestamp, but medium textures in this game look worse than low in most games.

I will say this. looking at the new zelda trailer made me realize just how incredible some of those linear PS3 games looked. Switch was supposed to be more advanced than the PS360 but at the end of the day its running into the same vram and gpu tflops limitations PS360 open world games like AC odyssey and infamous ran into where the detail was simply not there in comparison to linear games like uncharted, the last of us, call of duty, re5, mass effect 2, and killzone. I think GTA5 was the only open world game that gen that looked on par with the best looking linear games of that gen. i dont know wtf rockstar did but it was a huge improvement over GTA4 in terms of texture and asset quality.
 

SlimySnake

Flashless at the Golden Globes
Another point for our "in-house tech specialists" to analyse. Theres all this talk that the game just needs that much vram, or that its totally normal behavior to have a game running on worse textures than its ps3 counterpart while using 10x more memory because "its next-gen", as if that excuses any shitty look and performance.

Did you all forgot this?

4109991-the-last-of-us-pc-specs.jpg


Naughty Dog themselves recommended 8gb cards for high presets. That can only mean:
A - They lied
B - They messed something up and/or didn't have time to port the game properly
C - They underestimated the necessary amount of vram, which would imply they are so incompetent they don't know how their own in-house engine works.

I'm going with B.
ND lies all the time. Remember the Uncharted 4 teaser that was supposedly running at 1080p 60 fps in realtime? nonsense. Remember when they said they didnt force out Amy Henning. Turns out they made her sign an NDA and later on, Jason found out that it was indeed creative difference after Neil and Bruce tried to take over uncharted 4. More recently they showed off a clear bs version of tlou2 that they said was running on a ps4 pro, and they made zero mention of the fact that this game was initially developed by VSG studios before they took over the development. none of the dev diaries give VSG any credit whatsoever.

They also had dynamic resolution scaling in the PS5 version of the game literally documented in the settings but was patched out after DF found that it wasnt working. Removed the text message and everything.

DF also said that they were contacted weeks before the game came out for a review copy but then never heard from them again until they got the review codes the moment the game went live. There is definitely some shady shit going on. I am a 100% sure they got iron galaxy involved at the very end just so they can throw them under the bus. They even name dropped them in a tweet about how they are working with iron galaxy to fix the issue as if they themselves didnt develop the game.

ND is shady as fuck. Talented, but shady.
Shit/subpar ports happen, it's part and parcel of PC gaming. However expecting an 8GB GPU to carry you throughout next gen when the main consoles have 16GB to work with is a fools errand.
Slight correction. XSX only has 13.5 GB of ram available to games. PS5 according to DF's sources only has 12.5 GB. Sony never announced it so its probably because they wanted to avoid the bad PR after losing the Tflops PR.

That said, I agree that it is likely to due to Cerny's secret sauce that devs are finally using on consoles and when it comes time to port this shit to PC they just fill up vram and system ram and hope it will all work. Hogwarts had the same issue. My 16 GB of vram was causing all kinds of stutters, switched to 32 GB and the stutters were pretty much all gone. I didnt get a performance boost, and RT was still a mess, but thats probably due to me not being able to increase my vram. But what was shocking was that the game took 25GB in system ram alone with RT on. 33 GB if you also count the vram. thats half the fucking game in memory at all times.

Everyone was wondering why the xbox version was performing so poorly compared to the ps5 version despite having an identical cpu, better GPU and faster ram bandwidth, well, its the same shit i saw on PC. Its most likely IO related just like gotham knights.
 

Guilty_AI

Member
ND lies all the time. Remember the Uncharted 4 teaser that was supposedly running at 1080p 60 fps in realtime? nonsense. Remember when they said they didnt force out Amy Henning. Turns out they made her sign an NDA and later on, Jason found out that it was indeed creative difference after Neil and Bruce tried to take over uncharted 4. More recently they showed off a clear bs version of tlou2 that they said was running on a ps4 pro, and they made zero mention of the fact that this game was initially developed by VSG studios before they took over the development. none of the dev diaries give VSG any credit whatsoever.

They also had dynamic resolution scaling in the PS5 version of the game literally documented in the settings but was patched out after DF found that it wasnt working. Removed the text message and everything.

DF also said that they were contacted weeks before the game came out for a review copy but then never heard from them again until they got the review codes the moment the game went live. There is definitely some shady shit going on. I am a 100% sure they got iron galaxy involved at the very end just so they can throw them under the bus. They even name dropped them in a tweet about how they are working with iron galaxy to fix the issue as if they themselves didnt develop the game.

ND is shady as fuck. Talented, but shady.
Really? I heard some of this stuff before but didn't knew it ran so deep. Quite honestly, even C might be a possibility now, they might still have great artists but perhaps they aren't quite the tech wizards they once were anymore, considering many key people of the studio left between UC4 and TLOU2. Its much easier to hide incompetent software engineering when working with fixed hardware after all. Curious to see how their MP game will turn out.
 
Last edited:

Loxus

Member
Panajev

SflY4rE.jpg


Imagine this, 6.2GB at low texture settings @ 1080p, if we can even call that "textures". It's not much lower in VRAM usage in fact than the ultra/high texture settings at 1080p which are roughly 1GB more.

This should already raise a big fat red flag that something's fundamentally broken.

For reference, you can literally take a 6GB card and make the VRAM a virtual disk and then install Crysis on it and play it. Hell, Crysis looks better than the above screenshots and ran on 256-512MB VRAM.

In fact that screenshot's textures are so bad that it looks like the interns couldn't find assets on PS5 version so they picked PS3 textures to have a low setting, the kind of textures you saw on 256MB VRAM consoles, but even that i think i am being too generous to the above screenshot, i think PS3 games did better.

Here's 6.2GB of VRAM (4k, not 1080p) used from a "next" gen exclusive title

A-Plague-Tale-Requiem-Windows-03-11-2022-23-29-34.png


52466162831_0d05f0b717_3k.jpg


52485185829_3c76d691f6_3k.jpg


giphy.gif
To be honest, I don't think you guys should be comparing older games to newer games in this manner without looking at the texture file sizes.

According to Digital Foundry, that smudged look is because the high res textures failed to load because of insufficient VRAM.
RwoSrmd.jpg


We can see this in GTA 5 on consoles when driving fast. The LOD takes awhile to switch to the higher res.
7gJ8qzQ.jpg

It's the results of the texture needed not being in VRAM and takes awhile to get from storage to VRAM, thus pop in occurs.

We know TLOU Part 1 is using higher res textures than TLOU Remastered based on install sizes. About 79GB for TLOU Part 1 and about 48GB fot TLOU Remastered on PS4.

Assuming TLOU Part 1 is using a mixture of 2k, 4k and 8k textures for larger objects. A uncompressed 4k texture is ~67MB and ~269MB for uncompressed 8k. If the VRAM is already full, it's easy to see way the LOD doesn't switch to the high res textures.

The Lumen in the Land of Nanite UE5 demo running at 1440p on PS5 was using 8k textures.

And the fact that TLOU Part 1 has to allocate VRAM for the next ~30 seconds of game play, makes it not surprising to see 8GB of VRAM not being enough for games built around the PS5 storage architecture.

Modding Skyrim for example shows you how much VRAM higher res textures eat.
 

Hoddi

Member
Another point for our "in-house tech specialists" to analyse. Theres all this talk that the game just needs that much vram, or that its totally normal behavior to have a game running on worse textures than its ps3 counterpart while using 10x more memory because "its next-gen", as if that excuses any shitty look and performance.

Did you all forgot this?

4109991-the-last-of-us-pc-specs.jpg


Naughty Dog themselves recommended 8gb cards for high presets. That can only mean:
A - They lied
B - They messed something up and/or didn't have time to port the game properly
C - They underestimated the necessary amount of vram, which would imply they are so incompetent they don't know how their own in-house engine works.

I'm going with B.
Wait, that specs sheet specifies 1080p high for 8GB GPUs. Aren't many of the complaints from gamers because they can't run those at 1440p?
 
Last edited:

Braag

Member
Reality is what it is

Can’t have your cake and eat it too. You can’t highly optimize for hardware that’s not fixed
So should ND just stop patching the game for the next year or so? You know that takes precious resources and man power from their staff which they could use to make other games on your PS5.
 

Panajev2001a

GAF's Pleasant Genius
So should ND just stop patching the game for the next year or so? You know that takes precious resources and man power from their staff which they could use to make other games on your PS5.
They will do what makes business sense, they want to have a decent reputation and this is a new SKU they get money for, it is not a single purchase that gets you a multi platform game running on PC and PS5.

Some of the most egregious bugs you see now are 99.8% bugs the QA’s flagged and are in a backlog somewhere (I do not think they will pull a FROM and never bother with performance ones).

With that said the point made in the post you are answering to are valid: PC has a lot of great attributes, making it cheap and easy to optimise games is not one of them (despite how much you want to pass it on to gamers giving them tons of sliders to balance to get a smooth frame rate or the desired quality settings).

It is also true that with their current business model, GaaS games aside, they still have an incentive to make PS5 the best place to play because it is where they get a cut out of third party sales (games, DLC, and micro transactions). It is not in their best interest to encourage players to move from PS consoles to PC. It does not mean they will danger their PC ports intentionally, but it may means their PC ports are not day and date and they may not spend billions to deeply optimise them for every possible HW, OS, and driver version combination (the matrix of which is very very complex on PC… it is just the truth).
 

k_trout

Member
what I really don't understand is, if this is an insufficient hardware issue like some are saying, then how was my 8gbvram enough for cyberpunk with RT but not for Tlou?
 

SABRE220

Member
Reality is what it is

Can’t have your cake and eat it too. You can’t highly optimize for hardware that’s not fixed
My friend be a little more objective here and hold the developer accountable, naughty dog are amazing developers and this effort does them a disservice. They are not doing a charity piece here and giving the poor pc gamers free games, they are charging full price for a ps4 gen remaster. The game has a lasgen pipeline that would run comfortably on a ps4 there is no major new tech over the tlou2 that is even taking advantage of the ps5.

Even average developers have done far better porting their games to pc, there is no excuse here having modular specs does not mean you literally release a mess of a port...we have had pc games for two decades now on both consoles and pc that have had little issue migrating to pc. Its ok to question your favorite developer when they drop the ball...they arent saints they prioritize money like others and if they arent held accountable they will go down the path of the lazy money grabber.
 

Panajev2001a

GAF's Pleasant Genius
Really? I heard some of this stuff before but didn't knew it ran so deep. Quite honestly, even C might be a possibility now, they might still have great artists but perhaps they aren't quite the tech wizards they once were anymore, considering many key people of the studio left between UC4 and TLOU2. It’s much easier to hide incompetent software engineering when working with fixed hardware after all. Curious to see how their MP game will turn out.
I think Slimy is being all over the place with his criticism of shady things inside ND (politics abound at any company, read Masters of Doom to see how Carmack behaved for example, and yes I am as pissed as anyone for Amy H being pushed out), not sure how crediting people from VSG (were people themselves credited but not the studio? Did people really comb through this or is it an urban myth like “UK does not teach about colonialism and slavery in schools”?), and what is the conspiracy theory about Iron Galaxy (who had similar issues on a console to PC port before) being used to deflect blame by ND, etc… have got anything to do with ND’s developers and artists skills.

I do not think working on fixed specs hides incompetence, you need to be highly skilled at what you do to make that HW sing and ND still houses one of their advanced technology groups (ICE team) doing shared R&D for SIE WWS, but working on PC is a different ball game as you have an exponential level of complexity as you have much more abstracted API’s you do not control, a huge complex matrix of OS versions * HW combinations (CPU, GPU, RAM and VRAM speed and sizes, HDD’s and SSD’s of different shapes and characteristics, etc…) * different Driver versions (each with their performance and support issues) as well as PC players recurrent anger issues every time the console generation starts moving the baseline and the PC players overinflated expectations of what their HW is supposed to do collide with that (it is easier to have PS5+++++ HW powering through games designed for base PS4).
Doing the same level of optimisation for PC titles as console titles is a montano more work that needs to be paid for, I would not be surprised if Sony ended up buying Iron Galaxy or another support studio on top of Nixxes to help with PC ports.

It is quite certain ND will want to be day one on PC with their Factions (GaaS) title and that will influence development from the beginning unlike late ports taking an optimised console version to a myriad of PC devices and OS/Drivers combos.
 

Panajev2001a

GAF's Pleasant Genius
My friend be a little more objective here and hold the developer accountable, naughty dog are amazing developers and this effort does them a disservice. They are not doing a charity piece here and giving the poor pc gamers free games, they are charging full price for a ps4 gen remaster. The game has a lasgen pipeline that would run comfortably on a ps4 there is no major new tech over the tlou2 that is even taking advantage of the ps5.

Even average developers have done far better porting their games to pc, there is no excuse here having modular specs does not mean you literally release a mess of a port...we have had pc games for two decades now on both consoles and pc that have had little issue migrating to pc. Its ok to question your favorite developer when they drop the ball...they arent saints they prioritize money like others and if they arent held accountable they will go down the path of the lazy money grabber.
Sure, the port has issues but more than the dev take it with the parent company (Sony) and the constraints they had with the release of the TV show and how much time they had to make a viable port (actually respecting the specs recommendations they put out yields far less problems: some people having texture issues are playing at 1440p or above on 8GB cards and maybe customising settings further while their recommendations are 1080p max on those).

Downplaying their tech stack, not knowing what use is making of the PS5 (it was not brought up by ND as PS5 only, so unless you are inside this “it can run comfortably on base PS4” is a bit baseless… for all we know they had to bruteforce it to get it to run acceptably and it would have not run ok on PS4 without doing a lot more reworking which again is not free), downplaying the efforts required to port optimised single console versions to the OC platform, downplaying what PC platform strengths and weaknesses are, etc… is not making your arguments any favour.

Very often you had PC with 4-10x the performance difference over consoles bruteforcing games still designed around the previous generation of consoles in a cross generation phase (or think about very long generation like the PS3 one, PC’s at the end of that generations were so so far ahead of the PS3 that launched many year earlier it is not even funny) and some of these games you mention were multiplatform to begin with… every generation we have this period of adjustment when developers adjust to the baseline starting to shift, but do not worry PC’s will keep growing in power while the baseline for consoles will not move for 3-4 years at least and then maybe some before we have these problems again, etc…
 
Top Bottom