• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[DF]: The Last of Us Part 1 PC vs PS5 - A Disappointing Port With Big Problems To Address

bender

What time is it?
I'd like to apologize to the PS3 for dragging it into this unjustly

Blame Senua Senua as he started it.

Because of this thread, PS3 be like:

EqualMisguidedIchthyostega-size_restricted.gif
 

Panajev2001a

GAF's Pleasant Genius
what I really don't understand is, if this is an insufficient hardware issue like some are saying, then how was my 8gbvram enough for cyberpunk with RT but not for Tlou?
Almost like different games with different engines doing different work (btw, ask PS4 and Xbox One players how much they enjoyed CP2077) have something to do with it and well you took a curious example considering the crazy amount of bugs it had to sort out for over a year after its launch (some which they are still sorting out). Also, not sure what your experience is beside the subjective fact that you like it so it is not sn apples to apples comparison… which is the same discussion we have with the Year of Linux on Desktop (vs Windows/macOS) people when they say their experience is “just fine”.
 

Panajev2001a

GAF's Pleasant Genius
Seriously? I'm not going to rehash all in details but if you've been paying attention (may not have affected you personally which is great):

Spiderman: Poor CPU utilization and boundedness (especially with RT On), VRAM limitations/perf degradation
Atomic Heart: Stutter, crashing, FPS drops (at launch)
Returnal: Poor CPU utilization and boundedness (especially with RT On), streaming stutters when loading (again at launch)

I'm so confused. So PC can offer a "far superior" experience compared to consoles, but having to wait anywhere from a few mins to an hour to precompile shaders before startup is somehow a superior experience to just loading the game and playing? (And having to redo that everytime you make a GPU driver or HW change BTW). Then you say APIs and drivers have never been better....but does that mean they are easier to developer for over console even in their current state? Going from "pain in the ass" to less of a "pain in the ass" is still a ..."Pain in the Ass". Also, you do realize that much of the pain with shader compilation stutter is due to how DX12 works and what the developers now have to do at the SW level right? No it's not just a UE4 issue

I never said console games don't have their issue. S**t making video games is difficult...PERIOD. But you're missing my point. Using The Last of US PT1 as a recent example:

To this point, the legacy of The Last of Us was simply that is was an exceptional game that many put up as one of the best gaming experiences ever! The legacy of the game was just as a shining example of what the video game medium is capable of and it's quality helped define the reputations of both PlayStation Studios and Naughty Dog as a whole. Even resulting in one of the best shows in HBO history as well. Keep in mind that this game has literally released on 3 Console Generations to this point with various amounts of remastering and remaking of the original game....and while there of course were patches to improve things, the discussion and legacy of the game never wavered from it just being an outstanding game. Now with the PC port, what is the discussion and legacy? It's about how it doesn't run on this and that sku. It's about how poor of a port it is, how bad it performs, and how its crashing for many users, how it can't hit this resolution and FPS on this system etc . No matter what they do from this point on to fix, these impressions will forever stick when thinking back to the PC version of this game. The legacy is now a bunch of silly memes with super wet characters and black artifacts all over them. I haven't seen a single forum thread or media article discussing just how great a game (at its core) The Last of Us really is with this PC release. Why? (here it is...) Because oftentimes, the nature of the PC and the recent issues that come with that has made it increasingly difficult for gamers to just enjoy the games they are playing. Admittedly, it's hard to just enjoy the game when you have glitching, stuttering, inconsistent performances, crashing, and other issues you have to debug just to get it to work properly. Sure, not EVERYONE has these issues but that's the whole point: there isn't just a single PC to consider. Hell there isn't even a series or level of PCs to consider with HW spanning a decade or more at times.

Everyone loves to act like "The PC" is just this single monolithic beast of a computer...how many of you realize that most dev studios barely have 4000x Series cards to test their games on (nvm RX7000 series)? There maybe 1-3 Ada cards at a studio typically reserved to some key devs on the team and not the QA lab (for example). At the same time, there may not be any non-RTX cards or DX9 cards or Windows 10 systems available to test on either. There's so much trust and reliance on so many external entities doing their job properly and address any new bugs just to ensure a game will work correctly.

You hope Microsoft does it's job with the Windows OS and it's commands that your game relies on. You hope that Nvidia and AMDs drivers function as expected for the full range of skus out in the market (beyond what you can test). You hope that Steam, Epic, Xbox, Origins, Battlenet, Uplay etc work properly for every user despite what they may be running on their machine. You hope that OBS, Geforce Experience, Logitech, Razer, Norton and whatever other SW is running on the user's machine doesn't mess with the game's functionality and performance. Most importantly, you HOPE that the user will follow instructions, heed warnings, and use "common sense" to not try to do things to the game that may break the game (e.g attempt to run ultra settings and RT on a 5 yr old non-RT GPU with only 8Gbs or pair their old Intel 6000 Skylake quad core CPU with an RTX 4090 and wonder why the perf is so bad :messenger_grinning_smiling:)

Yes game development in general is a b*t** but PC development has its own challenges making it even worst in many ways. When you really think about it, it's literally a miracle something as complex as a modern video game could ever ship without any egregious issues that would prevent the player from simply enjoying the game.
Well said :).
 
Last edited:

k_trout

Member
Almost like different games with different engines doing different work (btw, ask PS4 and Xbox One players how much they enjoyed CP2077) have something to do with it and well you took a curious example considering the crazy amount of bugs it had to sort out for over a year after its launch (some which they are still sorting out). Also, not sure what your experience is beside the subjective fact that you like it so it is not sn apples to apples comparison… which is the same discussion we have with the Year of Linux on Desktop (vs Windows/macOS) people when they say their experience is “just fine”.
I still don't fully understand, so its an engine issue not a hardware issue?

edit: just wanted to add, I am genuinely confused about this, I'm not a fanboy for anything -except for maybe Kate bush lol - I think tlou is a better game, but cyberpunk seems like it is technically more advanced and I had way fewer issues with it than with tlou, i think it crashed twice in the 80 hrs whereas tlou crashes more than twice every time I play it and thats on medium settings well below vram limit
 
Last edited:

Braag

Member
They will do what makes business sense, they want to have a decent reputation and this is a new SKU they get money for, it is not a single purchase that gets you a multi platform game running on PC and PS5.

Some of the most egregious bugs you see now are 99.8% bugs the QA’s flagged and are in a backlog somewhere (I do not think they will pull a FROM and never bother with performance ones).

With that said the point made in the post you are answering to are valid: PC has a lot of great attributes, making it cheap and easy to optimise games is not one of them (despite how much you want to pass it on to gamers giving them tons of sliders to balance to get a smooth frame rate or the desired quality settings).

It is also true that with their current business model, GaaS games aside, they still have an incentive to make PS5 the best place to play because it is where they get a cut out of third party sales (games, DLC, and micro transactions). It is not in their best interest to encourage players to move from PS consoles to PC. It does not mean they will danger their PC ports intentionally, but it may means their PC ports are not day and date and they may not spend billions to deeply optimise them for every possible HW, OS, and driver version combination (the matrix of which is very very complex on PC… it is just the truth).

I asked that because he clearly wanted to state that either Sony doesn't care about the state of this release of Naughty Dog is simply incapable of releasing a working PC version of their game, either way, PC gamers should just accept that, because his lord and saviour Sony has graced the PC platform with one of their games.
But if they're gonna spend the time and money to fix the PC version anyway, why not do it before release and save some face and not become the laughing stock of internet for the coming weeks?

Sony even bought the single best PC porting studio Nixxes who could teach their first party devs a thing or two about making PC games so they could smooth out their releases.
But saying "deal with it" is nothing more than hurt fanboyism, because we all know Sony wont leave the game as it is cause most people would stop buying their barely functioning ports in the future.
 

Filben

Member
I remember when people argued for over a decade that your GPU is too slow anyway by the time you run out of VRAM.

Although they might have been right for a long time, this doesn't seem that accurate any more this generation.
 

winjer

Gold Member
what I really don't understand is, if this is an insufficient hardware issue like some are saying, then how was my 8gbvram enough for cyberpunk with RT but not for Tlou?

CP 2077 has a ton of low res textures. Just compare the Reworked Project, to the original textures.
A lot of them look like something out of a PS2 era.


But the thing is, when using quality textures, CP 2077 can use up the 24G of vram on a 4090.
 

k_trout

Member
CP 2077 has a ton of low res textures. Just compare the Reworked Project, to the original textures.
A lot of them look like something out of a PS2 era.


But the thing is, when using quality textures, CP 2077 can use up the 24G of vram on a 4090.

ok thanks, I know very little about these kind of things and want to filter out all the noise around this issue before I commit to an unwanted (but potentially necessary) hardware upgrade
 
Last edited:

winjer

Gold Member
ok thanks, I know very little about these kind of things and want to filter out all the noise around this issue before I commit to an unwanted hardware upgrade

Mind you that CP2077 does a lot of impressive graphical stuff. That is why it's so highly praised for it's visuals.
But texture work is a bit shoddy. Not only textures are a bit low res. But a lot of them even have compression artifacts.
If you are going to play CP2077 in the near future, then Reworked Project is a must.
This was made by the same guy that improved the textures and meshes in the Witcher 3 remake.
 
Last edited:

Guilty_AI

Member
Wait, that specs sheet specifies 1080p high for 8GB GPUs. Aren't many of the complaints from gamers because they can't run those at 1440p?
I'm seeing complaints all accross the spectrum. You dont get a mostly negative score on steam with problems on one resolution setting alone
 
Last edited:

Corndog

Banned
Last edited:

Marlenus

Member
No this time they are actually right.

The Oodle Library that came packaged with the game is broken or something and unnecessarily hammers the CPU.
Theres an unofficial fix for that atleast and it eases the strain on the CPU (proof its not the CPUs fault).....but I believe patch 2 already kinda fixed that so you might not need the unofficial fix depending on your system.....youll have to test yourself.
The fact the community got that fixed early tells you that its something ND missed in the port, its not that the 3600 is ancient and couldnt possibly handle this game.
Because, if watched the video you will see, that once the "loading" is done, the game runs "well" on the 3600.


And as for 8GB of VRAM.
Locking players to medium is fine....im all for that even(not really, let me suffer)....if you want Ultra textures build a machine that can handle Ultra settings.....................................but what they call medium is a farce.








fMF4ThX.png


There is no excuse for this level of texture quality and they should feel bad.

It is easier to have higher quality textures when there is a lot of repetition. That concrete blocks above the P have the same texture. There is a repeating pattern in the brickwork and the floor texture looks pretty simple.

Compared to TLOU where the 3 floor slabs are all different and you have brickwork which does not have repetition plus the other stuff in shot and all the stuff immediately around the camera.

That is the trade-off. If you want to optimise TLOU and keep the variety of assets then you need to lower quality of the assets.

Also an 8GB card running a PS5 game which itself has 10-12GB of usable vram seems a bit of an odd choice. A 3060 12GB or a down clocked 6700XT would be far better choices to ensure you are not hitting an issue with the game trying to keep vram usage below the available limit.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
It is easier to have higher quality textures when there is a lot of repetition. That concrete blocks above the P have the same texture. There is a repeating pattern in the brickwork and the floor texture looks pretty simple.

Compared to TLOU where the 3 floor slabs are all different and you have brickwork which does not have repetition plus the other stuff in shot and all the stuff immediately around the camera.

That is the trade-off. If you want to optimise TLOU and keep the variety of assets then you need to lower quality of the assets.

Also an 8GB card running a PS5 game which itself has 10-12GB of usable vram seems a bit of an odd choice. A 3060 12GB or a down clocked 6700XT would be far better choices to ensure you are not hitting an issue with the game trying to keep vram usage below the available limit.
TLOU2 runs on base PS4 with bigger, better environments with better and more complex textures than Part 1.......and the texture quality absolutely is NOT what Medium Environment Textures look like.
This is clearly some bullshit, no one in their right mind actually thinks this is what Medium textures should look like.
 
It is easier to have higher quality textures when there is a lot of repetition. That concrete blocks above the P have the same texture. There is a repeating pattern in the brickwork and the floor texture looks pretty simple.

Compared to TLOU where the 3 floor slabs are all different and you have brickwork which does not have repetition plus the other stuff in shot and all the stuff immediately around the camera.

That is the trade-off. If you want to optimise TLOU and keep the variety of assets then you need to lower quality of the assets.

Also an 8GB card running a PS5 game which itself has 10-12GB of usable vram seems a bit of an odd choice. A 3060 12GB or a down clocked 6700XT would be far better choices to ensure you are not hitting an issue with the game trying to keep vram usage below the available limit.
On the PS5, there's only 12.5GB available to developers. Do you really think that developers are allocating 12GB just for the GPU?

Just to let you know on PS4 developers were allocating around 50-60% of available memory just to the GPU.
 

winjer

Gold Member
If this isn't proof they're leaning heavily on i/o I don't know what else can be said.

The I/O can save some memory on the PS5. Mostly by shifting around cache data, in and out of memory faster.
But let's remember that the SSD is orders of magnitudes slower than GDDR6. It cannot keep up with a rendering engine and it's buffers.
And to make things worse, the PS5 does not support sampler feedback.
 

SlimySnake

Flashless at the Golden Globes
They are heavily relying on streaming since Crash 1 on PS1. It checks out.
Then how did the PS4 run TLOU Part 2 so well? It didnt have Cernys IO.

Why did Uncharted 4 which is based on the same engine work fine on PC?

I really dont understand whats so special about TLOU's textures anyway. The buildings still have a really flat look to them. I dont think they are using any kind of tessellation. Demon Souls and RE4 have way better looking assets. Though to be fair, they both feature older buildings with hand placed bricks.

I fully expected to be let down by RE4's visuals after going from TLOU1, but to my surprise it looks better in pretty much every way. Better looking character models, way better lighting, and object detail. TLOU's interiors are way too empty. It might be by design, but it just looks cheap and incomplete. Every tunnel and indoor area in RE4 is brimming with detail. That game should be hitting vram a lot more with all its volumetrics and other baked lighting effects in every single area.

P.S TLOU's 1440p presentation is really blurry and has a smudgy look. If thats what you guys got on consoles, I feel bad because DLSS at 4k looks way better than this.
 

rofif

Banned
TLOU2 runs on base PS4 with bigger, better environments with better and more complex textures than Part 1.......and the texture quality absolutely is NOT what Medium Environment Textures look like.
This is clearly some bullshit, no one in their right mind actually thinks this is what Medium textures should look like.
This is distant quality lod textures. Not low or medium. Seriously there should not be textures like that in there at all
 

SmokedMeat

Gamer™


Apparently ND’s releasing a quick fix today and a patch Friday. They’ve said they’re going to focus on fixing the PC version before moving onto the Deck.

I know AMD just released an Adrenaline update with optimizations for Last of Us, but I won’t be able to try that out until later.
 

yamaci17

Member
its system settings. yesterday, the searchbar was also over 100 MB. today its under. DWM is also only 460 mb today. yesterday it went up to 700 MB but it seems to be dynamic.

Xyjv5VR.jpg



I am currently at a very taxing part of the game. so far, its been smooth sailing as long as i dont quit to menu or restart or fail encounters. thats when performance really goes to shit. but for whatever reason, these new areas in pittsburgh which look stunning btw, are just taxing my gpu to its limits. down to 40 fps after being a locked 60 with just 80% gpu utilization all game at 4k dlss quality. Could be a taxing area, could be a vram issue, could be a memory leak. im just going to wait until they release more patches.

game is %30 taxing compared to console power

4k dlss quality looks better than native 4k so I wouldn't sweat over it (also , 4k dlss quality has a cost rendering cost around 1700p native rendering. even 4k dlss performance has a cost around 1440p rendering. let's not forget that, and don't say you get that perf at 1440p! that's important. you can check benchmarks, DLSS upscaling is pretty costly because some buffers will stay at native resolution, which will hurt performance compared to native '1440p' )

restarting or failing encounters, hmm thats a bit odd. i didnt get any problems playing regular game. other fps drops could be related to CPU but yeah game is pretty horrible on that front as well. I did notice that if I go restart the same map , it will take a long time to reload it for some reason

since you're on windows 10, you can disable "background applications". that way, programs will never stay at VRAM if they're turned off.

taskkill /f /im systemsettings.exe
taskkill /f /im dwm.exe

you can use a cmd code like this and execute it automatically as well before playing a game. most of these applications will restart, just with lesser vram usage

if you want true peak idle vram usage... you can kill explorer.exe and stuff that are bundled with it :messenger_tears_of_joy: you can use task manager to open windows explorer again later

taskkill /f /im explorer.exe
taskkill /f /im textinputhost.exe
taskkill /f /im searchhost.exe
taskkill /f /im startmenuexperiencehost.exe
taskkill /f /im shellexperiencehost.exe
taskkill /f /im lockapp.exe
taskkill /f /im systemsettings.exe
taskkill /f /im dwm.exe

when you do this,, most of the stuff that restarts will stay dead. with this, i can achieve 300 mb usage... without this, 500 mb (hence I played with 4k dlss perf with in game bar at 7.5 gb vram)

I may try testing pittsburgh later. i'm way ahead of there. sadly I play locked 40 as my ryzen 2700 is not up for the task for locked 50/60. what was ur cpu again? if its a zen 2 counterpart... sadly yes... attaining locked 60 fps in terms of cpu boundness with zen 2 cpus is near impossible. which is why I settled on 40 fps locked at 4k dlss perf
 
Last edited:

yamaci17

Member
SlimySnake SlimySnake

here's an anectodal comparison


4k dlss perf vs native 1440p

4k dlss perf has better image quality .same performance. do not corner yourself into a psyhological box by thinking that it is rendering at 1080p. it is leagues above what 1080p/1440p can muster in terms of image quality.

and movement comparison


notice how 1440p native taa turns grass into mush, whereas even 4k dlss performance makes them look crisp. people sure as hell sleep on 4k dlss perf a lot.

4k dlss quality of course is another beast. it will be much, much costlier than native 1440p while looking much much better at the same time.

I hate seeing online where people say "4k dlss quality is 1440p anywaysss" I hate that.

3060ti native 1440p, 81 fps



3060ti 4k dlss quality... 64 FPS... 4k dlss performance... 81 fps


so;

4k dlss performance performance cost equals or near around native 1440p rendering cost while LOOKING BETTER!

4k dlss quality is %20 costly than native 1440p while also looking MUCH better!
 
Last edited:

SlimySnake

Flashless at the Golden Globes
game is %30 taxing compared to console power

4k dlss quality looks better than native 4k so I wouldn't sweat over it (also , 4k dlss quality has a cost rendering cost around 1700p native rendering. even 4k dlss performance has a cost around 1440p rendering. let's not forget that, and don't say you get that perf at 1440p! that's important. you can check benchmarks, DLSS upscaling is pretty costly because some buffers will stay at native resolution, which will hurt performance compared to native '1440p' )

restarting or failing encounters, hmm thats a bit odd. i didnt get any problems playing regular game. other fps drops could be related to CPU but yeah game is pretty horrible on that front as well. I did notice that if I go restart the same map , it will take a long time to reload it for some reason

since you're on windows 10, you can disable "background applications". that way, programs will never stay at VRAM if they're turned off.

taskkill /f /im systemsettings.exe
taskkill /f /im dwm.exe

you can use a cmd code like this and execute it automatically as well before playing a game. most of these applications will restart, just with lesser vram usage

if you want true peak idle vram usage... you can kill explorer.exe and stuff that are bundled with it :messenger_tears_of_joy: you can use task manager to open windows explorer again later

taskkill /f /im explorer.exe
taskkill /f /im textinputhost.exe
taskkill /f /im searchhost.exe
taskkill /f /im startmenuexperiencehost.exe
taskkill /f /im shellexperiencehost.exe
taskkill /f /im lockapp.exe
taskkill /f /im systemsettings.exe
taskkill /f /im dwm.exe

when you do this,, most of the stuff that restarts will stay dead. with this, i can achieve 300 mb usage... without this, 500 mb (hence I played with 4k dlss perf with in game bar at 7.5 gb vram)

I may try testing pittsburgh later. i'm way ahead of there. sadly I play locked 40 as my ryzen 2700 is not up for the task for locked 50/60. what was ur cpu again? if its a zen 2 counterpart... sadly yes... attaining locked 60 fps in terms of cpu boundness with zen 2 cpus is near impossible. which is why I settled on 40 fps locked at 4k dlss perf
It was a memory leak. I loaded that same section last night and it was a locked 60 fps on 4k dlss with room to spare No drops to 40 fps i could find. I had been playing for a few hours that day so the game must have freaked out.

I did change the volumetrics to medium among some other settings. I didnt realize i had everything set to ultra aside from textures. It dropped my vram usage from 9.1 to 8.5 GB. That said, in native 4k, even though i was getting 50+ fps, just turning the camera after walking around causes a massive stutter. That could be a vram bottleneck because otherwise my gpu has enough power to get close to 55 fps at native 4k.

I am guessing they are doing that horizon culling thing that Jason Schrier pointed out. If I do a quick turn, they must be loading in data from system ram to vram. i had figured that data would always be in vram and the gpu would simply not render it, but i guess this game is different.
 

yamaci17

Member
It was a memory leak. I loaded that same section last night and it was a locked 60 fps on 4k dlss with room to spare No drops to 40 fps i could find. I had been playing for a few hours that day so the game must have freaked out.

I did change the volumetrics to medium among some other settings. I didnt realize i had everything set to ultra aside from textures. It dropped my vram usage from 9.1 to 8.5 GB. That said, in native 4k, even though i was getting 50+ fps, just turning the camera after walking around causes a massive stutter. That could be a vram bottleneck because otherwise my gpu has enough power to get close to 55 fps at native 4k.

I am guessing they are doing that horizon culling thing that Jason Schrier pointed out. If I do a quick turn, they must be loading in data from system ram to vram. i had figured that data would always be in vram and the gpu would simply not render it, but i guess this game is different.
are those physical stutters that you see on frametime or visual ones that you see on screen? are you playig with a mouse or gamepad? have you tried nvidia frame limiter capping? this game has some camera hitch issues with mouse

I've been having smooth sailing with gamepad but slow analog movement maybe helps. I will do a quick turn testing on my end and see what happens (i have to admit i never use quick turning :D )

edit: im dumb. as you use quick turn u must likely use a gamepad too hopefully dualsense or dualshock=?
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Gonna try on my old trusty 2080 Ti when I have a chance. So far on the 4090, it runs pretty much flawlessly. To think that this card is already almost 5 years old. It aged unbelievably well, especially for one that was maligned at launch.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
are those physical stutters that you see on frametime or visual ones that you see on screen? are you playig with a mouse or gamepad? have you tried nvidia frame limiter capping?

I've been having smooth sailing with gamepad but slow analog movement maybe helps. I will do a quick turn testing on my end and see what happens (i have to admit i never use quick turning :D )
It wasnt even a quick turn. I just walked to one end of that level and turned around. Massive drops from 55 fps to 5 fps. I dont have frametime enabled. No idea how to get MSI afterburner to display 1% framerate. I only see average framerate (D3D one).

I play with gamepad. This only happens on native settings. DLSS quality pushes my vram requirements to 10 GB which is just at the game's requirements. Native pushes it to 11.3 GB and thats when i get stutters like crazy randomly. It has to be a vram thing because i honestly i wouldnt mind playing this at 40 fps maxed out because my gpu actually ran ultra settings at around 35-40 fps. a few downgrades here and there and i should be able to run it at native 4k and 40 fps locked if it were not for my vram usage.
 

yamaci17

Member
It wasnt even a quick turn. I just walked to one end of that level and turned around. Massive drops from 55 fps to 5 fps. I dont have frametime enabled. No idea how to get MSI afterburner to display 1% framerate. I only see average framerate (D3D one).

I play with gamepad. This only happens on native settings. DLSS quality pushes my vram requirements to 10 GB which is just at the game's requirements. Native pushes it to 11.3 GB and thats when i get stutters like crazy randomly. It has to be a vram thing because i honestly i wouldnt mind playing this at 40 fps maxed out because my gpu actually ran ultra settings at around 35-40 fps. a few downgrades here and there and i should be able to run it at native 4k and 40 fps locked if it were not for my vram usage.
yeah interesting, can you name the exact encounter name so I can give it a try with %115 vram usage and see if procures problem on my end;?

help me understand, are you at game application 9.1 gb usage? so you only get no frame drops with game application vram at 8 gb ?
 
Last edited:

yamaci17

Member
Its the financial District.
okay so for starters i had to reduce my vram usage to 7.3 gb, because game bar recording takes around 200 250 mb. so i had to use native 1440p

i made a lot of erratic turns with mouse, couldn't trigger hitches or stalls with 7.3 gb vram usage (in game 7.5 gb usage though)

is it happening when u play a lot or should I try for a more prolonged test? mind you im on 16 gb ram too.. so my ram is also stressed out. 45 fps cap is because... my 2700 craps out with anything above. even 45 fps is pushing it but luckly this map ran all right



so native 1440p with high textures is possible while recording a video (but im sure recording the video will sooner or later break the game)

try 11 gb settings maybe?_

I'm starting to believe what I'm managing to have is an extreme outlier. maybe i should just keep it to myself and just enjoy the games from now on. dunno
 
Last edited:

Hoddi

Member
It wasnt even a quick turn. I just walked to one end of that level and turned around. Massive drops from 55 fps to 5 fps. I dont have frametime enabled. No idea how to get MSI afterburner to display 1% framerate. I only see average framerate (D3D one).

I play with gamepad. This only happens on native settings. DLSS quality pushes my vram requirements to 10 GB which is just at the game's requirements. Native pushes it to 11.3 GB and thats when i get stutters like crazy randomly. It has to be a vram thing because i honestly i wouldnt mind playing this at 40 fps maxed out because my gpu actually ran ultra settings at around 35-40 fps. a few downgrades here and there and i should be able to run it at native 4k and 40 fps locked if it were not for my vram usage.
It's 100% a VRAM issue. Running at native 4k on my 11GB card often completely saturates the PCIe bus which never happens at 1440p. Instead of just uploading data to the GPU then it will now start copying data back into system RAM which is what causes performance to plummet. You can also see how it raises the memory used by the game below.

C2iWJ4l.png
 
Last edited:
How the hell didn't Sony put Nixxes on this way before? Like...they bought them for this and they didn't think it would be good to have their premium studio for ports taking care of TLOU? It's honestly baffling to me.

I know Naughty Dog wanted to do it...but at what cost?
 

bender

What time is it?
How the hell didn't Sony put Nixxes on this way before? Like...they bought them for this and they didn't think it would be good to have their premium studio for ports taking care of TLOU? It's honestly baffling to me.

I know Naughty Dog wanted to do it...but at what cost?

Probably giving ND too big a of a leash unless the porting studios they acquired were busy working on something else, but like you mentioned, you'd figure this franchise would take priority. I do buy the theory it was rushed to take advantage of the wave of excitement generated by the show.
 
Probably giving ND too big a of a leash unless the porting studios they acquired were busy working on something else, but like you mentioned, you'd figure this franchise would take priority. I do buy the theory it was rushed to take advantage of the wave of excitement generated by the show.
Supposedly (and i have no idea where i read this already) Nixxes is working on Ratchet & Clank after both Spider-Man games.
Sony probably trusted Naughty Dog way too much with this.

I'm sure Naughty Dog themselves must feel like crap right now...but still, it's an unfortunate situation and once the hype dies, the game won't do great numbers on PC anymore. It's literally at #43 right now while Resident Evil 4 remake that came out a few days before is still at #2.
 

kingyala

Banned
Someone should forward this to the greate Alex to show him how clueless he is in the context of "VRAM explosion". But he would of course choose to stay in denial due to his fanaticism anyways.
it doesnt matter if you link him this video. he obviously knows whats going on as he speaks to devs and they tell him this but he choses to ignore this because he denied ps5's decompression and io advantage ever since ue5 was demoed with billions of polygons and 8k textures on ps5, he also compared ps5 and series x to the 2070 super and convinced his followers that a midrange pc with a 2070s will cover pc gamers for the generation.. and now as necxtgen games are coming out he has nowhere to run or hide.. all he can do now is refuse to agree and keep lying as much as he can to cover his fanaticism and obvious idiocy
 

bender

What time is it?
Supposedly (and i have no idea where i read this already) Nixxes is working on Ratchet & Clank after both Spider-Man games.
Sony probably trusted Naughty Dog way too much with this.

I'm sure Naughty Dog themselves must feel like crap right now...but still, it's an unfortunate situation and once the hype dies, the game won't do great numbers on PC anymore. It's literally at #43 right now while Resident Evil 4 remake that came out a few days before is still at #2.

You'd hope they can get a fix pumped out quickly, but I guess it boils down to how fundamentally flawed the port is. I'd also assume it is all hands on deck to assess this. If the fix is going to take considerable time, Sony could do themselves a favor on the PR front by working with store fronts to extend refund windows and offering purchasers deep discount coupons for repurchase once it is fixed.
 

Senua

Gold Member
Supposedly (and i have no idea where i read this already) Nixxes is working on Ratchet & Clank after both Spider-Man games.
Sony probably trusted Naughty Dog way too much with this.

I'm sure Naughty Dog themselves must feel like crap right now...but still, it's an unfortunate situation and once the hype dies, the game won't do great numbers on PC anymore. It's literally at #43 right now while Resident Evil 4 remake that came out a few days before is still at #2.
Hopefully they can do a Horizon Zero Dawn and properly turn it around, they seem very active and working hard on the updates, just a shame it wasn't done before they sold it for fifty fuckin squid init!
 
Top Bottom