• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Last of Us Part 1 on PC; another casualty added to the list of bad ports?

Mephisto40

Member
All i'm getting from this thread is people are trying to run the game at ultra 4k on a rig that isn't capable of doing it and then complaining about it and blaming the game lol
 

ShadowNate

Member
I guess I'll wait until they fix the issues and allow it to perform well on somewhat weaker hardware. Which is to say, I'm not upgrading for this title alone, and I shouldn't have to.

Too bad, I was looking forward to this on PC and was pretty sure I'd buy it day one. Up till the delay happened, then the news of Iron Galaxy's involvement and the lack of early reviews. All relieable red flags it seems.
 

Bojji

Member
99du0Md.jpg
UDxg943.jpg


Fuck 8GB of VRAM!
 

DrFigs

Member
My comment was for you PlayStation fanboys that like to come into every PC thread, acting as if the benchmarks they’re seeing on PC are equivalent to what their PS5 is running.
I think you're right. 4k ultra on pc will look better than the ps5 version, which apparently also barely gets above 30fps. but from an outsider, non pc gamer, i guess i'm confused as to whether these issues are just people trying to run a game at settings which are above their systems capability. naughty dog did post the recommended specs and it seems like people think they should be able to ignore them?
 
Last edited:

SmokedMeat

Gamer™
I think you're right. 4k ultra on pc will look better than the ps5 version, which apparently also barely gets above 30fps. but from an outsider, non pc gamer, i guess i'm confused as to whether these issues are just people trying to run a game at settings which are above their systems capability. naughty dog did post the recommended specs and it seems like people think they should be able to ignore them?

It’s a mix.

There’s legitimate issues with the port that need to be fixed. There seems to be a memory leak, not to mention a pause during gameplay to load the game. The loading in the middle of gameplay is unheard of. Not really a huge deal as I’m sure it’ll all be handled pretty quickly.

Then there’s the folks trying to run the game at all ultra settings (and there’s a lot of settings that can be tuned) and finding they can’t. A big part of that is running out of VRAM, and it’s quite possible that can be optimized better.

This is all my layman’s view lol. There’s way more knowledgeable PC folks around here.
 

Agent_4Seven

Tears of Nintendo
Exactly this.

This basically summarises pc vs console right now.
I'll go even further and say this - PC platfrom is a giant factory which produces overpriced af hardware for people who can't stand 30FPS, and instead of squizing everything from their hadware and play PC ports with a controller (which is ideal for 30 FPS), they're buying overpriced shit every 2 years just to get miserable 30FPS (if that) bump to minimum framerates (which are way more important than average and max) for now 1000$+. And until these people will wake the fuck up and start voting with their wallet, nothing will ever change and prices will get even higher for even less than bare minimum.

I've Z390 Aorus Master, 8700K and 32Gigs if RAM which I bought to replace outdated 4790K platform after almost solid 5 years service. My current rig with 3070 is more than enough for 99% of games to play at 4K and even higher, be it old games or some of the new ones, I could't care less if 60 is not an option (30 is also fine by me if it's 100% stable), plug-in one of my controllers and play at a higher resolution if I can, while others spending thousands of $ and one day run into a wall of shit which is TLOU PC port or something else - which is inevitable.

I don't care about BS PC ports with RAM and VRAM leaks and 0% CPU utilization, I just don't play them if it's 100% unplayable, I also don't need to play absolutely every new game as well as at launch. I know that I'm in the minority, but I don't care cuz I'm voting with my wallet, which is a lot more important cuz greedy fucks only understand the language of money and nothing else will every change their decisions and make our lives and wellbeing better.
 
Last edited:

dotomomo

Neo Member
I'm feeling pretty validated about my decision to return my 3060ti that I bought a few weeks ago in favour of a 6700XT.

Shout out to Daniel Owen's Youtube channel for his benchmarking analysis videos, he's been sounding the alarm regarding changing VRAM standards for over a year.

We've seen several high-profile games in 2023 that need 12GB+ VRAM as a minimum, this is the new norm, not some outlier anomaly. This version of TLOU has some seriously detailed assets, do people really think every developer has the budget and resources to consider *highly* scaleable LOD/MipMap management options for every tier of performance level?
 
Last edited:

Raphael

Member
NVidia shitting bricks rn with their laughable 8gb vram for mid range 4-series cards.
After HL and this people might finally reconsider NV for AMD. Now if only AMD will be a little less greedy with 7600-7700xt cards they might actually convert people and win a lot of market share.
However looking at nVidia offering im sure its still going to be pricey.
 
Last edited:

Stuart360

Member
We've seen several high-profile games in 2023 that need 12GB+ VRAM as a minimum, this is the new norm, not some outlier anomaly.
No we havent, and no its not.
Sure maybe if you are playing at native 4k, ultra settings, and full ray tracing on.
This game for me at High settings, including High textures, uses 7gb of vram at 1080p.
And i still havent seen a game use more than 8gb vram at 1080p.
 

GHG

Member
Shout out to Daniel Owen's Youtube channel for his benchmarking analysis videos, he's been sounding the alarm regarding changing VRAM standards for over a year.

This is the only guy doing good work in the PC gaming space on YouTube at the moment. He actually takes the time to benchmark games at various settings on various cards and then analyse what's being used and what isn't instead of just looking at benchmark graphs and saying "this is fine".

Anyone sensible knew that both system RAM and VRAM requirements were going to go up as soon as next gen only games started to arrive, and that's not even accounting for games that run poorly. The writing has been on the wall for a very long time but we still had people recommending 8GB VRAM cards, low amounts of system RAM and low core/thread count CPU's saying "it's enough" to max games out because "on paper" that's more powerful than the "next gen" consoles. That's not how it works and that's never how it's worked.

It's a shame we need to do this song and dance at the start of every new console generation, it's just that this time this situation has been delayed due to the extended cross-gen period this time around.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
NVidia shitting bricks rn with their laughable 8gb vram for mid range 4-series cards.
After HL and this people might finally reconsider NV for AMD. Now if only AMD will be a little less greedy with 7600-7700xt cards they might actually convert people and win a lot of market share.
However looking at nVidia offering im sure its still going to be pricey.
Not gonna happen. By and large, 1080p and 1440p are the most common resolution and 8-12GB is enough 99% of the time. Those who who are really feeling the blowback of the VRAM limitations are the ones gaming at 4K or using RT and those guys had 3080 and above. x80 class GPUs are 16GB+. All that happened is that NVIDIA will force enthusiasts to upgrade faster. People will sell their 3080s and mid-range gamers will snap them up for 1080-1440p. Once the 4060, 4060 Ti, and 4070 are out, mid-range gamers will grab them as well. They'll be 8-12GB which again is enough for their needs.

Not much will change.
 

Raphael

Member
Not gonna happen. By and large, 1080p and 1440p are the most common resolution and 8-12GB is enough 99% of the time. Those who who are really feeling the blowback of the VRAM limitations are the ones gaming at 4K or using RT and those guys had 3080 and above. x80 class GPUs are 16GB+. All that happened is that NVIDIA will force enthusiasts to upgrade faster. People will sell their 3080s and mid-range gamers will snap them up for 1080-1440p. Once the 4060, 4060 Ti, and 4070 are out, mid-range gamers will grab them as well. They'll be 8-12GB which again is enough for their needs.

Not much will change.
Hardware unboxed is saying you cant max TLOU on 1080p (!) with 8gb of vram. You get a lot of stuttering.

12gb will most likely be fine, but the optics alone of 8gb struggling even if in singular cases might make more people avoid those cards (3070/3070ti/4060/4060ti) from now.
 
Last edited:

Stuart360

Member
Hardware unboxed is saying you cant max TLOU on 1080p (!) with 8gb of vram. You get a lot of stuttering.
Then people need to turn the textures down a level, or shadows, or vollumetrics, etc.
Thats what PC gaming is for, thats the beauty of PC gaming.
 

Raphael

Member
Then people need to turn the textures down a level, or shadows, or vollumetrics, etc.
Thats what PC gaming is for, thats the beauty of PC gaming.
Of course. Difference probably is barely noticable even. But im talking about consumer behaviour and the optics this brings to the market. You will be paying a pretty penny for a 4060ti - you will probably want a card that will max everything in 1080p/1440p for that price. AMD will deliver it in the same price point.

How many more games will now Come that will utilize more than 8gb? Probably this will start to become more common now. NVidia will have a very tough sale thr more this happens imo.
 
Last edited:

Stuart360

Member
Of course. Difference probably is barely noticable even. But im talking about consumer behaviour and the optics this brings to the market. You will be paying a pretty penny for a 4060ti - you will probably want a card that will max everything in 1080p/1440p for that price. AMD will deliver it in the same price point.
I know its just annoying seeing people on here, and Steam etc, say stuff like 'you need 16gb of vram minimum to run games these days', or you need at least a 30 Series Nvidia card to get 60fps at 1080p in modern games', etc.
All bollocks.
PC users, including poeple with 'high end' systems, need to swallow their pride somethimes and turn down settings, or reolution, or play at 60fps if its a single player game, etc.
 

Thaedolus

Member
PC Gaming is dead in the water, just buy a damn PS5 and be done with it...
lmfao, I have a PS5 and probably haven't turned it on for over a month. The GT7 VRR and 120hz update has piqued by interest but TLOU Part I was the last game I bought on it. Meanwhile, my desktop PC + Steam Deck combo gets near daily use. PS5 is fine for what it is, but I'd much rather be playing RE4 at native 4K without any of the jank on PS5 right now, and also have the option to play it handheld...which I have been.

I'm sure TLOU PC will get fixed eventually, it's a shame it was apparently pushed out the door in a poor state. Should've delayed it longer if this was the best they could do. But the fact a company made a business decision to put out a subpar port is not an indictment of PC as a platform, it's just a company deciding to prioritize getting it out the door over doing quality work.
 

GreatnessRD

Member
Sounds like Nixxes gotta handle the job for all the PC ports going forward, yikes. Of all the games to be fucked up on the port side, you'd think this would be the last one that Sony would want to shit the bed at launch, lol.
 

Gaiff

SBI’s Resident Gaslighter
Hardware unboxed is saying you cant max TLOU on 1080p (!) with 8gb of vram. You get a lot of stuttering.

12gb will most likely be fine, but the optics alone of 8gb struggling even if in singular cases might make more people avoid those cards (3070/3070ti/4060/4060ti) from now.
Sure, but for one, there's apparently a memory leak, and for two, TLOU Part 1 is the exception, not the rule. Gamers with mid-tier rigs will worry about the next Monster Hunter or whatever From or CD Projekt Red is cooking up and it's extremely unlikely that those game will eat up over 8GB without RT at 1080p. Those people playing on mid-range machines won't mind 1080p.

The VRAM problem chiefly affects 3070 and 3080 owners who bought, relatively speaking, fairly high-end products but those guys won't upgrade to an 8GB GPU. Most will probably wait until the RTX 50 series at which point I suspect the 5070 will have at the very least 12GB, if not 16GB. As for those buying second-hand 30 series GPUs, well, they won't worry about 4K, otherwise, they wouldn't get 30 series GPUs.
 
Last edited:

Stuart360

Member
Sounds like Nixxes gotta handle the job for all the PC ports going forward, yikes. Of all the games to be fucked up on the port side, you'd think this would be the last one that Sony would want to shit the bed at launch, lol.
It is a bit weird they used Iron Galaxy for something as iconic as 'the Last of Us', a name that probably comes to my mind first when i hear 'Playstation'.
I have to guess that Nixxes is maybe working on Ghosts, and maybe a couple of other more modern games.
 
Naughty Dog should have bought fewer motion capture rigs and hired a few competent programmers instead.

This absolutely looks like a PS4 game with PS3 level animations in-game. The PS5 isn't doing any crazy shit here that a bog standard PC shouldn't be able to replicate. It looks marginally better than TLOU2 in some areas and much worse in others and while that game pushed the PS4 to the limit it's still a PS4 game.

The levels are smaller and more linear in this than TLOU2. Lighting is pre baked outside of the flashlight. Textures are real nice now to be fair but if even cards with 24gb vram are getting underwhelming performance it's not the hardware.

Just play Dead Space or RE4 and buy this for £8.99 from CD-Keys in a few months.
 
My PC is an i5-13600KF with 32 GB DDR 3600 memory and a 10 GB RTX 3080 GPU running Windows 11 Pro 22H2. I am using the latest 531.41 driver for this game and playing the game installed to my 2 TB Samsung 980 PRO NVMe SSD at the recommended 1440p High preset, as disclosed in the system requirements sheet, but with DLSS set to Quality.

The game took around 30-35 minutes to compile the shaders, which is even more ludicrous when you realise that they total 11.5 GB in size, but once done I was able to start the game and play up to the sequence with Joel and Tess at the checkpoint before I got my first crash. At this point, system memory usage was at 29 GB, which is insane for a console port. I have tried to play the game twice since then but it will always crash, sometimes after a couple of minutes, other times after 10-15 minutes. When the game works, it runs fine at 90-120 fps (capped at 120 fps) with no stuttering.

I don't have any issues with other games crashing like this (Hogwarts Legacy, The Witcher 3 DX12, Resident Evil 4 Remake etc) so to say I am disappointed in the quality of this port is an understatement. I have bought a 4080 but will not be installing it until the weekend as I want to also move some SSDs around and give my PC a spring clean. I only bought my RTX 3080 last August but it became clear to me that 10 GB was going to be insufficient for playing games at 1440p on max settings, especially with RT. Sure, I can lower settings but the whole point of my new PC, which I bought last December, was so that I could continue playing games at 1440p at 60-120 fps (anything higher is pointless for me as single-player only gamer) with all the eye-candy enabled. Otherwise I would just have stuck with the PS5 and Xbox Series X, which admittedly do offer terrific value for money considering how much PC gaming costs these days.

Hopefully, the developers can fix the issues and quickly. However, the damage is done, people are understandably annoyed at yet another sub-par PC port in a very long list of sub-par PC ports. If something isn't done about this and soon then I fear for the future of the PC as a gaming platform because, right now, in March 2023 it is not in a very good place. At all.
 
Last edited:

GreatnessRD

Member
It is a bit weird they used Iron Galaxy for something as iconic as 'the Last of Us', a name that probably comes to my mind first when i hear 'Playstation'.
I have to guess that Nixxes is maybe working on Ghosts, and maybe a couple of other more modern games.
Truly pathetic they put out something like this.
 

Stuart360

Member
By the way that Oodle dll trick helped reduce my system ram usage from 14-15gb to 11gb -


I didnt really notice better performance like some users are saying, but it def reduced my ram usage significantly.

Just download it and drop it in the games main folder, replacing the one alrady in there.

No one has talked about any problems from doing this, and my game works fine, but do it at your own risk obviously.
 

SmokedMeat

Gamer™
Not gonna happen. By and large, 1080p and 1440p are the most common resolution and 8-12GB is enough 99% of the time. Those who who are really feeling the blowback of the VRAM limitations are the ones gaming at 4K or using RT and those guys had 3080 and above. x80 class GPUs are 16GB+. All that happened is that NVIDIA will force enthusiasts to upgrade faster. People will sell their 3080s and mid-range gamers will snap them up for 1080-1440p. Once the 4060, 4060 Ti, and 4070 are out, mid-range gamers will grab them as well. They'll be 8-12GB which again is enough for their needs.

Not much will change.

But Nvidia isn’t offering 12GB cards in the mid range, which is what you’d want for a solid 1440p with high textures.
I don’t feel it’s right to be excusing Nvidia for skimping on VRAM in their cards. The 4060ti will be a repackaged 3070ti selling at the same price as 3070ti did two years ago. Who wants to spend $600- $650 on a brand new “mid range” GPU, and find they have to drop textures to medium, because Nvidia’s being stingy?

People seriously need to look past Nvidia for low to mid range cards. Jensen doesn’t deserve our money.
 

Gaiff

SBI’s Resident Gaslighter
But Nvidia isn’t offering 12GB cards in the mid range, which is what you’d want for a solid 1440p with high textures.
I don’t feel it’s right to be excusing Nvidia for skimping on VRAM in their cards. The 4060ti will be a repackaged 3070ti selling at the same price as 3070ti did two years ago. Who wants to spend $600- $650 on a brand new “mid range” GPU, and find they have to drop textures to medium, because Nvidia’s being stingy?

People seriously need to look past Nvidia for low to mid range cards. Jensen doesn’t deserve our money.
I'm not excusing them at all. I'm simply saying what I think is. The VRAM issue affects a small subset of their user base, chiefly those who got conned into buying a 10GB 3080. And yes, the 4060 Ti is replacing the 3070 Ti price-wise but most people on a 3070 Ti will not upgrade to a 4060 Ti. They'll either get a 4080 or a 5070. Those going for a 4060 Ti will be coming from a 3060 or lower and will aim for 1080 or 1440p,

The point I'm making is that this tactic of skimping on VRAM won't deter NVIDIA customers because it's been carefully targeted at 3070/4080 owners who are upper mid-range or high-end buyers. Those guys 100% won't go AMD and won't go for a lower tier 40 card. They'll go for a high-end 40 card or a mid-range 50 card.
 

lefty1117

Gold Member
I think it's as simple as, the PC and console architectures are pretty different and they just didn't spend enough on the port to maximize PC performance. But also I have to chuckle with the Sony DOOM cries about Microsoft nerfing CoD on Playstation ... and yet ...
 

SlimySnake

Flashless at the Golden Globes
Naughty Dog should have bought fewer motion capture rigs and hired a few competent programmers instead.

This absolutely looks like a PS4 game with PS3 level animations in-game. The PS5 isn't doing any crazy shit here that a bog standard PC shouldn't be able to replicate. It looks marginally better than TLOU2 in some areas and much worse in others and while that game pushed the PS4 to the limit it's still a PS4 game.

The levels are smaller and more linear in this than TLOU2. Lighting is pre baked outside of the flashlight. Textures are real nice now to be fair but if even cards with 24gb vram are getting underwhelming performance it's not the hardware.

Just play Dead Space or RE4 and buy this for £8.99 from CD-Keys in a few months.
I have had people telling me that this game looks amazing when you actually play it. That youtube doesnt do it justice. That it was marketed poorly. Im sorry, but this is a PS4 game through and through.

You are bang on about animations. I was shocked at how bad this feels to play compared to TLOU2. Joel feels like a tank instead of a nimble and efficient John Wick like Abby. Demon Souls reworked every single animation besides the visuals. Looks like ND kept every animation the same. I could swear they said they added motion matching to this but it could be that they added motion matching to the existing animations instead of reworking them like bluepoint did.

This is something I brought up last year. The real reason for lack of enhanced melee, dodge, and prone is that they did not want to reanimate Joel. Not because of some stupid gameplay reason. It was because it was out of the scope of this remake. Plain and simple. Obviously they couldnt use Ellie or even Abby's animation or Joel wouldve moved like a woman.

The game looks marvelous in cutscenes. Absolutely stunning lighting and texture work. Character models look straight up next gen. Then you switch to gameplay and its like switching from a pre-rendered cutscene to gameplay. It is a huge downgrade. Way bigger than TLOU2 and Uncharted 4 which were already a pretty significant downgrade to gameplay.

And yeah, the levels being identical really hurts but what hurts more is the fact that they tried to stay true to their PS3 era vision of what it should look like. The lighting in TLOU2 is baked too but it doesnt feel baked because it looks fairly realistic. This game has a very PS3 era look to it despite having some pretty extensive texture rework. Just goes to show how much lighting can really enhance the game. Bill's town on the PS3 had this UE3 look. And for some bizarre reason, they kept that look even though the game is on the PS4 engine which can do that dusk lighting very well. This is the first time ive seen ND artists make the wrong choice. They shouldve done what BP did and reimagined the lighting instead.
 

sainraja

Member
Playing PlayStation games on PC is like ordering a steak at a Mexican restaurant.

Sure, it's passable, but if you want the best experience, go to where they're known for steak. Similarly, don't order the enchiladas at the steak restaurant.

We've seen this over and over. Sony makes games for PlayStation hardware. Later, thet get them to run on PC. But, it's clearly not the focus.
They'll get better as they have started to port games over.
 

SlimySnake

Flashless at the Golden Globes
I'm not excusing them at all. I'm simply saying what I think is. The VRAM issue affects a small subset of their user base, chiefly those who got conned into buying a 10GB 3080. And yes, the 4060 Ti is replacing the 3070 Ti price-wise but most people on a 3070 Ti will not upgrade to a 4060 Ti. They'll either get a 4080 or a 5070. Those going for a 4060 Ti will be coming from a 3060 or lower and will aim for 1080 or 1440p,

The point I'm making is that this tactic of skimping on VRAM won't deter NVIDIA customers because it's been carefully targeted at 3070/4080 owners who are upper mid-range or high-end buyers. Those guys 100% won't go AMD and won't go for a lower tier 40 card. They'll go for a high-end 40 card or a mid-range 50 card.
Yeah, but the 3080 was by far the most popular high end card last gen. it was always sold out. i will have to look at the steam database splits but i wouldnt be surprised if its up there after the 1060s and 2060s.
 

Gaiff

SBI’s Resident Gaslighter
I have had people telling me that this game looks amazing when you actually play it. That youtube doesnt do it justice. That it was marketed poorly. Im sorry, but this is a PS4 game through and through.
Having them both, I'm honestly much more impressed by RE4R. I was actually scratching my head and had to go in the menus to make sure everything was on Ultra. The game still looks alright, not saying it looks bad at all but for some reason, I thought it looked better than TLOU2. It's pretty easily outdone by modern AAA lookers which is surprising given the fact that it's ND and all they had to do was rework the assets so you'd think they'd go balls to the walls since they didn't have to worry about scripts or gameplay changes but nah.

This could have been ported to PS4 and maintain 1080p/30 easily but then who would buy it since you already got TLOU Remastered for a fraction of the price?
 

Gaiff

SBI’s Resident Gaslighter
Yeah, but the 3080 was by far the most popular high end card last gen. it was always sold out. i will have to look at the steam database splits but i wouldnt be surprised if its up there after the 1060s and 2060s.
Very likely, yeah. People will ride it out for a few years but a good chunk will ditch it for a 5070 or 5080 due to the lack of VRAM for 4K gaming. I was never a big fan of that card because it had 1GB less than the 1080 Ti and suspected that it would become a problem but honestly didn't think it would happen so soon. I thought by the time it would be an issue, the card would be almost a relic so it wouldn't have mattered.
 

Thaedolus

Member
PC gaming is too expensive, inconsistent, and a pain in the ass.

I have one, and still prefer to avoid the stress of poor performance on ALL games by playing on console even if i have to pay $10-20 more.
The entire point of playing on PC is the better performance, which the platform absolutely provides given competent developers. Naughty Dog is already scrambling to patch TLOU per their Twitter account. That shows it’s a software problem, not a platform problem.

The delay and subsequent poor quality shows they were struggling. Again, look at RE4’s great release from a developer that has experience with RE engine games on PC for the last several years…I’m getting >80FPS maxxed out at native 4K. It can scale to run on a handheld. There’s your actual PC performance difference when someone knows what they’re doing. Naughty Dog or whoever handled the port just sucked at it
 

DeepEnigma

Gold Member
I'm feeling pretty validated about my decision to return my 3060ti that I bought a few weeks ago in favour of a 6700XT.

Shout out to Daniel Owen's Youtube channel for his benchmarking analysis videos, he's been sounding the alarm regarding changing VRAM standards for over a year.

We've seen several high-profile games in 2023 that need 12GB+ VRAM as a minimum, this is the new norm, not some outlier anomaly. This version of TLOU has some seriously detailed assets, do people really think every developer has the budget and resources to consider *highly* scaleable LOD/MipMap management options for every tier of performance level?
The Office Thank You GIF


It has it's issues that the developers have acknowledged the community with, but this will be the "new normal" for RAM usage. We have said this as far back as 2020 after the next-gen console specs were announced.
 

SlimySnake

Flashless at the Golden Globes
Having them both, I'm honestly much more impressed by RE4R. I was actually scratching my head and had to go in the menus to make sure everything was on Ultra. The game still looks alright, not saying it looks bad at all but for some reason, I thought it looked better than TLOU2. It's pretty easily outdone by modern AAA lookers which is surprising given the fact that it's ND and all they had to do was rework the assets so you'd think they'd go balls to the walls since they didn't have to worry about scripts or gameplay changes but nah.

This could have been ported to PS4 and maintain 1080p/30 easily but then who would buy it since you already got TLOU Remastered for a fraction of the price?
Yeah, same thing. Went back to RE4R and was immediately struck by how visually pleasing it felt compared to TLOU Part 1. Though tbh, i think has more to do with the art direction and the incredible use lighting in the RE games. in terms of textures and models, its not really doing anything too special compared to TLOU. Thats why i think the main reason for this game looking so underwhelming is their decision to stick close to the original visual design.

Very likely, yeah. People will ride it out for a few years but a good chunk will ditch it for a 5070 or 5080 due to the lack of VRAM for 4K gaming. I was never a big fan of that card because it had 1GB less than the 1080 Ti and suspected that it would become a problem but honestly didn't think it would happen so soon. I thought by the time it would be an issue, the card would be almost a relic so it wouldn't have mattered.
Everyone on PC gaf told me to get the 12 GB. I think we all knew. As a GTX 570 owner who had his card become outdated the moment next gen games launched last gen, i knew too. I just figured 10 GB vram vs 13 GB overall PS5 ram wouldnt have made such a big difference. I just forgot to take into account that i would likely want to target 4k with RT.
 

TGO

Hype Train conductor. Works harder than it steams.
The entire point of playing on PC is the better performance, which the platform absolutely provides given competent developers.
This can be said about every platform.
Many games release on consoles with poor performance without the graphics to back it up.
Tecmo's/From soft games have no excuse for their resolution or performance issues on any platform.
 
Top Bottom