• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Marvel’s Spider-Man Remastered has released on Steam/Steam Deck | Digital Foundry Breakdown Review Released

ChiefDada

Gold Member
Dude even the Matrix demo, the most 'next gen' thing we have so far, only streams around 500mb/sec accorfing to the devs, and will work on a SATA SSD, and it does as i tried it.

This will only end in tears dude, i thought we were past all this by now.

No, you're not getting it. From a streaming standpoint, the Matrix Demo was not impressive because... it didn't rely on heavy streaming. Console exclusive games will be more dependent on streaming because of the choice to not make a generational leap in RAM budget, this is in direct contrast to how traditional PC practices handle increased data requirements, which is by simply requiring more RAM, so it shouldn't be a shock to you that performance is sacrificed. Nixxes themselves stated how asset decompression on the CPU is bottlenecking performance. Keep in mind only ~20mb/s streaming most of the time.
 

bbr1979

Banned
Dude even the Matrix demo, the most 'next gen' thing we have so far, only streams around 500mb/sec accorfing to the devs, and will work on a SATA SSD, and it does as i tried it.

This will only end in tears dude, i thought we were past all this by now.
You are absolutely wrong. On consoles developers can max streaming capabilities. I said that in Avatar developers plan stream up to 2gb textures in second on consoles. So sorry but pc cant compete in resource management if we have high class developers.
 

SmokedMeat

Gamer™
i don't even now think that most cards are underperforming, most likely most of the GPUs out there are under heavy VRAM pressure, 3080 included

[/URL][/URL]

check this Raytracing benchmark.

at 1440p, rtx 3080 outperforms the 2080ti by %20 percent (it should be %30-35, in normal conditions).

but looking at 4k benchmarks, 3080 only and ONLY outperforms the 2080ti by a puny %7 margin.

and look at that, 2080ti outperforms the 3070 by %25 too. guess what, VRAM

and suddenly, 3070 is near 6700xt levels of performance at 4K.

I'm sure we would run into the similar performance penalty at 1440p too, after some play time. I just opened the game at 1440p, and i still get the 6. 2 GB VRAM thing and game casually offloads 2.3 GB worth of GPU data into the normal RAM.

the fact that 4k+dlss pushes 3080 to be close to 2080ti is just bad. , game simply refuses to use all available VRAM.

Where are you getting these benchmarks from? You’re way off.

https://www.techspot.com/photos/article/2518-spiderman-remastered-benchmarks/#4K-high

https://www.techspot.com/article/2518-spiderman-remastered-benchmarks/
 
Last edited:

SmokedMeat

Gamer™
Sorry but ps5 plays this game substantionally better than your pc.

Even if it does, your PS5 edition is frozen at that performance.

His PC will only run the game better and better every time he upgrades it.

Hell, his next upgrade will blow well past what your PS5 does.
 
Last edited:

Guilty_AI

Member
That's just ignorant.
I also didn't give a shit about shadows in 2003 when my gpu couldn't afford it
You still don't need shadows as long as the game is designed around not having them.

Technically we'll only really need ray tracing on games that make it mandatory (directly or indirectly)
 
Last edited:

Stuart360

Member
That's just ignorant.
I also didn't give a shit about shadows in 2003 when my gpu couldn't afford it
How is it ignorant?, i dont care about RT. You cant even tell RT is on in most gam,es, outside of reflection RT. Most people wouldnt even know RT is on in most games if they werent told. In fact this is the only game where i feel it is slightly worth it due to the amount of glass and windows. Still when you're actually playing the game normally, you dont see it anyway.
And its nothing to do with my gpu not supporting it. I upgraded to a 1080ti over a RTX card on purpose because i knew i would be turning RT off and the rtx cards i could afford at the time (2060/2070) pulled in worse benchmarks than the 1080ti, while being like $100-200 more.
 
Last edited:

rofif

Can’t Git Gud
You still don't need shadows as long as the game is designed around not having them.

Technically we'll only really need ray tracing on games that make it mandatory (directly or indirectly)
Technically you dont need shadows or ray tracing if a game has everything baked... kinda ike uncharted 4 or tlou2.
But there is still a lot of dynamic stuff
 

rofif

Can’t Git Gud
How is it ignorant?, i dont care about RT. You cant even tell RT is on in most gam,es, outside of reflection RT. Most people wouldnt even know RT is on in most games if they werent told. In fact this is the only game where i feel it is slightly worth it due to the amount of galss and windows. Still when you're actually playing the game normally, you dont see it anyway.
And its nothing to do with my gpu not supporting it. I upgraded to a 1080ti over a RTX card on purpose because i knew i would be turning RT off and the rtx cards i could afford at the time (2060/2070) pulled in worse benchmarks than the 1080ti, while being like $100-200 more.
you made a bad choice that's it.
Now you are missing out. Not massively but still
 

Stuart360

Member
you made a bad choice that's it.
Now you are missing out. Not massively but still
I dont feel like i'm missing out though, or i wouldnt of bought a 1080ti would i.
Maybe in 10 years when the tech isnt so demanding, i'll feel different. But no point me getting a worse card raw power wise just for a feature that i wont turn on anyway, and spend considerably more money doing it.
 
Last edited:

Guilty_AI

Member
Technically you dont need shadows or ray tracing if a game has everything baked... kinda ike uncharted 4 or tlou2.
But there is still a lot of dynamic stuff
exactly, its all about the artistic and developing approach. You can make an artstyle without dynamic shadowmaps (and even plenty of dynamic objects) look good.

The only reason modern games will look ugly with shadows disabled is because their development assumed dynamic shadows would be used, so they didn't bother making the game look good without them.

RT cards will only be truly necessary when game development gets at such stage.
 
Last edited:
problem is not beyond bottleneck
8 gb gpus run into all sorts of vram/ram limitations at 1440p and beyond with ray tracing enabled and very high textures

and high textures, at times, look worse than ps4 so its not an option either

[/URL]

for now i settled on maxed o
ut ray tracing with very high textures at native 1080p. playing 1440p and above causes huge VRAM bound bottlenecks. setting the textures to High is a fix but you get worse-than-PS4 textures most of the time. since dlss/fsr still uses 1440p/4k assets, they still incur heavy memory consumption

[/URL]

also, high textures + high reflections will give you weird reflections...



qhlVtSF.png
x2lA9kj.png


Pa5B3MT.png



using "very high geometry" EVEN with low textures fixes the problem;

Sk0OE2d.png


so, takeaways;

1) very high geometry has better reflections than ps5 regardless of texture setting, whether you have textures at low or very high

2) high textures+high geometry has worse reflections than ps5 (by huge margin). high geometry practically needs very high textures to properly create reflections

3) high geometry only produces ps5 equivalent reflections when paired with very high textures

and at 1440p and beyond, it is impossible (with how the game manages memory allocation) to fit high geometry+very high textres. you can fit very high geometry+high textures at 1440p, but then you will have to live with textures:

ps4 textures

YkDnRns.png


very high textures

EwOoU3C.jpg



"so called" high textures which Alex suggests for " 6 GB GPUs". You also need to use "high textures" at 1440p and above with 8 GB cards to enable ray tracing without having VRAM bound performance drops

J9gJJHM.jpg





TL;DR
- Game only uses maximum of %80 VRAM (aside from background apps)
- You need very high textures to ensure you get high quality textures that are both intended for PS4 and PS5. High textures are not a MATCH for PS4 textures. They bundled improved PS5 textures and oldgen PS4 textures with Very High texture settings. If you use High textures, you're getting even worse textures than PS4. be careful
- 8 GB VRAM GPUs, with how the game handles memory allocation, is not enough at 1440p and above to both run ray tracing and Very High textures. Only at 1080p I managed to run Ray tracing with Very High textures without having enormous slow downs.
- Slow downs are caused by excess memory spilling into normal RAM, which causes constant data transfer over PCI-e. This does not happen when you have enough VRAM or play at lower resolutions.

In short, it impossible to match PS5 equivalent perforance due to VRAM bottleneckes 8 GB cards experience at 1440p and above, and since PS5 constantly runs around 1440p at 60 FPS modes.

VRAM bottleneck is real, RTX 3060 can almost match PS5 ray tracing performance at native 4K. 3070/3070 ti cannot. At native 4K, VRAM situation becomes so drastic that 3070/3070ti drops frames below 30s with very high textures. Only way to play at native 4k with a 3070 like PS5 does is to use high textures, which as proven above, degrades the overall texture quality quite heavily.



Here at native 4k with nearly ps5 equivalent RT settings, 3060 is able to get 40+ FPS (until it too is affected by VRAM bottleneck, and yes, it does too)

and here 3070 with similar RT settings;


drops below 40, underperforms, even compared to 3060

I know it has been a long post but due to how the game handles memory allocation, and due to how bad high textures look, for now it is impossible to enjoy this game at maximum fidelity at 1440p with an 8 GB card. All cards severely underperform, cause performance slowdowns, excess PCIE transfers.

if that settings list comes from alex i wouldnt trust it he himself literally says he couldnt even figure out what 2 of the settings did and on several games like doom or the re games hes been wrong about settings and will have to slip in an apology or correction in things like tweets or df directs (or he does the thing he usually does and doesnt and instead ignores the true findings) the only settings that should be higher on pc than the ps5 fidelity mode are the shadows, af, rt, and debatbly the object draw distance in reflections anything else is likely a suspect finding
 

yamaci17

Member
First link is without RT.
This is with RT at "high":
4K-RT-p.webp
"high quality" puts textures to high. high textures cannot even match ps4 textures

these benchs are hugely invalid. he should retest with very high textures. all benchmarks done with high textures are invalid since they look worse than ps4 textures. its not worth to enable ray tracing when you go below ps4 textures to make budget available

pcgh is a reknowned german benchmarking website. they know their stuff

if i play high textures, its all fine and dandy. i can get away with 4k max ray tracing. those textures however simply suck

here, very high textures, 37 FPS, heavy vram bottleneck
high textures, magically back to 55 fps. vram bottlenecked removed


Jkpr3Xe.jpg

Lz4Cb9v.jpg

as i said, don't say "but its not maxed out!", as i said, game artificially limits maximum vram allocation at %80 total availble VRAM. I reported them about this but they didn't care. If you have contacts with them you're free to notify them just like me.

you can also clearly see how degraded the textures are.



44:26. this user also made the same discovery.
 
Last edited:

SmokedMeat

Gamer™
"high quality" puts textures to high. high textures cannot even match ps4 textures

these benchs are hugely invalid. he should retest with very high textures. all benchmarks done with high textures are invalid since they look worse than ps4 textures. its not worth to enable ray tracing when you go below ps4 textures to make budget available


They’re absolutely valid.

Reflections, image quality, textures, geometry, shadows, etc are all superior on PC.

The PS5 version also uses dynamic resolution.
 

yamaci17

Member
They’re absolutely valid.

Reflections, image quality, textures, geometry, shadows, etc are all superior on PC.

The PS5 version also uses dynamic resolution.
nope they're not
high textures look real bad
nice job ignoring my solid findings
 
Last edited:
As expected high end PC gamers are pounding on theyr chest, games play best on PC. They are fideling with the settings for a time and then ajusting them in the game.
But mid range PC gamers are happy that the game is running on theyr hardware. and low end PC gamers dont even bother.
Console gamers start the game pick the Performance mode or the Graphics mode and game....for $500,-😉
 
Last edited:

SmokedMeat

Gamer™
nope they're not
high textures look real bad
nice job ignoring my solid findings

You have no solid findings. Some no name YouTubers means nothing. I’m going by reputable sources.
If the textures were worse then PS4 textures as you claim, people would be ripping the shit out of this port.

Your sources are junk.
 

yamaci17

Member
You have no solid findings. Some no name YouTubers means nothing. I’m going by reputable sources.
If the textures were worse then PS4 textures as you claim, people would be ripping the shit out of this port.

Your sources are junk.
high textures are worse than ps4 textures
very high textures are fine
you've a reading comprehension problem, go fix it before you talk nonsense on forums


I have solid findings. PCGH is a solid benchmark website. It is a reputable source, whether you accept it or not.

No one sits and compares High textures to PS4 textures because no one would expect them to look worse, but they do. There's nothing I can do against that. I have solid proof that High textures look worse than PS4 textures and I've shared all relevant proofs here;


Read my post again, throughly and carefully. You will see that "high" textures are worse than PS4 textures. Simple as that. I made these discoveries, not "no name Youtubers". Go run the game, set the game to High textures, then crossmatch cutscenes and textures with the PS4 version. You're free to disprove my findings. All you can do is "claim" I have no solid findings, whereas I provided 3 solid proofs that high textures look worse than PS4.

Here's another thing for you: High textures in RDR2 also look worse than PS4 textures and you need Ultra textures to match PS4. Do you want me to prove that to you as well? Do you see anyone ripping the shit out of RDR2?

I discovered high textures being visibly worse than what PS4 offers because I played the game originally on PS4. So when I come across super jagged low-res textures when I set the game to High textures, I naturally had the instinct to draw comparisons. Naturally my hunch was correct, and high textures are not a match for PS4. You practically need very high textures to not only match PS5 but also the PS4.

This is not uncommon, they just packed new improved textures with PS4 textures together and throw that into the mix with very high settings. lower texture settings are compressed versions of very high textures, not " hand crafted " textures, naturally they look worse than very high textures, but to a point they also look worse than PS4. Most games do this and practically you either run them at highest textures, or you're subjected to horrible texturing. This game is not exception in this regards.

I've decided to block you, since you have no aim to add anything useful towards the discussion. I actually bring solid evidence, as a person who bought the game and playing the game, and as a person who played the game before on its original platform. I don't have any more time to prove you anything else, I provided solid evidences and backed up every single thing I've claimed, same cannot be said for you.

That hardware unboxed benchmark with "high" textures is just useless. What high textures gets you are proven by me, in my post above and many other posts.

No one in this industry has the guts to test this game with rt enabled and very high textures at 4K right now. It would create disastrous results for all current GPUs and a PR nightmare. Since I'm even being shut down here by mere casual users who cannot even accept the possibility that game has a huge problem in this regard, naturally all outlets will try to stay quiet on this fact.

Just because something is labeled "high" does not mean it will look good or suited for "high" end. No, there are no such rules. PS5 RT geometry is also labeled "high" but clearly it is mostly N64 reflections.
 
Last edited:

ChiefDada

Gold Member
You have no solid findings. Some no name YouTubers means nothing. I’m going by reputable sources.
If the textures were worse then PS4 textures as you claim, people would be ripping the shit out of this port.

Your sources are junk.

I don't agree with some of what he's saying, but why do you deem his sources to be junk? He has presented video evidence of how certain presets effect VRAM usage and performance. You are disregarding the proof because the related sources aren't renowned?

Btw, I never use the LOL emoji in a condescending manner like many here do, I literally laughed out loud when I read your post it was so ridiculous.
 
Death loop doesn't count that's a Zenimax game, Sony had no say in its PC version.
Yep so I don't think they've done anything really for blocking their stuff. I bought each of em so far even tho I had em on Playstation or w/e...Im satisfied :messenger_sunglasses:

I was thinking they could throw in some PC exclusive something with the launch of the joints..but judging from how social media reacts to that kinda thing I guess they gone play it safe and just do clean ports and im okay with that.
 

ArtHands

Thinks buying more servers can fix a bad patch
How can Sony always make such a good ports?
When Sony says "Greatness Awaits" it's really Greatness

They know PC is a very important market, hence putting each and every effort and investment they got into enhancing the remasters here.
 
Also direct storage currently supports only software mode.

Yes but this won't be the case for long, both Nvidia and AMD have been working with Microsoft on DirectStorage to bring high bandwidth and low latency streaming onto the PC discrete GPU's, in the form of DirectStorage, RTX I/O and Smart Storage from AMD.

I suspect that they'll have fully implemented fast GPU based decompression through these mediums by the end of next year or even much earlier. This would be right around the time that we get our first wave of next-gen games on both first and third party.

Also developers of Avatar said they planning stream up to 2gb in second only textures on new consoles.

This will be handled comfortably by consoles, however I don't see it being a bottleneck for PC's, especially for those with an SSD, most of which have a minimum read speed of 1 GB/s as I said originally, this will be 2 GB/s after decompression. keep in mind PC's usually have an advantage in VRAM pool size which can compensate.

Developers will take into account the variability of the hardware as they have always done, however I strongly suspect games will require an SSD as a base requirement and it makes sense, it's not just about the data throughput, SSD's are significantly better when it comes to I/O requests and priority data access with significantly lower latency compared to a HDD.

Let's also address the fact that Sony are starting to release more and more of the first party exclusives onto PC, and we'd be naive in thinking that this won't affect their first party studios game design, I do see them pushing the SSD however them saturating 5.5 GB/s bandwidth for a game? Not likely as it will create more headache for developers when porting titles to PC, that aside it's difficult even on a technical level to saturate that much bandwidth.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
You are absolutely wrong. On consoles developers can max streaming capabilities. I said that in Avatar developers plan stream up to 2gb textures in second on consoles. So sorry but pc cant compete in resource management if we have high class developers.
Just wait for GPU decomp.
Yes right now without devs using GPU decomp we are pretty screwed.....unless you have a substantial CPU to handle it.

But with GPU decomp theres going to be very very little holding these games back.
 

Dampf

Member
high textures are worse than ps4 textures
very high textures are fine

Strange you get frame drops with an 8 GB GPU on Very High textures.

Here's a 6 GB 2060 at very high textures with RT and it performs very well.

 
Last edited:

Dampf

Member
This is 1080p dlss quality, 720p render baseline.
Is the mip bias set correctly in this game? Then the textures behave like native 1080p.

Anyway, 1080p to 1440p should add around 1.3 GB VRAM (probably more like 800 MB) at most and you have a 8 GB card. So given this isn't an issue on the 2060, I'm not sure what's going here.

Perhaps its a memory leak issue? How long have you been playing until you get issues with VRAM?
 
Last edited:

yamaci17

Member
Is the mip bias set correctly in this game? Then the textures behave like native 1080p.

Anyway, 1080p to 1440p should add around 1.3 GB VRAM at most and you have a 8 GB card. So given this isn't an issue on the 2060, I'm not sure what's going here.

Perhaps its a memory leak issue? How long have you been playing until you get issues with VRAM?
i can replicate the issue instantly when I swing in downtown and come down to the streets.
 

bbr1979

Banned
Yes but this won't be the case for long, both Nvidia and AMD have been working with Microsoft on DirectStorage to bring high bandwidth and low latency streaming onto the PC discrete GPU's, in the form of DirectStorage, RTX I/O and Smart Storage from AMD.

I suspect that they'll have fully implemented fast GPU based decompression through these mediums by the end of next year or even much earlier. This would be right around the time that we get our first wave of next-gen games on both first and third party.



This will be handled comfortably by consoles, however I don't see it being a bottleneck for PC's, especially for those with an SSD, most of which have a minimum read speed of 1 GB/s as I said originally, this will be 2 GB/s after decompression. keep in mind PC's usually have an advantage in VRAM pool size which can compensate.

Developers will take into account the variability of the hardware as they have always done, however I strongly suspect games will require an SSD as a base requirement and it makes sense, it's not just about the data throughput, SSD's are significantly better when it comes to I/O requests and priority data access with significantly lower latency compared to a HDD.

Let's also address the fact that Sony are starting to release more and more of the first party exclusives onto PC, and we'd be naive in thinking that this won't affect their first party studios game design, I do see them pushing the SSD however them saturating 5.5 GB/s bandwidth for a game? Not likely as it will create more headache for developers when porting titles to PC, that aside it's difficult even on a technical level to saturate that much bandwidth.
It speaks about how much new consoles better than any pc config (including future). For games that really pushes graphics consoles will be main target for developers or pc will hold back.
 

bbr1979

Banned
Even if it does, your PS5 edition is frozen at that performance.

His PC will only run the game better and better every time he upgrades it.

Hell, his next upgrade will blow well past what your PS5 does.
Some tremendous pc master race bullshit. Go on pc-only forums.
 

winjer

Gold Member
It speaks about how much new consoles better than any pc config (including future). For games that really pushes graphics consoles will be main target for developers or pc will hold back.

Asides the matter of SSDs file system on PC, there is nothing a current gen console can do better than a modern PC. But even this is just a matter of game devs to start using Direct Storage.
Ray-tracing is much better on PC, especially on NVidia's side, where even the old RTX 2060 can match the PS5. CPU´s are also much more powerful on PC, than the cutdown Zen2 on consoles.

Some tremendous pc master race bullshit. Go on pc-only forums.

Might I remind you, that you are on thread specifically made for talking about the PC version of a game.
The hypocrisy is very strong with you....
 
Last edited:

gypsygib

Member
I think Sony is the best publisher in the industry at this point. There games and ports are so good. Even the average games in their line-up are great.
 

Guilty_AI

Member
Is the mip bias set correctly in this game? Then the textures behave like native 1080p.

Anyway, 1080p to 1440p should add around 1.3 GB VRAM (probably more like 800 MB) at most and you have a 8 GB card. So given this isn't an issue on the 2060, I'm not sure what's going here.

Perhaps its a memory leak issue? How long have you been playing until you get issues with VRAM?
Seems the game cuts vram usage in the GPU to around 80% of its capacity, its something yamaci17 yamaci17 himself verified (and i also saw people on steam forums commenting about it).

So basically while it has acceptable performance on mid range cards, the game won't perform as good as it should on higher end cards since it leaves a good amount of vram unused.
 
Last edited:

yamaci17

Member
Seems the game cuts vram usage in the GPU to around 80% of its capacity, its something yamaci17 yamaci17 himself verified (and i also saw people on steam forums commenting about it).

So basically while it has acceptable performance on mid range cards, the game won't perform as good as it should on higher end cards since it leaves a good amount of vram unused.
yeah, speaking of which, most MIP BIAS values at 1080p/1440p is also broken. various textures do not load properly, most notably cloth and mesh textures



this is how otto's sweater will look like on most 1440p/1080p systems. i've reported all these issues to them , they don't seem to care at all, constantly asking me dxdiag and whether I use an HDD or SSD. lmao. I link them 10 seperate different Youtube gameplay videos where issue is present yet they still ask *me* to share dxdiag reports.

LOD biases at 1080p/1440p is semi broken, some things load correct, some things does not load correct. sometimes you can see face textures are not properly textured, until some time is past

there also huge "loading" lags PCs experience. alex and DF also failed to mentioned that.

here:



look at 13:13, game fails to load MJ's cloths for 7 seconds

(this is a rig with rtx 3090, i don't know the specifics)

ps5 has no problem;



0:56

I managed to detect same texture loading lag on DF's video themselves (they themselves failed to notice). on paper, PC loads very fast, but it takes nearly 7-9 seconds for building textures to load. this is not a singled out issue either, if you swing at fast speeds, most of the buildings will load ugly, low res textures overall

RL8jnU7.jpg
6PHbxma.jpg



let's hope DirectStorage arrives and actually does something useful. or else, situation looks dreadful for future of PC ports
 
Last edited:

bbr1979

Banned
Asides the matter of SSDs file system on PC, there is nothing a current gen console can do better than a modern PC. But even this is just a matter of game devs to start using Direct Storage.
Ray-tracing is much better on PC, especially on NVidia's side, where even the old RTX 2060 can match the PS5. CPU´s are also much more powerful on PC, than the cutdown Zen2 on consoles.



Might I remind you, that you are on thread specifically made for talking about the PC version of a game.
The hypocrisy is very strong with you....
You cant beat effectiveness of consoles. Main graphics features are geometry textures and shaders not ray-tracing at all. Currently Demons Souls new Ratchet and Horizon are best games on this factors. Also in new games with ray-tracing 2060 is absolutely no match with ps5 and series x. Look at Matrix demo. New engines like Unreal engine 5 heavily optimised towards consoles.
 

Guilty_AI

Member
yeah, speaking of which, most MIP BIAS values at 1080p/1440p is also broken. various textures do not load properly, most notably cloth and mesh textures



this is how otto's sweater will look like on most 1440p/1080p systems. i've reported all these issues to them , they don't seem to care at all, constantly asking me dxdiag and whether I use an HDD or SSD. lmao. I link them 10 seperate different Youtube gameplay videos where issue is present yet they still ask *me* to share dxdiag reports.

LOD biases at 1080p/1440p is semi broken, some things load correct, some things does not load correct. sometimes you can see face textures are not properly textured, until some time is past

there also huge "loading" lags PCs experience. alex and DF also failed to mentioned that.

here:



look at 13:13, game fails to load MJ's cloths for 7 seconds

(this is a rig with rtx 3090, i don't know the specifics)

ps5 has no problem;



0:56

I managed to detect same texture loading lag on DF's video themselves (they themselves failed to notice). on paper, PC loads very fast, but it takes nearly 7-9 seconds for building textures to load. this is not a singled out issue either, if you swing at fast speeds, most of the buildings will load ugly, low res textures overall

RL8jnU7.jpg
6PHbxma.jpg



let's hope DirectStorage arrives and actually does something useful. or else, situation looks dreadful for future of PC ports

I don't really think this is a direct storage issue or anything like that. These engines were all developed with console architectures in mind and to make the most out of them. Its very likely porting these to pc isn't such an easy task, Spiderman isn't the first Sony FP game to have similar issues either (the best port so far was, unsurprisingly, Days Gone which uses UE4).

Hopefully with Sony's pc approach from now on, they'll improve their in-house engines to work better with PC architectures.
 
Last edited:

rofif

Can’t Git Gud
Strange you get frame drops with an 8 GB GPU on Very High textures.

Here's a 6 GB 2060 at very high textures with RT and it performs very well.


it's great and all but that's 1080p...
The problem is that x060 cards were always a standard for main resolution at 60fps.
But 1440p is the target nowadays on pc and sometimes 4k
 
Last edited:

rodrigolfp

Haptic Gamepads 4 Life
It speaks about how much new consoles better than any pc config (including future). For games that really pushes graphics consoles will be main target for developers or pc will hold back.
Like the "N64" reflections, 4x AF, LOW traffic and crowd density seeing here, right???
 

bender

What time is it?
Even if it does, your PS5 edition is frozen at that performance.

His PC will only run the game better and better every time he upgrades it.

Hell, his next upgrade will blow well past what your PS5 does.

Wait until you see the PS7's backwards compatibility implementation.
 
Top Bottom