• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Dead Space Remake PC - DF Tech Review - The #StutterStruggle Continues

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Welp that didn’t age well
SbiPfLU.jpg
How so.
The game has a shader compile run before you get in game.
It indeed doesnt have shader compilation stutter.
 

calistan

Member
RTX 2060S and R5 3600 still putting in work.

Imagine someone who built the exact system in 2019?
They'd be swimming in it right now.
Yet people told me way back when 6 cores was gonna mean death.
Could probably upgrade to a R5 5600/5800X3D and float the generation.

Yet people will come and say you need a 2000 dollar computer to play PC games.
Im glad DF still use this old PC as their low end/mid range machine.
Yes some games will beat it up, but man if youve had this CPU/GPU since 2019, they have earned their keep.
Even a small bump up to the 3060Ti would have 1440pDLSS be easy work going forward.




Odd the game has loading stutter even on PS5.
Beat poetry! Somebody should set this to music, preferably jazz.

MrOrwk1.jpg
 

Vick

Member
Difference is I deal with facts. Other people deal with special emotions with 900p, potato gunplay with dual sticks with aim assists and claw grip, higher lag, lower frame rate, lower aspect ratios and FoV, higher prices for the games, no mods, etc, etc...
I think every single post from you I have read in three years, every single one, felt driven by fragility and emotions.
Having a meltdown in this very Thread because a couple of PC guys dared to say they currently preferred the console experience for this particular game..

The only retard here is you for telling people what they prefer.
Controller is very comfortable, have all the buttons right there, analog movement and triggers. That plus haptics and features.
You can take your precision mouse and put it up your fucking ass. My goal in gaming is not to be precision 360 no scoping all the time. Precision aiming is not gameplay... but if you really need to know, people play very precisely with controllers too.
People like him is why PCMR gets hate. Wonder why every PC enthusiast on GAF who also owns Consoles never act like this.

Powered by Retard Mediumstation 900p™ Performance Mode:

Dead-Space-20230205004157.png


Dead-Space-20230207202956.png


Dead-Space-20230209010618.png


Dead-Space-20230207173055.png


Dead-Space-20230207161759.png
 

Mindman

Member
If you have a 3080 10 GB and have stutter issues, try playing in 1440p instead of 4k. Honestly this fixed 90% of stutters for me and the game is much better now. Seems like the VRAM issue is real.
 

rofif

Can’t Git Gud
I think every single post from you I have read in three years, every single one, felt driven by fragility and emotions.
Having a meltdown in this very Thread because a couple of PC guys dared to say they currently preferred the console experience for this particular game..


People like him is why PCMR gets hate. Wonder why every PC enthusiast on GAF who also owns Consoles never act like this.

Powered by Retard Mediumstation 900p™ Performance Mode:

Dead-Space-20230205004157.png


Dead-Space-20230207202956.png


Dead-Space-20230209010618.png


Dead-Space-20230207173055.png


Dead-Space-20230207161759.png
These are brilliant!
If Vick says it looks good, then it's good. He is an asshat when it comes to details nobody else can see hahah :D
I am happy you didn't disable film grain. It adds some detail and dithers the image to avoid gradients. tlou part1 looks plastic without it !
Ok. Now I now that I am getting Dead Space for ps5 too. That's a lot of spending in just 2 months!
 

Vick

Member
These are brilliant!
If Vick says it looks good, then it's good. He is an asshat when it comes to details nobody else can see hahah :D
I am happy you didn't disable film grain. It adds some detail and dithers the image to avoid gradients. tlou part1 looks plastic without it !
Ok. Now I now that I am getting Dead Space for ps5 too. That's a lot of spending in just 2 months!
Thanks buddy, posted a couple more in the previous page.

Grain adds alot in movement on my set (mainly made to faithfully handle film grain in movies) but is maybe a bit on the heavy side in these screenshots. Game looks really good, some really neat SSS and probably the best flame effects related lighting I've ever seen.
Compared to Part I GI bounce lighting is a couple of gens behind, barely existing even really, but game lighting as a whole manages to impress in other areas anyway. Volumetrics and particles are insane.
IQ is maybe softer than Part I, less temporally stable in a couple of occasions, and FSR creates some artifacts around sparks and behind glass, but it's pretty similar to that game in terms of presentation overall look/IQ.
And it's fluid, and in later runs with Hand Cannon you're constantly running from one room to the other and back to the Store to sell stuff all the time, so being fluid and consistent is a huge plus actually. Invaluable, imo.

Dead-Space-20230206173333.png


Dead-Space-20230206175548.png


Dead-Space-20230204223515.png
 
Last edited:
I posted this in the OT, but the game is now completely broken for me.

I went to start my New Game+ run, and the game opened in windowed mode for some reason. It refuses to go full screen, even though the in-game settings say it's in full screen mode. The resolution cannot be changed in-game either, it's locked at 1080p and and there is no option to select another resolution. The game now only runs at like 10 FPS in windowed mode, and this is after beating the entire game with little to no issues at all. None of my other games have this issue.

I've seen like 4 or so posts online from others that sound like they have my issue - the game refusing to use the main GPU but the integrated GPU instead. Not sure if this is really the case but either way I'm super sad right now.
 

GametimeUK

Member
People like him is why PCMR gets hate. Wonder why every PC enthusiast on GAF who also owns Consoles never act like this.
it's absolutely crazy, mate. I've spent so much money and invested so much time into PC over the past 10 years. It's my preferred platform and obviously I want PC to have the absolute best versions of games. That's the whole point in paying so much for one. I wouldn't jump on a message board and downplay the issues to justify my purchase. I come here to discuss games and at the moment vent some issues because my 3080 should be providing a vastly smoother experience in terms of stutters than the consoles, but it isn't the case.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
It's going to be a 500$ card. The naming doesn't matter anymore. 8GB for a new 500$ card in 2023 is laughable. It will be barely enough for 1440p gaming until the end of the year.
And itll likely be weaker than the RTX 3070.....that had a 500 dollar MSRP.
Comparable to the 3060Ti'G6X that had a 400 dollar MSRP.

Doesnt change the fact if you are buying the smallest chip of a generation and expecting it to run well at the highest general resolution*, you deserve all the suffering that comes your way.


*
General resolution at the time of release, no one bought a GTS 450 with 512MB of VRAM and played at 2560 x 1600 (The highest general resolution at the time of its release).
So saying the RTX 4060 is obsolete because at 4K it will run out of VRAM is pretty much redundant, no one would be buying that chip to play at 4k.
 

TrebleShot

Member
Hahahha some genuinely funny shit in this thread I’m convinced rodrigolfp rodrigolfp is taking the piss and having a laugh , no one is that stupid.

Look man I spent a shit load on my gaming PC it’s really nice I like it to kind of test things out and see what I’m “missing”.

Turns out in this specific instance the experience is worse than playing on PS5 as many have mentioned the PS5 performance mode looks very very nice, sure it’s not as natively sharp as PC and you miss a bit of that depth in the colours etc BUT it’s all moot when your shuffling around and the game is hitching every so often and the worse thing is it’s persistent and constant.

When you combine that with the incredible haptics of the DS and triggers (personal opinion but fantastic) and the game actually looking fantastic on PS5 I’m quite happy.

There’s other things as well like trophy support which leads to getting PSN vouchers. Quick resume and the system integration I’m not sure why so many PCMR weirdos are so defensive.

Wilfully ignorant, as they say a fool and his cash are easily parted.
 
Not sure why, but my framerate goes from rock-solid 4k 6 60fps to the teens every time that tentacle grabs Isaac (and in the final boss fight) on my 3080

Edit: lol didn't realize we had someone named 60fps here. Didn't mean to @ you, son
 
Last edited:

lukilladog

Member
Anyone planning on getting an RTX 4060 and running games at 4K even with DLSS deserves the choppy framerates they get fed.

4k dlss performance is actually 1080p lol. If this card releases with 8gb it's DOA.

...Also budget entry level cards used to be fine for resolutions up to 50% higher than the consoles of their time, don't let Jen Hsun manipulate you.

... and you probably were not around when budget cards used to come out with actually too much Vram.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
4k dlss performance is actually 1080p lol. If this card releases with 8gb it's DOA.

...Also budget entry level cards used to be fine for resolutions up to 50% higher than the consoles of their time, don't let Jen Hsun manipulate you.

... and you probably were not around when budget cards used to come out with actually too much Vram.
DLSS 4K may be 1080p, but the MipMaps loaded are the 4K ones so your memory usage is effectively the same as playing at Native 4K, what you are saving on with DLSS is raw pixel pushing.


Entry level cards were fine for resolutions 50% higher than consoles??
Mate what planet are you from?
The 8400 hell even the 8600 werent pushing 50% more pixels than the Xbox 360/PS3.
I had a frikken 8800GTS to play at 1440x900(approx 50% more pixels)...I ended up needing to upgrade to GTX260c216 to keep stable frame rates.
They had too much VRAM....what the fuck am I reading?
The 8400 had 128MBs.....yes MB, the highest end variant had 512MB.
Too much VRAM.
Mate you are misremember what the world was like, yes we were eating nice, but consoles werent taking on entry level GPUs, they wiped the floor with entry level GPUs.


As for the RTX4060 being DOA with 8GB of VRAM.
It will be 8GB.
The alternative is actually worse cuz that would mean its 12GB with a 96bit bus.
Its memory would be much slower than a Series S.
The 4060Ti is already at ~288GB/s
O1x8W86.png
 

lukilladog

Member
DLSS 4K may be 1080p, but the MipMaps loaded are the 4K ones so your memory usage is effectively the same as playing at Native 4K, what you are saving on with DLSS is raw pixel pushing.

I've been using 4k textures in my racing games since my gtx 660... even 8k ones with my actual 8gb card in Skyrim... but you deserve stuttering for using those in a 4060?. Is that your argument?.

Entry level cards were fine for resolutions 50% higher than consoles??
Mate what planet are you from?
The 8400 hell even the 8600 werent pushing 50% more pixels than the Xbox 360/PS3.
I had a frikken 8800GTS to play at 1440x900(approx 50% more pixels)...I ended up needing to upgrade to GTX260c216 to keep stable frame rates.
They had too much VRAM....what the fuck am I reading?
The 8400 had 128MBs.....yes MB, the highest end variant had 512MB.
Too much VRAM.
Mate you are misremember what the world was like, yes we were eating nice, but consoles werent taking on entry level GPUs, they wiped the floor with entry level GPUs.

Too much VRAM for the cards to take advantage of, that is what I meant, at least on the variants with double the vram, an option we no longer have. And yes, when those consoles arrived entry level cards and even the mid-range struggled, but when we were in similar time frames, 2.5 to 3 years after the consoles launched, we had cards like the $90 bucks hd4670 which could run games at 1280x1024 NP, a little higher than 1440x900 and console's 1280x720...which also didn't need to sacrifice texture quality. The reason many of Us upgraded 8800gt class cards to gtx260's had more to do with quality and speed, msaa, 60fps, PC specific superior textures, super sampling, Crysis.. not because we were struggling to match console's presentation, remember that at that time it was the consoles choking to 10fps in games like Borderlands and having a hard time with Frostbite. But now it is the other way around, and people like you think that We deserve crap for buying $400-500 cards and intending to play upscaled fake 4k?. Bro.

As for the RTX4060 being DOA with 8GB of VRAM.
It will be 8GB.
The alternative is actually worse cuz that would mean its 12GB with a 96bit bus.
Its memory would be much slower than a Series S.
The 4060Ti is already at ~288GB/s
O1x8W86.png

Junk card, it better comes out with 16gb.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I've been using 4k textures in my racing games since my gtx 660... even 8k ones with my actual 8gb card in Skyrim... but you deserve stuttering for using those in a 4060?. Is that your argument?.



Too much VRAM for the cards to take advantage of, that is what I meant, at least on the variants with double the vram, an option we no longer have. And yes, when those consoles arrived entry level cards and even the mid-range struggled, but when we were in similar time frames, 2.5 to 3 years after the consoles launched, we had cards like the $90 bucks hd4670 which could run games at 1280x1024 NP, a little higher than 1440x900 and console's 1280x720...which also didn't need to sacrifice texture quality. The reason many of Us upgraded 8800gt class cards to gtx260's had more to do with quality and speed, msaa, 60fps, PC specific superior textures, super sampling, Crysis.. not because we were struggling to match console's presentation, remember that at that time it was the consoles choking to 10fps in games like Borderlands and having a hard time with Frostbite. But now it is the other way around, and people like you think that We deserve crap for buying $400-500 cards and intending to play upscaled fake 4k?. Bro.



Junk card, it better comes out with 16gb.
We all like to fantasize.

But some of us live in the real world.
You think Nvidia is going to skimp to a 128bit bus, yet pony up for 16GB of VRAM?
A wider bus is cheaper than getting more VRAM.
Whatever you smoking I need some of it, cuz that psychosis sounds kinda fun.


If you are buying a 4060 to play games at 4KDLSS with any settings above LOD bias low, you deserve.....yes deserve any and all VRAM and chip stutter that you get.


My point is entry level Nvidia cards of yore werent pushing 50% more resolution at similar settings.
The 8400GS didnt hold a candle to the X360/PS3.


And this generation an entry level RTX4060 will likely atleast match the consoles in settings and resolution but certainly wont be pushing 50% more.
Im not sure why you would even think thats a thing as if we have historical evidence to support it.
 

lukilladog

Member
We all like to fantasize.

But some of us live in the real world.
You think Nvidia is going to skimp to a 128bit bus, yet pony up for 16GB of VRAM?

If they have to change the vram size last minute in order to protect the brand and move sales of the thing, yeah they will.

A wider bus is cheaper than getting more VRAM.

A wider bus wont hold textures, so that is irrelevant.

If you are buying a 4060 to play games at 4KDLSS with any settings above LOD bias low, you deserve.....yes deserve any and all VRAM and chip stutter that you get.
My point is entry level Nvidia cards of yore werent pushing 50% more resolution at similar settings.
The 8400GS didnt hold a candle to the X360/PS3.

And my point is that they did later on, by the 3rd year of each console we had cards slamming them good, the 6600gt with the xbox, the hd4670 with the ps3, and the 1060 with the ps4... but if the 4060 comes with only 8gb, it will be laughable to see it choking on 3 year old Ps5's 1800p (4k dlss quality on PC) and console textures. Forget about deserving shit for playing at 4k whatever with it, you deserve shit for buying a card with only 8gb at this point in time.
 

yamaci17

Member
if they release 16 gb 4070i at 600 bucks that'd be a day 1 purchase from me
but honestly i dont see them doing it
i simply refuse to buy an 12 gb card at this point. i dont want to. discord, chrome, spotify, steam, epic, all of them use VRAM. windows compositor use VRAM. twitch studio/obs studio needs VRAM to operate and run a stream. game bar recording needs a bit of VRAM. everything needs VRAM. 12 GB on paper is only and only borderline enough for futuregen games. this does not take all of these into equation.

i had my fill of disabling hardware accerelation on various software just to save crucial 600 mb - 1 gb here and there. i dont want to live that experience again. i simply do not want to compromise on texture quality just because.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
If they have to change the vram size last minute in order to protect the brand and move sales of the thing, yeah they will.
Protect the brand????.....move sales?????.......its a frikken 4060....even the true trash RTX 3050 sold.
The lower tier cards are almost entirely price sensetive and not actually performance sensitive, there is no logical reason to buy an RTX 3050....but people still bought that piece of shit.
The RTX 3060'8G.....yup it sold too....not because of its performance, but because its an Nvidia card and is cheap.
No one with even a mordcum of hardware knowledge would get or reccommend the 3050 or 3060'8G.
Nvidia jumping to 16GB for the RTX4060 on a 128bit bus?
Whatever drugs you are on are hectic my guy.

A wider bus wont hold textures, so that is irrelevant.
Im pointing out that Nvidia cheaped out on the bus width already, its illogical to even think they would then suddenly pony up on getting more VRAM.
So even wishful thinking for more VRAM is borderline delusional.

And my point is that they did later on, by the 3rd year of each console we had cards slamming them good, the 6600gt with the xbox, the hd4670 with the ps3, and the 1060 with the ps4... but if the 4060 comes with only 8gb, it will be laughable to see it choking on 3 year old Ps5's 1800p (4k dlss quality on PC) and console textures. Forget about deserving shit for playing at 4k whatever with it, you deserve shit for buying a card with only 8gb at this point in time.
The 6600GT wasnt an entry level GPU.
The 6200GS was an entry level GPU.

The GTX 960 would be the analog to the RTX 4060.
Not the GTX 1060.

Forget about deserving shit for playing at 4k whatever with it, you deserve shit for buying a card with only 8gb at this point in time.
If you are buying hardware that cant do the job you want it to do, yes you deserve all the shit.
If you are buying an RTX4060 to play at 1080p balanced settings......do you bro.

if they release 16 gb 4070i at 600 bucks that'd be a day 1 purchase from me
but honestly i dont see them doing it
i simply refuse to buy an 12 gb card at this point.
The 4070Ti is already 800 dollars with 12GB of VRAM.
You expect them to release a 16GB variant for 200 dollars less????
That would be a bigger bus and more VRAM for a 200 dollar discount.
If they go down a bus to accommodate the VRAM, the card still chokes at higher resolutions due to how slow the memory is.
 

ratburger

Member
I've seen like 4 or so posts online from others that sound like they have my issue - the game refusing to use the main GPU but the integrated GPU instead. Not sure if this is really the case but either way I'm super sad right now.
Have you tried disabling your integrated GPU in Device Manager?
 
I think every single post from you I have read in three years, every single one, felt driven by fragility and emotions.
Having a meltdown in this very Thread because a couple of PC guys dared to say they currently preferred the console experience for this particular game..


People like him is why PCMR gets hate. Wonder why every PC enthusiast on GAF who also owns Consoles never act like this.

Powered by Retard Mediumstation 900p™ Performance Mode:

Dead-Space-20230205004157.png


Dead-Space-20230207202956.png


Dead-Space-20230209010618.png


Dead-Space-20230207173055.png


Dead-Space-20230207161759.png
While not quite as good as a maxed out PC, it's good enough. Very sharp. The only problem is Fidelity + FPS. That is where I break away from console and prefer PC.
I think had the availability of PS5 been more in line with traditional launches (tried for over a year) I would lean more into the conversation.
But, anyway, this looks Solid.
I posted this in the OT, but the game is now completely broken for me.

I went to start my New Game+ run, and the game opened in windowed mode for some reason. It refuses to go full screen, even though the in-game settings say it's in full screen mode. The resolution cannot be changed in-game either, it's locked at 1080p and and there is no option to select another resolution. The game now only runs at like 10 FPS in windowed mode, and this is after beating the entire game with little to no issues at all. None of my other games have this issue.

I've seen like 4 or so posts online from others that sound like they have my issue - the game refusing to use the main GPU but the integrated GPU instead. Not sure if this is really the case but either way I'm super sad right now.
Try going into your BIOS and turning it off.
 

SlimySnake

Flashless at the Golden Globes
Hahahha some genuinely funny shit in this thread I’m convinced rodrigolfp rodrigolfp is taking the piss and having a laugh , no one is that stupid.

Look man I spent a shit load on my gaming PC it’s really nice I like it to kind of test things out and see what I’m “missing”.

Turns out in this specific instance the experience is worse than playing on PS5 as many have mentioned the PS5 performance mode looks very very nice, sure it’s not as natively sharp as PC and you miss a bit of that depth in the colours etc BUT it’s all moot when your shuffling around and the game is hitching every so often and the worse thing is it’s persistent and constant.

When you combine that with the incredible haptics of the DS and triggers (personal opinion but fantastic) and the game actually looking fantastic on PS5 I’m quite happy.

There’s other things as well like trophy support which leads to getting PSN vouchers. Quick resume and the system integration I’m not sure why so many PCMR weirdos are so defensive.

Wilfully ignorant, as they say a fool and his cash are easily parted.
Dude hogwarts is the same. I’ve spent over $4k on PC gaming since 2019 and i can’t even run it with rt on. Half of my gpu is going unused in non rt mode, crashes galore, insane stuttering despite them building shaders for a minute every fucking time i boot up the game like it’s GTA4.

Shadows and AO look completely missing on ultra settings. I don’t know why everything looks washed out.

I mean wtf is this door.

ZzyMWLe.jpg
n6ZEqbd.jpg



Edit: there is one thing it does well. Dolby atmos support is amazing. Can’t believe ps5 and xsx only support it for movies.
 
Last edited:

MikeM

Member
Dude hogwarts is the same. I’ve spent over $4k on PC gaming since 2019 and i can’t even run it with rt on. Half of my gpu is going unused in non rt mode, crashes galore, insane stuttering despite them building shaders for a minute every fucking time i boot up the game like it’s GTA4.

Shadows and AO look completely missing on ultra settings. I don’t know why everything looks washed out.

I mean wtf is this door.

ZzyMWLe.jpg
n6ZEqbd.jpg



Edit: there is one thing it does well. Dolby atmos support is amazing. Can’t believe ps5 and xsx only support it for movies.
Pcmasterrace is in a bad spot overall right now. Thanks devs.

Fyi- Atmos can be used in game on Xbox. Buy the app.
 

MikeM

Member
if they release 16 gb 4070i at 600 bucks that'd be a day 1 purchase from me
but honestly i dont see them doing it
i simply refuse to buy an 12 gb card at this point. i dont want to. discord, chrome, spotify, steam, epic, all of them use VRAM. windows compositor use VRAM. twitch studio/obs studio needs VRAM to operate and run a stream. game bar recording needs a bit of VRAM. everything needs VRAM. 12 GB on paper is only and only borderline enough for futuregen games. this does not take all of these into equation.

i had my fill of disabling hardware accerelation on various software just to save crucial 600 mb - 1 gb here and there. i dont want to live that experience again. i simply do not want to compromise on texture quality just because.
Its why I went 7900xt. 20GB of vram. Yummy.
 

Buggy Loop

Member
I think every single post from you I have read in three years, every single one, felt driven by fragility and emotions.
Having a meltdown in this very Thread because a couple of PC guys dared to say they currently preferred the console experience for this particular game..


People like him is why PCMR gets hate. Wonder why every PC enthusiast on GAF who also owns Consoles never act like this.

Powered by Retard Mediumstation 900p™ Performance Mode:


Dead-Space-20230207202956.png



Dead-Space-20230207173055.png


What is that? Anisotropic filtering x2 or even trilinear filtering?

save eating contest GIF
 

TrebleShot

Member
Dude hogwarts is the same. I’ve spent over $4k on PC gaming since 2019 and i can’t even run it with rt on. Half of my gpu is going unused in non rt mode, crashes galore, insane stuttering despite them building shaders for a minute every fucking time i boot up the game like it’s GTA4.

Shadows and AO look completely missing on ultra settings. I don’t know why everything looks washed out.

I mean wtf is this door.

ZzyMWLe.jpg
n6ZEqbd.jpg



Edit: there is one thing it does well. Dolby atmos support is amazing. Can’t believe ps5 and xsx only support it for movies.
This is my exact experience on the steam version also. It’s just not worth it you spend a shit load on high end components and the. Spend hours messing about with Ini files and controller configurations etc.
 

Fredrik

Member
Are they not going to do any Hogwarts Legacy comparisons?
I have all platforms and I don’t trust elanalista or whatever it’s called. I want to know where to play it!
 

MikeM

Member
This is my exact experience on the steam version also. It’s just not worth it you spend a shit load on high end components and the. Spend hours messing about with Ini files and controller configurations etc.
Some people call that the “fun” of pc gaming.

Sure as hell not for me. Broken is broken.
 

dotnotbot

Member
Thanks buddy, posted a couple more in the previous page.

Grain adds alot in movement on my set (mainly made to faithfully handle film grain in movies) but is maybe a bit on the heavy side in these screenshots. Game looks really good, some really neat SSS and probably the best flame effects related lighting I've ever seen.
Compared to Part I GI bounce lighting is a couple of gens behind, barely existing even really, but game lighting as a whole manages to impress in other areas anyway. Volumetrics and particles are insane.
IQ is maybe softer than Part I, less temporally stable in a couple of occasions, and FSR creates some artifacts around sparks and behind glass, but it's pretty similar to that game in terms of presentation overall look/IQ.
And it's fluid, and in later runs with Hand Cannon you're constantly running from one room to the other and back to the Store to sell stuff all the time, so being fluid and consistent is a huge plus actually. Invaluable, imo.

Dead-Space-20230206173333.png


Dead-Space-20230206175548.png


Dead-Space-20230204223515.png

Those screens are a good example why I love moderate amounts of film grain. Would look much more artificial and flatter without it.
 
Last edited:

rofif

Can’t Git Gud
Thanks buddy, posted a couple more in the previous page.

Grain adds alot in movement on my set (mainly made to faithfully handle film grain in movies) but is maybe a bit on the heavy side in these screenshots. Game looks really good, some really neat SSS and probably the best flame effects related lighting I've ever seen.
Compared to Part I GI bounce lighting is a couple of gens behind, barely existing even really, but game lighting as a whole manages to impress in other areas anyway. Volumetrics and particles are insane.
IQ is maybe softer than Part I, less temporally stable in a couple of occasions, and FSR creates some artifacts around sparks and behind glass, but it's pretty similar to that game in terms of presentation overall look/IQ.
And it's fluid, and in later runs with Hand Cannon you're constantly running from one room to the other and back to the Store to sell stuff all the time, so being fluid and consistent is a huge plus actually. Invaluable, imo.

Dead-Space-20230206173333.png


Dead-Space-20230206175548.png


Dead-Space-20230204223515.png
Lighting in that volumetric fog looks great. Is it dynamic like in returnal or not reactive ?
That couch looks bad for your spine 0/10
 

Vick

Member
who cares. It's not 2004 anymore. You dont get long flat texture plains.
4-16? I don't care anymore
Remember this period of PC gaming?




Now those where indeed completely different experiences. Between resolution, framerates and actual settings, games like Arkham Asylum, Dead Space, Arkham City, Max Payne 3 etc. all felt like something else entirely. What a glorious time, I'll never forget it.
So glorious many people are still stuck there, unwilling to acknowledge how drastically times have changed.

Lighting in that volumetric fog looks great. Is it dynamic like in returnal or not reactive ?
That couch looks bad for your spine 0/10
It's not static, but I haven't noticed how much reactive it actually is. For sure Devs oversold it here:




But when it randomly appears in the final game it still never fail to impress.
 

rofif

Can’t Git Gud
Remember this period of PC gaming?




Now those where indeed completely different experiences. Between resolution, framerates and actual settings, games like Arkham Asylum, Dead Space, Arkham City, Max Payne 3 etc. all felt like something else entirely. What a glorious time, I'll never forget it.
So glorious many people are still stuck there, unwilling to acknowledge how drastically times have changed.


It's not static, but I haven't noticed how much reactive it actually is. For sure Devs oversold it here:




But when it randomly appears in the final game it still never fail to impress.

yeah the differences used to be much bigger in 360/ps3 days.
Games usually had more MSAA and better fps on 360.
Ps3 had better vides quality and so other stuff. It was interesting
 

Buggy Loop

Member
who cares. It's not 2004 anymore. You dont get long flat texture plains.
4-16? I don't care anymore

Oh yeah, only for flat plains

Uh Huh Yes GIF




D2SJXT1.jpg



XgpZQaE.png


You guys have incredible eyes when it comes to shitting on VRS but can't see shit for texture filtering, or texture quality, or sub 1080p res for gaining "maybe" 60 fps.
 

Freeza93

Banned
Oh yeah, only for flat plains

Uh Huh Yes GIF




D2SJXT1.jpg



XgpZQaE.png


You guys have incredible eyes when it comes to shitting on VRS but can't see shit for texture filtering, or texture quality, or sub 1080p res for gaining "maybe" 60 fps.
Cherry picking goes both way my friend.
 

SmokedMeat

Gamer™
5600/7900xt PC and PS5 here.

Bought this on PS5 because new games on PC seem to be broken. Looks like this confirms it. Another one?

Generally, i’ve been disappointed in rejoining PC-Masterrace thus far. Flame shield engaged!

Overwatch Protect GIF by Xbox

PS5 version isn’t stutter free either. It’s even mentioned in the video.
 

Vick

Member
yeah the differences used to be much bigger in 360/ps3 days.
Games usually had more MSAA and better fps on 360.
Ps3 had better vides quality and so other stuff. It was interesting
If I recall correctly Dead Space was actually the first AAA game where PS3 version didn't sucked compared to 360.
It unfortunately had superior audio than any other version.

Oh yeah, only for flat plains

Uh Huh Yes GIF




D2SJXT1.jpg



[/URL][/URL]
XgpZQaE.png


You guys have incredible eyes when it comes to shitting on VRS but can't see shit for texture filtering, or texture quality, or sub 1080p res for gaining "maybe" 60 fps.
That "PS5" version of Horizon is actually the PS4 one, as there's no PS5 version. But I'm sure you know that already.

The Dead Space comparison you posted was made before the VRS Patch.. when the game looked like this.

MYQquIp.jpg
 

rofif

Can’t Git Gud
Oh yeah, only for flat plains

Uh Huh Yes GIF




D2SJXT1.jpg



XgpZQaE.png


You guys have incredible eyes when it comes to shitting on VRS but can't see shit for texture filtering, or texture quality, or sub 1080p res for gaining "maybe" 60 fps.
no difference. This doesn't bother me
 

Buggy Loop

Member
If I recall correctly Dead Space was actually the first AAA game where PS3 version didn't sucked compared to 360.
It unfortunately had superior audio than any other version.


That "PS5" version of Horizon is actually the PS4 one, as there's no PS5 version. But I'm sure you know that already.

The Dead Space comparison you posted was made before the VRS Patch.. when the game looked like this.

MYQquIp.jpg

No, i didn't bother to include the Xbox series X in the screenshot, maybe i should? Same showcase of filtering. PC also didn't have the option to disable VRS, although it didn't have PS5's problem.

Checkerboard pattern with RT mode too 🤷‍♂️

no difference. This doesn't bother me

Stop spending days on pcgamingwiki then, you're blind
 
Last edited:
I didn't really play at all before the patch that disabled VRS but played some last night and it looks really, really low res on console. When you move the camera anything with specular highlights just crawls. DLSS definitely has issues and looks awful in some games but is still absolutely light years ahead of FSR. Anyone claiming the console version has a clean image in performance mode needs to hit the opticians ASAP.
 

Vick

Member
Looks like octiny octiny is having an emotional insecurity meltdown as well, as usual in these cases.

Yes, it was. Comparison posted 27 January, Patch released 31 January.

i didn't bother to include the Xbox series X in the screenshot, maybe i should? Same showcase of filtering. PC also didn't have the option to disable VRS, although it didn't have PS5's problem.
Series X, while not as affected as PS5 (which was absolutely disgusting, borderline broken compared to Series X and PC) was similarly lacking in this regard. Looked much more "2X AF" in fact (PS5 Left, Series X right):

CQGtM33.jpg


Any more disingenuity?
 
Top Bottom