• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Dead Space Remake PC - DF Tech Review - The #StutterStruggle Continues

Freeza93

Banned
maybe but just maybe

wzgxjgn.jpg



Resi 8 was much better without it and Ea uses it in every game.?
 

Vick

Member
I didn't really play at all before the patch that disabled VRS but played some last night and it looks really, really low res on console. When you move the camera anything with specular highlights just crawls. DLSS definitely has issues and looks awful in some games but is still absolutely light years ahead of FSR. Anyone claiming the console version has a clean image in performance mode needs to hit the opticians ASAP.
Now this is one fat lie.

FSR temporal coverage does actually a good job here. Not 100% perfect, obviously, but what stated in this post couldn't be further from truth.

Dead-Space-20230208041609.png


There's more Aliasing in the DF analysis from John of the native resolution PC version than in my four entire playthough combined, for instance.
Whatever walkthrough on YT would suffice to see that's indeed the case.
 
Based on YT videos I had the impression that this game looks like some typical PS4 game, but after finally playing it I'm still impressed by how good the remake looks. The developers of the dead space remake have said in one of their videos they are using real time GI, and it really shows, because the lighting in this game looks simply amazing. In DS remake character models during the gameplay FINALLY looks like they belong to the scene, because their entire body is lit correctly. That wasnt the case in PS4 era, where you could see beautiful lighting on character models during the cutscene, but during the gampelay it looked extremely flat. Also the beautiful lighting in remake takes the atmosphere and realism to the next level, and especially in HDR. Dead Space Remake is extremely dark game, and in SDR there's a black crush pretty much everywhere, but in HDR you dont even need to use the flashlight, because you can see amazing shadows details and depth. This game looked more like real film to me.

The lighting lits realistically Hammond from the front., while his back has realistic indirect shadows. He really looks like part of the scene in a way, and that was never the case in PS4 era games.

a2.jpg



On this screenshot Hailey is mainly lit from the front, but her fase is also reflecting the red lighting on the left. It looks extremely realistic.


a7.jpg



Some more screehnots. There's clear black crush because these are SDR screenhots, but in HDR I can see EVERYTHING clearly even in these dark areas, the shadow depth in this game is simply amazing.



a3.jpg



a4.jpg



And screenshot for people who say the original game still holds up and looks very good. Who needed a remake when the original game already looked like this? :D


a1.jpg



And that's how lighting looks in Uncharted 4, the best looking PS4 game. During cutscens character models are lit manualy, so they look very realistic, but when gameplay starts the lighting on character models looks just flat.


fhojXHq.jpg


aqx30pi1k95.png
 
Last edited:

rofif

Can’t Git Gud
No, i didn't bother to include the Xbox series X in the screenshot, maybe i should? Same showcase of filtering. PC also didn't have the option to disable VRS, although it didn't have PS5's problem.

Checkerboard pattern with RT mode too 🤷‍♂️



Stop spending days on pcgamingwiki then, you're blind
Not blind.
Ignorant and happy. The differences really are not worth dwelling over
 

lukilladog

Member
Protect the brand????.....move sales?????.......its a frikken 4060....even the true trash RTX 3050 sold.

The 3050 had what used to be the minimum recommendable amount of Vram, so it's a flawed analogy.

The lower tier cards are almost entirely price sensetive and not actually performance sensitive, there is no logical reason to buy an RTX 3050....but people still bought that piece of shit.

You are contradicting yourself, it's a price sensitive segment, but it was overpriced AF and still sold. Maybe people saw that it was an actual gaming card that had the recommended minimum amount of Vram at the time... maybe they saw it could run Cyberpunk at 60fps by turning RT off and using dlss. So why is that ilogical?... if it was choking on cyberpunk to 10fps no matter what and still sold, I would say that was stupid.

The RTX 3060'8G.....yup it sold too....not because of its performance, but because its an Nvidia card and is cheap.
No one with even a mordcum of hardware knowledge would get or reccommend the 3050 or 3060'8G.
Nvidia jumping to 16GB for the RTX4060 on a 128bit bus?
Whatever drugs you are on are hectic my guy.

If they can make 8gb 14gbps work on 128bit just fine maybe they can do the same with 16gb 24gbps on 128 bit?. What is so crazy about that exactly?. The last 30% of the memory could be sluggish, but we are talking about Nvidia here, sales first.

Im pointing out that Nvidia cheaped out on the bus width already, its illogical to even think they would then suddenly pony up on getting more VRAM.
So even wishful thinking for more VRAM is borderline delusional.

It's not wishful thinking, I already pointed out how is that problematic for nvidia and how they can patch their thing and keep it going... it is that or release a legit lemon, or cancel it and rename it a 4050... which would still have worse lasting value than a 3050, doa out of the box.

The 6600GT wasnt an entry level GPU.
The 6200GS was an entry level GPU.

The GTX 960 would be the analog to the RTX 4060.
Not the GTX 1060.

Not if you want to get rigurous, but we used to call them entry level in the context of gaming, 6200gs, 8400 and gt710 was the stuff you would get for your htpc or a second monitor. Even yourself called the 4060 entry level on this thread.

If you are buying hardware that cant do the job you want it to do, yes you deserve all the shit.
If you are buying an RTX4060 to play at 1080p balanced settings......do you bro.

No you don't, you have been shitting on the 3050 but think a 4060 is fine for 1080p with only 8gb?... Obviously you have not used the 3050 or have seen the 3060ti trying to play Hogwart's Legacy and Dead Space?. 8gb is dead bro, don't screw up people with that recommendation.

The 4070Ti is already 800 dollars with 12GB of VRAM.
You expect them to release a 16GB variant for 200 dollars less????
That would be a bigger bus and more VRAM for a 200 dollar discount.
If they go down a bus to accommodate the VRAM, the card still chokes at higher resolutions due to how slow the memory is.

We have talked about this, this is not the first time manufacturers sell lower end cards with more Vram, it's a selling point.
 

Vick

Member
Based on YT videos I had the impression that this game looks like some typical PS4 game, but after finally playing it I'm still impressed by how good the remake looks. The developers of the dead space remake have said in one of their videos they are using real time GI, and it really shows, because the lighting in this game looks simply amazing. In DS remake character models during the gameplay FINALLY looks like they belong to the scene, because their entire body is lit correctly. That wasnt the case in PS4 era, where you could see beautiful lighting on character models during the cutscene, but during the gampelay it looked extremely flat. Also the beautiful lighting in remake takes the atmosphere and realism to the next level, and especially in HDR. Dead Space Remake is extremely dark game, and in SDR there's a black crush pretty much everywhere, but in HDR you dont even need to use the flashlight, because you can see amazing shadows details and depth. This game looked more like real film to me.

And that's how lighting looks in Uncharted 4, the best looking PS4 game. During cutscens character models are lit manualy, so they look very realistic, but when gameplay starts the lighting on character models looks just flat.


fhojXHq.jpg


aqx30pi1k95.png
Definitely agree, game manages to impress visually in multiple occasions.
SSS is probably the best I've ever seen (the organic stuff in general looked so life-like), and fire effects and related lighting much, much more impressive than the one in TLOU Part I for instance. Even particles like sparks reacted realistically to Isaac's body. Great stuff, made even more pleasant by the basically non-existent downgrade between Fidelity and Performance.

People complained about facial expressions, and it's certainly true, but it's definitely worth mentioning how unlike many other games, here whatever language you select the game is lip-synced accordingly!
I only wish the mentioned GI applied to the flashlight was even remotely close to ND games, especially considering the dark nature of the game. Motive solution is far from being on par with the PS3 TLOU implementation in 2013 unfortunately, nor the simplified Alien: Isolation solution.










Naughty Dog implementation seen in Part II, Uncharted 4, Lost Legacy (PS4 Pro versions especially as later ports including PC have been downgraded in this regard) and Part I is literal generations away, and something like this would have done wonders in this Dead Space setting.

Maybe for the Sequel, which I wish is currently planned despite the game not setting the world on fire.
I need more Dead Space, I don't think I have done four playthroughs one after the other before this Remake ever before..
 
Last edited:

rofif

Can’t Git Gud
Definitely agree, game manages to impress visually in multiple occasions.
SSS is probably the best I've ever seen (the organic stuff in general looked so life-like), and fire effects and related lighting much, much more impressive than the one in TLOU Part I for instance. Even particles like sparks reacted realistically to Isaac's body. Great stuff, made even more pleasant by the basically non-existent downgrade between Fidelity and Performance.

People complained about facial expressions, and it's certainly true, but it's definitely worth mentioning how unlike many other games, here whatever language you select the game is lip-synced accordingly!
I only wish the mentioned GI applied to the flashlight was even remotely close to ND games, especially considering the dark nature of the game. Motive solution is far from being on par with the PS3 TLOU implementation in 2013 unfortunately, nor the simplified Alien: Isolation solution.










Naughty Dog implementation seen in Part II, Uncharted 4, Lost Legacy (PS4 Pro versions especially as later ports including PC have been downgraded in this regard) and Part I is literal generations away, and something like this would have done wonders in this Dead Space setting.

Maybe for the Sequel, which I wish is currently planned despite the game not setting the world on fire.
I need more Dead Space, I don't think I have done four playthroughs one after the other before this Remake ever before..

I am up for another UC4 replay this year. Last version I replayed was ps5 port 4k30 and really the game is still amazing.
I've seen of course all your comparisons and I still think higher resolution is probably worth the weird downgrades.... but maybe instead of playink 4k40 this time.... I replayed ps4 verison again :p
 
Have you tried disabling your integrated GPU in Device Manager?
While not quite as good as a maxed out PC, it's good enough. Very sharp. The only problem is Fidelity + FPS. That is where I break away from console and prefer PC.
I think had the availability of PS5 been more in line with traditional launches (tried for over a year) I would lean more into the conversation.
But, anyway, this looks Solid.

Try going into your BIOS and turning it off.
I can't disable it. Im on a laptop, and it uses both.
 
RTX 2060S and R5 3600 still putting in work.

Imagine someone who built the exact system in 2019?
They'd be swimming in it right now.
Yet people told me way back when 6 cores was gonna mean death.
Could probably upgrade to a R5 5600/5800X3D and float the generation.

Yet people will come and say you need a 2000 dollar computer to play PC games.
Im glad DF still use this old PC as their low end/mid range machine.
Yes some games will beat it up, but man if youve had this CPU/GPU since 2019, they have earned their keep.
Even a small bump up to the 3060Ti would have 1440pDLSS be easy work going forward.




Odd the game has loading stutter even on PS5.
6 core 12 threads still kicks ass. Will be interesting to see when the high end stuff of today(20 core 32 thread/ 16 core/32 threads, starts not being enough in the future. I remember when a 2500k was massively overkill as a 4c 4t CPU but now 4t is terrible for gaming and 8t is bad for consistent fps and high 1% lows.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
The 3050 had what used to be the minimum recommendable amount of Vram, so it's a flawed analogy.
You are contradicting yourself, it's a price sensitive segment, but it was overpriced AF and still sold. Maybe people saw that it was an actual gaming card that had the recommended minimum amount of Vram at the time... maybe they saw it could run Cyberpunk at 60fps by turning RT off and using dlss. So why is that ilogical?... if it was choking on cyberpunk to 10fps no matter what and still sold, I would say that was stupid.
The 3050 is literally failed 3060s that Nvidia produced simply to fill out demand.
It wasnt actually originally supposed to come out.
The 3060 was supposed to be the lowest tier RTX30 card.
To put things into context the RTX 3050 is using a mobile configuration....no one should have bought it, but people who only know Nvidia as "the" GPU maker and were desperate during the shortage ended up buying it, cuz it was in stock and wasnt scalped to death so prices were "good" the performance not so much.
A bad purchase and I would never reccomend it to anyone.

If they can make 8gb 14gbps work on 128bit just fine maybe they can do the same with 16gb 24gbps on 128 bit?. What is so crazy about that exactly?. The last 30% of the memory could be sluggish, but we are talking about Nvidia here, sales first.

It's not wishful thinking, I already pointed out how is that problematic for nvidia and how they can patch their thing and keep it going... it is that or release a legit lemon, or cancel it and rename it a 4050... which would still have worse lasting value than a 3050, doa out of the box.
Making in work in the sense it is doable...yes its totally doable, with 2GB GDDR modules you can easily make a 16GB 128bit card.
But in reality Nvidia have no incentive to do so when they are already cutting costs as is.

Not if you want to get rigurous, but we used to call them entry level in the context of gaming, 6200gs, 8400 and gt710 was the stuff you would get for your htpc or a second monitor. Even yourself called the 4060 entry level on this thread.
Absolutely not.
The 6200GS was a straight up gaming card, entry level but still fully touted as a gaming card.
The 6600GT and co would be analogues to the xx70 class cards which I doubt you would ever call entry level.

No you don't, you have been shitting on the 3050 but think a 4060 is fine for 1080p with only 8gb?... Obviously you have not used the 3050 or have seen the 3060ti trying to play Hogwart's Legacy and Dead Space?. 8gb is dead bro, don't screw up people with that recommendation.

IThe RTX 3050 is a horrible card through and through, its vastly underpowered, starved on bandwidth and came way too late to be worth its own weight in shit.
A750s and RX6600s do everything better than it....thats why I shit on the RTX3050.
Its VRAM count isnt something im too worried about cuz that chip belongs in a low end gaming laptop anyway.

We have talked about this, this is not the first time manufacturers sell lower end cards with more Vram, it's a selling point.
Did you read what I was quoting.
How does it make sense to sell a 4070Ti'12G for 800 dollars, then sell a 4070Ti'16G for 600 dollars?



Now understand that im not condoning Nvidias practices this generation, hell im very much against them, every card is overpriced.
But seeing the landscape as it is, im not sitting in some dreamland thinking Nvidia is going to give customers a bone and make the 4060 16GB.
Not in the climate we are in.
Im being a realist here.

Best best best best scenario is it gets 12GB on small bus and thats pretty damn wishful thinking..........there was the engineering sample 160bit bus 10GB card, but i dont know if Nvidia is actually gonna send that to market......so most likely the 4060 is going to be 8GB and people will still buy it, cuz it will be priced in the range that people can actually afford, pretty much regardless of its actual performance.
 

adamsapple

Or is it just one of Phil's balls in my throat?
I've seen like 4 or so posts online from others that sound like they have my issue - the game refusing to use the main GPU but the integrated GPU instead. Not sure if this is really the case but either way I'm super sad right now.

You can manually choose which GPU an app runs at, in the Ndivida control panel at least. I'm sure there's gotta be an equivalent for AMD as well.

So is the DualSense implementation worth an extra 10% over just going with the XSX version?


PS5 and SX are priced the same, no ?
 
Last edited:

Thebonehead

Banned
You can manually choose which GPU an app runs at, in the Ndivida control panel at least. I'm sure there's gotta be an equivalent for AMD as well.




PS5 and SX are priced the same, no ?

Probably the 10 precent discount you get on EA titles if you are subscribed to GamePass.

Logically if you are subbed to GP though, the most reasonable thing to do is wait 6-12 months for it to hit the EA Vault and then play it via that
 

GametimeUK

Member
Just picked it up on my PS5. I got too annoyed at the stuttering on PC. It still has extremely slight stutters and the visuals are a bit softer, BUT in comparison it runs like an absolute dream and doesn't break my immersion. I'm happy with how this performs on console. Finally I get to just enjoy the damn game.
 

lukilladog

Member
The 3050 is literally failed 3060s that Nvidia produced simply to fill out demand.
It wasnt actually originally supposed to come out.
The 3060 was supposed to be the lowest tier RTX30 card.
To put things into context the RTX 3050 is using a mobile configuration....no one should have bought it, but people who only know Nvidia as "the" GPU maker and were desperate during the shortage ended up buying it, cuz it was in stock and wasnt scalped to death so prices were "good" the performance not so much.
A bad purchase and I would never reccomend it to anyone.

All those are meaningless arguments, what matters is how it plays games at the end of the day, and price. It had a bad price for what it was, but if you could find one second hand at a good price, it was fine for high and max settings at 1080p, with even some ray tracing here and there. Doom eternal at +100fps with RT reflections np, Metro Enhanced with forced RT at 80-100fps, the RE remakes with RT at 40-60fps, or +120 without it, The ascent with Ray traced shadows +100fps, Cyberpunk with medium RT at 40-50fps on high settings, or 60-80fps without RT, Crysis 2 and 3 remasters 60-80 with RT, Forza Horizon 5 must have 1 spot where it drops to 55 fps under rain with ultra settings, but the rest is 80-120fps 99%... but it had just about the minimum amount of vram to deliver that, the 4060 wont even have that "privilege" if Nvidia goes full retard.

Making in work in the sense it is doable...yes its totally doable, with 2GB GDDR modules you can easily make a 16GB 128bit card.
But in reality Nvidia have no incentive to do so when they are already cutting costs as is.

Cutting costs or maximizing profit?, you don't know, but knowing Nvidia, it's most likely the second. Nvidia will put 16gb if that maximizes profit by way of selling millions of this POS for $450, and that is not even controversial.

Absolutely not.
The 6200GS was a straight up gaming card, entry level but still fully touted as a gaming card.
The 6600GT and co would be analogues to the xx70 class cards which I doubt you would ever call entry level.

Nope, who said that?. When you have most review sites ignoring these type of cards is because the gaming community didn´t give a shit.

IThe RTX 3050 is a horrible card through and through, its vastly underpowered, starved on bandwidth and came way too late to be worth its own weight in shit.
A750s and RX6600s do everything better than it....thats why I shit on the RTX3050.
Its VRAM count isnt something im too worried about cuz that chip belongs in a low end gaming laptop anyway.

Do you know how that works?. How is 68fps at 6821mb bandwidth starved?. It seems very well balanced to me, I'm certain I can pull some Forza shots doing +7000mb at over 100fps.

dRd71Ie.jpg

Did you read what I was quoting.
How does it make sense to sell a 4070Ti'12G for 800 dollars, then sell a 4070Ti'16G for 600 dollars?

It's a compromise, nvidia will throw some customers under the bus if they have to.

Now understand that im not condoning Nvidias practices this generation, hell im very much against them, every card is overpriced.
But seeing the landscape as it is, im not sitting in some dreamland thinking Nvidia is going to give customers a bone and make the 4060 16GB.
Not in the climate we are in.
Im being a realist here.

Best best best best scenario is it gets 12GB on small bus and thats pretty damn wishful thinking..........there was the engineering sample 160bit bus 10GB card, but i dont know if Nvidia is actually gonna send that to market......so most likely the 4060 is going to be 8GB and people will still buy it, cuz it will be priced in the range that people can actually afford, pretty much regardless of its actual performance.

If people buys that they are gonna be as screwed as the poor saps that paid premium for a shity 4gb 6500xt not long ago. At least we agree they deserve all the stuttering they are gonna get (after being warned of course).
 
Last edited:
I am up for another UC4 replay this year. Last version I replayed was ps5 port 4k30 and really the game is still amazing.
I've seen of course all your comparisons and I still think higher resolution is probably worth the weird downgrades.... but maybe instead of playink 4k40 this time.... I replayed ps4 verison again :p
UC4 at 4k/40 on ps5 is so much nicer than 1440/60. I don5 understand why ND got rid of the 4k/40 mode in the LoU remake. It seems much less stable now that they uncapped it. I finished it last night and the final section I'm the hospital CHUGS now, probably due to the dynamic lighting that is so intense in that section.
 

yamaci17

Member
UC4 at 4k/40 on ps5 is so much nicer than 1440/60. I don5 understand why ND got rid of the 4k/40 mode in the LoU remake. It seems much less stable now that they uncapped it. I finished it last night and the final section I'm the hospital CHUGS now, probably due to the dynamic lighting that is so intense in that section.
1440p looks shit. both me and Rofif are heavily agree on this one. but people are mostly in denial, since 1440p is a safe spot for hardware req and practically is a must for 60 fps
but 4k/upscaled is also better than 1440p. a 1200p internal resolution being upscaled to 4k will destroy and demolish so called "native 1440p" rendering. so few people are aware of the difference. those who do and see what 4k lods/assets do to a game, and if they can discer it like me and rofif do, such people like us are unable to accept or be happy with "1440p" image quality.

it is a both blessing and curse.

even at 1200p internal, 4k upscaled is much much heavier to run than native 1440p in most cases. because it still loads 4k hints, lods and assets. 1440p however is plain 1440p. 1440p lods, assets, which makes games look weird/blurry compared to 4k upscaled
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
If people buys that they are gonna be as screwed as the poor saps that paid premium for a shity 4gb 6500xt not long ago. At least we agree they deserve all the stuttering they are gonna get (after being warned of course).
Look I fully agree Nvidia is fucking us over this generation.
There is no argument there from me.
People buying their entry level cards are fooling themselves, even with FrameGeneration Ampere cards on the low end would be better investments.

My whole real point is expecting goodwill from Nvidia at this point is fruitless.
We will have to wait for the RTX 5000s for them to rebuild "trust" in consumers.
Theyve done this before already, one gen people complain and the next generation things "normalize".
Im just being a realist so you and others dont get their hopes up for Nvidia to give us "good" entry level GPUs.
They arent going to do that this generation.....they didnt even do that last generation and the RTX30s in general were brilliant cards.

The RTX 4060 and 4060Ti arent even x16 cards.
Hell the 4060 is running on AD107.
Assuming its going to be approx 400 dollars.....a 3060Ti'G6X would likely beat it in actual gaming for the same price.
ftornBh.png
 

winjer

Gold Member
Those 4060 and 4050 Ti are going to be a disaster.
They will probably cost around 500$. But only bring 8GB of vram, that will be short for games being released this year and onwards, especially when using RT.
And to make it worse, only an 8 PCIe lanes. So vram will be used up rapidly, because it's only 8GB, then the GPU will need to fetch data from system RAM, but it will bottlenecked by a lower PCIe bandwidth.
And to make thing even worse, only 128bit bus for the vram.
So people will have to lower texture detail, Ray-tracing and some effects, to be able to play games smoothly on a brand new GPU.

This is the kind of crap one would expect from a sub 200$ GPU, not from something costing more than double.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Those 4060 and 4050 Ti are going to be a disaster.
They will probably cost around 500$. But only bring 8GB of vram, that will be short for games being released this year and onwards, especially when using RT.
And to make it worse, only an 8 PCIe lanes. So vram will be used up rapidly, because it's only 8GB, then the GPU will need to fetch data from system RAM, but it will bottlenecked by a lower PCIe bandwidth.
And to make thing even worse, only 128bit bus for the vram.
So people will have to lower texture detail, Ray-tracing and some effects, to be able to play games smoothly on a brand new GPU.

This is the kind of crap one would expect from a sub 200$ GPU, not from something costing more than double.
The 4060 is a hard sell to anyone in the know.
But it will still do number cuz people just need an Nvidia card.
AMD and Intel need to launch their new gen midrangers soon rather than later.
If AMD doesnt have a 7700 and/or 7600 by end of year they are taking a piss.
Intel needs to either refresh the Alchemists or have Battlemage 3080 level performance by the end of year too.
 
Welp, I finally fixed the game after 4 fucking days trying everything.

FUCK EA - this has been an issue with their games for years apparently, however there is no official solution from them. After digging through HUNDREDS of posts, I finally found the answer. Apparently, EA games have an issue with iGPU RAM allocation when it comes to multi-gpu laptops. When I doubled my laptop's RAM from 16 GB to 32 GB, it doubled the allocation of my iGPU's RAM from 2 GB to 4 GB. This caused Dead Space to refuse to switch over to my dedicated GPU - ever.

Obviously, this is a big fucking problem for laptops like mine that rely on multi-gpu to function - I can't just turn off the damn iGPU or the laptop won't work at all. So I ran BIOS, dropped the iGPU allocation back down to 2 GB and boom, everything is working fine again. What a piece of shit company. I need to cool off for a few days then I'll start my NewGame+ run. I'm so pissed right now.
 
1440p looks shit. both me and Rofif are heavily agree on this one. but people are mostly in denial, since 1440p is a safe spot for hardware req and practically is a must for 60 fps
but 4k/upscaled is also better than 1440p. a 1200p internal resolution being upscaled to 4k will destroy and demolish so called "native 1440p" rendering. so few people are aware of the difference. those who do and see what 4k lods/assets do to a game, and if they can discer it like me and rofif do, such people like us are unable to accept or be happy with "1440p" image quality.

it is a both blessing and curse.

even at 1200p internal, 4k upscaled is much much heavier to run than native 1440p in most cases. because it still loads 4k hints, lods and assets. 1440p however is plain 1440p. 1440p lods, assets, which makes games look weird/blurry compared to 4k upscaled
4K displays are overated. The thing is such insanely high resolution is not very universal, so for example 1920x1080 BD movies, or even older games (games with low quality assets and textures) will look much worse when displayed at 4K. What's more, even if you want to play 4K content on your 4K TV/monitor, you will still need to take into account the size of the display and the viewing distance, because if your eyes cant see more than lets say 1920x1080 pixels from the place where are you sitting, so why even bother with 4K display.

https://stari.co/tv-monitor-viewing-distance-calculator
According to this calculator people with perfect eyesight need to sit at a distance of 1 metre from a 55-inch 4K screen in order to really see what 4K has to offer (I'm talking about visual acuity distance). I have never seen anyone sit so close, and most people watch this kind of TV from about 2-3 metres, from which even someone with perfect eyesight cannot see more pixels than good old 1920x1080.

Although 4K displays are overated, I still try to run my games at 4K resolution if only my GPU has enough resources, because aliasing and shimmering arnt pretty, and only downscaling can make the game look sharp and aliasing/shimmering free at the same time. Even 1920x1080p display will show way more fine details if game will run at 4K downscaled to 1080p, than standard 1080p with crappy TAA running on the same tv. IMO that's why people think 4K makes a difference. It's not because their display has 8M pixels, but it's because picture itself has way supperior AA compared to extremely blurry TAA.

1440p/1080p looks bad to me, but only when upscaled to 4K. Edge contrast is ruined by upscaling, and that's the only reason why picture looks soft. We humans perceive sharpness as contrast and that's also why sharpening masks can make even blurry picture look sharp.

I have a total of 4 displays in my house:
55inch 4K LCD TV for modern games with HDR
42inch Plasma 1024x768 for PS3-PS2/xboxclassic/GC games
32inch LCD 1920x1080 for PS4 games
27inch LCD 2560x1440p with HDR for my PC

WIthout upscaling even my 1024x768 plasma has very sharp picture from a normal viewing distance and downscaled 1080p is already enough, to make even blurry TAA game look sharp. On my 32'inch 1920x1080 tv I need to dowscale from 4K to make TAA game look sharp. On my 1440p monitor I need 6K, and on my 4K TV i dont even bother with downscaling anyway, because 8K is too demanding and 4K with TAA looks acceptable even without downscaling anyway and especially with good sharpening..
 
Last edited:
I own a 4K OLED TV and agree that 4K is very much overrated, mostly because very little content actually displays at native 4K. Most 4K movies and TV shows, streamed or disc, are actually upscaled from lower resolutions as are most games. Even on PC, you need upscaling technology like DLSS and FSR if you want to play most modern games at 4K and the situation is even worse on consoles were you actually have to choose to play a game at 30 fps if you want to play it at 4K or near 4K. Maybe they should have marketed 1440p TVs instead for a more realistic resolution target then focused on 4K instead of 8K? lol

8K is even more redundant due to the ludicrous lack of 8K content anywhere except for the odd nature or tech video on YouTube and even Sony made a gaff by promoting it on every PS5 box even though the console still has not support for 8K over 2 years after it released. 8K steaming for movies and TV content is years away due to the bandwidth requirements and the likelihood of 8K discs is very, very low at this point.

Ultimately, it isn't the resolution that is important but the image quality in my opinion. This is why I am more than happy to continue playing games on my PC at 1440p where I can actually play the vast majority of games at native resolution and still have all the eye-candy and decent anti-aliasing (sometimes using NVIDIA's DLAA). It will be a long time before I upgrade my PC monitor to 4K.
 
Top Bottom