• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF - Death Stranding PC Tech Review: The Upgrade We've Been Waiting For

The Cockatrice

Gold Member
1. No Dev is forcing it upon you.
2. It's an extra feature that isn't used to mask performance optimization
3. "Not saying DS runs bad without it" (it runs great, so what are you telling us exactly?)
4. They call it the future because it's a huge performance boost at the cost of a slight loss in image quality. We are aware of how punishing 4K is and with RT, it's gonna be worse. No optimization will save you
5. It's more than a sharpening filter. It's a reconstruction algorithm and shits on CAS FidelityFX
6. It's doubtful devs will ever use it to excuse poor performance but if one day it becomes so good it's effectively indiscernible from native 4K all the while offering a huge performance boost, who cares?
If it was just about sharpening then why is AMD's Contrast Adaptive Sharpening in this game inferior to DLSS in a side by side comparison?

I never said it's just a sharpening filter. I know what it does. I used it on my 2080. And the differences are not indiscernible at all. Its easier on a shitty youtube video to hide the differences but when playing is a whole different thing. You have too much blind faith in that it won't be abused as an excuse for poor performance and I'm pretty sure Watch Dogs Legion will be the second example after Control in regards to its use. We can talk more about this in 2022 when it's more common.
 

01011001

Banned
It doesn't unless the TAA implementation is absolute shit which is very common. We need better AA solutions but DLSS 2.0 better looking than native? Absolute lie.

well then I have yet to see a TAA implementation that has none of the issues like ghosting and flicker in smaller high detail parts of the image.
DLSS2.0 meanwhile loses almost no detail compared to native resolutions and it has no flicker and no ghosting. it looks better.

also, DLSS2.0 is only ever talked about as a ways to increase your resolution from a lover resolution, which is a shame.
because DLSS2.0 also just can be used as AA and only as that.
you could run Death Stranding game on a 1440p monitor, set the resolution to 4K-DLSS and then have the game render at your native monitor resolution but also internally upscale it to 4K and then supersample it down to 1440p again. given that the resultimg image would even look really good on a 4k screen, imagine what this would look like on a 1440p screen instead!

and for 4K screen users, this could be pushed higher, with a native resolution of 4K which is then turned into an even higher resolution reconstruction by DLSS and then downsampled again.

so basically what I am saying is that DLSS could potentially simply replace TAA entirely not only as a way to up-res an image but also to just smooth edges at native resolution.
 
Last edited:
I never said it's just a sharpening filter. I know what it does. I used it on my 2080. And the differences are not indiscernible at all. Its easier on a shitty youtube video to hide the differences but when playing is a whole different thing. You have too much blind faith in that it won't be abused as an excuse for poor performance and I'm pretty sure Watch Dogs Legion will be the second example after Control in regards to its use. We can talk more about this in 2022 when it's more common.
Control runs well enough if you turn off RTX and the DLSS on my 2080 Ti looks very good but Control overall has dogshit filters and film grain that is nearly impossible to get rid of. Blame it on the game's filmic look with the shitty post-processing effects.

As for Watch_Dogs Legion, jury is out. Game isn't there yet.
 
Last edited:

Siri

Banned
DLSS 2.0 is simply incredible.

In Control, at 4K, I was able to go from 45 FPS native, to 60 FPS (DLSS 2.0) with better image quality - yes, better. It was like installing a brand new next-gen gpu.

I haven’t installed Death Stranding yet, but it sounds like DLSS 2.0 is doing its thing again.
 

A.Romero

Member
I'll double dip and check it out for the technical feats but I don't think I have another play through in me. I liked it but I think not that much.

Unless the mod scene takes off.

And yes, every game should get DLSS. Not as good PQ as native but pretty close and allows for 4K/60 FPS with current GPU. What's not to like?
 

Mister Wolf

Member
death-stranding-nvidia-geforce-rtx-dlss-2-0-performance-3840x2160-dlss-quality-mode.png
 
  • Like
Reactions: GHG

GHG

Member
I'm sorry for ever doubting you DLSS.

This shit us a fucking game changer. Mentioned it in another thread but I tried it out at 4k with rtx on in Wolfenstein Young blood today on a 2070 super and I'm blown away. It shouldn't be possible but yet it is, it'd like black magic.

If this becomes widespread and AMD don't have a similar solution of their own then they are finished in the GPU market.

No graphical upgrades can fix this bland, empty world walking sim, boring repetitive gameplay with little actual combat.

Yawn.
 
Last edited:

TheSHEEEP

Gold Member
No graphical upgrades can fix this bland, empty world walking sim, boring repetitive gameplay with little actual combat.
Well, yes, but now you can look at nothing happening on the screen in even higher resolution, with even more details and even higher fps!
And with PC, who knows, maybe someone will make a mod to add gameplay.
 
Last edited:

GHG

Member
People who doubt DLSS can look better than native haven't seen it with their own eyes.

Don't worry, that was also me once.
 

Guilty_AI

Member
People who doubt DLSS can look better than native haven't seen it with their own eyes.

Don't worry, that was also me once.
Yeah, i didn't realize DLSS 2.0 could actually look better than native in some areas. Its impressive
 

Skyr

Member
Seriously, how the fuck is this possible? I understand the framerate but how can it be that the PQ is better with DLSS than native?
You'd have to ask nvidia enginieer. Basically A.I. algorithm magic. I'm sure native can look better than DLSS with proper AA. But add that and performance tanks even more.
 
Last edited:

Mister Wolf

Member
For me the artifacts in DLSS do not bother me. I use control panel's sharpen filter always when I use TAA anyway. I despise TAA but its better than having no AA. As it stands DLSS 2.0 is superior to TAA in all forms.
 

Holammer

Member
d6QkTAt.jpg


Some of the posts in this thread man, wtf is wrong with you guys.
Anyway, Decima is looking good and it bodes well for Horizon: Zero Dawn. Hopefully it'll come with DLSS2.0 magic sauce support despite the AMD marketing stuff. If they include it, don't expect them to shout it from the rooftops.
 

Andodalf

Banned
Seriously, how the fuck is this possible? I understand the framerate but how can it be that the PQ is better with DLSS than native?

DLSS is basically extrapolating data from super high res renderings of the game, 16k iirc. Those 16k shots have a ton of detail that 4K misses. So even thought DLSS is lower that true 4K, it has more fine detail. It’s kinda as if the DLSS was 16k downsampled to your target resolution, and then intelligently reconstructed to 4K. That’s not really what happens, but kinda what the ML is trying to achieve.


Now how they high res rendering the ML works from work I have no clue, but ML is basically magic
 
Seems to run fine during gameplay at native 4K on my 5700XT with the settings up, cutscenes have noticeable frame drops tho.

Only downside for me is the lack of a 3840x1620 21:9 option. Maxes out at 3360x1440 which still looks decent enough on my 4K set I guess when used in conjunction with the FidelityFX CAS. Funnily enough there is a 1680x720 option for Ultrawide :messenger_grinning_sweat:
 

01011001

Banned
Seriously, how the fuck is this possible? I understand the framerate but how can it be that the PQ is better with DLSS than native?

modern anti aliasing methods are complete ass, that's how.

TAA is also just an algorithm that tries to estimate pixels in order to smooth edges... it just doesn't do it well.

DLSS uses a way more sophisticated form of the same idea. it uses machine learning to estimate more pixel detail and in the process also produces smooth edges.

this basically makes it the best modern form of anti aliasing and it adds detail to increase the preserved resolution
 

A.Romero

Member
modern anti aliasing methods are complete ass, that's how.

TAA is also just an algorithm that tries to estimate pixels in order to smooth edges... it just doesn't do it well.

DLSS uses a way more sophisticated form of the same idea. it uses machine learning to estimate more pixel detail and in the process also produces smooth edges.

this basically makes it the best modern form of anti aliasing and it adds detail to increase the preserved resolution

I just thought regular AA would see a direct benefit from native resolution rendering. I'm really surprised at ML beating native even with shitty AA.
 

Stuart360

Member
Can we stop praising DLSS 2.0? That tech should be a solution for weaker hardware not somethign the devs should force upon us for shitty performance optimizations. Not saying DS runs bad without it but I cringe every time someone calls DLSS the future. And if your argument that it looks sharper is because of its filter. You can easily add sharpness to the original image via filters, and I'm saying this because a lot of nvidia shills around here post that Control comparison where DLSS 2.0 looks much sharper than without but they ignore the fact that the original with sharpness will still look superior. Again DLSS 2.0 is a nice feature but lets not praise it so much that it becomes a tool for devs to excuse poor performance.
Are you really complaining about a fantastic new tech that is going to allow mid range and low range PC users achieve results beyond what they would usually be able to achieve on their hardware?.
 
Do they cover how the game runs on an rx570 or 580 (similar to a PS4 pro)?

I'm just curious to see how these cards handle the game in its default settings (on a very weak CPU while we are at it).
 

Krappadizzle

Gold Member
Rain drops are gone with DLSS 26:13

"DLSS looks better than native"

Mmmkay. We're gonna pretend like Nvidia doesn't sponsor DF & they haven't been shilling it rabidly even in its 1.0 incarnation.

Or you could just look at the evidence which would challenge the bullshit you just typed up. DLSS 1.0 was never praised by DF or really any outlet. It was almost exclusively looked at like, "what's the point?" 2.0 is a massive massive upgrade and actually does what they said 1.0 was going to do. There's plenty of evidence from multiple outlets that show how DLSS 2.0 can and often is better than just 4k native.

But you do you booboo:

200w.webp
 
Do they cover how the game runs on an rx570 or 580 (similar to a PS4 pro)?

I'm just curious to see how these cards handle the game in its default settings (on a very weak CPU while we are at it).
  • Performance on older GPU is quite good but RX 580/GTX 1060 can't do 30fps at 4K
  • RX580/GTX 1060 hover in the low 40's to low 50's at 1440p. RX 580 around 14% faster overall
  • They can do 1080p/60fps Highest Settings rather easily
 

reksveks

Member
1) love Sony fanboys criticising pc hardware for not running at native 4k60 whilst completely ignore the fact that the settings within the benchmarks aren’t the same
2) love the ‘backseat devs’ complaining about devs using dlss and using the line of thought that dev should properly optimise the game. 95% of these people would have a clue where to start and then the other 5% are totally ignorant of the context of the software in comparison to time/financial resources.

now back to the point, quite interesting that they are using directx12 as the underlying api and not vulkan. So now we have had watchdogs legion, cyberpunk and death stranding being dx12 first led, and two of those will have needed to be ported to vulkan/OpenGL/gmnx.
 

reksveks

Member
Do they cover how the game runs on an rx570 or 580 (similar to a PS4 pro)?

I'm just curious to see how these cards handle the game in its default settings (on a very weak CPU while we are at it).

‘the RX 560 is effectively more powerful than the base PS4, which runs the game at native 1080p. In our tests, we found that the Radeon RX 580 proved effective at delivering 1080p at 60 frames per second’ from the article but unsure of the settings
 

CrustyBritches

Gold Member
-It's cool how they've redone both the pre-rendered and real-time cutscenes for 60fps on PC.
-That 4xAF looks like shit on PS4 Pro, now I see why they don't allow the option on PC.
-DLSS really helps out with hair and plant quality, TAA even at 4K is gonna give me a seizure.
-DLSS Quality looks better in general than Native 4K TAA.
-DLSS Quality looks better than AMD CAS by a good amount due to the extensive shimmering on CAS.
-All RTX cards can run the game at default console settings + 16xAF at 4K DLSS Quality/60fps, with the exception of the 2060 that required 4K DLSS Performance.
-RX 580 runs the game 20-25fps at native 4K which is twice the resolution PS4 Pro is running the game at with 30fps.
-RX 580 runs the game 45-50fps at 1440p, 11% lower res than PS4 Pro, but with 50% higher performance.
-Guru3D has GTX 1060 better than RX 580, with 59fps and 54fps avg, respectively, at 1440p.
-Guru3D has RX 470 stock at 44fps avg at 1440p with 'Very High' settings.
 

Tschumi

Member
Besides cyberpunk, this is my last chance to retain my hitherto lifelong love for gaming.

Part of me wants it to fail and let me devote all of my attention to my life ^_^
 

ZywyPL

Banned
The game looks and runs great, but I have to say, what a lazy ass port, the options in the settings menu don't do shit, what the hell they have been doing all that time?
 
Just watched the video imo it was fine video, but what always aggravates me about the comparison videos like these they never show exact comparison. They show gpus at lower resolution hitting higher fps and then more expensive hitting same sudo resolution with higher fps. I want to see as close as you can get set up same sudo resolution same fps as PS4 pro and the parts breakdown and cost. It would answer alot of questions and put things into perspective imo.
Well, why do you think they didn't do that ? That won't go well with their agenda. The aim of DF is not to promote in any way the Playstation consoles, and it has never being since its inception. So if the PS4 punches way above its weight, well, they won't show it.

Why do you think the article is titled: "The upgrade we have being waiting for". Because of better AO, AF and 4K60fps not even stable on a 2080ti ? That's what they were waiting for ?

But when they talk about a switch port with 2x less framerate, 4x less resolution, and all effects reduced significantly (PS2 levels of assets and effects) suddenly the port is a technical achievement and the Switch a miracle of hardware. DF are so transparent.
 
Are you really complaining about a fantastic new tech that is going to allow mid range and low range PC users achieve results beyond what they would usually be able to achieve on their hardware?.
I kind of understand where he is coming from. In the future devs could be like: " Here is a poorly optimized game but it plays fine with DLSS on"
 
I'm disappointed I've not found anyone who is criticising/reviewing this game has pointed out how jarring and annoying the 'amazing animations' are in the game because to me they're just absolutely not amazing. In general, the animations look great sure, but the character just looks like he's floating around the world, he's just sliding along the ground and just doesn't look natural which in the end makes the animations while really good in a vacuum give the character models a really floaty look.

I mean, in this video look at his head as he moves across different height stuff. Just look at the footage of him running across the rocks at 21:40. How does this not get more criticism it's SO jarring and looks completely horrible to me. Head and body just sliding up and down so unnaturally.

Does this bother anyone else or is it just me?
 
1.9 was shit as well, but you can go through their old videos. Compared to everyone else they couldn't stop licking Huang's vaseline off his fingers quick enough. And ofc were downplaying CAS etc



There's some advantages in motion but there's many things Alex doesn't mention for fear of upsetting his daddy, such as how when he's comparing eye lashes DLSS Q shows evident painterly lines instead of trying to approximate the hairs like long/stringy textures (aka like hair actually is), the rain drops get lost as detail, etc. And ofc he never mentions that DLSS has sharpening built in, and a lot of what he raves about as "reconstructing" or showing more detail is owing more the sharpening than it is to reconstruction itself. There's a lot of subtleties that he chooses to gloss over (because he does know about them) and then simply assigns praise to DLSS.

That's why I say they're shills. There's no objectivity & context to what they're showing, they just want to sell you on DLSS (and hopefully bag more sponsored videos from Nvidia).

Scumbags.

DLSS is for the kind of people who thpught those early DNR scrubbed blu rays, and sweetfx/enb fuckfest profiles are acceptable.
 
Top Bottom