• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Death Stranding - In-depth PS5 vs Nvidia & AMD PC - Performance Deep Dive

Same thing basically. It says "Model Detail" in the settings menu, it's what controls the LOD/draw distance in the game.
They say PC has higher lod options. You said ps5 is using max lod on PC. Those 2 things don't agree.
Can anyone confirm?
Edit
It might have zero effect on the bench scene. I'm just curious.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Was quite spot on when I said this:


Check this out Black_Stride Black_Stride

Is your game up to date?

Some Death Stranding cutscenes have softlocks to 60fps.
Double check you arent soft lock limited.
Set everything to low and run the scene again.

While your 3070 might drop frames as well, i guarantee it pulls much higher highs and will average out higher.
 

Md Ray

Member
Is your game up to date?

Some Death Stranding cutscenes have softlocks to 60fps.
Double check you arent soft lock limited.
Set everything to low and run the scene again.

While your 3070 might drop frames as well, i guarantee it pulls much higher highs and will average out higher.
Yep, I always keep my games, drivers, OS up to date. The cutscene does seem to be limited to 60fps. I tried 1440p, it's locked 60 there, but as you can see at PS5 settings and res, it has similar frame rate dips to PS5 as the scene progresses.
They say PC has higher lod options. You said ps5 is using max lod on PC. Those 2 things don't agree.
Can anyone confirm?
Edit
It might have zero effect on the bench scene. I'm just curious.
DF's article is slightly contradictory. They say PS5 has higher LOD than Pro. When you look at Pro, the draw distance there is equivalent to PC's 'default', one notch above default is the max 'Very High' setting. So according to them PS5 should be using the Very High, so that's what I have used in the above screenshots.

I also separately benchmarked using the Pro's 'Default' LOD, like I said, the perf is identical to the max Very High LOD.
 
Last edited:
But that's what we're seeing here... See post #86.
Your test is with V sync on yes? And on max details? According to NXG PS5 still runs at PC default settings. He also says there might be a 10% performance penalty due to Vsync on PS5. That would put the average to around 62FPS. Run your 3070 with V sync off and you will get close to 80fps I believe. Comparing performance when both machines are hard capped to 60fps is not a true measure honestly. A non v-synced 3070 might have much much higher highs translating to much higher averages than the PS5.
 

rnlval

Member
Like you're doing now...

At least I have data to back up my claims.

Here's 2700X vs top-tier CPUs w/ RX 6900 XT in Death Stranding (benchmark by: HUB/TechSpot)

Like I've been saying, Zen+ Ryzen is plenty capable of running this game at well over 60+fps, look at the 1080p result: it's averaging 150+fps. Now, look at the 4K results.

DS.png


Sure...
FYI, Death Stranding was designed for Jaguar's 128 bit SIMD (SSE4.x or AVX v1 128-bit subset) not 256-bit AVX v2.

256-bit AVX v2 usage needs to be programmer targeted. Death Stranding is not running large-scale PhysX-style effects.
 

rnlval

Member
In terms of CU count, there isn't a direct equivalent.
the PS5 has 36 CUs. The 6600XT has 30. And the 6700XT has 40.
All have 64 ROPs. Although they are different.
The 6700XT has more TMUs, but it would be clocked lower, so it would better match the PS5.

There are many more differences. So finding an exact match is impossible.
But a 6700XT underclocked would be a much better match than an 2070 or a 5600XT.
FYI, 6600 XT has 32 CUs (or 16 DCUs) https://www.techpowerup.com/gpu-specs/radeon-rx-6600-xt.c3774
 

kyliethicc

Member
Love this game. Nice to see that Kojima Productions is actually using the PS5 hardware to deliver a legitimate leap from the PS4 Pro version, not just a lazy port.

And the whole PS5 vs PC GPU depends game by game. Easy enough to say that the PS5 delivers similar gaming performance as cards like the 3060 Ti, 2080, 2070 Super, 6600 XT, 5700 XT, etc.
 
Last edited:

Md Ray

Member
FYI, Death Stranding was designed for Jaguar's 128 bit SIMD (SSE4.x or AVX v1 128-bit subset) not 256-bit AVX v2.

256-bit AVX v2 usage needs to be programmer targeted. Death Stranding is not running large-scale PhysX-style effects.
I know... I don't recall saying it was designed for any specific microarchitecture.
 
Last edited:

Md Ray

Member
Your test is with V sync on yes? And on max details? According to NXG PS5 still runs at PC default settings. He also says there might be a 10% performance penalty due to Vsync on PS5. That would put the average to around 62FPS. Run your 3070 with V sync off and you will get close to 80fps I believe. Comparing performance when both machines are hard capped to 60fps is not a true measure honestly. A non v-synced 3070 might have much much higher highs translating to much higher averages than the PS5.
Nah, it's Vsync off, I did not engage any external or in-game fps cap either. I also ran using PC default settings separately and saw identical perf to the screenshots I posted here.
 

Md Ray

Member
Ok. Your pictures say 2070.

So do you have any idea why it's doing that in this game? Is it a Vram limitation?
One is NX Gamer's (at the top) which has 2070 and PS5 in it, one is my own screenshot (at the bottom) which has MSI Afterburner OSD saying "3070", I thought it was obvious?

Anyways, you can see VRAM consumption as well there. It's under 6GB, so that's not the issue. As NXG said, it's a combination of the engine favoring AMD architecture, driver cost to DX12 layer and all of that overhead affecting the PC including Vsync, but I had turned that off in my test.

The APIs and driver overhead cost on consoles are very lean, in comparison, so devs are able to get the most out of their HW.
 
  • Like
Reactions: Rea
^Ah. Now it makes sense. Then I guess cutscenes really aren't a good gauge to measure performance. The better the graphics card, the more the handicap. A 3090 might be running the cutscene at consistent 90 plus FPS, but the reading will give 60fps.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
60fps. The cutscene is limited to 60fps in that instance, as soon you enter gameplay it's fully unlocked up to 240fps.
^Ah. Now it makes sense. Then I guess cutscenes really aren't a good gauge to measure performance. The better the graphics card, the more the handicap. A 3090 might be running the cutscene at consistent 90 plus FPS, but the reading will give 60fps.


Its a purely academic test to see where the GPU drops frames since you are still softlocked to 60fps.
Naturally your average will be around 60fps with no way to actually average out above 60fps.
Any GPU that can handle 60+ will average to 60fps with the softlock engaged.

Did you try the TAB trick during the cutscene, it works during long cutscenes but i dont know it if it works on every cutscene.
Press TAB a few times when the cutscene starts and see if itll let the scene go above 60fps.
 

Md Ray

Member
Did you try the TAB trick during the cutscene, it works during long cutscenes but i dont know it if it works on every cutscene.
Press TAB a few times when the cutscene starts and see if itll let the scene go above 60fps
I just gave that a try, nope. It's still locked at 60.

This is right after that cutscene ends:

ivLERHA.jpg
 
Last edited:
NXG is testing the Directors cut on PS5 right? PC didn't get that update. It's possible that the DC has the benefit of some engine optimization which haven't yet trickled down to PC.

But in any case, what's clear is the Decima favours AMD architecture over Nvidias. It's very apparent when a 5600xt is keeping up with a 2070S levelish card. But then, Nvidia can earn image quality and performance back by simply enabling DLSS. Drawing a conclusion is pretty difficult. But I think we can safely say DS is a high point for PS5 performance.
 

Md Ray

Member
NXGamer NXGamer delivers his aperitif after the meal thus making it a digestif....but we drink it either way.




Might want to edit the OP with NXGs aperitif video so people entering the thread already have that knowledge.

I hope SlimySnake SlimySnake is watching this.
He's pairing the Nvidia GPU with a notoriously bad CPU. If he had paired it with the 3600 CPU he used for the 5600xt comparisons, the 2070 will perform better.
The video just proves how wrong this statement is, for this particular title.

I agree when evaluating perf between multiple PC GPUs you should stick with a single powerful CPU, but as I said in my initial post, at 4K, CPU doesn't really matter much for this game -- especially when the graphics card you're using is something like 2070/5600 XT at 4K which is already dipping into mid-40-30fps. It was very much a GPU-bound stress test at that point.
 
Last edited:

nemiroff

Gold Member
NXG is testing the Directors cut on PS5 right? PC didn't get that update. It's possible that the DC has the benefit of some engine optimization which haven't yet trickled down to PC.

But in any case, what's clear is the Decima favours AMD architecture over Nvidias. It's very apparent when a 5600xt is keeping up with a 2070S levelish card. But then, Nvidia can earn image quality and performance back by simply enabling DLSS. Drawing a conclusion is pretty difficult. But I think we can safely say DS is a high point for PS5 performance.

Wait wat, is that confirmed, why would he do that?
 

Dream-Knife

Banned
One is NX Gamer's (at the top) which has 2070 and PS5 in it, one is my own screenshot (at the bottom) which has MSI Afterburner OSD saying "3070", I thought it was obvious?

Anyways, you can see VRAM consumption as well there. It's under 6GB, so that's not the issue. As NXG said, it's a combination of the engine favoring AMD architecture, driver cost to DX12 layer and all of that overhead affecting the PC including Vsync, but I had turned that off in my test.

The APIs and driver overhead cost on consoles are very lean, in comparison, so devs are able to get the most out of their HW.
You should RMA your card then.

3070 gets mid-70s in Death Stranding.


Also, from this video review, only performance mode is locked 60. Quality mode is max settings 4k, but has dips into the low 40s.
 
Last edited:

Clear

CliffyB's Cock Holster
NXG is testing the Directors cut on PS5 right? PC didn't get that update. It's possible that the DC has the benefit of some engine optimization which haven't yet trickled down to PC.

DC also does more work visually than the original release. For example the water simulation is much higher precision, and there are several new player buildable structures available making the landscape busier generally.

NX does the right thing by testing primarily in areas/situations where these changes don't come into play.
 
DC also does more work visually than the original release. For example the water simulation is much higher precision, and there are several new player buildable structures available making the landscape busier generally.

NX does the right thing by testing primarily in areas/situations where these changes don't come into play.
All true. I'm just saying that a comparison is best done on a "like for like" scenario as much as possible. Different versions, frame caps etc hinder comparability.
 

vkbest

Member
Im surprised people is yet comparing Vsync off on PC vs Vsync On in consoles, after they are surprised on videos like this. Vsync penalize performance.

Even capping frame rate penalize performance (you will get bigger drops on capped vs uncapped) over unlocking.
 

Md Ray

Member
You should RMA your card then.

3070 gets mid-70s in Death Stranding.
Lol, you're a bit overreacting there. The card is perfectly fine, that's how it performs under certain GPU heavy sections.

I literally just posted this above, it IS in mid-70s here and even goes beyond 80fps in other areas of the map like the avg. numbers from Guru3D and other outlets have shown:
I just gave that a try, nope. It's still locked at 60.

This is right after that cutscene ends:

ivLERHA.jpg
 
Last edited:
All good atuff and thats even without proper PS5 Code .. DS is in its intrinsics still a ported over Cross Gen Title...
People all the time jump on the "consoles are weak" narrative in the beginning of every gen.
Look the early and infamous DF Comparisions with PS4 vs a GTX 750ti...
PS4 tend loose those Battles although everyone knew that it was the early code and lack of proper API usage what made the GTX750ti pull ahead of PS4.
Also the Advantages of 8GB DDR5 Ram and its need was dismissed in Forums.
PC Folklore was ,, 2GB of Vram will be fine"
Look now how you play Far Cry 6 or HZD on such cards...

PS5 will pull away from even cards like a RTX3070 in a few years on a regular basis.
 
Last edited:

ACESHIGH

Banned
Apples to oranges comparison. 2 different versions of the game. We should compare them if the directors cut edition is released on PC.
 
So we can conclude that ps5 punches above its weight when compared to Nvidia GPUs in games that Nvidia GPUs punch below their weight.
It also seems to perform right around it's weight when compared to AMD GPUs.
Water is wet?
 

yamaci17

Member
Perhaps it's a combination of faster GPU (faster than 2070, I mean), little to no driver overhead, and lower-level GNM API allowing it to punch above its weight.

lit. just lit bro! this. so much this.

whether others accept or not, PC is just like emulation itself. games are completely written for console architectures where the importance and focus is on: unified memory, unified CPU+GPU with minimal latency between them. a superior API directly written to extract performance out of a fixed hardware.

what does ampere have? 850 mb bloat of nvidia driver. yes thats about it... and a wonky windows to back it up which has no focus on gaming or whatsoever. i, like you, have a 3070 myself and have no problems with these facts. i had a ps4 equivalent literal gpu back then. it performed nothing like ps4 in games beyond 2016. up to that point, i can even say that i got better perf. compared to ps4. but then all went downhill when newer cards came up. i used to run w3 and gta 5 a bit above ps4, but magically it got shafted down later on. consoles are wizardry and some people will still keep claiming that have hidden lower than low settings which makes them go but i cannot see the differences when using console approximant settings compared to actual consoles. they just look the same
 

yamaci17

Member
So we can conclude that ps5 punches above its weight when compared to Nvidia GPUs in games that Nvidia GPUs punch below their weight.
It also seems to perform right around it's weight when compared to AMD GPUs.
Water is wet?

ampere architecture is not gaming oriented like rdna 2 is. it can't even punch "its weight" unless developer is actively optimized and tweaked their game for the arch.. the same was applied to kepler. it had "driver codepaths" made by nvidia to keep the "performance level" they had. once those code paths perished, incidents like these happened;


you see, nvidia fixed the problem by making maxwell GCN-like. that way, devs could translate GCN benefits of consoles to maxwell/pascal easily. but nvidia again made a turn in turing / ampere. these two archs ar e actually oriented for other various work loads like blender, compute and stuff. just like kepler. this is why they cannot even punch above their weight in freaking nvidia sponsored titles. check out wd:legion, an nvidia sponsored title yet 6700xt will perform tiny bit above 3070 at 1080p and 1440p and tie up with it at 4k.

rdna2 is a purely gaming focused architecture that just works. im pretty sure nvidia will pull another maxwell in lovelace and make it gaming oriented once again.

only condition where this generation nv gpus are

- ray tracing. but specifically, nvidia sponsored ray tracing. and that seems to be waning
- dlss. but that also seems to be waning. more than %75 of the aaa titles released in 2021 did not have dlss. which makes the DLSS point moot. nvidia made some deals in 2020 and made it seem like dlss was going to be huge and in a lot of games, however their DLSS game was pretty weak in 2021. if it end up like this, it will be in the same class of physx, gsync and many more other nvidia only features. i myself have the card, and i feel like i got shafted down by nvidia. oh, new dlss games. oh no, indie unreal engine 4 games. oh new aaa dlss implementation. oh no, shadow of tomb raider which already runs 4k 60 fps on my 3070.
 

Md Ray

Member
All good atuff and thats even without proper PS5 Code .. DS is in its intrinsics still a ported over Cross Gen Title...
Poeple all the time jump on the "consoles are weak" narrative in the beginning of every gen.
Look the early and infamous DF Comparisions with PS4 vs a GTX 750ti...
PS4 tend loose those Battles although everyone knew that it was the early code and lack of proper API usage what made the GTX750ti pull ahead of PS4.
Also the Advantages of 8GB DDR5 Ram and its need was dismissed in Forums.
PC Folklore was ,, 2GB if Vram will be fine"
Look now how you play Far Cry 6 or HZD on such cards...

PS5 will pull away from even cards like a RTX3070 in a few years on a regular basis.
Man, the abuse that would get hurled at me for telling 750 Ti was a weaker GPU than PS4 GPU at the time, lol.
It also seems to perform right around it's weight when compared to AMD GPUs.
Much better when you take PS5's shared bandwidth into account. The 5700 XT gets all 448GB/s to itself, the PS5 doesn't, yet it has a higher avg. fps than a PC with 5700 XT.
 
Last edited:

yamaci17

Member
He's pairing the Nvidia GPU with a notoriously bad CPU. If he had paired it with the 3600 CPU he used for the 5600xt comparisons, the 2070 will perform better.

Not really sure the point of comparing the 5600xt with a different CPU. I can somewhat see him justify using the 2700 since its the closest performance wise to the CPU present in the PS5 which could potentially be holding the PS5 GPU back, but if you are doing comparisons between two "PC" GPUs, you have to keep everything else a constant. That includes, RAM and the CPU. He needs to be consistent. Either use the 3600 CPU with both cards or use the 2700 with both cards. You cant pick and choose. I have asked him to re-test his 2070 results with the 3600, but he just refuses to. I dont know why he cant just take the GPU out of one system and plug it in another. He is clearly willing to use the 3600 in tests so why not use it with all GPUs?

The 3070 should perform like a 2080 Ti with a better CPU since they are pretty much on par with one another.

death-stranding-3840-2160.png

You can't simply go by these tables. They are not indicative of actual game experiences.

I would like to share an observation I did about "hardware unboxed" and with their tests and conclusions;



According to him, r5 3600 is fine for rtx 3080 in ac odyssey. according to him, 3600 pushes an average of 100 frames in ac odyssey and 10900k equals to 3600 since the performance is GPU bound, even at 1080p.

but let's take a look at the reality, shall we?



oh. a huge bottleneck. near 64-70 fps. practically, 3700x (not even a 3600) here is struggling to push a 6700xt to its maximum at 1080p.

but steve from HU showed me a graph where 10900k and 3600 feeds the 3080 at 1080p with an average of 100 frames.... right...

practically, if you don't listen to HU, get a 10900k for your 6700xt at 1080p, you will actually get 85+ fps in that particular scene wheeras you are getting 64-65 fps and lower utilization with 3700x.
fQxTHpz.png


only when you're climbing, lookin at ground, looking at sky, or in a completely empty field that you will actually see 90+ fps average with a ryzen 3600 and 3700x in maxed out ac odyssey. im not saying HU has done wrong. i don't know where they've tested in-game. but its clear that their numbers are a bit "bloated" for both CPUs and GPUs. so where am I getting at=? i just say that you can take these benchmark serious when you're comparing the performance between GPUs and CPUs, but you should not take them seriously when you want to learn how the game actually performs

so when you see rtx 3070 averaging 85 fps at 4k in death stranding in any graph you encounter, you can bet that they did not do an extensive test of actual gameplay. most likely they play the first 5-10 minutes and find a comfortable area where they can easily benchmark.

whereas in the nx gamer video i'm watching, there's action and stuff. and when there's not, 2070 performs better, like the graphs would suggest.

i hope i made my point clear, have a good day
 

ClosBSAS

Member
All good atuff and thats even without proper PS5 Code .. DS is in its intrinsics still a ported over Cross Gen Title...
People all the time jump on the "consoles are weak" narrative in the beginning of every gen.
Look the early and infamous DF Comparisions with PS4 vs a GTX 750ti...
PS4 tend loose those Battles although everyone knew that it was the early code and lack of proper API usage what made the GTX750ti pull ahead of PS4.
Also the Advantages of 8GB DDR5 Ram and its need was dismissed in Forums.
PC Folklore was ,, 2GB of Vram will be fine"
Look now how you play Far Cry 6 or HZD on such cards...

PS5 will pull away from even cards like a RTX3070 in a few years on a regular basis.
but it wont...far cry 6 is not a very optimized game, its an open world ubisoft game which will have like 7 20gb patches like AC VALHALLA still does months after release and it will still be badly optimized. BF 2042 runs better than far cry 6 and looks better on PC.

2gb was enough for last gen if you think about it. AC unity which didnt even run on consoles, you could hit 60 fps with a 680 nvidia at that time, watch dogs as well.

PS4 games like uncharted, horizon zero dawn, run and will run much better on PC with low specs. PS5 is a great machine, dont get me wrong, i have them all, but its just nowhere near a gaming PC with all bells and whistles.

also, fary cry 6 uses RTX which consoles dont, so we have much of a performance hitter. PS5 is not weak now, but it will be in a year or so when the 4xxxx series come out. So i disagree where you say that in a few years the ps5 will be up there. I think the longer you wait in consoles, the more obsolete they become with time as history has told us always.
 
About his DLSS rant it being dog shit.

Here you go ( also proofs the insane good optimisation death stranding has for PC, they can't even bother to update there own game for shit, yet can build a entire new version fo the game for the PS5 alone ). yea gg.



Not using DLSS on a 2070 even while it has hardware for it specifically catered towards it is idiotic and useless no matter if the PS5 pushes native or not, hell the PS5 should have used CB can't hold 60 for shit. Imagine the tearing.

The experience for both solutions are nearly identical and i bet the PC version is a lot less optimized, probably features higher settings and runs on a GPU that's not builded for 4k to start with and a cpu that's inferior towards the PS5.

Also why even bench anything with vsync on? nobody uses it on PC for good reasons.

Fucking lol.


Baby Babies GIF


Put new DLSS in and no more trailing artifacts, not too hard that. This is why DF boys are pros at what they do and the rest are only trying to catch up.. while some have different agendas entirely.

And VSYNC? Seriously, what year is it, 2010? VRR displays are old news already.

Anyway it's surprising to see PS5 dropping to 40 fps, wow talk about disappointment. Should have used CB for locked 60 indeed.

Tried that place and nothing out of ordinary, as per usual roughly 2x perf uplift, not sure why 2070 dropping so low. [maybe he didn't even enable tripple buffer lol]

ps5issuchbeastwowza.png


ehmmaybenot.png
 

yamaci17

Member
but it wont...far cry 6 is not a very optimized game, its an open world ubisoft game which will have like 7 20gb patches like AC VALHALLA still does months after release and it will still be badly optimized. BF 2042 runs better than far cry 6 and looks better on PC.

2gb was enough for last gen if you think about it. AC unity which didnt even run on consoles, you could hit 60 fps with a 680 nvidia at that time, watch dogs as well.

PS4 games like uncharted, horizon zero dawn, run and will run much better on PC with low specs. PS5 is a great machine, dont get me wrong, i have them all, but its just nowhere near a gaming PC with all bells and whistles.

also, fary cry 6 uses RTX which consoles dont, so we have much of a performance hitter. PS5 is not weak now, but it will be in a year or so when the 4xxxx series come out. So i disagree where you say that in a few years the ps5 will be up there. I think the longer you wait in consoles, the more obsolete they become with time as history has told us always.

no, not enough if you actually, really think about it

2 gb gtx 770 has the grunt to run rdr 2 and horizon zero dawn at or a bit above ps4. yet it has to run low textures. its just not a pleasant picture quality. you get full intended textures with ps4/xbox one.



i dont know why low textures had to look that bad, but it is what it is sadly

ac origins a 2017 title already made 2 gb obsolete, at least if you want "acceptable" texture quality;



2 gb went kinda eol at 1080p after 2017. you can guess 2018-2019-2020 games. all those games yeah they ran at 1080p 30 fps but at least with high textures thanks to that 8 gb total buffer

on a second note: gtx 770 doesn't seem to have the grunt to run rdr 2 at ps4 visuals at all. in the video all settings are on their lowest and yet it cant still get a consistent 30 fps. a bit tough, considering gtx 770 was supposed to be 2x faster than a ps4, or at least you would've been led to believe in 2014 by pc users :)
 
Last edited:

Md Ray

Member
wait...so ps5 is not even running at max settings? so why are people even comparing it to a 3070?
It's using PC's max for everything but model detail (LOD). It's one notch below PC's max according to NXG, water quality on PS5 is higher than PC though.

It's well documented even by DF that LOD have zero perf impact between default and very high, and that's what I'm seeing on my PC as well. Avg. fps with and w/o max LOD were the same.

All things considered, PS5's perf profile in this game at native 4K is between 3060 Ti and 3070 (closer to 3070), in terms of avg fps.
 

Clear

CliffyB's Cock Holster
Im surprised people is yet comparing Vsync off on PC vs Vsync On in consoles, after they are surprised on videos like this. Vsync penalize performance.

Even capping frame rate penalize performance (you will get bigger drops on capped vs uncapped) over unlocking.

Well, some folks apparently are unaware that you sync otherwise asynchronous processes by making one wait for the other!

Its that fundamental.
 

ClosBSAS

Member
no, not enough if you actually, really think about it

2 gb gtx 770 has the grunt to run rdr 2 and horizon zero dawn at or a bit above ps4. yet it has to run low textures. its just not a pleasant picture quality. you get full intended textures with ps4/xbox one.



i dont know why low textures had to look that bad, but it is what it is sadly

ac origins a 2017 title already made 2 gb obsolete, at least if you want "acceptable" texture quality;



2 gb went kinda eol at 1080p after 2017. you can guess 2018-2019-2020 games. all those games yeah they ran at 1080p 30 fps but at least with high textures thanks to that 8 gb total buffer

on a second note: gtx 770 doesn't seem to have the grunt to run rdr 2 at ps4 visuals at all. in the video all settings are on their lowest and yet it cant still get a consistent 30 fps. a bit tough, considering gtx 770 was supposed to be 2x faster than a ps4, or at least you would've been led to believe in 2014 by pc users :)

Wasn't it equivalent to another 7 series card with 3gb? Maybe 2 is pushing it a bit, but def 3gb was enough to run PS4 games.
 
ampere architecture is not gaming oriented like rdna 2 is. it can't even punch "its weight" unless developer is actively optimized and tweaked their game for the arch.. the same was applied to kepler. it had "driver codepaths" made by nvidia to keep the "performance level" they had. once those code paths perished, incidents like these happened;


you see, nvidia fixed the problem by making maxwell GCN-like. that way, devs could translate GCN benefits of consoles to maxwell/pascal easily. but nvidia again made a turn in turing / ampere. these two archs ar e actually oriented for other various work loads like blender, compute and stuff. just like kepler. this is why they cannot even punch above their weight in freaking nvidia sponsored titles. check out wd:legion, an nvidia sponsored title yet 6700xt will perform tiny bit above 3070 at 1080p and 1440p and tie up with it at 4k.

rdna2 is a purely gaming focused architecture that just works. im pretty sure nvidia will pull another maxwell in lovelace and make it gaming oriented once again.

only condition where this generation nv gpus are

- ray tracing. but specifically, nvidia sponsored ray tracing. and that seems to be waning
- dlss. but that also seems to be waning. more than %75 of the aaa titles released in 2021 did not have dlss. which makes the DLSS point moot. nvidia made some deals in 2020 and made it seem like dlss was going to be huge and in a lot of games, however their DLSS game was pretty weak in 2021. if it end up like this, it will be in the same class of physx, gsync and many more other nvidia only features. i myself have the card, and i feel like i got shafted down by nvidia. oh, new dlss games. oh no, indie unreal engine 4 games. oh new aaa dlss implementation. oh no, shadow of tomb raider which already runs 4k 60 fps on my 3070.
To me, a 3070 should be roughly equal to a 6700xt without heavy RT.
 
Top Bottom