• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Resident Evil 4 Remake First Look: A Classic in the Making? PS5, PS4, Xbox Series X and PC Tested

kingyala

Banned
It's dynamic res. Without RT and Hair strand, the resolution mode is around 1800p cb or exactly like Horizon Forbidden West Pre-patch before they fed it MORE data to get rid of the shimmering. Thats basically what capcom needs to do. Implement checkerboarding like FSR2.0. Otherwise, 2.8 million pixels checkerboarded will always look like trash in motion like HFW did pre-patch. 4kcb is 4.1 million pixels. Almost 40% more which is why we didnt see these issues in OG HZD, RE8 and other 4kcb games on the Pro.

As for why this game is so demanding, to me, it definitely looks better than RE2 and RE3. I just played RE3 like 20 times last month, and it simply doesnt look as good in any area. This game has better lighting, better textures, and way bigger environments. RE2 and RE3 both feature very small indoor levels with outdoor levels essentially designed as larger corridors. They also dont have to worry about rendering foliage which is way heavier on the GPU than city buildings.


It's 1440p checkerboard and then your tv takes that 1440p image and upscales to 4k.

And why doesnt it make sense that its somewhere between 900p and 1080p? 60 fps is hard to do with ray tracing. PS5 and XSX struggled to run Guardians at 1080p 60 fps. Returnal is 1080p native before they use different upscaling techniques to get to 4kcb.

My RTX 2080 couldnt run Control with ray tracing at 1080p 60 so i had to use DLSS at 1440p which set the internal resolution at 960p which is basically what 1440p cb is. The PS5 is basically a rtx 2080 in standard rasterization.

Capcom simply needs to offer a 30 fps locked mode with all bells and whistles. Even on PS4, they have left the framerate unlocked at 900p. it goes up to 45 fps and averages around 40 fps. Thats 30-50% of the GPU power left on the table. They should instead lock it at 1080p 30 fps and use DRS whenever it begins to drop below 30 fps and only then go to 960p or 900p instead of being there 100% of the time.

On the PS5, they can easily do 4kcb 30 fps with RT and hair strand on.
60 fps isnt hard to do with raytracing ratchet and clank have done it spiderman has done it and when unlocked they even hit 80 fps, compared to this ps4 game the series x and ps5 should be comfprtable running at 1440p 60 with some shadow or puddle reflections, there isnt even any noticeable reflections or shadows here so wheres the performance cost and hair strands dont make any sense ratchet and clank use strand rendering for fur and has 100,000 of strands that are also affected by light not a dozen like here.. it cant be 900p since the ps4 is already 900p
 

DenchDeckard

Moderated wildly
it's the same deadzone size.

the difference in RE2 is that its deadzone is a perfect circle I think, while the one in RE4 is a weird square shape.
and the reaction curve is different between the two.

it still is crap in RE2 as well but the game itself is way slower with slower enemies too, so the bad aiming doesn't stand out as much.

RE3, made by a different team, has the same deadzone size on all systems, which is equivalent to the size of RE4 on PS5

Oh man, well I completed RE2 multiple times on Xbox but I'll just have to see with this. Supposedly the PC version is more or less the same as the Xbox version control wise so I'm gonna hang fire on ordering that and will wait for launch day. I cant play with the Vaseline filter so PS5 is out for this.

Did they never patch RE2 on Xbox? As I don't really havea problem with that. :/
 

GymWolf

Member
I use a global 120 fps cap through nvcp

if i want to play at specific framerates such as 36/40/45, I use NVCP as well. for some reason feels more smoother than Rivatuner in most cases. (completely subjective personal experience).
if I'm certain I can hit locked 60 or 120, I tend to use in-game caps (if in game cap also presents specific values such as 40 45, I tend to use in game caps). in game caps usually offer the least possible input lag and most responsive gameplay. if i want to play at 30 FPS I usually use in game caps unless the frame cap is busted

I still use rivatuner from time to time but usually in older games. newer games I tend to go with NVCP.

by the way, NVIDIA's framecap is super GPU-power aware. if you do not care about input lag, it can severely reduce core clocks and save power. in most cases, NVCP frame cap will force GPU to work at %85-90 with reduced clocks. this is an interesting behaviour that can be observed with 2000/3000 cards. i find the input lag bareable with gamepad. for FPS games, i find it unberable.

if you're startled by the input lag this behaviour produces, you can force prefer maximum performance alongside with the cap

example

N7Euaet.jpg

onxKhGj.jpg



its practically a smart cap that will reduce clocks EXTREMELY. it will always push clocks in a way to maintain that 36 FPS cap. usually, reduced clocks only happen at certain thresholds of GPU usage. this is not however related to GPU usage. this will always aggresively reduce clocks. I actually find this a very good and overlooked feature with NVCP frame cap. it is weird that i've seen no one mentioning this. could be that most people do not use such low caps for any game or whatsoever (im a weird niche user so yeah)
I'm a bit confused.

What should i use with a 4080 and an oled 4k120hz tv if i don't care about power usage andi i just want the most stable and input lag free experience?

Right now i have vsync forced on and a 117fps cap from nvcp and they say it's the best combo except when games have dlss3 framegen, in that case i should only have vsync on because framegen+reflex automatically lower the max fps or some obscure shit, but now you say that the framecaps from nvcp introduce input lag and lower the core of my gpu? Fuck that (or maybe i just didn't understand at all).
 

damidu

Member
still will wait for day1 comparisons but

did they ever fixed aim issue for village on xbox? (if you did a circle on gamepad, you would basically get a rectangle motion on screen)

and how does this one compare to that?
 

kingyala

Banned
You know he's a hard-core PC guy right? He doesn't really care about either console.
he doesnt normally care about consoles but thats not the issue... the problem started years ago before the consoles came out he firstly claimed ssd's wont improve graphics when the the world heard about mark cernys patents on ps5 io, he then kept on going saying that even ater ue5 was demoed on ps5 he came out and doctored that ue5 presentation made a whole analysis video and never mentioned ssd once despite developers saying ue5 was invented with ssds in mind and that demo wouldnt be possible without it...

then he also claimed series x 12 tf will make it perform better than ps5 and used contorol's photo mode as an example... which was educational but he adamantly claimed that the resulsts of that photomode test proves that series x will always outperform ps5 to which he was miserably wrong as the more games came out the more we saw both consoles performing similarly... so ever since then hes been trying to save his face from all the fraud assumptions he made by making more doctored analysis video's in favour of series x, we even saw recently when df where discussing why the ps5 performs better than series x most of the times and john who spoke to developers said devs preferred ps5 api and its easy to design for to which alex jumped in and tried to doctor john's findings bu again saying devs dont optimize for xbox because it doesnt sell more... ive tried my best to follow the guy but hes lost all credentials u can go over to beyond 3d forums and check any statement made there that discredits ps5 will have a thumbs up from alex... you can go check it yourself... as a media person he should be impartial but he lies everytime like FOX NEWS or MSNBC
 

adamsapple

Or is it just one of Phil's balls in my throat?
it's the same deadzone size.

the difference in RE2 is that its deadzone is a perfect circle I think, while the one in RE4 is a weird square shape.
and the reaction curve is different between the two.

it still is crap in RE2 as well but the game itself is way slower with slower enemies too, so the bad aiming doesn't stand out as much.

RE3, made by a different team, has the same deadzone size on all systems, which is equivalent to the size of RE4 on PS5

I literally just beat RE2 remake yesterday in anticipation for 4make, and the aiming and deadzones are nothing like Re4's demo, with the same controller in my hands at least.
 

DenchDeckard

Moderated wildly
I literally just beat RE2 remake yesterday in anticipation for 4make, and the aiming and deadzones are nothing like Re4's demo, with the same controller in my hands at least.

Yeah, I didnt struggle with RE2 on xbox at all...this kinda sucks...Hopefully there is a chance for a fix ASAP.
 

SlimySnake

Flashless at the Golden Globes
60 fps isnt hard to do with raytracing ratchet and clank have done it spiderman has done it and when unlocked they even hit 80 fps, compared to this ps4 game the series x and ps5 should be comfprtable running at 1440p 60 with some shadow or puddle reflections, there isnt even any noticeable reflections or shadows here so wheres the performance cost and hair strands dont make any sense ratchet and clank use strand rendering for fur and has 100,000 of strands that are also affected by light not a dozen like here.. it cant be 900p since the ps4 is already 900p
Every game engine is different. Every game is different. You cant compare Ratchet to RE4. Would you compare Ratchet to TLOU Part 1 or Demon Souls? 1440p 60 fps with no ray tracing while Ratchet is doing native 4k 50 fps with ray tracing with way higher quality visuals.

I am just going by their pixel counts. If they counted 1440p checkerboard then the internal resolution is 960p. Plain and simple. Others have counted differently, but DF was purposefully using the worst case scenario with hair strands and RT on. Like I said, if you turn off those GPU hungry things, you might get up to 1800p cb but even then its just 2.8 million pixels just like HFW's 60 fps mode. And we all know how awful it looked at launch.

There is a reason why there is so much shimmering in the PS5 version. It's what happens when you use checkerboarding at lower resolutions. We've seen it before. There is a fix. GG did it though it took them 3 months and they had to essentially redesign their checkerboarding solution to be more like FSR2.0 and DLSS2.0. Capcom hasnt done that yet.

And PS4 version is not the same as the PS5 version even with RT and hair turned off. Anyone who has game on Ultra settings on PC knows just how much of a performance hog they are even if they dont provide a noticeable upgrade to visual fidelity.
 

skneogaf

Member
The frame rate mode with hair and ray tracing off is pretty much locked 60fps on xbox and playstation so people should just go with whichever version they want and play an amazing game.
 
60 fps isnt hard to do with raytracing ratchet and clank have done it spiderman has done it and when unlocked they even hit 80 fps, compared to this ps4 game the series x and ps5 should be comfprtable running at 1440p 60 with some shadow or puddle reflections, there isnt even any noticeable reflections or shadows here so wheres the performance cost and hair strands dont make any sense ratchet and clank use strand rendering for fur and has 100,000 of strands that are also affected by light not a dozen like here.. it cant be 900p since the ps4 is already 900p
Those games are highly optimized for one hardware only. It's very different when you have 5 versions to make and optimize. It's already impressive to have some RT (even limited) at 60fps on the twin consoles.
 

kingyala

Banned
Every game engine is different. Every game is different. You cant compare Ratchet to RE4. Would you compare Ratchet to TLOU Part 1 or Demon Souls? 1440p 60 fps with no ray tracing while Ratchet is doing native 4k 50 fps with ray tracing with way higher quality visuals.

I am just going by their pixel counts. If they counted 1440p checkerboard then the internal resolution is 960p. Plain and simple. Others have counted differently, but DF was purposefully using the worst case scenario with hair strands and RT on. Like I said, if you turn off those GPU hungry things, you might get up to 1800p cb but even then its just 2.8 million pixels just like HFW's 60 fps mode. And we all know how awful it looked at launch.

There is a reason why there is so much shimmering in the PS5 version. It's what happens when you use checkerboarding at lower resolutions. We've seen it before. There is a fix. GG did it though it took them 3 months and they had to essentially redesign their checkerboarding solution to be more like FSR2.0 and DLSS2.0. Capcom hasnt done that yet.

And PS4 version is not the same as the PS5 version even with RT and hair turned off. Anyone who has game on Ultra settings on PC knows just how much of a performance hog they are even if they dont provide a noticeable upgrade to visual fidelity.
vg tech found 1932p the shimmering is caused by chromatic aberration and other screen effects, checkerboarding isnt always broken its the implementation as you saw guerrilla patched forbidden west with a better implementation and image quality + graphics where improved.. what i dont understand is how this ps4 game not manage to hit 1440p 60 with puddle reflections and this re engine is known to perform very well even on ps4 so whats really the issue here what is it rendering that makes it 900p 60 really? im confused and even on pc it performs unconsistently.
 

kingyala

Banned
Those games are highly optimized for one hardware only. It's very different when you have 5 versions to make and optimize. It's already impressive to have some RT (even limited) at 60fps on the twin consoles.
thats all alright but inconsistent 1440p 60 checkerboard on current consoles with puddle rt reflections doesnt impress me, it feels like they dont even try to fine tune their work this days they just slap a game with no special optimization and rely on vrr, recontstruction, drs to save the day... even though such reconstruction technologies help they also make developers lazy, whats confusing is this is a ps4 game anyway
 

SlimySnake

Flashless at the Golden Globes
vg tech found 1932p the shimmering is caused by chromatic aberration and other screen effects, checkerboarding isnt always broken its the implementation as you saw guerrilla patched forbidden west with a better implementation and image quality + graphics where improved.. what i dont understand is how this ps4 game not manage to hit 1440p 60 with puddle reflections and this re engine is known to perform very well even on ps4 so whats really the issue here what is it rendering that makes it 900p 60 really? im confused and even on pc it performs unconsistently.
DF mentioned that while CR and other effects exacerbate the shimmering effects, the shimmering remains even after you disable them. You can test this today on your PS5. Yes, it helps clean the image but the main reason is the low base resolution. And yes, HFW improved it with a better implementation which is exactly what they need to do here in both the performance mode which is 1800p cb (HFW) and RT fidelity mode which drops all the way down to 1440p cb or 960p base resolution.

I see this on PC all the time Every game is different, but their performance scales with resolution. My 3080 runs this game at native 4k 60 fps with some drops to 55 fps, but the moment I turn on RT and hair strands, the performance takes a nose dive. Just play the game in Performance mode, turn off RT, hair strand and all post processing effects (Depth of field has a sizeable hit to performance) and you will get a higher resolution image, somewhere around 1872p cb which is what ElAnaldebits counted. Way higher than the 1440p cb image of the RT mode, but still not perfect until they implement the enhancements GG added for HFW's performance mode.

Bottomline is that If they could do 4kcb like RE8 they wouldve. The game is clearly more taxing with far more enemies on screen, way more foliage, and way bigger environments. All extremely taxing to not just the GPU but also ram bandwidth which is a huge bottleneck on both the PS5 and XSX as we've seen in several games that struggle to run at 60 fps despite the fact those same games on PCs run fine on similar hardware. Just look at Horizon. If their native 4k 30 fps version is running locked at 8.2 million pixels, they SHOULD be able to get 4kcb 4.1 million pixels working at 60 fps, but nope, their 40 fps version runs at 4kcb and their 60 fps version runs at 1800p cb or just 2.8 million pixels. There seems to be an extra cost on consoles to get these games running at 60 fps. simply halving the resolution doesnt seem to double the framerate like it does on PC and other games on consoles. Every game is different.
 

intbal

Member
Either DF/VG/NX need to stop doing "performance analysis" videos of pre-release code, or GAF just needs to stop making threads about those videos.


...yeah, I know neither of those things is going to happen.
 

kingyala

Banned
DF mentioned that while CR and other effects exacerbate the shimmering effects, the shimmering remains even after you disable them. You can test this today on your PS5. Yes, it helps clean the image but the main reason is the low base resolution. And yes, HFW improved it with a better implementation which is exactly what they need to do here in both the performance mode which is 1800p cb (HFW) and RT fidelity mode which drops all the way down to 1440p cb or 960p base resolution.

I see this on PC all the time Every game is different, but their performance scales with resolution. My 3080 runs this game at native 4k 60 fps with some drops to 55 fps, but the moment I turn on RT and hair strands, the performance takes a nose dive. Just play the game in Performance mode, turn off RT, hair strand and all post processing effects (Depth of field has a sizeable hit to performance) and you will get a higher resolution image, somewhere around 1872p cb which is what ElAnaldebits counted. Way higher than the 1440p cb image of the RT mode, but still not perfect until they implement the enhancements GG added for HFW's performance mode.

Bottomline is that If they could do 4kcb like RE8 they wouldve. The game is clearly more taxing with far more enemies on screen, way more foliage, and way bigger environments. All extremely taxing to not just the GPU but also ram bandwidth which is a huge bottleneck on both the PS5 and XSX as we've seen in several games that struggle to run at 60 fps despite the fact those same games on PCs run fine on similar hardware. Just look at Horizon. If their native 4k 30 fps version is running locked at 8.2 million pixels, they SHOULD be able to get 4kcb 4.1 million pixels working at 60 fps, but nope, their 40 fps version runs at 4kcb and their 60 fps version runs at 1800p cb or just 2.8 million pixels. There seems to be an extra cost on consoles to get these games running at 60 fps. simply halving the resolution doesnt seem to double the framerate like it does on PC and other games on consoles. Every game is different.
he game is clearly more taxing with far more enemies on screen, way more foliage, and way bigger environments... this is false it has nothing to do with memory bandwidth or processing power its simply bad optimization the game has same number of enemies on ps4 and effects its inherently a lastgen game just upresed to run on current machines... the problem is devs nowadays dont optimize enough they dont take time with this machines its why we have crossgen games like this perform the way they do we keep seeing this trend.. gotham knights, wo long, wild hearts and plenty of games that look no better than a early 2014 ps4 game but simply perform like shit...

Its got nothing to do with the consoles or hardware its the software thats the problem and its only first party studios like playstation studios that seem to take their time to support 4k 60 and 1440p 120 consistently on ps5 and even utilize pc ports well.. other studios simply just put unoptimized code and rely on reconstruction, vrr, vrs, drs to save the day.. its cheating and it isnt healthy it seems games now are made with reconstruction techniques in mind and not native techniques first to reach the preferred performance levels... back in the days devs had to make sure their game is natively consistent at 30,60 fps 1080p because reconstruction was almost none existent you had to push yourself and now they just dont care.. this generation bought reconstruction techniques and faster storage to help and simplify developers work but in effect it has turned some developers lazy
 

SlimySnake

Flashless at the Golden Globes
he game is clearly more taxing with far more enemies on screen, way more foliage, and way bigger environments... this is false it has nothing to do with memory bandwidth or processing power its simply bad optimization the game has same number of enemies on ps4 and effects its inherently a lastgen game just upresed to run on current machines... the problem is devs nowadays dont optimize enough they dont take time with this machines its why we have crossgen games like this perform the way they do we keep seeing this trend.. gotham knights, wo long, wild hearts and plenty of games that look no better than a early 2014 ps4 game but simply perform like shit...

Its got nothing to do with the consoles or hardware its the software thats the problem and its only first party studios like playstation studios that seem to take their time to support 4k 60 and 1440p 120 consistently on ps5 and even utilize pc ports well.. other studios simply just put unoptimized code and rely on reconstruction, vrr, vrs, drs to save the day.. its cheating and it isnt healthy it seems games now are made with reconstruction techniques in mind and not native techniques first to reach the preferred performance levels... back in the days devs had to make sure their game is natively consistent at 30,60 fps 1080p because reconstruction was almost none existent you had to push yourself and now they just dont care.. this generation bought reconstruction techniques and faster storage to help and simplify developers work but in effect it has turned some developers lazy
What are you talking about its false? Did you even play RE8? How many times did they have that many enemies on screen at once?

And you keep repeating the PS4 comparisons, its NOT running at the same settings. Hell, even the PS4 version of RE8 ran at 1080p while RE4 runs at 900p. What does that tell you? It is MORE TAXING than RE8. Otherwise it wouldve run at 1080p on the PS4 just like RE8 did.

Yes, third party devs wont put in the same effort as first party devs do, but thats not even the norm anymore. Several games released in the last 6 months perform better on PS5 than XSX and even equivalent PCs because devs optimize on PS5 first. Where have you been the last few months? it's been a huge story. Deadspace, Hogwarts, Callisto, even Gotham Knights which runs at just 30 fps on PS5 has no issues running RT stuff while my PC straight up drops to 0 fps every 30 minutes whenever i turn RT on.

Besides, I literally gave you an example of a FIRST party game like HFW which runs at the same exact 1800p cb resolution and had the same shimmering problems. Reconstruction is hardly a new thing. its been around since 2016 last gen. Yes, Capcom needs to do better but its insane how you are just handwaving ALL possible explanations and just resorting to 'devs are lazy'.
 
Last edited:

01011001

Banned
Either DF/VG/NX need to stop doing "performance analysis" videos of pre-release code, or GAF just needs to stop making threads about those videos.


...yeah, I know neither of those things is going to happen.

well from what they said about the day 1 version, almost nothing changed from this demo to the full version.

so videos like these are helpful to make sure you're not getting the wrong version.

for example the controls on xbox are abysmal, and according to John from DF the final game doesn't fix that, so that makes the decision very easy as to which version to get.
 

adamsapple

Or is it just one of Phil's balls in my throat?
Either DF/VG/NX need to stop doing "performance analysis" videos of pre-release code, or GAF just needs to stop making threads about those videos.


...yeah, I know neither of those things is going to happen.


I don't know what's prompting this. Doesn't look like there's any/many changes between the demo and final aside from the PS5 version fixing the missing RT in puddles.

The IQ is the same, the deadzone issues are the same.

DF/VG/NX have in almost all cases come up with different pixel counts in most games, that's .. what DRS does.
 

yamaci17

Member
I'm a bit confused.

What should i use with a 4080 and an oled 4k120hz tv if i don't care about power usage andi i just want the most stable and input lag free experience?

Right now i have vsync forced on and a 117fps cap from nvcp and they say it's the best combo except when games have dlss3 framegen, in that case i should only have vsync on because framegen+reflex automatically lower the max fps or some obscure shit, but now you say that the framecaps from nvcp introduce input lag and lower the core of my gpu? Fuck that (or maybe i just didn't understand at all).
Scared Homer Simpson GIF by reactionseditor


I have no experience or knowledge regarding DLSS3/Frame gen and how it functions

NVCP frame cap introduce input lag IF your GPU is going to have a massive amount of free resources. With a 117 FPS cap at a high resolution, that is unlikely to happen (unless you play something like Overwatch). With a 36 FPS cap, half of my GPU is unutilized, hence GPU declocks from 1900 MHz to 900 MHz to save power. This is an extremely niche example (do not take it seriously)

Oh, also, since you mention Reflex, Reflex+NVCP cap is input lag free. Because Reflex On+Boost practically forces GPU to run at maximum clocks regardless of the frame cap/free GPU usage. Reflex stops NVCP cap from reducing the GPU clock so you don't have to worry about input lag with it (On+Boost)

For games without Reflex, you can set per game profile to "prefer maximum performance" to ensure your GPU is boosted to its maximum. Do not enable this globally because it really is wasteful when you're on desktop!
EbqrCqG.png
 
Last edited:

GymWolf

Member
Scared Homer Simpson GIF by reactionseditor


I have no experience or knowledge regarding DLSS3/Frame gen and how it functions

NVCP frame cap introduce input lag IF your GPU is going to have a massive amount of free resources. With a 117 FPS cap at a high resolution, that is unlikely to happen (unless you play something like Overwatch). With a 36 FPS cap, half of my GPU is unutilized, hence GPU declocks from 1900 MHz to 900 MHz to save power. This is an extremely niche example (do not take it seriously)

Oh, also, since you mention Reflex, Reflex+NVCP cap is input lag free. Because Reflex On+Boost practically forces GPU to run at maximum clocks regardless of the frame cap/free GPU usage. Reflex stops NVCP cap from reducing the GPU clock so you don't have to worry about input lag with it (On+Boost)
I have the italian nvcp, is reflex the low latency mode that you can put on or ultra? Or another setting?

Because the only reflex setting i saw was inside games under the framegen setting, nothing in the nvcp, but i use the normal nvcp, some people use inspector that has way more settings.
 

yamaci17

Member
I have the italian nvcp, is reflex the low latency mode that you can put on or ultra? Or another setting?

Because the only reflex setting i saw was inside games under the framegen setting, nothing in the nvcp, but i use the normal nvcp, some people use inspector that has way more settings.

no, reflex is not a driver toggle. low latency mode is doing some stuff that reflex does. but not quite. and low latency mode only works with dx9/dx11 games. i'd say do not meddle with that setting unless you play a dx11 title

in NVCP, you should be looking for something related to "power management". as I said again, do not globally enable "prefer maximum performance" as that will stop GPU from reducing clocks on desktop and needlessly waste power + spin the fans + make the GPU work when you watch twitch and stuff

frame cap + prefer maximum performance for older games should be enough to maintain good input lag. you can adjust both the frame cap and power management mode for per game

and do not worry, every dlss3 game is being forced to have reflex from now on. so we're in for a treat in that regard! with reflex u dont have to meddle with any nvcp setting aside from the global cap and vsync
 
Last edited:

TonyK

Member
for example the controls on xbox are abysmal, and according to John from DF the final game doesn't fix that, so that makes the decision very easy as to which version to get.
But they said day one patch solve it, so that decision is not so easy.
 
I am going with the PC version as the demo has the fewest issues. looks by far the crispest of the three versions I've played and runs great on my new system with RT and Hair Strands at maxed out 1440p settings and 120 fps.

I have also played the PS5 and Xbox Series X demos but the PS5 looks oddly soft and shimmery versus the Xbox version and because of the console's barebones VRR support the game has instances of stutter as the game dips below 48 fps in its only 60 Hz mode when RT is enabled. The Xbox Series X version doesn't have this issue as it runs at 120 Hz and so supports low framerate compensation (LFC) and thus smoother VRR but it has laggy controls. I would have gone with the PS5 version otherwise as it is getting a free VR update at some point.
 
Last edited:

GymWolf

Member
no, reflex is not a driver toggle. low latency mode is doing some stuff that reflex does. but not quite. and low latency mode only works with dx9/dx11 games. i'd say do not meddle with that setting unless you play a dx11 title

in NVCP, you should be looking for something related to "power management". as I said again, do not globally enable "prefer maximum performance" as that will stop GPU from reducing clocks on desktop and needlessly waste power + spin the fans + make the GPU work when you watch twitch and stuff

frame cap + prefer maximum performance for older games should be enough to maintain good input lag. you can adjust both the frame cap and power management mode for per game

and do not worry, every dlss3 game is being forced to have reflex from now on. so we're in for a treat in that regard! with reflex u dont have to meddle with any nvcp setting aside from the global cap and vsync
Shit i have prefer maximum perf on global, so i should only use it on a game per game basis uh?

If i want to lock to 60 because i know that a game is heavy and 120 are off the table i usually just switch the type of resolution.

I have 2 sets of resolutions in the nvcp, tv resolution that goes as high as 4k60 and pc resolution that goes up to 4k120, in that way i don't have to use any framecaps because the game reads the tv as a 4k60 monitor.

I only use the 117 cap for games (without dlss3) where i can actually stay in those high framerate most of the time without wild flactuations (because gsync can only hide hiccups to a certain degree).
 

01011001

Banned
But they said day one patch solve it, so that decision is not so easy.

DF is testing on day 1 patch afaik, and John said the controls are still shit on Xbox.

this is the same team that made RE2, and RE2 has the same gigantic deadzone on Xbox and it never got fixed even tho they made a full next gen version.
 
Last edited:

yazenov

Member
Honestly, I would classify VRR as a damn near essential feature for this current gen, so far.

It's great and helps so much with stability and response.

VRR and Series X are both similar in the way that few people own these products.

The vast majority of Xbox users own a series S console since MS is pushing those instead of the Series X, and most people don't own VRR-capable monitors / TVs.

Digital Foundry should focus on the comparisons of features and products most people are actually using instead of niche products and features that are not relevant to most of the general public.

I bet most of these potential RE4 remake owners do not even own a VRR TV or a series X console 🤷‍♂️
 
Last edited:

yamaci17

Member
Shit i have prefer maximum perf on global, so i should only use it on a game per game basis uh?

If i want to lock to 60 because i know that a game is heavy and 120 are off the table i usually just switch the type of resolution.

I have 2 sets of resolutions in the nvcp, tv resolution that goes as high as 4k60 and pc resolution that goes up to 4k120, in that way i don't have to use any framecaps because the game reads the tv as a 4k60 monitor.

I only use the 117 cap for games (without dlss3) where i can actually stay in those high framerate most of the time without wild flactuations (because gsync can only hide hiccups to a certain degree).
in my opinion yes. no need to waste power / pump heat into case at idle

for 4k/60 thing it is up to you, i'd prefer screen stay at 4k/120 and cap the game directly to 60 per game (for that specific game)
 

adamsapple

Or is it just one of Phil's balls in my throat?
VRR and Series X are both similar in the way that few people own these products.

The vast majority of Xbox users own a series S console since MS is pushing those instead of the Series X, and most people don't own VRR-capable monitors / TVs.

Digital Foundry should focus on the comparisons of features and products most people are actually using instead of niche products and features that are not relevant to most of the general public.

I bet most of these potential RE4 remake owners do not even own a VRR TV or a series X console 🤷‍♂️


What a bizarre fanboyish post. VRR is a hardware level feature on all the new gen consoles, why shouldn't it be mentioned or covered ? That makes absolutely no sense.

They'll cover the game in mode depth at launch, this was just a demo video, and they do mention non VRR performance and their suggested options for playing if one doesn't have a VRR display.

Similarly, the closest equivalents for the next gen versions are naturally going to be the PS5 vs Series X, so they are the ones directly compared. You want them to do PS5 vs Series S comparisons for arbitrary reasons ?
 

damidu

Member
VRR and Series X are both similar in the way that few people own these products.

The vast majority of Xbox users own a series S console since MS is pushing those instead of the Series X, and most people don't own VRR-capable monitors / TVs.

Digital Foundry should focus on the comparisons of features and products most people are actually using instead of niche products and features that are not relevant to most of the general public.

I bet most of these potential RE4 remake owners do not even own a VRR TV or a series X console 🤷‍♂️
yeah i don't mind them pointing out stuff but
saying stuff like "vrr saves the day" doesn't give any incentive to developers to fix their shitty framerate, while majority of users still don't have access to the tech.

as if its totally normal for a game targeting 60 to dip to 40s. no amount of vrr will make that drop feel seamless anyway. low fps is low fps
 

yazenov

Member
What a bizarre fanboyish post. VRR is a hardware level feature on all the new gen consoles, why shouldn't it be mentioned or covered ? That makes absolutely no sense.

They'll cover the game in mode depth at launch, this was just a demo video, and they do mention non VRR performance and their suggested options for playing if one doesn't have a VRR display.

Similarly, the closest equivalents for the next gen versions are naturally going to be the PS5 vs Series X, so they are the ones directly compared. You want them to do PS5 vs Series S comparisons for arbitrary reasons ?

Well, good businesses should in theory cater to most of their audience rather than the 0.01% minority. So to answer your question, yes the focus should be PS5 vs. Series S :)
 

adamsapple

Or is it just one of Phil's balls in my throat?
Well, good businesses should in theory cater to most of their audience rather than the 0.01% minority. So to answer your question, yes the focus should be PS5 vs. Series S :)

DF is a tech channel, they don't cover what's 'good business' or most widely available, otherwise all their PC focused content would be led by 1650 GPUs :)
 
Last edited:

kingyala

Banned
t
What are you talking about its false? Did you even play RE8? How many times did they have that many enemies on screen at once?

And you keep repeating the PS4 comparisons, its NOT running at the same settings. Hell, even the PS4 version of RE8 ran at 1080p while RE4 runs at 900p. What does that tell you? It is MORE TAXING than RE8. Otherwise it wouldve run at 1080p on the PS4 just like RE8 did.

Yes, third party devs wont put in the same effort as first party devs do, but thats not even the norm anymore. Several games released in the last 6 months perform better on PS5 than XSX and even equivalent PCs because devs optimize on PS5 first. Where have you been the last few months? it's been a huge story. Deadspace, Hogwarts, Callisto, even Gotham Knights which runs at just 30 fps on PS5 has no issues running RT stuff while my PC straight up drops to 0 fps every 30 minutes whenever i turn RT on.

Besides, I literally gave you an example of a FIRST party game like HFW which runs at the same exact 1800p cb resolution and had the same shimmering problems. Reconstruction is hardly a new thing. its been around since 2016 last gen. Yes, Capcom needs to do better but its insane how you are just handwaving ALL possible explanations and just resorting to 'devs are lazy'.
theres nothing taxing here mate... spider man uses raytracing plus strand hair rendering all over the place and manages rt at 1440p 60 really what does this corridor game about shooting farmers in village alley ways taxing all its doing is rendering some puddle rt reflections and countable hair strands... optimization and software engineering is the problem not hardware. besides the preferable ps5 dev environment and tools devs still dont put enough work on optimizing games nowadays as they used to for alot of reasons as i explained earlier theres nothing at all taxing in this game i dont see anything special its doing compared to last gen games and its exactly why i keep putting the ps4 in the discussion cause this is a ps4 game thats just upresed on ps5 with silly visual tricks like sub surface scattering an old technique used even on xbox 360 games,, high res textures that are not even high res, a dozen strands of hair and puddle reflections and you think its taxing the hardware... where have u been all this years this game doesnt hold a candle to some ps4 games so the fact that it runs at 900p on a ps5 and still cant hit 60 is a dev failure not hardware
 

SlimySnake

Flashless at the Golden Globes
yeah i don't mind them pointing out stuff but
saying stuff like "vrr saves the day" doesn't give any incentive to developers to fix their shitty framerate, while majority of users still don't have access to the tech.

as if its totally normal for a game targeting 60 to dip to 40s. no amount of vrr will make that drop feel seamless anyway. low fps is low fps
I think it's valid to criticize the PS5 version for not supporting LFC, but yes, its frustrating to see them sing the VRR praises for games dropping below 48 fps. I have a gsync capable tv and I still notice drops below 50 fps. 55-60 fps I agree is very easily masked by gysnc/vrr but anything that drops 10-20 fps is going to be noticeable no matter what kind vrr tech you are using.

DF's only job is to offer valid criticisms to devs who do not offer proper 60 fps modes. I did like John's suggestion that they perhaps shouldve offered a locked 30 fps mode with RT, native 4k resolution and hair strands. That to me is DF doing their job. Excusing drops below 50 fps just because the rich few can afford to have vrr TVs is just making excuses for the developer that will not push the developer to do anything about it. We saw this play out last year when From shipped PS5 and XSX versions that had an insanely high resolution and framerates that dropped below 50 fps on the XSX. Why? Just reduce the goddamn resolution. But nope, DF was like XSX has VRR so go buy the XSX version. Nah, you hound and criticize the developer until they fix the framerate. The entire point of 60 fps is to get a more responsive game anyway. Even if vrr masks the tearing and stuttering, you will still notice it in the input lag.
 

Neo_game

Member
VRR is like gsync and freesync right ? On PC nobody talks about it. It is all about how gpu or in some cases cpu keep up with the fps. In fact some people are even obessed with 1% low. Personally I prefer gfx over some variable fps but devs should should do the optimization and not rely on VRR.
 

GymWolf

Member
in my opinion yes. no need to waste power / pump heat into case at idle

for 4k/60 thing it is up to you, i'd prefer screen stay at 4k/120 and cap the game directly to 60 per game (for that specific game)
I mean, do i have downsides by choosing the tv resolution so i don't have to cap nothing instead of introducing another artificial framecap that half the framerate?
 

yamaci17

Member
I mean, do i have downsides by choosing the tv resolution so i don't have to cap nothing instead of introducing another artificial framecap that half the framerate?
no, most of the time vsync is the most natural framecap you can get actually !

but 60 fps cap + 120 hz can have reduced input lag, just a note. 60 fps + vsync should be a bit more smoother at the expense of input lag. so its a trade off. so downside is the input lag. however upside is the smoothness
 
Last edited:

GymWolf

Member
VRR is like gsync and freesync right ? On PC nobody talks about it. It is all about how gpu or in some cases cpu keep up with the fps. In fact some people are even obessed with 1% low. Personally I prefer gfx over some variable fps but devs should should do the optimization and not rely on VRR.
We are just realist people and we know that 100% locked framerate on console is extremely hard to achieve especially at 60 fps.

Last gen i think that only 3 games had rock solid 60 fps, and they were halo 5, gears 5 and maybe doom, and i'm pretty sure that even those lost some frames with absolute hell on screen.

Is it better to have vrr or to hope that after 30 years, devs are gonna start introducing 100% locked framerate on console?

We both know the answer.

P.s. df saying that some games have locked framerate is kinda bullshit because for their own admission they usually test the first 10-15 hours of the game, where you don't have the biggest enemies, nor the most amount of them on screen nor the most bombastic setpieces etc.
 

SlimySnake

Flashless at the Golden Globes
VRR is like gsync and freesync right ? On PC nobody talks about it. It is all about how gpu or in some cases cpu keep up with the fps. In fact some people are even obessed with 1% low. Personally I prefer gfx over some variable fps but devs should should do the optimization and not rely on VRR.
Yes. VRR/Gysnc are great to mask some 1% drops but no one on PC is running games at a variable 40-50 fps lmao. Thats nonsense. We just reduce resolution or settings to get as close to 60 fps as possible and let gsync handle any additional drops that may occur during high intensity shootouts.

I recently tried playing Cyberpunk at an average of 50 fps. Just not smooth enough. Locked it to 40 fps and felt way better.
 

GymWolf

Member
no, most of the time vsync is the most natural framecap you can get actually !

but 60 fps cap + 120 hz can have reduced input lag, just a note. 60 fps + vsync should be a bit more smoother at the expense of input lag. so its a trade off. so downside is the input lag. however upside is the smoothness
But what i use is not really a framecap, it's literally turning the monitor into a 60 hz device so that's the max it can go, it doens't need a lock because the pc reads the monitor as a 60 hz screen ( i think)

Do you use a tv to play? because maybe you don't even have a second set of tv resolutions if you use a pc monitor (i don't know really).
 
Last edited:

01011001

Banned
VRR is like gsync and freesync right ? On PC nobody talks about it. It is all about how gpu or in some cases cpu keep up with the fps. In fact some people are even obessed with 1% low. Personally I prefer gfx over some variable fps but devs should should do the optimization and not rely on VRR.

well noone talks about it on PC because it has been the default state on PC for years already.
on PC you are expected to just run the game unlocked and then use Gsync and fast sync in combination to just run the game as well as it can at all times.

on console VRR is now starting to get more and more use, the obvious one is having a smooth presentation when the performance is suboptimal and another is to just run unlocked framerates up to 120fps just like on PC.

additional about the 1% lows on PC, that is way more of an actual issue and topic on PC due to #stutterstruggle, IO stutters, Shader comp stutters, weird CPU issues, driver issues etc.
that's mainly a PC thing and why the 1% and 5% lows on PC get so much attention.

this gets even more of a concern if you run high framerates at 120fps and above.
if you want a locked and steady performance you need to know where to lock the framerate. maybe you have a low end screen with no freesync/gsync so you need to know if you can hit 120fps for example reliably in a game
 
Last edited:

yamaci17

Member
But what i use is not really a framecap, it's literally turning the monitor into a 60 hz device so that's the max it can go, it doens't need a lock because the pc reads the monitor as a 60 hz screen ( i think)

Do you use a tv to play? because maybe you don't even have a second set of tv resolutions if you use a pc monitor (i don't know really).
I use a monitor but no, I too have TV resolutions, actually I can even choose 24/30/50/60 hz

to understand what I mean, try any FPS game with 60 hz+vsync and compare it to 60 fps cap + 120 hz + vrr, you will see how much of a difference it can actually have. im from speaking experience as I can set my screen to 60 hz as well
 

GymWolf

Member
I use a monitor but no, I too have TV resolutions, actually I can even choose 24/30/50/60 hz

to understand what I mean, try any FPS game with 60 hz+vsync and compare it to 60 fps cap + 120 hz + vrr, you will see how much of a difference it can actually have. im from speaking experience as I can set my screen to 60 hz as well
Do you keep the vrr option IN WINDOWS11 turned on or off? or nvidia just override with his gsync anyway so it doens't matter?
 
Last edited:

SlimySnake

Flashless at the Golden Globes
t

theres nothing taxing here mate... spider man uses raytracing plus strand hair rendering all over the place and manages rt at 1440p 60 really what does this corridor game about shooting farmers in village alley ways taxing all its doing is rendering some puddle rt reflections and countable hair strands... optimization and software engineering is the problem not hardware. besides the preferable ps5 dev environment and tools devs still dont put enough work on optimizing games nowadays as they used to for alot of reasons as i explained earlier theres nothing at all taxing in this game i dont see anything special its doing compared to last gen games and its exactly why i keep putting the ps4 in the discussion cause this is a ps4 game thats just upresed on ps5 with silly visual tricks like sub surface scattering an old technique used even on xbox 360 games,, high res textures that are not even high res, a dozen strands of hair and puddle reflections and you think its taxing the hardware... where have u been all this years this game doesnt hold a candle to some ps4 games so the fact that it runs at 900p on a ps5 and still cant hit 60 is a dev failure not hardware
Taxing COMPARED TO RE8 VILLAGE. Spiderman is a completely different game. You wont trash Horizon FW or GG for not doing 4k 50 fps with ray tracing like Spiderman, would you?

Now compared to RE8, it is doing things that are clearly better. You keep bringing up the PS4 version when I see you in the VG Tech which clearly lists all the upgrades over the PS4 version :
PS5 has some graphical improvements over the PS4 consoles such as: improvements to mesh quality, improved volumetric lighting quality, improvements to texture quality and additional dynamic shadows. PS5 also seems to have subsurface scattering, destructible small objects and terrain tessellation all of which don't appear to be present on the PS4 systems. On the PS4 consoles enemies can have a reduced animation rate which wasn't seen on PS5.

They are literally using half rate animations on the PS4. DF pointed out several areas that literally have textures so low they might as well be missing. The PS5 version is doing so much more, and all of that extra work is taxing on this engine.

Yes, devs need to leave last gen behind. Yes, devs need to optimize their games better. Yes, they couldve implemented a better checkerboarding technique but it is silly to suggest that this game should have the same performance as RE8 when there are clearly graphics upgrades here. If it was identical to RE8, it wouldve run at 4kcb 60 fps locked like RE8. Clearly they are pushing things here which is why they had to settle for 1800p. Their RT mode is 1440p cb because the RE8 RT mode regularly dipped below and was pretty much a 45 fps mode on PS5 in action scenes according to capcom alone. So clearly 4kcb wasnt enough for RE8 either.
 

yamaci17

Member
Do you keep the vrr option IN WINDOWS11 turned on or off? or nvidia just override with his gsync anyway so it doens't matter?
VRR setting in windows 10/11 graphics options is only relevant for niche UWP games that used to be somewhat imcompatible with VRR (I personally never played them or come across them). Most recent Store/Game Pass games do not use UWP thankfully. you can leave it on, it won't meddle with usual VRR. it just allows those UWP games to use VRR like usual (whatever they are)

thank you :messenger_open_mouth:
 
Last edited:
Top Bottom