• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ratchet & Clank: Rift Apart 40FPS!!??

Status
Not open for further replies.

Ratchet & Clank: Rift Apart Patch 1.002 Adds 40FPS Mode for 120Hz TVs




Here's an interesting addition to Ratchet & Clank: Rift Apart found in patch 1.002: if you own a TV capable of running at 120Hz, the PlayStation 5 exclusive will increase the frame rate of its Fidelity Mode to 40 frames-per-second. The patch is available to download now and works to increase the frame rate for television sets that can handle the boost. With a higher frame rate, the game will run even better whilst maintaining the visual quality of the normal Fidelity Mode. This particular frame rate has been chosen because it divides equally into the 120hz of high-end TVs, meaning frame pacing continues to be even.


 
This is a god damn travesty. Console players have assured me that 30fps is very cinematic. Being 30fps is a virtue in itself. 40fps is closer to that fucking Hobbit movie, which as we know was not cinematic at all. So what, now people are going to have to toggle 120hz off every time they want to switch between multiplayer and cinematic?!?
 

TrueLegend

Member
Interesting addon. But the game is like 15 hours long. Most people who have it are probably done with it now. Ofc it's better for those who are waiting for the price drops. But here is the thing. I don't want any and I mean any framerate below 60 to exist. I don't care if it's 4k30, 8K30, or 100K30 I simply, very simply want it DEAD. Begone 30,40 whatever FPS.
 

TheDreadBaron

Gold Member
Interesting addon. But the game is like 15 hours long. Most people who have it are probably done with it now. Ofc it's better for those who are waiting for the price drops. But here is the thing. I don't want any and I mean any framerate below 60 to exist. I don't care if it's 4k30, 8K30, or 100K30 I simply, very simply want it DEAD. Begone 30,40 whatever FPS.
I think I prefer dark souls 1 at a steady 30 FPS. 🤭
 
This is a god damn travesty. Console players have assured me that 30fps is very cinematic. Being 30fps is a virtue in itself. 40fps is closer to that fucking Hobbit movie, which as we know was not cinematic at all. So what, now people are going to have to toggle 120hz off every time they want to switch between multiplayer and cinematic?!?
Is everything ok back home buddy?
 

Thief1987

Member
Interesting addon. But the game is like 15 hours long. Most people who have it are probably done with it now. Ofc it's better for those who are waiting for the price drops. But here is the thing. I don't want any and I mean any framerate below 60 to exist. I don't care if it's 4k30, 8K30, or 100K30 I simply, very simply want it DEAD. Begone 30,40 whatever FPS.
Are options for others, not as entitled as you, really hurt you so much?
 
Last edited:

hoplie

Member
From the first post:

This particular frame rate has been chosen because it divides equally into the 120hz of high-end TVs, meaning frame pacing continues to be even.

120:40=3. While 60:40=1.5.
 
Last edited:

bender

What time is it?
Why the 40 frames are limited to 120hz Tvs ?

why can't people run 40 at 60hz 4k TVs ?
tumblr_n1mk9kB5VD1syseo5o2_500.gif
 

Golgo 13

The Man With The Golden Dong
Why the 40 frames are limited to 120hz Tvs ?

why can't people run 40 at 60hz 4k TVs ?
Has something to do with the division and presentation of frames. Since 40 is exactly 1/3rd of 120hz it’s a compatible framerate with those displays. The big gain here is the frame time measurements though which make the game feel much smoother than 30FPS.
 

jigglet

Banned
Why the 40 frames are limited to 120hz Tvs ?

why can't people run 40 at 60hz 4k TVs ?

Because VRR only became a thing relatively recently. Before that it had to be equally divisible by the refresh rate, so 60hz monitors could only do 30 or 15fps etc.

VRR now means they could pick literally any number below the refresh rate. It could be 43 frames if they wanted it to. They could choose the top most frame that is stable. It's highly unlikely 40fps just happened to be the optimal number. The problem is if they found 43fps was the best (for example) people would be very confused so they just went with 40.
 
Last edited:

DenchDeckard

Moderated wildly
Yep, developers continue to know that 30fps is gash. Already done with the game but would have loved to test this.
 

quazy

Neo Member
Because VRR only became a thing relatively recently. Before that it had to be equally divisible by the refresh rate, so 60hz monitors could only do 30 or 15fps etc.

VRR now means they could pick literally any number below the refresh rate. It could be 43 frames if they wanted it to. They could choose the top most frame that is stable. It's highly unlikely 40fps just happened to be the optimal number. The problem is if they found 43fps was the best (for example) people would be very confused so they just went with 40.
PS5 doesn't support VRR.
 

Thief1987

Member
PS5 doesn't support VRR.
Yes, that's why they locked it at 40, with VRR you don't need to lock framerate at all. One caveat, i think VRR doesn't work below 40 fps, so for good experience you should be sure that game exceed it for most of the time at least.
 
Last edited:

Tchu-Espresso

likes mayo on everthing and can't dance
Because VRR only became a thing relatively recently. Before that it had to be equally divisible by the refresh rate, so 60hz monitors could only do 30 or 15fps etc.

VRR now means they could pick literally any number below the refresh rate. It could be 43 frames if they wanted it to. They could choose the top most frame that is stable. It's highly unlikely 40fps just happened to be the optimal number. The problem is if they found 43fps was the best (for example) people would be very confused so they just went with 40.
This has absolutely nothing to do with VRR and more to do with simple math. 120hz is a multiple of 40hz.
 

BigTnaples

Todd Howard's Secret GAF Account
Good point, why is that? So weird

I mean. I thought it would be here by now.

It’s not on their flagship TVs right now either.

Both the PS5 and A90J/A80J OLED 2021 TVs are promising VRR in a “future update”.

But they are silent as to when, haven’t updated us at all on the status really. Which is really unacceptable. I mean. The TVs are sold with VRR on the box. And the PS5 they have said was getting VRR since before it launched.
aUxhepM.jpg
 
Last edited:

vkbest

Member
Because VRR only became a thing relatively recently. Before that it had to be equally divisible by the refresh rate, so 60hz monitors could only do 30 or 15fps etc.

VRR now means they could pick literally any number below the refresh rate. It could be 43 frames if they wanted it to. They could choose the top most frame that is stable. It's highly unlikely 40fps just happened to be the optimal number. The problem is if they found 43fps was the best (for example) people would be very confused so they just went with 40.

Nope, this is not VRR. its simply maths

60/2 = 30 (no judder)
120/3 = 40 (no judder)
60/1.5 = 40 (judder)

You can't send to TV 1.5 images for each 2 frames, so you have judder or sttutering, repeating images inconsistently
 
Last edited:

Bankai

Member
Because VRR only became a thing relatively recently. Before that it had to be equally divisible by the refresh rate, so 60hz monitors could only do 30 or 15fps etc.

VRR now means they could pick literally any number below the refresh rate. It could be 43 frames if they wanted it to. They could choose the top most frame that is stable. It's highly unlikely 40fps just happened to be the optimal number. The problem is if they found 43fps was the best (for example) people would be very confused so they just went with 40.

Nope, this isn't about VRR. it's about a framerate (40) which is "spread out evenly" on a 120hz display (3x40=120) and therefor is free of judder.
 

TrebleShot

Member
It’s very impressive and makes for a much smoother presentation when playing in fidelity mode I only wish I’d had it on my first run through.
 

ZywyPL

Banned
40FPS on 120Hz sounds like a really neat idea, has anyone here tested it already? How does it feel compared to 30 and 60FPS modes?

Although I'd personally take 4K+unlocked framerate+VRR any time of the day, it's sad that in 2021 games on consoles are still limited mostly to just either locked 30 or 60 FPS, with very few exceptions like GoW that runs at 40-50. The more option the better, as always.
 

rofif

Banned
40FPS on 120Hz sounds like a really neat idea, has anyone here tested it already? How does it feel compared to 30 and 60FPS modes?

Although I'd personally take 4K+unlocked framerate+VRR any time of the day, it's sad that in 2021 games on consoles are still limited mostly to just either locked 30 or 60 FPS, with very few exceptions like GoW that runs at 40-50. The more option the better, as always.
There are benefits to this way of doing it.
VRR on TVs can suck ass sometimes. It can flicker or make gamma look more washed out.
With 40fps on 120hz, you lock tv to nice 120hz.
VRR is still ideal assuming it works well enough on your tv of course
 

rofif

Banned
This has absolutely nothing to do with VRR and more to do with simple math. 120hz is a multiple of 40hz.
yep. 120hz integer dividers. Means, vsynced, these locked famerates would be smooth without judder.
1, 2, 3, 4, 5, 6, 8, 10, 12, 15, 20, 24, 30, 40, 60, 120
 

Raonak

Banned
Very cool option. for a showpiece game like ratchet, i'd rather have maximum graphics, so 40fps is a nice middle ground.

although....as a graphics whore, I wonder if they could offer a fidelty++ to boost RT or effects up a notch since they do have an extra 10 frame buffer.
 

Inviusx

Member
Ok I've just played for the last 10mins in 120hz mode on an LG C9 OLED.

Honestly it feels substantially smoother compared to Fidelity. To me, Fidelity feels almost unplayable compared to Performance RT. I could actually see this being something that some people may prefer to Performance RT if they want nothing but the absolute best quality and highest resolution at all times.

Here's my issue though, Performance RT looks and performs so fucking good that I'm not sure I can really see much difference between Fidelity and Performance once in motion. I'm sure if I sat here and squinted at some comparisons I might be able to spot something but honestly Performance RT is just a no brainer.

This mode is cool and it might be an alternative for some but for me Performance RT is the king.

Insomniac are fucking wizards.
 
Last edited:

rofif

Banned
Ok I've just played for the last 10mins in 120hz mode on an LG C9 OLED.

Honestly it feels substantially smoother compared to Fidelity. To me, Fidelity feels almost unplayable compared to Performance RT. I could actually see this being something that some people may prefer to Performance RT if they want nothing but the absolute best quality and highest resolution at all times.

Here's my issue though, Performance RT looks and performs so fucking good that I'm not sure I can really see much difference between Fidelity and Performance. I'm sure if I sat here and squinted at some comparisons I might be able to spot something but honestly Performance RT is just a no brainer.

This mode is cool and it might be an alternative for some but for me Performance RT is the king.

Insomniac are fucking wizards.
For me it's other way around. old 30fps fidelity felt pretty good for 30fps game.
Performance RT looks low res and blurry. Feels much better but this game is more about the graphics.
New fidelity 40fps should be great as long as 422 chroma looks ok
 

Ammogeddon

Member
For me it's other way around. old 30fps fidelity felt pretty good for 30fps game.
Performance RT looks low res and blurry. Feels much better but this game is more about the graphics.
New fidelity 40fps should be great as long as 422 chroma looks ok
I’ve seen chroma mentioned a couple of times. What are the issues with it? Mine is set to 444 in the pc mode of my TV.
 

Inviusx

Member
I’ve seen chroma mentioned a couple of times. What are the issues with it? Mine is set to 444 in the pc mode of my TV.

This is not the right thread for it. If I go into it again it's just going to start an off topic argument. Go check out the OLED thread.
 

rofif

Banned
This is not the right thread for it. If I go into it again it's just going to start an off topic argument. Go check out the OLED thread.
This is a good topic for it. We discuss a game that switched ps5 into 120hz 422 chroma mode.
Ammogeddon Ammogeddon supposedly it is lower quality image
 

Inviusx

Member
This is a good topic for it. We discuss a game that switched ps5 into 120hz 422 chroma mode.
Ammogeddon Ammogeddon supposedly it is lower quality image

Ok well, RnC is a pretty good test bed for chroma because switching between 120hz mode (4:2:2) and normal 4:4:4 (if you have a HDMI 2.1 compliant TV) is more or less instant. And I can tell you that right now as I type this I am switching between each mode and cannot tell the difference.

As I've said in the OLED thread, there is this weird FOMO reaction around full 4:4:4 chroma with people believing that it offers a noticeably improved picture compared to 4:2:2. People might think they are missing out on something if they can't achieve this with their set but I think it's all BS.

If you use your TV as monitor, having access to full 4:4:4 chroma can make an impact when trying to discern edges on small areas of text and this has been proven, you can find this on YouTube. However its application in gaming is redundant from what I can tell.

Some say that it improves colour banding but I assure you that a properly configured display at 4:2:2 on PS5 exhibits no banding issues at all for me. I can assume that these banding issues would come from a TV set that hasn't been configured correctly.

Did anyone complain about banding issues on the previous consoles that couldn't achieve 4:4:4? No they didn't, but now all of a sudden these issues come up because people perceive 4:4:4 as this holy grail that was previously unattainable. Its false.

So don't get this feeling that you're missing out by not using 4:4:4, to me the difference is either so small that I cannot even see it or there is no difference at all.
 
Last edited:
Ok well, RnC is a pretty good test bed for chroma because switching between 120hz mode (4:2:2) and normal 4:4:4 (if you have a HDMI 2.1 compliant TV) is more or less instant. And I can tell you that right now as I type this I am switching between each mode and cannot tell the difference.

As I've said in the OLED thread, there is this weird FOMO reaction around full 4:4:4 chroma with people believing that it offers a noticeably improved picture compared to 4:2:2. People might think they are missing out on something if they can't achieve this with their set but I think it's all BS.

If you use your TV as monitor, having access to full 4:4:4 chroma can make an impact when trying to discern edges on small areas of text and this has been proven, you can find this on YouTube. However its application in gaming is redundant from what I can tell.

Some say that it improves colour banding but I assure you that a properly configured display at 4:2:2 on PS5 exhibits no banding issues at all for me. I can assume that these banding issues would come from a TV set that hasn't been configured correctly.

Did anyone complain about banding issues on the previous consoles that couldn't achieve 4:4:4? No they didn't, but now all of a sudden these issues come up because people perceive 4:4:4 as this holy grail that was previously unattainable. Its false.

So don't get this feeling that you're missing out by not using 4:4:4, to me the difference is either so small that I cannot even see it or there is no difference at all.

I mostly agree with this. As an owner of an LG CX I think a lot of my anxiety around settings is that I am missing out on the most optimised settings, not even really caring about what the minor differences are. I have 4:4:4 for PC gaming but not messed around at all yet for the PS5.
 
Last edited:
I've read a lot that says in theory 4:4:4 shouldn't make any difference to gaming & I've not tried Ratchet at 4:2:2 but I get obvious banding in some games at 4:2:2 that completely disappears at 4:4:4. Really obvious in something dark like RE8 or the caves in AC Valhalla. You can hide the banding by crushing blacks but better to have 4:4:4. Maybe the the fault of the console or the TV but 4:4:4 looks perfect with zero banding.

I think the HDMI chip in the PS5 can handle 40Gbps so not sure why it is capped.
 
Status
Not open for further replies.
Top Bottom