• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.
  • The Politics forum has been nuked. Please do not bring political discussion to the rest of the site, or you will be removed. Thanks.

Ratchet & Clank: Rift Apart 40FPS!!??

Status
Not open for further replies.

rickybooby87

Member
Nov 2, 2020
458
1,532
380

Ratchet & Clank: Rift Apart Patch 1.002 Adds 40FPS Mode for 120Hz TVs




Here's an interesting addition to Ratchet & Clank: Rift Apart found in patch 1.002: if you own a TV capable of running at 120Hz, the PlayStation 5 exclusive will increase the frame rate of its Fidelity Mode to 40 frames-per-second. The patch is available to download now and works to increase the frame rate for television sets that can handle the boost. With a higher frame rate, the game will run even better whilst maintaining the visual quality of the normal Fidelity Mode. This particular frame rate has been chosen because it divides equally into the 120hz of high-end TVs, meaning frame pacing continues to be even.


 
Oct 5, 2013
2,964
772
710
This is a god damn travesty. Console players have assured me that 30fps is very cinematic. Being 30fps is a virtue in itself. 40fps is closer to that fucking Hobbit movie, which as we know was not cinematic at all. So what, now people are going to have to toggle 120hz off every time they want to switch between multiplayer and cinematic?!?
 

TrueLegend

Member
Jun 7, 2021
650
1,564
620
Interesting addon. But the game is like 15 hours long. Most people who have it are probably done with it now. Ofc it's better for those who are waiting for the price drops. But here is the thing. I don't want any and I mean any framerate below 60 to exist. I don't care if it's 4k30, 8K30, or 100K30 I simply, very simply want it DEAD. Begone 30,40 whatever FPS.
 

TheDreadBaron

Member
Oct 27, 2019
320
386
405
Interesting addon. But the game is like 15 hours long. Most people who have it are probably done with it now. Ofc it's better for those who are waiting for the price drops. But here is the thing. I don't want any and I mean any framerate below 60 to exist. I don't care if it's 4k30, 8K30, or 100K30 I simply, very simply want it DEAD. Begone 30,40 whatever FPS.
I think I prefer dark souls 1 at a steady 30 FPS. 🤭
 
Jun 28, 2013
3,479
3,863
895
This is a god damn travesty. Console players have assured me that 30fps is very cinematic. Being 30fps is a virtue in itself. 40fps is closer to that fucking Hobbit movie, which as we know was not cinematic at all. So what, now people are going to have to toggle 120hz off every time they want to switch between multiplayer and cinematic?!?
Is everything ok back home buddy?
 

Thief1987

Member
Jan 8, 2018
812
2,186
460
Interesting addon. But the game is like 15 hours long. Most people who have it are probably done with it now. Ofc it's better for those who are waiting for the price drops. But here is the thing. I don't want any and I mean any framerate below 60 to exist. I don't care if it's 4k30, 8K30, or 100K30 I simply, very simply want it DEAD. Begone 30,40 whatever FPS.
Are options for others, not as entitled as you, really hurt you so much?
 
Last edited:

//DEVIL//

Member
May 28, 2014
2,404
1,856
690
Why the 40 frames are limited to 120hz Tvs ?

why can't people run 40 at 60hz 4k TVs ?
 

hoplie

Member
Dec 8, 2020
50
112
260
From the first post:

This particular frame rate has been chosen because it divides equally into the 120hz of high-end TVs, meaning frame pacing continues to be even.

120:40=3. While 60:40=1.5.
 
Last edited:
  • Like
Reactions: Guilhermegrg

bender

Candy Corn Aficionado
Apr 12, 2010
12,382
22,100
1,480
Why the 40 frames are limited to 120hz Tvs ?

why can't people run 40 at 60hz 4k TVs ?
 

Golgo 13

The Man With The Golden Dong
Jun 14, 2014
5,306
3,720
915
Why the 40 frames are limited to 120hz Tvs ?

why can't people run 40 at 60hz 4k TVs ?
Has something to do with the division and presentation of frames. Since 40 is exactly 1/3rd of 120hz it’s a compatible framerate with those displays. The big gain here is the frame time measurements though which make the game feel much smoother than 30FPS.
 

jigglet

Member
May 18, 2020
3,412
6,141
630
Why the 40 frames are limited to 120hz Tvs ?

why can't people run 40 at 60hz 4k TVs ?

Because VRR only became a thing relatively recently. Before that it had to be equally divisible by the refresh rate, so 60hz monitors could only do 30 or 15fps etc.

VRR now means they could pick literally any number below the refresh rate. It could be 43 frames if they wanted it to. They could choose the top most frame that is stable. It's highly unlikely 40fps just happened to be the optimal number. The problem is if they found 43fps was the best (for example) people would be very confused so they just went with 40.
 
Last edited:
  • Like
Reactions: //DEVIL//

DenchDeckard

Member
Feb 28, 2021
2,528
4,550
395
Yep, developers continue to know that 30fps is gash. Already done with the game but would have loved to test this.
 

quazy

Neo Member
Oct 25, 2015
33
31
320
Because VRR only became a thing relatively recently. Before that it had to be equally divisible by the refresh rate, so 60hz monitors could only do 30 or 15fps etc.

VRR now means they could pick literally any number below the refresh rate. It could be 43 frames if they wanted it to. They could choose the top most frame that is stable. It's highly unlikely 40fps just happened to be the optimal number. The problem is if they found 43fps was the best (for example) people would be very confused so they just went with 40.
PS5 doesn't support VRR.
 

Thief1987

Member
Jan 8, 2018
812
2,186
460
PS5 doesn't support VRR.
Yes, that's why they locked it at 40, with VRR you don't need to lock framerate at all. One caveat, i think VRR doesn't work below 40 fps, so for good experience you should be sure that game exceed it for most of the time at least.
 
Last edited:

Tchu-Espresso

likes mayo on everthing and can't dance
Apr 14, 2006
4,758
1,646
1,700
Because VRR only became a thing relatively recently. Before that it had to be equally divisible by the refresh rate, so 60hz monitors could only do 30 or 15fps etc.

VRR now means they could pick literally any number below the refresh rate. It could be 43 frames if they wanted it to. They could choose the top most frame that is stable. It's highly unlikely 40fps just happened to be the optimal number. The problem is if they found 43fps was the best (for example) people would be very confused so they just went with 40.
This has absolutely nothing to do with VRR and more to do with simple math. 120hz is a multiple of 40hz.
 

BigTnaples

Todd Howard's Secret GAF Account
Feb 10, 2011
13,652
416
1,010
Good point, why is that? So weird

I mean. I thought it would be here by now.

It’s not on their flagship TVs right now either.

Both the PS5 and A90J/A80J OLED 2021 TVs are promising VRR in a “future update”.

But they are silent as to when, haven’t updated us at all on the status really. Which is really unacceptable. I mean. The TVs are sold with VRR on the box. And the PS5 they have said was getting VRR since before it launched.
 
Last edited:
  • Like
Reactions: Werewolfgrandma

vkbest

Member
Jan 23, 2017
1,729
1,687
500
Because VRR only became a thing relatively recently. Before that it had to be equally divisible by the refresh rate, so 60hz monitors could only do 30 or 15fps etc.

VRR now means they could pick literally any number below the refresh rate. It could be 43 frames if they wanted it to. They could choose the top most frame that is stable. It's highly unlikely 40fps just happened to be the optimal number. The problem is if they found 43fps was the best (for example) people would be very confused so they just went with 40.

Nope, this is not VRR. its simply maths

60/2 = 30 (no judder)
120/3 = 40 (no judder)
60/1.5 = 40 (judder)

You can't send to TV 1.5 images for each 2 frames, so you have judder or sttutering, repeating images inconsistently
 
Last edited:

Bankai

Member
Jul 14, 2015
1,081
1,689
625
Because VRR only became a thing relatively recently. Before that it had to be equally divisible by the refresh rate, so 60hz monitors could only do 30 or 15fps etc.

VRR now means they could pick literally any number below the refresh rate. It could be 43 frames if they wanted it to. They could choose the top most frame that is stable. It's highly unlikely 40fps just happened to be the optimal number. The problem is if they found 43fps was the best (for example) people would be very confused so they just went with 40.

Nope, this isn't about VRR. it's about a framerate (40) which is "spread out evenly" on a 120hz display (3x40=120) and therefor is free of judder.
 
  • Like
Reactions: 3liteDragon

TrebleShot

Member
Sep 30, 2020
468
944
340
It’s very impressive and makes for a much smoother presentation when playing in fidelity mode I only wish I’d had it on my first run through.
 

ZywyPL

Member
Nov 27, 2018
6,049
10,761
755
40FPS on 120Hz sounds like a really neat idea, has anyone here tested it already? How does it feel compared to 30 and 60FPS modes?

Although I'd personally take 4K+unlocked framerate+VRR any time of the day, it's sad that in 2021 games on consoles are still limited mostly to just either locked 30 or 60 FPS, with very few exceptions like GoW that runs at 40-50. The more option the better, as always.
 

rofif

Member
Sep 13, 2019
7,819
11,028
670
40FPS on 120Hz sounds like a really neat idea, has anyone here tested it already? How does it feel compared to 30 and 60FPS modes?

Although I'd personally take 4K+unlocked framerate+VRR any time of the day, it's sad that in 2021 games on consoles are still limited mostly to just either locked 30 or 60 FPS, with very few exceptions like GoW that runs at 40-50. The more option the better, as always.
There are benefits to this way of doing it.
VRR on TVs can suck ass sometimes. It can flicker or make gamma look more washed out.
With 40fps on 120hz, you lock tv to nice 120hz.
VRR is still ideal assuming it works well enough on your tv of course
 

rofif

Member
Sep 13, 2019
7,819
11,028
670
This has absolutely nothing to do with VRR and more to do with simple math. 120hz is a multiple of 40hz.
yep. 120hz integer dividers. Means, vsynced, these locked famerates would be smooth without judder.
1, 2, 3, 4, 5, 6, 8, 10, 12, 15, 20, 24, 30, 40, 60, 120
 

Raonak

Member
Aug 19, 2010
8,618
1,993
1,065
30
New Zealand
dreammodule.com
Very cool option. for a showpiece game like ratchet, i'd rather have maximum graphics, so 40fps is a nice middle ground.

although....as a graphics whore, I wonder if they could offer a fidelty++ to boost RT or effects up a notch since they do have an extra 10 frame buffer.
 

Inviusx

Member
Jan 4, 2016
2,199
2,634
540
Ok I've just played for the last 10mins in 120hz mode on an LG C9 OLED.

Honestly it feels substantially smoother compared to Fidelity. To me, Fidelity feels almost unplayable compared to Performance RT. I could actually see this being something that some people may prefer to Performance RT if they want nothing but the absolute best quality and highest resolution at all times.

Here's my issue though, Performance RT looks and performs so fucking good that I'm not sure I can really see much difference between Fidelity and Performance once in motion. I'm sure if I sat here and squinted at some comparisons I might be able to spot something but honestly Performance RT is just a no brainer.

This mode is cool and it might be an alternative for some but for me Performance RT is the king.

Insomniac are fucking wizards.
 
Last edited:

rofif

Member
Sep 13, 2019
7,819
11,028
670
Ok I've just played for the last 10mins in 120hz mode on an LG C9 OLED.

Honestly it feels substantially smoother compared to Fidelity. To me, Fidelity feels almost unplayable compared to Performance RT. I could actually see this being something that some people may prefer to Performance RT if they want nothing but the absolute best quality and highest resolution at all times.

Here's my issue though, Performance RT looks and performs so fucking good that I'm not sure I can really see much difference between Fidelity and Performance. I'm sure if I sat here and squinted at some comparisons I might be able to spot something but honestly Performance RT is just a no brainer.

This mode is cool and it might be an alternative for some but for me Performance RT is the king.

Insomniac are fucking wizards.
For me it's other way around. old 30fps fidelity felt pretty good for 30fps game.
Performance RT looks low res and blurry. Feels much better but this game is more about the graphics.
New fidelity 40fps should be great as long as 422 chroma looks ok
 

Ammogeddon

Member
Jul 14, 2014
734
314
535
For me it's other way around. old 30fps fidelity felt pretty good for 30fps game.
Performance RT looks low res and blurry. Feels much better but this game is more about the graphics.
New fidelity 40fps should be great as long as 422 chroma looks ok
I’ve seen chroma mentioned a couple of times. What are the issues with it? Mine is set to 444 in the pc mode of my TV.
 

Inviusx

Member
Jan 4, 2016
2,199
2,634
540
I’ve seen chroma mentioned a couple of times. What are the issues with it? Mine is set to 444 in the pc mode of my TV.

This is not the right thread for it. If I go into it again it's just going to start an off topic argument. Go check out the OLED thread.
 

rofif

Member
Sep 13, 2019
7,819
11,028
670
This is not the right thread for it. If I go into it again it's just going to start an off topic argument. Go check out the OLED thread.
This is a good topic for it. We discuss a game that switched ps5 into 120hz 422 chroma mode.
A Ammogeddon supposedly it is lower quality image
 

Inviusx

Member
Jan 4, 2016
2,199
2,634
540
This is a good topic for it. We discuss a game that switched ps5 into 120hz 422 chroma mode.
A Ammogeddon supposedly it is lower quality image

Ok well, RnC is a pretty good test bed for chroma because switching between 120hz mode (4:2:2) and normal 4:4:4 (if you have a HDMI 2.1 compliant TV) is more or less instant. And I can tell you that right now as I type this I am switching between each mode and cannot tell the difference.

As I've said in the OLED thread, there is this weird FOMO reaction around full 4:4:4 chroma with people believing that it offers a noticeably improved picture compared to 4:2:2. People might think they are missing out on something if they can't achieve this with their set but I think it's all BS.

If you use your TV as monitor, having access to full 4:4:4 chroma can make an impact when trying to discern edges on small areas of text and this has been proven, you can find this on YouTube. However its application in gaming is redundant from what I can tell.

Some say that it improves colour banding but I assure you that a properly configured display at 4:2:2 on PS5 exhibits no banding issues at all for me. I can assume that these banding issues would come from a TV set that hasn't been configured correctly.

Did anyone complain about banding issues on the previous consoles that couldn't achieve 4:4:4? No they didn't, but now all of a sudden these issues come up because people perceive 4:4:4 as this holy grail that was previously unattainable. Its false.

So don't get this feeling that you're missing out by not using 4:4:4, to me the difference is either so small that I cannot even see it or there is no difference at all.
 
Last edited:
Jul 2, 2014
929
744
600
Ok well, RnC is a pretty good test bed for chroma because switching between 120hz mode (4:2:2) and normal 4:4:4 (if you have a HDMI 2.1 compliant TV) is more or less instant. And I can tell you that right now as I type this I am switching between each mode and cannot tell the difference.

As I've said in the OLED thread, there is this weird FOMO reaction around full 4:4:4 chroma with people believing that it offers a noticeably improved picture compared to 4:2:2. People might think they are missing out on something if they can't achieve this with their set but I think it's all BS.

If you use your TV as monitor, having access to full 4:4:4 chroma can make an impact when trying to discern edges on small areas of text and this has been proven, you can find this on YouTube. However its application in gaming is redundant from what I can tell.

Some say that it improves colour banding but I assure you that a properly configured display at 4:2:2 on PS5 exhibits no banding issues at all for me. I can assume that these banding issues would come from a TV set that hasn't been configured correctly.

Did anyone complain about banding issues on the previous consoles that couldn't achieve 4:4:4? No they didn't, but now all of a sudden these issues come up because people perceive 4:4:4 as this holy grail that was previously unattainable. Its false.

So don't get this feeling that you're missing out by not using 4:4:4, to me the difference is either so small that I cannot even see it or there is no difference at all.

I mostly agree with this. As an owner of an LG CX I think a lot of my anxiety around settings is that I am missing out on the most optimised settings, not even really caring about what the minor differences are. I have 4:4:4 for PC gaming but not messed around at all yet for the PS5.
 
Last edited:
Sep 22, 2020
681
1,058
385
I've read a lot that says in theory 4:4:4 shouldn't make any difference to gaming & I've not tried Ratchet at 4:2:2 but I get obvious banding in some games at 4:2:2 that completely disappears at 4:4:4. Really obvious in something dark like RE8 or the caves in AC Valhalla. You can hide the banding by crushing blacks but better to have 4:4:4. Maybe the the fault of the console or the TV but 4:4:4 looks perfect with zero banding.

I think the HDMI chip in the PS5 can handle 40Gbps so not sure why it is capped.
 
Status
Not open for further replies.