It a cool feature but if 60fps is available I will always pick that option.
If 120fps is an option I will always pick that over 60fps
so would i if i had a tv that supports 120fps. how many games even have a 120fps mode?
Thinking the same thing. 24fps would mean ~42ms lag, but would allow for more graphical bells and whistles. It would be interesting to see a dev make a game with that framerate and see how the input lag would affect the experience.
It's a number of things. But first of all ps5's bottleneck is its bandwidth so it definitely could be a case where the system has more breathing room at 30fps. Second Insomniac's engine has been focused on 30fps for a long time, and that's to the point where the 60fps mode on ratchet wasn't even on the disc unless you forced it in ps5's menu. Ratchet ps5 Was a 30fps game first.Here is something I dont understand. They are able to do a locked 40 fps at native 4k with some drops to 1800p. That's roughly 8.2 million pixels with the low end being 5.6 million pixels.
But then for the 60 fps mode, they not only drop the resolution to 1440p (3.6 million pixels) with the low end being 1080p or 2.1 million pixels, but they also have to downgrade the visuals like reducing the crowd and the skybox detail as well as the lighting and other vfx. Why? Typically, just reducing by half frees up enough of the GPU to get the same visual quality at 2x the framerate. For some reason, reducing the resolution by half or even 2/3rds is not good enough when it comes to maintaining the same visual quality.
The GPU can handle it at 40 fps and native 4k so clearly thats not the bottleneck. Is it the memory bandwidth? Or the CPU? I can attribute the downgrade in crowd and flying ships to the CPU but other lighting downgrades? Insomniac just needs to release a 1440p fidelity mode just so we can see where the bottleneck is.
You would think 448 GBps would be enough though. The rtx 2080 cards are easily able to do 1440p 60 fps in games like Metro that run at native 4k 30 fps. No need to change settings to medium or low. Everything just scales down.It's a number of things. But first of all ps5's bottleneck is its bandwidth so it definitely could be a case where the system has more breathing room at 30fps. Second Insomniac's engine has been focused on 30fps for a long time, and that's to the point where the 60fps mode on ratchet wasn't even on the disc unless you forced it in ps5's menu. Ratchet ps5 Was a 30fps game first.
We take for granted perfect framerate in multiple modes these days, but it's a ton of work. Think back to the fps we got in 5th gen.
Not just any 120 hz TVs. Most 120 hz tvs only have 1080p 120 hz modes. Only the new HDMI 2.1 compatible tvs like the LG CX and Sony's own x900H tvs have 120 hz at 4k. I believe a couple of the new LG and Sony OLEDS have HDMI 2.1 capabilities but they cost thousands of dollars. Still, as an LG CX owner, I am happy they added it.Awesome, the 3 people who have a 120hz display are going to use it.
If you really want 120hz PC is better. The point of high refresh rates is being able to be faster, so KB&M is natural for it.
This doesn't work on a 120hz screen without HDMI 2.1?40 fps is the new 30 for 120hz/HDMI 2.1 TV owners.
Same consistent fidelity as 30fps but reduced input lag and smoother gaming experience.
It's not enough to let the gpu perform at 100% at all times in all scenarios. Ditto ps4, but the limitation is even bigger on ps5 , though not as bad as ps4 pro's bottleneck.You would think 448 GBps would be enough though. The rtx 2080 cards are easily able to do 1440p 60 fps in games like Metro that run at native 4k 30 fps. No need to change settings to medium or low. Everything just scales down.
maybe the CPU is using up a lot of the memory bandwidth?
Either way, I think devs need to start thinking ahead and leave unlocked modes in their games. That way when the PS5 Pro does come out they wont need to go back and patch in 60 fps fidelity modes.
The point of high refresh rates is being able to be faster
Its a pretty big list at this point.
ratchet should absolutely be played at 120fps
They said in terms of frametime it's a midpoint, and yeah 40fps is the closest we can get to a midpoint given display refresh rates.Midway point between 30 and 60fps is actually 45fps, not 40.
And yes I now that 45 is not a multiple of 120, still doesn't mean that you can blatantly lie.
It is when you look at frametime. 16.6ms -> 25ms -> 33.3msMidway point between 30 and 60fps is actually 45fps, not 40.
And yes I now that 45 is not a multiple of 120, still doesn't mean that you can blatantly lie.
Tomatos, tomatosNo, you can play at 40 FPS on a 1080p device at 120hz, but you need HDMI 2.1 with 4K 120Hz to get the full fat native 4K 40FPS.
You need 120hz to play at 40 fps. You don't need it to be 4K 120Hz.
It's not a matter of "why is like this", it's just a matter of I don't really care for this fidelity. 60fps is fineWatch the video so you can see why that is. the reason is not because they couldnt do it, it's because without a tv with that support, it would look and feel wrong because of the unven frames for a normal 60hz tv
It is when you look at frametime. 16.6ms -> 25ms -> 33.3ms
I mean?
Someone explain like I'm a retard. How is 40 the midway point of 30 and 60?
Fixed. When Displays with hdmi 2.1 become the norm, 40fps will be the new 30fps. But it will be when PS6 comes out in 2028 when this will be an industry wide standard imo.40fps is the new 30fps
I'll take VRR anyday of the week