• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5: VRR Update Is Reportedly Coming In December

Status
Not open for further replies.
If online tv discussions are anything to go by; people will be in for a bit of a shock when their iq goes to crap with VRR enabled because TV’s just don’t behave properly with varying refresh rates.
It's a non-issue 99.9% of the time. Can't remember the last time I saw any black flickering.
Most of the fanfare is drummed up only because PS5 doesn’t have it. Once it does, watch the conversation dry up into nothing.
Nope, it's a legit good feature. Imagine having screen tearing and stutter in the year of our Lord 2021.
 

ParaSeoul

Member
You must not have played many games then lol.
Noticed it more with PS3 games,haven't noticed any in what I've played recently. Except Saints row 4 on ps4 but then again thats rough in all aspects
VRR is better than vsync. It's not even debatable. Lower input lag while allowing higher framerates. How we still have people like you defending Sony for not having VRR is wild. Absolute madness.
Never said any of this lmao.
 

MrFunSocks

Banned
Noticed it more with PS3 games,haven't noticed any in what I've played recently. Except Saints row 4 on ps4 but then again thats rough in all aspects

Never said any of this lmao.
I'm sure you were just pretending that screen tearing isn't a thing anymore for some other reason then.....
 

Chukhopops

Member
If you need a perfect example of how useful VRR is, try Tales of Arise or any other game with a 45-60 FPS quality mode. Quality mode looks massively better than Perf mode but is unplayable without VRR.

The flicker argument is ridiculous btw: if you are in a hypothetical, rare situation where you have instantaneous variation of framerate from 120 to 60 then your experience will STILL be dramatically worse without VRR than with it.

But I remain hopeful that Sony will implement it and suddenly:
- its usefulness will never be questioned again;
- VRR adoption rate will jump from <1% to 100% overnight;
- all DF/VG tech comparisons will be meaningless because of VRR.
 

Panajev2001a

GAF's Pleasant Genius
If you need a perfect example of how useful VRR is, try Tales of Arise or any other game with a 45-60 FPS quality mode. Quality mode looks massively better than Perf mode but is unplayable without VRR.

The flicker argument is ridiculous btw: if you are in a hypothetical, rare situation where you have instantaneous variation of framerate from 120 to 60 then your experience will STILL be dramatically worse without VRR than with it.

But I remain hopeful that Sony will implement it and suddenly:
- its usefulness will never be questioned again;
- VRR adoption rate will jump from <1% to 100% overnight;
- all DF/VG tech comparisons will be meaningless because of VRR.

VRR is useful, but 1.) its gamma shift and flickering issues on OLED panels, for example, are ignored by some and 2.) it’s effect in games is sometimes oversold (just because you are not going to see stuttering or tearing, a game constantly moving between 40 and 60 FPS will still appear like accelerating and decelerating).

The day PS5 gets it then you will see it used as a club a lot less and it will fade in prominence (the amount of people that get to use it is a fraction of the whole user base) as it cannot be user for warring anymore.
 

mitchman

Gold Member
LG had it in their TV’s for years and it still has problems, see a few posts above yours.
I have a B9 with a RTX 3080 PC, XSX and PS5 connected to it and I can honestly say I've never seen any of this flickering some talk about. I run it with VRR/Freesync/G-sync on for the devices supporting it. I also have a desktop 48" CX I run off a RX 6900XT with Freesync Premium enabled at 120Hz, and I can't say it's something I've noticed there either. This seems to have been blown out of proportion, kinda like the burn-in risks.
 

Chukhopops

Member
VRR is useful, but 1.) its gamma shift and flickering issues on OLED panels, for example, are ignored by some and 2.) it’s effect in games is sometimes oversold (just because you are not going to see stuttering or tearing, a game constantly moving between 40 and 60 FPS will still appear like accelerating and decelerating).

The day PS5 gets it then you will see it used as a club a lot less and it will fade in prominence (the amount of people that get to use it is a fraction of the whole user base) as it cannot be user for warring anymore.
I disagree with all of that, again in the scenario where you’d see flicker (dark scene, fast variation) things would look even worse without it (tearing and/or frame drops all over the place). Also I’ve never experienced it personally.

Locking frame rate to 60 at all times to remove any frame drop means leaving aside some performance, and with a variable rate TV you won’t notice the difference between 60 fps (16.6 ms frame time) and 50 fps (20 ms so 3.4 ms more) or even 45 fps (22.2 ms).

There’s a reason why variable rates are everywhere in the PC world: it’s because they are flat out better than the alternatives (V-sync, locking a frame rate, etc). Nothing to do with console war.
 

Panajev2001a

GAF's Pleasant Genius
in the scenario where you’d see flicker (dark scene, fast variation) things would look even worse without it (tearing and/or frame drops all over the place)
That… is like your opinion man ;). Flicker and generally raised gamma, still feel worse to me.
 

Panajev2001a

GAF's Pleasant Genius
I disagree with all of that, again in the scenario where you’d see flicker (dark scene, fast variation) things would look even worse without it (tearing and/or frame drops all over the place). Also I’ve never experienced it personally.

Locking frame rate to 60 at all times to remove any frame drop means leaving aside some performance, and with a variable rate TV you won’t notice the difference between 60 fps (16.6 ms frame time) and 50 fps (20 ms so 3.4 ms more) or even 45 fps (22.2 ms).

There’s a reason why variable rates are everywhere in the PC world: it’s because they are flat out better than the alternatives (V-sync, locking a frame rate, etc). Nothing to do with console war.
No difference between 45 FPS and 60 FPS? With VRR? Good thing on most of those TV’s you can get 40 FPS with the TV set in 120 Hz mode like Ratchet does on PS5… surely that is also quite good ;).

There is a whole bunch of warring behind the VRR debacle on consoles, yes some will remain with PCMR debates too.

I am not saying that VRR and FreeSync/GSync are not good to have (animation speed fluctuations are still there to stay and will make some people lock or ask the framerate to be locked), just that right now it is a bit oversold in the console space (where people gaming on VRR enabled PC monitors and HDR enabled TV’s where this is available and artefacts free are a minority… I would not be surprised if VRR enabled PC monitors people report no gamma issues with do bot support HDR either… my question is OLED like black levels and contrast, VRR without gamma issues, and HDR).
 

DenchDeckard

Moderated wildly
It's a non-issue 99.9% of the time. Can't remember the last time I saw any black flickering.

Nope, it's a legit good feature. Imagine having screen tearing and stutter in the year of our Lord 2021.

I only ever see a flicker on loading screens and some start up screens. Once I'm in gameplay I can't see it.
 

Panajev2001a

GAF's Pleasant Genius
I have a B9 with a RTX 3080 PC, XSX and PS5 connected to it and I can honestly say I've never seen any of this flickering some talk about. I run it with VRR/Freesync/G-sync on for the devices supporting it. I also have a desktop 48" CX I run off a RX 6900XT with Freesync Premium enabled at 120Hz, and I can't say it's something I've noticed there either. This seems to have been blown out of proportion, kinda like the burn-in risks.
I assume you did some calibration to the TV (like rtings.com suggestions). Have you spent time enabling and disabling VRR in games where framerate fluctuates and checked gamma levels in the image? That and flickers are what tends to annoy me.
 

8BiTw0LF

Banned
For Real Reaction GIF by Travis
 

Chukhopops

Member
No difference between 45 FPS and 60 FPS? With VRR? Good thing on most of those TV’s you can get 40 FPS with the TV set in 120 Hz mode like Ratchet does on PS5… surely that is also quite good ;).

There is a whole bunch of warring behind the VRR debacle on consoles, yes some will remain with PCMR debates too.

I am not saying that VRR and FreeSync/GSync are not good to have (animation speed fluctuations are still there to stay and will make some people lock or ask the framerate to be locked), just that right now it is a bit oversold in the console space (where people gaming on VRR enabled PC monitors and HDR enabled TV’s where this is available and artefacts free are a minority… I would not be surprised if VRR enabled PC monitors people report no gamma issues with do bot support HDR either… my question is OLED like black levels and contrast, VRR without gamma issues, and HDR).
40 FPS is better than 30, 50 is better than 40, etc in terms of frame time. I think the 40 FPS mode is actually a great idea.

Once you remove the obligation to sync the source frame rate with the display frame rate, it becomes a simple linear curve with diminishing returns (the higher the frame rate goes, the less marginal improvement is visible on the frame time).

But like you said, it’s a matter of preference lol. If you prefer frame rate desync and drops I respect that.
 

Panajev2001a

GAF's Pleasant Genius
40 FPS is better than 30, 50 is better than 40, etc in terms of frame time. I think the 40 FPS mode is actually a great idea.

Once you remove the obligation to sync the source frame rate with the display frame rate, it becomes a simple linear curve with diminishing returns (the higher the frame rate goes, the less marginal improvement is visible on the frame time).

But like you said, it’s a matter of preference lol. If you prefer frame rate desync and drops I respect that.
I am not sure why ending a reasonable post with a jab makes much sense… “if you prefer messed up gamma and flickers on OLED / HDR panels I respect that 😂”.

For now I will take devs focusing on locking down the framerate and enjoying my TV panel as is thank you very much (hence why I disabled VRR on my XSX as I am not willing to mistreat my C9 for a feature LG itself does not have a good solution for and they have stated that their VRR implementation has issues with their OLED panels… with the C2 they added the ability to manually tweak the gamma with VRR on…).
 
I am not sure why ending a reasonable post with a jab makes much sense… “if you prefer messed up gamma and flickers on OLED / HDR panels I respect that 😂”.

For now I will take devs focusing on locking down the framerate and enjoying my TV panel as is thank you very much (hence why I disabled VRR on my XSX as I am not willing to mistreat my C9 for a feature LG itself does not have a good solution for and they have stated that their VRR implementation has issues with their OLED panels… with the C2 they added the ability to manually tweak the gamma with VRR on…).
Man has VRR and doesn't use it.

Feelings Reaction GIF
 
No difference between 45 FPS and 60 FPS? With VRR? Good thing on most of those TV’s you can get 40 FPS with the TV set in 120 Hz mode like Ratchet does on PS5… surely that is also quite good ;).

There is a whole bunch of warring behind the VRR debacle on consoles, yes some will remain with PCMR debates too.

I am not saying that VRR and FreeSync/GSync are not good to have (animation speed fluctuations are still there to stay and will make some people lock or ask the framerate to be locked), just that right now it is a bit oversold in the console space (where people gaming on VRR enabled PC monitors and HDR enabled TV’s where this is available and artefacts free are a minority… I would not be surprised if VRR enabled PC monitors people report no gamma issues with do bot support HDR either… my question is OLED like black levels and contrast, VRR without gamma issues, and HDR).

Man spent thousands on a TV just to not use it appropriately.
 

Chukhopops

Member
I am not sure why ending a reasonable post with a jab makes much sense… “if you prefer messed up gamma and flickers on OLED / HDR panels I respect that 😂”.

For now I will take devs focusing on locking down the framerate and enjoying my TV panel as is thank you very much (hence why I disabled VRR on my XSX as I am not willing to mistreat my C9 for a feature LG itself does not have a good solution for and they have stated that their VRR implementation has issues with their OLED panels… with the C2 they added the ability to manually tweak the gamma with VRR on…).
But you’re the one who said it was personal preference between the two options, there’s not much to discuss after that honestly. I’m not going to argue what you prefer.

I’m honestly just curious which games you’ve seen on your C9/XSX with the flicker issue. I’ve literally never seen it, even when I try to force a frequency change from 60 to 120 fps.
 

Fafalada

Fafracer forever
There’s a reason why variable rates are everywhere in the PC world: it’s because they are flat out better than the alternatives (V-sync, locking a frame rate, etc). Nothing to do with console war.
No - the reason variable rates(variable everything, really) are everywhere on PC is that it's not a platform. It's millions of different hardware targets, making it impossible to target any one metric reliably. It's also true that console space has been gradually fragmenting to be more like that as well in past 10 years, but we're not quite there yet.
 

omegasc

Member
The gamma issues on VRR are on a per game basis, as it happens with fluctuating frame rates, if I got it right. And it has to be something going very low (50s? 40s?) and back to 60-120 all the time.
I played Borderlands 3 on PC with VRR on and barely noticed it... framerates 80+. I basically only notice it on dark loading screens.
I am not saying it is a non issue, but there has to be a lot of 'special' conditions for it to be a real problem for me on the games I played.
I recall from a DF video Devil May Cry V at 120fps fluctuates a lot, and never seen it mentioned on VRR threads... What are people playing that makes the gamma issue so noticeable?
 

Soodanim

Member
If you prefer frame rate desync and drops I respect that.
if you prefer messed up gamma and flickers on OLED / HDR panels I respect that
I don't know what you're all arguing about after this, this covers mostly everything you lot are going on about.

Panajev's point about 40-60 being like speed up is interesting, because I've only ever used G/Freesync on PC where it (mostly) stays above 60 and I never experienced that. But going from 30-60 does feel like a speed up when it's a straight jump, so I can see how that would happen. That said, if a game goes from 40-60 that often, I think they fucked up.
 
Last edited:
If you need a perfect example of how useful VRR is, try Tales of Arise or any other game with a 45-60 FPS quality mode. Quality mode looks massively better than Perf mode but is unplayable without VRR.

The flicker argument is ridiculous btw: if you are in a hypothetical, rare situation where you have instantaneous variation of framerate from 120 to 60 then your experience will STILL be dramatically worse without VRR than with it.

But I remain hopeful that Sony will implement it and suddenly:
- its usefulness will never be questioned again;
- VRR adoption rate will jump from <1% to 100% overnight;
- all DF/VG tech comparisons will be meaningless because of VRR.
How is it going to go to 100% overnight when only a small fraction of people even own a VRR TV?
 

Chukhopops

Member
How is it going to go to 100% overnight when only a small fraction of people even own a VRR TV?
Because it’s a made up narrative that only 1% has access to it today. It’s been on TVs since 2018, all mid- and high-range Samsung / LG / Vizio / TCL have been compatible with it since 2020 at least.

The only exception is Sony which has a 4% market share on TVs anyway.
 

dotnotbot

Member
Because it’s a made up narrative that only 1% has access to it today. It’s been on TVs since 2018, all mid- and high-range Samsung / LG / Vizio / TCL have been compatible with it since 2020 at least.

The only exception is Sony which has a 4% market share on TVs anyway.

Certainly not all, for example the newest, cheapest and probably most popular right now OLED model from LG (priced like mid-range) is 60 Hz only with no VRR. This is what most non-ethusiast will pair with their gaming console.

Oh, and most people don't buy mid and high range TVs. Look up the average $$$ spent on TV.
 
Last edited:

JackSparr0w

Banned
You must not have played many games then lol.

VRR is better than vsync. It's not even debatable. Lower input lag while allowing higher framerates. How we still have people like you defending Sony for not having VRR is wild. Absolute madness.
You still need Vsync for the tearing unless your FPS never reach the refresh rate of the display. There are so many misconceptions about Vsync, VRR and what they do.

Personally I found VRR useless and turned it off as it introduces flickering at lower than refresh rate FPS and it doesn't do anything when the FPS match the refresh rate.
 

SF Kosmo

Al Jazeera Special Reporter
Great but most games are 60 or 30 anyway.
And VRR only makes some sense above 40 and ideally above 60.
Otherwise it causes some gamma change or flicker on my tv (lg oled). But that's only an issue if there is a drastic fps change.
I wish 120hz mode would be possible to be used for 60fps games too. Because of lower lag and these tv's only have proper gamma at 120hz.

As far as 4k monitors go. Most have 40-60hz vrr and it disables below 40..
And ps5 does not support 4k @120hz full color
Great but most games are 60 or 30 anyway.
And VRR only makes some sense above 40 and ideally above 60.
Otherwise it causes some gamma change or flicker on my tv (lg oled). But that's only an issue if there is a drastic fps change.

Beyond flicker, VRR only really looks good if there's a relatively small pocket of variance. Like going from 70fps down to 50 looks fine with VRR, but going from 80 down to 40 and you're still gonna notice the drops big time, just because your eye can see that difference really easily.

That's less true the higher you go, though. Like you can go from 120 down to 70 and it's not that bad.
 

Chukhopops

Member
Certainly not all, for example the newest, cheapest and probably most popular right now OLED model from LG (priced like mid-range) is 60 Hz only with no VRR. This is what most non-ethusiast will pair with their gaming console.
The A1 is literally the entry level OLED in LG’s catalog (hence the name), that’s why I said mid- and high-range. B1 and higher have it.

(I also think you’d need to be really badly informed to buy it when the CX is basically the same price at the moment)
 

dotnotbot

Member
The A1 is literally the entry level OLED in LG’s catalog (hence the name), that’s why I said mid- and high-range. B1 and higher have it.

(I also think you’d need to be really badly informed to buy it when the CX is basically the same price at the moment)
Edited my comment above to add the most important thing. And yes, most people buying TVs are badly informed and won't do much research.

Maybe it's the same as CX in US, in Poland for example it's been significantly cheaper than any OLED in the past.

Oh, and most people don't buy mid and high range TVs. Look up the average $$$ spent on TV.
 
Last edited:

ethomaz

Banned
It has nothing to do with RDNA2...it all has to do with the HDMI chipset Sony put in the PS5. From what we know, it is a MediaTek chipset, which is the same one in the 2021 OLED screens that still do not have VRR as well. From what we know, Sony has been working with MediaTek to figure out how to do it with the chipset they put into the PS5. But it has NOTHING to do with the RDNA2 or any other component of the PS5...it all comes down to the HDMI Chipset.
It is a Panasonic chipset (Nuvoton subsidiary).
MN864739.

Where you get MediaTek referente?
 
Last edited:

sankt-Antonio

:^)--?-<
I'm sure you were just pretending that screen tearing isn't a thing anymore for some other reason then.....
On PS5, for me, it isn't a thing anymore. 30fps has not been a thing for me on PS5 either.

I was very sad that my TV didn't have VVR from the start when I got it. Turns out I don't really care about VRR anymore as the games I played thus far have all been tear free and 60fps.

This could change, and I hope VRR is implemented by then (Sony TV & PS5). But...since most of the games next year are still crossgen titles, I think it'll stay the way it is now for longer (not a thing).
 

ethomaz

Banned
On PS5, for me, it isn't a thing anymore. 30fps has not been a thing for me on PS5 either.

I was very sad that my TV didn't have VVR from the start when I got it. Turns out I don't really care about VRR anymore as the games I played thus far have all been tear free and 60fps.

This could change, and I hope VRR is implemented by then (Sony TV & PS5). But...since most of the games next year are still crossgen titles, I think it'll stay the way it is now for longer (not a thing).
People forget most games on PS5 are VSynced... so doesn't have tearing.
 

ToTTenTranz

Banned
It is a Panasonic chipset (Nuvoton subsidiary).
MN864739.

Apparently no one knows for sure what that chipset does since the PS5 is the only device using it so far, but the previous chips with similar model numbers on the PS4 and PS4 Pro were Displayport->HDMI converters, meaning the original source is still the SoC with DP output. Hence the console being dependent on a firmware update to achieve an output with higher bandwidth.
 
Status
Not open for further replies.
Top Bottom