• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VRR: does it help frame rate or just alleviate tearing?

NinjaBoiX

Member
Really simple question, this thread will likely be done and dusted in a few replies but this seems like the best place to ask:

From my understanding VRR basically allows the display to dictate when to display a frame rather than relying on the source. So it eliminates tearing by simply waiting until a full frame is ready to display regardless of what the source says.

Yet I’m watching the next gen Destiny 2 DF video and they seem to be suggesting that VRR will mitigate the minor frame drops on Series X, which has nothing to do with tearing.

So what’s the deal?
 

NinjaBoiX

Member
They probably mean if it targets 60fps but can only do 58 fps you likely won't notice the difference if you are the one in 1000 people who have a TV that supports VRR.
But why “specifically” if you have VRR?

From my understanding VRR has zero effect on frame rate, simply on how the produced frames are presented by the display.
 

Magog.

Banned
But why “specifically” if you have VRR?

From my understanding VRR has zero effect on frame rate, simply on how the produced frames are presented by the display.

Like I said it wouldn't magically get you to 60fps but likely you couldn't tell the difference between 58fps and 60fps since it wouldn't lead to stuttering with a VRR display.
 

NinjaBoiX

Member
Like I said it wouldn't magically get you to 60fps but likely you couldn't tell the difference between 58fps and 60fps since it wouldn't lead to stuttering with a VRR display.
See this is where my understanding gets hazy, you can have stuttering even with a steady frame rate correct?

Someone come in here and clear things up, haha!
 
From my understanding VRR basically allows the display to dictate when to display a frame rather than relying on the source. So it eliminates tearing by simply waiting until a full frame is ready to display regardless of what the source says.
It's the opposite. The old paradigm is basically that. I'm a 60hz display and I want a frame every 16.67 milliseconds and if you fail to provide that then it's gonna get ugly.

With VRR the display accepts frames in a range of frametimes and displays them as they come in, essentially the pc/console is dictating when to display frames. With VRR if a frame takes 17.5ms vs the 16.67 for example (and thus the previous frame is shown for that period of time), it's a very minute difference and thus feels very smooth. Whereas if you are living say a 60hz vsync kind of life the lateness of that frame is going to cause the previous framed to be shown for two complete frames. You can still feel frametime variance in VRR, especially the lower the frame rates actually get but it's a lot better than non-VRR.

Think of it this way, 60 fps that drops to say 57 fps on VRR feels exactly like that. 60 fps that drops to 57fps on normal vsync feels more like dropping from 60fps to 30fps, because over the course of a couple of frames that's exactly what ends up happening.
 

Magog.

Banned
See this is where my understanding gets hazy, you can have stuttering even with a steady frame rate correct?

Someone come in here and clear things up, haha!

No, you only notice the stuttering because on a normal TV the refresh rate is stuck at 60fps so you're getting duplicate or dropped frames if the input is 58 fps. With VRR it will refresh at 58 fps if that's what the input is sending so it will still look smooth.
 

Riky

$MSFT
It basically matches your GPU output to your refresh rate, this removes tearing and judder. If you get a prolonged low framerates then you will still feel some control latency but since most console games the frame rate only drops for seconds I've never found that to be an issue.
Destiny 2 is pretty much perfect with VRR.
 

rofif

Can’t Git Gud
It's fantastic when it works.
Minimalizes lag
Removes tearing
Greatly helps with frame pacing
On real gsync monitors it also controls overdrive to minimalize ghosting but it's not guaranteed.

I use 4k 40 to 60hz freesync monitor now and it made 4k gaming to much better. I lock global fps to 58 to always stay in low lag vrr mode
 

NinjaBoiX

Member
Thanks for the input guys, so just to see if I’ve got this right; with VRR a delayed frame is only held for as long as it takes for the source to finish producing it, rather than until the next time the display expects to refresh?

So for example, if we take a “torn” frame as the third in a given sequence of five frames, and delayed by 10% of the normal time, it would go:

Non-VRR:

1st frame - 2nd frame - 3rd frame - 3rd frame (again) - 5th frame

VRR:

1st frame - 2nd frame - 3rd frame (displayed for 110% of normal time) - 4th frame (90% of normal time) - 5th frame

Is that right?
 
See this is where my understanding gets hazy, you can have stuttering even with a steady frame rate correct?

Someone come in here and clear things up, haha!
Basically if you drop to 58fps on a non-VRR display in the span of one second you'll have 56 frames that stay on screen for 16.7ms (60fps) and 2 frames that persist for 33.3ms (30fps), which results in a noticable judder twice a second. On a VRR display however each frame will persist for 17.2ms (58fps)...and you'll be hard pressed to tell the difference between a frame that persists for 16.7ms and one that persists for 17.2ms, even back to back.
 
Last edited:

NinjaBoiX

Member
Basically if you drop to 58fps on a non-VRR display in the span of one second you'll have 56 frames that stay on screen for 16.7ms (60fps) and 2 frames that persist for 33.3ms (30fps), which results in a noticable judder twice a second. On a VRR display however each frame will persist for 17.2ms (58fps)...and you'll be hard pressed to tell the difference between a frame that persists for 16.7ms and one that persists for 17.2ms, even back to back.
This is perfect, thank you!
 

godhandiscen

There are millions of whiny 5-year olds on Earth, and I AM THEIR KING.
I understood that VRR helps with frame pacing. Rather than being constrained to the locked frame pacing of a traditional TV, you can get the frame as soon as the console is done rendering it.
 

NinjaBoiX

Member
I understood that VRR helps with frame pacing. Rather than being constrained to the locked frame pacing of a traditional TV, you can get the frame as soon as the console is done rendering it.
It seems like a super clever version of v-sync, the logical extension of that tech.
 

RaySoft

Member
Really simple question, this thread will likely be done and dusted in a few replies but this seems like the best place to ask:

From my understanding VRR basically allows the display to dictate when to display a frame rather than relying on the source. So it eliminates tearing by simply waiting until a full frame is ready to display regardless of what the source says.

Yet I’m watching the next gen Destiny 2 DF video and they seem to be suggesting that VRR will mitigate the minor frame drops on Series X, which has nothing to do with tearing.

So what’s the deal?
VRR sync the display's refreshrate to the GPU's frameoutput, thus eliminating frametear.
 
Not only VRR eliminates tearing, even when your system can't keep consistent framerate, but it greatly reduces input lag compared to vsync.
 

JohnnyFootball

GerAlt-Right. Ciriously.
One thing I see people say about VRR that is false is that it eliminates tearing. This is false. Screen tearing is the result of frame being sent quicker than the monitor can keep up. It’s a major issue on 60Hz monitors, but much less noticeable on 120 Hz. VSync is still often used with GSync or Freesync.
 
One thing I see people say about VRR that is false is that it eliminates tearing. This is false. Screen tearing is the result of frame being sent quicker than the monitor can keep up. It’s a major issue on 60Hz monitors, but much less noticeable on 120 Hz. VSync is still often used with GSync or Freesync.

But it does. You only need to cap your frames below maximum monitors refresh. I enable gsync and force off vsync in nvidia control panel, so that I don't have to ever worry about turning off vsync in every game as setting in control panel over writes in game setting.
 
They probably mean if it targets 60fps but can only do 58 fps you likely won't notice the difference if you are the one in 1000 people who have a TV that supports VRR.
If it's this close nobody would notice.

Games that run in the mid 30s to mid 40s could be much improved by VRR support... but even without VRR it's still better than locking them at 30fps.
 
But it does. You only need to cap your frames below maximum monitors refresh. I enable gsync and force off vsync in nvidia control panel, so that I don't have to ever worry about turning off vsync in every game as setting in control panel over writes in game setting.
To use g sync properly you’re supposed to to turn on v sync in the Nvidia control panel and off in games.
 

Zeeed

Member
I've been lucky to get a TV that supports VRR/Free-Sync and a Nvidia card that has G-Sync.
And I mostly understand what people have said in this thread.
What I don't know is if I need to keep v-sync on or can I turn it off in the game settings?

Some game I leave v-sync off and I don't get any screen tearing so I figured the VRR is working.

Then some other game, I need to turn the v-sync on or I get tearing even with VRR. Does that mean the FPS fluctuate so much from high to low that VRR won't be able to stop the screen tearing?
 
To use g sync properly you’re supposed to to turn on v sync in the Nvidia control panel and off in games.

It's enough to force vsync off in control panel and not worry about in game setting ever. I have afterburner lways runing or cap fps in game.

But I guess theres uneducated gsync owners whose unaware that they're getting regular vsync lag whenever the game hits max refresh. And if you have 60 Hz gsync and good system than in your usuage example they will be constantly playing as, if they didnt have gsync display at all. Guess they have to thank all the noob sites out there.
 
I've been lucky to get a TV that supports VRR/Free-Sync and a Nvidia card that has G-Sync.
And I mostly understand what people have said in this thread.
What I don't know is if I need to keep v-sync on or can I turn it off in the game settings?

Some game I leave v-sync off and I don't get any screen tearing so I figured the VRR is working.

Then some other game, I need to turn the v-sync on or I get tearing even with VRR. Does that mean the FPS fluctuate so much from high to low that VRR won't be able to stop the screen tearing?
Most TV's have horrible VRR range: 48-60 Hz. [only latest HDMI 2.1 models are capable of up to 120 Hz VRR] So, if you have non 2.1 TV then every time your fps drops below 48 you get tearing without Vsync enabled or no tearing , but terrible stutter and lag with vsync enabled.
 

Zeeed

Member
Most TV's have horrible VRR range: 48-60 Hz. [only latest HDMI 2.1 models are capable of up to 120 Hz VRR] So, if you have non 2.1 TV then every time your fps drops below 48 you get tearing without Vsync enabled or no tearing , but terrible stutter and lag with vsync enabled.
Yes, I made sure my first tv purchase in 6 years would be as future facing as I could get it without upsetting the wife ;)

So I made sure the T.V. is:
-120hrz(real 120hrz, not the fake one, thank gawd I research about that or I would have been very upset)
-2.1 HDMI
-VRR
-4K
-HDR

Hopefully I can restrain myself for another 6 years ;)

And the game I had to turn on v-sync to stop tearing was Cyberpunk 2077, man, it's a frikin demanding game.
 
Last edited:
Yes, I made sure my first tv purchase in 6 years would be as future facing as I could get it without upsetting the wife ;)

So I made sure the T.V. is:
-120hrz(real 120hrz, not the fake one, thank gawd I research about that or I would have been very upset)
-2.1 HDMI
-VRR
-4K
-HDR

Hopefully I can restrain myself for another 6 years ;)

And the game I had to turn on v-sync to stop tearing was Cyberpunk 2077, man, it's a frikin demanding game.

It could have 40-120Hz VRR then. [latest Samsung range] Disable vsync in NCP, cap fps below 120 and never worry about tearing in games, only if frames drop below 40.

Can cap it at 100 or 60 or where ever your frames usually mostly hover at in the game. Most important is to avoid huge fluctuations as those will be noticeable even with VRR. If you see frames dropping to 60 very frequently, then having a cap at 100 Hz would feel less smooth than capping at 70 for example.
 
D

Deleted member 17706

Unconfirmed Member
One thing I see people say about VRR that is false is that it eliminates tearing. This is false. Screen tearing is the result of frame being sent quicker than the monitor can keep up. It’s a major issue on 60Hz monitors, but much less noticeable on 120 Hz. VSync is still often used with GSync or Freesync.

Don't get any tearing on my 60hz 4K monitor with FreeSync when it's working.
 

Zeeed

Member
It could have 40-120Hz VRR then. [latest Samsung range] Disable vsync in NCP, cap fps below 120 and never worry about tearing in games, only if frames drop below 40.

Can cap it at 100 or 60 or where ever your frames usually mostly hover at in the game. Most important is to avoid huge fluctuations as those will be noticeable even with VRR. If you see frames dropping to 60 very frequently, then having a cap at 100 Hz would feel less smooth than capping at 70 for example.
Thanks! I'll give the NCP thing a try.
 
VRR makes a huge difference because the screen is updating at the exact rate the PC (or device) is rendering the frames. This does affect the feeling of framerate smoothness even when the framerate is inconsistent.

This is because you're not seeing some frames two or three times depending on the refresh rate of the monitor without even going into tearing.

I've recently upgraded my PC to a 3090 and my TV is an LG CX running at 120hz (although with VRR It's running at the refresh rate of the game). I've been playing cyberpunk with all settings maxed, DLSS on balanced mode and I'm getting roughly 45 - 49 fps most of the time. There are areas where the game dips to 30 and I think my min frames is 29 but that's very rare. I can notice and feel it slow down to 30 very clearly but for my eye, 45 frames and above feels sufficient. Of course, it's much better at 60 frames if I switch to performance mode I can get closer to 60 (it's more like 54 - 63) but I prefer the slight clarity benefit of DLSS balanced.

Compared to experiences I have had in the past playing games on non VRR displays where the frame rate fluctuates up and down as cyberpunk does would make the game unplayable for me as the frame pacing fluctuations give me motion sickness. So in the past I would usually try and lower the settings and lock the frame rate to 30 rather than allow the fluctuations, I can honestly say that VRR has completely eliminated my motion sickness caused by fluctuations. I can still notice the frame rate change but the fluctuations don't feel so bad anymore.
 
Last edited:

Rentahamster

Rodent Whores
Yet I’m watching the next gen Destiny 2 DF video and they seem to be suggesting that VRR will mitigate the minor frame drops on Series X, which has nothing to do with tearing.
Frame drops lead to a usually in sync frame rate not matching the display's refresh rate. The mitigation is on the perceived choppiness by the end user. It looks and feels smoother.
 

Lister

Banned
Really simple question, this thread will likely be done and dusted in a few replies but this seems like the best place to ask:

From my understanding VRR basically allows the display to dictate when to display a frame rather than relying on the source. So it eliminates tearing by simply waiting until a full frame is ready to display regardless of what the source says.

Yet I’m watching the next gen Destiny 2 DF video and they seem to be suggesting that VRR will mitigate the minor frame drops on Series X, which has nothing to do with tearing.

So what’s the deal?

It's actually the opposite.

Normally the display scans the source every x milliseconds, regardless of whether the source has a frame ready or not. So it always just scans whatever the source has available in it's buffer. Without any sort of vsync, what happens is that some times the source will change the contents of it's buffer, WHILE the display is canning. This will cause a tear, if the source's data is changing (I'e the camera or objects in the scene are moving).

With Vsync the display and the source are synced together. The source essentially promises never to write to the buffer while the display is scanning.

The problem here is that if the source is slow to deliver a frame, the display may scan more than one identical frame, creating what we call judder. Movement of the camera and objects in the screen appear to warp around the screen, and motion doesn't look smooth, but stuttery. The overall increase in latency and motion judder means it's very noticeable when the source isn't matching the display.

With VRR such a Gsync The display doesn't just scan the source at it's own pace. Instead it waits fo rthe source to tell it that there's new frame ready.

This does two things. One, it removes screen tearing, since the display and source are always in sync. And two, judder and latency overhead is removed as well. This means that it's no where near as noticeable when the frame rate drops as it was before. You probably won't notice unless the drop is significant, or it's happening wildly.

One thing to keep in mind though is that not all VRR is equal. GSYNC has some very specific requirements so that the display is able to :

1. Allow for a longer time before it needs to be refreshed (image retention).
2. allow for a wide range of supported framerates.

Not all displays will support the same range of frame rates and reduce blurring while running VRR.

EDIT: man, beat by like 20 people.

On my GSYNC monitor I cna do things that would not feel good on my non VRR panel, like locking the frame rate of a demanding game to 45. This isnt as good as 60, but with VRR, it still feels fantastically better than 30, with zero judder and lower latency.
 
Last edited:

NinjaBoiX

Member
Great stuff guys, question we’ll and truly answered!

I’ll shop very carefully when I get an updated TV in a couple of years, by which point all this tech will be commonplace.
 
It's enough to force vsync off in control panel and not worry about in game setting ever. I have afterburner lways runing or cap fps in game.

But I guess theres uneducated gsync owners whose unaware that they're getting regular vsync lag whenever the game hits max refresh. And if you have 60 Hz gsync and good system than in your usuage example they will be constantly playing as, if they didnt have gsync display at all. Guess they have to thank all the noob sites out there.
Nvidia themselves says to turn on v sync in their panel. Tbh I have a 144 monitor but rarely game at that fps usually capped.
 
Nvidia themselves says to turn on v sync in their panel. Tbh I have a 144 monitor but rarely game at that fps usually capped.
VSync should be enabled really, there are edge cases where tearing can happen with G/FreeSync even well below max refresh, VSync fixes that...then you should cap it in RTSS to ensure that you never hit the VSync limit and introduce the additional latency caused by the additional buffers. Best of both worlds.
 

Fredrik

Member
Without VRR/gsync/freesync you either get tearing or stutter when you don’t quite reach the targeted framerate.

With VRR/gsync/freesync you get no tearing and no noticeable stutter when you don’t quite reach the targeted framerate.
 

Panajev2001a

GAF's Pleasant Genius
See this is where my understanding gets hazy, you can have stuttering even with a steady frame rate correct?

Someone come in here and clear things up, haha!
A game running at 48 FPS with VRR will feel slower than the same game running at 60 FPS on a different machine.

What VRR does is to change the framerate of the TV to match the rate the console/GPU is rendering at thus removing both stuttering (think VSync with framerate halving for a split second or so) as well as screen tearing.

The idea would be to have adaptive VSync and let VRR cover the framerate fluctuations but ensure you do not go above the maximum refresh rate of the TV or it will go back to having those problems essentially.

Due to the way they are made OLED TV’s may be su scelti le to contrast and brightness fluctuations if you enable VRR (LG stated it plainly) so YMMV. For now I have VRR disabled on my XSX and PS5 (LG C9 here). See:

Update: according to the video I linked you can see similar problems of gamma shifting while VRR mode is engaged even on LED LCD’s and he measured it on Samsung panels.
 
Last edited:
A game running at 48 FPS with VRR will feel slower than the same game running at 60 FPS on a different machine.
This is true...however 40+FPS is a LOT closer in feel to 60 than 30. It's diminishing returns, 40 is closer in feel to 60 than 30, 70 is closer in feel to 90 than 60, 100 is closer in feel to 120 than 90, and so on.
 

Shmunter

Member
VRR is a wheelchair for poor performance.

Jokes aside, according to detailed tv guys like HDTVTest or Stop the fomo, VRR is broken to high heaven on all existing tv’s.

Excessive inverse ghosting, poor black levels. You name it, not sure how anyone can possibly be enjoying it atm.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Upgrading from C7 to CX was a bigger deal than upgrading from 2080Ti to 3080.

It's amazing.
I am quite sensitive to brightness changes than very very minor framerate or tearing issues and I am a bit afraid of damaging the panel or accelerating the wear and tear, see video linked before.
 
Good for you hehe, still you will get varying gamma which becomes more of an issue the better your TV is calibrated and the better the TV can get at representing true black.
It's good enough for me. I'm not some egghead with calibration device staring at the settings for hours. Picture looks great, colors and everything.

Very happy with my CX.
 

NinjaBoiX

Member
VRR is a wheelchair for poor performance.

Jokes aside, according to detailed tv guys like HDTVTest or Stop the fomo, VRR is broken to high heaven on all existing tv’s.

Excessive inverse ghosting, poor black levels. You name it, not sure how anyone can possibly be enjoying it atm.
Yeah, I feel like the tech is so new that it’s still in the “teething problems” stage, I’m happy to wait a year or two until all the manufacturers are comfortable with it and have ironed out any kinks.

By that point the PS5’s library will be blossoming, so I’ll splurge on a new TV and a PS5 and be blown away by the improvement over my current PS4 Pro/1080p combo.
 

kraspkibble

Permabanned.
VRR is a game changer. i can never go back to a display stuck at a single refresh rate. i have a 165hz monitor for my PC and an LG CX for consoles.

with VRR you don't need to worry about your framerate. if you have a game and can only manage 50fps then it's no problem. on the LG CX the VRR window is 40-120hz and on my monitor it's 40-165hz. if a game bounces between 40-90fps then it's feels good where as on a non VRR screen it'd be better to rip my eye balls out.
 

Md Ray

Member
Riky Riky
Serious question. I'm considering a decent VRR display (preferably TV) for my PC. Do you mind if I ask what model TV have you got, and what's the VRR range on it? Is it 40-120Hz? I have no clue about the VRR range on TVs.
 
Last edited:
Top Bottom