Really simple question, this thread will likely be done and dusted in a few replies but this seems like the best place to ask:
From my understanding VRR basically allows the display to dictate when to display a frame rather than relying on the source. So it eliminates tearing by simply waiting until a full frame is ready to display regardless of what the source says.
Yet I’m watching the next gen Destiny 2 DF video and they seem to be suggesting that VRR will mitigate the minor frame drops on Series X, which has nothing to do with tearing.
So what’s the deal?
It's actually the opposite.
Normally the display scans the source every x milliseconds, regardless of whether the source has a frame ready or not. So it always just scans whatever the source has available in it's buffer. Without any sort of vsync, what happens is that some times the source will change the contents of it's buffer, WHILE the display is canning. This will cause a tear, if the source's data is changing (I'e the camera or objects in the scene are moving).
With Vsync the display and the source are synced together. The source essentially promises never to write to the buffer while the display is scanning.
The problem here is that if the source is slow to deliver a frame, the display may scan more than one identical frame, creating what we call judder. Movement of the camera and objects in the screen appear to warp around the screen, and motion doesn't look smooth, but stuttery. The overall increase in latency and motion judder means it's very noticeable when the source isn't matching the display.
With VRR such a Gsync The display doesn't just scan the source at it's own pace. Instead it waits fo rthe source to tell it that there's new frame ready.
This does two things. One, it removes screen tearing, since the display and source are always in sync. And two, judder and latency overhead is removed as well. This means that it's no where near as noticeable when the frame rate drops as it was before. You probably won't notice unless the drop is significant, or it's happening wildly.
One thing to keep in mind though is that not all VRR is equal. GSYNC has some very specific requirements so that the display is able to :
1. Allow for a longer time before it needs to be refreshed (image retention).
2. allow for a wide range of supported framerates.
Not all displays will support the same range of frame rates and reduce blurring while running VRR.
EDIT: man, beat by like 20 people.
On my GSYNC monitor I cna do things that would not feel good on my non VRR panel, like locking the frame rate of a demanding game to 45. This isnt as good as 60, but with VRR, it still feels fantastically better than 30, with zero judder and lower latency.