One day I'll learn what this is and why everyone cares about it.
... but not today!
VRR = Variable Refresh Rate.
displays have a refresh rate (60, 120, and 144hz are most common at the moment) which is not the same as frame rate but they are related. say you have a TV that is 60Hz refresh rate then for a 1:1 ratio you'd want content at 60fps. that way every time the TV refreshes there is a new frame ready to go at that exact same time. if you have too high or lower a framerate against the refresh rate then you're either wasting processing power or giving yourself issues like judder/stutter/input lag (for lower)/screen tearing. for example, say you're watching a movie which is shot at ~24fps and you have a 60hz display then, because 24 doesn't divide equally into 60, it will cause a stuttering effect.
anyway, VRR is aimed at games because games run at a common framerates like 30, 60, 120fps but they don't always stick to their limits. games that target 30fps might drop to 28, 25, 23fps. games that target 60fps might drop to 55, 50, 45fps. games that target 120fps might drop to 110, 90, 80fps. if Vsync is enable on your display you will experience input lag and stuttering as frames are held over multiple Hz. if Vsync isn't enabled and framerate doesn't match refresh rate you will experience screen tearing. if your refresh rate is 60fps and a game does 40 or 70fps then you might see two frames on screen at once (which is tearing).
VRR eliminates the input lag, stuttering, and screen tearing found in games as a result of vsync or mismatched refresh rates. the display can adjust its refresh rate to match the exactly fps a game has. example, i have a 144hz monitor. if i can run a game at 144fps then there is a 1:1 ratio but it's not realistic to expect all games to run at 144fps consistently as it will sometimes drop to 110, 100, 90, 80fps. so when a game is struggling and can only manage 103fps then the display will drop from 144hz to 103hz in order to keep a 1:1 ratio. if the game drops to 68fps then the refresh rate of the display goes to 68hz.
some displays have a VRR window which is usually 20, 40, 48hz. what that means is if the framerate goes that low then the display can't match it anymore but there is a feature called LFC (low frame compensation) which essentially doubles what ever the framerate is. so if framerate is 10fps the display is 20hz. if framerate is 42fps then the TV goes to 84hz.
VRR works by having the GPU and monitor sync with each other (hence Gsync, Freesync). your GPU says here are 51 frames for ya and the monitor goes to 51hz. the next second the gpu says here are 48 frames so the monitor goes to 48hz. the next second the gpu says here are 60 frames so the monitor goes to 60hz.