• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-SYNC - New nVidia monitor tech (continuously variable refresh; no tearing/stutter)

Water

Member
The demonstrations at the Montreal event showed this off as well. It's difficult to tell whether this will outperform "Lightboost" in eliminating motion blur, but it is similar to it.
What AndyBNV said:
"We have a superior, low-persistence mode that should outperform that unofficial implementation, and importantly, it will be available on every G-SYNC monitor. Details will be available at a later date."

The implementation in question is a hack to activate Lightboost without being in the stereoscopic 3D mode for which it was originally intended. So all he's said so far is that you can now officially activate Lightboost at will, not that you can activate it and GSync at the same time. I'm not even sure if the part about it "being available on every GSync monitor" is correct, as that would mean every upcoming GSync display must be a Lightboost display (be equipped with a strobing backlight), including the 1440p and 4k displays that aren't going to have TN-style refresh rates.

Carmack tweet referenced here says Lightboost and GSync are mutually exclusive at this time:
http://www.blurbusters.com/confirme...es-a-strobe-backlight-upgrade/comment-page-1/

I certainly hope Nvidia is figuring out how to extend the use of backlight strobing (Lightboost) to lower refresh rates than 100Hz, but from what I understand about the tech, it just isn't going to do much good at 30Hz. It might do some good at 60Hz or 75Hz, though, and I'd love to see what can be accomplished with a strobing backlight on a 1440p IPS display.
 

mdrejhon

Member
So does someone want to explain to us plebes exactly what this is? Is it basically a monitor that does Vsync on it's own without the card having to take the hit.
There are a lot of explanations already.
However, I'll explain it from the perspective of human vision behavior:

It simply allows the computer monitor to refresh at a variable refresh rate.
Your monitor refresh rate changes on the fly to exactly match the framerate.
This eliminates the erratic stutters of variable-framerate gaming.

Some good diagrams that helps explain how nVidia G-Sync eliminates erratic stutters.
As you track your eyes on moving objects, your eyes are always moving, while the frames on the monitor are static.
So the positions of objects in frames aren't always in sync with your eye position.

Without G-Sync:
......
fps-vs-hz-1024x576.png


With G-Sync:
......
fps-vs-hz-gsync.png


Your eyes are continuously tracking moving objects on the screen (regardless of whether there's stutters or not).
The rendered game world is now finally fully in sync with real-world time, not hampered by fixed refreshing intervals.
Erratic framerates causes causes erratic stutters on fixed-refresh-rate displays; creating jerky motion.
G-Sync allows what's rendered on the GPU, to be displayed on the screen in sync with your eye tracking.
So that variable framerates stops looking jerky.

You do get a minor side effect: More motion blur at lower framerates, less motion blur at higher framerates. Motion blur becomes proportional to frame rate.

At lower framerates, you may see edge strobing, like the 30fps object at www.testufo.com -- 30fps@30Hz will still continue to look the same as 30fps@60Hz on an LCD. Conceptually, one way of viewing G-Sync, is that it makes 40fps run at 40Hz, and 50fps runs at 50Hz, 75fps at 75Hz, 97fps @ 97Hz. That's smoother than 40fps@60Hz or 50fps@60hz. The monitor's refresh rate dynamically changes on-the-fly to match the frame rate. So that all framerates look smoother because it's always running at framerate = Hz. No more erratic stutters. Now, at very high framerates (e.g. framerates fluctuate from 75fps to 144fps), you won't see edge strobing/edge flicker effects -- just a continually-varying amount of motion blur as the framerate goes up and down. Higher refresh rate, less motion blur. Motion will no longer look random herky-jerky anymore (as long as the game engine works properly with G-Sync).

It's pretty neat when motion always looks like it's sync'd to the refresh rate, no matter what the framerate is.
It means you never see random erratic stutters (e.g. the type of stutter that "47fps@60Hz" or "78fps@120Hz" produces)
 

mdrejhon

Member
The demonstrations at the Montreal event showed this off as well. It's difficult to tell whether this will outperform "Lightboost" in eliminating motion blur, but it is similar to it.
LightBoost can be improved on by many ways:
-- Better colors. Avoid the LightBoost gamma bleaching effect
-- Easier to enable, such as simple checkbox in nVidia software
-- Configurable & Shorter strobe flash lengths. Hopefully adjustable to as small as 0.25ms or 0.5ms
-- Ability to run at all refresh rates, including 60Hz and 144Hz

I am hoping it includes shorter strobe flash lengths. It may be very dim without a more powerful backlight/backlight boost, but it will outperform a lot of CRT's.
For strobe backlight displays, motion blur is directly proportional to persistence (strobe flash length).

LCD motion clarity potential is now completely unbounded: It's only limited by how brief you can flash a backlight.
LCD motion blur is no longer limited by GtG, because GtG is kept in total darkness (unseen by eye) and the backlight is strobed on fully refreshed frames. GtG transitions (1-2ms) lasts far less than the duration of a refresh (8.3ms @ 120Hz), which made LightBoost possible.
 

TheExodu5

Banned
Have we had an explanation yet of how well this will work if we're rendering faster than the monitor can display? There are a few options on the hardware side, as far as I can tell:

1) Ignore the frame that was rendered too quickly and simply display the next one. This will introduce a bit of lag and also lower the effective framerate (e.g. if you're rendering at 145Hz and your monitor only does 144Hz, you'd be displaying only every second frame, with the exception of display two consecutive frames ever 144 refreshes...cutting your framerate down to about 77Hz). This, to me, is the worst case scenario, and not really acceptable.

2) Implement an overflow buffer, whereas every frame that is rendered too fast is put into a queue and displayed as soon as the monitor is ready. You'd still need to drop frames, but you wouldn't lower the effective framerate. This would introduce a little bit of input lag, though, as you wouldn't be displaying the frames as soon as they're ready. With a fast monitor, this input lag might be considered pretty negligible.

3) The GPU could downclock itself in real-time to ensure it never renders faster than the refresh rate of the monitor. This might be limiting though if it's in the game's interest to render faster than that (say, if the netcode of the game depends on you having a faster framerate, for example). I'm also not sure how well this works in practice, and if it can introduce any performance issues.

Any solutions I'm missing?
 

Easy_D

never left the stone age
Have we had an explanation yet of how well this will work if we're rendering faster than the monitor can display? There are a few options on the hardware side, as far as I can tell:

1) Ignore the frame that was rendered too quickly and simply display the next one. This will introduce a bit of lag and also lower the effective framerate (e.g. if you're rendering at 145Hz and your monitor only does 144Hz, you'd be displaying only every second frame, with the exception of display two consecutive frames ever 144 refreshes...cutting your framerate down to about 77Hz). This, to me, is the worst case scenario, and not really acceptable.

2) Implement an overflow buffer, whereas every frame that is rendered too fast is put into a queue and displayed as soon as the monitor is ready. You'd still need to drop frames, but you wouldn't lower the effective framerate. This would introduce a little bit of input lag, though, as you wouldn't be displaying the frames as soon as they're ready. With a fast monitor, this input lag might be considered pretty negligible.

3) The GPU could downclock itself in real-time to ensure it never renders faster than the refresh rate of the monitor. This might be limiting though if it's in the game's interest to render faster than that (say, if the netcode of the game depends on you having a faster framerate, for example). I'm also not sure how well this works in practice, and if it can introduce any performance issues.

Any solutions I'm missing?

Wouldn't a software solution like a frame limiter solve those without introducing any lag? I mean, just force the game to spit out a maximum 120 frames. No need to throw away or stall any frames that didn't exist to begin with.

Edit: Unless that comes with its own set of problems?
 

TheExodu5

Banned
Wouldn't a software solution like a frame limiter solve those without introducing any lag? I mean, just force the game to spit out a maximum 120 frames. No need to throw away or stall any frames that didn't exist to begin with.

Edit: Unless that comes with its own set of problems?

Not all games will allow you to limit the framerate. A hardware or driver-level solution is definitely preferable since it would work for anything.
 

TheExodu5

Banned
Nvidia's drivers have a driver-level frame limiting option. Is that not good enough?

Could be, actually. I haven't actually tried it much so I'm not sure if it has any major downsides. And again, if you set it to 144Hz for example, is there still maybe a chance of a frame rendering too fast? Depends somewhat on whether it just limits the average framerate or it actually tightly controls frame latency. If frame latency ever drops to lower than 1/144 seconds, a frame will have been rendered faster than the monitor can accept it.
 

mdrejhon

Member
There will still be arbitrary times between frames but the differences will be measured in single digit numbers of milliseconds
If you can predict accurate frame rendering times at the beginning of rendering the frame (So object positions are in the correct positions right at the moment the frame begins to get delivered to the monitor), you could get it far less than 1ms positioning error, for a very good game engine, in the situation of fluctuating framerates completely within the G-SYNC range (e.g. 30fps through 144fps) where the frame nevers goes beyond G-SYNC's minimum or maximum frame intervals.
 

xJavonta

Banned
I don't know if it's my age or what. But I almost throw up at some of the way keyboards, monitors, and mouses are styled. In my late teens and early 20's I would have been all over that crap.
20 years old, there's one blue LED on a fan in my case and it drives me fucking nuts
 

TheExodu5

Banned
Isn't this great news for emulation?

Also, will this work with any videocard?

It could be huge for emulation, as video/audio syncing is a very difficult thing to do (you either have to live with video frame drops or audio clicks). Though, thinking about emulators like BSNES, I'm not sure if some of them even bypass the Desktop Window Manager at this point...which might still pose an issue. I'm guessing games running in windowed mode will not be able to take advantage of this.
 

slapnuts

Junior Member
Great improvement for monitors.

Next up is no motion blur and no ghosting without having to use lightboost, right?
right? ;-;

Screw it...while we are at it...why not add some kind of chip to off load AA taxes from the GPU so that the monitor handles such tasks ...wishful thinking i know but you never know.
 

Durante

Member
Any solutions I'm missing?
By far the simplest and most effective way would appear to be falling back to V-sync behavior at 144 FPS. (That is, below 144 Hz flip/present calls in the graphics API invoke G-sync and return immediately, while at >144 Hz they stall until 6.9 ms frame time is reached)

I'm guessing games running in windowed mode will not be able to take advantage of this.
Yes, that's very likely. It's one small problem I see with the tech, since I really like the utility of playing with borderless fullscreen windowed mode. But it's a very small price to pay.
 

Tarin02543

Member
Will this be a viable alternative for Framemeister scalers?

I'm really interested in retro gaming but I cannot afford the expensive scalers.
 

Arulan

Member
By far the simplest and most effective way would appear to be falling back to V-sync behavior at 144 FPS. (That is, below 144 Hz flip/present calls in the graphics API invoke G-sync and return immediately, while at >144 Hz they stall until 6.9 ms frame time is reached)

Isn't GPU stalling one of the major contributors, among other factors, to input lag with traditional V-sync?
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
I think looking at G-Sync/V-Sync as a this-or-that kinda thing is a bit misguided. I don't see it as an alternative so much as a successor. Supposedly if frames drop below the G-Sync limitations (eg: 30Hz) you'll get typical V-Sync images and thus lag/stuttering. I would assume the peak is similarly V-Sync locked, in that if you can theoretically eclipse 144Hz/fps your GPU simply wont. I would think that much like how V-Sync locks you to the hertz rate of monitor, so too will G-Sync. Only in the case of G-Sync the hertz fluctuate with the GPU.

In practice G-Sync has an awful lot in common with V-Sync. V-Sync has always had to fight with your monitor hertz. Now that issue is removed, V-Sync can do its business properly. But it's not really "vertical synchronization " now, it's GPU synchronisation. Same theory, different application.
 
I think looking at G-Sync/V-Sync as a this-or-that kinda thing is a bit misguided. I don't see it as an alternative so much as a successor. Supposedly if frames drop below the G-Sync limitations (eg: 30Hz) you'll get typical V-Sync images and thus lag/stuttering. I would assume the peak is similarly V-Sync locked, in that if you can theoretically eclipse 144Hz/fps your GPU simply wont. I would think that much like how V-Sync locks you to the hertz rate of monitor, so too will G-Sync. Only in the case of G-Sync the hertz fluctuate with the GPU.

In practice G-Sync has an awful lot in common with V-Sync. V-Sync has always had to fight with your monitor hertz. Now that issue is removed, V-Sync can do its business properly. But it's not really "vertical synchronization " now, it's GPU synchronisation. Same theory, different application.

I'd say it's more like

V-sync -> Adaptive v-sync -> G-sync
And G-sync seems closer to adaptive in idea.
 

dr_rus

Member
I think looking at G-Sync/V-Sync as a this-or-that kinda thing is a bit misguided. I don't see it as an alternative so much as a successor. Supposedly if frames drop below the G-Sync limitations (eg: 30Hz) you'll get typical V-Sync images and thus lag/stuttering. I would assume the peak is similarly V-Sync locked, in that if you can theoretically eclipse 144Hz/fps your GPU simply wont. I would think that much like how V-Sync locks you to the hertz rate of monitor, so too will G-Sync. Only in the case of G-Sync the hertz fluctuate with the GPU.

In practice G-Sync has an awful lot in common with V-Sync. V-Sync has always had to fight with your monitor hertz. Now that issue is removed, V-Sync can do its business properly. But it's not really "vertical synchronization " now, it's GPU synchronisation. Same theory, different application.
This isn't really synchronization anymore, the G-Sync term is inaccurate. The GPU simply drives monitor refreshes, there is nothing to synchronize here. Thus v-sync isn't needed on a "g-sync" monitor.
It's a very nice tech but for the love of god, industry, bring it to proper IPS panels and not TN "gaming" crap only. I may get a TN g-sync monitor just to try it out but it won't stay with me for long because of, well, colors and angles.
 

-SD-

Banned
Anyone here have anything to say about this stuff spotted here?

1.
Gazzer96: This is just bull.
HWGuyEG: As someone who works with LCDs, it's hilarious bull. Their solution is to add a spare buffer to the LCD so you don't get tearing without vsync, which increases cost of the LCD. All they had to do is direct drive the LCD panel, would have lower latency, lower cost, and would solve tearing. Mobile GPUs already support this. All LCDs panels use FPD-Link, HDMI/DP/DVI is just a pointless layer ontop that operates at a fixed speed.

2.
MrTacticious: John Carmack is so passionate about what he talks, it's so interesting to listen
HWGuyEG: He's being paid well to talk about something which he doesn't understand. gsync is solution to a problem which does not need to exist, you can direct-drive LCD panels, no tearing or lag as long as you keep above ~10fps and under their max input frequency which is 144fps for modern TNs.
 

TheExodu5

Banned
By far the simplest and most effective way would appear to be falling back to V-sync behavior at 144 FPS. (That is, below 144 Hz flip/present calls in the graphics API invoke G-sync and return immediately, while at >144 Hz they stall until 6.9 ms frame time is reached).

Can GPUs actually do that, though? I assumed they couldn't since NVidia accomplishes frame limiting by downclocking cards. Though, I suppose that could be for power efficiency reasons, rather than the inability to do so.
 
Anyone here have anything to say about this stuff spotted here?

1.
Gazzer96: This is just bull.
HWGuyEG: As someone who works with LCDs, it's hilarious bull. Their solution is to add a spare buffer to the LCD so you don't get tearing without vsync, which increases cost of the LCD. All they had to do is direct drive the LCD panel, would have lower latency, lower cost, and would solve tearing. Mobile GPUs already support this. All LCDs panels use FPD-Link, HDMI/DP/DVI is just a pointless layer ontop that operates at a fixed speed.

2.
MrTacticious: John Carmack is so passionate about what he talks, it's so interesting to listen
HWGuyEG: He's being paid well to talk about something which he doesn't understand. gsync is solution to a problem which does not need to exist, you can direct-drive LCD panels, no tearing or lag as long as you keep above ~10fps and under their max input frequency which is 144fps for modern TNs.

Then competition is free to make simpler cheaper solution ;)
 

kartu

Banned
With G-Sync:
......
fps-vs-hz-gsync.png

I'm struggling to understand this one.
Ideal line was when refresh rate of the monitor matched frames rendered by GPU.
Now gap between points 2 and 3 on this chart looks like it is narrower, which cannot be.

I think there is a common misconception about that. Monitor can not deliver frames faster than it's max rate, with G-Sync or without. With g-sync it can pace it better, but not the way shown on this chart.

Correct me if I'm wrong.



Anyone here have anything to say about this stuff spotted here?

1.
Gazzer96: This is just bull.
HWGuyEG: As someone who works with LCDs, it's hilarious bull. Their solution is to add a spare buffer to the LCD so you don't get tearing without vsync, which increases cost of the LCD. All they had to do is direct drive the LCD panel, would have lower latency, lower cost, and would solve tearing. Mobile GPUs already support this. All LCDs panels use FPD-Link, HDMI/DP/DVI is just a pointless layer ontop that operates at a fixed speed.

2.
MrTacticious: John Carmack is so passionate about what he talks, it's so interesting to listen
HWGuyEG: He's being paid well to talk about something which he doesn't understand. gsync is solution to a problem which does not need to exist, you can direct-drive LCD panels, no tearing or lag as long as you keep above ~10fps and under their max input frequency which is 144fps for modern TNs.


And how do you "direct drive" an LCD panel?
 

bro1

Banned
At the end of the day, this is going to add significant cost to the monitors and only work with Nvidia, right? I would rather have a normal high resolution monitor or an inexpensive 144hz monitor then dealing with proprietery technology.
 

muddream

Banned
LightBoost can be improved on by many ways:
-- Better colors. Avoid the LightBoost gamma bleaching effect
-- Easier to enable, such as simple checkbox in nVidia software
-- Configurable & Shorter strobe flash lengths. Hopefully adjustable to as small as 0.25ms or 0.5ms
-- Ability to run at all refresh rates, including 60Hz and 144Hz

I am hoping it includes shorter strobe flash lengths. It may be very dim without a more powerful backlight/backlight boost, but it will outperform a lot of CRT's.
For strobe backlight displays, motion blur is directly proportional to persistence (strobe flash length).

LCD motion clarity potential is now completely unbounded: It's only limited by how brief you can flash a backlight.
LCD motion blur is no longer limited by GtG, because GtG is kept in total darkness (unseen by eye) and the backlight is strobed on fully refreshed frames. GtG transitions (1-2ms) lasts far less than the duration of a refresh (8.3ms @ 120Hz), which made LightBoost possible.

Can anyone other than Nvidia make these improvements to LightBoost? The motion clarity is very cool, but the colors and flicker are unbearable for me at this stage (even after trying out the "fixes").
 

Arulan

Member
Can anyone other than Nvidia make these improvements to LightBoost? The motion clarity is very cool, but the colors and flicker are unbearable for me at this stage (even after trying out the "fixes").

Lightboost was implemented for 3D, it wasn't meant for 2D. While the "unofficial hack" is quite amazing, it has some severe downsides.

I would imagine any other display manufacturer could make improvements to Lightboost for 2D if they actually set out with that goal knowing how popular it has become.
 

-SD-

Banned
Anyone here have anything to say about this stuff spotted here?

1.
Gazzer96: This is just bull.
HWGuyEG: As someone who works with LCDs, it's hilarious bull. Their solution is to add a spare buffer to the LCD so you don't get tearing without vsync, which increases cost of the LCD. All they had to do is direct drive the LCD panel, would have lower latency, lower cost, and would solve tearing. Mobile GPUs already support this. All LCDs panels use FPD-Link, HDMI/DP/DVI is just a pointless layer ontop that operates at a fixed speed.

2.
MrTacticious: John Carmack is so passionate about what he talks, it's so interesting to listen
HWGuyEG: He's being paid well to talk about something which he doesn't understand. gsync is solution to a problem which does not need to exist, you can direct-drive LCD panels, no tearing or lag as long as you keep above ~10fps and under their max input frequency which is 144fps for modern TNs.

Then competition is free to make simpler cheaper solution ;)
...and a universal one, is my hope.
 

Lucis

Member
Anyone here have anything to say about this stuff spotted here?

1.
Gazzer96: This is just bull.
HWGuyEG: As someone who works with LCDs, it's hilarious bull. Their solution is to add a spare buffer to the LCD so you don't get tearing without vsync, which increases cost of the LCD. All they had to do is direct drive the LCD panel, would have lower latency, lower cost, and would solve tearing. Mobile GPUs already support this. All LCDs panels use FPD-Link, HDMI/DP/DVI is just a pointless layer ontop that operates at a fixed speed.

I can't detail how it works, but this is definitely not how GSYNC works
 

mdrejhon

Member
I'm struggling to understand this one.
Ideal line was when refresh rate of the monitor matched frames rendered by GPU.
Now gap between points 2 and 3 on this chart looks like it is narrower, which cannot be.

I think there is a common misconception about that. Monitor can not deliver frames faster than it's max rate, with G-Sync or without. With g-sync it can pace it better, but not the way shown on this chart.
Technically, you're correct about G-SYNC's maximum 144fps, however:
-- The chart is correct if you visualize that the first fps=Hz diagram is 60fps@60Hz.

I forgot to clarify that this was the case, because I was recycling an old 60fps@60Hz diagram I created (it also applies to 120fps@120Hz and any other fps=Hz situations). That said, a correction will be made to future versions of the images. The specific charts' are refresh rate independent, and none of the charts need to be assigned the same refresh rate.

Also, G-Sync can be improved in the future to have a higher maximum frame rate. However, the graphics are useful to help visually understand the relationship between the human eye tracking & perception of stutters.

Thanks to your reader comment, I have posted an update, to clarify this.
http://www.blurbusters.com/how-does-nvidia-gsync-fix-stutters/comment-page-1/#comment-1774
Hope this clarifies things!
 

mdrejhon

Member
Can anyone other than Nvidia make these improvements to LightBoost? The motion clarity is very cool, but the colors and flicker are unbearable for me at this stage (even after trying out the "fixes").
Yes. There are multiple strobe backlights invented now, that are as efficient as LightBoost.

1. nVidia LightBoost (the one that started the strobe backlight craze, thanks to Blur Busters)
2. Sony's "Motionflow Impulse" (Sony's version of LightBoost)
3. Samsung's 3D Mode is a strobe backlight. (Samsung 120Hz instructions)
4. EIZO FDF2405's 240Hz monitor uses LightBoost-style strobing.
5. G-SYNC's now-official low-persistence mode is a LightBoost sequel.
6. Viewpixx's 5-figure priced IPS 1920x1200 with high quality LightBoost-like mode.

LightBoost (albiet unofficial) is just the first one that became popular.
Even OLED's have to use optional strobing to eliminate motion blur. Also, TFTCentral explains why strobing eliminates motion blur. Animations at www.testufo.com/eyetracking as well as www.testufo.com/blackframes are also very educational in explaining the human vision behavior of strobing.

The future is looking promising. Strobing is necessary for CRT-quality motion until we have 1000fps@1000Hz flicker-free displays (John Carmack and Michael Abrash agrees, and the animation at www.testufo.com/eyetracking is good proof of this, too.). As long as strobing remains optional, and we have PWM-free mode, everyone's pretty much well-covered. Although strobing reduces eyestrain for me -- because I get more motion-blur eyestrain than I get flicker eyestrain. But other people are different. Some people will never get used to strobing, until the strobing goes to a higher frequency and/or gets "softened" (more emulates CRT phosphor decay, or line-sequential, etc).
 

mdrejhon

Member
Gazzer96: This is just bull.
HWGuyEG: As someone who works with LCDs, it's hilarious bull. Their solution is to add a spare buffer to the LCD so you don't get tearing without vsync, which increases cost of the LCD. All they had to do is direct drive the LCD panel, would have lower latency, lower cost, and would solve tearing. Mobile GPUs already support this. All LCDs panels use FPD-Link, HDMI/DP/DVI is just a pointless layer ontop that operates at a fixed speed.
That's not how GSYNC works, either.
All the good 120Hz BENQ and 120Hz ASUS LCD panels are already being direct-driven since 2011. (just at granular intervals, rather than timed with GPU's)
And I have high-speed camera proof (watch the non-Lightboost section) and the upcoming Blur Busters Input Lag Tester (release date: early 2014, supports any refresh rate, and 4K) confirms this too, having only a few milliseconds of input lag.
Now G-SYNC direct-drives *AND* allows variable refresh rates.
 

Durante

Member
Thanks for all the in-depth posts Mark, really interesting.

Tangentially (at best) related question: did you ever do your high-speed camera/diode measurements on a DLP projector (e.g. I use a 120 Hz Acer H5630). It's the only "modern" display technology which feels CRT-like to me.
 

mdrejhon

Member
Tangentially (at best) related question: did you ever do your high-speed camera/diode measurements on a DLP projector (e.g. I use a 120 Hz Acer H5630). It's the only "modern" display technology which feels CRT-like to me.
Not yet, but DLP's projectors capable of 120Hz+BFI behave very similarly to LightBoost.
BENQ GT720 supports combining 120Hz and BFI

Basically, many DLP projectors running in 3D mode (120Hz) adds a forced black frame insertion between refreshes. This is to prevent crosstalk when using 3D glasses, to give time for glasses to switch shutters. But this helps 2D too! As a result, 120Hz DLP's in 3D mode, are very CRT-like with excellent motion resolution for 2D 120Hz gaming, where fast panning at www.testufo.com/photo looks as clear as stationary images (or almost). It's like having LightBoost, but in a projector.

I can confirm that LightBoost and strobe backlights actually outperform these DLP's, provided you run at 120fps@120Hz (LightBoost behaves like a 120Hz CRT). I'm able to read the street labels in the TestUFO Panning Map Test at 1920 pixels/second on a LightBoost LCD configured to LightBoost=10%. (If you have never tried LightBoost=10%, then you missed the further improvements). This panning map is so fast, I have to turn my head while trying to read the map labels. You cannot do this on regular LCD's, it only is readable on CRT's or LightBoost (and very probably upcoming G-SYNC's optional strobed mode). This produces less motion blur than a Sony GDM-W900 CRT, and less motion blur than 120Hz DLP's. So the map looks like a paper map being moved sideways in front of my face, with no motion blur. However, LightBoost displays have much worse color than 120Hz DLP's, so that may ruin part of the CRT illusion for you.

Motion blur is directly proportional to persistence, as John Carmack and others say. (aka visible frame length -- aka strobe length -- aka phosphor decay).
Black frame 50%; illumination 50% = 1:1 motion blur reduction -- 50% less motion blur
Black frame 75%; illumination 25% = 3:1 motion blur reduction -- 75% less motion blur
Black frame 90%; illumination 10% = 9:1 motion blur reduction -- 90% less motion blur
Regardless of which strobeable display technology (LightBoost, DLP, OLED, etc), even CRT/plasma (though they 'fade away' their strobes more gradually; aka phosphor decay)
 
So I understand how this is better than normal, double-bufferred vsync... but I almost never use that. I use Radeon Pro or DXOverrider to force triple buffering.

The way I understand triple buffering is this: The video card creates new frames as fast as it wants to, but doesn't immediately send them to the monitor. The monitor reads frames from the "3rd buffer". After the monitor is finished drawing a complete frame, the contents of the 3rd buffer are updated to the most recent frame the video card has rendered, which the monitor reads from again whenever it is ready. This keeps tearing in check without slowing down the video card. Input latency would always be, at most, one frame behind.

This seems like a win win solution. Is my understanding correct, and how is Gsync substantially better?
 

mdrejhon

Member
This seems like a win win solution. Is my understanding correct, and how is Gsync substantially better?
They're not necessarily mutually exclusive.
G-SYNC can probably be combined with triple buffering to gain the best of both worlds. That way games that run above 144fps can automatically use triple buffering, to make G-SYNC even better than triple buffering. That's because when your framerate dips below 144fps, your frames aren't forced to wait till the next granular scheduled refresh before the frame gets displayed on the monitor. Then when you hit 144fps@144Hz and get bottlenecked, you're triplebuffering to reduce 144Hz latency.

That said, it MAY distort frame timings relative to refresh timings. (divergent of object positions from ideal eye-tracking line, re-introducing ultra-minor judder during 144fps@144Hz in exchange for lower input lag).
This should be an optional option, at least for professional game players. Such as a Control Panel option:

[X] Enable Triple Buffering with G-SYNC When Above 144fps (lower lag but may add microstutters)
 

Durante

Member
So I understand how this is better than normal, double-bufferred vsync... but I almost never use that. I use Radeon Pro or DXOverrider to force triple buffering.

The way I understand triple buffering is this: The video card creates new frames as fast as it wants to, but doesn't immediately send them to the monitor. The monitor reads frames from the "3rd buffer". After the monitor is finished drawing a complete frame, the contents of the 3rd buffer are updated to the most recent frame the video card has rendered, which the monitor reads from again whenever it is ready. This keeps tearing in check without slowing down the video card. Input latency would always be, at most, one frame behind.

This seems like a win win solution. Is my understanding correct, and how is Gsync substantially better?
Triple buffering absolutely is the best currently possible mitigation strategy (in my opinion, since any form of tearing is unacceptable to me), and should always be used. However, G-sync doesn't try to mitigate the issue (of incoherent refresh rates between GPU and display), it fixes it. It is better in 3 ways:
  • Sadly, games rarely implement the correct method of triple buffering as you describe, and rather go with a FIFO frame queue instead of discarding old frames. In many games, even forcing it can add more than the expected amount of input lag.
  • Triple buffering does nothing to fix (or even reduce) the judder produced by the mismatch of "frame rendering time" and "frame display time". Since both devices (GPU and CPU) still produce/show frames at uneven intervals, you get judder.
  • Triple buffering requires additional GPU memory to store the third buffer (though this is mostly negligible in terms of size on modern GPUs).
Of those, judder is the most relevant issue, and basically impossible to fix with a fixed refresh rate display.

And yeah, as Mark says above, you could actually start to use triple buffering at above 144 FPS (for G-sync displays with a maximum frequency of 144 Hz), though personally I'm not hardcore enough to care about those 6ms.
 

mdrejhon

Member
Well, looking at the Internet as a professional analyst: Good news for people who want "LightBoost" style strobing at other refresh rates, to reduce GPU requirements (85fps @ 85Hz) or to reduce input lag (144fps @ 144Hz). G-SYNC's optional superior sequel to LightBoost (optional fixed-rate strobe mode) actually supports strobing at 85Hz and at 144Hz (at least), in addition to existing LightBoost modes (100Hz and 120Hz).

Clues:

  1. The G-SYNC upgrade datasheet has 85Hz added.
  2. AndyBNV suggested on NeoGAF the low-persistence mode is superior to LightBoost.
  3. The YouTube video of John Carmack at G-SYNC launch, was very suggestive.
  4. Many articles mentions 85Hz as a CRT frequency that stops flickering for many people.
  5. The pcper.com livestream suggests a very high fixed refresh in low-persistence mode.
Upon analysis, both 85Hz and 144Hz are available strobed modes with G-SYNC, in addition to 100Hz and 120Hz. 60Hz has too much flicker, so that mode isn't fully confirmable. Beyond 144Hz is beyond ASUS VG248QE's bandwidth. Other strobed modes might also be available. You heard it first from me. :)
 

LCGeek

formerly sane
Well, looking at the Internet as a professional analyst: Good news for people who want "LightBoost" style strobing at other refresh rates, to reduce GPU requirements (85fps @ 85Hz) or to reduce input lag (144fps @ 144Hz). G-SYNC's optional superior sequel to LightBoost (optional fixed-rate strobe mode) actually supports strobing at 85Hz and at 144Hz (at least), in addition to existing LightBoost modes (100Hz and 120Hz).

Clues:

  1. The G-SYNC upgrade datasheet has 85Hz added.
  2. AndyBNV suggested on NeoGAF the low-persistence mode is superior to LightBoost.
  3. The YouTube video of John Carmack at G-SYNC launch, was very suggestive.
  4. Many articles mentions 85Hz as a CRT frequency that stops flickering for many people.
  5. The pcper.com livestream suggests a very high fixed refresh in low-persistence mode.

Upon analysis, both 85Hz and 144Hz are available strobed modes with G-SYNC, in addition to 100Hz and 120Hz. Other strobed modes (e.g. 60Hz) might also be available, but would have very uncomfortable flicker.

You heard it first from me. :)

That's huge I need less frames to get the smooth clear effect, fuck yeah. Shame I just bought 7950 but when I come back nvidia for gsync somethings will be extremely welcome.
 

Rur0ni

Member
Well, looking at the Internet as a professional analyst: Good news for people who want "LightBoost" style strobing at other refresh rates, to reduce GPU requirements (85fps @ 85Hz) or to reduce input lag (144fps @ 144Hz). G-SYNC's optional superior sequel to LightBoost (optional fixed-rate strobe mode) actually supports strobing at 85Hz and at 144Hz (at least), in addition to existing LightBoost modes (100Hz and 120Hz).

Clues:

  1. The G-SYNC upgrade datasheet has 85Hz added.
  2. AndyBNV suggested on NeoGAF the low-persistence mode is superior to LightBoost.
  3. The YouTube video of John Carmack at G-SYNC launch, was very suggestive.
  4. Many articles mentions 85Hz as a CRT frequency that stops flickering for many people.
  5. The pcper.com livestream suggests a very high fixed refresh in low-persistence mode.
Upon analysis, both 85Hz and 144Hz are available strobed modes with G-SYNC, in addition to 100Hz and 120Hz. 60Hz has too much flicker, so that mode isn't fully confirmable. Beyond 144Hz is beyond ASUS VG248QE's bandwidth. Other strobed modes might also be available. You heard it first from me. :)
This is NeoGAF.

Great posts.
 

TheExodu5

Banned
Triple buffering absolutely is the best currently possible mitigation strategy (in my opinion, since any form of tearing is unacceptable to me), and should always be used. However, G-sync doesn't try to mitigate the issue (of incoherent refresh rates between GPU and display), it fixes it. It is better in 3 ways:
  • Sadly, games rarely implement the correct method of triple buffering as you describe, and rather go with a FIFO frame queue instead of discarding old frames. In many games, even forcing it can add more than the expected amount of input lag.
  • Triple buffering does nothing to fix (or even reduce) the judder produced by the mismatch of "frame rendering time" and "frame display time". Since both devices (GPU and CPU) still produce/show frames at uneven intervals, you get judder.
  • Triple buffering requires additional GPU memory to store the third buffer (though this is mostly negligible in terms of size on modern GPUs).
Of those, judder is the most relevant issue, and basically impossible to fix with a fixed refresh rate display.

And yeah, as Mark says above, you could actually start to use triple buffering at above 144 FPS (for G-sync displays with a maximum frequency of 144 Hz), though personally I'm not hardcore enough to care about those 6ms.

This was my biggest issue with triple buffering, and why I often didn't use it (also, why my old tag wasn't really correct :p). I would say that most games suffer from severe input lag with triple buffering enabled. With some competitive games it became completely unacceptable (Street Fighter, Counter-Strike).
 
Just finished reading up on this technology and if it works as advertised, it's a game changer for me. All I care about are my games running smoothly and an enjoyable experience. Graphic fidelity isn't the highest priority. Can't wait to try one out.
 
Hey can anyone help me understand this issue? I posted about it in the TechReport thread but have yet to get an answer:

Link to the segment in question

Right at the 38 minute mark the conversation is about how nothing (vsync on/off) fixes stuttering and how the industry missed a chance to innovate in moving from CRT to LCD. Tom mentions the need (back with CRT monitors) to sync the monitor with the GPU.

I'm just wondering if someone can explain what Gsync does that a CRT monitor doesn't? If CRT monitors had to be synced to the GPU, what am I missing? Isn't that what Gsync does?


I also never used a CRT monitor so I'm admitting to being out of my element here.
 
Top Bottom