• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-SYNC - New nVidia monitor tech (continuously variable refresh; no tearing/stutter)

rothgar

Member
Nice, my wife got me the VG248QE last month, but I'm running a 7870. Gonna have to wait a few months to swap to an nvidia gpu and the monitor module.
 

HariKari

Member
Honest question, why would you need 120+ fps on a 4k monitor to max them out? Would a solid 60 fps not have the same effect?

You can feel the difference between 60 fps and 120fps. Your eyes may not discern it, but your brain can sense that something is different. 60fps+ at 60hz is smooth. 120fps on a 144hz monitor is just buttery smooth. It's the same sensation you get when you go from playing consoles all of your life to a PC gaming setup on a nice monitor. It's a revelation.

More frames will still always be better. That's just extra slack in the system that can be used to drive the display or drive visual goodies. But you no longer have the 60 or 120 fps goalposts to shoot for. If it works as advertised (which it does, according to everyone who has witnessed it) then all you really need is a system that never really dips below 30 or 40 fps.

Even if this is still an enthusiast thing it seems like a thing that should be done

This technology means the most to entry level cards. The 'no more tearing' thing is nice, but what it does under 60 is just as important.
 

chaosblade

Unconfirmed Member
It's actual hardware. On both ends of the connection. I don't see what similarities it has with programmable shaders.

I meant its on that level. Imagine if one company had programmable sharers and no one else did.

And this is something that needs to be reasonably standardized. The idea of paying $500+ for a midrange 21" 1080p monitor with G-sync, AMD's implementation, and Intel's implementation isn't very appealing either.

Being proprietary also means this will probably never reach people beyond a tiny enthusiest niche when it would be very beneficial for everyone. Hopefully it gets licensed out reasonably.
 

Arulan

Member
If it works as advertised (which it does, according to everyone who has witnessed it) then all you really need is a system that never really dips below 30 or 40 fps.

Frame rate will continue to be important though, even if you need "less" to achieve a similar perceived motion smoothness. Input lag will continue to be affected by your frame rate, so even if 40 fps might "look" as smooth as 60 fps with this technology, you'll still want to achieve higher frame rates for lower input lag on the control side of things.
 

HariKari

Member
Frame rate will continue to be important though, even if you need "less" to achieve a similar perceived motion smoothness. Input lag will continue to be affected by your frame rate, so even if 40 fps might "look" as smooth as 60 fps with this technology, you'll still want to achieve higher frame rates for lower input lag on the control side of things.

G-Sync doesn't eliminate input lag, but significantly reduces it. Frames will always matter to the type of person who demands the absolute best competitive edge. But eliminating a good chunk of the lag is a major improvement for entry level and mid range cards/systems that want to run higher resolutions, more visual enhancements, or both.
 

mdrejhon

Member
@AndyBNV

What does this mean for Lightboost monitors and specifically ToastyX implementation? How does G-Sync compare to Lightboost strobbing? Do they work together, like with 3D Vision?

The VG248QE is one of the top monitors in this category. However, Lightboost uses DVI for a connection, and last time I checked, Displayport does not support this feature.
There are some technical challenges to combining G-Sync with LightBoost strobing, specifically flicker caused by variable strobing.

However, there's a variable-rate strobing algorithm that I've come up, that should make possible flicker-free combining of G-Sync and LightBoost:
http://www.blurbusters.com/faq/creating-strobe-backlight/#variablerefresh

Essentially, low framerates run in PWM-free mode, and the flat backlight DC begins to udulate more and more the higher the refresh rates go, until it resembles a LightBoost square wave during triple-digit frame rates. So you don't get flicker caused by variable-refresh-rate udulations. Strobe duty cycle is also maintained.
 

Arulan

Member
G-Sync doesn't eliminate input lag, but significantly reduces it. Frames will always matter to the type of person who demands the absolute best competitive edge. But eliminating a good chunk of the lag is a major improvement for entry level and mid range cards/systems that want to run higher resolutions, more visual enhancements, or both.

True on all accounts but I just want to make clear, G-sync should eliminate all input lag present with current V-sync, and reduce it overall a bit by direct communication with the GPU. However, the input lag associated to what frame rate the game is rendering at will continue to be just as important as it is now (not talking about V-sync at all). I agree though, for a lot of people, especially those who cannot always afford the highest-end setups this will improve their experience dramatically.
 

mdrejhon

Member
Honest question, why would you need 120+ fps on a 4k monitor to max them out? Would a solid 60 fps not have the same effect?
No. You still get the motion blur problem.
Contrary to popular myth, there is no limit to the benefits of even higher refresh rates, because of motion blur. As you track moving objects on a screen, your eyes are in a different position at the beginning of a refresh than at the end of a refresh. That causes frames to be blurred across your vision. Mathematically, 1ms of static refresh time translates to 1 pixel of motion blur during 1000 pixels/second motion, and it's already well documented as "sample-and-hold" in scientific references and already talked about, by John Carmack (iD Software), and Michael Abrash (Valve Software).

You have heard of people raving about LightBoost motion clarity recently, in some of the high-end PC gamer forums lately. Motion blur on strobe-free LCD's only gradually declines as you go higher in refresh rates. Fixing this motion blur limit is easier to do by strobing. LightBoost is already equivalent to 400fps@400Hz motion clarity via 1/400sec strobe flash lengths. If you use LightBoost=10%, you need'd at least 1000fps@1000Hz G-Sync to get less motion blur during non-strobed G-Sync than with strobed LightBoost. Yes, strobing is a bandaid until we have the magic 1000fps@1000Hz display. But so are Rube Goldberg vaccuum-filled glass balloons (aka CRTs) are also a bandaid too, in the scientific progress to Holodeck perfection.

Good animation of how motion blur works -- www.testufo.com/eyetracking
Good animation of how strobing fixes blur -- www.testufo.com/blackframes

The strobe effect is why CRT 60fps@60Hz still has less motion blur than non-strobed LCD 120fps@120Hz, noticed by exterme gamers who are used to CRT motion clarity.
LightBoost 100% uses 2.4ms strobe flash lengths. (1/400sec)
LightBoost 10% uses 1.4ms strobe flash lengths. (1/700sec)
The motion clarity difference between the two (using ToastyX Strobelight) is actually noticeable in fast panning motion, such as TestUFO: Panning Map Test (blurry mess on your monitor; perfect sharp on CRT and LightBoost) when toggling between LightBoost Strobe Brightness 10% versus 100%. It's like having an adjustable-phosphor-persistence CRT. Mathematically, based on science papers, 1ms of persistence (visible static frame duration time) translates to 1 pixel of motion blur for every 1000 pixel/second of eye-tracking motion.

(For those living under a rock, LightBoost (for 2D) is a strobe backlight that eliminates motion blur in a CRT-style fashion. It eliminates crosstalk for 3D, but also eliminates motion blur (For 2D too). LightBoost is now more popular for 2D. Just google "lightboost". See the "It's like a CRT" testimonials, the LightBoost media coverage (AnandTech, ArsTechnica, TFTCentral, etc), the improved Battlefield 3 scores from LightBoost, the photos of 60Hz vs 120Hz vs LightBoost, the science behind strobe backlights, and the LightBoost instructions for existing LightBoost-compatible 120Hz monitors. It is truly an amazing technology that allows LCD to have less motion blur than plasma/CRT. John Carmack uses a LightBoost monitor, and Valve Software talked about strobing solutions too. Now you're no longer living under a rock!)
 

Dawg

Member
Oh cool, didn't know Blurbusters had an account on NeoGAF!

Keep up the good work, enjoy reading the site :)
 

HariKari

Member
However, the input lag associated to what frame rate the game is rendering at will continue to be just as important as it is now

Just as tearing occurs when the GPU outpaces the monitor, stutter and lag occur when the monitor outpaces the GPU. G-Sync will be in monitors that can downclock themselves all the way down to 30hz in order to match the GPU. This significantly reduces input lag at FPS rates lower than 60 but >30.
 

Durante

Member
This technology means the most to entry level cards. The 'no more tearing' thing is nice, but what it does under 60 is just as important.
Yeah, I think people are missing this. The largest visible improvement from G-sync (and one that should be immediately obvious to most gamers, not just enthusiasts) is likely to occur in the range between 30 and 60 FPS.

This doesn't change the fact that higher framerates are always better, but it will greatly improve the fluidity of motion whenever you are not reaching exactly 60 (or 120) FPS. (In addition to latency benefits, of course)
 

kartu

Banned
Not exactly.

Yes, people who already game on 120hz monitors and/or beastly GPU configurations probably already have much less tearing than people trying to eke out performance from aging configurations.
I might be missing something, why is there any tearing if V-Sync is on?
 

Arulan

Member
Just as tearing occurs when the GPU outpaces the monitor, stutter and lag occur when the monitor outpaces the GPU. G-Sync will be in monitors that can downclock themselves all the way down to 30hz in order to match the GPU. This significantly reduces input lag at FPS rates lower than 60 but >30.

I think we may be talking about two different things, heh. I believe you're referring to pull down judder? Yes, G-sync does eliminate that, just as V-sync does but without all the downsides.

I'm simply referring to, ignoring all other sources of input lag, if you're running the game at 30 fps, at the very minimum you're still getting 33.33ms of input lag, at 40 fps 25ms, at 60 fps 16.66ms, at 120 fps 8.33ms, etc.
 

dark10x

Digital Foundry pixel pusher
(For those living under a rock, LightBoost (for 2D) is a strobe backlight that eliminates motion blur in a CRT-style fashion. It eliminates crosstalk for 3D, but also eliminates motion blur (For 2D too). LightBoost is now more popular for 2D. Just google "lightboost". See the "It's like a CRT" testimonials, the LightBoost media coverage (AnandTech, ArsTechnica, TFTCentral, etc), the improved Battlefield 3 scores from LightBoost, the photos of 60Hz vs 120Hz vs LightBoost, the science behind strobe backlights, and the LightBoost instructions for existing LightBoost-compatible 120Hz monitors. It is truly an amazing technology that allows LCD to have less motion blur than plasma/CRT. John Carmack uses a LightBoost monitor, and Valve Software talked about strobing solutions too. Now you're no longer living under a rock!)
I like Lightboost, but it still has a ways to go. It impacts image quality in other areas and is dependent upon high performance from your PC in order to function properly. The motion you can achieve on a CRT is independent of the source.
 

mdrejhon

Member
I like Lightboost, but it still has a ways to go. It impacts image quality in other areas and is dependent upon high performance from your PC in order to function properly. The motion you can achieve on a CRT is independent of the source.
Strobing isn't limited to TN anymore. Sony's interpolation-free low-lag "Motionflow Impulse" does it (Sony's equivalent of LightBoost), and the Eizo professional FDF2405W 240Hz VA monitor uses a strobe backlight (See page 15 of PDF manual for strobing information). You've must have missed the news of the new, additional strobe backlights under other brand names other than "LightBoost". 2014 shall be an interesting year.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
Yeah, I was going to say, even though the tech seems enthusiast level, it's actually going to be hugely beneficial to everyone, including those who average between 30 - 60fps. In fact, it's no less beneficial to them than a 100+fps system, perhaps even moreso as the differences in drops would be more significant.

This is partially why I'm pretty pumped to get a monitor. I don't have the money to throw around on significant SLI/Titan systems that crank triple digit frames, but I loathe the input lag of sub-60fps on a 60Hz monitor. If this improves response time even at lower framerates, it will be amazing.
 

dark10x

Digital Foundry pixel pusher
Strobing isn't limited to TN anymore. Sony Motionflow Impulse does it (Sony's version of LightBoost), and the Eizo professional FDF2405W 240Hz VA monitor uses a strobe backlight. You've must have missed the news of the new, additional strobe backlights.
I'd like to see those for myself to determine as I wasn't made aware of limitations of the technology until using it.

Again, though, all of these technologies are being used to solve problems inherent in LCDs when companies should be focusing on different types of displays altogether.
 

Durante

Member
What I don't get about strobing is why it has to be bound to FPS. Can't you strobe the same frame many times if you keep the strobing frquency extremely high? (Like 1000 Hz) Or is that not technically possible?
 

dark10x

Digital Foundry pixel pusher
What I don't get about strobing is why it has to be bound to FPS. Can't you strobe the same frame many times if you keep the strobing frquency extremely high? (Like 1000 Hz) Or is that not technically possible?
Yes, I'd like to know this as well.

It seems like it would be necessary for g-sync as well if you want to achieve the same effect.
 

mdrejhon

Member
Alas, the Eizo FDF2405W is a professional/commercial monitor costing four figures, but it shows what finally now can be done with VA panels.

240Hz panels have been around for two years now in high-end HDTV's, for interpolation. Apparently, Eizo FDF2405W shows that such 240Hz LCD panels are now possible in computer monitor sizes now, and driveable at closer to native refresh rates without interpolation.
 

mdrejhon

Member
Yes, I'd like to know this as well.
It seems like it would be necessary for g-sync as well if you want to achieve the same effect.
Yes, but variable-rate strobing can create flicker problems.
It's a very tough engineering problem.

Motion blur reduction strobing must only occur once per refresh, so if your framerates drop to only 30fps, you will get very bad and uncomfortable flicker.
I have come up with an idea of a strobe curve algorithm to fix this. Basically, it blends PWM-free at lower refresh rates, and gradually goes to full strobing at high refresh rates, roughly like this:

Example:
10fps@10Hz — PWM-free backlight
30fps@30Hz — PWM-free backlight
45fps@45Hz — PWM-free backlight
60fps@60fps — Minor backlight brightness undulations (bright / dim / bright / dim)
80fps@80Hz — Sharper backlight brightness undulations (very bright / very dim)
100fps@100Hz — Starts to resemble rounded-square-wave (fullbright / fulloff)
120fps@120Hz and up — Nearly square-wave strobing like original LightBoost

This would be a dynamically variable continuum all the way in bewteen too, much like automobile CVT instead of discrete gears in automobile transmissions. You avoid flicker at lower frame rates, and you get full strobing benefits at higher frame rates.

Simpler algorithm variations are also possible (e.g. keeping a square wave, and using only pulsewidth / pulseheight manipulation to achieve the blending effect, but without curve-softening). This is included as part of my general idea of blending from PWM-free at lower refresh rates, to strobing at higher refresh rates. The trigger framerates may be different from the example above (or may even be adjustable via a user flicker-threshold setting), but the concept is the same.

Strobe duty cycle is maintained, to achieve constant average brightness at all times, so that changes in strobe rates aren't noticeable.
 

Dawg

Member
So, the motion blur won't be reduced to CRT or anything and you'll still need lightboost, but what tech did they use for this screenshot:

DSC_3611.JPG


Again, this screenshot is one of the coolest things they showed (for me).
 

mdrejhon

Member
So, the motion blur won't be reduced to CRT or anything and you'll still need lightboost, but what tech did they use for this screenshot:
Some comments:

(1) Static photography do not accurately capture display motion blur. You need a pursuit camera.
HOWTO: Using a Pursuit Camera with TestUFO Motion Tests
For example, you can only photographically capture the checkerboard pattern at http://www.testufo.com/eyetracking#pattern=checkerboard if you pan the camera along with the UFO while taking a picture.

(2) G-Sync definitely does decrease motion blur, but not to LightBoost levels (for locked 120fps@120Hz motion, for panning similiar to www.testufo.com/photo fluidity)
G-Sync is better for variable frame rate situations, but if you've got a Titan and max out at 144fps@144Hz, it still has more motion blur than LightBoost 120fps@120Hz.

Also, it is possible they may have already come up with a method of variable-rate strobing, but I'd like to hear a confirmation about this. Also, is this station running G-Sync, or running LightBoost? John Carmack has mentioned that they've enhanced LightBoost in the ASUS G-Sync upgrade board, but he says it's an "either-or" option.

John Carmack said:
John Carmack ‏@ID_AA_Carmack 10h
@GuerrillaDawg the didn’t talk about it, but this includes an improved lightboost driver, but it is currently a choice — gsync or flashed.
 

2San

Member
Strobing isn't limited to TN anymore. Sony's interpolation-free low-lag "Motionflow Impulse" does it (Sony's equivalent of LightBoost), and the Eizo professional FDF2405W 240Hz VA monitor uses a strobe backlight (See page 15 of PDF manual for strobing information). You've must have missed the news of the new, additional strobe backlights under other brand names other than "LightBoost". 2014 shall be an interesting year.
Wow I didn't know that seems like Sony is doing good work as well.

The problem I have with LightBoost is that it gives me a mild headache. Do you get used to it after a while?
 
This technology is really interesting. I hope IPS monitors can have this as well, I'm not sure I can go back to TN panels.

Yesterday I played Arkham Asylum with Physix on and this thing would be a godsend, FPS goes down to 40's and 30's and the game stutters like crazy to me, with this no more stutter.

Hopefully more people and companies will take notice and this will become widespread.
 

mdrejhon

Member
Wow I didn't know that seems like Sony is doing good work as well.
The problem I have with LightBoost is that it gives me a mild headache. Do you get used to it after a while?
No... It depends on the human.
For me specifically, it actually reduces eyestrain because I get more headaches from fast gaming motion blur, than headaches from strobing. Eliminating the motion blur fixes it for me. Others such as CallSignVega agrees. Also, Eizo's FDF2405W brochure says the 240Hz mode (strobe mode) can reduce eyestrain during map panning. So there's pretty much credibility. But it is a tradeoff if you are more sensitive to flicker than motion blur. (I'm VERY sensitive to motion blur). Fortunately, ToastyX Strobelight makes it easy to turn off, so you can turn it on only during games.

LightBoost is good for people who are used to CRT's. If you never got CRT eyestrain, then it's easy to get used to LightBoost, since LightBoost behaves like a 120Hz CRT. However, it's not for people who get headaches with plasmas/CRT"s.

Adjusting strobe brightness downwards and increasing ambient lighting in the room, helps greatly -- common recommendations from back in the CRT tube days, too.
 

Datschge

Member
What I don't get about strobing is why it has to be bound to FPS. Can't you strobe the same frame many times if you keep the strobing frquency extremely high? (Like 1000 Hz) Or is that not technically possible?

Strobing ideally hides the whatever minimal ghosting that happens whenever a pixel color changes and only shows the finished state. And that happens at the screen refresh rate respectively g-sync's rate.
 

Dawg

Member
Some comments:

(1) Static photography do not accurately capture display motion blur. You need a pursuit camera.
HOWTO: Using a Pursuit Camera with TestUFO Motion Tests
For example, you can only photographically capture the checkerboard pattern at http://www.testufo.com/eyetracking#pattern=checkerboard if you pan the camera along with the UFO while taking a picture.

(2) G-Sync definitely does decrease motion blur, but not to LightBoost levels (for locked 120fps@120Hz motion, for panning similiar to www.testufo.com/photo fluidity)
G-Sync is better for variable frame rate situations, but if you've got a Titan and max out at 144fps@144Hz, it still has more motion blur than LightBoost 120fps@120Hz.

Also, it is possible they may have already come up with a method of variable-rate strobing, but I'd like to hear a confirmation about this. Also, is this station running G-Sync, or running LightBoost? John Carmack has mentioned that they've enhanced LightBoost in the ASUS G-Sync upgrade board, but he says it's an "either-or" option.

I see, thanks!
 

2San

Member
No... It depends on the human.
For me specifically, it actually reduces eyestrain because I get more headaches from fast gaming motion blur, than headaches from strobing. Eliminating the motion blur fixes it for me. Others such as CallSignVega agrees. Also, Eizo's FDF2405W brochure says the 240Hz mode (strobe mode) can reduce eyestrain during map panning. So there's pretty much credibility. But it is a tradeoff if you are more sensitive to flicker than motion blur. (I'm VERY sensitive to motion blur). Fortunately, ToastyX Strobelight makes it easy to turn off, so you can turn it on only during games.

LightBoost is good for people who are used to CRT's. If you never got CRT eyestrain, then it's easy to get used to LightBoost, since LightBoost behaves like a 120Hz CRT. However, it's not for people who get headaches with plasmas/CRT"s.

Adjusting strobe brightness downwards and increasing ambient lighting in the room, helps greatly -- common recommendations from back in the CRT tube days, too.
Hmm I don't recall having problems with CRT when I was younger. I haven't owned a CRT screen in a while though.

Yeah switching lightboost on and off is pretty straightforward. I'll give it a go again with your additional advice after my exams, thanks.
 
For those saying "most people won't notice this", it's the same argument than "30/60 FPS are the same". That some can't notice the difference doesn't mean than "most" other people won't, consciously or otherwise; if that was the case, all TVs and monitors would have a 30 Hz refresh rate and all games would be locked at 30 FPS. In fact any argument you make regarding this can be substituted by 30/60 FPS to see how ridiculous it is.
 

Dawg

Member
My parents and I actually owned a CRT TV till about 4 years ago.

The upgrade from CRT to LCD was so big. Colors, HD and all was fantastic, but the motion blur was such a huge change. Still hard to adapt to it.
 

Datschge

Member
It's nice, but I do not see this as a huge improvement compared to a tripple buffered and v-synched image. The average person is not going to notice a significant difference.

The difference would be way bigger on the lower end (slower GPUs, lower framerates), and there I'd disagree that average people are not going to see a significant difference. Alas this technology is restricted to high end only for now...
 

Rooster

Member
Can anyone explain exactly why this needs an Nvidia (and a specific series onwards at that) card to function? It's built into the monitor so will only be receiving what the graphics card sends over HDMI/DVI/Displayport right? Is there something that's been built into the hardware of these cards for a while now that tells the GSync hardware what to do? Or is it just dependent on driver support and Nvidia no longer support driver updates on these older cards?

If it's hardware based then AMD users are screwed unless a licensing agreement is reached.

Driver based similar on the licensing front but there is the possibility of third party mods.

If the Gsync module is doing things on it's own without special signals from the card/drivers then there is no reason AMD cards (or even games consoles?) wouldn't work with Gsync monitors. This would be good for everyone. Unlikely this will the case.
 
Can anyone explain exactly why this needs an Nvidia (and a specific series onwards at that) card to function? It's built into the monitor so will only be receiving what the graphics card sends over HDMI/DVI/Displayport right? Is there something that's been built into the hardware of these cards for a while now that tells the GSync hardware what to do? Or is it just dependent on driver support and Nvidia no longer support driver updates on these older cards?

If it's hardware based then AMD users are screwed unless a licensing agreement is reached.

Driver based similar on the licensing front but there is the possibility of third party mods.

If the Gsync module is doing things on it's own without special signals from the card/drivers then there is no reason AMD cards (or even games consoles?) wouldn't work with Gsync monitors. This would be good for everyone. Unlikely this will the case.

nvidia said their cards indeed have special HW in them which is used to make G-Sync work, which is why the feature isn't supported by eg older nvidia cards.
 

Datschge

Member
Isn't there a reason why monitors can't just, change their refresh rate to whatever? I mean, maybe this is just nVidia being confusing by using terms like Mhz, but they're making it sound like the they can change the monitors Hz to any figure they need to match the frame rate, and for some reason I just don't think that's the case.

The reason is the display controller in monitors (that thing that added all the superfluous and lag-inducing image post-processing) which nowadays is only prepared for normal use case. That's why Nvidia needs to mod existing monitors respectively have to sell specifically licensed monitors which are prepared to receive irregular frames and show them in an irregular pattern. Ideally the video protocol and display controllers would evolve into something that can offer this kind of access/use for all possible vendor independent hardware combinations (think e.g. true 24/48hz movie playback).
 

Durante

Member
Can anyone explain exactly why this needs an Nvidia (and a specific series onwards at that) card to function? It's built into the monitor so will only be receiving what the graphics card sends over HDMI/DVI/Displayport right?
Because what is sent over the Displayport cable for G-sync is not a normal Displayport signal.
 

mdrejhon

Member
Because what is sent over the Displayport cable for G-sync is not a normal Displayport signal.
I suspect it's just simply a variable-length vertical blanking interval, possibly with additional signalling added within. I have a good understanding of front porch, back porch, sync pixels, in both the horizontal and vertical dimension. DisplayPort still maintains the legacy blanking intervals. (You can witness this by playing with nVidia Custom Resolution Utility, or ToastyX Custom Resolution Utility, or even the very old PowerStrip utility popular in the late 90's/early 00's)

The pixel dotclock runs at a constant 144Hz, to keep frame transmission times very quick. (Future faster dotclocks could be done; such 240Hz or 480Hz during later this decade).

Old CRT's (that didn't maintain sync memories) could tolerate a slowly-varying blanking interval, so you could gradually increase and decrease refresh rates on them continuously in small increments (e.g. moving from 50Hz through 100Hz). If you changed the blanking interval too fast, it would go out of sync. Different CRT's had different sensitivities. This was clearly demonstrated by adjusting refresh rate in Entech Taiwan's PowerStrip utility; on certain CRT's, image stayed perfectly stable while you changed the refresh rate! But this never had applications until today. In practice, was only done on vector displays, which are 1980's variable-refresh-rate displays -- they slowed down when you had more vectors!

I used to design home theater equipment for a living, so I know all of this. I owned and maintained a CRT projector (NEC XG135 CRT projector -- on a 92" screen -- long before digital projectors became good)
 

Datschge

Member
Can anyone explain exactly why this needs an Nvidia (and a specific series onwards at that) card to function? It's built into the monitor so will only be receiving what the graphics card sends over HDMI/DVI/Displayport right?

To expand a little, today's GPUs are always adapting their video output to the frequency the monitor is capable of (which you set in the monitor settings) regardless of how many frames it actually calculated. That means the video data already includes all the tearing, v-syncing etc. as it leaves the GPU. G-sync is essentially removing that last step and moves the final processing of the video data into the monitor. Only this allows the monitor to display frames also at irregular variable frequencies. It depends on the programmability of the GPU's display output if it can be adapted to such an unprecedented use case afterwards, until now the target resolutions and frequencies where highly standardized.

So in a simplified model there are already three parts to it: The GPU must be able to send frames as soon as they are rendered, the video transport protocol needs to be capable of delivering it (packet based Display Port can be abused for it already, HDMI can not), the display both needs to be able to display frames as they arrive and cache frames for repeated displaying in case the next frame takes too long.
 

Durante

Member
I suspect it's just simply a variable-length vertical blanking interval, possibly with additional signalling added within. I have a good understanding of front porch, back porch, sync pixels, in both the horizontal and vertical dimension. DisplayPort still maintains the legacy blanking intervals. (You can witness this by playing with nVidia Custom Resolution Utility, or ToastyX Custom Resolution Utility, or even the very old PowerStrip utility popular in the late 90's/early 00's)

The pixel dotclock runs at a constant 144Hz, to keep frame transmission times very quick. (Future faster dotclocks could be done; such 240Hz or 480Hz during later this decade).
This makes sense. And I do remember tweaking front/back porch and other signalling parameters to get the most out of my 22" CRT in powerstrip way back when. On a Matrox GPU. Those were the times.

If this is true, then the latency of G-sync should basically always be the best-case latency of 144 Hz without V-sync. Pretty awesome.
 

mdrejhon

Member
This makes sense. And I do remember tweaking front/back porch and other signalling parameters to get the most out of my 22" CRT in powerstrip way back when. On a Matrox GPU. Those were the times.

If this is true, then the latency of G-sync should basically always be the best-case latency of 144 Hz without V-sync. Pretty awesome.
Exactly. Until frames start getting delayed by frame transmission times. (e.g. trying to do more than 144fps). Fortunately G-Sync can gain higher refresh rate caps -- on tomorrow's 200Hz, 240Hz, 500Hz, etc monitors.

I'd love to see manufacturers come out with a 1000fps@1000Hz capable G-Sync display, so we can get perfect flicker-free CRT motion clarity. Until then, it's scientifically impossible to get CRT motion clarity without interpolation, unless you use some form of strobing (which produces flicker under high speed camera).
Michael Abrash of Valve Software would like that 1000Hz display already, he mentioned it on his Valve Software blog:
http://blogs.valvesoftware.com/abrash/down-the-vr-rabbit-hole-fixing-judder

Before responding ("1000fps? wut"): check your scientific homework first, such as at least "Why Do Some OLED's Have Motion Blur?" or the dozens of scientific papers already created by display companies (some linked at the bottom of that article). Human eyes can't count the frames, but human eyes DO see motion blur sideeffects and wagonwheel/mousedropping effects of finite-framerate displays, and problems still occur all the way to 1000Hz non-strobed. Study this statement: "One millisecond of visible static frame-length still creates 1 pixel of motion blur during 1000 pixels/second panning motion. That means 4 pixels of motion blur during fast panning one-screen-width-per-second on a non-strobed 4K display (3840pixels/sec ~= 3.8 pixels of motion blurring on a non-strobed 1000fps@1000Hz non-strobed display, alas!). Finite framerates are going to be a motion blur bottleneck for a long time. As you continuously track eyes on moving objects on a flickerfree screen, your eyes are in a different position at the end of a refresh than at the beginning of a refresh. That is blurring the refresh across your eyes. An educational animation is www.testufo.com/eyetracking which demos this. Real life isn't composed of discrete frames. When attempting to use flickerfree non-strobed display technologies, ultrahigh discrete frame rates (even with instant 0ms GtG pixel transitions) will still create human-noticeable motion blur that makes flicker-free 1000fps@1000Hz non-strobed have more motion blur than real-life). It also creates mousedropping effects (wave a mouse in circle). The wagonwheel/stroboscopic effect still remains. You could add GPU-artifical motion blur effects to hide these limitatinos, but a lot of us hate that. Until we successfully achieve the 1000fps@1000Hz display (or well beyond), we aren't going to achieve Holodeck perfection where 100% of motion blur is by the human brain limitations, and not enforced upon your eyes by limitations of display persistence (sample-and-hold effect) of finite-framerate non-strobed displays. We're stuck with strobing as a solution to motion blur for a long time; it's much easier than four-digit refresh rates/frame rates. We dream for the Holodeck perfection, and discrete framerates has got to disappear sometime this century.
 

mdrejhon

Member
Hmm I don't recall having problems with CRT when I was younger. I haven't owned a CRT screen in a while though.

Yeah switching lightboost on and off is pretty straightforward. I'll give it a go again with your additional advice after my exams, thanks.
Also, other possible causes of LightBoost discomfort (other than an intolerance to 120Hz CRT's):

-- LED spectrum. It's not always very eye-friendly (LED can be superior or inferior to CCFL or CRT phosphor)

-- LightBoost stutters. LightBoost eliminates so much motion blur, that stutters/tearing becomes easier to see without the veil of motion blur. (that makes it an ideal candidate to be combined with G-Sync!) LightBoost only looks very pleasing at triple-digit framerates since it behaves like a CRT forced to 120Hz. If you only can do 60fps@120Hz, LightBoost begins to create ugly double-image effects similiar to 30fps@60Hz on a CRT. (See 30fps vs 60fps at www.testufo.com -- especially on CRT/LightBoost versus regular LCD -- it becomes very obvious that LightBoost motion looks pleasant only during framerate=Hz). Framerate/Refreshrate mismatches with LightBoost makes stutters very visible, and can be uncomfortable to the eyes.

People who loved 120Hz CRT's, almost always are able to get used to LightBoost, as it behaves like a 120Hz CRT otherwise. People who never got headaches on CRT, occasionally get temporary headaches when first switching back from a LCD back to a CRT, and then get used to it again (fix lighting, don't use too much brightness, play for short periods at first, don't sit too close to screen, etc). CRT"s in the old day, you never sat close to very huge superbright CRT's, so that helped mitigate the CRT headaches back in the old days. Common sense applies here. However, not everyone likes CRT, and I can understand...

Hoepfully, LightBoost + G-Sync can be done with at least one of the upcoming G-Sync monitors (using a good variable-strobe-rate algorithm that goes PWM-free at lower refresh rates to prevent flicker).
And, additionally, I'd love to see G-Sync at 240fps, frame transmission times to the monitor becomes only 4ms! (DisplayPort 2.0 can do 240Hz at 1080p via 2-channel mode, hint, hint).
 
Top Bottom