• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

G-SYNC - New nVidia monitor tech (continuously variable refresh; no tearing/stutter)

Also, other possible causes of LightBoost discomfort (other than an intolerance to 120Hz CRT's):

-- LED spectrum. It's not always very eye-friendly (LED can be superior or inferior to CCFL or CRT phosphor)

-- LightBoost stutters. LightBoost eliminates so much motion blur, that stutters/tearing becomes easier to see without the veil of motion blur. (that makes it an ideal candidate to be combined with G-Sync!) LightBoost only looks very pleasing at triple-digit framerates since it behaves like a CRT forced to 120Hz. If you only can do 60fps@120Hz, LightBoost begins to create ugly double-image effects similiar to 30fps@60Hz on a CRT. (See 30fps vs 60fps at www.testufo.com -- especially on CRT/LightBoost versus regular LCD -- it becomes very obvious that LightBoost motion looks pleasant only during framerate=Hz). Framerate/Refreshrate mismatches with LightBoost makes stutters very visible, and can be uncomfortable to the eyes.

People who loved 120Hz CRT's, almost always are able to get used to LightBoost, as it behaves like a 120Hz CRT otherwise. People who never got headaches on CRT, occasionally get temporary headaches when first switching back from a LCD back to a CRT, and then get used to it again (fix lighting, don't use too much brightness, play for short periods at first, don't sit too close to screen, etc). CRT"s in the old day, you never sat close to very huge superbright CRT's, so that helped mitigate the CRT headaches back in the old days. Common sense applies here. However, not everyone likes CRT, and I can understand...

Hoepfully, LightBoost + G-Sync can be done with at least one of the upcoming G-Sync monitors (using a good variable-strobe-rate algorithm that goes PWM-free at lower refresh rates to prevent flicker).
And, additionally, I'd love to see G-Sync at 240fps, frame transmission times to the monitor becomes only 4ms! (DisplayPort 2.0 can do 240Hz at 1080p via 2-channel mode, hint, hint).

Andy, send this commentary along with some of the other light boost stuff over to the engineering team.
 
Exactly. Until frames start getting delayed by frame transmission times. (e.g. trying to do more than 144fps). Fortunately G-Sync can gain higher refresh rate caps -- on tomorrow's 200Hz, 240Hz, 500Hz, etc monitors.
One interesting aspect is that with G-sync a monitor wouldn't necessarily have to be capable of displaying e.g. 400 different images per second to benefit from such a high transmission rate. The only thing that counts is how quickly one image can be transferred from the GPU to the monitor. Of course, we are talking about a difference of a few milliseconds compared to 144Hz here.
 
Giant abuse of Displayport 1.2. Fucks with the VBLANK periods on the fly and the GSYNC module will allow the display to be driven on an as-needed basis.

Probably requires Kepler because 1.1a doesn't have the bandwidth to drive past 1080p @ 144Hz.
It's limited to 650Ti Boost and above, the rest of the Displayport 1.2 capable Kepler family doesnt get this.

The % of Kepler based cards above 650Ti Boost. Isnt that what you were implying?

It's just ~$130. In a monitor, which you generally keep longer than most components. For enthusiasts, it's a drop in the bucket.

And now that it's finally on the table, it will probably trickle down eventually. Not that I care much about that, I wouldn't mind it remaining a high-end only feature either. Just as long as I can get it.

That's actually the greatest thing about this for me. It's incredibly rare these days that a positive hardware development takes me completely by surprise.
It's 175.

bblqigi.png
 
People who loved 120Hz CRT's, almost always are able to get used to LightBoost, as it behaves like a 120Hz CRT otherwise. People who never got headaches on CRT, occasionally get temporary headaches when first switching back from a LCD back to a CRT, and then get used to it again (fix lighting, don't use too much brightness, play for short periods at first, don't sit too close to screen, etc). CRT"s in the old day, you never sat close to very huge superbright CRT's, so that helped mitigate the CRT headaches back in the old days. Common sense applies here. However, not everyone likes CRT, and I can understand...
I haven't extensively tested this yet. Since while testufo did show an improvement with lightboost it was not worth the discomfort, so I just gave up. I do remember cranking up the brightness. I want to give this another shot though, taking these factors into account.
 
Some comments:

(1) Static photography do not accurately capture display motion blur. You need a pursuit camera.
HOWTO: Using a Pursuit Camera with TestUFO Motion Tests
For example, you can only photographically capture the checkerboard pattern at http://www.testufo.com/eyetracking#pattern=checkerboard if you pan the camera along with the UFO while taking a picture.

(2) G-Sync definitely does decrease motion blur, but not to LightBoost levels (for locked 120fps@120Hz motion, for panning similiar to www.testufo.com/photo fluidity)
G-Sync is better for variable frame rate situations, but if you've got a Titan and max out at 144fps@144Hz, it still has more motion blur than LightBoost 120fps@120Hz.

Also, it is possible they may have already come up with a method of variable-rate strobing, but I'd like to hear a confirmation about this. Also, is this station running G-Sync, or running LightBoost? John Carmack has mentioned that they've enhanced LightBoost in the ASUS G-Sync upgrade board, but he says it's an "either-or" option.
John Carmack ‏@ID_AA_Carmack 10h
@GuerrillaDawg the didn’t talk about it, but this includes an improved lightboost driver, but it is currently a choice — gsync or flashed.

Alright. I stand corrected in my belief that this is going to be better than lightboost in every way but this tech does make lightboost a lot less attractive because lightboost demands your games to run a much faster refresh rate while gsync in the future can offer benefits to people on a gaming notebook with an integrated IGP.

Gsync does have you take a hit to colors. As good as you and Vega have worked on addressing this it was only a sacrifice I was willing to tolerate. I'm hoping gsync can be applied to IPS monitors that already exist like the Catleap.

Regardless it is very cool that people who are still interested in the lightboost trick will benefit even more from gsync if they don't care as much about the caveats I mentioned.
 
175 is fine. 130 would be better. 100 would be fucking fantastic. At 100 I'd buy two more of them and two extra panels to swap out the VS247H-Ps that I'm currently using along side the VG248QE.
 
This is what's wrong with the industry. A multinational company like Nvidia spends million of dollars in R&D dreaming up some new outstanding technology and manufacturing it and then they decide to keep it in-house and turn a profit from it? It's ludicrous. Give the tech to AMD and whoever else wants it. As a matter of fact, Nvidia should just stop making GPUs altogether and become a charity, spending all of their money to develop technologies for everyone else. For fuck's sake, Nvidia.
LLShC.gif
 
I read a lot about and just watched the live demo. Let me see if I got it right:

So with this my Gpu would be free from V-sync and a 30-45 FPS game would feel (lol) like 60FPS to me. Is that right?
 
Why does Nvidia always give me a reason to keep buying their gpus every time I wanted to switch back to AMD? Lol.

This will be great in the future though.
 
I read a lot about and just watched the live demo. Let me see if I got it right:

So with this my Gpu would be free from V-sync and a 30-45 FPS game would feel (lol) like 60FPS to me. Is that right?
I doubt that 30-45 FPS would feel like 60 (though some who have seen it claimed that). However, any variable frame rate will feel much better than on any current display.

The more important part is that:
  • You will not get any tearing but also not get any additional input lag from traditional v-sync or triple buffering
  • There will be no judder associated with fitting a variable frame rate to a fixed display refresh rate
  • You will get the absolute minimum possible input lag from a frame being rendered to it being displayed possible, in any situation
 
This is what's wrong with the industry. A multinational company like Nvidia spends million of dollars in R&D dreaming up some new outstanding technology and manufacturing it and then they decide to keep it in-house and turn a profit from it? It's ludicrous. Give the tech to AMD and whoever else wants it. As a matter of fact, Nvidia should just stop making GPUs altogether and become a charity, spending all of their money to develop technologies for everyone else. For fuck's sake, Nvidia.
LLShC.gif

Well, there's a decent argument to be had here to counter this. The gaming industry is not so large that it can afford to cut the throats of major players. Just the same, a company should not have a stranglehold on a game changing technology like G-sync just because they got there first. They should reap their first mover benefits, then open the technology up or at least license it for a minimal fee, because that's what is best for the industry in the long run. This is why Mantle won't be 'open' to Nvidia until AMD has used up its sales potential. The same goes for G-sync.

The whole idea is to break free from any one company dictating the future of gaming. All four guys on the panel expressed that desire. How terrible would it be if Valve solely owned the ability to digitally distribute games? The industry would never move forward. It's the same with tech that is important to advancing the art. Both companies placed different bets. It just so happens that G-Sync requires zero effort on the part of devs, and is flat out the more important technology. Sweeney and Carmack more or less implied that Mantle was not a good idea in a world where Microsoft is still heavily invested in gaming and won't relinquish DirectX. They were also excited about SteamOS which is a move by Valve to move gaming to neutral ground.

AMD and Nvidia have every right to reap the benefits of R&D. But similar to sharing tech in the automotive industry, companies should not sit on something that can help the industry as a whole.
 
This is what's wrong with the industry. A multinational company like Nvidia spends million of dollars in R&D dreaming up some new outstanding technology and manufacturing it and then they decide to keep it in-house and turn a profit from it? It's ludicrous. Give the tech to AMD and whoever else wants it. As a matter of fact, Nvidia should just stop making GPUs altogether and become a charity, spending all of their money to develop technologies for everyone else. For fuck's sake, Nvidia.
LLShC.gif

People aren't looking for Nvidia not to turn a profit from it, they're looking for Nvidia to charge AMD ten or twenty bucks a card for it for the next 20 years rather than just boosting their turnover for a year or two until someone else creates an incompatible alternative and -does- license it out.
 
The difference would be way bigger on the lower end (slower GPUs, lower framerates), and there I'd disagree that average people are not going to see a significant difference. Alas this technology is restricted to high end only for now...
On the lower end (games running below 30 fps or between 35 and 55 fps) the problem could more effectively be solved by investing $50 - $100 more in a graphics card, which not only solves most of the the sync-issue, but also gives you more frames per second.
 
On the lower end (games running below 30 fps or between 35 and 55 fps) the problem could more effectively be solved by investing $50 - $100 more in a graphics card, which not only solves most of the the sync-issue, but also gives you more frames per second.
That really depends -- sub-60 FPS is not necessarily always on the low end. I often play games at 45-60 FPS, but it's not for lack of (cheap) GPU power. It's because I value the IQ more than a constant 60 FPS. G-sync would allow me to push IQ even further without compromising fluidity and responsiveness too much.
 
Durante said:
[*]You will get the absolute minimum possible input lag from a frame being rendered to it being displayed possible, in any situation
IMO that's one thing that's being overstated - for vast majority of software, the VSync period (16ms or less if you're on high-end monitor) is a small fraction of the input latency, and when frame does stall on VBlank it's almost always less than the whole period. Once you add network latency on top of that the VBlank period is really just miniscule in comparison so the impact on human reaction times is minimal.

I'd argue that stutter probably has more impact on your reaction speeds because it fucks with brain processing (not directly changing the input latency).
 
I doubt that 30-45 FPS would feel like 60 (though some who have seen it claimed that). However, any variable frame rate will feel much better than on any current display.

The more important part is that:
  • You will not get any tearing but also not get any additional input lag from traditional v-sync or triple buffering
  • There will be no judder associated with fitting a variable frame rate to a fixed display refresh rate
  • You will get the absolute minimum possible input lag from a frame being rendered to it being displayed possible, in any situation

Since I begun PC gaming I do everything to keep my FPS at 60. First time I saw how fluid everything could be, I couldn't go back, even if that meant lowering my IQ. I never had problem with input lag though,(never used 3overrider thing) but like i said i sacrificed everything.

This tech aparently solves that constant dilema. I am still having a hard time believing a 45FPS can go unonticed or even a 50, but the few videos I've seen show how clear the diference is.
Damn, this is too much. It fixes some of the Oculus Rift problems too.

I have to see it with my own eyes, its too good to be true.

Edit: Reading the DF article, aparently there is some ghosting when its below 60. I can accept that trade, this thing is mine.
 
IMO that's one thing that's being overstated - for vast majority of software, the VSync period (16ms or less if you're on high-end monitor) is a small fraction of the input latency, and when frame does stall on VBlank it's almost always less than the whole period. Once you add network latency on top of that the VBlank period is really just miniscule in comparison so the impact on human reaction times is minimal.
This is true, particularly compared to network latency (though not every game has that). However, the best we can do is eliminate one source of latency at a time, and that's what G-sync does for V-sync stalls.

A bit of a silly article title (I'd rather call it an end to judder or forced 60 FPS locks, since most people I know always played with V-sync on), but sounds good otherwise.
 
I just realized that G-sync was Mark Rein's "most amazing thing". (From the tweet a while ago)

First time a vague tweet actually delivered for me!
 
I just realized that G-sync was Mark Rein's "most amazing thing". (From the tweet a while ago)

First time a vague tweet actually delivered for me!
Nvidia's mentioned it right on their G-Sync page;
Mark, co-founder of Epic Games, was so enthused he couldn’t resist a teaser, telling his followers that he “saw the most amazing thing made by @nvidia. No, it's not a GPU, but gamers will love it.”
http://www.geforce.com/whats-new/ar...evolutionary-ultra-smooth-stutter-free-gaming
 
Is G-sync "useless" or a "buzzword" in any way shape or form? Can any current methods do a good enough job? Just wondering if people calling it as such have any good basis.
 
I'm really curious to see what the input lag situation is at 144Hz with this. Even at 120Hz or 144Hz, vsync adds a small, but noticeable amount of input lag. I'm not sure if this will address that or not.
 
Is G-sync "useless" or a "buzzword" in any way shape or form?
No, it's a genuine solution to a problem that has plagued gaming for a very long time now, and which could only be mitigated by various trade-offs previously.

How important the problems it solves are in the grand scheme of things is debatable, but it being a genuinely innovative and imminently useful technology is not -- just look at the glowing endorsements by industry luminaries such as Sweeney or Carmack.
 
Can any current methods do a good enough job?.

To benefit from it you should be experiencing problems first. If you already have silk smooth fps, you don't need it.

Getting faster GPU not to have "sometimes FPS drops to 30" problem is (imo, much better) option.

One point about g-sync is rather strange, it doesn't work with some games (I can't quite imagine why)
 
Getting faster GPU not to have "sometimes FPS drops to 30" problem is (imo, much better) option.
It's not about dropping to 30 FPS. Its about being able to drop to 56 or 49 FPS for a while and having them feel like 56 or 49 FPS, rather than like 30 FPS -- or perhaps worse due to temporal judder! -- like current technology.

One point about g-sync is rather strange, it doesn't work with some games (I can't quite imagine why)
I would assume this applies to those games which try to do their own frame pacing in unusual/weird ways (e.g. double v-sync, sleep timers for the V-blank period or some such sillyness). This is rather rare, but it happens from time to time (and often already causes other issues on some setups).
 
Durante said:
How important the problems it solves are in the grand scheme of things is debatable.
Well it has the potential to disrupt display market for interactive software - so I'd consider it as potentially very important, but to achieve that it will need to work with more than a subset of one display tech on a subset of one type of device.
 
I doubt that 30-45 FPS would feel like 60 (though some who have seen it claimed that). However, any variable frame rate will feel much better than on any current display.

The more important part is that:
  • You will not get any tearing but also not get any additional input lag from traditional v-sync or triple buffering
  • There will be no judder associated with fitting a variable frame rate to a fixed display refresh rate
  • You will get the absolute minimum possible input lag from a frame being rendered to it being displayed possible, in any situation

It'll be nice to finally have a definite way of playing Dead Space
 
You will get the absolute minimum possible input lag from a frame being rendered to it being displayed possible, in any situation.

Nitpicking here but nah, solution to micro-stuttering is exactly that, adding lag to "faster" frames, "frame pacing".
 
Remij said:
Can't you just disable vsync in the game and force it though D3DOverrider and have 60fps vsync'd gameplay?
Yea I'm pretty sure it works - still have to do the same on DS3 even (actually most EA console->PC ports now that I think about it).
 
Yea I'm pretty sure it works - still have to do the same on DS3 even (actually most EA console->PC ports now that I think about it).
Yeah, it definitely works, though unless they did something funny this should still make it better (but what game wouldn't that be true for?)
 
Can't you just disable vsync in the game and force it though D3DOverrider and have 60fps vsync'd gameplay?

Of course, but jittering and mouse-lag. G-Sync will be the definite solution :P. I knew about D3D Overrider trick, fortunately. All those strobe lights demand it.

Didn't the ingame Vsync work in 2? Been a longass while since I played. I think I'm due for a replay of the first, actually :).
 
In the demo NVIDIA gave us a couple weeks ago my eyes felt like they were "seeing" the smoothness of 60hz all the way down to about 35-40fps. It was at that point I thought the NVIDIA guys were lying and made them turn on an out-of-demo framecounter like FRAPs. It was at precisely that point I went and grabbed every person at Respawn that I could find. So many jaws on the floor that day, lol.

This sounds fucking fantastic. It'll be so great to not have to worry about D3DOverrider, or V-sync lag ever again, just to rid myself of screen tearing. Hahahaha, the feel of 60fps@35-40fps, delicious.
 
Really happy with the announcements; and very happy that nvidia is being pushed to be creative and try develop and improve their products.

With this, the shield, and Oculus Rift, I feel that PC gaming is at its best times...
 
I hope this fixes the stuttering I get with Tomb Raider and Far Cry 3. Stopped playing these games half way because I couldn't take it anymore.
 
But there is no 100% way to prove at this point that this will not add additional latency right?
The latency mathematics is rather simple (to me, as a amateur vision researcher, being Chief Blur Buster). Generally, it will have less input lag than VSYNC ON, less input lag than LightBoost, but slightly more input lag than >144fps during VSYNC OFF (since G-Sync will cap out at 144fps), based on the mathematics I've successfully gleaned from the digrams given already in nVidia's presentations.

The impression I'm getting is that G-Sync doesn't appear to add any framebuffer delays beyond existing electronics. It will still be real-time top-to-bottom scanout of the refresh (example high speed video, non-LightBoosted mode), just no longer forced to wait till the next scheduled refresh.
 
but slightly more input lag than >144fps during VSYNC OFF (since G-Sync will cap out at 144fps), based on the mathematics I've successfully gleaned from the digrams given already in nVidia's presentations.

To clarify this a bit, shouldn't it be >*monitors max refresh rate* during VSync off?

The current numbers are only 144hz because that's the one monitor they have it implemented in thus far but in the future, it'll be whatever monitor a persons using.
 
Are they going to jack up the price of monitors with it built in? Not sure whether I should wait for a monitor with it built in or just grab one of the DIY kits. But then they mention crazy 4K models and ugh, so much money!

They said next year for the DIY kits but are the built-on ones coming sooner?
 
Top Bottom