• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-SYNC - New nVidia monitor tech (continuously variable refresh; no tearing/stutter)

Septimius

Junior Member
I get what that means. I also get that this wouldn't have happened without Nvidia.

Could you explain why? Right now, I'm imagining a signal from the GPU, being the message "frame ready" or however it should be worded, which then triggers a refresh of the screen. Aside from needing the ability to hold back a new refresh cycle, is there something I'm missing? I don't want to come in here all "herp derp, this should be easy", I'm just imaging that if you built your own screen and card, this would be a straight forward enough thing to do. Just the other day I was pondering fancy ways to eliminate the tearing problem, since Dark Souls is doing a thing where it's at stable 60 fps, but somehow gets out of sync and half the frame is torn, but I was imagining that when some tech like this came out, it was more advanced than what this seems to be. So it must be something I'm missing.
 
This is huge, but proprietary of course. I hope this at least does better than PhysX did. :/

That said, it's not worth $300 of higher GPU costs and a monitor mod, so I'll stay on the red side of the fence. Tearing doesn't bother me anyway. Hate Vsync with a passion, but tearing is okay by me.
 

Durante

Member
Not sure you're able to answer this but did this force some of those companies to make monitors they otherwise wouldn't have made? Especially given that alot of 4k monitors/tv's we've seen so far are 30hz with some 60hz but if i understand correctly G-Sync will be able to make even 4k monitors go up to 144hz.

We haven't got alot of 120hz monitors that are above 1080p, so this will be a huge jump.
We haven't got any (consumer) monitor above 1080p that is officially over 60 Hz, so that alone is a huge jump.

Of course, many of the benefits of high refresh rates (lower input lag, more frame slots reducing stuttering) are made obsolete by G-sync.

Could you explain why? Right now, I'm imagining a signal from the GPU, being the message "frame ready" or however it should be worded, which then triggers a refresh of the screen. Aside from needing the ability to hold back a new refresh cycle, is there something I'm missing?
I'm not saying it wouldn't have happened for technical reasons. It could have happened for technical reasons for over a decade. I'm saying that it took a large company making a truly innovative hardware product (which requires the cooperation of even more companies) primarily for enthusiasts, and there aren't many that do such things.
 

Datschge

Member
I'm thinking again and I still don't see it. When a content format fails on the market, at some point no further content will be released for it. If I have a G-sync monitor, it will always provide G-sync functionality, independent of how many other such monitors are sold.

How do you define Betamax as purely a content format and G-sync compatible monitor/GPU not a limited product selection? Of course if you got a G-sync compatible monitor/GPU you get all the advantages, just like with any other licensed Nvidia product while having a compatible Nvidia card installed. Same with Betamax, while VHS recordings cut your SD quality in half Betamax recordings of your own choice preserve that in a superior way (while still analogue). You profit of it while you have it. But general adoption of a higher standard didn't spread, on the contrary actually.
 

coldfoot

Banned
This is such a great idea and such a simple concept at the same time. I wonder why it wasn't implemented in HDMI/DP in the first place. Digital displays should never need to refresh every x miliseconds, frame refresh should be triggered by the input source. Maybe HDMI 2.1 will implement it? I think HDMI implementing this standard stands a far better chance of mass market acceptance than Nvidia. Also it would give a reason for people to replace their perfectly working TV's and monitors, so the CE industry would love this as well.
NV should stop their proprietary BS and just push it to make it a part of the next HDMI standard.
 

BPoole

Member
This sounds pretty awesome, but of course Nvidia makes it proprietary. I also have one of those Yamakasi Catleap 1440p monitors, so getting this would be way too pricey for me since I would have to buy a different 1440p monitor plus an Nvidia gpu
 

TronLight

Everybody is Mikkelsexual
This sounds incredible. I'd love to have one, it must be a godsend for those game like Bioshock o Batman that have massive amounts of stuttering.
It must make your life easier if you can't hold 60fps in a game too.

I just wish it could be compatible with every monitor.
 
This sounds incredible. I'd love to have one, it must be a godsend for those game like Bioshock o Batman that have massive amounts of stuttering.
It must make your life easier if you can't hold 60fps in a game too.

Only if they don't have simple solutions like Batman where you just set it to DX9 and turn off the useless DX11 and nvidia features crammed into it.
 
Add these quotes to the OP

"The huge gains in GPU rendering power over the past decade have enabled developers and artists to create increasingly complex 3D scenes and worlds. But even on the highest end PC, the illusion of reality is hampered by tearing and stutter. NVIDIA G-SYNC elegantly solves this longstanding problem. Images on a G-SYNC display are stunningly stable and lifelike. G-SYNC literally makes everything look better."
-- Tim Sweeney, founder, EPIC Games

"NVIDIA's G-SYNC technology is a truly innovative solution to an ancient legacy restriction with computer graphics, and it enables one to finally see perfect tear-free pictures with the absolute lowest latency possible. The resulting output really allows your mind to interpret and see it as a true continuous moving picture which looks and feels fantastic. It's something that has to be seen to be believed!"
-- Johan Andersson, technical director, DICE

"With G-SYNC, you can finally have your cake and eat it too -- make every bit of the GPU power at your disposal contribute to a significantly better visual experience without the drawbacks of tear and stutter."
-- John Carmack, co-founder, iD Software
 

Septimius

Junior Member
I'm not saying it wouldn't have happened for technical reasons. It could have happened for technical reasons for over a decade. I'm saying that it took a large company making a truly innovative hardware product (which requires the cooperation of even more companies) primarily for enthusiasts, and there aren't many that do such things.

Yeah, I can see that. But is my assessment in the ballpark of what it is? We really are lagging in the film industry when we have yet to sync our monitors with variable fps content, so I'm stoked about the news. But that this is years and years of more than a hundred engineers? I'm just imagining Nvidia sitting down with a screen-manufacturer, saying "could you hold back the refresh of the screen until a message comes through?", and not knowing the pin-out of a DVI-plug, I'd almost imagine there would be an available pin for this, too.

Like, what does the huge die on the card do? The PCB doesn't look that dense. I'm just contemplating how it's solved :)
 

Ahmed360

Member
I genuinely hate screen tearing!

Well Done Nvidia!

Now please implement it into Laptops (I prefer laptops over desktops), and then...TAKE MY MONEY!
 

Elsolar

Member
Sooo, what makes this better than VSync+Tripple Buffering+hardware mouse input? With those 3 I can already get tight input, no tearing, and non-standard framerates together. What makes this worth the extra investment?
 

Septimius

Junior Member
Sooo, what makes this better than VSync+Tripple Buffering+hardware mouse input? With those 3 I can already get tight input, no tearing, and non-standard framerates together. What makes this worth the extra investment?

Doesn't that hold back several frames, and as such, create a visual delay of x/60ths of a second?
 
@AndyBNV

What does this mean for Lightboost monitors and specifically ToastyX implementation? How does G-Sync compare to Lightboost strobbing? Do they work together, like with 3D Vision?

The VG248QE is one of the top monitors in this category. However, Lightboost uses DVI for a connection, and last time I checked, Displayport does not support this feature.
 
the chances of being able to hack the firmware of this device to speak to other monitors and gpu's is pretty high. coupled with lightboost we might probably see a paradigm shift in display technology in the next five years when uhd lightboost and g-sync like tech is incorporated into monitors televisions phones and tablets.

there is nothing stopping this implementation by other companies amd the first company to implement it to work with any display and gpu will make a killing selling it as a licence. nvidia should think big picture with this.

how will movies look on this? will this eliminated pulldown and jitter? will 3d gaming suddenly have a renaissance due to less likelyhood of crosstalk? the only thing i worry about is what happens when a game goes below 24fps what will it look like? a slideshow? a flip book?
 

Durante

Member
Sooo, what makes this better than VSync+Tripple Buffering+hardware mouse input? With those 3 I can already get tight input, no tearing, and non-standard framerates together. What makes this worth the extra investment?
Look at the pictures in the OP.
 

Arulan

Member
Sooo, what makes this better than VSync+Tripple Buffering+hardware mouse input? With those 3 I can already get tight input, no tearing, and non-standard framerates together. What makes this worth the extra investment?

V-sync creates input lag depending on the "actual frame rate" of the renderer. This is a quote from an older Anandtech article, but it should give you an idea of best case input lag scenarios:

Here's a better break down of what we recommend:

Above 60FPS:

triple buffering > no vsync > vsync > flip queue (any at all)

Always EXACTLY 60 FPS (very unlikely to be perfectly consistent naturally)

triple buffering == vsync == flip queue (1 frame) > no vsync

Below 60FPS (non-twitch shooter):

triple buffering == flip queue (1 frame) > no vsync > vsync

Below 60FPS (twitch shooter or where lag is a big issue):

no vsync >= triple buffering == flip queue (1 frame) > vsync

http://www.anandtech.com/show/2803 (The actual quote is in the comments section, page 4)

Despite this, even in the best case scenario with V-sync, where you frame rate is at the idea range, D3DOverrider enforced triple buffering, and on a 120hz+ display input lag is still present. It may be considered "good enough" for a lot of people but personally I prefer "No V-sync" because of it. Screen tearing on a 120hz monitor is also less noticeable and I believe less frequent as well. Currently this is my preferred setup, but they're claiming G-sync has the same, or even lower input lag than "No V-sync" with all the benefits of no screen tearing, pull down judder, etc. This is quite amazing.
 
@AndyBNV

What does this mean for Lightboost monitors and specifically ToastyX implementation? How does G-Sync compare to Lightboost strobbing? Do they work together, like with 3D Vision?

The VG248QE is one of the top monitors in this category. However, Lightboost uses DVI for a connection, and last time I checked, Displayport does not support this feature.

Presumably they would work together. The primary reason strobbing works is because the image isn't continuously backlit so with G-Sync the monitor could strobe at the exact time a new frame is displayed. Possibly even eliminating the need for high frame rates to match the 120hz refresh rate.

Also Displayport works for lightboost but again I would not recommend the VG248QE as it gave me significant headaches any time I was using it with lightboost.

Look at the pictures in the OP.
The pictures don't show the case where it's running at a high stable frame rate.
 

Miguel81

Member
Sooo, what makes this better than VSync+Tripple Buffering+hardware mouse input? With those 3 I can already get tight input, no tearing, and non-standard framerates together. What makes this worth the extra investment?

I get terrible lag with triple buffering, I don't how gamers can stand it.
 

Ocho

Member
It's still more "coarse" than what G-Sync does. At 144 Hz, each frame is going to be onscreen for some multiple of 1/144th of a second, or 6.944 ms. If it takes 7.1 ms to render a new frame, for example, whatever's on the screen has to stay there for a whole 13.89 ms because the new frame missed its chance and has to wait for the display to refresh again. With G-Sync, you can keep something onscreen for exactly 7ms and refresh as soon as a new frame is ready.

Oh ok, got it! Thanks.
 

Datschge

Member
You know what's funny, this solution could fix so many things at the lower end. For example finally displaying movies at their native 24/48hz, or allowing weaker GPUs to push frames as they finish it. Imagine weaker-than-desktop portable systems where the display by default is set to reflect the actual fillrate. The higher end the GPU is and the higher the displays refresh rate is the relatively smaller the benefit of the technology is (ironically AMD with their far more unstable framerates would profit more than Nvidia does).
 
My god. They need to get this in TVs too, could be the end of judder. The PAL/NTSC distinction could be no more. (If I understand this properly.)
 
Presumably they would work together. The primary reason strobbing works is because the image isn't continuously backlit so with G-Sync the monitor could strobe at the exact time a new frame is displayed. Possibly even eliminating the need for high frame rates to match the 120hz refresh rate.

Also Displayport works for lightboost but again I would not recommend the VG248QE as it gave me significant headaches any time I was using it with lightboost.


The pictures don't show the case where it's running at a high stable frame rate.

Really? That sucks. I strobe mine constantly.

Have you tried switching between 100hz, 110hz, and 120hz? Maybe your brightness settings aren't ideal.
 

Miguel81

Member
From the four partners so far, Philips is the only one that make HDTVs so there is a chance for Philips to do it, but hopefully Nvidia can get more partners who can make this into their HDTVs.

If they can get Samsung to implement it, it would be great. AMD would have to license this for their GPUs, though, for it to really kick off.
 

coldfoot

Banned
Combine this with Lightboost and the monitor would only turn off the backlight when the LCD pixels were being refreshed, you'd get PERFECT image with pretty much no blur, even at 20fps.
Simple code would be: wait for new frame marker, turn off backlight and display new frame, turn on backlight 1-2 ms after the frame marker so all the LCD pixels have responded to the new input already.
 

pottuvoi

Banned
The tech is very impressive.
Now we can actually watch a hobbit with a proper 48fps, if it ever will be available in such format for a home users.

For gameplay..
Well we can enjoy any framerate, not just a not just constant frame rates that 'fit into a frame' or two..
 

Minsc

Gold Member
Great news, I really had no desire to upgrade my monitor, until now.

DSC_3611.JPG


Oh my god.

This is big. I'm really OCD about stuff like blurry moving text.

AWESOME

Definitely needs to be quoted at least once for anyone who missed it (open in a new tab at 100% to see the difference).
 

AndyBNV

Nvidia
Just saw this:

fuck, that's almost the cost of buying the panel itself

Nvidia knowing we simply can't refuse

smh

We hope to get the cost down to $130.

Will the DIY kit be available before 2014?

We are working towards that goal.

@AndyBNV

What does this mean for Lightboost monitors and specifically ToastyX implementation? How does G-Sync compare to Lightboost strobbing? Do they work together, like with 3D Vision?

The VG248QE is one of the top monitors in this category. However, Lightboost uses DVI for a connection, and last time I checked, Displayport does not support this feature.

We have a superior, low-persistence mode that should outperform that unofficial implementation, and importantly, it will be available on every G-SYNC monitor. Details will be available at a later date.
 

Dawg

Member
Just uploaded G-SYNC module shots to http://www.geforce.com/hardware/technology/g-sync/images



The kit is only compatible with the ASUS VG248QE at this time. I am asking engineering about 'unofficial' compatibility and kits for other monitors.

Let me know if you hear anything about the XL2411T from Benq. G-SYNC is the only thing keeping me from buying it :p It's pretty much the European 120hz monitor choice (main competitor of the Asus VG248QE)
 

evlcookie

but ever so delicious
I guess this is what Kyle from HardOCP was hinting at when he said, Wait until Oct before buying a new monitor.

Really interesting and impressive stuff.

Do wish this was a more open piece of tech, So every PC gamer regardless of card choice could enjoy it.

I do wonder if this would be a really good fit with an IPS monitor more so than a TN.
 

coldfoot

Banned
We hope to get the cost down to $130.
1. What happens to audio with this implementation? How do you maintain AV Sync?
2. Do you have any plans for a business model that will push the HDMI consortium to adopt this method and then you can license the tech to TV manufacturers?
 

AndyBNV

Nvidia
1. What happens to audio with this implementation? How do you maintain AV Sync?
2. Do you have any plans for a business model that will push the HDMI consortium to adopt this method and then you can license the tech to TV manufacturers?

1) That's a question for someone like TAP.
2) That's a question for Ujesh or Jensen.
 

CheesecakeRecipe

Stormy Grey
John Carmack ‏@ID_AA_Carmack 3m
G-Sync won't work on any of the display panels Oculus is considering right now, but it will probably be influencing many future panels.
 
Definitely needs to be quoted at least once for anyone who missed it (open in a new tab at 100% to see the difference).

This definitely looks like what guys at http://www.blurbusters.com/ are getting. G-Sync is probably controlling the back lighting to some degree to drastically decrease the motion blurring on LCD displays.

Although the camera angle isn't perfectly straight and there could be some focus issues with that photograph.
 
Top Bottom