• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-SYNC - New nVidia monitor tech (continuously variable refresh; no tearing/stutter)

Dawg

Member
This definitely looks like what guys at http://www.blurbusters.com/ are getting. G-Sync is probably controlling the back lighting to some degree to drastically decrease the motion blurring on LCD displays.

Although the camera angle isn't perfectly straight and there could be some focus issues with that photograph.

It's one of the biggest reasons I want G-Sync. The reduced motion blur is probably the most important factor for me.

No tearing and no stutter are fantastic too, though.
 

forrest

formerly nacire
Really awesome innovation. Unfortunately, it's another proprietary feature that I'll only adopt if Nvidia has the best bang for the buck card in my price range come build time. I do need a new monitor as well, so if pricing is fair and third party options are abundant, then maybe it will influence my card purchase. Either way I'll be watching with interest as I do hate tearing/stutter.
 

AmyS

Member
I haven't read much of this thread.

Can someone please explain to me how (or if) this will make frame rates more consistent (be it 30, 60, 120 etc) by having a variable refresh rate?
 

coldfoot

Banned
Another awesome feature of this technology is that a display device with Gsync built in would be able to show your current frame rate on the screen, no need for capture equipment to measure it.
 
We have a superior, low-persistence mode that should outperform that unofficial implementation, and importantly, it will be available on every G-SYNC monitor. Details will be available at a later date.


Excellent!

Woo_hoo!_poster.jpg
 

Elsolar

Member
I get terrible lag with triple buffering, I don't how gamers can stand it.

I've found that Triple Buffering is basically identical to disabled vsync on games with raw mouse input (CryEngine games, DICE games, Blizzard games, are some of the most popular examples of games with raw mouse input).
 

Dawg

Member
This was probably posted before, but I asked the Carmackinator how they will reduce blur and all with G-SYNC.

John Carmack ‏@ID_AA_Carmack 3m
@GuerrillaDawg the didn't talk about it, but this includes an improved lightboost driver, but it is currently a choice -- gsync or flashed.
 

SuoGrey

Member
Holy crap so essentially does this mean every game will have that silky smooth feel of high fps even if say its around 30 fps? :O
 
It's one of the biggest reasons I want G-Sync. The reduced motion blur is probably the most important factor for me.

No tearing and no stutter are fantastic too, though.

Yep this definitely explains why people would be incredibly excited after seeing it. It's incredible how much motion blur is introduced by constant back light and seeing those two displays side by side is certainly striking.
 

SapientWolf

Trucker Sexologist
I'd have to see this first hand, because I never tear at 144hz and I don't know if G-SYNC is going to do more for perceptual smoothness than Light Boost. Maybe it would at lower framerates.

We have a superior, low-persistence mode that should outperform that unofficial implementation, and importantly, it will be available on every G-SYNC monitor. Details will be available at a later date.
Ok, I missed this post. It's great that the tech has the potential to be mainstream. It's a game changer for sure.
 

coldfoot

Banned
I'd expect NV to get a call from Apple with a buyout offer of this tech, and put them in their new line of expensive LCD TV's :(
 

aeolist

Banned
I'd have to see this first hand, because I never tear at 144hz and I don't know if G-SYNC is going to do more for perceptual smoothness than Light Boost. Maybe it would at lower framerates.

if you never tear at 144hz you are running with vsync on or at extremely low resolution/quality settings

or maybe all the games you play are really old
 

hesido

Member
This is genius, but they should have pushed it into the HDMI standards (e.g, HDMI 1.5 supporting monitors, TV's would all communicate with the input device.

Then the world would be rosy place.
 

LordCanti

Member
I've got the Asus monitor they say will have a retrofit kit, so here's hoping that the retrofit kit is fairly cheap.

Honestly, in 120hz/Lightboost mode, I don't notice tearing in games at all, so I don't know how much I'd really be willing to pay for this.

Edit: Displayport only? Balls. My 580 doesn't have that. Are there converters?
 
This is genius, but they should have pushed it into the HDMI standards (e.g, HDMI 1.5 supporting monitors, TV's would all communicate with the input device.

Then the world would be rosy place.

They're going to include a chipset that can sync a GPU to a panel in an HDMI cable?

Please tell me more about this magical solution of yours.
 

aeolist

Banned
They're going to include a chipset that can sync a GPU to a panel in an HDMI cable?

Please tell me more about this magical solution of yours.

if they included it in an industry standard then whatever tvs support that version of the standard would have to have display controllers that can handle it. it's not that hard.

don't know if it'll ever happen but it really is the best solution for computer graphics
 

AndyBNV

Nvidia
I've got the Asus monitor they say will have a retrofit kit, so here's hoping that the retrofit kit is fairly cheap.

Honestly, in 120hz/Lightboost mode, I don't notice tearing in games at all, so I don't know how much I'd really be willing to pay for this.

Edit: Displayport only? Balls. My 580 doesn't have that. Are there converters?

G-SYNC requires GTX 650 Ti Boost or better, so you'd need to buy a new GPU, too. This isn't some arbitrary BS to force folks to upgrade, by the way - Kepler and up has tech in it to make G-SYNC work.
 

Arulan

Member
G-SYNC requires GTX 650 Ti Boost or better, so you'd need to buy a new GPU, too. This isn't some arbitrary BS to force folks to upgrade, by the way - Kepler and up has tech in it to make G-SYNC work.

It sounds like it's taking advantage of the same technology necessary for Adaptive V-sync, judging from the minimum required GPU.
 
so new checklist for the perfect monitor

IPS
UHD
1ms gtg
<8.33 ms of display lag
144 hz (will this even be needed anymore? with g-sync)
lightboost
g-sync
no bleeding
good contrast
accurate colours
anomorphic widescreen

benefits

movies will now be great. no jitter no pulldown

stutter eliminated

tearing eliminated

mouse needs less interpolation may lead to better accuracy.

will provide less powerful gpu's better experience

emulation issues solved for old games as timings can now match.

introduced motion blur will not be needed as the eyes will do it themselves when all frames are matched.

image quality vastly improved

multi gpu setups vastly improved. less frame variance.

this is a holy grail of display innovation right here. pc gaming is going to be even more glorious.

Brothers and sisters, today we shall walk out of the wilderness, the dark age is at an end , the sacrifices of our old king the crt will finally prevail and a decade of poor misguided progress now slain. A new beacon lights our path and it is glorious. It can only be a sign that gaming heaven is upon us, that display paradise is finally at hand! Lord gaben, let your light bestow upon us the beauty and the terror of Half Life 3 as we, your humble servants of the PC master race offer our beloved hdcrt's as a willing sacrifice as we await the coming of our display messiah.

AAAAAAAAMEEEEEEHHHHHHAAAAN.
 
We have a superior, low-persistence mode that should outperform that unofficial implementation, and importantly, it will be available on every G-SYNC monitor. Details will be available at a later date.

I thought this day cannot get any better. I was wrong.
 
John Carmack &#8207;@ID_AA_Carmack 19m

@jimrayvaughn @renderpipeline laptop and mobile screens will come later. I tried to convince Apple to do this years ago.

some people here asked about laptop screens
 

Tablo

Member
Nvidia bringing their A+++ game!! Makes me happy to have a GTX 670, will stick with Nvidia in the future, can't pass up this sort of stuff :)

Will definitely grab a G-Sync monitor next year and stay on Team Green!
 

Dr. Kaos

Banned
Wow. This is so huge for PC gaming in general (and VR in particular).

A game that runs between 30 and 60 FPS using gsync will still look and play great. This is a total gamechanger.

Your move, AMD.
 

hesido

Member
They're going to include a chipset that can sync a GPU to a panel in an HDMI cable?

Please tell me more about this magical solution of yours.

Here:
if they included it in an industry standard then whatever tvs support that version of the standard would have to have display controllers that can handle it. it's not that hard.

don't know if it'll ever happen but it really is the best solution for computer graphics



Maybe I should have said HDMI 2.0 to better illustrate my point, but I just said 1.5 as the latest I've heard is 1.4.

Anyway, the displays that support this future standard would come with this functionality, and I'm 100% sure it would take much smaller effort than a hack/mod like this to include it straight on the display, as the TV/monitor software would already be able to control the refresh rate while sending the necessary signalling / handshaking information through the HDMI cable. Many TV's already can change their refresh rates (24p, 50p, 60p), this would be done dynamically.
 

Miguel81

Member
Here:




Maybe I should have said HDMI 2.0 to better illustrate my point, but I just said 1.5 as the latest I've heard is 1.4.

Anyway, the displays that support this future standard would come with this functionality, and I'm 100% sure it would take much smaller effort than a hack/mod like this to include it straight on the display, as the TV/monitor software would already be able to control the refresh rate while sending the necessary signalling / handshaking information through the HDMI cable.

Should this thing not bypass the HDMI standard?
 

patientx

Member
Nice and so great for pc gaming but people won't rush and change monitors and gpu's just for this one thing. This is only suitable for high end gaming. Small market IMO.
 
Here:




Maybe I should have said HDMI 2.0 to better illustrate my point, but I just said 1.5 as the latest I've heard is 1.4.

Anyway, the displays that support this future standard would come with this functionality, and I'm 100% sure it would take much smaller effort than a hack/mod like this to include it straight on the display, as the TV/monitor software would already be able to control the refresh rate while sending the necessary signalling / handshaking information through the HDMI cable. Many TV's already can change their refresh rates (24p, 50p, 60p), this would be done dynamically.

Noone would increase price of TV by 100$ to get few gamers happy anyway.
 

Crisco

Banned
It's wild that it's taken this long for this tech to come along. It should have been part of the DVI standard from day one. It's such an obvious solution to an archaic problem. AMD is basically fucked if DisplayPort doesn't eventually adopt something like it into the official standard.
 

hesido

Member
Noone would increase price of TV by 100$ to get few gamers happy anyway.

I don't think a native solution would cost anything close to it, I don't even think it would cost more than a software solution. HDMI already has communication functions so the TV can talk to the source device. Higher end TV's already have the circuitry to handle this. Some even output audio though HDMI, audio return channel. If they can pass information, which is audio, that is totally in sync with the visuals, they can easily output necessary signals, including signals to refresh.

LG'S simplink
http://www.ask.com/question/what-is-simplink

Audio Return channel
http://www.hdmi.org/manufacturer/hdmi_1_4/arc.aspx
 
Nice and so great for pc gaming but people won't rush and change monitors and gpu's just for this one thing. This is only suitable for high end gaming. Small market IMO.

Actually the exact opposite, this is much more significant for lower end performance where the frame rates are not consistently high.
 
We hope to get the cost down to $130.



We are working towards that goal.



We have a superior, low-persistence mode that should outperform that unofficial implementation, and importantly, it will be available on every G-SYNC monitor. Details will be available at a later date.


Thanks Andy, you've been a good sport

Regarding the bolded, my god @ the potential of lightboost without the color and brightness degradation you get from the current mod in 2d mode.

throwing my money at the screen, just hoping that the installation doesn't result in ihavenoideawhatimdoing.jpeg
 
Regarding the bolded, my god @ the potential of lightboost without the color and brightness degradation you get from the current mod in 2d mode.
That would be physically impossible I imagine since eliminating motion blur depends on making sure the backlight is not on during image transition. The overall light output of the monitor would have to be lower no matter what. However with this the GPU will also have more control over the display's colour and gamma which would make it easier to fix colour degradation caused by strobing the backlight.
 
Top Bottom