• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-SYNC - New nVidia monitor tech (continuously variable refresh; no tearing/stutter)

Alcahest

Member
I can't picture how with such technology the mouse cursor in-game isn't going to jump/disappear all over the place since the GPU doesn't render games and mouse cursor the exact same way I think.
Can anyone shed some light on this?
 
The more I read, the more I want to buy a G-sync capable monitor. Next gen belongs to the PC. Now we just need a way to sneak this into the Oculus Rift and we're set.

antoniobanderas.gif
 

LordCanti

Member
G-SYNC requires GTX 650 Ti Boost or better, so you'd need to buy a new GPU, too. This isn't some arbitrary BS to force folks to upgrade, by the way - Kepler and up has tech in it to make G-SYNC work.

Hmm...not sure I'm ready to let go of my 580 yet. Hopefully reviewers will A/B 120hz/LightBoost (in 2D mode) vs G-Sync so I can get a real world picture of how much of a difference it would make.
 

Fafalada

Fafracer forever
dark10x said:
I'm glad companies are looking into this refresh rate business at last but I just cannot fathom why they continue to stick with LCD.
The amount of brainwashing done by the industry already has most people convinced LCD is the second coming (that's been coming for 15 years), so why change?

Durante said:
This solves the problem at its core.
Strictly speaking I still don't care for variable framerates, no matter how they're displayed, but this at least allows to control the variance better. I wonder what chances are to see this in handheld devices someday, as outside of the dying dedicated handhelds, they universally suffer from unstable shit framerate, and you don't have the PC option of upgrading them away.
 

Eknots

Member
I'm a little confused on how it can go past the monitors 60hz, I thought the hardware of the monitor itself cannot handle going past that? Is this removing a lock that monitors have and most can go past 60 to the said 110 if my gpu hits 110 fps?
 

DKo5

Respawn Entertainment
Holy crap so essentially does this mean every game will have that silky smooth feel of high fps even if say its around 30 fps? :O

In the demo NVIDIA gave us a couple weeks ago my eyes felt like they were "seeing" the smoothness of 60hz all the way down to about 35-40fps. It was at that point I thought the NVIDIA guys were lying and made them turn on an out-of-demo framecounter like FRAPs. It was at precisely that point I went and grabbed every person at Respawn that I could find. So many jaws on the floor that day, lol.
 

TheExodu5

Banned
How does it deal with framerates that are higher than the display's refresh rate? You'd have to either drop frames (resulting in slight desync - and therefore slight input lag) or limit the framerate to your display's maximum refresh rate.
 

Zeth

Member
I'm a little confused on how it can go past the monitors 60hz, I thought the hardware of the monitor itself cannot handle going past that? Is this removing a lock that monitors have and most can go past 60 to the said 110 if my gpu hits 110 fps?

These are 144hz displays as far as I know.
 
"I'm a little confused on how it can go past the monitors 60hz, I thought the hardware of the monitor itself cannot handle going past that? Is this removing a lock that monitors have and most can go past 60 to the said 110 if my gpu hits 110 fps?"


The monitors they're talking about are not 60hz monitors, but 120/144hz monitors.
 

Dawg

Member
"I'm a little confused on how it can go past the monitors 60hz, I thought the hardware of the monitor itself cannot handle going past that? Is this removing a lock that monitors have and most can go past 60 to the said 110 if my gpu hits 110 fps?"


The monitors they're talking about are not 60hz monitors, but 120/144hz monitors.

Don't 120hz monitors already have "no tearing" with v-sync off though?
 

Crisco

Banned
How does it deal with framerates that are higher than the display's refresh rate? You'd have to either drop frames (resulting in slight desync - and therefore slight input lag) or limit the framerate to your display's maximum refresh rate.

Almost definitely that. The thing is you can now go nuts with AA/downsampling or other performance crippling features because having a variable/inconsistent frame rate no longer matters.
 
How does it deal with framerates that are higher than the display's refresh rate? You'd have to either drop frames (resulting in slight desync - and therefore slight input lag) or limit the framerate to your display's maximum refresh rate.

@ID_AA_Carmack What if the GPU frames are faster than the monitor refresh?

John Carmack ‏@ID_AA_Carmack 1h

@VegetableBread the monitor goes up to 144hz. Presumably you would want to stall then, but tear might be a (bad) option.
 

TheExodu5

Banned
It should be noted that frame latency is going to be an even more important metric for GPU reviews now. Inconsistent frame latency can still introduce stutter with this tech.
 

Durante

Member
In the demo NVIDIA gave us a couple weeks ago my eyes felt like they were "seeing" the smoothness of 60hz all the way down to about 35-40fps. It was at that point I thought the NVIDIA guys were lying and made them turn on an out-of-demo framecounter like FRAPs. It was at precisely that point I went and grabbed every person at Respawn that I could find. So many jaws on the floor that day, lol.
Sounds just as amazing as I expected. I want one now. But 1440p...
 
It should be noted that frame latency is going to be an even more important metric for GPU reviews now. Inconsistent frame latency can still introduce stutter with this tech.

namely extreme differences in frame latency.

aka, from 22ms to 50 all of a sudden

In the demo NVIDIA gave us a couple weeks ago my eyes felt like they were "seeing" the smoothness of 60hz all the way down to about 35-40fps. It was at that point I thought the NVIDIA guys were lying and made them turn on an out-of-demo framecounter like FRAPs. It was at precisely that point I went and grabbed every person at Respawn that I could find. So many jaws on the floor that day, lol.

Oh my.
 

Tain

Member
If you aren't using any kind of vsync you are getting tearing, period. Over the refresh rate, under, doesn't matter.

So let me get this straight, if my games are locked at a solid 60 FPS, I would be seeing little to change with G-Sync technology, correct?

If they are currently locked with vsync you'll be able to disable vsync and get noticeably less lag.

If they aren't currently locked with vsync you are currently getting screen tearing and with gsync you will no longer be getting screen tearing.
 
This sounds amazing. Sucks I bought my first monitor earlier this year, though. :(

It would be cool if this technology was available in the way Samsung upgrades their HDTVs(Samsung Evolution Kit). It'd be awesome if TVs and monitors could be upgraded by simply buying an add-on(price being reasonable).
 
If you aren't using any kind of vsync you are getting tearing, period. Over the refresh rate, under, doesn't matter.



If they are currently locked with vsync you'll be able to disable vsync and get noticeably less lag.

If they aren't currently locked with vsync you are currently getting screen tearing and with gsync you will no longer be getting screen tearing.

Which means if you have G-Sync, there's no reason to ever turn v-sync on right?
 

Septimius

Junior Member
Which means if you have G-Sync, there's no reason to ever turn v-sync on right?

Yes. Because v-sync means holding back your game only to draw a frame when the display needs one. If you miss your chance to give a new frame, you wait for the next opportunity. With this, you make the frame, and when it's done, you show it.
 

hesido

Member
These are 144hz displays as far as I know.

Actually, higher hz monitors would be less affected from v-sync, and this tech would have been real beneficial with 60hz monitors. For cases when the frame rate drops to 50fps, when v-sync is on, certain frames would need to be shown 2*1/60 times -> 1/30 of a second instead of 1/60, adding 16ms to the still frame, if the game runs on 50fps on a 120hz monitor, some frames would be shown 1/60th, some frames would be shown 1/40th of a time, adding ~8ms to a frame. For 144 hz, it would add about 7ms to wait for refresh..

The higher you go, the less you'd wait for the v-sync, so a 1000hz monitor wouldn't even need this tech :)
 
Yup. So beautiful!

Yes. Because v-sync means holding back your game only to draw a frame when the display needs one. If you miss your chance to give a new frame, you wait for the next opportunity. With this, you make the frame, and when it's done, you show it.

This is why patience pays off.

(Let's hope it pays off for another two more years since it's not like I can build my comp yet anyway :V)
 

AndyBNV

Nvidia
For everyone coming to the thread, here's the high-level breakdown of what G-SYNC does:

  • No tearing
  • No VSync input lag
  • No VSync stutter
  • Improved input latency compared to VSync Off mode on 'standard' monitors
  • Clearer, smoother movement (tech demo shows readable text on a moving object)
  • DIY modkit available before the end of the year for the ASUS VG248QE (~$175, aiming to reduce price to ~$130)
  • Modded ASUS VG248QE versions available from partners before the end of the year
  • Off-the-shelf models available from retailers next year. ASUS G-SYNC VG248QE MSRP is $399.
  • Will be in monitors of every resolution and size, right up to 3840x2160 - selection determined by manufacturers like ASUS, BenQ, Phillips and Viewsonic
  • Requires GTX 650 Ti Boost or better

But I just bought my VG248QE...

I'll buy 2nd monitor anyway!

Mod it with the DIY kit, or have someone do it for you. Takes 20-30 minutes.
 

Reallink

Member
Neat development, but I don't game on 24" 1000:1 LCD's and never will. I guess I'll get excited when/if it's moddable into Home Theater quality displays/FPJ's.
 

scogoth

Member
Thanks Andy for being a great community rep! Its great to see Nvidia and you in particular being involved with gamers directly.

And now I need to buy 3 new monitors =/
 

XAL

Member
For everyone coming to the thread. Here's the high-level breakdown of what G-SYNC does:

  • No tearing
  • No VSync input lag
  • No VSync stutter
  • Improved input latency compared to VSync Off mode on 'standard' monitors
  • Clearer, smoother movement (tech demo shows readable text on a moving object)
  • DIY modkit available before the end of the year for the ASUS VG248QE (~$175, aiming to reduce price to ~$130)
  • Modded ASUS VG248QE versions available from partners before the end of the year
  • Off-the-shelf models available from retailers next year. ASUS G-SYNC VG248QE MSRP is $399.
  • Will be in monitors of every resolution and size, right up to 3840x2160 - selection determined by manufacturers like ASUS, BenQ, Phillips and Viewsonic
  • Requires GTX 650 Ti Boost or better



Mod it with the DIY kit, or have someone do it for you. Takes 20-30 minutes.

Good god.

PRODUCT PAGES NOW
 
This sounds pretty awesome, but of course Nvidia makes it proprietary.

Yes, of course a big company chooses to make proprietary a technology which they independently researched and coordinated in order to actually profit on all the work they put into it. That shouldn't surprise or really even bother anybody.

If this technology plays out as well as expected, it will become a standard eventually, but a proprietary API paired with select partners is the way for someone to actually make the effort of developing this technology commercially viable.

You know what's funny, this solution could fix so many things at the lower end. For example finally displaying movies at their native 24/48hz, or allowing weaker GPUs to push frames as they finish it.

Yeah the implications of this technology are pretty significant.

We have a superior, low-persistence mode that should outperform that unofficial implementation, and importantly, it will be available on every G-SYNC monitor.

w4GL3dMl.jpg


I can't believe you guys are gonna make me buy another monitor already. ;_;
 

Miguel81

Member
Yes, of course a big company chooses to make proprietary a technology which they independently researched and coordinated in order to actually profit on all the work they put into it. That shouldn't surprise or really even bother anybody.

If this technology plays out as well as expected, it will become a standard eventually, but a proprietary API paired with select partners is the way for someone to actually make the effort of developing this technology commercially viable.



Yeah the implications of this technology are pretty significant.



w4GL3dMl.jpg


I can't believe you guys are gonna make me buy another monitor already. ;_;

Could AMD license it, if it becomes widespread?
 
For everyone coming to the thread, here's the high-level breakdown of what G-SYNC does:

  • No tearing
  • No VSync input lag
  • No VSync stutter
  • Improved input latency compared to VSync Off mode on 'standard' monitors
  • Clearer, smoother movement (tech demo shows readable text on a moving object)
  • DIY modkit available before the end of the year for the ASUS VG248QE (~$175, aiming to reduce price to ~$130)
  • Modded ASUS VG248QE versions available from partners before the end of the year
  • Off-the-shelf models available from retailers next year. ASUS G-SYNC VG248QE MSRP is $399.
  • Will be in monitors of every resolution and size, right up to 3840x2160 - selection determined by manufacturers like ASUS, BenQ, Phillips and Viewsonic
  • Requires GTX 650 Ti Boost or better



Mod it with the DIY kit, or have someone do it for you. Takes 20-30 minutes.

Are there going to be 60hz g-sync monitors? Or will the g-sync brand guarantee at least a sexy 120hz?
 
Top Bottom