• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-SYNC - New nVidia monitor tech (continuously variable refresh; no tearing/stutter)

Dmented

Banned
Didn't see if anyone asked this yet, and tbh I don't know why I am... it's Nvidia afterall, but... this is exclusive to Nvidia GPU's isn't it?
 

Durante

Member
Carmack working on a secret project at Oculus + he's talking more about G-Sync.
G-Sync would seem ideal for the Oculus goal of minimizing input lag.


Thing is, it will be beneficial to everyone. This seems like such a common-sense idea, to have whatever is providing the image control the timing, that I could see it being standard for all sorts of displays and hardware in the future.
I absolutely agree that it's a common sense idea, but I don't necessarily think it will happen everywhere quickly. The thing is, it was already a common sense idea 10 years ago as well, and it probably wouldn't have happened for another 10 if it weren't for Nvidia.

The big roadblock is that it's only relevant for games, and display communication standards are built for TV/movies.
 
D

Deleted member 22576

Unconfirmed Member
This sounds super cool, I hope my gtx 680 is compatible.
 

Rur0ni

Member
That would be the difference between the new and older models of VG248QE, assuming there is no price mark-up.
I'm going to settle on $130. By the time this is more ubiquitous, you'll be buying a monitor with it already installed instead of springing for a kit that's finally under $100.
 
Well, not necessarily. They said $400 for that Asus monitor with a kit pre-installed is going to be sold for a premium price because of the extra work involved. A new model that was designed to use G-sync from the start would almost certainly be cheaper.

"I'm hearing that NVIDIA wants to try and get the module down to below $100 eventually."

Unless I'm reading this wrong the module is the kit itself. "Below $100 eventually" sounds like it will be around $130-$150 at launch.
 
I need to see the specs of the NVIDIA monitors with it built in. I'm interested in a 1440p monitor for sure. I'd love 120hz.
I'd kill for IPS and 27''
 

Calen

Member
OHMYGOD OHMYGOD OHMYGOD

is this really what I think it is

If so,
OHMYFUCKINGGOD SOMEONE DID IT

BEST PC TECH INNOVATION OF THE PAST 2 DECADES!

Edit:
IT IS
OH MY FUCKING GOD

SOME MOD ADD "OH MY GOD" TO THE TITLE STAT

This is hilarious but it was almost exactly my reaction.

Then I saw it, and I think my head exploded. I can't begin to express how much of a difference this makes and how GOOD it actually looks.
 

NervousXtian

Thought Emoji Movie was good. Take that as you will.
Over $100 for the DYI kit? Yeesh gonna have a hard time breaking out of the enthusiast market.

Pretty sure this right now is made for that market.

With the long plan being licensed out 3d party and becoming more of a standard in years.
 

Zeth

Member
Crazy. This is how displays would've always worked if they were designed for games, and not video/film. This stuff will be vital to stuff like VR in the near future. I'm excited.
 

Rur0ni

Member
I need to see the specs of the NVIDIA monitors with it built in. I'm interested in a 1440p monitor for sure. I'd love 120hz.
I'd kill for IPS and 27''
Well CES 2014 is less than (3) months away. I would expect something at least by then from the partner manufacturers that are launching in Q1, if not before.
 
Well CES 2014 is less than (3) months away. I would expect something at least by then from the partner manufacturers that are launching in Q1, if not before.

That's a fine amount of time, for me at least. I gotta start watching GPUs closely now since I'll have to upgrade.
 

Sixfortyfive

He who pursues two rabbits gets two rabbits.
Vsync was always a cure that was worse than the disease.

This, on the other hand, is some genuinely interesting tech. Makes you feel like an idiot when you sit and think about how it seems like such an obvious step forward, too.
 
I think a lot of people aren't going to notice this. For example, in iRacing you are recommended to play with v-sync off, because input lag has a massive effect on perceived car handling. There is a framerate lock in the menu to stop your GPU going nuts. By default, this is 84fps. Apparently, this is the ideal framerate that keeps the frames permanently out of sync with a 60Hz monitor, meaning that it never tears in the same spot so you don't see tearing.

Loads of users run iRacing at 84fps, and claim that it looks really smooth and without tearing. To me, it looks like a stuttery mess. It looks as bad as - if not worse than - 30fps. I lock the game to either 60fps or 120fps. Those are the only framerates that actually look smooth on my 60Hz monitor. Anything else looks stuttery to me, but apparently not... for many people.
 

Kintaro

Worships the porcelain goddess
Is this Megaton worthy for the PC space? Because I think this is worth a "megaton" personally.
 

ElFly

Member
G-Sync would seem ideal for the Oculus goal of minimizing input lag.


I absolutely agree that it's a common sense idea, but I don't necessarily think it will happen everywhere quickly. The thing is, it was already a common sense idea 10 years ago as well, and it probably wouldn't have happened for another 10 if it weren't for Nvidia.

The big roadblock is that it's only relevant for games, and display communication standards are built for TV/movies.

Movies could benefit a little too; Movies that run at 24fps could run at native framerate instead of the weird telecine we put up with today to make them display on 60hz displays.

Of course this isn't as important as framerate problem for videogames so really few people care or even notice.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
I'd like to see support for this in MAME. Tons of arcade games run at odd refresh rates.

Emulation could benefit tremendously of G-Sync is controllable at a driver level, like OpenGL. All odd refresh rate emulators could, in theory, replicate their true, original refresh rates.
 

dark10x

Digital Foundry pixel pusher
Since this is still a ways away I wonder what the odds of an OLED monitor would be? OLED display + this would be heavenly.

Movies could benefit a little too; Movies that run at 24fps could run at native framerate instead of the weird telecine we put up with today to make them display on 60hz displays.

Of course this isn't as important as framerate problem for videogames so really few people care or even notice.
Have you seen the 72Hz mode on a Kuro? Perfect display of 24 fps content without any judder or uneven frames. Hard to watch movies on anything else for me.

it was even cooler still discussing creating an IPS 1440 120Hz G-Sync enabled display for the hardcore gaming community.
Wow, an LCD for the "hardcore". How generous.

I'm glad companies are looking into this refresh rate business at last but I just cannot fathom why they continue to stick with LCD.
 

Tain

Member
I'd like to see support for this in MAME. Tons of arcade games run at odd refresh rates.

Emulation could benefit tremendously of G-Sync is controllable at a driver level, like OpenGL. All odd refresh rate emulators could, in theory, replicate their true, original refresh rates.

You shouldn't even need to make changes to MAME for it to support it. You probably just tell your nVidia drivers to use variable refresh rates for a given app, and bam. When MAME runs R-Type at 55fps with the D3D renderer, you are playing arcade-perfect R-Type with your G-Sync setup.

I would think that any emulator you can get screen tearing with will be pretty much perfect.
 

K.Jack

Knowledge is power, guard it well
motherofgodcrew.gif~original


This is amazing. Wow Nvidia strikes again.
 

ToD_

Member
Thing is, you shouldn't even need support. You probably just tell your nVidia drivers to use variable refresh rates for a given app, and bam. When MAME runs R-Type at 55fps, you are playing arcade-perfect R-Type with your G-Sync setup.

I hope it works this way. As in all refresh rates being selectable by applications. 30Hz through 144Hz for this particular monitor. In that case, yes, MAME would automatically select the appropriate refresh rate for each game/driver.
 
Really cool but the business side of it concerns me. Will it take off, or will it end up like 3D monitors? Will enough brands pick it up, and will it be available in prosumer equipment or just cheaper, overpriced gamer styled hardware.
 

AndyBNV

Nvidia
This is a detailed thread discussing input lag results with various forms of V-sync, flip queues, triple buffering, etc.

http://isiforums.net/f/showthread.php/9888-Input-lag-measurements

I'm not familiar with the game in question, but the general information I believe is valid to take away from.

If there is an NVIDIA rep around, are there any measurements comparing G-sync input lag to no V-sync at all yet?

Tech isn't 100% finalized so I can't share numbers at this time, but I can say that 144Hz G-SYNC input latency is faster than 144Hz stock-monitor 'No VSync' in almost all cases, and in others identical.
 

Arulan

Member
Tech isn't 100% finalized so I can't share numbers at this time, but I can say that 144Hz G-SYNC input latency is faster than 144Hz stock-monitor 'No VSync' in almost all cases, and at worst identical.

Thank you. That would be very impressive.
 
Without making this overly trite, can someone explain to me why they're so angered that Nvidia has made this proprietary?

They invested the R&D to develop an amazing technology and you want them to share it instead of cashing in on their success? Is that right?
 

luoapp

Member
Really cool but the business side of it concerns me. Will it take off, or will it end up like 3D monitors? Will enough brands pick it up, and will it be available in prosumer equipment or just cheaper, overpriced gamer styled hardware.

Let's hope Nvidia will open up their standard, or merge with the inevitable AMD solutions.

Without making this overly trite, can someone explain to me why they're so angered that Nvidia has made this proprietary?

They invested the R&D to develop an amazing technology and you want them to share it instead of cashing in on their success? Is that right?

It can be open and profitable at the same time.
 

Alo81

Low Poly Gynecologist
Tech isn't 100% finalized so I can't share numbers at this time, but I can say that 144Hz G-SYNC input latency is faster than 144Hz stock-monitor 'No VSync' in almost all cases, and in others identical.

Wait, so G-sync can be faster than no V-sync at all? Huh.

Is there any word on if DIY modules are being attempted for other monitors? I'm sporting a BenQ XL2420TX and would love to buy and add a module if it'll ever be possible.
 

mephixto

Banned
Tech isn't 100% finalized so I can't share numbers at this time, but I can say that 144Hz G-SYNC input latency is faster than 144Hz stock-monitor 'No VSync' in almost all cases, and in others identical.

Andy, are you guys working with BenQ for a similar DIY mod? or is just Asus only at the moment.
 

Trogdor1123

Member
Silly question but will this make its way into TVs eventually as well? Would be nice to have no screen tearing in some games that way.
 

AndyBNV

Nvidia
Wait, so G-sync can be faster than no V-sync at all? Huh.

Is there any word on if DIY modules are being attempted for other monitors? I'm sporting a BenQ XL2420TX and would love to buy and add a module if it'll ever be possible.

"Wait, so G-sync can be faster than no V-sync at all?" Yes, it's another benefit of building brand new hardware that we can hook directly to the GPU.

"Is there any word on if DIY modules are being attempted for other monitors?" I will be asking engineering about this when they return from Montreal.
 
Top Bottom