• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-SYNC - New nVidia monitor tech (continuously variable refresh; no tearing/stutter)

Durante

Member
Why refresh screen independently at all then? Isn't THE PERFECT way to render frames as soon as they are available?
That's exactly what G-sync does. The all-caps PERFECT way.

nVidia style.
Looks like most gafers don't get, what that means.
I get what that means. I also get that this wouldn't have happened without Nvidia.

Rolf NB said:
Why did this take so long?
Because companies suck, standardization processes for hardware interconnects suck more and no one cares about enthusiasts.

Why would I need Gsync if my monitor is running at 144hz and newer games usually net me between 60 and 100 frames? Isn't my monitor running faster than rendering so it has almost no tear?
Look at the pictures in the OP.

Q: What are the resolutions of G-SYNC monitors?

A: NVIDIA G-SYNC enabled monitors will be available in a variety of resolutions from 1920x1080, to 2560x1440 to 4Kx2K. The ASUS VG248QE NVIDIA G-SYNC enabled monitor has a max resolution of 1920x1080.
When, Andy?
 
As a living room PC gamer this solves nothing :( is there a technical reason why this couldn't be a box sitting between monitor and PC rather than integrating it inside displays?

Maybe I'll spring for one of these and an Nvidia 3D setup
 

Tain

Member
A lot of them already can with an RGB monitor and some custom firmware like ArcadeVGA, right? MAMERes tool is also good for getting MAME games to use the proper, or very close to it, refresh rates.

Groovy tech for games moving forward and the future where analog displays will be hard to get.

ToD_ said:
Yes, this has been possible with CRTs for a long time. I used to do this years ago with Powerstrip. I felt it was quite a hassle, though.

I'm hopeful this will be the beginning of flexible/variable refresh rates. Like you said, especially important now with analog displays becoming a thing of the past.

Actually, this is a little different.

While you can make custom resolutions with very detailed frequencies to match MAME games, like with Powerstrip, you still ultimately have to choose vsync or no vsync. If you choose vsync, you'll get lag. If you choose no vsync, the image will be at the same frequency as your monitor (or very close to it), but they won't flip at the same time, causing an awfully steady screen tear line.

At this level, CRTs don't really have anything to do with it aside from being able to support a range of vertical refresh rates. Many modern displays do, too.
 

Orayn

Member
Why would I need Gsync if my monitor is running at 144hz and newer games usually net me between 60 and 100 frames? Isn't my monitor running faster than rendering so it has almost no tear?

It's still more "coarse" than what G-Sync does. At 144 Hz, each frame is going to be onscreen for some multiple of 1/144th of a second, or 6.944 ms. If it takes 7.1 ms to render a new frame, for example, whatever's on the screen has to stay there for a whole 13.89 ms because the new frame missed its chance and has to wait for the display to refresh again. With G-Sync, you can keep something onscreen for exactly 7ms and refresh as soon as a new frame is ready.
 

Durante

Member
As a living room PC gamer this solves nothing :( is there a technical reason why this couldn't be a box sitting between monitor and PC rather than integrating it inside displays?
Because then the box would still have to talk with the monitor using a traditional protocol.

The tech needs to be in the display.
 

ToD_

Member
Oh really? I could have sworn I've heard an anecdote before about Japanese players considering the console version incomplete because it's incapable of replicating the frame issue that's present in the arcade version that they're all used to after a decade of playing it.

The framerate I mentioned is simply what MAME uses. They previously incorrectly used 60Hz, which was verified to be incorrect by users recording actual CPS-3 hardware using a DV cam. I'm not sure how they determined 59.633333Hz is correct, however. It could simply be an estimate. It would be pretty odd for CPS-3, or any other hardware used with CRT displays to have a variable framerate, though.

Hopefully I won't have to think about this stuff anymore with G-Sync.
 
Because then the box would still have to talk with the monitor using a traditional protocol.

The tech needs to be in the display.

and just as with 120 hz active 3D displays, and way before that high definition displays, we're seeing the tech first in PC displays.

the tech will quite possibly start showing up in TVs within a few years.
 

kartu

Banned
That's exactly what G-sync does. The all-caps PERFECT way.
The "changing frequence" part and "it doesn't work with all games" leaves some doubts about that.

I get what that means. I also get that this wouldn't have happened without Nvidia.
Progress happens, with some company or without.

Because companies suck, standardization processes for hardware interconnects suck more and no one cares about enthusiasts.
Cause you need special PCI-E, or power plug for AMD's GPU and another one for nVidia, right...
 

dark10x

Digital Foundry pixel pusher
Because then the box would still have to talk with the monitor using a traditional protocol.

The tech needs to be in the display.
That is precisely why this news has me bummed out. I love the idea, but it's nVidia which means a focus on PC gaming using an LCD monitor. We can only hope that this will push display manufacturers in some way, but I doubt it. It will be looked at as a PC thing and very likely not end up in anything besides small PC LCD monitors. Would be a waste of a great thing.
 

Datschge

Member
Excellent technology, but...

Damn you nvidia for making this proprietary.

...without making it into a more generic extension of HDMI standard that any monitor, display and TV manufacturer can implement this will be the superior Betamax technology of 2013.

The way Nvidia handles such stuff (Cuda, 3D Vision etc.) they set the high level in their tiny closed off universe while the rest of the industry struggles to catch up in quality and adoption. Not really a nice outlook.
 
That is precisely why this news has me bummed out. I love the idea, but it's nVidia which means a focus on PC gaming using an LCD monitor. We can only hope that this will push display manufacturers in some way, but I doubt it. It will be looked at as a PC thing and very likely not end up in anything besides small PC LCD monitors. Would be a waste of a great thing.

I could see SCEE pushing for Sony to put something like this in their TVs that would work with either a new revision of the PS4, or further down the line, the PS5.
 

Tain

Member
Usually when some hardware feature is proprietary I fear about the lack of software supporting it, but that isn't really an issue for this.
 

Durante

Member
Progress happens, with some company or without.
This would have made sense for over a decade. Why didn't it happen?


..without making it into a more generic extension of HDMI standard that any monitor, display and TV manufacturer can implement this will be the superior Betamax technology of 2013.
This makes no sense. When Betamax fails in the market, it renders your investment useless, since it's a content standard. If you buy a G-sync monitor, it will work perfectly regardless of how many other people do so.


Usually when some hardware feature is proprietary I fear about the lack of software supporting it, but that isn't really an issue for this.
Exactly.
 

J-Rzez

Member
Wow. Ok. The 780ti was an interesting product unveiling and I'm excited to hear more about it, but Maxwell is still looming overhead here so that kind of killed my excitement a little. But this, this is incredibly interesting to me. I was hell bent on Rift, but now, I'm not so sure because the visual quality gained from this may trump the VR effect unless they get their visual quality to these standards.

Interesting times.
 

chaosblade

Unconfirmed Member
Of all the things to be proprietary. This is way bigger than stuff like Mantle which is never going to see serious use on PC anyway.

I'll probably never use it unless it becomes standardized in monitors, and we all know that won't happen. Not spending extra on a monitor only to lock myself into NVidia cards.
 

Demon Ice

Banned
Holy shit. I will buy a monitor at whatever price Nvidia sets if it means I never again have to worry about tearing or vsync or input lag or triple buffering or anything. WOW.

This is fucking huge.


PLEASE tell me this is fully compatible with all video cards. If not, fuck it, I'll probably end up sticking with a Geforce GTX. WOW
 

AndyBNV

Nvidia
Holy shit. I will buy a monitor at whatever price Nvidia sets if it means I never again have to worry about tearing or vsync or input lag or triple buffering or anything. WOW.

This is fucking huge.


PLEASE tell me this is fully compatible with all video cards. If not, fuck it, I'll probably end up sticking with a Geforce GTX. WOW

G-SYNC monitors work with any GPU, but you need 650 Ti Boost or better to use the G-SYNC functionality.


Off the shelf monitors are made and shipped by ASUS, BenQ, etc, so I cannot give you a precise timeframe.
 

dejan

Member
ice-cube-it-was-a-goo2okhw.jpg
 
G-SYNC monitors work with any GPU, but you need 650 Ti Boost or better to use the G-SYNC functionality.

Wait a minute. My 570SLI are not supported with my upgraded Gsync monitor?! (DESPAIR)

That would put the upgrade pressure to be near 1 Million Kilo-pascals.
 

AndyBNV

Nvidia
Wait a minute. My 570SLI are not supported with my upgraded Gsync monitor?! (DESPAIR)

That would put the upgrade pressure to be near 1 Million Kilo-pascals.

Like ShadowPlay and the H.264 encoder on GTX 600 and 700 Series, G-SYNC requires GPU tech that we first integrated with Kepler.
 

Datschge

Member
This makes no sense. When Betamax fails in the market, it renders your investment useless, since it's a content standard. If you buy a G-sync monitor, it will work perfectly regardless of how many other people do so.

Think again. Betamax worked perfectly fine on its own as well (recording videos) and at that was the superior technology at its price until DVD-RAM finally replaced VHS (at which point HDD recorders were also feasible).
 

Tagyhag

Member
This is very interesting but I'm not fully grasping it , I'll have to wait to see what people say about it once it's out, and if there's any compatibility problems and the like.

So, will the monitors still be classified as 60/120/144hz etc. or is that not needed any more? I was thinking of getting a 120hz monitor but now I'm not so sure.
 

J-Rzez

Member
NvidiaFAQ[/quote said:
In addition to cutting-edge changes to the viewing experience, multiplayer gamers will receive a significant competitive advantage when G-SYNC is paired with a fast GeForce GTX GPU and low-lag input devices, something that’ll surely pique the interest of shooter aficionados. For eSports players, NVIDIA G-SYNC is an essential upgrade. With G-SYNC’s removal of input lag, successes and failures are squarely in the hands of players, differentiating the pros from the amateurs.

This thing is going to sell like crazy... I hope some VR headsets take advantage of this. I really wanted to go that route this time, but as "true next gen leap in gaming" as it is, their quality needs to match this as well.
 

Nethaniah

Member
Off the shelf monitors are made and shipped by ASUS, BenQ, etc, so I cannot give you a precise timeframe.

Not sure you're able to answer this but did this force some of those companies to make monitors they otherwise wouldn't have made? Especially given that alot of 4k monitors/tv's we've seen so far are 30hz with some 60hz but if i understand correctly G-Sync will be able to make even 4k monitors go up to 144hz.

We haven't got alot of 120hz monitors that are above 1080p, so this will be a huge jump.
 

Crisco

Banned
This is pretty awesome, even more so because it was demo'd on the monitor I own. Will definitely purchase the mod when it comes out, especially if it's $100 or less. That's basically impulse buy territory for what looks like a massive improvement in user experience.
 

Totobeni

An blind dancing ho
Hope they can expand this tech to HDTVs. It would be a great thing for those of us that switch back and forth.

From the four partners so far, Philips is the only one that make HDTVs so there is a chance for Philips to do it, but hopefully Nvidia can get more partners who can make this into their HDTVs.
 

Durante

Member
Think again. Betamax worked perfectly fine on its own as well (recording videos) and at that was the superior technology at its price until DVD-RAM finally replaced VHS (at which point HDD recorders were also feasible).
I'm thinking again and I still don't see it. When a content format fails on the market, at some point no further content will be released for it. If I have a G-sync monitor, it will always provide G-sync functionality, independent of how many other such monitors are sold.
 

Mogwai

Member
This is interesting. However, if you are required to buy an NVIDIA monitor it's going to be too expensive compared to console gaming. Their cards are expensive already.

But the promise of no stutter and tearing sounds so good.
 
Top Bottom