• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-SYNC - New nVidia monitor tech (continuously variable refresh; no tearing/stutter)

People stating that they're unhappy this is currently a proprietary thing, should bare in mind that unless one manufacturer had actually stepped up and put money into this technology that's been possible in theory for quite a while now, we might have ended up waiting a ton longer for this than was necessary.

The point is that if this catches on, the tech is now presumably going to be pushed by lots of different manufacturers over the next 5 or so years in competition.

Yes. That said, I'm worried that for the next few years, we're going to get a wild west scenario similar to the SLI wars. Used to be you had to check whether your motherboard supported Crossfire or nVidia's SLI if you wanted to take advantage. Now, assuming AMD (and Intel) hops on board and there's no open standard, you'll also have to check if your monitor supports G-Sync, AMD's solution or Intel's solution. Considering the initial cost of the extra silicon, no manufacturer's going to build more than one solution into their panels. Have fun being locked into one vendor's cards for a few years!
 

d0g_bear

Member
nvidia lists Windows 7/8/8.1 required for this to work. Is there a technical reason linux wouldn't work? I'm think of steam OS in particular (yes on a monitor)
 

Nethaniah

Member
nvidia lists Windows 7/8/8.1 required for this to work. Is there a technical reason linux wouldn't work? I'm think of steam OS in particular (yes on a monitor)

Maybe they just aren't working on getting this compatible with the drivers for linux yet.

Absolutely 0 interest until they start building monitors with this tech that are IPS displays that are at least 1440p. Until then it is DOA for me, never going back to 1080p and you couldn't pay me to own another TN panel :).

Well they are going to release 4k monitors with this so that's one down, let's hope for the other.
 

jrcbandit

Member
Absolutely 0 interest until they start building monitors with this tech that are IPS displays that are at least 1440p. Until then it is DOA for me, never going back to 1080p and you couldn't pay me to own another TN panel :). Did they give any info for when we would see better/larger displays than the typical 1080p TN panels?
 

Minsc

Gold Member
Seems like a perfect fit for the whole improved latency thing Valve's going for with SteamOS. Be surprised if it didn't find its way in there sooner or later.
 
When did tearing become such a big deal?

I would say it always has been a big deal, just input latency from vsync made it not an option. It is just something people dealt with. Sounds like a decent fix for it finally. If it also is reducing the appearance of stuttering, that is also great. I don't think it will eliminate it though.
 

Miguel81

Member
Every time I think I'll go AMD, Nvidia pulls me back in.

nvidia-2.jpg
al_pacino_godfather_3_pull_me_back_in.gif
 

Daingurse

Member
Wow, this is pretty fucking awesome! No idea whether my monitor is moddable, but I know exactly what to wait for now! Guess I'll be sticking with the Green Team for now and the foreseeable future.
 
"I know you guys are content with it becoming a niche thing for enthusiasts, but this is a big deal if it works out and it's a bloody shame that it most likely won't reach the vast majority of gamers."


Why wouldn't it reach the vast majority of gamers?

Because it's proprietary? Going to need monitor manufacturers to come up with a standard that works with both GPUs and maybe even intel's integrated graphics for it to reach the vast majority of gamers.
 

artist

Banned
Limited to... the 62% of people (give or take) who buy their discrete GPUs from nVidia? Proprietary tech hasn't stopped people from building monitors that support Lightboost and this is going to be a noticeably bigger deal than that.
It's actually much much lower than 62%.

This is specific to Kepler only that too limited to 650Ti Boost and above. I know there was a similar discussion in another thread (wrt to Mantle) and that works with all GCN arch. I wish Nvidia dropped a little more details on the requirements for G-SYNC, is it a dedicated IP block, DSP or something else present in GK106 or above? Why not let the goodness to the rest of the lower-end GPUs .. I would love, really love if this gets mainstream and ideally want as less (artificial) constraints as possible.
 

Reallink

Member
Because it's proprietary? Going to need monitor manufacturers to come up with a standard that works with both GPUs and maybe even intel's integrated graphics for it to reach the vast majority of gamers.

If Valve has their way, will the majority of PC gamers even be playing on "monitors" in the first place?
 

Hazaro

relies on auto-aim
Too bad I missed posting earlier, but this is seriously fantastic.

As ghst, Durante, artist, and others already pointed out this is a fantastic FIRST leap forward for having smoother games while needing less performance. It's to be seen how much input lag this induces vs. no v-sync but I'm very hopeful it'll work very nicely with 120Hz / 144Hz panels.

There is no reason others can't look to integrate this tech, get the costs down, and possibly even have it standard on most mid-high end panels in just a few years.
For that I award nVidia with a Flash .gif

hfAUReF.gif
 

coldfoot

Banned
New 4K TV's look like the best option for this tech, as an extra $100 doesn't mean much at those prices.
I really wish there was a way to put this into consoles, either via firmware update or hardware revision with a different HDMI chip. PS4 is powerful enough for 30 but not that powerful for 60 with all the bling, so 45 fps full bling would be perfect for it.
 

Hip Hop

Member
This is literally one of the biggest revolutions I've seen recently.

When I first got into PC gaming last year, after building my rig, I literally thought something was wrong with my GPU or CPU. I even exchanged my motherboard at one point.

Turned out it was the refresh rate was the problem of this. That is how bad 60Hz is when going above 60FPS. I couldn't believe my eyes and never experienced something like this as I was never a PC gamer.

I literally had no clue until someone suggested the refresh rate. Got myself a 144hz and the difference is night and day. It's like your playing with the wrong monitor by using a 60Hz. 120hz and above is a must for a PC gamer.

Like the old debate about how people can't distinguish 30 vs 60 fps. 60hz and 120hz is such a big difference that it is a fact that is noticeable like a varying FPS is.

I can see that screen tearing is still present with a 144hz monitor. It's not as horrific as when using a 60Hz monitor, but it is still there. If G-Sync works the way it has been presented, it would be one of the greatest achievements in PC gaming as of recent.
 

Reallink

Member
I doubt Valve has any plan to make people switch to tv's, they wouldn't even be able to i reckon.

Valve doesn't want people to move from monitors to TV, they want people who are already on TVs to move to their Steamboxes.

Sorry if it wasn't clear, that wasn't what I was implying. Operating under the assumption their goal is expanding PC gaming to the console/living room masses, the majority could then conceivably be on TV's.
 

Hip Hop

Member
When did tearing become such a big deal?

Ever since frames have been going above the refresh rate.


I literally did not want to use my gaming computer because of tearing. It is so bad it makes a 90FPS game look like sub-par 30FPS. I didn't know it was all about the refresh rate as I just had got into PC gaming.

Got myself a 144hz and that almost completely fixed the issue. Screen tearing doesn't happen at a bigger scale than when using a 60hz, but it is still there. More than playable with a refresh rate of 144hz but it could still be perfected. Hope G-sync delivers that.
 

Eusis

Member
Ever since frames have been going above the refresh rate.


I literally did not want to use my gaming computer because of tearing. It is so bad it makes a 90FPS game look like sub-par 30FPS. I didn't know it was all about the refresh rate as I just had got into PC gaming.

Got myself a 144hz and that almost completely fixed the issue. Screen tearing doesn't happen at a bigger scale than when using a 60hz, but it is still there. More than playable with a refresh rate of 144hz but it could still be perfected. Hope G-sync delivers that.
Any reason you couldn't use v-sync? It's not a perfect solution as we're getting this, but it DOES eliminate tearing.
 
"Any reason you couldn't use v-sync? It's not a perfect solution as we're getting this, but it DOES eliminate tearing."


I dunno why he didn't enable v-sync, but the difference between 60 and 120/144 is as big as the difference between 30 and 60.
 

Eusis

Member
"Any reason you couldn't use v-sync? It's not a perfect solution as we're getting this, but it DOES eliminate tearing."


I dunno why he didn't enable v-sync, but the difference between 60 and 120/144 is as big as the difference between 30 and 60.
Oh, I think I'd rather have tearing at 120/144hz than 60hz, but it's still worth bringing up as it DOES eliminate tearing, and is effectively a near-perfect solution at games that don't dip below 60 FPS on your setup anyway.
 

Hip Hop

Member
Any reason you couldn't use v-sync? It's not a perfect solution as we're getting this, but it DOES eliminate tearing.

I really dislike the input lag. For games where it requires precision like BF3 and Counter Strike: GO, V-sync was not an option. There were a few games where it worked great, but at other times, the stutter would occur. But you're right, sometimes I just had to make do with V-sync. It was never something I enjoyed using though. It was either input lag or screen tearing as choices. Tough decisions. Hope better solutions get implemented in the future, G-sync is a start.
 
Too bad I missed posting earlier, but this is seriously fantastic.

As ghst, Durante, artist, and others already pointed out this is a fantastic FIRST leap forward for having smoother games while needing less performance. It's to be seen how much input lag this induces vs. no v-sync but I'm very hopeful it'll work very nicely with 120Hz / 144Hz panels.

There is no reason others can't look to integrate this tech, get the costs down, and possibly even have it standard on most mid-high end panels in just a few years.
For that I award nVidia with a Flash .gif

hfAUReF.gif

Blame LCD technology.

CRT/OLED, what's input lag?

Nvidia has nothing to be relevant for the next few years, so just throw a placebo effect.

Yes, I want an OLED monitor to replace my 0 problems CRT.

Bleeding, IPS glow, input-lag, ms response time, what the hell is this? LCD is that unkillable cockroach of panel technology.

I play at 85Hz vertical sync enabled and I don't notice input lag. 2000 tech is so awesome...
 

Alo81

Low Poly Gynecologist
Why not let the goodness to the rest of the lower-end GPUs .. I would love, really love if this gets mainstream and ideally want as less (artificial) constraints as possible.

According to Nvidia Andy, the functionality is dependent on something introduced in the Kepler cards which of course, the 5xx series and below aren't sporting.
 

Rubius

Member
So, the end of ghosting? For some reason every monitor have ghosting I can see for me, even the high performance and 120hz stuff.
 
I just hate that SED/FED never got an actual chance and we ended up with this crappy LCD tech where all of the options have their portion of defects.

So, the end of ghosting? For some reason every monitor have ghosting I can see for me, even the high performance and 120hz stuff.

LCD tech, you'll never get rid of that.

CRT/Plasma/OLED, way to go.

SuperFast LCD TN 1-2ms grey to grey
Panasonic Plasma 0.1ms black to black
OLED something in between
CRT what's a ms?

http://www.youtube.com/watch?v=hxv7mmKHRhs
Don't watch if you use an LCD Monitor(it won't make the SED example accurate), sorry that you buy obsolete technology even if its "new".
 
Let me get this straight: Does this somehow mean that if a game is running at 60 frames per second, and then it drops to 40 frames per second, or even as low as 25FPS, we won't notice due to G-sync, because the GPU render rate and the monitor refresh rate will be practically identical, making the game, as well as input response still feel flawless?

Is that what this is? Because if so, that's fucking unbelievable.
 
Let me get this straight: Does this somehow mean that if a game is running at 60 frames per second, and then it drops to 40 frames per second, or even as low as 25FPS, we won't notice due to G-sync, because the GPU render rate and the monitor refresh rate will be practically identical, making the game, as well as input response still feel flawless?

Is that what this is? Because if so, that's fucking unbelievable.

You will still notice the frame rate difference, you just won't experience tearing and lag
 

Tain

Member
This tech would also benefit CRT displays if any had it. I don't know why multiple posters assume otherwise.

Vsync stutter, vsync lag, and screen tearing all happen on CRTs.
 
Let me get this straight: Does this somehow mean that if a game is running at 60 frames per second, and then it drops to 40 frames per second, or even as low as 25FPS, we won't notice due to G-sync, because the GPU render rate and the monitor refresh rate will be practically identical, making the game, as well as input response still feel flawless?

Is that what this is? Because if so, that's fucking unbelievable.

If you notice changes in framerate now, you'll still notice it. The benefit with G-Sync is that your input will stay synced with the framerate, so frame drops won't affect your input, which is the real problem with "stuttering" as it is referred to.

Adaptive v-sync is an easy way to think of it.
 
Tested had a much better explanation than I could articulate:

Because monitors refresh at 60Hz, there's typically a trade-off involved in using V-sync. It eliminates screen tearing, but you have to be able to render at a rock-solid 60 fps or above, or you get stutter. Without V-sync, you get screen tearing, because the monitor and GPU may be updating the frame at different times. With G-Sync, an Nvidia Kepler GPU will now control the timing of frame updates.
 

Arulan

Member
This is specific to Kepler only that too limited to 650Ti Boost and above. I know there was a similar discussion in another thread (wrt to Mantle) and that works with all GCN arch. I wish Nvidia dropped a little more details on the requirements for G-SYNC, is it a dedicated IP block, DSP or something else present in GK106 or above?

It may have something to do with the technology in Kepler they used to enable Adaptive V-sync. This however is purely speculation on my part.
 

HariKari

Member
It may have something to do with the technology in Kepler they used to enable Adaptive V-sync. This however is purely speculation on my part.

Adaptive V-sync works with other cards, like Fermi. Just marketing on their part. Also, this is nothing like adaptive V-sync, which is essentially V-sync that turns itself on over 60fps. G-Sync is far better than that.
 

Arulan

Member
Adaptive V-sync works with other cards, like Fermi. Just marketing on their part. Also, this is nothing like adaptive V-sync, which is essentially V-sync that turns itself on over 60fps. G-Sync is far better than that.

I was simply speculating to some type of handshaking going on in Kepler that they can point the G-sync module to.

I'm more curious to what specifics the monitor requires for it to be compatible with this module.
 
Top Bottom