• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-SYNC - New nVidia monitor tech (continuously variable refresh; no tearing/stutter)

Morzak

Member
Wait, so G-sync can be faster than no V-sync at all? Huh.

Is there any word on if DIY modules are being attempted for other monitors? I'm sporting a BenQ XL2420TX and would love to buy and add a module if it'll ever be possible.

It makes sense though, as long as the G-Sync module doesn't introduce more input lag then the original scaler. The solution just forces the same refresh-rate as your FPS, there is no additional waiting for a frame to picked up.

For the second question, I asked the same question, they don't know yet.

http://www.neogaf.com/forum/showpost.php?p=86539078&postcount=1183
 

AndyBNV

Nvidia
Andy, are you guys working with BenQ for a similar DIY mod? or is just Asus only at the moment.

We are working with ASUS, BenQ, Phillips and ViewSonic on off-the-shelf G-SYNC monitors, so there is a possibility that there'll be mods for their existing models also. That is pure speculation on my part, so please do not count on that happening. Presumably, it'd be more likely to happen if people expressed their interest directly to the companies in question.

You can learn a bit more about the tech in our FAQ: http://www.geforce.com/hardware/technology/g-sync/faq (stuff not mentioned on the stream or in the article)
 
Let's hope Nvidia will open up their standard, or merge with the inevitable AMD solutions.



It can be open and profitable at the same time.

And? It could be closed and be profitable as well, perhaps even more so. What's the fuss? It's a calculated business move, which many businesses do, and they get shit on because they beat everyone to the punch and decide to control and protect it? I find the hate unfounded.
 
And? It could be closed and be profitable as well, perhaps even more so. What's the fuss? It's a calculated business move, which many businesses do, and they get shit on because they beat everyone to the punch and decide to control and protect it? I find the hate unfounded.
The big picture here is PC gaming as a whole. And Nvidia might be making short term gains on a long term bust. Someone, somewhere at Nvidia must remember their SGI roots.

edit. I mean just look at how they're diversifying their range of products.. fucking Shield.
 

Totobeni

An blind dancing ho
VG248E will be G-Sync enabled next year for $399. I'm hearing that NVIDIA wants to try and get the module down to below $100 eventually.

I don't think I'll be able to wait for cheaper monitors, I will get the first Asus G-Sync monitor, $399 is a-ok and I don't care it really since it solve the evil stuttering I had issue with since forever now.
 

Rur0ni

Member
We are working with ASUS, BenQ, Phillips and ViewSonic on off-the-shelf G-SYNC monitors, so there is a possibility that there'll be mods for their existing models also. That is pure speculation on my part, so please do not count on that happening. Presumably, it'd be more likely to happen if people expressed their interest directly to the companies in question.

You can learn a bit more about the tech in our FAQ: http://www.geforce.com/hardware/technology/g-sync/faq (stuff not mentioned on the stream or in the article)
From your link:

Q: How much more does G-SYNC add to the cost of a monitor?
A: The NVIDIA G-SYNC Do-it-yourself kit will cost approximately $175.
Higher than I expected guys.
 

Qassim

Member
This sounds great. It kicks up a similar conflict I have with Mantle though. It's a great innovation by nvidia but if it does work as it is supposed to, then it could change the industry - AMD and Intel aren't going to not answer that.

This is potentially a bit more annoying because:

1) We'll have AMD and NVIDIA supported monitors? AMD will answer this with their own, perhaps Intel (although their graphics generally address a different market). Perhaps most will both have the hardware required for G-Sync and an AMD solution, if so:

2) Would that drive up prices? Perhaps for a while, but I suspect this hardware will reduce in price fairly quickly, but even then:

3) Then this has a potential lock-out effect on any potential newcomers to the market. It seems very unlikely at the moment we'd have anyone to attempt to break into the PC graphics market, at least at the end we're interested in, but it's an uncomfortable thought regardless.

I'm super excited for this, I have a 780, and plan to go Nvidia for my next lot too (after 4 years prior to this upgrade cycle with AMD and bad experiences), but I'm slightly concerned on the effect of this sort of stuff on the industry - fragmentation hurts PC gaming, we need vendor neutral solutions.

It's really difficult, because who will develop these vendor neutral solutions? Companies like Valve, Oculus, etc, seem like the best candidates, but I'm not sure how set up they are do to that sort of stuff.

NVIDIA and AMD aren't going to spend money developing stuff for the "good of the industry" if they aren't as a result going to drive people to buy their cards. Completely understandable and justifiable, especially in cases like this - it isn't as if they're extended existing functionality, this is brand new hardware that is required for this to work.
 
You can learn a bit more about the tech in our FAQ: http://www.geforce.com/hardware/technology/g-sync/faq (stuff not mentioned on the stream or in the article)

Oooh $175 is pricey.


Also this bit is interesting

Q: Does NVIDIA G-SYNC work for all games?

A: NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver.

I would be interested to know what these games are and the issues are. Frame rate depends a lot on the engine the game is running on and it's not always the GPU performance that ends up limiting it.
 
And? It could be closed and be profitable as well, perhaps even more so. What's the fuss? It's a calculated business move, which many businesses do, and they get shit on because they beat everyone to the punch and decide to control and protect it? I find the hate unfounded.

I'm not necessarily angry but from a consumer point of view, it's obviously better for these innovations to not be locked down to very specific hardware.
 

Nethaniah

Member
I would be interested to know what these games are and the issues are. Frame rate depends a lot on the engine the game is running on and it's not always the GPU performance that ends up limiting it.

Older games that freak the fuck out when you run them above a certain framerate or on newer hardware probably.
 

AndyBNV

Nvidia
From your link:

Higher than I expected guys.

Pricing isn't final, and we hope to drive it down over time. The price stated on AnandTech, via our staff in Montreal, is $130. Going by ASUS's quoted $399 price for an off-the-shelf modded monitor, and the current cost of the non-modded version, off the shelf appears to be the cheapest route right now.
 

Rur0ni

Member
Pricing isn't final, and we hope to drive it down over time. The price stated on AnandTech, via our staff in Montreal, is $130. Going by ASUS's quoted $399 price for an off-the-shelf modded monitor, and the current cost of the non-modded version, off the shelf appears to be the cheapest route right now.
Ah I see. $130 was definitely what I thought it would come in at.
 

ToD_

Member
Is the refresh rate then completely unlimited? What about older games where you can push it to 300 and beyond, it'd be super cool if that were the case.

No, not unlimited. From the AnandTech article below:

"NVIDIA demonstrated the technology on 144Hz ASUS panels, which obviously caps the max GPU present rate at 144 fps although that's not a limit of G-Sync. There's a lower bound of 30Hz as well, since anything below that and you'll begin to run into issues with flickering. If the frame rate drops below 30 fps, the display will present duplicates of each frame."

http://www.anandtech.com/show/7436/nvidias-gsync-attempting-to-revolutionize-gaming-via-smoothness
 
TLDR explanation:

Instead of the monitor refreshing it's frames (aka: Hz) irrespective of how fast/slow the GPU is rendering frames, G-Sync tech monitors fully synchronise GPU frame rendering with a dynamic Hz/refresh rate of the display.

When GPU frame rendering and monitor frame refreshes are not synchronised this causes issues with games with fluctuating framerates, like tearing.

In the past, the solution to this problem was V-Sync, which forcibly synchronised GPU frame rendering until each frame was complete, but this causes stuttering and lag as the monitor's frame refresh is faster than the GPU frame rendering.

G-Sync is a hardware level syncronisation of GPU frame rendering and monitor frame refreshing, meaning the monitor's Hz/refresh rate changes dynamically depending on the GPU's rate of frame rendering. You get each frame fully rendered (no tearing), and when the framerate drops the monitor doesn't have to fight it with a locked refresh rate, instead changing it's refresh rate to fit the framerate (eliminating stuttering and input lag).

Thanks for the explanation. Will this get messed up if you have multiple monitors up?
 

Orayn

Member
Also, holy hell, I didn't realize how amazing this would be for emulation. Somebody get Byuu on the line!

What is 'stutter' compared to an unstable, low framerate?

Repeated frames, basically. If you have some frames that take even a tiny bit longer to render than the time it takes for your display to refresh, you miss the window of opportunity and have to wait for the monitor to refresh again to display the new frame. If this happens consistently, it can make a game feel really jerky and not smooth even if the framerate is good.
 

kartu

Banned
Why refresh screen independently at all then? Isn't THE PERFECT way to render frames as soon as they are available?

Proprietary? What the fuck?

Hoping another develops the same tech and undercuts them.

nVidia style.
Looks like most gafers don't get, what that means.
 

zou

Member
hopefully they'll offer some decent monitors, none of those crappy 24" tn panels with hd only. And I'm not gonna hold my breath, but hopefully they'll have some non-widescreen ones.

20" 1600x1200 IPS and I could finally retire my dell monitors.
 

galvatron

Member
Emulation could benefit tremendously of G-Sync is controllable at a driver level, like OpenGL. All odd refresh rate emulators could, in theory, replicate their true, original refresh rates.

I'd like to see support for this in MAME. Tons of arcade games run at odd refresh rates.

A lot of them already can with an RGB monitor and some custom firmware like ArcadeVGA, right? MAMERes tool is also good for getting MAME games to use the proper, or very close to it, refresh rates.

Groovy tech for games moving forward and the future where analog displays will be hard to get.
 

UrbanRats

Member
I'm not sure i understand the full ins and outs of this innovation here, but tearing is the most annoying shit, so if this is a best and more efficient way to deal with it, i'm a happy guy!
 

Rolf NB

Member
Why did this take so long? Ever since TVs started specifically supporting p24, getting rid of constant refresh cycles altogether seemed like an inevitability.
 

Ocho

Member
Why would I need Gsync if my monitor is running at 144hz and newer games usually net me between 60 and 100 frames? Isn't my monitor running faster than rendering so it has almost no tear?
 

Alo81

Low Poly Gynecologist
Is the refresh rate then completely unlimited? What about older games where you can push it to 300 and beyond, it'd be super cool if that were the case.

The upper bound is still your monitors max refresh rate, the big benefit seems to be that you're not just limited to divisibles if you want it to look good.

Rather than 60 and 30 being the two most viable options, 60 and everything below (or 120 and below for 120hz monitors) are all equally viable.
 
Why did this take so long? Ever since TVs started specifically supporting p24, getting rid of constant refresh cycles altogether seemed like an inevitability.
Race to the bottom price wise, with ever increasing display sizes. I guess it was never marketable enough?
 

Orayn

Member
Why refresh screen independently at all then? Isn't THE PERFECT way to render frames as soon as they are available?

It's my new understanding that it does render frames as soon as they're available by refreshing the screen on-demand.

e.g. If you have three sequential frames that took 10, 15, and 20 ms to render, the refresh "rate" effectively goes from 100 to 66.67 to 50Hz, when they're being displayed.

This works up to the monitor's maximum refresh rate, of 144 Hz/6.944 ms, at which point it's just regular vsync.
 

ToD_

Member
A lot of them already can with an RGB monitor and some custom firmware like ArcadeVGA, right? MameRes tool is also good for getting arcade games to use the proper, or very close to it, refresh rates.

Groovy tech for games moving forward and the future where analog displays will be hard to get.

Yes, this has been possible with CRTs for a long time. I used to do this years ago with Powerstrip. I felt it was quite a hassle, though.

I'm hopeful this will be the beginning of flexible/variable refresh rates. Like you said, especially important now with analog displays becoming a thing of the past.
 

Nokterian

Member
Why would I need Gsync if my monitor is running at 144hz and newer games usually net me between 60 and 100 frames? Isn't my monitor running faster than rendering so it has almost no tear?

It removes all stuttering, lag, screentearing since it speaks directly to the GPU instead of using V-sync or triple buffering.
 

Baleoce

Member
CPS-3 games, to my knowledge, run at a constant 59.633333Hz.

Oh really? I could have sworn I've heard an anecdote before about Japanese players considering the console version incomplete because it's incapable of replicating the frame issue that's present in the arcade version that they're all used to after a decade of playing it.
 
Top Bottom