• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-SYNC - New nVidia monitor tech (continuously variable refresh; no tearing/stutter)

I'm trying to get how this affects me. Does that mean I can uninstall d3doverrider? None of that stuff's been much of a problem for me (although I do notice some stuttering sometimes, always thought it was all in my head).

To give an example, you could run the BF4 beta without any stuttering or tearing below 60fps.
 

RulkezX

Member
It's probably better for the average dude, assuming you're talking an average dude with the same eye for detail but a smaller wallet. Most of the excesses of enthusiast PC gaming are based on pegging 60 (or 120) in worst-case scenarios to avoid all of the equally bad solutions to framerate drops; this changes the ugly-ass jerkiness, input lag, and cascading wasted frames that you get on a midrange rig into a slight loss of temporal resolution.

I haven't gone green since my TNT2, because of that weakness in the midrange every time I've upgraded, but I'm pretty sure it's time now.

What I meant (and no disrespect to the harrdcore) is how much does it matter to those that don't freak out when they cant run a game at 30/60 or declare games unplayable over the most minor of things.

Eliminating something as minor as screen tearing (something I very rarely even see) doesn't seem to be a big enough deal to warrant some of the reactions on page 1.

I still don't understand what it would mean for my upper mid range PC when the things it's meant to eliminate have never been that much of an issue.

Input lag etc is already pretty minimal on a PC is it not ?

To give an example, you could run the BF4 beta without any stuttering or tearing below 60fps.

What stuttering and tearing *confused*
 
It's probably better for the average dude, assuming you're talking an average dude with the same eye for detail but a smaller wallet. Most of the excesses of enthusiast PC gaming are based on pegging 60 (or 120) in worst-case scenarios to avoid all of the equally bad solutions to framerate drops; this changes the ugly-ass jerkiness, input lag, and cascading wasted frames that you get on a midrange rig into a slight loss of temporal resolution.

I hadn't even thought about that. This actually makes PC gaming even more desirable, because yeah, previously I'd never have taken a racing game running at 40 fps over a locked 30. so I'd have throttled it to 30 and not been getting the full value out of my hardware.

now, there's no reason to lock to 30, ever. it'll be smooth at any framerate. so you'll always enjoy the full power of your system, and being sub 60 is going to be much more tolerable, so you'll get a longer lifetime out of your hardware too.

hype hype hype.
 

Mandoric

Banned
So does the TV do the Vsync on its own? People wont have to enable vsync in game options anymore?

It's the anti-vsync; in vsync the game had to do the hard work of synching to the monitor but with this the monitor does the hard work of synching to the game using extra logic on its controller board.
 

burney

Neo Member
Have they said anything about anything other then tn panels working with this technology? Don't really wan't to go back to a tn panel although the VG248QE looks pretty good.
 

mephixto

Banned
Indeed. Only problem is I play all my games (including PC) on a 40" LCD TV.

Bargh.

Hopefully similar tech is developed in TVs in the near future.

If the Nvidia solution sales good, I'm pretty sure that every TV and monitor brand will eventually develop own solution.
 

Meia

Member
I can't decide if this is amazing timing, or incredibly shitty timing.


I literally had a new monitor in the process of checkout before this broke at Newegg (VG278HE), but decided to wait til I found all the stuff I had a bit ago for 3d vision still around, and now I have a dilemma.


Do I pull the trigger on this for the bigger screen, or for the cheaper monitor talked about here for this new tech? GAH. Need to know how easy this would be to mod yourself, going to guess not very. :\


Maybe I'm better buying the monitor I was now, and just waiting for the G-sync monitors next year instead.
 

Baliis

Member
Sounds interesting, but as someone who doesn't notice stuff like that too often I guess I'm not all that excited.
 
What I meant (and no disrespect to the harrdcore) is how much does it matter to those that don't freak out when they cant run a game at 30/60 or declare games unplayable over the most minor of things.

Eliminating something as minor as screen tearing (something I very rarely even see) doesn't seem to be a big enough deal to warrant some of the reactions on page 1.

I still don't understand what it would mean for my upper mid range PC when the things it's meant to eliminate have never been that much of an issue.

Input lag etc is already pretty minimal on a PC is it not ?

whether you freak out about it or not this ensures that your hardware is being used to it's full potential. this makes every framerate feel as stutter free as 30 or 60, even if you could tolerate it before.

you very rarely see frame tearing... but wouldn't you rather it wasn't ever a thing? it's an ugly visual defect where the only solution to it was creating other defects. until now.
 

Karak

Member
This is huge for 4k gaming, because you can run @4k at less than 60fps and still have a great experience. It effectively moves the hardware goalposts forward a year or two.
Honestly this was the first thing and the main thing that came up when I watched the presentation. But just in regards to the fact that many PC gamers in that mid stream section lower everything to get to 60 while that may still be somewhat the case, I think a VAST majority will basically be getting a boost suddenly when installing this.

It is great. Synching like this is going to impact pretty much every level of gamer on Nvidia systems(with those cards and above of course)
 

Zeth

Member
This is so huge for 4K. Now you can play lots of games, even multiplatform next gen titles with a modest video card and get a 30-45fps that is much smoother.
 

MaLDo

Member
I guess if G means Game so G-Sync is Game-sync because the game controls monitor hz.


I will buy a big big monitor with gsync soon. Very soon. Andy, when?
 

dark10x

Digital Foundry pixel pusher
Again, do we have ANY idea if anything like this could be applied to TVs and non-LCD displays? That's the key here.

I love the idea but it just doesn't mean anything to mean in PC monitor form. :\ I'm glad someone is doing something about this, though, but I'm disappointed that it seems to be aimed at PC gaming on an LCD.

and... damn Dark... really? I thought you'd be all over this!
Oh, it's brilliant and exactly what I'm looking for...but not in an LCD monitor. It's being pioneered in a type of display that I don't play games on, basically.

I'm simply afraid that it won't be applied to TVs.
 
Thanks for the very easy to understand explanation in the OP.

Sounds like a useful thing to have; I always sacrifice v-sync for a higher framerate.
 

mephixto

Banned
fuck ... want


but also want mantle :(




AMD needs this. Fuck nVidia for making it proprietary.

I'll be somewhat pissed if this was a software solution being propietary, but is hardware.

I think that AMD was caught with the pants down on this, I don't expect they come with something similar in a very very long time.
 

ToD_

Member
Again, do we have ANY idea if anything like this could be applied to TVs and non-LCD displays? That's the key here.

I love the idea but it just doesn't mean anything to mean in PC monitor form. :\ I'm glad someone is doing something about this, though, but I'm disappointed that it seems to be aimed at PC gaming on an LCD.


Oh, it's brilliant and exactly what I'm looking for...but not in an LCD monitor. It's being pioneered in a type of display that I don't play games on, basically.

I'm simply afraid that it won't be applied to TVs.

I think this may be very difficult to get working on some other display technologies. Plasma and CRTs, for example, really show the image based on your refresh rate. As in, they flicker at the refresh speed. It would become unbearable at lower refresh rates like <50Hz.
 

Mandoric

Banned
I think this may be very difficult to get working on some other display technologies. Plasma and CRTs, for example, really show the image based on your refresh rate. As in, they flicker at the refresh speed. It would become unbearable at lower refresh rates like <50Hz.

Doesn't plasma refresh each pixel several times per frame? It may be possible to play with the timing/refresh count there.
 

Raistlin

Post Count: 9999
I'll be somewhat pissed if this was a software solution being propietary, but is hardware.

I think that AMD was caught with the pants down on this, I don't expect they come with something similar in a very very long time.
It doesn't matter that it's HW. They could make an API to com with it and make that public.

Hell AMD is making Mantle public (not that I expect other HW vendors to use it). In reality, this isn't rocket science. It needs to send some sync parameters. That's basically it in terms of com with the GPU.
 

ToD_

Member
Doesn't plasma refresh each pixel several times per frame? It may be possible to play with the timing/refresh count there.

Maybe, but this may just be easier to accomplish on LCD tech. I'm hoping to be proven wrong as I would prefer it on something else as well.
 

creyas

Member
Nice. Just upgraded to a 7970, but probably can be ready to upgrade again this time next year. Was getting a bit tired of looking at my current 120hz monitor, glad i didn't act on it yet.
 

ss_lemonade

Member
So this gets you dynamic refresh rates. Looking at the product details, gsync refresh rates range from 30 - 144hz. What happens if I drop below 30fps?
 

Orayn

Member
Have they announced any actual monitors yet, or just the DIY kit?

No specific models other than the Asus VG248QE with a kit pre-installed.

Will supposedly run the gamut from budget monitors up to 4k displays. I assume most of them will be 120Hz panels or higher.

So this gets you dynamic refresh rates. Looking at the product details, gsync refresh rates range from 30 - 144hz. What happens if I drop below 30fps?

I assume it would work like normal vsync then.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
So this gets you dynamic refresh rates. Looking at the product details, gsync refresh rates range from 30 - 144hz. What happens if I drop below 30fps?

One would assume tearing, or with synchronisation stutter/lag.

If you're consistently dropping below 30fps you should avoid spending that money on a monitor and instead get a new GPU.
 
Big news. If AMD can't make something compatible or equivalent, they could be in a major bind with some of the enthusiast crowd for the next few years. Mantle has nothing on this...
 

Rur0ni

Member
One would assume tearing, or with synchronisation stutter/lag.

If you're consistently dropping below 30fps you should avoid spending that money on a monitor and instead get a new GPU.
This was my thought as well. If you're really dipping below 30 fps, you've got to adjust something (graphics quality, or upgrade).
 
Welcome to next-gen lads! This is gonna be amazinggggggggg.



Buy I play my PC games on my big TV in the living room. So when can I expect this technology to be available on that?

Probably 2-5 years. Tech always comes to monitors and enthusiast/professional displays before trickling down to TV, things like HD resolutions, 120Hz (which still hasn't hit TVs), etc.
 
Top Bottom