• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-SYNC - New nVidia monitor tech (continuously variable refresh; no tearing/stutter)

Pagusas

Elden Member
I think it means no judder since it effectively makes the monitor's refresh rate the same as the game's framerate up to some maximum.

A game rendering 45 frames per second is taking 22.22 ms to make each new frame; I *think* G-Sync can tell the monitor to just keep each frame onscreen for that amount of time. A traditional refresh rate of 60Hz means that the monitor HAS to refresh every 16.67 ms, so you're left with a choice between tearing where you give it two partial frames at once, or vsync where each frame has to be displayed for some multiple of 16.67 ms.

Yep, my understanding is the monitor will just hold the frame until a new frame is received.
 

potam

Banned
I'm still trying to figure something out:

Will this be that amazing for someone like myself who doesn't find screen tearing to be the worst thing in the world? I suppose removing frame stutter would be nice.

Either way, I'm glad I haven't bought a new monitor yet.
 

2San

Member
I'm still trying to figure something out:

Will this be that amazing for someone like myself who doesn't find screen tearing to be the worst thing in the world? I suppose removing frame stutter would be nice.

Either way, I'm glad I haven't bought a new monitor yet.
I think it will be good for input lag as well. I think we still have a lot of ways to go concerning input lag.
 

Durante

Member
Tim Sweeney, creator of Epic’s industry-dominating Unreal Engine, called G-SYNC “the biggest leap forward in gaming monitors since we went from standard definition to high-def.” He added, “If you care about gaming, G-SYNC is going to make a huge difference in the experience.” The legendary John Carmack, architect of id Software’s engine, was similarly excited, saying “Once you play on a G-SYNC capable monitor, you’ll never go back.” Coming from a pioneer of the gaming industry, who’s also a bonafide rocket scientist, that’s high praise indeed.
.

We call our tech Variable Refresh Rate (VRR).
Good job man. Good fucking job.
 
iBEwT7tEC0X2B.gif

Antonio's a PC gamer and he is quite pleased with this news.
So am I.
 

2San

Member
So let me get this straight- ZERO input lag? Meaning just like a CRT? And will this work on 120Hz?
Well the VG248QE is one of the first supported screens and that's a 144Hz screen. I dunno about input lag, but there is always some input lag.
 

xenist

Member
I'm still trying to figure something out:

Will this be that amazing for someone like myself who doesn't find screen tearing to be the worst thing in the world? I suppose removing frame stutter would be nice.

Either way, I'm glad I haven't bought a new monitor yet.

Someone more knowledgable about graphics may correct me but I cannot think of such an IQ (excluding stuff like more pixels more colors that are the norm) improvement in a very long time. It won't make any numbers higher, but it's going to be one of those things that you cannot give up once you have it.
 

Shambles

Member
Normally I rail against PhysX and Mantle but I'm glad nVidia is doing this. Monitor manufacturers have been sitting on their collective lazy asses that the industry itself wasn't going to do ANYTHING to improve unless it was dragged into progress kicking and screaming. I hope down the road a standardized system takes place but that will never happen if nVidia wasn't doing this now. It's not like ATI/AMD never had a chance to try to address this problem, and these monitors will work with AMD cards as well, you just won't get the variable refresh rate.
 

graywolf323

Member
well I have two questions before I let the excitement take me

will this work with my 780M?

also what does this mean for frames per a second as we know it?
 
Nice. Hopefully, this will be adopted in a more universal, less proprietary way to speed adoption across the board instead of dying off due to exclusivity limitations. I was still thinking that frequency-based rendering/refresh techniques would be a generally more useful solution to issues like this, though.
 

RulkezX

Member
This is something for the enthusiast PC Gamer , right ? It's not something that's going to make a massive difference for the average dude ?
 

ghst

thanks for the laugh
carmack says it right. we have twenty page threads over minor frame variances, lacking AO effects or more jagged shadows, but here is a solution to giant fucking tears that rip the whole screen in half without compromising on response times and making the whole image stutter, and we're only at page two.

you won't find a more staunchly vendor agnostic gamer than me, but you really have to give it up for nvidia here. they've kicked down the door and we're all going to benefit.
 
.

Good job man. Good fucking job.

Seriously, we need Dark10X in here to join the party. I think the three of us are the most vocal on these NOW DEFEATED issues.

I was following the other thread and doing all my dancing over there, but your posts best sum up how I feel about this.

This is a huge day. I am still going to wait until the end of next year to do that PC upgrade (saving saving saving already...) but that build is going to sing like I can't even imagine right now.

Carmack is totally going to make sure Rift supports this too, I'd bet. He's always been a massive campaigner against frame latency in all it's forms.

No more tearing. No more stuttering. No more image lag. Frames beginning to draw on the monitor right after the graphics card finishes drawing.

It's a very, VERY, happy day.

And from my Stereoscopic 3D perspective, I'm very happy to hear it helps with 3D vision too.

Well the VG248QE is one of the first supported screens and that's a 144Hz screen. I dunno about input lag, but there is always some input lag.
it'll mean that input lag is much closer to how it works with Vsync turned off, rather than now, where vsync increases it for various reasons (buffering mainly).
 

HariKari

Member
also what does this mean for frames per a second as we know it?

Carmack said it will matter less, because you can free up horsepower to do pretty visual things and not have to worry about staying within +/- 10 of 60 frames a second. Witness how hard devs are working on making next gen a certain framerate.

He absolutely loved it and said it's a must buy for him. I'd trust the man.
 

Seanspeed

Banned
I'm still trying to figure something out:

Will this be that amazing for someone like myself who doesn't find screen tearing to be the worst thing in the world? I suppose removing frame stutter would be nice.

Either way, I'm glad I haven't bought a new monitor yet.
If you don't mind screen tearing, then this probably wont matter much to you.

But obviously no screen tearing is preferable to having to put up with it. Its objectively a superior experience.

For me, its the input lag I don't really notice or have any noticeable issues with. Its not even that I don't mind it, its that I honestly cant tell the difference when I turn on v-sync, so I dont know what there is to mind. So for me, I'm not exactly excited about this, but I think its cool the tech is there now.
 
This is something for the enthusiast PC Gamer , right ? It's not something that's going to make a massive difference for the average dude ?

Eyes trained for smooth images, crisp text, high resolutions, FPS fluctuations are definitely interested in this tech.
 

xenist

Member
carmack says it right. we have twenty page threads over minor frame variances, lacking AO effects or more jagged shadows, but here is a solution to giant fucking tears that rip the whole screen in half without compromising on response times and making the whole image stutter, and we're only at page two.

A lot of people don't realize how crazy big this is for quality.
 

Dawg

Member
I've missed a few pieces of info.

Can I just order my BenQ XL2411T now and buy the solution later and DIY?
 

HariKari

Member
This is something for the enthusiast PC Gamer , right ? It's not something that's going to make a massive difference for the average dude ?

Eyes trained for smooth images, crisp text, high resolutions, FPS fluctuations are definitely interested in this tech.

I imagine this tech or its approach will eventually filter down to consoles in the future. It makes 30/60 a moot point according to everyone who witnessed the demo. Good shit Nvidia.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Hard to believe we could be looking at a screen tear-less future for PC gamers at long last. It's pretty amazing technology.
 

potam

Banned
If you don't mind screen tearing, then this probably wont matter much to you.

But obviously no screen tearing is preferable to having to put up with it. Its objectively a superior experience.

For me, its the input lag I don't really notice or have any noticeable issues with. Its not even that I don't mind it, its that I honestly cant tell the difference when I turn on v-sync, so I dont know what there is to mind. So for me, I'm not exactly excited about this, but I think its cool the tech is there now.

Yeah, I mean obviously getting rid of tearing without using Vsync will be nice...I guess I'm just not that picky about my graphics aside from enjoying maxxing them out.

Either way, when it seems that everyone who has seen it is saying it's the second coming, I will take their word for it.
 

dark10x

Digital Foundry pixel pusher
I like the idea but I use a TV for PC gaming. Furthermore, this seems to be limited to LCDs right now which makes it worthless to me.
 
theoretically this tech could significantly increase time between upgrades.

As much as I prefer the smoothness of 60fps, I can tolerate 30fps that's locked and smooth. Problem is, you just don't get that without having to resort to vsync, and even that's not guaranteed if it dips below 30. Input lag becomes a significant issue here.

With this tech, lower frames should be infinitely more tolerable.
 
Top Bottom