• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-Sync is the god-level gaming upgrade.

Qassim

Member
Editing this out completely because I'm still being quoted despite my clarification on what I meant:

The point is there's no need to aim for the two extremes from a game development perspective (on a fixed console platform). A game running at an average of 45fps would be better than a game running at 30fps but not as good as a game running at 80fps.

The debate has two definite sides because you're forced to pick one if you want a decent experience by hitting that (fixed) refresh sync. That would no longer be the case with adaptive refresh. You get a tear-free, stutter free (of refresh related stutters) image at any frame rate above that minimum. That should give a decent amount more freedom for developers and their prioritisation of resources.

I don't really care about what people argue about on forums, that's not my point. I'm talking about the internal debate that many developers will have.

GSync is currently not compatible with any IPS display, so no sadly.

That's not true. G-Sync is compatible with IPS, it's just we haven't seen any G-Sync IPS displays. Tom Petersen from NVIDIA confirmed this.
 

b0bbyJ03

Member
If you're playing on a tv the benefits are probably lost on you anyways. Sorry, but gaming on a tv is total garbage after playing on any high quality, 1ms monitor.

Panasonic plasmas say hello. Bought one right before Panasonic stopped making them and there is no going back for me. The motion resolution, colors, contrast, deep blacks, size (55") are too good to give up. Unless your a competitive gamer of course, which I am not. I'm a snob now when I see my friends displays
 

Ulysses 31

Member
You know the 30fps vs 60fps debate? Adaptive refresh (G-Sync, FreeSync) makes that pretty much irrelevant.



That's not true. G-Sync is compatible with IPS, it's just we haven't seen any G-Sync IPS displays.

Howso?

You'll still want to play at 60 or higher rather than 30.
 
What causes screen tearing? Is that usually when your FPS exceeds your refresh rate?

If so, wouldn't it be fine to just have a 144HZ monitor, and cap your FPS at a 100 or something? I'm new to all this, so I have no idea how anything works.
 
You know the 30fps vs 60fps debate? Adaptive refresh (G-Sync, FreeSync) makes that pretty much irrelevant.

No it doesn't.

While I haven't used a GSync monitor, it apparently does not provide any benefits below 40fps, so you would still be able to see the difference between 30 and 60.

However, if a dev pushed to get to 60fps but still had dips to 50 it wouldn't be noticeable with GSync/FreeSync. That will be very much welcome.
 

Seanspeed

Banned
You know the 30fps vs 60fps debate? Adaptive refresh (G-Sync, FreeSync) makes that pretty much irrelevant.
Not irrelevant, no. Higher framerates are still going to result in smoother gaming. There just wont be any need to cap/lock a game at 30fps anymore.
 

Qassim

Member
Howso?

You'll still want to play at 60 or higher rather than 30.

Because 30vs60 is because of the fixed refresh rate of displays. When your display is refreshing at exactly whatever your framerate is at that moment, then 60fps is an arbitrary benchmark.

It could be 30fps (as that remains a good minimum, regardless) vs 54fps or 87fps or 72fps or 101fps. The debate would shift to "minimum frame rate" vs higher frame rate.
 

Qassim

Member
Not irrelevant, no. Higher framerates are still going to result in smoother gaming. There just wont be any need to cap/lock a game at 30fps anymore.

No it doesn't.

While I haven't used a GSync monitor, it apparently does not provide any benefits below 40fps, so you would still be able to see the difference between 30 and 60.

However, if a dev pushed to get to 60fps but still had dips to 50 it wouldn't be noticeable with GSync/FreeSync. That will be very much welcome.

This is blatantly false, apples and oranges really.

I think I'm being misunderstood (probably my fault). I'm not saying higher fps is no longer relevant. I'm saying the specific target of '60fps' is no longer relevant. 60fps would be but an arbitrary number.
 

Demigod Mac

Member
Really want an IPS G-Sync monitor but I doubt that will ever happen at this rate.

Best bet is wait for Freesync in Displayport 1.2a / 1.3 to become available, then get a video card and IPS display that supports it.
 

Seanspeed

Banned
Because 30vs60 is because of the fixed refresh rate of displays. When your display is refreshing at exactly whatever your framerate is at that moment, then 60fps is an arbitrary benchmark.

It could be 30fps (as that remains a good minimum, regardless) vs 54fps or 87fps or 72fps or 101fps. The debate would shift to "minimum frame rate" vs higher frame rate.
Well ok yea. It'll turn from 30fps vs 60fps to 30fps vs higher framerates, then. You'll still get the same people arguing the same things! ;)
 

Nzyme32

Member
I hear nothing but good things about gsync all the time. I can't wait for prices to drop eventually and when I can properly afford to dive in!
 

Bricky

Member
I still don't really know what G-Sync does, without being able to see it in person myself

Technical stuff aside gameplay will feel alot smoother without requiring a locked 60FPS to do so. Also; no tearing ever.

On top of that you'll get higher framerates or better IQ since achieving this same smoothness on a non-Gsync setup usually has a big performance cost.

When the prices come down it will be a mandatory feature for gamers looking to buy a new monitor.
 

Qassim

Member
Well ok yea. It'll turn from 30fps vs higher framerates, then. You'll still get the same people arguing the same things, though. ;)

The point is there's no need to aim for the two extremes from a game development perspective (on a fixed console platform). A game running at an average of 45fps would be better than a game running at 30fps but not as good as a game running at 80fps.

The debate has two definite sides because you're forced to pick one if you want a decent experience by hitting that (fixed) refresh sync. That would no longer be the case with adaptive refresh. You get a tear-free, stutter free (of refresh related stutters) image at any frame rate above that minimum. That should give a decent amount more freedom for developers and their prioritisation of resources.

I don't really care about what people argue about on forums, that's not my point. I'm talking about the internal debate that many developers will have.
 

Levyne

Banned
I can't wait to get a better, gsync capable monitor, but I'm hoping to get a 120hz and a general cpu/gpu upgrade at the same time (to be able to push those sorts of framerates. So I'm trying to save for all that. Maybe once I'm out of grad school as a gift to myself. Prices should be down by then too.
 

Krejlooc

Banned
Doesnt play nice with vr. The future of display technology is rolling raster interlaced asynchronous time warp, which renders stuff like gsync pointless in the process.

It's a stop gap technology at the moment.
 

Damerman

Member
Really want an IPS G-Sync monitor but I doubt that will ever happen at this rate.

if i could get this, i would never buy a new monitor for like 5-7 years... no doubt. unless of course high resolution holograms become mainstream.

Doesnt play nice with vr. The future of display technology is rolling raster interlaced asynchronous time warp, which renders stuff like gsync pointless in the process.

It's a stop gap technology at the moment.

pshh, VR is a stop gap technology. no one is going to wear a helmet for 8 hours on end to play an RPG.
 

Damerman

Member
No it doesn't.

While I haven't used a GSync monitor, it apparently does not provide any benefits below 40fps, so you would still be able to see the difference between 30 and 60.

However, if a dev pushed to get to 60fps but still had dips to 50 it wouldn't be noticeable with GSync/FreeSync. That will be very much welcome.

well i own a G-sync monitar, and the only time frame dips/hikes are extremely noticeable are in the 20's...

going from 30 to 60 is not as jarring as going from 20 to 30.
 
I still kind of have a hard time wrapping my mind around how it all works and feels, I wish a place around were I live had a display set up or had a friend with one. From my understanding it takes low frame rates and fluctuating frame rates and matches that with the monitors refresh rate, which eliminates precievable lag or controller unresponsiveness? Am I right? But how does it look when a game runs at 20fps? Will it still look chunky and rough or does G-sync smooth that out? That's the type of stuff I need to see in person to fully understand.
 
If those displays only weren't so damn rare and expensive... :( which one do you own btw? Just curious.
The ROG Swift PG278Q. By this time next year I might have a kid, and if that happens I doubt I will have the expendable income.
https://www.youtube.com/watch?v=do-UqHBCJkg

You have to DL the link b/c the vid here is 30fps b/c youtube
This is a great explanation, but you really can't do it justice in a video.
I can't say enough of this, they need to add variable refresh rate to HDMI specs, and most TV sets already has the capability to adjust their refresh rate (multiples of 24,25,30) so some sets could get freesync-like functionality with firmware upgrades..
That's a limitation of HDMI. It's a branch of DVI, which is really just a digital version of VGA. It's all refresh timing based. DisplayPort is packet based, but was linked to display timing, but not as strongly.

G-Sync and FreeSync are basically hacks to have the GPU send the frame as soon as it's rendered, and have the display display the frame as soon as it arrives.

There's a lot more history out there, but historically it's all based around linking TV cameras to 60hz power so that the TV on the other end could synchronize to it on the same 60hz power. Some of those limitations were overcome quickly, and semi-abandoned with the NTSC color hack, but it continued to make sense to have TVs synced to a steady framerate since film was always a steady framerate and in analog broadcasts a steady framerate means a steady bandwidth.

As a Nintendo fan, I hope Nintendo jumps on the bandwagon next generation if Sony and/or Microsoft do. It's amazingly transformative.
 

Spazznid

Member
It totally blows me away. With the way things currently run, almost every frame is either late or missed. It's taken my Q6600/750ti/4GB setup and made it feel relevant again. I'm not even really sure why I'm building a new system anymore.

If Sony were smart, they'll have this or FreeSync in the PS5 and future gaming TVs.

Shit, if they're really lucky, they'll be able to add FreeSync to the PS4 and release a series of Gaming TVs.

Holy Fuck.

EDIT: TV

no more tvs from sony, tho, and why Sony and not Microsoft or Nintendo? You biased?

Kidding aside, I'd love to try it on my monitor, but I've yet to save up any money, and when I looked for a unit, they all had to be assembled by whoever was selling them or something, causing the price to hike up.
 
I think I'm being misunderstood (probably my fault). I'm not saying higher fps is no longer relevant. I'm saying the specific target of '60fps' is no longer relevant. 60fps would be but an arbitrary number.

Not exactly, but I understand what you're saying. When G-sync is applied the difference between let's say 45 and 60 fps is negligible to all but the most discering eye.

http://m.newegg.com/Product/index?itemnumber=N82E16824009658
This is a pretty impressive price for this, minus the fact it's a TN panel and only 28" which may render 4k pointless without insane pixel density.
 
Funny you posted this, I've been camping on the Amazon listing for the Asus ROG Swift
for days and one popped up last night, grabbed it and it's getting delivered tomorrow. I'm going to film some Far Cry 4, AC: Unity, DA: Inquistion comparison videos. I'll throw them up in this thread when I post them. I haven't been this excited for a piece of tech since the Voodoo 5 3dfx video card
(I'm old.)

I am concerned about some of the quality control issues they have seem to have with the monitor, but I'm just hoping for no dead pixels.

Let me know if you guys want something specific posted or compared.
 

Syncytia

Member
GSync sounds interesting but I can't really justify the cost of monitor vs upgrading my gpu. If nVidia kept their previous pricing trends it might be a different story but with the 970 at $330, it's just not on the cards for me now. I haven't had issues like some have with Far Cry 4 and I'm basically above 60 fps on ultra so I'm good for now. I would love to try it out in person though, I really want to see what it's like.
 
I still kind of have a hard time wrapping my mind around how it all works and feels, I wish a place around were I live had a display set up or had a friend with one. From my understanding it takes low frame rates and fluctuating frame rates and matches that with the monitors refresh rate, which eliminates precievable lag or controller unresponsiveness? Am I right? But how does it look when a game runs at 20fps? Will it still look chunky and rough or does G-sync smooth that out? That's the type of stuff I need to see in person to fully understand.

3D Rendering natively happens at a variable refresh rate. No two frames have identical deltas. What this does is it causes the display to vary the LCDs refresh to match the refresh of the GPU.

The flow is like this:

GPU renders a frame -> Frame is transmitted to the monitor immediately -> Monitor displays the frame as soon as it's arrived (or as it's being transmitted - I'm not sure exactly which).

In older solutions the flow is like this:

GPU renders a frame -> GPU waits for the VBLANK period -> GPU transmits the frame over the course of the next 15.25ms (if you're at 60hz) and the frame is displayed as it arrives over the course of about 15.25ms (if we assume 1.45ms of VBLANK).

Modern TVs are worse. They look like this:

GPU renders a frame -> GPU waits for the VBLANK period -> GPU transmits the frame over the course of the next 15.25ms (if you're at 60hz) and stores it. TV processes the frame for an indeterminate amount of time. TV Displays the frames it collects at 60hz.
 

Tain

Member
I love it. I want televisions with it. It's a huge advancement in emulation. It lets me push games further and still get smooth results.

Variable refresh rate technology is unquestionably in the future of all display tech. It's too good not to be.
 
no more tvs from sony, tho, and why Sony and not Microsoft or Nintendo? You biased?

Not as many new TVs from Sony, but they're not stopping completely. I'm a Nintendo fan. Sony is my example because they're the only one of the three manufacturing displays right now.
 
Funny you posted this, I've been camping on the Amazon listing for the Asus ROG Swift
for days and one popped up last night, grabbed it and it's getting delivered tomorrow. I'm going to film some Far Cry 4, AC: Unity, DA: Inquistion comparison videos. I'll throw them up in this thread when I post them. I haven't been this excited for a piece of tech since the Voodoo 5 3dfx video card
(I'm old.)

I was very excited for an NES when it came out in 1986 across the US, and then the next thing that really excited me was the Monster 3D with a Voodoo chip.
 

Qassim

Member
I still kind of have a hard time wrapping my mind around how it all works and feels, I wish a place around were I live had a display set up or had a friend with one. From my understanding it takes low frame rates and fluctuating frame rates and matches that with the monitors refresh rate, which eliminates precievable lag or controller unresponsiveness? Am I right? But how does it look when a game runs at 20fps? Will it still look chunky and rough or does G-sync smooth that out? That's the type of stuff I need to see in person to fully understand.

No, the benefits of G-Sync really stop once you hit below 30fps.

To understand the benefits of adaptive refresh (G-Sync), you need to understand the problems with traditional fixed refresh displays.

Your GPU will render frames as fast as it can and then send it to the monitor, but the monitor will only refresh at set intervals. The result of this can lead to screen tearing. The solution was VSync, but that also introduces its own issues. Your GPU is forced wait until the current refresh on the display is finished before sending the next frame, this creates input lag as the visual information displayed to you is potentially old. It is also the case that if you your GPU cannot render the frames fast enough to match the refresh rate, and your GPU drops behind, the game begins to stutter.

G-Sync solves this by letting the GPU control when the monitor refreshes, so when the frame has been rendered and sent to the monitor, then the monitor will refresh at that moment. That solves these issues and brings some other advantages in that a game running at 54fps (example) will look pretty much as good as a game running at 60fps, or on a higher refresh rate display, you will get the visual advantage of a game running at 70fps vs 60fps, or 80fps vs 60fps, or above.

60hz becomes an arbitrary line and not a benchmark you need to hit in order to get that refresh sync (to avoid tearing, etc). On PCs, G-Sync monitors (outside of 4K displays) are coupled with high refresh rate panels (e.g. 144hz), so there are two effects at work for people in this case. You get a higher refreshing monitor that you can take advantage of and the benefits of G-Sync explained above, so it adds up into something rather good.
 
AOC 24" LED G-Sync g2460Pg

that one?

Yeah that's what I'm using at the moment.

Image itself isn't the greatest monitor ever, but it's pretty affordable by G-sync standards. As long as you're cool with 1080 (which you should be if you're playing EUIV :p)

No, the benefits of G-Sync really stop once you hit below 30fps.

According to those AMD slides the other day, G-sync monitor refresh rates bottom out at 30hz. So it's a standard monitor below 30, presumably. Although theoretically if it's running 144hz, all sorts of funky low framerates will still look fine.
 

wildfire

Banned
You know the 30fps vs 60fps debate? Adaptive refresh (G-Sync, FreeSync) makes that pretty much irrelevant.
^ Poorly worded:

.

No it doesn't.

While I haven't used a GSync monitor, it apparently does not provide any benefits below 40fps, so you would still be able to see the difference between 30 and 60.

However, if a dev pushed to get to 60fps but still had dips to 50 it wouldn't be noticeable with GSync/FreeSync. That will be very much welcome.

Iced Eagle wasn't corrected yet but he is mistaken. Gsync stops working below 30 FPS, not 40.
 

Boogdud

Member
You know the 30fps vs 60fps debate? Adaptive refresh (G-Sync, FreeSync) makes that pretty much irrelevant.

Actually, if anything, gsync makes sub 35fps look worse. It syncs every frame so you end up seeing a very jerky picture when it gets down to the ~30 fps range, below 30 and it's worth turning vsync off.

I've been using gsync monitors since the first diy kit. I absolutely love it, but it doesn't make 30fps look smooth by any means.
 

lmpaler

Member
If you're playing on a tv the benefits are probably lost on you anyways. Sorry, but gaming on a tv is total garbage after playing on any high quality, 1ms monitor.

I love gaming on my HDTV, buuut I am building a desk with a great monitor come taxes
 
Iced Eagle wasn't corrected yet but he is mistaken. Gsync stops working below 30 FPS, not 40.

It's a really interesting statement that it stops working below 30fps. My understanding is that it repeats the last frame if it's waited 1/30th of a second, but shouldn't the next frame still be displayed super quickly? Maybe at 1/30th of a second it just sends the partial frame anyway and we get a tear?

I've probably not read enough documentation.
 

Qassim

Member
Actually, if anything, gsync makes sub 35fps look worse. It syncs every frame so you end up seeing a very jerky picture when it gets down to the ~30 fps range, below 30 and it's worth turning vsync off.

I've been using gsync monitors since the first diy kit. I absolutely love it, but it doesn't make 30fps look smooth by any means.

Okay, that wasn't really my argument though, as I clarified.
 
It's a really interesting statement that it stops working below 30fps. My understanding is that it repeats the last frame if it's waited 1/30th of a second, but shouldn't the next frame still be displayed super quickly? Maybe at 1/30th of a second it just sends the partial frame anyway and we get a tear?

I've probably not read enough documentation.

I don't get why not going below 30 is a problem anyway since it can just do arbitrary multiples of any frame rate. If the gpu is pulling 25, it can do 50hz and be just as good as native 25hz.

I don't know if it does work like that, but conceptually it seems feasible
 
So if I was playing something like Dead Rising 3, with the framerate sometimes dipping to around 40-45 FPS, Gsync would eliminate the judder that you usually see?
 
So if I was playing something like Dead Rising 3, with the framerate sometimes dipping to around 40-45 FPS, Gsync would eliminate the judder that you usually see?

Yes. No tearing and no stuttering. Frame rate is still lower but it's not as big of a problem as it would normally be - it makes capping at 30 for smoothness an irrelevant option.
 

bodine1231

Member
I recommend getting one card if your using Gsync. I really should have gone with a single 980 instead of 2x 970's. Im finding that Gsync works better with smaller frame drops. When I run SLI I get higher frames but lower drops,which causes a fair bit of stutter.
It seems that Gsync cant keep up when my frames go from 80 to 50 but when I run a single card my frames stay in the 40-60 range and it works great.

It's nice but I don't think it was worth $800+ for the monitor.
 

Grief.exe

Member
I'm really hoping that it'll be the magic bullet I'm looking for to help me with the motion sickness I get with unstable framerates, but it seems like it'll be a while before I can get such a monitor, since I'm not paying $700+ for a monitor. At least not at the moment.

Hopefully prices come down after the market gets saturated and freesync starts competing.

So if I was playing something like Dead Rising 3, with the framerate sometimes dipping to around 40-45 FPS, Gsync would eliminate the judder that you usually see?

Yep. Your display refreshes at 60 Hz, 45 FPS cannot divide evenly into that.

Gsync will match the refresh rate to your frame rate.
 

rav

Member
I don't get why not going below 30 is a problem anyway since it can just do arbitrary multiples of any frame rate. If the gpu is pulling 25, it can do 50hz and be just as good as native 25hz.

I don't know if it does work like that, but conceptually it seems feasible

I think AMD's Freesync does exactly that, I'm not sure why NVidia chose 30hz as a hard limit, unless there's something in the displayport spec.
 
Top Bottom