• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-Sync is the god-level gaming upgrade.

Trojita

Rapid Response Threadmaker
If you're playing on a tv the benefits are probably lost on you anyways. Sorry, but gaming on a tv is total garbage after playing on any high quality, 1ms monitor.

What makes you come to this conclusion?

Most monitors with extremely low ms are TN panels which are going to be the worst looking of all the monitor panels used today. Unfortunately most manufacturer sold 120hz panels are going to be TN, with only a few IPS overclocked "offbrand" able to get somewhat stable 96hz and maybe 120hz. G-sync is only available on TN.

If anything games look the best on an IPS panel when it comes to color reproduction and overall picture quality.
 
I couldnt give less of a shit about G-Sync. Watched the presentation, seen the press and shit about it, seen neogaf fap over it.

I still don't care.

What's the draw?

Every so often a line doesnt tear on the screen.

WOW.
 

SliChillax

Member
I couldnt give less of a shit about G-Sync. Watched the presentation, seen the press and shit about it, seen neogaf fap over it.

I still don't care.

What's the draw?

Every so often a line doesnt tear on the screen.

WOW.

Have you seen it in person? It's not about a single line not tearing, it's about the whole image being consistently smooth without stuttering, tearing and lag. It's something that you have to see in person to understand.
 

riflen

Member
I couldnt give less of a shit about G-Sync. Watched the presentation, seen the press and shit about it, seen neogaf fap over it.

I still don't care.

What's the draw?

Every so often a line doesnt tear on the screen.

WOW.

Fair enough. Please remember to be equally unimpressed by all other variable-refresh technology as it arrives.
 

Luigiv

Member
I couldnt give less of a shit about G-Sync. Watched the presentation, seen the press and shit about it, seen neogaf fap over it.

I still don't care.

What's the draw?

Every so often a line doesnt tear on the screen.

WOW.

Tearing is the devil and needs to be eliminated forever. Hell yeah to the death of tearing.

Anyway, no tearing, no stutter, no framerate halving like you get with a double buffer but less input lag than a tripple buffer. What's there not to love? Well maybe the price of entry, I'll give you that. But that will come down in time, hopefully to the point where all monitor manufactures will include it as a standard feature because there's no reason not to.
 

Datschge

Member
The tech is brand new. By the time next-gen is starting I'm sure GSync displays won't have $100 premium that they do today. It just takes time for the cost to get driven down. Free-Sync is exactly that for manufacturers... Free. That competition will also drive down the price of GSync displays.
The tech is from 2009, every screen panel with eDP 1.0 supports it. The application to external displays and use case is new, the display controller (hardware and firmware) needs to support it. Nvidia sells it's own hardware solution for G-Sync, support for Adaptive-Sync needs to be implemented by manufacturers of display controllers. Samsung appears to include support for their next generation of display controllers in UHDs with DP. So cost for FreeSync depends on demand and economy of scale, while cost for G-Sync fully depends on Nvidia.

My expectation is that once freesync/active-sync displays become available, Nvidia will update drivers to support it. G-Sync itself will fade away as a transition standard. Sort of how there's currently a fight between dual-link dvi and hdmi 2.0. One will gain the upper hand and the other will fade.
Not really happening as one comes with licensing fees but is backed by some party invested in it (and unlikely to drop it), the other is a standard without any fee (so unlikely to fade until a better solution without a fee exists).
 

M3d10n

Member
Rolling raster interlaced asynchronous time warp has benefits beyond vr. It is absolutely the future.

This thread is literally the first and only relevant result for "Rolling raster interlaced asynchronous time warp" on Google. Do you have a link to whatever you're talking about?
 
What I hate the most about this is that it's nothing but a band-aid, an anomaly, a shameless pitiful cop-out for all the developers that can't be assed to optimize their games properly. All it will do is allow them to get even lazier.

Games made by top-of-the-line AAA developers don't need G-sync. All the games made by Nintendo don't need G-sync. All of the games on WiiU made by Nintendo are absolutely amazingly stable and responsive, with not a single hiccup or line of tearing to be seen. Why can't others do the same?
 
This thread is literally the first and only relevant result for "Rolling raster interlaced asynchronous time warp" on Google. Do you have a link to whatever you're talking about?

I'm pretty sure it's an amalgamation of a bunch of image processing terms.
 
What I hate the most about this is that it's nothing but a band-aid, an anomaly, a shameless pitiful cop-out for all the developers that can't be assed to optimize their games properly. All it will do is allow them to get even lazier.

Games made by top-of-the-line AAA developers don't need G-sync. All the games made by Nintendo don't need G-sync. All of the games on WiiU made by Nintendo are absolutely amazingly stable and responsive, with not a single hiccup or line of tearing to be seen. Why can't others do the same?

Bringing Nintendo into this equation and referring to consoles here isn't really relevant. Especially since one could argue the homogenization of consoles let's developers be far "lazier" than any other tech.
 

Momentary

Banned
What I hate the most about this is that it's nothing but a band-aid, an anomaly, a shameless pitiful cop-out for all the developers that can't be assed to optimize their games properly. All it will do is allow them to get even lazier.

Games made by top-of-the-line AAA developers don't need G-sync. All the games made by Nintendo don't need G-sync. All of the games on WiiU made by Nintendo are absolutely amazingly stable and responsive, with not a single hiccup or line of tearing to be seen. Why can't others do the same?

What are you on about? This has way more to do with a company's "optimization" of a game. Damn I hate when people use the "O" word. When a game is released with new graphics tech, console owners and people with old hardware always come out the wood works about this crap. Yes, 5% of the time it is justified, but the other 95% is just people using it as a bandwagon buzzword to not admit that they have weak hardware.
 

Raticus79

Seek victory, not fairness
What I hate the most about this is that it's nothing but a band-aid, an anomaly, a shameless pitiful cop-out for all the developers that can't be assed to optimize their games properly. All it will do is allow them to get even lazier.

I see this as the biggest driver for it to have any hope of hitting mass market. It's quite an angle - "don't let your HDTV hold your console back. Get WheneverSync"

For PC, I actually prefer strobing backlight to variable refresh and playing on lower settings to keep the framerate maxed. The problem with this on the ROG Swift is for some reason the brightness is much lower. I'm not sure why - the lightboost hack on my VG278H is perfectly fine, but if I wanted to go ULMB on my Swift I would need to dim my room and turn off the other monitor.

Also this:

WOW, you guys are talking as if the experience using a G-sync monitor is as good as an Oculus Rift or something.

Rift = the *ultimate* upgrade.

Sync-on-demand is nice, but it's relatively subtle. On the Swift thread some people have had trouble even seeing a difference without knowing what to look for.

---

This thread is literally the first and only relevant result for "Rolling raster interlaced asynchronous time warp" on Google. Do you have a link to whatever you're talking about?

Sounds like NVIDIA's "VR Direct" - http://www.slashgear.com/nvidia-vr-direct-oculus-optimized-18346717/
"Asynchronous Warp: This starts with the last scene rendered, and lets the GPU update it based on head position information. By warping the image later in the rendering pipeline, Maxwell cuts discontinuities between head movement and action on screen. And by doing it asynchronously, it avoids stalling the GPU, robbing it of performance."
Going to take a wild guess and say someone's looking at doing the same thing based on mouse or camera movement mid-frame on a regular display.
 

isamu

OMFG HOLY MOTHER OF MARY IN HEAVEN I CANT BELIEVE IT WTF WHERE ARE MY SEDATIVES AAAAHHH
WOW, you guys are talking as if the experience using a G-sync monitor is as good as an Oculus Rift or something.

Rift = the *ultimate* upgrade.
 
What makes you come to this conclusion?

Most monitors with extremely low ms are TN panels which are going to be the worst looking of all the monitor panels used today. Unfortunately most manufacturer sold 120hz panels are going to be TN, with only a few IPS overclocked "offbrand" able to get somewhat stable 96hz and maybe 120hz. G-sync is only available on TN.

If anything games look the best on an IPS panel when it comes to color reproduction and overall picture quality.

Latency matters more to me (and others) than minor color reproduction problems.

My IPS TV makes playing Super Meat Boy basically impossible.
 

dani_dc

Member
What I hate the most about this is that it's nothing but a band-aid, an anomaly, a shameless pitiful cop-out for all the developers that can't be assed to optimize their games properly. All it will do is allow them to get even lazier.

Games made by top-of-the-line AAA developers don't need G-sync. All the games made by Nintendo don't need G-sync. All of the games on WiiU made by Nintendo are absolutely amazingly stable and responsive, with not a single hiccup or line of tearing to be seen. Why can't others do the same?

You can't guarantee a game is going to run at 60fps in a variety of hardware with a variety of different graphical choices done by the player. And even Nintendo games have slow-downs, see Xenoblade.

Not to mention that people also play games at over 60fps.

WOW, you guys are talking as if the experience using a G-sync monitor is as good as an Oculus Rift or something.

Rift = the *ultimate* upgrade.

I've used an Oculus Rift, I'm interest in one in the future, but they're different experiences. Right now, as a consumer, I'm more interested in G-sync than Oculus. In the future we'll see. Plus technologies like G-sync/Freesync would also improve Oculus experience.
 

mephixto

Banned
What makes you come to this conclusion?

Most monitors with extremely low ms are TN panels which are going to be the worst looking of all the monitor panels used today. Unfortunately most manufacturer sold 120hz panels are going to be TN, with only a few IPS overclocked "offbrand" able to get somewhat stable 96hz and maybe 120hz. G-sync is only available on TN.

If anything games look the best on an IPS panel when it comes to color reproduction and overall picture quality.

I had two monitors sitting side by side on my desk. My old Samsung IPS and my BenQ XL2420T. I remember being dissapointed with the BenQ the fist time I tried out, colors were horrid compared with my old samsung. I searched the web, calibrated and enabled the lightboost(for w/e reason this BenQ has a horrible purple bleeding without lightboost) and finally it was somehow decent but still not on par with my old monitor).

Colors looks best on IPS, but games perform better on TN. I stopped noticing the color difference after a few days.

Despite the colors problems, gaming on it was a amazing and responsive. I can't go back and play on my old monitor, its like having pemanent motion blur and it feels feel slugish.
 

rickyson1

Member
I have a hard time caring about g-sync personally because it only works on full screen

I use multiple monitors and vastly prefer to play on borderless windowed whenever possible
 

Luigiv

Member
What I hate the most about this is that it's nothing but a band-aid, an anomaly, a shameless pitiful cop-out for all the developers that can't be assed to optimize their games properly. All it will do is allow them to get even lazier.

Games made by top-of-the-line AAA developers don't need G-sync. All the games made by Nintendo don't need G-sync. All of the games on WiiU made by Nintendo are absolutely amazingly stable and responsive, with not a single hiccup or line of tearing to be seen. Why can't others do the same?

That's all well and good for consoles running on TV's capped at 60hz but for PCs (which are currently the only devices that can run g-sync mind you) expecting all games to run a solid 60fps (let alone 144fps) on all possible set-ups is literally impossible. This doesn't excuse poorly optimised PC code but let's put it this way: for someone like me, who's running an old GPU (GTX670) that's capable of running most games at a framerate better than 60 but not necessarily a rock solid 144hz that my monitor is capable of, g-sync is the perfect solution. It allows me to run my games at better than 60fps without having to deal with tearing, stutter or lag when my system can't keep up. As much as I do love the thing, my Wii U can't do that.

And what happens if, in the future, 120fps+ TV's become a reality? Do you think Nintendo or any other dev would be willing to start making their games a rock solid 120fps? I doubt it. With limited console resources the trade offs for a framerate that high are just too big. However what if these 120fps TVs and next gen consoles are adaptive sync compatible? All of a sudden this opens up the door where we could see better than 60fps games on consoles. If a game runs at a solid 60fps, that means internally the game must actually be running faster than 60fps most of the time with 60fps as the targeted minimum. A good example of this is Red Steel 2, where the director went on record saying that the game was actually running around 80fps internally. So in our hypothetical scenario, Nintendo could still design like they do now, with a 60fps target minimum but allow for the framerate to creep higher when it can without the artefacts that would normally entail. At the end of the day a framerate that fluctuates between 60-80fps is still better than a framerate stuck at 60. It doesn't have to encourage laziness.
 

Seanspeed

Banned
G-sync/adaptiesync/freesync + OR = godly?
I think it might be beneficial to VR as a sort of a backup measure. Like, if your rig truly cant keep up with something in a certain part of a game/experience and your framerate drops, a variable refresh rate might limit the damage of this, but really, you shouldn't be dropping any frames if at all possible. And of course at the framerates that consumer Rift will be running(90fps), the benefits of something like Gsync are not immaterial, but definitely diminished somewhat.

So I wouldn't think its really going to be anything huge for VR, but it might still be worth having for those situations where things aren't running ideally and Gsync steps in to hopefully mitigate any resulting unease as much as possible.
 

Paganmoon

Member
I think it might be beneficial to VR as a sort of a backup measure. Like, if your rig truly cant keep up with something in a certain part of a game/experience and your framerate drops, a variable refresh rate might limit the damage of this, but really, you shouldn't be dropping any frames if at all possible. And of course at the framerates that consumer Rift will be running(90fps), the benefits of something like Gsync are not immaterial, but definitely diminished somewhat.

So I wouldn't think its really going to be anything huge for VR, but it might still be worth having for those situations where things aren't running ideally and Gsync steps in to hopefully mitigate any resulting unease as much as possible.

I think it'll not only be "material" but a downright requirement for VR, if it's going to catch on for high-end AAA type games. For high budget, high graphics games, you will always have frame drops, unless you're far above your framelock, be it 60 or 90 or 144 fps. And for a VR situation, drops will be much more noticeable.
And honestly, if OR/VR is only targeted at people with rigs who can run "high quality" games at 90fps locked, it's bound to fail right at the gates, as those rigs will not be able to maintain 90fps locked (at max graphics) for very long anyway.

Imo variable refresh rate is a must for VR to succeed.

But what the fuck do I know anyway...
 

Seanspeed

Banned
I think it'll not only be "material" but a downright requirement for VR, if it's going to catch on for high-end AAA type games. For high budget, high graphics games, you will always have frame drops, unless you're far above your framelock, be it 60 or 90 or 144 fps. And for a VR situation, drops will be much more noticeable.
And honestly, if OR/VR is only targeted at people with rigs who can run "high quality" games at 90fps locked, it's bound to fail right at the gates, as those rigs will not be able to maintain 90fps locked (at max graphics) for very long anyway.

Imo variable refresh rate is a must for VR to succeed.

But what the fuck do I know anyway...
VR isn't gonna work like the current high-budget, high graphics industry works. VR *needs* a consistent, high framerate. Anything less and you're liable to make people uneasy or even sick. And nothing is going to put people off more than the mainstream complaining that doing shit in VR makes them sick, literally.

This is another big reason why games/experiences need to be built with VR in mind from the get-go. Right now, many people have the mindset that, "Ok, I cant get 60fps, so I'll settle for 45-50fps" or whatever. That wont work in VR. Its max framerate or bust, basically. Gsync will only go so far in eliminating the painfulness of fluctuating framerates. You absolutely need a solid, consistent, fast framerate in order to break the flicker threshold and give a person an experience they can play for a length of time. That is not an optional nicety, but a requirement.

So developers will have to keep this in mind and not design anything that is going to severely limit CPU's. That's the big thing. And of course, building an experience that takes it easy on GPU's is also good, but options can tend to reduce the demand quite drastically in many cases. VR will absolutely require a step back in terms of graphics. Both the resolution and the design of the games cannot keep neck-and-neck with cutting edge 2D gaming for the most part. Some will try, like Project Cars and Star Citizen, but these will be the extremely high end experiences that only some will be capable of using in the short term. But graphics cards will get better and VR tech will get better, and even things like DX12 should help a lot, so there will be somewhat of a convergence on this stuff, but bottom line - if you want great VR, lower your expectations on the graphics fidelity aspect right now. That may seem disappointing sounding, I know, but seriously, quality VR is such a revelatory experience that it wont matter. Great graphics might be something worth pining over, but I don't think anybody is going to be complaining when they're completely immersed in the most exciting thing in gaming since.......well......ever, probably.
 

Paganmoon

Member
VR isn't gonna work like the current high-budget, high graphics industry works. VR *needs* a consistent, high framerate. Anything less and you're liable to make people uneasy or even sick. And nothing is going to put people off more than the mainstream complaining that doing shit in VR makes them sick, literally.

This is another big reason why games/experiences need to be built with VR in mind from the get-go. Right now, many people have the mindset that, "Ok, I cant get 60fps, so I'll settle for 45-50fps" or whatever. That wont work in VR. Its max framerate or bust, basically. Gsync will only go so far in eliminating the painfulness of fluctuating framerates. You absolutely need a solid, consistent, fast framerate in order to break the flicker threshold and give a person an experience they can play for a length of time. That is not an optional nicety, but a requirement.

So developers will have to keep this in mind and not design anything that is going to severely limit CPU's. That's the big thing. And of course, building an experience that takes it easy on GPU's is also good, but options can tend to reduce the demand quite drastically in many cases. VR will absolutely require a step back in terms of graphics. Both the resolution and the design of the games cannot keep neck-and-neck with cutting edge 2D gaming for the most part. Some will try, like Project Cars and Star Citizen, but these will be the extremely high end experiences that only some will be capable of using in the short term. But graphics cards will get better and VR tech will get better, and even things like DX12 should help a lot, so there will be somewhat of a convergence on this stuff, but bottom line - if you want great VR, lower your expectations on the graphics fidelity aspect right now. That may seem disappointing sounding, I know, but seriously, quality VR is such a revelatory experience that it wont matter. Great graphics might be something worth pining over, but I don't think anybody is going to be complaining when they're completely immersed in the most exciting thing in gaming since.......well......ever, probably.

I see your points, and you are most likely correct. I concede that. But expecting AAA devs to be able to guarantee 90FPS on their games, without drops (which can happen even at lower settings, unless as I stated earlier, high above your framelock, and honestly, you can still experience drops then), just tells me we won't be seeing AAA on VR for a long time. Or, we will see AAA games on VR, but it won't be a very good experience, due to frame drops, and for that variable sync will help. I don't think indie devs can really push VR to the masses (and VR to the masses is what needs to happen for it to stick). And I don't see that happening cause, it'll be an extra $300 in addition to your computer, and I don't see the general public paying that just to play unknown indie games.

But as I stated in my earlier post, what do I know. I'm just talking out of my ass :)
 

riflen

Member
This is what Mike Abrash had to say about G-Sync and VR last year:

MAbrash says:
October 25, 2013 at 9:02 am
I’m not sure what the effect of G-SYNC on VR will be. On the one hand, having some flexibility in frame times is obviously helpful, especially when you can’t repeat frames without strobing. On the other hand, any significant variation in frame time with low persistence will cause variable problems with flicker and strobing. My current thinking is that slight (1ms or so) variation might be beneficial, but more than that will probably not produce good VR results. But that’s just a guess, and there’s only one way to find out.

So yeah, no substitute for constant high frame rate where VR is concerned.
 

Seanspeed

Banned
I see your points, and you are most likely correct. I concede that. But expecting AAA devs to be able to guarantee 90FPS on their games, without drops (which can happen even at lower settings, unless as I stated earlier, high above your framelock, and honestly, you can still experience drops then), just tells me we won't be seeing AAA on VR for a long time. Or, we will see AAA games on VR, but it won't be a very good experience, due to frame drops, and for that variable sync will help. I don't think indie devs can really push VR to the masses (and VR to the masses is what needs to happen for it to stick). And I don't see that happening cause, it'll be an extra $300 in addition to your computer, and I don't see the general public paying that just to play unknown indie games.

But as I stated in my earlier post, what do I know. I'm just talking out of my ass :)
Well like I said, designing a VR game will require different priorities than designing a typical 2D game will. These games we see nowadays that are only 30fps or even struggling to hit 30fps are a result of the developers knowing what they can get away with. Fact is, most console gamers have accepted 30fps and can mostly deal with 30fps with dips as well. Pubs/devs love this as it means they can push the graphics hard, as that's what they can market easily. Its very hard to market framerate.

But this doesn't mean that's all these developers can do. Now 90fps is hardly something that is likely to be easy to achieve, but when you design the game around a performance target, rather than a certain level of graphics, it probably shouldn't be that huge of an issue. It will certainly require a change in design philosophy, though. Going for all out graphics will take a backseat to performance.

Its a challenge, and it even creates an awkward segregation between normal gaming and VR gaming that might be difficult for these pubs/devs to balance, but I ultimately feel that once people start getting their hands on good quality VR experiences, there will be an extreme hunger for more.

This is what Mike Abrash had to say about G-Sync and VR last year:



So yeah, no substitute for constant high frame rate where VR is concerned.
Right. Its certainly worth looking into, though.
 

Red Comet

Member
I want a G-Sync monitor so bad, but I'm saving my money for a 980 Ti right now. Actually, I've got a VG248QE, so if ever find the DIY kit on the cheap maybe I'll get that.
 
I want a G-Sync monitor so bad, but I'm saving my money for a 980 Ti right now. Actually, I've got a VG248QE, so if ever find the DIY kit on the cheap maybe I'll get that.

AFAIK those kits were limited time production until normal displays with g-sync are released.
 

Danny Dudekisser

I paid good money for this Dynex!
It's an awesome feature saddled by terrible TN monitors. I really can't wait to see this sort of thing implemented in a decent display, because it's actually pretty great.
 

Tain

Member
I would think gsync+VR's benefits would be a lack of lag compared to vsync, not making things easier on the hardware.
 

Ziffles

Member
This thread is literally the first and only relevant result for "Rolling raster interlaced asynchronous time warp" on Google. Do you have a link to whatever you're talking about?

http://www.extremetech.com/gaming/1...g-feature-will-make-vr-easier-on-your-stomach

https://www.youtube.com/watch?v=WvtEXMlQQtI

It sounds like more of a software solution to prevent VR stuttering on frame rate hitches, rather than some magic bullet to increase frame rates. I don't really see how it would put G Sync out to pasture.

Unless the whole "rolling raster interlaced asynchronous" part is some wild new technology. In that case, fill us in, Krejlooc.
 

sk3tch

Member
Is anyone rocking the BenQ XL2420G? I'm very close to pulling the trigger. I've heard the complaints about image quality, but I own 3 other XL2420xx monitors and they're great - I'm all about multiplayer gaming - so PQ isn't of paramount importance.

EDIT: just bought.

Some info that led to the purchase:
Review in progress - link
Thread by the same dude that's working on the review, above - link

It's the same "bad" picture quality that I have used for 2+ years with the XL2420xx series - so I'm not worried at all about that. I'll post up some impressions when I receive it. Excited to try out G-SYNC!!
 
D

Deleted member 17706

Unconfirmed Member
Is anyone rocking the BenQ XL2420G? I'm very close to pulling the trigger. I've heard the complaints about image quality, but I own 3 other XL2420xx monitors and they're great - I'm all about multiplayer gaming - so PQ isn't of paramount importance.

EDIT: just bought.

Some info that led to the purchase:
Review in progress - link
Thread by the same dude that's working on the review, above - link

It's the same "bad" picture quality that I have used for 2+ years with the XL2420xx series - so I'm not worried at all about that. I'll post up some impressions when I receive it. Excited to try out G-SYNC!!

Sorry for the semi-necro bump, but I just set up my new BenQ XL2420G. I've seen monitors with this refresh rate before, and even experienced G-Sync a bit before, but it's a different story when it's in your own home. Man, this is kind of magical. I'm sure G-Sync is just as good as any of the models, but this was the cheapest, best rated one I could find, and I wanted to stick at 1080p for a while. I also needed the ability to hook up my PS4 to sometimes game on my monitor, too, and I think this is the only G-Sync monitor with more than just a single DisplayPort cable.

Damn, the title of this thread was not exaggerating.

I've still only messed around in a few games, but tearing is gone and framerate transitions are smooth as hell. I just ran around in Assassin's Creed: Unity going between frame rates of as low as 30 and as high as almost 90 and it *felt* completely smooth. I then messed around in Far Cry 3 where frame rates moved about between 60 to 100 and it was incredible. Big difference when you start going well above 60 fps. It's just a whole new level of smoothness.

Not to mention how crazy it is to be messing around in Windows at 144hz.

Anyway, I'm very impressed so far and extremely pleased with this purchase. I need to go load up some more fast paced action games and test this out some more.
 

Dries

Member
Well, I'm curious. What's the best G-sync monitor I can get at 1080p, nowadays? I don't really have a budget, I'm willing to buy good stuff.

Also, last I read there was an issue with G-sync in combination with DSR or something, right? Has that been solved?
 
G-Sync precludes the use of strobing (at least it did the last time I looked) which is much more desirable on a high end gaming PC. It makes sense for lower end machines that can't lock a high refresh, but then price premium kinda defeats it. Building a high-end PC that can lock say 80hz and then letting it flail around on a 120Hz+ g-sync monitor, with all the blur of a sample-and-hold LCD is just strange.
 

SliChillax

Member
Well, I'm curious. What's the best G-sync monitor I can get at 1080p, nowadays? I don't really have a budget, I'm willing to buy good stuff.

Also, last I read there was an issue with G-sync in combination with DSR or something, right? Has that been solved?

If you don't have a budget go beyond 1080p, you'll never want to go back even with 1440p. There's the new Acer IPS 144hz Gsync monitor that just came out, it seems perfect.
 

Seanspeed

Banned
G-Sync precludes the use of strobing (at least it did the last time I looked) which is much more desirable on a high end gaming PC. It makes sense for lower end machines that can't lock a high refresh, but then price premium kinda defeats it. I honestly don't understand why it's so popular.
Monitors with ULMB achieve the same thing as Lightboost, and with better colors as well.

And you're incorrect that Gsync is only good for lower end machines. People running modern games at very high settings and/or high resolutions will often find themselves beneath 60fps even with good hardware. Certain games can run into a CPU bottleneck that won't let it get into the 90+ fps range. And of course many games don't even allow for >60fps framerates in the first place.

Gsync combined with 144hz and ULMB basically covers everything. Throw in a 1440p resolution and you've basically got a monitor that will be impossible to *fully* utilize with most modern games even with top grade components.
 
Gsync combined with 144hz and ULMB basically covers everything.
Don't get me wrong, variable sync should've been around from the get-go. But when it comes at the cost of strobing, then it's absolutely no contest imo. We can't have both, yet.

People running modern games at very high settings and/or high resolutions will often find themselves beneath 60fps even with good hardware
Then they should turn down their settings/resolution, sub 60Hz on a sample-and-hold LCD is full of motionblur and losing huge amounts of detail. The only way to escape the inherent flaws of LCD is to boost the native refresh (i.e 120Hz+) and lose all your eye candy, or utilise strobing and tolerate some flicker/loss of light output.
 

elelunicy

Member
Have you seen it in person? It's not about a single line not tearing, it's about the whole image being consistently smooth without stuttering, tearing and lag. It's something that you have to see in person to understand.
Anyway, no tearing, no stutter, no framerate halving like you get with a double buffer but less input lag than a tripple buffer. What's there not to love? Well maybe the price of entry, I'll give you that. But that will come down in time, hopefully to the point where all monitor manufactures will include it as a standard feature because there's no reason not to.

Since when does G-Sync eliminate stuttering now? If a frame takes too long to render (50ms or more), which happens all the time in games that stutter a lot, G-Sync doesn't really do anything to it.
 

Corpekata

Banned
Well, I'm curious. What's the best G-sync monitor I can get at 1080p, nowadays? I don't really have a budget, I'm willing to buy good stuff.

Also, last I read there was an issue with G-sync in combination with DSR or something, right? Has that been solved?

The issues with Gsync and DSR are with SLI. It works with single cards.

Since when does G-Sync eliminate stuttering now? If a frame takes too long to render (50ms or more), which happens all the time in games that stutter a lot, G-Sync doesn't really do anything to it.

They probably meant judder.
 

UnrealEck

Member
Well, I'm curious. What's the best G-sync monitor I can get at 1080p, nowadays? I don't really have a budget, I'm willing to buy good stuff.

Since you don't really have a budget, buy an Acer Predator XB270HU and a Titan X to power it.
 
So, possibly a stupid question here:

Is there any advantage at all to G-Sync if you've got a system capable of running every game at a locked 60FPS? Or does a G-Sync monitor pretty much only come in handy when you're floating somewhere in that 30-60 FPS range?
 

riflen

Member
Since when does G-Sync eliminate stuttering now? If a frame takes too long to render (50ms or more), which happens all the time in games that stutter a lot, G-Sync doesn't really do anything to it.

It has always eliminated the stuttering caused by playing with Vsync on. That's what people are referring to.
 

Belmire

Member
I got the Rog Swift (TN) about a week after it came out and running it on 2 980s. Upgraded from a Korean 1440p IPS 60hz monitor, Qnix 27inch. Even with web browsing being better on the Qnix, it doesn't touch the Swift when it comes to games. I hover around 140fps in almost every game without a trace of tearing or (j)stuttering...even BF4 almost maxes out.

Of course...I don't know how long I'm gonna last before caving and getting 2 Titan Xs and a 4k Gsync screen. The new cards seem a long way off...
 

Belmire

Member
So, possibly a stupid question here:

Is there any advantage at all to G-Sync if you've got a system capable of running every game at a locked 60FPS? Or does a G-Sync monitor pretty much only come in handy when you're floating somewhere in that 30-60 FPS range?

If you're doing 60fps at 1080p ALL the time with Vsync on, then no, don't get Gsync. It really becomes a thing of beauty when reaching over 60fps, so it's nice to have a 120/144hz refresh. If your system can always max out your monitor's refresh (60 or 120/144), then Gsync is kind of useless.
 
Top Bottom