• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-Sync is the god-level gaming upgrade.

Dries

Member
Swift is TN and Acer is IPS so it's really up to your preference there. Given your other posts you will likely favor the Acer as IPS is Lbetter for image quality. Spec wise the ACER only really falters in comparison to response time but it's pretty minimal. Given the same cost, the ACER seems to be superior in nearly every way. Both monitors have had some troubling QA issues so even that shouldn't be a big factor.

True. I can't wait to experience this upgrade. I'm coming from a non-IPS 22 inch 1080p screen, so I expect to be blown away. I think this could almost even be a better upgrade than my 980.
 

SliChillax

Member
True. I can't wait to experience this upgrade. I'm coming from a non-IPS 22 inch 1080p screen, so I expect to be blown away. I think this could almost even be a better upgrade than my 980.

You will be blown away, guaranteed 200%.
 

AndyBNV

Nvidia
Longest wait ever.

So... concerning G-sync, do I have to install new display drivers when I hook up my new screen? Or do I have to enable it in nvidia control panel?

If you have the latest drivers just enable it in the NVIDIA Control Panel. Alternatively, reinstall the drivers and it'll auto-detect a G-SYNC monitor and enable everything for you.
 
So I guess I'll pop this in here too. I'm upgrading my GPU to a 980, and I'd like to get a monitor with G-Sync.

What are some good ones that will allow me to max out this years games? It doesn't need to be 1440p.
 

Yasir

Member
What are the best g_sync monitors? Looking at QHD ideally. But seems to be quite a few with good ratings.
 
Was just about to upgrade my non-IPS 23 inch 1080p monitor to the ASUS MX279H but I'd love to bump it up even further to a g-sync monitor. Is there a consensus on the best ones out right now? Sounds like the Swift is riddled with QC problems.
 

boiled goose

good with gravy
Not irrelevant, no. Higher framerates are still going to result in smoother gaming. There just wont be any need to cap/lock a game at 30fps anymore.

Also this is just about the display. Inputs are still tied to the games frame rate. Ie 30 fps fighting games would still suck. Just look better.
 

Afro

Member
So I guess I'll pop this in here too. I'm upgrading my GPU to a 980, and I'd like to get a monitor with G-Sync.

What are some good ones that will allow me to max out this years games? It doesn't need to be 1440p.

Check out the AOC G2460PG 24" if you're on a budget (sub-$500) or are content with 1080p for the time being. I'm rocking one right now using Tom's Hardware's calibration settings. I remember it reviewing slightly better than the similarly priced BenQ and Asus monitors. Only one DisplayPort input though. No HDMI, DVI, etc.

That new Acer XB270HU 27" IPS looks pretty perfect if you want to spend almost double (worth it).

Not sure if any superior sub-$500 g-sync monitors have been released since I bought the AOC a few months ago though.
 
Just saw a pre-order page on Amazon for BenQ's new 27" 1440p, 144Hz, FreeSync monitor coming soon. It looks great...but is a TN panel and list price appears to be ~$750...which is the same price as the Acer IPS panel 27" G-Sync. I'd guess the BenQ will need to drop in price quickly unless there are large swaths of people who aren't aware how much better IPS panels look. Unless there's been some major improvements in TN panels over the past couple years I'm not aware of the price seems high if I can get an IPS for the same price.

The other difference is FreeSync vs G-Sync which seems like a wash and up to GPU preference. One thing I dont like about this whole G-Sync/FreeSync thing is being locked into one brand GPU or the other by a large investment in the monitor. Kinda sucks & makes me hesitant to invest in one or the other until the industry settles on one standard....
 
Longest wait ever.

So... concerning G-sync, do I have to install new display drivers when I hook up my new screen? Or do I have to enable it in nvidia control panel?

If you don't have one already, grab yourself a decent Displayport 1.2 cable, which is required for G-Sync monitors. You can get a good cheap one from Amazon or Monoprice if you're in the US.
 

Dries

Member
If you don't have one already, grab yourself a decent Displayport 1.2 cable, which is required for G-Sync monitors. You can get a good cheap one from Amazon or Monoprice if you're in the US.

Well, my Acer Predator XB270HU already comes with a displayport cable, so.... that one is probably fine, right?
 

SliChillax

Member
Well, my Acer Predator XB270HU already comes with a displayport cable, so.... that one is probably fine, right?

I don't know about Acer but the Displayport that came with my Rog Swift is rubbish. I thought the panel had issues, I sometimes got flickering or the screen would go black for a few seconds every day. I changed the Displayport cable and I haven't had any problems.
 

Dries

Member
I don't know about Acer but the Displayport that came with my Rog Swift is rubbish. I thought the panel had issues, I sometimes got flickering or the screen would go black for a few seconds every day. I changed the Displayport cable and I haven't had any problems.

Wow, that's pretty unforgivable imo. You should expect top-notch quality with the amount of cash we're spening on our hardware. *sigh*
 
True. I can't wait to experience this upgrade. I'm coming from a non-IPS 22 inch 1080p screen, so I expect to be blown away. I think this could almost even be a better upgrade than my 980.

Already made a post about it in another thread, but will repeat myself.
Yesterday I also just upgraded from a 22 inch 60 hz 1080p screen and yeah the differences are pretty mindblowing. I got the ROG Swift, so yours should even be better ( I got the ROG Swift because the only way to get a new monitor was with my microcenter credit card, and they said they wouldn't be getting any of the new acers for awhile.)
 

Dries

Member
Crosspost from PC thread:

Sup, PC-GAF

I'm in search of a new DisplayPort cable. I just ordered a Acer XB270HU and I've read that the DisplayPort cable that comes packaged with the monitor is kinda short. So I want a longer one. Couple of questions:

- Are there big differences in quality between different DisplayPort cables?

- If there are, what are the things I should keep an eye on?

- The DisplayPort cable will be working on a Acer XB270HU, so it should support 1440p, G-sync and DisplayPort 1.2. Will all DisplayPort cables work fine with a Acer XB270HU? Or will some work "better" than others?

- How do 3rd party DisplayPort cables work with hardware? Should I be wary of anything? Will I be at a disadvantage by not using the packaged cable?

- I'll be going from +-1m (the packaged cable) to 2m. Will there be any noticeable quality loss?

Lastly, could anyone advise a specific brand/product? Thanks!!!
 
I bought my AOC G-Sync screen last year when my old one broke. Didn't really play much on PC since then and I like some games more on TV. So GTA5 is the first game that really takes advantage of the tech for me. It's amazing! Running the game from 60-90 fps and sending that GPU and CPU to work. :)
 
Is the new Acer 144mhz 4k,IPS G-Sync only available in 27''? Is there a cheaper 24'' available?

Looks like its 27" only.

If price is a concern & you might consider an AMD GPU instead the similarly specced Asus IPS panel 27" 1440P/144Hz FreeSync monitor is coming soon with a list price of $599.

It has a 40hz floor for VRR so this may be of concern to some so I'll be researching it extensively before making a purchase sometime this summer after AMDs islands GPUs drop & we can get a better look at working silicon & monitor techs. In any case, this Asus monitor is rumored to be the exact same AOC panel used in the Acer G-Sync but my guess is this Asus model will have better build quality. If its as good as the Acer the starting price of the Asus being $200 cheaper out the gate is going to be awfully tough to pass up. The $600 price is awfully close to my insta-buy price point of $500 for one of these new IPS with adaptive sync tech. Unless FreeSync is notably inferior I cant see how the Acer could justify a $200 price premium.
 

Dries

Member
Thanks for the tips on the DisplayPort cables, guys.

Btw, are there also DisplayPort cables or monitors that support 4K @ 144Hz? Or is that future stuff?
 
Thanks for the tips on the DisplayPort cables, guys.

Btw, are there also DisplayPort cables that support 4K @ 144Hz?

All DP 1.3 certified cables will support it (or something like 127Hz@4K i don't remember exact calculation). But I don't know if any of them exist already ;)


Also no current gpu nor monitor has DP 1.3 afaik.
 
Saw a g-sync demo the other day and was impressed... but sad that Nvidia chose the exclusivity/proprietary route. I'm not going to throw money at that any time soon because of that.
 

Water

Member
Bad news for those waiting for the Acer XR341CK ultra wide monitor. The refresh rate has downgraded to 75hz max

http://www.tftcentral.co.uk/news_archive/33.htm#acer_xr341ck

Weird. I can't say I care much about it going all the way up to 144Hz, but I was hoping for 100ish Hz. Won't ULMB with refresh as low as 75Hz produce flickering? I'm still happy that they seem to be including the option. On the other hand, very interesting that there's supposed to be a second input and not just Displayport.
 

AJLma

Member
Bad news for those waiting for the Acer XR341CK ultra wide monitor. The refresh rate has downgraded to 75hz max

http://www.tftcentral.co.uk/news_archive/33.htm#acer_xr341ck

I can live with this, 75Hz with G-Sync is a great target for my single GPU rig.

It's cool to see Acer actually pushing the boundaries of what can be done with IPS tech for the demands of what is(for now) a niche few. It'd be nice if they could get it up to 90 before launch. >_>
 
Saw a g-sync demo the other day and was impressed... but sad that Nvidia chose the exclusivity/proprietary route. I'm not going to throw money at that any time soon because of that.

Yeah, this is my general feeling too. I tend to keep monitors a lot longer than video cards. A monitor is at least a 5 year purchase for me (hell, now that I think about it I've never actually sold any of the flat-panels I've purchased over the years, they're all being put to use in some way...although I may start culling some of the older ones soon).

I'd guess its similar for most people: They might switch out their GPU every year or two, but when most buy a monitor or a TV they tend to keep it a long time. So the idea of being locked into only 1 brand of GPU to get the most out of a monitor's capabilities doesn't sit right.

I have more nVidia GPUs than AMD GPUs at the moment (love the 750Ti for a variety of tasks & a 960 in my HTPC), so this isn't an issue of AMD loyalty for me. I just don't like the idea of not being able to take advantage of adaptive sync in an expensive G-Sync monitor if I switch my main gaming rig over to an AMD GPU at some point. G-Sync may have an advantage (I've yet to compare the two techs side-by-side in person) but I'm not sure its worth the $200+ premium over FreeSync on otherwise identical panels with similar features.

Over the next couple years the market will no doubt choose one over the other & settle on a single standard. I think FreeSync is more likely to win this one because a lot of people are going to feel the same way. I suppose we shall see.
 
Have anyone else had trouble getting DSR to work after the latest Nvidia patch? My monitor loses signal if I try to activate it in control panel. I have a single gpu, and it used to work flawlessly before.
 
Since the whole G-Sync/FreeSync thing is all the rage right now, I thought about starting a new thread for this review, but figured I'd put it here first so people have some counterpoints to consider:

http://techreport.com/review/28073/b...nitor-reviewed

I won't personally buy a TN panel since IPS/AHVA are so much nicer to look at. But, as TN panels go this one is apparently very good. And, more importantly, this review does seem to clear up some of the misinformation/hyperbole out there about the differences between G-Sync/FreeSync....

A couple observations from the summary:

"Spending time with a FreeSync monitor and walking through the gauntlet of supposed issues has crystallized my thoughts about some things. AMD and its partners have succeeded in bringing variable refresh technology to market using an open, collaborative approach. The concerns we've seen raised about niggling problems with FreeSync displays in specific cases, such as low-FPS scenarios and ghosting, are really nibbling around the edges."

"...the BenQ XL2730Z is good enough that I think it's time for the rest of the industry to step up and support the VESA standard for variable refresh rates...there is no technical reason why today's GeForce GPUs can't support variable refresh on Adaptive-Sync displays. All it would take is a driver update. If Nvidia really believes G-Sync offers compelling advantages over Adaptive-Sync, it should show that faith by supporting both display types going forward. Let consumers choose."

The upshot of this review for me is now I'm eagerly awaiting a review of the Asus MG279Q (IPS panel) before making a purchase decision. After viewing the differences with my own eyes, I do slightly prefer G-Sync (mosty because of the VERY slight differences at the "below 40Hz" threshold. The "ghosting problem" is a total non-issue from the looks of it). But proprietary is bad & right now that means nVidia is the one causing the problem. The other way for that problem to be solved is for a display manufacturer to come out with a display that will work with both techs. In the meantime, as the reviewer noted, nVidia should issue a driver update to enable VESA Adaptive Sync (FreeSync) compatibility on their GeForce GPUs & let consumers decide. Of course, what nVidia should do & what they will actually do are probably very different things. My guess is they won't give up this proprietary fight until the market forces them to...
 
Unless it's patented by Nvidia? I have no idea though, I'm talking out of my ass.

From what I understand, there is nothing preventing AMD from modifying their driver to output the same frame twice or thrice and get the same behavior as G-Sync below 40Hz (or whatever minimum VRR level that a particular panel sets) . Apparently AMD has hinted such a thing is on the way. If so, my guess is that they delayed that feature to get the driver out now to use with the first FreeSync monitors & have been ironing out kinks. There is no technological hurdle to caching the frame on the video card instead of the monitor. And there is certainly no legal barrier since FreeSync is just a branded form of the VESA standard Adaptive Sync which covers the gamut of frame tricks that both companies use. nVidia already essentially uses a form of non-G-sync Adaptive Refresh tech in their laptops since, lo and behold, laptops don't contain G-Sync modules. So, that pretty much confirms that a nVidia driver to allow FreeSync compatibility with all GeForce GPUs would be incredibly easy to release...since it basically already exists.

Assuming AMD can correctly implement some form of frame redraw multiplication on the low end of the range, and if the manufacturers can get any niggling ghosting issues under control in their TCON's the two techs will be roughly equal above and below FreeSync's VRR range. There may be other advantages G-sync has that I'm missing, but from the looks of it nVidia's proprietary and more expensive "module" option was needed primarily to get their product first to market. It's slightly better right now, but that may change soon. On a side note, above the VRR range FreeSync actually has an advantage as G-Sync is forced to V-Sync ON while FreeSync is selectable.
 

Corpekata

Banned
In the Nvidia Control Panel I assume? So will in-game V-sync options be greyed out or something?

Control Panel yes, though I'm pretty sure it'll default to that setting either way. No they won't be greyed out, you're just telling your GPU to override the setting so no matter what you do in the game it won't affect anything regarding Vsync settings.

Similar to how if you sex Global AF to X16 and a game gives you the option and you set it to X4, you'd still be getting X16 in the game.
 

Dries

Member
Control Panel yes. No they won't be greyed out, you're just telling your GPU to override the setting so no matter what you do in the game it won't affect anything regarding Vsync settings.

Similar to how if you sex Global AF to X16 and a game gives you the option and you set it to X4, you'd still be getting X16 in the game.

Great, thanks! Can't wait to experience G-sync. Monday is the day!
 

AndyBNV

Nvidia
Regarding G-SYNC being always-on now, this is by design and is a good thing. However, it makes using ULMB a little trickier at the moment, and this is what our driver team had to say about it:

NVIDIA said:
In our recent driver, when G-SYNC is enabled in the NV Control Panel, it is always enabled and now supersedes in-game settings. A consequence for example, is that an end-user wanting to use ULMB mode in a game, must first turn-off G-SYNC in the NV Control Panel. We are investigating alternatives for a future driver release, but for now, use the NV Control Panel to turn off G-SYNC before using ULMB mode.
 

antti-la

Member
I would love to see frame interpolation technique implemented in a way it wouldn't introduce so much input-lag. Sometimes I put it on in my TV when i play sub-60fps games which aren't too fast paced, and it's awesome - everything is super smooth.

Sony is already doing similar thing with Morpheus. Now do it with your TV's already.

Also, I have been wondering whether people who say they don't notice a difference between 30 and 60fps have this enabled on their TV's (usually is on standard settings) and that's the reason why.
 
Regarding G-SYNC being always-on now, this is by design and is a good thing. However, it makes using ULMB a little trickier at the moment, and this is what our driver team had to say about it:

3d vision had a separate and simple .exe / key stroke to turn it on and off at the driver level (in game or out of game). You should try and see if soemthing is possible also for G-Sync ULMB.
 

bizzle

Neo Member
3d vision had a separate and simple .exe / key stroke to turn it on and off at the driver level (in game or out of game). You should try and see if soemthing is possible also for G-Sync ULMB.
Why do all that when the way it was before the latest patch was working correctly.

Unless edge cases existed that we haven't been made aware of, I've not seen anyone post that they like the current implementation compared to the older one.
 

ss_lemonade

Member
I would love to see frame interpolation technique implemented in a way it wouldn't introduce so much input-lag. Sometimes I put it on in my TV when i play sub-60fps games which aren't too fast paced, and it's awesome - everything is super smooth.

Sony is already doing similar thing with Morpheus. Now do it with your TV's already.

Also, I have been wondering whether people who say they don't notice a difference between 30 and 60fps have this enabled on their TV's (usually is on standard settings) and that's the reason why.
Doesn't that introduce artifacts or is it dependent on your TV? I know my TV does (Sony w900a) and that, coupled with input lag makes it a pretty terrible setting to have enabled IMO.
 

Corpekata

Banned
Why do all that when the way it was before the latest patch was working correctly.

Unless edge cases existed that we haven't been made aware of, I've not seen anyone post that they like the current implementation compared to the older one.

There are a lot of games that don't have real fullscreen options (including big releases like the Evil Within), so having Gsync work on them is better than making ULMB a slight hassle.
 

SparkTR

Member
What is g sync?

It syncs your monitors refresh rate with the games framarate, so when a games frame-rate is varying between 40-120 (or any number above 30fps) it feels smooth, similar to locked 60fps or better. It also eliminates screen tearing and input lag.
 
Top Bottom