• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-Sync is the god-level gaming upgrade.

Jimrpg

Member
Yeah that's what I'm using at the moment.

Image itself isn't the greatest monitor ever, but it's pretty affordable by G-sync standards. As long as you're cool with 1080 (which you should be if you're playing EUIV :p)



According to those AMD slides the other day, G-sync monitor refresh rates bottom out at 30hz. So it's a standard monitor below 30, presumably. Although theoretically if it's running 144hz, all sorts of funky low framerates will still look fine.

The AOC 24inch G sync is the one I want to get.

What made you get that one?
 
I'm extremely pleased with my Acer XB280HK monitor. Playing with Gsync at 4K is quite a bliss even though I usually have to knock a few settings down.

With that being said, as far as I've seen, I haven't seen stuttering to be completely expunged. You know those weird stutters that you get with recent Ubisoft games? Those are still extremely apparent (which makes sense considering it's essentially dropping frames to single digits/teens for a split second).

Ironically, Assassins Creed Unity was the game that I saw the most benefit from Gsync.
 

TSM

Member
The pendulum demo is highly misleading, in that it isn't particularly graphically intensive, and so they can get away with artificially capping the framerate and still get no dropped frames.

Real games will suffer legitimate cases of dropped frames and you will still notice hitching on g-sync when it happens.

Games don't "drop" frames when using gsync. You can get frames that take excessively long to render which will result in something that looks kind of like judder.
 
TVs aren't going to suddenly have displayport support, HDMI is entrenched in that market and there's no reason for any of the manufacturers to add it.
TVs aren't going to suddenly have HDMI support, RCA is entrenced in that market and there's no reason for any of the manufacturers to add it.
it's also possible that the AMD GPUs in the consoles don't actually have the display controllers to support the feature, which would mean they'd actually have to redesign the SoC to add it. it's probably more complicated than just adding the port.
The GPU in the XBOX 360 doesn't actually have the display controller to support HDMI, which would mean they'd actually ahve to redesign the SoC to add it. it's probably more complicated than just adding the port.

Just making a point. These are the same arguments against HDMI from 2005.

I think that if Sony can get some sort of competitive (marketing) edge by supporting a display synced to the framerate of the GPU, they'll find a way to do it. Stick a demonstration setup in every Best Buy, Costco, Gamestop, and Fred Meyer.
 
I have been using gsync since March. I run VLC in gsync and it's flawless for 24hz and 25hz content. Under 30 fps g-sync kicks to normal vsync, but it still updates at maximum of 144hz so you get a judder free experience with fixed frame rate video.

Do you know the settings for this? I'm working on this in Kodi right now, and it looks pretty damn good. None of the stutter that I see playing 23.98hz video on my 60hz TV.

EDIT: Of note, the LCD still suffers from frame change bleed. I suppose it will be a little while until someone works out how to mix ULMB and G-Sync/FreeSync/Adaptive-Sync. It just won't happen on this monitor.
 
The AOC 24inch G sync is the one I want to get.

What made you get that one?

When I ordered it, it was the only 1080p G-sync monitor I could get in Australia, and only one of two total 1080p ones available anywhere in the world. They didn't sell DIY kits here and I would have had to import the imported the Asus one from another country. I think there are one or two more available around the world now. I'm not sure how it compares to other 1080p G-sync displays that are currently available.
 
TVs aren't going to suddenly have HDMI support, RCA is entrenced in that market and there's no reason for any of the manufacturers to add it.
The GPU in the XBOX 360 doesn't actually have the display controller to support HDMI, which would mean they'd actually ahve to redesign the SoC to add it. it's probably more complicated than just adding the port.

Just making a point. These are the same arguments against HDMI from 2005.

Except that first generation GCN GPUs lack the "required complexity" to support Freesync per AMD.

Whatever it is, it's clear that not all display controllers are created equal. Otherwise all AMD GPUs and not just the most recent would support it. The consoles could be in the same boat.

Logically this extends to Nvidia and Intel GPUs also. It might not be a matter of want but a matter of can't.
 

TSM

Member
Do you know the settings for this? I'm working on this in Kodi right now, and it looks pretty damn good. None of the stutter that I see playing 23.98hz video on my 60hz TV.

That's the big benefit. Gsync will render at the exact frame rate of the video even if it's something way off standard like 28.037 or something else weird. No more having to run your desktop at refresh rates that are multiple of one of the video frame rate standards to get flawless playback.
 
Except that first generation GCN GPUs lack the "required complexity" to support Freesync per AMD.

Whatever it is, it's clear that not all display controllers are created equal. Otherwise all AMD GPUs and not just the most recent would support it. The consoles could be in the same boat.

Logically this extends to Nvidia and Intel GPUs also. It might not be a matter of want but a matter of can't.

There's no can't. Just "How much?". Someone will do a cost/benefit analysis and it will either happen or it won't for one or more of the current generation of consoles.
 

DarkoMaledictus

Tier Whore
when 120 hz, 4k, gsync becomes a thing on quality 30 inch monitors I will be buying.. .well that or freesync, whatever makes more sense at that time!
 
I'd take a 4k monitor and an Oculus Rift with a high end graphics board over a g-sync monitor any day. I just brute force my way to 60fps and enjoy much larger screens over 31-inches.

-M
 
I have a GTX 570 and a 144 HZ monitor. I don't get the big deal over G Sync. I have enough to not have screen tearing or frame drops for some games. Not the newest that's for sure.
 

CND

Neo Member
I'm extremely pleased with my Acer XB280HK monitor. Playing with Gsync at 4K is quite a bliss even though I usually have to knock a few settings down.

With that being said, as far as I've seen, I haven't seen stuttering to be completely expunged. You know those weird stutters that you get with recent Ubisoft games? Those are still extremely apparent (which makes sense considering it's essentially dropping frames to single digits/teens for a split second).

Ironically, Assassins Creed Unity was the game that I saw the most benefit from Gsync.

You can try capping your frame rate to something like 40 which will help reduce the frame time variance leading to a subjectively smoother experience.

Capping a frame rate to anything other than a standard refresh rate results in persistent micro stutter in traditional monitors but with g-sync it works fine.

The sad thing is that a g-synced 40 FPS is not as pleasurable of an experience as nVidia marketing would lead you to believe.
 

SparkTR

Member
Yes it's a great upgrade, made probably the most frustrating aspects of gaming (frame-rate locks, screen tearing, input-lag, stuttering frame-drops) and made it smooth and hassle free.

I got a 980GTX as well so I'm pretty high-end, but some games, mostly shooters, have either ridiculous input lag or ridiculously screen tearing, or have inevitable frame-rate drops into the 40s or micro-stutters. G-sync is pretty much a single solution to fix all that. Every game always looks and plays smoothly, no matter what I'm doing in the game.
 

Grief.exe

Member
when 120 hz, 4k, gsync becomes a thing on quality 30 inch monitors I will be buying.. .well that or freesync, whatever makes more sense at that time!

4K freesync monitors from Samsung coming early next year. Not sure as to the bandwidth limitations to displayport.

EDIT

DisplayPort version 1.3 was released on September 15, 2014.[17] This standard increases overall transmission bandwidth to 32.4 Gbit/s with the new HBR3 mode featuring 8.1 Gbit/s per lane (up from 5.4 Gbit/s with HBR2 in version 1.2), totalling 25.92 Gbit/s with overhead removed. This bandwidth allows for 5K displays (5120×2880 px) in RGB mode, and UHD 8K television displays at 7680×4320 (16:9, 33.18 megapixels) using 4:2:0 subsampling. The bandwidth also allows for two 4K (3840×2160 px) computer monitors at 60 Hz in 24-bit RGB mode using Coordinated Video Timing, a 4K stereo 3D display, or a combination of 4K display and USB 3.0 as allowed by DockPort. The new standard features HDMI 2.0 compatibility mode with HDCP 2.2 content protection. It also supports VESA Display Stream Compression, which uses a visually lossless low-latency algorithm to increase resolutions and color depths and reduce power consumption.[18]

http://en.wikipedia.org/wiki/DisplayPort
 

TSM

Member
Another big benefit of gsync is emulation. With MAME for instance there are lots of games that run at really odd refresh rates. Mortal Kombat runs at 54.706840 hz, and running this with gsync gives you flawless playback of the game.
 
Another big benefit of gsync is emulation. With MAME for instance there are lots of games that run at really odd refresh rates. Mortal Kombat runs at 54.706840 hz, and running this with gsync gives you flawless playback of the game.

Good to know. Old games always had flawless framerates. They had to. The games were always programmed to sync with the display. The flip from 2D to 3D really changed that. I'm looking forward to enjoying emulated SNES/NES/Genesis/TG16 games without stutter.
 
Good to know. Old games always had flawless framerates. They had to. The games were always programmed to sync with the display. The flip from 2D to 3D really changed that. I'm looking forward to enjoying emulated SNES/NES/Genesis/TG16 games without stutter.

Well, it's just that they didn't skip frames, they went into slowdown instead and still drew every image at the given refresh rate. I'm pretty sure you could make a 3D engine that does the same thing if you wanted to.
 
Might be a stupid question but couldn't vr implement something like this in their displays without requiring huge amounts of power to get over 100 frames to get a decent experience. That's one thing turning me off from the oculus and I just might pick up a new monitor instead
 
The Acer is amazing, but please don't make the mistake of thinking just because it has g-sync you can skimp on horsepower. At the bare minimum you will want two 780 (non-ti), and even that will be a stretch.

Just have one OC'ed 780. I was planning on adding one when GTA5 comes out, maybe I'll take a look at them then.
 

DarkoMaledictus

Tier Whore
Might be a stupid question but couldn't vr implement something like this in their displays without requiring huge amounts of power to get over 100 frames to get a decent experience. That's one thing turning me off from the oculus and I just might pick up a new monitor instead

You'll definitely need some mad video card"S"
 
Yes it's a great upgrade, made probably the most frustrating aspects of gaming (frame-rate locks, screen tearing, input-lag, stuttering frame-drops) and made it smooth and hassle free.

I got a 980GTX as well so I'm pretty high-end, but some games, mostly shooters, have either ridiculous input lag or ridiculously screen tearing, or have inevitable frame-rate drops into the 40s or micro-stutters. G-sync is pretty much a single solution to fix all that. Every game always looks and plays smoothly, no matter what I'm doing in the game.

I still get screen tearing on my monitor. I think there's something wrong with the G-diffuser.
 
I'm tempted to buy one but logically it's better to wait a few months

- Current selection of screens is bit lacking as there's only 6 models on the market.
- first gen cannot do ULMB and g-sync at the same time
- only one input on those displays
- I'm not sure but those displays probably still use FPGA board so costs might drop when nvidia switches to mass production
- we might finally see Yeti ... erm I mean some reviews of AMD freesync soon

PS. And I can't decide if i want 4k@60 Hz or 1080p@144Hz panel at all :D
 

manzo

Member
No G-sync or FreeSync for me until home theater projectors support them. 24" monitors? I couldn't fathom playing anything anymore with less than 110".
 
D

Deleted member 125677

Unconfirmed Member
Yeah that's what I'm using at the moment.

Image itself isn't the greatest monitor ever, but it's pretty affordable by G-sync standards. As long as you're cool with 1080 (which you should be if you're playing EUIV :p)

lol, when are you going to join in on our multiplayer mayhem?

This monitor should be a GIGANTIC step up from my ancient Samsung Synchmaster 245 anyway
 

Scotch

Member
Not going to jump on GSync if there's no ips or va panel using it.
Yeah same. I really want to upgrade my monitor to a GSync one, but I'm never going back to a TN panel. I'm never really bothered by ghosting, I am however extremely irritated by bad viewing angles.
 
gsync saved my life

well not really, but its the best thing to happen to pc gaming in a while

Love my rog swift, but looking forward to any 1440p ips/va panel in the future
 

Porcupine

Member
So for a guy, that suffers a lot from Motion Sickness because of stuttering and inconsistant frame rates, a G-Sync monitor would be perfect?
 

wildfire

Banned
My expectation is that once freesync/active-sync displays become available, Nvidia will update drivers to support it. G-Sync itself will fade away as a transition standard.

It won't.

The Gsync modules also support strobe backlights which is needed for both 3D mode (a feature that is not popular) and ULMB mode aka Lightboost 2.0


As great as gsync mode is it is ULMB that is the best feature about the module.
 

Sentenza

Member
So for a guy, that suffers a lot from Motion Sickness because of stuttering and inconsistant frame rates, a G-Sync monitor would be perfect?
Motion sickness could be tied to a lot of different issues. Even the wrong FoV.

But a Gsync monitor would eliminate any problem of tearing, most of the potential causes of stuttering and with hardware powerful enough allow for some spectacular smoothness.
 

Porcupine

Member
Motion sickness could be tied to a lot of different issues. Even the wrong FoV.

But a Gsync monitor would eliminate any problem of tearing, most of the potential causes of stuttering and with hardware powerful enough allow for some spectacular smoothness.

My main problem is with constant ups and downs. If the game has perfect 60 frames I have nearly zero problems.
But with recent games even with a good gaming pc you can't make sure that you get the 60 frames 100% of the time. The alternative would be constant 30 frames, but I didn't spent over 1500€ on a PC to play with 30 frames ;) (Please don't start a 30vs60 debate :D )
 
Top Bottom