• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-Sync is the god-level gaming upgrade.

Would the smoothness from ULMB mode address that?

ULMB clears some of the remaining motion blur. Motion sickness is usually caused by uneven fast movement, and ULMB tries to faithfully represent even the slightest movement without blurring, so it's not really helping. For people suffering motion sickness, the best choice is simply to have a capped framerate and a large fov. Of course your mileage may wary depenging on how sensitive you are.
 

Jimrpg

Member
When I ordered it, it was the only 1080p G-sync monitor I could get in Australia, and only one of two total 1080p ones available anywhere in the world. They didn't sell DIY kits here and I would have had to import the imported the Asus one from another country. I think there are one or two more available around the world now. I'm not sure how it compares to other 1080p G-sync displays that are currently available.

Ah yes! That's my problem too.. In Australia, it's still the only 24" 1080p G sync! The asus one and the BenQ one out now doesn't come with G sync, and the BenQ one may be really expensive when it comes out. I can't afford the rog swift plus the dpi is more or less the same.. As long as it does the job 144hz and G sync that will do for me. And it seems from ur comments its ok. As long as it's not a really tacky cheap monitor I will be happy. The only thing I think it's missing is an hdmi in port to plug a console in, but that's cool it's not a must have.
 

SparkTR

Member
My main problem is with constant ups and downs. If the game has perfect 60 frames I have nearly zero problems.
But with recent games even with a good gaming pc you can't make sure that you get the 60 frames 100% of the time. The alternative would be constant 30 frames, but I didn't spent over 1500€ on a PC to play with 30 frames ;) (Please don't start a 30vs60 debate :D )

Yeah it'll solve your issue provided the drops are in the 40s. Anything around 30fps still feels bad, but without the screen tearing or input lag you usually get with other solutions like V-Sync.
 

Sentenza

Member
Anything around 30fps should possibly be avoided like a plague anyway.
Gsync utility lies mostly in making ANY variable framerate between 50-60 and 144hz acceptable.
 
I can't say enough of this, they need to add variable refresh rate to HDMI specs, and most TV sets already has the capability to adjust their refresh rate (multiples of 24,25,30) so some sets could get freesync-like functionality with firmware upgrades..

They would probably need new scalers. I can't recall seeing a TV set that can switch refresh rate seamlessly.

According to those AMD slides the other day, G-sync monitor refresh rates bottom out at 30hz. So it's a standard monitor below 30, presumably.
Below 30, gsync flat out stops working. Although sometimes it doesn't switch fast enough when dipping below, so you see that the back light is kinda flashing on and off. I guess that is the reason why they disable it.

Wake me up when I can get a quality panel with G-sync, 21:9 and *x1440 + 144Hz.

You'll be asleep for a while. I don't think there will be any interfaces that can handle that amount of data for a while.
 

riflen

Member
Variable-refresh displays are the future for a medium like games, where the frame rate is not predictable from one millisecond to the next.
G-sync v1.0 has some issues as an implementation, but it's very impressive to see in action at its most effective. NVIDIA deserve some credit for pushing this technology and proving it's viable. Display manufacturers have been quite happy to feed us the same tired technology since forever. I also don't miss fucking around with frame rate caps, refresh rates and buffering settings in an attempt to get the best playable experience from a game.

Things should get interesting in 2015 when a larger portion of people will get to experience this technology as competition increases, prices fall and AMD GPUs get support. In a year, we could have some very smart (probably pricey) variable-refresh IPS panels too.

One thing though, these displays (variable refresh of all kinds) will always be more expensive. There is a cost associated with the electronics in the display (G-Sync probably more so than Adaptive-Sync at the moment), which should go down over time, but there's also additional costs from testing and validation. Panels have to be specially selected and calibrated, which adds time and cost to the end product.

I also want to give a shout out to Ultra Low Motion Blur mode in the G-Sync displays or low-persistence modes in general. That is incredible to experience and really draws me into the game. The clarity is spectacular and it takes me back to the experience of playing Quake 3 on high quality CRT monitors. In ULMB mode, the text in this test is entirely readable. This mode translates to extremely crisp detail on moving objects in a game.
 

dreamlock

The hero Los Santos deserves
Anyone tried an ASUS VG248QE with a G-Sync DIY install?

I would like to do an upgrade considering it's ridiculous how high the prices are on G-Sync monitors atm.

However, I've been unable to find any resellers of G-Sync DIY kits in Europe. If possible, I'd appreciate if someone could point me in the right direction. Preferably with low shipping costs.
 
I'm really hoping that it'll be the magic bullet I'm looking for to help me with the motion sickness I get with unstable framerates, but it seems like it'll be a while before I can get such a monitor, since I'm not paying $700+ for a monitor. At least not at the moment.

A fellow motion sicker eh? Not sure if this would help me however since mine seems to be related more to motion blur (and a few other blurs, like DoF). Anything that messes with how my eyes focus.
 

bodine1231

Member
A fellow motion sicker eh? Not sure if this would help me however since mine seems to be related more to motion blur (and a few other blurs, like DoF). Anything that messes with how my eyes focus.

It didn't help me. I get motion sickness from Farcry 4 but it's because the default FOV is so zoomed in and there's lots of headbob. Disabling Gsync didn't help,only changing the FOV.
 

kinggroin

Banned
I think so. Not being anchored to a single digit divisible by your monitor's refreshrate is a total gamechanger. No longer do I worry about changing settings to get a consistent fps. I just play and adjust a few things here and there if I want the gameplay to be smoother. It's that simple.


Lol isn't this a contradiction
 

Thrakier

Member
I think before G-Sync becomes mass market appeal and will be available on TVs also, VR will be the standard and no one will ask for G-Sync anymore. It'll be a niche product for a short period of time.

That said, if it would be available as an external device, I'd buy it in a heartbeat for my monitor AND my TV.
 

Sentenza

Member
As a side note, can anyone who's particularly up-to-date on the topic point here what are are respectively the best and the cheapest 1080p Gsync-ed monitors on the market at this moment?
Just to know where we stand right now.
Because ideally a panel with this tech could be my most wanted upgrade in the near future.
 

Paganmoon

Member
I'm really hoping that it'll be the magic bullet I'm looking for to help me with the motion sickness I get with unstable framerates, but it seems like it'll be a while before I can get such a monitor, since I'm not paying $700+ for a monitor. At least not at the moment.

Free-sync might be the answer. From what I've gathered, it's going to be part of the displayport 1.2 standard, so all monitors with displayport 1.2 will support it, no extra licensing and chip needed.
 
I love my gsync screen, but make no mistake, you still need a good gpu if you are sensitive to frame rates. Sub 60 feels better but not perfect.
 
Free-sync might be the answer. From what I've gathered, it's going to be part of the displayport 1.2 standard, so all monitors with displayport 1.2 will support it, no extra licensing and chip needed.

DP 1.2a/3 include an option for adaptive sync, but it is not mandatory. It is an optional inclusion in monitors using it.
 

Sentenza

Member
DP 1.2a/3 include an option for adaptive sync, but it is not mandatory. It is an optional inclusion in monitors using it.
Beside, I'm not sure why people keep banking on Freesync (sometimes even dismissing Gsync in the process), since so far it's just an unproved attempt to match the competition's offering.
Something that no one had yet a chance to test first-hand. It needs to be stressed.
 

ZeroX03

Banned
Sounds like a decent idea for a startup/kickstarter.

It'd cost way too much to produce something competitive on a small scale I think.

I'm surprised NVIDIA haven't tried. They'd be in a better position than most, have the money and would be pushing their own tech.
 
Beside, I'm not sure why people keep banking on Freesync (sometimes even dismissing Gsync in the process), since so far it's just an unproved attempt to match the competition's offering.
Something that no one had yet a chance to test first-hand. It needs to be stressed.

The way they're selling it sounds too good to be true - 9-240hz instead of 30-144, zero added latency, costs nothing, allows for fancy monitor features at the same time. Well I mean we know it requires additional hardware, which itself was something they have been sending (presumably deliberately) mixed messages about in PR. But I'm really holding out for the catch.
 

Paganmoon

Member
DP 1.2a/3 include an option for adaptive sync, but it is not mandatory. It is an optional inclusion in monitors using it.

Ahh didn't know it was optional, thanks for clarifying.

Beside, I'm not sure why people keep banking on Freesync (sometimes even dismissing Gsync in the process), since so far it's just an unproved attempt to match the competition's offering.
Something that no one had yet a chance to test first-hand. It needs to be stressed.

True, actually had a discussion about just this with a friend an hour or so ago, that it doesn't bode well if the first retail monitors with freesync are to be released in a months time, but no one's gotten to test it yet (advanced/prototype models?).

Guess we'll see in a months time.
 
It'd cost way too much to produce something competitive on a small scale I think.

I'm surprised NVIDIA haven't tried. They'd be in a better position than most, have the money and would be pushing their own tech.

Yeah I actually tried to edit my post to add that I don't know how difficult it would be from a technical/logistics/market politics point, but my edit didn't go through.

I sold my Panasonic plasma a few years ago and have been gaming on PC monitor, HMD, and handhelds. Don't really miss TVs... but it seems to me there would be a market for a good gaming-focused TV.
 

Naminator

Banned
Sooo about G-sync.

Has anyone figured out a way to install the G-sync DIY Kits on different monitors other that one BenQ(Or was it Asus?) monitor?

I would love to mod the monitor that I already have with G-syn if that's a possibility.
 
Sooo about G-sync.

Has anyone figured out a way to install the G-sync DIY Kits on different monitors other that one BenQ(Or was it Asus?) monitor?

I would love to mod the monitor that I already have with G-syn if that's a possibility.

The board is specifically tuned to the panel Asus is using in that specific model. It won't work with other panels.
 

Nachtmaer

Member
The way they're selling it sounds too good to be true - 9-240hz instead of 30-144, zero added latency, costs nothing, allows for fancy monitor features at the same time. Well I mean we know it requires additional hardware, which itself was something they have been sending (presumably deliberately) mixed messages about in PR. But I'm really holding out for the catch.

Their statements about the cost have been really vague so far. First there wasn't going to be any additional cost, then they said it'll just be less expensive. My bet is that Freesync monitors will be cheaper than Gsync monitors because the scalers will be supporting it natively instead of having to use an FPGA, but they'll probably cost more than non-Freesync ones. I'm not too up to date on monitor tech, so someone can correct me if I'm wrong.

Like you said, it's best to wait and see how things turn out before jumping to conclusions.
 

Kinthalis

Banned
Yeah, I think at the minimum I'd need a G-Sync panel to be IPS, high refresh rate low latency and 1440p at least 28".

Ideally, and probably far into the future it would be ultra wide 21:9, OLED, 144 Hz and 3K resolution.
 

Deadstar

Member
Well, if you are moving the mouse, that would cause an update, which I assume you're talking desktop here. If an update occurs it should update the monitor ASAP right - therefore it should match with very low latency if you can run the desktop in G-Sync mode.

I think you can, but I heard there was increased power consumption when running G-Sync all the time. I don't know how big of an impact it is though.

I'm specifically talking in game. So on the desktop if the refresh rate is 60 hz the mouse moves with less precision than when it is at 120 hz. It's a lot less smooth. So in game, my assumption would be if the game drops the refresh rate to say 40 hz the mouse would be chugging. Is this not true? It seems like a huge downside to g sync unless I'm not understanding something correctly.
 

Sentenza

Member
Yeah, I think at the minimum I'd need a G-Sync panel to be IPS, high refresh rate low latency and 1440p at least 28".

Ideally, and probably far into the future it would be ultra wide 21:9, OLED, 144 Hz and 3K resolution.
On the other hand, I doubt I'll want anything more than a 1080p for at least the next 4-5 years, since the whole point of a monitor like this for me would be to enjoy the increased framerate (144hz FTW) and a tri-SLI of top gamma GPUs is definitely out of reach for my pockets.
 
Wake me up when I can get a quality panel with G-sync, 21:9 and *x1440 + 144Hz.

Could be soon. I work with 80gbps interfaces daily. I don't know if they'll adapt a display standard to QSFP (40gig each way) soon, but it could happen.

Probably would be something similar to dual-link displayport to make it happen.

$90 cables are a bitch though.
 

ChrisG683

Member
Even though G-Sync will probably be killed by FreeSync in the future, I've been enjoying the ever living sh** out of my ASUS ROG Swift monitor, it really has changed my outlook on my fps counter, to the point where I've completely stopped using FRAPS.

Especially on a game like Planetside where getting over 60 fps in large fights even on a GTX 980 isn't that realistic, G-Sync has drastically reduced my sensitivity to FPS fluctuations.

Now granted, you can still feel the drop below 60 fps and the smoothness of 100-144 fps, but it makes it MUCH more pleasing to play on when things take a dive in crazy situations.

Previously, a rock solid 60 fps was a more enjoyable experience than a rollercoaster 80-144 fps, you could see the stuttering and hiccups, and it would actually look worse than 60 fps.

NOW I don't give a flying f*** how much my FPS fluctuate, it's always buttery smooth.

Too bad it has a very aggressive matte coating, that shit makes me want to vomit sometimes.
 
Yeah, I think at the minimum I'd need a G-Sync panel to be IPS, high refresh rate low latency and 1440p at least 28".

Ideally, and probably far into the future it would be ultra wide 21:9, OLED, 144 Hz and 3K resolution.
Do they actually make oled monitors? A TV I could understand but your taskbar would be present so often on the display there's no way it wouldn't burn in.

Is that no longer an issue? I don't like having to think about it with my Vita, even if it does look really nice
 
Do they actually make oled monitors? A TV I could understand but your taskbar would be present so often on the display there's no way it wouldn't burn in.

Is that no longer an issue? I don't like having to think about it with my Vita, even if it does look really nice

There are OLED TVs. I'm sure there are or will be monitors.
 

mugwhump

Member
Pretty much zero. G-Sync as it stands requires a scaler made by Nvidia. You'd need a monitor that has two scaler boards in it, the Nvidia one and another one that has the Adaptive-Sync support. I think there actually isone model out there, that also has their own scaler in it, and there's a switch to alternate between it and the Nvidia board. But that one doesn't have Adaptive-Sync support. At the end of the day the problem with using dual boards is the costs - you're essentially doubling things inside the monitor, and G-Sync itself costs extra already.

Awwww maaaaan

Welp, nvidia got me. Now I can either go without any kind of variable refresh rate, or I can buy a g-sync one, ensuring my next gpu will be nvidia. Dammit.
 

SapientWolf

Trucker Sexologist
You can try capping your frame rate to something like 40 which will help reduce the frame time variance leading to a subjectively smoother experience.

Capping a frame rate to anything other than a standard refresh rate results in persistent micro stutter in traditional monitors but with g-sync it works fine.

The sad thing is that a g-synced 40 FPS is not as pleasurable of an experience as nVidia marketing would lead you to believe.
If the pixel motion in the scene is below a certain threshold a steady 40fps is nearly indistinguishable from 60fps with Gsync. That's a big caveat, but at CES I was fooled into thinking Assassin's Creed was running at 60fps when it was in fact running around 40. You can also simulate this effect with a slow pan in the pendulum demo. In something like an FPS with a mouse the lower framerate becomes obvious. The other interesting thing is that games look smoother when you decrease the viewing angle (i.e. use a smaller screen / sit farther away). That could be part of the reason why 30fps looks so horrible on the PC.
 

pa22word

Member
Lol isn't this a contradiction

Uh, no?


Gsync lets you use any refresh rate from 30+ at perfect stability, so if you want a consistent 40 FPS you just adjust settings according to that. If you want a consistent 50 fps you adjust settings based on that. This is all vs the standard atm of either locking it at 30 fps or 60 fps to match monitor refresh rates.
 

Unai

Member
How so? How so? How so? No one has still explained this.

It's basically a very good way let the OS itself handle v-sync and triple buferring, even in games that don't offer triple buferring as an option.

Just run the game in bordeless window mode (you can force it with some sofwares) and disable v-sync in game.

It actually doesn't need to be bordeless for that to work, but running in bordeless window mode will appear just like exclusive fullscreen so is better the normal window mode.
 
So question about this whole Gsync thing. My next monitor is going to probably be the 4k one from Acer, which I'm pretty excited for. Hopefully it'll last me a long time. However, just conceptually, what happens if you have Gsync turned on and you're pumping out frame rates higher than what the monitor caps at? Wouldn't you still get tearing then?
 

Luigiv

Member
I think before G-Sync becomes mass market appeal and will be available on TVs also, VR will be the standard and no one will ask for G-Sync anymore. It'll be a niche product for a short period of time.

That said, if it would be available as an external device, I'd buy it in a heartbeat for my monitor AND my TV.

Um, what? What does VR have to do with the demand for G-sync/Free-sync at all? VR headsets would benefit just as much from adaptive frame rates as any other video display. There's absolutely no reason we can't have both simultaneously.
 

Unai

Member
So question about this whole Gsync thing. My next monitor is going to probably be the 4k one from Acer, which I'm pretty excited for. Hopefully it'll last me a long time. However, just conceptually, what happens if you have Gsync turned on and you're pumping out frame rates higher than what the monitor caps at? Wouldn't you still get tearing then?

No, it will work like ordinary v-sync.
 

Thrakier

Member
Um, what? What does VR have to do with the demand for G-sync/Free-sync at all? VR headsets would benefit just as much from adaptive frame rates as any other video display. There's absolutely no reason we can't have both simultaneously.

I would guess that with VR developers finally have to optimize for framerate and performance, even frametimes. No one wants stutter in a virtual world. If frametimes are good, we wouldn't need G-Sync anymore.
 

Luigiv

Member
I would guess that with VR developers finally have to optimize for framerate and performance, even frametimes. No one wants stutter in a virtual world. If frametimes are good, we wouldn't need G-Sync anymore.

Um, what? It's nice to dream, I guess, but nothing you have said has any grounding in reality. If anything, VR headsets will be more likely to adopt adaptive refresh as it has a much better chance of offering us a stutter free experience than simply expecting perfect optimisation in every application, especially so on PC and smart device where no amount of optimisation could possibly allow for perfect performance when there are so many different hardware configurations available.

Anyway, in lighter news, I just bought an AOC G2460PG this afternoon (shortly after my previous post). I've been mulling over upgrading to a G-Sync monitor for the past few weeks and this thread popping up when it did definitely helped push me over the edge, much to my wallet's protests.

First impressions are quite good. G-sync appears to work as advertised and the clarity and response are superb. Even without using ULMB, my eyes can't pick up any motion blur (admitted I'm not super sensitive to this stuff). I'm also quite happy with the colour reproduction and contrast. Even though it's only a TN panel, it's still a big step up from my previous panel (also a TN), which is enough for me (that's the main reason I wanted to ditch the damn thing in the first place).

It should serve me well for many years to come.

Edit: Oh yeah, almost forgot; Hurray for the physical power button! That's the other big reason I was desperate to get rid of other panel. It's capacitive button drove me up the wall.
 
Top Bottom