SneakyStephan
Banned
Must be that hidden dGPU inside the gsync module.
I really need that 21:9 G-Sync monitor to come out right now. Rumoured to be $1299 though...
http://www.tomshardware.com/news/acer-releases-g-sync-curved-monitor,28975.html
So they don't build it into the cards, have a separate device and special monitors with support for it?
Some PC gamers really are played as fools eh?
Oh, my god! Going to sell my car to buy this.
It´s only 75Hz.
I can live with that. It's still an improvement over 60hz and 75 fps looks better to me than 60 fps on my current 144hz monitor. I'm assuming 75 fps on a 144hz monitor looks the same as 75fps on a 75hz.
Must be that hidden dGPU inside the gsync module.
If you're completely ignorant of the technology and what being achieved, it could seem that way I suppose.
I dont understand why people are so down on TN panels. I have a Dell U2713HM (1440p ips) and an Acer XB280HK (4K TN) and there is barely any difference between the two other than the resolution.
I also have both monitors and there are noticeable differences, but high quality TN panels these days are good enough that I can put the Dell away. For gaming, they are fine. If I need accurate colors for working, probably, not. TN panels are no where as bad as the earlier ones. Think, people that are still down on them probably had bad experiences with the earlier ones and/or still regurgitating old information.I dont understand why people are so down on TN panels. I have a Dell U2713HM (1440p ips) and an Acer XB280HK (4K TN) and there is barely any difference between the two other than the resolution.
For gaming, they are fine.
I also have both monitors and there are noticeable differences, but high quality TN panels these days are good enough that I can put the Dell away. For gaming, they are fine. If I need accurate colors for working, probably, not. TN panels are no where as bad as the earlier ones. Think, people that are still down on them probably had bad experiences with the earlier ones and/or still regurgitating old information.
Until the recent gsync IPS, that's the choice to have to make if you want fast refresh rate + gsync. For gaming, that's a totally fine comprise.But for people spending tons of money on gaming rigs that enable them to max out visual fidelity, to then arrive at "fine" with the monitor itself seems like a terrific anticlimax.
So, you agree with me.I have two panels of the same size one is IPS and another TN. I can easily spot the difference between the two side by side. But on its own I have no real problems with the TN.
Nvidia is hiring 5 year olds to do tech reports now?
We know what Gsync does, but your rig must have been messed up pretty bad beforehand if Gsync is doing what you say it is doing. Congrats on fixing your issues along with adding Gsync.
Until the recent gsync IPS, that's the choice to have to make if you want fast refresh rate + gsync. For gaming, that's a totally fine comprise.
So, you agree with me.
You realize that is impossible right?I didn't change a thing on my rig at all. Just adding the monitor, setting it to 144Hz, and turning gsync on have me the performance increase I described in the OP.
I didn't change a thing on my rig at all. Just adding the monitor, setting it to 144Hz, and turning gsync on have me the performance increase I described in the OP.
Doesn't make sense unless you are running at 720p or something. It does not increase frame rate.
Especially from 40 fps to 60 fps?
Doesn't make sense unless you are running at 720p or something. It does not increase frame rate.
Especially from 40 fps to 60 fps?
Is this true? So when i am only gaming in the 50-90 range i wont have full benefit of a 144hz monitor?
Butts. G-sync is advantageous at all framerates, and something like 45 FPS average is actually one of the cases where the difference compared to fixed refresh is most pronounced.Query: I've been told that g-sync monitors are really only worth it if you're floating somewhere about 50fps with frequent fluctuations, hence 144hz monitors, as the dynamic refresh rate goes well with stabilising each frame at those framerates. And that despite this, they're actually kinda shitty if you're hovering between 30 - 60fps (so lets say averaging 45fps), even though I would have though they'd be just as advantageous, if not moreso for stabilising frame rendering at sub-60fps.
True/false/butts?
While you, of course, are aware of the secret techniques necessary to introduce a new way to drive monitors by only changing the GPU side of the communication.I guess not everyone is aware of the big money making machine.
Yep, I'm definitely going g-sync when these stars align:
- single-card 4k performance becomes feasible (i.e. 60ish fps @ high settings @ 4k on modern titles on a single card)
- a 4k g-sync monitor becomes available that is high quality (i.e. 8-bit panel, not TN) and doesn't cost the earth.
I guess you bought before this was available?100% with you on this.
i would have gone g-sync on my current 1440p set-up if there was an IPS variant that could be used reliably for photography work as well as gaming [like my current Dell model], but that's still asking too much.
soon my friends, soon.
Query: I've been told that g-sync monitors are really only worth it if you're floating somewhere about 50fps with frequent fluctuations, hence 144hz monitors, as the dynamic refresh rate goes well with stabilising each frame at those framerates. And that despite this, they're actually kinda shitty if you're hovering between 30 - 60fps (so lets say averaging 45fps), even though I would have though they'd be just as advantageous, if not moreso for stabilising frame rendering at sub-60fps.
True/false/butts?
If I remember correctly, when you are below 60 FPS, the module inserts frames to compensate.
I would imagine being above 60 is the best value for the monitor as you get all the advantages along with no vsync.
If I remember correctly, when you are below 60 FPS, the module inserts frames to compensate.
I would imagine being above 60 is the best value for the monitor as you get all the advantages along with no vsync. Gsync creates a situation where dipping below 60 is acceptable.
Nope. This happens when it's below 30 with G-Sync. I believe Freesync is 40 FPS or so.
When you are below 30 the monitor interpolates frames, and when you are greater than 30 it just adjusts the refresh rate?
When you are below 30 the monitor interpolates frames, and when you are greater than 30 it just adjusts the refresh rate?
That's how I thought it worked too, but recently it was shown that below 30 it actually multiplies the current framerate to get a >30 refresh. Eg. if you are at 28 FPS it will refresh at 56 Hz. (this is superior behavior)No. If it goes below 30 the screen will refresh at 30 hz, just like a normal 30hz monitor would do (IIRC). Anything 30 and above will refresh as soon as the frame is dished up.
That's how I thought it worked too, but recently it was shown that below 30 it actually multiplies the current framerate to get a >30 refresh. Eg. if you are at 28 FPS it will refresh at 56 Hz. (this is superior behavior)
I guess you bought before this was available?
That's how I thought it worked too, but recently it was shown that below 30 it actually multiplies the current framerate to get a >30 refresh. Eg. if you are at 28 FPS it will refresh at 56 Hz. (this is superior behavior)
Cool... would this introduce latency though? Though at 28 FPS maybe you wouldn't notice...
So they don't build it into the cards, have a separate device and special monitors with support for it?
Some PC gamers really are played as fools eh?
I guess not everyone is aware of the big money making machine.
Cool... would this introduce latency though? Though at 28 FPS maybe you wouldn't notice...
Very low fps still feels bad with variable refresh, but now 50 fps doesn't fell much different from 60.Query: I've been told that g-sync monitors are really only worth it if you're floating somewhere about 50fps with frequent fluctuations, hence 144hz monitors, as the dynamic refresh rate goes well with stabilising each frame at those framerates. And that despite this, they're actually kinda shitty if you're hovering between 30 - 60fps (so lets say averaging 45fps), even though I would have though they'd be just as advantageous, if not moreso for stabilising frame rendering at sub-60fps.
True/false/butts?
FreeSync does this too. PcPer is wrong.That's how I thought it worked too, but recently it was shown that below 30 it actually multiplies the current framerate to get a >30 refresh. Eg. if you are at 28 FPS it will refresh at 56 Hz. (this is superior behavior)
Some reports have suggested that when the frame-to-frame interval on a FreeSync display grows too long, the display responds by "locking" into a 40Hz refresh rate, essentially quantizing updates at multiples of 25 ms. Doing so would be pretty poor behavior, because quantization at 25 ms steps would mean horribly janky animation. You'd be making the worst of an already bad situation where the attached PC was running up against its own performance limitations. However, such talk is kind of nonsense on the face of it, since we're dealing with a variable-refresh display working in concert with a GPU that's producing frames at an irregular rate. What happens in such cases differs between FreeSync and G-Sync, but neither solution's behavior is terribly problematic.
Let's start with how G-Sync handles it. I talked with Nvidia's Tom Petersen about this question, since he's made some public comments on this matter that I wanted to understand.
Such talk is kind of nonsense on the face of it, since we're dealing with a variable-refresh display working in concert with a GPU that's producing frames at an irregular rate.
Petersen explained that sorting out the timing of a variable-refresh scheme can be daunting when the wait for a new frame from the graphics card exceeds the display's maximum wait time. The obvious thing to do is to refresh the display again with a copy of the last frame. Trouble is, the very act of painting the screen takes some time, and it's quite possible the GPU have a new frame ready while the refresh is taking place. If that happens, you have a collision, with two frames contending for the same resource.
Nvidia has built some logic into its G-Sync control module that attempts to avoid such collisions. This logic uses a moving average of the past couple of GPU frame times in order to estimate what the current GPU frame-to-frame interval is likely to be. If the estimated interval is expected to exceed the display's max refresh time, the G-Sync module will preemptively refresh the display part way through the wait, rather than letting the LCD reach the point where it must be refreshed immediately.
This preemptive refresh "recharges" the LCD panel and extends its ability to wait for the next GPU frame. If the next frame arrives in about the same time as the last one, then this "early" refresh should pay off by preventing a collision between a new frame and a gotta-have-it-now refresh.