• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-SYNC - New nVidia monitor tech (continuously variable refresh; no tearing/stutter)

artist

Banned
bbk2shm.jpg


GSYNCVERTICAL1.jpg


http://blogs.nvidia.com/blog/2013/10/18/g-sync/

bblqigi.png


FULL FAQ: http://www.geforce.co.uk/hardware/technology/g-sync/faq

DEMO: http://www.youtube.com/watch?v=NffTOnZFdVs&hd=1

How To Upgrade To G-SYNC

If you’re as excited by NVIDIA G-SYNC as we are, and want to get your own G-SYNC monitor, here’s how. Later this year, our first G-SYNC modules will be winging their way to professional modders who will install G-SYNC modules into ASUS VG248QE monitors, rated by press and gamers as one of the best gaming panels available. These modded VG248QE monitors will be sold by the modding firms at a small premium to cover their costs, and a 1-year warranty will be included, covering both the monitor and the G-SYNC module, giving buyers peace of mind.

Alternatively, if you’re a dab hand with a Philips screwdriver, you can purchase the kit itself and mod an ASUS VG248QE monitor at home. This is of course the cheaper option, and you’ll still receive a 1-year warranty on the G-SYNC module, though this obviously won’t cover modding accidents that are a result of your own doing. A complete installation instruction manual will be available to view online when the module becomes available, giving you a good idea of the skill level required for the DIY solution; assuming proficiency with modding, our technical gurus believe installation should take approximately 30 minutes.

If you prefer to simply buy a monitor off the shelf from a retailer or e-tailer, NVIDIA G-SYNC monitors developed and manufactured by monitor OEMs will be available for sale next year. These monitors will range in size and resolution, scaling all the way up to deluxe 3840x2160 “4K” models, resulting in the ultimate combination of image quality, image smoothness, and input responsiveness.

http://www.geforce.com/whats-new/ar...evolutionary-ultra-smooth-stutter-free-gaming

Made some images for a visual representation of tearing, lag/stutter, and what G-Sync does. There's probably some inaccuracies (I'm not overly knowledgeable on the tech), but it should at least give people an idea.

Image 1: Perfection.
In graphics rendering you have two forces at play. Firstly, your hardware/GPU is rendering each frame of the game. We call this framerate. And the other is your monitor/display, which is refreshing display frames. We call this hertz (Hz). Traditionally your monitor's Hz is locked. Do you have a 60Hz monitor? What about 120Hz? Or 144Hz? This number tells you how many times it forcibly refreshes itself every second.

And so, in a perfect world, for every single refresh of the monitor's frames the GPU would render a single frame of game data. They would work in perfect unison together.

n1mrs0c.jpg


But this doesn't happen, not unless you have insane hardware or lower specs. We see it on PC, we see it on consoles, but framerate fluctuates. Your GPU does not render a locked framerate most of the time. And even when it is hitting that smooth 60fps, something can happen in the game that drops it. Big explosion out of nowhere. Building crumbling. The hardware is stressed, and the framerate drops. It might even drop below the monitor's refresh rate.

Image 2: Tearing
So what happens when our framerate is moving all over the place, but our monitor's refresh rate is locked? We get something called "tearing". Long and short of it, this is when the frames being rendered by the GPU are not in sync with the monitor's locked frame refresh rate. We get overlaps between the two. See image below.

n2wxshy.jpg


This means that when the monitor tries to refresh a frame, sometimes the GPU has two frames of overlapped data. This is what causes that big "tear" horizontally through the screen of some games.

Image 3: Vsync Solution! Also Stutter/Lag
People fucking hate tearing, so a solution was found: Vsync. Vsync acknowledges the refresh rate of your monitor. 60Hz? 120Hz? And it says "I'm only going to send GPU rendered frames to the beat of that refresh rate!". Everything is synchronised! But this has another problem. As we just mentioned, games rarely run at locked framerates. So what happens when my 60fps game drops to 45fps, but my monitor is 60Hz, and I'm using Vysnc?

n303syn.jpg


Essentially, the GPU forces itself to 'wait' on each frame before the monitor refreshes. Remember, the monitor refresh rate is locked. It stays beating to the same rhythm, regardless of how fast or slow the GPU is spitting out frame data. In this case, our GPU is rendering frames slower than the monitor's Hz, but Vsync is forcing it to play catch up. If the monitor tries to draw a frame, but no frame exists, it simply draws the last one, doubling up for a couple of seconds. Imagine this in a game. This would give the impression of "stuttering". This also introduces input lag from peripherals, as the GPU is constantly trying to play catch-up to the monitor's refresh rate.

Image 4: G-Sync Solution
G-Sync is a hardware level solution inside your monitor that communicates directly with the GPU. Instead of Vsync, G-Sync says "why don't we change the monitor refresh rate too?".

n4emsu2.jpg


So no matter how fast or slow the GPU is rendering frames, the monitor is never locked to a particular refresh rate. Not stuck at 60Hz, even if the GPU is stuck on 45fps. In this case, the monitor would change to 45Hz to match the framerate. And if the GPU suddenly boosts to 110fps? The monitor boosts to 110Hz too.

Every frame is drawn perfectly in sync with the monitor. The monitor doesn't ever have to play catch up to the GPU (tearing), nor does the GPU ever have to play catch up to the monitor (stutter/lag).


Demo going on right now - http://www.twitch.tv/linustech (looks amazing even through a camcorder)
 

tipoo

Banned
Are we really posting articles for each slide in this presentation? Keep it to the main thread.

And they should have called it Nsync.
 
Are we really posting articles for each slide in this presentation? Keep it to the main thread.

And they should have called it Nsync.

If Sony announced the Last Guardian during a press event it would get its own thread. Don't be silly, this could be a really big deal and many people aren't viewing the nvidia stream.

Looking forward to seeing this though. I ended up waiting on the monitor I was going to get so if this works as stated it will be mine.
 

ghst

thanks for the laugh
Are we really posting articles for each slide in this presentation? Keep it to the main thread.

And they should have called it Nsync.

i can't remember the last time a tech was announced with more potential to improve the base quality of experience in gaming.

it gets its own thread.

just a drag that its proprietary.
 

Pagusas

Elden Member
Are we really posting articles for each slide in this presentation? Keep it to the main thread.

And they should have called it Nsync.

No, It should be this way so people dont miss important news. This is a big deal and I hope it takes off.
 

Tain

Member
I am so fucking in. This solves IQ problems, but it also solves the shit out of my fundamental problems with emulation.
 

Coconut

Banned
I don't understand how adding another step in the process is going to decrease lag.

Can some one help my brain comprehend this.
 

Durante

Member
durante, hold my hand.
OHMYGOD OHMYGOD OHMYGOD

is this really what I think it is

If so,
OHMYFUCKINGGOD SOMEONE DID IT

BEST PC TECH INNOVATION OF THE PAST 2 DECADES!

Edit:
IT IS
OH MY FUCKING GOD

SOME MOD ADD "OH MY GOD" TO THE TITLE STAT
 

Daedardus

Member
More proprietary stuff?

When will both companies learn you have the most to gain when you're just better at an open standard rather than locking out half of a potential customer base? If everything gets splitted up that's bad for the whole business in general.
 

Glorious Dell Ultrasharp master race :(
More proprietary stuff?

When will both companies learn you have the most to gain when you're just better at an open standard rather than locking out half of a potential customer base? If everything gets splitted up that's bad for the whole business in general.

As opposed to making people buy an Nvidia GPU and G-Sync monitor?
 
No. Proprietary, and will only work with compatibly GTX video cards.

Need a Kepler GPU, so no (based on what they've said so far).

oh ok

More proprietary stuff?

When will both companies learn you have the most to gain when you're just better at an open standard rather than locking out half of a potential customer base? If everything gets splitted up that's bad for the whole business in general.

yea i agree
 

Tain

Member
OHMYGOD OHMYGOD OHMYGOD

is this really what I think it is

If so,
OHMYFUCKINGGOD SOMEONE DID IT

BEST PC TECH INNOVATION OF THE PAST 2 DECADES!

Edit:
IT IS
OH MY FUCKING GOD

SOME MOD ADD "OH MY GOD" TO THE TITLE STAT

isn't this the most beautiful shit in the world?
 

Dawg

Member
Great improvement for monitors.

Next up is no motion blur and no ghosting without having to use lightboost, right?
right? ;-;
 

RealMeat

Banned
Really cool. Have they said if it's an open standard so AMD could support it? I'm guessing not.
Also, I wonder if a future version Oculus Rift could support it. Seems like it would solve some of it's problems, and help with the latency.
 

Skilletor

Member
Are we really posting articles for each slide in this presentation? Keep it to the main thread.

And they should have called it Nsync.

I didn't know there was a conference going on, have no intention of checking a thread for said conference.

I wouldn't have known this, so thank you OP.
 
So does someone want to explain to us plebes exactly what this is? Is it basically a monitor that does Vsync on it's own without the card having to take the hit.
 

zhorkat

Member
It sounds great. Too bad it's only for Nvidia GPUs, and hopefully the additional hardware required isn't too much of an extra cost.
 

Durante

Member
isn't this the most beautiful shit in the world?
IT'S FULL OF STARS

And I fixed my caps lock key.

So does someone want to explain to us plebes exactly what this is? Is it basically a monitor that does Vsync on it's own without the card having to take the hit.
When I calm down I will write a post explaining it and why it's the best thing in gaming display technology in a decade. Seriously.
 

Freki

Member
This is really great technology but I have one concern - now you are basically locked into the nvidia ecosystem. I fear the worst for AMD...
 

Kaako

Felium Defensor
I was like why is Durante losing his shit? Then I read the article in the OP. Let's go into more details on this please.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
TLDR explanation:

Instead of the monitor refreshing it's frames (aka: Hz) irrespective of how fast/slow the GPU is rendering frames, G-Sync tech monitors fully synchronise GPU frame rendering with a dynamic Hz/refresh rate of the display.

When GPU frame rendering and monitor frame refreshes are not synchronised this causes issues with games with fluctuating framerates, like tearing.

In the past, the solution to this problem was V-Sync, which forcibly synchronised GPU frame rendering until each frame was complete, but this causes stuttering and lag as the monitor's frame refresh is faster than the GPU frame rendering.

G-Sync is a hardware level syncronisation of GPU frame rendering and monitor frame refreshing, meaning the monitor's Hz/refresh rate changes dynamically depending on the GPU's rate of frame rendering. You get each frame fully rendered (no tearing), and when the framerate drops the monitor doesn't have to fight it with a locked refresh rate, instead changing it's refresh rate to fit the framerate (eliminating stuttering and input lag).
 

Hagi

Member
This is cool right? How long will it take for this to take off outside of the Nvidia inner circle? I want to experience tech innovation too!
 
Top Bottom