• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-SYNC - New nVidia monitor tech (continuously variable refresh; no tearing/stutter)

i read in the thread that there might be something for people that don't want to shell out for a new monitor.



I really should be canceling my ps4/xbox one preorders and get a 780 eh?
 

hoserx

Member
"This same observation got Jen-Hsun, our CEO, to commission some of the brightest minds at NVIDIA to solve this problem. We brought together about two dozen GPU architects and other senior guys to take apart the problem and look at why some games are smooth and others aren’t."

Made by old people confirmed.
 

Durante

Member
This is really great technology but I have one concern - now you are basically locked into the nvidia ecosystem. I fear the worst for AMD...
I agree this is a concern, but I don't think the market implications will really matter beyond enthusiasts. No one else cares about this stuff.


On the other hand, as an enthusiast, I'M FUCKING ENTHUSED.
 

Dawg

Member
DSC_3611.JPG


Oh my god.

This is big. I'm really OCD about stuff like blurry moving text.

AWESOME
 
Hey that is really cool. Love this idea.

And on the plus side, I see folks complaining about this being proprietary, but given how I see this technology it seems like way less of an issue versus stuff like Mantle or PhysX being proprietary. For stuff like that you actually have to change your game code to take advantage of platform specific stuff. But in this case you don't, it all happens between the GPU and monitor, so it's not like games have to add code to support this or anything like that. So it doesn't really seem to hurt much, just means that you have to purchase your GPU / monitor as a "unit" to know that you can support the technology.
 

Rains

Member
It sounds great. Too bad it's only for Nvidia GPUs, and hopefully the additional hardware required isn't too much of an extra cost.
Indeed forces you to buy their cards for the feature to work on the mointor for me this is like saying hey these speakers will only work with the sony brand i would rather spend the extra on a better video card
 

BigDug13

Member
Looks like awesome tech. But I play PC games 6 feet away from my 60" Sony LED.

How long until we see this kind of tech in standard TV's?
 

L~A

Member
TLDR explanation:

Instead of the monitor refreshing it's frames (aka: Hz) irrespective of how fast/slow the GPU is rendering frames, G-Sync tech monitors fully synchronise GPU frame rendering with a dynamic Hz/refresh rate of the display.

When GPU frame rendering and monitor frame refreshes are not synchronised this causes issues with games with fluctuating framerates, like tearing.

In the past, the solution to this problem was V-Sync, which forcibly synchronised GPU frame rendering until each frame was complete, but this causes stuttering and lag as the monitor's frame refresh is faster than the GPU frame rendering.

G-Sync is a hardware level syncronisation of GPU frame rendering and monitor frame refreshing, meaning the monitor's Hz/refresh rate changes dynamically depending on the GPU's rate of frame rendering. You get each frame fully rendered (no tearing), and when the framerate drops the monitor doesn't have to fight it with a locked refresh rate, instead changing it's refresh rate to fit the framerate (eliminating stuttering and input lag).

Thanks for the explanation, should totally be added to the OP.

And sounds awesome... too bad it's limited to Nvidia stuff. Proprietary is so... ugh.
 

Schnozberry

Member
This is great tech. My 780 is ready. Hopefully the prices aren't in the stratosphere. That would be a huge barrier to adoption.
 

Somnid

Member
So does someone want to explain to us plebes exactly what this is? Is it basically a monitor that does Vsync on it's own without the card having to take the hit.

-A monitor refreshes at 60Hz.
-A GPU finishes rendering a frame whenever it does, depends on the complexity of the frame.
-To show the frame the monitor has to refresh.
-If the GPU finishes before a refresh it waits until the next refresh.
-If the GPU misses the expected refresh it takes an entire frame to catch back up.

That's frame lag with V-sync.

-If v-sync isn't on the GPU will push its content whenever.
-If the screen is mid-refresh then it'll draw part of one frame and have that frame change mid draw making part of the screen look different.

That's tearing.

This allows the GPU to decide when to push a frame. It happens when it's done rather than trying to sync the GPU rendering with the actual monitor refresh.

It's a very good idea and should eliminate a lot of gaming problems but I wonder how the GPU signals? Sounds very proprietary.
 

ghst

thanks for the laugh
i think people are missing the big picture here.

i mean, it's great that it's being announced by nvidia now. it's great they'll have it in high end gaming monitors powered by their own GPUs.

but the really awesome thing is that the technology is here. it's like the first rocket in space. we're no longer waiting for the tech, the next stage is the gradual proliferation, implementation and iteration.

in a few years, no serious gamer will ever have to think about these problems again.
 

komplanen

Member
Sounds pretty neat. Will definitely buy a monitor with this or similar tech when I upgrade my current monitor. Won't be anytime soon though.
 

eot

Banned
How will this work for low framerates? If I'm getting 45 fps won't 45Hz look like shit? It's not a CRT with flickering, but still.
 

Leb

Member
The only thing raining on this parade is the accursed TN panel. I pray to Gabe that somewhere down the road, this technology will be adapted for use with superior panel techs like IPS/PLS.
 

Orayn

Member
So does someone want to explain to us plebes exactly what this is? Is it basically a monitor that does Vsync on it's own without the card having to take the hit.

Opposite of that: If I'm reading this right, it makes the monitor's refresh behavior subservient to the video card instead of the other way around.
 

trh

Nifty AND saffron-colored!
TLDR explanation:

Instead of the monitor refreshing it's frames (aka: Hz) irrespective of how fast/slow the GPU is rendering frames, G-Sync tech monitors fully synchronise GPU frame rendering with a dynamic Hz/refresh rate of the display.

When GPU frame rendering and monitor frame refreshes are not synchronised this causes issues with games with fluctuating framerates, like tearing.

In the past, the solution to this problem was V-Sync, which forcibly synchronised GPU frame rendering until each frame was complete, but this causes stuttering and lag as the monitor's frame refresh is faster than the GPU frame rendering.

G-Sync is a hardware level syncronisation of GPU frame rendering and monitor frame refreshing, meaning the monitor's Hz/refresh rate changes dynamically depending on the GPU's rate of frame rendering. You get each frame fully rendered (no tearing), and when the framerate drops the monitor doesn't have to fight it with a locked refresh rate, instead changing it's refresh rate to fit the framerate (eliminating stuttering and input lag).

Thank you.



Holy shit.
 
Propriety displays, bravo Nvidia, I never would guessed that was your next propriety device. Whats next, power supplies, keyboards, mice?

Why are you locking out a huge segment of the market by making an Nvidia video card required? I guess the lock in works for Apple. I wonder how much a replacement G-Sync vide cable will cost, $50?
 

Durante

Member
i think people are missing the big picture here.

i mean, it's great that it's being announced by nvidia now. it's great they'll have it in high end gaming monitors powered by their own GPUs.

but the really awesome thing is that the technology is here. it's like the first rocket in space. we're no longer waiting for the tech, the next stage is the gradual proliferation, implementation and iteration.

in a few years, no serious gamer will ever have to think about these problems again.
Yeah baby

How will this work for low framerates? If I'm getting 45 fps won't 45Hz look like shit? It's not a CRT with flickering, but still.
It will look infinitely better than 45 FPS on a 60 Hz display.


Where's dark10x?
 

Sethos

Banned
Opposite of that: If I'm reading this right, it makes the monitor's refresh behavior subservient to the video card instead of the other way around.

It just matches the Hz with the FPS.

So game is 52FPS = Monitor is 52Hz and that will change constantly. That means you won't have those frames between the scan times causing the tear and you won't have the input lag of Vsync.
 

Dawg

Member
My hands are actually shaking slightly. Maybe I should re-evaluate my life or maybe I should just be happy.

I'm really happy this tech is here. At first, it just looked like an easier V-sync... but that comparison pic where you can actually read moving text... damn!
 

rjc571

Banned
I am so fucking in. This solves IQ problems, but it also solves the shit out of my fundamental problems with emulation.

I assume this is fixed pixel? So it will still have IQ problems with upscaling artifacts, presumably?
 

Perkel

Banned
i think people are missing the big picture here.

i mean, it's great that it's being announced by nvidia now. it's great they'll have it in high end gaming monitors powered by their own GPUs.

but the really awesome thing is that the technology is here. it's like the first rocket in space. we're no longer waiting for the tech, the next stage is the gradual proliferation, implementation and iteration.

in a few years, no serious gamer will ever have to think about these problems again.

yep. I would love to see monitors and especially TV created with games in mind
 
Top Bottom