• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-SYNC - New nVidia monitor tech (continuously variable refresh; no tearing/stutter)

Tain

Member
rjc571 said:
I assume this is fixed pixel? So it will still have IQ problems with upscaling artifacts, presumably?

Yeah, there will still be upscaling. You won't get CRT beauty. But timing issues are a much bigger deal for me, and they're gone.

You can now play something like R-Type (which is a 55hz arcade game) without stutter or screen tearing or lag or altering the game's speed and that's awesome.
 
http://www.geforce.com/whats-new/ar...evolutionary-ultra-smooth-stutter-free-gaming

nvidia site is up.

How To Upgrade To G-SYNC

If you’re as excited by NVIDIA G-SYNC as we are, and want to get your own G-SYNC monitor, here’s how. Later this year, our first G-SYNC modules will be winging their way to professional modders who will install G-SYNC modules into ASUS VG248QE monitors, rated by press and gamers as one of the best gaming panels available. These modded VG248QE monitors will be sold by the modding firms at a small premium to cover their costs, and a 1-year warranty will be included, covering both the monitor and the G-SYNC module, giving buyers peace of mind.

Alternatively, if you’re a dab hand with a Philips screwdriver, you can purchase the kit itself and mod an ASUS VG248QE monitor at home. This is of course the cheaper option, and you’ll still receive a 1-year warranty on the G-SYNC module, though this obviously won’t cover modding accidents that are a result of your own doing. A complete installation instruction manual will be available to view online when the module becomes available, giving you a good idea of the skill level required for the DIY solution; assuming proficiency with modding, our technical gurus believe installation should take approximately 30 minutes.

If you prefer to simply buy a monitor off the shelf from a retailer or e-tailer, NVIDIA G-SYNC monitors developed and manufactured by monitor OEMs will be available for sale next year. These monitors will range in size and resolution, scaling all the way up to deluxe 3840x2160 “4K” models, resulting in the ultimate combination of image quality, image smoothness, and input responsiveness.

Canceling all next gen consoles, getting a 780 and a new monitor.


oops late.
 

Enectic

Banned
So...question...is this similar to how OnLive handles v-sync (except on a local level)?

The nature of this v-sync mode deserves mention. Rather than altering a universal setting for all games on the server-side, it actually affects client-side decoding. With v-sync engaged, the client needs to wait for the next screen refresh before it can display the frame it has just finished decoding.
 

Serandur

Member
I... I don't know what to make of this. On one hand, it's flipping awesome. On the other, it's proprietary Nvidia stuff. That is bullshit... BULLSHIT. I really want AMD to remain a viable enthusiast GPU brand. Obviously, Nvidia probably don't, but still, something this big should not be proprietary. Fuck you Nvidia... no wait, take my money I guess... no, NO, FUCK YOU Nvidia... and take my money. Cognitive dissonance here, help?
 

Dawg

Member
I'm happy the industry is finally acknowledging the biggest problem in a gaming setup right now: the screens/monitors.

Really, I can buy a very high-end computer but I'll always feel like it will be "bottlenecked" by monitors because the tech is (imho) worse than CRT when it comes to input lag etc but ESPECIALLY motion blur, ghosting and all.

120hz (with lightboost) was the first step in the right direction and I hope it will only get better from here on now, especially with G-sync.
 

Dario ff

Banned
Steam Machine benefits just got a whole lot more interesting for competition.

And this just confirmed my next purchase will be an Nvidia card. :)
 

Exuro

Member
I... I don't know what to make of this. On one hand, it's flipping awesome. On the other, it's proprietary Nvidia stuff. That is bullshit... BULLSHIT. I really want AMD to remain a viable enthusiast GPU brand. Obviously, Nvidia probably don't, but still, something this big should not be proprietary. Fuck you Nvidia... no wait, take my money I guess... no, NO, FUCK YOU Nvidia... and take my money. Cognitive dissonance here, help?
Seeing as it looks like you can just pop the chip into certain monitors, couldn't amd make their own variant, then you order x monitor with the amd chip instead of nvidia?
 

Pappasman

Member
Does this effect input lag at all? (Sorry, im not to familiar with this kind of stuff)

Edit:nvm, i get it now. This is super cool.
 
Seeing as it looks like you can just pop the chip into certain monitors, couldn't amd make their own variant, then you order x monitor with the amd chip instead of nvidia?

Remember we are speaking about company that didn't notice they have crossfire problems for last 5 years ;)
 

Tik-Tok

Member
What happens when I'm NOT playing a game? How will having a fluctuating refresh rate effect day to day things like doing work on my desktop or browsing the internet and stuff?
 

Durante

Member
Could someone explain exactly what this is for the rest of us?
I'll try.

Since time immemorial, all standards for communication between a GPU and a display have used a fixed refresh rate, the rate at which a new picture is sent over the cable and shown on the display device (I'll call it "monitor" now for simplicity, but it doesn't have to be one). This is originally due to how CRT monitors work, which scan an electron beam over the grille and need to do so at a constant time interval to prevent flickering.

For a 3D game, this presents many challenges. Either you design your game such that it always manages to exactly create 60 (or whatever the refresh rate is, or a direct multiple) frames in all situations, giving up performance in some and spending incredible amounts of engineering effort on corner cases, or you synchronize on the start of a new frame (called "V-SYNC"). In this case, you have the following disadvantages:
  • As soon as you can no longer maintain one framerate, you drop directly to the next lower integer division of the base Hz.
  • You incur additional input lag waiting for the synchronization impulse.
  • The game engine animation timing needs to take into account not just the time to create frames, but when they are actually displayed. (Almost no one gets this right)
The first point can be mitigated by e.g. triple-buffering, but this does not fix the other two points.

Ever since we had modern displays which do not require a fixed refresh cycle, everyone who thinks about this for a bit has had the idea of changing things up so that the GPU is actually in control of the communication. It knows exactly when a frame is finished rendering, and at exactly that point the monitor should start showing it.

However, no one thought this would ever happen, because display standards are designed by huge committees which don't give a crap about what a few hardcore gaming enthusiasts want. The best chance was just increasing the refresh rate, which mitigates the issue but does not solve it.

Now Nvidia fixed it.
 

teiresias

Member
Won't this result in flicker and headache inducing low refresh rates if you're playing a game that really pushes the rendering hardware and can only keep about 30fps. I understand such horrible framerates are generally unknown to us glorious PC gamers, but I think it's a valid concern, since it may artificially force a graphics card refresh earlier than one may want if you literally can't look at your monitor due to the strobing of the monitor refresh rate being limited by the rendering speed (though I suppose this functionality could be turned off if you want).
 
What happens when I'm NOT playing a game? How will having a fluctuating refresh rate effect day to day things like doing work on my desktop or browsing the internet and stuff?

What are you talking about ? Any modern gpu should be able to provide 144 Hz 2d desktop screen.
 
Holy SHIT. Finally, modern displays can do what CRTs could do decades ago (although that had to do with how the game was programmed too). I feel like some ancient lost technology has been rediscovered. V-sync has always sucked because of latency, triple buffering has always sucked because of stuttering (and latency), and tearing has always sucked because it's just hideous. That I won't have to deal with this shit anymore is just fuckin amazing.
 

Orayn

Member
How will this work for low framerates? If I'm getting 45 fps won't 45Hz look like shit? It's not a CRT with flickering, but still.

I think it means no judder since it effectively makes the monitor's refresh rate the same as the game's framerate up to some maximum.

A game rendering 45 frames per second is taking 22.22 ms to make each new frame; I *think* G-Sync can tell the monitor to just keep each frame onscreen for that amount of time. A traditional refresh rate of 60Hz means that the monitor HAS to refresh every 16.67 ms, so you're left with a choice between tearing where you give it two partial frames at once, or vsync where each frame has to be displayed for some multiple of 16.67 ms.
 
So the divide is growing wider... I guess people will have to decide between Mantle and PhysX+G-SYNC in the future. Which would be sad. Both things deserve to be spread between everyone: A new, legacy-cruft-free, low-level, easy-to-develop-for graphics API as well as monitors seamlessly changing refresh rates according to the GPU's frame-render capabilities.

If NVIDIA's technology works in every application without any special implementation required from the developer's side (it definitely sounds like the driver would handle things automatically, as long as you turn off V-SYNC), this will be huge, though. Heck, this is a technology that should be in every screen from now on, period. I love it!

Since we're actually still pretty unsure how the Mantle SDK really functions, I'm heavily favouring NVIDIA for now.
 

Mandoric

Banned
VFR for monitors, nice. Hope it goes generic, since this would be pretty cool on the video front too.

Won't this result in flicker and headache inducing low refresh rates if you're playing a game that really pushes the rendering hardware and can only keep about 30fps. I understand such horrible framerates are generally unknown to us glorious PC gamers, but I think it's a valid concern, since it may artificially force a graphics card refresh earlier than one may want if you literally can't look at your monitor due to the strobing of the monitor refresh rate being limited by the rendering speed (though I suppose this functionality could be turned off if you want).

LCDs don't strobe, even at lower refresh rates.
 
Top Bottom