• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-SYNC - New nVidia monitor tech (continuously variable refresh; no tearing/stutter)

2San

Member
For everyone coming to the thread, here's the high-level breakdown of what G-SYNC does:
[*]DIY modkit available before the end of the year for the ASUS VG248QE (~$175, aiming to reduce price to ~$130)
Ah that's a bit harsh. Might nice if you guys add a discount if you buy a new gpu(similar to shield).
 
For everyone coming to the thread, here's the high-level breakdown of what G-SYNC does:

  • No tearing
  • No VSync input lag
  • No VSync stutter
  • Improved input latency compared to VSync Off mode on 'standard' monitors
  • Clearer, smoother movement (tech demo shows readable text on a moving object)
  • DIY modkit available before the end of the year for the ASUS VG248QE (~$175, aiming to reduce price to ~$130)
  • Modded ASUS VG248QE versions available from partners before the end of the year
  • Off-the-shelf models available from retailers next year. ASUS G-SYNC VG248QE MSRP is $399.
  • Will be in monitors of every resolution and size, right up to 3840x2160 - selection determined by manufacturers like ASUS, BenQ, Phillips and Viewsonic
  • Requires GTX 650 Ti Boost or better



Mod it with the DIY kit, or have someone do it for you. Takes 20-30 minutes.

ao4avaklb3drd8d.gif
 
no more tearing huh? the only tearing i see are the one's rolling down my cheeks in pure joy. i can soon kiss my fw900 goodbye and send her off to crt heaven.

i really have to see a demo of this myself. this might really pave the way for 4K gaming if even 35fps is now the new 60 as long as it feels smooth with my mouse like playing on 140hz then this will be a miracle.
 
"i can soon kiss my fw900 goodbye and send her off to crt heaven."


I wish there were a monitor as good as my old FW900. I've never been so sad as when she died on me.
 
Yes, of course a big company chooses to make proprietary a technology which they independently researched and coordinated in order to actually profit on all the work they put into it. That shouldn't surprise or really even bother anybody.

If this technology plays out as well as expected, it will become a standard eventually, but a proprietary API paired with select partners is the way for someone to actually make the effort of developing this technology commercially viable.

A standard that will work only, still on Nvidia cards?

Ppl were upset at the fact of mantle becoming a standard and blocking out Nvidia cards, how is this any different?
 
Could AMD license it, if it becomes widespread?

That's a question of whether nVidia think they could make more money by taking a cut on competing GPUs or by having an exclusive on a cool enthusiast tech like this, but even if they don't license it someone will come up with a competing implementation eventually.
 

Shai-Tan

Banned
Carmack say:

the didn't talk about it, but this includes an improved lightboost driver, but it is currently a choice -- gsync or flashed.

it sucks that it's either or. i think i will wait for this stuff to shake out.
 
A standard that will work only, still on Nvidia cards?

Ppl were upset at the fact of mantle becoming a standard and blocking out Nvidia cards, how is this any different?

Mantle will in all likelihood... not be generalised into a standard (it by definition cannot be: GCN specific).

This will probably be generalised into a standard.
 

ymmv

Banned
The question of liscencing this was actually brought up during the panel.

This thing is so hare brained obvious that I really can imagine it being eventually liscenced.

It is that big and important.

And not just for computer monitors, every HDTV should have this tech pre-installed. This will not only make games more responsive and the visuals smoother, but your graphics card will punch above its weight too.
 

elfinke

Member
This is a properly great advance in existing tech. Good stuff.

Just gotta get it into decent (any) 16:10 monitors now and I'm in. I need those vertical pixels!

Would love to see this tech added to future HDTV panels too, though I guess in this current implementation it would matter to the Xbone or PS4 given the hardware they're using. In any event, variable refresh rate 4k panels are certainly the future. Brilliant!
 
Someone may have mentioned this already, but what are the chances this tech could be used with the Oculus Rift? Stable framerates and no tearing combined with VR sounds like perfect match.

Edit: Yeah, I assumed it would have been brought up. Good to know it will be integrated in the future at least.
 
"i can soon kiss my fw900 goodbye and send her off to crt heaven."


I wish there were a monitor as good as my old FW900. I've never been so sad as when she died on me.

i look every week for unwanted ones on the net just in case she dies, best $35 i have ever spent. if this thing works i will be jumping to 4k almost a decade before i thought i would while saying that 35fps is smooth.

this is crazy talk.
 
Someone may have mentioned this already, but what are the chances this tech could be used with the Oculus Rift? Stable framerates and no tearing combined with VR sounds like perfect match.


implemeted with rift or projectors will make the opt in price for a good experience actually become less if it can be implemented
 

Eusis

Member
Why the fuck would a TV need this?
We do watch more than TV on it, plus consoles may need it more (if it could actually be used with them but this is TOO new of technology for that) given you can't really do much about a lack of v-sync there.
FYI, according to Carmack, if frame rate drops below 30 FPS, it starts re-drawing old frames.

https://twitter.com/ID_AA_Carmack/status/391300283672051713

So it's variable above 30 FPS, but it essentially reverts to a 30 Hz VSYNC below that.
Sounds like the Achille's Heel of it. Which isn't too bad really given at that point you either need to dial back settings or just be thankful it actually runs, especially as you need reasonably new hardware to use this anyway.
 

Riposte

Member
I'm real dumb. Does this mean less input lag?

EDIT: Okay, someone summed it up well.

This is incredibly exciting. Big step forward, right?
 

Miguel81

Member
That's a question of whether nVidia think they could make more money by taking a cut on competing GPUs or by having an exclusive on a cool enthusiast tech like this, but even if they don't license it someone will come up with a competing implementation eventually.

I really hope it ends up being the former because AMD "solutions" always end up seeming like patch-work.
 
Mantle will in all likelihood... not be generalised into a standard (it by definition cannot be: GCN specific).

This will probably be generalised into a standard.

Mantle is open, so it could become a standard.

And G-sync will probably become a standard is very reassuring...yeah...
 

Nirolak

Mrgrgr
For everyone coming to the thread, here's the high-level breakdown of what G-SYNC does:

  • DIY modkit available before the end of the year for the ASUS VG248QE (~$175, aiming to reduce price to ~$130)
  • Modded ASUS VG248QE versions available from partners before the end of the year
  • Off-the-shelf models available from retailers next year. ASUS G-SYNC VG248QE MSRP is $399.

I had a couple quick questions that I'm not sure if you can answer:
1.) Is there any difference between the modded ones and the one shipping next year?
2.) Is "next-year" relatively early or are we talking like March, July, September...
 

AndyBNV

Nvidia
I had a couple quick questions that I'm not sure if you can answer:
1.) Is there any difference between the modded ones and the one shipping next year?
2.) Is "next-year" relatively early or are we talking like March, July, September...

1) No difference, no. The mods this year are by pro-modding firms. They'll buy in the ASUS monitor, buy in our DIY kit, install the kit, and then sell the modded panel onto customers with a warranty for both the panel and G-SYNC module. The ones available next year can be bought at any retailer and are made/produced/modded by the OEMs.

2) That timing is entirely down to the OEMs.
 

Nirolak

Mrgrgr
1) No difference, no. The mods this year are by pro-modding firms. They'll buy in the ASUS monitor, buy in our DIY kit, install the kit, and then sell the modded panel onto customers with a warranty for both the panel and G-SYNC module.

2) That timing is entirely down to the OEMs.

Thanks a bunch.

I think I'll get a modded one then assuming the mark-up isn't ridiculous.
 

chaosblade

Unconfirmed Member
Mantle is open, so it could become a standard.

And G-sync will probably become a standard is very reassuring...yeah...

Being open doesn't really matter. AMD has worked on it for years from the ground up to support GCN for consoles. Nobody is going to put in tons of work to make it support Nvidia architectures, or older AMD ones. It's just AMD trying to leverage their console contracts on PC.

I doubt Mantle is going to be a big deal at all. It's marketing. This is a big deal though, probably the biggest breakthrough for PC gaming (or arguably gaming in general) in a long time.
 

Alo81

Low Poly Gynecologist
Can always do it yourself :) We'll have installation guides, and hopefully a step by step video also.

Is there any word on how difficult installation will be? It certainly won't just be open it up and pop it in a slot.

Will a lot of soldering be required?
 

Lord Panda

The Sea is Always Right
It looks like the time for me to retire my U3011 is drawing near, but I'm not a fan of that Asus panel.

Looking forward to seeing the other gsync displays.
 

Ce-Lin

Member
I thought I would never see the day, the feel of locked 60 fps even with those framerate dips and stutter I hate, good job Nvidia. Now please sort the bugs in the 32x.xx branch of drivers. This will be commonplace in some years, it must be.
 

AndyBNV

Nvidia
Is there any word on how difficult installation will be? It certainly won't just be open it up and pop it in a slot.

Will a lot of soldering be required?

Screwdriver only, 20-30 minutes. I've looked through the guide and it doesn't appear too difficult. If you have computer building and/or computer modding experience it shouldn't be a problem.
 

Dawg

Member
Ugh, I really have a hard time deciding what monitor to buy. The Asus monitor is great, but the BenQ option is cheaper and better in Europe.

I kinda have faith BenQ will receive the DIY option as well (they were on the list of manufacturers when it was announced, after all) since they're pretty much the EU leader of 120hz atm.

Will probably just buy the BenQ and see what happens, I can still change later on if needed.
 

kartu

Banned
Apparently the higher frequency your monitor supports, the less you win.
The faster your GPU is, the less you win.

It's a nice tech, but price, on top of GPU vendor hook on monitor, are dissappointing.
I bet people with 200-250$-ish GPUs are better off going SLI/CrossFire for most games, than upgrading their monitors.
 
Ppl were upset at the fact of mantle becoming a standard and blocking out Nvidia cards, how is this any different?

Mantle is a low-level API that will directly affect development. In order to get the full benefit, developers will need to code directly to that API, which means that any benefits it provides will never work with non-AMD cards, and its functionality could make it harder to run games that support it in the future on new hardware. That pushes towards vendor lock-in and makes it harder for devs to support both platforms equally.

This is a technology that works with (almost) any game out of the box and doesn't affect development at all. That means that you might need to pay nVidia for it today, but if AMD releases a competing version (or someone puts together a standard) three years from now, it'll already work with all the games that came out in the interim.

Pretty stoked that I actually own a VG248QE. I'll definitely be picking this up day 1.

I do too, but rather than buy a modkit I think I'll just grab a premodded one when they come out -- I only have one display right now and wanted to upgrade to two anyway.
 
Apparently the higher frequency your monitor supports, the less you win.
The faster your GPU is, the less you win.

Not exactly.

Yes, people who already game on 120hz monitors and/or beastly GPU configurations probably already have much less tearing than people trying to eke out performance from aging configurations. But this will also have a lot of other benefits: it'll fix persistent speed/tearing problems in emulators, it'll allow you to turn off locked framerate on games that do rock-solid 60 FPS but can't hit 120, and reduce the input lag on games that people triple-buffer today. I expect this will be a pretty major improvement even for people with good systems.
 

LiquidMetal14

hide your water-based mammals
Now I want to go back to getting nvidia GPU's with a monitor compatible with this tech. I think that's the way I'm leaning.
 

stef t97

Member
We do watch more than TV on it, plus consoles may need it more (if it could actually be used with them but this is TOO new of technology for that) given you can't really do much about a lack of v-sync there.

I get that but on 60hz TVs developers should be focusing on a 30 or 60fps target, not 22-30ish like many seem to nowadays. Obviously it will still make a difference but the combination of that and most people using wireless controllers it just doesn't seem like as big of a necessity.
 
Not exactly.

Yes, people who already game on 120hz monitors and/or beastly GPU configurations probably already have much less tearing than people trying to eke out performance from aging configurations. But this will also have a lot of other benefits: it'll fix persistent speed/tearing problems in emulators, it'll allow you to turn off locked framerate on games that do rock-solid 60 FPS but can't hit 120, and reduce the input lag on games that people triple-buffer today. I expect this will be a pretty major improvement even for people with good systems.

It will also in general just reduce input lag. Everything goes directly to the monitor baby... immediately.
 

mephixto

Banned
Not exactly.

Yes, people who already game on 120hz monitors and/or beastly GPU configurations probably already have much less tearing than people trying to eke out performance from aging configurations. But this will also have a lot of other benefits: it'll fix persistent speed/tearing problems in emulators, it'll allow you to turn off locked framerate on games that do rock-solid 60 FPS but can't hit 120, and reduce the input lag on games that people triple-buffer today. I expect this will be a pretty major improvement even for people with good systems.

A good example is BF4, my fps go wild between 50-120 you can notice the drops and its not smooth.
 

Eusis

Member
I get that but on 60hz TVs developers should be focusing on a 30 or 60fps target, not 22-30ish like many seem to nowadays. Obviously it will still make a difference but the combination of that and most people using wireless controllers it just doesn't seem like as big of a necessity.
True, especially with the 30 FPS bit. That ends up making something like this more useful to games like Monster Hunter 3 Ultimate or Jak 2/3 than, say, The Last of Us or Grand Theft Auto V.
 
Top Bottom