• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

G-SYNC Do-It-Yourself Kits available in NA for $199 - Link to Youtube Guide inside

D

Deleted member 22576

Unconfirmed Member
HDMI can't output past 60Hz anyway, which kills the entire point of the module.
Right but that means you can't plug a console into the display.
Supposedly the chip they use on them sells for $1000 MSRP. This is from a random display tech hobbyiest on the Blurbuster's forum.
Well that sure is bananas.
 

Hip Hop

Member
I did get that monitor for $200, but another $200 is too much. If anything, I hope they are available on Amazon or something so they can sell them open box for $100-$130
 

LiquidMetal14

hide your water-based mammals
You should seriously ask yourself why it makes you upset that there's one company out there actually creating and selling enthusiast products for the gaming market.

Because of 3D vision we got 120 Hz LCD monitors, because of its further development we got Lightboost and reclaimed motion clarity, and because of G-sync variable refresh rate adoption in gaming displays at the very least got a massive mindshare and development boost, and we just got the first official 1440p 120 Hz monitor announcement.

The latter I'm excited for. My wallet isn't as happy to have to buy new things that raise the premium or price of entry.

But this is an enthusiast thing. I can deal with that. I think I might hold off on getting that ASUS monitor for a year or two while I upgrade and build a new Haswell-E build. In 2 years there will surely be a nice 1440p GSYNC compatible monitor at a respectable price.

My initial reaction to things like this is never good. Not with what nvidia tries to do with their approach even though this is a nice potential game changer. The sting of the investment initially is what really gets me.
 

Xyber

Member
You should seriously ask yourself why it makes you upset that there's one company out there actually creating and selling enthusiast products for the gaming market.

Because of 3D vision we got 120 Hz LCD monitors, because of its further development we got Lightboost and reclaimed motion clarity, and because of G-sync variable refresh rate adoption in gaming displays at the very least got a massive mindshare and development boost, and we just got the first official 1440p 120 Hz monitor announcement.

While all that is great stuff, that doesn't change the fact that they could have included a single HDMI so you have the option to plug something else in. Even if that means I can't take advantage of the higher Hz and stuff. For 200 bucks I expect a bit more. Even just having a second displayport on the monitor would be good enough so I could just get an adapter for it.

I was ready to buy it as soon as it was available here in Sweden, but now I'm not interested at all because that would mean I can't plug in my PS4 when I get one. And I'm not buying myself a new TV (my current one is shit and almost broken) just to play on a console.

I guess I will have to wait for their next version or something and hope they fix this.
 
D

Deleted member 22576

Unconfirmed Member
I guess I will have to wait for their next version or something and hope they fix this.

You don't have to guess this is a weird upgrade for idiots like us, its not a real product. The fact they're even offering this is sort of astonishing.
 

Dash Kappei

Not actually that important
I don't care how cool and hip GSYNC is. You are absolutely right. It makes me kind of upset that nvidia does this kind of stuff. No way in hell I'm buying a 300+ monitor on top of a little 200 dollar chip.

It makes you upset that there's a company out there pushing the boundaries for gaming enthusiasts, mmmkay.
You know you don't HAVE to buy it right now, right? It's tech aimed at enthusiast, not "best bang for the back" level, so if you don't want to put up with the prices which come with owning new technology before everyone else, be glad they're brining this stuff to the table and R&D like crazy and just wait for the prices to come down.
Even bluray players, something massively produced unlike these boards, were an arm and a leg when they first hit the market, now you can get one for 50 bucks.

"Upset" Pffft.

edit: Durante my man, what you said
 

Branduil

Member
I feel like it's going to be a better idea to wait a couple years for a new monitor. So much new tech coming out between 4K, G-Sync, Oculus Rift, etc.
 
Whew, $200, yeah, like others said I rather put it towards the graphics card, $200 for the chip is heading into too spicy territory for me. Hopefully this and free-sync from AMD helps settle down the tech and price in time.
 

mkenyon

Banned
To be fair i saw how much the FPGA cost and its around $425.
You can buy the thing here
http://ca.mouser.com/ProductDetail/...=sGAEpiMZZMvoScKlWpK8THvOVBmiOJRoUcL5B6L/FL0=

edit: sorry i got the wrong part it is indeed this one http://ca.mouser.com/ProductDetail/Altera-Corporation/5AGXMA3D4F31I3N/?qs=u/ajwFXIEhbNXrXYP0t%2bXA==
it is 3: $925.06
There we go.

They're selling this at a loss.
Whew, $200, yeah, like others said I rather put it towards the graphics card, $200 for the chip is heading into too spicy territory for me. Hopefully this and free-sync from AMD helps settle down the tech and price in time.
Freesync doesn't do what G-Sync does. No capability for beyond native 60Hz refresh (supposedly), no ULMB, which is the really important part.

Don't look at it as a $200 chip. Look at it as the best $450 gaming monitor one could ever have.
So who besides me is buying/bought this?
Going in on a 790 and the 1440p G-Sync monitor. Replacing a 7970 and a Benq XL2420T and Samsung S23A750D (both 120hz).
 

Branduil

Member
There we go.

They're selling this at a loss.

Freesync doesn't do what G-Sync does. No capability for beyond native 60Hz refresh (supposedly), no ULMB, which is the really important part.

Why is ULMB the really important part? You have to choose between using it or G-Sync, so that doesn't matter for someone who wants a variable-refresh monitor.
 
Why is ULMB the really important part? You have to choose between using it or G-Sync, so that doesn't matter for someone who wants a variable-refresh monitor.

Both are important. It's unfortunate that people have to choose one or the other right now. This may change in the future. It may not, we don't know.

From my perspective, ULMB is more important because CRT-like strobbing is really neat.

The current go to solution, ToastyX Lightboost, can't go lower than 100hz.

This is a curse in games like Metal Gear Rising which has capped refresh rate of 60hz.

ULMB could support 60hz strobbing. We'll just have to see on Monday.
 

nomis

Member
Just ordered one and was kicked back with a server error page, credit card was still charged for the full amount even though according to nvidia's own policies they only charge a "nominal amount" until the physical product has shipped.

PkdVpbJ.jpg
 

Sentenza

Member
I think we were looking at $130 or so introductory, and then coming down from there to $75 or something. Would have to look back at the threads.
Not that it matters too much for me. The upgrade route was never an option with my old monitor.
It was always about purchasing a new one when the right moment comes.
 
I was really hoping they would eventually release a mod kit for the VG236H. Now, I completely doubt it at this price point.
 

Mandoric

Banned
Not enthused about the "limited time" part. I'd grabbed a new AMD board shortly before G-Sync announce, and my plan was to grab the monitor now and the upgrade daughterboard later in the year after another refresh. Now I may hold off on the monitor completely, or get a standard 120hz and just sli my current card to keep it locked.
 
Not enthused about the "limited time" part. I'd grabbed a new AMD board shortly before G-Sync announce, and my plan was to grab the monitor now and the upgrade daughterboard later in the year after another refresh. Now I may hold off on the monitor completely, or get a standard 120hz and just sli my current card to keep it locked.
Real gsync monitors should be rolling out shortly though.
 

xBladeM6x

Member
I think I'll wait a while before investing into G-Sync. As awesome as it looks, I imagine it will be too pricey for a while.
 
... I don't think I'm gonna do it.

There appear to be a lot more trade offs to using this kit on this monitor than I thought there would be. The ULMB section adds concerns like the lack of user control over the strobbing and black level.

It also turns out that strobbing goes only as low as 85Hz, which does not fix problems when it comes to half-assed ports that don't allow for >60hz+ refresh rates like Metal Gear Rising and Dark Souls.

I actually don't mind the status quo of my monitor. Lightboost works fine, and I actually prefer it's current coloration that I spent ages configuring.

I guess I'm in the 1440p+, non TN camp now.
 

mdrejhon

Member
It also turns out that strobbing goes only as low as 85Hz, which does not fix problems when it comes to half-assed ports that don't allow for >60hz+ refresh rates like Metal Gear Rising and Dark Souls.
It's wholly possible that a firmware upgrade might permit 60Hz strobing. You never know.

Strobe backlights can go down to 60Hz, but allowing 60Hz can be problematic.

The alternative is to use software-based black frame insertion to convert 120Hz strobing into 60Hz strobing. Some emulators use that, including WinUAE, a certain version of MAME, and a new version of an MSX emulator.

I already let NVIDIA that strobe length is quite important.

LightBoost_settings.jpg
 
You never know.

That's what I'm not comfortable with. It's not the cost for the kit, or the work required to mod the thing.

Now that all the reviews are out, I realize now how much of a very beta experience this kit is. Beta experiences lead to headaches. Also, regrets. School just started back up today. Two and some odd weeks ago, they would have had my Christmas/I'm bored money.

I want these companies to put more work into making G-SYNC and ULMB a gold master disc rather than beta before I buy. The ROG SWIFT PG278Q coming out in a few months will be a benchmark.
 

FyreWulff

Member
For $200, I can get myself another monitor or even another GPU.

This is not for people that are frugal with their money. It's a beta prototype board that they've let enthusiasts purchase so they can try out the tech. Seriously, it's very rare for a large company to ask you do something that normally voids your warranty.

99.9% of people on this forum that will be using gsync will be buying monitors that already support it out of the box. You really are just buying an alternate 'brain' for the display. This is not like buying an expansion card, it's more like upgrading to USB4 by swapping out your entire motherboard.

Those monitors will have the tech for cheaper since a) they're not going to be using FPGAs and b) they'll have the economy of scale on their side instead of a limited custom run
 

wildfire

Banned
Weren't these things supposed to be priced around 40 bucks when they talked about them the first time?


No it was $175 for the add in board.

It used to be in their FAQ but they have updated it and removed that price.

Fortunately ASUS made a report back in October mentioning the 175 price as well.

http://rog.asus.com/267372013/gaming/nvidia-announces-g-sync-for-monitors/

Has this been stated? I was hoping that Asus was just the first monitor...


The Asus will be the only monitor. Nvidia isn't in the business of selling add-in boards. This is just to promote their brand.


I could site 2 sources where they show a lot of disinterest in making new boards for other monitors.
 

Boogdud

Member
Mine arrives tomorrow.

In the vid it looks like it almost replaces the calibration gui, etc. on the monitor. Curious about that.

I'm also wondering if it's designed for primarily 60hz or can I still jump up to 120hz and get benefits from it even if my framerate doesn't cap at 120, it seems like it would since it is designed to prevent the stuttering when the rate dips and peaks, etc.

I guess I'll find out tomorrow.
 

Durante

Member
BlurBusters G-Sync Review part 2 is up:
http://www.blurbusters.com/gsync/preview2/
That true input-to-screen lag measurement is totally awesome! Great work Mark.

I hope you can do this with lots more monitors/display technologies in the future! Also with traditional V-sync (double- and triple buffered).

Again, this is groundbreaking work.

It was good that we were also unable to detect any input lag degradation by using G-SYNC instead of VSYNC OFF. There were many situations where G-SYNC’s incredible ability to smooth the low 45fps frame rate, actually felt better than stuttery 75fps — this is a case where G-SYNC’s currently high price tag is justifiable, as Crysis 3 benefitted immensely
 

TheExodu5

Banned
I should be receiving mine soon, hopefully! I'm very excited for this.

One thing I would like to know...I wonder if the FPGA board will be upgradable/reprogrammable in case of any major changes behind the tech. Hopefully they've included a service option of some sort. Of course, due to them selling this at a massive discount, I guess it's possible that they may not be allowed to expose the FPGA since it might undercut Altera's sales.

edit: incredible review Mark!

It looks like the input lag from high framerate G-Sync is way higher than predicted. I wonder why that is. At least capping at 120Hz seems like a good solution.

lag-csgo.png


I'm really surprised. Theoretically we figured that the total input lag from >144Hz would only extend to 1 frame at most (6.9ms) as the framerate went to infinity but it seems like even at just 300fps we're looking at nearly 20ms of additional input lag. I wonder why that is. Maybe there is a buffer in place that only gets used when the GPU outputs too quickly, which incurs an additional input lag penalty.
 
Edit: Or not.

Ok the connector was needed to be pulled to the front and then lifted up. I've got the OSD menu working after making sure its a nice flush connection. However while messing around with that I left some pressure marks on the panel. Because I am an idiot.

Mine arrives tomorrow.

In the vid it looks like it almost replaces the calibration gui, etc. on the monitor. Curious about that.

You are replacing all of the original display hardware other than the panel. So yes all of the OSD and control interface is from Nvidia.
 
Some quick impressions.

ICC color profiles seem to be maintained in at least some games that are running in fullscreen. Which I guess is great news since it should at least somewhat help with the crappy TN colors of this panel.

Metal Gear Rising runs great even if it is locked at 60FPS. I was pulling off perfect parries no sweat on blade masters and mastiffs.

It seems to do what it promises to do, unlocked frame rates aren't that bothersome. Running Arkham Origins at max settings with the frame rate fluctuating between 60 and 90 wasn't a big deal.

However it still does nothing for the absolutely horrendous ports like CoD Ghosts where the game just freezes up for seconds at a time. Same goes for Black Flag with Physx on. The smoke effects still hit the frame rate hard enough for it to be jarring. In the case of Assassin's Creed I would still prefer to have the game run at a locked 60 FPS or close to it with vsync on.
 

sirap

Member
I'm seeing a lot of posts about getting a better gpu/monitor etc..

A better gpu ain't gonna get you the results that gsync does. A better monitor won't do jack shit for tearing either.
 

SapientWolf

Trucker Sexologist
From what I am seeing right now it will give you better results than g-sync.
The Ass Creed g ysnc demo I saw was pretty smooth, as long as the framerates stayed above 30. Nvidia might have been gaming the demo, but I don't know how. It was really bizarre to see fraps say 38 even though it looked like a locked 60.
 
The Ass Creed g ysnc demo I saw was pretty smooth, as long as the framerates stayed above 30. Nvidia might have been gaming the demo, but I don't know how. It was really bizarre to see fraps say 38 even though it looked like a locked 60.

My comparisons are based purely on my personal preference. From what I saw I preferred the game running at settings that were aiming for a locked 60 FPS with VSYNC versus G-Sync and cranking the settings higher.

I am starting to agree with what John Carmack said when G-sync would be great for the occasional frame dip so that games would still target 60 FPS at all times but an occasional dip towards 50 would be tolerable.
 
Top Bottom