When did tearing become such a big deal?
Cause it's only the tearing that gets fixed with this.
When did tearing become such a big deal?
When did tearing become such a big deal?
People stating that they're unhappy this is currently a proprietary thing, should bare in mind that unless one manufacturer had actually stepped up and put money into this technology that's been possible in theory for quite a while now, we might have ended up waiting a ton longer for this than was necessary.
The point is that if this catches on, the tech is now presumably going to be pushed by lots of different manufacturers over the next 5 or so years in competition.
nvidia lists Windows 7/8/8.1 required for this to work. Is there a technical reason linux wouldn't work? I'm think of steam OS in particular (yes on a monitor)
Absolutely 0 interest until they start building monitors with this tech that are IPS displays that are at least 1440p. Until then it is DOA for me, never going back to 1080p and you couldn't pay me to own another TN panel .
Maybe they just aren't working on getting this compatible with the drivers for linux yet.
Maybe a DirectX only solution...nvidia lists Windows 7/8/8.1 required for this to work. Is there a technical reason linux wouldn't work? I'm think of steam OS in particular (yes on a monitor)
When did tearing become such a big deal?
Since the end CRT and introduction of fixed-pixel displays.
Seriously, this pattern has been going on since my last ATI card... the ATI Rage Pro.Every time I think I'll go AMD, Nvidia pulls me back in.
"I know you guys are content with it becoming a niche thing for enthusiasts, but this is a big deal if it works out and it's a bloody shame that it most likely won't reach the vast majority of gamers."
Why wouldn't it reach the vast majority of gamers?
It's actually much much lower than 62%.Limited to... the 62% of people (give or take) who buy their discrete GPUs from nVidia? Proprietary tech hasn't stopped people from building monitors that support Lightboost and this is going to be a noticeably bigger deal than that.
Because it's proprietary? Going to need monitor manufacturers to come up with a standard that works with both GPUs and maybe even intel's integrated graphics for it to reach the vast majority of gamers.
If Valve has their way, will the majority of PC gamers even be playing on "monitors" in the first place?
If Valve has their way, will the majority of PC gamers even be playing on "monitors" in the first place?
If Valve has their way, will the majority of PC gamers even be playing on "monitors" in the first place?
I doubt Valve has any plan to make people switch to tv's, they wouldn't even be able to i reckon.
Valve doesn't want people to move from monitors to TV, they want people who are already on TVs to move to their Steamboxes.
When did tearing become such a big deal?
Any reason you couldn't use v-sync? It's not a perfect solution as we're getting this, but it DOES eliminate tearing.Ever since frames have been going above the refresh rate.
I literally did not want to use my gaming computer because of tearing. It is so bad it makes a 90FPS game look like sub-par 30FPS. I didn't know it was all about the refresh rate as I just had got into PC gaming.
Got myself a 144hz and that almost completely fixed the issue. Screen tearing doesn't happen at a bigger scale than when using a 60hz, but it is still there. More than playable with a refresh rate of 144hz but it could still be perfected. Hope G-sync delivers that.
Oh, I think I'd rather have tearing at 120/144hz than 60hz, but it's still worth bringing up as it DOES eliminate tearing, and is effectively a near-perfect solution at games that don't dip below 60 FPS on your setup anyway."Any reason you couldn't use v-sync? It's not a perfect solution as we're getting this, but it DOES eliminate tearing."
I dunno why he didn't enable v-sync, but the difference between 60 and 120/144 is as big as the difference between 30 and 60.
When did tearing become such a big deal?
Any reason you couldn't use v-sync? It's not a perfect solution as we're getting this, but it DOES eliminate tearing.
Too bad I missed posting earlier, but this is seriously fantastic.
As ghst, Durante, artist, and others already pointed out this is a fantastic FIRST leap forward for having smoother games while needing less performance. It's to be seen how much input lag this induces vs. no v-sync but I'm very hopeful it'll work very nicely with 120Hz / 144Hz panels.
There is no reason others can't look to integrate this tech, get the costs down, and possibly even have it standard on most mid-high end panels in just a few years.
For that I award nVidia with a Flash .gif
Why not let the goodness to the rest of the lower-end GPUs .. I would love, really love if this gets mainstream and ideally want as less (artificial) constraints as possible.
So, the end of ghosting? For some reason every monitor have ghosting I can see for me, even the high performance and 120hz stuff.
Let me get this straight: Does this somehow mean that if a game is running at 60 frames per second, and then it drops to 40 frames per second, or even as low as 25FPS, we won't notice due to G-sync, because the GPU render rate and the monitor refresh rate will be practically identical, making the game, as well as input response still feel flawless?
Is that what this is? Because if so, that's fucking unbelievable.
Let me get this straight: Does this somehow mean that if a game is running at 60 frames per second, and then it drops to 40 frames per second, or even as low as 25FPS, we won't notice due to G-sync, because the GPU render rate and the monitor refresh rate will be practically identical, making the game, as well as input response still feel flawless?
Is that what this is? Because if so, that's fucking unbelievable.
Because monitors refresh at 60Hz, there's typically a trade-off involved in using V-sync. It eliminates screen tearing, but you have to be able to render at a rock-solid 60 fps or above, or you get stutter. Without V-sync, you get screen tearing, because the monitor and GPU may be updating the frame at different times. With G-Sync, an Nvidia Kepler GPU will now control the timing of frame updates.
This is specific to Kepler only that too limited to 650Ti Boost and above. I know there was a similar discussion in another thread (wrt to Mantle) and that works with all GCN arch. I wish Nvidia dropped a little more details on the requirements for G-SYNC, is it a dedicated IP block, DSP or something else present in GK106 or above?
It may have something to do with the technology in Kepler they used to enable Adaptive V-sync. This however is purely speculation on my part.
Adaptive V-sync works with other cards, like Fermi. Just marketing on their part. Also, this is nothing like adaptive V-sync, which is essentially V-sync that turns itself on over 60fps. G-Sync is far better than that.
I'm more curious to what specifics the monitor requires for it to be compatible with this module.