Backfoggen
Banned
Even if this is still an enthusiast thing it seems like a thing that should be done
Honest question, why would you need 120+ fps on a 4k monitor to max them out? Would a solid 60 fps not have the same effect?
Even if this is still an enthusiast thing it seems like a thing that should be done
It's actual hardware. On both ends of the connection. I don't see what similarities it has with programmable shaders.
If it works as advertised (which it does, according to everyone who has witnessed it) then all you really need is a system that never really dips below 30 or 40 fps.
Frame rate will continue to be important though, even if you need "less" to achieve a similar perceived motion smoothness. Input lag will continue to be affected by your frame rate, so even if 40 fps might "look" as smooth as 60 fps with this technology, you'll still want to achieve higher frame rates for lower input lag on the control side of things.
There are some technical challenges to combining G-Sync with LightBoost strobing, specifically flicker caused by variable strobing.@AndyBNV
What does this mean for Lightboost monitors and specifically ToastyX implementation? How does G-Sync compare to Lightboost strobbing? Do they work together, like with 3D Vision?
The VG248QE is one of the top monitors in this category. However, Lightboost uses DVI for a connection, and last time I checked, Displayport does not support this feature.
G-Sync doesn't eliminate input lag, but significantly reduces it. Frames will always matter to the type of person who demands the absolute best competitive edge. But eliminating a good chunk of the lag is a major improvement for entry level and mid range cards/systems that want to run higher resolutions, more visual enhancements, or both.
No. You still get the motion blur problem.Honest question, why would you need 120+ fps on a 4k monitor to max them out? Would a solid 60 fps not have the same effect?
However, the input lag associated to what frame rate the game is rendering at will continue to be just as important as it is now
I have to give John Carmack credit where due; Blur Busters started because of a tweet reply from John Carmack -- the same guy who's also worked on this impressive G-Sync stuff with nVidia.Oh cool, didn't know Blurbusters had an account on NeoGAF!
Keep up the good work, enjoy reading the site
Yeah, I think people are missing this. The largest visible improvement from G-sync (and one that should be immediately obvious to most gamers, not just enthusiasts) is likely to occur in the range between 30 and 60 FPS.This technology means the most to entry level cards. The 'no more tearing' thing is nice, but what it does under 60 is just as important.
I might be missing something, why is there any tearing if V-Sync is on?Not exactly.
Yes, people who already game on 120hz monitors and/or beastly GPU configurations probably already have much less tearing than people trying to eke out performance from aging configurations.
Just as tearing occurs when the GPU outpaces the monitor, stutter and lag occur when the monitor outpaces the GPU. G-Sync will be in monitors that can downclock themselves all the way down to 30hz in order to match the GPU. This significantly reduces input lag at FPS rates lower than 60 but >30.
I like Lightboost, but it still has a ways to go. It impacts image quality in other areas and is dependent upon high performance from your PC in order to function properly. The motion you can achieve on a CRT is independent of the source.(For those living under a rock, LightBoost (for 2D) is a strobe backlight that eliminates motion blur in a CRT-style fashion. It eliminates crosstalk for 3D, but also eliminates motion blur (For 2D too). LightBoost is now more popular for 2D. Just google "lightboost". See the "It's like a CRT" testimonials, the LightBoost media coverage (AnandTech, ArsTechnica, TFTCentral, etc), the improved Battlefield 3 scores from LightBoost, the photos of 60Hz vs 120Hz vs LightBoost, the science behind strobe backlights, and the LightBoost instructions for existing LightBoost-compatible 120Hz monitors. It is truly an amazing technology that allows LCD to have less motion blur than plasma/CRT. John Carmack uses a LightBoost monitor, and Valve Software talked about strobing solutions too. Now you're no longer living under a rock!)
Strobing isn't limited to TN anymore. Sony's interpolation-free low-lag "Motionflow Impulse" does it (Sony's equivalent of LightBoost), and the Eizo professional FDF2405W 240Hz VA monitor uses a strobe backlight (See page 15 of PDF manual for strobing information). You've must have missed the news of the new, additional strobe backlights under other brand names other than "LightBoost". 2014 shall be an interesting year.I like Lightboost, but it still has a ways to go. It impacts image quality in other areas and is dependent upon high performance from your PC in order to function properly. The motion you can achieve on a CRT is independent of the source.
I might be missing something, why is there any tearing if V-Sync is on?
I'd like to see those for myself to determine as I wasn't made aware of limitations of the technology until using it.Strobing isn't limited to TN anymore. Sony Motionflow Impulse does it (Sony's version of LightBoost), and the Eizo professional FDF2405W 240Hz VA monitor uses a strobe backlight. You've must have missed the news of the new, additional strobe backlights.
Yes, I'd like to know this as well.What I don't get about strobing is why it has to be bound to FPS. Can't you strobe the same frame many times if you keep the strobing frquency extremely high? (Like 1000 Hz) Or is that not technically possible?
Yes, but variable-rate strobing can create flicker problems.Yes, I'd like to know this as well.
It seems like it would be necessary for g-sync as well if you want to achieve the same effect.
Some comments:So, the motion blur won't be reduced to CRT or anything and you'll still need lightboost, but what tech did they use for this screenshot:
John Carmack said:John Carmack ‏@ID_AA_Carmack 10h
@GuerrillaDawg the didn’t talk about it, but this includes an improved lightboost driver, but it is currently a choice — gsync or flashed.
Wow I didn't know that seems like Sony is doing good work as well.Strobing isn't limited to TN anymore. Sony's interpolation-free low-lag "Motionflow Impulse" does it (Sony's equivalent of LightBoost), and the Eizo professional FDF2405W 240Hz VA monitor uses a strobe backlight (See page 15 of PDF manual for strobing information). You've must have missed the news of the new, additional strobe backlights under other brand names other than "LightBoost". 2014 shall be an interesting year.
No... It depends on the human.Wow I didn't know that seems like Sony is doing good work as well.
The problem I have with LightBoost is that it gives me a mild headache. Do you get used to it after a while?
What I don't get about strobing is why it has to be bound to FPS. Can't you strobe the same frame many times if you keep the strobing frquency extremely high? (Like 1000 Hz) Or is that not technically possible?
Some comments:
(1) Static photography do not accurately capture display motion blur. You need a pursuit camera.
HOWTO: Using a Pursuit Camera with TestUFO Motion Tests
For example, you can only photographically capture the checkerboard pattern at http://www.testufo.com/eyetracking#pattern=checkerboard if you pan the camera along with the UFO while taking a picture.
(2) G-Sync definitely does decrease motion blur, but not to LightBoost levels (for locked 120fps@120Hz motion, for panning similiar to www.testufo.com/photo fluidity)
G-Sync is better for variable frame rate situations, but if you've got a Titan and max out at 144fps@144Hz, it still has more motion blur than LightBoost 120fps@120Hz.
Also, it is possible they may have already come up with a method of variable-rate strobing, but I'd like to hear a confirmation about this. Also, is this station running G-Sync, or running LightBoost? John Carmack has mentioned that they've enhanced LightBoost in the ASUS G-Sync upgrade board, but he says it's an "either-or" option.
Hmm I don't recall having problems with CRT when I was younger. I haven't owned a CRT screen in a while though.No... It depends on the human.
For me specifically, it actually reduces eyestrain because I get more headaches from fast gaming motion blur, than headaches from strobing. Eliminating the motion blur fixes it for me. Others such as CallSignVega agrees. Also, Eizo's FDF2405W brochure says the 240Hz mode (strobe mode) can reduce eyestrain during map panning. So there's pretty much credibility. But it is a tradeoff if you are more sensitive to flicker than motion blur. (I'm VERY sensitive to motion blur). Fortunately, ToastyX Strobelight makes it easy to turn off, so you can turn it on only during games.
LightBoost is good for people who are used to CRT's. If you never got CRT eyestrain, then it's easy to get used to LightBoost, since LightBoost behaves like a 120Hz CRT. However, it's not for people who get headaches with plasmas/CRT"s.
Adjusting strobe brightness downwards and increasing ambient lighting in the room, helps greatly -- common recommendations from back in the CRT tube days, too.
It's nice, but I do not see this as a huge improvement compared to a tripple buffered and v-synched image. The average person is not going to notice a significant difference.
Can anyone explain exactly why this needs an Nvidia (and a specific series onwards at that) card to function? It's built into the monitor so will only be receiving what the graphics card sends over HDMI/DVI/Displayport right? Is there something that's been built into the hardware of these cards for a while now that tells the GSync hardware what to do? Or is it just dependent on driver support and Nvidia no longer support driver updates on these older cards?
If it's hardware based then AMD users are screwed unless a licensing agreement is reached.
Driver based similar on the licensing front but there is the possibility of third party mods.
If the Gsync module is doing things on it's own without special signals from the card/drivers then there is no reason AMD cards (or even games consoles?) wouldn't work with Gsync monitors. This would be good for everyone. Unlikely this will the case.
Isn't there a reason why monitors can't just, change their refresh rate to whatever? I mean, maybe this is just nVidia being confusing by using terms like Mhz, but they're making it sound like the they can change the monitors Hz to any figure they need to match the frame rate, and for some reason I just don't think that's the case.
nvidia said their cards indeed have special HW in them which is used to make G-Sync work, which is why the feature isn't supported by eg older nvidia cards.
Good to know. Thanks.
Because what is sent over the Displayport cable for G-sync is not a normal Displayport signal.Can anyone explain exactly why this needs an Nvidia (and a specific series onwards at that) card to function? It's built into the monitor so will only be receiving what the graphics card sends over HDMI/DVI/Displayport right?
I suspect it's just simply a variable-length vertical blanking interval, possibly with additional signalling added within. I have a good understanding of front porch, back porch, sync pixels, in both the horizontal and vertical dimension. DisplayPort still maintains the legacy blanking intervals. (You can witness this by playing with nVidia Custom Resolution Utility, or ToastyX Custom Resolution Utility, or even the very old PowerStrip utility popular in the late 90's/early 00's)Because what is sent over the Displayport cable for G-sync is not a normal Displayport signal.
Can anyone explain exactly why this needs an Nvidia (and a specific series onwards at that) card to function? It's built into the monitor so will only be receiving what the graphics card sends over HDMI/DVI/Displayport right?
This makes sense. And I do remember tweaking front/back porch and other signalling parameters to get the most out of my 22" CRT in powerstrip way back when. On a Matrox GPU. Those were the times.I suspect it's just simply a variable-length vertical blanking interval, possibly with additional signalling added within. I have a good understanding of front porch, back porch, sync pixels, in both the horizontal and vertical dimension. DisplayPort still maintains the legacy blanking intervals. (You can witness this by playing with nVidia Custom Resolution Utility, or ToastyX Custom Resolution Utility, or even the very old PowerStrip utility popular in the late 90's/early 00's)
The pixel dotclock runs at a constant 144Hz, to keep frame transmission times very quick. (Future faster dotclocks could be done; such 240Hz or 480Hz during later this decade).
Exactly. Until frames start getting delayed by frame transmission times. (e.g. trying to do more than 144fps). Fortunately G-Sync can gain higher refresh rate caps -- on tomorrow's 200Hz, 240Hz, 500Hz, etc monitors.This makes sense. And I do remember tweaking front/back porch and other signalling parameters to get the most out of my 22" CRT in powerstrip way back when. On a Matrox GPU. Those were the times.
If this is true, then the latency of G-sync should basically always be the best-case latency of 144 Hz without V-sync. Pretty awesome.
The panel with John Carmack, Tim Sweeney and Johan Andersson if you missed it:
http://youtu.be/MH2hjhcfWic
Also, other possible causes of LightBoost discomfort (other than an intolerance to 120Hz CRT's):Hmm I don't recall having problems with CRT when I was younger. I haven't owned a CRT screen in a while though.
Yeah switching lightboost on and off is pretty straightforward. I'll give it a go again with your additional advice after my exams, thanks.