Trile buffering is a mitigation strategy for the issues introduced by fixed refresh rate, and one which only addresses some of them.What's wrong with triple buffering?
Someone ask Carmack the obvious one.
What's wrong with triple buffering?
G-SYNC requires GTX 650 Ti Boost or better, so you'd need to buy a new GPU, too. This isn't some arbitrary BS to force folks to upgrade, by the way - Kepler and up has tech in it to make G-SYNC work.
Ah man I just bought a second monitor
The amount of brainwashing done by the industry already has most people convinced LCD is the second coming (that's been coming for 15 years), so why change?dark10x said:I'm glad companies are looking into this refresh rate business at last but I just cannot fathom why they continue to stick with LCD.
Strictly speaking I still don't care for variable framerates, no matter how they're displayed, but this at least allows to control the variance better. I wonder what chances are to see this in handheld devices someday, as outside of the dying dedicated handhelds, they universally suffer from unstable shit framerate, and you don't have the PC option of upgrading them away.Durante said:This solves the problem at its core.
Holy crap so essentially does this mean every game will have that silky smooth feel of high fps even if say its around 30 fps? :O
I'm a little confused on how it can go past the monitors 60hz, I thought the hardware of the monitor itself cannot handle going past that? Is this removing a lock that monitors have and most can go past 60 to the said 110 if my gpu hits 110 fps?
"I'm a little confused on how it can go past the monitors 60hz, I thought the hardware of the monitor itself cannot handle going past that? Is this removing a lock that monitors have and most can go past 60 to the said 110 if my gpu hits 110 fps?"
The monitors they're talking about are not 60hz monitors, but 120/144hz monitors.
How does it deal with framerates that are higher than the display's refresh rate? You'd have to either drop frames (resulting in slight desync - and therefore slight input lag) or limit the framerate to your display's maximum refresh rate.
Don't 120hz monitors already have "no tearing" with v-sync off though?
How does it deal with framerates that are higher than the display's refresh rate? You'd have to either drop frames (resulting in slight desync - and therefore slight input lag) or limit the framerate to your display's maximum refresh rate.
Nope - why should they?
Don't 120hz monitors already have "no tearing" with v-sync off though?
Sounds just as amazing as I expected. I want one now. But 1440p...In the demo NVIDIA gave us a couple weeks ago my eyes felt like they were "seeing" the smoothness of 60hz all the way down to about 35-40fps. It was at that point I thought the NVIDIA guys were lying and made them turn on an out-of-demo framecounter like FRAPs. It was at precisely that point I went and grabbed every person at Respawn that I could find. So many jaws on the floor that day, lol.
It should be noted that frame latency is going to be an even more important metric for GPU reviews now. Inconsistent frame latency can still introduce stutter with this tech.
In the demo NVIDIA gave us a couple weeks ago my eyes felt like they were "seeing" the smoothness of 60hz all the way down to about 35-40fps. It was at that point I thought the NVIDIA guys were lying and made them turn on an out-of-demo framecounter like FRAPs. It was at precisely that point I went and grabbed every person at Respawn that I could find. So many jaws on the floor that day, lol.
Sounds just as amazing as I expected. I want one now. But 1440p...
Don't 120hz monitors already have "no tearing" with v-sync off though?
So let me get this straight, if my games are locked at a solid 60 FPS, I would be seeing little to change with G-Sync technology, correct?
So let me get this straight, if my games are locked at a solid 60 FPS, I would be seeing little to change with G-Sync technology, correct?
If you aren't using any kind of vsync you are getting tearing, period. Over the refresh rate, under, doesn't matter.
If they are currently locked with vsync you'll be able to disable vsync and get noticeably less lag.
If they aren't currently locked with vsync you are currently getting screen tearing and with gsync you will no longer be getting screen tearing.
Which means if you have G-Sync, there's no reason to ever turn v-sync on right?
These are 144hz displays as far as I know.
I don't understand the clock picture. Why would G-Sync eliminate motion blur?
Yup. So beautiful!
Yes. Because v-sync means holding back your game only to draw a frame when the display needs one. If you miss your chance to give a new frame, you wait for the next opportunity. With this, you make the frame, and when it's done, you show it.
But I just bought my VG248QE...
I'll buy 2nd monitor anyway!
For everyone coming to the thread. Here's the high-level breakdown of what G-SYNC does:
- No tearing
- No VSync input lag
- No VSync stutter
- Improved input latency compared to VSync Off mode on 'standard' monitors
- Clearer, smoother movement (tech demo shows readable text on a moving object)
- DIY modkit available before the end of the year for the ASUS VG248QE (~$175, aiming to reduce price to ~$130)
- Modded ASUS VG248QE versions available from partners before the end of the year
- Off-the-shelf models available from retailers next year. ASUS G-SYNC VG248QE MSRP is $399.
- Will be in monitors of every resolution and size, right up to 3840x2160 - selection determined by manufacturers like ASUS, BenQ, Phillips and Viewsonic
- Requires GTX 650 Ti Boost or better
Mod it with the DIY kit, or have someone do it for you. Takes 20-30 minutes.
This sounds pretty awesome, but of course Nvidia makes it proprietary.
You know what's funny, this solution could fix so many things at the lower end. For example finally displaying movies at their native 24/48hz, or allowing weaker GPUs to push frames as they finish it.
We have a superior, low-persistence mode that should outperform that unofficial implementation, and importantly, it will be available on every G-SYNC monitor.
Yes, of course a big company chooses to make proprietary a technology which they independently researched and coordinated in order to actually profit on all the work they put into it. That shouldn't surprise or really even bother anybody.
If this technology plays out as well as expected, it will become a standard eventually, but a proprietary API paired with select partners is the way for someone to actually make the effort of developing this technology commercially viable.
Yeah the implications of this technology are pretty significant.
I can't believe you guys are gonna make me buy another monitor already. ;_;
For everyone coming to the thread, here's the high-level breakdown of what G-SYNC does:
- No tearing
- No VSync input lag
- No VSync stutter
- Improved input latency compared to VSync Off mode on 'standard' monitors
- Clearer, smoother movement (tech demo shows readable text on a moving object)
- DIY modkit available before the end of the year for the ASUS VG248QE (~$175, aiming to reduce price to ~$130)
- Modded ASUS VG248QE versions available from partners before the end of the year
- Off-the-shelf models available from retailers next year. ASUS G-SYNC VG248QE MSRP is $399.
- Will be in monitors of every resolution and size, right up to 3840x2160 - selection determined by manufacturers like ASUS, BenQ, Phillips and Viewsonic
- Requires GTX 650 Ti Boost or better
Mod it with the DIY kit, or have someone do it for you. Takes 20-30 minutes.