• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Introducing G-SYNC Pulsar tech

HeisenbergFX4

Gold Member


G-SYNC Displays Dazzle At CES 2024: G-SYNC Pulsar Tech Unveiled, G-SYNC Comes To GeForce NOW, Plus 24 New Models​

By Guillermo Siman & Andrew Burnes on January 08, 2024 | Featured StoriesG-SYNCGeForce NowGeForce RTX GPUsHardwareNVIDIA ReflexPulsar
In the ever-evolving realm of gaming technology, NVIDIA has consistently pioneered technologies that have redefined experiences for users.
In 2013, the introduction of G-SYNC revolutionized display technology, and G-SYNC has continued bringing new advancements in displays ever since.
Today, you can buy hundreds of G-SYNC, G-SYNC ULTIMATE, and G-SYNC Compatible gaming monitors, gaming displays, and TVs, ensuring a fantastic smooth, stutter-free experience when gaming on GeForce graphics cards.
At CES 2024, our partners at Alienware, AOC, ASUS, Dough, IO Data, LG, Philips, Thermaltake, and ViewSonic announced 24 new models that give further choice to consumers. Among the new models is the Alienware AW3225QF, one of the world’s first 240Hz 4K OLED gaming monitor, the Philips Evnia 49M2C8900, one of the world’s first ultrawide DQHD 240Hz OLED gaming monitors, and LG’s 2024 new 144Hz OLED and wireless TV lineup, which are available in a multitude of screen sizes, from 48 inches to a whopping 97 inches.
At CES, we also announced two new G-SYNC innovations: G-SYNC on GeForce NOW, and G-SYNC Pulsar technology.
G-SYNC technology comes to the cloud with GeForce NOW, vastly improving the visual fidelity of streaming to displays that support G-SYNC. Members will see minimized stutter and latency for a nearly indistinguishable experience from a local gaming experience.
G-SYNC Pulsar is the next evolution of Variable Refresh Rate (VRR) technology, not only delivering a stutter-free experience and buttery smooth motion, but also a new gold standard for visual clarity and fidelity through the invention of variable frequency strobing. This boosts effective motion clarity to over 1000Hz on the debut ASUS ROG Swift PG27 Series G-SYNC gaming monitor, launching later this year.

nvidia-g-sync-pulsar-comparison-stacked.png





With G-SYNC Pulsar, the clarity and visibility of content in motion is significantly improved, enabling you to track and shoot targets with increased precision
No longer will users have to choose between the smooth variable refresh, or the improved motion clarity of Ultra Low Motion Blur (ULMB) – our new G-SYNC Pulsar technology delivers all the benefits of both, working in perfect harmony to deliver the definitive gaming monitor experience.

PG27AQN-R.png





ASUS ROG Swift PG27 Series G-SYNC gaming monitors with Pulsar technology launches later this year
Read on for further details about our new CES 2024 G-SYNC announcements.

The Next Generation: Introducing G-SYNC Pulsar Technology​

NVIDIA’s journey in display technology has been marked by relentless innovation. The launch of G-SYNC in 2013 eradicated stutter by harmonizing the monitor’s refresh rate with the GPU’s output. This was just the beginning.
Following G-SYNC, we introduced Variable Overdrive, enhancing motion clarity by intelligently adjusting pixel response times. Then we incorporated HDR, bringing richer, more vibrant colors and deeper contrast, transforming the visual experience. Additionally, the expansion into broader color gamuts allowed for more lifelike and diverse color representation, while the introduction of wider refresh rate ranges further refined the smoothness of on-screen action—catering to both high-speed gaming and cinematic quality, and everything in between.

g-sync-timeline.png





The Breakthrough Innovation Of G-SYNC Pulsar​

Traditional VRR technologies dynamically adjust the display’s refresh rate to match the GPU’s frame rate, effectively eliminating stutter.

To evolve VRR further, the aspiration has always been in unifying it with advanced strobing techniques to eliminate display motion blur (not to be confused with in-game motion blur visual effects). Display motion blur is caused by both slow LCD transitions, and the persistence of an image on the retina as our eyes track movement on-screen. Slow pixel transitions can’t keep up with fast-moving objects, leading to a smear effect. These slow transitions can be eliminated by Variable Overdrive—but motion blur born out of object permanence in the eye can only be removed by strobing the backlight.
However, strobing the backlight at a frequency that is not fixed causes serious flicker—which, until now, had prevented effective use of the technique in VRR displays.
For over a decade, our engineers have pursued the challenge of marrying the fluidity of VRR timing with the precise timing needed for effective advanced strobing.
The solution was a novel algorithm that dynamically adjusts strobing patterns to varying render rates. NVIDIA’s new G-SYNC Pulsar technology marks a significant breakthrough by synergizing two pivotal elements: Adaptive Overdrive and Pulse Modulation.
With Adaptive Overdrive, G-SYNC Pulsar dynamically adjusts the rate at which pixels transition from one color to another, a vital technique to reduce motion blur and ghosting. This process is complicated by VRR technology, where the refresh rate fluctuates in tandem with the GPU's output. G-SYNC Pulsar’s solution modulates overdrive based on both screen location and refresh rate—ensuring that clarity and blur reduction are maintained across a spectrum of speeds, and across the entire screen space.
Complementing this, the technology also intelligently controls the pulse's brightness and duration—key to maintaining visual comfort and eliminating flicker. Flickering, often a byproduct of strobing methods used to diminish motion blur, can disrupt the gaming experience and cause viewer discomfort. By adaptively tuning backlight pulses in response to the constantly changing game render rate, G-SYNC Pulsar creates a consistent and comfortable viewing experience, effectively accommodating the display's dynamic nature.
Merging these two adaptive strategies, G-SYNC Pulsar transcends previous challenges associated with enhancing VRR with strobing backlight techniques. Prior attempts have often stumbled, leading to flickering and diminished motion clarity. However, G-SYNC Pulsar’s innovation ensures perfect synchronization between overdrive and backlight pulse with the screen's refresh cycle.
This represents a leap beyond incremental updates or a combination of existing technologies: it is a radical rethinking of display technology—necessitating the development of new panel technology, and representing a fundamental reengineering at both hardware and software levels.








g-sync-pulsar-tech-explainer-1.png



g-sync-pulsar-tech-explainer-2.png



g-sync-pulsar-tech-explainer-3.png



g-sync-pulsar-tech-explainer-3.png

g-sync-pulsar-tech-explainer-1.png

g-sync-pulsar-tech-explainer-2.png






The resulting gaming experience is transformative, where each frame is delivered with both stutter-free smoothness, and motion clarity that is effectively the quadruple of its baseline refresh rate—enabling a truly immersive and uninterrupted visual journey for gamers. Even in the most intense and fast-paced games.

G-SYNC Pulsar Demo: See The Technology In Action​

The true impact of G-SYNC Pulsar is unmistakable in action. When enabled, the technology offers starkly smoother scenes and sharper clarity, surpassing the capabilities of earlier monitor technologies that attempted to combine VRR and strobing.
This advancement offers distinct advantages for various gaming genres. In competitive gaming, the elimination of stuttering is crucial, as these distractions can impede performance and affect outcomes. Similarly, enhanced motion clarity can provide a competitive edge, where precise tracking and response to fast-moving elements and friend-foe distinction are paramount. For immersive games, the technology's ability to maintain consistent smoothness and clarity enhances the player's sense of being part of the game world, free from immersion-breaking visual artifacts.

Furthermore, G-SYNC Pulsar simplifies the user experience by eliminating the need to switch between different monitor settings for either VRR or strobing technologies. Whether it's for the high-stakes environment of competitive gaming, or the rich, detailed worlds of immersive titles, G-SYNC Pulsar delivers a superior and convenient visual experience tailored to all facets of gaming.
In the video below, a 1000 FPS high-speed pursuit camera recorded Counter-Strike 2 running identically on a 360Hz G-SYNC monitor with Pulsar technology enabled, versus with Pulsar technology disabled. Played back at 1/24 speed, the reduction of monitor-based motion blur on the G-SYNC Pulsar display is immediately evident, greatly improving clarity, fidelity, target tracking and target acquisition, helping improve hit rate, and more, making users more competitive online.





With G-SYNC Pulsar, the clarity and visibility of content in motion is significantly improved, enabling you to track and shoot targets with increased precision

G-SYNC Technology Comes To GeForce NOW​

G-SYNC technology will be coming soon to the cloud with GeForce NOW, raising the bar even higher for high-performance game streaming.
With the introduction of the Ultimate tier, GeForce NOW delivered improved visual graphics in the cloud by varying the stream rate to the client, driving down total latency on Reflex-enabled games.
Newly improved cloud G-SYNC technology goes even further by varying the display refresh rate for smooth and instantaneous frame updates for variable refresh rate monitors, fully optimized for G-SYNC capable monitors, providing members with the smoothest tear-free gaming experience from the cloud.
Ultimate members will also soon be able to utilize Reflex in supported titles at up to 4K resolution and 60 or 120 FPS streaming modes, for low-latency gaming on nearly any device.
With both Cloud G-SYNC and Reflex, members will feel as if they’re connected directly to GeForce NOW servers, for a visual experience that is smoother, clearer, and more immersive than ever before.

24 New G-SYNC Gaming Monitors, Displays & TVs Coming Soon​

There are thousands of monitors, displays and TVs out there, but only the best receive the G-SYNC badge of honor. Monitors with dedicated G-SYNC modules push the limits of technology, reaching the highest possible refresh rates, and enabling advanced features such as G-SYNC esports mode. And G-SYNC ULTIMATE displays deliver stunning HDR experiences.

g-sync-advantages.png





Hundreds of other displays are validated by NVIDIA as G-SYNC Compatible, giving confidence for buyers looking for displays that don’t blank, pulse, flicker, ghost, or otherwise artifact during Variable Refresh Rate (VRR) gaming. G-SYNC Compatible also ensures that a display operates in VRR at any game frame rate by supporting a VRR range of at least 2.4:1 (e.g. 60Hz-144Hz), and offers the gamer a seamless experience by enabling VRR by default on GeForce GPUs.
Manufacturers are increasingly adopting VRR, and many incorporate game modes and additional features that enhance the PC experience, whether gaming on a monitor or TV. To ensure a flawless out of the box experience, partners such as Alienware, AOC, ASUS, Dough, IO Data, LG, Philips, Thermaltake, and ViewSonic share their displays with us for testing and optimization.
At CES 2024, our partners unveiled 24 new G-SYNC gaming monitors, displays and TVs, from a 16 inch portable monitor, all the way up to 97 inch TVs.
The Alienware AW3225QF, launching this January, is the one of the world’s first 240Hz 4K OLED gaming monitors. Spanning 32 inches, the curved display has a wide VRR range from 40Hz, up to its headline-grabbing 240Hz maximum refresh rate. Alienware boasts that the AW3225QF has a 0.03ms gray-to-gray minimum response time, Dolby Vision® HDR and VESA DisplayHDR True Black 400. And for peace of mind, there’s a three-year OLED burn-in warranty.

aw3225qf-cfp-00030rf095-wh.png





The Philips Evnia 49M2C8900 is one of the world’s first 5120x1440 (DQHD) 240Hz OLED gaming monitors, boasting a 48.9 inch curved screen. It’s DisplayHDR True Black 400 certified, for increased color accuracy and brightness, it’s VESA ClearMR-certified, for increased motion clarity, there’s a Smart Image Game Mode to tweak the display output based on what you’re playing, and Philips’ Ambiglow technology is built into the back of the monitor, suitably illuminating your surroundings with color that matches the on-screen action, heighting immersion.

philips-evnia-49m2c8900.png





LG came to CES with 4 new IPS monitors, and 5 new 2024 TV lines, with 17 different size options. The LG SIGNATURE OLED M4, the world’s first G-SYNC Compatible 144Hz TV, is one of their notable highlights. Now available in a 65-inch screen size, this thrilling addition presents diverse screen options for the striking wireless OLED TV lineup, offering selections ranging from the versatile 65-inch model to the massive 97-inch giant. Cleaner, distraction-free viewing is easily attainable with innovative wireless solutions completely free of cables, excepting the power cord, and the M4 model is the world’s first TV with wireless video and audio transmission at up to 4K 144Hz, delivering superior OLED performance with accurate details and an elevated sense of immersion.

lg-g4-g-sync-compatible-display-ces-2024.png





Below, you can check out the full specs of all 24 new G-SYNC gaming monitors, displays and TVs, to see if there’s something that meets your requirements. If not, head to our complete list of G-SYNC displays, dialing in on the perfect display with our filters.

ManufacturerModelSize (Inches)Panel TypeResolutionRefresh Rate
AlienwareAW3225QF32OLED3840x2160 (4K)240Hz
ASUSROG Swift PG27 series27IPS2560x1440 (QHD)360Hz
ASUSPG49WCD49OLED5140x1440 (DQHD)144Hz
AOC16G316IPS1920x1080 (FHD)144Hz
AOC24G424IPS1920x1080 (FHD)165Hz
AOC27G427IPS1920x1080 (FHD)165Hz
AOCPD4949OLED5120x1440 (DQHD)240Hz
AOCQ27G2SD27IPS2560x1440 (QHD)180Hz
DoughES07E2D27OLED2560x1440 (QHD)240Hz
IO DataGCU271HXA27IPS3840x2160 (4K)160Hz
LG2024 4K M4 / G497OLED3840x2160 (4K)120Hz
LG2024 4K M4 / G483, 77, 65, 55OLED3840x2160 (4K)144Hz
LG2024 4K C4 series83, 77, 65, 55, 48, 42OLED3840x2160 (4K)144Hz
LG2024 4K CS series65, 55OLED3840x2160 (4K)120Hz
LG2024 4K B4 series77, 65, 55, 48OLED3840x2160 (4K)120Hz
LG24G560F24IPS1920x1080 (FHD)180Hz
LG27G560F27IPS1920x1080 (FHD)180Hz
LG27GR75QB27IPS2560x1440 (QHD)144Hz
LG32GP75A32IPS2560x1440 (QHD)165Hz
Philips25M2N320025IPS1920x1080 (FHD)180Hz
Philips27M1N5500P27IPS2560x1440 (QHD)240Hz
Philips49M2C890049OLED5120x1440 (DQHD)240Hz
Thermaltake27FTQB27IPS2560x1440 (QHD)165Hz
ViewSonicXG272-2K-OLED27OLED2560x1440 (QHD)240Hz

For word on future G-SYNC ULTIMATE, G-SYNC, and G-SYNC Compatible displays, keep an eye on our GeForce Game Ready Driver announcements, where support for new monitors, displays and TVs is noted.

G-SYNC At CES 2024​

Whether you game on a monitor, display or TV, locally on a PC or laptop, or via GeForce NOW in the cloud, there are now even more ways to enhance your gaming experiences with G-SYNC technology.
G-SYNC technology on GeForce NOW makes the service’s fantastic streaming experience even better.
24 new G-SYNC gaming monitors, displays and TVs ensure there’s a perfect smooth and stutter free display for everyone.
And with the introduction of G-SYNC Pulsar, NVIDIA has addressed a challenge that has persisted for over a decade. G-SYNC Pulsar stands as a testament to NVIDIA's unwavering commitment to innovation and redefining the boundaries of gaming technology, offering gamers a visual experience that is smoother, clearer, and more immersive than ever before.
For news about new G-SYNC monitors, stay tuned to GeForce.com, where you can also check out our other announcements from CES 2024.
 

Dacvak

No one shall be brought before our LORD David Bowie without the true and secret knowledge of the Photoshop. For in that time, so shall He appear.
Damn, now I want an LG G4 😩
 

Danny Dudekisser

I paid good money for this Dynex!
It does sound cool, though GSYNC's performance has been so inconsistent for me over the years that I'm pretty skeptical of how well the implementation will actually work.
 

sertopico

Member
Oh, I was hoping they were finally gonna drop the proprietary modules and technologies after all these years, since FreeSync does almost all of it with no extra tax to pay to nvidia... Not gonna happen apparently.

Correct me if I'm wrong, but (QD) OLED owners do not need this new Pulsar thing, right?
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Oh, I was hoping they were finally gonna drop the proprietary modules and technologies after all these years, since FreeSync does almost all of it with no extra tax to pay to nvidia... Not gonna happen apparently.

Correct me if I'm wrong, but (QD) OLED owners do not need this new Pulsar thing, right?
Nobody NEEDS it since we have functioned this long without it.

But if you are running at a full 240 fps, the motion will be so smooth at that point that the (likely) loss of brightness may not be worth it. Even 120 fps on an OLED is very smooth.
 
Last edited:

sertopico

Member
Nobody NEEDS it since we have functioned this long without it.

But if you are running at a full 240 fps, the motion will be so smooth at that point that the (likely) loss of brightness may not be worth it. Even 120 fps on an OLED is very smooth.
Understood, thanks.
Personally I will never be in the need of such high framerates, I don't play any competitive fps and such. Actually I even brought my current alienware's refresh rate down to 144Hz in order to get 10 bit color depth.
 

nkarafo

Member
Imagine needing all those fancy high tech advancements, in both hardware and software, powered by the latest graphics cards and monitors in 2024, just to be able to reach the motion clarity of an old CRT you had in the 90's.

Shows how abysmal the flat panel technology was, is and will continue to be.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Is this basically some kind of fancy black field insertion that works with VRR? Even if they could drastically reduce the flicker, wouldn't that have a dramatic affect on brightness?
BFI absolutely has an effect on brightness. On my LGC1, I could compensate by adjusting the peak brightness.
 

Ulysses 31

Member
Imagine needing all those fancy high tech advancements, in both hardware and software, powered by the latest graphics cards and monitors in 2024, just to be able to reach the motion clarity of an old CRT you had in the 90's.

Shows how abysmal the flat panel technology was, is and will continue to be.
Soo you don't welcome this improvement in motion clarity in flat panels? :lollipop_confounded:
 

Porticus

Member
Imagine needing all those fancy high tech advancements, in both hardware and software, powered by the latest graphics cards and monitors in 2024, just to be able to reach the motion clarity of an old CRT you had in the 90's.

Shows how abysmal the flat panel technology was, is and will continue to be.

I don't care about motion clarity when the image is a shit smearing turd.
 

nkarafo

Member
Soo you don't welcome this improvement in motion clarity in flat panels? :lollipop_confounded:
I'm not very enthusiastic. no. Because this is another exclusive Nvidia thing who are going to lock it to whatever cards they want to sell this year. And a feature for some premium gaming monitors maybe?

It's not an improvement for the flat panel technology in general (that would still be 20 years too late).

I don't care about motion clarity when the image is a shit smearing turd.
Well, if you don't want the image to be a shit smearing turd you should care bout motion clarity.
 
Last edited:
Imagine needing all those fancy high tech advancements, in both hardware and software, powered by the latest graphics cards and monitors in 2024, just to be able to reach the motion clarity of an old CRT you had in the 90's.

Shows how abysmal the flat panel technology was, is and will continue to be.
a 90s CRT was 20", 50lbs+, wasnt flat, had to be manually adjusted (convergence, geometry, etc.), had to be degaussed (that was cool though), had phosphorous streaking or other nonsense, didnt have perfect blacks, had worse contrast, had to warm up, had inherent image blurriness/smoothing, etc etc.

basically worse in every other way.

early flat panels were absolute garbage (insulting really), plasma was pretty good, but competent modern flat panels are just better.
CRTs are a special-use case now, nothing more.
 
Imagine needing all those fancy high tech advancements, in both hardware and software, powered by the latest graphics cards and monitors in 2024, just to be able to reach the motion clarity of an old CRT you had in the 90's.

Shows how abysmal the flat panel technology was, is and will continue to be.
I enjoy CRT as much as the next man. I have a JVC D-Series in my bedroom. But let’s be real here.

CRT has its own problems that are solved by fixed-pixel displays: physical distortion, flicker, moire, poor geometry, radio interference, bloom, breathing, size and weight.

If we can get displays that are bright enough to compensate for black frame insertion then I think we are good to go.
 

nkarafo

Member
I enjoy CRT as much as the next man. I have a JVC D-Series in my bedroom. But let’s be real here.

CRT has its own problems that are solved by fixed-pixel displays: physical distortion, flicker, moire, poor geometry, radio interference, bloom, breathing, size and weight.

If we can get displays that are bright enough to compensate for black frame insertion then I think we are good to go.
Some of these don't apply to all CRTs, like in some higher quality ones or high res PC monitors at 75hz+. Yeah they do have their own problems but most of those don't really affect the actual image that's displayed. Biggest complaint is weight and size but that doesn't have anything to do with image quality. LCD ghosting is a far more serious issue IMO.
 

Kenpachii

Member
I moved from a 100hz flat screen CRT playing counterstrike on it, towards a LCD with 30ms ghosting that costed 1k. shooters where unplayable one big smear fest.

It's funny how far into the future we really already where back in the day, and we only now are getting back.
 

iHaunter

Member
I enjoy CRT as much as the next man. I have a JVC D-Series in my bedroom. But let’s be real here.

CRT has its own problems that are solved by fixed-pixel displays: physical distortion, flicker, moire, poor geometry, radio interference, bloom, breathing, size and weight.

If we can get displays that are bright enough to compensate for black frame insertion then I think we are good to go.
Everything except motion clarity is far worse on CRTs.
 

dave_d

Member
a 90s CRT was 20", 50lbs+, wasnt flat, had to be manually adjusted (convergence, geometry, etc.), had to be degaussed (that was cool though), had phosphorous streaking or other nonsense, didnt have perfect blacks, had worse contrast, had to warm up, had inherent image blurriness/smoothing, etc etc.

basically worse in every other way.

early flat panels were absolute garbage (insulting really), plasma was pretty good, but competent modern flat panels are just better.
CRTs are a special-use case now, nothing more.
And don't forget, CRTs had burn in. Not sure which is worse for burn in, CRTs or OLEDs. (If anybody has gone to FunSpot they can tell you how many of those old games have burn in.)
 

dave_d

Member
And yet they don't have anywhere near the motion clarity of CRTs, just like any other modern display without any form of BFI.
I could always point out CRTs basically have built in BFI. (Ok, rolling scan but a similar idea. BlurBuster has an article on that.)
 

SHA

Member
Can Jensen at least unleash all his shenanigans into a single product at once? , instead of adding incremental steps and turn it into hundreds of products.
 

dave_d

Member
Everything except motion clarity is far worse on CRTs.
Well to be fair to CRTs they can produce more different colors than a CRT since they are analog. This is what they're talking about when they say CRTs have better color reproduction. Of course it'll pretty much be the wrong color, sometimes really wrong.(Hey NTSC didn't get the nick name "Never The Same Color" for nothing.)
 
Oh, I was hoping they were finally gonna drop the proprietary modules and technologies after all these years, since FreeSync does almost all of it with no extra tax to pay to nvidia... Not gonna happen apparently.

Correct me if I'm wrong, but (QD) OLED owners do not need this new Pulsar thing, right?
You have no idea how much worse those "free" VRR displays are vs a real module Gsync panel. If your Freesync monitor has some way of displaying the dynamic refresh rate, often referred to as the fps display, then I suggest you turn it on and play some game with a locked framerate cap and watch how piss poor of a job those free scalers handle VRR. Not very good I can assure you. I made a video about this a few years ago to demonstrate how these garbage Freesync displays work:
 

sertopico

Member
You have no idea how much worse those "free" VRR displays are vs a real module Gsync panel. If your Freesync monitor has some way of displaying the dynamic refresh rate, often referred to as the fps display, then I suggest you turn it on and play some game with a locked framerate cap and watch how piss poor of a job those free scalers handle VRR. Not very good I can assure you. I made a video about this a few years ago to demonstrate how these garbage Freesync displays work:

I am aware of the issue, although I didn't test it myself yet.
At the same time I wouldn't personally use any frame cap since I bought a VRR display and therefore I want to use the full range/fps my 4090 is able to produce. The only thing that bothers me a bit from my current monitor, an AW3423DWF, is the flickering that occurs when I'm in game menus, specifically in Cyberpunk.
 
Last edited:
I am aware of the issue, although I didn't test it myself yet.
At the same time I wouldn't personally use any frame cap since I bought a VRR display and therefore I want to use the full range/fps my 4090 is able to produce. The only thing that bothers me a bit from my current monitor, an AW3423DWF, is the flickering that occurs when I'm in game menus, specifically in Cyberpunk.
I don't know how familiar you are with VRR, but it is not active at max refresh rate. In other words, if you don't framerate limit to below your max refresh rate, VRR disengages and it defers to regular vsync instead. Then you get a bump in input lag and noticeable stutter as it switches on and off. That's why Nvidia will intentionally use a framerate limiter with Gsync enabled when you also turn on Ultra low latency. Gsync just doesn't work without being below the max refresh rate.
 

sertopico

Member
I don't know how familiar you are with VRR, but it is not active at max refresh rate. In other words, if you don't framerate limit to below your max refresh rate, VRR disengages and it defers to regular vsync instead. Then you get a bump in input lag and noticeable stutter as it switches on and off. That's why Nvidia will intentionally use a framerate limiter with Gsync enabled when you also turn on Ultra low latency. Gsync just doesn't work without being below the max refresh rate.
Ouch that sucks.

But I think I won't have this issue, all the games I'm playing are not reaching my monitor's maximum refresh rate and therefore hitting the framerate wall. Plus, with frame generation on you get your max framerate automatically capped through reflex. I checked and it doesn't go above 138 fps. This of course might be an issue on games which don't support such technology. So far I haven't encountered any problems. Still, it's annoying it exists.
 
Last edited:

MarkMe2525

Member
I enjoy CRT as much as the next man. I have a JVC D-Series in my bedroom. But let’s be real here.

CRT has its own problems that are solved by fixed-pixel displays: physical distortion, flicker, moire, poor geometry, radio interference, bloom, breathing, size and weight.

If we can get displays that are bright enough to compensate for black frame insertion then I think we are good to go.
I had to calibrate the geometry on my 27" crt last night. It was hell finding the proper documentation and explanation of adjustment options for this 2003 TV. I was up to around 2 am before I was satisfied.

10 out of 10 would do it again.
 
Rip our eyes... Strobing give eye strain to everyone, good luck to still play with that when you are getting older...

I have 34 year, i can't use a Gsync monitor anymore, i used one without problem the last 7 years but since last year my eyes can't support it anymore, that + screen flickering...

And any strobing just give me terrible headache... Never got this problem before... Too much of these just give your terrible eye strain with the time...
 

Silver Wattle

Gold Member
The added NVIDIA tax would probably put the monitors that use these within distance of OLED's, which do no benefit from this LCD focused tech.
Kinda cool but outside of competitive gamerz, it's kinda pointless?
 

poodaddy

Member
Rip our eyes... Strobing give eye strain to everyone, good luck to still play with that when you are getting older...

I have 34 year, i can't use a Gsync monitor anymore, i used one without problem the last 7 years but since last year my eyes can't support it anymore, that + screen flickering...

And any strobing just give me terrible headache... Never got this problem before... Too much of these just give your terrible eye strain with the time...
I'm 36, almost 37, and I'm getting a host of eye related head aches and strain as I get older too. Sucks man, but I feel your pain.
 
Top Bottom