• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Old CRT TVs had minimal motion blur, LCDs have a lot of motion blur, LCDs will need 1,000fps@1,000hz in order to have minimal motion blur as CRT TVs

JeloSWE

Member
I will try playing in Movie mode, to see if there is any difference. I suppose input lag will be one...

Edit : no improvement as far as I am concernend, I still get a very noticeable doubling of the picture when rotating camera in games.

If anyone has some suggestions for a 32 inches TV with good motion clarity, know that I am interested in it...
I'm guessing The Frame is not really meant as a media screen, more like a painting on the wall. So I'm not surprised Samsung has skimped on the PWM implementation even in Movie mode as well.
 
Last edited:

cireza

Banned
I'm guessing The Frame is not really meant as a media screen, more like a painting on the wall. So I'm not surprised Samsung has skimped on the PWM implementation even in Movie mode as well.
It had a decent review on rtings for gaming.
 

JeloSWE

Member
It had a decent review on rtings for gaming.

https://www.rtings.com/tv/reviews/samsung/the-frame-2020#page-test-results
Flicker-Free: NO
Frequency: 240hz
The Frame 2020 uses PWM to dim its backlight. It normally flickers at 240Hz, but the frequency drops to 120Hz when Picture Clarity is enabled, even if you don't adjust the Blur and Judder reduction sliders. Enabling LED Clear Motion lowers the flickering frequency further to 60Hz.
Note that every picture mode except 'Movie' mode has Picture Clarity enabled by default.


240hz PWM is definitely better than 120hz but might still induce image duplication, you'd want at least 480hz and more if possible to reduce that effect.

It seems to create a lot of visible duplication the way they do the PWM strobing.

Here is how my Sony Z9F looks with it's 720hz PWM
 
Last edited:

dolabla

Member
I hope one day we can get there again. CRT had its faults (weight, etc), but it was definitely superior in other ways than anything we have today. I sure hope we can get that true successor one day.
 

cireza

Banned
240hz PWM is definitely better than 120hz but might still induce image duplication, you'd want at least 480hz and more if possible to reduce that effect.

It seems to create a lot of visible duplication the way they do the PWM strobing.

Here is how my Sony Z9F looks with it's 720hz PWM
Yes, I had seen their picture and duplication was very noticeable. The second one you posted is really blurry... I guess I can't be satisfied by any HD TV :(
 

JeloSWE

Member
Yes, I had seen their picture and duplication was very noticeable. The second one you posted is really blurry... I guess I can't be satisfied by any HD TV :(
They are share the same blur distance but mine is smoother due to higher PWM hz. It's a good thing. Only thing that could remedy the blur is BFI.
 

sunnysideup

Banned
Every tv tech is made with form factor as priority number one.

I wonder what would happen is they threw that out of the window. And put picture quality above all. Disregard if its clunky or has large power draw.

I dont care if its huge, weights alot and give me huge electric bills. I just want at good moving picture.
 
Last edited:

JeloSWE

Member
Every tv tech is made with form factor as priority number one.

I wonder what would happen is they threw that out of the window. And put picture quality above all. Disregard if its clunky or has large power draw.

I dont care if its huge, weights alot and give me huge electric bills. I just want at good moving picture.
Cost-effective, not form factor. If you threw money at it, which they do in a way when they build one off proof of concept prototypes to test and demonstrate new tech then you can get really spectacular TV's such as Sonys 10.000 nit 8K 120hz HDR display. And you can in theory already build a microLED TV, it's just that it's impossible to mass produce them yet.
 
CRT TVs were blurry by default indeed. Games had a natural AA solution for free.

CRT TVs also only had like 10% or less pixels to move around. so pick your fights.
 

n0razi

Member
I'm sure CRT had less motion blur than Led or Oled, but that it's only an aspect of image quality. For sure my 65" LG CX image quality would seem like science fiction to myself from the past.


This

Motion clarity is only one part of overall image quality... I will take the numerous other benefits of a modern OLED
 

mdrejhon

Member
This

Motion clarity is only one part of overall image quality... I will take the numerous other benefits of a modern OLED
LG CX is the cat bean’s indeed.

That said, I’ve seen some high end LCDs matching LG CX — mainly five figure priced televisions or high four figure priced televisions. Nanosys Quantum Dot phosphors has similar color gamut (or better) than many OLEDs, and putting a 100,000-LED FALD behind the panel, kind of solves the LCD blacks problem while having smaller bloom than a CRT electron beam dot.

It’s got a lot of pieces to the puzzle how an LCD can be massively worse than OLED, but there are engineering paths to equalize nearly all attributes of LCD to OLED — it’s a matter of cost effectiveness for the package deal.

The small size of VR displays makes this easier, so commercializing an OLED-blacks-quality OLED-colors-quality LCD is much easier — perhaps Quest Pro or Index Pro or Index 2?
 
Last edited:

mdrejhon

Member
240hz PWM is definitely better than 120hz but might still induce image duplication, you'd want at least 480hz and more if possible to reduce that effect.
120fps at 120Hz PWM is more eye-friendly than framerate-mismatched PWM. My eyes can get bothered by 864 Hz PWM not synchronized to refresh rate, but isn’t bothered by a motion blur reduction strobe backlight (proper one-flash-per-refresh PWM that’s more eye friendly than unsynchronized PWM).

60 Hz PWM is way too low and flickery, but some people like it.

In fact, it can be advantageous to lower the refresh rate to match frame rate, so you can have stroberate=refreshrate=framerate necessary for the CRT motion effect. (CRT is similar to PWM, but with phosphor fade at the end, so it's more of a triangular wave rather than a square wave on an oscilloscope graph).

Nontheless, users should have the choice of PWM frequency, but if there *HAS* to be PWM, usually matching PWM=framerate=Hz is the most eye friendly. So if your display can’t do 240Hz, then you prefer PWM-free instead of 240Hz PWM at 120Hz.
 
Last edited:

balgajo

Member
One final thing. A lot of games have motion blur turned on in the settings (Doom 2016 and Doom Eternal for example) go into settings and if you see "motion blur" turn it the fuck off! LCDs/OLEDs already have motion blur so why would you want more? Why game developers do that is beyond me, modern flat panel displays already have a lot of motion blur as it is so why they add more to it is crazy.
Games do that to simulate the smoothness of real world lens. They are not quite that yet but nowadays motion blur is much better than some time ago.

Also, I don't see how 1000fps display would improve movies as they already come with natural motion blur.
 

mdrejhon

Member
Games do that to simulate the smoothness of real world lens. They are not quite that yet but nowadays motion blur is much better than some time ago.
As I wrote in my earlier post, motion blur is useful to make low frame rates comfortable.
Motion blur is caused by the camera shutter and can be comfortable.

Also, I don't see how 1000fps display would improve movies as they already come with natural motion blur.
I work in the industry.

I recognize the correct & wrong uses of motion blur.

You probably misunderstood this thread -- It is not for Hollywood. It's a "Right tool for Right Job".

1000fps is for reality simulation or for matching CRT motion clarity video material.

It's wrong to have extra source based motion blur in a Star Trek Holodeck.

Examples:
- Virtual Reality, such as Oculus Rift, Quest 2, HTC Vive, Valve Index, PSVR, etc.
- Video games at high frame rates (e.g. CS:GO)
- Simulator rides (they're more realistic feeling when removing external motion blur)
- Sports video
- Fast action GoPro / POV camera stuff
- Mimicking CRT motion clarity with no form of flicker/phosphor/strobe/BFI.
- Etc.

For reality simulation, you only want motion blur generated by human brain. You NEVER want additional source-based motion blur or display-based motion blur when trying to simulate real life. So you need blurless displays & blurless sources for a lot of applications.

The ONLY motion blur you want in Virtual Reality, should only be the motion blur generated inside your human brain! NEVER in source, NEVER in display.
Full stop -- scientifically proven, researcher proven -- just ask any VR researcher.

However, movies are different. I love 24fps for all the classics -- and fan of Hollywood Filmmaker Mode on TVs. But 24fps is not a screwdriver one-size-fits-all tool for all motion material. But I've also seen 1000fps on 1000Hz too, on prototypes.

The problem is that CRT used flicker to eliminate motion blur, and the only way to eliminate motion blur flickerlessly is retina refresh rates. Real life does not flicker. Real life does not strobe. Flicker-to-eliminate-blur (like CRT) is a humankind bandaid that won't work for a Star Trek Holodeck. A true real-life-matching Holodeck should never flicker.

TL;DR:
1. To eliminate motion blur at low frame rates: You need to flicker (like CRT, plasma, strobe mode, BFI)
2. To eliminate motion blur without flicker: It now becomes mandatory to use ultrahigh refresh rates & frame rate (blurless sample-and-hold)

Right now, all VR headsets flicker because that's the only way to eliminate motion blur without unobtainium frame rates & refresh rates. We are stuck with that technique for now, despite the fact that real life doesn't flicker, even though we're trying to make a display mimick real life.

Also, playing games at really low frame rates such as 20 or 25 frames per second, they can often look better when you enable the motion blur setting. But once games go super high frame rates doing reality simulation (VR) then adding source-based motion blur becomes a burden in reality emulation. When you combine It actually creates actual headaches/migraine because it's forcing extra blur above-and-beyond your natural human-brain's generated motion blur. It's no longer a planar display experience (movie) but an immersed experience (VR) where the rules of the blur ballgame dramatically changes. Motion blur setting in games are a personal preference, some people depend on it because of specific reasons (e.g. really low frame rates on GPU, or is really sensitive to phantom array effects).

For more educational reading, see motion animations here.

Right Tool For Right Job!
 
Last edited:

NeoIkaruGAF

Gold Member
mdrejhon mdrejhon , thanks a lot for your insight! This thread is amazing, thanks to your contribution.

I have a question: what is the best solution on modern displays to achieve good motion with low framerate content?

Like I said before, movies and sub-60fps games just look deeply unnatural to me on a LG C9, even to the point of making me physically troubled at times (headaches, nausea). BFI on this particular TV doesn’t do much to mitigate the problem because the flickering is just too noticeable and introduces a whole other set of visual fatigue symptoms.

For movies, some mild motion interpolation can be good without getting into soap-opera look territory. What’s amazing is how not every movie or TV program moves the same with the same motion interpolation settings. I’ve noticed I can get a bit more aggressive with it with modern movies (typically, 3 on the LG ”dejudder” settings), while older movies may look better by lowering dejudder to 2. Old hand-drawn cartoons just look better without any interpolation at all.

For gaming this isn’t really an option.
 

JeloSWE

Member
120fps at 120Hz PWM is more eye-friendly than framerate-mismatched PWM. My eyes can get bothered by 864 Hz PWM not synchronized to refresh rate, but isn’t bothered by a motion blur reduction strobe backlight (proper one-flash-per-refresh PWM that’s more eye friendly than unsynchronized PWM).
I have to disagree some what from personal testing and experience, but those PWM has been multiples of fps and not an irregular number of course. So I've had 3 Samsungs that does either 480 Hz in Movie mode or 120hz in all other modes including game mode. And my Sony Z9F always does 720hz unless you enable the 120hz BFI option. To me when panning the camera in games or just moving my eyes and mouse around my PC desktop, higher PWM is preferred as the the image duplicates the result from say 60fps content on a 120hz PWM screen on both those Samsungs and my Sony creates a very obvious and distracting imager duplication. Where's 480hz and even better 720hz PWM creates so many duplicates they start to merge together and almost look like a a static backlight implementation, unless you have thin white lines on a black background. I would say that I even find higher PWM calmer than 120hz which isn't completely flicker free. My best TV I had was my previous Sony 55W905a as it didn't use PWM, no flicker and any ingame panning looked so smooth, god I hate PWM, they should really just let the backlight stay on or run it at 3000hz. When I flicked my eyes around the screen on the old Sony vs the new one and the Samsungs, it looked so stable and calm, now it's always feel a bit uneasy, especially at 120hz PWM which I presume is due to the Phantom Array Effect.
 
Last edited:

svbarnard

Banned
Whats even worse is that mouse trail effect gets fainter as the refresh goes up, but gets alot more distracting since you can still see it, and more of it.

VD9upLq.png

O4H8HO1.png
Mr Rejhon I have some more questions for you.

1. So you know how in some games (such as Doom Eternal or COD modern warfare 2019) if you go in the settings you will see "motion blur" as an option you can turn on or off? Well digital foundry did a video three years ago explaining why game developers do that, why they add motion blur to video games. The guy in the video says that motion blur is added to modern video games to "hide gaps between frames and be visually continuous" also says things like "gaps in the motion between the frames". Here this is a screenshot from the video, this apparently is what he's talking about when he says gaps between the frames and that they add motion blur to video games to help smooth that out.
lM5ckBb.png

Does this guy know what he's talking about? What is he talking about exactly? Is that visible Gap, pictured in the image above, produced by the stroboscopic effect AKA the stepping mouse effect? Here's the link to the video



2. The phantom aray effect/ stroboscopic effect/stepping mouse effect, can you explain to the people here exactly what it is?

3. What's the difference between ghosting and the stroboscopic effect? Are they the same thing?

4. How noticeable is the stepping mouse effect at 1,000fps@1,000Hz? Is it annoyingly horrible at that frame rate or what? Or is it better at that frame rate?

As usual thanks for your insight. You're truly a genius at what you do!
 
Last edited:

Tygeezy

Member
Yes, it's impressive how they've made VR headsets more comfortable to view than cinema 3D glasses. Less blur than a plasma, less blur than a CRT, less blur than a 35mm projector. And far less nauseating 3D.

And even beyond that, native rendered 3D environments are even better than watching filmed content -- you can even lean down and look underneath a virtual desk, or crouch to crawl under a virtual desk, or walk around a virtual museum statue. Many VR apps have perfect 1:1 sync between VR and real life -- they're not the dizzying roller coaster experiences.

Ease has really improve things there too. 50% of the Oculus Quest 2 market are non-gamers, and these models are so easy that a boxed Quest 2 was mailed to a nursing home (some are literally almost jail-like during COVID times), and the 70-year-old was playing VR the same day with no help from the nursing home staff! They skip the rollercoaster apps and just download the comfortable VR apps (Oculus has "Comfort" ratings in their app store). For those who "hold their nose and buy despite Facebook", these super-easy standalone VR headsets makes it perfect for the hospital bed-ridden or the locked-down individual, the Quest 2 is a gift from heaven for these people.

Vacationing by sitting in a boring residence chair that actually becomes a simulated porch chair of a virtual beachfront house viewing out to the virtual seas and virtual palm trees -- teleporting yourself virtually (this is a real app, it's called the "Alcove" app, with Settings->Beachfront). And being able to play chess on an actual virtual table with a remote family member sitting across the table. And since the Quest 2 has a built in microphone, you're talking to each other in the same room despite both family members living in two different countries. This non-gamer app, "Alcove", is an actual app you can download in the in-VR app store on Quest 2, no computer or phone needed.

Helping the framerate=Hz experience of CRT motion clarity in VR (without a computer needed) -- the Quest 2 has a built-in 4K-capable GPU, a fan-cooled Snapdragon XR running at a higher clockrate than smartphones -- but the fan is so quiet you don't hear it, and with the hot air blowing out a long 1mm slit at the top edge). The graphics of a standalone computerless Quest 2 VR headset has 3D graphics as good as a GTX 1080 from five years ago -- that's downright impressive for $299 playing high-detail standalone VR games such as Star Wars: Tales From The Galaxy Edge (FPS) or playing lazy seated/bed games like Down The Rabbit Hole (like a Sierra Quest or Maniac Mansion game, except it's sidescrolling dollhouse-sized true 3D VR) or playing exercise (Beat Saber) burning more calories for cheaper than an annual gym membership. While many hate Facebook, many are holding noses and buying up Quest 2's as if they're RTX 3080's because they're the closest thing to a Star Trek Holodeck you can get today. John Carmack did an amazing job on that VR LCD screen.

And in an Apple-style "One More Thing", it can play PCVR games wirelessly now. So you can play PC-based Half Life Alyx too. Basically a cordless VR headset for your gaming PC. As long as you have a WiFi 5 or WiFi 6 router in the same room (preferably a 2nd router dedicated to just Quest 2) and a powerful RTX graphics card in the PC, it becomes like a "Wireless HDMI" connection (through nearly lossless H.EVC compression at up to triple-digit megabits per second -- pretty much E-Cinema Digital Cinema bitrates -- no compression artifacts at max settings, and only 22ms latency, less lag than many TVs, at a full frame rate of 90fps 90Hz), to the point where it feels like a wireless version of HDMI. So you can play PCVR as well as in-VR apps (using the in-headset Snapdragon GPU), so you have PC operation (using PC GPU) *and* portable operation (using built-in Snapdragon GPU). Quite flexible.

And while not everyone does this -- it also can optionally doubles as a strobed gaming monitor with Virtual Desktop, and is even capable of optional 60 Hz single strobe (for playing 60 years of legacy 60fps 60Hz content via PC). Virtual Desktop can even display a virtual room with a virtual desk with a virtual computer, and support is being added (in the next few months) to a Logitech Keyboard to map a 3D-rendered VR keyboard into the same physical location of your logitech keyboard! Not everyone would use VR this way, but it's one unorthodox way to get a CRT emulation, since this LCD is so uncannily good at zero motion blur, zero ghosting. You'd wear your VR headset seated at your physical computer, but you're staring into a virtual CRT-motion-clarity instead (in a rendered virtual office).

Now, obviously, some of this is an unconventional optional use -- but consider you get everything of the above for just $299: A desktop gaming monitor, a PCVR headset, a standalone VR headset, a virtual IMAX theatre, a virtual vacation, and a built-in battery powered fan-cooled GPU more powerful than a $299 graphics card of 5 years ago. All in one. Even just one or two of the uses, just pays for the headset itself, alone -- especially if you're in a COVID jail (quarantine).

Certainly isn't grandpa's crappy "Google Cardboard" toy VR. People who only have experience with 60 Hz LCDs don't know what they're missing with these formerly science fiction CRT-beating LCDs now already on the market, being made out of necessity for VR headsets. Zero blur, zero phosphor trails, zero crosstalk, zero double image effect, it's just perfect CRT motion clarity. For those who don't want Facebook, there's the Valve Index, but it's not standalone.

*The only drawback is LCD blacks. However, I saw a FALD VR prototype -- MicroLED local dimming with thousands of LEDs -- that will allow inky-blacks in future headsets by ~2025. But at $299, I didn't bother to wait.


Hello,
Your first reply here is blank — did you mean to say something in regards to my explanation of low-Hz strobing on a high-Hz-capable panel?

Now in regards to your second reply with two new questions:

1. I believe 120 Hz HFR will standardize for about a decade. 120 fps at 120 Hz is high enough that strobing (flicker) is not too objectionable, for an optional motion blur reduction mode. I think 8K will raise refresh rates before 16K becomes useful. 16K is useful for VR, but I think external direct-view TVs don’t really need to go beyond 8K in the consumer space. Jumbotrons, cinemas, and video walls will still have usefulness to go 16K to go retina for first row of seats at the front for example.

Retina frame rates will be used in specialty venues later this decade. There is an engineering path to 8K 1000fps 1000Hz display which can be achieved using today’s technologies. For content creation in this sphere, more information can be found in Ultra High Frame Rates FAQ. A new Christie digital cinema projector is currently capable of 480 Hz already!

I am working behind the scenes for 1000 Hz advocacy; it is definitely not unobtainium — at least for industrial / commercial / special venue purposes. Some of them can afford rigs with 8 RTX GPUs built into them, for example, running specialized 1000fps software. I was behind the scenes in convincing Microsoft to remove the 512 Hz Windows limitation, and now have a Windows Insider build capable of 1000 Hz.

However, for retail televisions, they will be stuck at 120 Hz HFR for a long time due to video industry inertia. Streaming is taking over cable/broadcast, and will be sticking to Hollywood MovieMaker modes.

However, there is already a hack to run YouTube videos at 120fps and 240fps HFR by installing a Chrome extension and playing 60fps videos at 2x or 4x speed. You record 240fps 1080p video, upload to YouTube as 60fps, then play back to a 240 Hz gaming monitor using the chrome extension at 4x playback speed. So you can do 240fps 1080p HFR today with just your own smartphone camera! (Many new smartphones support 240fps slo-mo which can be converted to 240fps real-time HFR video).

So the workflow isn’t that expensive to do, technology is already here today, and 1000 Hz isn’t going to be expensive technology in the year 2030s+. Specialized/niche, yes. But definitely not unobtainium as 1000 Hz is already running in the laboratories which I’ve seen.

Ultimately, by the end of the century, there’ll be legacy framerates (24fps, 50fps, 60fps) and retina frame rates (1000fps+). 120fps HFR is just a stopgap in my opinion. There is somewhat of a nausea uncanny valley, to the point that 1000fps at 1000Hz is less nauseous than 120fps at 120Hz (and fixes some aspects of the Soap Opera Effect). 1000fps is a superior zero-motion-blur experience that has no flicker and no stroboscopics. Better window effect. Fast motion speeds exactly as sharp as stationary video! As long as it is 1000fps+ native frame rates or good perceptually-lossless-artifactless-“artificial-intelligence”-interpolated video or other properly modern frame rate amplification technology.
I’d also like to point out that even if a game isn’t graphically impressive, it sure looks a whole heck of a lot more impressive actually being in the game and everything being 3D and being able to interact with the environment. Things that are tedious in normal games such as manually looting everything all of a sudden becomes fun in vr. A lot of really popular and fun games are Indy not graphical powerhouses anyway.

if someone could make a survival game in the vein of Valheim on quest which is certainly doable on the quest 2 hardware; it would be amazing.
 

mdrejhon

Member
if someone could make a survival game in the vein of Valheim on quest which is certainly doable on the quest 2 hardware; it would be amazing.

You can play Valhiem in VR using headset like a stereoscopic 3D desktop monitor:

While it is not true VR, it is more immersive than normal. Check out Playing Valhiem in VR with VorpX, using your VR headset as a stereoscopic 3D desktop gaming monitor that also happens to have CRT motion clarity.



You said you had a Quest 2. Basically you'd use your VR headset as a goggles display for your PC to view your Windows Desktop or play non-VR PC games. You view your PC screen remotely via BigScreen or Virtual Desktop (with refresh rate configure to match the Quest 2 headset), and use your VR controllers like a gamepad. It works today, and produces better motion clarity than some desktop gaming monitors, thanks to Quest 2's superior motion-clarity LCD.

VR headsets can double as a stereoscopic 3D television/monitor/display (using VR headset like NVIDIA 3D Vision) with better graphics quality than NVIDIA 3D Vision. No shutter glasses, both eyes are getting images at the same time.


I have a question: what is the best solution on modern displays to achieve good motion with low framerate content?

Like I said before, movies and sub-60fps games just look deeply unnatural to me on a LG C9, even to the point of making me physically troubled at times (headaches, nausea). BFI on this particular TV doesn’t do much to mitigate the problem because the flickering is just too noticeable and introduces a whole other set of visual fatigue symptoms.

For some people, low frame rates are ergonomically SOL

Unfortunately, low frame rates are hugely problematic for some people from an ergonomic POV, if you find low frame rates nauseating, because you've got major pick-poison effects.
  1. For the motion-blur-sensitive person (motion blur headaches), low frame rates at sample-and-hold can be nauseating
  2. For the flicker-sensitive person (flicker headaches), low frame rates during strobing can be nauseating, since they either visibly flicker or generate double-image effects.
  3. There are people who are both motion-blur sensitive and flicker-sensitive, it becomes more difficult (damned if I do, damned if I don't)
We have 60+ years of legacy 60fps 60Hz content means we are stuck with band-aids, such as strobing (flicker problem), sample-and-hold (flickerless but has blur), or interpolation (artifacts, laggy).

It's very hard for people who are flicker-sensitive (headaches) and motion-blur-sensitive (headaches), many people with vision sensitivities couldn't even watch a CRT because of flicker sensitive, and others can't watch a LCD because of display-derived motion blur sensitivities.

Artificial-Intelligence Interpolation Without Artifacts

Currently, such individuals who gets low-framerate nausea have to swallow interpolation as a band-aid if you really hate flicker.

Long term for legacy video content (legacy recorded that don't have 3D information or Z-buffer information) -- it is probably artificial-intelligence interpolation is probably the best bet. Basically smart artificial intelligences (such as GPT-3 or better) being able to become realtime artists to recreate the "fake images" much more perfectly like a professional PhotoShopper, except done in real time artfully recreating original-indistinguishable extra frames at hundreds of frames per second.



Early beginnings of greatly improved AI interpolation (such as DAIN) without parallax artifacts (AI smartly "Photoshops" the parallax artifacts away) has already begun, and will gradually keep getting better until interpolation looks perfectly flawless.

Today if you're one of those who get lots of headaches/nausea from low frame rates -- one can download your 24fps content, download DAIN, run the process overnight (DAIN is compute intensive, not real time yet), and play your near-perfectly interpolated videos the next day. A lesser PC-based video-file-converting technique would be Smooth Video Project, but it has way more artifacts and SOE-effect (Soap Opera Effect), though its strength can be adjustable for PC content just like for a TV's low-interpolation setting.

Ideally, AI interpolation needs to be a realtime easy toggle, but you can roll your own non-realtime AI interpolation solution today.

Other Real Time Low-Lag Techniques For Low-Frame-Rate Game Content:

Currently, many of us just get a faster GPU / reduce detail / reduce framerate-impacting settings like excess AA / etc. But sometimes this is not possible nor affordable. Some TVs use a lower-lag interpolation mode available for game mode (e.g. like the one in Samsung first put in NU8000 series a while back, called "Game Motion Plus".

Others use band-aids like phosphor fade (plasma, CRT) to soften the flicker effect of low refresh rates, though this will eventually be emulateable on ultra-high-Hz display with configurable rolling-scan (e.g. using 1000Hz to emulate phosphor fade in 1ms increments), for playing legacy content (e.g. emulators).

Long term for game content is frame rate amplification technologies (FRAT) since it is possible for games to have access to the original motion information (controllers, motion vectors, etc) and laglessly create virtually original-looking frames. This is already being done at 2:1 ratios today in algorithms such as Asynchronous Space Warp (ASW 2.0) built into Oculus Rift VR headsets. HTC Vive and Valve Index does something similar, doubling frame rates without reducing graphics detail. Also, AI can become part of FRAT, as seen in NVIDIA DLSS 2.0 which used neural network training.

In the future, dedicated silicon will happen to GPUs to convert frame rates at 5:1 ratios and 10:1 ratios within a decade or so. Some work is already being done at things like 4:1 ratios in nearer future even for raytracing (see Temporally Dense Ray Tracing). In that sphere, NVIDIA credited me on Page 2 of this peer reviewed research paper where doubling frame rate halves display motion blur.

Either way, multiple solutions are being worked on, but it's still the early days. Like Japan HDTV experiments in the year 1980s. It will be a long time before 40Hz or 480Hz is mainstream, but it is inevitable as long as it becomes a freebie feature (like retina screens and like how 120Hz slowly is -- almost all new 4K HDTVs now support 120Hz, and almost all new gaming consoles now support 120Hz).
 
Last edited:

Tygeezy

Member
You can play Valhiem in VR using headset like a stereoscopic 3D desktop monitor:

While it is not true VR, it is more immersive than normal. Check out Playing Valhiem in VR with VorpX, using your VR headset as a stereoscopic 3D desktop gaming monitor that also happens to have CRT motion clarity.



You said you had a Quest 2. Basically you'd use your VR headset as a goggles display for your PC to view your Windows Desktop or play non-VR PC games. You view your PC screen remotely via BigScreen or Virtual Desktop (with refresh rate configure to match the Quest 2 headset), and use your VR controllers like a gamepad. It works today, and produces better motion clarity than some desktop gaming monitors, thanks to Quest 2's superior motion-clarity LCD.

VR headsets can double as a stereoscopic 3D television/monitor/display (using VR headset like NVIDIA 3D Vision) with better graphics quality than NVIDIA 3D Vision. No shutter glasses, both eyes are getting images at the same time.




For some people, low frame rates are ergonomically SOL

Unfortunately, low frame rates are hugely problematic for some people from an ergonomic POV, if you find low frame rates nauseating, because you've got major pick-poison effects.
  1. For the motion-blur-sensitive person (motion blur headaches), low frame rates at sample-and-hold can be nauseating
  2. For the flicker-sensitive person (flicker headaches), low frame rates during strobing can be nauseating, since they either visibly flicker or generate double-image effects.
  3. There are people who are both motion-blur sensitive and flicker-sensitive, it becomes more difficult (damned if I do, damned if I don't)
We have 60+ years of legacy 60fps 60Hz content means we are stuck with band-aids, such as strobing (flicker problem), sample-and-hold (flickerless but has blur), or interpolation (artifacts, laggy).

It's very hard for people who are flicker-sensitive (headaches) and motion-blur-sensitive (headaches), many people with vision sensitivities couldn't even watch a CRT because of flicker sensitive, and others can't watch a LCD because of display-derived motion blur sensitivities.

Artificial-Intelligence Interpolation Without Artifacts

Currently, such individuals who gets low-framerate nausea have to swallow interpolation as a band-aid if you really hate flicker.

Long term for legacy video content (legacy recorded that don't have 3D information or Z-buffer information) -- it is probably artificial-intelligence interpolation is probably the best bet. Basically smart artificial intelligences (such as GPT-3 or better) being able to become realtime artists to recreate the "fake images" much more perfectly like a professional PhotoShopper, except done in real time artfully recreating original-indistinguishable extra frames at hundreds of frames per second.



Early beginnings of greatly improved AI interpolation (such as DAIN) without parallax artifacts (AI smartly "Photoshops" the parallax artifacts away) has already begun, and will gradually keep getting better until interpolation looks perfectly flawless.

Today if you're one of those who get lots of headaches/nausea from low frame rates -- one can download your 24fps content, download DAIN, run the process overnight (DAIN is compute intensive, not real time yet), and play your near-perfectly interpolated videos the next day. A lesser PC-based video-file-converting technique would be Smooth Video Project, but it has way more artifacts and SOE-effect (Soap Opera Effect), though its strength can be adjustable for PC content just like for a TV's low-interpolation setting.

Ideally, AI interpolation needs to be a realtime easy toggle, but you can roll your own non-realtime AI interpolation solution today.

Other Real Time Low-Lag Techniques For Low-Frame-Rate Game Content:

Currently, many of us just get a faster GPU / reduce detail / reduce framerate-impacting settings like excess AA / etc. But sometimes this is not possible nor affordable. Some TVs use a lower-lag interpolation mode available for game mode (e.g. like the one in Samsung first put in NU8000 series a while back, called "Game Motion Plus".

Others use band-aids like phosphor fade (plasma, CRT) to soften the flicker effect of low refresh rates, though this will eventually be emulateable on ultra-high-Hz display with configurable rolling-scan (e.g. using 1000Hz to emulate phosphor fade in 1ms increments), for playing legacy content (e.g. emulators).

Long term for game content is frame rate amplification technologies (FRAT) since it is possible for games to have access to the original motion information (controllers, motion vectors, etc) and laglessly create virtually original-looking frames. This is already being done at 2:1 ratios today in algorithms such as Asynchronous Space Warp (ASW 2.0) built into Oculus Rift VR headsets. HTC Vive and Valve Index does something similar, doubling frame rates without reducing graphics detail. Also, AI can become part of FRAT, as seen in NVIDIA DLSS 2.0 which used neural network training.

In the future, dedicated silicon will happen to GPUs to convert frame rates at 5:1 ratios and 10:1 ratios within a decade or so. Some work is already being done at things like 4:1 ratios in nearer future even for raytracing (see Temporally Dense Ray Tracing). In that sphere, NVIDIA credited me on Page 2 of this peer reviewed research paper where doubling frame rate halves display motion blur.

Either way, multiple solutions are being worked on, but it's still the early days. Like Japan HDTV experiments in the year 1980s. It will be a long time before 40Hz or 480Hz is mainstream, but it is inevitable as long as it becomes a freebie feature (like retina screens and like how 120Hz slowly is -- almost all new 4K HDTVs now support 120Hz, and almost all new gaming consoles now support 120Hz).

I thought about getting vorp x mainly to play pc games in 3D. I really enjoyed playing Diablo 3 and path of exile with 3D vision way back.

Id definitely be interested in playing the Diablo 2 remaster with vorpx assuming it works with it like Diablo 3 apparently does. I might pickup and test valheim on it, but I was hoping someone would use the unity engine like valheim does and build a survival co-op game from the ground up exclusively for VR. I just think that genre in particular is an excellent fit for vr. Walking Dead Saints and Sinners has some minor survival mechanics and that game is outstanding in VR.
 

mdrejhon

Member
I have to disagree some what from personal testing and experience, but those PWM has been multiples of fps and not an irregular number of course. So I've had 3 Samsungs that does either 480 Hz in Movie mode or 120hz in all other modes including game mode. And my Sony Z9F always does 720hz unless you enable the 120hz BFI option. To me when panning the camera in games or just moving my eyes and mouse around my PC desktop, higher PWM is preferred as the the image duplicates the result from say 60fps content on a 120hz PWM screen on both those Samsungs and my Sony creates a very obvious and distracting imager duplication. Where's 480hz and even better 720hz PWM creates so many duplicates they start to merge together and almost look like a a static backlight implementation, unless you have thin white lines on a black background. I would say that I even find higher PWM calmer than 120hz which isn't completely flicker free.
Correct.

We're not mutually exclusive, and you're correct too.

1. People who are PWM-insensitive (doesn't mind flicker/strobe/PWM of any kind, even unsynchronized)
2. People who are PWM-sensitive only to framerates mismatching stroberate (aka "motion blur reduction" PWM)
3. People who are PWM-sensitive to any kind of PWM

I fall in the category of #2.

The bottom line is that some people get eyestrain from PWM artifacts, instead of eyestrain from PWM flicker.

These are essentially two separate causes of PWM strain, and different people are affected.

9VI4BMR.jpg


Everybody sees differently. We respect that everybody has different vision sensitivities. Focus distance? Brightness? Display Motion Blur? Flicker? Tearing? Stutter? Some of them hurts people's eyes / create nausea / symptoms sommore than others. One person may be brightness sensitive (hate monitors that are too bright or too dim). Another may get nausea from stutter. And so on. Displays are inherently imperfect emulations of real life.

Even back in the CRT days -- some people were insanely bothered by CRT 30fps at 60Hz, while others were not at all.

Also, thanks to the Vicious Cycle Effect where higher resolutions, bigger displays, wider FOV, and faster motions amplfiy PWM artifacts. They become easier to see on 8K displays than 4K displays than 1080p displays.

The visibility of PWM artifacts increases with bigger resolutions & bigger FOV & faster motion. Where the problem of (framerate not equal stroberate) become more visible again at ever higher PWM frequencies. This is part of why I'm a big fan of retina frame rates at retina refresh rates for many use cases.

In fact, many people are bothered by PWM artifacts (#2) when the display is big enough - such as virtual reality. Headaches of display artifacts (PWM artifacts, motion blur artifacts, etc) are biggest with a massive IMAX display strapped to your eyes. So, that's how VR headsets work to minimize the percentage of nausea and headaches in the human population, their PWM frequency is matched to Hz to fix headaches from duplicate images.

From an Average Population Sample, Lesser Evil of Pick-Your-Poison Display Artifacts

VR researchers discovered people got a lot of nausea/headaches with things like display motion blur and PWM artifacts, so they (A) chose to strobe, and (B) they chose to strobe at framerate=Hz.

Strobing / blur reduction / BFI / "framerate=Hz PWM" are essentially the same thing -- synonyms from a display point of view. We don't normally equate "PWM" with "motion blur reduction" but motion blur reduction on LCD is essentially flashing a backlight, and that's the scientific definition of PWM (an ON-OFF square wave).

For people matching #2, the fix is to lower refresh rate and raise frame rates, until they converge to framerate=refreshrate=stroberate. When this happens, you get beautiful CRT motion clarity, zero blur, zero duplicates, etc. You still have flicker, but there can be a point where flicker is the lesser evil (as long as flicker frequency is high enough).

When screens are gigantic to eyes enough (like VR), it becomes a problem bigger and more visible than CRT 30fps at 60Hz.

VR researchers found that the fewest % of headaches occured with VR PWM at framerate=Hz. You do need a proper triple match to eliminate the maximum amount of VR nausea for the maximum population though: refreshrate == framerate == stroberate PWM, and the frequency of this to be beyond flicker fusion threshold (i.e. 120Hz instead of 60Hz).

It is the lesser of evil of a pick-your-poison problem of displays that can't yet perfectly match real life.

9ReYBbG.png


That's why if you want this for your desktop games, you must have the triple match, framerate = refreshrate = stroberate.

That's why blur busting is so difficult in many games at these non-retina frame rates. You need technologies similar to VSYNC ON, but with lower lag, to keep frame rate synchronized. You need GPU powerful enough to run framerates equalling refresh rates that are strobing high enough not to flicker. You need high-quality strobing without ghosting or crosstalk. Etc. So, 120fps, 120Hz PWM, 120Hz display refresh -- very tough to do with many games.

(BTW, this is also partially why RTSS Scanline Sync was invented as a substitute to laggy VSYNC ON -- I helped Guru3D create RTSS Scanline Sync -- which is essentially a low-lag VSYNC ON alternative.)
 
Last edited:

mdrejhon

Member
Mr Rejhon I have some more questions for you.

Hey! I am glad you are interested in asking questions.

But, may I ask a favour first? Just like the person who hog classroom time by asking the most questions than the other students in the rest of the classroom -- would it be possible to first answer my previous questions first before I finish answering further questions of yours, please? Forum participation is unpaid time and I have to ration the maximal teaching moments to maximum number of interested people. Thank you!

1. Can you answer my question earlier if you had a reply that went missing:
Link to your reply with missing reply.

2. Can you answer if you finished the exercise I told you about in the other forum:
"Steps To See Stroboscopic Text Effect In Browser Scroll"

Once you reply, I'll be happy to reply to your other excellent questions (if they aren't self-answered by #2). It's important to do this specific exercise before you ask any additional questions, because self-testing some motion tests yourself, to see for yourself, helps to answer your own questions. Appreciated!
:messenger_winking:

However, since a Blur Busters' prime directive is mythbusting the refresh rate race to retina refresh rates, I'd like to address a potential source of confusion of yours:
1. So you know how in some games (such as Doom Eternal or COD modern warfare 2019) if you go in the settings you will see "motion blur" as an option you can turn on or off? Well digital foundry did a video three years ago explaining why game developers do that, why they add motion blur to video games. The guy in the video says that motion blur is added to modern video games to "hide gaps between frames and be visually continuous"
Correct.

"gap" = "stroboscopic stepping" = "strobscopics" = "steps between frames" = "gaps between frames" = "phantom array effect"

It's a synonym. Same thing. Identical meaning for the perspective of moving images on a display.

1. Watch the YouTube video you posted again, but mentally replace "gap between frames" with "stroboscopic effect between frames".
2. Then re-read The Gapping Effect of Finite Frame Rates but mentally replace "strobsocopic effect" / "phantom array effect" with "gaps between frames"

They mean the same thing. That said, my article goes more in depth than the video (in some ways) by covering the different kinds of stroboscopic effect for non-strobed displays and strobed displays.

We just use different terminology. They don't cause each other. They are the same thing. One and same.

I also even mention it in the Ultra HFR FAQ: Real Time 1000fps on Real Time 1000 Hz Displays where a 360 degree camera shutter introduces source-based motion blur to "fill the gaps between frames" (means same thing as "fill the stroboscopic stepping between frames") The good news is that 1000fps means you only need 1ms of intentional source-based motion blur to fill the stroboscopic gaps between frames.

I know this will create additional questions from you, but before you ask additional questions, please answer my earlier questions first. Thank you!

(From Ultra HFR FAQ)
  1. Fix source stroboscopic effect (camera): Must use a 360-degree camera shutter;
  2. Fix destination stroboscopic effect (display): Must use a sample-and-hold display;
  3. Fix source motion blur (camera): Must use a short camera exposure per frame;
  4. Fix destination motion blur (display): Must use a short persistence per refresh cycle.
Therefore;
  • Ultra high frame rate with 360-degree camera shutter is also short camera exposure per frame;
  • Ultra high refresh rate with sample-and-hold display is also short persistence per refresh cycle,
Therefor, to solve (1), (2), (3), (4) simultaneously requires ultra high frame rates at ultra high refresh rates.

Also, svbardnard, please complete this exercise and re-read all the links (knowing the synonym "gapping" = "stroboscopics").

My article, The Gapping Effect of Finite Frame Rates, is pretty precisely exactly the same point being made in the videos you posted. Adding game-based motion blur is a good and bad thing. As my article The Gapping Effect of Finite Frame Rates explains, there are pros/cons of adding motion blur to frames. The YouTube video you posted focuses on certain pros/cons.

Once you recognize that the pros of motion blur also has the cons of motion blur, and you understand the pros/cons. It's all part of the vicious cycle effect. Where the more you try to eliminate display motion blur, the source-based artifacts become more visible (whether source-based motion blur or stroboscopic stepping effect).

So to eliminate BOTH simultaneously for a bigger percentage of population (including all the sensitive individuals). You need source-based motion blur to eliminate stroboscopic effect, but to minimize source-based motion blur AND display-based motion blur, requites extremely high frame rates at high refresh rates. That gets a display closer to mimicking real life, so a bigger percentage of eyes sees no issues.

Anyway, please complete the exercise, and let me know!

Thank you!
 
Last edited:

JRW

Member
We still have a Sony 34XBR960 CRT and lately Ive been playing Ghost N Goblins Resurrection on it via Switch (the tv has 1 HDMI input thankfully) It really is crazy how much smoother 60fps/60Hz looks on a CRT vs. even my main 144Hz PC LCD and I won't even get into how much better the contrast ratio & black levels are on a decent CRT.

But the convergence has slipped a bit on the XBR960 over the years, thats one downside w/ CRT they need occasional calibration to keep things looking crisp.
 

dave_d

Member
We still have a Sony 34XBR960 CRT and lately Ive been playing Ghost N Goblins Resurrection on it via Switch (the tv has 1 HDMI input thankfully) It really is crazy how much smoother 60fps/60Hz looks on a CRT vs. even my main 144Hz PC LCD and I won't even get into how much better the contrast ratio & black levels are on a decent CRT.

But the convergence has slipped a bit on the XBR960 over the years, thats one downside w/ CRT they need occasional calibration to keep things looking crisp.
Well and the geometry on the image can be a bit off. (IE On my old Trinitron in the 90s straight lines in menus on Final Fantasy 2 were clearly not straight.)
 

Durask

Member
haha, sounds like you just came from the future.
Nah, there are 4 now.


I tried the Razer one.
 

Kuranghi

Member
Seemed like the best place to ask this:

nvidia-refresh-rate-animation.gif



Isn't this gif extremely disengenuous because the separate captures aren't sync'd? It seems to me like the gif starts after the moment where you'd have the advantage they are referring to in the text.

Surely the moment when the hand first appears is the moment where you have the "seeing them earlier" advantage, as in, the hand will smoothly slide from behind the door on the 240hz test whereas it will suddenly pop into view on the 60hz test, after its already been on screen for a few frames at 240hz.

I don't think you would fire at that point ofc, you'd want to wait until the head/chest is visible, but crucially you can start moving your crosshair over to the target sooner because you've reacted to the movement is my understanding.

I think I might be missing something here though so thought I'd ask.
 

perkelson

Member
No thank you. I'd take 60hz shitty LCD panel over 1000hz CRT with shitty colors, contrast and everything else, oh and by the way i won't need separate room to hold it in since 60CRT would be size of small car.
 

mdrejhon

Member
Seemed like the best place to ask this:

nvidia-refresh-rate-animation.gif



Isn't this gif extremely disengenuous because the separate captures aren't sync'd? It seems to me like the gif starts after the moment where you'd have the advantage they are referring to in the text.

Surely the moment when the hand first appears is the moment where you have the "seeing them earlier" advantage, as in, the hand will smoothly slide from behind the door on the 240hz test whereas it will suddenly pop into view on the 60hz test, after its already been on screen for a few frames at 240hz.

I don't think you would fire at that point ofc, you'd want to wait until the head/chest is visible, but crucially you can start moving your crosshair over to the target sooner because you've reacted to the movement is my understanding.

I think I might be missing something here though so thought I'd ask.

Actually... this animation might be accurately synced from the Input Lag point of view (rather than Display Lag), because:

There's also a hidden framebuffer queue in the graphics driver.

So it may be possible it is actually sync'd (we don't know what driver settings were used for that comparison) -- because it only shows the display-output-side equation, rather than graphics driver frame queue latency.

Drivers at lower Hz generates a bigger tapedelay latencies -- similar to what the animation does.



"Input" lag is a long cascading chain of latencies from multiple sources -- not just "display" lag.

Higher framerate & higher Hz can decrease software/driver latency long before display latency. A bigger micro-tapedelay latency at lower Hz, just like in the animation.

For example, during VSYNC ON a prerendered frame queue of 2 frames can be 2/60sec at 60Hz, and 2/240sec at 240Hz, with certain sync settings.

Similar issue can also result for uncapped VRR (framerates trying to exceed Hz without being allowed to exceed it, as in VSYNC OFF).
 
Last edited:

mdrejhon

Member
No thank you. I'd take 60hz shitty LCD panel over 1000hz CRT with shitty colors, contrast and everything else, oh and by the way i won't need separate room to hold it in since 60CRT would be size of small car.
This thread is talking about a 1000 Hz LCD.

Looks exactly the same size as your 60 Hz LCD, but has zero motion blur like a CRT.

If you ever used a 240 Hz LCD, even web scrolling motion blur is reduced by 75%. Less motion blur than a 120Hz iPad.

Double Hz and framerate = half scrolling motion blur. (scientific citations linked from this article).

A 1000 Hz LCD would have only 60/1000ths the motion blur of a 60Hz LCD.

High-Hz LCDs is no longer just for gaming anymore. A very common usage -- web browser scrolling -- looks much nicer at higher Hz. The 120Hz iPad experience, except better. There are non-gamers using 240Hz and 360Hz monitors now for motion-blur ergonomic reasons. Some people have motion blur nausea or gets motion sick during scrolling, and higher Hz helps that. This may not be you, but helps a lot of others. Fast growing segment of audience.
 
Last edited:

Vick

Gold Member
Without motion blur 30fps titles are awful to look at, and per object motion blur is one of the reasons (along with the much more prominent DOF, lol, and AO and SSS) Uncharted 2 was considered visually a generation above Drake’s Fortune (no one apparently noticed downgrades such as texture and shaders resolution, shadows, particles, enemies AI and animations, etc.) and God of War 3 considered CGI looking by everyone at the time. You need these things, but they need to be tastefully added.
Having awful displays with awful settings (every techonology outside of CRT and Plasma introduces to various degrees additional motion blur not present in the source) is not a good reason to push the industry backwards.

Glad John keeps fighting for per object motion blur on DF in every occasion, love him for that.
Only people which should never be catered to on this sort of stuff are tasteless average Joe’s and cheap ass gamers who can’t afford anything better than awful LCD’s.

I hate per-object motion blur!

Said no CG artist, CRT/Plasma/OLED/DLP projector owner ever.
It has much to do with LCD’s since this awful cheap ass piece of tech (which i hate for the simple reason it killed plasma’s, being much cheaper to produce) introduces various degrees of blur not present in the source, killing the supposed look of the source in the process.
Yeah, that was an error on my part for wanting to be inclusive. OLED’s also introduce motion blur not present in the source. That is camera motion blur, not the per-object motion blur on which i was discussing.
Clarity during pannings would never result in fluidity in 30fps games, and since the vast majority of games on consoles target those camera blur of some extent is also essential.

Actually the vast majority of console games have had these options since the start of the generation due to the constant bitching, that’s why i think still complaining about it is ridiculous at this point.

Granted, some games have awful motion blur implementations that look like shit no matter the technology on which they are displayed, but that’s not the point.
What's indisputable is that the panel's way of handling motion plays a huge role, almost as prominent as the nature of said panel between CRT, plasma, LCD and OLED.
What a load of bullcrap, stop playing games on potatoes then.
Try 60fps games on a 60hz plasma and tell me you need more with a straight face.

Higher resolutions on the other hand, 1:1 or downsampled will always look better.
They already looked basically perfect on Pro. First party games at 60fps on a 60hz top gamma plasma though..

tenor.gif


Can’t wait.
You are in trouble friend, just like i am.
NO TV on the market no matter the price can match our panels in motion (so 99.9% of the time), and if like me you've been using plasmas for a long time there's just no way in hell you could adjust to how LCD/LED/OLED handle motion.
That’s why i have a KURO and a VT50. And when they'll both die, hopefully years from now, i'll just try to find another one. Actually, i better start looking.
If you can find a top gamma plasma, in motion (so 99.9% of the time) it would look much better than any TV on the market, no matter the price.
Blacks were also as deep as OLED in the latest Panasonic and you'd never have to worry about resolutions because 1080p would be 1:1 and downsampled images would simply blow your mind.

As I said minutes ago in the other Thread, when both my KURO and VT50 will die, years from now hopefully, i'll just try to find another one.
There was always a very defined boundary in my case, PC games played at higher framerates on monitors, and Console 30fps played on the Plasma. Unless they weren't consistent in hitting the target or had awful input lag they never bothered me, mostly due to the excellence of that technology in handling motion.

But playing consistently 60fps, amazing looking games on PS5 at higher resolutions has been a disaster. 60hz signals are quite simply orgasmic on the plasma.. even things like 90fps on the LCD monitor look bad in comparison and there's no turning back, i just downloaded the Crash games and i have no desire to play them simply because they're 30fps.

InFamous: Second Son, Ghost of Tsushima, Days Gone, God of War or The Last Guardian supersampled at 60fps are truly something. Even Until Dawn was transformed.
PS5 games supersampled at 60fps on a VT50 or KRP:

op9PWrr.gif

Sorry, just needed to feel validated. And to spread Plasma Master Race™ awareness.
 

perkelson

Member
This thread is talking about a 1000 Hz LCD.

No, read thread title. It is about comparison between CRTs and LCDs on refreshrates/blur

High-Hz LCDs is no longer just for gaming anymore. A very common usage -- web browser scrolling -- looks much nicer at higher Hz. The 120Hz iPad experience, except better. There are non-gamers using 240Hz and 360Hz monitors now for motion-blur ergonomic reasons. Some people have motion blur nausea or gets motion sick during scrolling, and higher Hz helps that. This may not be you, but helps a lot of others. Fast growing segment of audience.

Sorry but you don't need at all more than 60hz to read STATIC content. And i seriously doubt you would even perceive blurr in first place as blurr of LCD usually is not due to refresh rate but panel itself and how pixel switch.

I have TV from 2007 (1080p) one of top tier Sony Bravias and blurr is basically non existent while i still can see plenty of blurr on cheaper panels nearly 15 years later.
 

JeloSWE

Member
Sorry but you don't need at all more than 60hz to read STATIC content. And i seriously doubt you would even perceive blurr in first place as blurr of LCD usually is not due to refresh rate but panel itself and how pixel switch.
It not quite that simple. In theory you could have a 1Hz screen and there would be no problem with reading static text, like a book reader. BUT many LCD TV's and monitors uses Puls Width Modulation, basically, the back lights don't stay on but flickers, it looks similar to how CRT and Black Frame Insertion where the screen spend part of the each frames refresh as dark. This flickering can cause the trailing Image Duplication on your retina as you move your eyes quickly over the screen, this is especially evident with thin white lines on top of black backgrounds. So for best experience reading text you want no back light flickering at all and most modern work monitors use Sample and Hold with static backlighting. The problem how ever is that this will produce motion blur in your retina if there is motion on screen and your eyes are following along. Say a scrolling page with text. The only way to make the text in motion not look blurry on a low refresh rate display is to use Black Frame Insertion on a gaming monitor or TV in game mode. If you use BFI on a 60Hz screen it will flicker and cause eye strain after a while as well as cause image duplication on the retina when moving the eyes quickly. To solve both of these problems you need a pretty high refresh rate with low to no BFI or PWM.
 

mdrejhon

Member
No, read thread title. It is about comparison between CRTs and LCDs on refreshrates/blur



Sorry but you don't need at all more than 60hz to read STATIC content. And i seriously doubt you would even perceive blurr in first place as blurr of LCD usually is not due to refresh rate but panel itself and how pixel switch.

I have TV from 2007 (1080p) one of top tier Sony Bravias and blurr is basically non existent while i still can see plenty of blurr on cheaper panels nearly 15 years later.
First, you are right about static content, but scrolling is motion.

Text scrolling has motion blur, as seen in the animation www.testufo.com/framerates-text while eye-tracking the text. is caused by display persistence explained in the animation www.testufo.com/eyetracking (View this one before replying)

Next, did you know that I am the creator of TestUFO, and founder of Blur Busters? My research is cited in more than 20 peer reviewed research papers, including by TV manufacturers such as Samsung.

Finally, the thread title is very clear the person was referring indirectly my MOTION article that I wrote at Blur Busters Law: The Amazing Journey to Future 1000 Hz Displays — the very article that make people say LCDs need 1000fps 1000Hz to match CRT motion clarity.

Also, you probably missed the full non-cropped thread title:

3tyDl7I.jpg


What I do is important to those who easily get motionsick by games / motion blur / etc. Blur reduction helps these people lots too. This is additional motion blur above-and-beyond source based blur.

Not everyone is sensitive to everything. One person may be brightness sensitive, another may be sensitive to 3:2 pulldown judder (I worked in the home theater industry), yet others sensitive to motion blur from panning / scrolling / turning / etc.

You may not be sensitive, so if you are not, then do not worry about this thread. However, enough people are sensitive to this that it actually pays for an industry.

So, you don’t have to get more than 60 Hz, if you don’t need it / want to — but what I am saying is still 100% scientifically proven and demonstrable, and many researchers agree with me.

To correctly understand the perspectives, there are great science explainers at www.blurbusters.com/area51 to catch up on.

We replace the “fake refresh rates” with real refresh rates that benefit humnkind.

Sincerely,
Mark Rejhon
Founder of Blur Busters / TestUFO
 
Last edited:

nkarafo

Member
If you ever used a 240 Hz LCD, even web scrolling motion blur is reduced by 75%. Less motion blur than a 120Hz iPad.
Can confirm.

I have a 240hz Dell monitor and using the smooth scrolling option in Firefox.

I can read the small sized text while it's smoothly scrolling. At 60hz text becomes a blurry mess. At 240hz it's very close to my CRT PC monitor at 85hz. Though it's not quite there, i can still see some artifacting/ghosting when the text is black and the background is white. But in dark themes it's basically the same.

After this, i can't go back to 60hz LCDs. Even 144hz ones aren't good enough.
 

Rat Rage

Member
First, you are right about static content, but scrolling is motion.

Text scrolling has motion blur, as seen in the animation www.testufo.com/framerates-text while eye-tracking the text. is caused by display persistence explained in the animation www.testufo.com/eyetracking (View this one before replying)

Next, did you know that I am the creator of TestUFO, and founder of Blur Busters? My research is cited in more than 20 peer reviewed research papers.

Finally, the thread title is very clear the person was referring indirectly my MOTION article that I wrote at Blur Busters Law: The Amazing Journey to Future 1000 Hz Displays.

Also, you probably missed the full non-cropped thread title:

3tyDl7I.jpg


What I do is important to those who easily get motionsick by games / motion blur / etc. Blur reduction helps these people lots too. This is additional motion blur above-and-beyond source based blur.

Not everyone is sensitive to everything. One person may be brightness sensitive, another may be sensitive to 3:2 pulldown judder (I worked in the home theater industry), yet others sensitive to motion blur from panning / scrolling / turning / etc.

You may not be sensitive, so if you are not, then do not worry about this thread. However, enough people are sensitive to this that it actually pays for an industry.

So, you don’t have to get more than 60 Hz, if you don’t need it / want to — but what I am saying is still 100% scientifically proven and demonstrable, and many researchers agree with me.

Sincerely,
Mark Rejhon
Founder of Blur Busters / TestUFO

Nice to have first-hand information from people who are clearly in the know! Very much appreciated!
 

Murdoch

Member
Sorry, just needed to feel validated. And to spread Plasma Master Race™ awareness.

*Raises glass*

I "Downgraded" from an Xbox One X and a 4K monitor to a Panasonic TX-P42GT60B and Series S a few months ago. To call it a generational leap forwards is my personal understatement of the decade. It's absolute madness that marketing boffins have had us all fooled for the past 10 years. Although to be fair we've been using LCD for that long I can only presume the abhorrent blacks and LCD smear have become second nature.
 

Ulysses 31

Member
Although to be fair we've been using LCD for that long I can only presume the abhorrent blacks and LCD smear have become second nature.
Could also be that you haven't been keeping up with the more modern LCD displays with VA panels that don't have these issues, or at least way less than 5-10 years ago. :lollipop_grinning:
 

mdrejhon

Member
Could also be that you haven't been keeping up with the more modern LCD displays with VA panels that don't have these issues, or at least way less than 5-10 years ago. :lollipop_grinning:
While VA improved massively, I still don’t quite like most VA panels from a ghosting perspective...
(They have their pros and cons, though)

...but from a motion-resolution point of view, I really like the strobed versions of the new FastIPS panels that were released in 2020.

These new panels are easier to calibrate with a low-persistence strobe with zero double images, and can beat a Sony FW900 CRT in motion clarity at some settings, if they are well-calibrated at the factory. Like the ViewSonic XG270 and XG2431 I worked on.

Another amazing LCD panel is the Oculus Quest 2 VR LCD, which has true real-world measured 0.3ms MPRT (not just GtG). The two pixel response benchmarks are different (Pixel Response FAQ: GtG versus MPRT)
 
Last edited:

GuinGuin

Banned
My OLED has minimal native motion blur so I always turn it ON when given the option. Makes 60fps games look more natural and less Soap Opera Effect.
 

svbarnard

Banned
480hz monitors are coming soon.
https://blurbusters.com/expose-stea...e-oleds-by-boe-china-surprising-blur-busters/

So the old CRT TVs had essentially zero motion blur, whereas modern day LCDs/OLEDs have a lot of motion blur, so the only way to fix this is to have 1,000 frames per second according to Blurbusters. And believe it or not it doesn't stop there, we are eventually going to need 10,000fps, why? Because there is a screen artifact called the phantom array effect that doesn't go away till you hit 10,000 frames per second. So this means we would need a computer capable of playing a game at 10,000 fps and a screen capable of displaying a 10,000hz refresh rate (I think this will take 50 years or more to accomplish). Look at the screenshot, that's the founder of Blurbusters, it's from the Blurbusters forum.

GOAfMDq.jpg
 
Last edited:
Top Bottom