• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.
  • Hey Guest. Check out the NeoGAF 2.2 Update Thread for details on our new Giphy integration and other new features.

Old CRT TVs had minimal motion blur, LCDs have a lot of motion blur, LCDs will need 1,000fps@1,000hz in order to have minimal motion blur as CRT TVs

JeloSWE

Member
Mar 23, 2015
544
851
550
I will try playing in Movie mode, to see if there is any difference. I suppose input lag will be one...

Edit : no improvement as far as I am concernend, I still get a very noticeable doubling of the picture when rotating camera in games.

If anyone has some suggestions for a 32 inches TV with good motion clarity, know that I am interested in it...
I'm guessing The Frame is not really meant as a media screen, more like a painting on the wall. So I'm not surprised Samsung has skimped on the PWM implementation even in Movie mode as well.
 
Last edited:

cireza

Member
Jun 1, 2014
8,039
6,553
700
I'm guessing The Frame is not really meant as a media screen, more like a painting on the wall. So I'm not surprised Samsung has skimped on the PWM implementation even in Movie mode as well.
It had a decent review on rtings for gaming.
 

JeloSWE

Member
Mar 23, 2015
544
851
550
It had a decent review on rtings for gaming.

https://www.rtings.com/tv/reviews/samsung/the-frame-2020#page-test-results
Flicker-Free: NO
Frequency: 240hz
The Frame 2020 uses PWM to dim its backlight. It normally flickers at 240Hz, but the frequency drops to 120Hz when Picture Clarity is enabled, even if you don't adjust the Blur and Judder reduction sliders. Enabling LED Clear Motion lowers the flickering frequency further to 60Hz.
Note that every picture mode except 'Movie' mode has Picture Clarity enabled by default.


240hz PWM is definitely better than 120hz but might still induce image duplication, you'd want at least 480hz and more if possible to reduce that effect.

It seems to create a lot of visible duplication the way they do the PWM strobing.

Here is how my Sony Z9F looks with it's 720hz PWM
 
Last edited:

dolabla

Member
Oct 9, 2013
5,875
12,688
1,005
I hope one day we can get there again. CRT had its faults (weight, etc), but it was definitely superior in other ways than anything we have today. I sure hope we can get that true successor one day.
 
  • Like
Reactions: chilichote

cireza

Member
Jun 1, 2014
8,039
6,553
700
240hz PWM is definitely better than 120hz but might still induce image duplication, you'd want at least 480hz and more if possible to reduce that effect.

It seems to create a lot of visible duplication the way they do the PWM strobing.

Here is how my Sony Z9F looks with it's 720hz PWM
Yes, I had seen their picture and duplication was very noticeable. The second one you posted is really blurry... I guess I can't be satisfied by any HD TV :(
 

JeloSWE

Member
Mar 23, 2015
544
851
550
Yes, I had seen their picture and duplication was very noticeable. The second one you posted is really blurry... I guess I can't be satisfied by any HD TV :(
They are share the same blur distance but mine is smoother due to higher PWM hz. It's a good thing. Only thing that could remedy the blur is BFI.
 

sunnysideup

Member
Nov 11, 2018
837
1,314
430
Every tv tech is made with form factor as priority number one.

I wonder what would happen is they threw that out of the window. And put picture quality above all. Disregard if its clunky or has large power draw.

I dont care if its huge, weights alot and give me huge electric bills. I just want at good moving picture.
 
Last edited:

JeloSWE

Member
Mar 23, 2015
544
851
550
Every tv tech is made with form factor as priority number one.

I wonder what would happen is they threw that out of the window. And put picture quality above all. Disregard if its clunky or has large power draw.

I dont care if its huge, weights alot and give me huge electric bills. I just want at good moving picture.
Cost-effective, not form factor. If you threw money at it, which they do in a way when they build one off proof of concept prototypes to test and demonstrate new tech then you can get really spectacular TV's such as Sonys 10.000 nit 8K 120hz HDR display. And you can in theory already build a microLED TV, it's just that it's impossible to mass produce them yet.
 

namekuseijin

Nintendo is for soys and kids. But my tears about them are for men.
Jun 10, 2020
2,793
3,051
615
CRT TVs were blurry by default indeed. Games had a natural AA solution for free.

CRT TVs also only had like 10% or less pixels to move around. so pick your fights.
 

chilichote

Member
Jan 6, 2020
610
1,867
385
Germany
I hate looking movies and playing games on LCDs. I would buy a modern CRT, with moderate power consumption, instant if i could.
 
Last edited:
  • Like
Reactions: cireza

n0razi

Member
Nov 8, 2011
4,945
821
900
I'm sure CRT had less motion blur than Led or Oled, but that it's only an aspect of image quality. For sure my 65" LG CX image quality would seem like science fiction to myself from the past.


This

Motion clarity is only one part of overall image quality... I will take the numerous other benefits of a modern OLED
 

mdrejhon

Member
Aug 17, 2013
121
145
500
Toronto, Ontario
www.blurbusters.com
This

Motion clarity is only one part of overall image quality... I will take the numerous other benefits of a modern OLED
LG CX is the cat bean’s indeed.

That said, I’ve seen some high end LCDs matching LG CX — mainly five figure priced televisions or high four figure priced televisions. Nanosys Quantum Dot phosphors has similar color gamut (or better) than many OLEDs, and putting a 100,000-LED FALD behind the panel, kind of solves the LCD blacks problem while having smaller bloom than a CRT electron beam dot.

It’s got a lot of pieces to the puzzle how an LCD can be massively worse than OLED, but there are engineering paths to equalize nearly all attributes of LCD to OLED — it’s a matter of cost effectiveness for the package deal.

The small size of VR displays makes this easier, so commercializing an OLED-blacks-quality OLED-colors-quality LCD is much easier — perhaps Quest Pro or Index Pro or Index 2?
 
Last edited:
  • Like
Reactions: NeoIkaruGAF

mdrejhon

Member
Aug 17, 2013
121
145
500
Toronto, Ontario
www.blurbusters.com
240hz PWM is definitely better than 120hz but might still induce image duplication, you'd want at least 480hz and more if possible to reduce that effect.
120fps at 120Hz PWM is more eye-friendly than framerate-mismatched PWM. My eyes can get bothered by 864 Hz PWM not synchronized to refresh rate, but isn’t bothered by a motion blur reduction strobe backlight (proper one-flash-per-refresh PWM that’s more eye friendly than unsynchronized PWM).

60 Hz PWM is way too low and flickery, but some people like it.

In fact, it can be advantageous to lower the refresh rate to match frame rate, so you can have stroberate=refreshrate=framerate necessary for the CRT motion effect. (CRT is similar to PWM, but with phosphor fade at the end, so it's more of a triangular wave rather than a square wave on an oscilloscope graph).

Nontheless, users should have the choice of PWM frequency, but if there *HAS* to be PWM, usually matching PWM=framerate=Hz is the most eye friendly. So if your display can’t do 240Hz, then you prefer PWM-free instead of 240Hz PWM at 120Hz.
 
Last edited:
  • Like
Reactions: NeoIkaruGAF

balgajo

Member
Aug 31, 2013
2,002
97
605
Brazil
One final thing. A lot of games have motion blur turned on in the settings (Doom 2016 and Doom Eternal for example) go into settings and if you see "motion blur" turn it the fuck off! LCDs/OLEDs already have motion blur so why would you want more? Why game developers do that is beyond me, modern flat panel displays already have a lot of motion blur as it is so why they add more to it is crazy.
Games do that to simulate the smoothness of real world lens. They are not quite that yet but nowadays motion blur is much better than some time ago.

Also, I don't see how 1000fps display would improve movies as they already come with natural motion blur.
 

mdrejhon

Member
Aug 17, 2013
121
145
500
Toronto, Ontario
www.blurbusters.com
Games do that to simulate the smoothness of real world lens. They are not quite that yet but nowadays motion blur is much better than some time ago.
As I wrote in my earlier post, motion blur is useful to make low frame rates comfortable.
Motion blur is caused by the camera shutter and can be comfortable.

Also, I don't see how 1000fps display would improve movies as they already come with natural motion blur.
I work in the industry.

I recognize the correct & wrong uses of motion blur.

You probably misunderstood this thread -- It is not for Hollywood. It's a "Right tool for Right Job".

1000fps is for reality simulation or for matching CRT motion clarity video material.

It's wrong to have extra source based motion blur in a Star Trek Holodeck.

Examples:
- Virtual Reality, such as Oculus Rift, Quest 2, HTC Vive, Valve Index, PSVR, etc.
- Video games at high frame rates (e.g. CS:GO)
- Simulator rides (they're more realistic feeling when removing external motion blur)
- Sports video
- Fast action GoPro / POV camera stuff
- Mimicking CRT motion clarity with no form of flicker/phosphor/strobe/BFI.
- Etc.

For reality simulation, you only want motion blur generated by human brain. You NEVER want additional source-based motion blur or display-based motion blur when trying to simulate real life. So you need blurless displays & blurless sources for a lot of applications.

The ONLY motion blur you want in Virtual Reality, should only be the motion blur generated inside your human brain! NEVER in source, NEVER in display.
Full stop -- scientifically proven, researcher proven -- just ask any VR researcher.

However, movies are different. I love 24fps for all the classics -- and fan of Hollywood Filmmaker Mode on TVs. But 24fps is not a screwdriver one-size-fits-all tool for all motion material. But I've also seen 1000fps on 1000Hz too, on prototypes.

The problem is that CRT used flicker to eliminate motion blur, and the only way to eliminate motion blur flickerlessly is retina refresh rates. Real life does not flicker. Real life does not strobe. Flicker-to-eliminate-blur (like CRT) is a humankind bandaid that won't work for a Star Trek Holodeck. A true real-life-matching Holodeck should never flicker.

TL;DR:
1. To eliminate motion blur at low frame rates: You need to flicker (like CRT, plasma, strobe mode, BFI)
2. To eliminate motion blur without flicker: It now becomes mandatory to use ultrahigh refresh rates & frame rate (blurless sample-and-hold)

Right now, all VR headsets flicker because that's the only way to eliminate motion blur without unobtainium frame rates & refresh rates. We are stuck with that technique for now, despite the fact that real life doesn't flicker, even though we're trying to make a display mimick real life.

Also, playing games at really low frame rates such as 20 or 25 frames per second, they can often look better when you enable the motion blur setting. But once games go super high frame rates doing reality simulation (VR) then adding source-based motion blur becomes a burden in reality emulation. When you combine It actually creates actual headaches/migraine because it's forcing extra blur above-and-beyond your natural human-brain's generated motion blur. It's no longer a planar display experience (movie) but an immersed experience (VR) where the rules of the blur ballgame dramatically changes. Motion blur setting in games are a personal preference, some people depend on it because of specific reasons (e.g. really low frame rates on GPU, or is really sensitive to phantom array effects).

For more educational reading, see motion animations here.

Right Tool For Right Job!
 
Last edited:

NeoIkaruGAF

Member
Dec 8, 2019
1,540
2,762
505
mdrejhon mdrejhon , thanks a lot for your insight! This thread is amazing, thanks to your contribution.

I have a question: what is the best solution on modern displays to achieve good motion with low framerate content?

Like I said before, movies and sub-60fps games just look deeply unnatural to me on a LG C9, even to the point of making me physically troubled at times (headaches, nausea). BFI on this particular TV doesn’t do much to mitigate the problem because the flickering is just too noticeable and introduces a whole other set of visual fatigue symptoms.

For movies, some mild motion interpolation can be good without getting into soap-opera look territory. What’s amazing is how not every movie or TV program moves the same with the same motion interpolation settings. I’ve noticed I can get a bit more aggressive with it with modern movies (typically, 3 on the LG ”dejudder” settings), while older movies may look better by lowering dejudder to 2. Old hand-drawn cartoons just look better without any interpolation at all.

For gaming this isn’t really an option.
 

JeloSWE

Member
Mar 23, 2015
544
851
550
120fps at 120Hz PWM is more eye-friendly than framerate-mismatched PWM. My eyes can get bothered by 864 Hz PWM not synchronized to refresh rate, but isn’t bothered by a motion blur reduction strobe backlight (proper one-flash-per-refresh PWM that’s more eye friendly than unsynchronized PWM).
I have to disagree some what from personal testing and experience, but those PWM has been multiples of fps and not an irregular number of course. So I've had 3 Samsungs that does either 480 Hz in Movie mode or 120hz in all other modes including game mode. And my Sony Z9F always does 720hz unless you enable the 120hz BFI option. To me when panning the camera in games or just moving my eyes and mouse around my PC desktop, higher PWM is preferred as the the image duplicates the result from say 60fps content on a 120hz PWM screen on both those Samsungs and my Sony creates a very obvious and distracting imager duplication. Where's 480hz and even better 720hz PWM creates so many duplicates they start to merge together and almost look like a a static backlight implementation, unless you have thin white lines on a black background. I would say that I even find higher PWM calmer than 120hz which isn't completely flicker free. My best TV I had was my previous Sony 55W905a as it didn't use PWM, no flicker and any ingame panning looked so smooth, god I hate PWM, they should really just let the backlight stay on or run it at 3000hz. When I flicked my eyes around the screen on the old Sony vs the new one and the Samsungs, it looked so stable and calm, now it's always feel a bit uneasy, especially at 120hz PWM which I presume is due to the Phantom Array Effect.
 
Last edited:

svbarnard

Member
Mar 17, 2016
155
141
390
Whats even worse is that mouse trail effect gets fainter as the refresh goes up, but gets alot more distracting since you can still see it, and more of it.


Mr Rejhon I have some more questions for you.

1. So you know how in some games (such as Doom Eternal or COD modern warfare 2019) if you go in the settings you will see "motion blur" as an option you can turn on or off? Well digital foundry did a video three years ago explaining why game developers do that, why they add motion blur to video games. The guy in the video says that motion blur is added to modern video games to "hide gaps between frames and be visually continuous" also says things like "gaps in the motion between the frames". Here this is a screenshot from the video, this apparently is what he's talking about when he says gaps between the frames and that they add motion blur to video games to help smooth that out.

Does this guy know what he's talking about? What is he talking about exactly? Is that visible Gap, pictured in the image above, produced by the stroboscopic effect AKA the stepping mouse effect? Here's the link to the video


2. The phantom aray effect/ stroboscopic effect/stepping mouse effect, can you explain to the people here exactly what it is?

3. What's the difference between ghosting and the stroboscopic effect? Are they the same thing?

4. How noticeable is the stepping mouse effect at 1,000fps@1,000Hz? Is it annoyingly horrible at that frame rate or what? Or is it better at that frame rate?

As usual thanks for your insight. You're truly a genius at what you do!
 
Last edited:
  • Like
Reactions: mdrejhon

Tygeezy

Member
Sep 28, 2018
1,114
1,113
465
Yes, it's impressive how they've made VR headsets more comfortable to view than cinema 3D glasses. Less blur than a plasma, less blur than a CRT, less blur than a 35mm projector. And far less nauseating 3D.

And even beyond that, native rendered 3D environments are even better than watching filmed content -- you can even lean down and look underneath a virtual desk, or crouch to crawl under a virtual desk, or walk around a virtual museum statue. Many VR apps have perfect 1:1 sync between VR and real life -- they're not the dizzying roller coaster experiences.

Ease has really improve things there too. 50% of the Oculus Quest 2 market are non-gamers, and these models are so easy that a boxed Quest 2 was mailed to a nursing home (some are literally almost jail-like during COVID times), and the 70-year-old was playing VR the same day with no help from the nursing home staff! They skip the rollercoaster apps and just download the comfortable VR apps (Oculus has "Comfort" ratings in their app store). For those who "hold their nose and buy despite Facebook", these super-easy standalone VR headsets makes it perfect for the hospital bed-ridden or the locked-down individual, the Quest 2 is a gift from heaven for these people.

Vacationing by sitting in a boring residence chair that actually becomes a simulated porch chair of a virtual beachfront house viewing out to the virtual seas and virtual palm trees -- teleporting yourself virtually (this is a real app, it's called the "Alcove" app, with Settings->Beachfront). And being able to play chess on an actual virtual table with a remote family member sitting across the table. And since the Quest 2 has a built in microphone, you're talking to each other in the same room despite both family members living in two different countries. This non-gamer app, "Alcove", is an actual app you can download in the in-VR app store on Quest 2, no computer or phone needed.

Helping the framerate=Hz experience of CRT motion clarity in VR (without a computer needed) -- the Quest 2 has a built-in 4K-capable GPU, a fan-cooled Snapdragon XR running at a higher clockrate than smartphones -- but the fan is so quiet you don't hear it, and with the hot air blowing out a long 1mm slit at the top edge). The graphics of a standalone computerless Quest 2 VR headset has 3D graphics as good as a GTX 1080 from five years ago -- that's downright impressive for $299 playing high-detail standalone VR games such as Star Wars: Tales From The Galaxy Edge (FPS) or playing lazy seated/bed games like Down The Rabbit Hole (like a Sierra Quest or Maniac Mansion game, except it's sidescrolling dollhouse-sized true 3D VR) or playing exercise (Beat Saber) burning more calories for cheaper than an annual gym membership. While many hate Facebook, many are holding noses and buying up Quest 2's as if they're RTX 3080's because they're the closest thing to a Star Trek Holodeck you can get today. John Carmack did an amazing job on that VR LCD screen.

And in an Apple-style "One More Thing", it can play PCVR games wirelessly now. So you can play PC-based Half Life Alyx too. Basically a cordless VR headset for your gaming PC. As long as you have a WiFi 5 or WiFi 6 router in the same room (preferably a 2nd router dedicated to just Quest 2) and a powerful RTX graphics card in the PC, it becomes like a "Wireless HDMI" connection (through nearly lossless H.EVC compression at up to triple-digit megabits per second -- pretty much E-Cinema Digital Cinema bitrates -- no compression artifacts at max settings, and only 22ms latency, less lag than many TVs, at a full frame rate of 90fps 90Hz), to the point where it feels like a wireless version of HDMI. So you can play PCVR as well as in-VR apps (using the in-headset Snapdragon GPU), so you have PC operation (using PC GPU) *and* portable operation (using built-in Snapdragon GPU). Quite flexible.

And while not everyone does this -- it also can optionally doubles as a strobed gaming monitor with Virtual Desktop, and is even capable of optional 60 Hz single strobe (for playing 60 years of legacy 60fps 60Hz content via PC). Virtual Desktop can even display a virtual room with a virtual desk with a virtual computer, and support is being added (in the next few months) to a Logitech Keyboard to map a 3D-rendered VR keyboard into the same physical location of your logitech keyboard! Not everyone would use VR this way, but it's one unorthodox way to get a CRT emulation, since this LCD is so uncannily good at zero motion blur, zero ghosting. You'd wear your VR headset seated at your physical computer, but you're staring into a virtual CRT-motion-clarity instead (in a rendered virtual office).

Now, obviously, some of this is an unconventional optional use -- but consider you get everything of the above for just $299: A desktop gaming monitor, a PCVR headset, a standalone VR headset, a virtual IMAX theatre, a virtual vacation, and a built-in battery powered fan-cooled GPU more powerful than a $299 graphics card of 5 years ago. All in one. Even just one or two of the uses, just pays for the headset itself, alone -- especially if you're in a COVID jail (quarantine).

Certainly isn't grandpa's crappy "Google Cardboard" toy VR. People who only have experience with 60 Hz LCDs don't know what they're missing with these formerly science fiction CRT-beating LCDs now already on the market, being made out of necessity for VR headsets. Zero blur, zero phosphor trails, zero crosstalk, zero double image effect, it's just perfect CRT motion clarity. For those who don't want Facebook, there's the Valve Index, but it's not standalone.

*The only drawback is LCD blacks. However, I saw a FALD VR prototype -- MicroLED local dimming with thousands of LEDs -- that will allow inky-blacks in future headsets by ~2025. But at $299, I didn't bother to wait.


Hello,
Your first reply here is blank — did you mean to say something in regards to my explanation of low-Hz strobing on a high-Hz-capable panel?

Now in regards to your second reply with two new questions:

1. I believe 120 Hz HFR will standardize for about a decade. 120 fps at 120 Hz is high enough that strobing (flicker) is not too objectionable, for an optional motion blur reduction mode. I think 8K will raise refresh rates before 16K becomes useful. 16K is useful for VR, but I think external direct-view TVs don’t really need to go beyond 8K in the consumer space. Jumbotrons, cinemas, and video walls will still have usefulness to go 16K to go retina for first row of seats at the front for example.

Retina frame rates will be used in specialty venues later this decade. There is an engineering path to 8K 1000fps 1000Hz display which can be achieved using today’s technologies. For content creation in this sphere, more information can be found in Ultra High Frame Rates FAQ. A new Christie digital cinema projector is currently capable of 480 Hz already!

I am working behind the scenes for 1000 Hz advocacy; it is definitely not unobtainium — at least for industrial / commercial / special venue purposes. Some of them can afford rigs with 8 RTX GPUs built into them, for example, running specialized 1000fps software. I was behind the scenes in convincing Microsoft to remove the 512 Hz Windows limitation, and now have a Windows Insider build capable of 1000 Hz.

However, for retail televisions, they will be stuck at 120 Hz HFR for a long time due to video industry inertia. Streaming is taking over cable/broadcast, and will be sticking to Hollywood MovieMaker modes.

However, there is already a hack to run YouTube videos at 120fps and 240fps HFR by installing a Chrome extension and playing 60fps videos at 2x or 4x speed. You record 240fps 1080p video, upload to YouTube as 60fps, then play back to a 240 Hz gaming monitor using the chrome extension at 4x playback speed. So you can do 240fps 1080p HFR today with just your own smartphone camera! (Many new smartphones support 240fps slo-mo which can be converted to 240fps real-time HFR video).

So the workflow isn’t that expensive to do, technology is already here today, and 1000 Hz isn’t going to be expensive technology in the year 2030s+. Specialized/niche, yes. But definitely not unobtainium as 1000 Hz is already running in the laboratories which I’ve seen.

Ultimately, by the end of the century, there’ll be legacy framerates (24fps, 50fps, 60fps) and retina frame rates (1000fps+). 120fps HFR is just a stopgap in my opinion. There is somewhat of a nausea uncanny valley, to the point that 1000fps at 1000Hz is less nauseous than 120fps at 120Hz (and fixes some aspects of the Soap Opera Effect). 1000fps is a superior zero-motion-blur experience that has no flicker and no stroboscopics. Better window effect. Fast motion speeds exactly as sharp as stationary video! As long as it is 1000fps+ native frame rates or good perceptually-lossless-artifactless-“artificial-intelligence”-interpolated video or other properly modern frame rate amplification technology.
I’d also like to point out that even if a game isn’t graphically impressive, it sure looks a whole heck of a lot more impressive actually being in the game and everything being 3D and being able to interact with the environment. Things that are tedious in normal games such as manually looting everything all of a sudden becomes fun in vr. A lot of really popular and fun games are Indy not graphical powerhouses anyway.

if someone could make a survival game in the vein of Valheim on quest which is certainly doable on the quest 2 hardware; it would be amazing.
 

mdrejhon

Member
Aug 17, 2013
121
145
500
Toronto, Ontario
www.blurbusters.com
if someone could make a survival game in the vein of Valheim on quest which is certainly doable on the quest 2 hardware; it would be amazing.

You can play Valhiem in VR using headset like a stereoscopic 3D desktop monitor:

While it is not true VR, it is more immersive than normal. Check out Playing Valhiem in VR with VorpX, using your VR headset as a stereoscopic 3D desktop gaming monitor that also happens to have CRT motion clarity.


You said you had a Quest 2. Basically you'd use your VR headset as a goggles display for your PC to view your Windows Desktop or play non-VR PC games. You view your PC screen remotely via BigScreen or Virtual Desktop (with refresh rate configure to match the Quest 2 headset), and use your VR controllers like a gamepad. It works today, and produces better motion clarity than some desktop gaming monitors, thanks to Quest 2's superior motion-clarity LCD.

VR headsets can double as a stereoscopic 3D television/monitor/display (using VR headset like NVIDIA 3D Vision) with better graphics quality than NVIDIA 3D Vision. No shutter glasses, both eyes are getting images at the same time.


I have a question: what is the best solution on modern displays to achieve good motion with low framerate content?

Like I said before, movies and sub-60fps games just look deeply unnatural to me on a LG C9, even to the point of making me physically troubled at times (headaches, nausea). BFI on this particular TV doesn’t do much to mitigate the problem because the flickering is just too noticeable and introduces a whole other set of visual fatigue symptoms.

For some people, low frame rates are ergonomically SOL

Unfortunately, low frame rates are hugely problematic for some people from an ergonomic POV, if you find low frame rates nauseating, because you've got major pick-poison effects.
  1. For the motion-blur-sensitive person (motion blur headaches), low frame rates at sample-and-hold can be nauseating
  2. For the flicker-sensitive person (flicker headaches), low frame rates during strobing can be nauseating, since they either visibly flicker or generate double-image effects.
  3. There are people who are both motion-blur sensitive and flicker-sensitive, it becomes more difficult (damned if I do, damned if I don't)
We have 60+ years of legacy 60fps 60Hz content means we are stuck with band-aids, such as strobing (flicker problem), sample-and-hold (flickerless but has blur), or interpolation (artifacts, laggy).

It's very hard for people who are flicker-sensitive (headaches) and motion-blur-sensitive (headaches), many people with vision sensitivities couldn't even watch a CRT because of flicker sensitive, and others can't watch a LCD because of display-derived motion blur sensitivities.

Artificial-Intelligence Interpolation Without Artifacts

Currently, such individuals who gets low-framerate nausea have to swallow interpolation as a band-aid if you really hate flicker.

Long term for legacy video content (legacy recorded that don't have 3D information or Z-buffer information) -- it is probably artificial-intelligence interpolation is probably the best bet. Basically smart artificial intelligences (such as GPT-3 or better) being able to become realtime artists to recreate the "fake images" much more perfectly like a professional PhotoShopper, except done in real time artfully recreating original-indistinguishable extra frames at hundreds of frames per second.


Early beginnings of greatly improved AI interpolation (such as DAIN) without parallax artifacts (AI smartly "Photoshops" the parallax artifacts away) has already begun, and will gradually keep getting better until interpolation looks perfectly flawless.

Today if you're one of those who get lots of headaches/nausea from low frame rates -- one can download your 24fps content, download DAIN, run the process overnight (DAIN is compute intensive, not real time yet), and play your near-perfectly interpolated videos the next day. A lesser PC-based video-file-converting technique would be Smooth Video Project, but it has way more artifacts and SOE-effect (Soap Opera Effect), though its strength can be adjustable for PC content just like for a TV's low-interpolation setting.

Ideally, AI interpolation needs to be a realtime easy toggle, but you can roll your own non-realtime AI interpolation solution today.

Other Real Time Low-Lag Techniques For Low-Frame-Rate Game Content:

Currently, many of us just get a faster GPU / reduce detail / reduce framerate-impacting settings like excess AA / etc. But sometimes this is not possible nor affordable. Some TVs use a lower-lag interpolation mode available for game mode (e.g. like the one in Samsung first put in NU8000 series a while back, called "Game Motion Plus".

Others use band-aids like phosphor fade (plasma, CRT) to soften the flicker effect of low refresh rates, though this will eventually be emulateable on ultra-high-Hz display with configurable rolling-scan (e.g. using 1000Hz to emulate phosphor fade in 1ms increments), for playing legacy content (e.g. emulators).

Long term for game content is frame rate amplification technologies (FRAT) since it is possible for games to have access to the original motion information (controllers, motion vectors, etc) and laglessly create virtually original-looking frames. This is already being done at 2:1 ratios today in algorithms such as Asynchronous Space Warp (ASW 2.0) built into Oculus Rift VR headsets. HTC Vive and Valve Index does something similar, doubling frame rates without reducing graphics detail. Also, AI can become part of FRAT, as seen in NVIDIA DLSS 2.0 which used neural network training.

In the future, dedicated silicon will happen to GPUs to convert frame rates at 5:1 ratios and 10:1 ratios within a decade or so. Some work is already being done at things like 4:1 ratios in nearer future even for raytracing (see Temporally Dense Ray Tracing). In that sphere, NVIDIA credited me on Page 2 of this peer reviewed research paper where doubling frame rate halves display motion blur.

Either way, multiple solutions are being worked on, but it's still the early days. Like Japan HDTV experiments in the year 1980s. It will be a long time before 40Hz or 480Hz is mainstream, but it is inevitable as long as it becomes a freebie feature (like retina screens and like how 120Hz slowly is -- almost all new 4K HDTVs now support 120Hz, and almost all new gaming consoles now support 120Hz).
 
Last edited:

Tygeezy

Member
Sep 28, 2018
1,114
1,113
465
You can play Valhiem in VR using headset like a stereoscopic 3D desktop monitor:

While it is not true VR, it is more immersive than normal. Check out Playing Valhiem in VR with VorpX, using your VR headset as a stereoscopic 3D desktop gaming monitor that also happens to have CRT motion clarity.


You said you had a Quest 2. Basically you'd use your VR headset as a goggles display for your PC to view your Windows Desktop or play non-VR PC games. You view your PC screen remotely via BigScreen or Virtual Desktop (with refresh rate configure to match the Quest 2 headset), and use your VR controllers like a gamepad. It works today, and produces better motion clarity than some desktop gaming monitors, thanks to Quest 2's superior motion-clarity LCD.

VR headsets can double as a stereoscopic 3D television/monitor/display (using VR headset like NVIDIA 3D Vision) with better graphics quality than NVIDIA 3D Vision. No shutter glasses, both eyes are getting images at the same time.




For some people, low frame rates are ergonomically SOL

Unfortunately, low frame rates are hugely problematic for some people from an ergonomic POV, if you find low frame rates nauseating, because you've got major pick-poison effects.
  1. For the motion-blur-sensitive person (motion blur headaches), low frame rates at sample-and-hold can be nauseating
  2. For the flicker-sensitive person (flicker headaches), low frame rates during strobing can be nauseating, since they either visibly flicker or generate double-image effects.
  3. There are people who are both motion-blur sensitive and flicker-sensitive, it becomes more difficult (damned if I do, damned if I don't)
We have 60+ years of legacy 60fps 60Hz content means we are stuck with band-aids, such as strobing (flicker problem), sample-and-hold (flickerless but has blur), or interpolation (artifacts, laggy).

It's very hard for people who are flicker-sensitive (headaches) and motion-blur-sensitive (headaches), many people with vision sensitivities couldn't even watch a CRT because of flicker sensitive, and others can't watch a LCD because of display-derived motion blur sensitivities.

Artificial-Intelligence Interpolation Without Artifacts

Currently, such individuals who gets low-framerate nausea have to swallow interpolation as a band-aid if you really hate flicker.

Long term for legacy video content (legacy recorded that don't have 3D information or Z-buffer information) -- it is probably artificial-intelligence interpolation is probably the best bet. Basically smart artificial intelligences (such as GPT-3 or better) being able to become realtime artists to recreate the "fake images" much more perfectly like a professional PhotoShopper, except done in real time artfully recreating original-indistinguishable extra frames at hundreds of frames per second.


Early beginnings of greatly improved AI interpolation (such as DAIN) without parallax artifacts (AI smartly "Photoshops" the parallax artifacts away) has already begun, and will gradually keep getting better until interpolation looks perfectly flawless.

Today if you're one of those who get lots of headaches/nausea from low frame rates -- one can download your 24fps content, download DAIN, run the process overnight (DAIN is compute intensive, not real time yet), and play your near-perfectly interpolated videos the next day. A lesser PC-based video-file-converting technique would be Smooth Video Project, but it has way more artifacts and SOE-effect (Soap Opera Effect), though its strength can be adjustable for PC content just like for a TV's low-interpolation setting.

Ideally, AI interpolation needs to be a realtime easy toggle, but you can roll your own non-realtime AI interpolation solution today.

Other Real Time Low-Lag Techniques For Low-Frame-Rate Game Content:

Currently, many of us just get a faster GPU / reduce detail / reduce framerate-impacting settings like excess AA / etc. But sometimes this is not possible nor affordable. Some TVs use a lower-lag interpolation mode available for game mode (e.g. like the one in Samsung first put in NU8000 series a while back, called "Game Motion Plus".

Others use band-aids like phosphor fade (plasma, CRT) to soften the flicker effect of low refresh rates, though this will eventually be emulateable on ultra-high-Hz display with configurable rolling-scan (e.g. using 1000Hz to emulate phosphor fade in 1ms increments), for playing legacy content (e.g. emulators).

Long term for game content is frame rate amplification technologies (FRAT) since it is possible for games to have access to the original motion information (controllers, motion vectors, etc) and laglessly create virtually original-looking frames. This is already being done at 2:1 ratios today in algorithms such as Asynchronous Space Warp (ASW 2.0) built into Oculus Rift VR headsets. HTC Vive and Valve Index does something similar, doubling frame rates without reducing graphics detail. Also, AI can become part of FRAT, as seen in NVIDIA DLSS 2.0 which used neural network training.

In the future, dedicated silicon will happen to GPUs to convert frame rates at 5:1 ratios and 10:1 ratios within a decade or so. Some work is already being done at things like 4:1 ratios in nearer future even for raytracing (see Temporally Dense Ray Tracing). In that sphere, NVIDIA credited me on Page 2 of this peer reviewed research paper where doubling frame rate halves display motion blur.

Either way, multiple solutions are being worked on, but it's still the early days. Like Japan HDTV experiments in the year 1980s. It will be a long time before 40Hz or 480Hz is mainstream, but it is inevitable as long as it becomes a freebie feature (like retina screens and like how 120Hz slowly is -- almost all new 4K HDTVs now support 120Hz, and almost all new gaming consoles now support 120Hz).
I thought about getting vorp x mainly to play pc games in 3D. I really enjoyed playing Diablo 3 and path of exile with 3D vision way back.

Id definitely be interested in playing the Diablo 2 remaster with vorpx assuming it works with it like Diablo 3 apparently does. I might pickup and test valheim on it, but I was hoping someone would use the unity engine like valheim does and build a survival co-op game from the ground up exclusively for VR. I just think that genre in particular is an excellent fit for vr. Walking Dead Saints and Sinners has some minor survival mechanics and that game is outstanding in VR.
 

mdrejhon

Member
Aug 17, 2013
121
145
500
Toronto, Ontario
www.blurbusters.com
I have to disagree some what from personal testing and experience, but those PWM has been multiples of fps and not an irregular number of course. So I've had 3 Samsungs that does either 480 Hz in Movie mode or 120hz in all other modes including game mode. And my Sony Z9F always does 720hz unless you enable the 120hz BFI option. To me when panning the camera in games or just moving my eyes and mouse around my PC desktop, higher PWM is preferred as the the image duplicates the result from say 60fps content on a 120hz PWM screen on both those Samsungs and my Sony creates a very obvious and distracting imager duplication. Where's 480hz and even better 720hz PWM creates so many duplicates they start to merge together and almost look like a a static backlight implementation, unless you have thin white lines on a black background. I would say that I even find higher PWM calmer than 120hz which isn't completely flicker free.
Correct.

We're not mutually exclusive, and you're correct too.

1. People who are PWM-insensitive (doesn't mind flicker/strobe/PWM of any kind, even unsynchronized)
2. People who are PWM-sensitive only to framerates mismatching stroberate (aka "motion blur reduction" PWM)
3. People who are PWM-sensitive to any kind of PWM

I fall in the category of #2.

The bottom line is that some people get eyestrain from PWM artifacts, instead of eyestrain from PWM flicker.

These are essentially two separate causes of PWM strain, and different people are affected.



Everybody sees differently. We respect that everybody has different vision sensitivities. Focus distance? Brightness? Display Motion Blur? Flicker? Tearing? Stutter? Some of them hurts people's eyes / create nausea / symptoms sommore than others. One person may be brightness sensitive (hate monitors that are too bright or too dim). Another may get nausea from stutter. And so on. Displays are inherently imperfect emulations of real life.

Even back in the CRT days -- some people were insanely bothered by CRT 30fps at 60Hz, while others were not at all.

Also, thanks to the Vicious Cycle Effect where higher resolutions, bigger displays, wider FOV, and faster motions amplfiy PWM artifacts. They become easier to see on 8K displays than 4K displays than 1080p displays.

The visibility of PWM artifacts increases with bigger resolutions & bigger FOV & faster motion. Where the problem of (framerate not equal stroberate) become more visible again at ever higher PWM frequencies. This is part of why I'm a big fan of retina frame rates at retina refresh rates for many use cases.

In fact, many people are bothered by PWM artifacts (#2) when the display is big enough - such as virtual reality. Headaches of display artifacts (PWM artifacts, motion blur artifacts, etc) are biggest with a massive IMAX display strapped to your eyes. So, that's how VR headsets work to minimize the percentage of nausea and headaches in the human population, their PWM frequency is matched to Hz to fix headaches from duplicate images.

From an Average Population Sample, Lesser Evil of Pick-Your-Poison Display Artifacts

VR researchers discovered people got a lot of nausea/headaches with things like display motion blur and PWM artifacts, so they (A) chose to strobe, and (B) they chose to strobe at framerate=Hz.

Strobing / blur reduction / BFI / "framerate=Hz PWM" are essentially the same thing -- synonyms from a display point of view. We don't normally equate "PWM" with "motion blur reduction" but motion blur reduction on LCD is essentially flashing a backlight, and that's the scientific definition of PWM (an ON-OFF square wave).

For people matching #2, the fix is to lower refresh rate and raise frame rates, until they converge to framerate=refreshrate=stroberate. When this happens, you get beautiful CRT motion clarity, zero blur, zero duplicates, etc. You still have flicker, but there can be a point where flicker is the lesser evil (as long as flicker frequency is high enough).

When screens are gigantic to eyes enough (like VR), it becomes a problem bigger and more visible than CRT 30fps at 60Hz.

VR researchers found that the fewest % of headaches occured with VR PWM at framerate=Hz. You do need a proper triple match to eliminate the maximum amount of VR nausea for the maximum population though: refreshrate == framerate == stroberate PWM, and the frequency of this to be beyond flicker fusion threshold (i.e. 120Hz instead of 60Hz).

It is the lesser of evil of a pick-your-poison problem of displays that can't yet perfectly match real life.



That's why if you want this for your desktop games, you must have the triple match, framerate = refreshrate = stroberate.

That's why blur busting is so difficult in many games at these non-retina frame rates. You need technologies similar to VSYNC ON, but with lower lag, to keep frame rate synchronized. You need GPU powerful enough to run framerates equalling refresh rates that are strobing high enough not to flicker. You need high-quality strobing without ghosting or crosstalk. Etc. So, 120fps, 120Hz PWM, 120Hz display refresh -- very tough to do with many games.

(BTW, this is also partially why RTSS Scanline Sync was invented as a substitute to laggy VSYNC ON -- I helped Guru3D create RTSS Scanline Sync -- which is essentially a low-lag VSYNC ON alternative.)
 
Last edited:
  • Like
Reactions: JeloSWE

mdrejhon

Member
Aug 17, 2013
121
145
500
Toronto, Ontario
www.blurbusters.com
Mr Rejhon I have some more questions for you.

Hey! I am glad you are interested in asking questions.

But, may I ask a favour first? Just like the person who hog classroom time by asking the most questions than the other students in the rest of the classroom -- would it be possible to first answer my previous questions first before I finish answering further questions of yours, please? Forum participation is unpaid time and I have to ration the maximal teaching moments to maximum number of interested people. Thank you!

1. Can you answer my question earlier if you had a reply that went missing:
Link to your reply with missing reply.

2. Can you answer if you finished the exercise I told you about in the other forum:
"Steps To See Stroboscopic Text Effect In Browser Scroll"

Once you reply, I'll be happy to reply to your other excellent questions (if they aren't self-answered by #2). It's important to do this specific exercise before you ask any additional questions, because self-testing some motion tests yourself, to see for yourself, helps to answer your own questions. Appreciated!
:messenger_winking:

However, since a Blur Busters' prime directive is mythbusting the refresh rate race to retina refresh rates, I'd like to address a potential source of confusion of yours:
1. So you know how in some games (such as Doom Eternal or COD modern warfare 2019) if you go in the settings you will see "motion blur" as an option you can turn on or off? Well digital foundry did a video three years ago explaining why game developers do that, why they add motion blur to video games. The guy in the video says that motion blur is added to modern video games to "hide gaps between frames and be visually continuous"
Correct.

"gap" = "stroboscopic stepping" = "strobscopics" = "steps between frames" = "gaps between frames" = "phantom array effect"

It's a synonym. Same thing. Identical meaning for the perspective of moving images on a display.

1. Watch the YouTube video you posted again, but mentally replace "gap between frames" with "stroboscopic effect between frames".
2. Then re-read The Gapping Effect of Finite Frame Rates but mentally replace "strobsocopic effect" / "phantom array effect" with "gaps between frames"

They mean the same thing. That said, my article goes more in depth than the video (in some ways) by covering the different kinds of stroboscopic effect for non-strobed displays and strobed displays.

We just use different terminology. They don't cause each other. They are the same thing. One and same.

I also even mention it in the Ultra HFR FAQ: Real Time 1000fps on Real Time 1000 Hz Displays where a 360 degree camera shutter introduces source-based motion blur to "fill the gaps between frames" (means same thing as "fill the stroboscopic stepping between frames") The good news is that 1000fps means you only need 1ms of intentional source-based motion blur to fill the stroboscopic gaps between frames.

I know this will create additional questions from you, but before you ask additional questions, please answer my earlier questions first. Thank you!

(From Ultra HFR FAQ)
  1. Fix source stroboscopic effect (camera): Must use a 360-degree camera shutter;
  2. Fix destination stroboscopic effect (display): Must use a sample-and-hold display;
  3. Fix source motion blur (camera): Must use a short camera exposure per frame;
  4. Fix destination motion blur (display): Must use a short persistence per refresh cycle.
Therefore;
  • Ultra high frame rate with 360-degree camera shutter is also short camera exposure per frame;
  • Ultra high refresh rate with sample-and-hold display is also short persistence per refresh cycle,
Therefor, to solve (1), (2), (3), (4) simultaneously requires ultra high frame rates at ultra high refresh rates.

Also, svbardnard, please complete this exercise and re-read all the links (knowing the synonym "gapping" = "stroboscopics").

My article, The Gapping Effect of Finite Frame Rates, is pretty precisely exactly the same point being made in the videos you posted. Adding game-based motion blur is a good and bad thing. As my article The Gapping Effect of Finite Frame Rates explains, there are pros/cons of adding motion blur to frames. The YouTube video you posted focuses on certain pros/cons.

Once you recognize that the pros of motion blur also has the cons of motion blur, and you understand the pros/cons. It's all part of the vicious cycle effect. Where the more you try to eliminate display motion blur, the source-based artifacts become more visible (whether source-based motion blur or stroboscopic stepping effect).

So to eliminate BOTH simultaneously for a bigger percentage of population (including all the sensitive individuals). You need source-based motion blur to eliminate stroboscopic effect, but to minimize source-based motion blur AND display-based motion blur, requites extremely high frame rates at high refresh rates. That gets a display closer to mimicking real life, so a bigger percentage of eyes sees no issues.

Anyway, please complete the exercise, and let me know!

Thank you!
 
Last edited:
  • Like
Reactions: Panajev2001a

JRW

Member
Nov 29, 2006
2,492
113
1,305
Clovis, California
We still have a Sony 34XBR960 CRT and lately Ive been playing Ghost N Goblins Resurrection on it via Switch (the tv has 1 HDMI input thankfully) It really is crazy how much smoother 60fps/60Hz looks on a CRT vs. even my main 144Hz PC LCD and I won't even get into how much better the contrast ratio & black levels are on a decent CRT.

But the convergence has slipped a bit on the XBR960 over the years, thats one downside w/ CRT they need occasional calibration to keep things looking crisp.
 
  • Like
Reactions: Godfavor

dave_d

Member
Jan 31, 2005
1,850
677
1,580
We still have a Sony 34XBR960 CRT and lately Ive been playing Ghost N Goblins Resurrection on it via Switch (the tv has 1 HDMI input thankfully) It really is crazy how much smoother 60fps/60Hz looks on a CRT vs. even my main 144Hz PC LCD and I won't even get into how much better the contrast ratio & black levels are on a decent CRT.

But the convergence has slipped a bit on the XBR960 over the years, thats one downside w/ CRT they need occasional calibration to keep things looking crisp.
Well and the geometry on the image can be a bit off. (IE On my old Trinitron in the 90s straight lines in menus on Final Fantasy 2 were clearly not straight.)
 

Kev Kev

Gold Member
Oct 25, 2012
3,649
6,733
1,050
I’ve never noticed it

I don’t notice most of the things you people complain about
 

Durask

Member
Feb 6, 2012
2,921
2,129
860
haha, sounds like you just came from the future.
Nah, there are 4 now.


I tried the Razer one.