You missed a golden opportunity by not naming this thread "Lightboost Returns: Final Frequency 240" or something like that.
Wrong, buddy
Sadly, no -- even 1000Hz is not yet enough to pass the "
wow, I didn't know I was standing in Holodeck" test for average human vision. Wagonwheel effects. Mousedropping effects. Strobe effects (some people still hate 360Hz PWM dimming backlights, and fixing motion blur only works at 1 strobe flash per refresh). Even faint motion blur can still occur. And many other reasons. See below for further explanations why 240Hz is woefully low for Holodeck.
240hz, holy shit. The human eye can see at about 400hz right? So we are now half way there if that's the case.
It doesn't work that way. It's very dependant on lots of variables.
Vision researchers found situations where they kept seeing motion blur at 1000fps@1000Hz, so the limit doesn't stop there when the variables are set to ideal values. Some manufacturers make displays for vision researchers (vpixx.com has a true-500Hz monochrome projector for vision researchers, and there are others). So this science is already being done. Also see
Why Do We Need 1000fps@1000Hz This Century? for some educational information.
Even at best case scenario (framerates matching refresh rate), things make it easier to see motion blur at ever-higher refresh rates, such as:
- Games have faster panning speeds;
- Closer viewing distances;
- Bigger field of view;
- Higher resolutions (4K motion blur is currently really bad);
Persistence is more important, because CRT 60fps@60Hz has less motion blur than non-strobed LCD 120fps@120Hz. This is because CRT had less persistence than LCD (until recently -- when strobe backlights arrived). John Carmacks also talks about persistence, and NVIDIA also talks about persistence. Also persistence is not transition time -- persistence is different from GtG -- persistence is
pixel visibility time, and persistence accounts for strobe backlights, while GtG does not. Now that persistence accurately describes
tracking-based motion blur, there's a very simple law that Blur Busters has discovered:
Blur Busters Law:
1 millisecond of persistence creates 1 pixel of motion blurring during 1000 pixels/second.
Examples (motion tests at
TestUFO):
2ms persistence & 2000 pixels/second = 4 pixels of motion blur during motion test at framerate=Hz
0.5ms persistence & 3000 pixels/second = 6 pixels of motion blur during motion test at framerate=Hz
However, I'm not the first to study this. Tons of scientists already know this; all the complex stuff is already well documented in science papers -- sample-and-hold and MPRT --
http://scholar.google.com/scholar?q=MPRT ... These are complex, but I'm able to de-mystify a lot of this stuff (and LightBoost provides a very simple scientific "see for yourself" science experiment) for everybody, people like us, a Plain English interpretation above. And vision researchers I've talked to, has agreed with the Blur Busters Law (so far) -- it's the world's most simplified link between display persistence and human-perceived motion blur (for eye-tracking-based motion). The law is uncannily accurate for squarewave persistence such as strobe backlights, DLP black frame insertion, OLED rolling scans, and other squarewave flashing occuring at one flash per refresh.
To attempt 100% flickerfree 1000Hz without strobing, your persistence can never be lower than 1ms, because persistence has a hardlimit of the full refresh cycle if the pixel is continuously visible for the whole refresh (unless GtG is longer than a refresh cycle -- like the old 33ms LCD days -- then you can have higher persistence than the refresh cycle. But it can never be lower than the length of a refresh, unless you begin strobing or otherwise light-modulating). As you track your eyes on moving objects, your eyes are in a different position at the end of a refresh than at the beginning of a refresh. If you had a 4K VR goggles strapped to your head and you turn your head, the whole scenery will pan past your face at over 4000 pixels/second, 1ms of persistence at 4000 pixels/second will create 4 pixels of motion blur. Viola. 1000Hz not enough for flickerfree PWM-free display to simultaneously be blur free.
Today, at LightBoost=10%, has 1.4ms of persistence, which is only 1.4 pixels of motion blurring at 1000 pixels/second (See
photos of LightBoost=10%). That's only 1/700sec -- the same amount of motion blur as a sample-and-hold flickerfree 700fps@700Hz, since frame visibility length is the chief dictator of tracking-based motion blur (Example:
www.testufo.com/eyetracking -- explains this very well. 120Hz only halves motion blur over 60Hz, then 240Hz halves motion blur again over 120Hz, and so on -- for sample-and-hold flickerfree displays. But as we keep getting bigger sizes, more resolution, more FOV, fast panning speends, the detectability thresholds keeps going up).
The bottom line is not even 1000Hz is the final frontier, since real-life is framerateless (or infinite framerate, depending on interpretation). For true holodeck emulation, with zero motion blur, and zero strobe effects, and zero wagonwheel effects, and zero GPU blur effects -- to simulatneously achieve ALL the above and be tricked "
wow, I didn't know I was standing in Holodeck", you will need continuous motion with no static frames (aka infinite frame rate). That is scientifically impossible for a display, but a 1000fps@1000Hz display will get us very close to Holodeck quality. 400Hz is not remotely close. Not by a long, long, long shot.
Photographic demonstrations of Blur Busters Law (1 ms of persistence equals 1 pixel of motion blur during 1000 pixels/sec):
-- PHOTOS:
60Hz vs 120Hz vs LightBoost
-- PHOTOS:
LightBoost 10% vs 50% vs 100%
The great thing is LightBoost provides a very simple "see-for-yourself" that demonstrates this very well.