• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Old CRT TVs had minimal motion blur, LCDs have a lot of motion blur, LCDs will need 1,000fps@1,000hz in order to have minimal motion blur as CRT TVs

01011001

Banned
No dude you're wrong but you're being stubborn about it. The average person is not too tech savvy and telling them that flat panel TVs have a lot of motion blur is something they can easily understand. Please you're being stubborn just accept that you're wrong on this one.

I mean you don't know if that's so easy for the average person to understand either.

if you tell a non-tech savvy person that a TV has a lot of motion blur I bet many will then ask back what that means, just like people would if you told them it has persistence blur.

because what does the average joe know about motion blur or what that means in practice? Motion blur is not a term that anyone outside of tech savvy people and/or gamers or film nerds really ever think or talk about.

the only way you can rightfully claim that motion blur is easier to understand than persistence blur is if you tested it in the field at an electronics store. you switch from using Motion Blur to Persistence Blur every other customer and then write down how often each one asked back what that is.

and then after like 100 or if you wanna be more precise maybe 1000 customers you see which one was asked back for clarification more :)
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
All these 1000, 10.0000 a million or something refresh rates don't mean much.

In order to achieve clarity in these displays, the content must also run at such high frame rates.

I do have a 240hz monitor and when i feed it 240fps content, the clarity is improved (better than 60hz, not as good as CRT). But if the content is the usual 60fps, which is what you want from most games or when you emulate console games, there is no difference, you still getting a very blurred moving image. The refresh rate doesn't matter in this instance.

So there needs to be a hardware change, a technological change, for this to work. All the softwar tricks in the world will never be as good because these need processing and processing creates artifacts and input lag.

And all this just so we can get a similar quality like the one we had for decades before flat panels took over. Talk about a huge step back.



240hz is not enough, i test it myself every day against my CRT TV and monitor.

BFI is also pretty bad because it mangles the colors and brightness.
Bullshit
 

JRW

Member
ya whenever I play games on my 2008 Kuro Plasma I'm reminded of how behind LCD's still are in motion clarity even when comparing the same games on my 27" Dell 144Hz Gsync PC monitor, last game was God of War PC, 60fps/60Hz on the Plasma looks better than 144fps/144Hz on the LCD.
 

BadBurger

Is 'That Pure Potato'
here are the specs for the Sony Trinitron GDM-FW900

wQMvF5L.jpg

That's nice as a kind of theoretical I guess, but who back in 2014 or so would be spending $2000 on archaic tech unless they needed the best CAD experience out there, and how would it benefit modern media? In any form? Slightly less motion blur in some games? And that's all before we even get into the nitty gritty of a TV that weighs more than a healthy teenager in 2014 that lacks all of the (even then) modern attributes and technologies.

Not a convincing argument to me.
 
Last edited:

Type_Raver

Member
Using a console on CRT monitor represent!

I miss my 21' Dell trinitron CRT, but a recently acquired a free 19' Hyundai CRT (shadow mask), works quite well, and is quite bright too!

I've got an old 21' Apple CRT, which looks quite nice and works well on PC, but lacks geometry adjustment buttons and isn't good for consoles.

Always on a lookout for a 19 or 21', early 2000s model sony, dell, sun or Mitsubishi.

X7C45Ue.jpg
 
Last edited:

mdrejhon

Member
So I thought the whole point was about eliminating motion blur? I thought the whole point was that modern day flat panel TVs such as our LCDs/OLEDs have a shit ton of motion blur compared to the old CRT TVs, and the only way to eliminate this motion blur was to achieve a 1,000 frames per second?
Right Tool For Right Job.
Don’t confuse the screwdriver with the hammer in the toolbox.

We’re simply choosing CRT simulation as a superior method of BFI. The HDR allows brighter BFI, and the rolling-scan ensures that photons are hitting eyeballs are all times, rather than harsher square-wave flicker. 60 Hz flicker CRT has less eyestrain than 60 Hz squarewave BFI.

Running an emulator with BOTH spatial HLSL filters (existing technology) AND temporal HLSL filters (my CRT electron gun simulator idea), will make it look both spatially and temporally correct. Basically a superior method of BFI.

The technology can scale to better-and-better temporal accuracy, the more Hz you throw at it, so it can be designed to be Hz-scaling, much like spatial CRT filters begin to look more and more accurate on higher resolutions, even even more accurate with OLED (good blacks, etc).

Again, remember, Right Tool For Right Job. Not everyone wants to emulate a CRT, but there are use cases where you DO want to simulate a CRT both spatially AND temporally. (Or even, only temporally).

A phosphor fadebehind rolling scan is the gentlest possible flicker for a specific Hz.

So (if you’re stuck at low Hz) AND (you don’t want extra frames) THEN (rolling scan + fade logic) is the gentlest way to flicker at a specific “X Hz” in situations where you actually want to flicker.

Also, dimness of BFI can be compensated by HDR nit surges at the small windowing sizes of a tight rolling scan. A 10,000nit HDR display (I saw a prototype at CES 2020) can still do 500 nit at 1/20th persistence, which is great for simulating the electron beam dot (which is incredibly bright), and at only 5% window.

Again, repeating: Right Tool For Right Job.

Sometimes the superior tool is blurfree sample and hold (1000fps+ at 1000Hz+), but sometimes it’s not always the Right Tool For The Right Job (e.g. faithful retro simulation).

I don't understand why we would try to use software to simulate a CRT TV on LCDs/OLEDs? I thought the two main things we just wanted to eliminate are motion blur and the "phantom array effect AKA wagon wheel effect".
Again, repeating: “Right Tool For Right Job”.
Sometimes you want to, and sometimes you don’t want to.

But if you had a display with a 1920Hz refresh rate (and a computer capable of playing a video game at 1920fps) you wouldn't have motion blur anymore because according to blurbusters motion blur is eliminated once you hit 1,000 frames per second. (And also according to blurbusters we will need a screen with a 10,000Hz refresh rate in order to eliminate the phantom array effect).
Emulators can’t do 1000 frames per second without violating faithfulness.

For some specific applications, sometimes you don’t want interpolation — if you are a human who’s specifically perservation-focussed and are not flicker-sensitive (But want a brighter BFI and a gentler-flicker BFI than old-fashioned digital squarewave BFI). So if you had an answer “What’s the world’s best BFI algorithm” — then the answer is a CRT or a perfect simulation thereof.

In other words, “My post is for people who wants the world’s best BFI algorithm”. WIth that perspective, go back to the post and reread it — it’s for situations where other tools are unsuitable for their specific needs for a specific application. I was just confirming that specific person is correct that display algorithm simulators is something that will eventually (in 10-20 years from now) become a popular substitute to a CRT purchase.

You’d still use the 1000fps 1000Hz to play your PC esports game like you describe, and then when you launch simulator, you’d instead use a CRT simulator as a superior BFI that’s far better than today’s BFI.

Again, Right Tool For Right Job.
It’s not black and white.

What I'm saying is, what's all this talk about using software to simulate CRT TVs? I thought the point was about eliminating motion blur on flat panel TVs? Here's what I want, just eliminate the motion blur and the phantom array effect and I'll be happy. But I don't suppose we'll be getting LCDs or OLEDs with a 10,000Hz refresh rate for at least 50 years from now correct?
Again, Right Tool For Right Job.
Don’t confuse the screwdriver with the hammer, in a manner of speaking, metaphorically…

You’re talking about a different legitimate tool than I am — they both co-exist on the same display and you can switch between display algorithms instantly.

Turn ON/OFF the CRT simulator mode (or plasma simulator mode) like turning ON/OFF BFIl.

As a superior version of BFI for retro-friendly preservation for the 60 years of legacy 60fps 60Hz material, where you actually want to preserve the original CRT flicker, original (low/zero) blur, original phosphor decay, original phantom array effect, etc — all the original artifacts. Whether at home, or in a museum, or a MAME arcade cabinet.

You might wish to re-read it through a corrected lens:
https://www.neogaf.com/threads/old-...blur-as-crt-tvs.1593080/page-7#post-266446063

One moment, your display is perfectly simulating a CRT as perfectly (to human vision margins) as a Sony FW900 CRT tube. Even passing an A/B blind test!!! behind a fake bezel. Including shadowmask/aperturegrille texture, fuzziness, resolution independence, brightness, phosphor ghosting, zero blur, etc. ALL of the attributes correctly spatially AND temporally simulated to human retina league.

Next moment, your display is an ordinary PC 1000fps+ 1000Hz+* (choose any quadruple digit) blurless sample and hold display.

Note: *1000 may not yet be enough to pass blind tests with a CRT. It may require, say, 4000+ Hz. However, 1000 should get pretty close, assuming extremely bright HDR pulses are available.

The same screen thus, capable of chameloning into every single display in humankind.

Just look at all the CRT lovers here, and they “Liked” my earlier post almost a dozen times already — a true chamelon of a display — which becomes technologically possible once (resolution AND refreshrate AND hdr) are all simultaneous retina’d.

The higher the refresh rate (and the more “retina” the resolution and HDR is), the more likely a temporal-domain retro display simulator will pass an A/B blind test with the original display — i.e. passing an A/B blind test with a flat CRT tube versus a flat panel (behind an equally thick glass front layer, anyway).

And being able to simulate infinite number of displays on the same panel — at a moment’s notice. Like an infinite number of custom BFI modes.

TL;DR; Right Tool For Right Job
 
Last edited:

mdrejhon

Member
This sounds like a lot of effort to compensate for the wrong technology to begin with.
Thinking outside of the box…

“Wrong Technology” is actually relative, because real-life does not flicker, and real-life has infinite frame rate. And real life does not add /additional/ motion blur beyond what your human brain already generates.

Attributes useful to attempt to simulate in a future VR headset, where VR perfectly matches real life, a perfect Holodeck.

CRT’s flicker (unlike real life) and has a low frame rate (unlike real life).

Flicker (phosphor, BFI, strobing) is a humankind BAND-AID because we can’t simulate real life’s infinite frame rate.

Blurless sample and hold (ultrahigh framerate at ultrahigh refresh rate) is the right technology to pass a Holodeck Turing Test (blind test between VR headset and transparent ski goggles), where you can’t tell apart real life and VR.

At sufficiently high resolutions (e.g. 8K), even only mere quadruple-digit refresh rates still produces a difference between real life and VR, e.g. phantom array effects and even motion blur (1000fps = 1ms motion blur = 8 pixels of motion blur at 8000 pixels/sec). Those two effects still creates differences between real life and VR.

All modern VR headsets use flicker (strobing as BFI method) because display motion blur is massively amplified in VR, and it is to be avoided at all costs (even via flicker). So Quest 2 unavoidably flickers, at 0.3ms MPRT. That’d require 3333fps at 3333Hz to fully eliminate flicker while keeping the same amount of motion blur. Instead of ninety 0.3ms frames flashed, the whole second is filled consecutively full of unique 0.3ms frames.

In other words: we’re strobing/flicker/phosphor as a band-aid to fix motion blur, only because we can’t simulate real life’s infinite frame rate. So CRT is totally the wrong technology (long-term) for VR.

That being said if you’re wanting faithfulness for retro games — yes, CRT is the right technology.

Just saying “wrong technology” requires a disambiguation — it’s relative to the content you need handled.

The bonus is that the “Right Technology” for VR, automatically means a literally a chameleon display capable of simulating any past display (including a realistic-looking CRT tube in VR, sitting on a virtual desk, or a realistic-looking plasma display in VR, sitting on a virtual wall — looking spatially AND temporally correct, and indistinguishable from real life.

It might sound like a lot of effort, but look at the effort we did to simulate CRTs correctly spatially. We’re only missing the effort to simulate CRTs correctly gamut-wise (HDR color space) and correctly temporally-wise (the ultrahigh Hz needed) to autocomplete the “nothing looks off” problem.

The algorithm to simulate a CRT is actually surprisingly simple software-wise (I’ve got prototype code already, which begins to look semi-useful on 240Hz OLEDs, at least looking far better than ordinary squarewave software BFI).

At least it’s easier for a single individual to write the logic for a software-based CRT beam simulator, than to build the hardware of a brand new CRT.

The biggest problems were simply computing power (solved) and the necessary refresh rate to begin to be superior than BFI (almost solved).

Imagine a time traveler going from 2022 to 1982 and showing a geek a modern iPhone or Galaxy. In 1980s of the golden gaming era, we’d be shocked at the fact we’d have HD smartphones in our pocket, capable of better-than-hollywood movie quality, with enough bits of memory to hold the whole said movie in the phone’s flash memory, with incredible compression algorithms undreamed of. The audacity to have 1 terabit of rewriteable flash memory in your pocket (128 GB) back in the day when 16 kilobytes cost over $200 just a few years prior (late 70s) and a full 64 kilobytes in Commodore 64 was the new ginormous thing in town in 1982, a welcome relief from 4 KB of late 70s in Apple ][ in 1978. With an amazing gigabit wireless modem (aka 5G), fantasyland pigs can fly stuff in 1982. Someone in 1982 would said it was stupid crazy overkill to have that memory in a pocket battery powered supercomputer capable of rendering TRON movie in /realtime/ at 100x original detail. This exceeds the whole planet’s RAM memory combined in all the computers in the world, added together, at the time, all into one device.

So, nah, CRT electron beam simulation in a GPU shader isn’t sounding that crazy anymore. ;)

Besides, the “Right” display (Capable of perfectly doing VR matching real life) is also automagically a chamelon display simulating any past display to human vision thresholds.

Fully optionally flickerable in any optional infinite combinations, capable of simulating any past retro display on demand, literally a superset whose venn diagram is a circle inside a circle, as far as human retina margins are concerned. Simulate a CRT, simulate a plasma, simulate a DLP, etc, including per-pixel temporal behaviors.

But yes — it will be a long time before blind tests get passed (real retro display versus simulation thereof). But long BEFORE then, faithfulness and usefulness increases the more resolution / Hz / dynamic range (true HDR bigger than today’s HDR) we throw at it ever closer and closer to retina thresholds for each resolution / Hz / dynamic range.

360Hz LCDs apparently still have too slow GtG and weird gamma behaviors mid-GtG to really look good for CRT simulators (still useful demo), but upcoming bright 240Hz OLEDs interestingly looks usefully “more realistic CRT even if not perfect” in software CRT simulation experiments than classical 120Hz OLED TV BFI (the crude BFI that is the only option if your max Hz is only 120).

2022 Technology Statuses:
- Retina level of resolution: NEAR. We are already there on some displays.
- Retina level of refresh rate: FAR. We are still far from the quintuple digits.
- Retina level of dynamic range: FAR. Commodity displays are still very non-HDR, and even Dolby Vision (Even 10,000nit spec) isn’t retina HDR yet.

Fortunately, it is easy to beat the low bar of uncomfortable classical squarewave BFI. It is easy to beat in software once “GtG near zero”, like for OLEDs and MicroLED displays, and Hz is at least 4x+ the retro display. The concept is sound, and very quality scaleable (comfort/quality superior to classical BFI) as Hz and HDR gets better over the years.
 
Last edited:

Mr.Phoenix

Member
I am sorry... doesn't BFI fix this issue?

The only issue with BFI is a dimmer image, which can be avoided by simply having a brighter display, to begin with.
 

01011001

Banned
I am sorry... doesn't BFI fix this issue?

The only issue with BFI is a dimmer image, which can be avoided by simply having a brighter display, to begin with.

it does, but BFI has issues.
for example the vast majority of screens can't use BFI in combination with VRR, and the ones that can have issues.
then for some reason newer TVs use hardware that doesn't support 120hz with BFI, while 2019 models did...

like it's a bit messy.

my PC monitor doesn't support BFI with HDMI, and only supports it at 60hz and 120hz. it doesn't work with 144hz, 165hz or anything between those and 60hz outside of 120hz.

and due to the dimming down to around 400nits peak for most TVs, HDR loses all of its impact of course.

so BFI is a way to get clearity, but on current screens it still is far from good enough to be something worth using outside of specific cases.
 
Last edited:

Mr.Phoenix

Member
it does, but BFI has issues.
for example the vast majority of screens can't use BFI in combination with VRR, and the ones that can have issues.
then for some reason newer TVs use hardware that doesn't support 120hz with BFI, while 2019 models did...

like it's a bit messy.

my PC monitor doesn't support BFI with HDMI, and only supports it at 60hz and 120hz. it doesn't work with 144hz, 165hz or anything between those and 60hz outside of 120hz.

and due to the dimming down to around 400nits peak for most TVs, HDR loses all of its impact of course.

so BFI is a way to get clearity, but on current screens it still is far from good enough to be something worth using outside of specific cases.
I get all that, but I feel BFI is ultimately a more practical solution than having TVs with 1Ghz refers rates and video to match that. The solutions to get BFI right are easier.
 

01011001

Banned
I get all that, but I feel BFI is ultimately a more practical solution than having TVs with 1Ghz refers rates and video to match that. The solutions to get BFI right are easier.

true, but so far it's not easy getting screen that can do it really well, and basically none can use it in tandem with VRR.

the ones that are able to use it with VRR apparently start looking super bad as soon as you get below 100fps, because the uneven strobing will become way too obvious.
 

JayK47

Member
I have a CRT I want to get rid of. An old Samsung 1080p. It requires 2 strong people to move it. It was great for the Xbox 360, but quickly got made obsolete with an LCD TV. have barely used it since and it is just a burden to have due to it's weight.
 
Top Bottom