• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Old CRT TVs had minimal motion blur, LCDs have a lot of motion blur, LCDs will need 1,000fps@1,000hz in order to have minimal motion blur as CRT TVs

Whitecrow

Banned
Following this thread, I tried BFI on my c9.

I tried before and didnt like the flicker, but now it was like..

Wtf is this sorcery??

Even in a fps mess like ff7r, it can be appreciated how much more natural the motion looks.

Also, oddly enough, it brough the panel brightness down to a level that everything just looks better.

Bright areas still flickers but I can deal with that.
 

nkarafo

Member
Modern panels are starting to redeem themselves a bit. VRR is an amazing feature that i would miss on a CRT.

Though i still prefer the CRT motion clarity but its not like a modern monitor completely sucks now.
 

LiquidMetal14

hide your water-based mammals
I'll live with the negatives of the bastard son of bitch 83 and 65" OLED TVs I have at the house. Good for the future for when those of us can who can afford it will be able to see the light.
 
I've been using a software BFI program I grabbed off the Blur Busters forums to do some.. unethical... human testing, on myself 🤣
Right now, I am driving my Asus XG279Q at a custom refresh rate of 150hz, and using this software BFI tool to draw 1 real frame followed by 4 black frames for an effective refresh rate of 30hz. Yep, it's a flickery mess. But let me tell you something, emulation and old games built around 30 fps, have NEVER looked so sharp and smooth. When you single cycle strobe, it doesn't matter what refresh/frame rate you're targeting, it will appear sharp and smooth. I've even tried 25 fps from the old Diablo 2 days, and the game looks crystal clear sharp.
And to be clear, this isn't a good LCD. It's a crappy fake IPS screen with around 3-4ms response times. I am so eager to try an OLED with software BFI combined with hardware BFI for these lower fps games and see not only the incredible smoothness, but the deeper contrast too.

Of course, this isn't ideal and no normal human being wants to be subjected to sub 60hz flicker. I can tolerate it because I'm some kind of ungodly beast of a man. But the good news is, if we can reach 960hz screens, then we can simulate rolling scans on just about any "low" fps content. Eg - for 60 fps feeds, you'd have 16 refresh cycles per 60hz frame to simulate a rolling scanout. This would dramatically reduce flicker and also greatly help boost brightness and colors. You don't even need to send a 960hz signal to the display, it can be done locally onboard the panel's built in processor. Just feed it a regular old 60hz signal and let the display do all the heavy lifting of simulating 16 simulated scanout frames from the source. Done.

I hope I live to see the day this can be done because it's going to be so fucking cool.
You're on to something.

But it won't be OLED to do it for the masses, it'll be Micro LED and there are further mitigation measures to be taken. The amount of blue light (and by proxy, near UV) emitted by these new techs is off the charts which is not good. We'll need a filter on top of the screens to negate it once we're getting into flicker territory, because peak brightness will have to increase as well.

Mini-LED is also not a bad candidate to do it if it lasts long enough.

Granted, this was also a concern with CRT's, where instead of blue light in your eyes you'd get x-rays if not for shielding. So they were shielded and lost peak brightness due to it.


The last plasmas Panasonic did were already doing 3000 Hz. :) I wouldn't be surprised if they come up with a subpixel layout that simulates the plasma subfields on a wide surface instead of layering. Micro-LED probably allows it doubling the hertz of a panel easily that way.
Good luck trying to find a high res crt pc monitor that isn't in the thousands of dollars.
well, for working I have to say LCD is more comfortable because it actually holds the image without flicker. If you work in excel, I wouldn't use CRT ever. It's still good to edit photos, see movies and game on.

Although I've seen some crazy shenanigans lately with even and odd lines on LCD LG monitors that completly fuck with image stability.
Modern panels are starting to redeem themselves a bit. VRR is an amazing feature that i would miss on a CRT.

Though i still prefer the CRT motion clarity but its not like a modern monitor completely sucks now.
They are. The path we took was cheap pixels which at first that was a nosedive, 4K sets often had worse panels than their FHD counterparts, with the inclusion of 6-bit FRC (IPS panels used to have 8 bit color per pixel, but started existing in 6 bit and simulating the rest with interpolation/slight grain when we did the jump, this wasn't a fluke but an opportunity). Instead of increasing the colors we were able to pull per pixel, manufacturers were keen on reducing the color resolution to half (4K image, 2K color) and using look up tables to do the missing colors through color dithering.

Look up tables use a 2x2 approximation to simulate a color that your panel doesn't render, so it's halving the color resolution in the process. It just doesn't look like half resolution because it's dithered over the rest of the image detail and even if it did 4K is overkill under 85" in most cases so detail is perceived as retina before we could ever see that behaviour.

CRT's, plasmas were comparatively complex and simply better if you took into account the quality of each pixel for it. The path we have now is panel simplicity with as much speed and pixels as possible.

I wouldn't be surprised if we end up having 16K TV's with 960 hertz in order to pull 4K at 60 Hz properly. We might do the same with processing as well, smaller processors with less power but more of them, it's the same paradigm everywhere.

At some point simulation will be better than these old tech even in the places they're hard to beat, but it's an uphill battle and in the end it'll beat them not because it bested the principle, but because our eyes can't tell the difference between the "proper way" to do it and the emulated version.
 
Last edited:

Mister Wolf

Member
I've been using a software BFI program I grabbed off the Blur Busters forums to do some.. unethical... human testing, on myself 🤣
Right now, I am driving my Asus XG279Q at a custom refresh rate of 150hz, and using this software BFI tool to draw 1 real frame followed by 4 black frames for an effective refresh rate of 30hz. Yep, it's a flickery mess. But let me tell you something, emulation and old games built around 30 fps, have NEVER looked so sharp and smooth. When you single cycle strobe, it doesn't matter what refresh/frame rate you're targeting, it will appear sharp and smooth. I've even tried 25 fps from the old Diablo 2 days, and the game looks crystal clear sharp.
And to be clear, this isn't a good LCD. It's a crappy fake IPS screen with around 3-4ms response times. I am so eager to try an OLED with software BFI combined with hardware BFI for these lower fps games and see not only the incredible smoothness, but the deeper contrast too.

Of course, this isn't ideal and no normal human being wants to be subjected to sub 60hz flicker. I can tolerate it because I'm some kind of ungodly beast of a man. But the good news is, if we can reach 960hz screens, then we can simulate rolling scans on just about any "low" fps content. Eg - for 60 fps feeds, you'd have 16 refresh cycles per 60hz frame to simulate a rolling scanout. This would dramatically reduce flicker and also greatly help boost brightness and colors. You don't even need to send a 960hz signal to the display, it can be done locally onboard the panel's built in processor. Just feed it a regular old 60hz signal and let the display do all the heavy lifting of simulating 16 simulated scanout frames from the source. Done.

I hope I live to see the day this can be done because it's going to be so fucking cool.

422mPCoihnDpk83SVGQiY2YZCv8Huprr_CXn7YW59bw.png
 
Plasma for life yo
I wish they would still make PDPs. My panasonic 42GT60 is broken (after lightning strike) but picture quality was on another level compared to LCDs (motion clarity, contrast, blacks, shadow details, vivid colors, natural looking white balance instead of too cold / warm). My father still has panasonic 42X10 (just 1024x768 panel) and even PS3/x360 games looks perfectly sharp on it. I even prefer playing on this 42X10 than on my 4K 55inch LCD despite much lower resolution, because from 2 meters even 1024x768 looks surprisingly detailed and sharp and this 42X10 has much better colors / contrast / motion clarity.
 
I wish they would still make PDPs. My panasonic 42GT60 is broken (after lightning strike) but picture quality was on another level compared to LCDs (motion clarity, contrast, blacks, shadow details, vivid colors, natural looking white balance instead of too cold / warm).
Hm, it's certainly worth to fix. If it's lightning it's down to the power supply, which failed a bit more often than it should on Panasonic plasmas. (not as bad as Samsung and LG, but still).

Panasonic used to rebuild them for local parts when they failed, they might still have some? otherwise a good tv technician can rebuild it.

I wanted to buy a GT60 desperately in 2013 for the small living room, but could only afford the 65VT60, not the 65VT60 and the 42GT60.
My father still has panasonic 42X10 (just 1024x768 panel) and even PS3/x360 games looks perfectly sharp on it. I even prefer playing on this 42X10 than on my 4K 55inch LCD despite much lower resolution, because from 2 meters even 1024x768 looks surprisingly detailed and sharp and this 42X10 has much better colors / contrast / motion clarity.
Yes! :)

I have a 42X50. Bought it for Wii/PS2/Retro Console usage for 279 euros I believe (!). And it's crazy how little it cost and how ridiculously good despite the anamorphic pixel structure it still is. You just can't buy anything with that image quality and size for that price. 42X60 was even better, full calibration features just like it's top range siblings and similar black levels but cost 100 euros more and was never on sale.

PS4/XBox One looks really good on those (compared to PS3/X360) because with the framerate being improved (less dips and vsync issues), motion improves even further.


There was also a cheap Panasonic 42" 852x480 plasma a few years before. It was even crazier for 480p content if a bit niche.

EDIT: Pictures of one of those 480p Panasonic plasmas:



It's incredible that they manufactured them until 2010/2011 or so.


The UT50 model was also crazy. It was a low cost ST50 with basically everything the ST50 had (minus wifi and lower image subfields). Sadly they didn't make a UT60 model but ST60's were super cheap anyway.

Panasonic was on a row, they were pushing every corner of their tech and doing so at very low prices. The demise of Plasma was strange, they went out with a bang just like Pioneer did before.
 
Last edited:
Hm, it's certainly worth to fix. If it's lightning it's down to the power supply, which failed a bit more often than it should on Panasonic plasmas. (not as bad as Samsung and LG, but still).
Power supply works and TV starts normally, however HDMI inputs shows no picture. TV repairman told me signal processing chip on mainboard must be damaged, but I cant find a replacement main board (TNPH1043 1A) for my 42GT60 (UE model).
 
Last edited:
Power supply works and TV starts normally, however HDMI inputs shows no picture. TV repairman told me signal processing chip on mainboard must be damaged, but I cant find a replacement main board (TNPH1043 1A) for my 42GT60 (UE model).
Crap.

I found one on ebay, it's on UK:

-> https://www.ebay.com/itm/325201421138

UK models were slightly different, but definitely compatible. (some Panasonic UK models had RJ45 while the european models didn't)

Apparently, VT and ZT models also used that board. I wonder what happens if you use their firmware on a GT60 (but I wouldn't try it) the board from a 50GT60 is most likely fine though.
 
Last edited:

NeoIkaruGAF

Gold Member
There was also a cheap Panasonic 42" 852x480 plasma a few years before. It was even crazier for 480p content if a bit niche.

EDIT: Pictures of one of those 480p Panasonic plasmas:



It's incredible that they manufactured them until 2010/2011 or so.

I had a 37" Panasonic EDTV plasma (852x480).
It was the perfect TV for the Wii. 16:9, no upscaling, the Wii never looked so crisp on any other screen (Wii sucks on CRT).
It wasn't that ideal for 240p stuff, though - the pixels were huge, sorta like having a giant GBA. It was pretty bad for TV too, and it had some tremendous image retention so it wasn't wise to use it too long with the same game.
The bezel was huge for today's standards though, it'd be awesome to have a new one with the thin bezels that can be made today.
 

cortadew

Member
I had a 37" Panasonic EDTV plasma (852x480).
It was the perfect TV for the Wii. 16:9, no upscaling, the Wii never looked so crisp on any other screen (Wii sucks on CRT).
It wasn't that ideal for 240p stuff, though - the pixels were huge, sorta like having a giant GBA. It was pretty bad for TV too, and it had some tremendous image retention so it wasn't wise to use it too long with the same game.
The bezel was huge for today's standards though, it'd be awesome to have a new one with the thin bezels that can be made today.
Why do you think the Wii sucks on a crt? I had my Wii hooked up on a 32 inches Sony Wega and thought it looked fantastic, much better than my sony bravia full HD from 2007.
 
Crap.

I found one on ebay, it's on UK:

-> https://www.ebay.com/itm/325201421138

UK models were slightly different, but definitely compatible. (some Panasonic UK models had RJ45 while the european models didn't)

Apparently, VT and ZT models also used that board. I wonder what happens if you use their firmware on a GT60 (but I wouldn't try it) the board from a 50GT60 is most likely fine though.
Thanks mate :). According to the description this board will fit my 42'inch model as well, so there's a hope for my tv :).
 

NeoIkaruGAF

Gold Member
Why do you think the Wii sucks on a crt? I had my Wii hooked up on a 32 inches Sony Wega and thought it looked fantastic, much better than my sony bravia full HD from 2007.
'Cause it looks very very soft, even with a SCART RGB cable.
I used a component cable on the plasma.
 
'Cause it looks very very soft, even with a SCART RGB cable.
I used a component cable on the plasma.
Well, Wii video encoder chip was crap. Signal was weak and noisy when compared to, say, Gamecube. There are 3 known revisions from different manufacturers though, so not all consoles were the same.

Are the cables original? That could make a difference.

Then there's the TV that receives the weak signal and what does it do with it, some TV's were able to amplify weak signals, some not so much.

My launch Wii is not blurry on my 14" PVM's when connected by either Component cables or RGB Scart (my component cables are third party and "good but not the best", the RGB Scart is Nintendo original), but mileage might vary. The output is generally considered soft and blurry, but with the right combo you can almost not notice.
 
Last edited:

Gamer79

Predicts the worst decade for Sony starting 2022
CRT's no doubt had great picture quality. I had a 20 inch back in the day I used to play my dreamcast on , then I has a jury rig system and ran the xbox 360 to the crt with sound.

There is No Question they have handle motion much better, deeper blacks, better color accuracy, etc. The issue is they are not practical these days so their size, weight, and form factor. If they could build a widescreen crt at 50 inches or more and make the weight a fraction of what it is that would be a different story.

My 55inch tv weight's 38lbs. It is 4k, can produce billions of colors, 120hz, 1000 nit hdr, Quantum dot etc etc. It has a fantastic picture.

with that said....

If you can build me a 50inch CRTV for under 100lbs, widescreen, and affordable? Sign me up. I just believe the weight of these behemoths make them impractical these days.
 
Last edited:

Hoddi

Member
'Cause it looks very very soft, even with a SCART RGB cable.
I used a component cable on the plasma.
The AV connector used the same pins for RGB and YPbPr. Were you possibly seeing the difference between 480i and 480p?

For what it's worth, my GC and Wii look very similar on both my CRT and plasma over RGB SCART. I still own the original Nintendo GC cable and it isn't much better than the cheapo Wii cable I bought along with my CRT two years ago.
 

mdrejhon

Member
I wouldn't be surprised if we end up having 16K TV's with 960 hertz in order to pull 4K at 60 Hz properly. We might do the same with processing as well, smaller processors with less power but more of them, it's the same paradigm everywhere.

At some point simulation will be better than these old tech even in the places they're hard to beat, but it's an uphill battle and in the end it'll beat them not because it bested the principle, but because our eyes can't tell the difference between the "proper way" to do it and the emulated version.

Correct. yes, this is the path of the future (10-year timeline)

Software-Based Display Simulators
(e.g. CRT electron beam simulators in shader)

Upcoming 240Hz OLEDs are almost enough to begin CRT electron beam simulators in software (simulate 1 CRT refresh cycle via 4 digital refresh cycles).

Today we can do this:

An example of how software algorithms can simulate a different display that you don't already have:
  • testufo.com/vrr
    Variable refresh rate simulation
    Software-based simulation of VRR on a non-VRR display.
    Works on any display, even non-VRR displays.

  • testufo.com/blackframes#count=4&bonusufo=1
    Variable control of motion blur at a fixed frame rate
    Requires 144Hz or higher to avoid uncomfortable flicker.
    Applicable to future adjustable phosphor decay capabilities.

  • testufo.com/rainboweffect
    Simulation of DLP color wheel (rainbow effect)
    Requires 240Hz or higher refresh rate for the rainbow effect to look accurate.
    If you can't see the rainbow effect, wave your hand really fast in front of this animation to see the rainbow effect.
    WARNING: epilepsy warning if you don't already have a 240Hz monitor. These demos look flickerfree on a 240Hz+ screen
    DO NOT RUN THIS TEST AT 60HZ: EYE PAIN WARNING
Tomorrow, we will be able to do this:

240Hz HDR/OLED -- semi-accurate CRT electron gun simulators (4ms blur)
600Hz+ displays -- plasma subfield simulators.
1000Hz+ displays -- accurate CRT electron gun simulators (1ms blur) with accurately similar zero motion blur and phosphor decaybehind (and 60Hz flicker is equally comfortable as old CRT), better than classic software BFI
1440Hz+ displays -- DLP temporal dithering simulation via binary 1-bit flashing on/off pixels to generate 24-bit color (24 LCD/OLED refresh cycles per simulated DLP refresh cycle)

We Only Need To Simulate Up To Temporal Retina Thresholds

NOTE: Yes, CRT responds in nanoseconds for the leading edge blur, but there's a lot of trailing edge blur. The human eye doesn't see the leading edge blur on CRT (due to nanosecond/microsecond response) since it's far below retina threshold. But the trailing blur is noticeable (as phosphor ghosting) and we can still simulate that in software -- because there's literally 20ms worth of blurring.

CRT-ghosting.jpg


Simulating a 60 Hz CRT Accurately Via 1920 Hz Display: 32 Digital Refreshes Per 1 CRT Refresh

For example, a future 1920Hz display gives us 32 digital refresh cycles to simulate 1 CRT refresh cycle for a 60Hz CRT. This is done by using a GPU shader to simulate 1/32th of a CRT refresh cycle of a CRT electron beam. So you generate 32 frames of 1/32sec worth of CRT electron gun simulation, to output to a 1920 Hz display (1/60sec worth). Fast enough to be (more or less) human-retina simulation of a CRT. And because it'd be a correct simulation of a CRT including rolling scan and phosphor decay, the comfort will be equal to an original CRT (unlike uncomfortable 60Hz squarewave BFI).

You'd get the same artifacts (e.g. plasma contouring, DLP noise, rainbow artifacts, CRT phosphor trails), because of the accuracy of simulation of the retro display afforded from the fine granularity of simulation made possible by brute Hz.

That being said, we ideally need HDR, so we can flicker the pixels even brighter (if possible), because CRT electron beam can be extremely bright. 2000nit HDR will allow 200nit CRT electron beam simulation (if software-based phosphor decay is adjusted to ~90% blur reduction)

A Retina-Everything (Resolution,Hz,HDR) Can Software Simulate Any Prior Display Accurately

4K 240Hz OLED, then 4K 480Hz, then 8K 240Hz then 8K 500Hz, then 8K 1000Hz (and so on) -- display algorithm simulators will become more and more accurate as the century proceeds, and we'll achieve perfect Turing test (A/B blind test between a flat MicroLED/OLED and a flat CRT tube) when we retina-out resolution AND we retina-out refresh rate AND we retina-out HDR .... Such a display can theoretically simulate any display before it!

This century will be lots of fun attempting to do retina simultaneously for all of those (resolution AND refresh AND hdr / color gamut). Tomorrow's temporal retro display simulators will be written as a GPU shader, and open source projects will permanently preserve retro displays in a display-independent OS-independent manner. Just throw sheer brute Hz and you've got your magic. When nobody can purchase a rare used Sony GDM-W900 CRTs in year 2045 for less than $10000, you just simply purchase a 8K 1000hz MicroLED/OLED and then download the a CRT-electron-simulator github project instead. VOILA!!!!

Note: Retina threshold can vary from human to human, but one can simply target the 99% threshold, as an example to capture most of the human populations' sensitivities. Remember, geometric upgrades are needed for Hz (Laboratory tests show that 240Hz-vs-1000Hz is easier to tell apart 240Hz-vs-360Hz). This is true also for 1000Hz versus 4000Hz. Researchers discovered you need roughly 4x refresh rate increases in those stratospheres for human-visible differences due to the diminishing curve of returns. Temporals behave like spatials that the more you get close to retina, the bigger jump up you need to see even a very marginal difference. Like DVD-vs-4K is easier to tell apart than VHS-vs-DVD.

Long term, we would like to see an open source project that creates a Windows Indirect Display Driver, to apply a display simulator to everything you do in Windows (including running emulators and games that doesn't have accurate CRT-temporal simulator, as HLSL simulation of texture doesn't fix identicalness of motion blur / phosphor decay to an original CRT -- just try playing Sonic Hedgehog in MAME, even with MAME HLSL). Adding temporal HLSL simulation in addition to spatial HLSL simulation, solves that problem.
 
Last edited:

cortadew

Member
Correct. yes, this is the path of the future (10-year timeline)

Software-Based Display Simulators
(e.g. CRT electron beam simulators in shader)

Upcoming 240Hz OLEDs are almost enough to begin CRT electron beam simulators in software (simulate 1 CRT refresh cycle via 4 digital refresh cycles).

Today we can do this:

An example of how software algorithms can simulate a different display that you don't already have:
  • testufo.com/vrr
    Variable refresh rate simulation
    Software-based simulation of VRR on a non-VRR display.
    Works on any display, even non-VRR displays.

  • testufo.com/blackframes#count=4&bonusufo=1
    Variable control of motion blur at a fixed frame rate
    Requires 144Hz or higher to avoid uncomfortable flicker.
    Applicable to future adjustable phosphor decay capabilities.

  • testufo.com/rainboweffect
    Simulation of DLP color wheel (rainbow effect)
    Requires 240Hz or higher refresh rate for the rainbow effect to look accurate.
    If you can't see the rainbow effect, wave your hand really fast in front of this animation to see the rainbow effect.
    WARNING: epilepsy warning if you don't already have a 240Hz monitor. These demos look flickerfree on a 240Hz+ screen
    DO NOT RUN THIS TEST AT 60HZ: EYE PAIN WARNING
Tomorrow, we will be able to do this:

240Hz HDR/OLED -- semi-accurate CRT electron gun simulators (4ms blur)
600Hz+ displays -- plasma subfield simulators.
1000Hz+ displays -- accurate CRT electron gun simulators (1ms blur) with accurately similar zero motion blur and phosphor decaybehind (and 60Hz flicker is equally comfortable as old CRT), better than classic software BFI
1440Hz+ displays -- DLP temporal dithering simulation via binary 1-bit flashing on/off pixels to generate 24-bit color (24 LCD/OLED refresh cycles per simulated DLP refresh cycle)

We Only Need To Simulate Up To Temporal Retina Thresholds

NOTE: Yes, CRT responds in nanoseconds for the leading edge blur, but there's a lot of trailing edge blur. The human eye doesn't see the leading edge blur on CRT (due to nanosecond/microsecond response) since it's far below retina threshold. But the trailing blur is noticeable (as phosphor ghosting) and we can still simulate that in software -- because there's literally 20ms worth of blurring.

CRT-ghosting.jpg


For example, a future 1920Hz display gives us 32 digital refresh cycles to simulate 1 CRT refresh cycle for a 60Hz CRT (simulate 1/32th of a refresh cycle of CRT electron beam via a GPU shader).

That being said, we ideally need HDR, so we can flicker the pixels even brighter (if possible), because CRT electron beam can be extremely bright. 2000nit HDR will allow 200nit CRT electron beam simulation (if software-based phosphor decay is adjusted to ~90% blur reduction)

4K 240Hz OLED, then 4K 480Hz, then 8K 240Hz then 8K 500Hz, then 8K 1000Hz (and so on) -- display algorithm simulators will become more and more accurate as the century proceeds, and we'll achieve perfect Turing test (A/B blind test between a flat MicroLED/OLED and a flat CRT tube) when we retina-out resolution AND we retina-out refresh rate AND we retina-out HDR .... Such a display can theoretically simulate any display before it!

This century will be lots of fun attempting to do retina simultaneously for all of those (resolution AND refresh AND hdr / color gamut).

Long term, we would like to see an open source project that creates a Windows Indirect Display Driver, to apply a display simulator to everything you do in Windows (including running emulators and games that doesn't have accurate CRT-temporal simulator, as HLSL simulation of texture doesn't fix identicalness of motion blur / phosphor decay to an original CRT -- just try playing Sonic Hedgehog in MAME, even with MAME HLSL). Adding temporal HLSL simulation in addition to spatial HLSL simulation, solves that problem.
Be my friend please
 

Panajev2001a

GAF's Pleasant Genius
Correct. yes, this is the path of the future (10-year timeline)

Software-Based Display Simulators
(e.g. CRT electron beam simulators in shader)

Upcoming 240Hz OLEDs are almost enough to begin CRT electron beam simulators in software (simulate 1 CRT refresh cycle via 4 digital refresh cycles).

Today we can do this:

An example of how software algorithms can simulate a different display that you don't already have:
  • testufo.com/vrr
    Variable refresh rate simulation
    Software-based simulation of VRR on a non-VRR display.
    Works on any display, even non-VRR displays.

  • testufo.com/blackframes#count=4&bonusufo=1
    Variable control of motion blur at a fixed frame rate
    Requires 144Hz or higher to avoid uncomfortable flicker.
    Applicable to future adjustable phosphor decay capabilities.

  • testufo.com/rainboweffect
    Simulation of DLP color wheel (rainbow effect)
    Requires 240Hz or higher refresh rate for the rainbow effect to look accurate.
    If you can't see the rainbow effect, wave your hand really fast in front of this animation to see the rainbow effect.
    WARNING: epilepsy warning if you don't already have a 240Hz monitor. These demos look flickerfree on a 240Hz+ screen
    DO NOT RUN THIS TEST AT 60HZ: EYE PAIN WARNING
Tomorrow, we will be able to do this:

240Hz HDR/OLED -- semi-accurate CRT electron gun simulators (4ms blur)
600Hz+ displays -- plasma subfield simulators.
1000Hz+ displays -- accurate CRT electron gun simulators (1ms blur) with accurately similar zero motion blur and phosphor decaybehind (and 60Hz flicker is equally comfortable as old CRT), better than classic software BFI
1440Hz+ displays -- DLP temporal dithering simulation via binary 1-bit flashing on/off pixels to generate 24-bit color (24 LCD/OLED refresh cycles per simulated DLP refresh cycle)

We Only Need To Simulate Up To Temporal Retina Thresholds

NOTE: Yes, CRT responds in nanoseconds for the leading edge blur, but there's a lot of trailing edge blur. The human eye doesn't see the leading edge blur on CRT (due to nanosecond/microsecond response) since it's far below retina threshold. But the trailing blur is noticeable (as phosphor ghosting) and we can still simulate that in software -- because there's literally 20ms worth of blurring.

CRT-ghosting.jpg


Simulating a 60 Hz CRT Accurately Via 1920 Hz Display: 32 Digital Refreshes Per 1 CRT Refresh

For example, a future 1920Hz display gives us 32 digital refresh cycles to simulate 1 CRT refresh cycle for a 60Hz CRT. This is done by using a GPU shader to simulate 1/32th of a CRT refresh cycle of a CRT electron beam. So you generate 32 frames of 1/32sec worth of CRT electron gun simulation, to output to a 1920 Hz display (1/60sec worth). Fast enough to be (more or less) human-retina simulation of a CRT. And because it'd be a correct simulation of a CRT including rolling scan and phosphor decay, the comfort will be equal to an original CRT (unlike uncomfortable 60Hz squarewave BFI).

You'd get the same artifacts (e.g. plasma contouring, DLP noise, rainbow artifacts, CRT phosphor trails), because of the accuracy of simulation of the retro display afforded from the fine granularity of simulation made possible by brute Hz.

That being said, we ideally need HDR, so we can flicker the pixels even brighter (if possible), because CRT electron beam can be extremely bright. 2000nit HDR will allow 200nit CRT electron beam simulation (if software-based phosphor decay is adjusted to ~90% blur reduction)

A Retina-Everything (Resolution,Hz,HDR) Can Software Simulate Any Prior Display Accurately

4K 240Hz OLED, then 4K 480Hz, then 8K 240Hz then 8K 500Hz, then 8K 1000Hz (and so on) -- display algorithm simulators will become more and more accurate as the century proceeds, and we'll achieve perfect Turing test (A/B blind test between a flat MicroLED/OLED and a flat CRT tube) when we retina-out resolution AND we retina-out refresh rate AND we retina-out HDR .... Such a display can theoretically simulate any display before it!

This century will be lots of fun attempting to do retina simultaneously for all of those (resolution AND refresh AND hdr / color gamut). Tomorrow's temporal retro display simulators will be written as a GPU shader, and open source projects will permanently preserve retro displays in a display-independent OS-independent manner. Just throw sheer brute Hz and you've got your magic. When nobody can purchase a rare used Sony GDM-W900 CRTs in year 2045 for less than $10000, you just simply purchase a 8K 1000hz MicroLED/OLED and then download the a CRT-electron-simulator github project instead. VOILA!!!!

Note: Retina threshold can vary from human to human, but one can simply target the 99% threshold, as an example to capture most of the human populations' sensitivities. Remember, geometric upgrades are needed for Hz (Laboratory tests show that 240Hz-vs-1000Hz is easier to tell apart 240Hz-vs-360Hz). This is true also for 1000Hz versus 4000Hz. Researchers discovered you need roughly 4x refresh rate increases in those stratospheres for human-visible differences due to the diminishing curve of returns. Temporals behave like spatials that the more you get close to retina, the bigger jump up you need to see even a very marginal difference. Like DVD-vs-4K is easier to tell apart than VHS-vs-DVD.

Long term, we would like to see an open source project that creates a Windows Indirect Display Driver, to apply a display simulator to everything you do in Windows (including running emulators and games that doesn't have accurate CRT-temporal simulator, as HLSL simulation of texture doesn't fix identicalness of motion blur / phosphor decay to an original CRT -- just try playing Sonic Hedgehog in MAME, even with MAME HLSL). Adding temporal HLSL simulation in addition to spatial HLSL simulation, solves that problem.
Or maybe CNTV will finally be practical and we will essentially go back to CRT’s but with lots of guns instead of a single one with big magnets to steer it :).
 

cireza

Banned
Correct. yes, this is the path of the future (10-year timeline)

Software-Based Display Simulators
(e.g. CRT electron beam simulators in shader)

Upcoming 240Hz OLEDs are almost enough to begin CRT electron beam simulators in software (simulate 1 CRT refresh cycle via 4 digital refresh cycles).

Today we can do this:

An example of how software algorithms can simulate a different display that you don't already have:
  • testufo.com/vrr
    Variable refresh rate simulation
    Software-based simulation of VRR on a non-VRR display.
    Works on any display, even non-VRR displays.

  • testufo.com/blackframes#count=4&bonusufo=1
    Variable control of motion blur at a fixed frame rate
    Requires 144Hz or higher to avoid uncomfortable flicker.
    Applicable to future adjustable phosphor decay capabilities.

  • testufo.com/rainboweffect
    Simulation of DLP color wheel (rainbow effect)
    Requires 240Hz or higher refresh rate for the rainbow effect to look accurate.
    If you can't see the rainbow effect, wave your hand really fast in front of this animation to see the rainbow effect.
    WARNING: epilepsy warning if you don't already have a 240Hz monitor. These demos look flickerfree on a 240Hz+ screen
    DO NOT RUN THIS TEST AT 60HZ: EYE PAIN WARNING
Tomorrow, we will be able to do this:

240Hz HDR/OLED -- semi-accurate CRT electron gun simulators (4ms blur)
600Hz+ displays -- plasma subfield simulators.
1000Hz+ displays -- accurate CRT electron gun simulators (1ms blur) with accurately similar zero motion blur and phosphor decaybehind (and 60Hz flicker is equally comfortable as old CRT), better than classic software BFI
1440Hz+ displays -- DLP temporal dithering simulation via binary 1-bit flashing on/off pixels to generate 24-bit color (24 LCD/OLED refresh cycles per simulated DLP refresh cycle)

We Only Need To Simulate Up To Temporal Retina Thresholds

NOTE: Yes, CRT responds in nanoseconds for the leading edge blur, but there's a lot of trailing edge blur. The human eye doesn't see the leading edge blur on CRT (due to nanosecond/microsecond response) since it's far below retina threshold. But the trailing blur is noticeable (as phosphor ghosting) and we can still simulate that in software -- because there's literally 20ms worth of blurring.

CRT-ghosting.jpg


Simulating a 60 Hz CRT Accurately Via 1920 Hz Display: 32 Digital Refreshes Per 1 CRT Refresh

For example, a future 1920Hz display gives us 32 digital refresh cycles to simulate 1 CRT refresh cycle for a 60Hz CRT. This is done by using a GPU shader to simulate 1/32th of a CRT refresh cycle of a CRT electron beam. So you generate 32 frames of 1/32sec worth of CRT electron gun simulation, to output to a 1920 Hz display (1/60sec worth). Fast enough to be (more or less) human-retina simulation of a CRT. And because it'd be a correct simulation of a CRT including rolling scan and phosphor decay, the comfort will be equal to an original CRT (unlike uncomfortable 60Hz squarewave BFI).

You'd get the same artifacts (e.g. plasma contouring, DLP noise, rainbow artifacts, CRT phosphor trails), because of the accuracy of simulation of the retro display afforded from the fine granularity of simulation made possible by brute Hz.

That being said, we ideally need HDR, so we can flicker the pixels even brighter (if possible), because CRT electron beam can be extremely bright. 2000nit HDR will allow 200nit CRT electron beam simulation (if software-based phosphor decay is adjusted to ~90% blur reduction)

A Retina-Everything (Resolution,Hz,HDR) Can Software Simulate Any Prior Display Accurately

4K 240Hz OLED, then 4K 480Hz, then 8K 240Hz then 8K 500Hz, then 8K 1000Hz (and so on) -- display algorithm simulators will become more and more accurate as the century proceeds, and we'll achieve perfect Turing test (A/B blind test between a flat MicroLED/OLED and a flat CRT tube) when we retina-out resolution AND we retina-out refresh rate AND we retina-out HDR .... Such a display can theoretically simulate any display before it!

This century will be lots of fun attempting to do retina simultaneously for all of those (resolution AND refresh AND hdr / color gamut). Tomorrow's temporal retro display simulators will be written as a GPU shader, and open source projects will permanently preserve retro displays in a display-independent OS-independent manner. Just throw sheer brute Hz and you've got your magic. When nobody can purchase a rare used Sony GDM-W900 CRTs in year 2045 for less than $10000, you just simply purchase a 8K 1000hz MicroLED/OLED and then download the a CRT-electron-simulator github project instead. VOILA!!!!

Note: Retina threshold can vary from human to human, but one can simply target the 99% threshold, as an example to capture most of the human populations' sensitivities. Remember, geometric upgrades are needed for Hz (Laboratory tests show that 240Hz-vs-1000Hz is easier to tell apart 240Hz-vs-360Hz). This is true also for 1000Hz versus 4000Hz. Researchers discovered you need roughly 4x refresh rate increases in those stratospheres for human-visible differences due to the diminishing curve of returns. Temporals behave like spatials that the more you get close to retina, the bigger jump up you need to see even a very marginal difference. Like DVD-vs-4K is easier to tell apart than VHS-vs-DVD.

Long term, we would like to see an open source project that creates a Windows Indirect Display Driver, to apply a display simulator to everything you do in Windows (including running emulators and games that doesn't have accurate CRT-temporal simulator, as HLSL simulation of texture doesn't fix identicalness of motion blur / phosphor decay to an original CRT -- just try playing Sonic Hedgehog in MAME, even with MAME HLSL). Adding temporal HLSL simulation in addition to spatial HLSL simulation, solves that problem.
This sounds like a lot of effort to compensate for the wrong technology to begin with.
 

SkylineRKR

Member
Yeah PDP were godly. I had a V20 for approx 8 years, its still the best TV I had. The CX is very good, it benefits from new tech like 4k, 120hz, HDR etc but I feel it just approaches PDP quality despite those sets being 2009-2010 tech and 1080p max.

I bought a KS9000 and put my V20 on sale (and it still fetched a surprisingly good price, there are enthusiasts out there and the grey colorway was exactly what that couple looked for). I never liked the KS as much despite being billed the top tier 4k set of 2016. The viewing angle was worse, the color saturation was worse. I was never completely satisfied with it, thats why I sold it after 3 or 4 years. Always felt kind of dirty parting with the V20, as I knew it was better. But I had a Pro etc, the TV didn't have 4k/HDR.
 

Tarin02543

Member
I've found the perfect retro monitor:

The MICROLED display supports HDR and HDR10+, with a screen coating to enhance the colors and produce true blacks. The smaller two screens have a peak brightness of 1,000 nits, with the 165-in version capable of up to 800 nits. All displays have a refresh rate of up to 3,840 Hz, with 16-bit color processing and an adjustable color temperature from 3,000 to 10,000 k.


The C Seed N1 TVs range starts at €180,000 (~US$184,223).

 

nkarafo

Member
Especially with OLED, black levels and ghosting are no longer a problem
OLDED still have ghosting. What was the last time you saw a moving image on a CRT? If it's too long and you forgot then you are lucky. It's better not to know because you are going to get disappointed.

I guess, but you're still playing games on a CRT at like 460p or whatever. Time marches on.
Yeah, it doesn't matter if a new technology is shit, all that matters is to "get with the times". I fully agree with you.
 
I don't have a degree in image engineering to understand the discussion XD. isn't there any modern crt brand that produces models today?
 

rofif

Can’t Git Gud
OLDED still have ghosting. What was the last time you saw a moving image on a CRT? If it's too long and you forgot then you are lucky. It's better not to know because you are going to get disappointed.


Yeah, it doesn't matter if a new technology is shit, all that matters is to "get with the times". I fully agree with you.
It's the opposite. I went big on retro and crt back in 2020. I have two crt monitors in my "cellar" and unfortunately last time I got em out was in 2020 :(
I got 4k 27" 60hz ips around that time an 48" oled in 2021.
So long time ago really. Back then I was on big LCD 240hz/144hz gsync crusade too and the CRT looked much MUCH better than VA did.
IPS and TN with high refresh rate looked not too bad.
The one thing that was better on CRT than on any LCD was Max Payne 1. it's a dark game with a lot of shades of grey an it's really bad in motion on LCD. Not so bad on IPS but TN and VA cannot handle the dark alleys.

I replayed max payne 1,2 rtcw, hl1 and Unreal in a month I had this plugged back in. 1024x768 120hz.
It was great but honestly it's not AS GOOD as people remember it to be. it is no questions asked better than VA. But better than Oled? I don't think so.
Sure, I know Oled still has ghosting but it's so little, I really don't mind and compared to crt, I have a huge screen, hdr, better colors and so on.

Wish I had some some other room to set up for retro gaming with old pc's and consoles :(

Anyway - here is a pic from 2020
vZm9FQF.jpg


btw - out of all these games, Return To Castle Wolfenstein aged the best. It's INCREDIBLE game.
Max Payne 2 also.... but mp1, hl1 and unreal are realy antiquated quick save spams. I love them for different reasons though. Games were so good back then!
 

dave_d

Member
It was great but honestly it's not AS GOOD as people remember it to be. it is no questions asked better than VA. But better than Oled? I don't think so.
I agree with this and I say that because in my area there's an "Arcade Museum" called Fun Spot which has tons of arcade games mostly from the 80s and those things all use CRTs. CRTs have advantages but they're not the end all be all of it. But yeah if you can make it to Laconia NH go check it out.
 
Last edited:

svbarnard

Banned
He may be confusing motion blur with low motion clarity. (motion blur can come from the source itself and not the display)

When moving, the sharpness/detail on modern displays tends to fade a lot which shouldn't be the case ideally.

banners-motion-blur-faq-2x.png.webp


Just try to read something like text/signs when the camera scrolls/pans in games or text/signs on a moving truck across the screen.

www.testufo.com shows the motion clarity improving the more hz your display can do.
I prefer to just call it motion blur, display motion blur. It's also easier for the average person to understand when you just say motion blur instead of motion clarity. I think the average person would have no idea what you were talking about if you said motion clarity.
 

01011001

Banned
I prefer to just call it motion blur, display motion blur. It's also easier for the average person to understand when you just say motion blur instead of motion clarity. I think the average person would have no idea what you were talking about if you said motion clarity.

I think the most accurate and easy to understand term would be presistence blur.
 

svbarnard

Banned
Correct. yes, this is the path of the future (10-year timeline)

Software-Based Display Simulators
(e.g. CRT electron beam simulators in shader)

Upcoming 240Hz OLEDs are almost enough to begin CRT electron beam simulators in software (simulate 1 CRT refresh cycle via 4 digital refresh cycles).

Today we can do this:

An example of how software algorithms can simulate a different display that you don't already have:
  • testufo.com/vrr
    Variable refresh rate simulation
    Software-based simulation of VRR on a non-VRR display.
    Works on any display, even non-VRR displays.

  • testufo.com/blackframes#count=4&bonusufo=1
    Variable control of motion blur at a fixed frame rate
    Requires 144Hz or higher to avoid uncomfortable flicker.
    Applicable to future adjustable phosphor decay capabilities.

  • testufo.com/rainboweffect
    Simulation of DLP color wheel (rainbow effect)
    Requires 240Hz or higher refresh rate for the rainbow effect to look accurate.
    If you can't see the rainbow effect, wave your hand really fast in front of this animation to see the rainbow effect.
    WARNING: epilepsy warning if you don't already have a 240Hz monitor. These demos look flickerfree on a 240Hz+ screen
    DO NOT RUN THIS TEST AT 60HZ: EYE PAIN WARNING
Tomorrow, we will be able to do this:

240Hz HDR/OLED -- semi-accurate CRT electron gun simulators (4ms blur)
600Hz+ displays -- plasma subfield simulators.
1000Hz+ displays -- accurate CRT electron gun simulators (1ms blur) with accurately similar zero motion blur and phosphor decaybehind (and 60Hz flicker is equally comfortable as old CRT), better than classic software BFI
1440Hz+ displays -- DLP temporal dithering simulation via binary 1-bit flashing on/off pixels to generate 24-bit color (24 LCD/OLED refresh cycles per simulated DLP refresh cycle)

We Only Need To Simulate Up To Temporal Retina Thresholds

NOTE: Yes, CRT responds in nanoseconds for the leading edge blur, but there's a lot of trailing edge blur. The human eye doesn't see the leading edge blur on CRT (due to nanosecond/microsecond response) since it's far below retina threshold. But the trailing blur is noticeable (as phosphor ghosting) and we can still simulate that in software -- because there's literally 20ms worth of blurring.

CRT-ghosting.jpg


Simulating a 60 Hz CRT Accurately Via 1920 Hz Display: 32 Digital Refreshes Per 1 CRT Refresh

For example, a future 1920Hz display gives us 32 digital refresh cycles to simulate 1 CRT refresh cycle for a 60Hz CRT. This is done by using a GPU shader to simulate 1/32th of a CRT refresh cycle of a CRT electron beam. So you generate 32 frames of 1/32sec worth of CRT electron gun simulation, to output to a 1920 Hz display (1/60sec worth). Fast enough to be (more or less) human-retina simulation of a CRT. And because it'd be a correct simulation of a CRT including rolling scan and phosphor decay, the comfort will be equal to an original CRT (unlike uncomfortable 60Hz squarewave BFI).

You'd get the same artifacts (e.g. plasma contouring, DLP noise, rainbow artifacts, CRT phosphor trails), because of the accuracy of simulation of the retro display afforded from the fine granularity of simulation made possible by brute Hz.

That being said, we ideally need HDR, so we can flicker the pixels even brighter (if possible), because CRT electron beam can be extremely bright. 2000nit HDR will allow 200nit CRT electron beam simulation (if software-based phosphor decay is adjusted to ~90% blur reduction)

A Retina-Everything (Resolution,Hz,HDR) Can Software Simulate Any Prior Display Accurately

4K 240Hz OLED, then 4K 480Hz, then 8K 240Hz then 8K 500Hz, then 8K 1000Hz (and so on) -- display algorithm simulators will become more and more accurate as the century proceeds, and we'll achieve perfect Turing test (A/B blind test between a flat MicroLED/OLED and a flat CRT tube) when we retina-out resolution AND we retina-out refresh rate AND we retina-out HDR .... Such a display can theoretically simulate any display before it!

This century will be lots of fun attempting to do retina simultaneously for all of those (resolution AND refresh AND hdr / color gamut). Tomorrow's temporal retro display simulators will be written as a GPU shader, and open source projects will permanently preserve retro displays in a display-independent OS-independent manner. Just throw sheer brute Hz and you've got your magic. When nobody can purchase a rare used Sony GDM-W900 CRTs in year 2045 for less than $10000, you just simply purchase a 8K 1000hz MicroLED/OLED and then download the a CRT-electron-simulator github project instead. VOILA!!!!

Note: Retina threshold can vary from human to human, but one can simply target the 99% threshold, as an example to capture most of the human populations' sensitivities. Remember, geometric upgrades are needed for Hz (Laboratory tests show that 240Hz-vs-1000Hz is easier to tell apart 240Hz-vs-360Hz). This is true also for 1000Hz versus 4000Hz. Researchers discovered you need roughly 4x refresh rate increases in those stratospheres for human-visible differences due to the diminishing curve of returns. Temporals behave like spatials that the more you get close to retina, the bigger jump up you need to see even a very marginal difference. Like DVD-vs-4K is easier to tell apart than VHS-vs-DVD.

Long term, we would like to see an open source project that creates a Windows Indirect Display Driver, to apply a display simulator to everything you do in Windows (including running emulators and games that doesn't have accurate CRT-temporal simulator, as HLSL simulation of texture doesn't fix identicalness of motion blur / phosphor decay to an original CRT -- just try playing Sonic Hedgehog in MAME, even with MAME HLSL). Adding temporal HLSL simulation in addition to spatial HLSL simulation, solves that problem.
So I thought the whole point was about eliminating motion blur? I thought the whole point was that modern day flat panel TVs such as our LCDs/OLEDs have a shit ton of motion blur compared to the old CRT TVs, and the only way to eliminate this motion blur was to achieve a 1,000 frames per second?

I don't understand why we would try to use software to simulate a CRT TV on LCDs/OLEDs? I thought the two main things we just wanted to eliminate are motion blur and the "phantom array effect AKA wagon wheel effect".

"For example, a future 1920Hz display gives us 32 digital refresh cycles to simulate 1 CRT refresh cycle for a 60Hz CRT."

But if you had a display with a 1920Hz refresh rate (and a computer capable of playing a video game at 1920fps) you wouldn't have motion blur anymore because according to blurbusters motion blur is eliminated once you hit 1,000 frames per second. (And also according to blurbusters we will need a screen with a 10,000Hz refresh rate in order to eliminate the phantom array effect).

What I'm saying is, what's all this talk about using software to simulate CRT TVs? I thought the point was about eliminating motion blur on flat panel TVs? Here's what I want, just eliminate the motion blur and the phantom array effect and I'll be happy. But I don't suppose we'll be getting LCDs or OLEDs with a 10,000Hz refresh rate for at least 50 years from now correct?

FYI sample and hold displays are LCDs/OLEDs. I appreciate you coming back to the thread Mark Rejhon you're a pioneer in the field of display motion blur you're single-handedly changing the game.

F8IRoue.png
 
Last edited:

svbarnard

Banned
I think the most accurate and easy to understand term would be presistence blur.
So when it comes to the average person who isn't too tech savvy, telling them that modern-day flat panel TVs have a lot of "motion blur" compared to the old CRT TVs, is a lot easier for them to understand than saying persistence blur. I think the average person would say what the heck is persistence blur? Whereas everyone knows what motion blur is.
 
Last edited:

Yerd

Member
It's the opposite. I went big on retro and crt back in 2020. I have two crt monitors in my "cellar" and unfortunately last time I got em out was in 2020 :(
I got 4k 27" 60hz ips around that time an 48" oled in 2021.
So long time ago really. Back then I was on big LCD 240hz/144hz gsync crusade too and the CRT looked much MUCH better than VA did.
IPS and TN with high refresh rate looked not too bad.
The one thing that was better on CRT than on any LCD was Max Payne 1. it's a dark game with a lot of shades of grey an it's really bad in motion on LCD. Not so bad on IPS but TN and VA cannot handle the dark alleys.

I replayed max payne 1,2 rtcw, hl1 and Unreal in a month I had this plugged back in. 1024x768 120hz.
It was great but honestly it's not AS GOOD as people remember it to be. it is no questions asked better than VA. But better than Oled? I don't think so.
Sure, I know Oled still has ghosting but it's so little, I really don't mind and compared to crt, I have a huge screen, hdr, better colors and so on.

Wish I had some some other room to set up for retro gaming with old pc's and consoles :(

Anyway - here is a pic from 2020
vZm9FQF.jpg


btw - out of all these games, Return To Castle Wolfenstein aged the best. It's INCREDIBLE game.
Max Payne 2 also.... but mp1, hl1 and unreal are realy antiquated quick save spams. I love them for different reasons though. Games were so good back then!

I think I have a very similar monitor in my parents basement. Was cleaning out stuff and they wanted to get rid of it, but I convinced them to keep it a little longer. I don't even remember what it's capabilities are. Mine has a black shell. I don't know if I have any way to connect it to modern GPUs. I remember some vga to hdmi adapter that came with an old AMD card, but not sure if I have that still. It's been so long since I dealt with VGA, I think they might have male/female on different ends, or are the cable ends all the same? And now that I think about it, I think all the vga cables were discarded in the cleanup...

Are they worth anything to anyone? I think that was why I kept it, to sell it. I need to test that thing out soon. They are so bulky and heavy, I don't find them worth the hassle of real estate and heft.
 
Last edited:

rofif

Can’t Git Gud
I think I have a very similar monitor in my parents basement. Was cleaning out stuff and they wanted to get rid of it, but I convinced them to keep it a little longer. I don't even remember what it's capabilities are. Mine has a black shell. I don't know if I have any way to connect it to modern GPUs. I remember some vga to hdmi adapter that came with an old AMD card, but not sure if I have that still. It's been so long since I dealt with VGA, I think they might have male/female on different ends, or are the cable ends all the same? And now that I think about it, I think all the vga cables were discarded in the cleanup...

Are they worth anything to anyone? I think that was why I kept it, to sell it. I need to test that thing out soon. They are so bulky and heavy, I don't find them worth the hassle of real estate and heft.
They these can we worth something for the right person.
Mine can go up to 180hz if I lower the resolution enough
 

JohnnyFootball

GerAlt-Right. Ciriously.
I don't believe that we need 1000 Hz displays to reach the clarity of CRT. You get remarkable consistency with 240Hz. I bet 120 fps at 240 Hz with BFI on an OLED would be indistinguishable. 1 frame 1 black frame. Heck 120 fps on my OLED without BFI is so damn close to perfect.

Also, someone mentioned that one tech where TV/monitors would have a 960Hz refresh rate, where on something like 60fps content would display 1/16 of the image per refresh. I dont know if that is something talked about or being actually developed. I'd love to see how that works.
 
Last edited:

nkarafo

Member
All these 1000, 10.0000 a million or something refresh rates don't mean much.

In order to achieve clarity in these displays, the content must also run at such high frame rates.

I do have a 240hz monitor and when i feed it 240fps content, the clarity is improved (better than 60hz, not as good as CRT). But if the content is the usual 60fps, which is what you want from most games or when you emulate console games, there is no difference, you still getting a very blurred moving image. The refresh rate doesn't matter in this instance.

So there needs to be a hardware change, a technological change, for this to work. All the softwar tricks in the world will never be as good because these need processing and processing creates artifacts and input lag.

And all this just so we can get a similar quality like the one we had for decades before flat panels took over. Talk about a huge step back.


I don't believe that we need 1000 Hz displays to reach the clarity of CRT. You get remarkable consistency with 240Hz. I bet 120 fps at 240 Hz with BFI on an OLED would be indistinguishable. 1 frame 1 black frame. Heck 120 fps on my OLED without BFI is so damn close to perfect.

Also, someone mentioned that one tech where TV/monitors would have a 960Hz refresh rate, where on something like 60fps content would display 1/16 of the image per refresh. I dont know if that is something talked about or being actually developed. I'd love to see how that works.
240hz is not enough, i test it myself every day against my CRT TV and monitor.

BFI is also pretty bad because it mangles the colors and brightness.
 
Last edited:

svbarnard

Banned
I don't believe that we need 1000 Hz displays to reach the clarity of CRT. You get remarkable consistency with 240Hz. I bet 120 fps at 240 Hz with BFI on an OLED would be indistinguishable. 1 frame 1 black frame. Heck 120 fps on my OLED without BFI is so damn close to perfect.

Also, someone mentioned that one tech where TV/monitors would have a 960Hz refresh rate, where on something like 60fps content would display 1/16 of the image per refresh. I dont know if that is something talked about or being actually developed. I'd love to see how that works.
The thing is with black frame insertion BFI, there can be flicker from what I understand and it makes the screen dimmer so it has drawbacks.
 

svbarnard

Banned
All these 1000, 10.0000 a million or something refresh rates don't mean much.

In order to achieve clarity in these displays, the content must also run at such high frame rates.

I do have a 240hz monitor and when i feed it 240fps content, the clarity is improved (better than 60hz, not as good as CRT). But if the content is the usual 60fps, which is what you want from most games or when you emulate console games, there is no difference, you still getting a very blurred moving image. The refresh rate doesn't matter in this instance.

So there needs to be a hardware change, a technological change, for this to work. All the softwar tricks in the world will never be as good because these need processing and processing creates artifacts and input lag.

And all this just so we can get a similar quality like the one we had for decades before flat panels took over. Talk about a huge step back.



240hz is not enough, i test it myself every day against my CRT TV and monitor.

BFI is also pretty bad because it mangles the colors and brightness.
Yup it makes the screen dim.
 

01011001

Banned
So when it comes to the average person who isn't too tech savvy, telling them that modern-day flat panel TVs have a lot of "motion blur" compared to the old CRT TVs, is a lot easier for them to understand than saying persistence blur. I think the average person would say what the heck is persistence blur? Whereas everyone knows what motion blur is.

I mean tbh a basic knowledge of the English language and a tiny bit of logical thinking should hopefully be enough for most people to understand what persistence blur might be.

motion blur is also something the average person would not think about much or hear about and is also just 2 basic English words that describe something very directly, like persistence blur
 

01011001

Banned
The thing is with black frame insertion BFI, there can be flicker from what I understand and it makes the screen dimmer so it has drawbacks.

CRTs also flicker and their max brightness usually doesn't exceed 250 nits

so if you have a super bring 1000+ nits TV and use BFI, the image shouldn't really be much dimmer nor that much more flickery than a CRT, especially if the TV supports BFI at 120hz
 
Last edited:

svbarnard

Banned
All these 1000, 10.0000 a million or something refresh rates don't mean much.

In order to achieve clarity in these displays, the content must also run at such high frame rates.

I do have a 240hz monitor and when i feed it 240fps content, the clarity is improved (better than 60hz, not as good as CRT). But if the content is the usual 60fps, which is what you want from most games or when you emulate console games, there is no difference, you still getting a very blurred moving image. The refresh rate doesn't matter in this instance.

So there needs to be a hardware change, a technological change, for this to work. All the softwar tricks in the world will never be as good because these need processing and processing creates artifacts and input lag.

And all this just so we can get a similar quality like the one we had for decades before flat panels took over. Talk about a huge step back.



240hz is not enough, i test it myself every day against my CRT TV and monitor.

BFI is also pretty bad because it mangles the colors and brightness.
All these 1000, 10.0000 a million or something refresh rates don't mean much.



In order to achieve clarity in these displays, the content must also run at such high frame rates.
Yes you are correct. In order to have a perfect screen with no motion blur and no screen artifacts such as the "phantom array effect" you would need a screen with a 10,000Hz refresh rate and a PC capable of playing your video game at 10,000 frames per second. I imagine it will take 50 years for this to happen. But technology is moving very fast these days. I can't imagine how advanced technology will be by 2050, people don't realize just how fast technology is moving these days. Let's put it this way the rate of technological advancement is getting faster not slower. In fact I didn't even type any of this, I simply spoke to my smartphone and it turned my words into text for me. So yeah, I can't imagine what the 2050s will look like.
 

svbarnard

Banned
I mean tbh a basic knowledge of the English language and a tiny bit of logical thinking should hopefully be enough for most people to understand what persistence blur might be.

motion blur is also something the average person would not think about much or hear about and is also just 2 basic English words that describe something very directly, like persistence blur
No dude you're wrong but you're being stubborn about it. The average person is not too tech savvy and telling them that flat panel TVs have a lot of motion blur is something they can easily understand. Please you're being stubborn just accept that you're wrong on this one.
 

K' Dash

Member
I have my old plasma Panasonic Viera at my parents... I'll have to bring it home and do some testing.

I was looking to buy a CRT HD, but if I can avoid spending a few hundreds and have good results, I'll be happy.
 
Last edited:
Top Bottom