• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Why aren't we all using CRT gaming monitors?

People keep mentioning noise... was that on all CRT's, or just ones that were damaged/poorly made?

(Again, I don't really remember what CRTs were actually like)
Its certainly much worse on some then others. My grandpa still uses a CRT and the whine from it is horrific but he's long past being able to hear it. My own CRT theres only a little noise which I don't really notice unless I really focus on it.
 
Because people aren't tech-savvy enough. We like to think we are, yet here we are, looking at crap IQ thought a goddamn TN panel.
 
I know I had CRTs that were quiet. I mean if you were right up next to it and listening for it, maybe you'd hear a hum/whine, but barely.

But like others have said, they are too f-ing heavy. I had a giant Sony Wega for years and it made every move a nightmare.

I'd also add the baseline for PQ has gone up dramatically. Even mediocre flatscreen TVs and monitors today are far better than 90% of CRTs. It's not a bad trade.
 
I had a CRT HDTV once, it was thirty inches and weighed 335lbs. No that is not hyperbole, the weight was posted on the box.

Amazing picture, but dang that thing was tough no manage.
 
CRTs had their pluses and minuses.

Pros:
-superior contrast levels
-superior black levels
-variable refresh rates (PC monitors obviously)
-variable resolutions

Cons:
-terrible geometry problems that were nearly impossible to get perfect even with very fine controls (warping image when playing scrolling games and pin cushioning around the edges)
-terrible screen uniformity problems (especially if it wasn't a PC monitor with appropriate controls to adjust it)
-terrible color accuracy problems (the color temperature would vary depending on how warm the electronics were)
-convergence issues (especially the sides and corners)
-phosphor wear or uneven phosphor wear (TV gets dimmer as it ages and color accuracy drifts)

I had several high end CRT HDTVs including Sony's flagship 40" XRB. They were a pain in the ass to calibrate properly. Getting the geometry and convergence perfect was impossible and you had to make the best compromise instead. Once you had the thing tuned up, you had to touch up the calibration every couple months because the color temperature drifted.

On the topic of PC monitors, they supported an amazing number of resolutions, but they were horizontally limited by the number of phosphor stripes they had and the number of holes in the mask. Monitors that claimed very high resolutions would do their best to display them, but there was not enough granularity to actually get anywhere near 2k horizontal pixels. So what would happen is the monitor would scan out the image and whatever pixels made it through the mask made it through the mask.
 
I am a Sony PVM owner, believe me, these things are amazing. Even a consumer Trinitron will look great for older game systems.
 
CRTs had their pluses and minuses.

Pros:
-superior contrast levels
-superior black levels
-variable refresh rates (PC monitors obviously)
-variable resolutions

Cons:
-terrible geometry problems that were nearly impossible to get perfect even with very fine controls (warping image when playing scrolling games and pin cushioning around the edges)
-terrible screen uniformity problems (especially if it wasn't a PC monitor with appropriate controls to adjust it)
-terrible color accuracy problems (the color temperature would vary depending on how warm the electronics were)
-convergence issues (especially the sides and corners)
-phosphor wear or uneven phosphor wear (TV gets dimmer as it ages and color accuracy drifts)

I had several high end CRT HDTVs including Sony's flagship 40" XRB. They were a pain in the ass to calibrate properly. Getting the geometry and convergence perfect was impossible and you had to make the best compromise instead. Once you had the thing tuned up, you had to touch up the calibration every couple months because the color temperature drifted.

On the topic of PC monitors, they supported an amazing number of resolutions, but they were horizontally limited by the number of phosphor stripes they had and the number of holes in the mask. Monitors that claimed very high resolutions would do their best to display them, but there was not enough granularity to actually get anywhere near 2k horizontal pixels. So what would happen is the monitor would scan out the image and whatever pixels made it through the mask made it through the mask.

Interesting post, thanks for the information! Do you still have any CRTs?
 
why? you have 4 pages of people telling you why gamers don't want them. i'll happily pay the premium to have a uhd, 144hz, ips, low response time monitor rather than a 100lb+ heavy, loud, hot, huge monitor on my desk.

Exactly. CRT monitors are certainly superior in terms of nearly everything from a visual standpoint - however, they are extremely inconvenient heavy, take up a lot of desk space (not everyone has tons of space behind their monitor, I have 2 inches) and obviously CRT isn't possible in a mobile form (you're not getting CRT on laptop screens).

The reasons they have fizzled out are obvious. Can current tech ever get back up to the quality of a CRT monitor? Maybe some day. The possibility and capability is out there, someone has to just decide to really go after it. I am not sure there is enough demand to do it.
 
I'm sure it's was because they was hard to dispose of, they wasn't very energy efficient either.
They also emitted low levels radiation.
Today's TV's are energy efficient and recyclable friendly.
 
Interesting post, thanks for the information! Do you still have any CRTs?

No. I abandoned my 40" CRT when I moved cross country a couple years ago. Carting around a 350lb TV just wasn't worthwhile no matter how nice it looked.

Another thing to consider is that everything on a CRT drifts. You do your best, but you check the calibration patterns again when it's warmer/cooler or time has passed and you would find that everything is slightly off (convergence, geometry, color temperature, uniformity, etc.).

Fixed pixel displays solved issues like convergence and geometry and this alone was a huge improvement. The biggest failings of LCDs are contrast levels and black levels. Most of this is due to the fact that LCDs are not emissive displays. All the light leakage clouds the image.
 
It's a shame that we don't have any way of convincing TV manufacturers that many gamers I'm sure would be willing to pay a little extra for the benefits of CRT in HD flat panel tech such as the (for now) abandoned SED and FED screens.

My Philips 40" is as close to the CRT experience as I've seen from any HD TV, but it still can't beat the contrast, correct colour reproduction, white/black response time, and lack of motion blur of a CRT.
 
I don't really know about monitors.

However I have both HD and CRT TVs for gaming, and very happy of both.

But when I got my first HD TV, it was such a HUGE downgrade. Everything was blurry, whenever I would rotate the camera in a game, my eyes would bleed.

I now have a LED Sony TV, took the smallest size possible that offered 1080p, and the result is very good. Mainly play Wii U (and some Xbox One) on it.

My CRT of course is still rocking despite being 10 to 15 years old, and I play a lot of my old consoles on it. Which means something like 5-15 hours per weak. Currently in a Shining Force III run. If new CRT TVs were manufactured, I would probably try one but only if it has Péritel/SCART plug.
 
I have an older PC with one of the last modern video cards to support s-video out (Radeon 4860), so a few years ago I picked up an old CRT TV off the side of the road to test it out. Games look amazing on it! And because I play them at such a low resolution, the old graphics card can handle most modern games (although not so much now with ports of PS4/XBone games being more common). The only problem is that UIs aren't designed with such low resolution in mind anymore, and can often be difficult to read. Forget Witcher 2 for instance.

I really wish they still sold video cards with S-Video out. Unless they do and I'm not aware?
 
Up until recently: Because they stopped making them. And they did that because LCD panels are better for working, easier to transport and have an attribute which is easier to sell to people (being thin).

Now: Because they don't support variable refresh :P

But seriously, especially until very recently when non-TN high refresh rate panels showed up, CRTs were incredibly superior displays for gaming. Just imagine what they could have done with one more decade of refinement.
 
Huggers knows all.

Haha cheers bro

Some interesting replies in here. I have a bit of a CRT obsession and have started to hoard them some what. There really is nothing I've seen to compare to a BVM or indeed a PVM. This is a video I made ages ago of me picking up and testing a BVM crt. It gives some indication of how amazing a SNES can look through the right screen: https://youtu.be/Fb1zPp5GZSo
 
Back in 2004 i switched from a 19" (1600x1200) CRT (manufactured in 2001) to a Acer AL1714.

The main reasons back then where:
- Size: The 19" Monitor was huge. It occupied 1/3 of my desk space. It was also so heavy and bulky that i hated to carry this monitor to network sessions.
- Displaysize: Even the crt was a 19" modell it had the viewable size of a 17" tft display.
- Power consumption: As I recall it correctly the power consumption was three times larger than the power consumption of the tft display. And as energy is not that cheap in germany like in the US it saved me some good money over a year.
- Radiation: As I used my computer 10+ hours a day the lack of raditation hitting your eyes on a tft was a big plus.

The only downside I've encountered was the bad performance of the tft display on lower resolutions and the step back i was forced to made on the overal screen resolution.

I'm still owning an old Commodore 1084S crt for my retro "needs" but I'm glad that the days of crt monitors are over.
 
I work in a uni and we still use CRT monitors for testing participants and other research.

The main drawback is moving the things, finding them to buy, repairing them when they break and the cost they now reach.

Sometimes you will get lucky and find one on eBay for £40 but most of the time you are looking at companies that have bought up old stock and know what they have so they are selling them for £800 or something.

I looked in to moving our testing to flat panels to save space and power but to get one that meets the requirements of the research previously done and same spec as a CRT you are looking at spending the best part of £3000.
 
FLAC audio isn't the norm for the mass market. But among audio enthusiasts, it's actually becoming quite common.

As far as I'm aware, there is literally nobody on the entire planet who is currently building CRT monitors. I understand they're not for everybody, but it seems like something gaming enthusiasts would want.

Gaming enthusiasts spend hundreds of dollars on low-latency TN displays with crappy colors and crappy viewing angles. A CRT would be even LOWER latency without either of those downsides, while ALSO removing the need for antialiasing, because the pixels are round and soft like the dots of ink from a printer.

Sure, you can still find used CRTs at garage sales, but those suck. High quality CRTs are hard to find, and it's only going to get harder.
I'm sorry to call you out on this again but this is FUD.

Many N64 games had edge-based AA, why would they do that if CRTs removed the need for anti-aliasing? At the launch of the PS2 a complaint amongst some enthusiasts was how prominent the jaggies looked in some games, again how would people notice that if CRTs naturally removed jaggies?

What some people like UncleSporky in this thread are talking about is pixel artists taking advantage of a TV CRT's focus/convergence issues, but those focus/convergence issues is not a real anti-aliasing method for 3D games. Also, the amount of focus/convergence issues a CRT has varies from CRT to CRT and is most prominent on the bigger TV CRTs.
 
Maybe when I used to play cs 1.6.

CRT's are heavy as fuck, take up too much space on your desktop and aren't widescreen. Not to mention I doubt it supports 1080p

If it does, that's crazy. Black bars suck though
 
You know, I've been wondering a lot lately why no one makes a kickstarter for a new CRT monitor with inputs for both a PC and old consoles. But I don't understand the economics of what it would take to make that happen.
 
I would buy one if they were still available. I still use CRTs for specialised usecases and they are becoming harder to find.

Kickstarter that shit, you got $2k+ from me.
 
Maybe when I used to play cs 1.6.

CRT's are heavy as fuck, take up too much space on your desktop and aren't widescreen. Not to mention I doubt it supports 1080p

If it does, that's crazy. Black bars suck though
Help to read the thread.
LCD monitors have only recently caught up to CRT resolutions and refresh rates from 10+ years ago.
CRT monitor tech was pretty advanced in the early 2000s with resolutions of 1440p and up and refresh rates of 100hz and up.
Widescreen crt monitors were starting to get common too before the LCD marketing started taking over.

I agree that CRT's are greater when it comes to gaming. When you think about it, they still have superior black to white colour ratio, no issues with view angles, and no input lag. CRT's are an old tech, but they still have some advantages over current LCD's. But on the downside, they are bulky and use way more electricity than flat screens. But in the long run, I do think we are starting too see some of these downsides of flatscreens disappear.




.

They are incredibly difficult to find these days and LCDs have come a very long way and are still improving hugely with every passing year.


I keep seeing this misconception here.

No amount of time or research will allow LCD technology to overcome their flaws:

-the contrast will never approach that of a crt or plasma because LCD panel pixels do not produce light, they filter light from a backlight so you cannot control the brightness of individual pixels
No amount of 'led backlighting' or 'dynamic contrast' (just dimming the backlight) can solve this.
It's an issue inherent to the technology. (OLED and plasma are much superior to LCD because of this, as the pixels produce their own light just like with a CRT)

-the BIG one:
LCD tech* has one major glaring flaw that really REALLY hurts it in gaming (and does not nearly hurt it as much in movies where a lot of the time is spent in static scenes or slow panning scenes): The way it refreshes its images.

*and oled tech too , btw , so don't expect OLED to save us, despite it actually having light producing pixels -solves contrast problem- and having infinitely much better pixel response time (solves lcd smear and ghosting, but does NOT actually give you motion clarity anywhere near crt or plasma, I'll explain why now.


LCD (and oled) panels use a technique called 'Sample and hold'
What this means is that the pixels are continuously lit (because of the backlight)and that the pixels will (painfully slowly in the case of LCD tech, yes even with a 'fast' TN panel) turn to the correct orientation to filter through the correct color of light and then hold that state until it's time for the next frame.
2 problems with this, one BIG and one medium for lcd and not an issue for oled:

Medium: pixel response/transition time: lcd pixels are SLOW to reorient themselves
Grey to grey transition on a faster TN panel at high refresh rate might be 1 ms ish, but that's the marketing number, other color transitions are significantly slower, often up to like 8ms for an IPS panel (at 60 hz half of your frame duration may be spent transitioning a pixel to the right color.
This makes it so that your screen is showing you the wrong color pixel most of the time (let this sink in, really, it's such a glaring defect, defect is the only word that is right for it) , causing a very smeary image in panning scenes ( pretty much all the time in sidescrolling or first person games) or 'ghosting' (a ghost outline from the previous frame from the pixels not having changed yet, especially visible at high contrast edges in motion)

BIG: pixel persistence.
Our eyes cannot resolve sample and hold motion properly.
Unless you expect to get bionic eyes this is a problem that LCD and OLED will always have.
Instead of me typing up some long explenation it's much easier and better for me to link to really good explenations, please read at least one of them it's very interesting stuff:

easy basic explenation:
http://www.cnet.com/news/black-frame-insertion-busting-blur-from-oculus-to-lcd-tvs/

A bit more detailed:
http://www.blurbusters.com/faq/oled-motion-blur/

What it looks like : (esepcially you guys with your '1ms response time' monitors are in for a laugh)
http://www.testufo.com/#test=eyetracking
(on a crt the moving ufo will look exactly like the static ufo, you can track it perfectly with your eyes, this is the fabled crt and plasma motion clarity and it's SO objectively superior to the garbage result on lcds that the comparisons to FLAC music format earlier in the thread make me cringe:p)
As the article explains, there is a 'bandaid' to try and mimic CRT pulse (backlight strobing aka black frame insertion) but it is only available on expensive high refresh rate monitors, lowers your brightness and more importantly requires a HIGH end (very expensive) PC because you need to maintain 120+ fps in your games.
It's literally a fraction of a percent of gamers that get to enjoy good motion clarity on lcd monitors... while every single crt user enjoys perfect motion clarity.

Ironically backlight strobing also introduces the same (non factor but it's a popular marketing parroting point in this thread!) flicker as you get with a crt (anyone who knows what sample and hold is understands the overwhelming irony of TV manufacturer marketing having managed to convince the masses that a worthless method such as sample and hold is preferable to the far superior crt/plasma pulse)


There's many other downsides to LCD tech but some of them have to a point been overcome or at least bandaged, but sample and hold and backlights mean that there will never be such a thing as LCD monitor tech catching up to CRT. It's not going to happen.

The only replacement for CRT will be a different (non sample and hold) future technology. Oled is not it either... (though it's way better than LCD in everything except for motion clarity)

If only you guys knew how many superior technologies that were in development (and promising + viable) were canned 4-6 years ago because of high LCD panel profit margins at the time.
You think you're supporting an industry and that its success and your money is accelerating technological innovation and advancement but it is quite the opposite.
 
Stephen, but OLED with good electronics can perfectly emulate the refresh patterns of a CRT. Black Frame Insertion works great even without emulating Phosphor decay.
 
Buy one of the Sony LCDs that support strobing. It's like a big, flat CRT with all the advantages of a flatscreen. Normal LCD is such a massive step backwards for gaming.

Strobed 4K OLED is going to be incredible.
 
Sure some people would like em but there is honestly absolutely no room for that kind of shit in my apartment. I remember when every desk had to have those pull-out keyboard things so the huge ass heavy monitor could hog the actual desk space.
 
Stephen, but OLED with good electronics can perfectly emulate the refresh patterns of a CRT. Black Frame Insertion works great even without emulating Phosphor decay.

If you can run your games at high refresh rates yes... I just did an 850 euro upgrade and I sure can't do 120fps in every game, for that I would need a 980ti.
As I said, it's always going to be a 1 percent luxury thing for gamers, it's not something the mainstream can take advantage of.
We need to bury LCD and get new technology that does not use sample and hold (and meanwhile we can have OLED as a bridge to fix some of the most glaring non motion related issues that LCD panels suffer from)

If people (wrongfully) perceive crt as outdated technology then I can't wait to hear people talk about LCD panels a few years after we do finally moved on to proper display tech.
 
People keep mentioning noise... was that on all CRT's, or just ones that were damaged/poorly made?

(Again, I don't really remember what CRTs were actually like)

Wait a goddamn second. You don't remember how CRTs are like and you still insist they should be standard even though you get multiple pages talking about how impractical it is now in today's tech market?
 
-could definitely notice flicker, even at 90hz
-over time they seemed to progressively get darker, made it difficult to see anything in games. on lcd still image looks better because of it

I would be willing to try an crt if there were new ones that didn't have those problems but weight probably figures into why they would be expensive on top of them not being mass market items anymore. I remember when I switched from CRT to LCD and it was gross how much of a difference in motion blur there was between the two e.g. in Supreme Commander I could see the ships fly by completely clear looking in CRT and they were completely blurred on an LCD. I know that was on a early LCD which probably had 16ms response time but we still can't have that in LCD without gimmicky lightboost. Still hoping for OLED to solve this problem I guess.
 
-could definitely notice flicker, even at 90hz
-over time they seemed to progressively get darker, made it difficult to see anything in games. on lcd still image looks better because of it

I would be willing to try an lcd if there were new ones that didn't have those problems but weight probably figures into why they would be expensive on top of them not being mass market items anymore.

Highly doubt you can see flicker at 90 hz, most people can't even see it at 75hz anymore, but w/e not going to argue especially as your only path to match crt motion is ultra low persistence mode on an lcd panel which will flicker the same amount for you.
You need to understand that the strobing is what allows your eyes to track the motion on the screen, it's a good thing, marketing firms have twisted it into a negative with their eye fatigue fud to pressure parents into spending a lot of money on a new tv for the sake of their kids' health.
The constant blur you experience on a high persistence lcd panel in motion causes way more eye strain, the only time when an lcd panel is easier on the eyes than a 75+ hz crt is when you're looking at a still image.

I do agree that crts lose brightness over time but you need to put it into context. My iiyama crt retained more than sufficient brightness for 10 years , it was only in the last 2 years (when the monitor was 12 years old) that I had to put it on 100 percent brightness and it was starting to suffer.

My current lcd brightness (set on 20 percent) preference is still a lower brightness than my CRT was giving me after 10 years. (also my blacks were blacker on the crt, there was 10 times more shadow detail on the crt in dark scenes , there was no black crush, the colors DEMOLISHED those of my lcd panel)
Who here is using a 10 year old LCD monitor again? Who expects to use their current LCD for 10 years? Unlike lcd panels a crt was usually a ten year purchase.
Let's pretend for a second for the sake of argument that a crt would deteriorate within 5 years to a lower than ideal brightness.
Your LCD starts off from day one with:
-black crush
-extremely poor contrast
-grey blacks unless you have a VA panel
-IPS glow if you have an IPS panel
-backlight bleeding
-extremely poor viewing angles if you have a TN panel or IPS panel (only VA has acceptable viewing angles) It's so bad that on a larger TN monitor you'll have visible color shift towards the edges of your screen because you're looking at them from an angle. If I move my head at all I can see the color shift on my TN monitor.
-either serious response time related blur or annoying overshoot (there is literally ONE very expensive IPS panel out of all the monitors out there that does not suffer from it, one exceptional miracle panel that most people here can't afford)
-extreme sample and hold eye tracking blur at 60 hz, still serious blur at 120 hz, needs 120hz BFI to approach acceptable clarity
-uneven backlighting on all but the best panels, even most of the 'best' gaming panels suffer from it


But you're worried about possible brightness issues years after your monitor would have ended up on a landfill somewhere.
 
I still think CRT has better picture quality, even current stuff looks better on it. Got rid of mine though because of the weight. I had a 32 inch one like 3 years ago. Took me a week to recover after I moved it out of the house.
 
Highly doubt you can see flicker at 90 hz, most people can't even see it at 75hz anymore, but w/e not going to argue especially as your only path to match crt motion is ultra low persistence mode on an lcd panel which will flicker the same amount for you.
You need to understand that the strobing is what allows your eyes to track the motion on the screen, it's a good thing, marketing firms have twisted it into a negative with their eye fatigue fud to pressure parents into spending a lot of money on a new tv for the sake of their kids' health.
The constant blur you experience on a high persistence lcd panel in motion causes way more eye strain, the only time when an lcd panel is easier on the eyes than a 75+ hz crt is when you're looking at a still image.

I do agree that crts lose brightness over time but you need to put it into context. My iiyama crt retained more than sufficient brightness for 10 years , it was only in the last 2 years (when the monitor was 12 years old) that I had to put it on 100 percent brightness and it was starting to suffer.

My current lcd brightness (set on 20 percent) preference is still a lower brightness than my CRT was giving me after 10 years.
Who here is using a 10 year old LCD monitor again? Who expects to use their current LCD for 10 years? Unlike lcd panels a crt was usually a ten year purchase.

A 60Hz strobed TV with a dim backlight and good ambient lighting is better on my eyes than typical LCD with all the blur.

Anyway, gamers don't care about motion quality any more. LCD has taught people to unfocus any time anything moves.
 
.
-Had virtually no input lag, at all.

Modern gaming LCDs are typically close enough to this already.


.
-Maintained color accuracy at different viewing angles.

I'll expand your thought to color accuracy in general.

There are a bunch of draw backs between VA, IPS and CRT. This will be resolved when OLEDs are common.

.
-Didn't have issues with aliasing.

True.

.
-Could cleanly scale down to lower resolutions.

LCDs can do that already. It's mostly a software implementation problem.

.
The only downside, as far as I can tell, is that CRT's are large and heavy. While this could definitely be a problem for some people, I'd imagine that a lot of dedicated gamers wouldn't mind the extra space.

You're underselling this. The FW900 is 100lbs. CRT monitors are just as bad CRT TVs. They only take up less volume. The component needed to display the image is relatively compact but dense.


Besides there is one thing you forgot to mention that LCDs only started addressing 2 years ago consistently.

CRT screens don't have motion blur. LCD's had terrible blur for years but a new technique to have almost CRT level of blur reduction was discovered with displays that have strobed backlights.

Any gsync display will have it but they're locked down to work with only Nvidia gpus.

For gpu agnostic displays you can use this list.

http://www.blurbusters.com/faq/120hz-monitors/
 
A 60Hz strobed TV with a dim backlight and good ambient lighting is better on my eyes than typical LCD with all the blur.

Anyway, gamers don't care about motion quality any more. LCD has taught people to unfocus any time anything moves.

Those tvs use motion interpolation
120 or 240 hz refresh rate with varying rates of backlight off time. (lower persistence = better clarity but comes at an equivalent brightness cost)

If there is a tv that does it at just 60 hz then it will have worse brightness than your average shutter glasses 3d movie in a shitty cinema and flicker as much as a 60hz crt.

Motion interpolation cannot be used for gaming because it creates massive massive amounts of input lag.

The only way to do backlight strobing for gaming is 120 or 144 fps on a 120 or 144hz panel, as I already explained that is not viable for midrange computers (unless all you play is counter strike :p)
backlight strobing only serves a small niche of users for gaming and even for them the extra cost counts as a really big downside/deterrent.


All the bandaids to make LCD panels better for gaming are such a waste of energy to me, energy that should be put in developing better display technology.
LCD tech is not worth dignifying, it's deeply flawed garbage.

unironic: "LCD technology was a mistake, it's nothing but trash" =p
 
What makes me sad is going into an arcade, which used to be a crt mecca, and seeing Sega rally and daytona cabs with horrible 16:9 lcds in place of the 4:3 crts, working crts will be like gold dust in 10, 20 years time
 
I have 3 CRT's hooked up in the mancave. One is a 14" Toshiba I use for my Saturn shmups, TATE style. Then there's the 23" Trinitron which is for my Snes and Genesis. And finally, the big daddy 36" Wega, which is for my Gamecube, PS2 and Xbox, since its progressive scan.

I love CRT's. My wife has to basically stop me from picking them up everytime I see one in good shape that's been left by the curb, which happens all the time. The 14" Toshiba was a great find at a yard sale for $5. Component and S-Video complete with remote in pristine condition.
 
All the bandaids to make LCD panels better for gaming are such a waste of energy to me, energy that should be put in developing better display technology.
LCD tech is not worth dignifying, it's deeply flawed garbage.
I agree with this. You're not going to find many agreeing with you here when so many spend their time arguing the quality of static images of moving video games, prioritizing the wrong things about gaming, a moving medium that focuses the user on recognizing and tracking patterns and objects in motion. LCD fights this very task with its shit-quality motion.
 
You need to be able to make and sell millions of them in order for it to be worth it, and nobody wants to buy CRTs and nobody wants to stock CRTs outside a handful of people.
 
i bought 10 years ago a flat display 19" crt, that piece weight 20kg.. if you want that with 24" i bet it will be 30kg.. thats maybe why.
 
Top Bottom