• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

HD discussion #18902471: CRT vs. LCD vs. Plasma

the new sony 720p RP LCD are hitting stores today? I will have to go to my local hifi shop and check them out :D

and oh man - price drop on the KD-34XS955 >_<

ah this is painfull... must stay strong... further price cuts inevitable.... resist. shiney. new TVs.
 
Deg said:
Not only that. But 1080p displays will use newer technology. 720p tv's will mostly be older tech and have more flaws than their newer counterparts. :) The 1080p sets will be where most new advancements will be showcased. 720p tv will be cheaper for that reason.

Take LCD for example. The tech just keeps getting better. But these advancements mostly are showcased in new better spec models which overtake the old ones. :)



720p sets don't use the same bloody panels year in year out. Technology used in LCD will be applied across the board. You'll just get better yields of 720p panels so they'll be cheaper. In fact, I'd lay money on some of the early 1080p sets being similar to early generation LCDs, with manufacturers more concerned about getting the resolution out and not concentrating on contrast etc. That will improve over time. Its dangerous to assume that increasing the resolution will also improve other elements.
 
mrklaw said:
In fact, I'd lay money on some of the early 1080p sets being similar to early generation LCDs, with manufacturers more concerned about getting the resolution out and not concentrating on contrast etc. That will improve over time. Its dangerous to assume that increasing the resolution will also improve other elements.


They do however. They improve all the time. You know there will be 1080p showcase models that companies will push as their flagships with new tech and then roll out the whole range to suit different tastes. At the same time there will be cheaper LCD aimed at people with less of a budget yet they can get 1080p too. Everyone will be catered for.
 
I doubt best buy has them, but this is an independant shop that is pretty good about getting new stuff in, I'll probably call them and ask if they have them 1st, obviously :)
 
mrklaw said:
You mean all 1080i TVs do that? No disrespect, but I think you're wrong.
I have been researching HDTV since a year and a half before I got my first display in 2002. I'm not wrong. and for the record, the HD sets aren't doing ANYTHING different than the HD boxes do that convert 720p to 1080i. how do those boxes do it if it is so impossible?

Yes, they display as 1080i, but thats effectively 540p. 540 scanlines every 1/60 second, each 1/60 of a second the 540 lines are updated. Combined you get the 1080 lines over the course of 1/30 second. Effectively you have a 540p/60 display
umm.. except you can't equate 30i to 60p like that. the effective resolution, and what our eyes see, is 1080 lines of resolution. the human eye loses the ability to see actual flashes at somewhere between 20-50 flashes of light per second (the actual number is debatable). TV screens are at 60 flashes per second. So what happens after 50 flashes per second? The flashing light then appears as one constant light source.

http://dwb.unl.edu/Teacher/NSF/C11/C11Links/www.ece.wpi.edu/infoeng/textbook/node71.html

so we effectively see 1080 lines of resolution, not 540 lines. true we are only seeing 30 real frames per second, but again, our eyes do not detect that frame rate difference. what they detect is a drop in frame rate from the framebuffer on the console.

I would assume (please point me to where you discussed this if you disagree) that a 1080i set takes a 720p input and scales it down to 540p then whacks it out the tube.
why would you assume this when that would effectively ruin everything the FCC has been working on for the past 15+ years? There are two basic HD resolutions. 720 lines progressive and 1080 lines interlaced. If manufacturers started making boxes that did 540p by default could you imagine the lawsuits stemming from the box not doing HD?

What else can it do? It can't scale it to 1080 lines and then interlace it. First, upscaling is generally bad. Secondly, it takes 1/30 of a second to display the full 1080 lines in two passes of 540 lines. 720p changes the whole 720 lines in 1/60 of a second.
kind of like how the xbox, ps2, and GCN take a 60Hz progressive picture and interlace it to 60Hz interlace? next.

So even if 1080i sets scaled up then interlaced, you'd end up with a 1/30 update rate, so all your 60fps games go out the window.
no, because you are confusing frame rates, that is the number of frames per second the frame buffer is rendering, with the refresh rate at which your TV is displaying the picture at. Games still run at 60fps on the XBox even though they are being sent out through the composite cables, right? games were still running on the DC at 60fps even though you had NO progressive output on that system aside from the VGA cable, right?

I'd still have a nagging doubt about going out and buying a TV that isn't capable of displaying what is being put into it - talking about games now, not TV where there is a mix of standards.
and yet no one is complaining thaty their XBox 720p games look like shit on their 1080i sets. and while TV is mixed standards, ABC outputs 720p 24/7. Fox outputs 720p 24/7. ESPN outputs 720p 24/7. No mixed standards there. those guys are all 720p 24/7.
 
OK, good post.

So now I'm just gonna rail on interlaced standards generally :)

surely its better to have progressive output? Look at the 480p stuff from xbox and PS2.

From a console point of view, if you have an interlaced output, you run the risk of edge combing (no idea what its called, but when an object is moving quickly horizontally, and it moves between refreshes)

Maybe its just more noticable over here in 50Hz land..
 
Wow. Looks like the current 42 and 50" Sony LCD RPTV sets are being firesaled right now! JR.com has the current 42" model for $1599 and the 50" model at $1999 now! Must be trying to clear out inventory for the new models! :D
 
Are Blu-Ray and HD-DVD movies expected to be mostly in 720p or 1080p? I thought I heard 720p would be the norm.

If HD movies, HD broadcasts, and 99% of Xbox 360/PS3 games are going to be primarily in 720p I don't really see the immediate need for a 1080p set.
 
mrklaw said:
OK, good post.

So now I'm just gonna rail on interlaced standards generally :)

surely its better to have progressive output? Look at the 480p stuff from xbox and PS2.
the argument I got into with shog was this. We don't see the interlacing. at 60Hz it is impossible for our eyes to see the actual interlacing process. what we see are the artifacts from the seperate fields being flashed back and forth. the artifacts occur because critical detail is lost in the other field that currently isn't being shown (i.e. the other half of the picture). This is more ciritical on lower resolution video, where the video is made up of fewer lines, therefore getting rid of every other line has a much larger chance of eliminating finer detail on a given field.. follow me? I used this example in another thread, so I'll use it in this one also.

Code:
######
##  ##
######
##  ##
##  ##

Code:
############
############
####    ####
####    ####
############
############
####    ####
####    ####
####    ####
####    ####

now the bottom letter is twice the resolution of the top letter. but assume that the letters are really the same size on the screen (like a TV). Now what happens when you interlace the top letter? Every other field you will lose the two horizontal rows on the 'A'. This is what causes the flickering on an interlaced picture. Not that the fields are being alternated, but that entire parts of the image are literally disappearing every 30th of a second.

now let's look at the bottom letter. what happens when you interlace it? not a whole lot. no whole part of the image will disappear on any given refresh/field. Where before the horizontal lines would flash from not there to there, the new horizontal lines will merely change from one line to the next.

so no, interlacing is not bad by default. what makes it bad is low resolution video that gets destroyed by the process. when you crank up the resolution, the artifacting of interlacing an image becomes much less noticeable the larger you go. is a progressive picture always preferable? If we are talking "all other things the same" resolution progressive picture I would say sure, why not. You don't lose anything and gain something, even if our eyes might not and probably won't detect it. but that there is the crux. is it worth it to go progressive, even if your eyes can't detect it, when it means giving up a superior visual display (CRT)? Is it worth going progressive, even if our eyes can't detect it, when it means spending a bunch more money for a smaller picture? These are decisions everyone has to make. For me, and I would imagine most others, the answer is a very definite NO. It is not worth it to have a progressive picture when our eyes won't even notice it, in the process paying more for it, getting less, and having a relatively inferior display tech. Now there are other factors in there. Available space, price, convenience, etc.. These IMHO are MUCH more important factors compared to stupid shit like interlace or progressive or 720 lines or 1080 lines. I think how much space I have or how much money I have to spend are significantly more important than some visual difference my eyes won't see.

---- said:
Are Blu-Ray and HD-DVD movies expected to be mostly in 720p or 1080p? I thought I heard 720p would be the norm.

If HD movies, HD broadcasts, and 99% of Xbox 360/PS3 games are going to be primarily in 720p I don't really see the immediate need for a 1080p set.
all HD video discs will be 1080p. Every single one.
 
borghe said:
35mm? 70mm? 4k?

projection displays are perfectly perfect as long as you have the resolution and brightness to accomodate screen size and throw.


I can't use a projection technology for a screen I can draw on. That's far from perfect. :P

At least for now, LCD has a place in that respect.

BTW, all this SXRD hype should be controlled a bit here since we are in a gaming forum: HT forums are going ga ga over SXRD because it's a varient of LCoS that's improving greatly on the weak blacks of LCoS. These folks actually prefer LCoS over DLP because of it's softer look!

For us in the gaming forum, we should embrace DLP over LCoS and it's varients because we prefer sharper look of the DLP over the softer look of the LCoS, which is better for HT folks that want better emulation of the look of films. If you want the equivilent of a giant sharp projected computer image, DLP is much better than LCoS.

It's videogames, people! We don't want the display technology providing half assed additional Anti Aliasing over ones driven by the silicon on the GPUs (otherwise, we would be elated with the fuzzy AA that SD NTSC sets provide for us right now).
 
borghe said:
all HD video discs will be 1080p. Every single one.


thats why i will wait before i go 1080p :D Maybe 2 more years.

The only thing i feel bad about is all the dvds i have got over the years. Might upgrade afew :p
 
Shogmaster said:
I can't use a projection technology for a screen I can draw on. That's far from perfect. :P

At least for now, LCD has a place in that respect.

BTW, all this SXRD hype should be controlled a bit here since we are in a gaming forum: HT forums are going ga ga over SXRD because it's a varient of LCoS that's improving greatly on the weak blacks of LCoS. These folks actually prefer LCoS over DLP because of it's softer look!

For us in the gaming forum, we should embrace DLP over LCoS and it's varients because we prefer sharper look of the DLP over the softer look of the LCoS, which is better for HT folks that want better emulation of the look of films. If you want the equivilent of a giant sharp projected computer image, DLP is much better than LCoS.


There you go pimping DLP sets again! :lol

Braveheart.jpg


"You may take our DLP sets. But you'll never take OUR FREEDOM!"

What do you thinka bout the D-ILA JVC technology? My biggest issue with DLP are rainbows. D-ILA is suppose to alleviate this.
 
Borghe,

The biggest problem I have with interlaced output on games consoles is movement, and differences between frames. Are you saying that'll be much reduced simply due to there being more lines? I really hope so.

Using your example, if the 'A' moves rapidly across the screen, it can move between fields, so you get


Code:
############
    ############
####    ####
    ####    ####
############
    ############
####    ####
    ####    ####
####    ####
    ####    ####
 
Shogmaster said:
To quote one of my buddies, "LCD's are a speed bump on the road to the perfect display technology."


That is a very good quote :D

---- said:
Are Blu-Ray and HD-DVD movies expected to be mostly in 720p or 1080p? I thought I heard 720p would be the norm.

If HD movies, HD broadcasts, and 99% of Xbox 360/PS3 games are going to be primarily in 720p I don't really see the immediate need for a 1080p set.


According to a Microsoft rep and the former BRD spokesman for Panasonic Hollywood Labs who frequenly post over at avsforums, both HD-DVD and BRD movies will be authored @ 1080p.....as you may/may not know, film is 1080p/23.98fps and one of the reason these new optical formats are being introduced is because they provide an easy way to author films at their native resolution.....1080p was the plan from the beginning..

Now as far as OUTPUTTING 1080p digitally....that will probalby depend on the deck itself...

Toshiba has said their first gen HD-DVD deck will output either 720p or 1080i via HDMI while the PS3 will supports 1080p outputs....I would imagine some HD-DVD/BRD deck will output full 1080p and some wont....your milage may vary....

As far as 1080p TVs that support 1080p input via HDMI....they should be arriving later this week.....the issue up to now is the Silicon Image HDMI reciever chipsets in these TVs didn't support 1080p, but Zoran, Sigma Designs as well as Silicon Image are introducing 1080p HDMI transmitters/recievers and I believe the Misubishi 1080p TVs shipping this week incorporate the Silicon Image SiL 9011 reciever, which supports 1080p....so we'll see....
 
Well I would say then if all HD-DVD and BRD are going to be 1080p then definitely don't buy a tv set right now. Even if you don't see yourself replacing your DVD collection in 1080p you're still going to want to rent all new movie releases in 1080p.

The question I have though is how good are these 1080p sets going to be at handling the 720p/1080i HDTV and 720p/1080i games? A lot of 720p HDTVs are horrible at handling standard definition tv signals. Is there going to be an affordable tv (sub $2,000) any time soon that will be able to handle 480i/p, 720i/p, 1080i/p resolutions? Right now there's definitely not anything on the market that I know of that will be able to properly handle everything from SDTV to HD-DVD and everything in between.
 
borghe said:
the argument I got into with shog was this. We don't see the interlacing. at 60Hz it is impossible for our eyes to see the actual interlacing process. what we see are the artifacts from the seperate fields being flashed back and forth. the artifacts occur because critical detail is lost in the other field that currently isn't being shown (i.e. the other half of the picture). This is more ciritical on lower resolution video, where the video is made up of fewer lines, therefore getting rid of every other line has a much larger chance of eliminating finer detail on a given field.. follow me? I used this example in another thread, so I'll use it in this one also.

Code:
######
##  ##
######
##  ##
##  ##

Code:
############
############
####    ####
####    ####
############
############
####    ####
####    ####
####    ####
####    ####

now the bottom letter is twice the resolution of the top letter. but assume that the letters are really the same size on the screen (like a TV). Now what happens when you interlace the top letter? Every other field you will lose the two horizontal rows on the 'A'. This is what causes the flickering on an interlaced picture. Not that the fields are being alternated, but that entire parts of the image are literally disappearing every 30th of a second.

now let's look at the bottom letter. what happens when you interlace it? not a whole lot. no whole part of the image will disappear on any given refresh/field. Where before the horizontal lines would flash from not there to there, the new horizontal lines will merely change from one line to the next.

so no, interlacing is not bad by default. what makes it bad is low resolution video that gets destroyed by the process. when you crank up the resolution, the artifacting of interlacing an image becomes much less noticeable the larger you go. is a progressive picture always preferable? If we are talking "all other things the same" resolution progressive picture I would say sure, why not. You don't lose anything and gain something, even if our eyes might not and probably won't detect it. but that there is the crux. is it worth it to go progressive, even if your eyes can't detect it, when it means giving up a superior visual display (CRT)? Is it worth going progressive, even if our eyes can't detect it, when it means spending a bunch more money for a smaller picture? These are decisions everyone has to make. For me, and I would imagine most others, the answer is a very definite NO. It is not worth it to have a progressive picture when our eyes won't even notice it, in the process paying more for it, getting less, and having a relatively inferior display tech. Now there are other factors in there. Available space, price, convenience, etc.. These IMHO are MUCH more important factors compared to stupid shit like interlace or progressive or 720 lines or 1080 lines. I think how much space I have or how much money I have to spend are significantly more important than some visual difference my eyes won't see.


I understand what you're saying, but the whole interlaced vs. progressive scan argument has been a long running one.

EDIT: I tend to agree with you generally, but I thought I'd throw the thoughts of one of the industry's heavy hitters out there. :)

Here's what Joe Kane (Digital Video Essentials) has to say:

D-Theater - Questions and Answers

5) Which format is better?

Joe Kane Productions is strongly backing all progressive formats over any interlaced video option. Progressive images look better when objects in the pictures are in motion. A progressive image is complete in itself where an interlaced image is often different between the first and second half of the video signal. The differences between the two halves of the picture show up as interlaced artifacts. The vertical resolution of an interlaced signal has to be filtered in order to reduce the visibility of these artifacts. That reduces the real image resolution far below the scan numbers associated with the format. Digital compression of images, which is necessary to make them fit into the space allocated for a broadcasting TV channel or a D-Theater tape, is far more efficient with progressive video at the source than interlaced video.

6) If JKP is so strongly backing progressive images, why are you also making DVE available in the 1080i format?

In a word, heritage. Interlaced video has been with us been with us since the beginning of television. Many of us had hoped that it would go away with the coming of HDTV. Unfortunately interlaced video is a bad habit that is hard for many analog oriented people to break. As a result interlaced video formats are part of HDTV rates. At the moment 1080i display capability represents a large portion of so called “HD Ready” TV sets currently on the market. We feel it is important to supply real setup and evaluation material for that format. We also suspect that many people seeing the true difference between the two versions of the program will want to shift to progressive video.

7) Are the two versions of DVE good for showing the differences between interlaced and progressive HDTV?

We believe the best demonstration of progressive versus interlaced HD video can be done with ever-day video from the networks broadcasting each format. DVE can be used for such demonstrations with the limitation that it was produced in 1080p and does not have as many of the artifacts that would be present if it had been shot in 1080i/60. The quality of the display is an important factor. May so called “DTV or HD Ready” sets don’t have the capability of showing any of the HD formats for their real capability, let alone adequately show the differences between interlaced and progressive video. Of course in these early days of DTV not all programming comes up the capability of either 720p or 1080i.

8) If most "HD Ready" televisions aren't good enough to show the differences between interlaced and progressive video, why should I be concerned about the 720p tape of DVE being available?

We recall a time when broadcaster rightly claimed that consumers had no idea of just how good composite NTSC video looked. That was what got us started on consumer display device calibration. Along came the best of laserdisc and suddenly the consumer had the potential of seeing better quality NTSC than most broadcasters. The quality example was set in that format with many in the industry saying it couldn’t get any better. Then along came the component capability of DVD. It’s all part of viewers not knowing what they are missing until there is a new reference. The 720p tape serves as a reference for the capability of progressive high definition TV. On a good set it will show the difference between 1080i and 720p. It will help make a strong case for going progressive in high definition.

9) Why is it that when I watch ABC, which broadcasts in 720p, I don't see much of a difference between it and the other networks broadcasting in 1080i?

The majority of displays currently available to consumers aren’t capable of displaying 720p at 720p. The signal gets converted to 480p or 1080i. There are real picture quality losses in either of these conversions beyond the progressive to interlaced problem. The answer to this question is also dependent on the quality of the display itself, even if it can display 720p. Most “HD Ready” sets available in the mass market are not good enough to show either 720p or 1080i off to anywhere near their real capability.

Then there are individual programs that, for what ever reason, don’t look as good as we would expect for demonstrations of the capability of HDTV. If you look at the quality of the film transfer shown in DVE at index point 21 you’ll have a reference for the kind of detail we expect from HDTV programs. It was shot on 35 mm film and transferred to video on a Spirit DataCine at 1080p/24. It was then down converted to 1080i and 720p for the individual tapes made available in D-Theater. This production process represents what should be happening for all film based TV programs.

10) With all of the discussions of DTV being able to present much better image quality there are often times when upconverted standard definition material looks better from the analog feed from the station than the digital feed? Why is that?

This is most likely a problem at the station in their choice of equipment to convert standard definition material to digital high definition. All things being equal the analog path should not look as good as the digital path. This sort of thing is bound to happen in the early days of DTV where there is a lack of equipment and or experience in the new digital medium. Even when the signal originated in the composite domain the station should have better equipment to convert that to the component signal required for digital transmission than is available to most consumers. That step alone should account for their having a better picture. Then there are all of the typical analog artifacts, such as ghosting and lower detail capability that shouldn’t be a part of the digital signal path.

11) Why are you always putting "DTV" or "HD Ready" in quotes?

Over the past ten years we’ve written many articles explaining our position on what needs to be present in a display device for it to be truly capable of being called an HDTV ready set. In the late 90’s we proclaimed that if a CRT based set wouldn’t do 720p it’s not HDTV. Most “HD Ready” sets that are currently available in mass market retail stores are modifications of standard definition designs, not an all new design specifically engineered for HDTV. If they were designed for HDTV they would include a 720p display capability.

TV set manufacturers and the Consumer Electronics Association (CEA) would like you to believe that good HDTV sets are easily available and inexpensive. DVE will help you recognize the reality of that idea. Along with the CEA, JKP has proposed identification of quality levels of TV sets, essentially saying that not everything can be true HDTV, that there are steps of quality between standard definition and high definition. There is value in sets that do a lot better than current standard definition without going all of the way to HDTV. JKP is offering more stringent specifications for these categories than the CEA. We are taking the position that in the evolution of HDTV our first obligation is to make TV sets as good as they can be in their category. Making them less expensive comes after we have a firmly engrained expectation for quality.



borghe said:
all HD video discs will be 1080p. Every single one.

Perhaps for BR, but not according to Toshiba:

Toshiba Reaffirms Its HD-DVD Strategy

You have to register to read it, so here's the article:

This Week in Consumer Electronics said:
Toshiba Reaffirms Its HD-DVD Strategy


By Greg Tarr -- TWICE, 6/6/2005

ALBUQUERQUE, N.M.— Despite saying that the company continues to discuss a possible unification of rival high-definition optical disc formats, Toshiba's U.S. marketing executives speaking at the company's annual line show, here, held to earlier announcements that they will launch the first U.S. HD-DVD player in the fourth quarter.

Jodi Sally, Toshiba's digital A/V group marketing VP, unveiled to dealers and press an as yet unnamed HD-DVD player, which is slated to retail for “under $1,000” late this year.

The exact feature set of the player have not be finalized, but Sally said the player will playback HD DVDs, DVDs and CDs. High-definition resolution output will include both the 720p and 1,080i formats. Outputs will include HDMI and IEEE-1394, and the first player will support interactivity and Internet connectivity.

For audio, the player will include decoders for both Dolby Digital Plus and DTS HD, she added.

Sally said Toshiba will support the player launch with “an extensive marketing campaign,” involving advertising on television, radio and in print. Additionally, the company will work with key retailers to deliver in-store displays using high-definition monitors.

“Starting in the third quarter we want to start doing some consumer demonstrations at retailers across the country to increase awareness as a prelaunch campaign,” Sally said.

Company executives said Toshiba would like to settle on a unified standard, but it is also critical to launch HD DVD now should an agreement prove impossible.

“In order to avoid the difficulty of having a drop-dead date [for unification], these processes are going on in parallel,” said Mark Knox, a technical consultant on HD DVD to Toshiba. “We are going to continue to develop and deliver HD DVD on time this year. Only at such time as there is a clearly defined, established agreement are we going to change that plan.”

Knox acknowledged that a decision for unification could have a significant impact on Toshiba's planned investments in its HD-DVD marketing campaign this year, but “even now is the time that we need to be finalizing details. We just returned from the Media Tech show where we affirmed affordability and viability of [disc] production, and we are very excited about bringing HD DVD to market this year.”

Sally identified several factors that are encouraging Toshiba to market its player this year. These include increased penetration of high-definition television sets, the absence of packaged high-definition media in a disc format, and the peaking of the existing DVD market.

In establishing the next disc format, Toshiba felt it critical to keep a disc structure that is compatible with existing DVD discs, enabling today's DVDs to play on new HD-DVD players.

The HD-DVD player will feature a single objective lens capable of reading both red and blue laser discs, while discs will include intuitive navigation menus for interactivity.

As for disc capacity, a single-layer HD DVD disc will hold 15GB of data, a dual-layer disc will hold 30GB, and a just-announced triple-layer HD-DVD disc will hold 45GB capable of up to 12 hours of HDTV content, Sally said. A new hybrid “flipper” disc will enable storing HD content on one side and SD-DVD content on the other.

Contrary to recent industry rumors, Knox said development of a digital rights management (DRM) system — called AACS — planned for the player “is on track and won't delay the planned launch.”

The DRM system, Knox said, is based on device identification keys and watermarks, which Hollywood Studios will be able to use to prevent future software releases from playing back on machines that have been used to hack copyrighted HD-DVD discs. The device revocation capability has proven very popular with content rights holders, Knox said.

Also on track are previously announced plans for HD-DVD movies to support the player's launch. Present at the line show was Steve Nickerson, former Toshiba executive who is now Warner Home Video's (WHV) market management senior VP.

Nickerson said his company will have a sampling of titles ready to support various HD-DVD disc versions, including HD movies only, hybrid discs containing both HD and SD versions of a movie on one disc, and discs with HD and/or SD movies with interactive extras, such as video games. However, WHV has not determined which titles will be used for the hybrid or interactive discs. Pricing has not been determined, but a premium for HD discs versus current DVDs should be expected, he said.


Or are you saying that everything will be mastered at D5?
 
mrklaw said:
From a console point of view, if you have an interlaced output, you run the risk of edge combing (no idea what its called, but when an object is moving quickly horizontally, and it moves between refreshes)

Maybe its just more noticable over here in 50Hz land..

You mean when it looks like two images have been overlapped onto each other? Like the effect of taking a picture with an analog camera on the same film twice.. One of the pictures will look faded and ghost looking.. Isn't that just interlacing?
 
Shompola said:
You mean when it looks like two images have been overlapped onto each other? Like the effect of taking a picture with an analog camera on the same film twice.. One of the pictures will look faded and ghost looking.. Isn't that just interlacing?

Here ya' go Shompola:

Interlaced Video Explained

What he's talking about is the contested issue of interlacing artifacts for fast motion type content. IOW, the kind of stuff you find in televised sporting events and video games.
 
yah I know. but when people bitch about interlacing they arent primarly bitching about flickering, they bitch about the issue I described.
 
Hokie Joe....the HD-DVD video will be mastered at 1080p....even if it is output at 1080i at the HD-DVD deck, as long as the embedded telcine flags are correct the 1080p display can "reconstruct" the original 1080p image via inverse telcine processing(3:2 cadence usually)......

This is how all of my D-theater tapes work and the deck outputs 1080i to my Qualia 006, which correctly displays it as 1080p....
 
Shompola said:
yah I know. but when people bitch about interlacing they arent primarly bitching about flickering, they bitch about the issue I described.
actually in most instances that isn't noticeable in video games. where that primarily occurs is in scene changes. because the image is moving constantly and consistently between fields in a video game, our eyes don't really perceive that the first field of frame a is really different from the second field of frame a.. because it is continuous motion. Where this effect is primarily noticed is in movies and such, during scene changes. where field a is one scene and field b is a second frame.

it should also be noted that most of what you guys are still talking about it related to 480i, or relatively low resolution images. the game changes significantly when you bump up to 1080i.

hokiejoe - I agree with you 100%. really, 100%. progressive is better. there is no way I would argue that an interlaced picture is better or 100% equal to an interlaced picture..... all things the same. but here we aren't talking about all things the same. we are talking about different display technologies, each with their own artifacts and deficiencies. we are talking about various price points and sizes. we are talking about the realtive quality of each set and additional pricing on top of that. when you take all of that into consideration, in my mind at least, the debate between progressive and interlaced or 1080i or 720p becomes very small compared to the other factors. I am more worried about whether I have space for a CRT, or if I really want another TV for 5+ years that I have to fuck around with convergence and geometry on. to me that stuff is way more important than, as I was saying, a visual discrepency that I still after 3 years have never really noticed to the point of caring. so yes, I agree that progressive is better.. I am just disagreeing with other people here as to how much of a factor that determines in my next set I buy (which will probably be next year).

As for HD-DVD... I promise you HD-DVD will support 1080p. If not out of the gates then eventually. It makes no sense whatsoever, especially with 1080p sets out now and the HD-DVD format not even 100% finalized, to NOT support 1080p, especially when your competition is. At the very least, movies will be encoded the same way DVD is, 24p. of course TV will likely be encoded at 1080i (being that's what it's shot at for now) but the majority of releases will be 1080p. now of course if the players support it at first will be another story.

Actually my longshot belief is that we will still end up with one format. You can tell with the way the two keep trying for it that neither one wants to release separately, and that they know it will hurt both formats if they do.. we'll see, but my money is still on a unification.

edit - damn it.. kleeg beat me to the explanation of HD-DVD mastering. for the record though, I imagine players will have the capability out of the gates also. by the time the players are released (if we do see separate laucnhes for the formats), the video output technology combined with the fact that the players will likely be $700-1000 means why would you save $80 by putting in a video signal generator only capable of 1080i instead of one capable of 1080p as well..
 
Well, we already know of an 1080p source.....the PlayStation 3

I fully expect you will see some BRD/HD-DVD products only support 720p and 1080i, while others will add 1080p outputs too....
 
---- said:
The question I have though is how good are these 1080p sets going to be at handling the 720p/1080i HDTV and 720p/1080i games? A lot of 720p HDTVs are horrible at handling standard definition tv signals. Is there going to be an affordable tv (sub $2,000) any time soon that will be able to handle 480i/p, 720i/p, 1080i/p resolutions? Right now there's definitely not anything on the market that I know of that will be able to properly handle everything from SDTV to HD-DVD and everything in between.


Well, that is the rub of fixed-pixel display technology, itsn't it :)

The only display techs that can have different resolution settings are CRTs, GLVs and Field Emmision Displays (which are basically flat panel CRTs) and even they both have specific "ideal" resolution...

No, the burden of getting good HD/ED/SD video will most likely land on the shoulders of the scaler/deinterlacers which are getting better all the time...

If we can just get that Silicon Optic Realta HQV chip down in price a little more and include them in TVs......well....that would be just grand :D


As for disc capacity, a single-layer HD DVD disc will hold 15GB of data, a dual-layer disc will hold 30GB, and a just-announced triple-layer HD-DVD disc will hold 45GB capable of up to 12 hours of HDTV content, Sally said. A new hybrid “flipper” disc will enable storing HD content on one side and SD-DVD content on the other.


Yeah....too bad that triple-layer 45GB HD-DVD disk is totall bullshit vaporware and wasn't even mentioned in the MS-Toshiba press event in Japan last week:

8cm-HD%20DVD.jpg
 
borghe said:
As for HD-DVD... I promise you HD-DVD will support 1080p. If not out of the gates then eventually. It makes no sense whatsoever, especially with 1080p sets out now and the HD-DVD format not even 100% finalized, to NOT support 1080p, especially when your competition is. At the very least, movies will be encoded the same way DVD is, 24p. of course TV will likely be encoded at 1080i (being that's what it's shot at for now) but the majority of releases will be 1080p. now of course if the players support it at first will be another story.

Actually my longshot belief is that we will still end up with one format. You can tell with the way the two keep trying for it that neither one wants to release separately, and that they know it will hurt both formats if they do.. we'll see, but my money is still on a unification.


I agree with you that HD-DVD will eventually make it to 1080p.

I too hope that we see a unified format- screw format wars. I want my cake and it to though, because I want BR capacity AND H.264/VC-1 encoding.

It doesn't hurt to dream I guess. :)
 
Kleegamefan said:
Hokie Joe....the HD-DVD video will be mastered at 1080p....even if it is output at 1080i at the HD-DVD deck, as long as the embedded telcine flags are correct the 1080p display can "reconstruct" the original 1080p image via inverse telcine processing(3:2 cadence usually)......

This is how all of my D-theater tapes work and the deck outputs 1080i to my Qualia 006, which correctly displays it as 1080p....


Yeah I knew that, I was just trying to understand what Borghe was getting at.

Thanks for the clarification though. :)
 
Mrbob said:
There you go pimping DLP sets again! :lol

Braveheart.jpg


"You may take our DLP sets. But you'll never take OUR FREEDOM!"

I don't own DLP (nor plan to) so it's not bias push for DLP on my part. It just happens to be the best projection tech for digital source content (i.e. ncomputer generated images).

What do you thinka bout the D-ILA JVC technology? My biggest issue with DLP are rainbows. D-ILA is suppose to alleviate this.

D-ILA is just another varient of LCoS. It's great for HT enthusiasts, but if you want for example, to play Half Life 2 from your PC's video card blown up to 56", DLP kills it. But then again, watching non digitally generated movies transfered to DVD, LCoS varients look much more natural than DLP (the exceptions being digitally generated CG movies like Finding Nemo, which looks absolutely brilliant on DLP).
 
Kleegamefan said:
If we can just get that Silicon Optic Realta HQV chip down in price a little more and include them in TVs......well....that would be just grand :D

Yep, I've read some good things about it. From recollection, at one of the demo shows the writer was saying that he couldn't tell the difference between 480p and 720p. That sounds awful romantic, but I'm certainly intrigued.
 
Shogmaster said:
D-ILA is just another varient of LCoS. It's great for HT enthusiasts, but if you want for example, to play Half Life 2 from your PC's video card blown up to 56", DLP kills it. But then again, watching non digitally generated movies transfered to DVD, LCoS varients look much more natural than DLP (the exceptions being digitally generated CG movies like Finding Nemo, which looks absolutely brilliant on DLP).

I won't be hooking up my PC up to it. Mainly X360 and PS3 games which will be 720P native. My one concern has to do with the black level. I've read complaints that the black level is too bright in darkened areas with D-ILA sets?
 
Mrbob said:
I won't be hooking up my PC up to it. Mainly X360 and PS3 games which will be 720P native.

What do you think X360 and PS3 are? They are just highly specialized PCs. They all primarily output images generated by their GPUs. I'm telling you, DLP will makes those things POP compared to SXRD and other LCoS.

My one concern has to do with the black level. I've read complaints that the black level is too bright in darkened areas with D-ILA sets?

SXRD is definitely superior to D-ILA for blacker blacks from what I have seen.
 
OK....biased LCOS owner chiming in :)

I would take a serious look at the JVC D-ILAs if I were you....the newer xx585 series especially are really nice...

Both LCOS and DLPs have their pros and cons (can't dog on either since I own both) but LCOS is a newer technology but its less mature than DLPs and therefore isn't as "known" a technology and its full potential hasn't been reached yet...

LCOS displays have been a royal PITA to manufacture over the last few years.....JVC and SONY were the first to perfect mass manufacture of LCOS TVs but Brillian, LG electronics and Hitachi will all introduce LCOS RPTV products later this year (and eLCOS and Phillips will hopefully get their shit together soon too) so they *are* coming...

Compared to DLPs, you can squeese more pixels on a given IC and there seems to be no problem introducing 3-chip LCOS devices in TVs with prices competive with 1-chip DLP DMDs...you never have to worry about rainbows with most LCOS TVs (the Phillips LCOS used a rotating color-sphere-thingy, but they no longer make that POS) since they have 3 chips...one each for RGB....

LCOS also have the highest fill rate of any microdisplay technology (92%)...this gives a very smooth "film-like" image which is great for movies and there is NO screendoor....silky smooth image, if thats your thing....

Some downsides compared to DLPs is we are still about 1 generation away from full digital LCOS designs.....the first fully digital LCOS TVs to arrive will be the 1080p D-ILAs this fall, followed by the eLCOS panels, whenever they arrrive and perhaps the SXRDs and others will migrate to full digital designs too.....as you know, DLPs are the only fully digital design so far and avoiding D/A conversion whithin the TV is a major plus in my book....especially for us gamers...

As good as the Sony SXRDs are, even don't support a pure digital path to the display and I can't imagine how good the PQ will be in a full digital LCOS design....they are pretty kickass already...

As I said before, LCOS is not as mature as DLP...as such, they are a couple of years behind DLPs in Contrast Ratio and Black/Shadow Detail performance, though this is improving fast (the new SXRDs this fall will have 5000:1 CR) also...being a younger tech...LCOS tend to have more glitches and growing pains compared to DLPs...

To be fair, the new xHD4 DLPs will also display 1080p via wobulation....dunno if they will look as good as a true 1080p set though, that remains to be seen.........One thing is for sure though, DLPs excel in digital images like computer images and games, so you must consider that too...lost in the sea of variables yet :D


In other words.....everything I've said is for naught and you must decide for yourself what is best for you.....Just don't rule out LCOS no matter what the horror stories tell you :)
 
Holy shit Klee!

I just checked the presentation for the HQV on Silcon Optix's website- wow.

Oh, and from reading the reviews, they upconverted 480i and displayed it against a true 1080i source. The guy said he picked which source was true 1080i 3 out of 4 times; but said it very close.

He also said that they demoed it against a Faroudja processor "showing what a 1080i signal de-interlaced to 1080p looks like on each processor". Apparently it was no contest- the Realta chip took the honors.

Sign me up.
 
Kleegamefan said:
Compared to DLPs, you can squeese more pixels on a given IC and there seems to be no problem introducing 3-chip LCOS devices in TVs with prices competive with 1-chip DLP DMDs...you never have to worry about rainbows with most LCOS TVs (the Phillips LCOS used a rotating color-sphere-thingy, but they no longer make that POS) since they have 3 chips...one each for RGB....

That's got nothing to do with squeezing in more IC per area. It's just how each technology handles projecting lights. DLP is purely reflective. LCD is purely transmissive. LCoS is a combination of the two. You can overlap RGB with LCD and LCoS while obviously, you cannot with DLP.

Also, consumer DLPs make do with with single chip because it can. The rainbow effect from single chip DLP was not so bad that the market demanded 2 or 3 chip version. Apprently that was not the case with LCoS. The market determined that RGB be sepeated for LCoS since only those survived the market.

LCOS also have the highest fill rate of any microdisplay technology...this gives a very smooth "film-like" image which is great for movies...

Fillrate?!? This isn't GPUs. LCoS "fills out" more than DLP because it's partially transmissive. Some also call that being fuzzier. ;)
 
Shogmaster said:
Fillrate?!? This isn't GPUs. LCoS "fills out" more than DLP because it's partially transmissive. Some also call that being fuzzier. ;)


I thought that the higher fillrate ratio of LCOS was due in part to the higher density of using three panels?
 
The Realta HQV stuff is muey impressive....


Considering Silicon Optics licensed UUUUUUUBBBBEEERRR expensive Teranex video processing technology (I am talking about their 60K video processor here) it should look good :D


Check it:

http://www.avsforum.com/avs-vb/showthread.php?t=444594&highlight=Realta


BTW, rumors have it CE TV firms are tripping over themselves to be the first on the market to include this chip in their TVs..

Happy days are ahead for sure :)
 
HokieJoe said:
I thought that the higher fillrate ratio of LCOS was due in part to the higher density of using three panels?


No, there is a smaller interpixel gap on LCOS since the circuitry is actually *behind* the silicon substrate and the light path does TWO passes through the chip before hitting the screen..

3-chip Mercury DLP projectors (say, buddy, can you spare a 25 thousand dollar bill??) still have the 86% fill rate because currently there are limits to how small the gaps can be between each mirror on a DMD...remember, they have to rotate +/- 12 degrees and that number will probably go up in the future...
 
HokieJoe said:
I thought that the higher fillrate ratio of LCOS was due in part to the higher density of using three panels?

If those 3 panels were seperately capable of RGB, I can see that argument, but since each only handles RGB seperately, I'd say no. And stop using the term "fillrate". That's a GPU term. You are just gonna confuse the people (like you have done to yourself).
 
If it were me, I'd buy the CRT set. It's the cheapest solution for HD currently and all other newer technology formats are imperfect for what they cost. I recently purchased a 53" Panasonic CRT set which I will hold on to until CNT tech hits the market.

edit: disclaimer - I haven't been keeping track of this thread so...
 
Shogmaster said:
That's got nothing to do with squeezing in more IC per area. It's just how each technology handles projecting lights. DLP is purely reflective. LCD is purely transmissive. LCoS is a combination of the two. You can overlap RGB with LCD and LCoS while obviously, you cannot with DLP.

Also, consumer DLPs make do with with single chip because it can. The rainbow effect from single chip DLP was not so bad that the market demanded 2 or 3 chip version. Apprently that was not the case with LCoS. The market determined that RGB be sepeated for LCoS since only those survived the market.



Fillrate?!? This isn't GPUs. LCoS "fills out" more than DLP because it's partially transmissive. Some also call that being fuzzier. ;)


Actually, I must correct you on some things here:

0219e_b.gif



SXRDs have .35 micron interpixel spacing compared more than double that on DMDs....


Here is a LCOS chip
05.jpg


As you can see, the light sources makes one pass throught the glass substrate to the reflection electrode and then does another pass through the substrate to the display...this 2nd pass helps reduce gaps between pixels at the display (screendoor) according to JVC whitepapers.....

Compare this with a DMD:

DLP1.GIF


IMG0008539.gif


03fig12.jpg



The light source projects throught the (rapidly spinning) color wheel and reflects off the mirrors on the surface of the DMD and out to the screen......since each mirror can rotate plus/minus 12 degrees, you must have a gap between mirrors to do this....

There are no moving parts on a LCOS panel and they uses this advantage to produces smaller and more tightly packed pixels :)



Shogmaster said:
If those 3 panels were seperately capable of RGB, I can see that argument, but since each only handles RGB seperately, I'd say no. And stop using the term "fillrate". That's a GPU term. You are just gonna confuse the people (like you have done to yourself).

The correct term is Fill Factor Rate


http://www.projectorcentral.com/news_story_368.htm

...but many people call it Fill Rate for short.....you can either use the term Fill Factor or Fill Rate when discribing LCOS just like these places did:

http://www.geocities.com/columbiaisa/tv_hdtv_01web.htm

http://www.epinions.com/content_3169230980

http://www.hdblog.net/index.php/2005/04/24/qualia-006/

Just FYI:)
 
Kleegamefan said:
SXRDs have .35 micron interpixel spacing compared more than double that on DMDs....

Isn't that comparing 1920x1080 native SXRD vs 1280x720 native DLP? Kinda not the same. ;)

There are no moving parts on a LCOS panel and they uses this advantage to produces smaller and more tightly packed pixels :)

Look man, all that's great, but the fact is, LCoS look is not "filled out" as much as "fuzzy". I've never looked at DLP image and said to myself, "look at all them gaps. It really needs to fill-out!". DLP manages plenty "filled-out" image output without looking fuzzy. LCoS stuff looks fuzzier, period. That serves it well for making film look more natural, but let's not sugercoat it into a feature. It's fuzzy, not filled out.



The correct term is Fill Factor Rate


http://www.projectorcentral.com/news_story_368.htm

...but many people call it Fill Rate for short.....you can either use the term Fill Factor or Fill Rate when discribing LCOS just like these places did:

http://www.geocities.com/columbiaisa/tv_hdtv_01web.htm

http://www.epinions.com/content_3169230980

http://www.hdblog.net/index.php/2005/04/24/qualia-006/

Just FYI:)

That makes me even more infuriated since folks are confusing it with the GPU term out of laziness! AAARRRGGGHHH!! This is gaming forum first and foremost! Use "Fill Factor Rate" for that crap and leave "Fillrate" for proper GPU talk!
 
Shogmaster said:
If those 3 panels were seperately capable of RGB, I can see that argument, but since each only handles RGB seperately, I'd say no. And stop using the term "fillrate". That's a GPU term. You are just gonna confuse the people (like you have done to yourself).


Oops, my bad fill-rate.

HDTV PUB-Display Types & Methods said:
D-ILA is a type of LCD display that is owned by JVC. D-ILA, unlike LCD, uses a method of reflecting light through the elements twice before being passed to the lens. The fill-rate of D-ILA is said to be 93%.

http://www.hdtvpub.com/articles/whatisdtv/displaytypes.cfm


So, fillrate, fill-factor, fill-factor, fill ratio- whose counting?

Really man, lighten up. The question mark denoted that it was indeed a question. There was no reason to be snappish.
 
HokieJoe said:
Really man, lighten up. The question mark denoted that it was indeed a question. There was no reason to be snappish.

No need? NO NEED?!? IT'S WHAT WE LIVE FOR!!!!! *Knocks over virtual magazine rack*
 
Shogmaster said:
Isn't that comparing 1920x1080 native SXRD vs 1280x720 native DLP? Kinda not the same. ;)

Well, the interpixel spacing on the 720p and 1080p JVC D-ILA panels are the same size.....the 1080p DILA is just a bigger chip....


Look man, all that's great, but the fact is, LCoS look is not "filled out" as much as "fuzzy". I've never looked at DLP image and said to myself, "look at all them gaps. It really needs to fill-out!". DLP manages plenty "filled-out" image output without looking fuzzy. LCoS stuff looks fuzzier, period. That serves it well for making film look more natural, but let's not sugercoat it into a feature. It's fuzzy, not filled out.

You may notice I never spoke in absolutes.....picture quality is subjective and LCOS "looking" filled out or fuzzy is not a fact, it is an opinion....

I very much realized this which is probably why I summed up my post with:

kleegamefan said:
In other words.....everything I've said is for naught and you must decide for yourself what is best for you.....Just don't rule out LCOS no matter what the horror stories tell you

Gee....I even put an :) at the end of it ;)





That makes me even more infuriated since folks are confusing it with the GPU term out of laziness! AAARRRGGGHHH!! This is gaming forum first and foremost! Use "Fill Factor Rate" for that crap and leave "Fillrate" for proper GPU talk!

They are talking about the displays ability to fill screen real estate with pixels....since when did every term have just one definition anyway?

Would you slap me or thank me if I said I could give you a daisy chain of tea bags....wait...don't answer that:D

No need? NO NEED?!? IT'S WHAT WE LIVE FOR!!!!! *Knocks over virtual magazine rack*

Oh shog......j00 so SILLY :)

Again, LCD/LCOS/DLPs have their pros/cons....here is now JVC discribes them:

http://www.jvc.com/Presentations/HDILA/drawbacks.html


LCD

Fair Brightness Capability

LCD, Liquid Crystal Display, is a transmissive technology that allows light to pass through liquid crystal microchips to provide image data to the display screen.

The drawback of LCD is that transistors are required on each pixel, and wires must run between each pixel to transmit the image data.

Combined, these wires and transistors limit both the total area through which light can pass (limiting brightness)as well as the total area on which pixels can be placed (limiting the megapixel ratio).


DLP
Better Brightness Capability

DMD, Digital Micro-mirror Devices, employs the reflective technology DMD that uses micro-mirrors on hinges to reflect image data to the display screen.

The main drawback of DMD is that a hinge is required on, and a space is required between each pixel to allow the mirrors to move.

This space between the pixels and the space occupied by the hinges both limit the total reflective space available per pixel (limiting brightness)as well as the total area on which pixels can be placed (limiting the megapixel ratio).

DILA/LCOS
Best Brightness Capability

HD-ILA employs D-ILA, Direct-drive Image Light Amplifier, also a reflective technology, but it utilizes LCOS pixels which require virtually no space between them.

LCOS therefore is able to yield the highest aperture ratio (reflective area)and greatest total area for placing mega pixels of any device type (maximizing megapixel count).

D-ILA further improves upon LCOS by using vertically oriented pixels and by adding an inorganic alignment layer that both stabilizes device performance and maximizes chip production.
 
so its possible that initial Bluray players will be 1080i/720p only. Not 1080p. Kinda like how initial DVD players were 480i only. then 1080p comes along later

Makes sense, especially as 1080p standards probably aren't ratified.

But then if PS3 supports 1080p, and is out first, and is cheaper than other bluray players.....doesn't that kind of make them all redundant? I guess they'll have to be recorders to be worth buying?
 
Top Bottom