QUOTE(Caldor @ Nov 11 2006, 06:52 AM)
In fact, it is you who needs to get a clue. Too many of you people are not professionals in these fields yet you try to talk like you are. You have a superficial understanding of a complex field.
If you had of studies this field instead of shooting off your mouth talking about things you don't understand you would realise that it is impossible to perfectly convert interlaced fields to a full progressive frames. Even the best consume level $US3000 video processes still leave interlace artifacts in certain situations.
Not having to do interlacing to progressive from the original 1080P24fps source is a distinct advantage because it eliminates any interlace artifacting.
This is true, but contrast and blacks have improved in later generation of LCOS / LCD / Plasma.
SED and Laser will take over eventually.
CRTs are practical for 60" screen sizes which for most viewing distances is ideal for 1080 sources. Think of the THX recommended viewing angle.
I'm a broadcast engineer at a post production facility in Santa Monica, CA. I'll be at NAB next April if you'd like to meet. Just last night I helped an editor do the pre-color-grade finishing for an independent feature film. It was shot on Varicam and edited natively on Final Cut Pro. We had to master it to D5 1080Psf 23.98 using nothing but FCP and it worked out great.
I'm also an AVS Special Member and I've had experience installing and engineering the NEC XG series CRT front projectors. I helped develop a solution for using RT effects with a motion JPEG codec on Final Cut Pro. I have experience working on a Xantus by Teranex. A real one. I've also been very successful at developing methods for doing software upconversions in house without having to go somewhere that has a Teranex. My credits are on IMDB as an editor and as a post-prodcution supervisor.
So what is it you do for a living again?
Look, I understand why you think what you do. What you say is true for most standard definition interlaced footage. No so with 1080i content. The reality is that any and every 1080i master is made with proper 3:2 cadence. And it's digital so re-interlacing is, for lack of a better work, perfect. Thus, any consumer device such as a Blu-ray player or a 1080p television set has the capability to deinterlace it 100% perfectly without any artifacts whatsoever. Hey, but don't take my word for it. That article, even though I thought was misleading, proves my point exactly. As do many others. The reason you think deinterlacing always causes artifacts is because you are taking known facts of the standard definition world and applying it to HD. It doesn't work that way. HD masters are always mastered digitally and with correct field cadence. So you don't get any artifacts whatsoever. The cheapest, crapiest deinterlacing chip (like the ones in the Westinghous sets) can handle 1080i content flawlessly. Only computer monitors w/o any deinterlacing capability will have problems. And computer monitors don't support HDCP so it's moot point. Any non-CRT device that calls itself a television and can accept 1080i will remove the 3:2 pulldown perfectly from any 1080i HD source. A 1080p source is useful only for games.This post has been edited by Lifter: Nov 11 2006, 12:51 PM