QUOTE(mlmadmax @ Nov 12 2006, 06:35 PM)
Doesn't only 1080p60 have a positive for videogames and 1080p30, which is what is out right now not have any advantage over 1080i even for video games because they are both capped at 30 frames?
In real-world terms, no. There will not be any difference when you view it on your TV set. In fact, for movies a 1080i source can actually be better on some TV sets. Some TV sets and video processors are able to do what's called an inverse telecine- turning 1080i60 into 1080p24. Although the difference is very subtle, this is technically better than viewing a 24fps film at 1080p30. Hardly a difference, but if you look closely you'll see smother camera pans if your Tv can actually display 24p. 30p will have what's called redundant frames (6 extra frames per second) which can be seen as stuttering during camera pans. But this is very, very trivial. Nothing to make any purchasing decisions on.
I don't know of any current TV models that actually deinterlace 1080i60 to 1080p24, but there are video processors and HTPCs that can. This can benefit TVs that accept 1080p24 signals (the HP DLP sets are the only ones I know of that can do this - but I don't follow it that closely so there may be more).
QUOTE(DeQuosaek @ Nov 13 2006, 05:35 AM)
I know this isn't directed at me, and correct me if I'm wrong, but even at 30 fps wouldn't a non-interlaced image look better in motion than an interlaced image? An interlaced display draws half of each frame each pass so that in fast moving pictures you may notice the separate lines whereas a progressive display draws the entire image one time per pass avoiding that problem.
Now, obviously this example is not in 1080 resolution, but wouldn't the same principles apply?
Isn't this like comparing 480i to 480p? Don't we all agree that 480p looks better?
You are correct in theory and it is the case with 480. But in reality, all 1080i signals are deinterlaced properly by the television itself. While analog CRTs display one field at a time, digital displays (LCD, DLP, LCoS, plasma) only display progressive frames. And they are all apt at deinterlacing 1080i properly with computer monitors being the only exception.
QUOTE(-Spud- @ Nov 14 2006, 04:33 AM)
Lifter U DA MAN! Everyone OWNED!
If you have a native 720p display unit isn't the best possible picture going to come from a native 720p source? In other words is it pointless upscaling natively 720p games and video to 1080i even if your 720p display unit will let you see a 1080i image?
Yes, it is completely pointless. In fact it's worse. Because in the case of the Xbox 360, it's upconversion process turns 720p 60fps into 1080 30fps. So you gain no benefit in resolution. You just cut the framerate in half and introduce minute scaling artifacts.
QUOTE(Caldor @ Nov 14 2006, 05:07 AM)
Thanks for that.
As I said, not all displays recognise the 2:2 candence for restoring the original film content.
And film content is the point I want to stress. While I understand your point of view it does however assume that the content will be film based.
This is not the case for my country where much 1080 content is shot live for sporting events and the like using interlaced cameras.
Under this type of footage, it is impossible to perfectly assemble all the temporal fields into full frames and we arrive again at my original statement as to why progressive is best - because it eliminates interlace artifacting.
mlmadmax - No, some countries have 1080i50 for example
-Spud- - the best method is to feed the display the native resolution of the display
I'm not going to pretend to know what's up with the UK. Its just something I've never dealt with. But if you are talking about 1080i50 to 1080p50, it's just a simple weave and as long as the source is digital (like an HDCam) then there shouldn't be any artifacts whatsoever from the deinterlace process.
Edited by Lifter, 15 November 2006 - 10:06 AM.