Jump to content


Is there really a 1080p advantage?

  • Please log in to reply
32 replies to this topic

#31 -Spud-


    X-S Member

  • Members
  • Pip
  • 106 posts
  • Location:Melbourne, Australia
  • Interests:God, Games, Music, Movies
  • Xbox Version:v1.0

Posted 14 November 2006 - 08:15 PM

QUOTE(Caldor @ Nov 14 2006, 02:07 PM) View Post

-Spud- - the best method is to feed the display the native resolution of the display

Cheers Caldor. I presumed as much.

#32 Lifter


    X-S Member

  • Members
  • Pip
  • 70 posts

Posted 15 November 2006 - 10:04 AM

QUOTE(mlmadmax @ Nov 12 2006, 06:35 PM) View Post

To lifter,

Doesn't only 1080p60 have a positive for videogames and 1080p30, which is what is out right now not have any advantage over 1080i even for video games because they are both capped at 30 frames?

In real-world terms, no. There will not be any difference when you view it on your TV set. In fact, for movies a 1080i source can actually be better on some TV sets. Some TV sets and video processors are able to do what's called an inverse telecine- turning 1080i60 into 1080p24. Although the difference is very subtle, this is technically better than viewing a 24fps film at 1080p30. Hardly a difference, but if you look closely you'll see smother camera pans if your Tv can actually display 24p. 30p will have what's called redundant frames (6 extra frames per second) which can be seen as stuttering during camera pans. But this is very, very trivial. Nothing to make any purchasing decisions on.

I don't know of any current TV models that actually deinterlace 1080i60 to 1080p24, but there are video processors and HTPCs that can. This can benefit TVs that accept 1080p24 signals (the HP DLP sets are the only ones I know of that can do this - but I don't follow it that closely so there may be more).

QUOTE(DeQuosaek @ Nov 13 2006, 05:35 AM) View Post

I know this isn't directed at me, and correct me if I'm wrong, but even at 30 fps wouldn't a non-interlaced image look better in motion than an interlaced image? An interlaced display draws half of each frame each pass so that in fast moving pictures you may notice the separate lines whereas a progressive display draws the entire image one time per pass avoiding that problem.

For example:
IPB Image
IPB Image

Now, obviously this example is not in 1080 resolution, but wouldn't the same principles apply?

Isn't this like comparing 480i to 480p? Don't we all agree that 480p looks better?

You are correct in theory and it is the case with 480. But in reality, all 1080i signals are deinterlaced properly by the television itself. While analog CRTs display one field at a time, digital displays (LCD, DLP, LCoS, plasma) only display progressive frames. And they are all apt at deinterlacing 1080i properly with computer monitors being the only exception.

QUOTE(-Spud- @ Nov 14 2006, 04:33 AM) View Post

Lifter U DA MAN! Everyone OWNED! tongue.gif

If you have a native 720p display unit isn't the best possible picture going to come from a native 720p source? In other words is it pointless upscaling natively 720p games and video to 1080i even if your 720p display unit will let you see a 1080i image?

Yes, it is completely pointless. In fact it's worse. Because in the case of the Xbox 360, it's upconversion process turns 720p 60fps into 1080 30fps. So you gain no benefit in resolution. You just cut the framerate in half and introduce minute scaling artifacts.

QUOTE(Caldor @ Nov 14 2006, 05:07 AM) View Post

Thanks for that.

As I said, not all displays recognise the 2:2 candence for restoring the original film content.

And film content is the point I want to stress. While I understand your point of view it does however assume that the content will be film based.

This is not the case for my country where much 1080 content is shot live for sporting events and the like using interlaced cameras.

Under this type of footage, it is impossible to perfectly assemble all the temporal fields into full frames and we arrive again at my original statement as to why progressive is best - because it eliminates interlace artifacting.

mlmadmax - No, some countries have 1080i50 for example

-Spud- - the best method is to feed the display the native resolution of the display

I'm not going to pretend to know what's up with the UK. Its just something I've never dealt with. But if you are talking about 1080i50 to 1080p50, it's just a simple weave and as long as the source is digital (like an HDCam) then there shouldn't be any artifacts whatsoever from the deinterlace process.

Edited by Lifter, 15 November 2006 - 10:06 AM.

#33 mlmadmax


    X-S Genius

  • Members
  • PipPipPipPip
  • 870 posts
  • Location:California
  • Xbox Version:v1.4
  • 360 version:v2 (zephyr)

Posted 15 November 2006 - 09:45 PM

Thanks for the info Lifter, I have been trying to nail this whole technology down for a while, when you get into some of the more technical aspects it can be pretty confusing to say the least.

So in a nutshell with a lcd plasma dlp or lcos there really is no difference in 1080p with a movie.

But with games there can be differences due to framerate constraints/issues.

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users