Jump to content


Photo

Is there really a 1080p advantage?


  • Please log in to reply
32 replies to this topic

#16 unclepauly

unclepauly

    X-S Senior Member

  • Members
  • PipPip
  • 212 posts
  • Location:Toledo
  • Xbox Version:unk
  • 360 version:v1 (xenon)

Posted 11 November 2006 - 03:54 AM

1080p IS the holy grail of hdtv. It's basically the most perfect image you are going to get for the next 10yrs(maybe less). I agree it is not a gigantic leap over 1080i, but over 720p? It's 2x the pixels... The more you can see the better. I think this study is just trying to do some damage control for somebody.

#17 mc_365

mc_365

    X-S Expert

  • Members
  • PipPipPip
  • 739 posts

Posted 11 November 2006 - 04:48 AM

This is so funny, I seem to remember everyone and thier mother just a 2 years ago saying it made no since to purchase an HDTV that was not progressive now we're saying 1080i is better than 720p and almost as good as 1080P.

Well my expeirence is 1080i on a CRT properly dialed in with a 1080i signal of true HD source material is the best picture you can get presently.

The flat panels may have higher resolutions and be progressive but the just can produce blacks and/or the rich smooth colors that you get with the tube.

That being said, I've seen alot of sets being I go to J&R music like twice a week cuase its on my way from work, the best image on a flat panel I have observed is the Panasonic Plasmas at 720P I think people like the LCD becuase of the brightness. The screen is so bright when the go to the show room it makes the images look vibrant and sharp but there is not depth and flesh tones look like shit, and fast movements look the tv signal got scambled.

I'll probably get a plasma cuase I could use the extra space but I love the picture on my 7 year old 36" sony trinitron XBR HD ready set.

In my seven years of watching HD tv both OTA and Dish I think I know what is a good image and tubes are still the benchmark.

#18 Lifter

Lifter

    X-S Member

  • Members
  • Pip
  • 70 posts

Posted 11 November 2006 - 05:42 AM

Get a clue people. This article has absolutely no reflection on video games and I have no idea why it was posted on the Xbox-scene news. It's totally misleading if you apply those findings to video games.

Mission Impossible 3, just like every other movie ever made, was shot at 24 frames per second. Without getting too technical, what that means is that whether you play the 1080i version or the 1080p version - it makes no difference whatsoever. None. Zip. Nada. Just like the article says. Play it back at 1080i, and your digital 1080 TV set is just converting it to 1080p at 30fps. Since the movie is only 24fps, you lose absolutely nothing. Full resolution. Full framerate. All from a 1080i source signal. A 1080p source of MI3 gives no advantage.

What the article does not say is that their test is totally irrelavent when it comes to gaming. Games are not limited to 24fps like movies are. They can be 60fps. And in many cases they are. And if the game can also render 1920x1080 at 60fps (or 60hz) like PS3 games for example, then the only way to take advantage of the full 1080p60 is to have a 1080p playback device and a 1080p display that actually accepts 1080p signals.

Current Xbox 360 games probably cannot take advantage of this because they were not designed to output 1080p at 60 frames per second. Only 30fps. As any PC gamer would know, the 360 would need to double it's rendering power in order to double the framerate at the same resolution. So any hypothetical Xbox 360 game that can do a true 1080p60 is cutting corners elsewhere as far as graphics are concerned.

So once again, an article testing movies on Blu-ray has absolutely nothing to do with video games.

#19 Caldor

Caldor

    X-S Member

  • Members
  • Pip
  • 130 posts

Posted 11 November 2006 - 06:45 AM

QUOTE(Lifter @ Nov 11 2006, 02:49 PM) View Post

Get a clue people.....snip......what that means is that whether you play the 1080i version or the 1080p version - it makes no difference whatsoever. None. Zip. Nada. Just like the article says. Play it back at 1080i, and your digital 1080 TV set is just converting it to 1080p at 30fps. Since the movie is only 24fps, you lose absolutely nothing. Full resolution. Full framerate. All from a 1080i source signal. A 1080p source of MI3 gives no advantage.


In fact, it is you who needs to get a clue. Too many of you people are not professionals in these fields yet you try to talk like you are. You have a superficial understanding of a complex field.

If you had of studies this field instead of shooting off your mouth talking about things you don't understand you would realise that it is impossible to perfectly convert interlaced fields to a full progressive frames. Even the best consume level $US3000 video processes still leave interlace artifacts in certain situations.

Not having to do interlacing to progressive from the original 1080P24fps source is a distinct advantage because it eliminates any interlace artifacting.

Amatuer. muhaha.gif


QUOTE(unclepauly @ Nov 11 2006, 01:01 PM) View Post

1080p IS the holy grail of hdtv. It's basically the most perfect image you are going to get for the next 10yrs(maybe less). I agree it is not a gigantic leap over 1080i, but over 720p? It's 2x the pixels... The more you can see the better. I think this study is just trying to do some damage control for somebody.


This is true, but contrast and blacks have improved in later generation of LCOS / LCD / Plasma.

SED and Laser will take over eventually.

CRTs are practical for 60" screen sizes which for most viewing distances is ideal for 1080 sources. Think of the THX recommended viewing angle.

#20 Burgleflickle

Burgleflickle

    X-S Young Member

  • Members
  • Pip
  • 41 posts
  • Location:MN
  • Xbox Version:unk

Posted 11 November 2006 - 09:34 AM

All I'm hearing is waaah. Grow up.

To Admins:
The credibility of this site has gone down considerably in the past few months, and it's because of some of the speculative, vague crap on these news feeds. With all due respect, if you have reliable news, post it, if you don't have a newsworthy story, let it go--otherwise you'll have these crybabies detracting loyal members flaming and crapping up good forums that have helped thousands. I respect and enjoy this site, and I hope you keep helping newbs and those entrenched in the mod community.

To those holding bs flags:
If you don't know, don't post acting like you do, because you sound like a moron.

To those who flame:
It's intolerant morons like you that don't have the decency to keep things light hearted, and ruin it for everyone.

I realize the hypocrisy of my flame, but I can only take three months of people angrily arguing instead of debating points like ADULTS.

#21 Lifter

Lifter

    X-S Member

  • Members
  • Pip
  • 70 posts

Posted 11 November 2006 - 12:49 PM

QUOTE(Caldor @ Nov 11 2006, 06:52 AM) View Post

In fact, it is you who needs to get a clue. Too many of you people are not professionals in these fields yet you try to talk like you are. You have a superficial understanding of a complex field.

If you had of studies this field instead of shooting off your mouth talking about things you don't understand you would realise that it is impossible to perfectly convert interlaced fields to a full progressive frames. Even the best consume level $US3000 video processes still leave interlace artifacts in certain situations.

Not having to do interlacing to progressive from the original 1080P24fps source is a distinct advantage because it eliminates any interlace artifacting.

Amatuer. muhaha.gif
This is true, but contrast and blacks have improved in later generation of LCOS / LCD / Plasma.

SED and Laser will take over eventually.

CRTs are practical for 60" screen sizes which for most viewing distances is ideal for 1080 sources. Think of the THX recommended viewing angle.



I'm a broadcast engineer at a post production facility in Santa Monica, CA. I'll be at NAB next April if you'd like to meet. Just last night I helped an editor do the pre-color-grade finishing for an independent feature film. It was shot on Varicam and edited natively on Final Cut Pro. We had to master it to D5 1080Psf 23.98 using nothing but FCP and it worked out great.

I'm also an AVS Special Member and I've had experience installing and engineering the NEC XG series CRT front projectors. I helped develop a solution for using RT effects with a motion JPEG codec on Final Cut Pro. I have experience working on a Xantus by Teranex. A real one. I've also been very successful at developing methods for doing software upconversions in house without having to go somewhere that has a Teranex. My credits are on IMDB as an editor and as a post-prodcution supervisor.

So what is it you do for a living again?

Look, I understand why you think what you do. What you say is true for most standard definition interlaced footage. No so with 1080i content. The reality is that any and every 1080i master is made with proper 3:2 cadence. And it's digital so re-interlacing is, for lack of a better work, perfect. Thus, any consumer device such as a Blu-ray player or a 1080p television set has the capability to deinterlace it 100% perfectly without any artifacts whatsoever. Hey, but don't take my word for it. That article, even though I thought was misleading, proves my point exactly. As do many others. The reason you think deinterlacing always causes artifacts is because you are taking known facts of the standard definition world and applying it to HD. It doesn't work that way. HD masters are always mastered digitally and with correct field cadence. So you don't get any artifacts whatsoever. The cheapest, crapiest deinterlacing chip (like the ones in the Westinghous sets) can handle 1080i content flawlessly. Only computer monitors w/o any deinterlacing capability will have problems. And computer monitors don't support HDCP so it's moot point. Any non-CRT device that calls itself a television and can accept 1080i will remove the 3:2 pulldown perfectly from any 1080i HD source. A 1080p source is useful only for games.

Edited by Lifter, 11 November 2006 - 12:51 PM.


#22 Caldor

Caldor

    X-S Member

  • Members
  • Pip
  • 130 posts

Posted 11 November 2006 - 12:59 PM

Ok fair enough, I jumped the gun and for that I apologise. I'm sorry, but I think if you had of explained your reasoning like your thoughts yiou shared in your second post that may have prevented me reading it in the way I did.

However, I dont live in a country that uses 3:2 - I'm on 2:2 and not all displays are good at that. Some dont detect candence different to anything but 3:2. I do understand your point but I'm not American and were not all on those standards.

#23 Lifter

Lifter

    X-S Member

  • Members
  • Pip
  • 70 posts

Posted 11 November 2006 - 01:25 PM

No worries. But I'd be very surprised if HD sets in the UK don't have the same capability. I understand it's a problem with DVDs, but it shouldn't be with any HD disc or broadcast.

#24 mlmadmax

mlmadmax

    X-S Genius

  • Members
  • PipPipPipPip
  • 870 posts
  • Location:California
  • Xbox Version:v1.4
  • 360 version:v2 (zephyr)

Posted 12 November 2006 - 06:28 PM

To lifter,

Doesn't only 1080p60 have a positive for videogames and 1080p30, which is what is out right now not have any advantage over 1080i even for video games because they are both capped at 30 frames?



#25 DeQuosaek

DeQuosaek

    X-S Young Member

  • Members
  • Pip
  • 50 posts
  • Xbox Version:unk
  • 360 version:unknown

Posted 13 November 2006 - 05:28 AM

QUOTE(mlmadmax @ Nov 12 2006, 10:35 AM) View Post

To lifter,

Doesn't only 1080p60 have a positive for videogames and 1080p30, which is what is out right now not have any advantage over 1080i even for video games because they are both capped at 30 frames?

I know this isn't directed at me, and correct me if I'm wrong, but even at 30 fps wouldn't a non-interlaced image look better in motion than an interlaced image? An interlaced display draws half of each frame each pass so that in fast moving pictures you may notice the separate lines whereas a progressive display draws the entire image one time per pass avoiding that problem.

For example:
Interlaced:
IPB Image
vs
Progressive:
IPB Image

Now, obviously this example is not in 1080 resolution, but wouldn't the same principles apply?

Isn't this like comparing 480i to 480p? Don't we all agree that 480p looks better?


#26 zombie4rave

zombie4rave

    X-S Senior Member

  • Members
  • PipPip
  • 253 posts
  • Xbox Version:v1.0
  • 360 version:v1 (xenon)

Posted 13 November 2006 - 04:05 PM

The biggest advantage right now that I see of 1080p, upscaled or not, is that you can match the 360 to your tv's native resolution. I've got a Samsung 1080P DLP rear projection. If I can send it a 1080p signal then I minimize game lag.

#27 mlmadmax

mlmadmax

    X-S Genius

  • Members
  • PipPipPipPip
  • 870 posts
  • Location:California
  • Xbox Version:v1.4
  • 360 version:v2 (zephyr)

Posted 13 November 2006 - 04:40 PM

I think 480p can go up to 60 frames a second where 1080i and 1080p30 can only go 30 frames a secon. This means for film they are exactly the same in some sense, but I don't know all the details as to why and would like clarification.

#28 -Spud-

-Spud-

    X-S Member

  • Members
  • Pip
  • 106 posts
  • Location:Melbourne, Australia
  • Interests:God, Games, Music, Movies
  • Xbox Version:v1.0

Posted 14 November 2006 - 04:26 AM

Lifter U DA MAN! Everyone OWNED! tongue.gif

If you have a native 720p display unit isn't the best possible picture going to come from a native 720p source? In other words is it pointless upscaling natively 720p games and video to 1080i even if your 720p display unit will let you see a 1080i image?


#29 Caldor

Caldor

    X-S Member

  • Members
  • Pip
  • 130 posts

Posted 14 November 2006 - 05:00 AM

QUOTE(Lifter @ Nov 11 2006, 10:32 PM) View Post

No worries. But I'd be very surprised if HD sets in the UK don't have the same capability. I understand it's a problem with DVDs, but it shouldn't be with any HD disc or broadcast.


Thanks for that.

As I said, not all displays recognise the 2:2 candence for restoring the original film content.

And film content is the point I want to stress. While I understand your point of view it does however assume that the content will be film based.

This is not the case for my country where much 1080 content is shot live for sporting events and the like using interlaced cameras.

Under this type of footage, it is impossible to perfectly assemble all the temporal fields into full frames and we arrive again at my original statement as to why progressive is best - because it eliminates interlace artifacting.

mlmadmax - No, some countries have 1080i50 for example

-Spud- - the best method is to feed the display the native resolution of the display


#30 CHRONIC 5000

CHRONIC 5000

    X-S Young Member

  • Members
  • Pip
  • 37 posts
  • Location:GA
  • Xbox Version:v1.0
  • 360 version:v1 (xenon)

Posted 14 November 2006 - 08:52 AM

Screw 1080p....I'm gonna Ultra High Definition 7,680 4,320 pixels biggrin.gif biggrin.gif

linkage----- http://en.wikipedia....efinition_Video

On a serious note. I won't go to 1080p, no matter how much better it is (or isn't???), until the sets get cheaper...much cheaper...and since I don't see that happening for a while, I'll just stick with 720p/1080i.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users