Technical & Editorial
- Written by Brian Florian and Colin Miller
- Published on 30 November 2007
- High Definition 1080p TV: Why You Should Be Concerned
- Page 2: Flat Panel HDTVs and the Caveat No One Wants to Talk About
- Page 3: Native 1080p Material - A Hidden Reality
- Page 4: But, Do You You Really Need 1080p?
- Part 5: Are 1080p Scaling Artifacts Worth Worrying About?
- Part 6: Conclusions About 1080p
- All Pages
1080p, or 1080 progressive, is a very high resolution video format and screen specification. It is one of the ATSC HDTV specified formats which includes 720p, 1080i, and 1080p. If you are even casually interested in Home Theater, you no doubt have heard the term 1080p, and if so, you most likely have been misinformed about it. Common misconceptions being spread include that there is no media to carry it, that you need an enormous screen to benefit from it, and on the whole you just shouldn't care about it. Why the industry has persisted in the charade is beyond the scope of this piece, but suffice it to say, if you don't care about 1080p now, you will.
1080p is here, it is now, and has been for quite some time!
In order to understand 1080p, you first need a solid understanding of 1080i (1080 interlaced). Please bear with us, don't cut to the chase, and keep on reading. Trust us, it'll be worth it.
1080i vs. 1080p: It's all a matter of time.
1080i is the highest resolution format of the HDTV ATSC specification as well as the recently launched HD DVD and Blu-ray media. 1080p is often quoted as being a higher resolution than 1080i, and though from a certain point of view (which we will touch on) that's true, in the broad context it is not (1).
In a very real way, 1080i and 1080p are the same resolution in that both consist of a 1920 x 1080 raster. That is, the picture is comprised of 1080 separate horizontal 'lines', with 1920 samples per line (or pixels per line, depending on your point of view). In other words, both 1080i and 1080p represent an image with 1920 x 1080 unique points of data in space.
The difference between 'i' and 'p' can only be appreciated in the time domain.
In a "true" or "native" 1080i HDTV system, the temporal resolution is 60 Hz. The image is sampled, or updated if you prefer, every 1/60 of a second. As with any interlaced format though, only half the available lines are sampled, or updated, every 1/60 of a second. The capture device (say, a video camera) does not sample the entire 1920 x 1080 at one time. Rather, it samples fields. A single field consists of every other line out of the complete picture. So we have the "odds" field which has lines 1, 3, 5, 7, etc and the "evens" field which has lines 2, 4, 6, 8, etc.
So, in an interlaced system, the camera samples one field (say the "odds"), then 1/60 of a second later, it samples the opposite field (the "evens"), then 1/60 of a second later it refreshes the odds, then 1/60 of a second later the evens, and so on. The alternating set of fields of a 1080i source each make up half the image.
The shorthand for this format is 1080i60.
The subject being captured is updated every 1/60 of a second, but only half the lines are used for each update. This has one benefit and many drawbacks.
The one virtue of this format is its high subject refresh rate: Think of a sporting event where the ball is traveling fast. We get an update on its position every 1/60 of a second. That's really good compared to film's 24 Hz refresh rate (even IMAX HD is only 48 Hz).
The downside on an interlaced format is that the alternating fields only truly compliment each other if the subject is stationary. If it is, then the alternating fields "sum" to form a complete and continuous 1920 x 1080 picture (everything lines up perfectly between the two fields). If the subject moves though, it will be in one position for one field and another position for the next. The interlaced fields no longer compliment one another and artifacts such as jaggies, line twitter, and other visual aberrations are a normal side effect of the interlaced format.
What does all this have to do with 1080p?
1080p differs from 1080i in that the entire 1920 x 1080 raster (all of the 1080 lines side to side) is sampled and/or displayed at one time. No fields. Just full, 1920 x 1080 frames. No combing. No line twitter. Just perfect pictures. But how, if our HDTV system does not incorporate 1080p does it become at all relevant?
We're going to show you.
First we will explain how and why 1080i must be processed as best as possible into 1080p in order to maximize the potential of today's digital displays, including LCD and Plasma flat panel TVs, as well as LCD/DLP etc, projection systems.
Then we'll explain how 1080p material is already here, how many of you have some in your home right now and you don't even know it!
(1) The TV term "high definition" goes back to the 1930s, when anything higher than 30 scanning lines qualified. So, even the ATSC specification of 720p and 1080i is somewhere around the fourth step into the better world of increasing amounts of video picture information. Commercial 3,840 x 2,160 projectors are already becoming available at the cineplex, and we will probably have such displays at home within three to five years, to watch TV programs and movies in "higher definition".
- Next >>