Secrets Q & A

High Definition 1080p TV: Why You Should Be Concerned

ARTICLE INDEX

 

Flat Panel HDTVs and the Caveat No One Wants to Talk About

As we've said, 1080i60 is the highest resolution format offered by today's mediums. Its fairly intuitive to think that simply displaying it as such will maximize the formats potential. The trouble is, with the exception of dinosaur CRTs not yet cleared from inventories, you can't buy a TV today which is capable of displaying it!

CRT (aka "tube") TVs are the only display technology which actually "does interlaced". Its the only technology which can actually alternately refresh the odd and even lines of its face. In other words they are the only devices that can display a "raw" 1080i60 signal.

Well, CRT is dead! We (and countless others) heralded the death knoll of CRT years ago amidst protest and anguish, but now there is no denying it: 2007 is the year CRTs disappear and flat panels take over . . . permanently.

This is extremely relevant. Flat panel TVs (LCD, Plasma) or any fixed pixel technology (such as DLP/LCD projectors etc) have a fixed display mode, their so called "native resolution". That is, they can only display the actual resolution of their panel (1024 x 768, 1366 x 768, and 1920 x 1080 being just a few examples). Everything else must be scaled and/or processed to that native format of the device.

More importantly, with the exception of some odd, early, and now discontinued plasma models, NO flat panel or fixed pixel display devices "do interlaced". That is, although you can feed them an interlaced signal like 1080i60, one way or another it has to be converted, or "de-interlaced" into a progressive stream, and then scaled or mapped to the device's native resolution, whatever that may be.

You have no choice. It's either going to happen in the TV itself, or in the disc player, or in a processor in between, but make no mistake: if you are watching 1080i60 on anything other than a CRT, it's being de-interlaced.

As with many things, there is a "right" and "wrong" way to do it.

The wrong way, which of course happens to be the cheap way from a processing and cost perspective, is to simply scale each 540 line field to the native resolution of the display. That means that whether your TV is 720 lines, 768 lines, 1024 lines, or even one of the "Full HD" 1080 line models, if it de-interlaces this way, you are only seeing a picture which is 540 lines strong. Not what you paid for, is it?

Remember we said that if the picture is not moving, fields sum together to form a complete 1920 x 1080 picture?

The right way to process 1080i is to de-interlace it to 1080p (regardless of what the TV's native resolution is) using motion adaptive de-interlacing. This is a process which involves detecting which areas of the picture are moving and which ones are not, and then combining fields in the non-moving areas while interpolating the moving ones (filling in the spaces between the alternating lines with average, in between values) . If you have a 1080p display (which actually displays 1080p without cropping and re-scaling), you're done, because the result is a 1080p signal. If you have a TV of any other resolution, it's then just a matter of scaling the 1080p signal to whatever the native resolution of the device is.

So even though you might only have a 720 line device, that device needs to be able to handle 1080p (at least inside the display after performing de-interlacing in order to maximize its potential when viewing a 1080i source.

Bet the sales person didn't mention that when he sold you that shiny new TV, did he?

Let's look at some illustrations:

If this were a scene shot at 1080i, and displayed at 1080i, it would look like this. But today's digital TV's cannot do this. The signal must be de-interlaced.
1080i vs. 1080p
If we de-interlace it the WRONG way, it would look like this.

The entire scene is reduced to 540 lines worth of resolution. Hint: look at the hands.

If you display this on a 1366x768 TV (a common resolution right now), you will be wasting 1/3 of the resolution you paid for!
1080i vs. 1080p
If we de-interlace it the RIGHT way though, to 1080p, it would look like this.

Only the areas in motion are reduced in detail. The rest remains at the full 1080 line resolution.

Though you need a full 1920 x 1080 TV to maximize the detail present, on a lesser TV, say a 1366 x 768 model, you will still realize the device's full potential.
1080i vs. 1080p


Still wonder if you should care about 1080p?