Is it possible to 'up convert' 720p to 1080i? - AVS Forum

AVS Forum > HDTV > Local HDTV Info and Reception > Is it possible to 'up convert' 720p to 1080i?

Local HDTV Info and Reception

johngi's Avatar johngi
12:06 AM Liked: 10
post #1 of 16
08-24-2001 | Posts: 3
Joined: Aug 2001
Hi, this is my first post.

I've ordered the Panasonic CT-34WX50 which supports 480p and 1080i. I'm concerned that for 720p content the TV will down convert to 480p. However, I've read (can't remember where) that this TV will up convert 720p to 1080i. Is this even possible? If so, does anyone know for sure if this TV supports this?

------------------
Thanks,
-john

[This message has been edited by johngi (edited 08-24-2001).]
kschonha's Avatar kschonha
06:08 AM Liked: 10
post #2 of 16
08-24-2001 | Posts: 179
Joined: Mar 2001
My Toshiba DST3000/TW40X81 setup converts 720p and 480p content to 1080i. 1080i content is passed through in its native format. I believe that most of the STBs on the market function that way.
rodmanbra's Avatar rodmanbra
12:36 PM Liked: 10
post #3 of 16
08-24-2001 | Posts: 1,559
Joined: Feb 1999
No.No and NO!!

720p to 1080i is DOWNCONVERSION. I stated this atleast 10 X!

720p is 45

1080i is 33.7 The number is LOWER. y or n?
johngi's Avatar johngi
02:16 PM Liked: 10
post #4 of 16
08-24-2001 | Posts: 3
Joined: Aug 2001
What do the numbers 45 and 33.7 mean?

So it is not technically possible to convert 720p to 1080i?

------------------
Thanks,
-john
Thomas Desmond's Avatar Thomas Desmond
02:42 PM Liked: 15
post #5 of 16
08-24-2001 | Posts: 2,336
Joined: Jun 2000
It is indeed possible to convert 720p to 1080i, and many of the set top boxes on the market will do it automatically. For example, the Toshiba/Mitsubishi/Hughes boxes will all automatically convert everything to 1080i.

Whether this is an "upconversion" or a "downconversion" is a debatable point -- and not a debate that I'd care to join. It's too much like the tech equivalent of arguing religion.
LeeAntin's Avatar LeeAntin
02:51 PM Liked: 10
post #6 of 16
08-24-2001 | Posts: 2,311
Joined: Dec 1999
What Rod is reffering to is the Hortizontial Scanning rate of the two different HDTV formats. 720P runs at 45 kHz and 1080i runs at 33.7 kHz. So in essence he is correct in that you are going from 45 to 33.7 which is down.

Lee
Frank's Avatar Frank
02:53 PM Liked: 14
post #7 of 16
08-24-2001 | Posts: 5,194
Joined: Apr 1999
Quote:
What do the numbers 45 and 33.7 mean?
He rounded off the horizontal scan frequency in kilohertz.
1080i HDTV has 1125 scanning lines, 1080 of which are visible with no overscan. 1125 times 30 frames per second equals 33,750 hertz.
720P has about 765 scan lines, 720 of which are potentially visible. 765 times 60 frames per second equals 45,900 hertz.

Frank

PVR's Avatar PVR
03:17 PM Liked: 10
post #8 of 16
08-24-2001 | Posts: 3,761
Joined: Jun 2001
I tend to think of "downconvert" to mean less output "resolution" (pixels per frame), not khz signal differences.

For instance when you go from 720p to 1080i you have to
add pixels to the resultant 1080i frames so "PIXEL REPLICATION" or some sort of "interpolation" is being done to calculate what extra pixels to add. I call that "UPCONVERT".

When you go from 1080i to 720p you have to remove pixels to fit into the smaller resolution frames. "PIXEL DECIMATION" is done to elimate pixels which cannot be displayed at the lower resolution. I call that "DOWNCONVERT".

Doing a websearch on "720p to 1080i" I find the term "upconvert" is used many times (on different sites), but "downconvert" is generally only used when talking about 1080i->480i, 720p-> 480i.

I found one "DTV glossary" that attempts to explain this.

They seem to acknowledge that "upconvert 720p to 1080i" is the common usage, but they consider it a "misnomer":
http://www.nortek.net/learning_center/dtv_glossary.htm
========================================================
Downconvert
A term used to describe the format conversion from a higher resolution input signal number to a
lower display number, such as 1080i input to 480i display.

Upconvert
The term used to describe the conversion of a lower apparent resolution to a higher number, such
as "upconverting" 720p to 1080i. This is a misnomer, though, since to accomplish this, the
horizontal scanning frequency is actually lowered from 45kHz to 33.75kHz. Resolution quality is
not improved by this method.
========================================================

So - even though it seems that "upconvert 720p to 1080i" is the common usage, I vote that we come up with the new term "crossconvert" when going between formats where the higher pixel resolution format (e.g.: 1080i 1920x1080 vs 720p 1280x720) has a lower horizontal scanning frequency (e.g.: 1080i 33.75khz vs 720p 45khz).

For instance:

Upconvert 480i to 1080i
Crossconvert 720p to 1080i
Downconvert 720p to 480p

If people don't know what I mean when I say "crossconvert" then I will stick with "upconvert 720p to 1080i" because I value final pixel resolution more than horizontal scanning frequency.

Savageone79's Avatar Savageone79
03:48 PM Liked: 10
post #9 of 16
08-24-2001 | Posts: 2,454
Joined: Jul 2001
Supposedly a 720p frame and a 1080i frame have about the same visual quality. The interlaced nature of the 1080i lessens its effective visual detail to around 720p. The reason 1080i was choosen was because it is cheaper to make 1080i tv's then 720p tv's or it was at the time the standard was adopted.
johngi's Avatar johngi
05:10 PM Liked: 10
post #10 of 16
08-24-2001 | Posts: 3
Joined: Aug 2001
Thanks for all the info guys!

I've been reading around on the net and it finalliy dawned on me that the TV would not be doing the conversion--most likely it would be done by an STB.

I agree with PVR's use of 'cross-convert' since you gain some and lose some when converting between 720p <--> 1080i.

I guess my question has been answered and I'm happy about the end result. As long as I can view 720p broadcasts without dropping down to 480p I'm happy. Thanks again!

------------------
Thanks,
-john
work permit's Avatar work permit
09:09 PM Liked: 18
post #11 of 16
08-26-2001 | Posts: 2,583
Joined: May 2001
FWIW Broadcast papers calls it "side conversion"

Quote:
Format converters come in many flavors such as up-converters, down-converters, side-converters, cross-converters and all-format converters. Side-converters convert between the multiple HDTV formats, for instance, 720p to 1080i or 1080i to 720p. Cross-converters are format converters that also perform frame rate conversion between 50 Hz and 60 Hz.
I seem to remember the term "cross converter" used for conversion between any HD format, but I may be mistaken.

------------------
Alex
PVR's Avatar PVR
01:46 AM Liked: 10
post #12 of 16
08-27-2001 | Posts: 3,761
Joined: Jun 2001
Thanks, work-permit... I guess "side conversion" is the term to use.

>> Supposedly a 720p frame and a 1080i frame have about the
>> same visual quality. The interlaced nature of the
>> 1080i lessens its effective visual detail to around 720p.

But some displays will de-interlace the 1080i signal. For instance digital dILA/LCD/DLP projectors tend to have
an on-board frame store that accepts the interlaced signal and then buffers two fields and outputs as progressive scan.
Most of those digital projectors are still < 1080i (under 1920x1080) but the day will soon be here where you can get real 1080p projectors and then you will really want 1080i source material over 720p.

Also these "visual quality" terms are subjective. Maybe they polled a bunch of people and asked which picture they thought looked better and they couldn't agree. To me 1080i tends to look better than 720p but others may disagree.
Also the type of display device makes a big difference in any benefit of 1080i so you really need to see a range of display devices to make up your mind. Unfortunately most of the current HDTV displays don't try hard to resolve more than 1280x720 worth of resolution so you probably won't notice the 1080i advantage.

>> The reason 1080i was choosen was because it is
>> cheaper to make 1080i tv's then 720p tv's or it was at >> the time the standard was adopted.

I have seen this statement before and it always annoys me.
Sure some of the TV expense is in the electronics and a lower bandwidth 33khz 1080i signal saves some money over the
higher bandwidth 45khz 720p signal, but I don't think of a TV with 1080i "electronics" and a 720p resolution picture tube to be a "real" 1080i TV.
Maybe call them "1080i compatible" TVs, but (to me) a real 1080i TV should be able to have enough "pixel sharpness" to
show at least 1600x900 worth of resolution. Unfortunately the
cost of a > 1600x900 resolution picture tube (or big enough
CRTs in an RPTV) basically dwarfs the cost savings of the
electronics. Real 1080i TVs should cost more than 720p TVs.
720p resolution TVs with 1080i input electronics are a cheap hybrid that we have been living with so far.
========
Because so many of these "1080i input bandwidth with 720p resolution" TVs have been sold the content producers feel
little need to produce material with "true" 1920x1080 resolution detail. If you browse around the other HDTV forums
you will find much discussion of how many of the "so called" HDTV programs have little inherent detail beyond regular NTSC ("Hi Frank!")

Anyways - probably someday in the future we will all have upgraded to "true" 1080i 1920x1080 displays and we will wish those "old" HDTV reruns didn't look so "fuzzy/blurry" compared to the quality we eventually get when all the
equipment is able to live up to the spec.

(Sorry about all the ranting and raving. I will get off my soapbox now!)


[This message has been edited by PVR (edited 08-27-2001).]
trbarry's Avatar trbarry
05:11 AM Liked: 10
post #13 of 16
08-27-2001 | Posts: 10,138
Joined: Jan 2000
It's probably not an important issue but there is one point that seems to get glossed over in discussions like these.

1080i sends 30 interlaced frames / second at a resolution of 1920 x 1080 for a total pixel rate of about 62 million pixels / second, assuming we could somehow resolve them all.

720p sends 60 progressive frames / second at a resolution of 1280 x 720 for a total pixel rate of about 55 million pixels / second, so they really are about the same.

But both rates above really only are for video source which actually contains that many input fields or frames.

If the source is from film than there are really only 24 frames / second to send anyway. The telecine process is just creating duplicate fields for anything over 24 frames.

So, assuming our receivers could actually do 3:2 pulldown removal and display all the unique pixels, we would have a different story.

Watching a movie under 1080i would show 24 frames / second at 1920 x 1080 = 2.07 million new (non-duplicated) pixels / frame.

Watching a movie under 720p would show 24 frames / second at 1280 x 720 = .92 million new (non-duplicated) pixels / frame.

So with movies the difference becomes significant. Of course this assumes we are using a scaler or TV with 3:2 pulldown removal and that the frames will be properly duplicated to match the TV's refresh rate on display.

I'm not really making a pitch for 1080i here and I don't really even like interlace much. I'm just pointing out that you can't give credit for more temporal resolution when the source material doesn't even have it. And that would hold for all 24p source material.

Most of our current TV's probably can not display all of the above but we can easily imagine future TV's and scalers that do.

It's early morning and I'm on my first cup of coffee so I hope I haven't screwed up the math above too bad.

- Tom

------------------
<FONT size="1">
Getting started with HTPC:
HTPC FAQ , DScaler , Xcel's Links , and
The Anti-DMCA Website . </FONT s>
<FONT size="2">And Free Dmitri Sklyarov</FONT s>
roman's Avatar roman
11:35 AM Liked: 10
post #14 of 16
08-27-2001 | Posts: 336
Joined: Jun 2000
I can barely perceive the interlacing, considering that each field of 1080i is 540p (which is higher then an full frame of progressive NTSC).


vruiz's Avatar vruiz
11:56 AM Liked: 10
post #15 of 16
08-27-2001 | Posts: 4,534
Joined: May 2000
Quote:
Originally posted by trbarry:
Watching a movie under 1080i would show 24 frames / second at 1920 x 1080 = 2.07 million new (non-duplicated) pixels / frame.

Watching a movie under 720p would show 24 frames / second at 1280 x 720 = .92 million new (non-duplicated) pixels / frame.

It's early morning and I'm on my first cup of coffee so I hope I haven't screwed up the math above too bad.
I'm afraid you did, Tom. What you're describing in the first example is actually 1080p/24, not 1080i. In order to make a fair comparison you need to introduce the interlace factor and the time factor in the equation. In interlace only half the pixels are actually on the screen at any given time so a more accurate calculation would be:

1080i/24: 1920x540x24= 24,883,200 pixels per second
720p/24: 1280x720x24= 22,118,400 pixels per second

Or since we're dealing with 24 frames per second we could also think in terms of 1/24th of a second:

1080i/24: 1920x540= 1,036,800 pixels per 1/24th of a second
720p/24: 1280x720= 921,600 pixels per 1/24th of a second

Either way it's only a 12.5% increase in actual resolution of 1080i over 720p. The interlaced format however, takes advantage of perceived resolution, the fact that our eyes and brain see the two 540-line fields as one 1080-line frame.

This is all theoretical of course since there is no 24fps interlaced format. The 30fps/60Hz interlaced format does well in creating the illusion of one frame from the two fields, but if you were to reduce the refresh rate to 24fps/48 Hz you will probably start to see flicker.

------------------
Vic Ruiz
STOP HDCP/DFAST/5C

[This message has been edited by vruiz (edited 08-27-2001).]
trbarry's Avatar trbarry
01:39 PM Liked: 10
post #16 of 16
08-27-2001 | Posts: 10,138
Joined: Jan 2000
Quote:
I'm afraid you did, Tom. What you're describing in the first example is actually 1080p/24, not 1080i. In order to make a fair comparison you need to introduce the interlace factor and the time factor in the equation. In interlace only half the pixels are actually on the screen at any given time so a more accurate calculation would be:

1080i/24: 1920x540x24= 24,883,200 pixels per second
720p/24: 1280x720x24= 22,118,400 pixels per second
Vic -

Dunno, I'm still not so sure. I think the telecine process will make TWO 1920x540 fields out of a single frame, doubling the eventual number of pixels received for that frame or almost 50 million new pixels / second. But 720p will make only one 1280x720 field out of a single picture.

Of course there are a whole raft of other issues. I was just counting available pixels being sent.


- Tom


------------------
<FONT size="1">
Getting started with HTPC:
HTPC FAQ , DScaler , Xcel's Links , and
The Anti-DMCA Website . </FONT s>
<FONT size="2">And Free Dmitri Sklyarov</FONT s>
Closed Thread Local HDTV Info and Reception

Subscribe to this Thread

Powered by vBadvanced CMPS v3.2.3