AVS Forum banner

Status
Not open for further replies.
1 - 20 of 36 Posts

·
Registered
Joined
·
2,250 Posts
Discussion Starter · #1 ·
I posted this in the HDTV hardware forum but didn't get any replies. I'm hoping someone here will have some insight.


-phil



I know that film source DVDs are encoded with 3:2 pulldown to get the correct frame rate out at 480i. A thread about the new Samsung RPTVs got me thinking and I have several quesitons:


1) Does anyone know if 3:2 pulldown is also employed for HDTV broadcast movies?


2) Is the answer different depending on if it is broadcast in 720P or 1080i?


3) Is there some other frame rate compensation used in HDTV?


4) Is a HDTV monitor with 3:2 pulldown useful when viewing HD video material?


-phil
 

·
Registered
Joined
·
9,884 Posts
From movies I've captured and process it appears the same sort of 3:2 pulldown is used. This occurs on 480i and 1080i.


A slightly different telecine process is used in 480p and 720p (at 60 fps) and I'm not sure if it's properly called telecine. But the repeated frames still occur in a telecine-like patterrn if the movies have started out as 24p film and removing the duplicates and displaying at a refresh rate multiple of 24 would still be nice. But here you don't have to worry about improperly woven unmatch frames causing ugly artifacts.


It's also interesting to note that some 480i from Fox appears really to be 480p @ 30 fps, just half the frames of whatever it is sending at 480p on another sub-channel. Proper display of this is really more like 2:2 pulldown removal.


- Tom
 

·
Registered
Joined
·
6,092 Posts
Why would you need to remove anything on the fox broadcast. You are getting constant 30fps with a 2,2,2,2... cadence.


Getting back to 1080i, Movies are almost categorically shot at 24 frames per second. Therefore you have to do telecine to get them to display at the proper 60 Hz speed in the U.S.


The 60 Hz is either a field rate for 1080i or a frame rate for 720p. It takes two fields, the odd lines and the even lines, to make up one frame. So, 1080i has a frame rate of 30Hz. (720p is progressive so it very rarely refered to in terms of field rate but one could argue that 720p has field rate of 120Hz.)


Both are telecine but 3:2 telecine with 1080i is done at the field level while it is done at the frame level with 720p. (Brain-teaser, If you stop and think about it, only one of the 5 fields in the 3:2 sequence is redundant with 1080i but 3 of the 5 frames are redundant with 720p)


As far as deinterlacing is concerned, there are only two ways I know that it is comercially possible a Faroujda 5000 outputting at 1080p or the Hi-Pix which does 1080i to 1080p conversion for film and then scales that down to a lower resolution with 768p being the highest output.


I don't think any display does 3:2 pull-down detection for 1080i sources. They all do a form of interpolated "bob" style deinterlacing. This looks really good but not as good as true progressive out from film based 1080i material.


-Mr. Wigggles
 

·
Registered
Joined
·
2,048 Posts
Quote:
I don't think any display does 3:2 pull-down detection for 1080i sources
The JVC DLA-QX1 does that.
 

·
Registered
Joined
·
9,884 Posts
Quote:
Why would you need to remove anything on the fox broadcast. You are getting constant 30fps with a 2,2,2,2... cadence.
I was allowing for the fact that some things like Fox News seem to be a stew of materials from many different sources. Some may be interlaced, film source, or possibly (but haven't proved it) 2:2 material that is out of cadence.


The only reason I use the 480i Fox stream at all is for test material for writing a new deinterlacer. That's where I noticed this.


- Tom
 

·
Registered
Joined
·
6,092 Posts
Quote:
Originally posted by Cliff Watson
Mr. Wigggles,


How does 1080i network programming shot on film differ from 1080i movies shot on film?
I don't think there is much difference at all even in terms of quality.


As far as JVC is concerned it can display a 1080p input but I don't know if it does proper 1080i to 1080p conversion or not internally. For $225K, I'm not sure if it really matters because you'll probably have a $60K Teranex process the source anyway.


-Mr. Wigggles
 

·
Registered
Joined
·
19,586 Posts
You'd think that they'd use pulldown for data reduction, even if there was no other requirement to do it, right? Anyway, if they do pulldown, there are field repeat flags in the HD MPEG2 data stream, right? If so, then you'd think that it wouldn't be terribly hard to do the progressive conversion.
 

·
Registered
Joined
·
6,092 Posts
Dean,


You are correct there are progressive frame flags in the HD stream. This is what the Hi-Pix uses to do 3:2 pulldown. But rarely does scaler equipment have access to the flags. The Faoudja 5000 actually looks at the image which is pain in the ass at 1080i and hence the $$$.


For 1080p to exist at a consumer level it will need to be done by the source. I don't think we'll see it anytime soon at a reasonable price. But we might see HD settop boxes that advertise 3:2 pull-down detection for 1080i sources which take the 1080p result and down scale it to 720p.


-Mr. Wigggles
 

·
Registered
Joined
·
19,586 Posts
I was referring to the actual STB itself, not an external scaler really. Given how easy it would be, if you had access to the repeat flags in the digital stream, it should be fairly trivial for them to do so. Though it wouldn't be a huge market, they could put out a high end box with RGB outputs that supported this option, since there are a good number of people out there that have the FP systems and direct view computer monitors to support it. It could just revert to 1080i when not showing 24 f/s material. I can't see how it would require a lot more expense than what they are already doing? The D/A circuitry would have to be better of course, because of the higher bandwidth required, but it doesn't seem like it would be prohibitively expensive these days given that many computer video cards can handle it easily enough.
 

·
Registered
Joined
·
1,703 Posts
Sensing the 3-2 pulldown of 24 fps film source (it's there on all interlaced video too) is not relevant unless you are making progressive scan, whcih for this discussion is converting 1080i to 1080p. No consumer grade TV's (or projectors) accept 1080p input, a few TV's or HDTV set top boxes (I don't know of any) may convert the 1080i to 1080p within themselves, 1080p is not one of the ATSC HDTV standards and will only come into being after 1080i is de-interlaced.


Currently having the 1080p be made anywhere except inside the TV is very finicky. You need 50 MHz video bandwidth (theoretical need is 75 MHz to actually resolve adjacent tiniest 1/1920'th screen width pixels) for your component video cable and A/V receiver compared with getting away with 25 MHz for 1080i.


>>> It's also interesting to note that some 480i from Fox appears really to be 480p @ 30 fps, just half the frames of whatever it is sending at 480p on another sub-channel...


Done right, making interlaced video out of what was once progressive scan, cannot be distinguished from the same source televised or recorded as interlaced in the first place. Done wrong, you get a softer picture, 240p quality as opposed to 480i, or occasional jerky motion from the outright skipping of whole frames and getting both even and odd fields from the frames that are kept.


Video hints:
http://members.aol.com/ajaynejr/video.htm
 

·
Registered
Joined
·
9,884 Posts
Allan -


That is certainly an impressive collection of useful info you've made. I've bookmarked it. But it wasn't obvious which section you were talking about above.


As far as distinguishing the difference, remember that I was talkiing about looking at captured data, one frame at a time. Some things are a bit more obvious there than if you displayed them real time on an interlaced display.


As far as "done right", I have no idea. It seem likely the Fox stations have better line doublers than the average user but I'm still not that impressed with their standard 480i->480p conversions. But I'm not sure whether you were talking about displaying the pseudo 480p on an interlaced display or their usual process of upconverting 480i->480p before sending it that way on theiir primary channel.


- Tom
 

·
Registered
Joined
·
19,586 Posts
Quote:
No consumer grade TV's (or projectors) accept 1080p input, a few TV's or HDTV set top boxes (I don't know of any) may convert the 1080i to 1080p within themselves, 1080p is not one of the ATSC HDTV standards and will only come into being after 1080i is de-interlaced
Plenty of folks have FP systems capable of accepting a 1080p signal. It doesn't matter that they cannot resolve all of the detail. Hell, most of us cannot resolve nearly all the detail in the 1080i signal, and that doesn't stop us from using and enjoying it. It a matter of getting rid of interlace artifacts, which we could do perfectly on film based material. Your projector will resolve whatever it can resolve, just as it does on 1080i material, and you'll have an progressive image.
 

·
Registered
Joined
·
9,884 Posts
I may not understand it completely but if you have confidence you have a properly telecined 1080i (480i) from 24p film source then I think an MPEG2 decoder can make a proper 24p source just by ignoring the various repeat flags and frames, and then displaying the remaining frames at 1/24 second intervals.


That is, it is faster and more efficient to to 3:2 pulldown removal than not do it. The problems come because you don't know whether or not your source was properly telecined from 24p material. And even when it has it may have been messed up with video edits, overlays, time compression, etc.


- Tom
 

·
Registered
Joined
·
6,092 Posts
Quote:
Originally posted by Dean Roddey



Plenty of folks have FP systems capable of accepting a 1080p signal. It doesn't matter that they cannot resolve all of the detail. Hell, most of us cannot resolve nearly all the detail in the 1080i signal, and that doesn't stop us from using and enjoying it. It a matter of getting rid of interlace artifacts, which we could do perfectly on film based material. Your projector will resolve whatever it can resolve, just as it does on 1080i material, and you'll have an progressive image.
I think this is where the HiPix does a good job. It deinterlaces 1080i to 1080p with the flags and then scales that output to one of its progressive resolution outputs 1360X768p or 1280X720p. Depending on your digital display there is a good chance one of these resolutions would be native or if you have an FP CRT, one of these progressive resolutions should fill your display.


And I would like to point out that mathmatically going from 1080p to 720p is fairly easy and with HiPix's scalling capabilities it doesn't have any trouble going to 768p either.


It would be nice if consumer STB's when outputting 720p, would take the 1080i signal to 1080p first before going to 720p. But since there isn't much cry for 720p I don't think any of them go through the trouble.


And since 1920 X 1080p at 60fps is at the extremes of what DVI can handle it might complicate things when it comes time to develope a high end STB.


I am just happy with my HiPix currently. It looks great and I don't think I will personally need anymore than 768p in coming years.


-Mr. Wigggles
 

·
Registered
Joined
·
13,423 Posts
â€I think this is where the HiPix does a good job. It deinterlaces 1080i to 1080p with the flags and then scales that output to one of its progressive resolution outputs 1360X768p or 1280X720p.â€


Mr. Wigggles,


I’m still trying to understand this feature of the HiPix. What 1080i-source material are you using for this 3:2 pull down before scaling? This question leads back to my question above about the difference between 1080i-network programming from film and 1080i movies from film. I don’t have HBO-HD or SHO-HD to compare to the network 1080i derived from file but all network 1080i shows have excessive combing with this feature enabled and scaled to 720p/768p.


In other words what are the limitations of this HiPix feature?
 

·
Registered
Joined
·
6,092 Posts
Quote:
Originally posted by Cliff Watson
â€I think this is where the HiPix does a good job. It deinterlaces 1080i to 1080p with the flags and then scales that output to one of its progressive resolution outputs 1360X768p or 1280X720p.â€


Mr. Wigggles,


I’m still trying to understand this feature of the HiPix. What 1080i-source material are you using for this 3:2 pull down before scaling? This question leads back to my question above about the difference between 1080i-network programming from film and 1080i movies from film. I don’t have HBO-HD or SHO-HD to compare to the network 1080i derived from file but all network 1080i shows have excessive combing with this feature enabled and scaled to 720p/768p.


In other words what are the limitations of this HiPix feature?
Cliff,


I haven't looked at network film based programming for a long time on the HiPix. It is possible that the network flags aren't set properly for 24FPS stuff shown at 1080i (60 fields per second). If field cadence doesn't match the flag cadence all hell can break loose when it comes time to deinterlace. I don't record CSI or any other CBS show so I don't know. The only network movies I have are a few ABC recordings which are 720p of coarse so that doesn't matter.


Now a few CBS shows to my knowledge are shot with video cameras which makes them 60i to begin with which probably would weave deinterlace very poorly. I'm not sure which ones are but that could cause some issues.


I haven't noticed any combing on my HBO/Showtime stuff and I'm pretty sensitive to it. I could capture the input signal to my projector to show what I'm getting. I could also do some very close viewing on my Princeton Graphics monitor to look for problems. Anything I run up against should be minor on the HBO stuff but who knows with the CBS stuff.


But all other things being equal the CBS stuff and the HBO stuff are 24fps telecine productions both being shown at 1080i. So it should be possible to weave deinterlace them to 1080p.


-Mr. Wigggles


Ps. the 24fps 1080i deinterlace feature to my knowledge comes set to "on" or "1" in the registry upon install of Beta 3 which is what I'm still using. I have have never changed it so if it is actually "0" in the off position, I will have some egg on my face do to the bob deinterlacing actually going on. Breaking out the Princeton Graphics monitor for critical viewing should make this very apparent visually one way or another.
 

·
Registered
Joined
·
13,423 Posts
â€Ps. the 24fps 1080i deinterlace feature to my knowledge comes set to "on" or "1" in the registry upon install of Beta 3 which is what I'm still using. I have have never changed it so if it is actually "0" in the off position, I will have some egg on my face do to the bob deinterlacing actually going on. Breaking out the Princeton Graphics monitor for critical viewing should make this very apparent visually one way or another.â€


Mr. Wigggles,


Go wash the egg off your face. Beta 3 does not install with deinterlace before scaling enabled. :D


No need to check on a monitor. ALL CBS shows have very bad combing with the bit set to 1 in the registry. Wish I didn’t live on the east side of an apartment building so I could put up a dish.
 

·
Registered
Joined
·
6,092 Posts
Well egg yoke does blur your vision, that goes to show you when you only use a 1024 X 576 projected image it doesn't really matter too much whether you are doing bob on the 540p or scaling down from 1080p :)


I will dust off the Princeton Graphics when I get home and do some serious testing with the 1080p enabled on the HBO stuff.


Let you know.


-Mr. Wigggles
 
1 - 20 of 36 Posts
Status
Not open for further replies.
Top