Understanding Hi Def and NTSC/PAL - Page 2 - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #31 of 39 Old 04-29-2007, 04:06 PM
AVS Special Member
 
CKNA's Avatar
 
Join Date: Jan 2001
Location: CT, USA
Posts: 4,059
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Quidam67 View Post

I'm glad I asked the original questions, and I've found everyone's contribution to this thread very informative (although some answers raise more questions than they answer, based on my lack of experience in this field).

The reason for the initial post is that unless you are an enthusiast and/or come from a technical background (with regards to this technology) then you have no real idea what you are spending your money on. I've learnt the hard way that you cannot rely on the product manufacturers (or especially the guys who sell them) to truly educate you on the technology you are putting into your own home, and how to achieve an optimal set up.

The issue of frame-rate incompatibility between film and TV (and the various methods used to solve it) are fasciniating to me. Living in a PAL region (New Zealand) I can't recall witnessing judder but a few nights ago I rented "The Queen" on DVD and the judder was horrific. I'm struggling to understand why. The film was PAL, and the DVD player was a PAL/NTSC Progressive scan Pioneer. As a test, I played the movie on my modded xbox (which does 480p and 720p/1080i upscaling) and the judder was gone, so I suspect this movie was some sort of sloppy conversion from an NTSC master -but regardless, if the judder I witnessed is what NTSC people have to put up with I'm thanking my lucky stars I'm in a PAL region.

As was previously mentioned, modern TV's seem to be able to handle all sorts of formats. Most certainly, both my Plasma's (Panasonic and Samsung) can handle 50 and 60hz + 480p/576p/720p/1080i signals. They both have VGA as well, and I noted that Samsung manual says it can handle 72hz via this connection, which would suggests it could potentially do a 24fps movie at 3:3 with no speed-up. But whether it can do 72hz over component or HDMI is yest another question I'm not qualified to answer -and it frustrates me that this sort of inofrmation is almost invisible/unobtainable to the consumer. On that note, I guess folks using projectors probably care/know more about the 24fps issue becuase I assume projectors are more likely to be able to do 48/72hz (or higher) as they are typcially going to be used to play films.

What I can say is that the issue of frame-rate incompatibility between film and TV should have been properly resolved with the new generation of TV's; Players and Hi Def standards, but that does not appear to be the case -at least, not in a really definative way.


What you saw was not 2:3 pulldown judder but issue with your player. Besides it could not be judder as it was PAL DVD. They do not convert NTSC DVD's to PAL. They speedup the master to 25fps for PAL release. Film captured at 24fps has judder in itself . It is called strobing. It is due poor motion capture of 24fps. That is why there is no such thing as smooth playback of film. Even with so called speed up which sucks as far as audio, or 3:3 pulldown. Most people confuse 2:3 pulldown judder with stuttering. If somebody sees stuttering that means their player is not working properly. Judder can be seen the best during credits. Letters apper to be moving just so slightly.
CKNA is offline  
Sponsored Links
Advertisement
 
post #32 of 39 Old 04-29-2007, 05:06 PM
AVS Special Member
 
sneals2000's Avatar
 
Join Date: May 2003
Location: UK
Posts: 7,044
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 53 Post(s)
Liked: 47
Quote:
Originally Posted by Quidam67 View Post

The issue of frame-rate incompatibility between film and TV (and the various methods used to solve it) are fasciniating to me. Living in a PAL region (New Zealand) I can't recall witnessing judder but a few nights ago I rented "The Queen" on DVD and the judder was horrific. I'm struggling to understand why. The film was PAL, and the DVD player was a PAL/NTSC Progressive scan Pioneer. As a test, I played the movie on my modded xbox (which does 480p and 720p/1080i upscaling) and the judder was gone, so I suspect this movie was some sort of sloppy conversion from an NTSC master -but regardless, if the judder I witnessed is what NTSC people have to put up with I'm thanking my lucky stars I'm in a PAL region.

Don't think this is the case.

The Queen was made in the UK - and I would expect any DVD release in a 50Hz region (Europe, Aus/NZ etc.) to have been a 2:2 pulldown 576/50i with 4% speed-up. It would be very unlikely for a 3:2 480/60i master to have been used and then converted via a standard video standards converter to 576/50i for a DVD release. If for some bizarre reason a 3:2 480/60i master had been used a 576/48i DEFT conversion (horribly known as "Slow PAL" by some) would be expected to have been used to remove the 3:2 judder and give 2:2 transfer.

Not sure what is causing the judder with your set-up - but I'd be surprised if it was 3:2 pulldown related.

Have you tried playing the disc on a regular, simple DVD player, or looked at the non-progressive output of your player?

Quote:
As was previously mentioned, modern TV's seem to be able to handle all sorts of formats. Most certainly, both my Plasma's (Panasonic and Samsung) can handle 50 and 60hz + 480p/576p/720p/1080i signals. They both have VGA as well, and I noted that Samsung manual says it can handle 72hz via this connection, which would suggests it could potentially do a 24fps movie at 3:3 with no speed-up. But whether it can do 72hz over component or HDMI is yest another question I'm not qualified to answer -and it frustrates me that this sort of inofrmation is almost invisible/unobtainable to the consumer. On that note, I guess folks using projectors probably care/know more about the 24fps issue becuase I assume projectors are more likely to be able to do 48/72hz (or higher) as they are typcially going to be used to play films.

Yep - CRTs had a much tighter line rate/field rate spec due to the specialised scanning transformers and circuits needed to drive CRT line and field scans with a bright display. These days Plasmas and LCDs can be a bit more versatile.

That said - it would be interesting to know if your display actually displays at 72Hz, or just accepts a 72Hz signal and "copes".

72Hz over Component or HDMI in video format terms is a non-starter - it is up to the display to accept a 1080/24p signal and convert to 72Hz, not be fed it via the source. You don't find DVD players or HD-DVD/BluRay devices with component or HDMI 1080/72p outputs.

I suspect the 72Hz via VGA is because this is a standard PC option - though HTPC fans may run clever 60i to 24p de-interlacing and 3:3 frame repetition algorithms to convert 60i 3:2 24p DVDs to 72Hz 3:3 72p VGA output.

Quote:

What I can say is that the issue of frame-rate incompatibility between film and TV should have been properly resolved with the new generation of TV's; Players and Hi Def standards, but that does not appear to be the case -at least, not in a really definative way.

There will always be more standards...

The EU licensing of HD-Ready - mandating 480/60i, 576/50i, 480/60p, 576/50p, 1080/60i, 1080/50i, 720/60p and 720/50p compatibility with both Component and HDCP equipped HDMI or DVI inputs has helped immensely in Europe - allowing broadcasters to stick at 50Hz (which they have to for myriad reasons) but single HD-DVD/BluRay releases in 1080/24p with 480/60i extras.

Quite a few displays and players now additionally support 1080/50p and 1080/60p optionally via some, but not all inputs, and this may only be a real issue with Full 1080 displays (i.e. those with 1920x1080 panel resolutions).

The addition of 1080/24p output from some HD-DVD/BluRay players is still pretty new - and really only benefits people who have displays that will display a 24p signal at a 2:2, 3:3, 4:4, 5:5 frame repetition to avoid the 3:2 judder of 60p.

Don't get me wrong 3:2 judder is not unwatchable - it is just noticable to those of us used to 2:2 motion. You only really notice it on linear motion - like rolling/crawling credits, tracking camera shots, and particularly unrealistic CGI camera pans.
sneals2000 is offline  
post #33 of 39 Old 04-29-2007, 06:22 PM - Thread Starter
Senior Member
 
Quidam67's Avatar
 
Join Date: Feb 2007
Posts: 376
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Great stuff. I've learnt a lot from this thread.

I agree with both you guys, it seems unlikely the judder was the result of pull-down. The judder was so bad, everything seemed surreal. I can't believe anyone in an NTSC region would have accepted what I was watching -and I'm certainly not that fussy. I have another couple of DVD players I can test it on (a philips and a Sony HDD recorder) so I'll report back.

BTW, my Samsung displays the actual frame rate (hz) it is running at when selecting the info button (assuming it's the output, rather than input frame rate) so I guess I could also attach my laptop via the VGA and see what frame-rate is actually being used.

I like the idea of a HTPC, it's something I might look into at some point.

Cheers
b
Quidam67 is offline  
post #34 of 39 Old 04-30-2007, 01:45 AM - Thread Starter
Senior Member
 
Quidam67's Avatar
 
Join Date: Feb 2007
Posts: 376
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
OK, the Queen played back perfectly (no judder) on my Sony so I can conclude it was the Pioneer. Strange -my opinion of the Pioneer has gone down a bit. Although it is progressive scan, it was a very cheap unit.

My Laptop plays through my Samsung at 60hz, but that was what my PC is set to as well, so I think there is still some ambiguity surrounding what frame rates the TV can handle (via VGA)

Assuming any of you guys are reading this, I have a final question that is bugging me:

How does a TV that is essentialy 720p native (like my 42 inch Plasmas which are 1024*768) process a 1080i signal?

For example 1080i/50 is converted to 720p/50 how?

The reason I ask is because it may provide me with a better understanding of what is the best signal to feed the TV (assuming I have a choice)

Cheers
b
Quidam67 is offline  
post #35 of 39 Old 04-30-2007, 04:01 AM
Senior Member
 
YellowCows's Avatar
 
Join Date: Nov 2002
Posts: 224
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 10
It depends on the internal scaler of your plasma.

All progressive HD panels have an internal de-interlacer/scaler that takes various input signals, both SD and HD, and 'converts' or de-interlaces (produces a progessive picture from an interlaced source) and/or scales (interpolates the incoming resolution to fit the native resolution of the display) them to the panel's fixed (native) resolution. Depending on the input, the scaler might also perform a framerate conversion on the signal (e.g. from 50Hz to 60Hz) if necessary.

As you may know, the need for scaling/de-interlacing is due to the fact that the panel in question is a fixed-pixel progressive display. In other words, it can only display the exact number of pixels it is rated at progressively - so interlaced inputs need to be de-interlaced, and scaled up or down to fit the exact number of pixels so that the panel can display the picture. Thus, assuming that the display has a native rate of 1024X768@60Hz, an SD input of 576i/50 would need to be de-interlaced to 576p/50, then scaled up to 1024X768, and the frame rate converted from 50Hz to 60Hz (or not, if the display can handle a 50Hz framerate natively).

The same process would be needed to display 1080i on a 768p panel, with the scaler scaling down instead of up. However, there is one proviso, which is that the quality of de-interlacing is highly dependent on the scaling algorithms used internally. That is to say, many internal scalers actually halve the resolution of the 1080i signal during the de-interlacing process - to a 540p picture, and then scale that up to 768p. This is the low-cost way of doing things, and ultimately results in loss of resolution, though the resulting picture may still be perceived as High Definition. Gladly, manufacturers nowadays compete not only on the quality of the panel, but also on the internal processing, so the internal scalers found in the latest models of LCD and Plasma maintain much more accurate processing that retains picture information discarded by previous generations.

This is why so many of us here on AVS purchase outboard video processors (scalers) that handle all the de-interlacing and scaling using sophisticated chipsets and multi-tap scalers and pass the scaled/deinterlaced picture to the display at its native rate, bypassing the internal processor altogether. For that, it is essential that the display accept NR (native rate) through either DVI/HDMI or at least VGA. But that is another story.

This is only a simplified explanation, but I hope I've managed to explain it a little ...
YellowCows is offline  
post #36 of 39 Old 04-30-2007, 01:11 PM - Thread Starter
Senior Member
 
Quidam67's Avatar
 
Join Date: Feb 2007
Posts: 376
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Thanks YellowCows.

It happens more or less like I was thinking, however, I'm struggling with the "Temporal" aspect of this. In a progressive scan DVD player, I can appreciate how a buffer could be utilised to "read ahead" and thus perform de-interlacing while maintaining a specific output frame-rate eg 480p/60hz to the TV.

However, if the TV is receiving a 1080i/60 signal, then I don't see how it could de-interlace the 2 sets of fields without dropping the frame-rate to 30 -Surely it can only work with what it is getting (in a temporal/cadence sense)?

The only other alternative (to my mind) is that it takes the first set of fields (540 lines) it receives, interpolates to produce a complete frame (1080 lines) and then scales that down to the native resolution of the TV (eg 768 lines) and displays it. Then it does the same with the second set of fields (for the current frame), and so on and so on. This would result in the interlaced signal looking like a progressive scan image, but with a source resolution of only 540 lines, which is actually worse than what I'm getting now in PAL land for an SD image of 576p. If this is how it works, then a 720p picture sounds superior to a 1080i one, at least for a 720p TV.

Your thoughts? I can't help but feel I'm missing something.
Quidam67 is offline  
post #37 of 39 Old 04-30-2007, 01:49 PM
AVS Special Member
 
sneals2000's Avatar
 
Join Date: May 2003
Location: UK
Posts: 7,044
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 53 Post(s)
Liked: 47
Quote:
Originally Posted by Quidam67 View Post

Thanks YellowCows.

It happens more or less like I was thinking, however, I'm struggling with the "Temporal" aspect of this. In a progressive scan DVD player, I can appreciate how a buffer could be utilised to "read ahead" and thus perform de-interlacing while maintaining a specific output frame-rate eg 480p/60hz to the TV.

However, if the TV is receiving a 1080i/60 signal, then I don't see how it could de-interlace the 2 sets of fields without dropping the frame-rate to 30 -Surely it can only work with what it is getting (in a temporal/cadence sense)?

Worth considering that a display can employ field or frame storage - so can store a field to process in addition to the incoming field to produce a frame, whilst storing the incoming field simultaneously to process with the next incoming field. In fact some de-interlacing techniques use professionally use more than two fields to deliver better motion detection and compensation.

Quote:


The only other alternative (to my mind) is that it takes the first set of fields (540 lines) it receives, interpolates to produce a complete frame (1080 lines) and then scales that down to the native resolution of the TV (eg 768 lines) and displays it.

Some cheap displays just take the 540 lines of the field and scale directly to 768 lines.

Slightly more expensive de-interlacers that have stored the previous 540 line field to allow a 1080 line frame to be de-interlaced may decide that there is too much motion in the scene to do this, and instead again scale 540 to 768 lines. However if they have both fields available (one stored) AND there isn't much motion then they can scale the combined 1080 line frame to 768 lines for greater vertical resolution.

Even more expensive de-interlacers will take the 540 vs 1080 treatment decision on a per pixel or per block basis rather than an entire field basis. How the 1080p frame is created from the two 540 line fields depends on the processing power available - as motion detection and tracking can help improve the vertical resolution of even some motion these days. (Snell and Wilcox are using the same Phase Correlation techniques they use in 50/60 converters to improve de-interlacing - particularly for SD to HD upconversion)

Quote:


Then it does the same with the second set of fields (for the current frame), and so on and so on. This would result in the interlaced signal looking like a progressive scan image, but with a source resolution of only 540 lines, which is actually worse than what I'm getting now in PAL land for an SD image of 576p.

Yes - except that on fast motion (not on film sources but on video sources) the same thing happens to 576/50i sources - they get treated as 288/50p. You don't get 576/50p resolution on all 576/50i material - though you should on 2:2 pulldown film (and 25p video) sources.

Quote:


If this is how it works, then a 720p picture sounds superior to a 1080i one, at least for a 720p TV.

Your thoughts? I can't help but feel I'm missing something.

How your display de-interlaces 1080i sources dictates whether 720p or 1080i sources are better. However now that full 1920x1080 displays and better de-interlacers are available - the balance is swinging back in 1080i's favour it seems. Sky in the UK and the BBC have gone with 1080i not 720p for their new HD services.
sneals2000 is offline  
post #38 of 39 Old 04-30-2007, 02:56 PM - Thread Starter
Senior Member
 
Quidam67's Avatar
 
Join Date: Feb 2007
Posts: 376
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I like the idea of a field buffer. I was stuck in the mindset that a frame is made of 2 sets of fields (which it is) and that there would never be a cross-over (between frames) but I don't think that is true. So I now think it works like this (on a good quality system):

1) Field A = Buffer field
2) Field B = Current field
3) Field A is deinterlaced with Field B (assuming a suitable lack of motion is detected)
4) Scaling is done and the Frame is displayed
5) Field A is replaced with field B
6) Field B is replaced with the next incoming field
7) go back to step 3

If this is the case, the only thing that bothers me is that we are potentially combining a field from frame A with a field from frame B. For a film, this would never have been intended, but I really can't say what impact that might have on the quality of the resulting picture, because as you say, a smart de-interlacer would probably pick up on any significant movement (between fields) and choose not to de-interlace for that particular cycle. The other issue with this method is temporal. Because if you are playing back a film, then the de-interlacing process is producing an extra frame in between each true frame (eg. Field B from frame 1 deinterlaced with Field A from frame 2)so this would have to be allowed for when considering the frame rate that the movie is being played at.

Complex stuff!
Quidam67 is offline  
post #39 of 39 Old 04-30-2007, 03:15 PM - Thread Starter
Senior Member
 
Quidam67's Avatar
 
Join Date: Feb 2007
Posts: 376
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Going back to something Yellowcows said:

"It depends on the internal scaler of your plasma"

That really is a question isn't it? Reading the techincal specs of a particular TV model dose not yield such information (in my experience) and that's understandable, given most people would not know what they were reading about anyway. I'm betting the amount of money you paid might actually offer a pretty solid hint though.

I suspect both my Plasmas have good de-interlacers/scalers (Panasonic and Samsung and based on picture, I'd say the Panasonic is the better of the two). But I also have a 32 inch Acer LCD and I'm betting that the processor in that baby isn't quite so hot.

About the 720p vs 1080i thing, I can see how the tide will turn against 720p, and assuming the TV does a good job with 1080i the resulting picture quality (between 720p and 1080i) is probably going to be hard to differentiate.

Bear in mind that in New Zealand we have no Hi Def broadcasts so my Hi Def experience has thus far been limited to the Xbox 360 -and Blu Ray when I can bear to plunk over the cash for a PS3. Actually, I'm also looking very closely at a Oppo 981, as SD DVD is still going to be the main way I watch movies for quite some time.
Quidam67 is offline  
Reply Blu-ray Players

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off