found this from cnet site on sammy plasmas. Same story.
Film Mode is primarily for 480i/p input sources, but even moreso for 480i inputs. What it does is correct the interpoation of the 3:2 pulldown function of the picture.
Not to say that it couldn't be used to process other pictures, but depending on the way the DVD was encoded will depend on the improvement of Film Mode.
In laymans terms, it is supposed to improve the picture.
In technical terms.....
DVDs are based on MPEG-2 encoding, which can be either progressive or interlaced sequences. Most discs use interlaced sequences, since the players are designed for interlaced output (and was the case until upconverting DVD players).
If the sequence on the disc is progressive, then all sorts of rules kick into play so that the material stays progressive from start to finish. If it's interlaced, there are fewer rules and no set requirement to use progressive frames is established within the disc. The encoder can mix and match interlaced fields and progressive frames as long as each second of MPEG-2 data contains exactly 60 fields.
So to display a perfect progressive image from a film-sourced DVD, the player needs to igure out which fields in the MPEG stream go together. It would be nice if the progressive frame flag in the disc would tell the player that the frames are from a film and go together, but it's not always optimized for progressive scan playback.
Most players will use a standard MPEG-2 decoder to generate digital interlaced video that sends the video feed to a deinterlacing chip. If the deinterlacing chip sees a constant stream of 5-field sequences in which the first and third fields are identical, it switches to film-mode deinterlacing.
Film Mode is the one area of deinterlacing that can be objectively perfect.
Film Mode in a television also allows that picture input - regardless of source - to process the picture the same way, since not all players have the ability to do that. Some can send the signal out, which unprocessed, produces a grainy picture. If overprocessed, also produces a grainy picture.
The most common and worst artifacting that you get in film mode happens when the deinterlacer combines together two fields that were never meant to go together. This most often happens when the 3-2 sequence is interrupted and the deinterlacer doesn't act quickly enough. When this happens, the odd numbered lines of the image are from one moment in time, and the even numbered lines are from a different moment in time - thus causing a horrible picture.
The good news is that it is defeatable, so if it's not a process that improves the picture for a progressive scan feed, you can turn it off.
This is an issue that can happen regardless of manufacturer. In saying that, it's important to note that this isn't a Samsung issue as much as it is an industry-compatibility issue.
Samsung products give you the ability to defeat most technologies that would hamper your viewing pleasure as much as we include technologies that enhance your experience.
Hope this helps.http://forums.cnet.com/5208-13973_10...hreadID=299056
sooo..since the info on my tv says the source is 1080p/24p and the motion is smooth, thats good enough for me. Meaning, it's doing what it's suppose to do. So In that regard, i'm content. I wasnt sure myself and was afraid i was missing out on something that would "improve" picture quality.