Now Available: Tech Talk Podcast with Scott Wilkinson, Episode 19 Click here for details.
Quote:
Originally Posted by zack8322 /forum/post/16895767
It's the crappy AMP the LCDs need to have smooth motion. You can tame it, but you need it due to LCD's nature.
Quote:
Originally Posted by Nielo TM /forum/post/16895936
Low-level MCFI is a gimmick, and has been since 2002.
High-level MCFI what needed to reduce hold time
Quote:
Originally Posted by jimfitz /forum/post/16895451
Was looking into purchasing a Samsung LED TV and saw it hooked up via BluRay and thought it looked fake or to good. Anyone ever notice this???? Is there such a thing as too clear!!!?????
Quote:
Originally Posted by Gary McCoy /forum/post/16902264
The latest Samsung LCDs are brighter and have more saturated colors than do any plasmas. That plus the semi-gloss screen finish is what makes these products usable in a very brightly lit room. The appearance in a brightly lit showroom is what makes LCDs more popular with purchasers than plasmas.
As for frame interpolation, it really doesn't matter if you like it or not. The feature is available on many brands of LCDs and on Pioneer and Panasonic plasmas. But even way back in 2007 when I bought my 120Hz set, the options for setting AMP (the Samsung version of frame interpolation) were High/Medium/Low/Off. Newer sets have even more settings, but "Off" is always an option.
It's perfectly OK to prefer LCD or Plasma technology in most viewing environments. But is a B&M store with those merciless high intensity overhead lights, plasma just doesn't look good in comparison to LCD.
But you are right about one thing - LCD sets with frame interpolation can actually look clearer than the original source video. The modern MCFI chipsets use advanced algorithyms that can replace blurred trailing edges of fast-moving objects with the same (clear) area of an adjacent frame. The effect is most effective when you have a clear video source at 1080p24, such as a Blu-Ray. After MFCI processing the 1080p120 or 1080p240 display can have less blur than the original camera film. That does take some getting used to, and if misadjusted MCFI can add artifacts to moving objects. The idea is to have the MCFI setting just below the artifact threshold - but where exactly that setting should be depends upon the original photography, including film speed and exposure settings. The correct setting for one movie might be too high or too low for the next.
Traditionalists may not like it but my personal tastes include bright saturated colors and minimal blur, I don't think I'll ever go back to (or even admire) any display that lacks effective MCFI processing.
Quote:
Originally Posted by aydu /forum/post/16902416
+1
Why buy a super monitor to watch poor looking content?
Personally, I don't care if the director of a film intended it to look bad, was drunk during the post production, or just didn't care about the appearance of his/her work. I want pq that enables me to engage with the storyline and not be distracted by poor looking video.
Quote:
Originally Posted by Nielo TM /forum/post/16900837
No, sorry
In the UK, all the A series had the low and high-level MCFI unified under a single option. So the user would have to manually enable it during high-motion contents.
But that's all changed in the B series (kudos to Samsung). Now the user has control over Blur Reduction (high-level MCFI) and Judder reduction (low-level MCFI).