Originally Posted by *UFO*
I have seen the JS8500 in person .......... they can only do 4:4:4 in PC mode
minor typo probably. from what i have read one of the limitations of most ( ?all) current 4k lcd tv's is their inability
to do RGB 4.4.4 chroma on their hdmi input ports (for ex when in game mode). in contrast, many of the good current model 1080 HD lcd tv's (like our w900) can do chroma 4.4.4 RGB when the hdmi port is set to game mode. the importance of being able to do this 444 chroma mainly relates to using the tv as a pc monitor or with a HTPC (in order to bypass the tv inbuilt processing modes).
for the average consumer this might not matter because we normally use a YCbCr 4.2.0 video signal from stand alone dvd/blu-ray players. also since you still cant fit a full length 4k movie on a standard double layer blu-ray disc (and for this reason there are no 4K blu-ray players yet), early adopters of this 4k standard will be dependent on over the air or cable content for at least another year or two i suspect.
Originally Posted by *UFO*
Another thing is the w900a has a native 240hz panel, while all current 4K panels are only 120hz panels.
i still find the high claims of refresh rates (90, 100, 120, 240, etc) rather confusing for flatscreen televisions. one reason for this is the often deliberately misleading claims from various manufacturers. from what i can make out lcd tv's are still all basically only 60 hz ( which is linked to the old CRT standard of the 110 volt electrical system, or even 50 hz in europe with 220 volt). Any video frames being displayed when the lcd panel refresh above 60 hz is done by either interpolation ( adding newly created frames that dont exist in the original source material) and/or by some backlight trickery (like adding a black frame in between existing video frames).
as a consumer tech reviewer in a pc magazine recently put it:
Bottom line: Refresh rate is how often the TV shows a new image. Anything above 60Hz is entirely the invention of the TV itself. All modern video is either 24 frames per second (movies and most TV shows), 60 fields per second (1080i video), or 60 frames per second (720p video). Higher refresh rates are simply used to increase apparent motion resolution (using "newly created" video frames which were not present in the original source material).
note: The 600Hz claims with plasmas is largely marketing hype, but is technically correct and is different from how lcd works
the purpose of all this added processing is to reduce the visibly low video frame rate (perceived as a judder or blurring) in the PAL (25 frames/sec) or NTSC (30 frames/sec) video signal, or the flickering caused by the used display if that has refresh rates below +/- 100 hz. As a result with lcd display technology the individual pixels might need to turn on/off at 120 or 240 times per sec (eg "Hz") and the original number of video frames in the source material have been padded with added frames of "newly created" video (which still cant create a perfectly smooth video image for moving content).
on faster moving video, for ex sports or a panning video shot across a landscape, that is where the better interpolation methods will matter the most to remove judder/blurring. For 1080 HD lcd' tv's the various existing methods from the better tv makers are now pretty good at adding these new frames ( and the sony motionflow is particularly good, but you need to select the appropriate setting). however for 4K lcd tv's most manufacturing companies in 2015 still have technological limitations with panel refresh rates higher then 120 hz (as UFO indicated in his post).
similar limitations also still exist for pc lcd monitors. the manufacturers that recently make claims of 100 or 120 hz are using TN panels which are only 6 bit color (often used in cheap "gaming monitors" that make claims of super low latencies). the better quality 8 bit color panels (either MVA/PVA or IPS) still cant run higher then 60 or 70 hrz.