Originally Posted by praz
Any of these discussion involving a computer as the source is really a waste of time. Most do not understand the interaction and limitation of computer components to output a proper video level signal. And if the signal is properly sent there is always the risk that some future firmware, driver or software update will break it.
Are you serious? Videocards on PCs are more powerful and capable than any video cards in BD players, Roku and similar boxes, or TVs themselves. Are you seriously saying that even gaming consoles cannot output a proper video level signal? Consoles were designed for HDTVs - they use the same HDMI cables as PCs and BD players! Consoles use slightly different versions but of the same videocards from nVidia and AMD. They do an incredible job at outputting a proper limited (16-235) and full range (0-255) video level signals, but they are nowhere near as powerful as most high-end PC videocards. If anything, PC video cards are designed to output at full RGB/ 4:4:4 0-255 range over DVI and either 0-255 or 16-235 over HDMI. Some cards, like nVidia cards, need a registry entry to properly use 0-255 range for HDMI devices.
This is ridiculous, seriously ridiculous - TV manufacturers create PC or similar modes because they are fully aware that 4:2:2 does not look that great for PC users, otherwise they would have not made such modes! It has nothing to do with PCs being PCs, but the fact that text and interface are designed for 4:4:4, which all monitors use. There was even a big thread in this exact section dedicated to chroma subsampling and it provided specific examples with images that showed the difference between what is being seen with 4:2:0, 4:2:2, and 4:4:4. People consistently report issues with 4:2:2 and see them resolved with 4:4:4. This whole "There should be no difference" is pointless because there is. Again, its so damn stupid to rely on these theories when in practice things are different.
Again, unless a user calibrated his/her TV using either ICC profiles or videocard image settings, no new driver will change a calibration performed on the TV itself, using user or service menu. No desktop PC cards from AMD and nVidia, which own 90% of the market share, require any kind of firmware or BIOS updates. I do not know about Intel and other integrated graphics cards (in laptops)
, which should not be used for HTPCs as its really slow and are a minority in the world graphics adapters/cards. Only a handful of experienced users seeking to increase voltage for overclocking or unlock features on a card will mess with updating their cards' BIOS. Unless such an update is targeting changing or fixing any output problems (chances are slim to none), nothing will change for TV calibration. This has been the case for many years now! I have installed 3 sets of drivers since I calibrated my TV and I even formatted my hard drive with Windows 7 and installed Windows 8.1, which uses drivers differently. Guess what? My TV calibration is still fine! My ArgyllCMS 3DLUTs still produce identical results. Now, if I switch to my integrated Intel HD4000 card, then my RGB WB will be off, but that is fine because it is a different source! If I were to connect a PS3 to my TV, then it would also need adjustments, maybe in different areas. That's is completely normal and that is why there are WB settings saved for each and every source on most TVs. CMS tends to stay the same for all sources, at least on my TV.