Originally Posted by joerod
TheBland, You may have a 14 foot screen but the question I have is your display Deep Color or x.v. Color friendly? That is one key feature that gives the 51FD the edge. Without it it should at least be a draw...
Again, I quexstion your enthusiasm as there is nothingto back it up. I am as much of a videophile as anyone... but you can't ignore the facts about x.v. and Deep Color. There is no edge with the '51. Folks here should know this before purchasing.
Since you don't believe me, here is a snippet from Audioholics
:Currently, Hollywood films are telecined directly to digital, with masters stored on D5 tape in 10-bit 4:2:2 format. While this is better than the 8-bit 4:2:0 present on current media, it's still not 12- or 16-bit Deep Color or even utilizing the xvYCC color space. Without mastering and the ability to store xvYCC on source material (other than games which are generated via PC video cards) it seems that xvYCC is largely a marketing gimmick, save the new lines of camcorders, etc which boast 10-bit recording and xvYCC support. Somehow, eliminating the occasional color banding in home movies isn't exactly the incredible leap in technology for which most of us were hoping.
'x.v.Color,' a branding methodology (from Sony) with associated logo whose purpose is to provide consumers with a simplified means of identifying xvYCC-compliant displays in anticipation of retail availability later this year.
Jim Peterson (Head of Lumagen) on the private Lumagen forum parroted this view as well that Deep color is pure hype and marketing. He is the video processor guru in the industry. Here is a snippet of his comments on deep color
:Deep color is pretty much all hype and no substance - at least for now and the forseeable future.
Consumer sources are 8-bits --- then they are compressed. Even if a blu-ray movie master started with "more bits" the compression would reduce it to the same quality as if they started with 8-bits. There are only so many bits to go around even on a bluray disc. I looked at the bit-rate on one movie (Pirates - Dead Man's chest) and the bit-rate was around 22 Mbits/second. Better than DVD's which run 5 to 6 Mbits/sec, and the compression is more advanced. Still, if you wanted full-fidelity at 8-bits you should see more like 35 Mbps. Full-fidelity at, say, 12-bits would be about 8X to 16X this -- bluray is not currently capable of this. Granted you can make improvements beyond 8-bit in the 50 to 70 Mbps range, but this would be subtle, pehaps gaining at most 1 LSB of precision (i.e. 9-bit), and certainly could be handled with HDMI 1.1's 4:2:2 format.
Also, the compression is 4:2:0. That is, 1/4 resolution Chroma. 4:2:2 at 12-bit format is available with HDMI 1.1 and is already beyond what can be effectively used. Independent of what is claimed by manufacturers, the actual signal to noise of the current batch of TV's and projectors, is probably around 10-bits. So, we generally recommend our customers use our 4:2:2 12-bit output mode, but with dither set to 10-bits, since this tends to best match the capability of the displays. Using a 4:4:4 12-bit mode would buy you nothing for consumer sources --- and it will cost you money.
So, you see, what you may be seeing is the excitment factor of a new player with an unknown menu feature or an aberration in the output.Personally, I wouldn't be touting that the x.v. display and 'deep color' as an improvement in image quality (and what sets this player apart). Many cheaper players do this as well as DTS MA decoding. This is the truth, as I see it, about the 51. It likely will be an excellent player but lets not make it more than it really is....