Alright, I'll try to clear up and clarify some things I said, as I think some of them were either phrased poorly, or taken incorrectly, probably because of how it was written, so I'll cover as much as I can then.
Originally Posted by matrixj3
Right on! Heck i have been calibrating tv's for about 25 years and i quit using the calibration dvd's light meters etc. years ago...i just do it by feel and use 3 blurays and i calibrate audio by feel also. It's like cooking, if you have to keep measuring with meters and graphs then the feel and warmth is not there. I always get the best compliments on my calibrations because i ask my friends "what they like". Why calibrate a flat image if they spent $3000 on an HDTV when i can give them a VERY colorful image that makes Peyton Manning look like he is about to break through their tv screen and is razor sharp to the point they were glad they did not get a 3D set!
That's not calibrating. Calibrating implies that you are have a standard you are trying to hit, and you're configuring it to hit that standard. The goal with calibrating a display, and having a Blu-ray player setup correctly, is that you replicate the content on the disc as accurately as possible.
Now, I'm not saying that should be what everyone wants. If you want something that has more color, even if it's not as accurate, or that is brighter, then do whatever you want. Really, the point is to enjoy the experience, but don't use the term "calibration" when what you're really doing is "tweaking" the controls to what you personally like. Personal preference goes out the door with a calibration, it's as close to the standard as you can get.
Originally Posted by khollister
I think you may be misinterpreting what I said. I am not necessarily advocating intentionally dialing in inaccuracies to achieve a "pleasing" picture or sound - I am advocating that there can be visual or audible differences in components that measure the same. The question (which Chris and I obviously disagree on) is whether the "better" picture is actually the accurate one or not.
My real point is that with a player like the 790, when you're using the internal controls to move the player off of reference output, saying that the picture is "better" than the Oppo is just purely subjective, with nothing objective to back it up. The other issue is that all of the settings the 790 does you can probably do yourself, in your display controls, for the same benefit but also fewer tradeoffs.
Using the controls that remap the gamma cause a large drop in dynamic range. You can use the gamma control available in most current displays to do the same thing, but you don't lose dynamic range, so you're getting the benefit you were after, but without the tradeoff. Is your gamma now a reference 2.2? No, but if you prefer the picture, at least it looks better now than before.
The same goes for most of the sharpness and other controls. You can do all of those in your display just as easily, and get the same benefit from it. The main advantage with doing those in the display is since almost all displays have multiple picture mode memories, you can do one for 3D, one for Blu-ray, and one for DVD if you want and adjust them based on content (perhaps more sharpness and noise removal on DVD, higher gamma on 3D, etc...). With the player, you almost always have to remember to go in and adjust them yourself for each movie, which will lead to a worse picture each time.
Really, do what you enjoy, but when there are alternatives that can do the same thing, or better, it's a better way to go sometimes.
An example is color measurement. There are known display devices where the perceived color by a trained observer does not match what a spectroradiometer measures. Does this mean our color perception is wrong? No - it means the measurement science is imperfect because assumptions were made about spectral distributions and the human reaction to them.
The other way to look at this is "The human eye is incredible adaptive and responds to shifts in color and white balance more easily, whereas an instrument doesn't have that adaptability and only can see the information presented to it."
Originally Posted by matrixj3
There are TONS of capacitors diodes etc. in blu ray players, AV amps and HDTV's...you think they are all going to look and sound the same in the end?...that means that all of the "electronic guts" in the BD player, AV amp and HDTV have to perfom at 100% to be identical in picture quality and sound quality?
The problem is confusing how analog stereo works with how HDMI works. Analog stereo sends a waveform through all those capacitors and cables and traces, and they all add some sort of change or distortion to it. It might be amazingly small, but it's there. HDMI is a packet-based digital output, so if it works, you can send the same data through 1,000 components and it won't change at all, and will always come out exactly the same.
HDMI operates more like Ethernet and TCP/IP do. The data is broken down into packets, encoded with error checking, and sent. If it goes incorrectly, you get a huge visible error on screen (either bright white blocks usually, or no image at all), not a barely noticeable shift (Micro-Contrast, as someone put it). It's as if you're sending an email from two different computers. The internals are completely different, the OS can even be different, but the packets of data are sent in such a way that if you inspect them from each, you will see that they are identical, and the exact same thing is going to arrive at your target.
Comparing HDMI to previous technologies, either in video or audio, is futile as it operates totally differently. It works, or it doesn't work, but the nature of it means that you aren't going to have subjective measures on it. A TV will get the exact same data from two correctly working players, and if nothing else is different (colorspaces, cables, what they are going through, etc...) the output will be exactly the same. If it's not, the display is broken, that's it.
All tv's look different...all speakers sound different...all amps sound different...and yea..players do too. Like i posted before i had mentioned i had returned a Sony player a while back because it was not as bright as my other 3 players...just to make sure, i got the same results when watching on my 60 inch led and 50 inch plasma which are also in my home theater. I tested it out with my switcher and directly to the other sets just to make sure it was the bd player and not my amp or tv's. So either it could be a quality control issue with that unit or that model was just not up to par
Some prior Sony firmwares, and even the current one, had an issue with setting the Luma (Y component) output lower than reference, which would lead to a picture being dimmer. This is something we can test and verify, and it is because that player is working incorrectly that it happens. If it was working correctly, it would be identical. The fact that players can do this is why we test it.
Originally Posted by khollister
The problem is that most of us do not have a file that was literally the master for a particular blu-ray that we can do a direct comparison with (both by eye and by measurement). The "measurement clan" is, I believe, basing their declaration of reference on standard test patterns that have known content with which to make a comparison with the data stream coming off the BD. The potential problem with that is that actual movie content is far more complex and dynamic, and we actually do not know for sure that there couldn't be other factors in play that might alter the perfect transfer function.
The reference material and the Blu-ray content are the same. They're both 1080p24 or 1080p60, both 4:2:0 encoded. There is no difference playing one back than playing the other. Doing the conversion from 4:2:0 to 4:2:2, 4:4:4 or RGB is math. If it does the math right with the test pattern, it will do it right every single time. There are only so many possible values that can be sent over HDMI, and we test every single one of them. Saying that a movie is different than a test pattern in this case is like saying "If I do 1+1 on my calculator it will be 2, but if I do it really, really fast, it might be something else."
I see this as similar to the fallacy that measuring audio equipment with static sine waves or impulse functions seems to have little to do with how it behaves with actual source material.
Analog audio and video have nothing to do with HDMI, as I related earlier. We know how HDMI behaves in its current version. If content starts using Deep Color, or higher frame rates, or different color spaces, then we'll adapt and find new ways to test it. As it is, if you have two players with the same reference output over HDMI, same colorspace, and same content, they will look identical on a screen. If they don't, it's time to look at something else in the chain and see what is wrong, or find some objective proof to back it up.
Test patterns with audio also have their purpose. You can see that a device is adding distortion of a certain type, or has a different response curve, or has poor channel separation compared to something else. Does it mean that some people don't like that distortion, or don't like boosted bass? No. Audio also doesn't have the same standards for content that video does, so with video we can easily say "This is what reference is, and anything else is a deviation of that" more than we can with audio.