Originally Posted by lyris
A few points to make on what you said.
Even if a standard doesn't mention sharpening controls or other "enhancements": there's also no standard that says auto contrast systems, or a multitude of other picture tinkering systems shouldn't be used, either. It's implicit from the point of view of video accuracy that these sorts of features are left off.
In setting our displays up to reproduce the source as accurately as possible, it's implicit that no such processing goes on.
The exception is if the processing has been designed to address or compensate for a flaw in the display system. For example, Panasonic's BD players with their chroma upsampling feature to address the 4:2:0 limitation in BD. I'm not against that at all, because the processing has been designed to bring the image closer to that of a studio master (although some artefacts do result).
Point #2: actually, there is a standard that addresses this.
(European Broadcasting Union tech paper #3320, User requirements for Video Monitors
in Television Production (http://tech.ebu.ch/docs/tech/tech3320.pdf
) - emphasis mine. Notice also how they put 'enhancements' in quotes.
Of course, it's one thing to refer to spec papers that define studio monitors in a monitoring environment; it's another to understand the video system as a whole. I gave the example of reconstructing missing chroma details earlier.
But BD has no such resolution limitation when it comes to Luminance, which is what this box's processing seems to have the most effect on. Luminance is not sub sampled, and although it's transformed and quantized, on BD it's done so at levels that don't affect the "sharpness" of the video (unless it's done really badly at low bit rates). Quite simply, unless you have a projector with an incredibly terrible lens - in which case a box like this could make the picture subjectively better - then there is nothing that needs fixing
There are good and reasonable reasons to criticise (not attack!) this sort of processing from the point of view of video purism.
If the DoP or colorist had wanted to add unsharp mask-like emphasis on the foreground, he would have done so during grading. (DaVinci Resolve's own Sharpness control, when taken to extremes, looks pretty much the same as the examples posted on the Darbee Vision site).
Again, there is absolutely nothing wrong with the Luminance resolution of current video on Blu-ray. There is nothing that needs fixed.
Secondly, you say that BD "could be improved". Well, yes, anything could be improved - I'll be happy to see 4K BD, for example. The difference is that that is a proportional
improvement. With a higher res display system, the entire image is sharper. What the Darbee Vision processing is doing is localized enhancement, and that's getting into the territory of infringing on the art, rather than the technicalities.
Put it another way: in Blu-ray, and even DVD projects that I work on, I work with the director or DoP (or someone acting on their behalf) to get the image that goes to disc just right. Part of calibration is to avoid adding distortions to the image, which is why I shut off NR controls and Sharpening systems when I calibrate displays (although some sharpening, dependinding on the implementation quality, can be good for SD, which again goes back to what I'm saying about understanding why recommendations are in place and not just blindly following them).
I'd lastly point you to the Imaging Science Foundation ethos: we calibrate displays so that we get "the whole picture, right picture, and nothing but the picture". Adding ringing and halos and/or localized
contrast enhancement that were not part of the original grade violates two of those principles.
This device is not conducive to accurate video because it addresses a "problem" which does not exist with Blu-ray Disc on a 1080p display. If people like the distortions it's adding that's cool, but please don't pretend that they're somehow enhancing the accuracy. The description of the process clearly states that it's adding localized processing (I'm assuming it's frequency based to avoid sharpening out of focus backgrounds, etc) which is not part of the original signal. Granted, it doesn't affect grayscale, gamma or color reproduction when measured with test patterns (although you can bet if it's turned up to extremes, it'll absolutely affect gamma and color in the areas it's processing), but that doesn't mean that it's not modifying the intent of the content producers.
In reality, most of what we're debating is probably the principle more than anything. I've been told the frankly unpleasant processing examples on the Darbee web site
show the processing turned up to extremes. I would absolutely NOT want anything I work on to be viewed with processing like that. At very gentle settings, it's probably not doing much harm, but again, like I said, there's nothing wrong with the resolution of BD as it is.