Originally Posted by jrp
A comment was made about changing the Lumagens black and white levels to get them correct for the Ruby. Per our manual, this is the wrong thing to do (for the first input). What should be done is to set the black and white levels using the controls in the projector. If the projector levels aren't a match to the Lumagen, it must be fixed in the projector. Then differences between inputs can be calibrated in the Lumagen. Also, the DVI levels need to be set correctly in the Lumagen for input and output. We have actually heard of DVD players saying they use one and really use the other, which affects quality and calibration.
This is good advice, of course. A correctly calibrated projector should be the starting point.
|To set black level, use the Lumagen's contrast pattern (MENU->MISC->TPAT, and use the arrow keys to select the contrast pattern). Adjust so the 4 IRE bar is barely visible againt black and the 96 IRE bar is visible againt white. Then using AVIA's needle pattern adjust the DVDs black and white in a similar fashion. Note for the black level I use AVIA's pattern of 2 and 4 IRE on a solid black background. This is because black level in projectors and displays vary a lot with average scene intensity. This difference can be exaserbated by a dynamic IRIS.
I would never use any DVD source to calibrate a video chain, with the exception of the DVD part of it. To do so makes no sense other than a rough 'rubber band' setup.
Before continuing, let me say that I am not a beginner to all this, having worked professionally in audio recording, and audio/video post production for a number of years. I have done many many calibrations with a variety of products in all parts of the audio or video chain, from source to display, and have designed/engineered several pieces of audio and video equipment (mostly analog).
In professional circles, the expectation is that any piece of equipment that is inserted into an existing video or audio path, can be quickly set to conform to the signalling standards of that path (minor calibration issues are a different issue). So for example, if I took my Faroudja DVP-5000 analog video processor and inserted it between a source and the projector RGBHV input, with all its controls set to default, there would be no further calibration necessary. 0-1v in = 0-1v out, period. THEN I make changes to improve or shape the signal to my requirements. Likewise if I set it for component in and out rather than RGBHV, it would follow those standards as well. The point is that by default, it is a neutral insert into an existing path.
For digital equipment of a similar function, the same expectation should be met. If you inserted a processing device in a digital path of a computer level signal (0-255), that device should be set to 0-255 levels input and 0-255 levels output, and be transparent assuming all other adjustments in the device are set to neutral. Likewise with video level signals (16-235) in and out, such as are typical in the home environment. And of course, when you transcode or 'map' between the two standards, the expectation is a neutral transformation.
Why then, does the Lumagen not conform to this? It imposes an input calibration for each and every input, in addition to just selecting video or computer input levels, leaving the door wide open for all sorts of operational miscalibrations and errors. Why is this? It makes no sense to me. Even SDI is a known digital standard with defined levels for accurate performance. Why then are the Lumagen settings for SDI completely off their default settings and also different from the corrections necessary to conform video level signals? There's way too much tweaking needed for basic operation.
|Bill, while I assume you did not calibrate grayscale, did you at least set the color and hue and hue offsets using the AVIA color patterns? It is possible the color issues you mention are just that our out-of-box settings don't align with the Ruby. We try to have our setting be a nominal for the various requirements of the different manufactures, but since they are all different, we can't be perfect for all out of the box.
You can be extremely close right out of the box, if the above conditions were met. Obviously there's going to be an occasional piece of auxilliary equipment (like DVD players, mostly) that don't follow the standards, and for those you have the range of controls to compensate. These are not the norm, which can be anticipated.
Yes, I have calibrated grey scale on the projector with Colorfacts software, and have noted through level tests based on the calibration, where black and white levels should be on the projector (very close to those recommended in the Sony Ruby calibration section). I also have the numbers for black/white levels settings on the Lumagen to allow it to conform to a neutral insert condition.
However, none of this has anything to do with the point I have been trying to get across in previous posts. And that is, by simply inserting the Lumagen into an existing path (assuming level corrections), it degrades the image. Period. That type of degradation cannot be corrected by calibration adjustments. This the very first test we would always do with audio or video equipment under evaluation -- how does it change the signal by just being there. Then you decide based on the amount and type of degradation if you can tolerate it compared to the attributes that device brings to the table. If it can fix a major problem in the image, then the degradation might be acceptable, but if it was just added to shift an image or something minor (for example), it probably wouldn't.
And this is where I place the Lumagen. While it does some really nice improvements to interlacing, and there are a wide variety of options to modify the image, the initial degradation by just being there is too much for me. Others have commented in agreement on that point both here and to me privately. By in large those that see this characteristic have Rubys, Qualias or another very high resolution projector, usually not a CRT.
So now my question is whether this will ever be addressed in the VisionPro HDP? I think it's great that so many improvements have been made with software updates and all, but without improvement in the basics it's not something I'd want to keep. Would the next version with the Realta chip for interlacing be any better? Of course that's unknown at this time. Would the 10 bit software upgrade be better? Very likely possible, if it changed the entire video processing path to 10bit. But it won't be likely at all, except accidentally, if Lumagen doesn't realize a problem exists.