Originally Posted by eelton
Thanks for the information.
I used the flashing bars pattern on the AVS HD 709 disc to set the black level (17 and higher flashing).
Ok, so all bars in contrast/brightness seem the same between A,B?
Originally Posted by eelton
I have displayed test patterns (A) from AVS, stored on my TiVo, measured with HCFR, (B) from AVS, stored on a networked computer and played through the TV's Plex app, measured with HCFR, (C) from AVS, via my laptop with an HDMI connection using your Lightspace software and a floating window; and (D) generated by HCFR via Chromecast, measured with HCFR.
LightSpace is not my software, just I named Ted's LightSpace my calibration disk, which is universal calibration disk (work with CalMAN/ChromaPure/HCFR additionally) but the main and most content of disk is for 3D LUT profiling via LightSpace, displaying thousand colors without user prompt for 3D LUT with LightSpace.
What is the output settings of your TiVo? (YCbCr?)
When you are using a notebook with extended desktop with software generator (LightSpace/HCFR etc.), the notebook/software settings are affecting the accuracy.
Most of the times, users find very easy and quick solution to use their Notebook/PC HDMI output for patch generation using CalMAN's software pattern generator, but without doing proper testing to see if their output is correct, there level mismatch (confusing video/PC levels), some will find out their mistake at the end of their calibration but they will have wasted already so much time, because the output was not correct, some will never notice anything since they will have as a proof the colorful charts of CalMAN where they will see that all are fine and they have low average dE.
At the end of the day, you need to have accurate colors from the source you are using to playback the movies, not from other untested solution.
For example, stand-alone players can have accurate YCbCr output while Notebook PC's can have accurate RGB output, some TV's are handling differently the signal when you send RGB or YCbCr, usually they convert the incoming RGB signal to YCbCr for processing, there other TV's which handle differently the 60p from 24p input, all these require some testing to find out what is happening.
To see if your Notebook/PC HDMI RGB output is configured correctly you have to compare it using a stand-alone player output (configured to output YCbCr, because most of the players are bit-perfect at that colorspace output) and using a calibration disk compatible with HCFR being used as reference. (Setting the PC/Notebook to output YCbCr is not recommended, because it will be required specific 3x3 matrix to convert RGB-Video to REC.709 YCbCr)
You need to compare Grayscale and gamut (saturation or colorchecker measurement runs) from both patch solution to see if you have agreement, if you have agreement then you can use your Notebook/PC HDMI for patch generation (using HCFR software generator), see also your reported black/white level and gamma measurements. Not all notebooks can be used as software patch generators, there video card settings, Windows ICC or VCGT which can affect the accuracy of the HDMI output. Having a stand-alone player with a bit-perfect calibration disk, you can use it as reference point for that test.
To be sure that the Video Card or OS is not altering your HDMI Output accuracy, you have to check your Video Card panel settings and disable any dynamic mode or other enhancement, then remove any ICC profile you will see to your Control Panel -> Color Management and finally resetting your VCGT (Video Card Gamma Table).
Also your software patch generator has been configured to output correct levels, ideally you need to set HCFR to output video levels (16-235), VGA (0-255) and TV to expect 16-235.
For grayscale only you can use AVSHD grayscale (when you will have enable the software offset to fix levels mismatch of AVSHD with HCFR (see details here
), but for gamut measurements (for example 4-Point Saturation sweep), AVSHD is not compatible with all colors of HCFR 4-Point Saturations (and there no fix about this), for example:
AVSHD 50% Red Saturation Pattern has RGB Triplet 190.95.95 but HCFR's Color Engine needs/calculates errors from RGB Triplet 191.96.96, it's 0.42 dE2000 error.
AVSHD 75% Magenta Saturation Pattern has RGB Triplet 203.100.203 but HCFR's Color Engine needs/calculates errors from RGB Triplet 202.99.202, it's 0.36 dE2000 error.
When you performing such comparisons you have to match the size of the patterns you are using.
Originally Posted by eelton
I've attached images for gamma curves. A and B produced identical results, showing a nearly flat gamma curve at 2.02. D produced a slightly less flat curve, with an average of 2.25. The gamma curve from C showed my gamma to be below the 2.4 goal, although I'm not sure how to tell by how much from the graph. I've also attached the table showing the levels in HCFR, from trial D. All of these measurements were with the same settings on the TV (including gamma BT.1886) in ISF dark room mode.
At any rate, all of my other results are where they should be, and I'm happy with the image. I'll chose to use think of the gamma as the Chromecast result of 2.25, so that it doesn't seem way off.
Its unknown which of those is the most accurate, when you don't access to a reference generator to have as reference to test all other patch playback/generator solutions, one way is use your TV USB to playback from there the Media Files and have this as reference, if there not available stand-alone blu-ray/media player of known brand, then measure grayscale and color gamut patterns and see which option is tracking better your 'reference'.
If you perform these test, upload all measurement data.