There are some tests out there that are supposedly designed for those who do not have colorimeters, but I wonder what those tests really show and mean.
For example, this contrast test - http://www.lagom.nl/lcd-test/contrast.php
shows that my blue conrtast is too high because it is hard to distinguish the bands on the blue strip as they closer to 100% IRE (right side). However, I used ArgyllCMS calibration for desktop (not madVR) and loaded the produced .cal file into videocard, and measured the calibration with HCFR. HCFR reported perfect luminance (including blue), gamma (including blue), RGB WB, and all saturation/colorchecker sweeps under dE 3.5, mostly under dE 3. Yet, the uploaded file did not really change the way that blue strip looks like... How can I have near perfect measured luminance and gamma while that picture shows that my blue contrast is still too high?
Then there is this pattern - http://www.lagom.nl/lcd-test/gamma_calibration.php
that shows 1.6 for my blue gamma @ 48%. How is that possible when HCFR reports blue and luminance gamma of 2.2 @ 48% IRE with and even without ArgyllCMS calibration? If this test is incorrect, then what does it actually show?
Is it truly impossible to calibrate gray scale to a point where its more or less acceptable? My i1D3 does not measure blacks that well and it is a common advice to trust readings for 10-100% IRE, but check near-black levels with your eyes to make sure they are not red, green, or blue. The same can be done with all IRE steps. Sure, you won't get anywhere near the results you would get with a colorimeter, but you could at least fix huge problems where the color is obviously either too red, or too green, or too blue. You can definitely calibrate black and white levels. If that gamma test on Lagom site shows accurate gamma, but you could technically calibrate gamma by looking at 10% IRE steps, BUT thus far that test shows my blue gamma is 1.6, while HCFR shows 2.2. Monitor calibration wizard has similar patterns for gamma calibration. I used them a couple of times and the results were incredibly bad...
Also, a side-question regarding sharpness in general - Is it truly bad to have a slight hint of sharpening, using setting 1-5/100 instead of 0? I saw several semi-pro calibrators use sharpness 5 and one even used 10. I ventured into SM and realized that even with Sharpness = 0 and edge enhancement turned off, there is still some sharpening applied because if I were to turn it off entirely in SM, things would get very blurry. So, in the end, the only way to truly turn of sharpness is to use 4:4:4 PC Mode because any 4:2:2 mode (even Movie) ends up sharpening things up a bit. I personally like to use Game Mode, which also retains a tiny bit of sharpness that creates a small halo around the black rectangle in this test - http://www.lagom.nl/lcd-test/sharpness.php
but I never saw anything like that in a movie. I know I am supposed to use Movie Mode, but I have my reasons to use Game Mode and I just wanted to know if that slight sharpening alone would distor