And thanks to this test build a found a slight deviation of my last current config that in critical viewing (and avg dE in the SG Color Checker) proved considerably better than my previous one.
What I learned from it: Its VERY important that between 30 and 50 IRE (and probably up) the colortemperature curve EXACTLY kisses the 6500K line - my former calibration had one minor tweak in it to "correctly" reduce the deviation you will see at the 10 and 20 ire points, but it slightly pushed 30 and 40 ire below 6500K and by slightly I mean about to but probably even less then to the 6400K mark - compared to the configuration now, and only in comparison - skintones (edit: and blue skies) were noticeably too warm in critical viewing (dE for skin tones in both configs way below dE 2.5(- except bronze tones)).
Also, when calibrating greyscale - and all your values are way below dE 2 in both CIE76 and CIE2000, go with the configuration that shows lower errors in both standards. Basically - calibrate for CIE76, but don't ignore CIE2000 greyscale tracking (its just four mouse clicks to switch them). I had other configurations that showed better CIE76 tracking but noticeably worse CIE2000 performance and in critical viewing they look not as good.
Best that is possible on a KDL42W653 (just by using the TVs settings):
And just to give you an indication about the "worst 10%" (avg dE) value: One Single increment in red Bias was the difference between a value of 3.2 and 2.47 (avg of the 9 or ten colors it looks at). Thats how sensitive this indicator is, or at least can be.
So all that is left for me at this point is to find someone that can educate me on if the skewed sample of colors in Color Checker SG (it is skewed as in "you have more points in certain sectors") also does represent "visual importance" as in "more noticeable". I mean, why would have photographers chosen it as a standard else wise?