Thanks to ChadB for providing the actual model numbers - also thank you to nuke for providing a few more details about the "Sony inhouse tech" they currently use on their Trilluminus marketed TVs (I found the same "white" LEDs with new coating (to become "white", initially blue)" when trying to research them myself).
Which brings us to the following problem in the end.
As far as the "chain of trust" with colorimeters goes, it seems to be fully broken for LCDs at this point.
If we just look at the matrizes Chad B provided you'll see significant, non linear, deviations from the white LED, RGB LED and OLED presets that come with the i1d3.
Furthermore I acquired a recalibration report from one of the i1d3 vendors who also replace the X-Rite presets with more of their own, TV brand specific ones, and they only induce minor changes compared to the original X-Rite ones.
That said - the matrizes Chad B just provided, compared against the plain preset only reads of a X-Rite i1d3 show deviations in the range of +/-8 dE (sum of 16 dE between two readings possible) in greyscale alone.
While the TV vendors at this point do not even communicate openly which backlight technology they are using. And there are significant changes to be seen even within two generations of the same backlight technology.
(Sonys "inhouse tech" is a deviation of the white LEDs backlight with other (phosphor?) coatings.)
About the same goes for the Samsung models ChadB provided matrizes to. They perform significantly different from presets a Colorimeter (i1d3) would expect.
Which leaves us with the following conclusions.
Colorimeters for now are unusable to calibrate LCDs, because the inter model deviations in current LCD backlight technologies are simply all over the place and without any preset database to compare against - updated at least once a year for each manufacturers backlight generations - using a Colorimeter alone to calibrate is simply not possible.
This problem could be fixed ENTIRELY if the whole scene would adopt a "share deviation matrizes" approach - at which point i1d3s would be sufficient for the entire industry - because of their low inter model deviations and them not degrading over time.
But who at that point would buy the 10.000 USD equipment?
It would only make sense for testers, TV manufacturers and whole sale resellers of calibration equipment - because of the one side simply already having the ability to look at many TVs throughout the year and the other side would be interested to make it a for profit service. But, at the current point in time the market is skewed against this, because its more profitable to sell 10k equipment, which is serviceable every 6 months, to whales and disregard the fact that even most test sites probably are using Colorimeter presets not applicable to what they are testing.
Also "recalibrating" or "preloading" certain presets on a i1d3 for money at this point basically can be concluded to be fraudulent behavior, because inter model variation is well within the realm of introducing significant color errors, even within the product range of one TV vendor a preset would be based on.
Then why is the community here so happy and the problems at hand dont get any form of recognition?
Because, again, we have a bias - where most in here are happy as long as two or three TV models every five years get profiled sufficiently (correction matrizes are shared) to also being able to be calibrated with a colorimeter.
Also, because this seems to be a problem that is just starting to get recognition and that will get traction in the future, because almost noone in here cared about LCDs and their backlight technologies at all - which now should be changing as plasmas are dying.
The problem being, that it might take five years before a critical mass of the readers in here become sensitive to whats going on, because of personal dependencies/buying cycles.
: I have two things to ask from you.
1. As you just shared four correction matrizes for four different LCD models for the i1d3 - which show highly significant deviations from the i1d3 presets AND also from each other - do you have any "sense" of how big the actual deviations between the LCD (LED) (= white LED) presets on the i1d3 are in your daily work > uncorrected vs corrected (with a spectro). How "big" is the difference in general? (dE (uv or 2000) or actual Gain/Bias).
2. As I myself also own a Quantum Dot display - and right now have almost no idea on what part of the spectrum my Colorimeter readings would fall, but right now I have a strong indication that with that much uncertainty being introduced into the process by backlight alone, I can't use it at all - would you also have and be willing to share a correction matrix of a (55)W900 (W905a) (also a quantum dot display), for a i1d3, because I wasn't tuned in enough to buy one of those 3-4 TVs where those are publicly available. Also I am a student, living in the heart of Europe, so I would, probably for some time, not be able to afford the airfare for bringing you around for a calibration.
As it stands right now we only have one correction matrix for one Quantum dot display publicly available - and that is not very much. At all. If possible even publicly so that we could dissect it for any kind of "trend" within at least two quantum dot displays.
I will wait a few days for reactions from the forum and then write an introduction to this problem and pin it on top of the calibration tutorial I have written - because with deviations as huge as you can find within the released four matrizes in this thread - I have to warn others not to think about or buy calibration equipment below the price point of 10k USD - when calibrating LCDs.