I know you like to say that but you offer no measured proof. In practice that type of calibration along with color checker patterns does capture the linearity of the display to a degree where residual errors anywhere in the gamut should not exceed 3 dE94 at the 3-sigma level. That is what I demonstrated in this post using a sample color set of 2500 patterns. The confidence level of this test (using 2500 patterns in a 16M color population) is +/- 1% that it accurately describes the boundaries of any color you'd like to measure out of the total 16M population. As I mentioned earlier there is still room for improvement but it's very slim, you can push that 3-sigma point down to 2 dE94 but that's it given a probe precision of 0.4 dE94.
Unless there is some mythical color gremlin that pokes the gamut in between unverified points it's just not scientifically (or common sense) believable that a display behaves the way you describe. If it did, no amount of 3dLUTing would save you because you'd have to LUT all 16M points to make sure the gremlin was contained.