I don't know how the Lumagen does dithering. Dithering is supposed to be different *per pixel* in such a way that it averages to the correct value if you look at multiple neighboring pixels. When doing measurements, the test patterns usually consist of a whole screen of the same color. If you do this with dithering turned on, some pixels might have 1 value lower, while other pixels might have 1 value higher. But if you average all pixels, you should get exactly the desired color value. Now a meter does not measure single pixels, it measures the averaged light output of several neighbor pixels. So dithering should not cause any problems with calibration, IMHO. Furthermore, even *if* the meter would measure only one single pixel, you can setup madVR to change the dithering pattern for every new frame, which means that each pixel will change its dithering value all the time. Again a meter does not measure a pixel for only one video frame, but it measures it over a longer period, so again the dithering should average itself out.
The JRP post about dithering had a different reason: Some user had monitored the Lumagen test pattern output and found that some pixels didn't have the expected values. If this was caused by dithering, this should have no consequence (see above). But the user was concerned that the Lumagen might output wrong pixels *everywhere*, so JRP explained that the "problem" might be caused by dithering, not by the Lumagen outputting incorrect values.
All this said: If the test calibration software driving madVR/madTPG requests color values which don't need dithering, madVR will automatically not use dithering. madTPG only dithers if the calibration software asks for a value which can't be represented without dithering. So basically the calibration software has the option to drive madTPG with un-dithered clean pixels, if it carefully chooses the "right" test pattern colors. Just as an example:
Let's say the calibration software wants to measure grayscale IRE 10. When considering 0.0 as black and white as 1.0, IRE 10 should be 0.1. When using 8bit TV levels output (16-235) this would be 8bit value 37.9. This is a floating point value. So basically madTPG *cannot* technically render this requested test pattern color without using dithering. As a result madTPG will display most pixels as 38 and a few pixels in between as 37. If the calibration software wants to avoid dithering, it could ask madTPG whether the output levels are TV or PC (there's a madTPG API for that) and then the calibration software could be clever enough to round the measured IRE to the nearest cardinal value. So basically the calibration software could ask for (38 - 16)/219 = 0.100456621[...]. This would result in madVR disabling dithering and always outputting 38 for all pixels. Of course the calibration software would then have to change its math a bit to take into consideration that it didn't measure IRE 10, but instead IRE 10.0456621[...]. But that should be quite doable. AFAIK
most DVD test patterns round to the next cardinal value, anyway, so using DVD test patterns, the calibration software would probably measure 10.0456621[...], anyway.
BTW, madTPG's dithering quality is probably quite a bit higher than the Lumagen's dithering quality. From what I read (on some forum, so take it with a pinch of salt), the Lumagen uses simple random dithering, which adds quite a bit of extra noise and changes almost every pixel in some way. madTPG's default dithering method is much less noisy and only changes pixels if they need to be changed.