Originally Posted by chronitis
For example, if I run a 70 ire pattern, the reading will start at 600 and continue to increase in brightness if the test pattern stays up. Are you supposed to use the first reading as reference? I just can't understand the dramatic increase in brightness every second...
Yeah, that's "normal". It used to be like that on the 2017 series as well (and probably goes way back, although it's probably less prominent on older models which allow for lower peak luminance and use more aggressive ABL).
If you deliberately heat your entire panel up as much as possible within the limits of the built-in safeties (for example, you leave it running with a loop of 100% W,R,G,B,C,M,Y fullscreen fields for 20+ minutes in an attempt to get rid of some visible image retention) you will probably be able to observe some obvious changes in the tone response (dark grays will look overly bright and even the overall white balance might feel a bit different) with your naked eyes (I didn't try to measure this). It goes back to "normal" in a few minutes of average content (say, movies or video games) as it cools down.
I guess the same thing happens with small window patters (but even faster locally due to higher peak luma without the ABL): the panel gets warmer on that spot and thus the brightness increases.
(Note: the "cools down" and "heaths up" should be taken very vaguely here, this is probably not just about actual temperatures but also some electric charge buildup, etc... but the end result is the same from the perspective of visual observation...)
I think the right answer is that we should measure the "average" state: not entirely "cool", not overly "hot". But it's obviously rather tricky. I often see obvious IR after a 20-30 minute calibration session regardless of CalMAN's recommended fullscreen "pattern insertion" settings.