AVS › AVS Forum › Display Devices › Display Calibration › How constant is Display Output
New Posts  All Forums:Forum Nav:

# How constant is Display Output

I try to achieve the D65 of x=0.313 y=0.329.

Getting my 'Y' value constant is just a matter of leaving my LCD TV on for about 40 minutes when it balances out with my room temperature but stability at the xy coordinates proves illusive.

Nothing really significant just a variation that I cannot predict.

Up to now I have adjusted but my feeling now is that the Display itself is not constant and I am adjusting the LCD's natural variation.

In other words I could leave it for another day and find it does not need adjustment.

Is my LCD alone in giving this variation or is it a feature of all Displays?

### AVS Top Picks

Quote:
Originally Posted by PE06MCG

I try to achieve the D65 of x=0.313 y=0.329.

Getting my 'Y' value constant is just a matter of leaving my LCD TV on for about 40 minutes when it balances out with my room temperature but stability at the xy coordinates proves illusive.

Nothing really significant just a variation that I cannot predict.

Up to now I have adjusted but my feeling now is that the Display itself is not constant and I am adjusting the LCD's natural variation.

In other words I could leave it for another day and find it does not need adjustment.

Is my LCD alone in giving this variation or is it a feature of all Displays?

the meter itself will fluctuate a bit in terms of reading the exact same pattern with the exact same calibration settings, but I do believe the display might fluctuate to some extent as well, possibly less so than the meter
Greetings

All displays behave this way. If you look closely at a 80% white pattern on a plasma ... you see it is not static. There is movement. The meter picks that up. Also there is power fluctuation too in the current going to the TV. The meter can pick that type of change up too.

While the goal might be perfection, reality says we can only get it close enough where maybe the changes are beyond what we can see anyway so it is effectively perfect.

Imperfect humans are incapable of making perfect devices.

regards
Quote:
Originally Posted by Michael TLV

Greetings

All displays behave this way. If you look closely at a 80% white pattern on a plasma ... you see it is not static. There is movement. The meter picks that up. Also there is power fluctuation too in the current going to the TV. The meter can pick that type of change up too.

While the goal might be perfection, reality says we can only get it close enough where maybe the changes are beyond what we can see anyway so it is effectively perfect.

Imperfect humans are incapable of making perfect devices.

regards

Thanks Michael,

As I said, I can achieve a reasonably constant 'Y' value so the movement you see on a Plasma is presumably 'Y' fluctuation across the screen?

However you suggest that achieving a constant D65 is a futile exercise because of display imperfections.

As you say though, they are often not seen.

The problem I suppose is that I am unable to know if the 'accurate' calibration I make has been done at the extreme of my TV's variability.

If so, the possiblilty exists that visible errors may occur when the Display returns to its normal non extreme variabilty.

Apart from many observations over a period of time (assuming variation due to display only), how can I ensure my calibration is at the mid point of its variation?
Quote:
Originally Posted by PE06MCG

Thanks Michael,

As I said, I can achieve a reasonably constant 'Y' value so the movement you see on a Plasma is presumably 'Y' fluctuation across the screen?

However you suggest that achieving a constant D65 is a futile exercise because of display imperfections.

As you say though, they are often not seen.

The problem I suppose is that I am unable to know if the 'accurate' calibration I make has been done at the extreme of my TV's variability.

If so, the possiblilty exists that visible errors may occur when the Display returns to its normal non extreme variabilty.

Apart from many observations over a period of time (assuming variation due to display only), how can I ensure my calibration is at the mid point of its variation?

This may be one of those "How many angels can dance on the head of a pin?" kinds of questions. As Michael stated, displays fluctuate, not only in luminance, but also in color being displayed, and so do meters. The only way one can be certain is to do multiple readings after all adjustment is complete. If they are consistently below a dE of 3, which means they are essentially indistinguishable from each other, it's good to go.
Quote:
Originally Posted by Rolls-Royce

This may be one of those "How many angels can dance on the head of a pin?" kinds of questions. As Michael stated, displays fluctuate, not only in luminance, but also in color being displayed, and so do meters. The only way one can be certain is to do multiple readings after all adjustment is complete. If they are consistently below a dE of 3, which means they are essentially indistinguishable from each other, it's good to go.

Do you know I agree entirely.

There has been so much said recently about how accurate certain meters are that the reason to use a measurement tool to suit the purpose of use has been lost.

Seems the display is a moving target so why use a micrometer when a tape measure would probably be just as useful?
Quote:
Originally Posted by PE06MCG

Do you know I agree entirely.

There has been so much said recently about how accurate certain meters are that the reason to use a measurement tool to suit the purpose of use has been lost.

Seems the display is a moving target so why use a micrometer when a tape measure would probably be just as useful?

You'd think it wouldn't matter given display fluctuation and the coarseness of available adjustments. However, total system error (meter + display) does add up, so using a more accurate meter can help keep that as low as possible. You may not achieve perfection, but it's good to know you're giving it your best shot.
Quote:
Originally Posted by Rolls-Royce

You'd think it wouldn't matter given display fluctuation and the coarseness of available adjustments. However, total system error (meter + display) does add up, so using a more accurate meter can help keep that as low as possible. You may not achieve perfection, but it's good to know you're giving it your best shot.

Fair point but all the more accurate meters seem to highlight is the degree of fluctuation you mention.

The persuit of accuracy with meters is admirable particularly if it does not have a price premium but it may not be a useful quality at least for calibrating the variability of Displays.
New Posts  All Forums:Forum Nav:
Return Home
Back to Forum: Display Calibration

### AVS Top Picks

AVS › AVS Forum › Display Devices › Display Calibration › How constant is Display Output