Originally Posted by taz291819
I'll try and explain this for you.
If you watch SDR (standard dynamic range) content with your display being triggered for HDR (High Dynamic Range), colors will be extremely over-saturated.
And it's vice-versa with HDR material not triggering HDR on the display, the image is extremely flat.
SDR displays don't have a problem with the "always on" HDR that AT&T is doing, because their displays simply ignore it.
Originally Posted by PlanetAVS
You're describing a form of compression. HDR doesn't do that. If anything, HDR has the same goals as contrast stretching in that it tries to capture the widest range of levels of brightness.
Thanks guys. It looks like I was looking at web pages that oversimplified things.
I just looked things up elsewhere, and found a page on CNET labelled "How HDR Works".
It explains things a bit more clearly.
In effect, it appears that video HDR does contrast enhancement in some parts of the image, and contrast compression in other parts, depending on the brightness and/or dynamic range of pixels in that region, with the goal of making local variation more visible. And not every TV does it exactly the same way. Nor is entirely similar to HDR in still photography.
One way of doing that would be to do overall contrast compression on a global level, i.e., applied to the local mean, so you saturate fewer large scale dark and light areas, but to enhance the variation (expand the contrast) on a local level. It's really quite clever.
On top of that, HDR TVs are supposed to actually have a relatively wide dynamic range of achievable brightness, as well as more quantized levels within that range - but that isn't the definition of HDR itself.
So - whether or not HDR is what you want depends on the type of imagery, and what you want to get out of it.
For some scientific and engineering work, you often want to have a consistent relationship between the nominal brightness and/or color, etc., and the displayed brightness and/or color, and the results of HDR would potentially be disastrous. For example, if you want to visually classify ground cover based on the distribution of brightness and/or color and/or polarization and/or phase, etc. But for some purposes, such as detecting objects, you simply want to bring out local detail, which is what HDR is designed to do.
For recreational TV watching, it sounds like it might sometimes be a very good idea, if and only if it is done "right", part of which means it is properly adapted to the statistics of the picture being displayed, and to the characteristics of the display device.
On my TV at home, an old non-smart LCD TV, which does not have HDR, I routinely set the color saturation to maximum, and contrast quite high, to gain detail in most areas. I bring down the brightness so large areas don't over-saturate, which would lose detail in the bright areas. But I do sometimes loose detail in the dark areas. HDR is much more sophisticated than that.
There are a lot of image processing techniques which can be helpful in bringing out detail for some imagery and some purposes, but not others. This is one of them.
So - as long as I have a fair degree of degree of control over it, I think I would prefer a TV that had HDR. But if the Osprey box really does take away that control, I wouldn't choose that box. Of course, to each their own.