Originally Posted by darinp2
I wonder how high OLED can go for average light level or how high they will be able to go in sets that try to handle HDR encodings well.
I also wonder whether that is just a BDA guideline, or a guideline that is likely to apply to HDR in general.
That mentions 1000 nit peaks, but Dolby has displays that go way beyond that and my post was partially about what people want without technological limits (partially thinking about the future as technological limits tend to decrease). It feels to me like 1000 nits was chosen largely due to current display limitations, but maybe it would have been chosen even if it was easy for consumer displays to go to 10000 nits or more.
Actually, that is not correct - the footnote says: "Over 1000 Nits should be limited to specular highlights."
There is no mention of an upper limit.
APL of 400 Nits is incredibly bright. My Vizio P70 puts out a max of 400 Nits on an all-white screen set on max brightness. I use that setting occasionally for bright-room viewing when I have direct sunlight flooding the room and it makes the TV pretty watchable (compared to the 65VT60 plasma which was not). But even in that mode, the APL is far below 400 Nits on content -probably more like 200 or perhaps 267 Nits APL.
For dark-room viewing, I am calibrated to 100 Nits peak meaning an APL of 50 - 67 Nits on content.
Whatever constitutes 'small specular highlights', it is almost a certainty that these HDR TV will have separate 'HDR Contrast' controls that allow you to crank them up or crank them down independently of APL (which will continue to be controlled by Backlight and Contrast).
All this bitching and moaning about HDR being a gimmick really leaves me stumped. We are getting 10 (or even 12) bit-depth, standardized WCG colorspace control and an improved EOTF for rendering shadow detail.
Anyone here on AVS should be cheering those changes and welcoming the opportunity to purchase a new TV that can exploit them.
The bitching and moaning should be reserved for fears that the entire initiative may dry on the vine before taking hold and content capitalizing on this increased capability never materializes.
Since Cinema is already encoded in DCI-P3, I'm pretty optimistic that the HDR train is leaving the station and we are moving to streaming content being offered in this improved cinema-like format. Whether UHD Bluray disks take hold is a different question entirely and I share the general consensus that a new generation of physical media gets the necessary support from the studios...
The upgraded HDR/UHD video format has far fewer barriers to widespread adoption than 3D ever did. 3D required filming in a different (and more expensive) way while the HDR/UHD format amounts to allowing the studios to share the Cinema-level encoding they have already done for Cinemas with the consumers which ultimately will translate into even lower cost than they face in delivering a Rec.709 master today (or at least no greater cost).
I really couldn't give a rat's *ss about what ends up happening with the brightness of small specular highlights. It is inevitable that the end user will have the control to crank them down or crank them up (assuming they have a TV with the required peak brightness
I have no concerns about the peak brightness of LGs current WOLEDs -what is important to me is they have the ability to interpret HDR format and can exploit the color gamut, increased bit-depth, and SMTPE EOTF gamma function...