Originally Posted by imagic
Calibration stuff. What initial settings to use when drtting up for calibration, which mode(s) to calibrate, what hardware you'll need to ensure accuracy.
Yes, I figured that, especially considering the subject of seminar. I could easily see Samsung setting up a seminar after getting a quote like below.
quote from (pre-seminar) review:
We doubt that the Samsung could reach 1,000nits but the lower HDR target of 500nits does seem achievable thanks to the peak illuminator feature.
Would a recently calibrated meter like SpectraCal C6 be adequate? It is wide gamut capable. Or would its field of view not be adequate to measure a small field highlight of 1000nits? What size field would you use to measure/calibrate highlight?
Do you calibrate to absolute cd/m2 of reference white, and let hightlights take care of themselves? Dolby seemed to suggest that they wanted a specific cd/m2 output for a specific bit setting.
Does it require specific range of contrast and backlight settings to realize good HDR performance?
Some of the calibration threads may be better place to discuss, but some of the owners/potential owners may be interested as well, even if they have no interest at all in playing around themselves. For example, I'm getting a "free" calibration as part of an extended warranty, and would like to point him to some of the things discussed at seminar since they appear to be important to getting good results.
Related question, did they discuss the Samsung UHD Calibration Tool mentioned below at 1.21 seconds in video from CES? I've heard you plug the meter into TV and choose to let it do its thing instead of laptop, but can find very little info on it. Might be a good starting point for a subsequent manual calibration.
Sorry for all the questions, I'm honestly a bit jealous of you for being able to attend, but thanks for a great write up.
Best regards, dave.