Some aspects of HDR that are not well defined or require some refinement...
1) How to properly calibrate an HDR display to ensure that it is reproducing the intended imagery. As we know from years of experience with HD displays, out of box settings are not always optimized or accurate. Displays benefit from a good calibration. There is work being done by some in the professional calibration community to develope the proper tools and patterns needed to calibrate an HDR display. But, many current HDR displays lock you out of many of the picture controls when receiving an HDR signal. Do manufacturers plan to unlock these settings at some point in the future (perhaps once proper calibration tools and methods become available to consumers)? Right now, it seems like we're flying blind with consumer HDR reproduction and it doesn't help that the quality of the content varies considerably. For example, many have noted that the grading for Exodus: Gods and Kings appears to be too dark on the Ultra HD Blu-Ray. Some are seeing an orange/yellow push in Mad Max: Fury Road. Etc.
2) HDR to SDR conversion - Rhetorical question: What do you get when you combine HDR content with an HDR-capable source device and a display that does not support rec 2020 signaling? Obviously, some sort of conversion needs to take place. Does the source device always target a specific set of SDR specs for this conversion (e.g. 100 nit peak brightness and rec.709 color gamut) or does it have more flexibility in mapping the content to the capabilities of the display? For instance, if the display can't accept rec. 2020 signaling or SMPTE 2086 metadata, but can reproduce a wider color gamut than rec.709 and/or can hit 300-400 nits, can the source device remap the signal in such a way as to utilize those capabilities? My assumption is that the source device always does its best to remap the signal to 100 nits rec.709 and that it is then up to the consumer to decide if he wants to let the display expand that back out to whatever it can reproduce. Regardless, what the consumer sees won't be an accurate representation of either the original HDR content or any SDR grade the content creator has done. Content creators would likely suggest that you shouldn't be playing HDR content on a display that isn't fully HDR-compliant and should play SDR content instead. But hardware manufacturers have never been opposed to providing features that "improve" the content if it means they can sell you new hardware. Given that, and the reality that people will mix and match hardware and content of different quality levels/capabilities, it would be nice if there was documentation available to consumers explaining what to expect when doing so and how to optimize there setup short of replacing every piece of equipment they own with UHD Premium (or equivalent) gear.