Well, I guess they could just encode HDR content as if it were normal content and then apply a stretch factor to account for the higher peak white. E.g. if a movie studio wants peak white for a specific movie to be twice as bright as it normally is, the stretch factor for every Y (luma) value would simply be 0.5. Doing it like this would allow movie studios to use a different peak white value for every movie.
I do wonder what happens to movies with HDR information if they're displayed on a "normal" display. Are they just extra dim there with all the HDR information still visible? Or is the HDR information cut away so that the movie is normal bright but without any HDR detail? I guess it should be the latter, but will that still look "ok" even with the HDR information clipped away? And how does the Blu-Ray player know if the display supports increased peak white or not? Seems to me that a lot of things still need to be sorted out in that area. Quite possibly a new HDMI version could be necessary, or at least a new EDID extension so that displays can communicate their peak white ability (and current setting) to the 4K Blu-Ray player.
Edit: Probably if the display can't handle the movie's peak white value, instead of just cutting away the HDR information, the 4K Blu-Ray player should condense the HDR information into the near peak white area which the display still can show.