Is 8bit HDR or HLG a "joke"? Here is an HDR HLG video shot from the same camera
using its internal 8bit 420 Slog2 XAVC and Slog2 recorded externally from 14bit RAW to 10bit ProRes HQ. So the 8bit is heavily compressed (100 Mbps), the 10bit is not (950 Mbs!). Totally unfair to the 8bit. But, let's see...
Same scenes - for each the first version is from Slog2 ProRes 10bit, the second from XAVC 8bit SLog2 placed on the same timeline. The video was rendered in HLG 10bit DHxNR HQ in Resolve Studio and sent to YouTube. No color correction other than the Resolve presets converting Slog2 to HLG and level adjustments. It is a real HLG video. You can easily tell which is which, since the 10bit source from RAW is 4K DCI and the 8bit source is 4K UHD.
The video has plenty of blue sky, ripe for banding. And lots of detail, and high contrast with those pesky white clouds ripe for blowouts.
There is a difference. But does it matter? Is it fixable? A laughable joke?
If you do not like the test ("the test is a joke," "the test tells me nothing," "YouTube converts to 8bit anyway, so what?"), suggest what you would want to see. I can do anything with the original Slog2 clips, including making them available for download. The point is to look at video and not just use theory that may be true but may be about things that do not matter to our eyes.
Finally, why do we care about this? ALL Sony mirrorless cameras and camcorders less than $8,000 that have log gammas (Slog2/3) and/or HLG only shoot 4K in 8bit 420 - the RXxxx series, the A7xxx and the A6xxx series, and the FS5. If shooting in log or HLG at 8bit to make HDR videos is a joke, that would call into question what Sony is trying to sell, and what anyone should consider buying (note, the GH5 can only do 10bit 422 in 4K30p, not 4K60p). No Canon cameras/camcorders do 4K 10bit below $8,000 either. And they have log gammas. Is this a big deal?