HDR indeed only makes sense on 10+ bit displays. 8-bit would have way too much banding to be useable. It already HAS plenty of banding, even in the scene's it's already expected to display (8 bpc 420 Bluray content).
Increasing the peak white level in lumens means you need more bits per channel otherwise there will not be enough gradations in the possible signal to give a smooth transition in not only greyscale (luma), but also in the chroma components. You can swap between YCbCr (luma + 2 chroma) channels or other color spaces and RGB, using any number of bits, but those bits are typically the same per component for easy transfer back and forth (which happens often)
When you increase the peak white from 100 nits to 2600 as the UHD HDR standard does, you will need a minimum of 10-bits per channel to avoid banding. To go beyond 2600 nits, you need 12 bits.
Deep color has been supported in HDMI since 1.3, meaning 30-bit or 36-bit total (10-bit or 12 bit per channel). But most of that data is merely extrapolated. In UHD Bluray HDR titles, you will actually get content that's specifically graded to HDR. Dolby Vision is a separate spec that uses 12-bits encoded as 10-bit 444 YCbCr SDR sent over the wire then decoded in the TV as 12-bit 420 HDR with 10,000 nits peak white in the signal, plus metadata in the signal to say what the min and max white in the signal is, so that the TV can auto-calibrate itself to reproduce the signal the best it can.