I haven't watched the video yet, but...
8 bits is not quite enough for normal displays with smooth gradients, and you will see subtle banding.
Add 1/2 bit of noise before quantization, and the banding is imperceptible.
Often, cameras already have that much noise present, and some sources (including 3D rendering) add noise for this reason.
Unfortunately, that kind of subtle high frequency noise gets lost with lossy compression, although it works great for uncompressed signals.
Aside from compression, the other big culprit is multiple stages of quantization. For example, take an 8 bit source, process it (i.e. color correction), then send out an 8 bit signal over HDMI, then process it again, then show it on an 8 bit display device, you get 3 stages of quantization. This will result in strong banding. The solution to this is to do all the processing in higher precision, and never quantize back to 8 bits. For example, take an 8 bit source, process, transmit in 10 bits, process, then display on a 10 bit device.
Another culprit is high dynamic range. The better the dynamic range, the more bits you need to keep up with the precision of your eyes. 8 bits starts to fall really short as displays get better.