Thanks for the link. You're right, it is too technical for most. But unfortunately, it is also dated.
It is a long read, and I must admit that I only skimmed it. But all of the distortion concerns brought up were things we worried about in the 90's, and have since put to bed. From your synopsis:
Roughly speaking, the jitter and distortion of an optical signal will (depending on the DAC it is fed to) degrade the sound audibly . . .
This is all true, but I have bolded the key phrase that makes it no longer relevant. The optical signal in question no longer feeds the DAC.
Jitter is only detrimental to the clock the drives the DAC (or the ADC, when recording). In the 'olden' days, we used to recover the clock from the S/PDIF stream (whether optical or coax) and send it directly into the DAC. Here, clock jitter could cause issues.
Later, we would remove the jitter using phase-lock-loops. But that was adding cost, so the cheaper devices just kept the jitter.
Today, every device has an embedded DSP, and those DSP's typically include an asynchronous sample-rate converter (ASRC). The incoming S/PDIF stream is resampled to the DSP's internal clock, which drives the DAC. Now, the jitter is whatever the designer chooses it to be when she selects the DSP's clock, and you can count on it being inaudible.
We can thank HDMI for forcing this issue into solution, because the difficulty in synchronizing the recovered audio clock from HDMI is one factor that made ASRC so attractive.