Dan may well be right about that, but I think he'd agree that even these digital clocking errors (if they indeed occur) would have little effect on perceived sound.
Significant errors would likely cause complete dropouts, anyway, so if you aren't hearing dropouts, you're probably hearing everything. I'm in the "bits is bits" crowd, so it only makes sense to me that a digital bitstream that is decoded externally is either 100% right or 100% wrong.
There are no degrees of "good" or bad."
The DD/AC3 standard includes provisions for error detection/correction/concealment. The first decode stage is an error check, which likely uses a number of check bits to verify the integrity of each data word, with at least single error correction in most implementations. The final decode stage applies overlap/add error concealment if necessary, generally using the last known good data block in an irrecoverable error condition, with muting as a last resort.
Keep in mind this refers to data errors, not clocking errors, but depending on how DD is transferred and decoded, I would expect this to be a non-issue. With CD audio/PCM over SPDIF, the timing of the samples can have audible effects, especially if that data is not bufferred and reclocked by the D/A converter. A compressed/encoded data format such as DD is likely buffered at both ends, with clocking information stripped from the decoded data, and the decoded samples reclocked, so any jitter would have to be introduced by the decoder.