^Well, with the particular DAC I was using in my tests - again, one of the very first prototypes of silicon that was going to be included in an SoC - I was able to take its SNR from the 90dB range all the way down to the 70dB range (or maybe even less? if I remember correctly) by using a VCO instead of a VCXO, ie, by increasing jitter exponentially.
Removing the crystal was not industry practice, of course. No designs that I knew of at the time had tried such a thing. We were trying to see just how much cost we could cut from the design before the DAC performance really suffered (the crystal in the VCXO was expensive, at least relative to the other components in the circuit).
So what I was doing was really a worst-case-ever scenario: A DAC design whose performance was susceptible to jitter, and then a design of the circuitry around it making the jitter horrible. The combination led to really poor performance.
Needless to say, we never went into production with that design.
But it did show that if you you picked the worst-case DAC and then really did a poor design around it, it was possible
to make jitter an issue.
Whether any other companies every went into production with something so bad, well, I doubt it. But then again, you never know what 'audiophile' companies will do!