Originally Posted by Roger Dressler
Why can't you use a 1-bit FIFO for time delay of DSD? Dolby did that for their Time Link delay back before LPCM was fashionable. Worked fine.
Sorry, Roger, I shouldn't have said "you couldn't" do delays with SACD. I should have said "you couldn't easily" do delays with SACD.
In a previous post I wrote, "The problem with "DSD" is that it is completely unwieldy to work with for any signal processing. Even a simple time delay is a giant pain. The signal is only 1 bit wide, so it is a mile long and coming in at 64x speed. To build a time delay would require a special buffer. Then you would need six of them. (Thankfully, there is no such thing as 7.1 channel SACD!)"
Nearly all specialist manufacturers need to buy either SSP "engines" or pre-programmed DSP chips to build an SSP. To the best of my knowledge, there is no "off-the shelf" solution that includes the possibility of anything
for "DSD" signals.
Doing a 1-bit wide FIFO in an FPGA is quite easy. Doing it in a DSP chip is a giant pain in the neck. There is no memory in the DSP chip, it's all off-chip. So the DSP chip would have to take "chunks" of 1-bit DSD data and format it into (say) 32-bit "blocks" to send to the off-chip memory. Then when it needed it, it would have to re-serialize it to a one-bit stream.
Again, if it were easy, everyone would be doing it.
The biggest problem is that 99% of the customers with big home theater rigs don't give a fig about multi-channel SACD. So there is very little incentive to put all the time and money into developing this kind of thing, especially when there are a million other new technologies that you are forced into keeping up with. 3D video, anyone???