That's a good quesiton about sound quality. Let me answer the other quesiton first. Yes, a "manual" switch that is just controlled by remote would be easier to manage (in my experience).
OK, now onto audio (which is truly a topic I enjoy):
Back in the stone age days, we had only two channel (and then quad channel) analog audio outputs from our cassette decks, reel-to-reel players, LD players, turntables, etc. Two channel RCA analog output was the only choice (after quad died).
Once the CD came out, people started complaining that the sound wasn't "real" enough and the blame was placed on the digital-to-analog converters being used in the players. So, in a truly inspired decision, the industry said, "let's provide a digital output between the player and the receiver and make more money." And, so they did. People purchased new CD players (at hundreds of dollars) for digital output and then purchased new receivers that had digital inputs (at hundreds to thousands of dollars). Most people never noticed the difference in sound quality, but they did it anyway. It also was the start of digitally copying the CD, but that is another story.
So now we have 44.1kHz, 16-bit stereo all-digital signals going out of the S/PDIF output. Both optical and coax (with RCA plugs) were provided as an method to transmit the digital data. There was also sufficient bandwidth on these S/PDIF outputs to handle the newer (even better sounding) 48-kHz, 16-bit audio that everyone said would provide a noticible improvement over CD quality sound. And so the industry made more money as devices started coming out with 48-kHz, 16-bit stereo signals.
Well, about this time the movie industry was noticing that people really liked it when movies had surround sound. First this was done by encoding a center and rear channel (singular) into the two channel soundtrack and later there was discrete 5.1 multichannel audio in the theaters. In order to make 5.1-channel audio fit in the space reserved for two channel audio in the theaters, various forms of lossy compression were invented (or re-invented). The most famous of these are AC3 (which later was marked as Dolby Digital) and DTS. When these were first marketed for the home, the LaserDisc (LD) was the most suitable method (before DVD). Through some ingenious engineering, the LD people found that they could cram a Dolby Digital signal into one digital audio channel on a LD. In order to send this signal to the receiver they did something that would take me a while to explain, but it was different than S/PDIF even though the S/PDIF output was available. Again, everyone had to upgrade their LD player and their receiver to use this new output.
About the same time DTS came along and said that they had a better way to compress audio. Their way involved both channels of a standard CD output and so DTS-CDs were born. These were playable over S/PDIF and again used the exact same amount of bits normally used for 2-channel audio. The idea behind their method (and AC3) was that people would not hear certain parts of the audio and those bits (not the people) could be eliminated. This is one of the base technologies behind almost all lossy audio encoding along with elimination of duplicate signals between channels.
With DVDs, the Dolby Digital and DTS output were sent over the S/PDIF instead of requiring the special output that the LD used. This again required everyone to purchase a new receiver that could understand DTS from a DVD, since that DTS used slightly more bits that the DTS-CD equivalent (48-kHz instead of 44.1-kHz).
In the late 1990s, DVD-Audio and SACD came along. Both of those could output digital audio at greater than anything the S/PDIF outputs could handle. The audio was up to 192-kHz, 24-bits and had a full 6 channels of output (if 6-channels were used then you were limtied to 96-kHz). This was audio heaven. But, of course many people had to purchase a new receiver that had 5.1-channel analog inputs so that the DVD-Audio and SACD players could convert to analog and then send that to the receiver. This way the lossless digital audio could not be copied since it never left the player (theoretically).
Well analog must be bad, so along came HDMI which has the bandwidth needed to send 7.1-channel of lossless PCM audio from a player to a receiver. All digital and protected against copying. Of course, everyone needed a new receiver to connect the HDMI line and then read the new lossless formats such as Dolby TrueHD and DTS-HD MA. S/PDIF just can't handle the bitrates of these formats.
So, now your question...The only audio formats that your cable box DVR can output are 2-channel PCM and Dolby Digital. Both of those can fit in the S/PDIF bandwidth without modification. A Blu-Ray player can send 7.1-channel LPCM, Dolby TrueHD and DTS-HD MA. None of those can fit in a S/PDIF bandwidth and therefore your Blu-Ray player downgrades that audio to fit in the available bandwidth, if you use S/PDIF (but stays lossless with HDMI). Basically on a BD player you take lossless audio and convert it to lossy audio if you use S/PDIF. On a cable box, it is already lossy for multichannel, so it doesn't matter.
A long explanation but I hope that answered your question.
Edited by alk3997 - 1/18/13 at 4:24pm