While bits-is-bits, and most of the differences you mention are, shall we say, unlikely, not all software and hardware can deliver the same bits. For example, SPDIF cannot deliver lossless multichannel audio. It can only provide lossless stereo LPCM or lossy multichannel DD or DTS. HDMI with HDCP is required in order for a system to be allowed to deliver lossless multichannel LPCM, Dolby TrueHD or DTS-HD MA.
One of the argued issues is how much jitter must be present in a digital signal for it to cause audible distortion. HDMI has been measured to have significantly higher jitter than SPDIF in most cases. Apparently there is just enough jitter in HDMI from some devices for it to be able to cause effects which might be just barely audible in extremely favorable circumstances.
In the vast majority of listening environments, however, jitter is completely inaudible, and the differences that most people hear are primarily caused by "expectation bias". Also, whenever you listen to anything extremely carefully you'll hear things you never noticed before.
Edited to add: Another major cause of an audible quality difference is the volume level. Even a very small fraction of a db difference will cause the louder source to sound better. Exactly matching volume levels is extremely difficult.
Marantz SR7009/7.1.4/FH+TM/DefTech PM1000/LCR+TM amped