When upgrading my home theater systems I frequently try new combinations of playback gear to ensure the best audio quality. So it was time to compare the latest in HDMI audio to S/PDIF audio.
Over the years HDMI audio has improved with each major revision, where as the Sony/Philips Digital Interconnect Format (S/PDIF) is mainly used in legacy systems and those in high-end audio who reject HDMI.
Let’s compare the two specifications:
The maximum S/PDIF capacity is 96 KHz at 20bits with 24 bits optional for two channels.
HDMI supports up to 8 channels of uncompressed audio at sample sizes of 16-bit, 20-bit, and 24-bit, with sample rates of 32 kHz, 44.1 kHz, 48 kHz, 88.2 kHz, 96 kHz, 176.4 kHz, and 192 kHz. HDMI also supports Dolby Digital and DTS, and up to 8 channels of one-bit DSD audio at rates up to four times that of Super Audio CD. With version 1.3, HDMI supports lossless compressed audio streams Dolby TrueHD and DTS-HD Master Audio.
For this comparison I used my reference Samsung 700 all-digital receiver with a Asus media player, which has both HDMI and S/PDIF outputs. The speakers were the Triton Two Towers.
Seriously there was no comparison as the S/PDIF optical connection offered inferior performance with by every audible measure even being fatiguing and irritating. This bus shows its age.
The HDMI audio was dynamic, articulate, musical and pleasing to the ear.
As background, I’ve done similar comparisons with each previous generation of HDMI. The first generation HDMI chips were atrocious sounding, used in the Panasonic DVD players and digital receivers. But that was eons ago. Now it’s time to make a strong recommendation for HDMI.
That the “high-end” two channel audiophile (read expensive) market continues to use S/PDIF is both perplexing and humorous, probably rooted in not having the resources to upgrade.