Originally Posted by bbexperience
You can do that, but if you've already gone from the source to the tv, without then sending the signal to the receiver, you know it's not the receiver. From what you listed before you went HDMI to the tv, but then you went from the tv to the receiver. I'm saying just try to see if you have drops using the tv speakers. If you've already done that, you can start troubleshooting the sources. I just find it very hard to believe it's the source. Especially on the Xbox.
I don't have consistent drops through the TV speakers, just when I play the audio on the receiver. I sometimes lose audio totally with Nick on the TV speakers, usually happens when the show cuts to commercial. Happens every few days when my kid is watching Spongebob. Changing the channel or input brings it back.
Right, I get it not to spend the cash on Monster, but I'm comparing a couple of different $5 cables, and the BlueRigger is IMO much better built by simple observation: thicker, better shielded, sturdier plugs. That good article also points out two key things that lend to my hypothesis:
If the cable is faulty or if there is some cataclysm causing data to be lost between the player and the receiver, the decoders are designed to mute instead of blasting out compromised data. There is no such thing as an audio version of "sparkles." Instead, you just get a total dropout of the audio. So if you're getting audio dropouts, it's possible it's the HDMI cable. But if you're not getting video issues as well, the problem is likely elsewhere. If the audio isn't muting, then as long as you're outputting an audio codec, you're getting exactly what's on the disc.
Could this be the source of my mute issue? Data loss or error due to source in DD?
The big "if" that I've been repeating is "if the signal gets there." Over short runs -- a few meters, say -- it is incredibly unlikely that even the cheapest HDMI cable won't work perfectly. Over longer runs, the answer is less clear-cut. The variables of the transmitter and receiver combo in the source and display, plus any repeaters you have in the mix (like a receiver), mean that not every long HDMI cable can handle all the data. By long, I mean 50 feet or more.
"Unlikely" is not "impossible", quality could create delays in audio/video over long runs. Why not a bad short run? Aren't short cables just long cables cut --- a bad one could be enough to throw off a few MS?
Believe me, I get the skepticism -- but after systematic troubleshooting I'm not sure what else to rule out here. I've been in IT for >17 years, and back when I did hands-on support I'd seen bad network cables do spectacular things -- packet storms that brought down whole floors and networks -- and network traffic is just a bunch of 1s and 0s as well.
Not a 1:1 relationship, but similar enough... take with a grain of salt, I'm the new guy here, but I'm just telling you what I've observed. Appreciate the dialog.