Originally Posted by LastQuestion
I was hoping for some form of technical explanation...
Well, we already gave you the simple explanation that monoprice's advice is just BS. That means there is no technical basis for it. But lets talk about how wire gauge affects the signal.
One of the factors that affects the quality of the signal received by a HDMI device is resistance of the cable. Smaller AWG equals larger wire cross section. Larger wire cross section equals lower resistance for a given length. Lower resistance means the signal at the receiving end is attenuated less. IOW the signal is stronger. Over short distances, say up to 10 feet or so, the signal will usually be strong enough with even a 28 AWG cable. Longer distances may require larger cable (smaller AWG). That is pretty much all there is to wire gauge.
Wire gauge is not the sole, or even the primary, determinant of signal quality in an HDMI cable. There are other things, like skew, crosstalk, etc.
With an HDMI device, if the input signal is just good enough (in amplitude, shape, etc.), the result will be essentially the same as if the input signal was perfect. Best analogy I can think of is those web applications that make you type in a text sequence that is displayed in a garbled, but intelligible, way to weed out bots.
Of course, all the electronics involved, the bit rate, and the environment also affect what will work and won't work.