We've helped with a lot of this confusion in the HDMI section of the AVSForum. First, let me give you the standard link to the HDMI Org. The HDMI Org owns the trademarks, copyrights, patents, standards and first borns of anything having to do with HDMI. This link explains the differences in HDMI cables: http://www.hdmi.org/learningcenter/faq.aspx#49
What you want to take away from that FAQ is that there is no such thing as a 1.3a cable or a 1.3c cable and particularly nothing known as a 1.4a cable. There are only High Speed and Standard Speed HDMI cables, with some options. All of the passive HDMI cables are dumb. They don't change bits (unless it is an error) and they don't manipulate resolution. The resolution is select by the source component based on inputs from the sink (the TV in this case). The message packet is called the EDID. The EDID tells the source what it can send based on the capabilities of the sink. The HDMI cable will certainly not hold bits so as to lower the resolution, no matter what version of HDMI cable you are using. And, ghosting has nothing to do with HDMI cables (although letting the projector warm up for 30 minutes before playing a 3D movie can have an effect).
So, when you talk about resolution changes, the cable has nothing to do with that. BTW, the pinouts between a Standard Speed and High Speed cable are exactly the same. The only changes over the years have been for the cable manufacturers to connect *all* of the pins. Before some of the latest options, some manufacturers saved 2 cents by not connecting every line but that would only effect your ability to use ARC and Ethernet over HDMI (which no one has implemented yet). So, those are irrelevent to this discussion.
Bit errors in HDMI show up as sparkles, lines, large sections of the screen changed to one color or no picture at all. They do not show up as color or resolution changes or subtle changes. You get a bit error, you'll know it. The odds of a random bit error improving your picture are equal to that of 20 trained monkeys on typewriters accidentally writing Shakespeare. Since the data is uncompressed but encoded, the error effects more than one pixel when the encoding fails due to the bit error.
Finally, to get 3D, everything in your HDMI chain must be 3D compatible. If it isn't then, then no 3D. Just because something says it is 1.4a-compatible doesn't make it 3D compatible. And, in the case of the PS3, it uses a 1.3c chipset and yet is still 3D compatible (as are some Denon receivers, which pass 3D video using a 1.3c chipset). There are many 1.4a-chipset TVs that cannot show 3D. 3D is an option in the HDMI 1.4 and HDMI 1.4a spec.
So, I'm betting that your difference in preceived quality is either 1) based on the stream speed allowed by your network or 2) use of SbS 3D format when the other source used frame packed 3D. Just a guess, since I don't have enough information to say anything else.
Hope that helps your understanding.