Originally Posted by Robotpedlr
First off I have read about 200 posts on here...all great info. But I am not sure I have seen an answer to my specific question.
I have a 25' HDMI cable through my wall that was providing signal from my DTV box to my old Plasma TV. I have now upgraded the TV to a new Panasonic GT50 3D plasma. The cable seems to work great (picture looks great, no weird issues). So my question is...do I have to upgrade to the new 1.4 standard (high speed) cable to take full advantage of 3D?
I have watched some DTV 3D and it looked fine (but I am no expert on how fine 3D should look). I assumed if it worked, it works. But read some posts somewhere that said if you use an older spec cable it will just reduce the resolution from 1080i (or p) down to something lower (think it said 540i). Is that true...because I can obviously see pixelation or something..but not sure I can tell if I am being robbed of a better (resolution) picture.
Thanks again... and the only reason I just rip and replace is the effort required for the inwall cable pull (even with the old cable). Plus the old cable looks pretty beefy.
Step 1 of HDMI cable selection - There are only two types of HDMI cables. They are standard speed and high speed with a number of options. Certified high speed indicates that the cable has been tested to the highest speeds used and planned by HDMI. With the certificate for that length of cable, you are guaranteed (short of a manufacturing defect or damage) that the cable will work for anything HDMI can throw at it. There are no such things as 1.4 cables.
Standard speed means that the cable was not designed to meet those high speeds. How far short of high speed the standard speed cable falls is different for each cable (or at least each model of cable). The only thing you are guaranteed with a standard speed cable is that 1080i and 720p will work. Both standard speed and high speed cables should have the same pinouts.
So, if you start off with a standard speed or high speed cable that is getting all of the bits from input to output without errors and you replace it with another cable that gets all of the bits from input to output without errors, have you gained anything? No. At least not in terms of bit error rates.
...and definitely no change to video quality no matter what. These are bits that represent pixels. There are no magic errors that will decrease sharpness or increase brightness. You just get random noise if you get bit errors. Combined with the requirement for "secure" communications (encryption), this results in no picture, lines, sparkles, a solid rectangle or a partial picture. It does not result in a refinement to any of the items you can change in a TV's menu system (such as tint, color, brightness, sharpness, refresh rate, etc)
For your particular question, 3D is actually the roughly equivalent bandwidth as 1080p/48 2D, which is less than 1080p at 60Hz. So if you are getting a good 1080p/60 2D picture then your picture will be fine at 1080p/24 3D. Resolution is chosen by the equipment and media and not by the cable.
I also just realized you were talking about DTV 3D and not Blu-Ray 3D. DTV 3D is just a regular 1080i or 720p picture split in half. Half the picture goes to one eye and the other half picture goes to the other. Your display then recreates the full picture for each eye from the half picture information. Therefore the data is no different from an HDMI standpoint as a regular 720p/1080i picture. So even a standard speed cable is guaranteed to work with that.