Originally Posted by Goatse
I'll say once again... there is zero difference between 1080i and 1080p. Especially for 24fps film or 30fps standard hdtv. All marketing hype.
I think that most of the "debate" in this thread is stemming from a lack of education about the new official HD3D standard that the BDA just finalized, which the video industry has adopted for all products such as DSS boxes, HDTVs etc. in addition to the BDA-governed Blu-ray devices.
We've *always* had the ability to do half-baked 3-D, which includes tricks like red/blue color filters for each eye (truly horrid) and trying to "weave" both left and right eye signals into a single video stream using alternating fields.
Both of those methods are substandard.
It should be obvious to anyone who's ever used red/blue glasses what's wrong with not getting actual color in your HD movie.
As far as the odd/even 1080i method... the problem isn't that a 1080p60 doesn't have room enough for all the picture content of two 1080p24 signals (since technically 1080p48 would be enough bandwidth), the problem is that HDMI 1.3 doesn't have any sort of meta-data or flag that would instruct the receiving display when the 1080 signal was a "regular" signal and when it needed to be treated differently because it was actually combined with both left and right eye information weaved together.
That's the problem with HDMI 1.3... there's no "standard" or spec for how to handle dual left/right eye information in the video stream because they finalized the spec before the 3D issue was worked through.
Well, when you change the spec you need to change the HDMI chipset since you need to ensure that products will be able to handle the signal without hiccup or problems... hence HDMI 1.4. Basically, HDMI 1.4 is the same bandwidth as 1.3, but it adds the meta-data to allow *two* distinct 1080p signals to be sent together in such a way that the HDTV can *see* them as two distinct 1080p signals.
So the problem with something that's only HDMI 1.3 is that since there's no standard or industry-agreed way to ship left/right eye 1080 HD over HDMI 1.3, there's no way of knowing if any particular 3D HD source will actually work with the display. Presumably, the display would have some way of analyzing the incoming video stream to try to identify some sort of field/frame cadence to try to figure out when a signal is comprised of two interweaved left/right images, but no TV manufacturer has really stated this. In this case, you'd have to assume that the best the TV can do is just "slave" to the incoming "3D" HDMI 1.3 signal and hope for the best... but since the LCD Shutter glasses need the TV's signal to sync with for left/right alternation, it gets really tricky.
HDMI 1.4 solves all these problems because it delivers two real 1080p signals to the TV in a way that the TV recognizes and can deal with.
Then the TV can do cool things like maybe do frame interpolation to up-sample each eye to 120 Hz (if the set is 240Hz) etc.... stuff you could never do if the TV just had to slave in HDMI 1.3 mode.
These are the facts gentlemen... our new HD3D standard has finally arrived and will be discussed heavily at CES. And only HDMI 1.4 devices can deliver FULL HD 1080p quality to each eye... HDMI 1.3 displays will be stuck with some sort of compromised version of 3D that doesn't deserve to be advertised as "3D" now that we actually do have a real 3D standard that has been set by the industry.
If the industry polices the manufacturers as it should, you should only be seeing the HD3D logo (you can see it at the digital bits.com) on true HDMI 1.4 products that can do full 1080p per each eye and properly identify and handle dual-channel HD 1080p video.