AVS Forum banner
1 - 20 of 49 Posts

·
Registered
Joined
·
562 Posts
Discussion Starter · #1 ·
Hi All:


I've been following HDMI 1.3 developments for awhile now, and posted the answer below to question in another forum on-site about what HDMI 1.3 has to offer. Sorry to cross-post, but I would appreciate your critique on my evaluation. Always ready to learn! Thanks.


-----------------------------------------------------------------------------------------------------



HDMI 1.3 allows for some interesting potential features. The standard does not require that all of the features be included in a product claiming HDMI 1.3 compatibility however, so buyers need to be careful.


HDMI 1.3 allows for:
  • Greater Bandwidth- It more than doubles the bandwidth that can be passed through the interface (from 4.95 Gigabits per second to 10.2 Gps) allowing more data to pass. It will support up to 1440p through a single connection. Since current consumer displays don't exceed 1080p, this feature won't increase the resolution of a 1080p display over earlier versions of HDMI.
  • Deep Color- Current formats support up to 8 bit color. "Deep color" will allow 10, 12, and 16 bit color depths. This feature allows compatible displays to show billions of colors rather than the current millions of colors. "Billions" of colors is more than the eye can see. Some people pooh-pooh this feature as useless (since it's more than you can see), but you want your display to show more colors than you can see. If you can see differences between colors you will see color banding in scenes with subtle color gradations- like skies. Deep color is claimed to eliminate color banding where it's used. Deep color also supposedly allows for greater contrast ratios. Whether the upcoming display technologies will allow for greater contrast is a good question. Sony's PS3 Blu-ray player/game console and Toshiba's HD-XA2 standalone HD-DVD player both support HDMI 1.3 with deep color. The caveat here is that there's currently no content that supports deep color. Games will likely be the first source to support deep color, but movies on disc seem to be another matter. Blu-ray and HD-DVD discs currently support only 8 bit color. Toshiba seems to be betting that HD-DVD discs with deep color will be available at some point, but that's far from a given. Sony controls content from production (Sony Pictures), to disc manufacture (for Blu-ray), to display, and they're pushing HDMI 1.3/deep color/broader color space so they might jump onboard with some compatible content, but it's a gamble whether they will or not. Downloadable movies might support deep color and broader color space at some point in the future, but that's still vaporware. Some people claim that a source or display that allows for deep color will allow for finer calibration (and accuracy) even for 8 bit content, but I have no idea if this is true.
  • Broader Color Space- HDMI 1.3 supports the xvYCC color space (as opposed to RGB/YCbCr) that allows for 1.8 times as many colors as earlier color models. The hype says that this feature will allow more accurate and vivid colors to be displayed on compatible systems. Sony has announced HD camcorders for consumers that support xvYCC (Sony calls it xv.color), but who knows who else will join them, what content will be available, and when it might happen? It might turn out to be great. It might not be used.
  • Auto A/V Lip Sync- Some displays don't process video signals as fast as the A/V receiver in the surround system processes the audio causing the picture to lag behind the sound- like a poorly dubbed movie. Some current receivers have manual modes so you can adjust the lag yourself, but HDMI 1.3 allows for compatible HDMI 1.3 receivers and displays to automatically sync the sound and picture. The "buzz" says this is likely to be a popular feature that consumers will look for and therefore manufacturers will want to supply, but I haven't seen any "auto lip sync" products announced yet, and some argue about how helpful the feature will be.
  • New Lossless Audio Formats- HDMI 1.3 will allow the new Dolby-TrueHD and DTS-HD lossless audio formats to pass over the connection. A cool feature, but maybe unnecessary. Best I can tell, Blu-ray and HD-DVD players that are available now (and likely to be available in the future) decode the lossless audio in the player and can send it to any compatible receiver as PCM over earlier HDMI versions- so the HDMI 1.3 connection is unnecessary. Apparently some say Blu-ray and HD-DVD discs are (or will be) authored so that the lossless audio has to be decoded in the player and can't be processed in the receiver, so HDMI 1.3 wouldn't help for those discs. It's possible that other future content (e.g. downloadable movies) will allow or require processing of the lossless audio in a receiver through HDMI 1.3, but who knows?
  • New Mini Connector- HDMI 1.3 allows for a smaller connector on portable devices like camcorders- kinda like you see with USB.


IMHO, HDMI 1.3 allows for some exciting potential features, but it's still up in the air whether manufacturers will incorporate the features made possible by HDMI 1.3 in their products, how much content will be available to take advantage of those features, and how much real difference the features will make. I think it would be nice if next generation products would embrace HDMI 1.3, but there are other maybe more important features to look for in the near future (24fps input support with a refresh rate some multiple of 24 to eliminate judder, better PC input support, etc., etc.) You can bet that products with HDMI versions earlier than 1.3 won't take advantage of HDMI 1.3's features (except for lossless audio as mentioned above), but it's not certain that even HDMI 1.3 products will use the features. Caveat emptor.


Just my 2¢,


kelpie
 

·
Registered
Joined
·
269 Posts
Thanks for taking the time to read up on HDMI 1.3. Here are some comments I have on your analysis:

Greater bandwidth: The 10Gbps bandwidth can be used by manufacturers in a variety of ways. Think of HDMI as a data pipe which we've just made bigger and faster, but it's up to manufacturers to choose how they want to use it. Here are 3 performance features that we've identified as possible uses of the greater bandwidth:

1) higher resolution, such as 1440p or WQXGA (i.e. 30" LCDs running 2560x1600). A few TV makers have announced or demonstrated displays with these resolutions.

2) deeper color depths (10, 12, or 16-bit per component), which requires a greater amount of pixel data to be sent during the same amount of line frame time.

3) higher refresh rates (72, 75, 90, or 120Hz). At CES, quite a few LCD TV makers announced and demonstrated TVs support a 120Hz refresh rate, which appeared quite effective at reducing the LCD motion blur. With the new high def optical discs, the content is encoded on the disc at 1080p/24fps, yet a TV's display may support anywhere from 60-120Hz. Today, the player usually performs conversion from 1080p/24 to 1080p/60, while the TV might perform another frame rate conversion (say 1080p/60 to 1080p/120) with its own video processor to match the TV's best refresh rate. Many video processing experts will tell you that if you want to do the best job performing frame rate conversion, you should do it all at the content level because the raw/original encoded video (ex. MPEG4 video) has information about motion vectors and other data that allow the processor to apply the best algorithms and thus get the best quality. Compare that to the 2 step process I described above where the TV does not have any of the important video information that would help it do a better job with the frame rate conversion. For this reason, we may see some HDMI source devices be capable of sending a video timing such as 1080p at 120Hz refresh rate, which requires twice the bandwidth of 1080p/60.

Deep color: At CES, we had a side by side demonstration of two identical LCD TVs being run with 8bit and 10bit color depth content generated from a PS3. The difference was obvious enough that all the visitors who saw the demo were able to see the differences without needed us to point out where to look and what to look for. I haven't personally seen a 10 vs 12-bit side by side demo, but I have seem 10-bit content where I could still see some subtle banding, which leads me to think that the human eye can probably see the difference between 10 and 12-bit. In my discussions with video processing experts, they do claim that they can take 8-bit video content and apply a gamma extraction to smooth out the banding and deliver a pseudo 10-bit video experience. It appears a number of HD-DVD and BluRay player makers are betting that this will indeed yield a better experience, even though the content is natively encoded at 8-bit.

Broader Color Space: Given the limitations of today's color space, and the greater color gamuts that the TV's can now produce, I expect this technology will trickle into more and more products over time. If you do a web search on xvYCC, I believe you will find other TV makers that have announced products supporting this.

Lip Sync: Given that an HDTV will typically buffer the video anywhere from 2-4 frames, we're talking anywhere from 33-67ms of video latency where you hear the audio first, then see the corresponding motion 33-83ms later. Our brains are wired to be relatively tolerant of audio that is somewhat behind the video (so the lips move, but the audio comes out with a bit of a delay), but we are relatively sensitive when the audio comes first, followed by video. Some people are more sensitive, some are less, so your perception will vary. If you listen to audio through the TV, this is generally not a problem because the TV has a built in audio buffer that delays the audio by the right amount to be in sync with its buffered video. But if you use an external device (i.e. AV receiver) to render your audio, now you have audio that is presented with almost delay, while the video comes out with somewhat more delay. The good news is that this feature is relatively simple & cheap to put into a TV, so we hope that it will become a standard feature in TVs quickly. Seeing more and more AV receivers and even some DVD players add an audio delay feature is a good sign that this feature is recognized as a useful benefit, and we hope that it becomes a more automatic and precise correction through HDMI.

Lossless audio formats: You are correct that most (perhaps all) of the HD-DVD and Blu-ray players available now perform the decoding of these new formats into PCM, and HDMI 1.3 is not required for transporting multi-channel PCM audio. We have heard some express that the decoding of these audio formats is usually done better by the electronics in an AV receiver, which would be an example of a case where HDMI 1.3's ability to send the lossless formats in their encoded formats over the cable is needed.

Mini Connector: HDMI is distinct from USB in that USB can not support uncompressed HD video & audio due to its bandwidth restrictions. Using a lower bandwidth interface would require the content to be compressed by the source (which generally translates to higher cost & potentially quality degradation). In addition, the display would require a decoder to be able to decompress the content, which again implies cost & quality degradation. Also, there is greater possibility of incompatibility if the compression scheme used by the portable device is not supported by the TV. For these reasons, there are many benefits (lower cost, higher image quality, minimal risk of obsolescence or incomptiability) with using an uncompressed interface like HDMI. From a connectivity point of view, it's quite common to find HDTVs with HDMI connectors (many now put one on the front or side panel specifically for portable devices), but it is not common to find an HDTV with USB connectors, much less the ability to accept a HD-video/audio stream over USB.


In summary, we created the HDMI 1.3 specification to enable these interesting features in new products, but it is up to the manufacturers to choose which features they implement and how, and of course, up to consumers to decide what features will affect their purchasing decisions.
 

·
Registered
Joined
·
214 Posts

Quote:
Originally Posted by HDMI_Org /forum/post/0


Lossless audio formats: You are correct that most (perhaps all) of the HD-DVD and Blu-ray players available now perform the decoding of these new formats into PCM, and HDMI 1.3 is not required for transporting multi-channel PCM audio. We have heard some express that the decoding of these audio formats is usually done better by the electronics in an AV receiver, which would be an example of a case where HDMI 1.3's ability to send the lossless formats in their encoded formats over the cable is needed.

How could an AV receiver uncompress a lossless format better? That is marketing spin. You are repeating an expression that would be equaviequivalent to someone saying that using program A to unzip a text file vs. using program B to unzip the same file results in a better text file somehow.


Of course, once the audio is uncompressed there is a great deal of difference in what your pre/pro or AV receiver can do to help you manage and play all those channels of sound on your speakers. Paying extra (buying HDMI 1.3 equipment) just so you can move the compressed audio streams around is a waste of money.


This page on dolby's website gives a great explanation about why you would rather send uncompressed pcm to your AV receiver. Take a look at page 3, which I'm quoting below
http://www.dolby.com/consumer/techno...HD_avrs_3.html

Quote:
With six or eight channels of 24-bit/96 kHz audio to handle from these new HD formats, the post-processing DSP requirements for an A/V receiver more than double. Rather than devoting the considerable DSP resources to decoding the core audio signals within the A/V processor itself, it may be more fruitful to use the A/V processor's DSP resources to perform high-resolution post-processing such as bass management, room or speaker equalization, Dolby Pro Logic® IIx decoding, or other types of digital signal processing.


This excellent thread has been going into great detail about the value, or lack thereof, in waiting for hdmi 1.3
http://www.avsforum.com/avs-vb/showthread.php?t=786883


HDMI has just been a source of confusion and lots of marketing spin. Why not actually help consumers out by requiring manufacturers to spell out if their HDMI ports support multi channel audio, or bass management, or proper handling of the lfe channel? Because, that wouldn't sell new gear of course.
 

·
Premium Member
Joined
·
2,787 Posts
I don't think it's fair to characterize the benefits of HDMI 1.3 over previous versions as purely marketing hype. It's not inaccurate to say that 1.3 supports features that will allow for a better picture and audio quality to reach our eyes and ears. To claim that the use of the 1.3 pipeline alone will improve picture and audio without manufacturer and studio support would be inaccurate.


Obviously the full benefits of 1.3 will only be realized once 1) it is supported by the manufacturers of players, processors, recievers, and displays, and 2) it is supported by the studios mastering HD DVDs and BDs (source material). It's a safe bet that manufacturers will support it long before the studios will, as I'm not even sure that either HD optical format has the provisions to support higher frame rates and color depths at 1080p.


HDMI 1.3 does provide a larger pipeline to allow for higher picture and audio quality - the question is whether or not A/V manufacturers and studios are going to take advantage of that pipeline. I'd like to see it happen; as to whether it will in the near future is anyone's guess.
 

·
Registered
Joined
·
562 Posts
Discussion Starter · #6 ·
Thanks for taking the time to review and reply to my post HDMI_Org.

Quote:
Originally Posted by HDMI_Org /forum/post/0

Greater bandwidth: The 10Gbps bandwidth can be used by manufacturers in a variety of ways. Think of HDMI as a data pipe which we've just made bigger and faster, but it's up to manufacturers to choose how they want to use it. Here are 3 performance features that we've identified as possible uses of the greater bandwidth:

1) higher resolution, such as 1440p or WQXGA (i.e. 30" LCDs running 2560x1600). A few TV makers have announced or demonstrated displays with these resolutions.

2) deeper color depths (10, 12, or 16-bit per component), which requires a greater amount of pixel data to be sent during the same amount of line frame time.

3) higher refresh rates (72, 75, 90, or 120Hz). At CES, quite a few LCD TV makers announced and demonstrated TVs support a 120Hz refresh rate, which appeared quite effective at reducing the LCD motion blur. With the new high def optical discs, the content is encoded on the disc at 1080p/24fps, yet a TV's display may support anywhere from 60-120Hz. Today, the player usually performs conversion from 1080p/24 to 1080p/60, while the TV might perform another frame rate conversion (say 1080p/60 to 1080p/120) with its own video processor to match the TV's best refresh rate. Many video processing experts will tell you that if you want to do the best job performing frame rate conversion, you should do it all at the content level because the raw/original encoded video (ex. MPEG4 video) has information about motion vectors and other data that allow the processor to apply the best algorithms and thus get the best quality. Compare that to the 2 step process I described above where the TV does not have any of the important video information that would help it do a better job with the frame rate conversion. For this reason, we may see some HDMI source devices be capable of sending a video timing such as 1080p at 120Hz refresh rate, which requires twice the bandwidth of 1080p/60.

Thanks for expanding on my point. Some questions if you please.


You seem to be implying that a single HDMI 1.3 connection can be used to connect to a 2560X1600 resolution display. The 2560X1600 displays that I had heard of use dual DVI connections. Are you aware of displays with resolutions greater than 1080p that will use a HDMI connection?


Also, of course HDMI's bandwidth can be used for more than the increased resolution that I mentioned. But some people are claiming that HDMI 1.3 lacks the bandwidth to support higher resolutions, AND deep color, AND xvYCC color space, AND higher refresh rates, etc. all through the same connection at the same time. Would it be possible to, say for the sake of argument, create a source device that could send a 1440p (or even 1080p) image with deep color using the xvYCC color space at 120 hz with lossless audio auto lip-synced to an A/V receiver/ display over a single HDMI 1.3 connection? Or would the source have to pick and choose which features to offer because of bandwidth limitations?

Quote:
Originally Posted by HDMI_Org /forum/post/0

Deep color: At CES, we had a side by side demonstration of two identical LCD TVs being run with 8bit and 10bit color depth content generated from a PS3. The difference was obvious enough that all the visitors who saw the demo were able to see the differences without needed us to point out where to look and what to look for. I haven't personally seen a 10 vs 12-bit side by side demo, but I have seem 10-bit content where I could still see some subtle banding, which leads me to think that the human eye can probably see the difference between 10 and 12-bit. In my discussions with video processing experts, they do claim that they can take 8-bit video content and apply a gamma extraction to smooth out the banding and deliver a pseudo 10-bit video experience. It appears a number of HD-DVD and BluRay player makers are betting that this will indeed yield a better experience, even though the content is natively encoded at 8-bit.

Some questions here too, please. I'm kinda surprised that you still saw some color banding at 10 bits. Are you aware of any 10+ bit content being discussed so far other than games? Have you seen a demonstration of 8 bit content being sent to a 10 bit display with and without "deep color" gamma extraction to see if a "pseudo 10 bit video experience" makes an appreciable difference? Since this seems to be what we'll have available for awhile with non-game content this comparison seems quite relevant.

Quote:
Originally Posted by HDMI_Org /forum/post/0

Broader Color Space: Given the limitations of today's color space, and the greater color gamuts that the TV's can now produce, I expect this technology will trickle into more and more products over time. If you do a web search on xvYCC, I believe you will find other TV makers that have announced products supporting this.

I'm sure other manufacturers are working on displays that support the xvYCC color space, but are you aware of any upcoming content that supports the xvYCC color space other than consumer camcorders? What's in the pipeline for the future?

Quote:
Originally Posted by HDMI_Org /forum/post/0

Lip Sync: Given that an HDTV will typically buffer the video anywhere from 2-4 frames, we're talking anywhere from 33-67ms of video latency where you hear the audio first, then see the corresponding motion 33-83ms later. Our brains are wired to be relatively tolerant of audio that is somewhat behind the video (so the lips move, but the audio comes out with a bit of a delay), but we are relatively sensitive when the audio comes first, followed by video. Some people are more sensitive, some are less, so your perception will vary. If you listen to audio through the TV, this is generally not a problem because the TV has a built in audio buffer that delays the audio by the right amount to be in sync with its buffered video. But if you use an external device (i.e. AV receiver) to render your audio, now you have audio that is presented with almost delay, while the video comes out with somewhat more delay. The good news is that this feature is relatively simple & cheap to put into a TV, so we hope that it will become a standard feature in TVs quickly. Seeing more and more AV receivers and even some DVD players add an audio delay feature is a good sign that this feature is recognized as a useful benefit, and we hope that it becomes a more automatic and precise correction through HDMI.

We can all hope that auto lip-sync will be available, but I'm kinda confused as to why it doesn't seem to have happened yet if it's so simple and cheap. For example, Sherwood Newcastle's new $1500 R-972 A/V receiver and Sony's new $33,000 KDL-70XBR3 70" flat-panel LCD display both support some HDMI 1.3 features, but neither makes any mention of auto lip-sync in their product descriptions. Are you aware of any specific upcoming products that do support this feature?

Quote:
Originally Posted by HDMI_Org /forum/post/0

Lossless audio formats: You are correct that most (perhaps all) of the HD-DVD and Blu-ray players available now perform the decoding of these new formats into PCM, and HDMI 1.3 is not required for transporting multi-channel PCM audio. We have heard some express that the decoding of these audio formats is usually done better by the electronics in an AV receiver, which would be an example of a case where HDMI 1.3's ability to send the lossless formats in their encoded formats over the cable is needed.

Have you heard any "buzz" about future sources/content (meaning not Blu-ray or HD-DVD) that will take advantage of an A/V receiver's ability to decode the lossless audio formats?

Quote:
Originally Posted by HDMI_Org /forum/post/0

Mini Connector: HDMI is distinct from USB in that USB can not support uncompressed HD video & audio due to its bandwidth restrictions. Using a lower bandwidth interface would require the content to be compressed by the source (which generally translates to higher cost & potentially quality degradation). In addition, the display would require a decoder to be able to decompress the content, which again implies cost & quality degradation. Also, there is greater possibility of incompatibility if the compression scheme used by the portable device is not supported by the TV. For these reasons, there are many benefits (lower cost, higher image quality, minimal risk of obsolescence or incomptiability) with using an uncompressed interface like HDMI. From a connectivity point of view, it's quite common to find HDTVs with HDMI connectors (many now put one on the front or side panel specifically for portable devices), but it is not common to find an HDTV with USB connectors, much less the ability to accept a HD-video/audio stream over USB.

Sorry I wasn't clear. I wasn't trying to mention USB as an alternative to HDMI 1.3 to connect to a display. I was just comparing USB's availability of a mini-connector for cameras etc. to HDMI 1.3's new mini-connector. Thanks for straightening that point out.

Quote:
Originally Posted by HDMI_Org /forum/post/0


In summary, we created the HDMI 1.3 specification to enable these interesting features in new products, but it is up to the manufacturers to choose which features they implement and how, and of course, up to consumers to decide what features will affect their purchasing decisions.

This is a super point. It does seem to be wait-and-see as to whether HDMI 1.3's potential will be realized.


One last question. I've heard rumors that Silicon Image's current HDMI 1.3 chips don't support HMDI 1.3's full bandwidth potential and/or all of HMDI 1.3's features- so manufacturers can't yet make "Full HDMI 1.3" devices even if they wanted to. True or no?


For just one example, the Dolby article reference above says:

Quote:
Originally Posted by Dolby /forum/post/0

Dolby TrueHD and Dolby Digital Plus in A/V Receivers
Eventually (emphasis added), A/V receivers will have direct access to Dolby® Digital Plus or Dolby TrueHD bitstreams. We are working with the IEC and HDMI organizations to update data protocols to enable future versions of these high-bandwidth interfaces to carry these bitstreams.

"Working with HDMI organizations to update data protocols"!? Does the current version of HDMI 1.3 not support Dolby TrueHD and Dolby Digital Plus for A/V receivers? Is there any part of the so-far-announced feature set for HDMI 1.3 that isn't currently possible with the available standards/protocols and hardware?


Thanks again for your help,


kelpie
 

·
Registered
Joined
·
269 Posts

Quote:
Originally Posted by kelpie /forum/post/0


You seem to be implying that a single HDMI 1.3 connection can be used to connect to a 2560X1600 resolution display. The 2560X1600 displays that I had heard of use dual DVI connections. Are you aware of displays with resolutions greater than 1080p that will use a HDMI connection?

There are no HDMI products shipping yet that support over 1080p on HDMI, but I do expect this to change in the future. Given that HDMI can do what DVI dual link can do, but at a somewhat lower cost (because electronics are cheaper to support single link vs dual link), we expect to see manufacturers take advantage of this in the future. Initially, we're seeing manufacturers use the deep color feature to take advantage of the higher bandwidths, but higher refresh rate & higher resolutions are the next logical trends.

Quote:
Originally Posted by kelpie /forum/post/0


Would it be possible to, say for the sake of argument, create a source device that could send a 1440p (or even 1080p) image with deep color using the xvYCC color space at 120 hz with lossless audio auto lip-synced to an A/V receiver/ display over a single HDMI 1.3 connection? Or would the source have to pick and choose which features to offer because of bandwidth limitations?

With the current maximum 340MHz of HDMI 1.3, this could support 1080p, xvYCC, 120Hz, 8-bit RGB, and the 8 channels of lossless surround sound audio. Note: xvYCC and the audio play no practical role into using the bandwidth. To push the next gen of performance, such as 1080p/12-bit RGB/120Hz, we would need to bump the speed up to 550MHz. While I can't promise this would be something in the next HDMI spec, I can say that HDMI has the technical foundation to be increased well over the current 340MHz limit.

Quote:
Originally Posted by kelpie /forum/post/0


Are you aware of any 10+ bit content being discussed so far other than games? Have you seen a demonstration of 8 bit content being sent to a 10 bit display with and without "deep color" gamma extraction to see if a "pseudo 10 bit video experience" makes an appreciable difference? Since this seems to be what we'll have available for awhile with non-game content this comparison seems quite relevant.

As for other native deep color content, I would expect PC content (such as digital photography) to be the next one. Most digital cameras capture in 12bit or greater, and most graphics chips already have 12bit piping in their architecture. Eventually, I hope to see the HD-DVD and Blu-ray codec standards get upgraded to add the provision for defining the content in deep color as well. I have seen a demo of 8-bit content upscaled to 10-bit and displayed on a 10-bit LCD TV, and I can absolutely see the difference where banding was greatly reduced via the video processing. This made me a believer that video processing can still yield a good deep color experience even if the content is natively 8-bit.

Quote:
Originally Posted by kelpie /forum/post/0


I'm sure other manufacturers are working on displays that support the xvYCC color space, but are you aware of any upcoming content that supports the xvYCC color space other than consumer camcorders? What's in the pipeline for the future?

I haven't followed the trends on the broadcast side to know whether this will be implemented or not. I believe the cost is quite minor for the HDMI chips, but is more a matter of whether the broadcasters will upgrade their camera's to xvYCC, and whether the broadcast video standards (like MPEG2) can support it or not.

Quote:
Originally Posted by kelpie /forum/post/0


We can all hope that auto lip-sync will be available, but I'm kinda confused as to why it doesn't seem to have happened yet if it's so simple and cheap. For example, Sherwood Newcastle's new $1500 R-972 A/V receiver and Sony's new $33,000 KDL-70XBR3 70" flat-panel LCD display both support some HDMI 1.3 features, but neither makes any mention of auto lip-sync in their product descriptions. Are you aware of any specific upcoming products that do support this feature?

Unfortunately, I do not know of announced or launched products that support auto lip-sync correction at this time. I certainly hope to see it come into TVs very quickly this year at the least.

Quote:
Originally Posted by kelpie /forum/post/0


Have you heard any "buzz" about future sources/content (meaning not Blu-ray or HD-DVD) that will take advantage of an A/V receiver's ability to decode the lossless audio formats?

Can't say that I am aware of any other content on the horizon besides HD-DVD and Blu-ray movies that will have the Dolby & DTS lossless formats. I suggest watching the Dolby & DTS websites to see what they might be announcing in the future.

Quote:
Originally Posted by kelpie /forum/post/0


One last question. I've heard rumors that Silicon Image's current HDMI 1.3 chips don't support HMDI 1.3's full bandwidth potential and/or all of HMDI 1.3's features- so manufacturers can't yet make "Full HDMI 1.3" devices even if they wanted to. True or no?

You'll have to ask Silicon Image or refer to their website/documentation about the capabilities of their products. I'm not at liberties to comment on specific products.

Quote:
Originally Posted by kelpie /forum/post/0


For just one example, the Dolby article reference above says: "Working with HDMI organizations to update data protocols"!? Does the current version of HDMI 1.3 not support Dolby TrueHD and Dolby Digital Plus for A/V receivers? Is there any part of the so-far-announced feature set for HDMI 1.3 that isn't currently possible with the available standards/protocols and hardware?

This statement is not quite accurate regarding HDMI. HDMI 1.3 spec has all the required protocols included now to fully enable the design of a product that supports DolbyTrueHD. No updates are required.
 

·
Registered
Joined
·
5 Posts
Here are the specs for HDMI v1.3. These specs were approved in June 2006.


Single-cable digital audio/video connection increased to 10.2 Gbps


Increased color support, including 30-bit, 36-bit, and 48-bit color depths (RGB or YCbCr)


Supports xvYCC color standards


Supports automatic audio syncing capability


Supports output of Dolby TrueHD and DTS-HD Master Audio streams (audio codec formats used on HD DVDs and Blu-ray Discs) for external decoding by AV receivers.


Availability of a new mini connector for devices such as camcorders.


These specs provide sufficient bandwidth and bit rate to support the 1440p standard that is being played with in Japan and will probably hit here in the future. Remember, transmission of an HD 1440p signal is easily done with MPEG3 (just waiting in the wings) and most probable with MPEG4 - Part 10.
 

·
Read the FAQ!
Joined
·
36,621 Posts
For a different take on this stuff, see this sticky thread from the Amps/Receivers/Processors forum here:

http://www.avsforum.com/avs-vb/showthread.php?t=789994


There's no need to copy it all over here again.


Whether you call it hype or not, the simple fact is that many buyers are expecting more out of HDMI V1.3 than it can actually deliver -- in the real world -- over the next several years. There is nothing wrong with HDMI V1.3, per se. It's just that people see marketing terms like "Deep Color" and expect dramatic improvements which just aren't going to be there. Rather than touting things that lead to unrealistic expectations ("gee, maybe a miracle will occur and the HD-DVD and Blu-Ray formats will magically change somehow to allow movies that double the max data rate off disc, and the player you buy this year will magically know how to decode/handle that entirely new data format") why not just be up front about it and allow that Deep Color offers nothing to HD-DVD or Blu-Ray buyers, but is an enabling technology for future formats still several years away?


Rather than claiming that decreased banding or enhanced calibration will result from widening the HDMI transmission pipe, why not just be up front about it and allow that modern TVs *ALREADY* do more than 8-bit internal video processing, which is where it really matters? That THAT'S where the advantage lies?


The xvYCC color space is less of a problem, since most buyers have no clue what it is and wouldn't understand it if they did. But any claim that xvYCC support in HDMI V1.3 is going to mean ANYTHING to consumers of mass market, commercially produced content (i.e., movies you can buy on disc or TV shows you can tune into) for the balance of this decade is just plain wrong. Again, the limit is the content. You can't create a greater color gamut than is in the content to begin with.


Automatic Lip Sync correction is another example of something that buyers are reading way more into than is really there. Ask any DirecTV customer if he'd like to get rid, automatically, of some of the gross lip sync errors in his DirecTV viewing and of course he'll say yes. And how much effort is put into explaining to him that auto Lip Sync in HDMI V1.3 won't fix that? Or that the process of adjusting for the fixed, designed in, video processing delays in his TV is ALREADY trivial with a manual lip sync control in his AVR and a common, everyday calibration DVD?


How 'bout explaining to the poor HDMI V1.3 receiver buyer that HD-DVD titles authored for "advanced" content (i.e., virtually all such titles) or future Blu-Ray titles authored for "player profile 1.1" have to have their TrueHD or DTS-HD MA audio decoded *IN THE PLAYER* to play completely correctly, due to audio mixing that has to happen in the player and can't happen until after the packed audio formats are decoded?


HDMI V1.3 receivers with built-in decoders are just around the corner. And HD-DVD and Blu-Ray users who buy those receivers will eventually discover that their discs don't play the same "stuff" with their receiver doing the decoding as they can see at their friend's house if he has a player with the decoding built-in.


And the very idea of saying that decoding a lossless packed audio track into its component PCM streams in the player can POSSIBLY IN ANY WAY WHATSOEVER be inferior to the same process in the HDMI V1.3 receiver is also just wrong. These formats are "lossless" because the PCM that comes out of decoding is bit for bit identical to the PCM that went into the encoder in the studio. The certification process by Dolby Labs and DTS is meant to insure this. The bits you end up with in the receiver are no different and no better.


And why on earth would the receiver treat incoming, pre-decoded PCM any worse than the PCM it might produce itself by internal decoding?


HDMI V1.3 is a useful next step in the HDMI standard. But there is significant misunderstanding in the buying community about what advantages it can really deliver and when it can deliver them. I don't think that is entirely the fault of the buyers.

--Bob
 

·
Registered
Joined
·
1,241 Posts

Quote:
people see marketing terms like "Deep Color" and expect dramatic improvements which just aren't going to be there

It is possible to extract more than 8 bits per component resolution from the macroblocs in current codecs, and display that, even though no more than 8 bits were put in. That may not be the data that was put in there, but it might still lead to a picture with better perceived quality than a decoded that quantizes to 8 bits.


What's more important: when scaling (720 to 1080, or 1080 to 720), you will end up with intermediate color values, which certainly will have less banding if you use Deep Color. Even just 10 bits instead of 8 is a noticeable improvement.


Yes, you'll probably have to buy a new source (read: HD player), and scaler (read: receiver), but at least you can get some benefit even from the current content.
 

·
Read the FAQ!
Joined
·
36,621 Posts

Quote:
Originally Posted by jwatte /forum/post/0


It is possible to extract more than 8 bits per component resolution from the macroblocs in current codecs, and display that, even though no more than 8 bits were put in. That may not be the data that was put in there, but it might still lead to a picture with better perceived quality than a decoded that quantizes to 8 bits.


What's more important: when scaling (720 to 1080, or 1080 to 720), you will end up with intermediate color values, which certainly will have less banding if you use Deep Color. Even just 10 bits instead of 8 is a noticeable improvement.


Yes, you'll probably have to buy a new source (read: HD player), and scaler (read: receiver), but at least you can get some benefit even from the current content.

This is advantage on the margins. Remember we are talking about what gets transmitted BETWEEN devices, not what gets done in the internal processing before that video gets sent out by any given device.


If the player does 10 (or 12) bit internal processing, and sends 8 bit results to a scaler which does 10 (or 12) bit internal processing, and sends 8 bit results to a display, which then does 10 (or 12) bit internal processing for things like gamma correction, you will, I suspect, get essentially all of the advantage you are anticipating. And of course better players, scalers and displays ALREADY do this.


But that aside, if you ask the average buyer what Deep Color in HDMI V1.3 means to him you are going to get a very different take than the reality. It's really no different than a few years back when buyers really thought that you could turn SDTV into HDTV just by scaling up the resolution.


It's worse in some senses this time, because buyers actually believe HD-DVD and Blu-Ray discs will deliver, in fact, Deep Color content, and not enough people are telling them otherwise.

--Bob
 

·
Registered
Joined
·
807 Posts

Quote:
Originally Posted by Bob Pariseau /forum/post/0


There is nothing wrong with HDMI V1.3, per se.

Well, actually, there is: all these wonderful features are OPTIONAL. Claiming 1.3 compliance per se does not mean much: pretty much any device on the market today with HDMI connector can claim such compliance. No 1.3 device has to provide support for DeepColor nor xvYCC nor hi-def audio, not even lip-sync.

I completely agree with your point about content, or lack thereof. But this is always the case with a new technology: content providers and mastering software developers have no way even to test their material until there is delivery mechanism, and there is no delivery mechanism until it's hyped up to deliver money.


Small note about audio mixing in a player: 1.3 has enough bandwidth to be able to provide multiple audio streams to be mixed in a receiver. Is it useful, or is there any advantage to that? I don't know. I don't even know if 1.3 has a provision for that. But it can be done.
 

·
Read the FAQ!
Joined
·
36,621 Posts
ptsenter,

Although this is certainly a problem for consumers -- made more so by the fact that manufacturers can't agree what to CALL this stuff in their marketing materials, thus making it even harder to tell whether any given product implements any given feature -- I don't really balk at that. HDMI V1.3 is enabling technology, as it should be. The market will determine which features get pushed and which lag behind. Some features will take years to pan out.


What I balk at is letting people believe that something is there which really isn't.


Toshiba, for example, is putting out shelf placards for their XA2 player which tout HDMI V1.3 and note its "Deep Color" technology. This despite the fact that they know full well there is *NO DISC* that player can play, now or in the future, which actually contains Deep Color content. Their advertising is technically correct but deliberately misleading.


Unfortunately that's the state of the market right now. HDMI V1.3 is becoming a buzzword -- a check-off item -- with deliberately slippery meaning.


Caveat emptor.

--Bob
 

·
Registered
Joined
·
807 Posts

Quote:
Originally Posted by ptsenter /forum/post/0


Small note about audio mixing in a player: 1.3 has enough bandwidth to be able to provide multiple audio streams to be mixed in a receiver. Is it useful, or is there any advantage to that? I don't know. I don't even know if 1.3 has a provision for that. But it can be done.

1.3 has a provision up to 8 audio streams.
 

·
Read the FAQ!
Joined
·
36,621 Posts

Quote:
Originally Posted by ptsenter /forum/post/0


1.3 has a provision up to 8 audio streams.

What about the control information to instruct the mixer? Remember the user interface is in the player and instructions come from the disc itself.


I would point out that neither Dolby Labs nor DTS has even suggested in their materials that it might be possible for the mixing to take place downstream of the player. They have talked about bypassing the mixing -- essentially not doing it -- but not about moving it to the receiver.

-- Bob
 

·
Registered
Joined
·
807 Posts

Quote:
Originally Posted by Bob Pariseau /forum/post/0


HDMI V1.3 is enabling technology, as it should be.

Apparently, it's not enough. Case in point: HDMI Org is working on, first, a way to prevent manufacturers to come up with their own names for the same features, second, to make sure for consumers to understand what's actually implemented and what not.
Quote:
What I balk at is letting people believe that something is there which really isn't.

And this is HDMI's fault first, manufacturers - second.

Quote:
HDMI V1.3 is becoming a buzzword -- a check-off item -- with deliberately slippery meaning.

Ditto.
 

·
Registered
Joined
·
807 Posts

Quote:
Originally Posted by Bob Pariseau /forum/post/0


What about the control information to instruct the mixer? Remember the user interface is in the player and instructions come from the disc itself.


I would point out that neither Dolby Labs nor DTS has even suggested in their materials that it might be possible for the mixing to take place downstream of the player. They have talked about bypassing the mixing -- essentially not doing it -- but not about moving it to the receiver.

-- Bob

1.3 allows control information (even the most elaborate) to be passed also, even so it's, guess what, optional.

What Dolby Labs and DTS are doing is totally different. I don't know their reasoning, which probably has nothing to do with HDMI. It does not mean it can't be done.
 

·
Read the FAQ!
Joined
·
36,621 Posts

Quote:
Originally Posted by ptsenter /forum/post/0


1.3 allows control information (even the most elaborate) to be passed also, even so it's, guess what, optional.

What Dolby Labs and DTS are doing is totally different. I don't know their reasoning, which probably has nothing to do with HDMI. It does not mean it can't be done.

Well this is interesting in its own way, but of course the reality is that NOBODY is suggesting HDMI V1.3 receivers will actually implement the unique audio mixing and control requirements separately defined for the HD-DVD and Blu-Ray formats just to offload those functions from the players. Nor that players will be designed to ship out multiple audio tracks plus mixing control info.


So really we're back where we started: Despite HDMI V1.3, proper playback of current, "advanced" HD-DVD discs and future, "player profile 1.1" Blu-Ray discs will require audio decoding and mixing in the player. And the result of that is a set of PCM streams that works just fine over HDMI V1.1 or V1.2.

--Bob
 

·
Registered
Joined
·
5 Posts
Real uncompressed high definition video content is created in 10 and 12 bit digital signal processing environments. Some compressed HD video content is created in 8 bit.


Fake high definition, known as HDV, is a consumer video acquisition format being pushed by some manufacturers as legit HD. It is and highly compressed out of the camera as a mpeg2 stream with some color space data limitations. It often normally bandings, pixelations, and dropouts.


Higher levels of high definition acquisition and processing are right around the corner with new technology peeking out from under the covers. These new HD formats will take advantage of the increased data rates afforded by HDMI v1.3 and provide the engines to push the entire content creation chain forward to take advantage of the various increased capacities afforded by HDMI v1.3.


Granted, manufacturers can be counted on to hype and deceive us about their products. However, when one finally takes advantage of new features enabled by new technology with the availability of the new HD content, the others will fall in line merely to maintain the appearance of being on the cutting edge.


So, yes. The implementation of HDMI v1.3 will undergo some time and much hype and lies until one manufacturer takes full advantage of the new video and audio capabilities of v1.3 coupled with their 'proprietary' HD dvd delivery format. Once this occurs, the rest will jump in line and HDMI v1.3 will then be utilized to provide full functionality of its capacities.
 

·
Read the FAQ!
Joined
·
36,621 Posts
scriptshooter,

Sure. HDMI has to lead the way with the V1.3 spec so that development like this can happen. There will also be plenty of improvement out there that is independent of the extensions introduced in V1.3 -- i.e., it is equally well supported by V1.1.


For example, one of the most important changes in the very near term is that superior video de-interlacing and scaling solutions are starting to migrate down into affordable products. These don't depend on HDMI V1.3, but you can bet that some manufacturers will tie the two together in marketing materials in an effort to get people to replace older HDMI products with newer V1.3 products sooner than they need to. I.e., implying that you need a new HDMI V1.3 standard DVD player to take advantage of what's actually improved de-interlacing and scaling in a new TV that also happens to be HDMI V1.3.


In the effort to promote HDMI, whether V1.3 or not, spokespeople should be careful not to oversell things that way.


It is also the case that non-mass market formats will precede new mass market formats. Many enthusiasts enjoyed HD quality movies on D-Theater tapes well before HD-DVD and Blu-Ray were launched. And perhaps new enthusiast formats will start to appear that take advantage of the HDMI V1.3 feature set.


But nobody should confuse buyers into thinking that this has anything to do with the REAL mass market formats -- i.e., HDTV (whether off air, cable, or satellite), HD-DVD, Blu-Ray, or standard DVD.


New mass market formats will, eventually, come out of course. The gestation period will likely be similar to that for HD-DVD and Blu-Ray. I.e., years. The issue is not whether a technology or combination of technologies can support a new feature from capture to display. The issue is how to get enough industry backing behind some new set of technologies to launch a new, incompatible format. And for HDTV to change you also have to allow for the swap out of capital equipment in the networks and local stations.


But have no doubt that there are technology issues here as well. Current technology for digitizing film stock can't even reach 1920x1080 resolution yet. Deep Color capture has its own challenges. And of course not all movies can be computer generated. End-to-end, digital, live movie production will likely be the eventual solution, but it is still early days.


This is not reason for giving up. It's just reason for saying spokespeople should be cautious when touting immediate advantages from HDMI V1.3.


And HDMI will also be tarnished if manufacturers continue to take the minimalist approach to implementation. Manufacturers are already feeling the heat because far too many HDMI V1.1 and V1.2 implementations don't handle 1080p resolution for example (which is, of course, optional). And that doesn't even get into the whole interoperability and connection robustness issue.


[It is fundamental that the HDCP and EDID processing issues get resolved, and perhaps Simplay will finally do that. I'm skeptical.]


I suppose what bothers me more than anything is that buyers are being led to believe that they HAVE TO update all of their equipment to HDMI V1.3, when in fact HDMI V1.1 or V1.2 alone, or a mix of HDMI V1.3 with those, will actually yield the same results, for all practical purposes, even for HD-DVD and Blu-Ray, over at least the next couple years.


My personal take on this is as follows:


* If you want to buy a product now, there is no reason to wait for HDMI V1.3. You can do all the fun and useful stuff over the next couple years with a well engineered HDMI V1.1 or V1.2 product, of which there are many. So if you see a product you like and it happens to be a good implementation of "only" HDMI V1.1, then go for it.


* Nor is there reason to pay a premium for HDMI V1.3. There is no immediate enhanced value which justifies premium pricing. It is all future potential that is at least a year or two out, and in many cases more.


* But there's also no reason *NOT* to get it! If the product you like happens to come with HDMI V1.3 then fine! And don't feel you have to change out all your other HDMI products at the same time. Just be sure that everything ELSE the product does justifies what you are being asked to pay for it.


That last point needs to be qualified in one particular. Using HDMI V1.3 to HDMI V1.3 connections at their highest data rate will require cables engineered for that. Folks who are doing in-wall cabling, particularly for longer runs, may run into problems when they finally put HDMI V1.3 products on both ends of the cable and crank it up. Although HDMI V1.3 cabling provisions are in place, it is hard for consumers to know which cable manufacturers are actually doing the right thing here because there is not yet enough HDMI V1.3 consumer gear out there -- and none at all that implements the highest bandwidth HDMI V1.3 connections yet -- for real user feedback.

--Bob
 
1 - 20 of 49 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top