or Connect
AVS › AVS Forum › Blu-ray & HD DVD › HDTV Software Media Discussion › Industry Insiders Q&A Thread: only Questions to insiders please
New Posts  All Forums:Forum Nav:

Industry Insiders Q&A Thread: only Questions to insiders please - Page 58  

post #1711 of 4623
Quote:
Originally Posted by wolfyncsu7
Question to any insiders who might know (and I apologize if this has been posted before... I searched for a little bit ... I promise):

Is HD DVD and/ or Blu-ray able to provide source material that will utilize the expanded color range (not sure if I phrased this correctly) that HDMI 1.3 brings to the table?

Is it just a matter of more bits and more space needed to provide more colors? Will this also be dependent on the data rate transfer specs for each format?
Crickets....

I guess that means there is no support for greater color depth in either format.

- Rich
post #1712 of 4623
That's correct Rich. The whole thing about HDMI reminds of people who confuse the link speed to a disc drive, with its actual performance.

This is the real problem with these industry standards. They move slowly to adopt new formats. So we are stuck with 4:2:0 sampling at 8-bits. Now that this is terrible but before we dream about "more colors" we have other things to improve on...
post #1713 of 4623
Regardless of the bit-depth of HD encodings however, couldn't players be provided that support HDMI 1.3 to stream video content using higher-bit-depth? I say this because there's often a bit of processing/decoding taking place in the player and often these calculations are above 8-bits. With deep-color HDMI 1.3 on an HD DVD/BD player, the video could stream to the display at the higher interpolated color depth rather than being decimated back down (like a 16-bit source transmitted as 24 bits to the D/A converter after EQ DSP rather than decimated back to 16)
post #1714 of 4623
Quote:
Originally Posted by amirm
Their solution to above is to use PCM. There, you lose the above savings. So now we are talking about 5.8 mbit/sec and whopping 12 mbit/sec for 96khz since even a 100% stream of zeros takes the same space as random data. This rules out general usage on BD-25 as otherwise, space becomes a big issue and you get to decide between that last bit of audio fidelity versus lots of extras (compared to HD DVD-30).

12 mbit/sec is about the same speed as VC-1 is doing, isn't it? One lossless track is using the same amount of space as the video? Pure crazy-ness.
post #1715 of 4623
I hope there is no processing on HD video :). Even if there is, I am not sure a display would actually show the extra bits. But yes, it is always good to have a better pipe than the source aso to not have that be a limitation in future scenarios.
post #1716 of 4623
Quote:
Originally Posted by skogan
12 mbit/sec is about the same speed as VC-1 is doing, isn't it? One lossless track is using the same amount of space as the video? Pure crazy-ness.
Had not thougth about it that way :).
post #1717 of 4623
Quote:
Originally Posted by DaViD Boulet
Regardless of the bit-depth of HD encodings however, couldn't players be provided that support HDMI 1.3 to stream video content using higher-bit-depth? I say this because there's often a bit of processing/decoding taking place in the player and often these calculations are above 8-bits. With deep-color HDMI 1.3 on an HD DVD/BD player, the video could stream to the display at the higher interpolated color depth rather than being decimated back down (like a 16-bit source transmitted as 24 bits to the D/A converter after EQ DSP rather than decimated back to 16)
Right now at least we're stuck with 8 bit 4:2:0 imagery on the packaged media. Personally I hope that we can expand the specifications to include support for 10 bit content at some future date (and yes, there are ways to do that).

But 8 bit content aside, both blue-laser formats require a substantial amount of image processing downstream of the video decoder(s). The operations include luma keying of the PiP channel, interactive/presentation graphics compositing, color space conversion, scaling, etc. Every one of these operations potentially results in "fraction bits" to the right of the least significant bit in an 8 bit model. Many designs truncate these fraction bits or use them to invoke a rounding operation back to an 8 bit integer.

There is reason to believe that going to a 10 bit video data path right after the video decode, and keeping that extra precision around right through to the HDMI 1.3 connector will result in improved image quality.
post #1718 of 4623
Question for Amir and Keith:

Do either of you have any comments on this post, and follow-up posts?

Some pertinent quotes:
Quote:
Originally Posted by dr1394
Toshiba and HD-DVD have painted themselves into a corner with iHD. The requirements of the graphics subsystem are way beyond the capabilities of any current SoC solution. Either you build another PC based player (which nobody wants to do), or you build an SoC based player that isn't fully iHD capable.
Quote:
Originally Posted by dr1394
HTML and XML are part of the problem, not the solution. They are too PC-centric, and require a classic PC graphics sub-system. SoC solutions have limited graphics (at least today). BD-J is a much better fit to current SoC graphics capabilities.
Quote:
Originally Posted by Enigma
Are you saying that you believe that BD SoC players are not capable of BD-J menus, despite the BD spec, or are you saying that BD-J is simpler to implement than iHD?

Further, when you say "current SoC"; clearly this is a short term issue (not having an SoC solution). Keith of Sigma Designs has mentioned numerous times that their SoC chip will be capable of both BD and HD DVD, with different firmware. I'm guessing that players with this chip will begin showing up next spring, probably from Tosh as well as others. Do you dis-agree?
Quote:
Originally Posted by dr1394
I am saying that iHD requires a more capable graphics sub-system than BD-J. Assuming that SoC solutions are the path to lower cost players, then the SoC silicon is burdoned with the heavy graphic requirements of iHD.
Quote:
Originally Posted by dr1394
Will they be fully iHD capable? That is, supporting all the possible objects per line while maintaining frame rate?
Quote:
Originally Posted by Enigma
I thought that was required by the HD DVD spec. Perhaps we can get some clarification from an insider?
I'd be interested in Amir's comments wrt iHD (as well as the general topic), and Keith's comments wrt SoC implementation (on a future anoymous product, of course).
post #1719 of 4623
Very strange comments. Unless he is saying BD-J can not do everything iHD can, then I have no idea what there reference to graphics is. After all, the goal of both systems is to display something eventually. As such, it is puzzling that he would think less graphics is needed to display the same data with BD-J!
post #1720 of 4623
Quote:
Originally Posted by Tom McMahon
Right now at least we're stuck with 8 bit 4:2:0 imagery on the packaged media. Personally I hope that we can expand the specifications to include support for 10 bit content at some future date (and yes, there are ways to do that).
That is very interesting (to me at least).

Would you vote for 10 bit 4:2:0 or would you like to go straight to 10 bit 4:4:4? Also would expanding the specification work for both BluRay and HD-DVD? Could you post a little comment about how expanding the specification would work? Would new media with 10 bit encoding still be compatible with old players (of course only to that extend that old players would play only 8bit)?

Thanks!
post #1721 of 4623
Quote:
Originally Posted by madshi
That is very interesting (to me at least).

Would you vote for 10 bit 4:2:0 or would you like to go straight to 10 bit 4:4:4?

wouldn't 4:4:4 require 12 bit
post #1722 of 4623
Quote:
Originally Posted by AV Doogie
wouldn't 4:4:4 require 12 bit
Not if it is YUV end to end.
post #1723 of 4623
Quote:
Originally Posted by madshi
That is very interesting (to me at least).

Would you vote for 10 bit 4:2:0 or would you like to go straight to 10 bit 4:4:4? Also would expanding the specification work for both BluRay and HD-DVD? Could you post a little comment about how expanding the specification would work? Would new media with 10 bit encoding still be compatible with old players (of course only to that extend that old players would play only 8bit)?

Thanks!
The correct next step would be to go to 4:2:0 10 bit through the packaged medium "channel". This would actually result in a reduction in the bitrate required to reach the same PSNR and visual quality levels as 8 bit codification. I know that this is completely counterintuitive but it is true because of the reduction in coding "noise". Dolby submitted several video codinig contributions to the JVT (MPEG/ITU) that demonstrated this.

4:2:2 is *only* required for interlaced content and therefore we can avoid that altogether for any enlightened extensions to the spec. Futrure film content will *never* be interlaced. Future video content will *never* be interlaced*.

4:4:4 is nice but for consumer packaged medium applications is not required as the multi-generation benefits are not accrued. 4:2:0 is fine as long as all of the processing aspects are honored.

Therefore, in my personal opinion, 4:2:0 10 Bit with optional xvYCC color space calibration and HDMI1.3 connectivity to the display will offer tremendous value-add for the consumer in the coming years.
post #1724 of 4623
Quote:
Originally Posted by DaViD Boulet
Regardless of the bit-depth of HD encodings however, couldn't players be provided that support HDMI 1.3 to stream video content using higher-bit-depth? I say this because there's often a bit of processing/decoding taking place in the player and often these calculations are above 8-bits. With deep-color HDMI 1.3 on an HD DVD/BD player, the video could stream to the display at the higher interpolated color depth rather than being decimated back down (like a 16-bit source transmitted as 24 bits to the D/A converter after EQ DSP rather than decimated back to 16)
Let me mention another aspect of the current blue-laser ENcoder situation. Some encoders accept 8 bits only. Some accept 10 bits. Either way somewhere deep in the encoder, whether it be MPEG-2, VC-1 or H.264/AVC, there is a conversion to "8 bits" (that's a simplification but let it be for now) for the final content that's packaged on the disc.

Depending on where the endcoder does the math there may be a quality difference in the resulting video that the consumer experiences. So when you're talking about 8 bit, 10 bit, VC-1, H.264/AVC etc you must be extremely precise in order to be comparing apple and apples.
post #1725 of 4623
Quote:
Originally Posted by dolphan
Hello Insiders!

Fantastic thread! Please keep it going.

I was wondering why Sony would not use SACD as the audio for BD? It is multichannel and sounds wonderful.
SACD as a mass market format is essentially dead. Those who have used DSD for mastering have mixed reports of quality. I enjoy all the SACD's (and DVD-A's) that I have.

With HD DVD and Blu-ray optical media, Dolby TrueHD 5.1 and DTS-HD Master Audio are both new lossless codecs that seem to be poised to replace SACD and DVD-A.

The Blu-ray camp has announced at least 18 record labels that made up 50% of music sales last year have received Blu-ray hardware and tools for mastering. The only thing we don't know is if they will be using Dobly's TrueHD or DTS-HD Master Audio.

Perhaps Roger can shed some light for us?
post #1726 of 4623
Quote:
Originally Posted by Tom McMahon
Future video content will *never* be interlaced*.
Tom,

Are you suggesting adding 1080p60 storage also? Or would this future video content be something else?

--Darin
post #1727 of 4623
Quote:
Originally Posted by TomsHT
Talkstr8t, I am curious if there have been any recent advances made referring to BD50’s
Sorry, I've had very little exposure to that end of the technology chain so I don't have a meaningful response for you.
post #1728 of 4623
Quote:
Originally Posted by amirm
Very strange comments. Unless he is saying BD-J can not do everything iHD can, then I have no idea what there reference to graphics is. After all, the goal of both systems is to display something eventually. As such, it is puzzling that he would think less graphics is needed to display the same data with BD-J!
I haven't yet read that thread, but it sounds like he is referring to the fact that a very basic browser with simple HTML and ECMAScript support typically requires at least as much CPU and more memory to perform satisfactorily as does a non-optimized (no JIT) Java VM environment as specified by Blu-ray and OCAP. Given the extensive additional graphics capabilities required by iHD relative to a basic browser, and the fact that all BD-J implementations are expected to include JIT support (which provides up to a 10x performance improvement relative to no JIT), I would assume a far more capable processor would be required to adequately support iHD.

- Talk
post #1729 of 4623
Talk,

Why have we not seen any movies even using BD-J menus upto this point?

Thanks,
Robert.
post #1730 of 4623
Quote:
Originally Posted by amirm
Even if there is, I am not sure a display would actually show the extra bits.
Many decent HDTVs now use 12-bit or more processing internally. They really rip apart the video and throw it back together. It's a wonder we get a picture at all. :)
post #1731 of 4623
Kjack,
If you were going to hook up yout HD player (assuming you have one of course)to a HDTV 42 to 50 Inch,is there enything on the market right now you would go for?
post #1732 of 4623
Quote:
Originally Posted by wolfyncsu7
Question to any insiders who might know (and I apologize if this has been posted before... I searched for a little bit ... I promise):

Is HD DVD and/ or Blu-ray able to provide source material that will utilize the expanded color range (not sure if I phrased this correctly) that HDMI 1.3 brings to the table?

Is it just a matter of more bits and more space needed to provide more colors? Will this also be dependent on the data rate transfer specs for each format?
(Acknowledging Amir and Rich's posts, yes there were crickets for a little bit.)

Wolf/Rich,

First off, don't we want TV's that correctly support 709 properly? Several DLP's and LCD's I've dug into still have no 10-bit support. There's a Pro-level plasma from Pioneer that says it has true 10-bit support, but that horse burned me bad before... ;) Don't worry, we'll still look at it to see if it'll pass our testing.

Amir is right that there's more to fix than whether or not we've got 10-bit 4:2:0 on the disc. Also, I don't think you want your players going THAT far up in price right off the bat as it's not just the codec and decoder you've got to worry about. There's the compositing and HDMI transmission chips that have to be upgraded. As I'm not an engineer of these boxes, I'd be curious to know exactly how much these would cost. The PS3 tacked on some for the HDMI 1.3 ability. Does the Sony, Pioneer, or other stand-alone player support HDMI 1.3 already?

Tom from Broadcom spoke about the 10-bit high profile spec for H.264. I read a little into it and believe as he does that 10-bit 4:2:0 is just like HD resolution as the percentage of increase does not directly reflect the increase in bitrate (that's 25% up for math whizzes out there for the bitdepth). Don't focus on the 256 levels versus 1024 levels thing as it's the number of bits, not levels, that get stored on the disc. We could see it increasing the bitrate maybe 10-12%, but that's nothing when you're looking at from 14Mbits to ~16Mbits ABR. BTW, 10-bit 4:2:2 would be about another 25% jump over 8-bit to 10-bit. 18-21Mbits ABR, anyone??? Mmmmm, smoother color gradients (insert Homer gargle... :D ). So, to answer your question... Does either disc spec support a 10-bit video, not to my knowledge. Can the codecs do it efficiently? Oh yeah and better than MPEG-2 did just for 4:2:0 to 4:2:2's jump (~7Mbits to ~18Mbits for SD???). Maybe VC-1 can/does have it as well.

However, as to my earlier point, I'm making this WAY too simplistic for the CE's out there. You could see a player cost jump over that piddly 45% (potential) I mentioned in my codec response. Also, I've yet to see a 10-bit profile encoder for H.264. x264 project, pretty please??? :o So this could be pure speculation and could be difficult to code an encoder for it as not much work exists in that regard that's widely available.

Cjplay.
post #1733 of 4623
Quote:
Originally Posted by amirm
I hope there is no processing on HD video :). Even if there is, I am not sure a display would actually show the extra bits. But yes, it is always good to have a better pipe than the source aso to not have that be a limitation in future scenarios.
I've seen one 10-bit plasma display. There's also high-end front projectors and CRT's that support it (checked a little deeper). Again, though, CE's have WAY further to go than codec designers. Which is easier to change? The 24-bit D/A to a 30-bit (10bpc is 30-bit RGB) in your TV or the coding of the picture on a $20 disc? It'll come someday, but as Amir said, it'll take time for adoption. Not to mention as David said, they could still be only displaying the 8bpc signal even though it may accept the 10bit signal. Refresh rates for 24p (48hz and 72hz) are just coming to acceptable levels to remove tearing from 8-bit display. Add 10-bit to the mix and that could screw things up as well.

Cjplay.
post #1734 of 4623
Quote:
Originally Posted by AV Doogie
wouldn't 4:4:4 require 12 bit
To clarify, when any of us mention a bit depth, it's "Per channel". 10 bits per channel, 8 bits per channel. RGB24 in PC's is just 8-bit RGB. YUY2 and UYVY are 16-bit color so 8-bit YUV or YVU. The 4:4:4 is just the number of stored samples. In 4:2:0, there's still 8 bits representing the chroma level, it's just there's 1/4 of the samples stored versus 4:4:4. 10-bit just lets 1024 levels of luma and chroma exist versus 256 in 8-bit. While that may seem like 300% of a bitrate increase, you don't store the levels, but you do store the bits that contain those levels, so that's why it's really ~25% more info.

This is where Tom's "never" starts to confuse me as to never needing to be used but doing only 1/4 of the sampling in 4:2:0.

Tom, based on the below articles, I would think we would want 4:2:2 sampling, interlaced or no, to increase chroma sampling precision. As a compressionist "raised" on Broadcast, VOD, and DVD MPEG-2, I've seen the increased color quality and sharpness of 4:2:2 over 4:2:0. Again, feel free to refute this as I'm sure you will.
TVTech article on 4:2:2
From MSDN (Amir's neck of the woods) -
MSDN on YUV formats
Fourcc.org, my favorite YUV hangout, but not always 100% updated.
FourCC.org
So you can see where people get confused as to why you point out the lack of need to use 4:2:2 or even 4:4:4.

Cjplay.
post #1735 of 4623
Quote:
Originally Posted by Cjplay
...Tom from Broadcom spoke about the 10-bit high profile spec for H.264. I read a little into it and believe as he does that 10-bit 4:2:0 is just like HD resolution as the percentage of increase does not directly reflect the increase in bitrate (that's 25% up for math whizzes out there for the bitdepth). Don't focus on the 256 levels versus 1024 levels thing as it's the number of bits, not levels, that get stored on the disc. We could see it increasing the bitrate maybe 10-12%, but that's nothing when you're looking at from 14Mbits to ~16Mbits ABR. ...
10 bit baseband uncompressed source material has a higher bitrate than 8 bit material, but the compressed bitstream rate does not go up when coding 10 bit source (and depending on the bitrate and the content, it can go down!).

You are still coding 1920 by 1080 so the spatial part of the bitstream syntax remains identical. The frequency domain information is basically the same. The motion vectors get a bit more accurate, but their lengths remain about the same. All of that stuff gets entropy-coded by either CAVLC or CABAC as usual. I know that it's completely counter-intuitive, but the 10 bit bitrate does not go up. If the video was being run-length encoded maybe, but not with the H.264/AVC toolkit.

To quote from one of Dolby's JVT contributions (JVT-L026):

"These tests demonstrate that the quantization design for FRExt performs as intended – the rate-distortion depends almost entirely on QP. Sample bit depth only has an effect at low QP values where encoding at a higher bit depth always improves PSNR and often reduces the bit rate. Taken together these plots demonstrate that encoding at a higher bit depth is never more costly (in rate-distortion) than encoding at a lower bit depth, and is often more efficient."

(QP is the "Quantization Parameter" , a codec setting that determines how coarsely the video is quantized. Low quantization = higher video quality, High quantization = lower video quality.)

Quote:
Originally Posted by Cjplay
However, as to my earlier point, I'm making this WAY too simplistic for the CE's out there. You could see a player cost jump over that piddly 45% (potential) I mentioned in my codec response. Also, I've yet to see a 10-bit profile encoder for H.264. x264 project, pretty please??? :o So this could be pure speculation and could be difficult to code an encoder for it as not much work exists in that regard that's widely available.
Cjplay.
The JVT source-code reference encoder supports 10 bits and has freely available to everyone for the last several years (that's how everyone verified each other's work during the development of the High Profiles (FRExt)). You wouldn't want to use this encoder for any real world production, but it serves as an excellent starting point for anyone building real product for the blue-laser marketplace.
post #1736 of 4623
Quote:
Originally Posted by Cjplay
Tom, based on the below articles, I would think we would want 4:2:2 sampling, interlaced or no, to increase chroma sampling precision. As a compressionist "raised" on Broadcast, VOD, and DVD MPEG-2, I've seen the increased color quality and sharpness of 4:2:2 over 4:2:0. Cjplay.
4:2:2 only increases the chroma resolution in the vertical dimension. When you rotate your head 90 degrees does the picture look sharper?

There could be a number of reasons why you've experienced better looking video with 4:2:2 chroma resolution - the most likely being that it is easier (cheaper) to make chroma filters that don't have to perform the extra steps required to get in and out of 4:2:0. Perhaps the implementation you were comparing against cut some corners. Also note that comparing a 4:2:2 encoder from XYZ Corporation against a 4:2:0 encoder made by Coders Inc. is like comparing apples and yogurt.

With progressive content and good filter design, there is no reason whatsoever that you'd favor 4:2:2 over 4:2:0 for a one-time codec designed for packaged media distribution of high quality entertainment content.

The answer will be different, though, if you're doing multi-pass compression on tape machines or servers or distribution/backhaul networks, of course. So too if you're dealing with interlaced content.
post #1737 of 4623
Quote:
Originally Posted by madshi
That is very interesting (to me at least).

Would you vote for 10 bit 4:2:0 or would you like to go straight to 10 bit 4:4:4? Also would expanding the specification work for both BluRay and HD-DVD? Could you post a little comment about how expanding the specification would work? Would new media with 10 bit encoding still be compatible with old players (of course only to that extend that old players would play only 8bit)?
Thanks!
4:4:4 has a place in production, post-production, and the image processing on the back end of a high definition DVD player, but it wouldn't gain you much on a blue-laser disc itself.

The H.264/AVC specification already supports High 10 Profile.

I can't speculate on if, how or when the respective standards organizations might address these issues going forward. But it is clear to me that with Hollywood content being 10+ bits, and with HDMI 1.3 and everything downstream of it going 10+ bits, there may be an opportunity to connect those dots at some point.

There are several ways to maintain downward compatibility with the first N generations of players. Too much detail to type here. But if you choose to pursue one of those options it would work for both standards.
post #1738 of 4623
Quote:
SACD as a mass market format is essentially dead. Those who have used DSD for mastering have mixed reports of quality. I enjoy all the SACD's (and DVD-A's) that I have.

With HD DVD and Blu-ray optical media, Dolby TrueHD 5.1 and DTS-HD Master Audio are both new lossless codecs that seem to be poised to replace SACD and DVD-A.

The Blu-ray camp has announced at least 18 record labels that made up 50% of music sales last year have received Blu-ray hardware and tools for mastering. The only thing we don't know is if they will be using Dobly's TrueHD or DTS-HD Master Audio.

Perhaps Roger can shed some light for us?
DSD (the audio format used on SACD) is a single-bit encoding design and not amenable to any medium where any DSP on the audio signal (such as bass management or downmixing of channels) might be needed as in a home-theater recorded format.

In such cases, DSD (direct stream digital) is converted to LPCM in order to perform DSP chores. Might as well have been LPCM (or lossless compressed LPCM) to begin with.
post #1739 of 4623
Quote:
Originally Posted by Tom McMahon
4:2:2 is *only* required for interlaced content and therefore we can avoid that altogether for any enlightened extensions to the spec. Futrure film content will *never* be interlaced. Future video content will *never* be interlaced*.
Are you saying that on the new formats all video will be effectively de-interlaced correctly using 'proper' algorithms as part of the encoding? So for example a disc version of some sporting event shot on HDTV in 1080i would be made available as 1080p and therefore we wouldn't have to rely on the patchy video de-interlacing of most displays?? Is that mandatory or just optional for those content providers that prefer to deliver a quality product??

I know it'll be some time before interlaced content disappears from broadcast streams, but if BD and HD-DVD mean it will disappear from disc media that will be good news indeed provided the encoding steps do a decent deinterlacing job...

Which I guess throws up another question about VC-1 encoding. Does it do de-interlacing and make use of as much information as possible to do it properly?? Or would this be a future enhancement...??
post #1740 of 4623
For native 1080i feed I'd rather have the choice as a consumer to get a decent deinterlacer than be forced to watch the signal through whatever deinterlacer the studio stuck in the path! As time marches forward deinterlacing will improve... but once a disc is mastered already-deinterlaced that resultant image quality could not be improved.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: HDTV Software Media Discussion
This thread is locked  
AVS › AVS Forum › Blu-ray & HD DVD › HDTV Software Media Discussion › Industry Insiders Q&A Thread: only Questions to insiders please