UHD - the questions to be answered... - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #1 of 51 Old 09-17-2013, 08:22 AM - Thread Starter
Member
 
JENC's Avatar
 
Join Date: May 2009
Location: Denmark
Posts: 42
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I'm sure, that UHD will be the next big thing. My next television will be such one, and I can't wait.

But there is still many issues with UHD, and it seems as actual status is, that only panel resolution is "checked OK". To support when-to-buy decisions (yes, I know we are still to wait some time) and give people an overall status update, perhaps the sum of competences in the forum here can help with the questions below.

Is interlaced finally history?
In HD television we had to strive with two formats. 720p and 1080i. Many hailed 720p to be better, due to progressive format, and theoretically this is right, but almost nothing is recorded in true progressive format and we end up with an interlace recording in a progressive signal path. Not a good thing.

- Will interlaced finally be scrapped, so we get _true_ progressive recordings, in all situation, in UHD?

TV tuner?
The currently seen examples of TV channels in UHD, is Satellite with separate boxes.
For many reasons i dislike separate boxes, and my provider is cable.

- Is the standard for UHD TV tuners defined?
- Will we have DVB-C2 HEVC tuners (as opposed to DVB-C MPEG4)

- Which tuners are in currently sold UHD TV's? And is the first DVB-C2 HEVC tuner seen?


Color depth and bit-rate:
Where are we on color depth and bit-rate. UHD formats seems to support bigger color depth and bit-rate than is actually realized by panels and other 4k hardware.

- HDMI 2.0 didn't achieve full color depth and bit rate as to UHD specifications, did it?
- Will coming UHD Bluray be better?

- What color depth and bit rate does the currently sold UHD TV's support? And which one is best?


HDMI 2.0:
It seems like HDMI 2.0 didn't meet the requirements (as least requirements of us nerds :-)
- HDMI 2.0 max's out at 60hz
and what about color depth as above?

But what does this mean for HDMI

- Is a HDMI 2.1 coming up shortly to better these things?

- Or will another standard take over?

- Or shall we just learn to be content with e.g. 50/60hz.


US vs. Europe:
US uses 60hz and Europe 50hz (I'm in Europe).

- Will we continue to see this difference with UHD, or can we finally see a world standard?

Many things are seen across continents, e.g. sports, documentaries, soaps, etc..

Also in connection with the TV tuner standards above, will UHD TV broadcasts in Europe in up being 2160p50 and in US being 2160p60 ? (and hopefully no 2160i50 formats as mentioned above).


I don't see many articles summarizing these known issues. I would like to know my self, and probably many others would like to know also.

So, if you specifically know the answers to the above, please share with us :-)

Thanx

Jens
Denmark
JENC is offline  
Sponsored Links
Advertisement
 
post #2 of 51 Old 09-17-2013, 05:43 PM
Member
 
NLPsajeeth's Avatar
 
Join Date: May 2013
Posts: 50
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 20
Quote:
Will interlaced finally be scrapped, so we get _true_ progressive recordings, in all situation, in UHD?

Yes that is the plan with the Rec. 2020 standard.
Quote:
HDMI 2.0 didn't achieve full color depth and bit rate as to UHD specifications, did it?
HDMI 2.0 barely scratched the surface of the UHD standards set out in Rec. 2020. It is incapable of anything beyond 4K 60p 8-bit RGB 4:4:4. The current DisplayPort standard (1.2) are a little better with 4K 60p 10-bit RGB 4:4:4 but we will likely have to wait until DisplayPort 1.3 before we start seeing more support. At least that is coming soon, who knows when the next HDMI standard is coming.
Quote:
Will coming UHD Bluray be better?

UHD Blu-Rays can be encoded at any format and is not restricted by the cable being used to transport video. So far no UHD Blu-Ray standards have been published so no one knows that this point. That being said, based on what has happened in the past, 4K Blu-Rays will probably have a 23.976p frame rate and most likely 4:2:0 which requires way less bandwidth than 4K 60p 8-bit 4:4:4.
Quote:
What color depth and bit rate does the currently sold UHD TV's support? And which one is best?

Current UHD TVs only support 8-bit.
More color depth is always better. See chronoptimist's excellent reply below for more details.
Quote:
Is a HDMI 2.1 coming up shortly to better these things?

Shortly, no. Eventually, yes.
Quote:
Or will another standard take over?

Unlikely. Even Apple with all its power couldn't get the HDMI forum to move over to VESA signalling.
Quote:
Or shall we just learn to be content with e.g. 50/60hz.

High frame rate (>100 fps) will come eventually. It will take a while.
Quote:
Will we continue to see this difference with UHD, or can we finally see a world standard?
Yes the difference will remain.
The ITU would like the world to standardize on 120.00 fps.
Europe does not wish to use motion compensation on their content to convert 120 fps video to their legacy 50 Hz or 25 Hz.. However, they still want high frame rate content and will likely force the addition of 100 fps or 150 fps to any UHD standard.
tgm1024 likes this.

Kit 4K (Database of 4K Displays), Kit 8K (Database of 8K Displays), Kit Display (General Display Information)
NLPsajeeth is offline  
post #3 of 51 Old 09-18-2013, 03:26 AM - Thread Starter
Member
 
JENC's Avatar
 
Join Date: May 2009
Location: Denmark
Posts: 42
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Thanx for a very good reply.
Quote:
Originally Posted by NLPsajeeth View Post

... The current DisplayPort standard (1.2) are a little better with 4K 60p 10-bit RGB 4:4:4 but we will likely have to wait until DisplayPort 1.3 before we start seeing more support. At least that is coming soon, who knows when the next HDMI standard is coming.

- Will DisplayPort have a chance of becoming the standard interface in Tv's, BluRay's, Settop boxes, etc. taking over HDMI as de facto standard?
As we would like to have the better technology.

Your summarization points out, that it still will take some time and some hardware increments before we are "finally happy".
But hardware manufacturers are probably happy for this and will want to sell us a UHD set three times :-)
JENC is offline  
post #4 of 51 Old 09-18-2013, 04:56 AM
AVS Special Member
 
coolscan's Avatar
 
Join Date: Jan 2006
Posts: 1,792
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 102
Quote:
Originally Posted by JENC View Post

Thanx for a very good reply.
- Will DisplayPort have a chance of becoming the standard interface in Tv's, BluRay's, Settop boxes, etc. taking over HDMI as de facto standard?
As we would like to have the better technology.
The rumor is that the computer manufacturer fought hard for Displayport this round but lost out to the CEMs that also rules content and content protection, much by them buying and bullying the smaller companies to vote for HDMI.

Which is, if you read between the lines was Sony, which is both movie content provider, CEM, ruler of all things Blu-Ray and has the chairman of the HDMI forum, protecting HDMI and the very small increase in specs.
Quote:
Your summarization points out, that it still will take some time and some hardware increments before we are "finally happy".
But hardware manufacturers are probably happy for this and will want to sell us a UHD set three times :-)
Some are more happy than others, while some want to do away with any restrictions on connector standards altogether so they have more room to move forward with new development in equipment without being hindered by a stupid cable standard.

As long as Sony and their lapdogs rule, nothing revolutionary will happen to the HDMI (and the worst connector plug in the history of electronics) anytime in the foreseeable future.

And a comment on color-bits and such; When we see how slow Hollywood is to move into basic 4K, don't expect any rec.2020 or higher bits.
The existing 4K++ cameras can barely do much of that in RAW, and then you have the post workflow that has to manage such changes, and many of them barely cope with 4K the way it is.

And 4K broadcast will do whatever they can to keep files sizes and bandwidth as low as possible.

There is some light in the tunnel with the work on Academy Color Encoding System (ACES) to get everybody on the same "color page" and using increased color gamut, but they have used years of this and are not yet finished.
A company like RED that is one of the two companies that make professional 4K cine cameras (Sony is the other) is not happy about the work that has been done on ACES and has pulled out of the use of ACES.

Some articles on ACES (some quite old).
http://www.ibtimes.com/aces-color-space-gamut-end-all-gamuts-404278
http://www.filmlight.ltd.uk/store/press_releases/digital-film-central-filmlight-create-aces-workflow-for-elysium-4k-finish/
http://wolfcrow.com/blog/what-is-aces-academy-color-encoding-system/
http://www.oscars.org/science-technology/council/projects/aces.html

Your original questions in the first post are interesting enough, but they are really premature and impossible to answer for many years yet.
Don't expect any big changes in the transition from HD to UHD except more pixels.
coolscan is offline  
post #5 of 51 Old 09-18-2013, 06:02 AM
AVS Special Member
 
blee0120's Avatar
 
Join Date: Oct 2009
Location: Merillville, IN 46410
Posts: 3,555
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 25 Post(s)
Liked: 53
Coolscan, do you think we are going to be using bt.709 with 8bits for 4K for awhile? If so, do you think it will be a big improvement over 2k?
blee0120 is online now  
post #6 of 51 Old 09-18-2013, 06:28 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,559
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
Quote:
Originally Posted by NLPsajeeth View Post

UHD Blu-Rays will only be as good as what is allowed by HDMI. However, 4K Blu-Rays will be 23.976p and most likely 4:2:0 which requires way less bandwidth than 4K 60p 8-bit 4:4:4.
The "UHD Blu-rays" could be encoded to any format they like, regardless of what HDMI standards support. If HDMI 2.0 only supports 8-bit 4:4:4, you could still encode 12-bit 4:4:4 data on the disc, as long as this gets converted to 8-bit on output. Then when HDMI 2.1 comes along, they could output a 12-bit 4:4:4 signal.

But yes, they will likely be 4:2:0 - it would probably hurt image quality to move to 4:4:4 video right now. Hopefully they will move beyond 8-bit though. It's more efficient for encoding, and looks better.
Quote:
Originally Posted by NLPsajeeth View Post

8-bit. If UHD content is created at Rec 709, 8-bit is sufficient. If UHD content gets produced at Rec 2020 then 12-bit should be used.
This is a common misconception. Even though current Blu-rays are 8-bit, an 8-bit display is absolutely not sufficient. For starters, Blu-ray is encoded as YCbCr, and displays have RGB pixels. Converting 8-bit YCC to 8-bit RGB is a lossy process - you need to go beyond 8-bit when you convert to RGB. Current displays are also incapable of simply accepting the values they have been sent from the player and displaying them without any further processing. There's a lot of complex processing going on to calibrate the displays, and 8-bit is insufficient there too. This is why, when you look at high end desktop monitors which use 10-bit IPS panels, they look better than 8-bit panels even when receiving an 8-bit signal.
Quote:
Originally Posted by JENC View Post

- Will DisplayPort have a chance of becoming the standard interface in Tv's, BluRay's, Settop boxes, etc. taking over HDMI as de facto standard?
As we would like to have the better technology.
It's not likely, though computer monitors seem to have all moved over to DisplayPort now.
tgm1024 likes this.
Chronoptimist is offline  
post #7 of 51 Old 09-18-2013, 06:39 AM
Member
 
NLPsajeeth's Avatar
 
Join Date: May 2013
Posts: 50
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 20
Quote:
Will DisplayPort have a chance of becoming the standard interface in Tv's, BluRay's, Settop boxes, etc. taking over HDMI as de facto standard?

Personally, I wish we could all standardize on DisplayPort, but for the reasons that coolscan outlined I don't think CE companies will let this happen and HDMI will remain the standard for A/V equipment. I think the best we can hope for is for more vendors to follow in the footsteps of the Panasonic Viera TC-L65WT600 and include both DisplayPort and HDMI ports.
Quote:
And a comment on color-bits and such; When we see how slow Hollywood is to move into basic 4K, don't expect any rec.2020 or higher bits.
The existing 4K++ cameras can barely do much of that in RAW, and then you have the post workflow that has to manage such changes, and many of them barely cope with 4K the way it is.

I agree, getting Hollywood to move to Rec. 2020 will take a very long time based on past transitions. However, Hollywood should already be doing most or all of their color work already using the DCI P3 color space at a color depth of 12 bits per color. Since the Rec. 2020 covers 99.98% of that gamut perhaps there is hope we can at least watch things in DCI P3 which is at least 1.26x larger than the gamut in Rec. 709.
Quote:
A company like RED that is one of the two companies that make professional 4K cine cameras (Sony is the other) is not happy about the work that has been done on ACES and has pulled out of the use of ACES.

That is a bummer. I too am hoping ACES will take off as it seems like a good solution and is slowly being adopted in more and more Movie and TV productions. Hopefully RED will have no choice but to support it in the future.

Kit 4K (Database of 4K Displays), Kit 8K (Database of 8K Displays), Kit Display (General Display Information)
NLPsajeeth is offline  
post #8 of 51 Old 09-18-2013, 06:47 AM
Member
 
NLPsajeeth's Avatar
 
Join Date: May 2013
Posts: 50
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 20
Quote:
The "UHD Blu-rays" could be encoded to any format they like, regardless of what HDMI standards support.
Quote:
This is a common misconception. Even though current Blu-rays are 8-bit, an 8-bit display is absolutely not sufficient.

Thank you for pointing this out and explaining your reasoning. You are correct and I have updated my original post accordingly.

Kit 4K (Database of 4K Displays), Kit 8K (Database of 8K Displays), Kit Display (General Display Information)
NLPsajeeth is offline  
post #9 of 51 Old 09-18-2013, 09:56 AM
AVS Special Member
 
coolscan's Avatar
 
Join Date: Jan 2006
Posts: 1,792
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 102
Quote:
Originally Posted by blee0120 View Post

Coolscan, do you think we are going to be using bt.709 with 8bits for 4K for awhile? If so, do you think it will be a big improvement over 2k?
The real advantage and improvement for Real 4K over 2K is happening at the capture stage (and somewhat in post production treatment) (seating distance has less to do with it) and as long as displays can reproduce the captured content without degrading it, it will be a apparent improvement.

There will be a lot of "4K" that is not Really 4K, like up-converted, captured on inferior cameras with only 4K sensors, crappy capture codecs etc. so don't expect too much from 4K, at least not the first five years.

I don't expect any content with wider color space, even if the displays will be able to display it, because of the content providers paranoid fear of their content being ripped.
coolscan is offline  
post #10 of 51 Old 09-18-2013, 10:52 AM
AVS Special Member
 
Glimmie's Avatar
 
Join Date: Sep 2000
Location: Los Angeles, CA
Posts: 7,911
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 126 Post(s)
Liked: 205
Quote:
Originally Posted by coolscan View Post

And a comment on color-bits and such; When we see how slow Hollywood is to move into basic 4K, don't expect any rec.2020 or higher bits.
The existing 4K++ cameras can barely do much of that in RAW, and then you have the post workflow that has to manage such changes, and many of them barely cope with 4K the way it is.

Hollywood has been doing 4K features since 2005. Remember that 35mm scanned negative can also produce 4K. It's true that it has not been mainstream but now as more and more theaters get 4K projection, we will see more 4K end to end features. As a stop gap, most feature films today are shot 4K. The post production and DCI is in 2K but the 4K camera masters are held onto. In the future it's not that difficult at all to re-conform the feature in 4K if needed. All the editorial and color decision work done in the 2K DCI can be easily applied to a 4K re-conform which is largely an automatic process.

What makes little sense IMO, is consumer 4K. Outside of large home theaters, the benefit is nil. We shall see how the market plays out with this one.

Glimmie's HT Page
Being redone - comming soon!

Glimmie is offline  
post #11 of 51 Old 09-18-2013, 12:46 PM
AVS Special Member
 
blee0120's Avatar
 
Join Date: Oct 2009
Location: Merillville, IN 46410
Posts: 3,555
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 25 Post(s)
Liked: 53
I been watching so many BD over the last 3 years. Many of them don't look as good, only a few look clear and crisp. So, I'm think 4K will do the same, most look ok while a slight few look great. Is that worth so much money?
blee0120 is online now  
post #12 of 51 Old 09-18-2013, 02:05 PM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,559
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
Quote:
Originally Posted by blee0120 View Post

I been watching so many BD over the last 3 years. Many of them don't look as good, only a few look clear and crisp. So, I'm think 4K will do the same, most look ok while a slight few look great. Is that worth so much money?
Even a mediocre Blu-ray looks much better than a DVD or HD broadcast/stream.
It should be the same with 4K sources - though at 4K you will definitely be seeing the limits of most film scans. (better than Blu-ray, but maybe not as good as a great digital 4K image)
Chronoptimist is offline  
post #13 of 51 Old 09-18-2013, 02:23 PM
AVS Special Member
 
blee0120's Avatar
 
Join Date: Oct 2009
Location: Merillville, IN 46410
Posts: 3,555
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 25 Post(s)
Liked: 53
If its just adding resolution, how can it be that much better?
blee0120 is online now  
post #14 of 51 Old 09-18-2013, 03:34 PM
AVS Addicted Member
 
rogo's Avatar
 
Join Date: Dec 1999
Location: Sequoia, CA
Posts: 30,088
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 61 Post(s)
Liked: 436
Quote:
Originally Posted by blee0120 View Post

If its just adding resolution, how can it be that much better?

BluRay does not just add resolution.

There is no difference in HDMI cables. If you can see the picture without visible dropouts or sparklies, the cable is working at 100%. No other cable will display a better version of that picture. You're simply wrong if you think there is a better digital cable than one that is already working.
rogo is offline  
post #15 of 51 Old 09-18-2013, 03:57 PM
AVS Special Member
 
blee0120's Avatar
 
Join Date: Oct 2009
Location: Merillville, IN 46410
Posts: 3,555
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 25 Post(s)
Liked: 53
So what else is it going to add? Still gonna be 8bit of color and most likely bt.709
blee0120 is online now  
post #16 of 51 Old 09-18-2013, 07:25 PM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,058
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 224 Post(s)
Liked: 634
Quote:
Originally Posted by Chronoptimist View Post
 
Quote:
Originally Posted by NLPsajeeth View Post

8-bit. If UHD content is created at Rec 709, 8-bit is sufficient. If UHD content gets produced at Rec 2020 then 12-bit should be used.
This is a common misconception. Even though current Blu-rays are 8-bit, an 8-bit display is absolutely not sufficient. For starters, Blu-ray is encoded as YCbCr, and displays have RGB pixels. Converting 8-bit YCC to 8-bit RGB is a lossy process - you need to go beyond 8-bit when you convert to RGB.

 

Is this because of the different weights placed upon the red, green, and blue sensitivities in the eye with YCC?  Despite it's ease of understanding, RGB by itself is not a data-efficient format because it places equal color resolution on all three channels when green should be given the lion's share, and blue about 1/6th of green's share.  No?


WARNING: You have now entered a no @#$%tard zone. Please gather your anti-vaccine propaganda nonsense and slowly back out the way you came in.
tgm1024 is offline  
post #17 of 51 Old 09-18-2013, 08:15 PM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,559
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
Quote:
Originally Posted by tgm1024 View Post

Is this because of the different weights placed upon the red, green, and blue sensitivities in the eye with YCC?  Despite it's ease of understanding, RGB by itself is not a data-efficient format because it places equal color resolution on all three channels when green should be given the lion's share, and blue about 1/6th of green's share.  No?
The short version is that YCC values don't translate directly to RGB values, so you need greater than 8-bit precision to avoid rounding. (rounding can create banding in the image)

As for chroma resolution; yes, it's true that we are more sensitive to and have more spatial resolution with some colors than others, it's really not a relevant argument as far as displays are concerned though.
Our displays are nowhere near the limit of what our eye can resolve, so subsampled chroma is still rather obvious. It's most obvious if you have ever connected a computer or games console (both of which are natively RGB) to a display which converts to YCC and downsamples chroma for processing the image - it's very obviously blurred.

For example: http://www.avsforum.com/t/1162100/the-official-pioneer-9g-north-american-krp-500m-krp-600m-owners-discussion-pt-ii/6200_100#post_20822997
Chronoptimist is offline  
post #18 of 51 Old 09-18-2013, 09:16 PM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,058
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 224 Post(s)
Liked: 634
Quote:
Originally Posted by Chronoptimist View Post
 
Quote:
Originally Posted by tgm1024 View Post

Is this because of the different weights placed upon the red, green, and blue sensitivities in the eye with YCC?  Despite it's ease of understanding, RGB by itself is not a data-efficient format because it places equal color resolution on all three channels when green should be given the lion's share, and blue about 1/6th of green's share.  No?
The short version is that YCC values don't translate directly to RGB values, so you need greater than 8-bit precision to avoid rounding. (rounding can create banding in the image)

 

If the YCC starts off as 8 bit integers, you then use floating point to convert them to RGB in the floating domain and then down cast them back to 8-bit integers.  I'm not sure where you lose precision, unless the ranges themselves (not precision) for each of the R, G, and B don't map to the YCC ranges 1:1.

 

Quote:
Our displays are nowhere near the limit of what our eye can resolve, so subsampled chroma is still rather obvious.

 

No, the displays don't approach our eyes, but that's why I'm asking what YCC is doing.  I'm not talking about chroma subsampling but was asking if YCC was weighting the bits where the sensitivities lie (roughly a 30/59/11 split).  Regardless of how overall sensitive our eyes are, it'd make better use of the colors if more depth (gradations) were allowable for green than blue.  If 24 bits is all we got, it makes no sense to give only 8 of them to the most sensitive cone in our eyes and a full 8 of them to blue.


WARNING: You have now entered a no @#$%tard zone. Please gather your anti-vaccine propaganda nonsense and slowly back out the way you came in.
tgm1024 is offline  
post #19 of 51 Old 09-18-2013, 09:40 PM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,058
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 224 Post(s)
Liked: 634

Nevermind, I just looked up the equations.  It is a range issue, I understand what you were getting at now, because when range is squashed you lose precision....I think that's what you meant anyway.  The YCC color model can provide many YCC values that translate to invalid RGB values as a result when using the same nominal ranges of 0...1.0 for Y, Cr, Cb, and expecting 0...1.0 for RGB.

 

http://software.intel.com/sites/products/documentation/hpc/ipp/ippi/ippi_ch6/ch6_color_models.html

 

I'll have to think on this unless you can correct me one way or the other.

 

Quote from above link. (Click to show)
Quote:

YCbCr and YCCK Color Models

The YCbCr color space is used for component digital video and was developed as part of the ITU-R BT.601 Recommendation. YCbCr is a scaled and offset version of the YUV color space.

The Intel IPP functions use the following basic equations [Jack01] to convert between R’G’B’ in the range 0-255 and Y’Cb’Cr’ (this notation means that all components are derived from gamma-corrected R’G’B’):

Y’ = 0.257*R' + 0.504*G' + 0.098*B' + 16

Cb' = -0.148*R' - 0.291*G' + 0.439*B' + 128

Cr' = 0.439*R' - 0.368*G' - 0.071*B' + 128

R' = 1.164*(Y’-16) + 1.596*(Cr'-128)

G' = 1.164*(Y’-16) - 0.813*(Cr'-128) - 0.392*(Cb'-128)

B' = 1.164*(Y’-16) + 2.017*(Cb'-128)

The Intel IPP color conversion functions specific for the JPEG codec use different equations:

Y = 0.299*R + 0.587*G + 0.114*B

Cb = -0.16874*R - 0.33126*G + 0.5*B + 128

Cr = 0.5*R - 0.41869*G - 0.08131*B + 128

R = Y + 1.402*Cr - 179,456

G = Y - 0.34414*Cb - 0.71414*Cr + 135.45984

B = Y + 1.772*Cb - 226.816

YCCK model is specific for the JPEG image compression. It is a variant of the YCbCr model containing an additional K channel (black). The fact is that JPEG codec performs more effectively if the luminance and color information are decoupled. Therefore, a CMYK image should be converted to YCCK before JPEG compression (see description of the function ippiCMYKToYCCK_JPEG for more details).

Possible RGB colors occupy only part of the YCbCr color space (see Figure "RGB Colors Cube in the YCbCr Space") limited by the nominal ranges, therefore there are many YCbCr combinations that result in invalid RGB values.

There are several YCbCr sampling formats such as 4:4:4, 4:2:2, 4:1:1, and 4:2:0, which are supported by the Intel IPP color conversion functions and are described in Image Downsampling.

RGB Colors Cube in the YCbCr Space

 

ch6_color_models_4.jpg
PhotoYCC color model:
 

Since the PhotoYCC model attempts to preserve the dynamic range of film, decoding PhotoYCC images requires selection of a color space and range appropriate for the output device. Thus, the decoding equations are not always the exact inverse of the encoding equations. The following equations [Jack01] are used in Intel IPP to generate R’G’B’ values for driving a CRT display and require a unity relationship between the luma in the encoded image and the displayed image:

R' = 0.981 * Y + 1.315 * (C2 - 0.537)
G' = 0.981 * Y - 0.311 * (C1 - 0.612)- 0.669 * (C2 - 0.537)
B' = 0.981 * Y + 1.601 * (C1 - 0.612)

The equations above are given on the assumption that source Y,C1, and C2 values are normalized to the range [0..1], and the display primaries have the chromaticity values in accordance with [ITU709] specifications.

The possible RGB colors occupy only part of the YCC color space (see Figure  "RGB Colors in the YCC Color Space") limited by the nominal ranges, therefore there are many YCbCr combinations that result in invalid RGB values.

RGB Colors in the YCC Color Space

 

ch6_color_models_5.jpg
 

WARNING: You have now entered a no @#$%tard zone. Please gather your anti-vaccine propaganda nonsense and slowly back out the way you came in.
tgm1024 is offline  
post #20 of 51 Old 09-19-2013, 02:43 AM - Thread Starter
Member
 
JENC's Avatar
 
Join Date: May 2009
Location: Denmark
Posts: 42
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Hi

Tuners, from my original question, hasn't got a hit yet.

- Which tuners will (eventually) be standard in UHD television sets? Especially for cable. I'm guession DVB-C2 HEVC (or perhaps called DVB-C2 H.265)

- And which tuners are in the first sets currently sold?

There is very good info on color schemes in the above answers. But mostly by short reference. Does anyone have some good links that provides in-depth introduction to the various color formats.
JENC is offline  
post #21 of 51 Old 09-19-2013, 05:13 AM
AVS Special Member
 
Chronoptimist's Avatar
 
Join Date: Sep 2009
Posts: 2,559
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 1 Post(s)
Liked: 203
Quote:
Originally Posted by JENC View Post

Tuners, from my original question, hasn't got a hit yet.
- Which tuners will (eventually) be standard in UHD television sets? Especially for cable. I'm guession DVB-C2 HEVC (or perhaps called DVB-C2 H.265)
I missed that. Yes, there will likely be new tuners which support H.265 - but you will have to wait until companies announce broadcast plans before we have any idea of what will be used or required.

Doesn't cable usually go through an external decoder?
Chronoptimist is offline  
post #22 of 51 Old 09-19-2013, 06:39 AM
AVS Special Member
 
8mile13's Avatar
 
Join Date: Nov 2009
Posts: 3,807
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 182 Post(s)
Liked: 202
''Operators will not be in no great hurry to ditch their legacy set-tops and will perhaps start with trails of HEVC for one or two UHDTV channels. All our boxes support H.264, so they would become obsolete if we started transmitting HEVC video.'' So everybody needs a new receiver and a new (UHD) television - and don't forget to buy a 4K blu-ray player smile.gif
http://www.v-net.tv/operators-welcome-hevc-potential-for-uhd-and-multi-screen/
8mile13 is offline  
post #23 of 51 Old 09-19-2013, 07:24 AM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,058
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 224 Post(s)
Liked: 634
Quote:
Originally Posted by 8mile13 View Post

''Operators will not be in no great hurry to ditch their legacy set-tops and will perhaps start with trails of HEVC for one or two UHDTV channels. All our boxes support H.264, so they would become obsolete if we started transmitting HEVC video.''

 

Aside from the 4K hardware issues, the H.264 vs. H.265 issues are completely solvable by firmware updates, no?


WARNING: You have now entered a no @#$%tard zone. Please gather your anti-vaccine propaganda nonsense and slowly back out the way you came in.
tgm1024 is offline  
post #24 of 51 Old 09-19-2013, 07:29 AM
AVS Special Member
 
Glimmie's Avatar
 
Join Date: Sep 2000
Location: Los Angeles, CA
Posts: 7,911
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 126 Post(s)
Liked: 205
Keep in mind that OTA TV is under the control of the FCC. Any change from MPEG2 requires an act of congress. 4K over satellite and cable - possibly. But it's going to be a very long time before ATSC is changed in any way that obsoletes MPEG2. And don't expect more spectrum bandwidth to be allocated to OTA either. In fact the opposite is happening. The cell and internet companies would buy up all the spectrum in a minute if the government allowed it. The pressure to grab more of the TV spectrum for wireless is growing steadily.

Glimmie's HT Page
Being redone - comming soon!

Glimmie is offline  
post #25 of 51 Old 09-19-2013, 07:47 AM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,058
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 224 Post(s)
Liked: 634
Quote:
Originally Posted by Glimmie View Post

Keep in mind that OTA TV is under the control of the FCC. Any change from MPEG2 requires an act of congress. 4K over satellite and cable - possibly. But it's going to be a very long time before ATSC is changed in any way that obsoletes MPEG2. And don't expect more spectrum bandwidth to be allocated to OTA either. In fact the opposite is happening. The cell and internet companies would buy up all the spectrum in a minute if the government allowed it. The pressure to grab more of the TV spectrum for wireless is growing steadily.

 

Not that you were strictly implying this, but the only thing I might raise in response to this is (in my opinion) that the adoption of OTA-HD last time around is a poor analogy to adoption of furthering standards.

 

Last time around, there were two things going on at once:

 

  • Adoption of HD
  • Adoption of digital transmission

 

Now that we've finally bitten the digital transmission bullet, ever increasing formats (along any axis---resolution, color, frame rate) have the backward compatibility problem far more easily solved.  I think.


WARNING: You have now entered a no @#$%tard zone. Please gather your anti-vaccine propaganda nonsense and slowly back out the way you came in.
tgm1024 is offline  
post #26 of 51 Old 09-19-2013, 08:18 AM
AVS Special Member
 
8mile13's Avatar
 
Join Date: Nov 2009
Posts: 3,807
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 182 Post(s)
Liked: 202
Quote:
Originally Posted by tgm1024 View Post


Aside from the 4K hardware issues, the H.264 vs. H.265 issues are completely solvable by firmware updates, no?
According to a report titled HEVC Decoding in Consumer Devices senior analist Michelle Abraham estimated that the number of consumer devices that shipped in 2011 and 2012 that would be capable of HEVC playback with a software upgrade totaled around 1.4 billion, with more than a billion more expected to be sold in 2013.'
http://www.streamingmedia.com/Articles/Editorial/Featured-Articles/The-Future-of-HEVC-Its-Coming-but-with-Plenty-of-Questions-89010.aspx

Keep in mind that many manufacturers will not provide such an software upgrade on many of their products especially the older ones and i am pretty shure that a lot official cable provider, satellite provider etc.. receivers are not HEVC capable.
8mile13 is offline  
post #27 of 51 Old 09-19-2013, 09:18 AM
AVS Special Member
 
Glimmie's Avatar
 
Join Date: Sep 2000
Location: Los Angeles, CA
Posts: 7,911
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 126 Post(s)
Liked: 205
Quote:
Originally Posted by tgm1024 View Post

Not that you were strictly implying this, but the only thing I might raise in response to this is (in my opinion) that the adoption of OTA-HD last time around is a poor analogy to adoption of furthering standards.

Last time around, there were two things going on at once:
  • Adoption of HD
  • Adoption of digital transmission

Now that we've finally bitten the digital transmission bullet, ever increasing formats (along any axis---resolution, color, frame rate) have the backward compatibility problem far more easily solved.  I think.

Two issues:

1) The consumer receiver can only receive MPEG2 and is not upgradable. Note I am referring to the current installed base. Sure new products could be made with MPEG4/5 decoders built in but to what standard would they be designed? There is no MPEG4 or 5 or 4K ATSC standard. And the FCC will protect that installed MPEG2 only user base.

2) Because the OTA broadcaster must still maintain MPEG2 compatibility, where do we squeeze in the MPEG4/5? MPEG2 is inefficient and cannot be squeezed much further for a 4K sub channel. And the idea of another 6mhz channel assignment for 4K is not going to happen period. FCC and government issues aside, show me a TV station that is going to invest in a parallel plant complete with antenna and transmitter to broadcast 4K? Where's the 4K audience penetration to support that?

The HD/Digital transition is an interesting analogy when it comes to 4K. Before there were any consumer products introduced in the USA we had an HDTV standard (albeit several options) on the books and a plan to implement the change on both the broadcaster and consumer level. Where is the plan for OTA 4K? There is none.

4K OTA is a pipe dream. Not gonna happen. Satellite, cable - possibly, internet - yes, but not OTA.

Glimmie's HT Page
Being redone - comming soon!

Glimmie is offline  
post #28 of 51 Old 09-19-2013, 09:29 AM
AVS Special Member
 
Glimmie's Avatar
 
Join Date: Sep 2000
Location: Los Angeles, CA
Posts: 7,911
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 126 Post(s)
Liked: 205
Quote:
Originally Posted by 8mile13 View Post

Keep in mind that many manufacturers will not provide such an software upgrade on many of their products especially the older ones and i am pretty shure that a lot official cable provider, satellite provider etc.. receivers are not HEVC capable.

And that's simply because it's not technically possible in addition to all the other issues. Consumer products are built to a price and to get there they use custom designed ASICs. That's "hard" hardware. Sure you can download a software update to change the menu structure or possibly do some minor alterations to the decoder. But you aren't going to find an MPEG2 decoder in a consumer product that can be magically download to an MPEG4 or 5 decoder. We currently cannot build hardware like that at the current consumer price point.

Glimmie's HT Page
Being redone - comming soon!

Glimmie is offline  
post #29 of 51 Old 09-19-2013, 09:46 AM
AVS Special Member
 
tgm1024's Avatar
 
Join Date: Dec 2010
Posts: 6,058
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 224 Post(s)
Liked: 634
Quote:
Originally Posted by Glimmie View Post
 
Quote:
Originally Posted by tgm1024 View Post

Not that you were strictly implying this, but the only thing I might raise in response to this is (in my opinion) that the adoption of OTA-HD last time around is a poor analogy to adoption of furthering standards.

Last time around, there were two things going on at once:
  • Adoption of HD
  • Adoption of digital transmission

Now that we've finally bitten the digital transmission bullet, ever increasing formats (along any axis---resolution, color, frame rate) have the backward compatibility problem far more easily solved.  I think.

Two issues:

1) The consumer receiver can only receive MPEG2 and is not upgradable. Note I am referring to the current installed base. Sure new products could be made with MPEG4/5 decoders built in but to what standard would they be designed? There is no MPEG4 or 5 or 4K ATSC standard. And the FCC will protect that installed MPEG2 only user base.

 

Aside from "where to put it" point you made below, I'm still not sure why this is a tough issue.  Firmware updates are nearly everywhere.  Even TVs that aren't connected to the internet have had SD card slots allowing updates.  And asside from bandwidth, MPEGanything is a software issue.  Unless the raw processing power required suddenly went up, which isn't out of the question.

 

Quote:
2) Because the OTA broadcaster must still maintain MPEG2 compatibility, where do we squeeze in the MPEG4/5?

And this is an astoundingly good point!  Makes me wonder though, I'm not 100% sure we can't mathematically supply an single format that has 2K-MPEG2 in it, and an incremental amount of data bringing it to 4K-MPEG4/5, such that the sum total is close to what 4K-MPEG4/5 would be alone.  That is, having the MPEG4/5 information make use of the MPEG2 data.  I'm trying to think back about JPG, and how the DCT cosine was managed and if there could be added data to make it a "better jpg" such that the original JPG wasn't wasted.  Some of the craziest formats have been incrementable---I remember how ingenious YIQ was and how it didn't waste the black & white signal previously sent.

 

But again: we're up against the firmware update argument again....which is another point you might be right on but I don't yet agree with.

 

Quote:
The HD/Digital transition is an interesting analogy when it comes to 4K. Before there were any consumer products introduced in the USA we had a standard on the books and a plan to implement the change on both the broadcaster and consumer level

 

This might be cart before the horse (?)  The reason we had to have such a plan on the books was because of the monumental leap we were attempting.  And by the way, that plan wasn't around for long before because we were forward thinking that many years before hand.  AFAICT, it only seems that way because we kept delaying the adoption over and over and over, scared silly that Mom & Pop would wake up one day with broken TVs and vote their congressman out.  Does it ever suck having congress in the way of everything.  This is why we're constantly looking over the lake at Japan and saying "aw.......@#$%, I want that."

 


WARNING: You have now entered a no @#$%tard zone. Please gather your anti-vaccine propaganda nonsense and slowly back out the way you came in.
tgm1024 is offline  
post #30 of 51 Old 09-19-2013, 10:23 AM
AVS Special Member
 
Glimmie's Avatar
 
Join Date: Sep 2000
Location: Los Angeles, CA
Posts: 7,911
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 126 Post(s)
Liked: 205
Quote:
Aside from "where to put it" point you made below, I'm still not sure why this is a tough issue. Firmware updates are nearly everywhere. Even TVs that aren't connected to the internet have had SD card slots allowing updates. And asside from bandwidth, MPEGanything is a software issue. Unless the raw processing power required suddenly went up, which isn't out of the question.

Your are grossly over estimating the abilities of a firmware update with today's technology - I know as I am a hardware engineer. In order to have an MPEG decoder that is upgradable from MPEG 2 to MPEG 4/5/6/7/8 would require the hardware to be entirely FPGA based. That's expensive. Can't do that in a $500 HDTV. And even if you were FPGA based, will that chip(s) have enough gates for the new decoder algorithm? Consumer video processing products are ASIC based. They have to be in order to meet the current price point. There is a limit to how much field programmable capabilities you put in a low cost chip.

Even with pure software solutions there are limits. Read the link above. Most MPEG4 is done in the GPU hardware. At the very least you would need a new video card for a computer system to decode MPEG5.

As for a standard that uses MPEG4 to carry additional detail for an MPEG2 stream, that boat has sailed. We have had MPEG4 for years. Who is going to take the time and money to come up with such a system for the sake of OTA HDTV? Wouldn't we rather spend these resources on MPEG6?

We don't need 4K HDTV for the average 50in HDTV. If most people saw a 1080i feed with the bandwidth that was intended in 1999 they would think they are looking at 4K! OTA is what it is. It's future is limited. Look at Fox and CBS threatening to move entirely to cable and that's not even for technical reasons. IMPO, OTA will be dead when the time comes to abandon MPEG2 in 2030. For the few people that must have 4K now there are alternate delivery mechanisms that make far more sense.

Glimmie's HT Page
Being redone - comming soon!

Glimmie is offline  
Reply Flat Panels General and OLED Technology

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off