SMPTE Standardization - AVS Forum
Forum Jump: 
 4Likes
  • 2 Post By Scott Wilkinson
  • 2 Post By TVOD
 
Thread Tools
post #1 of 14 Old 06-13-2014, 03:02 PM - Thread Starter
AVS Special Member
 
Scott Wilkinson's Avatar
 
Join Date: Sep 2001
Location: Burbank, CA
Posts: 1,299
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 310 Post(s)
Liked: 1093
SMPTE Standardization



Media-industry consultant Paul Briscoe and SMPTE International Governor Rich Welsh discuss the standardization of various picture-quality elements in UHD/4K and how content creators are dealing with them, including the difference between UHD and 4K, wide color gamut, high dynamic range, increased bit depth, high frame rates, fractional frame rates, codecs, content delivery, the expected timeline for deployment of these elements, answers to chat-room questions, and more.

fookoo_2010 and xvfx like this.

Scott Wilkinson
AVS Editor
Scott Wilkinson is online now  
Sponsored Links
Advertisement
 
post #2 of 14 Old 06-14-2014, 06:06 AM
AVS Special Member
 
tigerfan33's Avatar
 
Join Date: Jun 2007
Posts: 1,175
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 38 Post(s)
Liked: 26
Is it true that most displays today are 10 bit panels as Paul stated?
tigerfan33 is offline  
post #3 of 14 Old 06-14-2014, 06:14 AM
AVS Special Member
 
JWhip's Avatar
 
Join Date: Jun 2001
Location: Wayne, PA
Posts: 4,221
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 178 Post(s)
Liked: 240
Really enjoyed this one Scott. More reasons to wait before venturing into the world of 4K. The CE's should have waited to set standards before rushing into 4K. I mean there was no real clamor for it anyway was there? Or had I missed something.
JWhip is online now  
post #4 of 14 Old 06-14-2014, 07:39 AM
Advanced Member
 
mightyhuhn's Avatar
 
Join Date: Nov 2013
Posts: 645
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 266 Post(s)
Liked: 92
Quote:
Originally Posted by tigerfan33 View Post
Is it true that most displays today are 10 bit panels as Paul stated?
if you call a 8 bit + A-FRC a 10 bit panel yes.

but he was talking about input and processing too.
i don't think more than 10 bit processing is common. and you can already send 12 bit to a panel.

most if not nearly all panel these days are still 8 bit. some are true 10 bit and some TV are 10 bit too don't ask which is native 10 bit most with 1.07 billion color are still 8 bit wither dither so a simple lie.
but displaying 8 bit TV/BD content on a 10 bit display doesn't really help displaying it with 8 bit and dither is already very good. displaying 8 bit RGB on a 10 bit display is totally worthless.
mightyhuhn is online now  
post #5 of 14 Old 06-14-2014, 12:16 PM - Thread Starter
AVS Special Member
 
Scott Wilkinson's Avatar
 
Join Date: Sep 2001
Location: Burbank, CA
Posts: 1,299
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 310 Post(s)
Liked: 1093
Quote:
Originally Posted by mightyhuhn View Post
Quote:
Originally Posted by tigerfan33 View Post
Is it true that most displays today are 10 bit panels as Paul stated?
if you call a 8 bit + A-FRC a 10 bit panel yes.

but he was talking about input and processing too.
i don't think more than 10 bit processing is common. and you can already send 12 bit to a panel.

most if not nearly all panel these days are still 8 bit. some are true 10 bit and some TV are 10 bit too don't ask which is native 10 bit most with 1.07 billion color are still 8 bit wither dither so a simple lie.
but displaying 8 bit TV/BD content on a 10 bit display doesn't really help displaying it with 8 bit and dither is already very good. displaying 8 bit RGB on a 10 bit display is totally worthless.
Yes, I think he was talking mostly about processing and post-production. As far as I know, most TVs still use 8-bit panels, but I'm going to do some research on this given what Paul said.

Scott Wilkinson
AVS Editor
Scott Wilkinson is online now  
post #6 of 14 Old 06-14-2014, 12:18 PM - Thread Starter
AVS Special Member
 
Scott Wilkinson's Avatar
 
Join Date: Sep 2001
Location: Burbank, CA
Posts: 1,299
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 310 Post(s)
Liked: 1093
Quote:
Originally Posted by JWhip View Post
Really enjoyed this one Scott. More reasons to wait before venturing into the world of 4K. The CE's should have waited to set standards before rushing into 4K. I mean there was no real clamor for it anyway was there? Or had I missed something.
You're right, there was no real clamor for 4K; it was something the manufacturers did because HDTVs are reaching market saturation, and they want to sell more TVs.

Scott Wilkinson
AVS Editor
Scott Wilkinson is online now  
post #7 of 14 Old 06-14-2014, 05:44 PM
AVS Special Member
 
TVOD's Avatar
 
Join Date: Jun 2003
Posts: 4,881
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 74 Post(s)
Liked: 70
Another great interview. Having industry people is enlightening.

I'm concerned that the consumer electronics industry will not be as patient with the evolution of UHD as HD and cut back, but on the other hand UHD panels may become the standard as they are getting easier to make.

Many current displays have a wider gamut than 709 which requires color management. Display firmware upgrades could include a future UHD gamut increase and possibly HDR.

As mentioned cinema 4096 and UHD 3840 have an aspect ratio mismatch so vertical scaling is probably required. For horizontal, since this would be done on the creation side, professional scaling would be used. Add distribution encoding and I doubt it's an issue.

A common misconception is that 25/30 fps interlaced video has less smooth motion than higher frame rates such as 48 fps in cinema or the 50p/60p standards. The motion update rate in 2:1 interlace is at field rate (twice the frame rate). 25i and 30i are updated at 50hz and 60hz respectively. The shutter exposure time is the field duration.

The "film look" not only involves the slower frame rate but also the typical 1/2 frame duration shutter exposure time. 24 fps usually uses 1/48th second. This faster shutter time decreases motion blur but adds a stuttered look (judder). In live type video the frame duration and shutter time are usually nearly equal so judder is minimized. With 100+ fps, the issue of motion blur should be greatly diminished, though it should be interesting if someone decides that a faster shutter time makes it look more cinematic.

Judder can add distance to the storytelling which can aid suspension of disbelief. Higher frame rates may require a filmmaker to change their visual methodology of storytelling. If that does become popular, one might say Soap Operas led the way and "Soap Opera Effect" could be positive instead of pejorative. The increased frame rate TVs can display do not change the original shutter exposure time which means there is a mismatch between frame rate and motion blur. It may be too smooth.

JPEG2K requires a high bit rate because it's an intraframe DWT (Discrete Wavelet Transform) encoding scheme. Besides cinema, JPEG2000 is also sometimes used for TV backhauls by companies such as FOX. JPEG2K can offer better quality while the MPEG varieties offer better efficiency. MPEG encoding might use longer GOPs for high frame rates, as well as perhaps easier motion compensation as less moves between frames. JPEG2K may require much additional bandwidth for high frame rates.

Anything else? I think I should change to decaf.
Joe Bloggs and Phrehdd like this.

Last edited by TVOD; 06-14-2014 at 07:29 PM.
TVOD is offline  
post #8 of 14 Old 06-15-2014, 03:15 PM
Advanced Member
 
spacediver's Avatar
 
Join Date: May 2013
Location: Toronto
Posts: 898
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 91 Post(s)
Liked: 75
Quote:
Originally Posted by Scott Wilkinson View Post
Yes, I think he was talking mostly about processing and post-production. As far as I know, most TVs still use 8-bit panels, but I'm going to do some research on this given what Paul said.

Yes, I think part of the confusion is between the bit depth of the LUT processor in the display, and the bit depth of the panel itself. As I understand it, if you have a 10 bit LUT, but an 8 bit panel, this means that while yes, you can only have 256 steps that characterize the individual channels in any given signal to the panel, you can choose from 1024 different step sizes, which can reduce quantization artifacts.

Similarly, a video card can have a 10 bit LUT, but only an 8 bit frame buffer. I believe windows can specify a LUT with 16 bit precision.

I may be off on some of my comments - I find it hard to come by good information on this stuff.
spacediver is online now  
post #9 of 14 Old 06-15-2014, 11:32 PM
Advanced Member
 
mightyhuhn's Avatar
 
Join Date: Nov 2013
Posts: 645
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 266 Post(s)
Liked: 92
the frame buffer in gpu is not limited to 8 bit every gpu these days can output 10 bit rgb with a direct x10/11 they don't do this with a 8 bit frame buffer!

the point in 8 bit + processing is this:
a LCD got 3 sub pixel RED, GREEN and BLUE better know as RGB.
the data on a BD/DVD isn't RGB it is YCbCr 4:2:0 after decompressing.
to display YCbCr on a RGB display you need to do 2 things
first transform YCbCr 4:2:0 to YCbCr 4:4:4 (upscaling chroma resolution to luma resolution) doing this in a higher bit deep creates less artifacts, i never read about this but i'm pretty sure a upscsler creates float point so a high bit deep frame buffer saves more precision. of cause you don't create float point with a point /nearest neighbor resizer...
then do a YCbCr -> RGB conversation this creates float point data so the error in high bit deep is lower.
at the end the picture needs to be displayed in the native bit deep of the panel so it has to round or dither the high bit deep frame buffer to 8 bit.


you can use MadVR on windows with a 16 bit 3d lut to correct the image in a really high quality.
mightyhuhn is online now  
post #10 of 14 Old 06-15-2014, 11:47 PM
Advanced Member
 
spacediver's Avatar
 
Join Date: May 2013
Location: Toronto
Posts: 898
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 91 Post(s)
Liked: 75
Quote:
Originally Posted by mightyhuhn View Post
the frame buffer in gpu is not limited to 8 bit every gpu these days can output 10 bit rgb with a direct x10/11 they don't do this with a 8 bit frame buffer!
I wonder why it's not supported for opengl:

http://nvidia.custhelp.com/app/answe...a-geforce-gpus

I'm on a CRT (which can clearly support 10 bits). Does this mean I can get true 10 bit color with directx with my gpu (GTX 660)? What applications can I test this with? I'm using a DVI-VGA cable.
spacediver is online now  
post #11 of 14 Old 06-16-2014, 12:11 AM
Advanced Member
 
mightyhuhn's Avatar
 
Join Date: Nov 2013
Posts: 645
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 266 Post(s)
Liked: 92
over vga i don't think so. only DP or HDMI
mightyhuhn is online now  
post #12 of 14 Old 06-16-2014, 01:23 AM
Advanced Member
 
spacediver's Avatar
 
Join Date: May 2013
Location: Toronto
Posts: 898
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 91 Post(s)
Liked: 75
Quote:
Originally Posted by mightyhuhn View Post
over vga i don't think so. only DP or HDMI
Interesting report here

One confirmed instance of 10 bit framebuffer support over VGA, and one instance over DVI-VGA, but the latter seemed to stop working. Also appears that for nvidia cards, you need to use linux and the linux drivers to get this to work.

According to Graeme Gill (creator of ArgyllCMS), DVI output is typically limited to 8 bits, although I wonder if this restriction is only for DVI-D.

According to wiki, DVI-D only has 24 bits per pixel (8 bit per color channel). The specs for DVI-A / DVI-I don't seem to be very explicit:


Quote:
Originally Posted by wiki View Post
The analog section of the DVI specification document is brief and points to other specifications like VESA VSIS[8] for electrical characteristics and GTFS for timing information. The idea of the analog link is to keep compatibility with the previous VGA cables and connectors.
However, if DVI-A /DVI -I are truly compatible with VGA, then it may support 10 bit frame buffers, given that VGA appears to support 10 bit frame buffers.

Last edited by spacediver; 06-16-2014 at 01:32 AM.
spacediver is online now  
post #13 of 14 Old 06-16-2014, 07:23 AM
Senior Member
 
Utopianemo's Avatar
 
Join Date: May 2007
Location: Portland, OR
Posts: 334
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 22 Post(s)
Liked: 39
Moore's law again! Show me the math!

Seriously though, it was interesting to hear you guys talk about how we may be reaching the end of Moore's "law" accurately forecasting the rate of technological advancement. It echoes what I've heard elsewhere:
"Although this trend has continued for more than half a century, Moore's law should be considered an observation or conjecture and not a physical or natural law. Sources in 2005 expected it to continue until at least 2015 or 2020.[note 1][14] However, the 2010 update to the International Technology Roadmap for Semiconductors predicted that growth will slow at the end of 2013, when transistor counts and densities are to double only every three years." (Source: Wikipedia "Moore's Law")
Utopianemo is offline  
post #14 of 14 Old 06-17-2014, 12:55 AM
AVS Special Member
 
Joe Bloggs's Avatar
 
Join Date: Apr 2007
Location: UK
Posts: 2,581
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 151 Post(s)
Liked: 59
Quote:
Originally Posted by TVOD View Post
Another great interview. Having industry people is enlightening.
A common misconception is that 25/30 fps interlaced video has less smooth motion than higher frame rates such as 48 fps in cinema or the 50p/60p standards. The motion update rate in 2:1 interlace is at field rate (twice the frame rate). 25i and 30i are updated at 50hz and 60hz respectively. The shutter exposure time is the field duration.
I agree. And interlaced at 50i or 60i is smoother and more realistic than 25p/30p. But interlaced is also a messy standard and doesn't always de-interlace well and only half the real lines are captured for a single point in time (field) when recording in interlaced mode so when de-interlaced it's a complex thing to try to re-create, which could involve line doubling or other methods. Basically 1080p50 should be better than 1080i25 (perhaps it might need a higher bitrate sometimes?). But even with the highest bitrate, there will always be issues with interlaced. Though it's more realistic than 25p/30p, it will be much better to have full a full progressive picture at a high frame rate (not less than approx 50 fps) at high enough bitrate. So it's good that UHDTV is going high frame rate progressive (hopefully they'll make it more than 100/120) so we can end interlaced (other than for legacy content) without going to a worse standard like 25p/30p.

Quote:
Originally Posted by TVOD View Post
With 100+ fps, the issue of motion blur should be greatly diminished, though it should be interesting if someone decides that a faster shutter time makes it look more cinematic.
For high frame rate TV (100 fps or more) I think the EBU/NHK/BBC were wondering whether to have a shorter shutter (something like half the rate), not to make it more film-like, just to reduce blur or something like that. Whether it also makes judder look too bad at those rates I don't know - I think there will be increase judder/strobing, so I think in theory open frame rate and as high as possible fps should be best.

Last edited by Joe Bloggs; 06-17-2014 at 01:22 AM.
Joe Bloggs is offline  
Reply AVS Forum Podcasts

Tags
frontpage , Home Theater Geeks

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off