8 bit panel vs. 10 bit panel? - AVS Forum
Forum Jump: 
 
Thread Tools
post #1 of 5 Old 01-02-2008, 06:45 PM - Thread Starter
Member
 
killswitch_19's Avatar
 
Join Date: Nov 2007
Posts: 176
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
What differences will I notice if any between the 2? Are colors that much better on the 10bit panel?
killswitch_19 is offline  
Sponsored Links
Advertisement
 
post #2 of 5 Old 01-03-2008, 09:47 AM
AVS Special Member
 
wtr_wkr's Avatar
 
Join Date: Apr 2005
Location: Silicon Valley
Posts: 1,437
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
The new, top two 10b panels are simply better (S-LCD and LG's es IPS).
As to whether the processing interpolates color and you can see it is TBD.

HD-DVD is dead, so now I'm a Gary McCoy fanboy.
wtr_wkr is offline  
post #3 of 5 Old 01-04-2008, 12:11 AM
Newbie
 
owensj's Avatar
 
Join Date: Nov 2007
Posts: 11
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Had a Sony 8-bit and now a Sony 10-bit (and processing). The 10-bit, WOW!
owensj is offline  
post #4 of 5 Old 05-15-2008, 08:55 AM
Newbie
 
jsm22's Avatar
 
Join Date: May 2008
Posts: 1
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I have worked in the electronics feild specifically with tv for the pst 5 years and now more than ever this question is popping up. The simple answer is as of now it makes little to no difference if you have a 10 bit or 8 bit lcd. The reason is because there is no 10 bit source out there to feed this 10 bit panel, and when u send 8 bit to a 10 bit tv the tv has to do more processing which everyone knows more thinking done by the tv means a lower quality picture. With blu-ray now being the last form of mechanical source other then gaming there will be no source out there in the future either, the next generation of movies and such will be internet based and downloadable content, with compression rates of 13-1 and getting bigger online there is no need for 10 bit and satelite and cable company's have said they will not increase there bandwith wich means they cannot fit a 10 bit signal. You may also notice the biggest player in lcd Samsung has gone with an 8 bit panel in all of their new tvs and i can tell you i have seen a 8 bit samsung next to a 10 bit sony xbr on a panel comparison using an HQV disc and there is no real difference samsungs processing makes up the difference in colour recreation.So dont get caught up in the numbers game just see if you like the picture
jsm22 is offline  
post #5 of 5 Old 07-09-2009, 04:36 AM
Newbie
 
submux's Avatar
 
Join Date: Jul 2009
Posts: 1
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
but, I'll chime in anyway. I am a professional mutliplexer and CODEC developer working everyday with 10 and 12 bit video sources. To begin with, killswitch_19 is entirely correct on nearly all points, but there is in fact a time where 10-bits is an improvement over 8-bits even when the source itself is only 8-bits.

To start, 8-bit means that for red(r), green(g), and blue(b), the values 0 to 255 can be represented. For 10-bit, the rgb values can be from 0 to 1023. This means that per component, 10-bit is 4 times as detailed as 8-bit. Therefore, if you had a raw image with 10-bit depth, it would have a color palette 64 times as large (4x4x4=64) to represent the image on your screen. In the case of high definition video, with the exception of footage from EXTREMELY high end cameras (starting with the RedOne cinema camera and upward), you will never come across media of this scale. The reason is, it would require a signal of 3.125 gigabits per second to properly transmit this signal. TV networks with half million dollar cameras broadcast sports from the arena to the network at less than 1/3rd this speed, with quality loss.

The piddly 50mbit/sec that you get from high definition formats would almost definitely not benefit from higher bit depths as it already is stretching itself quite far by employing 150:1 compression to begin with.

The case where 10-bits for a consumer screen makes a big difference is in upscaling video from a lower resolution. Each color channel (red, green, blue), before scaling is multiplied by 4 to make it a 10-bit value to begin with. Then the image is scaled up by finding values inbetween "neighboring" pixels to jam in-between each pixel.

If you work in 8-bit, and you have a pixel with the value 1 and the pixel next to it is the value 2, then if you were to double the size of the image, the pixel inserted inbetween is calculated by adding the two values together, then dividing them in half. So, 1+2 = 3 / 2 = 1.5.

1.5 is not a valid pixel value. So, it would become either 1 or 2 since scaling systems are generally smart enough to use a more complex calculation that takes other pixels into account as well.

Using the same values, in 10-bit, therefore multiplying the 1 and 2 each by 4, we get the values 4 and 8 to start with instead. So 4 + 8 = 12 / 2 = 6. 6 is obviously a valid value, so now instead of the 8-bit version which would be either 1.1.2 or 1.2.2, we have a higher quality scaling of 4.6.8 instead.

The result is that the "sub-pixel-sampling" or the pixels in-between the encoded pixels are of a higher precision. The visible result, in special circumstances (generally you saw it more during the earlier jumps from 5 to 6 pixels per channel) is that color banding in the picture is much less.

The quality is even further improved when linear and temporal color scaling is taken into consideration. This is when the previous pictures and pixels around each pixel are used to help scale the current picture. So the scaler has as much data as possible to help it guess the new value of each pixel when scaling.

I know this is a bit too detailed for a forum like this, but I felt like jumping in.

To summarize, depending on the quality of the processor being used for scaling on the TV, it is possible to greatly improve the quality of a SD, 720p, even a 1080i (during the deinterlacing phase) picture on a 1080p screen using 10-bit color channel resolution since detail is filled in by guessing numbers for pixels that were not represented on the source media.

That being said, going from 16.7 million colors per pixel to a little over 1 billion colors per pixel is not as earth-shaking as it may sound. Thanks to motion in pictures, it's not likely to make a big enough different to matter, especially in the case of back-lit screens, but that's an entirely separate discussion.
submux is offline  
Reply LCD Flat Panel Displays

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off