Why calibrate? - AVS Forum
Forum Jump: 
 
Thread Tools
Old 05-01-2012, 09:54 AM - Thread Starter
Member
 
techfreak191's Avatar
 
Join Date: Dec 2005
Posts: 102
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Hi,

I'm fairly new to the whole calibration affair.

I own a Sony KLV-40S400A. I've never felt the need to calibrate my display as I feel the TVs menu settings are more than enough to get good life-like colors as well as accurate grays, blacks and whites.

However, I do notice color banding in games and movies (Xbox 360, HDMI) as well as Satellite TV (composite video).

Will calibration fix that? Or am I stuck with banding, since I own an 8-bit panel?
techfreak191 is offline  
Sponsored Links
Advertisement
 
Old 05-01-2012, 10:15 AM
AVS Special Member
 
GeorgeAB's Avatar
 
Join Date: Feb 2002
Location: Denver, CO
Posts: 3,320
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 21 Post(s)
Liked: 110
Quote:
Originally Posted by techfreak191 View Post

Hi,

I'm fairly new to the whole calibration affair.

I own a Sony KLV-40S400A. I've never felt the need to calibrate my display as I feel the TVs menu settings are more than enough to get good life-like colors as well as accurate grays, blacks and whites.

However, I do notice color banding in games and movies (Xbox 360, HDMI) as well as Satellite TV (composite video).

Will calibration fix that? Or am I stuck with banding, since I own an 8-bit panel?

Display calibration is not based upon feelings:

'Display Calibration: Root Fundamentals'
http://www.avsforum.com/avs-vb/showthread.php?t=1021933
GeorgeAB is offline  
Old 05-01-2012, 10:46 AM
AVS Special Member
 
airscapes's Avatar
 
Join Date: Dec 2008
Location: Philadelphia
Posts: 4,736
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 92 Post(s)
Liked: 130
airscapes is online now  
Old 05-01-2012, 11:00 AM
AVS Special Member
 
PlasmaPZ80U's Avatar
 
Join Date: Feb 2009
Posts: 7,291
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 112 Post(s)
Liked: 204
PlasmaPZ80U is offline  
Old 05-01-2012, 06:22 PM
 
Phase700B's Avatar
 
Join Date: Jan 2004
Posts: 2,508
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 24
Quote:
Originally Posted by techfreak191 View Post

Hi,

I'm fairly new to the whole calibration affair.

I own a Sony KLV-40S400A. I've never felt the need to calibrate my display as I feel the TVs menu settings are more than enough to get good life-like colors as well as accurate grays, blacks and whites.

However, I do notice color banding in games and movies (Xbox 360, HDMI) as well as Satellite TV (composite video).

Will calibration fix that? Or am I stuck with banding, since I own an 8-bit panel?

You can also make use of discs such as Disney WOW or AVS HD 709 to make Media Assisted Settings to see if that helps before considering major added expense for a calibration or extra equipment. Then you can decide. The AVS HD 709 is available for free download here on AVS. I've used these both and have no banding.
Phase700B is offline  
Old 05-01-2012, 09:40 PM
AVS Special Member
 
PlasmaPZ80U's Avatar
 
Join Date: Feb 2009
Posts: 7,291
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 112 Post(s)
Liked: 204
It's worth mentioning the banding issue may be a source one or one that is unaffected by calibration so YMMV when using discs like Disney WOW and AVSHD709 in hopes of reducing/eliminating banding.
PlasmaPZ80U is offline  
Old 05-02-2012, 01:10 AM - Thread Starter
Member
 
techfreak191's Avatar
 
Join Date: Dec 2005
Posts: 102
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by PlasmaPZ80U View Post

It's worth mentioning the banding issue may be a source one or one that is unaffected by calibration so YMMV when using discs like Disney WOW and AVSHD709 in hopes of reducing/eliminating banding.

Right, that's why I'm now looking into calibration.... to reduce or eliminate banding completely.

But you say otherwise...
techfreak191 is offline  
Old 05-02-2012, 01:12 AM - Thread Starter
Member
 
techfreak191's Avatar
 
Join Date: Dec 2005
Posts: 102
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Phase700B View Post

You can also make use of discs such as Disney WOW or AVS HD 709 to make Media Assisted Settings to see if that helps before considering major added expense for a calibration or extra equipment. Then you can decide. The AVS HD 709 is available for free download here on AVS. I've used these both and have no banding.

Will this calibration effect or improve quality in games? I use RGB for games on my console.

That's my primary concern at the moment.
techfreak191 is offline  
Old 05-02-2012, 01:31 AM - Thread Starter
Member
 
techfreak191's Avatar
 
Join Date: Dec 2005
Posts: 102
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by PlasmaPZ80U View Post

Another good one that is closest to George's link:

http://www.tlvexp.ca/2011/12/why-we-calibrate-myths/

Superb link!

Personally, my only goal is to eliminate color banding. Other than that, the greys, blacks and whites look great. I find the banding in games particularly, annoying. That's all.

My LCD panel is 8-bit. Can one expect better and smoother color gradients on 10-bit or 12-bit panels? How many colors does 8-bit represent on an LCD screen?

I've noticed banding in all source material, from movies to satellite TV and HD games.
techfreak191 is offline  
Old 05-02-2012, 01:33 AM - Thread Starter
Member
 
techfreak191's Avatar
 
Join Date: Dec 2005
Posts: 102
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by techfreak191 View Post

Will this calibration effect or improve quality in games? I use RGB for games on my console.

That's my primary concern at the moment.

Phase 700B, can you also tell me if you own an 8-bit panel or higher?
techfreak191 is offline  
Old 05-02-2012, 01:54 AM
AVS Special Member
 
sotti's Avatar
 
Join Date: Aug 2004
Location: Seattle, WA
Posts: 6,640
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 44 Post(s)
Liked: 175
Quote:
Originally Posted by techfreak191 View Post

My LCD panel is 8-bit. Can one expect better and smoother color gradients on 10-bit or 12-bit panels? How many colors does 8-bit represent on an LCD screen?

8bit (And all your sources are 8bit) gives you 8bits for red, blue and green so it's really 24 bits total. Because all your sources are only 8 bit per channel, the value of 10bit or higher displays is based on being able to do more processing of the input signal.

So you get a total of 16.4 Million colors of 24bit combinations.
But 8bit per channel means red green or blue can only have 256 different values (0-255).

So it's not uncommon to be able to see banding in things like gradient blues or grays, where even 1bit different can be significant. Also sometimes with games it can be an artifact of the precision used in the game engine/pixel shaders.

That said I've seen some displays will have banding issue based on brightness/contrast settings. In order to check that out you'll need a good pattern that is high quality so you know the content isn't the issue. I believe the spears and munsil disc may have patterns for that.

Joel Barsotti
SpectraCal
CalMAN Lead Developer
sotti is online now  
Old 05-02-2012, 07:19 AM
 
Phase700B's Avatar
 
Join Date: Jan 2004
Posts: 2,508
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 24
Quote:
Originally Posted by techfreak191 View Post

Will this calibration effect or improve quality in games? I use RGB for games on my console.

That's my primary concern at the moment.

Certainly, making your TV settings as close to ideal points of operation will help get the most out of your TV display. So, yes, it should improve your picture quality for games providing you have the proper settings also for your game console. Both Xbox and PS3 have internal video settings and often, it is balance between your TV settings and the game console.

In answer to your question, I have a 2007 Mitsubishi and a 2010 LG. The Mits is an 8 bit panel and the LG purportedly is a 10 bit panel, however, the main board electronics do not necessarily take advantage of it.

Also, some video sources including some games, cable and satellite set top boxes induce banding and also some program material on them is of lower quality and can produce banding gradients at times. I had both major satellite providers (8 yrs & 6yrs each) and they both were terrible at times with banding of SD material. I now use only OTA broadcast which I hardly ever see banding with unless during a low quality commercial or a poor SD signal source from a sub channel (24-2). I also use my HTPC with streaming and lower quality video sources also show banding at times.

Is it possible for you to try HDMI connects to your systems and see if banding disappears?
Phase700B is offline  
Old 05-02-2012, 07:47 AM - Thread Starter
Member
 
techfreak191's Avatar
 
Join Date: Dec 2005
Posts: 102
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Phase700B View Post

Certainly, making your TV settings as close to ideal points of operation will help get the most out of your TV display. So, yes, it should improve your picture quality for games providing you have the proper settings also for your game console. Both Xbox and PS3 have internal video settings and often, it is balance between your TV settings and the game console.

In answer to your question, I have a 2007 Mitsubishi and a 2010 LG. The Mits is an 8 bit panel and the LG purportedly is a 10 bit panel, however, the main board electronics do not necessarily take advantage of it.

Also, some video sources including some games, cable and satellite set top boxes induce banding and also some program material on them is of lower quality and can produce banding gradients at times. I had both major satellite providers (8 yrs & 6yrs each) and they both were terrible at times with banding of SD material. I now use only OTA broadcast which I hardly ever see banding with unless during a low quality commercial or a poor SD signal source from a sub channel (24-2). I also use my HTPC with streaming and lower quality video sources also show banding at times.

Is it possible for you to try HDMI connects to your systems and see if banding disappears?

I've toyed around with just about every setting on my TV and the console. With some, banding is somewhat less visible, but it is very much there. Interestingly, I don't see banding in every movie or game. However, I do see it in just about every channel on satellite TV.

From you're response, which I found very helpful, I'm led to believe there's no need to upgrade to a 10-bit or higher panel. And that 8-bit would be more than enough. 8-bits on R, G and B equals 16.7 million colors. Interesting, I didn't know that. Are those true and dedicated 16.7 million colors? So ALL PS3 and Xbox 360 games are 8-bit at their source? Same goes for DVD movies?

For the record, I'm playing games and watching movies on my console, which is hooked up to the TV via HDMI, running at 1080p, set to standard reference levels, and RGB color space (Xbox 360 dash settings). I think banding may have been somewhat less visible when I was using component, but not sure.

I watch satellite TV through composite video inputs, and I suppose the input is at 480p, but the output isn't progressive. I'm not sure, but I do know that it's very sharp and good-looking SD content!

Cable TV is through the standard RF jack, and the lowest quality.

I see banding in all these 3, but as far as games and movies go on my game console.. only certain games and movies, not all.

I apologize for the long post and too many questions! I'm new here, and the amount of info I'm getting to absorb is extraordinary!

Really appreciate the FB, cheers!
techfreak191 is offline  
Old 05-02-2012, 07:49 AM - Thread Starter
Member
 
techfreak191's Avatar
 
Join Date: Dec 2005
Posts: 102
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Phase700B View Post

Certainly, making your TV settings as close to ideal points of operation will help get the most out of your TV display. So, yes, it should improve your picture quality for games providing you have the proper settings also for your game console. Both Xbox and PS3 have internal video settings and often, it is balance between your TV settings and the game console.

In answer to your question, I have a 2007 Mitsubishi and a 2010 LG. The Mits is an 8 bit panel and the LG purportedly is a 10 bit panel, however, the main board electronics do not necessarily take advantage of it.

Also, some video sources including some games, cable and satellite set top boxes induce banding and also some program material on them is of lower quality and can produce banding gradients at times. I had both major satellite providers (8 yrs & 6yrs each) and they both were terrible at times with banding of SD material. I now use only OTA broadcast which I hardly ever see banding with unless during a low quality commercial or a poor SD signal source from a sub channel (24-2). I also use my HTPC with streaming and lower quality video sources also show banding at times.

Is it possible for you to try HDMI connects to your systems and see if banding disappears?

Sorry, the response on 8-bits and RGB, was meant for the other gentleman!

Didn't realize in all the excitement!
techfreak191 is offline  
Old 05-02-2012, 07:56 AM
 
Phase700B's Avatar
 
Join Date: Jan 2004
Posts: 2,508
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 24
It sounds as though your banding is, then, probably source dependent and "normal" in that lower quality video will exhibit that. If this is your first higher quality HD TV and also a larger panel any picture quality issues may be made more evident.

And, yes, the 8 bit/10 bit concern is not really an issue in this. In fact, as was pointed out by sotti, 8 bit has all the potential it needs. Also, 10 bit was a "overkill" aspect that never is really utilized in source material so not really a factor.
Phase700B is offline  
Old 05-02-2012, 07:56 AM - Thread Starter
Member
 
techfreak191's Avatar
 
Join Date: Dec 2005
Posts: 102
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by sotti View Post

8bit (And all your sources are 8bit) gives you 8bits for red, blue and green so it's really 24 bits total. Because all your sources are only 8 bit per channel, the value of 10bit or higher displays is based on being able to do more processing of the input signal.

So you get a total of 16.4 Million colors of 24bit combinations.
But 8bit per channel means red green or blue can only have 256 different values (0-255).

So it's not uncommon to be able to see banding in things like gradient blues or grays, where even 1bit different can be significant. Also sometimes with games it can be an artifact of the precision used in the game engine/pixel shaders.

That said I've seen some displays will have banding issue based on brightness/contrast settings. In order to check that out you'll need a good pattern that is high quality so you know the content isn't the issue. I believe the spears and munsil disc may have patterns for that.


From you're response, which I found very helpful, I'm led to believe there's no need to upgrade to a 10-bit or higher panel. And that 8-bit would be more than enough. 8-bits on R, G and B equals 16.7 million colors. Interesting, I didn't know that. Are those true and dedicated 16.7 million colors? So ALL PS3 and Xbox 360 games are 8-bit at their source? Same goes for DVD movies?

For the record, I'm playing games and watching movies on my console, which is hooked up to the TV via HDMI, running at 1080p, set to standard reference levels, and RGB color space (Xbox 360 dash settings). I think banding may have been somewhat less visible when I was using component, but not sure.

Like you said, depending on the game engine and pixel shaders, I do notice it certain games only. And in others, it's not noticeable at all. Do devs sometimes use less colors to keep performance and frame rate up?

You say that 10-bit and higher panels have to do more signal processing. So that means it cuts into their precious response times? If that is so.. then not good for gaming, and no real advantage, since all gaming content is 8-bit, like you pointed out.

I usually use my games' in-game brightness and contrast control to determine accurate blacks, greys and whites. I set my TV's display mode to Standard instead of Cinema. Even though experts tell you to set it to Cinema, to yield the best blacks, greys and whites, as well as accurate color tones.. I still find the Standard setting to do a better job of it.

So bottom line: no need to upgrade to a higher bit panel just to eliminate banding. But calibration CAN reduce or eliminate banding altogether, correct?
techfreak191 is offline  
Old 05-02-2012, 08:41 AM
AVS Special Member
 
Doug Blackburn's Avatar
 
Join Date: May 2008
Location: San Francisco - East Bay area
Posts: 3,457
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 25 Post(s)
Liked: 228
You can't assume "8 bits is enough" nor can you assume more than 8 bits runs slower.

8-bits is BARELY enough to avoid contouring. Everything has to be PERFECT to not have visible contouring in 80bit images. There are, essentially, no 8-bit video displays. The lowest "bits" I'm aware of in video displays is 10-bits. Samsung displays have an 18-bit internal data path. But Samsung displays don't run consistently slower than displays with 10 or 12 bit data paths.

You MUST have more bits in the data path than the source to avoid making adjustments to the source that result in lost resolution. Cable and satellite are so heavily compressed, that even though they are nominally 8-bit sources, the actual resolution of what is delivered to your TV is actually 6 or 7 bits.

You can't assign any weight/value/quality/performance to the number of bits a display supports internally. There is ZERO correlation with any of those things re. image quality or "speed" of the display. There are too many other variables.

It sounds like you want some easily rationalized "rules of thumb" about the performance of video displays, but that just doesn't happen in the real world. There are just too many variables.

You should see if there's a FAQ for the Xbox here to determine what people have found is the best combination of settings to use for movies and the best combination for games. They could be the same, but they could be different. Often there are 1 or more users of the device in question (xbox in this case) who have the equipment and skills to measure performance in different modes and they will know the most accurate settings to use for various functions. You can't make any assumptions about console settings for different sources.

We do know that the gaming industry has no standards for color so there's really no way to calibrate for game mode... primarily because games from different "labels" can be set to a different standard. Luckily, I've never heard anybody complain that games vary by LARGE amounts. So the amount of difference from one brand to another won't be awful. The best that can be done is to make the video display accurate and let the games fall where they may. Consoles tend to have accurate output (the PS3 is highly accurate, image-wise, when playing Blu-ray discs, for example), so the real culprits for sources are the games and movies themselves.

"Movies is magic..." Van Dyke Parks
THX -- ISF -- HAA
Doug Blackburn is offline  
Old 05-02-2012, 09:04 AM
AVS Special Member
 
sotti's Avatar
 
Join Date: Aug 2004
Location: Seattle, WA
Posts: 6,640
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 44 Post(s)
Liked: 175
Quote:
Originally Posted by techfreak191 View Post

So bottom line: no need to upgrade to a higher bit panel just to eliminate banding. But calibration CAN reduce or eliminate banding altogether, correct?

Calibration may or may not reduce banding. There are so many variables it's hard to say yes or no. 10bit panels should reduce banding introduced by processing. The processing is going to happen any way. With 8bit you've got 255 steps, if in the calibration process you need to adjust the size of a step, well then you'll have to have fewer steps. With 10bit panels, that's not a problem. But video only uses 220 step anyway 16-235, and cutting a few steps doesn't make a huge difference, so 8bits can be very good.

But back to my original post, none of that matters if the source has banding in it.

Since you are watching your console, then everything you're watching is pretty heavily compressed. OTA HDTV is suppose to be 20mbits per second, compressed internet streams at "HD" quality are typically only 3-5mbps. How do they shrink it? Some is fancy algorithims, but mostly it's throwing away subtle data. Blu-Rays use the same fancy algorithims as compressed internet streams but use 35mbps. That missing data is what causes banding, Blu-Rays have 10x the amount of data in every frame compared to streaming video.

Joel Barsotti
SpectraCal
CalMAN Lead Developer
sotti is online now  
Old 05-02-2012, 09:17 AM
Member
 
kanti123's Avatar
 
Join Date: Sep 2011
Posts: 189
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 23
I think you have the same problem like I did. Samsung is replacing my panel with a new one
kanti123 is offline  
Old 05-02-2012, 09:24 AM
Advanced Member
 
Smackrabbit's Avatar
 
Join Date: Sep 2001
Location: Portland, OR, USA
Posts: 895
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 7 Post(s)
Liked: 32
I have no idea if the LGs or others use true 10-bit panels, but I'd tend to think they don't. You can buy true 10-bit computer displays but they are $1,500-2,000 for a 24"-30" display. A 10-bit panel that is affordable is really likely an 8+2 panel, in that it is really an 8-bit panel, but uses A-FRC to quickly switch between two colors to simulate once it can't create. So, if you ask for an RGB value of 402,0,0 in 10-bits, which would be 100.5,0,0 in 8-bit, the display will switch that pixel between 100,0,0 and 101,0,0 quickly to simulate 100.5,0,0. Far away you might not notice, but up close you might be able to.

The electronics use higher precision to avoid errors, but the panel itself is likely just native 8 bits no matter what the electronics are. I could be proven wrong, but this would be my likely guess.

Chris Heinonen
Senior Editor, Secrets of Home Theater and High Fidelity, www.hometheaterhifi.com
Displays Editor, AnandTech.com
Contributor, HDGuru.com and Wirecutter.com
ISF Level II Certified Calibrator, ReferenceHomeTheater.com
Smackrabbit is offline  
Old 05-02-2012, 09:34 AM
 
Phase700B's Avatar
 
Join Date: Jan 2004
Posts: 2,508
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 24
Quote:
Originally Posted by Smackrabbit View Post

I have no idea if the LGs or others use true 10-bit panels, but I'd tend to think they don't. You can buy true 10-bit computer displays but they are $1,500-2,000 for a 24"-30" display. A 10-bit panel that is affordable is really likely an 8+2 panel, in that it is really an 8-bit panel, but uses A-FRC to quickly switch between two colors to simulate once it can't create. So, if you ask for an RGB value of 402,0,0 in 10-bits, which would be 100.5,0,0 in 8-bit, the display will switch that pixel between 100,0,0 and 101,0,0 quickly to simulate 100.5,0,0. Far away you might not notice, but up close you might be able to.

The electronics use higher precision to avoid errors, but the panel itself is likely just native 8 bits no matter what the electronics are. I could be proven wrong, but this would be my likely guess.

To my knowledge, what you just described in detail is what LG has used on some TV models. It is not true 10 bit and depends on the main board/panel circuitry to work the manipulations. Pretty academic anyway.
Phase700B is offline  
Old 05-02-2012, 11:19 AM
AVS Special Member
 
PlasmaPZ80U's Avatar
 
Join Date: Feb 2009
Posts: 7,291
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 112 Post(s)
Liked: 204
Quote:
Originally Posted by techfreak191 View Post

For the record, I'm playing games and watching movies on my console, which is hooked up to the TV via HDMI, running at 1080p, set to standard reference levels, and RGB color space (Xbox 360 dash settings). I think banding may have been somewhat less visible when I was using component, but not sure.

One thing worth testing on your X360 is setting the HDMI Color Space to YCbCr709 instead of RGB. Some TVs work better that way and usually it is best to send YCbCr to the display instead of RGB.
PlasmaPZ80U is offline  
Old 05-02-2012, 11:22 AM
AVS Special Member
 
PlasmaPZ80U's Avatar
 
Join Date: Feb 2009
Posts: 7,291
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 112 Post(s)
Liked: 204
Quote:
Originally Posted by Phase700B View Post

To my knowledge, what you just described in detail is what LG has used on some TV models. It is not true 10 bit and depends on the main board/panel circuitry to work the manipulations. Pretty academic anyway.

I believe my 2011 LG is 8 bit only based on Panelook. I was going to post a link but I can't access the page on my browser for some reason.
PlasmaPZ80U is offline  
Old 05-02-2012, 11:36 AM
 
Phase700B's Avatar
 
Join Date: Jan 2004
Posts: 2,508
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 24
The panelook site is where I got my info and, as you said, it does not connect at the moment.

My 42LD550 with the Global-Plat2 designation indicates it is 10bit/8bit on panelook and as has previously been described by Smackrabbit.

By the way, I have noticed much of the information on panelook may not be totally accurate anyway, as it depends on submission of specifications by both vendors and/or individuals in the field using the panels. I have noticed missing information and small errors in some detailed specs.
Phase700B is offline  
Old 05-02-2012, 12:54 PM - Thread Starter
Member
 
techfreak191's Avatar
 
Join Date: Dec 2005
Posts: 102
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Phase700B View Post

It sounds as though your banding is, then, probably source dependent and "normal" in that lower quality video will exhibit that. If this is your first higher quality HD TV and also a larger panel any picture quality issues may be made more evident.

And, yes, the 8 bit/10 bit concern is not really an issue in this. In fact, as was pointed out by sotti, 8 bit has all the potential it needs. Also, 10 bit was a "overkill" aspect that never is really utilized in source material so not really a factor.

Helpful feedback. Appreciate it.

And yes, this is my first HD TV. It's a 40" LCD. Great for gaming, glad I upgraded!

Are you a gamer by any chance?

Also, can you tell me if an uncalibrated LCD screen exhibits more banding as opposed to a calibrated one?
techfreak191 is offline  
Old 05-02-2012, 12:59 PM - Thread Starter
Member
 
techfreak191's Avatar
 
Join Date: Dec 2005
Posts: 102
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by PlasmaPZ80U View Post

One thing worth testing on your X360 is setting the HDMI Color Space to YCbCr709 instead of RGB. Some TVs work better that way and usually it is best to send YCbCr to the display instead of RGB.

I have tried YCbCr709 for games. Colors look good, so do the blacks/greys and whites. But the banding remains unchanged.
techfreak191 is offline  
Old 05-02-2012, 01:01 PM
AVS Special Member
 
sotti's Avatar
 
Join Date: Aug 2004
Location: Seattle, WA
Posts: 6,640
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 44 Post(s)
Liked: 175
Quote:
Originally Posted by techfreak191 View Post

Also, can you tell me if an uncalibrated LCD screen exhibits more banding as opposed to a calibrated one?

I play a ton of games (although mostly PC).

How and when a display will show banding is completely dependent on the display itself. Some calibration controls add banding/posterization. Other controls will increase dynamic range mitigating them. Some TV's are funky and will band unless contrast and brightness are in specific ranges. Without spending time with the specific make and model it's impossible to know the answer.

One thing is a good professional calibrator will figure out how to strike a balance and should be minimizing visual artifacts. But the really good calibrators that pay attention to that level of detail are few and far between.

Joel Barsotti
SpectraCal
CalMAN Lead Developer
sotti is online now  
Old 05-02-2012, 01:04 PM - Thread Starter
Member
 
techfreak191's Avatar
 
Join Date: Dec 2005
Posts: 102
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Smackrabbit View Post

I have no idea if the LGs or others use true 10-bit panels, but I'd tend to think they don't. You can buy true 10-bit computer displays but they are $1,500-2,000 for a 24"-30" display. A 10-bit panel that is affordable is really likely an 8+2 panel, in that it is really an 8-bit panel, but uses A-FRC to quickly switch between two colors to simulate once it can't create. So, if you ask for an RGB value of 402,0,0 in 10-bits, which would be 100.5,0,0 in 8-bit, the display will switch that pixel between 100,0,0 and 101,0,0 quickly to simulate 100.5,0,0. Far away you might not notice, but up close you might be able to.

The electronics use higher precision to avoid errors, but the panel itself is likely just native 8 bits no matter what the electronics are. I could be proven wrong, but this would be my likely guess.

Based on what you're describing, I see little to no reason at all, to upgrade to a 10-bit panel.

Since all Xbox 360 games and DVD movies utilize 8-bit as their source, there's no point in buying a more expensive screen. If the banding is there at the source, it's there. I'm still not sold on calibration though, whether it may reduce/eliminate banding or not. I feel no reason to calibrate at the moment, as (to my eye at least) I get very accurate life-like colors, as well as good greys/blacks and whites.
techfreak191 is offline  
Old 05-02-2012, 01:07 PM - Thread Starter
Member
 
techfreak191's Avatar
 
Join Date: Dec 2005
Posts: 102
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by sotti View Post

I play a ton of games (although mostly PC).

How and when a display will show banding is completely dependent on the display itself. Some calibration controls add banding/posterization. Other controls will increase dynamic range mitigating them. Some TV's are funky and will band unless contrast and brightness are in specific ranges. Without spending time with the specific make and model it's impossible to know the answer.

One thing is a good professional calibrator will figure out how to strike a balance and should be minimizing visual artifacts. But the really good calibrators that pay attention to that level of detail are few and far between.

Hmm, interesting info.

I've tried googling detailed info on settings with regard to my model: it's a Bravia KLV-40S400A. Turned up practically nothing.
techfreak191 is offline  
Old 05-02-2012, 01:08 PM - Thread Starter
Member
 
techfreak191's Avatar
 
Join Date: Dec 2005
Posts: 102
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Doug Blackburn View Post

You can't assume "8 bits is enough" nor can you assume more than 8 bits runs slower.

8-bits is BARELY enough to avoid contouring. Everything has to be PERFECT to not have visible contouring in 80bit images. There are, essentially, no 8-bit video displays. The lowest "bits" I'm aware of in video displays is 10-bits. Samsung displays have an 18-bit internal data path. But Samsung displays don't run consistently slower than displays with 10 or 12 bit data paths.

You MUST have more bits in the data path than the source to avoid making adjustments to the source that result in lost resolution. Cable and satellite are so heavily compressed, that even though they are nominally 8-bit sources, the actual resolution of what is delivered to your TV is actually 6 or 7 bits.

You can't assign any weight/value/quality/performance to the number of bits a display supports internally. There is ZERO correlation with any of those things re. image quality or "speed" of the display. There are too many other variables.

It sounds like you want some easily rationalized "rules of thumb" about the performance of video displays, but that just doesn't happen in the real world. There are just too many variables.

You should see if there's a FAQ for the Xbox here to determine what people have found is the best combination of settings to use for movies and the best combination for games. They could be the same, but they could be different. Often there are 1 or more users of the device in question (xbox in this case) who have the equipment and skills to measure performance in different modes and they will know the most accurate settings to use for various functions. You can't make any assumptions about console settings for different sources.

We do know that the gaming industry has no standards for color so there's really no way to calibrate for game mode... primarily because games from different "labels" can be set to a different standard. Luckily, I've never heard anybody complain that games vary by LARGE amounts. So the amount of difference from one brand to another won't be awful. The best that can be done is to make the video display accurate and let the games fall where they may. Consoles tend to have accurate output (the PS3 is highly accurate, image-wise, when playing Blu-ray discs, for example), so the real culprits for sources are the games and movies themselves.

Er.. right. So you're saying 10-bit panels do not exhibit banding in games and movies?
techfreak191 is offline  
 
Thread Tools


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off