or Connect
AVS › AVS Forum › Display Devices › Display Calibration › Consumer Level Disney World of Wonder (WOW) vs. DVE Blu Ray
New Posts  All Forums:Forum Nav:

Consumer Level Disney World of Wonder (WOW) vs. DVE Blu Ray - Page 8

post #211 of 450
Quote:
Originally Posted by RBFilms View Post
Hi,

This could be very simple.

Built in Pattern should match a pattern from a test generator fed directly to your HDTV.

External Pattern off a disc must go through a Calibrated Chain...meaning your BD Player may need to be calibrated and your AVR may also affect the output of the pattern to the HDTV Panel.

You cannot be 100% "correct" until you check to see if your AVR is affecting levels as well.

RIch
My AVR is a Pioneer 919AH-K. It is a 'repeater' pass-thru only. There are no adjustments made to the picture by my AVR and as I said before, there are no longer any possible changes to the picture from my blu-ray player/TV interaction. I have disabled any possible features. What I see on my TV is what is coming from the disc's display pattern, unchanged.
post #212 of 450
Pass Through mode does not guarantee signal integrity. There are AVR's on the market that can ... and do ... affect the integrity of the signal even in Pass Through mode.

Also, did you calibrate your BD Player? They can ... and do ... have an impact on the signal.

I can honestly say that the Patterns on Disney WOW are reference to a Test Pattern Generator. I would also be willing to bet money on the fact that S&M and DVE Patterns are also accurate.

We also produce patterns that are Industry Standard Reference. There is no variable in this area...it either adheres to a reference..or it is does not. They all yield very similar results.

If you are getting discrepancies between the discs and your internal patterns...you need to check your entire chain with a test pattern generator, software, and a very good meter.

Outside of color & tint ... which can vary slightly between discs due to the accuracy of the blue filters ... Brightness & Contrast should be similar. Disney WOW has a very accurate blue filter by the way. Yes, still not as good as a $14,000 meter and $2,500 worth of Software and an experienced ISF Technician doing the job ... but it yields surprisingly good results for a Blue Filter.

One thing to note...is the level of accuracy on the brightness and contrast patterns...some have finer gradients than others. Disney WOW uses a 1% gradient scale.

Last but not least, if you are finding discrepancies between the discs, there may be a subjective factor in how the test patterns are being used or read by the user.

With all that factored in, you may get slight discrepancies due to the gradient scale and human factors ... but I do not think they should not be off by a great deal from each other.

I do know the patterns on the discs I mentioned are reference quality and adhere to an industry standard.

Outside of what I mentioned above, I cannot be sure why the results are different.

Quote:
Originally Posted by Timothy91 View Post

My AVR is a Pioneer 919AH-K. It is a 'repeater' pass-thru only. There are no adjustments made to the picture by my AVR and as I said before, there are no longer any possible changes to the picture from my blu-ray player/TV interaction. I have disabled any possible features. What I see on my TV is what is coming from the disc's display pattern, unchanged.
post #213 of 450
Quote:
Originally Posted by RBFilms View Post

Pass Through mode does not guarantee signal integrity. There are AVR's on the market that can ... and do ... affect the integrity of the signal even in Pass Through mode.

That is a cop out. In my research for a receiver, Pioneer support stated that the 919AH-K does NOT change any data in the signal, it's simply a "repeater" with NO video processing of any kind. Data integrity isn't an issue I would be getting errors.

Quote:


Also, did you calibrate your BD Player? They can ... and do ... have an impact on the signal.

My blu-ray player is running in a mode where picture settings are not adjusted or altered. Calibrating a blu-ray player shouldn't be necessary if it's not altering the video data that is read from the disc.

Quote:


I can honestly say that the Patterns on Disney WOW are reference to a Test Pattern Generator. I would also be willing to bet money on the fact that S&M and DVE Patterns are also accurate.

I have not questioned any of the test patterns except "contrast". The differences between different patterns cannot be judged any other way but "different" when the settings are checked. This is not happening with just myself. This can be seen by simply reading the settings results people come up with when using the WOW calibration disc and the DVE disc. They are different.

Quote:


We also produce patterns that are Industry Standard Reference. There is no variable in this area...it either adheres to a reference..or it is does not. They all yield very similar results.

Not on contrast.

Quote:


If you are getting discrepancies between the discs and your internal patterns...you need to check your entire chain with a test pattern generator, software, and a very good meter.

I would believe you and would bother checking if this weren't a widespread and easily identifiable difference seen by all users who have reported their DVE and WOW settings.

Quote:


Outside of color & tint ... which can vary slightly between discs due to the accuracy of the blue filters ... Brightness & Contrast should be similar. Disney WOW has a very accurate blue filter by the way. Yes, still not as good as a $14,000 meter and $2,500 worth of Software and an experienced ISF Technician doing the job ... but it yields surprisingly good results for a Blue Filter.

I haven't seen a problem with color or tint settings. Those are virtually identical settings from different patterns. Let's try to focus on the setting we're talking about: contrast

Quote:


One thing to note...is the level of accuracy on the brightness and contrast patterns...some have finer gradients than others. Disney WOW uses a 1% gradient scale.

Level of gradient 'might' make a difference in a user's final choice of setting 'if' the instructions weren't clear enough so that the user can identify how to set the TV to the pattern or what to look for. Otherwise, the setting should end up at least in the ballpark with other "industry standard" test patterns. In regards to "contrast", users are NOT ending up in the ballpark from one pattern to the next.

Quote:


Last but not least, if you are finding discrepancies between the discs, there may be a subjective factor in how the test patterns are being used or read by the user.

Ah, yes and this is where I think you may have the only point, but I find it difficult to chalk it up to "user error" when in the thread for my LH90 TV, people who have set the TV's gamma, color gamut and black level the same, we all get very close settings results, which has me believing it's being done correctly by most users. And remember, these people in that LH90 thread have different blu-ray players, so it's highly unlikely the blu-ray players are introducing much of any difference in the picture.

Quote:


With all that factored in, you may get slight discrepancies due to the gradient scale and human factors ... but I do not think they should not be off by a great deal from each other.

They are. You should actually LOOK at the settings people are getting when using the same TV model (I recommend visiting the threads people create for the individual TV models) and you'll see when all factors are equal (like TV settings, gamma, black level, etc which might effect the outcome of the picture settings) that users are getting significant contrast results from one test pattern to another.

Quote:


I do know the patterns on the discs I mentioned are reference quality and adhere to an industry standard.

Outside of what I mentioned above, I cannot be sure why the results are different.

I think it's rather clear what to hypothesize based on the information I've found on this site from reported settings from users who I have confirmed have identical model TV's & identical TV settings, that either

a) The test patterns are not giving standardized results
b) The TV's themselves are responding differently to the different test patterns due to some weakness in the technology the set is using or is simply not designed to display a consistent result with different test patterns.

It's one of those two. I have not looked into the results of these different patterns on plasma tvs, but if the plasma tvs also suffer from the problem, I would start to question the consistency of contrast test patterns across the different discs, or we would have to qualify that contrast is a setting much like sharpness where even the same model TVs will produce different/inconsistent errors based on circuits or parts and their reaction to the patterns.
post #214 of 450
post #215 of 450

Interesting links. Contrast appears to be a bit of a mystery to most people trying to set it. I would like to know if the is an industry standard for mastering video and if there is any calibration gear that can help correctly set contrast and backlight based on "FtL" as one of those threads suggests. It would go a long way to standardizing things for pro-calibrators who come out to your place with their equipment. Right now I'm using the contrast scale on DVE along with my own eyeball/judgement to compare video shot outdoors in sunlight to the intensity of REAL light I see looking out of my window. By doing this kind of comparison, I'm trying to achieve a kind of "looking through the window" effect on my TV. I must admit doing it this way results in incredible 'pop' but it seems some of the brightest whites are clipping/smooshing together. Perhaps this is normal and my eyes just aren't able to see the details in the upper whites (because admittedly, those commercials & nature videos shot outdoors look so close to the real thing, it's amazing sometimes and while I'm sure I would notice a difference in the upper bright whites from an A/B compare of the same view in real life vs my TV, I can't help but to think I've come really close in overall intensity of light).

It's hard to deny that LED backlights have an incredible ability to compete with real/natural light from outside. The quality of my daytime viewing is amazing. My TV for nighttime viewing is also pretty darn awesome, so I'm by no means dissatisfied with the settings I've settled on.

On a side note, if LCD didn't have that damn weakness in regards to backlight-bleeding in the corners of the screen when a really dark-gray backround is showing which isn't quite black, then I would say Plasma has no reason to be around, but the fact that Plasma can achieve a very uniform look to near-black backrounds, they are still the preferred night-time technology. LCD's are getting VERY close though. There are only a few dark shades of gray which cause my LCD to lack a uniform look. When the screen gets black enough, my LCD actually does show a nice, uniform black with no light bleed because of local dimming and it's amazing because other LCDs still look gray and leak light on the corners. It's only a matter of time when a trick or advance in LCD design conquers this one last weakness.
post #216 of 450
Yes, of course there are standards. This is the heart of calibration, and there is lots of gear.
post #217 of 450
Quote:
Originally Posted by buzzard767 View Post

Yes, of course there are standards. This is the heart of calibration, and there is lots of gear.

In that thread you referenced, there were several different claims on what FtL the industry uses for their equipment they use for mastering. No one really showed what gear to use or what the proper settings actually were. If you know, could you make it a bit more clear? Thanks.
post #218 of 450
Quote:
Originally Posted by Timothy91 View Post

In that thread you referenced, there were several different claims on what FtL the industry uses for their equipment they use for mastering. No one really showed what gear to use or what the proper settings actually were. If you know, could you make it a bit more clear? Thanks.

Yes, I understand that. "Thee" reference is the first url I posted authored by Steve Shaw (Light Illusion) and it's 80 nits (23.2 foot lamberts). A totally light controlled room is part of the calibration. Most viewing areas have some lights so if you want to be able to see the picture the luminance must be increased to, say, 30-35fl for night and 40-50 for daytime viewing. Grayscale and color gamut is then adjusted at these luminances. No, these don't match the 80 nit reference, but under the circumstances, the picture viewed will now be as good as it can get, all things considered.

The gear is software and a meter to measure light and color.
post #219 of 450
Quote:
Originally Posted by buzzard767 View Post

Yes, I understand that. "Thee" reference is the first url I posted authored by Steve Shaw (Light Illusion) and it's 80 nits (23.2 foot lamberts). A totally light controlled room is part of the calibration. Most viewing areas have some lights so if you want to be able to see the picture the luminance must be increased to, say, 30-35fl for night and 40-50 for daytime viewing. Grayscale and color gamut is then adjusted at these luminances. No, these don't match the 80 nit reference, but under the circumstances, the picture viewed will now be as good as it can get, all things considered.

The gear is software and a meter to measure light and color.

I'm assuming both contrast and backlight settings are going to effect this number.? (FtL)
post #220 of 450
Definitely.
post #221 of 450
Quote:
Originally Posted by buzzard767 View Post

Definitely.

Okay, this is all good and well that there may be a way to calibrate grayscale via an expensive device, but this still doesn't explain the reason for the inconsistent contrast settings results from these setup discs.
post #222 of 450
Quote:
Originally Posted by Timothy91 View Post

Okay, this is all good and well that there may be a way to calibrate grayscale via an expensive device, but this still doesn't explain the reason for the inconsistent contrast settings results from these setup discs.

Sorry, the only way to accurately calibrate GS is with a meter. Contrast is individual to your display and viewing environment. Set it so you get the luminance you need, there is no coloration, and your eyes don't tire while watching. That's it. The subject is controversial and there are as many answers as there are both calibrators and viewers. Don't look for something magic because it isn't there.
post #223 of 450
Quote:
Originally Posted by Timothy91 View Post

Okay, this is all good and well that there may be a way to calibrate grayscale via an expensive device, but this still doesn't explain the reason for the inconsistent contrast settings results from these setup discs.

Hi Tim,

I'm jumping into the middle of this thread, so I apologize if someone has already said this. There are three basic guidelines for setting contrast. You should have no:

1) Clipping of whites, especially up to about 234 (digital video level) where the whitest whites will usually, but not always, be; no clipping of R, G, B channels if possible
2) Color shift in the grayscale, or
3) Eye fatigue (based on the particular viewing environment you're setting the contrast for)

If you put the contrast as high as you can without encountering any of those three things and in the environment in which you are going to watch, you'll be doing well!

Yes, calibrators with meters have standard ranges of luminance to which they set contrast (and other parameters) initially based on the viewing environment, and then adjust from there using the 3 guidelines above.

Not all displays are created equal and there usually are compromises that have to be made to optimize things based on the particular display. This may even vary between different displays of the same model since the electronics of the individual display are usually slightly different. This is the reason settings taken from another person's display may, or may not, work well on your display.

Now, on top of the variability inherent in each display, add the differences that may exist in everyone's video chain from different source devices through different receivers (or heaven knows what else) going to the display, and you have a good idea about why everyone may wind up with slightly different settings for the same parameter.

Hope that helps a bit (and I hope I didn't repeat what someone else already said!).

Best,
Greg
post #224 of 450
Quote:
Originally Posted by gerianne View Post

1) Clipping of whites, especially up to about 234 (digital video level) where the whitest whites will usually, but not always, be; no clipping of R, G, B channels if possible

Is there a test pattern I can download online and pipe thru my blu-ray player from my computer which allows me to look at these results? I figured the DVE contrast pattern was pretty much doing this already with it's bars showing a bunch of shades under "peak white" and "whiter than white". I could see every shade difference on the DVE pattern and was able to match the peak white bar to the whiter than white bar. So, I have to figure this is a success, however other test pattern results have me scratching my head in wonder on how accurate the different patterns really are. I believe I've shown the patterns are giving different results but no one seems to be addressing that issue at all, just avoiding it.
post #225 of 450
Don't go getting all paranoid on us. lol

Download the AVSHD disc AND Patterns Manual here. Use the clipping patterns, both GS and Color.
post #226 of 450
Quote:
Originally Posted by buzzard767 View Post

Don't go getting all paranoid on us. lol

Download the AVSHD disc AND Patterns Manual here. Use the clipping patterns, both GS and Color.

Cool. I will test these out sometime this week. Hopefully will find time when I get home this evening.
post #227 of 450
No need to get upset here. I am not copping out. I am sharing what I know to be a fact.

The better BD players do include calibration tools for just this reason.

Color should NOT be the same for every test disc...I know for a fact the color filters vary in the amount of Green they leak. I know they are all fairly close...but they are not the same.

I can assure you that until you test your signal path with a test generator and a meter ... none of your claims can be verified...nor can mine. Neither of us can be right without using a FULLY calibrated reference system to test these theories.

One needs to eliminate all variable from to get an accurate result. Without a calibrated signal path, there is no way to verify what is really happening. This the my only point I am making in my post.




Quote:
Originally Posted by Timothy91 View Post
That is a cop out. In my research for a receiver, Pioneer support stated that the 919AH-K does NOT change any data in the signal, it's simply a "repeater" with NO video processing of any kind. Data integrity isn't an issue I would be getting errors.



My blu-ray player is running in a mode where picture settings are not adjusted or altered. Calibrating a blu-ray player shouldn't be necessary if it's not altering the video data that is read from the disc.



I have not questioned any of the test patterns except "contrast". The differences between different patterns cannot be judged any other way but "different" when the settings are checked. This is not happening with just myself. This can be seen by simply reading the settings results people come up with when using the WOW calibration disc and the DVE disc. They are different.



Not on contrast.



I would believe you and would bother checking if this weren't a widespread and easily identifiable difference seen by all users who have reported their DVE and WOW settings.



I haven't seen a problem with color or tint settings. Those are virtually identical settings from different patterns. Let's try to focus on the setting we're talking about: contrast



Level of gradient 'might' make a difference in a user's final choice of setting 'if' the instructions weren't clear enough so that the user can identify how to set the TV to the pattern or what to look for. Otherwise, the setting should end up at least in the ballpark with other "industry standard" test patterns. In regards to "contrast", users are NOT ending up in the ballpark from one pattern to the next.



Ah, yes and this is where I think you may have the only point, but I find it difficult to chalk it up to "user error" when in the thread for my LH90 TV, people who have set the TV's gamma, color gamut and black level the same, we all get very close settings results, which has me believing it's being done correctly by most users. And remember, these people in that LH90 thread have different blu-ray players, so it's highly unlikely the blu-ray players are introducing much of any difference in the picture.



They are. You should actually LOOK at the settings people are getting when using the same TV model (I recommend visiting the threads people create for the individual TV models) and you'll see when all factors are equal (like TV settings, gamma, black level, etc which might effect the outcome of the picture settings) that users are getting significant contrast results from one test pattern to another.



I think it's rather clear what to hypothesize based on the information I've found on this site from reported settings from users who I have confirmed have identical model TV's & identical TV settings, that either

a) The test patterns are not giving standardized results
b) The TV's themselves are responding differently to the different test patterns due to some weakness in the technology the set is using or is simply not designed to display a consistent result with different test patterns.

It's one of those two. I have not looked into the results of these different patterns on plasma tvs, but if the plasma tvs also suffer from the problem, I would start to question the consistency of contrast test patterns across the different discs, or we would have to qualify that contrast is a setting much like sharpness where even the same model TVs will produce different/inconsistent errors based on circuits or parts and their reaction to the patterns.
post #228 of 450
Just heard about this Disney WOW setup disc from elsewhere -- as I'm not an AVS regular and haven't shopped for a new setup disc since it came out -- and thought maybe I can help a little bit to clarify Tim's observed discrepancies/problem w/ the luminance level settings for his LED LCD model by offering a different POV (and rephrasing of things) on the matter. Also, I have some small thoughts on the issue that haven't been fleshed out so far (near as I can tell). Hope I'm not butting in too much w/ this.



Anyhoo, Tim is pointing out that the discrepancy he sees is fairly uniform/consistent across all reporting owners of that model in that specific model's owners' thread.

And he (and I) can understand why the display's own internal test may yield different results from a BD calibration disc that uses a different video chain. However, that logic doesn't really explain what seems to be a substantial enough diff between say the DVE disc and the WOW disc, which both presumably go thru the same video chain from disc to display.



I'm not that familiar w/ Tim's display model's tech, etc., but I'm wondering though whether that display model has some sort of undefeatable circuitry that might cause it to react differently based on the different test patterns provided by the different setup discs.

Is it possible that the previously mentioned "local dimming" capability, for instance (among other possible built-in tech), react differently in this regard and thus lead users to generally choose contrast and brightness settings that differ between setup discs but yet are uniform/consistent across display units of that model display? Afterall, the test patterns *are* different (even though the peak white and deepest black points should presumably be identical or at least extremely close to identical), and it wouldn't be unheard of for certain implementations of display tech to have certain weaknesses that may be prone to handle certain test patterns differently than one might expect in a "perfect world" scenario.

Yes, I also understand that the ambient light environment *will* impact that choice of settings, so Tim will also need to control (and account for) that variable in his observations. However, that won't explain the consistent difference between setup discs for every user/display/environment combo, assuming Tim's observation is correct and everyone has been diligent to control those factors well enough.



One other thing in this regard. It's also quite possible that the different test patterns may generally lead users to *prefer* different luminance settings as a whole. This would be another factor that may *add* to the variance although this one would vary more between users than the above hypothesized issue of some undefeatable in-display tech being a potential cause, if such tech exist in that model.



Another thing just occurred to me. Not sure if this was fleshed out before, but are these setup discs yielding uniformly/consistently different backlight settings? OR is that not manually settable on this display (which has variable backlight, if I understand correctly)? If it's not settable, then it's probably handled and adjusted automatically, and you're at the mercy of the display's dynamic contrast/lighting/whatchamacallit tech. In that case, all bets are off on this. IF, however, it can be manually set/fixed, then it needs to be set identically across all the different setup discs (for a given user/environment) in order to yield consistent contrast/brightness settings between them.



Hope I have not muddied the waters for y'all in trying to help find the answer/solution to Tim's concern...

Peace...

_Man_
post #229 of 450
Getting end users to understand the goals and realities of preserving headroom is not easy...but it is important that folks get this concept.

On the current LG models the differences between settings of 80 and 90 are insignificant by design - those units are designed to prevent clipping within most of the range of the user's adjustments - however a disc player with a hot signal will clip at anything over 80.

Consumer disc player output levels can not be individually calibrated in volume manufacturing to reference signal levels. Manufacturers have included user adjustments in select disc players because they know exactly what they are doing.

Manufacturing engineering expertise on these select models enables professional installers to specify disc players with features that can be deployed to match output levels to reference signal generator levels. ISF calibrators do this every day and have invested in the rather expensive tools required to set these levels.

This critical information cannot be ignored - especially when using discs alone to calibrate HDTVs with patterns that provide headroom and toeroom markers above and below digital 16 to 235. If players match reasonable well within 16 to 235 we are getting exactly is intended and reasonably expected at disc player price points.

The ISF dictated methodology of matching disc player output to reference generators is critical for accurate results. Also, some AVR's can ... and do ... affect, change or alter audio and video signals.

We are getting very close results using the Disney WOW, the Spears & Munsil and the test patterns on the Video Forge - IF the disc player matches a generator or has been set to match.
post #230 of 450
Quote:
Originally Posted by RBFilms View Post

Getting end users to understand the goals and realities of preserving headroom is not easy...but it is important that folks get this concept.

Now we're getting to the meat of the matter.

As an LG owner, I can also confirm your comment about the difference between 80 and 90 being insignificant - from a clipping perspective.

I think that I may be able to explain why users of the WOW disc tend to come up with a generally lower setting than the other discs. The attached picture is from the WOW reference manual. It shows what the user should see when using the "beginner level" contrast pattern once the contrast is properly set. This picture would lead the user to make a less aggressive contrast setting that preserves some of the all-important headroom that is mentioned above. Note that the whiter-than-white bar is not clipped. This is good. There needs to be some headroom to allow for differences between input devices/signals.

I don't use any of the calibration discs that were mentioned (including WOW), but from reading the description of the instruction provided by the other discs, I'd say they are suggesting an aggressive contrast setting that leaves little to no headroom.

By the way, I discovered that there are some really nice patterns on the WOW disc while looking through the manual. It also includes audio tools and reference material to "real world test" your settings. Looks very comprehensive. Here is a link to the manual if interested in checking it out: http://disneydvd.disney.go.com/managed/WOW_Manual.pdf
LL
post #231 of 450
Quote:
Originally Posted by djams View Post

Now we're getting to the meat of the matter.

As an LG owner, I can also confirm your comment about the difference between 80 and 90 being insignificant - from a clipping perspective.

I think that I may be able to explain why users of the WOW disc tend to come up with a generally lower setting than the other discs. The attached picture is from the WOW reference manual. It shows what the user should see when using the "beginner level" contrast pattern once the contrast is properly set. This picture would lead the user to make a less aggressive contrast setting that preserves some of the all-important headroom that is mentioned above. Note that the whiter-than-white bar is not clipped. This is good. There needs to be some headroom to allow for differences between input devices/signals.

This is the kind of explanation that I've been looking for. So, contrast indeed doesn't have a focused/precise setting and it appears even the various sources have different contrast set for the broadcasted video. Brightness/black level seems to be much more accurate (in the ballpark) than the peak white (contrast) setting. I would again like to request that the makers of these patterns try to hammer out a decent ballpark setting people should use instead of the willy-nilly way it is now.
post #232 of 450
Quote:
Originally Posted by Timothy91 View Post

I would again like to request that the makers of these patterns try to hammer out a decent ballpark setting people should use instead of the willy-nilly way it is now.

It's not a pattern issue. It's the nature of the contrast control, and all the other factors involved. What's important is to understand and accept all these factors and adjust accordingly.
post #233 of 450
You can set Contrast as high as you want as long as you're not clipping 235 and below, getting no discoloration, or inducing eyestrain. Setting Brightness is much more clearcut.
post #234 of 450
Quote:
Originally Posted by RBFilms View Post
On the current LG models the differences between settings of 80 and 90 are insignificant by design - those units are designed to prevent clipping within most of the range of the user's adjustments


To me that sounds like the set is automatically adjusting things, which is probably not so good for optimal calibration. If that is not something defeatable, then I guess owners may need to look for the lowest contrast setting that still yields the max desirable white level.

Still, that doesn't really explain what Tim's observing across AVS members *IF* his observation is correct.

Quote:
- however a disc player with a hot signal will clip at anything over 80.

Consumer disc player output levels can not be individually calibrated in volume manufacturing to reference signal levels. Manufacturers have included user adjustments in select disc players because they know exactly what they are doing.


Are we talking about analog video or digital via HDMI?

Assuming HDMI, which players would actually output a "hot signal" in the digital domain?

I was under the impression that kind of issue was largely something in the analog video domain, not digital, although I suppose some odd player maker *might* somehow produce players that run the digital signal too "hot". I can't imagine the better, major makers (eg. Panasonic, Sony, Oppo) doing something like that (by default anyway).

Certainly, I haven't heard any such thing before -- not that I was looking for it in the past...



Hmmm... On 2nd thought, I guess I can see why LG did it that way w/ the contrast control.

They are leaving that headroom to handle *other* kinds of sources that may run hot relative to your reference calibration source, eg. wildly varying cable/sat source signals vs your calibration disc run on a BD player *or* the display's own built-in/internal test patterns.

Also, if you're not using a good calibration disc for the setup, then yeah, you're much more likely to need the headroom.

Still, ideally, I'd think they should allow you the option to defeat that feature on a source input-by-input basis. For instance, you really should not need to worry about clipping from the BD player (using the same video chain) once you've properly calibrated that source w/ one of these setup discs. It's really the other sources, especially analog video sources, that you need to worry about for the most part. I guess this is especially true if you use an AVR to do the video switching between both digital and analog sources (w/ conversion to digital), instead of letting your display do the switching (particular for the analog sources)...

_Man_
post #235 of 450
Man-Fai Wong - To me that sounds like the set is automatically adjusting things, which is probably not so good for optimal calibration.

R&B FILMS - More Manufacturers are starting to do this. It actually makes sense in that it makes no sense to allow users to drive their HDTV Panels in to clipping. On some sets, I see 98% to 100% as MAX white level which is just before the Panel starts to clip.


Man-Fai Wong - Still, that doesn't really explain what Tim's observing across AVS members *IF* his observation is correct.

R&B FILMS - My comment about 80 to 90 scale being an insignificant difference does explain part of it. I checked with an authority in the industry who is familiar with that specific TV. They explained the engineering behind this particular HDTV Panel and why 80 to 90 is insignificant. The other reasons why one may see a difference is an improperly calibrated signal path from the BD Player to the AVR, to the HDTV Panel. Bear in mind, the patterns on Disney WOW show Super White and Super Black. It is crucial to calibrate the entire path to assure the BD Player, AVR, and HDTV Panel pass a reference calibrated signal that is identical to a common source like a test pattern generator.



Man-Fai Wong - I suppose some odd player maker *might* somehow produce players that run the digital signal too "hot". I can't imagine the better, major makers (eg. Panasonic, Sony, Oppo) doing something like that (by default anyway).

R&B FILMS - Now think about that statement for a minute. Why would you think for one minute that since Panasonic or Sony would NOT take the time to calibrate an HDTV Panel that costs thousands that they would take the time to calibrate a player that costs hundreds? The speed and efficiency required to manufacture these products for a reasonable costs does not allow for testing and calibration of these devices. I would almost guarantee you that many if not most BD Players are outputting as signal that would not match a calibrated reference. Also, Oppo is one of the manufacturers that actually gives you the controls to calibrate their BD Players. Why would they do that if it is not necessary? I can tell why...because it is necessary.

** See the attached file which is a snapshot of the lower half of Page 55 in the Oppo BD-83 owners manual entitled "Picture Controls Adjustment" **

R&B FILMS - You absolutely do need to worry about calibrating your BD Player and your AVR if it has a processor. You also need to check your AVR to see if it affects signals in any way. Black level shifts are not uncommon in an AVR. Headroom is desirable to a degree because there can be spikes above 235...I would say 239 to 241 is good...chances of needing headroom beyond that is slim. Internal display patterns should match external patterns if your entire signal path is calibrated.

I can assure you, I am NOT making this stuff up.




Quote:
Originally Posted by Man-Fai Wong View Post



To me that sounds like the set is automatically adjusting things, which is probably not so good for optimal calibration.....

Still, that doesn't really explain what Tim's observing across AVS members *IF* his observation is correct.





Are we talking about analog video or digital via HDMI?

Assuming HDMI, which players would actually output a "hot signal" in the digital domain?

I was under the impression that kind of issue was largely something in the analog video domain, not digital, although I suppose some odd player maker *might* somehow produce players that run the digital signal too "hot". I can't imagine the better, major makers (eg. Panasonic, Sony, Oppo) doing something like that (by default anyway).

Certainly, I haven't heard any such thing before -- not that I was looking for it in the past...



Hmmm... On 2nd thought, I guess I can see why LG did it that way w/ the contrast control.

They are leaving that headroom to handle *other* kinds of sources that may run hot relative to your reference calibration source, eg. wildly varying cable/sat source signals vs your calibration disc run on a BD player *or* the display's own built-in/internal test patterns.

Also, if you're not using a good calibration disc for the setup, then yeah, you're much more likely to need the headroom.

Still, ideally, I'd think they should allow you the option to defeat that feature on a source input-by-input basis. For instance, you really should not need to worry about clipping from the BD player (using the same video chain) once you've properly calibrated that source w/ one of these setup discs. It's really the other sources, especially analog video sources, that you need to worry about for the most part. I guess this is especially true if you use an AVR to do the video switching between both digital and analog sources (w/ conversion to digital), instead of letting your display do the switching (particular for the analog sources)...

_Man_


LL
post #236 of 450
Quote:
Originally Posted by Man-Fai Wong View Post

They are leaving that headroom to handle *other* kinds of sources that may run hot relative to your reference calibration source, eg. wildly varying cable/sat source signals vs your calibration disc run on a BD player *or* the display's own built-in/internal test patterns.

Bingo. What I noticed with the built-in TV test pattern's ideal setting is that when viewing HD Cable, more different content with obviously different contrast levels appear to be "real" as opposed to the DVE pattern which appears a bit 'hot' on some content. I have developed my own little contrast "test" using HD Cable channels from news broadcasts and sports commentator wrap-up desk shots. What I focus on with those channels is the paper on the desk. Having worked in an office for many years, I know what indoor light looks like when reflecting off of paper. When contrast is too bright, the paper on the tables of these broadcasts look unnaturally bleach-white with a high intensity. The DVE contrast pattern leaves the look of the light reflecting off the paper as exaggerated and unnatural. When I kick the contrast from '97', slowly down to '87', the paper on the table slowly starts to look like it looks in the real world. A more natural intensity and more realistic difference of white shading across the paper, more detail differences of white.

However, on well-done, high quality blu-ray video which has been meticulously mastered with millions of dollars on the production budget, the DVE contrast pattern has undeniable realism in outdoor scenes. It's simply amazing the difference in the realism of light intensity and 'pop'. Also, even with the occassional over-done bright output on some channels from HD-cable, there is undeniable 'pop' from a high contrast. It's like candy for the eyes. And like candy, it's not good for you (not necessarily accurate to the original intent) but sure does taste good. :P

Setting a TV properly is a constant battle in trying to find what is either the best compromise to compensate for non-uniform cable-TV broadcasts or strict accuracy for video that is done right and comes from a trusted source like a well-mastered blu-ray disc.
post #237 of 450
Okay,

Yesterday I ran the AVSHD patterns and they matchup to the DVE pattern results identically. the 235-white pattern goes white at "97" on medium gamma. The exact same setting I got on DVE. On "low" gamma it goes white on "100" (only 'slightly' different than the DVE setting of "98"). The black level chart does the same thing. "57" brightness on 'medium' gamma shows as blacked-out on the bar intended for video black, right on the money with both the DVE test pattern, the Monster video test pattern & the TV's built in test pattern. All brightness test patterns are PRECISELY the same setting on my TV ("57") with medium gamma.

So, there is at least good consistency found on TWO test patterns through my Blu-ray player in regards to contrast. This is a good thing. The only thing left to do is test the WOW disc pattern to be 100% sure we have consistency on test patterns and I will be satisfied.

If I find the WOW disc also matches the results of DVE & AVSHD, then my television's test pattern may be showing me that my blu-ray player is off-setting something in contrast/peak-white output. Black-level is a match and color is a match. Contrast however is not between the TV test pattern and the DVE/AVSHD patterns.

I'm very happy right now that I've got TWO highly trusted contrast/peak-white output test patterns precisely matching each other from a single source (blu-ray player). Now, I will try to find the reason for the difference between the TV's contrast test pattern and the DVE/AVSHD patterns fed from my Blu-ray player.

The two possible reasons now may be either:
1. The blu-ray player may be changing the contrast/peak white output somehow.
2. The TV's built-in test pattern is purposefully trying to get the user to set the contrast to a more "universally" nice looking result for TV channel content with wildly different video sources. ("87" contrast does indeed look nicer on HD cable than the "97" setting the Blu-ray player is showing.)

I'm wondering if I go into the blu-ray player controls and bump down contrast until my TV shows the pattern correctly right at "87", will this "offset" decently correct the difference between the Blu-ray and the TV's patterns and bring things to a more normal appearance? Without knowing if the TV's test pattern is adhering to the same standards as the DVE/AVSHD patterns, I can't really make this assumption. By artificially setting the blu-ray player's contrast lower, I may be introducing distortion and limiting my peak-white output (thereby artificially reducing my TV's dynamic range).

Why on earth would electronics manufacturers allow a blu-ray player to output anything other than the proper "reference" video? A blu-ray player shouldn't EVER need adjustment since this is digital video we are talking about. The whole purpose behind digital video is to deliver consistency and quality. It appears that my 55LH90 THX set is fully capable of a picture range from "video black" to "video white" while giving nice shadow detail.

The only thing left to do now is to test one more "trusted" contrast pattern from my blu-ray player (WOW disc) and then find out if my blu-ray player is somehow messing with the peak-white/contrast. It would be rather strange if it did because every other setting (brightness, color & tint) are closely matching the DVE/AVSHD patterns. I'm a step closer in reconciling the reasons for these apparent differences in contrast settings. The AVSHD contrast patterns perfectly matching the DVE patterns is awesome! It won't be much more investigation until I have a more definitive explanation.
post #238 of 450
Quote:
Originally Posted by Timothy91 View Post

Okay,

Yesterday I ran the AVSHD patterns and they matchup to the DVE pattern results identically. the 235-white pattern goes white at "97" on medium gamma. The exact same setting I got on DVE. On "low" gamma it goes white on "100" (only 'slightly' different than the DVE setting of "98"). The black level chart does the same thing. "57" brightness on 'medium' gamma shows as blacked-out on the bar intended for video black, right on the money with both the DVE test pattern, the Monster video test pattern & the TV's built in test pattern. All brightness test patterns are PRECISELY the same setting on my TV ("57") with medium gamma.

So, there is at least good consistency found on TWO test patterns through my Blu-ray player in regards to contrast. This is a good thing. The only thing left to do is test the WOW disc pattern to be 100% sure we have consistency on test patterns and I will be satisfied.

If I find the WOW disc also matches the results of DVE & AVSHD, then my television's test pattern may be showing me that my blu-ray player is off-setting something in contrast/peak-white output. Black-level is a match and color is a match. Contrast however is not between the TV test pattern and the DVE/AVSHD patterns.

I'm very happy right now that I've got TWO highly trusted contrast/peak-white output test patterns precisely matching each other from a single source (blu-ray player). Now, I will try to find the reason for the difference between the TV's contrast test pattern and the DVE/AVSHD patterns fed from my Blu-ray player.

The two possible reasons now may be either:
1. The blu-ray player may be changing the contrast/peak white output somehow.
2. The TV's built-in test pattern is purposefully trying to get the user to set the contrast to a more "universally" nice looking result for TV channel content with wildly different video sources. ("87" contrast does indeed look nicer on HD cable than the "97" setting the Blu-ray player is showing.)

I'm wondering if I go into the blu-ray player controls and bump down contrast until my TV shows the pattern correctly right at "87", will this "offset" decently correct the difference between the Blu-ray and the TV's patterns and bring things to a more normal appearance? Without knowing if the TV's test pattern is adhering to the same standards as the DVE/AVSHD patterns, I can't really make this assumption. By artificially setting the blu-ray player's contrast lower, I may be introducing distortion and limiting my peak-white output (thereby artificially reducing my TV's dynamic range).

Why on earth would electronics manufacturers allow a blu-ray player to output anything other than the proper "reference" video? A blu-ray player shouldn't EVER need adjustment since this is digital video we are talking about. The whole purpose behind digital video is to deliver consistency and quality. It appears that my 55LH90 THX set is fully capable of a picture range from "video black" to "video white" while giving nice shadow detail.

The only thing left to do now is to test one more "trusted" contrast pattern from my blu-ray player (WOW disc) and then find out if my blu-ray player is somehow messing with the peak-white/contrast. It would be rather strange if it did because every other setting (brightness, color & tint) are closely matching the DVE/AVSHD patterns. I'm a step closer in reconciling the reasons for these apparent differences in contrast settings. The AVSHD contrast patterns perfectly matching the DVE patterns is awesome! It won't be much more investigation until I have a more definitive explanation.

I can practically guarantee that if you use the advanced patterns on the WOW disc on the same bluray player, and turn up the contrast until white 236 clips, you will be at the same contrast setting. These are all reference quality patterns (WOW, DVE, AVS HD709). Knock yourself out if you wish, but I think you're wasting your time.

Of course the picture wizard on the LG TV's is using some level of "whiter-than-white" to set the contrast control - to leave some headroom. I wonder if you read the excellent discussion of "White Clipping" in the Patterns Manual for the AVS HD709 disc. Specifically, I'd point out this statement:
Quote:


· If white-level on your display can cause bars to stop flashing,
we suggest keeping some bars above reference white. A good
compromise for displays that show levels above white may be
setting white-level so you can still see 244 flash.

Translation: we suggest leaving some headroom. I wouldn't recommend messing with the contrast setting on your bluray player to "match" the LG's internal pattern results. Instead, try setting contrast as recommended above and see how close that comes to the 87 setting you get using the LG picture wizard.

BTW - since you have the AVS HD709 disc, you should also be looking at pattern A4 - "Color Clipping" to evaluate your contrast setting, as setting contrast too high can also cause color clipping. On my LG, 97 contrast clips blue. Mine is a different model, may be OK on your set - but important to take a look.
post #239 of 450
I checked out the color clipping pattern last night and found that "97" both red and blue have a VERY faint bar blinking at 233 (234 is not shown on the pattern) with no activity at all at 235 while green has a nicely solid bar at 243 with a faint bar on 235. (I am virtually certain that red and blue aren't registering anything at 234 based on the readings).

So, it seems that at the very peak of the output for red and blue, there is a tiny bit of clipping when the contrast is set at "97" per the peak white test pattern.

I have temporarily compromised with a setting of "96" which should allow red and blue to have a faint blink on "234" (Even if 234 is not on the pattern, it's a logical deduction that it if it was available to see, it would register a very light blinking bar.)

Is there any way to tune the TV's color settings to improve the peak output contrast of red and blue? Or is this going to be a limitation due to the television's design?

Last night on my HD Cable TV, the TBS Conan O'Brien late show, IMO, is showing an absolutely PERFECT image. The color, contrast, shading detail, even sharpness (the fine cushion sewing pattern of the couch was very visible but not even a glint of noise on the detailed pattern!). The quality of the video off that live feed is the best I've ever seen on HD cable. My TV is obviously set very well. :P
post #240 of 450
this might be a stupid question, but i feel i should ask.
i have a dvd of disney WOW. will this work as effectively for blu ray players as the blu ray disney WOW?
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Display Calibration
AVS › AVS Forum › Display Devices › Display Calibration › Consumer Level Disney World of Wonder (WOW) vs. DVE Blu Ray