or Connect
AVS › AVS Forum › Display Devices › Display Calibration › Calibrate each source. What does this mean?
New Posts  All Forums:Forum Nav:

Calibrate each source. What does this mean?

post #1 of 69
Thread Starter 
If I calibrate my blu-ray/freesat, which takes a hdmi cable. Does this mean any other hdmi devices, like my ps3, are now calibrated?

How do I calibrate my SD channels. The aerial cable is the source for these.

Thanks
post #2 of 69
Quote:
Originally Posted by sefmiller View Post

If I calibrate my blu-ray/freesat, which takes a hdmi cable. Does this mean any other hdmi devices, like my ps3, are now calibrated?

Unless you are running them through the same HDMI input on the set, the answer is "No". You can use your PS3 to play a calibration disc in order to set up the input it's on.

Quote:
Originally Posted by sefmiller View Post

How do I calibrate my SD channels. The aerial cable is the source for these.

Thanks

I use a composite video cable out from a DVD/Blu-Ray/HDDVD player into a composite input on the set, then copy those settings over to the coax input you get SD material through. The TV runs composite video through the same comb filter and color decoding that it uses for the antenna coax source, so the results should be close enough between the two. Understand that you will still have variations from channel to channel. It's no accident that the old analog SD standard, NTSC, was believed to mean "Never The Same Color".
post #3 of 69
Thread Starter 
Does the picture differ with each HDMI input?
post #4 of 69
Calibrating for each source is an "old tape" from the days of analog video that people can't give up. Digital video sources RARELY need a custom calibration for each source. And by RARELY, I mean almost never.

In the analog days, a voltage produced by the source component (laserdisc player, VCR, etc.) determined how bright the screen would be. Getting those voltages and brightness levels identical across many products was damn near impossible. So analog components usually needed "custom" calibrations because none of them "matched".

In digital video, 1s and 0s determine how bright each color is. As long as the video signal is "in tact" (not damaged), you will get the correct brightness levels, guaranteed. The only exceptions are when a digital component contains video processing and that processing alters the video source on purpose or by accident. That's pretty rare, especially if you have chosen all the correct settings in each source component... the settings that disable any internal processing.

So a calibration of a "modern" all-digital video system really no longer needs separate calibrations for each source except in some rare cases.
post #5 of 69
Thread Starter 
Do I need to calibrate each HDMI input then? I have my blu-ray in hdmi 1 and ps3 in hdmi2.
post #6 of 69
It doesn't hurt to check each device, because the PS3 and some other players can be set to output video that isn't at standard video levels. If you have your TV set to expect video levels but you send it computer levels from the PS3 then probably it won't display correctly. Just check each device you can (both the PS3 and Blu-ray player) to make sure you're displaying the intended video range.
post #7 of 69
Thread Starter 
Thanks.

How do I check them?
post #8 of 69
The link below has downloads that will play on the PS3 and most recent Blu-ray players. There are also various commercial calibration disks, such as Digital Video Essentials, Spears & Munsil, or Disney's World of Wonder. Follow the instructions from your calibration disc of choice to make sure that the basic controls on your display are set as intended.

Typically if the video player and display are not not both set for video levels the test pattern for setting Brightness (Black-level) will be somewhat easy to notice as being different than the recommended instructions for setting brightness. For example if the PS3 was set to output computer levels and the TV was expecting video levels, the brightness test would be too dark. On the other hand, if the PS3 was outputting video levels while the display expected computer levels, the brightness test would be too bright. By default most devices, except computers, default to using video levels, but if you're going to calibrate one video player it doesn't hurt to also check that any other players are also working as intended. For a basic check just get a calibration disc and follow the instructions.
post #9 of 69
Quote:
Originally Posted by Doug Blackburn View Post

Calibrating for each source is an "old tape" from the days of analog video that people can't give up. Digital video sources RARELY need a custom calibration for each source. And by RARELY, I mean almost never.

In the analog days, a voltage produced by the source component (laserdisc player, VCR, etc.) determined how bright the screen would be. Getting those voltages and brightness levels identical across many products was damn near impossible. So analog components usually needed "custom" calibrations because none of them "matched".

In digital video, 1s and 0s determine how bright each color is. As long as the video signal is "in tact" (not damaged), you will get the correct brightness levels, guaranteed. The only exceptions are when a digital component contains video processing and that processing alters the video source on purpose or by accident. That's pretty rare, especially if you have chosen all the correct settings in each source component... the settings that disable any internal processing.

So a calibration of a "modern" all-digital video system really no longer needs separate calibrations for each source except in some rare cases.

I was under the impression that most Blu-Ray players do NOT all process data correctly. For example, Oppos have perfect data pass through with 0.0 dE. Many other blue ray players introduce dEs as high as 7.0 or 8.0... Hence the need to calibrate the blu-ray (source) separately from the input it's on. Someone on here does extensive blu-ray reviews... I can't find the link now. So, I'm confused...
post #10 of 69
Quote:
Originally Posted by ZandarKoad View Post


I was under the impression that most Blu-Ray players do NOT all process data correctly. For example, Oppos have perfect data pass through with 0.0 dE. Many other blue ray players introduce dEs as high as 7.0 or 8.0... Hence the need to calibrate the blu-ray (source) separately from the input it's on. Someone on here does extensive blu-ray reviews... I can't find the link now. So, I'm confused...

You should be calibrating the input to be correct and then testing with the source. With the Blu-ray players you are mentioning (I do the testing you are referring to) they often can have errors that can not be adjusted around. To me the user is better off spending $130 on a Panasonic 210 if they are getting a calibration than trying to adapt the display for a totally broken source. Not to mention if you are going through a receiver to a single input then you can't do that anyway.

If the Blu-ray player is broken you really can't calibrate around it as usually it lacks proper dynamic range on output or something else.
post #11 of 69
Thread Starter 
What are these errors the blu-ray player's suffer?

Thanks
post #12 of 69
Quote:
Originally Posted by sefmiller View Post

What are these errors the blu-ray player's suffer?

Thanks

I just found the link I was looking for on those blu-ray reviews! It's in Smackrabbit's signature. LOL!

http://www.hometheaterhifi.com/blu-ray-players.html

Different players suffer different errors. I just purchased the Pany 210 myself. As far as I can tell, it's the cheapest "perfect" blu-ray player. I'm using it as a poor man's signal generator...
post #13 of 69
Quote:
Originally Posted by ZandarKoad View Post

I was under the impression that most Blu-Ray players do NOT all process data correctly. For example, Oppos have perfect data pass through with 0.0 dE. Many other blue ray players introduce dEs as high as 7.0 or 8.0... Hence the need to calibrate the blu-ray (source) separately from the input it's on. Someone on here does extensive blu-ray reviews... I can't find the link now. So, I'm confused...

I'm not familiar with that thread or the methodology used. It's fairly easy to inadvertently introduce errors in the 7-8 range. All I can say is... if there's a pixel identified as 128, 128, 128 encoded on the Blu-ray disc in YCbCr 4:2:0 format (which is the format used to encode Blu-ray discs) and you aren't making the player (on purpose or by accident) do extra conversions, all the player should do is send that (gray) pixel out the HDMI port in YCbCr 4:2:2 format... that's a TRIVIAL conversion, exceedingly simple and it shouldn't introduce errors. The rules for encoding and decoding Blu-ray discs are very well-documented. Oppo players use a MediaTek chip set so any other player using a MediaTek chip set would be equally accurate. Things were more unsettled in the early days of Blu-ray, in the last 2-3 years, I would be VERY surprised to find any visible differences between disc players (not including the cheapest off-brand models... I mean, they could have anything inside from a MediaTek chip set to something very slap dash).

Methodology is the KEY to measurements. You would have to measure every Blu-ray player at exactly the same time if you were taking measurements off of a projection screen or panel display because the projection lamp will change from day to day and even measurements from panel displays will change from day to day. If you were looking at raw data from the HDMI output and not using a display for measurements, you should be able to trust what you find... and you could measure different players months or even years apart and not have measurements contaminated by shifts in the display. You might get away with measuring an Oppo/MediaTek player, then IMMEDIATELY measure the player in question before the projector or display has a chance to "move". And you'd have to make several measurements of the Oppo/MediaTek player just to see the range of readings that may be produced -- because there will be variations. You wouldn't want to rely on a single set of measurements for each player to tell you anything (if you are measuring from a video display). I can't second guess what someone else might be doing re. measurements and methodology.
post #14 of 69
Thread Starter 
Quote:
Originally Posted by Smackrabbit View Post

You should be calibrating the input to be correct and then testing with the source. With the Blu-ray players you are mentioning (I do the testing you are referring to) they often can have errors that can not be adjusted around. To me the user is better off spending $130 on a Panasonic 210 if they are getting a calibration than trying to adapt the display for a totally broken source. Not to mention if you are going through a receiver to a single input then you can't do that anyway.

If the Blu-ray player is broken you really can't calibrate around it as usually it lacks proper dynamic range on output or something else.

I hav noted that a lot of blu-ray players come with there own picture modes (cinema, rgb, normal etc). Does the normal mode change the tv's display in any way? This would further confuse calibration.
post #15 of 69
Quote:
Originally Posted by Doug Blackburn View Post

I'm not familiar with that thread or the methodology used. It's fairly easy to inadvertently introduce errors in the 7-8 range. All I can say is... if there's a pixel identified as 128, 128, 128 encoded on the Blu-ray disc in YCbCr 4:2:0 format (which is the format used to encode Blu-ray discs) and you aren't making the player (on purpose or by accident) do extra conversions, all the player should do is send that (gray) pixel out the HDMI port in YCbCr 4:2:2 format... that's a TRIVIAL conversion, exceedingly simple and it shouldn't introduce errors. The rules for encoding and decoding Blu-ray discs are very well-documented. Oppo players use a MediaTek chip set so any other player using a MediaTek chip set would be equally accurate. Things were more unsettled in the early days of Blu-ray, in the last 2-3 years, I would be VERY surprised to find any visible differences between disc players (not including the cheapest off-brand models... I mean, they could have anything inside from a MediaTek chip set to something very slap dash).

Methodology is the KEY to measurements. You would have to measure every Blu-ray player at exactly the same time if you were taking measurements off of a projection screen or panel display because the projection lamp will change from day to day and even measurements from panel displays will change from day to day. If you were looking at raw data from the HDMI output and not using a display for measurements, you should be able to trust what you find... and you could measure different players months or even years apart and not have measurements contaminated by shifts in the display. You might get away with measuring an Oppo/MediaTek player, then IMMEDIATELY measure the player in question before the projector or display has a chance to "move". And you'd have to make several measurements of the Oppo/MediaTek player just to see the range of readings that may be produced -- because there will be variations. You wouldn't want to rely on a single set of measurements for each player to tell you anything (if you are measuring from a video display). I can't second guess what someone else might be doing re. measurements and methodology.


I think they ARE measuring the data directly on a bench rather than from a display:
http://www.hometheaterhifi.com/techn...roduction.html

And they've performed dozens of such benchmarks...

Here is part two of that HDMI benchmark article:
http://www.hometheaterhifi.com/techn...follow-up.html

I wouldn't assume my Blu-Ray player is accurate just because it should be... I'd check it first, like you said, by measuring the data directly from the HDMI output, sans display. I've only looked at maybe 4 or 5 of their benchmark results, so it's hard to generalize how many players are accurate vs those which are inaccurate. But only 2 out of the five I looked at were accurate (dE generally being 0.2 or less, usually 0.0).
post #16 of 69
Thread Starter 
Quote:
Originally Posted by sefmiller View Post

I hav noted that a lot of blu-ray players come with there own picture modes (cinema, rgb, normal etc). Does the normal mode change the tv's display in any way? This would further confuse calibration.

Also, do you know if the european versions, of the blu-ray players you review, would have the same benchmark performance? We get the same products as america, in the UK, but with different model numbers.
post #17 of 69
I can say one thing for certain, my Zat502hd tuner uses some crazy-off over saturated profile, and nothing I can do to change it, not a bluray player though...
post #18 of 69
Quote:
Originally Posted by sefmiller View Post

I hav noted that a lot of blu-ray players come with there own picture modes (cinema, rgb, normal etc). Does the normal mode change the tv's display in any way? This would further confuse calibration.

That's the sort of thing that would scare me away from a Blu-ray player. You'd likely (not absolutely) be looking at a player that was spec'd by the marketing people instead of by people interested in making an accurate disc player. There might be a good mode among the choices, but without a meter it could be difficult to tell which mode is accurate and which modes are not.
post #19 of 69
Quote:
Originally Posted by sefmiller View Post

I hav noted that a lot of blu-ray players come with there own picture modes (cinema, rgb, normal etc). Does the normal mode change the tv's display in any way? This would further confuse calibration.

Pretty much ignore all these other modes. They are all horrible and all do processing that has an effect on the image, but doesn't tell you what. For example, on many of them the Cinema mode reduces the luminance output across the whole output, almost as if it's trying to create the effect of an ISF Night setting when the display is in ISF Day mode. Of course, doing this just crushes dynamic range and gamma and leads to a washed out image. Vivid modes are even worse as they just bump up the luma and chroma levels, and you will find that the peak white value is now 200 or 205 instead of 235, so you've lost a ton of dynamic range again.

Even in normal mode some of the players aren't correct. Recently the Pioneer BDP-52FD came through and it did just fine in YCbCr modes (4:2:2 or 4:4:4). In RGB mode neither was correct. One mode compressed 0-255 content down to the 16-235 range, and one expanded 16-235 content out to the 0-255 range. Neither took 0-255 content and left it as 0-255, which is what it should do for a normal display, and to pass the benchmark.

Players have gotten better in the past year it seems, but there are still plenty out there that have issues (clipping BTB and WTW are issues). Since you can buy a player that passes the benchmark and has decent performance (The Panasonic BDT-210) for $130, and you get Avatar 3D included with that, there is no reason to have a player that fails these. If you need a universal player or want the best performance (better DVD scaling, a Source Direct mode, dual HDMI outputs, etc...) then go with the Oppo players. You can move your poorly performing player to a bedroom system or somewhere that it won't bother you as much.
post #20 of 69
Quote:
Originally Posted by Smackrabbit View Post


Players have gotten better in the past year it seems, but there are still plenty out there that have issues (clipping BTB and WTW are issues). Since you can buy a player that passes the benchmark and has decent performance (The Panasonic BDT-210) for $130, and you get Avatar 3D included with that, there is no reason to have a player that fails these. If you need a universal player or want the best performance (better DVD scaling, a Source Direct mode, dual HDMI outputs, etc...) then go with the Oppo players. You can move your poorly performing player to a bedroom system or somewhere that it won't bother you as much.

I've been following this thread for awhile. So, it seems that for a low cost, high quality player, you can't go wrong with the BDT-210 (which I have) or for a higher end one, the Oppo's (95?).
post #21 of 69
Quote:
Originally Posted by Otto Pylot View Post


I've been following this thread for awhile. So, it seems that for a low cost, high quality player, you can't go wrong with the BDT-210 (which I have) or for a higher end one, the Oppo's (95?).

Or the 93
post #22 of 69
Or the Cambridge Audio 751BD (review not posted yet) if you want a really serious audio player as well.
post #23 of 69
Thread Starter 
Quote:
Originally Posted by Smackrabbit View Post

Pretty much ignore all these other modes. They are all horrible and all do processing that has an effect on the image, but doesn't tell you what. For example, on many of them the Cinema mode reduces the luminance output across the whole output, almost as if it's trying to create the effect of an ISF Night setting when the display is in ISF Day mode. Of course, doing this just crushes dynamic range and gamma and leads to a washed out image. Vivid modes are even worse as they just bump up the luma and chroma levels, and you will find that the peak white value is now 200 or 205 instead of 235, so you've lost a ton of dynamic range again.

Even in normal mode some of the players aren't correct. Recently the Pioneer BDP-52FD came through and it did just fine in YCbCr modes (4:2:2 or 4:4:4). In RGB mode neither was correct. One mode compressed 0-255 content down to the 16-235 range, and one expanded 16-235 content out to the 0-255 range. Neither took 0-255 content and left it as 0-255, which is what it should do for a normal display, and to pass the benchmark.

Players have gotten better in the past year it seems, but there are still plenty out there that have issues (clipping BTB and WTW are issues). Since you can buy a player that passes the benchmark and has decent performance (The Panasonic BDT-210) for $130, and you get Avatar 3D included with that, there is no reason to have a player that fails these. If you need a universal player or want the best performance (better DVD scaling, a Source Direct mode, dual HDMI outputs, etc...) then go with the Oppo players. You can move your poorly performing player to a bedroom system or somewhere that it won't bother you as much.

Does the normal mode usually change anything though? My Blu-ray player is also a free-sat. The picture mode's of the player only take place when you play a dvd, usb stick or blu-ray. This would mean that if I was to calibrate the blu-ray player, the free-sat could look different. I have tried contacting panasonic about this, but they refuse to answer.
post #24 of 69
Quote:
Originally Posted by sefmiller View Post

Does the normal mode usually change anything though? My Blu-ray player is also a free-sat. The picture mode's of the player only take place when you play a dvd, usb stick or blu-ray. This would mean that if I was to calibrate the blu-ray player, the free-sat could look different. I have tried contacting panasonic about this, but they refuse to answer.

What normal mode does depends on the player, but it is usually the most accurate of the modes. With Panasonic there were a few other options to set correctly (like turning Advanced Chroma off), but I haven't used one of the combination ones, so I have no idea if they use the same chipsets, or the same processing, or are totally different.
post #25 of 69
Quote:
Originally Posted by pokekevin View Post

Or the 93

There you are I don't know the difference between the two so I just sort of threw that out there.
post #26 of 69
Quote:
Originally Posted by alluringreality View Post

\\For example if the PS3 was set to output computer levels and the TV was expecting video levels, the brightness test would be too dark.

e
post #27 of 69
Thread Starter 
Quote:
Originally Posted by Rolls-Royce View Post

Unless you are running them through the same HDMI input on the set, the answer is "No". You can use your PS3 to play a calibration disc in order to set up the input it's on.



I use a composite video cable out from a DVD/Blu-Ray/HDDVD player into a composite input on the set, then copy those settings over to the coax input you get SD material through. The TV runs composite video through the same comb filter and color decoding that it uses for the antenna coax source, so the results should be close enough between the two. Understand that you will still have variations from channel to channel. It's no accident that the old analog SD standard, NTSC, was believed to mean "Never The Same Color".

What type of composite cable. Is it the red, white and yellow one?

Thanks
post #28 of 69
Thread Starter 
Quote:
Originally Posted by Rolls-Royce View Post

Unless you are running them through the same HDMI input on the set, the answer is "No". You can use your PS3 to play a calibration disc in order to set up the input it's on.



I use a composite video cable out from a DVD/Blu-Ray/HDDVD player into a composite input on the set, then copy those settings over to the coax input you get SD material through. The TV runs composite video through the same comb filter and color decoding that it uses for the antenna coax source, so the results should be close enough between the two. Understand that you will still have variations from channel to channel. It's no accident that the old analog SD standard, NTSC, was believed to mean "Never The Same Color".

Isn't it true that the source device has an effect on the calibration. So using a composite video cable on my dvd player will not be a good baseline to adjust my SD antenna picture?
post #29 of 69
Quote:
Originally Posted by sefmiller View Post

Isn't it true that the source device has an effect on the calibration. So using a composite video cable on my dvd player will not be a good baseline to adjust my SD antenna picture?

Every analog input should be calibrated individually.

Digital's for the most part are consistent, but should be checked if possible.
post #30 of 69
Quote:
Originally Posted by sotti; View Post

Every analog input should be calibrated individually.

Digital's for the most part are consistent, but should be checked if possible.

When i was looking for a calibrator in Holland one of them said that he would do one input then copie em. There was also one who was only willing to do one input, didn't even mention the other inputs. I had to start a thread here to get some proper information on the subject, got some help from HogPilot. After that i found a guy who said that he might do three inputs if he had some time left, he ended up doing two inputs. <- My experiences with professional calibrators
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Display Calibration
AVS › AVS Forum › Display Devices › Display Calibration › Calibrate each source. What does this mean?