or Connect
AVS › AVS Forum › Video Components › Home Theater Computers › HOW-TO: Calibrating Display to Match HTPC Output
New Posts  All Forums:Forum Nav:

HOW-TO: Calibrating Display to Match HTPC Output - Page 3

post #61 of 486
Four dots and a period says:

Quote:
When running DVE test pattern with Zoomplayer in windowed mode reference black is being displayed at 16 and reference white at 235. When going fullscreen with Zoomplayer the picture gets brighter. Reference black is now at 33 and reference white at 241. Same result with both Nvidia and DScaler decoder, removing ffdshow doesn't have any effect. Same result with VMR9 windowless/windowed and VMR7.

I've noticed this too! It's a very strange phenomena (I have a 6800GT card). It occurs in TheaterTek as well, with eithe the 67.66 drivers and the 71.84 drivers. When you view either in a window, and then go to fullscreen, the image does get brightened by a fair bit - although I didn't notice at first because my eyes adjust to it immediately. I only found out when I did a Print-Screen and noticed that, as an example, the 50 IRE bar would jump from 127,127,127 to 147,147,147!

I >think< if you just chapter skip forward, and then again, the colors will correct itself, so it is only a temporary effect. I really should test that again just to be sure. Give it a go, see if that helps!

By the way, is there a reliable way of taking a screenshot of overlay that matches exactly what is seen on the screen?
post #62 of 486
Quote:
Originally posted by maxleung
I >think< if you just chapter skip forward, and then again, the colors will correct itself, so it is only a temporary effect. I really should test that again just to be sure. Give it a go, see if that helps!

Thanks for the reply!

I got the same result with Avia (which is running video, not still images), and I tried skipping back and forward on DVE, but it didn't make any difference. Then I played a couple of regular movies and reference black (the black bars) were being displayed at 16, both with Zoomplayer windowed and in full screen. This makes me rather confuzed, what is happening with calibration patterns in fullscreen? Seems like the image I use to calibrate my display is actually too bright, and thus my calibration is bad. But if so, the picture should be way too dark when viewing DVD's, and it isn't. I need to investigate this further, all input appreciated.

Quote:
By the way, is there a reliable way of taking a screenshot of overlay that matches exactly what is seen on the screen?

Is it possible to take a screenshot of overlay at all?
post #63 of 486
Have a look at the Snell & Wilcox motion pattern on DVE - take a screenshot while it is in motion in full-screen mode versus windowed mode, and compare. I think you will find that as long as video has motion, the colors and grays should be correct.

Another thing to try: In fullscreen mode, go to the Snell & Wilcox pattern, let it run for a few seconds, then go to the grayramp screen and take a screenshot - the grays will probably look correct now. I think if you go to any pattern with motion, and then go back to a still screen, the colors should be correct.
post #64 of 486
I am not able to display DVE title 12, chapter 2 in fullscreen at the right levels no matter what I do. However, it works well with other chapters like title 12, chapter 5 (which is the same pattern with additional color bars), so I will use this one for calibration.
post #65 of 486
Well, I thought I had a solution - which was to disable YUV mixing. But, this isn't consistent and doesn't always work.

Are you running in multimonitor mode? I find that I get the correct RGB values more often only when I run my PC in single display mode. Yuck.

(This is with the NVIDIA 67.66 drivers - I believe the behavior is the same with the 71.84 drivers from testing I did a week earlier).
post #66 of 486
Quote:


Originally posted by maxleung
Well, I thought I had a solution - which was to disable YUV mixing. But, this isn't consistent and doesn't always work.

Yes, I tried that too, but it didn't make any difference here.

Quote:


Are you running in multimonitor mode? I find that I get the correct RGB values more often only when I run my PC in single display mode.

I do only use single mode for DVD playback (and testing).

As mentioned earlier, this a with a Radeon and the latest drivers. Maybe I will give some older drivers a try.
post #67 of 486
Ah! I forgot you had a Radeon!

This is a strange artifact we are seeing, because it seems to go away after a while. Right now I'm using YUV Mixing turned on, DXVA, in single display mode and it works okay if I minimize TheaterTek, then resume TT again, and skip forward a chapter and skip back. Then, it seems to "hold" the proper brightness and contrast after that.

Probably the most reliable way to calibrate with the DVE patterns is to use the DGIndex method and use a picture viewing program instead - TT and some other players can't seem to force the proper settings all the tiime.
post #68 of 486
Quote:


Originally posted by .....
As mentioned earlier, this a with a Radeon and the latest drivers. Maybe I will give some older drivers a try.

I'm running some pretty old drivers with my R9600 (last summer/fall - I forget what build they are), and I am not seeing this. However, I probably need to do some more testing it sounds like to make sure my eyes aren't deceiving me. All in all, I would say that the code base for the renderless modes is still fairly immature, so we shouldn't be too surprised with issues like these.

Later,
Bill

PS: Of course, I'll be jettisoning the R9600 as soon as the fanless 6600s come out here in another month or three.
post #69 of 486
Thread Starter 
Over the past weekend, I went through and reworked the settings on my Samsung display. I started off trying small tweaks, but ended up messing up my settings (forgot to write down my service menu starting settings), and so I just re-calibrated the HTPC/DVI input settings. It had been about 115 hours (of usage) since I had a new light engine put in anyway, so it was due for a checkup. And now it looks better than before anyway.

So here's my anecdotal experience, for calibrating on a digital display, using Finding Nemo as a gauge, etc. I will put this in the main post as well.



Calibrating for Digital Displays - Case Study == Using Finding Nemo as a Gauge:
(NOTE: This is based on my own experience calibrating my Samsung DLP, so your mileage may vary with your own display)


While the brightness/black level adjustment advice holds true, and you should match the brightness setting to show level 16 blacks as black, with good gradation up from there, with digital displays the contrast/white level setting may not work the same way.

The idea of course is to set the darkest black your display is capable of to be at the level 16 black, making sure you can see detail in the dark values, etc. But on some digital displays, like my DLP, the display is capable of extremely bright whites. I found that if I adjusted the contrast/white level setting in my service menu up until I found the brightest white it is capable of producing, and set that level to level 235 white, the upper range of the greyscale was adversely affected, shifting to yellow or red at certain %'s.

For reference, the "normal" contrast setting in my service menu is about 103. I could take it up to about 125 to produce the brightest whites the set is capable of, and depending on the level of the red/green/blue gains, I could take this up to 130-140 and get the "brightest white" the set can produce set to 235.

When I tested out DVDs with this higher contrast setting, they all seemed to look fine, albeit brighter than I remember them of course. But one DVD was extremely messed up, with banding, macroblocking, etc. I had seen banding on the DVD before, and was aware that it could show such problems if the display (and video chain) aren't up to par. The DVD? Disney's/Pixar's Finding Nemo, of course. The PQ of this DVD has been fervently debated, some people saying it's perfect reference quality, and others saying it has banding, blocking, etc., and then the people who don't see those issues say it's the video chain/display that's bad if you see issues. See this thread, started late 2003, to see what I'm talking about.

So, with my high contrast settings, thinking I was setting the "white level" at the correct spot, Finding Nemo looked like trash. Banding was slight with certain scenes, but at the end of Chapter 14, when Marlin and Dory come out of the swarm of jelly fish, I saw 3! colors of blue, with saw blade-like blocking between the different tones. Lost and trying to tweak levels, I spent the better half of Saturday (much to my wife's dismay) "fiddling" with the TV and computer. I settled on some decent settings Saturday night, something I could be satisfied with (for a day), back at an acceptable level of banding on Finding Nemo that I had grown accustomed to seeing.

But I was obsessed with nailing down the settings to get Finding Nemo to look better, even though every other DVD I tested looked "fine". So Sunday, I spent about 2-3 hours (took much less time this time) tweaking. I started off by lowering my contrast down to around the previous level, and the extreme banding in the scene described above went away. So from there, I used DVE greyscale screens to adjust red/green/blue gain/offset balance. I had a hard time with this, and found that using the Phillips Pattern Generator a much more complete and convenient tool, although I still checked the DVE screens for black level and overall status. The Phillips Pattern Generator has greyscale bars and many other test patterns, but I found the CRT color/b-w adjustments to be the most useful. I was able to nail down the red-green-blue gain/offset adjustments with the greyscale bars. Then I nailed down the contrast level and color saturation (in the user menu) to create a proper 0-100 gradation for each color, with clear distinctions at the 1%-5% steps on both the top and bottom end (although the bottom end, about 0-10% or so, required moving brightness/black level to 0 instead of 16 temporarily). Too much contrast, and the upper range of the colors bands together, as seen in the Finding Nemo example. The same applies to the Color saturation setting, as too much color will lose the differentiation between 100% and 99%, etc. Working through my Samsung's Service Menu reverts the user menu controls to the "Dynamic" mode, which has Contrast at 100/100, Brightness at 50/100, Color at 65/100, and Sharpness at 65/100(?). So after getting the brightness and contrast levels dialed in with the service menu, I exited and went back to the user menu controls to turn off the sharpness and dial down the color to fine-tune the color gradations and get proper differences between single % changes with the Phillips program.


Basically, this meant that for my digital Samsung DLP display, I lowered my brightness/black level to set "black" at 16. But I kept my contrast/white level lower, not going for the absolute brightest white it was capable of. I went for the proper greyscale gradation through the whole 0-255 range using the Phillips Pattern Generator, aside from the lowered brightness for black=16, and found those results were the best. The "white" level appeared light grey compared to the maximum white the display can achieve on a greyscale ramp, but even at the lower contrast/white level setting, a screen of all "white" is still plenty bright, and looks "white" anyway.

I was a little nervous as I brought up Finding Nemo, but was pleasantly surprised. On all scenes tested -- 0:30 opening, aquamarine-colored water under "butt" boat, Dory's blue skin/scales at the beginning of Chapter ? right after Marlin and Dory first meet, water around deepsea fish's antenna, water where Marlin and Dory come out of jellyfish swarm, etc. -- banding was gone or reduced to so slight a level I had to stare and look to see it. I couldn't have been more pleased with myself, and the results, and every other DVD I tested looked great like before, with seemingly better gradations, colors, detail, etc. I had a good calibration before, after I got my new light engine installed a few weeks ago, but redoing it (light bulb losing a bit of brightness after first 100+ hours of use, etc.) this time, I think I have my settings better than ever before. And there is less noise in the picture than I remember from before as well - on the DVDs I tested, I could only make out the noise by pausing the DVD and advancing frame by frame.

Lessons (I) learned from this experience:

A) Digital displays can be very finicky, where a CRT may seem to hide/blend colors together for less banding, at least with DVD sources, or computer+DVD sources.
B) The Finding Nemo DVD is a great test to gauge your digital display for proper color/br/ct balance. If the settings are off just a little, you'll see banding and artifacts.
C) Brightness/black level is easy enough to set, but contrast can be a bit trickier (at least on digital displays with a high amount of light output), so testing with finicky real world material like Finding Nemo can help put things into perspective. Other DVDs with distinct green/red/orange/blue looks can also help see if you have your green or red, etc. up/down a few notches too far, as you can tell a green looks too green or doesn't have enough green more in a scene you know, than on a greyscale where it may not show up as well in the greys.
D) Using an HTPC as a DVD player gives you a lot of control and options that normal DVD players don't, not only with playback, but especially with calibrations. Being able to use the Phillips Pattern Generator, etc. to dial in my settings meant that I can have (near) perfect settings without needing fancy, expensive signal generators, light meters, etc. Too bad this only applies to the DVI/HTPC input.


So, that's my digital display calibration experience for you. Your mileage may vary, but hopefully some of this will help people trying to calibrate their own digital displays.
post #70 of 486
cyberbri, sounds like you calibrated your DLP the same way I did, except I used DVE and grayscale patterns handmade in Photoshop.

I tried to preserve the 235-255 range as well - I didn't want reference white to be as bright as peak white, so I stepped back the contrast bit.

I also had to be very careful that the player brightness and contrast settings did not result in incorrect RGB values being shown. If I had brightness and contrast set wrong in the DVD player itself, banding becomes much more prominent.
post #71 of 486
Overlay can do studio RGB as well ( I can see the shadow of the THX logo)
post #72 of 486
Overlay can be all over the place though, there's no guarantee it produces anything near either Studio or PC levels.
post #73 of 486
Quote:


Originally posted by ChrisWiggles
Are you dense or what!

PC Levels will clip what you posted. Studio levels will PRESERVE all that detail. And it's also why I advocate, though not stringently, that users with digital displays calibrate for a max peak white, instead of at 235. If you use PC levels it doesn't matter what you do at the monitor because those values are clipped at the source/processor, so they can never be recovered.

What new clothes, I've been repeating myself over and over to you, across different threads for months, and across different forums. And you still don't get it.

I think what he was trying to show that if you calibrate your display for Studio RGB values then non-video based computer images can lose visible information. I don't know what the image he was showing was from, but I have played some games where after an explosion the entire screen was almost all white and looked similar to the posted image.

This is not important if all you are using your computer for is to watch video, but it is important for me as I use my HTPC for other things as well. I do like look of VMR9 better then overlay, but I don't want to calibrate my display to studio levels due to the other things I use my HTPC for.

Arbury
post #74 of 486
And that is why it is extremely handy to have different user modes on your display. Most projectors and RPTVs have those, but regular CRT and LCD monitors have to be calibrated each time you switch from studio to PC levels or vice versa.
post #75 of 486
Henry, you are right! User modes are the best (quickest) solution for such a case.

Hmm, you just gave me an idea - my Benq 8700+ projector has a serial port (an RJ45 port?). The service manual documents all the available commands - and you can change just about anything, including contrast, brightness, gamma, etc. and also select user mode.

If your display has such a feature, you can program your PC to automatically change user modes for you. When launching a video player, tell the projector to use your video-level user mode. When going back to the desktop, the projector will use the PC-level mode...

Things get much simpler if you use Girder and have an IR transmitter...
post #76 of 486
Quote:


First, that image looks like it's been expanded to PC levels.

PC Levels will clip what you posted.

Kinda seems like you contradict yourself in back to back posts? Note my use of the letters P and C in reference to that image.
No. PC Levels won't clip it. You need PC levels (full white = 255) to resolve it on a digital display.
Quote:


Studio levels will PRESERVE all that detail.

Uhh No. Not on a digital if one has calibrated the displays full white to 235.
Quote:


And it's also why I advocate, though not stringently, that users with digital displays calibrate for a max peak white, instead of at 235.

Ok. But that exposes another downside
Quote:


You do lower your CR when you calibrate to have all values above 235 visible. I went from 1500:1 down to 1000:1 when I did that.

You also suggest a subjective calibration for the white point on a digital.
Quote:


However, whites are a different matter. This is subjective, as we've discussed earlier on the thread.
You can either clip down to reference white, you can include it all, or do something in between.

A subjective calibration to arbitrary non-reference levels seems kinda an oxymoron to me. Calibration infers a reference and / or a standard to me. Websters too.
Quote:


What you took a picture of proves MY point.

Quote:


Remember that you can't correctly display video levels and graphics levels simultaneously.

That's your point that the image I posted here proves. The images where I tried to show the effects of clamping BTB/WTW are here http://www.avsforum.com/avs-vb/showt...3#post5178943. Now, perhaps I just haven't found the right scenes, that's why I've been asking to see your caps or get title / timestamps from you that shows all this missing details that you claim PC levels cause.
Quote:


You seem very adamant that for some reason about this topic, yet you think this data does not matter, and you can't see the banding artifacts.

I've been very careful to confine my debates to the claim that PC's have been missing all the significant BTB / WTW image information encoded on all DVD's prior to StudioRGB levels. I haven't argued the supposed banding artifacts yet. Perhaps you confused me with someone else?
Quote:


So my question is, why do you enter all these threads?

Why does a dog lick. Seriously, I think the claim you guys are making that there is significant BTB / WTW image information encoded on all our DVD's is a bold one. To quote Carl Seagan, extraordinary claims demand extraordinary evidence. It seems to me it should be pretty easy for you to just show it. I've also discussed (argued?) the value of $2K power cords when the proponents can't offer any measurable, quantifiable, or documentable evidence to back their bold claims.
Quote:


then let it go and watch how you want and stop bothering us all about it.

This isn't the first time I've seen you express a desire from a dissenting option or argument to just shut the hell up and to just 'trust' you and accept these facts' or that anyone who challenges or questions some of these claims doesn't know what the hell they are talking about or must be blind. You and others have talked a lot about these missing BTB / WTW details', highlights', shadows', etc that have been revealed to you by adopting StudioRGB levels. If you really want to shut me up, you could always just post some A/B images that shows all this significant BTB / WTW image information encoded on all our DVD's. Otherwise, I think this is a public forum that anyone can post to. I've tried to keep it technical. I've posted pictures of how I see the emperors new clothes vs his old ones. You and others have hinted many times about doing the same but haven't.
Quote:


Are you dense or what!

Question? Rhetorical? Personal? Anywho, I am denser than both air and water.

I'll close with some quotes from other respected posters that seem to me to support what I (and others) have been arguing and run counter to yours on this issue. Just so I don't feel so alone.
Quote:


Vern Dias
In the end, I feel that a calibration that prevents BTB from showing and clips WTW is optimum.

Quote:


Guy Kuo
The above white bar can clip without endangering details of properly mastered material.

Quote:


Guy Kuo
Because of the occasional poorly mastered recording, an ability to process signals in the below back and the above white regions can be useful.

Quote:


Mr D
For what its worth I don't regard hanging onto the BTB and WTW as being all that important for reasons I think I've explained.

Quote:


Glimmie, Technicolor Inc
It's far better to clip in a controlled telecine enviornment than the encoding lab which is simply a hard clip.

Quote:


Glimmie, Technicolor Inc
Therefore things are kept within the published specs of SMPTE RP125, 259, and 292, and CCIR601, 656.

Quote:


Mr. Wigggles
Brightness should be set so that black (usually 0 IRE) coming out of the source equals the darkest black your projector can produce, then contrast should be set such that white (Always 100 IRE) creates the brightest white your projector can produce.

Quote:


Mr. Wigggles
*Some source material has blacker than black or whiter than white content. In thes instances where the video "engineer" has f*$@^!d things up in the encoding process.

Quote:


Mr. Wigggles
DVI was a computer interface for years before being used for video. Being digital there is NO reason to have blacker than black or whiter than white signal levels. 0 is black and 255 is white, PERIOD. (Getting back to my pet peeve numero uno), video "engineers" come along and start transmitting 16 as black and 235 as white - vunderful. There is very very little reason (see *) to not use the full 8 bits when translating the component video into RGB; you simply use different coefficients in the equations. But alas, half our sources use 0-255 (yeah!) while the other use 16-235 (booh!), and consequently our displays have brightness and contrast settings wrong half the time because no one seams to know what is black and white with DIGITAL SIGNALS.

Quote:


Mr. Wigggles
Personally I wish they would encode HD-DVD etc at full 8 bit RGB (i.e. 0 -255) and call it a day.

Dave
post #77 of 486
Quote:


Originally posted by dlarsen
The images where I tried to show the effects of clamping BTB/WTW are here http://www.avsforum.com/avs-vb/showt...3#post5178943. Now, perhaps I just haven't found the right scenes, that's why I've been asking to see your caps or get title / timestamps from you that shows all this missing details that you claim PC levels cause.

It seems to me it should be pretty easy for you to just show it. I've also discussed (argued?) the value of $2K power cords when the proponents can't offer any measurable, quantifiable, or documentable evidence to back their bold claims.

You've been shown, over and over again. There's no voodoo involved, there's real picture information that expanding to PC levels clips, the images and graphs posted by sspears and dumunsil PROVE beyond any doubt that picture information in real Hollywood movies, exists beyond 16 and 235.

Now if you don't feel it's important to maintain that that info all the way to the display fine. Doesn't bother me at all. If you'd rather not re-calibrate your display for PC stuff, or whatever, no skin off my back, and I doubt any of the others here that advocate maintaining Studio RGB levels care either.

Quote:


I think the claim you guys are making that there is significant BTB / WTW image information encoded on all our DVD's is a bold one.

You use that statement a lot, you also seem to miss the fact that at the very high end of A/V reproduction, "significant" <= "subtle". The difference between stock video card scaling and ffdshow Lanczos scaling is a perfect example, many here claim that it's a "significant" (actually the use stronger words usually) and essential for playback. But in truth it's not, it's a small improvement that, to some, is a very important improvement, while to others it's not noticeable. The same is true of this discussion, it's much more important to maintain proper video levels to some than others.

Really I don't understand your persistence. Are you suggesting we should all be expanding video to PC levels? If so, can you provide proof that it doesn't hurt the image? (contrary to many experts) To me it seems more like you want to convince to yourself (and others) that it's OK not to maintain Studio levels. So what is your motivation?

If you aren't arguing that, then your persistent comments don't add anything to the discussion.
post #78 of 486
Quote:


You've been shown, over and over again. There's no voodoo involved, there's real picture information that expanding to PC levels clips, the images and graphs posted by sspears and dumunsil PROVE beyond any doubt that picture information in real Hollywood movies, exists beyond 16 and 235.

That 'proof' was not full frame A/B and one has no way to make any kind of comparision as there is nothing to compare to contrast with. It also must PROVE that there is real 'picture information' hidden in the black bars of a 2.35 movie as that's where many of the BTB pixels were.

Dave
post #79 of 486
Quote:


The difference between stock video card scaling and ffdshow Lanczos scaling is a perfect example, many here claim that it's a "significant" (actually the use stronger words usually) and essential for playback. But in truth it's not, it's a small improvement that, to some, is a very important improvement, while to others it's not noticeable.

Agreed, and there has also been many A/B images posted that capture document, and compare those differences. Subtle or significant. FWIW it's often not subtle to me.

Quote:


Are you suggesting we should all be expanding video to PC levels?

Nope.

Quote:


So what is your motivation?

To see all these BTB / WTW image information details that all you guys seem to see for myself in a A/B comparison. Just like with your example of one A/Bing different scaling algos or others comparisons of sharpening, etc... As far as I know, I'm the only one who has posted full frame A/B image comparisons of this. Again, I'm not the only one who seems to be missing these details.

Dave
post #80 of 486
Quote:
It also must PROVE that there is real 'picture information' hidden in the black bars of a 2.35 movie as that's where many of the BTB pixels were.
You like making the comment that some of the highlighted pixels were in the black bars don't you, even though in those shots (except for the FOTR one) most of the highlighted pixels are on the border of the bars. Plus 4 of the 7 shots have NO bars at all, so all the highlighted pixels are in the picture.

And well regarding the LOTR TT screenshot, well almost half the picture contains BTB.

Here's an AB for you. I threw in the highlighted pic as well.

ps

For the benefit of others reading this thread, here's a link to the images being discussed (and a great thread on out of range picture info):
http://www.avsforum.com/avs-vb/showt...08#post3992208

-edit

Fixed the zip file, it has the right images now.

 

sample_normal.zip 195.19140625k . file
post #81 of 486
Dave, you're saying that we are claiming that there is significant below black and above reference white detail - but that is not the case. It depends on the source material! A dark movie will have a lot more below back information than a bright one (ie. horror versus romantic comedy).

I could make the case that we don't need 16 million colors - after all, no single scene uses all of them. Why don't we limit our displays to 16 bit (65536 colors) then? Can someone show me a scene where a significant number of colors numbering 65536 or more, are used? Golly, no scene can have that many colors! Let's all use 16 bit then!

Preserving Studio RGB levels does NOT hurt the picture at all. But expanding to PC levels can hurt the image - banding artifacts, etc. So what is the big deal? Can you prove that Studio RGB damages the picture? Why is this such a big issue?

Carl Sagan is spinning in his grave...
post #82 of 486
Dave: I've asked you before and I will ask you again.

You seem adamant that expanding to PC levels causes no degradation in image. Which means that it is equal to you. Fine, then go watch.

You have *NEVER* claimed that expanding to PC levels provides *ANY* even theoretical PQ improvement.

You have *NEVER* even attempted to explain why video engineers had their proverbial thumbs up their rear ends on this matter.

Guy Kuo, and Joe Kane also expressly (moreso Joe Kane) the need for proper preservation of Video Levels. Guy has seemingly contradicted himself in the past, but as the standards for the ISF RL which he put together clearly show, it *demands* proper video levels be maintained.

There have been numerous examples and explanations given to you, on VARIOUS threads over the course of damn near a year or some godawful long time, on NUMEROUS forums.

I will only address one specific misconception in your post:

Quote:


Studio levels will PRESERVE all that detail.
--------------------------------------------------------------------------------

Uhh No. Not on a digital if one has calibrated the displays full white to 235.

This is a display *SPECIFIC* weakness. This is not a problem on a CRT display which is the video reference as it stands now. To fully emulate a CRT in this regard requires calibrating to 254. If you can't handle tradeoffs or understand that this causes the need for subjective calibration choices, then fine. But that is always the way things are. I shouldn't have to state the obvious that there is no perfect display. EVERY system has huge compromises made, and these are often subjective, some are choices made for us that we have no choice in.
post #83 of 486
Quote:


Here's an AB for you. I threw in the highlighted pic as well.

I must be missing something but where is the A/B comparison to be made in the images your zip file? Like one with the BTB present and passed vs the same image with it clamped? Like the ones I posted. I also note that again MANY of those highlighted pixels ARE WELL into the black bars and even off frame. Is there hidden image information in the areas where there isn't even an image' present?

If you could post the original unclamped image (without highlighting any pixels) THEN we could do an A/B comparison to the clamped one. You have the clamped one, you must have the unclamped one too? I'd also suggest not using jpgs for this as the values of pixels can shift with the compression.
Quote:


I could make the case that we don't need 16 million colors - after all, no single scene uses all of them. Why don't we limit our displays to 16 bit (65536 colors) then? Can someone show me a scene where a significant number of colors numbering 65536 or more, are used? Golly, no scene can have that many colors! Let's all use 16 bit then!

Kinda a silly analogy IMO. I'm sure it would be EASY to demonstrate the differences between 16 and 24 bit RGB.
Quote:


Preserving Studio RGB levels does NOT hurt the picture at all.

I never said it did. I have shown some downsides to it in a multi-use HTPC environment and when using a digital display if calibrated for peak vs reference white. Also, you aren't 'preserving' the original Studio RGB. Our DVDs have been encoded in YCbCr and that damage has been irrecoverably done before the disk was ever stamped.
Quote:


But expanding to PC levels can hurt the image - banding artifacts, etc. So what is the big deal?

I guess the big deal is that I (and a few other respected members) can't really see all this claimed upside and you guys can't seem to just SHOW all this downside.
Quote:


Can you prove that Studio RGB damages the picture? Why is this such a big issue?

Nope. Haven't tried. I have tried to prove your point for myself that PC levels damage the image but have failed. I don't think you have really proven that PC levels damage it either. BTB / WTW image information OR banding artifacts.
Quote:


Carl Sagan is spinning in his grave...

Indeed.

Quote:


Dave: I've asked you before and I will ask you again.

I don't see a question mark anywhere in your post. I'll assume these must be the questions?
Quote:


You have *NEVER* claimed that expanding to PC levels provides *ANY* even theoretical PQ improvement.

Correct. And you have provided ONLY theories as to the improvements of StudioRGB levels IMO.

Quote:


You have *NEVER* even attempted to explain why video engineers had their proverbial thumbs up their rear ends on this matter.

Correct again. I never said any such thing. I do find it kinda interesting that seemingly no other digital imaging applications requires BTB / WTW and that the specification that established such headroom is very old and likely predates most digital imaging. See the Mr. Wigggles posts I quoted.
Quote:


This is a display *SPECIFIC* weakness. This is not a problem on a CRT display which is the video reference as it stands now.

Note my use of the word digital in the comment you quoted me on. There's a lot of them in use now.
Quote:


subjective calibration

Those two words still don't seem like they belong together but oh well


You also keep requesting that I quit posting my comments and opinions on this subject yet you keep asking me questions, lay down challenges and even flame me and expect me not to respond?

Dave
post #84 of 486
This is just silly. I see banding artifacts on my DLP and a lesser amount on my 19" CRT monitor if I use PC levels. That is proof enough for me. I invite you to come over and witness this yourself (a screenshot will not work - a camera cannot capture that much detail with such limited light, unfortunately).

Anyways, enough of this hijacking. I vote that this thread gets split up - the idea of it is to have the HTPC match reference output which is Studio RGB levels.

Dave, for chrissakes create a new thread on the subject. Why dilute this once-informative thread? This is ridiculous.
post #85 of 486
Quote:


Originally posted by dlarsen
I must be missing something but where is the A/B comparison to be made in the images your zip file? Like one with the BTB present and passed vs the same image with it clamped? Like the ones I posted.


Sorry I zipped the wrong files, it's fixed now.

Quote:


I also note that again MANY of those highlighted pixels ARE WELL into the black bars and even off frame. Is there hidden image information in the areas where there isn't even an image' present?

What are you talking about? "WELL into the black bars" there's nothing beyond the edges, and besides almost half of that picture (not including bars) contains BTB.


Quote:


You also keep requesting that I quit posting my comments and opinions on this subject yet you keep asking me questions, lay down challenges and even flame me and expect me not to respond?

Oh, don't go blaming us for this. You're the one who (back in post #39) started the whole "Show me the BTB" topic in this thread.
post #86 of 486
Thanks for the images Stranger89. Perhaps it's the jpeg format but I do measure values below 16 on the clamped image (I'd expect nothing below 16,16,16) and I don't measure the same levels on the mid-level data like I'd expect. For instance pixel 455,416 (relative to the top left) I measure 28,31,38 in the clamped and 24,31,37 in the unclamped. I ran into similar problems with my caps using when jpegs.

Having said that, I can't really perceive any difference between the two images when flipping back and forth in a A/B manner.

Thanks again.

Dave
post #87 of 486
So Dave, then why do you care. If you see no difference, then why do you care about this argument? I don't see any motivation for you to keep arguing this. My motivation is merely to bring up to the HTPC folks, who apparently want high quality video at DIY prices, that "hey, just make sure you're handling the video right like any other video scaler." That's all.

It pains me when it seems some are *so* stubborn or feel attacked when I'm just trying to bring up a way to make sure that they *are* getting the best possible video, just as if they were using a standalone video processor.

YOU are the one advocating doing things *differently* and this is something you still don't seem to have realized. I am advocating sticking with the current way of handling video. I am NOT advocating any change, but you don't seem to recognize this.

Quote:


the specification that established such headroom is very old and likely predates most digital imaging.

Correct. Video standards and calibration methods and ways of handling video are all still based on CRT displays, like it or not. A digital display, to properly display the video as well as possible, has to emulate a CRT. This was the goal of the Joe Kane samsung, which alters things like black level to attempt to emulate a CRT environment as much as possible. Like it or not, this is the way things are now. Will they change in the future? Possibly, but even in digital video environments, there are benefits that have been explained to you, repeatedly.

Quote:


Those two words still don't seem like they belong together but oh well

I'm not going to explain to you the basics of calibration.

Quote:


You also keep requesting that I quit posting my comments and opinions on this subject yet you keep asking me questions, lay down challenges and even flame me and expect me not to respond?

No, I expect you either 1) to have some kind of coherent argument, or 2) recognize you don't, and stop arguing.

You keep asking everyone to prove to you that the world is round, basically. I and others are advocating maintaining the SAME way of handling video as has been the case for years in video equipment. I want HTPC users to be able to get the best possible image, to have high quality processing etc, just as if they had spent tens of thousands on standalone video processors. You seem to have a problem with this.

Yet you have no arguement that you have articulated at any point to advocate any departure from the orthodox way of handling video as video. You have no coherent argument that is being articulated besides "prove this to me personally." Prove what? There's nothing to prove, there is a way video is handled, it's been handled that way since forever, everyone seems to agree on this but you.

Quote:


Preserving Studio RGB levels does NOT hurt the picture at all.
--------------------------------------------------------------------------------


I never said it did. I have shown some downsides to it in a multi-use HTPC environment and when using a digital display if calibrated for peak vs reference white.

You even admit here that you have no argument that you are making (besides being hideously argumentative).

You have shown that multi-use multi-standards environments are, gee whiz! problematic. No crap. You have shown that certain kinds of displays that have limitations different than CRTs are, gee whiz, limiting in that regard. wow. That's just... I mean... I just never knew that .

WHAT IS YOUR POINT?

Back in school they called it a thesis. Do you have one?

You have utterly failed to articulate any coherent point or argument.
post #88 of 486
I'm not going to jump into the debate, but regardless of which side one takes it's helpful to have good tools to set black and white levels to where *you* believe they should be for best PQ. I developed the attached screens to allow higher precision calibration of black and white levels at the ends of the scale around the reference levels that are the subject of the debate. Please give them a try and let me know if you find them useful.

Each screen is 1280x720, developed for setting up my native 720p projector via HTPC/DVI. Two small GIF files, but zipped to beat the screen size limit imposed by the system.

 

reference calibrator 1280x720.zip 131.4365234375k . file
post #89 of 486
Thread Starter 
3no,

Thanks for sharing that! I'll check it out tonight at home, and definitely save these.

Have you found with your 720p projector that you could calibrate contrast to 235 for peak white? On my DLP TV, if I try to get the brightest white the display is "capable" of at 235, it messes with other values below that (high grey turns pink/orange), and I got hideous results as I mentioned with Finding Nemo.

Since DLPs don't bloom like CRTs at a certain point, it makes it harder to figure out where white level "should" be without using some light measurement device to get the right amount of actual light output. I found that for me (and for now), I get the best results by calibrating black to 16 and white to 255 (give or take) with DVE/Phillips Pattern Generator. Whites in movies are still very bright, and I don't think my eyes could handle the Virtual Dock Handler white scene in chapter 4(?) of Matrix Reloaded (which has whites at about 220 IIRC) if it were any brighter than it already is.

But I'll have to check your screens as well and see if I can dig up an old copy of Photoshop (3.0 or 4.0 or so) and see if I can do anything with it.
post #90 of 486
cyberbri, it is weird that your display would mess other colors when you push up the contrast! Maybe the contrast control acts like a gamma control on your set? Hmmmm...I do recall discussions on how, if you turn up contrast too high on a digital projector, you are exceeding the lamps color range.

For example, my Benq 8700 projector makes whites turn blue if I crank the contrast all the way up, because the bulb does not have enough red in the spectrum.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Home Theater Computers
AVS › AVS Forum › Video Components › Home Theater Computers › HOW-TO: Calibrating Display to Match HTPC Output