AVS Forum banner

Status
Not open for further replies.
1 - 20 of 20 Posts

·
Registered
Joined
·
806 Posts
Discussion Starter #1
First let me say that ColorFacts is a hoot! It is simple to set up and use and the data is pretty straight forward. There is a couple of issues with it though.


1. The graphs could be more detailed. Perhaps they are less detailed on purpose so as not to take up too much storage when saved but it would be nice to see more detail as an option.


2. The software seems to be somewhat unstable, at least on my system. Note, this is the only software that is, I can run WinDVD, etc. forever without issues.


Sometimes when I just leave it on the 'Align Sensor' it gives a failure message. I have had to reboot my system numerous times when ColorFacts is running.


That being said, I would still buy it again because it does seem to work and the resets are nothing more than a distraction.


Now for Dilard 2.2.


Let me say that this is a nice piece of software! It is very stable and has given me no trouble what so ever, (till just now...see below).


I first loaded ColorFacts on my laptop and took some baseline measurements.


To my surprise, the temperature chart was flat from JVC as is but the contrast was about 350.


I then moved on to the first attempt at calibration.


The first thing that I stumbled on was the precalibration. It asks you to put white and black blocks into targets but doesn't tell you which goes into which. Moreover there isn't any white block. The wizard puts up three primary color ramps on the screen and sliders to move the 'blocks'. My take, baring any other input, was to put the black block on the right target and the PRIMARY COLOR block on the left target.


Mark, is this correct? This is one place where a picture would be worth a 1000 words. Prehaps even an automated demo of the two controls sliding the two bars onto the targets so that people would know what to look for, (or maybe I'm just a dunce :p ).


I then ran ColorFacts to read out the results. I got a temp curve that started way below 6500 then slowly ramped up to above 7500 the slowly back down again to 7260 or so.


Undaunted, I poped in a DVD and took a look. WOW, the picture was completely blown out. This is because Dilard does not seem to use the overlay but the main display memory instead. Since I have sensitive eyes, I had the desktop dimmed. Dilard took this as low output and boosted the gain. When I did the movie which does use the, (scoped), overlay it was way too hot.


I then reset the desktop to default and re-ran the entire precalibration and calibration. This time it gave a flat temp line but at 6977, not at 6500. Contrast was ~400:1


At this point I decided to revisit the precalibration to see if there really was a white block, couldn't find it. I then decided to stretch the blocks towards the outside of the target, (still enclosed but biased towards the outside edges). The documentation says that an error in this direction gives better results. Again I reran the precalibration and calibration. This time I got a hump at around 8000 that quickly ramped down to 6500 then slowly ramped back up to beyond 7500, averaging about 7220. Contrast was ~440:1


I then dedecided to restore the previous setup since it had the flatest temp graph. All went well except it refuses to restore the blue gamma. So here is where I am, at the moment. An unusable display due to bad blue gamma.


Mark,


Would it be possible for you to e-mail me with some suggestions? Particularly about geting around the restore failure. I have tried three times now to no avail and I'd really like to watch movies tonight. I've e-mailed you as well.


To all of those who've had better luck, my hat's off to you. If anyone else has a suggestion, let me know.


Anyone else?


Thanks,

Phil
 

·
Registered
Joined
·
806 Posts
Discussion Starter #2
Update, I was able to restore the gamma from the last save, (the one with the hump), but even it was flakey, I had to try three times to get it to go without errors.


Mark, do you automatically downshift to a lower speed when you get errors? I might be something to think about. Do you try three times then give up? You may also wish to increase that as well. Also, it would be nice if you let out a 'ding' every time the comm path got an error. That way, if you are getting a lot of 'dings', you will no to be careful and try to improve your environment.


I am using two 25' serial cables so that may be an issue! However, I still don't think that backup/restore is as bullet proof as it needs to be.


Phil
 

·
Registered
Joined
·
4,525 Posts
Hi Phil,


Interesting findings. I suppose that some additional "environment checks" may be a good idea in the future.

Quote:
I had the desktop dimmed
If the contrast, brightness, gamma, etc. of the desktop have been adjusted, they will, of course, affect the calibration. When the Wizard asks for a 50% gray, it expects to be reading a 50% gray from the colorimeter, and makes appropriate adjustments so that it becomes that.


It sure sounds like the projector ended up with an obnormally bright boost due to the dimmed desktop.

Quote:
Would it be possible for you to e-mail me with some suggestions?
Sure thing. Also, I have put up a page with Calibration Tips and Tricks that may be helpful in maximizing the performance of the Wizards.


Mark
 

·
Registered
Joined
·
4,351 Posts
Hmm, this is a little worrysome - How will we know that Dilard autocalibrated our color to a roughly flat 6500K if we don't have Colorfacts?


How can we tell if our brightness/gamma of the desktop is equal to that of the overlay? Maybe Dilard could force this equality?


Thanks,


Andy K.
 

·
Registered
Joined
·
889 Posts
Mark,


Wow, great help page! Everything seems crystal clear to me but this part:

Quote:
6. Remeasure Initial Contrast Ratio

Checking this box will cause the Precalibration Wizard to measure the initial contrast before any changes are made to the projector and store the value for the Calibration Wizard to display after the calibration is complete. If this box remains UNchecked, the Precalibration Wizard will leave the stored value intact (preserving the original, original contrast).


The first time this Wizard is run the box will be disabled and can not be unchecked.
I have to say, I'm thoroughly confused. :)


What "stored value" does this refer to? The value of the previous calibration run? If so, how does forcing it to be disabled the first time the calibration is run make any sense (as there's no previous calibration number to preserve)? Or am I just misunderstanding what this checkbox does?
 

·
Registered
Joined
·
806 Posts
Discussion Starter #6
Hello again! I shortened the serial cable and was able to restore the gamma. I would still suggest a 'ding' or something when an error occurs so that people can monitor their environment. My guess is that I was getting some errors during backup but they were corrected via retry. I must have been right on the hairy edge.


Now on to what I think would be a great addition. As I first mentioned and Andy reaffirmed, since the calibration uses the desktop and not the overlay, there are only two ways to make sure that the final result, (the overlay that is), is correct.


One would be to use the overlay to do the calibration. It seems that there must be something unsurmountable about this or I am sure Mark would have thought of it.


The other would be to calibrate the desktop. This can be done quite easily, I believe, if you have access to a scope. Alternatively, you could just get the numbers from someone who has done the scoping.


******IDEA******


Before the precalibrator gets started, bring up a pop-up window that asks if it should display a 100IRE to 7.5IRE ramp, the exact same one that is on Avia. If you say yes, it displays the ramp so that the person can scope the display and jump to the driver's contrast/brightness controls to make the corrections until the 716mv/56.3mv readings are achieved.


The person then hits escape and the precalibrator is then free to do it's thing. Once the precal and cal are done, the person can then slip in the Avia disc, display and scope the same ramp and thereby match the desktop calibration to the overlay calibration.


An important side benefit is that the D-ILA is no longer calibrated to the PC, it is now calibrated to a standard. Think of someon who uses a scaler and only hooks up a PC to do the calibrations. If the PC display card is off, the scaler output will not be correct.


Does anyone see any flaws in the above? Mark, how 'bout it?


I am downloading the latest Dilard release and will try again hopefully tomorrow night. I should be able to do better than what I am seeing so it's either me, the software or my D-ILA, (probably in that order :p).


If there is a way to add pictures to a thread, would someone let me know? I could place the temp histograms of my various trials in this thread, (I don't have my own website) and I can't find any more free picture hosting sites any more).


Isn't anyone else willing to share results?


Phil
 

·
Registered
Joined
·
4,525 Posts
Hi Chris,


Regarding "Remeasure Initial Contrast Ratio"-

Quote:
What "stored value" does this refer to? The value of the previous calibration run?
The Calibration Wizard displays the "results". Two of them are initial contrast and final contrast. Since changes have already been made by the time the Calibration Wizard runs, it can't measure initial contrast...the Precalibration Wizard must do this.


The Precalibration Wizard measures contrast if you check this box, and will store it in the Registry (a key called "InitialContrast" or something similar). You can look for it between running the two Wizards, if you wish.


You can't UNcheck the box during the first run because the Precal Wizard MUST measure the contrast, as there is no stored value to fall back to. The default is UNchecked, so the original contrast will remain in the Registry forever, indicating what your projector was before Dilard did anything to it.



Phil -


Some great ideas there! This is really "version 1.0" of the Calibration Wizards. Maybe a future version will use the Overlay instead of the Desktop to create the test patterns. We use DVD players that use the Overlay for video more than anything, so this idea does make sense.


For now, setting the desktop and the overlay to default values (or at least the same values) is the only thing to do.


Thanks for the suggestions! I have copied your post into the development tasks database.
 

·
Registered
Joined
·
4,351 Posts
Hi Mark,


Let me just re-iterate my point above. Its fantastic that the wizard provides some kind of feedback on the start and finish contrast ratios, but it would also make things a lot easier if there was some kind of feedback for the start and finish color temperature.


Obviously, I understand that this could infringe on your Colorfacts product, but if the wizard provided something simple like "Starting Color Temperature varied between 3500K and 9000K. Ending Color Temperature varied between 6450K and 6600K." This would strike a balance between giving valuable feedback that confirms you set up your calibration correctly, while at the same time not giving away too much information :)


Andy K.
 

·
Registered
Joined
·
4,525 Posts
Hi Andy,


That's a really good idea, but it becomes one of those slippery-slope kind of things. It would be easy to keep track of minimum and maximum temp, before and after. Average temp., too.


However, then it would be nice to see a graph, to know where the errors are. Since correlated color temp., doesn't really take into account errors in the green channel, we will need to see a three-dimensional presentation of the color information as well (or at least a projection of 3-D data, like a CIE chart).


Then it becomes difficult to track the actual data with the reference data (since the target is so small), and you need a way to show color errors easily. This is something like an extruded Red, Green, Blue errors histogram. Finally, we have the information you really want to see. Correlated Color Temperature minimums and maximums is really not much information at all. An RGB Histogram is!


Plus, woudn't it be nice to know how "red" your reds are? And how "green" your greens are? How dark your blacks are (and what "color" they are)? How closely your projector matches a standard (like NTSC, PAL or HDTV)?


In the end, there is so much information that is needed to guage the performance of a display device that it could easily form the basis for another application completely....and that is the way that it was implemented.


Make no mistake....ColorFacts was born from Dilard and the code in the former was at one point to be found in the latter. It become such a voluminous amount of information, and such a tangential presentation to the primary task of calibrating the projector, that it was pulled out into it's own "Display Device Analysis System".


When the scope of the already-well-endowed Dilard started to mushroom, it was necessary to stop and look at what the product's core competencies would be.

It was decided that the product would be a less expensive "at home" alternative to sending the projector away for various operations, calibrations and automation systems for those people who wish to take the do it yourself route.


Comparing that marketplace might help you see what features should/should not be in the product. How many correlated color temperature graphs, RGB histogram readings, conformance to standards and other information would you receive if you sent the projector away to be calibrated by someone else?


I'm not saying that this product shouldn't (couldn't) be a better alternative to other choices available, I'm just trying to rein in the scope of the application to have a clearly defined feature-set.


Make no mistake: This is an at-home alternative to doing many operations that are 1) Expensive and 2) Require shipping. There will certainly be other bonuses (automation, tweaker's tools, etc), but that is the primary goal.


Thanks! Sorry for the long-windedness. Guess I had a lot to say on that!
 

·
Registered
Joined
·
4,351 Posts
Hi Mark,


I absolutely 100% agree with your point. It is a slippery slope, no doubt about it. Luckily, since you get to be the final arbiter of what goes into the product, you can put as much or as little into the product as you want.

Quote:
Correlated Color Temperature minimums and maximums is really not much information at all. An RGB Histogram is!
Right, that is exactly what I'm saying. Its not much information at all, by design. What would be ideal from both a customer standpoint and a business-model standpoint is the *bare minimum* required to tell the user that the calibration went properly.


Heck, Color Temp. Min/Max may even be overkill. Even if the wizard confirmed that the result is within X% of a flat 6500K color temperature and Reported 'Yes/No' would be sufficient. I thought the Min/Max was a good idea just to tell you if you were way off (something seriously wrong with the calibration set up) or pretty close (maybe tweak the position of the spectrometer, or just rerun the calibration). You could also gauge how well multiple calibration runs are converging.


Bottom line is that feedback like this should be solely in support of what is required to take the guesswork out of the calibration, and 'prove' it worked properly, and no more. As long as the feedback gives the user no specific information about the final response of the display, then I think you have the two different products nicely differentiated.

Quote:
Comparing that marketplace might help you put things in perspective. How many correlated color temperature graphs, RGB histogram readings and other information would you receive if you sent the projector away to be calibrated?
None at all - but at least you are 100% guaranteed that the projector calibration worked. The confidence level is very very high. Unfortunately, with so many variables (PC settings, ambient light, and especially user error) it would be nice to avoid the situation where you are never really sure if your calibration was done properly. Without proper feedback the confidence level falls.


I know I'm not looking for Color Temp. Graphs or pretty pictures, but I would like to know that I didnt screw up my calibration from an independant observation tool, rather than my eyes. :)


Thanks for hearing my 2 cents!


Andy K.
 

·
Registered
Joined
·
4,525 Posts
When the projector is calibrated by someone else...

Quote:
at least you are 100% guaranteed that the projector calibration worked
Really? Ever analyzed a professionally calibrated projector? I have. Several. On some, the final contrast wasn't better than 250:1. Wouldn't it be nice to know that?


Anyway, that's beside the point. I actually agree with you. I will see what I can put in for additional metrics in the next release.


Thanks for the great suggestions and feedback!
 

·
Registered
Joined
·
4,351 Posts
Thanks, Mark. The contrast ratio is definitely a great thing to know, since it can be the subject of many 'bragging rights' threads around here :)


One other question just popped into my head - Is there a reason why the spectrometer has to be 6-8 feet away at least? Would it be even more accurate if it was 1 foot away, for example? (Difficult to do with a ceiling mounted operation but not for a table-mounted one)


Thanks,


Andy K.
 

·
Registered
Joined
·
806 Posts
Discussion Starter #13
I ran 2.23 last night and got 435 contrast with a temp graph that started very high, (color of black), then quickly came down to 6500 then slowly ramped up to around 7200 if I recall. Still not ready for prime time.


This leads me to a possible cause. If the precalibration wizard located black then white points and got the white point wrong, the calibration would correct for off black as quickly as possible, (as stated above), then do a flat line to white.


If white is off, the calibration will do a slow ramp to off-white as I've seen on most of my calibrations. So I think that the problem is in the precalibration wizard in that it doesn't find the proper white. What do you think, Mark?


By the way, I must say that Mark has been very attentive to my challenges. I am sure that we will get to the bottom of this.


My guess is that when all of us who have ColorFacts and 2.2 say that we have 600+ contrast and a temp histogram that is 6500+/-50, there will be no need for extra info in Dilard. You set it up and it does it's job, period.


Remember that this is a first release of the calibration wizard and there are bound to be unforseen problems. I am sure Mark will figure this out.


As I stated previously, I don't think it is the sensor. That being the case, we may only be a download away from perfection.


I would also urge all others who have ordered ColorFacts and have Dilard to run the same tests and share the info here. It would be helpful to know if this is just a fluke or is more common. I am sure that Mark would like to know this as well.


Phil


P.S. One thing that I forgot to mention is that the grey scale absolutely rocks. Dilard displays a 16 level, (I think), grey bar pattern after calibration and you can see every gradation from pure white to pure black. Excellent! Now if the colors WERE truly grey.... ;)
 

·
Registered
Joined
·
4,525 Posts
Quote:
6500 then slowly ramped up to around 7200 if I recall.
Hi Phil,


Did you measure your primaries before running that Wizard?


Also, another great instrument to use with the gray scale tracking is the CIE chart. You will see each gray scale plotted in the color of the test image (white is white, black is black, etc.), and you can see how close they are landing to the D65 reference point on the chart.


I like this instrument better than the temperature histogram, since the temperature histogram mainly only considers the Red and Blue balance (mainly) and green errors will not register on this instrument (see the Help file for details).


The CIE Chart will show correlation with the D65 standard illuminant, which can be much more informative.
 

·
Registered
Joined
·
4,525 Posts
Quote:
600+ contrast and a temp histogram that is 6500+/-50
Just so you know, the Dilard tolerance goal is about +/-200 degrees, not fifty. I thought that was a good compromise between "really good" and "taking really long" :D.


We can revisit the basic assumptions, if you wish, but one of the goals of the software was that it would be "fairly easy" to run.
 

·
Registered
Joined
·
806 Posts
Discussion Starter #16
Mark,


I should have put a smiley by that +/- 50! What would you think of having a parameter that would allow a trade-off of time vs accuracy? That way, one could try a couple of calibrations just to see the interactions then let the thing run over night, if they want, to do the final tracking.


All,


I did some more testing last night and got some interesting results.


First I ran the greyscale measurements with Avia as opposed to the internal Colorfacts patterns. They are close enough in results that no one who must use Avia need worry about inaccurate readings.


Next, I measured my Elsa default values of the desktop. Wow, they were right on the money! both at 100 IRE and at 10 IRE, (ColorFacts doesn't have a 7.5 IRE display), R, G, and B were not even one click off the standard!


Now for the bad news. The overlay default DOES NOT EQUAL the desktop default, not even close. Luckily I had already dialed in my overlay with AVIA and a scope so I am good to go, (default desktop at standard, tweaked overlay at same standard).


For those with GeForce cards I think that the default desktop should all be about the same. They have a reputation for consistancy. All you need are the scoped overlay tweak and your dialed in not only to your PC, but to any standard source. I'll remeasure my overlay and post the numbers tonight.


For those with other cards, it may not be that simple. From what I gather, the Radeons are all over the map in output. The LE I had wouldn't even put out 716 mV. For optimum results, you may need to scope both the desktop and the overlay to get them to agree and use that value to calibrate from.


To this end I am asking Mark to come up with a wall paper that has the left half at 100IRE, (according to ColorFacts/Dilard), and the right half at 7.5 IRE. That way, those with scopes can dial in their boards while those without scopes can take the numbers that are posted for the various boards. Alternatively, those with ColorFacts can set the display back and forth between 100 IRE and 10 IRE as I did.


I personally think that the only way around all of this is if Dilard uses the overlay as the calibration standard. That way it is guaranteed to be in accordance with the software DVD players, since they also use the overlay.


Anyone else with any experiences to share?


Phil;
 

·
Registered
Joined
·
4,351 Posts
I have a question - How can we adjust settings on the Overlay if the controls are in the DVD player?


Does the Radeon and/or GeForce have Overlay color controls in the Display Properties? (Does this include Gamma control in the Radeon case?)


Andy K.
 

·
Registered
Joined
·
806 Posts
Discussion Starter #18
Hello all!


Well last night I didn't have a lot of time to play but I did get to the measured values.


For Else Gladiac GTS-2, set desktop to default, (hit the reset to defaults button), for the overlay set brightness to 11 and contrast to 94. It was late so this may not be 'dead nuts' on but it should be close enough to get you started. Note that the overlay will not affect calibration since Dilard doesn't use it, (yet!). So when I have a chance to reverify the overlay numbers, you won't have to recalibrate.


I think that GeForce cards start with a brightness of zero so the proper value for them should be 111. Can someone confirm this?


I also sent Mark some new graphs. I'll admit that I cheated to get a flat graph. We'll see if 'Mikey likes it'!


Tonight I will try a new calibration with the primaries dialed in first. If it still has that upward ramp, I'll just cheat again until Mark figures out the fix.


Unless tonight resolves the 'upward ramp' issue, I will retire from this thread and start a new one when Mark does another point release.


Thanks,

Phil
 

·
Registered
Joined
·
2,151 Posts
Phil,


When I did my calibration, I didn't see anything unusual at all in the ATI overlay as far as how it displayed compared to before. Then again I don't have a scope. :D


I use the atidvd.reg file always before playing a DVD. Those settings(if memory serves me) were done/scoped to match the zero ire and to display the best image, right? Is it safe to assume that the ATI overlay would produce a great image using that file if the projector was properly set up/calibrated? This of course couldn't rule out variances between individual video cards.


I don't think I would want my G11 calibrated to my video cards overlay if there were that big of a difference. It may not serve very well for HD material.


Chris
 

·
Registered
Joined
·
806 Posts
Discussion Starter #20
Chris,


The problem is that Dilard uses the desktop to do the calibration. Therefore, if the desktop and overlay don't agree, the calibration is invalid. Your overlay can be tweaked to no end and the picture can still look lousy if the desktop isn't calibrated to the same standard.


Note they don't necessarily have to be to a standard, they just have to agree. I just happened to luck out because the default desktop is exactly to the 716mV-100 IRE standard, and my overlay was scoped to the standard as well. Therefore, they were both the same.


Your overlay is set to standard, is your desktop? That's the question. Of course many people aren't as anal as I am, being an engineer. :D


Hope this helps,

Phil
 
1 - 20 of 20 Posts
Status
Not open for further replies.
Top