AVS Forum banner

The official Ruby Calibration Q/A Thread

83K views 614 replies 104 participants last post by  wm 
#1 ·
Looks like we needed a good, centralized thread to share helpful tips, tricks, and settings related to calibrating the Ruby. I'll start by posting some of my findings and some questions.


[Mods - please leave this thread in this forum rather than moving it to the calibration forum - as most Ruby owners will not find it there (a previous thread on calibrating Ruby dynamic iris got no action there and quickly sank despite many attempts to keep it going...)]

=====


I've calibrated two different Ruby's at this point (using ColorFacts 6.0 and the EyeOne sensor) - one was a demo unit and the other was mine. The pjs were connected via DVI to a Bravo D1 outputting 1080i using patterns from both VE and DVE.


I thought you may find it interesting to know that out of the box both were quite warm on the color temp from 20-100 IRE, particuarly below 50 IRE. For example both units ranged from 5800 - 6250, generally starting around 5800 and moving up toward 6250 as the IRE increased. The demo unit had about 125 hours; mine had just a couple at the time.


I did all my calibration in the auto iris mode. I found it worked best to calibrate using window patterns. If you calibrated for the windows and then later tried with the full fields, it was pretty close at the full field. However the same was not true in reverse (calibrated to full fields resulted in widely off #s when measuing against windows).


One thing I did not try that I'm interested in would have been calibrating at the window, then going to full fields and tweaking at the full fields, then checking again on the windows. Basically the full fields were close enough, and I didn't want to spend hours on this as I know I'll need to recalibrate it soon as I only had a few hours on the bulb.


Nonetheless, if anyone knows of the best approach to find the right balance between APLs when it comes to calibrating auto iris please post it here, as I am still a bit unclear on the process despite pretty good results from my attempts.


If anyone is interseted in my calibration #s from the Ruby with just a couple hours on it, let me know and I'll post them here. Assuming some similarity between units/lamps, using these numbers may be a good starting point for those without calibration equipment.


On another note, I was quite surprised that I had to set my Color control to 57 and Tint to 48 to get the Split Color Bars pattern to look correct. Anyone else notice this? I find that the grayscale I set via the Bravo works great on my Comcast STB as well. However I wonder if the same color/tint setting is applicable on there as sometimes colors may look over saturated.


I set my Contrast at 69 based solely on how 10 step IRE bars looked. At the default of 80 the 100 IRE bar looked kinda "mushed" into the 90 IRE bar. As I stepped it down toward 69 the seperation between the 100 and 90 IRE bars became much more defined. I stopped at 69 which was the point at which there was no more definition between the 100 IRE and 90 IRE bar compared to higher settings. Was this the result of brightness compression?


Normally you have to back the contrast down on these projectors because you run out of the limiting color at the default contrast setting. With the Ruby however I had plenty of blue (limiting color in its Xenon lamp) to spare at 100 IRE. So it was a shame to back the Contrast down to 69 but seeing the 10 step pattern made it pretty clear that this was a necessity. Perhaps when the bulb ages and light output drops, I may opt to bring the contrast back up to get the increased brightness at the expense of the upper end. So in a way it is good this flexibility is there.


One tough spot for me as usual is that unfortunately the EyeOne will not generate a valid CIE graph. I had this same issue with the Sharp 10K. It shows Red right on but blue and green are way short of their rec 709 coords so I know this is wrong. Therefore I cannot use the RCP to gauge how to dial in proper rec 709 coords.


I have my Ruby in Nomal color space. Can someone please recommend any tweaks/settings to the RCP to dial in the colors a bit more accurately as Greg mentioned doing in his review?


Please chime in with your questions, tips and tricks related to calibrating your Ruby.
 
See less See more
#503 ·
I don't think the HDMI input is sharper, but it has different default settings, which results in a different looking image. So simply swapping the cable from one to the other input will show differences that are related to settings and not to the input itself. A comparison between both inputs would be best done with a HDMI splitter or a Lumagen Radiance (don't have one yet, still using my old ProHDP), where each input is calibrated to the specific source (or the source to the input).


About the firmware & mods. Lots of things changed. How the panels are controlled, better on/off CR, better ANSI CR, Iris algorithms, no crush, no iris artifacts, signal processing and a few other things. I've started to take some measurements when Mark Peterson asked me if I could provide some numbers. That was back before WWDC in June and I never got around finishing those. I still plan to finish it, but I'm not sure when, as I have a ton of other stuff to work on and already have a few other potential projectors in the pipeline for tweaking.
 
#504 ·
I can tweak HDMI to look nearly identical to DVI but only using my crystalio II. I will admit I have not done 1080p/23.98 into my VP then 1080p/47.95 into the DVI on the Ruby for awhile. I will mess around with it this weekend and do some comparisons. Maybe my new Pioneer 95FD will get here...
 
#505 ·

Quote:
Originally Posted by Stephan /forum/post/11795304


I don't think the HDMI input is sharper, but it has different default settings, which results in a different looking image. So simply swapping the cable from one to the other input will show differences that are related to settings and not to the input itself. A comparison between both inputs would be best done with a HDMI splitter or a Lumagen Radiance (don't have one yet, still using my old ProHDP), where each input is calibrated to the specific source (or the source to the input).


About the firmware & mods. Lots of things changed. How the panels are controlled, better on/off CR, better ANSI CR, Iris algorithms, no crush, no iris artifacts, signal processing and a few other things. I've started to take some measurements when Mark Peterson asked me if I could provide some numbers. That was back before WWDC in June and I never got around finishing those. I still plan to finish it, but I'm not sure when, as I have a ton of other stuff to work on and already have a few other potential projectors in the pipeline for tweaking.

Stephan,


Thanks for the info. Please don't tell me that convergence is somewhere in the mix of tweaks under "how the panels are controlled".


Sounds cool. Is this going to be something you're going to be selling or offering up as a service in the future?


Andy
 
#508 ·

Quote:
Originally Posted by AndyN /forum/post/11796510


Stephan,


Thanks for the info. Please don't tell me that convergence is somewhere in the mix of tweaks under "how the panels are controlled".


Sounds cool. Is this going to be something you're going to be selling or offering up as a service in the future?


Andy,


convergence is not in there. But convergence can be adjusted physically in the pj. Of course it's a real pain, so I wouldn't recommend trying it (besides you lose warranty and there's a high risk of damaging things).


I'm not planning to offer this as a service or sell kits for doing it. A kit is pretty much impossible, as there are other things involved that are different from projector to projector. That would only leave the service, which is not feasible at all.

It would probably take 1 to 1 1/2 weeks to apply all changes to a single projector, so maybe 3 to 4 projectors per month. You do the math... there's no money to be made with it and it would leave no time to anything else. Prices would be so high, there would be no market.


Now, the big question is probably, why I did it in the first place... well, first of all, I'm nuts.


I've always been modding things, if there was something to gain. Back in the old CRT days, I enhanced the bandwidth of video circuits. The Qualia 004 was the first digital projector I could live with, after many years of 1080p with a 9" CRT.


I mainly work in (medical-) image processing (software and hardware), so besides doing this for myself I've used it in some projects where cost is almost no object (6 to 7 digit $ range).


I have done these mods, so others can as well. It takes a little knowledge of how these things work, so any EE with some time and interest in projectors should be able to do it. But the Ruby is a dated projector, the next generation just started shipping or will pretty soon, so I don't know how much time people are willing to put into it anymore.


And besides that, I think there will be alot of competition from the DLP camp.
 
#509 ·
I sent my Ruby in for service due to the lamp not firing up issue. It went in on Monday September 24th UPS 3 day air and I got back yesterday, the 4th of October. Amazing turnaround.

While it was at Sony I talked to Juan and he explained that they had installed the hard start mod kit, replaced the lamp and had replaced the optical block. The first two items I was expecting but the optical block was news. I was always about a half pixel off and suspect shipping knocked it further out. Long story short all three items were replaced no charge. It was sent back next day air at their expense.

I did ask about 24p capabilities and he really knew nothing of this. I didn't expect much here. If there were a firmware or hardware upgrade path we would have heard about it.


Sony gets an A for their incredible service and timely turnaround. This is the best experience I've ever had in getting service for anything in consumer electronics. My theater was offline for 10 days, 5 of which were shipping and the weekend.


I contrast this with the F that I gave JVC for their service of my HD2K two years ago. They had the pj for two months and then sent it back not working right. I had to hound them for answers. I don't know if things are the same today, I hope not. I was offline for nearly 3 months by the time this problem was fixed.
 
#512 ·

Quote:
Originally Posted by joerod /forum/post/11819734


That's good news. Well, except the 1080p/24 part.
How is the picture now? Better?

The picture is better than when I first got it. Brand new lamp and perfect convergance.

Quote:
Originally Posted by drhankz /forum/post/11820080


REMEMBER - I promised you that kind of experience

You spoke the truth!


I ran output from my Radiance XD via the DVI input this morning to compare DVI & HDMI. I haven't spent much time with it but remember why I have used HDMI almost exclusively. I get random picture noise on DVI. Does anyone experience this? It looks perfect and then there'll be a bit of static, maybe 4 or 5 pixel high and a foot wide. Kinda like antenna interference.
 
#513 ·

Quote:
Originally Posted by kraigk /forum/post/11820588


I ran output from my Radiance XD via the DVI input this morning to compare DVI & HDMI. I haven't spent much time with it but remember why I have used HDMI almost exclusively. I get random picture noise on DVI. Does anyone experience this? It looks perfect and then there'll be a bit of static, maybe 4 or 5 pixel high and a foot wide. Kinda like antenna interference.

I have NOT Experienced any DVI problems as you

describe.


However when I output 1080p/48 from either of

my Video Processors to the Ruby - the picture

is MUCH SOFTER - CRT LIKE. If you look at a detailed

scene - like in Phantom Of The Opera - Blu-Ray - the

chapter when the Phantom comes down the Opera

hall stairs - the Opera Hall has FANTASTIC Detail.


I compared that scene back to back with 1080p/60 HDMI

versus 1080p/48 DVI.


I can't watch the DVI. The detail gets all SOFT - aka NOT CRISP.
 
#518 ·

Quote:
Originally Posted by Health Nut /forum/post/11754321


By the way, the VW-200 has a 400W xenon bulb, same as Ruby but with a 2500 life expectancy... Joe, maybe you should return that bulb... I bet you could use the new improved VW-200 bulb... improved in terms of 2500 hours...


Guys, we need to figure out about the VW-200 bulb, it could be great for a Ruby bulb replacement... might work and last 2500 hours and age more gracefully..... Both are 400W xenons....

???

My Ruby bulb is rated 2500 hours, and I have already reached 1800.

Where did you get a Ruby with an only 1000 hours bulb ?
 
#520 ·
geeji, I just tried the DVI detective and it would not pass my HD DVD player signal. Am I doing something wrong. I tried a couple different DVI adaptors and cables. I know it is not HDCP compliant so maybe that is what is happening...
 
#521 ·

Quote:
Originally Posted by Health Nut /forum/post/11836321


The Ruby bulbs retailed at $1,000.00 and were rated for 1,000 hours, hence people bitched about the $1.00/hr bulb usage... If you havea 2,500 hour bulb, it must be a NEW design, same as the VW-200... Would be nice if that is true...

My Ruby was bought in January 2006, one of the earliest available. The original bulb lasted for 2500 hours until the menu notice came up. It would have lasted longer if I hadn't decided to replace it. I don't understand what this 1000 hour myth is.
 
#524 ·

Quote:
Originally Posted by ericlhyman /forum/post/11857446


Is this really a different bulb or the same one with a more realistic spec for expected lifetime? Is the cost lower due to greater sales volume?

I also have a Feb 2006 ruby with 1400 hrs on the lamp...still looks great...Ive always thought it to be 2500 hrs life with some brightness loss after a couple of hundred hrs


charlie
 
#525 ·

Quote:
Originally Posted by joerod /forum/post/11837628


geeji, I just tried the DVI detective and it would not pass my HD DVD player signal. Am I doing something wrong. I tried a couple different DVI adaptors and cables. I know it is not HDCP compliant so maybe that is what is happening...

Joerod, I used DVI Detective successfully with an HTPC outputing NON-HDCP 1920x1080p.

If your HD DVD player output is 1080p and thus HDCP protected, it is unfortunately normal DVI detective cannot pass it. I apologize if my post, for lack of details, let you believe the opposite.

I believe Gefen is working on an update of DVI Detective which may also be "HDCP passthru" so you may want to check that with their friendly support and return your present version for exchange with the new one when it becomes available.

Just in case, also check that your DVI Detective is properly programmed and locked on your projector, and test the connection direct from HD-DVD->VideoProj with the DVI Detective Power Supply IN : contrary to what the manual says, even after locking, you may need the power supply in some cases; also if you have an AVR or switch in between, it may be the source of your problem.

I could not understand why I did not get a stable image either, just to discover my Yamaha RX-V2600, supposedly "HDMI 1.1" was actually incapable of passing thru 1080p video
 
#526 ·

Quote:
Originally Posted by Bill Cruce /forum/post/11843250


My Ruby was bought in January 2006, one of the earliest available. The original bulb lasted for 2500 hours until the menu notice came up. It would have lasted longer if I hadn't decided to replace it. I don't understand what this 1000 hour myth is.

My Ruby was bought in December 2005, and was rated for 2500 Hours.

AFAIK, all Rubys since day one were 2500 Hours rated.
 
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top