AVS Forum banner
Status
Not open for further replies.
1 - 1 of 1 Posts

·
Registered
Joined
·
2,293 Posts
Discussion Starter · #1 ·
I posted this over in the CRT section in response to a query in another thread, so some may have already seen this, but it's more pertinent here...


I keep reading that digital projectors have constant light output regardless of how much of the surface area is lit, but that's not what I'm finding when I measure them.


I have measured a difference in light output on a Sharp 9000 DLP between a 100 IRE window and 100 IRE full field, where the full field signal produces LESS output. Today I did the same measurements on a JVC G15 (Dukane) that I am calibrating, and thought the results might be of interest...


Projector: Freshly calibrated JVC G15 (Dukane) with 3 hours on the lamp.

Screen: Stewart 1.3 gain microperf 80" wide


Signal: [email protected] (HTPC) 4x3 image

100 IRE Window pattern = 30.9 ft lamberts

100 IRE Full screen = 29.2 ft lamberts


This is roughly 850 ANSI Lumens by the formula:

foot lamberts = (ANSI lumens) * (Screen Gain) / (screen area in square feet)


Signal: 720p 16x9 image (Accupel HDG-2000)

100 IRE Window pattern = 34 fl

100 IRE Full screen = 32 fl


So again, I am measuring lower light output with a full field white input signal than I am with a window pattern. And lower output with a 4x3 image than a 16x9 image.


I've no idea what causes this.


For these measurements: McMahan Lightspex in "*LUM" mode, fiber optic probe on tripod at 1 foot from screen surface, projector positioned to produce 80" wide image.


William
 
1 - 1 of 1 Posts
Status
Not open for further replies.
Top