I'm reading conflicting reports and opinions on what is the correct gamma for viewing movies.
CRTs are typically 2.5 gamma, and is what we've been used to for many decades. However, with the advent of digital displays (LCD flat panels, digital projectors, etc.) there seems to be a lot of variance. Most LCD monitors for computers are 2.2 gamma - just look at the reviews on behardware.com, xbit-labs, etc. and they produce gamma charts of 2.2. My own DLP projector, the Benq 8700+, has a default gamma of 2.5. My new LCD monitor, the excellent NEC 20WMGX2, has a gamma of 2.2 and is close to D6500K right out of the box (but still needed some calibration, oh well)!
Watching movies on my projector (with a brand new bulb) is a different experience from my LCD - shadow detail is slightly muffled on the pj, but it has a more 3D look to it and video artifacts are not very noticeable. The LCD has loads of shadow detail, but there is quite a bit of video noise that is more apparent. I don't believe LCD overdrive is an issue here.
My guess is that the lower gamma on the LCD is the main culprit here.
I've read in the discussions here on AVS a while back that most DVDs are mastered with Sony Trinitron studio CRT monitors which have a 2.5 gamma. Yet why do most of the new digital projectors come with 2.2 gamma? And why do many of the esteemed ISF calibrators who post here use a 2.2 gamma if DVDs are mastered for 2.5?
This has me concerned - when people report they see "great shadow detail" while watching a trendy projector (last year it was the Pearl, this year it is the JVC RS-1), I wonder if it is merely the 2.2 gamma they are seeing. This will fool them into thinking the projector is superior, when another display they have been watching has a 2.5 gamma!
Of course, I have read here that at the lower end of the gamma curve, good projectors will have a linear output response in the lower IRE end of the scale, and then follow the standard gamma curve for the rest - this could also explain the enhanced shadow detail in some displays.
Anyways, I hope you guys can clear this up for me, and maybe when people gush over a new projector, they can tell us what the gamma is set up for it.
EDIT: I was wrong about the dE of my NEC being 1.0 before calibration. That would be AFTER calibration. My bad!
CRTs are typically 2.5 gamma, and is what we've been used to for many decades. However, with the advent of digital displays (LCD flat panels, digital projectors, etc.) there seems to be a lot of variance. Most LCD monitors for computers are 2.2 gamma - just look at the reviews on behardware.com, xbit-labs, etc. and they produce gamma charts of 2.2. My own DLP projector, the Benq 8700+, has a default gamma of 2.5. My new LCD monitor, the excellent NEC 20WMGX2, has a gamma of 2.2 and is close to D6500K right out of the box (but still needed some calibration, oh well)!
Watching movies on my projector (with a brand new bulb) is a different experience from my LCD - shadow detail is slightly muffled on the pj, but it has a more 3D look to it and video artifacts are not very noticeable. The LCD has loads of shadow detail, but there is quite a bit of video noise that is more apparent. I don't believe LCD overdrive is an issue here.
My guess is that the lower gamma on the LCD is the main culprit here.
I've read in the discussions here on AVS a while back that most DVDs are mastered with Sony Trinitron studio CRT monitors which have a 2.5 gamma. Yet why do most of the new digital projectors come with 2.2 gamma? And why do many of the esteemed ISF calibrators who post here use a 2.2 gamma if DVDs are mastered for 2.5?
This has me concerned - when people report they see "great shadow detail" while watching a trendy projector (last year it was the Pearl, this year it is the JVC RS-1), I wonder if it is merely the 2.2 gamma they are seeing. This will fool them into thinking the projector is superior, when another display they have been watching has a 2.5 gamma!
Of course, I have read here that at the lower end of the gamma curve, good projectors will have a linear output response in the lower IRE end of the scale, and then follow the standard gamma curve for the rest - this could also explain the enhanced shadow detail in some displays.
Anyways, I hope you guys can clear this up for me, and maybe when people gush over a new projector, they can tell us what the gamma is set up for it.

EDIT: I was wrong about the dE of my NEC being 1.0 before calibration. That would be AFTER calibration. My bad!