Aw crap, you have me there. Well, there MUST be someway to correlate that. Right? Like, for instance, if there's a theoretically perfect projector perfectly setup and it had perfect blacks, how many lumens would it take to put out the brightest whites? Well, I think *someone* must understand what I'm trying to say. Eh?
Hence dither, from what I gather. The TI spokesman told me that if they didn't dither the crap out of the DLP chips that it would look absolutely terrible. Not that I like the look of dither much. Supposedly they are now going to 10-bit DLPs, but I don't know how that will help unless they can use interpolation to extrapolate more bittage. Of course my EDTV looks like crap on a lot of satellite with lots of faces looking like they've dropped it to 4-6 bit. I suppose though, that if a processor can sense pixels that differ by one shade of gray or sense a row of pixels that are all the same, it might be able to be programmed to great a gradient of values. I would think that would take some slick processing though. So, I'm guessing the actual answer is that there's plenty of brightness to fulfill HDTV and that only the blacks need to be improved. On DLPs, that is.
A forum community dedicated to home theater owners and enthusiasts. Come join the discussion about home audio/video, TVs, projectors, screens, receivers, speakers, projects, DIY’s, product reviews, accessories, classifieds, and more!