AVS Forum banner
1 - 16 of 16 Posts

·
Registered
Joined
·
650 Posts
Discussion Starter · #1 ·
OK--here is the definitive word from Optoma on the HD 70 color processing:


"HD70 utilizes an 8 bit color processing system for digital inputs and a 10 bit color processing system for analog inputs.


Please note that this is the exact same methodology used by ALL 720p projectors priced less than $3000 utilize, regardless of manufacturer."


I assume that the Mits HD1000U is the same as this???


Brian
 

·
Registered
Joined
·
4,662 Posts
I dunno, I've seen 10 and 12 bit as well.


Now I'm not certain of their EXACT usage of these specs, but the Z5 even says 12 bit, and that can be had for FAR less than 3K.
 

·
Registered
Joined
·
857 Posts

Quote:
Originally Posted by briansxx /forum/post/0


OK--here is the definitive word from Optoma on the HD 70 color processing:


"HD70 utilizes an 8 bit color processing system for digital inputs and a 10 bit color processing system for analog inputs.


Please note that this is the exact same methodology used by ALL 720p projectors priced less than $3000 utilize, regardless of manufacturer."


I assume that the Mits HD1000U is the same as this???


Brian

This very information has been available and beaten to death in other threads. Do a search if you want all the informationyou ever wanted to read about this issue.
 

·
Registered
Joined
·
650 Posts
Discussion Starter · #4 ·

Quote:
Originally Posted by hmcewin /forum/post/0


This very information has been available and beaten to death in other threads. Do a search if you want all the informationyou ever wanted to read about this issue.

I've seen discussion on this, but most of it seems to have been heresay of one sort or another. I don't recall seeing anything "official" from Optoma regarding this matter. I do seem to keep seeing again and again the "Mits is 10-bit color and the HD 70 is 8-bit" argument, so I was just hoping to clarify that debate. Optoma also confused the matter by saying that their European models were 10-bit and the US model 8-bit. Later in the email I quoted from above, they refute this and say: "The HD70 sold in the US is from the exact same factory line and the HD70 sold in the UK."


Best,


Brian
 

·
Registered
Joined
·
1,367 Posts

Quote:
Originally Posted by briansxx /forum/post/0


I've seen discussion on this, but most of it seems to have been heresay of one sort or another. I don't recall seeing anything "official" from Optoma regarding this matter. I do seem to keep seeing again and again the "Mits is 10-bit color and the HD 70 is 8-bit" argument, so I was just hoping to clarify that debate. Optoma also confused the matter by saying that their European models were 10-bit and the US model 8-bit. Later in the email I quoted from above, they refute this and say: "The HD70 sold in the US is from the exact same factory line and the HD70 sold in the UK."


Best,


Brian

then they must have totally different gamma curves because earlier on one of these threads somone posted calibrated settings for the european HD70 and they were way off from what people here were usingand the charts being shown here, at least according to Kras I believe...

I agree the whole 8 bit thing was stupid and beat to death (and I also read months ago here exactly what you said about the digital being limited to 8 bit on all the lower end models I believe it was just a marketing ploy by optoma to keep the hd72's selling as well as the 70's that kinda backfired for them) but am also interested in the facts so I can set the HD70 haters straight everytime they mention this while try to stear somone away from the hd70 towards the virtually identically performing hd1000u. I believe the facts need to be set striaght here to lift the bad rap this projector seemed to have aquired because it was the first of its kind and razled a lot of feathers and hurt a lot of biased people who either had just bought thier projectors too early and had to justifiy it or the people who simply could not comprehend this thing was under a grand and were threatened by it. It seems by the time the hd1000u hit the streets these people had already given up complaining. Yet you still hear thier complaints being referenced even though the hd1000u has most of the same issues (or hardware limitations).
 

·
Registered
Joined
·
1,367 Posts

Quote:
Originally Posted by gwlaw99 /forum/post/0


From what I understand if you leave brilliant color on 0-1 the HD70 8bit limitation does not affect the picture much if at all.

thats because the effects you speak of really have nothing to do with the 8 bit thing but really BC itself...BCat 0 or 1 results in a dim dull low contrast picture. when the bulb it brand new bc at 1 is tolarable but when the bulb breaks in you find yourself turning it up to 3 or even 4 to compensate because you can now do this without the whites getting totally blown out of proportion. this it all because of the bulb brightness and the clear segment. At least the way I understand it.
 

·
Registered
Joined
·
321 Posts
^ So true. I've gone from BC1 to either BC3 or BC4 as the lamp dimmed. Also have the brightness 5 notches higher than I started with.


Don't understand people saying the pj would have better blacks when the lamp dimmed. If you don't increase the brightness, you loose shadow detail. So my blacks are the same as when I started. Also to get the "plasma" look so many spoke of, I now need to use the high lamp setting.
 

·
Registered
Joined
·
11,213 Posts
auggiedoggy


It is important to know for the tweakers the technical design of the PJ. For example the SP4805 contrast control was 10b - on the SP7205 it was 8b. Since it did not have a video DVI preset - it was important to know that you should instead use RGB gain/bias on the SP7205 to adjust for video DVI instead of brightness/contrast like the SP4805.


BC at various levels increases the brightness of color and brightness of white by using the spoke time and clear segments. The gamma tracks but gets higher with the clear segment - but the brite colors will crush and posterize if you go to far. But that really is not a bit issue - since it is colorwheel related. But it was the original reviewer that assumed it was a bits issue that started all the confusion.


It still does not answer the questions though - are the RGB gain/bias - 8b/10b? That would mean you get grainy/banded results if you try to calibrate. Is the gamma tables 8b/10b/12b? Or is this a reference to brightness/contrast/color/tint being 8b or 10b depending on the input? If HDMI color adjustment is 8b but 10b for component - maybe you get a better picture when calibration is required by using component instead.


The oblique reference from Optoma is really just saying same as every other PJ that uses the TI chipsets for video/color processing - should be easy to track down those details and ignore marketing - but not every PJ uses the TI chipset - especially if it is a LCD!


This confusion is much like the DVD players that advertise 14b DACs - then you find out the video controls are 8b and centered at the wrong values - requiring adjustment but causing banding/dithering/sparkling the moment you try to do so.


In the higher end forum there are people buying scalers to work around PJ calibration issues - but they need to be aware of the same thing - knowing which controls and ports are limited to 8b. In fact I had had a PM that someone had a scaler with the HD70 and was going to try to calibrate that way!
 

·
Registered
Joined
·
101 Posts

Quote:
Originally Posted by briansxx /forum/post/0


OK--here is the definitive word from Optoma on the HD 70 color processing:


"HD70 utilizes an 8 bit color processing system for digital inputs and a 10 bit color processing system for analog inputs.


Please note that this is the exact same methodology used by ALL 720p projectors priced less than $3000 utilize, regardless of manufacturer."


I assume that the Mits HD1000U is the same as this???


Brian

How does Optoma know exactly what all other manufacturer's use? They are hardly an unbiased source. I think the reason this subject goes on and on is nobody really knows and there isn't a picture quality test that clearly shows it.


I suspect that in the end 8 bit vs 16 bit is not a big deal. If you can't see color banding on the HD70 then does it really matter if it is 8 or 10 bit?


That said, I have an HD1000. It came down to price and a coworker who already had an HD1000.


-Mike
 

·
Registered
Joined
·
1,367 Posts

Quote:
Originally Posted by mshust /forum/post/0


How does Optoma know exactly what all other manufacturer's use? They are hardly an unbiased source. I think the reason this subject goes on and on is nobody really knows and there isn't a picture quality test that clearly shows it.


I suspect that in the end 8 bit vs 16 bit is not a big deal. If you can't see color banding on the HD70 then does it really matter if it is 8 or 10 bit?


That said, I have an HD1000. It came down to price and a coworker who already had an HD1000.


-Mike

they say that because of the hardware included in the texas instruments dlp one chip solutions....and thier limitations. I do think maybe Kras is right here and optoma is giving us only half the story here....I think there are more than one place in the video processing chain where the signal can be limited to 8 bit...I honestly do not have the technical knowledge to understand all of it..just the basics maybe the processing bandwidth is the same as the hd1000u for digital (8bit) and 10 bit for anolog or maybe they are limited sowhere else to 8 bit before the signal even gets that far I have seen people ask for clarification on this before and each time the same limited explaination is given. I wish somone with some real hardware knowledge would chime in and clear this up and put this issue to bed.


BTW they would be a pretty unbiased source since they also manufacture the HD1000u as well as many dell hp mits and even epson projectors I believe. (Optoma's parent company is coretronics.)
 

·
Registered
Joined
·
11,213 Posts
All you need to show it is DVE with the greyscale ramp - or AVIA PRO with greyscale and RGB ramps. If you have a PC you can whip up a gradient pattern easy - or buy DisplayMate. If a control is 8b when adjust it - the pattern will band since the data itself is 8b. I do this all the time on in-home calibrations to figure out if I should adjust the source or display.


AVIAs greyscale ramp is dithered and not full range 0-255 - but use it if you got it.


A Coretronics engineer would be the unbiased source - of course Optoma marketing/engineers are going to be biased - they don't want information out that that causes someone to dither a buying decision even if it benefits the tweaker that already bought.. Manufacturing may not often understand the designs that well so Coretronics might not be of help other than finding out what parts are used. From there you snoop TI to get the datasheets and white papers.
 

·
Registered
Joined
·
5,344 Posts
just as an added clarification, there are fundamentaly 2 places that the digital word-width can affect you here.... first and foremost is the digital input connection.... both DVI and HDMI use 24bit RGB (or component) color which is 8 bits for each of R, G and B (simplification). Thus, even on the "10 bit" HD1000u, the incoming color resolution from the source is 8 bit. Second, is the pj's internal video processing engine. It's responsible for color matrix conversion, screen reslution scaling, user video adjustments, etc. Here is where the reported and assumed difference lies. Video processing in 8bits often involves rounding errors in calculations which result in color banding.... doing the calculations in 10 bits and then truncating the bottom 2 bits for display at 8 bits means the rounding errors are discarded...

 

·
Registered
Joined
·
11,213 Posts
Nice try at saying adjustments bit depth is irrelevant....by making a presumption that the display truncates the bits on display when that is not the case. The issue is that when you make adjustments - you need more bits than in the source - that is a fundamental of digital math to avoid banding/dithering the result. Saying you don't need more than 8b because your source has no more bit depth - is absolutely wrong and indicates a failure to understand display processing.


Displays usually are not 8b at the panel - with DLP they are no bits being that they are timing driven PWM - not bits depth driven PCM. So you have to step back into the framebuffer that does the PCM to PWM conversion. These are a higher bit depth because display must simulate a non-linear CRT gamma curve - even though they are inherently linear. Encoded video is presuming that this gamma expansion will happen. You need more than 8b to properly expand video with gamma - or you will get tremendous banding.


There are several controls in the display chain - you have video decoding which is the brightness/contrast/tint/color that converts component to RGB - you have the white balance which adjust the RGB mix - and you have the gamma control which uncompresses the linear video into the expected non-linear gamma curve. Insufficient bit depth in any of these will cause banding/dithering - it is not just one place. It requires an understanding of the video/display chain if you want to be a calibrator.


LCDs often have 12b - DLP often have 10b. That does not make LCD better - rather they need more because their panels are not linear like DLP. Takes more bits to compensate for that! As much as you have to turn down LCD to get gamma and greyscale in a good range - you effectively have the same bits as a DLP.
 

·
Registered
Joined
·
5,344 Posts

Quote:
Nice try at saying adjustments bit depth is irrelevant....by making a presumption that the display truncates the bits on display when that is not the case. The issue is that when you make adjustments - you need more bits than in the source - that is a fundamental of digital math to avoid banding/dithering the result. Saying you don't need more than 8b because your source has no more bit depth - is absolutely wrong and indicates a failure to understand display processing.

nothing could be further from what I was trying to say! I'm on your side.... by distilling down to a 24bit display path, I was trying to show that the 10 bit calculation path was STILL very important.... either I didn't express it clearly or you jumped when you saw the least-common-denominator output level....


 
1 - 16 of 16 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top