AVS Forum banner

Status
Not open for further replies.
1 - 20 of 38 Posts

·
Premium Member
Joined
·
7,402 Posts
Discussion Starter #1
I heard that if you have a dvd player with HDMI output and a video display with a dvi input, the signal could be downconverted from 10 bit to 8 bit and that is less than ideal. Some say rounding errors cause the loss of below black levels. Others say they use an HDMI dvd player with a DVI display and have no issues.


What I am wondering, is how do the iScan products fit in?


From the iScan Ultra to the HD/HD+ to the VP30.


1) Which processor uses 8, 10 or 12 bit processing and what is output on DVI (for the Ultra and HD/HD+) and what is output on the HDMI for the VP30. I see reference to 8 bit scaling in the Ultra to HD+ and 10 bit scaling in the VP30, but what about the rest of the de-interlacing and video processing, how many bits? I see DAC's are 12 bit. Am I comparing apples to oranges when I compare HDMI output of dvd players to the processing in the iScan units?


2) Also, when I use the VGA output of my iScan Ultra to my Hitachi PJ TX100, what bit conversions are done and what do I end up with?


The picture looks great with VGA. I have compared with DVI to DVI, but I cannot say I have noticed a difference in casual viewing comparisons.


thanx in advance for any feedback,


:)
 

·
Registered
Joined
·
1,901 Posts
Quote:
Originally Posted by cpc
I heard that if you have a dvd player with HDMI output and a video display with a dvi input, the signal could be downconverted from 10 bit to 8 bit and that is less than ideal.
A DVD player with an HDMI output may or may not produce a 10-bit output. If the output is RGB, then it will only have 8-bit resolution. If it's YCbCr, then it could have 8-bit or 10-bit resolution (although HDMI supports 12-bit 4:2:2 YCbCr I have yet to see a player which produces this). Unless the DVI input of the display has non-standard support for YCbCr (DVI by definition is RGB, although it is possible to send YCbCr over DVI), then the HDMI output of the player will be 8-bit RGB. This conversion will be done in the player.


Note that if the display has a standard DVI input, then you can only send it an 8-bit signal regardless of how many bits of resolution you may have in the source. It simply won't accept anything higher than this. This is 'ideal' as it can get for that particular display, at least for a digital input.

Quote:
Some say rounding errors cause the loss of below black levels. Others say they use an HDMI dvd player with a DVI display and have no issues.
It is highly unlikely that rounding errors would cause this, as they are typically an order of magnitude below what is necessary to significantly remove below-black information. There is a known issue in some HDMI transmitters and receivers, however, which could cause this. If the HDMI device converts from one format to another (e.g., YCbCr to RGB) below-black and above-white information could be lost.

Quote:
What I am wondering, is how do the iScan products fit in?


From the iScan Ultra to the HD/HD+ to the VP30.


1) Which processor uses 8, 10 or 12 bit processing and what is output on DVI (for the Ultra and HD/HD+) and what is output on the HDMI for the VP30. I see reference to 8 bit scaling in the Ultra to HD+ and 10 bit scaling in the VP30, but what about the rest of the de-interlacing and video processing, how many bits? I see DAC's are 12 bit. Am I comparing apples to oranges when I compare HDMI output of dvd players to the processing in the iScan units?
All of these iScan units use an SiI504 deinterlacer. This is an 8-bit processor which effectively produces an 8-bit 4:2:2 YCbCr output.


The scaling engines in the Ultra, HD and HD+ all take an 8-bit input signal. However, they all maintain greater than 8-bits of accuracy internally and will produce a 10-bit output. The scaler in the HD and HD+ is higher quality than that in the Ultra (where it is used only for aspect ratio conversion) and the scaling calculations in those two processors are done in such a way that full numerical accuracy is maintained throughout the scaling path until the final rounding stage which produces the 10-bit output signal. The VP30 has a full 10-bit scaler, meaning that it accepts a 10-bit input signal as opposed to the 8-bit signal used in the HD and HD+. It is similar otherwise (at least in terms of computational accuracy).


Other video processing stages in these devices have sufficient accuracy to maintain the resolution of the signal being processed. I.e., if the signal being processed has N-bit resolution, then the hardware in the processing path for that signal is done with at least N-bit accuracy (and may be done at higher resolution and then rounded down at the output, as is the case with the scaling engine).


The Ultra through HD+ have DVI outputs which only produce an 8-bit 4:4:4 RGB signal. (The HD and HD+ have hardware which can produce a YCbCr output, but the software to support this was never implemented.) The internal 10-bit YCbCr signal is converted to RGB and reduced in resolution to 8 bits. The HD and HD+ use a random noise signal to dither the output to 8 bits while I believe the Ultra just rounds (although it's been a while and I don't recall exactly). The VP30's HDMI output is capable of RGB or YCbCr output. Any required color-space or other conversions are done in custom logic to avoid problems with below-black or above-white clipping in the HDMI transmitter. The VP30 can output 8-bit RGB, 8-bit 4:4:4 YCbCr, or 10-bit 4:2:2 YCbCr.


The Ultra through HD+ use 12-bit DACs. The Ultra oversamples the signal in the DAC hardware (see below), while the HD and HD+ use the scaling engine to do this. The VP30 does things a bit differently. It uses 4 10-bit instrumentation-quality DACs for the analog output. 3 of the DACs are used for the video signal and the 4th is used to generate the embedded sync (if present) so that the full 10-bit range of the DAC is available for the active video signal. The VP30 supports up to 10x oversampling of the analog output, with 1080p being oversampled at 2x (i.e., ~300 MHz).

Quote:
2) Also, when I use the VGA output of my iScan Ultra to my Hitachi PJ TX100, what bit conversions are done and what do I end up with?
I can't tell you exactly what your display will do as I don't know what internal video processing it has. It will however, have to digitize the analog input signal and will likely perform other processing. You'll probably have to contact the manufacturer to determine what this processing might be and what resolution the input A/D converters are.


The only conversion done by the iScan Ultra to its internal digital signal will be in the video encoder at the output. The video encoder is fed a 10-bit 4:2:2 YCbCr output from the Ultra's scaling chip (used for aspect ratio conversion). It will then internally oversample the luma by 4x and the chroma by 8x. If the Ultra is configured for an RGB output, the video encoder will perform this conversion. These conversions produce a signal with more than 10-bit resolution, but I don't have access to the details of actual video processing and conversions being done inside the encoder. The oversampled signal is then sent to 12-bit DACs inside the encoder. The resulting analog signal is then sent through an analog reconstruction filter to remove higher-frequency images of the baseband signal.


- Dale Adams
 

·
Premium Member
Joined
·
7,402 Posts
Discussion Starter #3
Thank-you for your very informative response. I believe my dvd player has 10 bit DAC's, and good ones at that. The picture from the component out to the Ultra and VGA to the projector is almost always very good.


I guess I have to do some more research.


What about SDI? Do you know how many bits are involved in that signal?
 

·
Registered
Joined
·
2,951 Posts
Quote:
Originally Posted by cpc
What about SDI? Do you know how many bits are involved in that signal?
8 bit 4:2:2 YCbCr.
 

·
Premium Member
Joined
·
7,402 Posts
Discussion Starter #5
So all this 10 and 12 bit processing obvious does not introduce more data. What does it do? Is it like the 18 and 20 bit processing of early CD players? Does the 10 and 12 bit processing of 8 bit dvd signals result in some sort of higher accuracy?


So going from HDMI down to DVI is not an issue for the most part? That is the impression I am getting.


thanx


:)
 

·
Registered
Joined
·
2,951 Posts
Quote:
Originally Posted by cpc
So all this 10 and 12 bit processing obvious does not introduce more data. What does it do? Is it like the 18 and 20 bit processing of early CD players? Does the 10 and 12 bit processing of 8 bit dvd signals result in some sort of higher accuracy?
Complex mathematical equations introduce rounding errors. The point of oversampling is to reduce or eliminate rounding errors by increcing the order of magnitude of the equation.


So say you had a number between 1 and 10, and when you were working on it the number could go up or down by 0.4 every time you did a set of equations, you can see how you could rapidly end up with a very different number after a few equations. But if you times everything by 10, so you've got a number between 10 and 100, and the rounding effect is still 0.4 a time, and then divide everything by ten when you've finished all your equations, then the effect of rounding errors will be much smaller. That's the idea of oversampling. (Of course, processing is done in base-2 (binary) rather than base-10 (decimal) maths, so you're timesing things by 2 for each extra bit, not times 10).

Quote:
So going from HDMI down to DVI is not an issue for the most part? That is the impression I am getting.
It's not ideal, as it's better to only oversample and downsample once for optimum accuracy, but unless you have amazing vision then you'd be hard pressed to notice much difference.
 

·
Premium Member
Joined
·
7,402 Posts
Discussion Starter #7
Ok, fair enough.


So my sending DVI from my iScan Ultra to the HDMI of a TX200 will be ok? The HDMI connection itself and video processing of the TX200 is 10 bit.


thanx
 

·
Registered
Joined
·
2,951 Posts
Quote:
Originally Posted by cpc
Ok, fair enough.


So my sending DVI from my iScan Ultra to the HDMI of a TX200 will be ok? The HDMI connection itself and video processing of the TX200 is 10 bit.


thanx
Sure.
 

·
Premium Member
Joined
·
7,402 Posts
Discussion Starter #9

·
Registered
Joined
·
2,951 Posts
Quote:
Originally Posted by cpc
Recommend keeping my dual-link DVI-D cable and using an HDMI to DVI adapter like this:

http://www.pacificcable.com/Picture_...Name=HDMIMDVIF


or should I simply grab an HDMI to DVI cable?


thanx


:)
Whatever's easier for you. The result will be the same.


In my experience DVI->HDMI adapters aren't particularly pleasent, as the stupid HDMI connector struggles to hold the weight of both the adapter and the cable.
 

·
Registered
Joined
·
10,821 Posts
those adapters the have the hdmi instered into it's port and the big DVI port on the back are asking for trouble, they put way too much strain on the hdmi input


I like the DVI adapter that screws into the DVI port on the TV and has a female HDMI port on it's back, very sturdy and solid, then all you need is a HDMI to HDMI cable

http://www.monoprice.com/products/pr...t=1#largeimage


most would agree that a HDMI to DVI cable is best


-Gary
 

·
Premium Member
Joined
·
7,402 Posts
Discussion Starter #12
For now I will try to setup something so that the DVI cable plugging "onto" the DVI>HDMI adapter is not straining the hdmi connection on the back of the TX200. I will fudge around with it for now. I have the DVI cable, so if I can make it work ok, I will use it. I tried the other way around, connecting my old TX100 to the new owners S77 panny via the eact same adapters and it worked no problemo at 720p no sparkles.


At first I bought a $50 adapter (Monster) and now I bought a $30 adapter. I will compare them and probably return the more expensive one. If I can find a buyer for my DVI cable, Perhaps I will get an HDMI to DVI cable instead, or an HDMI cable.


Thanx folks :)
 

·
Premium Member
Joined
·
7,402 Posts
Discussion Starter #13
Here is another question.


My Panasonic CV 51 (essentially similar to the RV-31) is set with Blacks to DARKER.

My iScan Ultra is set normal as far as I can tell (brightness contrast etc).

I am using the DVI output of the iScan.


What do I set the HDMI setting to in my Hitachi TX200?


Here are the options:


AUTO - Automatically selects an optimum mode.

Normal - Suitable for DVD signals (16-235).

Enhanced - Suitable for VGA signals (0-255).


Also, in the projector, there is FRAME LOCK (not sure if it works for HDMI).


thanx for any feedback


:)
 

·
Registered
Joined
·
2,951 Posts
Quote:
Originally Posted by cpc
do I set the HDMI setting to in my Hitachi TX200?


Here are the options:


AUTO - Automatically selects an optimum mode.

Normal - Suitable for DVD signals (16-235).

Enhanced - Suitable for VGA signals (0-255).
Use 16-235. That keeps the bellow black and above white part of the signal.

Quote:
Also, in the projector, there is FRAME LOCK (not sure if it works for HDMI).
Sounds like genlocked framerate conversion. I suspect it'll only work with an interlaced signal, but give it a try. If you notice any judder/tearing/stuttering/dropped frames, then turn it off.


The iScan probably has superior framerate conversion anyway.
 

·
Premium Member
Joined
·
7,402 Posts
Discussion Starter #17
Ok, I thought the "Enhanced" was full and the "Normal" is truncated.


So what is the difference? I will go with Normal then. Enhanced looked less contrasty anyways.
 

·
Registered
Joined
·
2,951 Posts
Quote:
Originally Posted by cpc
Ok, I thought the "Enhanced" was full and the "Normal" is truncated.


So what is the difference? I will go with Normal then. Enhanced looked less contrasty anyways.
Not in the case you listed above.


Generally the terms "normal black" and "enhanced black" are some marketing person's term for the black points of 7.5IRE and 0IRE respectively, and in such a situation you'd want to pick the setting that would give you a black point of 0IRE. But in this case, just to confuse you, they've used the same terminology to refer to something completely different, ie, whether to use studio video levels (as DVD does) or computer video levels (which DVD does not use) for the HDMI output.
 

·
Premium Member
Joined
·
7,402 Posts
Discussion Starter #19
I thought so. I used normal "DVD" and it looks better.


thanx


:)
 

·
Premium Member
Joined
·
7,402 Posts
Discussion Starter #20
As far as all of this is concerned:

Quote:
All of these iScan units use an SiI504 deinterlacer. This is an 8-bit processor which effectively produces an 8-bit 4:2:2 YCbCr output.


The scaling engines in the Ultra, HD and HD+ all take an 8-bit input signal. However, they all maintain greater than 8-bits of accuracy internally and will produce a 10-bit output. The scaler in the HD and HD+ is higher quality than that in the Ultra (where it is used only for aspect ratio conversion) and the scaling calculations in those two processors are done in such a way that full numerical accuracy is maintained throughout the scaling path until the final rounding stage which produces the 10-bit output signal. The VP30 has a full 10-bit scaler, meaning that it accepts a 10-bit input signal as opposed to the 8-bit signal used in the HD and HD+. It is similar otherwise (at least in terms of computational accuracy).


Other video processing stages in these devices have sufficient accuracy to maintain the resolution of the signal being processed. I.e., if the signal being processed has N-bit resolution, then the hardware in the processing path for that signal is done with at least N-bit accuracy (and may be done at higher resolution and then rounded down at the output, as is the case with the scaling engine).


The Ultra through HD+ have DVI outputs which only produce an 8-bit 4:4:4 RGB signal. (The HD and HD+ have hardware which can produce a YCbCr output, but the software to support this was never implemented.) The internal 10-bit YCbCr signal is converted to RGB and reduced in resolution to 8 bits. The HD and HD+ use a random noise signal to dither the output to 8 bits while I believe the Ultra just rounds (although it's been a while and I don't recall exactly). The VP30's HDMI output is capable of RGB or YCbCr output. Any required color-space or other conversions are done in custom logic to avoid problems with below-black or above-white clipping in the HDMI transmitter. The VP30 can output 8-bit RGB, 8-bit 4:4:4 YCbCr, or 10-bit 4:2:2 YCbCr.


The Ultra through HD+ use 12-bit DACs. The Ultra oversamples the signal in the DAC hardware (see below), while the HD and HD+ use the scaling engine to do this. The VP30 does things a bit differently. It uses 4 10-bit instrumentation-quality DACs for the analog output. 3 of the DACs are used for the video signal and the 4th is used to generate the embedded sync (if present) so that the full 10-bit range of the DAC is available for the active video signal. The VP30 supports up to 10x oversampling of the analog output, with 1080p being oversampled at 2x (i.e., ~300 MHz).
and especially this:

Quote:
The VP30 can output 8-bit RGB, 8-bit 4:4:4 YCbCr, or 10-bit 4:2:2 YCbCr.
What benefits are there between output? If my display could handle any of the above, through HDMI, which would I want to use from a VP30 for instance?


In my case, the Hitachi TX200 has these choices:


AUTO RGB SMPTE240 REC709 REC601


Perhaps my projector only accepts RGB over HDMI?

It has component inputs too of course.
 
1 - 20 of 38 Posts
Status
Not open for further replies.
Top