Originally Posted by drlopezmdfacc
I too have received my D2/P5 and am very happy with the results. I'm no expert and the picture on my Pioneer PRO-730HD is significantly improved whether I am using the 8300 SADVR or my Toshiba Hd-DVD. I have HDMI from the cable box and the DVD player to the D2 and a HDMI to DVI connection to the TV. As predicted by Bob, the picture is significantly improved using the RGB video output. RGB extended (no idea what that is!?) does not look better than regular RGB.
I have some questions: there is a soft hum coming from the P5--is that supposed to be there? I hear no speaker hum. Do I have a ground loop problem? Is this important? Will it "hurt" the amp? I can not hear the hum from my listening position and the sound is otherwise spectacular coming through my M&K 5.2 set-up. Also, still not able to get an analog sound signal from a non-HD channel coming through the cable box; I thought I went through the set-up as advised in the manual. Any thoughts? The screen says it is receiving "Analog-DSP" but I hear nothing. Thanks and Happy New Year!
I suspect your 8300 is sending digital audio out even for SDTV channels -- even if it is receiving those over analog-style cable. It will only be digital stereo, of course, not multi-channel, but it will still be digital. Try that first -- i.e., use the same digital audio input you use for your HDTV channels. If that doesn't work for you, then there may be a setting you need to make inside the 8300 itself. We have several 8300 users here who can probably provide more detail.
Typically what you want for home theater DVI connections is "RGB". In addition, you may need to make a setting to the DVI connection in your TV itself. Look for something in the TV that adjusts its DVI input to expect stuff from a "video" or "set top box" source as opposed to a "computer" or "PC" source. Any DVI connection which is also HDCP (copy protection) compliant should either offer such a setting or should be explicitly spec'ed as being intended for home theater use and NOT for use connecting a computer to the TV. Check your owner's manual.
This all arises because DVI was originally a standard for connecting computers to monitors digitally, and the original use of DVI connectors on TVs was so that folks could use them as digital video computer monitors.
Extended RGB, you see, is the encoding typically used by computer graphics cards.
Technically what's going on is that Extended RGB encodes "Black" as digital 0, and "Reference White" as digital 255. Since the digital signal is limited to the range of 0 to 255 there is no space below Black to encode "Blacker than Black" data and no space above Reference White to encode "Peak White" data.
Blacker than Black data is *NOT* intended to be seen but is in the video signal to help any video processing work better. It keeps the image data from cutting off all of a sudden at Black -- which can produce artifacts when various types of math are performed on the video. Peak White data *IS* intended to be seen, but not all TV's will be capable of reproducing it so the content producers make sure that the image looks good even if the TV cuts off at the slightly lower Reference White level. Glints, sparks, and detail in clouds are examples of Peak White data.
"RGB" on the other hand -- or more properly "Studio RGB" -- encodes Black as digital 16, and Reference White as digital 235.
The range from 1 to 15 in Studio RGB is used for Blacker than Black data, and the range from 236 to 254 is used for Peak White data. 0 and 255 are reserved values.
On the face of it Extended RGB would seem "better" because it provides more steps for displaying the visible gray scale. But in fact Studio RGB is what you want to use because the extra data at either end helps video processing algorithms produce a better result and because the people who are producing the content have designed it to be used that way. So you should only use Extended RGB if your TV input gives you no choice in the matter.
As you can imagine if you send Extended RGB when the TV is expecting RGB or the other way around the first thing you will notice is that black levels are wrong -- either too dark with loss of detail near black, or too gray and washed out. Some TVs provide enough control of black level (Brightness control) that you can compensate and make things SEEM to be correct. But it is still not correct because, in addition to the gray scale differences, the math used to convert YCbCr to RGB *COLOR* information is also a little bit different. This is much harder to pick up by eye. The bottom line is that you want to get the Anthem and your TV to agree on which format of RGB is being used, and, given a choice, that should just be "RGB".
The same issues exist on the input side if your source device insists on sending RGB. Regular "RGB" is the likely default for home theater devices, but if your source is, say, a typical computer graphics card, then it is likely sending Extended RGB. The Anthem can be set to expect either type of input.