For those keeping score, I've really got only two concerns about my D2. The one I've posted most frequently on is the inability of the D2 to maintain a robust HDMI/HDCP connection with my Comcast/Motorola HD/DVR. This Comcast box is known to be, umm, difficult, so my refusal to just give up and switch to Component cabling is pure stubborness.
I've developed a lot of theories about heat related issues, and/or combinations of D2 setup stuff leading to these problems, but in reality there's nothing solid yet that really explains this. Sometimes the Comcast/D2 combo works like a champ for days at a time. Other times it fails and can be a bear to get working again. On my most recent failure (as it turns out, *NOT* fixable by disabling Triggers, by the way) I just decided to leave the Comcast and the D2 sitting there with the ghastly green screen of failure displayed. After about 10-15 minutes the problem FIXED ITSELF with no further intervention from me. I've now got about 24 hours of flawless operation since that happened. And that includes several power cycles and etc., which might well have produced a new failure. Go figur....
However my second problem was really giving me second thoughts about the D2. My display is a couple years old and has a DVI/HDCP input. That means it takes 8 bit RGB as input. It's a Fujitsu plasma and does a good job of dithering low level grays to counter the effects of that limited input.
Unfortunately, when connected to the D2, I had severe noise problems in the low level grays. The D2 does 10 bit internal processing but has to dither down to 8 bit when sending out RGB -- see Kris's review for example. My theory is that the noise I was seeing was a result of a conflict between the D2's dithering and what was going on in my plasma -- kind of a beat frequency between those two algorithms.
Well today I did another calibration pass, specifically focused on eliminating the noise in the low level grays. I use the gray scale ramps, IRE windowed fields, and steps charts (both normal and "monotonicity") in Avia Pro to see this stuff. The noise, when present, is not at all subtle. It's easy to see the noise vary, or move between IRE levels, when making Brightness/Contrast level changes.
And I found a solution.
Actually I'd say I'm about 90% there right now. Certainly close enough for delightful viewing, and maybe as good as it gets with my display.
The trick was to use a control in my plasma that raises the overall luminance level indpendent of the Blacks/Whites settings. Raising this one step apparently changed the dithering going on inside the display enough that it was no longer beating against the dithering produced by the D2. What's left is a "normal" level of dithering noise for this display.
Whatever's really going on, I've now got low level grays which are comparable in quality to the best I could get previously by directly connecting my 59avi DVD player to the DVI input of the Fujitsu. And every other aspect of the imagery is improved by running 480i into the D2 (HDMI to HDMI) and then 1360x768p to the display (HDMI to DVI).
I may be able to tweak this up a bit better as time goes on, but I'm now solidly in the camp that says the HDMI video path through the D2 really produces spectacular results. However if you have an 8 bit RGB display, it may take some serious patience to find the magic combo of display and D2 settings that produces nirvana.
And this leads to the real reason I'm sticking with HDMI from my Comcast box.
The settings I'm using on my 59avi are known to produce "standard" HDMI output values from the 59avi (see the 59avi Owner's Thread in the Standard Definition DVD forum here). The level settings combo of (1) the D2's input settings for the 59avi input, and (2) the settings on my display which together produced the best results are thus the "best" settings for processing "proper" HDMI values through the D2 to my display.
And that means that if I connect my Comcast box via HDMI, setting the SAME COMBO of D2 and display settings should give me the best chance of having proper levels for Comcast viewing as well -- modulo the odd variations that happen between various broadcast stations.
The one gotcha here is the color space difference between the DVD content (SDTV style color) and HDTV from the Comcast, but the D2 automagically handles that colorspace stuff already.
Contrast with if I had the Comcast connected via Component cabling. Now there's no reason whatsoever to believe that the "best" HDMI settings from the 59avi would also be the "best" Component settings for viewing the Comcast. Thus I'd have to get a light sensor, *AND* find some standardized test patterns broadcast on some Comcast channel, to calibrate the Comcast independently.
Or more likely I'd have to hire an ISF guy to come in with a signal generator and a light sensor, and etc.
The point is, the proper settings with this combo of hardware are finicky in the extreme. Screw up just a little and the nasty noise comes back. So being able to transfer settings "proved' with Avia Pro on my 59avi to the Comcast video path is a big win. And that can only be done if I use the Comcast's HDMI output.
The proof is in the viewing. Having achieved near nirvana with my current "best" settings for 59avi viewing through the D2, I then transferred those settings to the Comcast video path and voila I've now got eye candy there as well. At least until the Comcast decides it no longer likes my HDCP again.....
At the moment, life is good.