I've been feeding the interlaced output of a Sony DVP9000ES to a Crystal Image Scaler (to a JVC DILA15 on a 110" diagonal screen). I purchased the Sony three years ago. For the past several months I'd been having problems with dropouts and other video problems on the Sony which had gotten progressively worse in that regard. I then saw a Denon 1600 floor sample on clearance at a local Tweeter and picked one up. The Sony (since its an ES model) was still in its 5-year parts & labor warranty so I sent it to Sony and they replaced parts, cleaned it etc. and now it works like new. The problem is that in comparing the PQ of the two players I seem to prefer the Denon. The PQ of both players is good, and I've tried tweaking the various video adjustments in the Sony to match that of the Denon as I had planned on giving the Denon to a friend when the Sony came back. I've gotten the image closer to that of the Denon (the Sony's image always appeared "softer" to me so I corrected that). I've also compared the two players using various test patterns and they seem to test rather similarly although the black levels in the Denon appear to be marginally better. Although I would describe the difference as subtle, overall I see more detail in the Denon than in the Sony. As I said, I'm using the interlaced output of the two players. I'm not a video processing geek but what could be responsible for the better PQ of the Denon? Is anyone surprised at this result? Any ideas?