Originally Posted by xtrips
I understand the logic behind this but then aren't we spoiling the signal extensively? I mean this the path as I reckon:
- BD (1920x1080?/24) > HTPC (1920x1080i/50) > D2 (1920x1080p/24)
This way I think I lost all the advantage of supporting the original 24fps. Right?
What do you think I should do?
And one more thing, I am pretty sure I read somewhere that the Gennum processes its de-interlacing only when fed Interlaced signal (logical). So is this the reason why you recommend using a 1920x1080i/50 signal?
I think I am going sideways....
I have video settings in the HTPC (nvidia), in the D2 and in the Sharp projector. I am lost.
So I decided first to isolate the D2.
I projected the test patterns from the D2 and proceeded as described in the Anthem manual. This led me to some major changes in my projector settings. When the test patterns looked as expected (i used an optical filter set from a Digital Video Essentials dvd) I thought I was Go for setting the nvidia.
But then the HTPC's output looked terrible. So i started setting the nvidia settings. I couldn't get it done somehow. Things looks wrong.
What is the best pratice, Please?
I think I've managed to confuse you.
First are you in the US or in Europe? There is no reason at all to use a /50 signal in US markets. Those are for European TVs.
Second, you should *NOT* be using interlaced signals for film-based Blu-Ray discs if your HTPC supports 1080p/24. And for video-based Blu-Ray discs you should be using 1080i/60 (in US markets).
For film-based Blu-Ray discs the path should be: BD (1080p/24 off the disc) --> HTPC (1080p/24) --> D2 --> TV where the output from the D2 is set to the "best" video resolution and frame rate for your TV. The frame rate to your TV will be /24 if the TV supports it properly and /60 otherwise. I.e., it is OK to convert /24 from the disc to /60 if your TV doesn't support /24 -- let the D2 do this conversion.
For video-based Blu-Ray discs the path should be: BD (1080i/60 off the disc) --> HTPC (1080i/60) --> D2 --> TV where the output from the D2 is set to the "best" video resolution and /60 frame rate for your TV. I.e., do not try to convert /60 input into /24 output.
A player like the PS3 can be set to automatically switch its output from 1080p/24 to 1080i/60 according to what's coming off the disc. Ideally your HTPC Blu-Ray playback system should do the same.
There are lots of ways to screw up video calibration setup so I can't tell what you might have done wrong from your post. Carefully re-read the "Video Calibration for non-ISF Techs" post in the collection of links in the first post of this thread for suggestions.
First, what are your Setup / Video Output settings in the D2 and does your projector use an HDMI or a DVI input? For an HDMI to DVI connection to your projector, are you using Studio RGB output from the D2? Second, double check the "picture mode" setting in your projector as described in that post, as well as turning off image enhancement "features" in the projector.
If your "best" projector settings for displaying the D2's internally generated test patterns are still quite a bit different from those you used with your HTPC, then either you have your D2's Video Output set wrong or your HTPC was sending out an odd signal which you compensated for using the settings in your projector. Thus you also need to revisit the settings in your HTPC for its output to the D2.