Thanks for the reply pohh-bah.
Perhaps this is a lack of understanding on my part regarding what is going on then. If I select a custom aspect ratio from my video driver, doesn't that mean that the video is being scaled once by my HTPC to something other than 1080p, being passed through my AVR, and then being scaled again by my TV to 1080p? I don't understand why this is necessary if my HTPC is putting out a 1920x1080 image at 60 Hz (full 1080p signal). I further don't understand why, when keeping my HTPC video output constant at 1920x1080 @ 60 Hz, that changing my HDMI input type between 'Video' and 'PC' results in the same signal being displayed totally differently. I'm certainly not telling my TV to zoom the 'Video' image or anything like that.
I guess I just don't understand why the custom aspect ratio is necessary. Shouldn't I just be able to set my HTPC resolution to 1080p and have that signal be fully resolved by my 1080p TV? You'll have to forgive me, as I'm relatively new to this. I had no trouble previously, when I ran DVI->HDMI and just set the input type to 'PC'. Is this something that you typically have to deal with HTPCs when going HDMI all the way? I'm not willing to give up the new HD audio, but I'm concerned that I'll be introducing unnecessary artifacts by setting up a custom aspect ratio.
If someone could explain this to me, or at least point me in the direction of some material that might explain it, I'd much appreciate it.