Uh, hmm, so, could someone explain THIS to me?
I was goofing around with the TV today, content that I was going to hook the new computer build directly into it rather than the receiver. Really, I was just testing to see if I could see any difference in quality between VGA and HDMI.
After playing with that I thought I'd hook the computer back into HDMI1 on the TV since the computer is the primary component for use with this TV. The picture, still pumped into the TV directly from the computer, was crap again. Resolution settings were fine (still 1360x768), just looked slightly fuzzy ("crap" is a bit of an over-exaggeration).
So, then I hooked the computer back up to the receiver, but hooked the receiver up to HDMI2 on the TV. Crystal clear.
I have no idea what is going on. I reset all of the picture stuff on HDMI1 to no avail. Now, HDMI1 does have settings available to it that I can't adjust on 2, like the colors' balance, setting an "entertainment mode" for sports, games, etc. HDMI1 also has Zoom and Just Scan viewing modes, while HDMI2 can only go to 16:9 or 4:3. Yet, despite the apparent limitations, HDMI2 looks far crisper than anything I can get HDMI1 to display.
Does anyone have any idea what is going on? For reference again, my TV is a Samsung LN40A450, receiver is an Onkyo TX-SR606 and for the time being the computer is a Dell Zino HD running over HDMI.
EDIT: Another wrinkle: when hooked up to HDMI1, Windows 7 says that 1280x720 is the recommended resolution. When hooked into HDMI2 it says that 1360x768 is the recommended resolution. Setting HDMI1 to Just Scan and then adjusting the GPU's over/under scan does work, but the picture is still slightly fuzzy.