Originally Posted by jkaiser
I can only run 1080p over VGA to my Samsung (DVI/HDMI are 1080i only - but not recommended by Samsung). When I use the overscan utility (for 1080p), the Samsung reports "unsupported resolution".
Your lack of 1080p control may be related to what the ATI driver is detecting as your display (it doesn't think you display supports 1080p so it doesn't give you that option in catalyst).
Hi folks, I did some experimenting and found that this is neither a hardware nor a driver issue, it's simply a bug. Here's the proof.
Requirements: a 1080p-capable monitor with both HDMI and DVI inputs (mine's a Westy), a Radeon card (mine is an X1300, but as you'll see, I suspect any Radeon will do), a DVI-DVI cable, and a DVI-HDMI cable.
1. Connect the monitor to the ATI via DVI. Set it to 1920x1080x60 (or, in Catalyst, 1080p). Verify that display works with no borders.
2. WITHOUT TURNING OFF ANYTHING OR CHANGING ANYTHING ON THE PC OR MONITOR, disconnect the DVI and hook up the HDMI. If your monitor doesn't auto-detect the new input, manually change the monitor's input selection to HDMI.
3. If you get a Catalyst warning about not receiving audio through HDMI, ignore it.
4. Verify that the display works with no borders.
That's right - the Radeon is pumping out pure 1080p perfectly through HDMI with no underscanning, no borders, no nothing. If your monitor is capable of this in its menus, you can verify that the monitor is receiving a 1080p signal.
5. Reboot the PC. (On my PC I can change resolutions and change back to accomplish the same thing, but YMMV and rebooting is the one sure bet.)
6. Verify that your picture now sucks.
This rules out any hardware or basic driver inability to display 1080p through HDMI - because it did before the reboot. Now things really get interesting.
7. If your monitor supports it, verify that the current resolution is still 1080 (p or i).
How is that possible? It's because the HDMI cable is still pushing a 1080 signal, but somewhere the screen size is getting downsized and a black border is ADDED to total up to 1080. I first thought of this when I turned my brightness way up on a 1280x1024 setting on the same monitor and saw a "black" (now gray) border around the Windows desktop (normally invisible because the black color put out by the ATI blends in with the blank area of the monitor). Somehow a border was being added...
8. WITHOUT TURNING OFF ANYTHING OR CHANGING ANYTHING ON THE PC OR MONITOR, disconnect the HDMI and hook up the DVI. If your monitor doesn't auto-detect the new input, manually change the monitor's input selection to DVI.
9. Verify that the picture still sucks.
WHAT?!? But you're running off DVI now! What has this got to do with the HDMI?
Answer: Nothing! Oh wait, maybe it does...maybe selecting HDMI somehow latched something in the hardware and now your video card is permanently screwed up and will display those borders forever. Let's disprove that theory:
10. Reboot the computer and verify that everything is back to normal.
So: there is nothing wrong with the drivers, and the hardware is fully capable of producing perfect 1080p images regardless of whether it's going through DVI or HDMI. It's just a stupid, stupid bug that ATI refuses to publicly acknowledge (as of this date there's nothing on it in their FAQs or Knowledge Base). Someone in this thread mentioned uninstalling Catalyst, and I'm going to give that a try, because clearly what is happening is some ATI routine outside the core driver is intercepting the 1080p, forcing it into an underscanned size, and adding a black border around it to make it still come out 1920x1080. It CANNOT be the hardware and it CANNOT be the driver because otherwise you would have seen it the moment you switched from DVI to HDMI.