|........HDMI......TV Set (1920x1080p)
THEN I INSTALLED THE CATALYST CONTROL CENTER (CCC):
No matter what I tried, the setup always wanted to use the HDMI TV set as the primary display. Even if I got it to work with the Monitor as primary and the TV as extended, on boot up it would try to use the TV as primary. The TV set would only show a black screen with a text box saying that the resolution was not correct. Disconnecting the HDMI cable was the only way I could power on and have the desktop display on the Monitor. Other times with both cables connected, things would deteriorate and the computer monitor would go black and the TV would be full of white noise with a cycling effect as if trying to connect.
In Win7, right clicking the desktop and clicking "Screen Resolution" always sets the TV as 1 and the Monitor as 2. There is no way to set the monitor as 1 and the TV set as 2. If I unplug the HDMI cable and boot, the monitor becomes 1 but as long as the HDMI is connected, the TV becomes 1 and the monitor becomes 2. Meanwhile, CCC shows it vice versa with the monitor as 1 and the TV as 2. Things just don't work right.
The only way I can get the two displays to work reliably is to:
... 1. Uninstall CCC.
... 2. Connect the monitor to the motherboard's VGA connector and the TV to the HD4350's HDMI connector.
This way, Win7's "Screen Resolution" always shows the monitor as 1 and the TV as 2 (what I want). The desktop always displays on the monitor and the TV is always the extended display.
FOR WHAT'S IT'S WORTH:
When I had both my Monitor and TV connected to the motherboard's integrated VGA (or DVI) and the HDMI ports, I never had the craziness using the Intel HD Graphics that the ATI and CCC combo has. Everything worked as expected.
How do you folks that use the Catalyst Control Center and have both your monitor and TV connected to your ATI graphics card manage to get everything working correctly and reliably? Is there a way to make Win7 assign display1 to the Monitor and display2 to the HDMI TV set? I ask in case my motherboard's integrated graphics ever goes bad and I have to connect both displays to my HD4350 card.
Have you set manual detection in CCC?
1. The default was manual detection in CCC so when I had problems, I tried "auto detect when CCC is opened". The delay each time CCC was opened with the auto detect was too irritating so I went back to manual detection. A big problem was when I clicked the manual detect (or with auto detect), the desktop display would be changed from the monitor to the TV set. The monitor would go black (or blue as extended) but the TV set would only show a screen of white noise with a periodic cycling effect as if trying to connect.
2. I wonder if my problem occurs because the second display is a HDTV set connected via HDMI? Is one of your monitors connected via HDMI?
3. I also wonder if my problem manifests itself because my monitor's resolution is 1680x1050 and when the desktop is switched to the HDTV set, it doesn't like that resolution. Maybe both of your monitors can work with each others resolution.
I'm not sure how you intend to use both displays... clone, extended, what? Personally, I've used HD4550, HD5450, HD6450 & HD6570 and have ONE display at a time. I set HDMI as default to my AVR & PJ while the VGA is my back-room 1600x1050 display. Whenever my AVR/PJ is on the video goes there, when I power them off it reverts to the VGA. Works like a champ!
Have you tried using a different version of CCC?
I have had this problem come and go with a variety of monitors and graphic cards, (both ATI and Nvidia), over the years. My theory is that microsoft auto updates screw things up. I have tried all the suggested fixes with no effect. Now I when I start media center in live TV and media center hangs up I just hit the previous button on my remote until I get synchronized audio or I shut down media center and or the TV and or the computer and start from scratch.
Best wishes. If you find a solution, please post it.