Originally Posted by shud
Thankfully I found this thread!
Sad to report that even with Catalyst 7.5 this is still an issue. A craptacular Nvidia card I got for like $35 from Newegg that I have in another PC outputs a 1920x1080 signal perfectly fine.
This is really hard to wrap my head around. I'm tempted to ditch my x800XL in favor of a similar Geforce card.Is there a simpler way to just get a 1080p resolution to work?
Overscan shouldn't be an issue, my Toshiba has a "native" picture size that cut down the overscan on the 1280x720 resolution just fine.
If I go into CCC and set my display for 1920x1080x30, it works just fine. But as soon as I bump it up to 60hz there's absolutely no signal.
I reallllly want this to push a progressive output. This interlacing is killing my eyes.
Even if I set a custom timing in PowerStrip and adjust the refresh to 60hz, when I go into the windows properties for my Toshiba LCD it's set at 30hz, and if I bump it up I lose the signal.
Can this card just simply not push 1920x1080x60? I find it hard to believe - that junky Nvidia can do it. If this is the case I may just switch to a fairly budget Geforce option.
Thanks for posting. I've been busy and not followed this thread for a long while. And thanks for the update on Catalyst 7.5. I haven't had a chance to check since 7.4.
The goal with this thread is really to help overcome a particular inability of some Radeon cards to deliver a proper interlaced 1080i30 signal. Or actually: of the Catalyst drivers. Since earlier drivers were able to do it without problems, and since PowerStrip can do it as well, we know the cards can do it.
It is becoming less and less of an issue over time as newer TVs can accept 1080p input -- and I am not aware that Radeon cards should have problems with 1920x1080x60.
What kind of TV do you have? Are you sure it supports 1080p input?
I do note that you have 1920x1080 working with the cheap Nvidia card, but it may welll read from the TVs EEDID and force 1080i when you set 1920x1080 resolution.
If the TV has a native resolution of 1280x720 (which I seemed to read from your message) it may not support 1080p input at all. It supports 1080i input (as you are seeing all right) and scales that down to 1280x720 (after de-interlacing).
In that case, I wouldn't bother with 1080i or 1080p since you are not going to see any more detail in the final picture anyway. Set your PC resolution to 1280x720x60 and save yourself the hassle. If you use a good video player (such as VLC) it will scale down 1080i and 1080p source material as good as the TV (or better even).
(Of course if your TV is one of those 1366x768 resolution LCD or plasma, you should set your PC to that resolution).
On the other hand, if the TV has a native resolution of 1920x1080 (LCD, DLP, LCoS, plasma etc) you definitely want to get 1920x1080 working. The key is whether it supports 1080p input or only 1080i.
If it's a 1080p flat panel LCD or plasma it probably DOES support 1080p input. If it's a recent model 1080p DLP, LCoS or other RPTV it MAY support 1080p input. Otherwise it most likely ONLY supports 720p and 1080i input (like my Toshiba 2005 DLP).
If you know positively that your TV supports 1080p input I apologize for insinuating otherwise. It's just news to me that ATI cards should have problems outputting 1920x1080p60 -- to me, its always been 1080i interlaced that was the problem.
That said, if your TV is a native 1920x1080 (LCD, plasma, DLP, LCoS etc) you shouldn't be seeing that much flickering with 1080i signal. These TV's all de-interlace the 1080i signal and display it at 60Hz.
Did you try the settings described in the opening post of this thread? ("Use centered timings" in particular). You don't have to go through the entire procedure since your x800XL obviously doesn't suffer from the same 1080i glitch that this procedure is meant to circumvent.