Guide to 1080i DVI on Radeons - Page 4 - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #91 of 94 Old 06-16-2007, 01:19 PM - Thread Starter
Senior Member
 
Brandenborg's Avatar
 
Join Date: Nov 2005
Posts: 466
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 21
Quote:
Originally Posted by shud View Post

Thankfully I found this thread!

Sad to report that even with Catalyst 7.5 this is still an issue. A craptacular Nvidia card I got for like $35 from Newegg that I have in another PC outputs a 1920x1080 signal perfectly fine.

This is really hard to wrap my head around. I'm tempted to ditch my x800XL in favor of a similar Geforce card.

Is there a simpler way to just get a 1080p resolution to work? Overscan shouldn't be an issue, my Toshiba has a "native" picture size that cut down the overscan on the 1280x720 resolution just fine.

If I go into CCC and set my display for 1920x1080x30, it works just fine. But as soon as I bump it up to 60hz there's absolutely no signal. I reallllly want this to push a progressive output. This interlacing is killing my eyes.

Even if I set a custom timing in PowerStrip and adjust the refresh to 60hz, when I go into the windows properties for my Toshiba LCD it's set at 30hz, and if I bump it up I lose the signal.

Can this card just simply not push 1920x1080x60? I find it hard to believe - that junky Nvidia can do it. If this is the case I may just switch to a fairly budget Geforce option.

Hi shud,

Thanks for posting. I've been busy and not followed this thread for a long while. And thanks for the update on Catalyst 7.5. I haven't had a chance to check since 7.4.

The goal with this thread is really to help overcome a particular inability of some Radeon cards to deliver a proper interlaced 1080i30 signal. Or actually: of the Catalyst drivers. Since earlier drivers were able to do it without problems, and since PowerStrip can do it as well, we know the cards can do it.

It is becoming less and less of an issue over time as newer TVs can accept 1080p input -- and I am not aware that Radeon cards should have problems with 1920x1080x60.

What kind of TV do you have? Are you sure it supports 1080p input?

I do note that you have 1920x1080 working with the cheap Nvidia card, but it may welll read from the TVs EEDID and force 1080i when you set 1920x1080 resolution.

If the TV has a native resolution of 1280x720 (which I seemed to read from your message) it may not support 1080p input at all. It supports 1080i input (as you are seeing all right) and scales that down to 1280x720 (after de-interlacing).

In that case, I wouldn't bother with 1080i or 1080p since you are not going to see any more detail in the final picture anyway. Set your PC resolution to 1280x720x60 and save yourself the hassle. If you use a good video player (such as VLC) it will scale down 1080i and 1080p source material as good as the TV (or better even).

(Of course if your TV is one of those 1366x768 resolution LCD or plasma, you should set your PC to that resolution).

On the other hand, if the TV has a native resolution of 1920x1080 (LCD, DLP, LCoS, plasma etc) you definitely want to get 1920x1080 working. The key is whether it supports 1080p input or only 1080i.

If it's a 1080p flat panel LCD or plasma it probably DOES support 1080p input. If it's a recent model 1080p DLP, LCoS or other RPTV it MAY support 1080p input. Otherwise it most likely ONLY supports 720p and 1080i input (like my Toshiba 2005 DLP).

If you know positively that your TV supports 1080p input I apologize for insinuating otherwise. It's just news to me that ATI cards should have problems outputting 1920x1080p60 -- to me, its always been 1080i interlaced that was the problem.

That said, if your TV is a native 1920x1080 (LCD, plasma, DLP, LCoS etc) you shouldn't be seeing that much flickering with 1080i signal. These TV's all de-interlace the 1080i signal and display it at 60Hz.

Did you try the settings described in the opening post of this thread? ("Use centered timings" in particular). You don't have to go through the entire procedure since your x800XL obviously doesn't suffer from the same 1080i glitch that this procedure is meant to circumvent.

Good luck.
Brandenborg is offline  
Sponsored Links
Advertisement
 
post #92 of 94 Old 06-16-2007, 02:03 PM - Thread Starter
Senior Member
 
Brandenborg's Avatar
 
Join Date: Nov 2005
Posts: 466
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 21
Quote:
Originally Posted by scherer326 View Post

I can get 1:1 mapping when I set the resolution to 1920x1080 at 30hz (1080i) but I get the 1inch black bar around the screen when I set the resolution to 1920x1080 at 60hz (1080p). I tried the VGA and it works perfect for 1080p but I want to save that input for my xbox 360.

I have the ATI x1600 pro, Sony KDL-46XBR2 1080 LCD television. I have the computer connected to the tv via dvi to hdmi cable.

Why cant 1080p at 60hz fill the entire screen, no 1:1 mapping. What am I doing wrong, please help.

scherer326, I am sorry you never got a response to your post. I haven't been following this thread for several months. You probably solved or worked around this issue a long time ago.

I saw some others post about similar problems, and I am not sure why you would get a black border in 1080p (underscan). Usually it's the other way round (overscan).

I suspect the TV has some built-in overscan compensation -- and that it just compensates too much. Any chance there is a setting in the TVs menus to turn overscan compensation on/off?

I don't see what can be done on the PC side to fix this. There is no valid HDTV resolution larger than 1080p you can use.

If you found a solution (and if you happen to see this msg again) I would like to hear it.
Brandenborg is offline  
post #93 of 94 Old 06-16-2007, 02:19 PM - Thread Starter
Senior Member
 
Brandenborg's Avatar
 
Join Date: Nov 2005
Posts: 466
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 21
Quote:
Originally Posted by cbm64 View Post

First, let me say what a great thread this is! I'm not running windows but there is a lot of useful information here. Special credits go out to Brandenborg.

Ok, so here goes. Perhaps this is not the 'proper' thread, but since I am trying to get things working with dvi > hdmi, I think you can help me out. But I think I might be in for some guidance here. In the end I want to end up with 1080i on a radeon 9200 using DVI.

I'm into MythTV analog for some time now and running Linux since '92. I got the idea to use my old Mac Mini (PPC version) as a frontend for watching TV/movies/games/whatever. This machine is running Gentoo/PPC and I have a great picture using dsub vga on a lcd monitor running 1280x1024. I use the Xorg opensource radeon driver since ATI didn't port their fglrx drivers to ppc. But no problem their since I only use it for 2D anyway and the guys at Xorg worked miracles with this OS driver.

So since this mini will be the frontend for watching TV, I plunged right in and bought a JVC LT-32x70 LCD TV which has a pixel grid of 1366x768. It has besides the usual ports to choose from 2 HDMI ports, a dsub vga and a component port. According to the manual one hdmi port supports 'dvi equipment'.

So I got myself a dvi>hdmi cable (can't believe what people are willing to pay) and got to work. First snag I ran into was that I cannot boot the mini with the cable attached. Could be the negotiation where in the end the mini cannot find a suitable resolution to present itself. I also have an Asus pundit something here with a geforce6 mobile and also ran into the same problem. The cable itself and the hdmi port on the JCV are ok since I can get a picture when I connect the cable to my macbook and use it as a second screen. Colors a bit course though, but I believe you guys call this 'black crush'.

Since the mini will be powered all the time this is not a big issue, but when I'm going for standby eventually this might be a show-stopper in the end. Is there - aside from unplugging the cable at boot- another solution to this in the end?

Second question. I can control the combined DVI-I/A port on the mini to quite some degree with the drivers from Xorg. I also can fiddle with modelines and timings since I ate those for breakfast back in my Mame arcade monitor days. But since this is something different going digital and all, do you thing I can pull it of using the modelines I saw here in this very thread for the 1080i? Or is there more that meets the eye in the Catalyst driver and am I way off?

Third. There is always the option to go for a vga solution using the dsub connector. From the manual I get the impression that it scales everything to WXGA at 1024x768, but on the macbook I think I've managed to crank out 1280x768. I will have to verify this using a testpattern though. So I think that a multiple of 8 close to 1366 might be an option. I just might get a 1:1 per pixel solution. Will this degrade the qualitity a lot? In one of the early replies in this thread the vga option was also mentioned I believe as an alternative.

Fourth and last (promise). I also can go for the component option. I've just ordered an ATI dvi2component dongle on ebay just to be sure. Perhaps it will work with the 9200 perhaps not, but I thought that this perhaps might get me through the boot sequence. I also believe reading this thread that using component with 1025i25 (I'm from the Netherlands) will produce a fine image.

Still, it won't be so much fun, so the dvi>hdmi still has my vote. And after all, my wife watches TV and I only tinker with it! Would be a shame if I fixed it in one sweep.

Hope to here your opinions!

regards,
cmb64

Hi cmb64,

As I wrote to scherer326: I am sorry you never got a response to your post. I haven't been following this thread for several months. You probably got this thing working a long time ago.

Sounds like a fun project. I don't know half the products you mentioned, as I am not into Myth/Linux, nor Mac.

Some of the problems are common to the PC side, though, such as the inability to boot in a working HD resolution. I am unable to view my boot screen on the TV, so I always use the LCD monitor for all setup (as you use the Macbook) then connect the TV once the OS is set for the proper resolutions.

My impression is that Nvidia cards are a lot more plug-n-play with HDTVs, including picking up compatible resolutions through EEDID at boot time.

But any comments from me this long after you post are probably moot.

If you do see this and made progress on your project (as I am sure you did) I would love to hear the outcome.
Brandenborg is offline  
post #94 of 94 Old 08-30-2007, 06:05 AM
Member
 
Flip's Avatar
 
Join Date: Feb 2002
Location: Sweden
Posts: 22
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Hi

I am trying to get a nice HTPC desktop on my european Pioneer PDP427 plasma (don't know the american modell) 42 inch 1024*768 pixels. HDMI inputs. Connected with HDMI-HDMI cable + adaptor to Ati Radeon x700 (DVI).
The HDMI inputs supports 720p, 1080i, 1080p24 (only blank screen for me with Powerstrip)

I have tried 1024*768, 720p, 1080i but the windows desktop is eather too big or too small.

About your great guide, which checkboxes/options in Powerstrip should not be enabled? When should the refresh rates indicate 60Hz/30Hz etc.
Screenshots from Powerstrip would be great

/Filip fille48@hotmail.com
Flip is offline  
Reply Home Theater Computers

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off