AVS Forum banner
Status
Not open for further replies.
1 - 15 of 15 Posts

·
Registered
Joined
·
11 Posts
Discussion Starter · #1 ·
I'm considering a switching my P4 1.8 GHz PC from Windows 2000 with a MyHD 120 card to Fedora Core 3 with 1 or 2 pcHDTV HD-3000 cards. I currently interface to a 1080i CRT type HDTV by component and a 6.1 receiver using S/PDIF (either optical or coax). I believe I'll need to add the following hardware:


Sound Card: The MyHD card has its own S/PDIF out, so I need to replace this with a new card. On another thread ('44.1khz spdif without resample to 48'), bac522 suggests a cheap card that can be made to work under Linux (Chaintech AV-710, $25 at newegg). Looks pretty good.


Video Card: This one's a bit confusing for me. To replace the hardware decoding of the MyHD, I was planning on using getting a 128 MB card with the nVidia GeForce FX 5200 chipset. This is the best chipset I could find that is still offered in fanless options. It sounds like I could go for a slower choce (e.g. the 4000), but I'd only save $10 or $15. There are quite a few options to chose from all about the same price (e.g. Rosewill, AOpen, Soltek, Albatron, Chaintech, MSI, and Gainward). These all have DVI outputs, which I guess I don't need now, but might be useful later. They do seem to vary in terms of maximum refresh rate, though I don't see how that matters as no HDTV refreshes faster than 60 Hz. Some are 64-bit and some (more expensive) are 128-bit. Does this matter in a Linux HTPC? Is 256 MB useful (adds around $20)?


Video Card to Component Interface: Here's where I'm really lost. The MyHD card is configurable so that it puts out 1920x1080i (actually I've been having trouble getting this working, but that's another story), 1280x720p (won't work on my current TV, maybe later), or several 480p choices. You can also choose Y-Pb-Pr or R-G-B (supposedly my current TV will do both, but I've only gotten Y-Pb-Pr working so far). I use a VGA to component cable now - would I continue to use the same device? Some cards come with a DVI to VGA connector - would that signal be identical to the VGA connector then? (maybe after configuring the video card?) Is this always a painful tweaking experience getting the Xorg configuration files just right? (I read some of the advice in the Fedora Myth Howto, and it sounds like it can get pretty involved if you aren't lucky)


Tuner Card: Only one option here - the pcHDTV HD-3000, hopefully they won't sell out before I can make a decision.


Dara



p.s. If anyone is curious why I want to abandon a functioning Windows system for possibly more tweaking than is going to be enjoyable, the answer is I absolutely hate the MyHD card. I can't record and watch a taped program at the same time, the fast forward isn't a fast forward - it's a skip ahead, and the schedule recording has a bug where it won't choose the channel correctly when there are two channels with the same prefix number (e.g. 28-1 and 28-2 in Los Angeles). On top of that, I have to deal with old functionality being lost during upgrades (e.g., you used to be able to play DVDs ripped to your hard drive through this card, but no longer). I don't think any windows solution is that great right now (ATI looks like it has problems too), and if I'm going to get frustrated, I'd rather be able to get involved with a forum and a community (e.g. MythTV) that looks a lot more interesting than the MyHD non-community.
 

·
Registered
Joined
·
471 Posts
I've found the fx 5200 to have lesser black details than my previous radeon 9100 myself. It has some advantages for hdtv decoding such as hw accelerated mc and idct through the XvMC extension, but I think I would opt for an fx 5700 if you go the nvidia route.


Too bad you are in the US. Here in europe I can enjoy dvb-* through VDR which is a truly remarkable program with tivo functionality which can record multiple streams at once from multiple tuner cards, with indepent stream playback.
 

·
Registered
Joined
·
11 Posts
Discussion Starter · #3 ·
Torgeir,


I am forced to go with nVidia (unless I want to upgrade to a 3 GHz P4 and I'd rather stall on that if I can), so I can't choose ATI. I understand some of the physics on why black level performance can be better on one TV type than another, but I don't understand what a video card means in the equation. You have noticed this effect in a blind comparison? Can you explain it?


I am willing to spend a bit more to get a faster video chip, though plenty of sources on the web say that there is absolutely no benefit for just displaying HDTV. The other downside is that none of the more expensive nVidia cards come fanless (on newegg.com anyway, 5700LE or 5700). I bought a PC a few years ago that was designed to be quiet and it still doesn't satisfy me - I don't want to add any noise source to it unless it were the only solution.


I'm reading more about the Video Card to Component Interface issue. I may get stuck for another $100 there unless I can use RGB (or maybe RGBHV, but then I can switch video using my AV receiver which is a bit annoying - ok if I use the HTPC to watch DVDs I guess). My Mitsibushi TV supposedly does both, but I never got RGB to work using the MyHD card so I'm not optimistic there.


Thanks for the info,


Dara
 

·
Registered
Joined
·
471 Posts
I think the black detail issue is due to the radeons using ten bits precision per component when doing scaling and colour space conversion for the video image. I'll revert back to the radeon to compare again.
 

·
Registered
Joined
·
254 Posts
Dara,


Remember that you can always get a third party heatsink with heatpipes to get rid of that fan. I have a 5700 Ultra running fanless in my Hush ATX. And the 5700U is much cooler than the 9800AIW it replaced (at least in HTPC stuff, gaming etc is probably another story).

As you may have read in another thread, I had some trouble with my 5200's running hd material, in that I that the picture had distorsions (i think they where heat related, but I may be wrong). But only in 1920x1080.


- Micael
 

·
Registered
Joined
·
471 Posts
Well I put the radeon 9100 back again, and it will stay back for the time being. The radeon indeed has much better colour fidelity, and much better black level. It brought back some judder as well though, so I think I need to upgrade in any case.
 

·
Registered
Joined
·
471 Posts
A few nights with the fx 5200 have convinced me that this card is unsuitable for HTPC use. It's scaler is lacking in horizontal resolution and exhibits colour banding. The only positive about this card is that it has good pans with little judder. I'd attribute this to good driver support.


The advantage that it supports XvMC will soon diminish as I hear that open source XvMC implementation is coming for ATI cards RSN.
 

·
Registered
Joined
·
254 Posts
Torgeir,

Did you ever test the 5700U? I certainly would like to hear some second opinion of the pros and cons of this video card.

When it comes to scaling, I have found another problem with the nvidia; downscaling from 1920x1080 can, with some material render a horrible vertical artifacts. However this may be also be xv. Since ATI's binary driver does not allow 1920x1080, I have not been able to track this down.

Someone mentioned (was it you?) that the ati driver from xorg had a lot of the gatos stuff in it. Are you using this (xorg) one?


- Micael
 

·
Registered
Joined
·
471 Posts
Yes, I was using a recent binary xorg snapshot for the ATI card. I am curious about trying their proprietary driver though. I see the X server using about 10% cpu with the ATI driver, while with the nvidia card I see it use about 3%. I think this can be the source of some of the judder I'm seeing. I am still interested in trying both a 9600 and a 5700 (does it really have to be an Ultra?) to compare picture quality.


How would you describe the "horrible vertical artifacts"? What I see when talking about limited horisontal resolution I can best compare to a computer game which doesn't antialise its rendering, causing jagged edges on objects. I see this on faces and other objects.
 

·
Registered
Joined
·
10,688 Posts
I've been using an FX5200 for several months and I've had no problems with scaling, color banding or shadow detail when running mplayer through the DVI port. Xine doesn't look very good but I'm not sure that's related to the card. The XvMC accelleration works very well with HD streams and I like having one less fan in my system.


On the other hand, Nvidia's drivers have been terrible. Their last two or three driver releases simply don't work with my dual monitor set up. I'm still using the one they released in April hoping that they'll fix they things they've broken.
 

·
Registered
Joined
·
254 Posts
Well, I'm not sure how to descibe that downscaling artifact. Not pixel, but it rather looks like "block ringnings" I guess I cannot describe this ;-)

But it does not happen with all hd material, and with the material that has the problems, it happens all the time while the kamera is moving, and the film is really unwatchable. I have only seen it with 1920x1080 material. It is nothing like (de) interlacing distorions, much bigger.

I'm not too concerned about it, since I really only need to downscale 1920x1080 in my test environment.


scowl, I would guess that, since you are using the dvi port, you have less of the quality problems that I have seen on the cheaper boards. (VGA is really inferior on the 5200 compared to the 5700ultra.


And pivot, I really think that the ultras should be better, since it is a much better PCB (more layers) with better/cleaner power supply. But as usual, ymmv.


- Micael
 

·
Registered
Joined
·
10,688 Posts
Do you see this downscaling artifact with more than one application? I've been noticing a lot of quality differences between xine and mplayer. For example with XvMC enabled, xine displays the 1080i stream from my WB network station as a bunch of vertical bands 16 pixel wide. I've noticed that xine also has other more subtle rendering problems that I didn't notice until I compared it with mplayer.


I've always used this board with a VGA monitor hooked up to the VGA port (the DVI goes to an LCD display) and I don't remember seeing anything like what you're describing on that monitor either.
 

·
Registered
Joined
·
254 Posts
I have now tried some configurations, and i have now found that it has to do with the deinterlacer in combination with downscaling. It seems that there are something that happens if you do both.

I tried MPlayer, but I could not figure out how to enable the deinterlacer, but I got the usual microstutter with it, which I have always seen ;-)

The vertical banding that you described, I have seen some time ago, but it dissappeard along with some upgrade I did, your old nvidia driver perhaps(?). I have not seen it at all in my htpc which I set up this summer, so it must have been before that.
 

·
Registered
Joined
·
10,688 Posts
I had a feeling it was software.


Mplayer doesn't have deinterlacing with XvMC, at least not the version I have. I changed the flag in the call to XvMCPutSurface to XVMC_TOP_FIELD. That makes it do cheap one-field deinterlacing. I haven't had any stuttering with mplayer since I increased my kernel's time slicing to 1000hz. This made mplayer run as smooth as my STB. Xine still drops frames and goes screwy sometimes but I improved that by adding a call to XvMCFlushSurface in xvmc_display_frame (mplayer already did this). I would get the timeslicing change for free if I could upgrade to a 2.6 kernel but I can't because Nvidia's 2.6 compatible drivers won't work for my two display setup.


I guess my old driver could be causing the banding except there's no trace of it when I run mplayer. It calls the same XvMC functions that xine calls.
 
1 - 15 of 15 Posts
Status
Not open for further replies.
Top