AVS Forum banner

New HD graphics for Old Core2 Duo 2.1GHz HTPC?

7K views 93 replies 11 participants last post by  CherylJosie 
#1 · (Edited)
EDIT: As of today, I installed the Ubuntu 14.04 released driver for the NVIDIA GT 730 and it is now working fine with mplayer. Just checked the NVIDIA web site for drivers and found documentation indicating the currently supported latest version packaged by Canonical (nvidia-331-updates) now supports the GT 730.

I have not yet gotten 7.1 audio working but 5.1 is fine.

The vanilla driver on the NVIDIA web site was causing some weirdness with Firefox and Flash, and I could not get DKMS to play nice with it either. Just about every OS update broke the graphics driver and it had to be re-installed every time.

http://www.nvidia.com/object/linux-display-amd64-331.113-driver.html

Not so much activity on the ATI graphics. The system is still working fine with the Nouveau driver and Oibaf PPA but I never did get VA-API or HDMI audio working with it.

EDIT: Many thanks to the kind forum members who helped me through this learning curve configuring Linux graphics for hardware video stream (bluray etc) decoding.

There are numerous technical details to work out and the process is not straightforward for someone unfamiliar with custom installing Linux graphics drivers or with using hardware decoding on various players that each require different 'command-line parameters'/'configuration file parameters'/'output device settings'.

Status of my experiments on Ubuntu 14.04 so far:

EVGA Geforce GT 730
- with latest proprietary NVIDIA driver tarball and its bundled VDPAU driver:
vdpau, va-api, digital audio passthrough working
- with open-source oibaf ppa and its mesa-vdpau-drivers:
untested yet (the Nouveau driver has difficulty with this new model)

XFX Radeon HD 6570
- with proprietary AMD driver:
unknown (my testing might have been compromised by my errors)
- with open-source oibaf ppa and its mesa-vdpau-drivers:
vdpau working
va-api and digital audio passthrough not working

flash hardware decoding:
- untested yet

Ubuntu 12.04:
- hardware decoding (if implemented at all) worked poorly

Hardware decoding CPU load on GT 730 with proprietary driver, Ubuntu 14.04 is substantially better than on Ubuntu 12.04. With 12.04, transitions in and out of fullscreen caused multiple players to freeze/crash. Also, open-source support for hardware decoding is nonexistent on 12.04 and therefore I do not recommend 12.04.

Somewhere around post number 90 I finally got the GT 730 with latest NVIDIA driver doing hardware video decoding.

In the next couple of posts after number 90 I finally got the XFX Radeon HD 6570 with oibaf ppa doing hardware video decoding through vdpau only.

I have not attempted to get hardware accelerated flash working yet.

If I get more working, I will append another post, and status update here. I am still finalizing the configuration of my HTPCs.

Here are links to drivers and instructions.

Open-source driver hardware decoding:

Ubuntu Will Not Enable Open-Source VDPAU Support

http://www.phoronix.com/scan.php?page=news_item&px=MTYwNzU

in their 14.04 repository and livecd disk image, so you need the

Updated and Optimized Open Graphics Drivers (oibaf)
https://launchpad.net/~oibaf/+archive/ubuntu/graphics-drivers
Using the VDPAU driver
http://wiki.cchtml.com/index.php/Ubuntu_Trusty_Installation_Guide#Using_the_VDPAU_driver
AMD Radeon VDPAU Video Performance With Gallium3D
http://www.phoronix.com/scan.php?page=article&item=amd_gallium3d_vdpau&num=1

Note that to get VDPAU working with a Radeon HD 6570 I had to install the oibaf ppa, and then update, and then add the mesa-vdpau-drivers package. I did not get va-api or digital audio pass-through working.

Enable Open Source UVD On Fedora 18
http://www.liangsuilong.info/?p=1609

I have not tried Fedora.

xorg-edgers fresh X crack
https://launchpad.net/~xorg-edgers/+archive/ubuntu/ppa

At that point in my learning curve I still did not understand how to force various applications to use hardware decoding. The functionality of hardware decoding with this driver is unknown to me.

Closed-source driver hardware decoding:

More Detailed Installation Instructions
https://help.ubuntu.com/community/BinaryDriverHowto#More_Detailed_Installation_Instructions

BinaryDriverHowto/AMD
https://help.ubuntu.com/community/BinaryDriverHowto/AMD
AMD Catalyst™ Driver
http://support.amd.com/en-us/download/desktop?os=Linux+x86
Unofficial Wiki for the AMD Linux Driver
http://wiki.cchtml.com/index.php/Main_Page
Hardware Video Decode Acceleration (EXPERIMENTAL)
http://wiki.cchtml.com/index.php
/Ubuntu_Trusty_Installation_Guide#Hardware_Video_Decode_Acceleration_.28EXPERIMENTAL.29

My experiments with Radeon proprietary driver were uninformed as to codec selection in the player and the hardware decoding conclusions are suspect. I will attempt to repeat the experiments, append a post and update here.

BinaryDriverHowto/Nvidia
https://help.ubuntu.com/community/BinaryDriverHowto/Nvidia
Unix Driver Archive
http://www.nvidia.com/object/unix.html
NVIDIA VDPAU Performance Metrics On Ubuntu 14.04 Linux
http://www.phoronix.com/scan.php?page=article&item=nvidia_vdpau_metrics&num=1

Note that to get NVIDIA VDPAU working (for mpv, mplayer, mythtv...) I had to install the latest LTS driver from NVIDIA web site (one that supported the recently released GT 730), requiring running the install script from the recovery root shell prompt, and install the bundled vdpau driver too, then blacklist the nouveau driver, then install the vdpau-va-driver package from repositories as well to get the va-api frontend to vdpau decoder working too (for VLC primarily).

If you also need to get vdpau working with NVIDIA GT 730 or other recent card you might want to check the posts leading up to post #90 to see how I did it, if you have any questions at all about the prior statement. I had to dig deep to figure out how to correctly install the NVIDIA driver straight from NVIDIA tarball and the instructions in the README were incorrect.

Superficially, performance of Geforce and Radeon vdpau decoding were similar in my experiments, but I was too busy learning how to configure them to make any methodical comparisons. Maybe later.

---

I have both an Intel and an Abit 2.1GHz Core2 Duo machines each with PCIExpress 1.0 (2 X16 slots on the ABIT), ancient graphics adapters with no hardware codecs or acceleration under Linux, 8GB memory and gobs of disk that I use/am upgrading for HD home theater both as NFS server and client for playback in living room and bedroom.

I know there are tons of smart tightwads like me reading these posts with their own rube goldberg HTPCs that work great. Someone somewhere has already conquered this HD mountain on a shoestring.:)

So, does anyone know of an inexpensive graphics card that will reliably decode 1080P in hardware in a 1.0 slot with Linux kernel 3.2 or later (Ubuntu 12.04 intending to upgrade to 14.04 once the hardware is stable)? My primary video applications are MythTV and VLC although I am interested in trying out XBMC someday and of course I occasionally need to use mplayer-based stand-alone player on some formats if I cannot figure out how to use my other two video players on a given program.

Gaming acceleration is welcome for future reference if it comes along for the ride but not required at this time since I have no plans to game.

Ideally, the replacement graphics adapters would be used or no-name, very affordable or free, and referenced somewhere on e.g. linuxtv documenting PCIExpress 1.0 slot compatibility plus the availability and reliability of the hardware codecs, or with a personal recommendation and verification of compatibility from someone who has already done this.

I need an expert opinion or at least some tips on how to develop one myself.

Ideas? Thanks.

More info for the masochistic:

These machines currently cannot handle bluray HD in either software or hardware decoding.

I also have some issues with MythTV playback because neither graphics card has Linux drivers that enable any hardware acceleration. I can barely get 1080i running and must do the de-interlace in the TV.

I searched AVS forum several times and cannot find anything useful on selecting an inexpensive graphics card that will play bluray etc. in hardware decoding. Even on NewEgg, the emphasis is on bus speeds, processors, and gaming rather than on what codecs are implemented and how well they work under Linux. It seems no one cares now that the average CPU handles HD streaming in software anyway.

Also, from looking at the pictures on NewEgg it seems that the newer cards might need an additional power connector in the PCIE card slot near the case penetration that does not exist on my v1.0 backplanes. The Intel power supply has no 6-pin graphics power connector either, just 4 pin, although I could maybe swap power supplies because the ABIT board has a separate Molex power connector for the PCI backplane anyway? I am not expecting bluray playback to draw down the city electrical grid am I?

I have no idea what limitations using an older model graphics adapter with hardware codecs would impose on my viewing experience, or why. Neither do I understand any incompatibilities with newer models plugged into PCIE 1.0 or power supply issues. It is all incomprehensible to me at this point. I only know that my ideal solution is zero cost or as close as possible since these machines are a decade old.

My other option is to just network through the bluray players for HD but that leaves me with having to switch the receiver, makes navigation a pain, limits multitasking to the tiny PC monitor, and does not help with MythTV, unless I can also use bluray for that?

I only have 2 bluray players but both seem network capable. Anyone have any experience using Sony bluray players as streaming remote graphics adapters for your network-attached media servers? NFS seems not to work with them. I read something about Samba somewhere to get Windoze compatibility for bluray players and set-top boxes?

Tight budget constraints are in play here. That is why I even bother asking else I would just buy a new machine or two. Using the bluray players is my backup plan if I cannot find a solution 'inside the box'.

I considered upgrading the main boards but then I need microprocessors too plus ram and maybe case and power supply. The cost quickly adds up to equal or exceed the cost of a new graphics adapter.

I am OK with needing to tweak software to work with old hardware, but less OK with needing to drain substantial ca$h from my bank account for HD television.

Thanks for your patience reading such a long post.
 
See less See more
#6 ·
Thank you for the suggestion Lex.

The picture of that model on NewEgg has a tab on the PCI bus near the external connectors. There is no corresponding receptacle on the motherboards at that end of the PCIE socket. Will that pose a problem? Is it just for power connection that can be made direct with a cable instead?

My GEForce 7300gs has the same tab but there are no traces on it and it just hangs in midair because the socket stops?

The price is higher than I would like, but maybe I can find one used. Thanks for the suggestion.
 
#4 ·
Get a Broadcom Crystal HD card. I got one for a Dell Studio Hybrid With a dual core2 cpu and now it decodes 1080p With no problems + Plays BD's since I replaced the DVD-player With a BD-player. Cost NeXT to nothing on ebay. Thay come in various configurations. It's even directly supported in several media players.
 
#11 · (Edited)
I haven't gotten it to work properly yet. I open up a page with Flash and the CPU immediately starts to ramp up. As far as I know, you have to manually set flags to get it to work, but that's still experimental, I think. I never did get the flags to work properly.

To be clear, Chrome utilizes it's own Flash called "Pepperflash". It works, mostly. I still run into sites that try to make me download Adobe Flash. Any system I work on though, Adobe takes over if you have both installed. So, Chrome is trying to get Pepperflash to utilize acceleration within the browser, not actually get Adobe Flash to utilize acceleration. At least, that's what I think... Take it for what it's worth.

I'm now using the latest Chrome Beta which has the HTML5 DRM enabled in it for Netflix. It's awesome!!! Netflix now plays natively without any emulation (WINE/Silverlight).

That being said, I haven't dug into the latest Chrome version to see if hardware acceleration for Flash is implemented yet and more importantly, working. I'll futz with it over the next few days and see.

Ultimately, I'm just extremely happy that I have my MythTV and Netflix all working on one interface with no funky work arounds. I'm kind of at a point of "If it ain't broke, don't fix it". That'll change soon, as my real motto is "If it ain't broke, fix it till it is".

Linux really needs to come up with it's own Flash variation that WORKS. Of course the best solution is that Flash just goes away, quickly and quietly into the night. With Netflix moving to HTML5, there's going to be more exposure to the fact that you can do what Flash does without bowing down to proprietary strangleholds.
 
#13 ·
After reading up on graphics cards for a few days I have tentatively settled on the following GT 730 models:

http://www.evga.com/products/Specs/GPU.aspx?pn=8f4503ba-a8c4-401d-9406-b3c1cc4c4824

http://www.evga.com/products/Specs/GPU.aspx?pn=07db90d8-e448-4d74-9c9e-bfd0b111d295

The reason for settling on these two is they both have PCIE2.0 and they also are the latest low-end offering with vdpau feature set D. I suspect my ancient motherboards cannot handle 128b/130b bus encoding of PCIE3.0 and they stopped releasing firmware updates several years ago. Rather than tempt fate I decided to just stick to PCIE 2.0.

Any suggestions about what the major difference (besides form factor and 20% more memory bandwidth) is between a card with 96 vs 384 cuda's?

I am also considering a used EVGA GTX 560 SC
http://www.evga.com/Products/Specs/GPU.aspx?pn=FC6EE9E2-97D8-45F6-BB58-590D7E58FDFD

It seems a much higher performing card (~2x) looking at the flops...

...but it also has vdpau feature set C instead of D and that may be more important criteria than speed for my home theater application. Besides, it uses 150W instead of 38W and might lead to power supply issues since mine are only 500w and I am running 6-7 hard disks in each case.

I am placing high value on feature set D since I will be using these machines almost entirely as home theater PCs and the CPUs definitely cannot do software decoding beyond mpeg2.

So, what do the experts think? Or am I over-analyzing this?
 
#14 ·
...but it also has vdpau feature set C instead of D and that may be more important criteria than speed for my home theater application. Besides, it uses 150W instead of 38W and might lead to power supply issues since mine are only 500w and I am running 6-7 hard disks in each case.

I am placing high value on feature set D since I will be using these machines almost entirely as home theater PCs and the CPUs definitely cannot do software decoding beyond mpeg2.

So, what do the experts think? Or am I over-analyzing this?
The only difference between feature set C and D is support for decoding H.264 with a resolution of up to 4032 × 4080 and MPEG-1/MPEG-2 with a resolution of up to 4032 × 4048 pixels. In a standard HTPC setup resolutions that high are not needed.

Go with the card that will give you the most power to decode and don't worry about resolution.
 
#15 ·
I suppose I was thinking that the multi-monitor desktop size was somehow involved in feature set D, but in retrospect it seems that decoding resolution has no effect on multi-monitor desktop size. Memory is probably the limiting factor for desktop size, or scaler resolution, or something like that, not decoder. I suppose that only the latest games use multi-monitor displays in real time decoding for their 'back story' video clips or somesuch. They probably also need more CPU horsepower to run those games too. Maybe eventually home theater in general will benefit from 4K but I will probably never need it for my frankly low-end setup. It seems it is just another gaming feature for Joe Consumer.

OK thanks for the input. All in all I am still leaning toward a new low-end card with feature set D (and some processing horespower for transcoding), just in case some titles are eventually released in 4K 'bluray' with h.264 extensions and will not play on older cards, but some of the deals on ebay used cards seem too good to pass up, especially like the 210 that can be had for ~$15. I could just leave that with the server and find something else for gaming in my second childhood (drool) if need be later, rather than spending $80 for something that is only marginally better and wasting it in a server that cannot handle any complexity while I wait for new titles in higher resolution.

This has been a learning experience. Thanks for the help. I really have not been paying attention to computer tech for a while and am sort of out of touch.

The last consideration I am looking at is whether or not Handbrake transcoding benefits from the cuda cores or memory size or CPU clock speed or memory bandwidth etc. I suppose it is time to check the Handbrake website and find out if they have any recommendations, or if they even use the cuda cores for transcoding.

I do transcode things to shrink them down, especially the video that my old WinTV capture card outputs as mpeg2 and DVDs that look basically the same transcoded as not. I guess the original consideration for choosing inefficient mpeg2 encoding was ease of decoding it on a dog machine (like dvd), not efficiency.

My x.264 transcoded video stutters and tears on this core2 duo, especially when I use advanced compression features to encode to a visually 'lossless' quality of 18, and the divx and vorbis transcoder results just do not look as nice -- there is a 'grain' to the look and the files are bigger.

I suppose I could also just start with the used 210 and try something 'better' if it does not work out the way I need. For $15 the price is right. It costs that much to restock at NewEgg and then I have nothing at all to show for my trouble and money. At least a used 210 is something useful. I can conceptually rationalize it as an experiment worth buying.

Anyway I thank you all for your kind input. I was totally lost when I posted originally. Thank you for the education. I will check back once I have the cards upgraded and let you know how it worked out.

Ciao, everyone!
 
#16 ·
@CherylJosie:

Handbrake is just basically a frontend for ffmpeg, and AFAIK ffmpeg hasn't attempted to tap into cuda. So for transcoding, that is all handled with the CPU. At least it can utilize multiple cores.

I'd definitely go with the 210 for $15. I think that you'll be pleasantly surprised with the video quality from VDPAU on such an inexpensive card.
 
#17 ·
Well I just bought an EVGA Geforce 210 for $14.99 on Ebay. It is a bare card, allegedly tested but never used, fanless heat sink, free shipping. A decent lunch out costs more these days.

I am going to give it a try and see how it goes before I commit to a second one. They are all over Ebay, all sorts of EVGA cards of various models.

One advantage to this Geforce 210 card is I can put the audio through HDMI. I have nothing in the Intel that will output digital multichannel.

The built-in Intel optical SPDIF on the Intel motherboard is stereo only and the EMU 1212m (with rca spdif that can handle master audio) is not showing up in the sound setting control panel. I see others online with similar issues showing up in Ubuntu 12.04.

Audacity can still use it but PulseAudio is not recognizing it. I have to use Audacity play-through function to route the sound in/out of the 1212m and that is still stereo because Audacity does not seem to do multichannel either.

As far as Handbrake goes, it does not actually run ffmpeg as far as I know (ffmpeg is deprecated), but rather implements its own executable through the ffmpeg and avconv library functions. I do not recall Handbrake spawning child processes.

I see some mention of hardware acceleration with Handbrake, but it seems it might still be beta. Their web site did not have much info. In any case it seems that the support of hardware acceleration in Handbrake is minimal and maybe limited to the decoding half of the process -- in Willy World (Microsloth) only.

https://trac.handbrake.fr/wiki/GPUAcceleration
https://trac.handbrake.fr/wiki/HardwareAcceleration
https://trac.handbrake.fr/milestone/OpenCL%20Beta
http://www.anandtech.com/show/5835/testing-opencl-accelerated-handbrakex264-with-amds-trinity-apu

The short version is, now I am confused again. I have no idea what the difference is between Handbrake using hardware acceleration for decoding vs Handbrake using hardware acceleration for encoding, unless it means that Handbrake can both parse the input and write the output via GPU.

I still have no idea what QuickSync is or how to use it but it seems that it might only work under Windows so the majority of the acceleration might not be available in Linux.

I guess I will have to wait and see how it works with the 210. There is supposed to be a new check box section in the user interface for opencl scaling and hardware decoding. If it shows up, I guess it is supported. Kind of a tough way to figure out if it is working...

I could really use the acceleration. It takes days to transcode a bluray on a core2 duo. Even a DVD takes a long time. That was one reason for adding drive space and just plain ripping instead of squeezing. Even if I use my Dell i7 laptop it still takes a long time and that thing goes into thermal shutdown unless I point a desk fan at it.
 
#18 ·
So here I find a comprehensive explanation of hardware accelerated encoding, at least as it relates to a subset of the available hardware and software anyway:

http://techreport.com/review/23324/a-look-at-hardware-video-transcoding-on-the-pc

It looks as if hardware acceleration in Handbrake is minimal (probably due to the complexity of parallelizing tasks) but it is there, for what it is worth, using only the generalized shader (CUDA) rather than the 'black box' encoder packaged in the graphics card. GPU is probably not going to make much improvement running on a Core2 compared to an equal investment in CPU on the only application that really matters to me (Handbrake).

Quality in the competing proprietary applications is awful using either the CPU or the GPU black box. Even from 2 meters away I can see a marked difference in a 22" monitor between the original and CPU-only transcode. The output looks terrible.

Interestingly enough, the best and most consistent output is from an open-source application encoding 'out of the black box'.. sort of like the best virus immunity and security and online free tech support and database and web server and file server and kernel of... what is it called again? L-something? Linus? Liberalism? Socialism? Cancer?

Can you believe they actually called it cancer? That is like calling a lush green forest and meadow cancer. Linux is organic. Better they just call it good ****.

If this article is up-to-date it looks like quality hardware acceleration of transcoders in general is practically nonexistent on the PC at this time, despite a huge pile of graphics processing resources sitting there like an appendix, or is it a bud, in the graphics card.

I guess I will wait until acceleration is working better and then consider a higher--performance card if I need one for that. Looks like I might need a more reliable CPU for transcoding, or lots, and lots, and lots... of... patienceeeeee...

You were right, I should get a modest feature set C card that supports HDMI, like the Geforce 210, since none of the rest of the card is going to help out a home theater PC at this time anyway. It is all strictly for games and maybe business.

Problem solved (on paper anyway). Looks like I made the right choice.
 
#19 ·
thanks for the info on handbrake. i thought ffmpeg was deprecated but wasn't sure -- drives are cheap enough that i haven't bothered with transcoding anything for a few years now. i agree that utilizing cuda for transcoding would be nice, but that's probably a lot easier said than done.
 
#20 ·
Well the fanless EVGA Geforce 210 arrived yesterday.

Fanless is an ironic description.

The HDMI audio is limited the same as optical SPDIF streams. SPDIF selector must be set in VLC ALSA output module or the output is stereo downmix.

The hardware video decoding tears and glitches all over the place with two screens running, even playing a DVD.

With one screen a DVD is almost watchable but still glitches. It is only slightly better in this respect than my Geforce 7300 GS.

The HDMI video decoding barely works at 720p. It stutters and judders constantly even with vertical refresh sync disabled.

In 720p the frame rate is limited to 60Hz.

In 1080p there is 24/30/60Hz. All of them stutter and judder when decoding bluray.

I noticed that my Geforce 7300GS worked better under Windows. Possibly this Geforce 210 is the same.

I suspect that those having good results with this card are not depending on the hardware decoder. I suspect it does not have the necessary bandwidth, either in the hardware decoding itself, or in the GPU/memory bandwidth, depending on whether the decoder needs the GPU/memory.

Otherwise there is some basic flaw in my setup that I have failed to detect, such as some sort of driver issue or hardware bandwidth problem in the PCIE 1.0 graphics slot.

I spent hours trying to get it working. Maybe the card is damaged, but somehow I doubt it. It works exactly like I have come to expect a weak card to work.

Nvidia X Configuration utility has no scroll bar and once displayed on 720p must be moved with alt-f7 to reach the 'apply' button to set the display back to 1080p.

With Nouveau drivers the hardware decoded video tears, stutters, and judders all over, and the HDMI audio is nonfunctional.

I tried kplayer, kmplayer, smplayer, all have issues with the audio. It also seems they have shorter stutters but more frequently. Possibly this is because they are not sending audio through the HDMI directly but through Pulse instead. None of them seem to do 5.1 even when the HDMI output is selected in the application (if it allows).

I think I might get a GT 730 instead and try that. If that misbehaves, there might be a problem with the motherboard or CPU speed. I should try it in the ABIT as well as this Intel machine.

This is going to be a real bummer if I have to upgrade the motherboard to get the bandwidth out of the PCIE. My processors are not capable of version 2.0, according to Wikipedia. I will need memory and CPU if I do that.

So, based on what I just wrote, any suggestions now? Should I try the GT 730? Which one?

http://www.evga.com/products/ProductList.aspx?type=0&family=GeForce+700+Series+Family&chipset=GT+730

I am leaning toward the P/N: 01G-P3-3731-KR because it has the best mix of cpu and memory speed. Not sure if that matters but I would rather play it safe if I have to buy new... the 2GB version is not out yet so I would save $10 for half the memory...

http://www.evga.com/Products/Product.aspx?pn=01G-P3-3731-KR

Anybody?
 
#24 ·
Well the fanless EVGA Geforce 210 arrived yesterday.

Anybody?
Give the official nvidia drivers a try and make sure the vdpau package (maybe libvdpau) is also installed. For a very long time I was using a 8400 gtx (I think that was the one.... ancient) with no issue on a low power amd chipset.

I do remember something about the 2XX level card having some HDMI audio issues (lack of full support) but other than that they were very good cards for a HTPC.
 
#21 ·
I see from Wikipedia that the maximum pcie 1.0 transfer rate is 4GB/s, or 32Gb/s.

Meanwhile, the data rate reported from VLC when playing bluray is peaking at ~44Gb/s.

Looks like it could indeed be pcie1.0 limiting the performance. This would explain all the dropped frames.
 
#22 ·
NVIDIA X configuration utility reports GPU utilization less than 20% even with bluray. PCIE utilization is pegged at 0. Possibly they thought it not worth their while to implement that measurement for PCIE1.0.

I suppose the next step is to start shopping for a motherboard and CPU and memory.

Note to self: Forget about pcie 1.0.
 
#23 · (Edited)
Is the nouveau driver loading? There is an in-kernel driver for nVidia cards named "nouveau". It is a horrible video driver, and not suitable for a HTPC. Run this command in a terminal, it will look for that module and return blank if it isn't present:

Code:
lsmod | grep nouveau
Also to see if the nVidia driver is loading:

Code:
lsmod | grep nvidia
EDIT:

I see that you did mention using the nouveau driver. You must use the driver provided by nVidia. I don't recall which Linux distribution you are using, but it usually can be installed through your package manager.
 
#25 · (Edited)
EDIT: It appears from research and testing that PCIE 1.0 is inadequate for bluray playback (above 4GB/s), and for some reason the Geforce 210 is having difficulty even with dvd playback (~1GB/s) on pcie1.0 so it looks like the intent of this thread is a non-starter. PCIE1.0 is not likely to work.
LOL what? I dunno what research and testing you've been doing but that is completely false. PCIe 1.0 is more than fast enough for all of that. Unless Linux has some kind of limitations, PCIe most certainly does not. If you were talking about an x1 slot then yeah sure but an x16 slot, 1.0 or not is enough for almost everything except perhaps the very fastest videocards of today.

I run an HD 7950 on a PCIe 1.1 (x16) slot and there's no issues with bandwidth, and this is talking about in gaming not just simple BD playback. My framerate in most games, while more than enough to play the games I'm playing, is held back more by my CPU, a Xeon X3360 (C2Q 9550 eqv.) @ 3.6Ghz, than the PCIe slot "limitation". Note that I've also had an HD 4830 and an HD 6750 (with CPUs from C2D E6600 to E8400, to the CPU I have now) in the same exact slot previously, and have never had a problem with BD playback or anything else related to PCIe bandwidth limitations.

PCIe scaling and limitation has been tested many times by sites like Tom's Hardware, and the difference between PCIe 1.x, 2.x, and 3.0 for single graphics cards is essentially meaningless (1-5% difference). It only begins to matter when you have dual-proc video cards or SLi/Crossfire configurations. And, even then, PCIe 2.0 is fast enough for even the highest end cards of today in dual SLi/Crossfire. I.e. 1.x would not be enough but 3.0 is overkill for what even two GPUs can do these days. Unless you're running 4 videocards, 3.0 is not necessary.

The notion that PCIe 1.x is not enough for simple BD playback, is rather comical.
 
#26 · (Edited)
I run an HD 7950...
'nough said

Thanks for your insight and opinion, but your experience with AMD/Radeon in windows doesn't apply to Linux. AMD/Radeon has HORRIBLE Linux drivers, while nvidia has excellent Linux drivers. No one who has a Linux HTPC would ever use a Radeon card.

CherylJosie seemed to be using the Linux open-source nvidia driver, nouveau. It doesn't support hardware acceleration (not that I am aware of).

Also, I really don't know if the PCIe 1.0 is different in Linux, maybe some other members can help on this.
 
#28 ·
Yes, I made a mistake. I misread the number that VLC reports for bitrate. VLC is reporting spikes over 40,000Kbps, not 40,000Mbps.

I suppose a sanity check would have been in order. 4GB/s limit of pcie1.0 seems fast enough to play an entire bluray in 10 seconds. Mea culpa for troubleshooting while blinded with fatigue.

At 32Gbps PCIE1.0 should be able to handle 1000x the bitrate of the bluray stream I used for testing, on average. Foot, mouth, insert, chew vigorously. Pie, humble, eat. Etc.:eek:

There is more. It appears that I have not actually tested the hardware acceleration on this card until tonight.

Of the four drivers that Jockey located for the Geforce 210, only version '331-updates' actually seems to use hardware acceleration with VLC. Version '331' (recommended, according to Jockey) will not, neither will '304' or '304-updates', according to my testing. Bluray playback freezes with all but one driver.

To make matters worse, version 331-updates only seems to use hardware acceleration with 60fps in 1920x1080, 1360x768, 1280x720. All other modes resolution/fps appear to lack gpu acceleration.

1360x768 seems to have the smoothest output (the aspect ratio seems OK too at first glance) so maybe that was the resolution that gpu hardware acceleration under Linux driver was actually developed at, and the other two might have been hasty ports of the original solution.

I did stumble across a bug in nvidia-settings application version '331' (that is automatically installed by Jockey). After installing and removing drivers several times, nvidia-settings began crashing on 'apply', making it worthless. Also, it started reporting that there were 1080p 60/50fps settings for driver version 331-updates instead of 1080p 60/30/24fps settings.

I checked online and discovered a bug report about the 'apply' crash and a recommendation to downgrade nvidia-settings to version 304, which I did immediately and locked it.

There were more quirks causing issues too.

I originally had VLC 'Accelerated video output (overlay)' setting turned on (it is on by default). It is located on the 'video' tab of show settings: simple, and its popup info bubble says it is for 'direct video rendering'. The settings tool calls this 'hardware acceleration' but it is apparently referring to a DMA controller of some sort (maybe on the video card), not a decoder or anything else I recognize as 'acceleration'.

I see no noticeable benefit from overlay. Apparently overlay accomplishes nothing under my test case. I do see a net detriment from overlay, however.

Direct rendering seems to take precedence over gpu acceleration in VLC settings interface. I had to turn off overlay in order to get hardware decoder acceleration working.

I also had to locate the VLC setting for 'gpu acceleration' with a google search because I did not even know where it was.

Here is the one setting I had to turn on to enable acceleration in VLC settings, the 'Use GPU accelerated decoding' bit on the 'inputs & codecs' tab (under show settings: simple):

https://wiki.videolan.org/VLC_GPU_Decoding/

After playing with this card for a few days now it looks as if the hardware acceleration is finally on, but it is a mess.

So now bluray freezing is gone, but stuttering remains and so do the jaggies, on both DVD 24fps progressive mpeg2 distribution media and broadcast (captured) 480i/720p/1080i on mpeg2 hardware capture. Blu-ray also seems to have some minor jaggies going on too, although not as bad. Maybe they are just as bad but the higher resolution hides them some by making them smaller.

Additionally, non-tearing video is STILL limited to one screen! When the analog port is on, the gpu acceleration works but the bluray/dvd video tears something awful just like it does without acceleration.

There is one additional jolly point. Now that I have installed and removed all four proprietary drivers twice, the 'mouse magnet' (border between displays where the popup menu might be triggered) has become an absolute mouse pointer trap. The mouse gets trapped on the secondary screen and the launcher hides permanently. Only alt-tab and then alt-f7 lets me select a window on the primary screen for moving, and that frees up the mouse pointer until it gets trapped again.

Apparently something in the desktop got corrupted from activating and deactivating all those drivers repeatedly. I had to reveal the launcher constantly (disable auto-hide) or use just one monitor.

The most galling issue is that even with hardware acceleration off, DVD output is STILL full of jaggies! Fast-moving animation at the start of Doctor Who looks like horizontal corduroy, and the slowly panning space view of Earth in the opening scene shows a blue and white sphere with a serrated profile.

The video is still stuttering in every single mode with or without acceleration too, even with the speed/quality slider all the way up toward speed and the adaptive power management off. Syncing to vblank seems to do nothing at all.

The nvidia-settings seems to be missing any interlaced video modes for any of the drivers it comes with. Maybe this indicates that interlacing, de-interlacing, scaling and frame rate conversion were hastily implemented in the drivers, without comprehensive debug. Maybe the register settings in the graphics card get set improperly, resulting in corrupted output.

So at this time I still have no idea what the actual problem is, but things are definitely not working properly. It could be a driver issue, a hardware issue, an OS issue, who knows?

It seems unlikely that the OS or hardware is at fault, given the wide disparity in performance with different drivers. The overall picture is one of driver issues.

Maybe none of the drivers for this card are implemented correctly. It would not be the first time I came across an issue with drivers for Linux!

The legendary Nvidia Linux Driver Support (tm) is apparently overstated. My final conclusion, after all this testing, is that the driver is at fault. Until I have a video card with a properly implemented driver for it, I will not be able to play quality video on this machine, period, IMO.

The only questions are, will a newer card perform better and will the latest driver be compatible? It seems like a total crapshoot. My e-mu soundcard and my Geforce 7300gs video card both have lousy drivers.

Is there anyone out there who is currently running bluray and dvd through gpu acceleration on a core2duo with 2.1GHz cpu clock speed or slower cpu, using vlc under linux, and getting good hardware decoder results?

Which graphics card are you using? Which driver version? Which Linux?

Thanks.
 
#31 ·
I may have missed it, but have you checked this setting in VLC:

Tools->Preferences->Input/Codecs->Hardware-accelerated decoding

Set it to:
Video Decode and Presentation API for Unix (VDPAU)

Save the new settings before exiting. Then when watching a mpg video (like a DVD video), use the "Video" dropdown menu and try the different deinterlacing algorithms. See if one works better than the others. If you find one that works best, you can set it to be the default:

Tools->Preferences->Video->Deinterlacing->Mode

Also, check to see if the CPU isn't being overloaded when watching a deinterlaced video.
 
#33 ·
The option you suggest is not present in my version of VLC. Instead, it is a check box under 'codecs' that says 'use gpu accelerated decoding'. I have noticed little to no benefit from turning it on and may have just gotten fooled by random changes in the prolonged 'freezing' (long stutters) as I played with the settings. As I noted above, the CPU utilization reported by 'top' seems to be independent of all 'acceleration' settings in VLC with this Geforce 210.

The Linux and Windows implementations are different. I am using 2.0.8, the latest release for Ubuntu 12.04 repositories. Later versions of Ubuntu might have later versions of VLC, unlike Windows where the latest version of an application usually runs on all currently licensed versions of the OS. Support for Linux open-source applications is tied to the OS in most cases due to lack of manpower to backport fixes and upgrades so upgrading the OS is usually the best option, next to custom-compiling the application to an older kernel/OS.

I have de-interlacing turned off. I did not even bother trying it out. No sense complicating things until I have evidence that acceleration is working.

So, I suppose my next test is to try Ubuntu 14.04 with this Geforce 210 to see if anything changes. I did the same with my emu1212m sound card and nothing improved. Older hardware seems to get 'stuck' at a particular level of support once the free time of the developers is burned through and new hardware comes out.

I am probably going to end up with a GT 730 next. At least it is a new card, although more than I wanted to spend to watch TV from a computer.
 
#32 ·
The more I test this Geforce 210 card the less convinced I am about its hardware acceleration support on Linux. My 2.1GHz core2 duo is underpowered and stutters on bluray playback without acceleration support, but it also seems to stutter WITH acceleration support 'enabled' too, upon further testing...

I also just found that phoronix has already published actual testing data that supports my results:

http://www.phoronix.com/scan.php?page=article&item=nvidia_vdpau_metrics&num=2

The testing showed poorest performance with a Geforce GT 220, the lowest-performing model in the testing lineup. I am using a Geforce 210, presumably a more budget-oriented card with less performance and maybe no PureVideo acceleration at all.

The 2x0 cards only reduced peak CPU usage from 14% to 13% (220), or to 11% (240). Presumably then, they only implemented PureVideo version A despite what Wikipedia states. For all I know, the Geforce 210 has little or no hardware acceleration under Linux.

Even the 9500GT, a prior generation entirely, outperformed the 2x0 cards, matching the rest of the entire test suite of several models from the GT 460 up to the GTX 760 that Phoronix tested, all reducing peak CPU load percentage by half. Presumably, all of these cards implement PureVideo feature set C or later.

According to Phoronix testing data, any 2xx card seems like it might be a bad idea for hardware acceleration under Linux and no data at all is reported by Phoronix on the 3xx series.

Finally, when I checked my CPU utilization with and without the 'gpu acceleration' turned on in VLC, I saw no noticeable difference. It still seems to use one CPU at 55% and the other at 15% in 'top', although I am skeptical of these numbers given the actual performance I see. It seems that the CPU usage must be higher on peaks and why is only one CPU doing or all most or all of the decoding anyway? Something is wrong.

Phoronix testing was done under Ubuntu 14.04 with the 337.25 driver. I am using Ubuntu 12.04 with the 331.38 driver.

I am also using a core2duo instead of an i3.

I am using a commercial bluray with higher bitrate than the 'Big Buck Bunny' clip that plays fine on this machine even with no graphics acceleration at all (usually). It does peak at nearly 40Mbps twice, when the scene transitions to the meadow outside the burrow, but I only saw one stutter on one of three tries playing the clip compared to frequent stutters on a commercial bluray.

In every respect, my test case is downgraded in performance from the test case where the worst performing graphics card delivered minimal peak decoding acceleration, leaving the cpu to cover the peak load, the portion that is going to cause stuttering.

This was the same sort of question I am asking about Linux hardware decoding specifically, what version of a modern graphics card supports hardware decoding well and under which Linux? Apparently Phoronix had the answer for at least a few cards with direct testing of PureVideo.

Anyway, it now looks as if I might have a better idea which cards might actually work for this application. I should probably consider the GT 730 or higher to ensure up-to-date driver support. I guess that is the next step, unless I can find a great deal on one of the other cards Phoronix tested, or something similar.

So, conventional wisdom aside, it appears that PureVideo is not a slam-dunk on Linux after all.
 
#34 ·
sorry that card didn't work out for you. i'm surprised since the 8500GT that i had performed quite well, so i figured that VDPAU had only gotten better -- and that's been the case with the newer ones that i've used, too.

again, though, i'd at least try it with XBMC before i gave up on it. it usually plays things better than vlc for me.
 
#37 ·
I am wondering about the conventional wisdom on AMD vs. Nvidia streaming support under Linux. Did things change recently?

http://www.phoronix.com/scan.php?page=news_item&px=MTYwNzU

" While the VDPAU state tracker inside Gallium3D has stabilized greatly and can be used by the Nouveau driver as well as the R600/RadeonSI Gallium3D drivers, it will not be enabled. The Video Decode and Presentation API for Unix is widely supported by Linux multimedia software for offloading the video acceleration of popular video formats onto the GPU. This works really well in the open-source world with the AMD Radeon Linux driver stack since last year when they provided open-source support for UVD, the AMD Unified Video Decoder block found on most modern Radeon GPUs."

After checking Wikipedia for which Radeon cards are supported, of course I find that my Radeon in this ABIT machine might not have any hardware streaming support at all.

It is fine that I got the Geforce 210. Part of the reason for doing this upgrade is to see how far I can extend my dollars while learning as much as possible in the process.

Besides, the 7300GS I discarded had jury-rigged fan and capacitor plague. Better it got replaced sooner than later. Call it a $15 fire insurance policy.

So now I have to investigate AMD graphics cards I guess, now that Phoronix claims their streaming decoding is better supported under Ubuntu 14.04. AMD is usually less expensive for similar performance.

The Nvidia cards with streaming decoders, even used on ebay, still seem pricey for such old tech ($25 -> $250 plus S&H) and the power consumption on the cards from older generations is higher too.
 
#40 ·
I am trying to find some utility or diagnostic that will report to me the status of the purevideo hardware decoding, both in the Geforce 210 and the driver.

I installed the Phoronix test suite and it is downloading a video clip of Big Buck Bunny (that I already have... sigh).

Never used XBMC. Wanted to, never got to it. Is it difficult to install and configure? I understand it is sort of like an alternative to MythTV.

I checked the MythTV VDPAU page that blackcat6 pointed me to, and learned about vdpauinfo:
Code:
vdpauinfo
display: :0   screen: 0
API version: 1
Information string: NVIDIA VDPAU Driver Shared Library  331.38  Wed Jan  8 19:13:15 PST 2014

Video surface:

name   width height types
-------------------------------------------
420     4096  4096  NV12 YV12 
422     4096  4096  UYVY YUYV 

Decoder capabilities:

name               level macbs width height
-------------------------------------------
MPEG1                 0  8192  2048  2048
MPEG2_SIMPLE          3  8192  2048  2048
MPEG2_MAIN            3  8192  2048  2048
H264_MAIN            41  8192  2048  2048
H264_HIGH            41  8192  2048  2048
VC1_SIMPLE            1  8190  2048  2048
VC1_MAIN              2  8190  2048  2048
VC1_ADVANCED          4  8190  2048  2048
MPEG4_PART2_SP        3  8192  2048  2048
MPEG4_PART2_ASP       5  8192  2048  2048
DIVX4_QMOBILE         0  8192  2048  2048
DIVX4_MOBILE          0  8192  2048  2048
DIVX4_HOME_THEATER    0  8192  2048  2048
DIVX4_HD_1080P        0  8192  2048  2048
DIVX5_QMOBILE         0  8192  2048  2048
DIVX5_MOBILE          0  8192  2048  2048
DIVX5_HOME_THEATER    0  8192  2048  2048
DIVX5_HD_1080P        0  8192  2048  2048

Output surface:

name              width height nat types
----------------------------------------------------
B8G8R8A8          8192  8192    y  Y8U8V8A8 V8U8Y8A8 
R10G10B10A2       8192  8192    y  Y8U8V8A8 V8U8Y8A8 

Bitmap surface:

name              width height
------------------------------
B8G8R8A8          8192  8192
R8G8B8A8          8192  8192
R10G10B10A2       8192  8192
B10G10R10A2       8192  8192
A8                8192  8192

Video mixer:

feature name                    sup
------------------------------------
DEINTERLACE_TEMPORAL             y
DEINTERLACE_TEMPORAL_SPATIAL     y
INVERSE_TELECINE                 y
NOISE_REDUCTION                  y
SHARPNESS                        y
LUMA_KEY                         y
HIGH QUALITY SCALING - L1        y
HIGH QUALITY SCALING - L2        -
HIGH QUALITY SCALING - L3        -
HIGH QUALITY SCALING - L4        -
HIGH QUALITY SCALING - L5        -
HIGH QUALITY SCALING - L6        -
HIGH QUALITY SCALING - L7        -
HIGH QUALITY SCALING - L8        -
HIGH QUALITY SCALING - L9        -

parameter name                  sup      min      max
-----------------------------------------------------
VIDEO_SURFACE_WIDTH              y         1     4096
VIDEO_SURFACE_HEIGHT             y         1     4096
CHROMA_TYPE                      y  
LAYERS                           y         0        4

attribute name                  sup      min      max
-----------------------------------------------------
BACKGROUND_COLOR                 y  
CSC_MATRIX                       y  
NOISE_REDUCTION_LEVEL            y      0.00     1.00
SHARPNESS_LEVEL                  y     -1.00     1.00
LUMA_KEY_MIN_LUMA                y  
LUMA_KEY_MAX_LUMA                y
This looks very similar to what I see in nvidia-settings. It appears to me that vdpauinfo thinks this system should be able to play bluray fine. So what gives?
 
#41 ·
#43 ·
Installed XBMC. Could not figure out how to play an mkv.

Then a problem I noticed a few times since installing this Geforce 210, and increasingly frequently since then, finally terminated this entire experiment.

The user interface (not just the video playback) has been stuttering intermittently.

Now the Intel will not boot at all with the 210 installed. Neither will the ABIT..The card has died completely.

Some people sell broken junk alongside 'used' the way some people sell 'pink slime' with ground meat.

For ~$80 total I can get GT 730 and it has the almost-latest feature set D decoder block.

The biggest drawback to this approach is I will spend $160 for two of the GT730s.

I have a more affordable option though. For ~$30 or less, I can get a used Radeon 6xxx from Ebay, with the latest AMD decoder block. They sell for as low as $20 on Ebay.

Phoronix's Nvidia/VDPAU benchmarking study states that Ubuntu 14.04 supports all the AMD hardware decoders. If actual in practice, this is great news and opens up many inexpensive possibilities for home theater graphics card choices.

Currently I am leaning toward trying one of each of these two proposed solutions -- AMD Radeon 6xxx first, Nvidia GT 730 second.

The third option is an older-generation used Nvidia (with VDPAU feature set C decoding block) i.e. GT 510 or up, for ~$40->up but still seems like a bad deal to me for such old tech just to convert a file server into an HTPC/server.

The last solution is to get one of those 'Hong Kong' Nvidia-based graphics cards auctioning on Ebay at bargain-basement prices. I am considering them the equivalent of the white-van speaker con.

Thanks for all the helpful suggestions and links. I really knew next to nothing about graphics when I first posted this thread. Could you tell?
 
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top