or Connect
AVS › AVS Forum › Video Components › Home Theater Computers › Cheapest videocard for MadVR?
New Posts  All Forums:Forum Nav:

Cheapest videocard for MadVR? - Page 2

post #31 of 112
Quote:
Originally Posted by madshi View Post

Please also take into account that madVR will get more demanding high-quality algorithms in the future. E.g. there's a new madVR test build available (download see last pages of doom9 thread) which can do Lanczos scaling without ringing artifacts now. The anti-ringing filter eats quite a bit of performance. There will be more improvements like that which will all take a hit on GPU performance, if you use them. Of course you can always stick to the standard algorithms, then a budget GPU should do fine. But if you want to be ready for future high-quality algorithms, then you might want to aim a bit higher. My recommendation for a madVR GPU has always been to get the fastest GPU that fits your budget and your thermal envelope. Of course another option is to get a budget GPU now and upgrade later. GPUs get faster every year, after all...
Personally, those new 28nm GPUs look tasty to me. The performance per watt ratio is noticeably better than with the older 40nm GPUs, which is quite important for HTPCs. I'm thinking about putting a Radeon 7750 into my HTPC. Or maybe even a 7770 or 7850. The soon-to-come GK106 based NVidia cards (650Ti and 660) might be good options, too.

Quote:
Originally Posted by FantaXP7 View Post

Excellent, thank you for this reply. Good to know that the ATI cards are an option as well. I think awhile back mostly Nvidia cards were recommended.


Is it "for sure" that ATI cards are viable options for MadVR use today? If so, which ATI chipset(s) are recommeded... and starting with which MadVR revision?

Thanks for the informative discussion, as I am presently selecting a graphics card specifically for MadVR use in my next HTPC build smile.gif
post #32 of 112
If anybody knows it is madshi as he develops MadVR so I would say that the cards he listed work fine.
post #33 of 112
Quote:
Originally Posted by whiteboy714 View Post

Really? All that? That seems like a lot of power.
Is something like a gt430 or 6570 going to start not being enough soon for madvr?
Depends on what kind of algorithms you want to use and which resolutions and framerates you need. I guess 1080p60 up/downscaling with the new anti-ringing algorithm might be too much for those GPUs, but I've not tested that. If you don't need/want the new anti-ringing algorithm, nor any of the other future algorithm enhancements, then maybe a GT430 or 6570 are enough. But I don't know for sure. As I said in my previous comment, my recommendation is (and has always been) to get the fastest GPU you can afford which doesn't get too hot/loud in your HTPC. But it's also not a bad idea to get a budget GPU now and upgrade next year to the next budget GPU again etc...
Quote:
Originally Posted by Sammy2 View Post

I'm pretty happy now with my GT 430 but am seeing the aliasing during LiveTV playback (not so much on Blu-ray VC-1 or whatever). I'm not sure if that is the compression from the CableCo or if it is something on my end.
Sounds like CableCo problem to me.
Quote:
Originally Posted by FantaXP7 View Post

Good to know that the ATI cards are an option as well. I think awhile back mostly Nvidia cards were recommended.
NVidia cards have one big advantage: Custom resolutions. Other than that there's no disadvantage using ATI cards with madVR. Actually, playback in fullscreen exclusive mode is currently less problematic on ATI and Intel compared to NVidia, due to a nasty NVidia driver bug which has been there for ages. All those stupid tweak options in madVR are only there because NVidia makes smooth playback in fullscreen exclusive mode very difficult, unlike ATI. With NVidia you may have to tweak fullscreen exclusive settings a lot to get perfect results. With ATI, results are often perfect with default settings. However, I plan to report this problem to NVidia and hope to have it fixed sooner or later. Then NVidia will have the advantage again, thanks to custom resolutions. Don't really know why ATI doesn't offer something similar.
post #34 of 112
Quote:
Originally Posted by Vlad Theimpaler View Post

Is it "for sure" that ATI cards are viable options for MadVR use today? If so, which ATI chipset(s) are recommeded... and starting with which MadVR revision?
Thanks for the informative discussion, as I am presently selecting a graphics card specifically for MadVR use in my next HTPC build smile.gif
madVR has *always* supported ATI cards just fine. Actually the GPU in my development PC has always been an ATI card. There's no specific GPU generation or madVR revision that is needed. Just be aware that ATI does *not* offer custom resolutions. That's the one big disadvantage for HTPC users from my point of view. I recommend the new 28nm generation, due to much improved power per watt ratio. The 7750 is especially power efficient. It consumes < 50W and is relatively powerful. The NVidia 550Ti might become a good competitor to the 7750 when it's released (September, I think), we'll have to wait and see about that.
post #35 of 112
A big "THANK YOU" to madshi smile.gifsmile.gifsmile.gif
post #36 of 112
Quote:
Originally Posted by Sammy2 View Post

I'm pretty sure it is the CableCo's (Charter) compression more than anything else.
Sounds like it to me.
post #37 of 112
Quote:
Originally Posted by madshi View Post

Actually, playback in fullscreen exclusive mode is currently less problematic on ATI and Intel compared to NVidia, due to a nasty NVidia driver bug which has been there for ages. All those stupid tweak options in madVR are only there because NVidia makes smooth playback in fullscreen exclusive mode very difficult, unlike ATI. With NVidia you may have to tweak fullscreen exclusive settings a lot to get perfect results. With ATI, results are often perfect with default settings. However, I plan to report this problem to NVidia and hope to have it fixed sooner or later. Then NVidia will have the advantage again, thanks to custom resolutions. Don't really know why ATI doesn't offer something similar.

What specific Nvidia bug are you referring to? Do you have steps to reproduce it? We work pretty closely with Nvidia (as a PC game developer) and I've always found their drivers much better than ATI. If I can reproduce your bug, I can try to forward it our developer support rep.
post #38 of 112
Quote:
Originally Posted by Wizziwig View Post

What specific Nvidia bug are you referring to? Do you have steps to reproduce it? We work pretty closely with Nvidia (as a PC game developer) and I've always found their drivers much better than ATI. If I can reproduce your bug, I can try to forward it our developer support rep.
That would be awesome!!

I'm still busy with my commercial projects at the moment. But as soon as I find some time again for madVR, I plan to create a test project which demonstrates the NVidia driver bug.

The bug is that when pre-presenting frames in fullscreen exclusive mode (similar to triple buffering, but with a user selectable number of frames instead of 3), IDirect3DSwapChain9Ex::GetPresentStatistics() is reporting lots of presentation glitches, and they are true glitches, visible on screen. No such problem with ATI and Intel. I've found some tricks and hacks to work around the problem, e.g. creating separate D3D devices for rendering and for presentation helps a bit, but that requires sharing the D3D surfaces between 2 devices, quite complicated, and it only helps, but doesn't completely fix the problem.
post #39 of 112
Quote:
Originally Posted by renethx View Post


i3-550's GPU is fast enough for all SD and all 1080p24 contents with madVR highest settings. (It struggles only with 1080i60 and 1080p60.)


Does your display support 24Hz input? Then I recommend Intel HD Graphics + DXVA/EVR, LAV Audio Decoder, dtsdecoderdll.dll + ReClock (media adaptation + WASAPI exclusive mode), that costs $0 for perfect judder-free zero frame drop/repeat playback of all contents.

Well. There you have it.
post #40 of 112
renethx's comment does not cover the new anti-ringing scaling algorithm, though. The new algorithm didn't exist yet when renethx posted his comment.
post #41 of 112
P.S: I forgot to mention one thing: With NVidia GPUs you can have hardware video decoding when using madVR. With ATI it's possible, too, but with ATI it's slower. So if you want to use hardware video decoding, NVidia might be the better choice at the moment.
post #42 of 112
So the GT 430 will not be "strong" enough for these future algorithms?

What is the approximate time frame? I see you are busy with other things so I'm thinking it won't be for a while yet?
post #43 of 112
If one is using software decoding, what is the minimum CPU recommended? I have problems with a few movies where CPU is getting up into the 90's during certain passages (e.g. Avatar) and am getting dropped frames galore. I've only an E4500 and an ATI 5570.
post #44 of 112
@Sammy2, the new anti-ringing scaling algorithm is already available, although only as a test build right now, but it seems to be working just fine. Can't give you any approximate time frames for other future algorithms. Don't know myself. Whether the 430 is strong enough or not depends a lot on which resolutions and frame rates you need support for, whether you want to use DXVA deinterlacing and whether you want to use the new anti-ringing scaling algorithm. There's no simple answer. At least none that I could give. So let me come back to my recommendation of getting the fastest GPU that you can afford and that's cool and silent enough for your PC. Of course that's just my personal recommendation. If you don't have the money, a lesser GPU will do, but it might not be able to do "everything".

@JMGNYC, don't know, maybe someone else can answer that?
post #45 of 112
Thanks..

The "ringing" I'm seeing is the "haloing" that I describe with LiveTV playback and probably is the CableCo's compression so MadVR isn't even in play there. Blu-ray (mkv) playback is stellar though. No complaints with that at all for me but I'm only using a 40" 120Hz HDTV that is about 4 years old now. When I upgrade to 60" later this year it may be an issue to look into then..
post #46 of 112
I'm curious if anyone has seen this message before

"this application is not rated by nvidia corp"

I get it in MPC-HC a lot, usually when switching inputs from my HTPC to my XBOX then back...Maybe another reason for me to try out the ATI 7750
post #47 of 112
Do you leave mpc open when switching inputs? That may be the problem if so my htpc does weird stuff as well when I do that.
post #48 of 112
Typically no. I have seen it occur opening something new after making the input switch back.
post #49 of 112
So... Sammy2...

How do you get live TV playback through MadVR?
post #50 of 112
No. MadVR is only invoked as a video render-er through MPC-HC. LiveTV uses MS filters.
post #51 of 112

In madshi's post from a few days ago. He mentioned the "Nvidia 550Ti" as a potential competitor to the Radeon 7750 with relation to madvr performance. He stated that he thought the Nvidia card would not be out until sometime in September. However, I wonder was he referencing something like this:

 

EVGA GeForce GTX 550 Ti (Fermi)

 

or is there another upcoming chipset with a similar naming convention? The above has 192 CUDA cores. However, it uses a mini-HDMI port and requires a minimum of 400W system power.

post #52 of 112
Quote:
Originally Posted by madshi View Post

That would be awesome!!
I'm still busy with my commercial projects at the moment. But as soon as I find some time again for madVR, I plan to create a test project which demonstrates the NVidia driver bug.
The bug is that when pre-presenting frames in fullscreen exclusive mode (similar to triple buffering, but with a user selectable number of frames instead of 3), IDirect3DSwapChain9Ex::GetPresentStatistics() is reporting lots of presentation glitches, and they are true glitches, visible on screen. No such problem with ATI and Intel. I've found some tricks and hacks to work around the problem, e.g. creating separate D3D devices for rendering and for presentation helps a bit, but that requires sharing the D3D surfaces between 2 devices, quite complicated, and it only helps, but doesn't completely fix the problem.

Interesting. Our current graphics engine only supports OpenGL on the PC (for easier porting to Mac, iOS, Android, etc.). I'll be adding a D3D backend renderer in a few weeks and will try to reproduce your problem. Have you tried increasing the maximum latency with SetMaximumFrameLatency()? Maybe the default queue size is too low for really intensive CPU/GPU apps. You could also try disabling Nvidia's power management - I've seen cases where it would randomly toggle into a low power state where the card was too slow to keep up with our game. I also remember something like what you describe happening when Nvidia first released their 260.xx series of drivers. I've been doing mostly console games the past few years so don't remember if they ever fixed that on current D3D drivers.

As a general observation, Windows is a really poor platform for a media player! For any "real-time" application, it's difficult to get 100% reliable results from the task scheduler. We've had much better luck in Linux - it's almost as reliable as our console projects. Might be something to consider for a future madvr version.
post #53 of 112
Quote:
Originally Posted by rbrinson View Post

In madshi's post from a few days ago. He mentioned the "Nvidia 550Ti" as a potential competitor to the Radeon 7750 with relation to madvr performance. He stated that he thought the Nvidia card would not be out until sometime in September. However, I wonder was he referencing something like this:

EVGA GeForce GTX 550 Ti (Fermi)

or is there another upcoming chipset with a similar naming convention? The above has 192 CUDA cores. However, it uses a mini-HDMI port and requires a minimum of 400W system power.
I think he referred to the 650 which is not released yet. That card is from nvidia's previous generation and has been out for a while. I would hope it would be plenty for madvr, but who knows down the road. As far as gaming power the 550ti is in between the 7750 and 7770 in regards to AMD.
post #54 of 112
Quote:
Originally Posted by Wizziwig View Post

Interesting. Our current graphics engine only supports OpenGL on the PC (for easier porting to Mac, iOS, Android, etc.). I'll be adding a D3D backend renderer in a few weeks and will try to reproduce your problem. Have you tried increasing the maximum latency with SetMaximumFrameLatency()? Maybe the default queue size is too low for really intensive CPU/GPU apps. You could also try disabling Nvidia's power management - I've seen cases where it would randomly toggle into a low power state where the card was too slow to keep up with our game. I also remember something like what you describe happening when Nvidia first released their 260.xx series of drivers. I've been doing mostly console games the past few years so don't remember if they ever fixed that on current D3D drivers.
As a general observation, Windows is a really poor platform for a media player! For any "real-time" application, it's difficult to get 100% reliable results from the task scheduler. We've had much better luck in Linux - it's almost as reliable as our console projects. Might be something to consider for a future madvr version.
I'm already using SetMaximumFrameLatency(). One of the madVR "tweak" options (specifically designed to help with the NVidia glitching problem) defines whether I'm calling SetMaximumFrameLatency with the exact number of backbuffers, or with backbuffers + 2. I'm quite sure that one of the many NVidia users affected by this problem has already tried disabling the power management, so I don't think it will help here, either. Both good suggestions, though.

I think the problem has to do with synchronization. I think the problem occurs if the GPU is busy doing something specific (maybe copying to backbuffer or whatever) while a VSync occurs. In that case it seems that NVidia's driver sometimes isn't able to flip the page, probably because it fails to get access to some critical section or something like that. That's why another madVR tweak option checks where the VSync scanline position is and avoids starting a new rendering pass when VSync is near to the next VSync interrupt. And that does help a lot in avoiding those presentation glitches.

Yeah, I can well imagine that Linux might be a more stable platform for time critical stuff. But really, D3D exclusive mode *should* in theory solve this problem because you can present multiple frames in advance and the pre-presented frames are supposed to be flipped by the VSync interrupt in driver land. That should take out all timing problems. In theory. It seems to work well enough with ATI and Intel, just not with NVidia at the moment.
Quote:
Originally Posted by whiteboy714 View Post

I think he referred to the 650 which is not released yet. That card is from nvidia's previous generation and has been out for a while. I would hope it would be plenty for madvr, but who knows down the road. As far as gaming power the 550ti is in between the 7750 and 7770 in regards to AMD.
Yeah, sorry for the confusion, I meant the 650ti.
post #55 of 112
Quote:
Originally Posted by madshi View Post

P.S: I forgot to mention one thing: With NVidia GPUs you can have hardware video decoding when using madVR. With ATI it's possible, too, but with ATI it's slower. So if you want to use hardware video decoding, NVidia might be the better choice at the moment.

FWIW, with the 7000 series of AMD cards, they really managed to improved the performance of "Copy Back" DXVA mode, and i don't see a reason to not recommend it anymore. Its really slow on any previous generations (6000 and below) though.
I use a passive 7750 in my HTPC myself, mostly because NVIDIA lacks good passive cards (especially in the 6xx generation). I do however use Intels QuickSync decoder in that system and not the AMD GPU for decoding.
post #56 of 112
Ah, good to know. It makes sense that AMD improved the "copy back" in the 7xxx series, after all they put a major effort into improving GPU computing with 7xxx, and copy back is a part of that.
post #57 of 112
Quote:
Originally Posted by Nevcairiel View Post

FWIW, with the 7000 series of AMD cards, they really managed to improved the performance of "Copy Back" DXVA mode, and i don't see a reason to not recommend it anymore. Its really slow on any previous generations (6000 and below) though.
I use a passive 7750 in my HTPC myself, mostly because NVIDIA lacks good passive cards (especially in the 6xx generation). I do however use Intels QuickSync decoder in that system and not the AMD GPU for decoding.

Great post, and I hear you. I get noticeable framerate loss with my 6750 in DXVA2 copy-back mode using the Lav decoder, and while I was considering upgrading to something like nVidia's gtx 560, my local stores have just about no passive cards out there at that kind of speed. Of course, in these parts they don't have any passive cards in AMD's 7xx range either, but I imagine that's just a matter of time.

In the meantime I will stick to having my Q6600 cpu do the software decoding biggrin.gif
post #58 of 112
Quote:
Originally Posted by Venturai View Post

In the meantime I will stick to having my Q6600 cpu do the software decoding biggrin.gif

yep, with a ati 6670, get issues with dxva. using the Q6600 for software too and it plays perfectly.
post #59 of 112
Ok. So, in about three weeks I'll be ordering the parts for my first HTPC build. Based upon previous recommendations the GT 430 was going to be the video card. However, it now looks as though the GT 430 is soon going to be unable to keep up. I have given greater weight to Nvidia's line due to the ability to define custom refresh rates. So, which card should I be considering now given the limitation of PCIe 2.0 due to the i3-2120 CPU? A GTS 450?
post #60 of 112
PCIe 2.0 is not really a limitation for the mid-range segment. It may only be a limitation on very high-end cards, and even there its questionable.

Rumors are that the new NVIDIA GTS 650 is to be released on 6th of September, if you can/want to wait for that, it might be worth it. Otherwise go with 550Ti or 450/460.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Home Theater Computers
AVS › AVS Forum › Video Components › Home Theater Computers › Cheapest videocard for MadVR?