or Connect
AVS › AVS Forum › Video Components › Home Theater Computers › DXVA2 in XBMC Frodo
New Posts  All Forums:Forum Nav:

DXVA2 in XBMC Frodo - Page 2

post #31 of 46
Thread Starter 
Quote:
Originally Posted by Dark_Slayer View Post

The i3 I previously mentioned is running OpenElec, and I tried enabling DXVA under the video playback settings as well. It worked fine here as well.

So. . . egg on my face for this line (which I edited with strikethrough)

I should make it a point to re check the settings I mention rather than just trying to recall them off the top of my head

DXVA is obviously never an option in OpenELEC, but I do leave VAAPI enabled for the i3 2105 running OpenELEC version 2.99.2. VAAPI might have been working for a while in OpenELEC, but due to the issue I had with hardware acceleration the first time I ever setup XBMC I never thought to enable it. After I went to "check" in the stable release on my i5 and found it successful, then I toggled on VAAPI in OE and noticed similar 5% utilization
Quote:
Originally Posted by StardogChampion View Post

Well, according to this guy who Googled it and theorized about it, you're all wrong: http://forum.xbmc.org/showthread.php?tid=140534&pid=1336310#pid1336310 -- it will only play "fluently" on the HD4000 and Intel NUC. Sorry guys...

smile.gif

I can't really prove him wrong, but from what Renethx said above it shouldn't be different for any IVB processor. My OpenELEC box is using a sandy bridge i3, but I'd have to throw windows on there just to test. I might get a chance to do so this weekend, in the interest of science. Screenshot may or may not be forthcoming wink.gif
post #32 of 46
Ok, installed XBMC12 and the aforementioned Intel driver on the Celeron 847 test system. I still see banding/pixelation on scene changes. CPU usage was around 14%. Turning off DXVA I get about 85-90% CPU util which is better than the 100% CPU util in XBMC11 but the video is still choppy. Using WMP for the same MKV and it's as smooth as butter. So, that's always an option for Celeron 847 and Windows, just launch WMP from XBMC. Bummer though.
post #33 of 46
Quote:
Originally Posted by renethx View Post

Progressive VC-1 can be played back just fine whether DXVA2 is on or off. It's interlaced VC-1 (a short clip here: BBC Life - Plants) that can't play with XBMC, whether DXVA2 is on or off, whatever graphics card (not only Intel, but also AMD or NVIDIA) you use. But interlaced VC-1 is found only in BBC Blu-ray and a few others.

DXVA2 on means decoding is done by GPU, as well as deinterlacing and scaling. With DXVA2 off, decoding is done by CPU, but deinterlacing and scaling can be still done by GPU (set Render method = DXVA). GPU in any IVB processor behaves exactly the same way.

MPC-HC (or any DirectShow player)+LAV Video Decoder (with whatever video renderer) is just perfect.

I have an Onkyo with HQV Vida VHD1900 processor for upscaling to 1080p, with Qdeo ... what do you recommend I set XBMC at to get the best PQ using my AVR?

My contents is all 720p and 1080i (OTA broadcasts)
post #34 of 46
Yup, an external post-processor is another solution. But AFAIK there is no simple, automatic way for XBMC to output native resolution based on the source.
post #35 of 46
There is, it's just not in the main XBMC (yet?).

http://forum.xbmc.org/showthread.php?tid=64139&page=11
post #36 of 46

Since you guys are all subscribed to this thread (and seem to know a lot about this) maybe you can help me? I currently have this machine running Win7-64/XBMC v11. An older machine, but working fine:

 

AMD Athlon-64 x2 (Dual-Core)

-  On-Board ATI Radeon HD 3200 (similar to HD2400) VGA-DVI-HDMI

- 2 gb ram and small HDD
- 400w PS

 

With DXVA2 on, works fine and can handle VIDEO_TS, 1080p .MKV, AC3, DTS. DXVA2 was the magic that helps this config (I assume by offloading a lot to GPU). I mainly like to play movies so you can’t really tell it’s not a real disc in a real stand-alone player ( high bitrate video and sound with all bit-streamed channels to AVR)

 

Swapping it out for this machine:

 

Intel Core-i3 2120 (Dual-Core, Quad-Threaded),

- On-Board Intel HD 2000 graphics VGA-HDMI

- 6gb RAM, 1gb HDD

- 280w PS

 

The question is … will the Intel 2000 work as good as the AMD-GPU setup? Or, do I need to drop in a AMD 5670/6670 and a power-supply upgrade? The reason I ask is because I’m reading about Intel 2000/3000/4000 XBMC problems. I did notice that Windows Update is suggesting a driver update to WDDM1.1/1.2. I will likely install it or the latest from Intel.com (once I document my current version). I plan to go ahead and install XBMC v12.1 since it’s current stable.

 

The AVR and plasma are old (see workout room below), so I must use VGA (@ 720p) and SPDIF. However, I would also like it to be able to handle 1080p over HDMI later (without messing with the insides again) so I will also test final config with Living Room setup (with front HDMI Aux on Onkyo 607 setup).

post #37 of 46
Quote:
Originally Posted by Tesla1856 View Post

I did notice that Windows Update is suggesting a driver update to WDDM1.1/1.2. I will likely install it or the latest from Intel.com (once I document my current version).

Used that as more of just a notification that an update was available.

 

Current Intel HD Video Driver
DM: Intel HD Graphics Family, Intel, 12-15-2012, v8.15.10.2598, MS-WHCP
Intel Graphics and Media (Control Panel): Driver Version v8.15.10.2598, vBIOS v2126.0
WEI 4.7: 7.1, 7.4, 4.7, 4.6, 5.9

 

Installed updated driver from Intel.com.
Dated 3-21-2013 with a Package Version v15.28.15.64.3062
Verified below that driver version DOES match what the driver text said it would be.

 

Now, Current Intel HD Video Driver
DM: Intel HD Graphics, Intel Corp, 3-8-2013, v9.17.10.3062, MS-WHCP
Intel Graphics and Media (Control Panel): Driver Version v9.17.10.3062, vBIOS v2126.0
WEI 5.1: 7.1, 7.4, 5.1, 5.8, 5.9
 

Notes: WEI increased for Aero 5.1 and 3D 5.8. vBIOS stayed the same.

post #38 of 46
Quote:
Originally Posted by renethx View Post

DXVA2 is not just decoding but it involves deinterlacing (both video and film mode) and rendering (in particular scaling). XBMC mixes various methods in a complicated way:



If "Allow hardware acceleration (DXVA2)" is on, then all subsequent rendering methods are only DXVA2 (i.e. the card vendor supplied driver's algorithm). If it is off, you can still use DXVA2 for deinterlacing and scaling. You can also choose hardware accelerated video scaling (Lanczos3 etc.) as well as crappy software scaling (Bilnear etc.), while deinterlacing is always software mode ("De-interlace" etc.) and the quality less than satisfactory.

DXVA2 video scaling:



AMD is the worst, NVIDIA is not so good (ES = Edge sharpening), Intel is the best among the three. Unfortunately, with Intel, DXVA2 decode is sometimes broken and "DXVA Best" deinterlacing never works in XBMC. So the best compromise would be:



The best deinterlacing method (DXVA2 Best) and the best video scaling method (Lanczos, except for Jinc) never work at the same time in XBMC internal player whatever card you use. If you care about the best PQ, you'd better use an external player with madVR (DXVA2 for Intel).


Thanks for the helpful post, in the new xbmc gotham maybe you can make sense of this for me, is this what you were discussing where we could perhaps use dxva best with lanczos 3?

To access these settings need confluence if i remember, latest xbmc nightly, and hit the side bar to access expert.

https://github.com/xbmc/xbmc/pull/838
post #39 of 46
The table of "DXVA2 Scaling Algorithms" is wrong. The correct one (from this post) is
Code:
      Scaling        |   AMD    |  NVIDIA  |   Intel    
--------------------------------------------------------
SD --> HD (1.5x)     | Lanczos  | Bilinear | Lanczos+AR 
SD --> FHD (2.25x)   | Lanczos  | Lanczos  | Lanczos+AR 
SD --> 4K UHD (4.5x) | Lanczos  | Lanczos  | Lanczos+AR 
HD --> FHD (1.5x)    | Bilinear | Bilinear | Lanczos+AR 
HD --> 4K UHD (3x)   | Bilinear | Bilinear | Lanczos+AR 
FHD --> 4K UHD (2x)  | Bilinear | Bilinear | Lanczos+AR
post #40 of 46
I will take a look.

I guess your solution for dxva best, lanczos 3 is actually here with this guys builds.


http://forum.xbmc.org/showthread.php?tid=127174&page=9

Have no tried it.

Do any of these settings such as lanczos 3 effect chroma upsampling in xbmc to or is that purely a madvr thing.
post #41 of 46
Where did OP go?

DarkSlayer I hope you didn't get slayed.
post #42 of 46
post #43 of 46

Ran into something strange with Intel 2000.

 

In XBMC v11.x DXVA2 over HDMI works fine. Over VGA, had to switch it off (or colors went crazy). No problem leaving it off because decent i3 CPU. Would still do 1080p MKV and original BD M2TS. Sound was bitstream over optical.

 

Is this even possible for it to act like this?

post #44 of 46
Thread Starter 
Quote:
Originally Posted by Mfusick View Post

Where did OP go?
DarkSlayer I hope you didn't get slayed.
I didn't get slayed, but thanks for caring
Quote:
Originally Posted by renethx View Post

DXVA2 video scaling:

Quote:
Originally Posted by renethx View Post

The table of "DXVA2 Scaling Algorithms" is wrong. The correct one (from this post) is
Code:
      Scaling        |   AMD    |  NVIDIA  |   Intel    
--------------------------------------------------------
SD --> HD (1.5x)     | Lanczos  | Bilinear | Lanczos+AR 
SD --> FHD (2.25x)   | Lanczos  | Lanczos  | Lanczos+AR 
SD --> 4K UHD (4.5x) | Lanczos  | Lanczos  | Lanczos+AR 
HD --> FHD (1.5x)    | Bilinear | Bilinear | Lanczos+AR 
HD --> 4K UHD (3x)   | Bilinear | Bilinear | Lanczos+AR 
FHD --> 4K UHD (2x)  | Bilinear | Bilinear | Lanczos+AR

So the table (corrected) is showing DXVA2 image upscaling? Is intel still the sole user of bicubic chroma upscaling while the other 2 use bilinear?

Quote:
Originally Posted by Tesla1856 View Post

In XBMC v11.x DXVA2 over HDMI works fine. Over VGA, had to switch it off (or colors went crazy). No problem leaving it off because decent i3 CPU. Would still do 1080p MKV and original BD M2TS. Sound was bitstream over optical.

Is this even possible for it to act like this?
Quoted you to indicate that I didn't ignore your post, but I have no idea of the answer
post #45 of 46
Quote:
Originally Posted by Murilo View Post

I will take a look.

I guess your solution for dxva best, lanczos 3 is actually here with this guys builds.


http://forum.xbmc.org/showthread.php?tid=127174&page=9

Have no tried it.

Do any of these settings such as lanczos 3 effect chroma upsampling in xbmc to or is that purely a madvr thing.
I use this build and it works flawlessly. I have an AMD 5450 and it uses hardware accelerated Lanczos3+DXVA Best for deinterlacing. Basically everything in this chart that says bilinear is changed to Lanczos3. I have not tried 4k as my card doesn't do it nor do I have a monitor that supports it.
Code:
Scaling        |   AMD    |  NVIDIA  |   Intel    
--------------------------------------------------------
SD --> HD (1.5x)     | Lanczos3  | Bilinear | Lanczos+AR 
SD --> FHD (2.25x)   | Lanczos3  | Lanczos  | Lanczos+AR 
SD --> 4K UHD (4.5x) | Lanczos3  | Lanczos  | Lanczos+AR 
HD --> FHD (1.5x)    | Lanczos3 | Bilinear | Lanczos+AR 
HD --> 4K UHD (3x)   | ?        | Bilinear | Lanczos+AR 
FHD --> 4K UHD (2x)  | ?        | Bilinear | Lanczos+AR
It works with Nvidia cards as well.
Edited by StinDaWg - 7/23/13 at 5:51pm
post #46 of 46
Quote:
Originally Posted by Dark_Slayer View Post


So the table (corrected) is showing DXVA2 image upscaling? Is intel still the sole user of bicubic chroma upscaling while the other 2 use bilinear?

The table is on image upscaling. Look at this post for chroma upscaling. Intel is the sharpest among the three.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Home Theater Computers
AVS › AVS Forum › Video Components › Home Theater Computers › DXVA2 in XBMC Frodo