or Connect
AVS › AVS Forum › Video Components › Home Theater Computers › Intel HD2000 iGPU testing
New Posts  All Forums:Forum Nav:

Intel HD2000 iGPU testing - Page 2

post #31 of 78
Quote:
Originally Posted by renethx View Post

Quote:
Originally Posted by jim2100 View Post

Yes, that is what I do. MPC-HC does it automatically.


For live TV (the main source of interlaced contents), say in WMC, no way.

Yes way! Easily done.
post #32 of 78
Quote:
Originally Posted by jim2100 View Post


Yes way! Easily done.


I mean, no automatic way in WMC. You have to switch refresh rate manually.

 

BTW is your GPU's deinterlacer so poor? Which are you using? Just out of curiosity.

post #33 of 78
Quote:
Originally Posted by renethx View Post

Quote:
Originally Posted by jim2100 View Post

Yes way! Easily done.


I mean, no automatic way in WMC. You have to switch refresh rate manually.

BTW is your GPU's deinterlacer so poor? Which are you using? Just out of curiosity.

No, there is an automatic way. Just set MPC-HC to automaticaly switch back to the default after it is done. And the default is set to 1080i, so that is what WMC will use.

My display's deinterlacer is fine. Why should I mess with anything else?
post #34 of 78

I see, you have your own unique way. smile.gif

post #35 of 78
Quote:
Originally Posted by assassin View Post

Edit: Just did a little more testing. Looks like the CPU uses a whopping 3-5% more with DXVA disabled and de-interlace turned off. (14-15% with de-interlace off in XBMC compared to 17-18% with de-interlacing enabled on my system)
Again we are talking about 1080i content which is only about 5% or so of all movies. But even those 5% play and this is just another HTPC fallacy that has gotten perpetuated with no one really taking the time to actually test it.

 

If you turn off DXVA2, decoding is in software mode. This affects all video playback (progressive or interlaced). With an Intel dual core processor, the CPU usage at 1080p H.264 video playback is

 

- ~50% if DXVA2 off

- ~10% if DXVA2 on

 

This is just a fact. smile.gif

 

BTW I read your original post. As Intel iGPU's decoder and deinterlacer are broken in XBMC, you have to disable DXVA2. Then every potential problem will go away because the video playback is in software mode, it's completely hardware-independent, i.e. you will get exactly the same PQ whatever GPU you use. Using XMBC with DXVA2 off is pointless to test Intel iGPU...


Edited by renethx - 6/17/12 at 11:25am
post #36 of 78
Thread Starter 
Quote:
Originally Posted by renethx View Post

If you turn off DXVA2, decoding is in software mode. This affects all video playback (progressive or interlaced). With an Intel dual core processor, the CPU usage at 1080p H.264 video playback is

- ~50% if DXVA2 off
- ~10% if DXVA2 on

Just a fact. smile.gif

I don't know if that is accurate renethx.

Actually the screenshot from above is with DXVA off. Like I said it went up to about 17% when I turned on de-interlacing.

345

I will post some screenshots of 1080p H.264 material but know with certainty that I never come close to 50% CPU usage. I will also test on my G620.
post #37 of 78
Thread Starter 
Here is the CPU use on my i3 2100 dual core system with DXVA turned off. I will test my G620 next.

Wall.E 1080p H264 test (straight rip from bluray). About 20% CPU usage.

344

344

344

Super 8 1080p H264 test. About 13% CPU usage.

344

344

344
post #38 of 78

You'd better not post screenshots. That's meaningless (as nobody is asking PQ of progressive contents). If you want to tell us CPU usage, numerical values are enough and concise.

post #39 of 78
Thread Starter 
Quote:
Originally Posted by renethx View Post

You'd better not post screenshots. That's meaningless (as nobody is asking PQ of progressive contents). If you want to tell us CPU usage, numerical values are enough and concise.

CPU usage is listed in the screenshot. That's why I am posting them.
post #40 of 78

I know, but it's a pain to click the picture and enlarge it just to see CPU usage. Writing down numerical values is enough.

 

DXVA2 off means XBMC does not use the CPU's hardware decoder, it uses CPU to decode everything. So CPU usage is very predictable (if you have ever done this type of tests).

post #41 of 78
Thread Starter 
Quote:
Originally Posted by renethx View Post

I know, but it's a pain to click the picture and enlarge it just to see CPU usage. Writing down numerical values is enough.

DXVA2 off means XBMC does not use the CPU's hardware decoder, it uses CPU to decode everything. So CPU usage is very predictable.

I listed what I thought the average CPU use was of each movie so you don't have to even click unless you want to (also I can clearly see the CPU use percentage on my laptop screen without clicking the picture but maybe its too small for some). But for those that want proof there it is. smile.gif
post #42 of 78

Basically what you are doing is measuring CPU usage when hardware decode acceleration is on and off. Not so interesting test.

 

As I wrote repeatedly, Intel iGPU's decoder and deinterlacer are broken in XBMC so that you have to disable DXVA2. Then every potential problem will go away because the video playback is in software mode, it's completely hardware-independent, i.e. you will get exactly the same PQ whatever GPU you use (Intel, AMD or NVIDIA). Using XMBC with DXVA2 off is pointless to show Intel iGPU's deinterlacer works fine.

 

I have to go out of town. See you later. smile.gif


Edited by renethx - 6/17/12 at 11:00pm
post #43 of 78
Thread Starter 
Quote:
Originally Posted by renethx View Post

Basically what you are doing is measuring CPU usage when hardware decode acceleration is on and off. Not so interesting test.

As I wrote repeatedly, Intel iGPU's decoder and deinterlacer are broken in XBMC so that you have to disable DXVA2. Then every potential problem will go away because the video playback is in software mode, it's completely hardware-independent, i.e. you will get exactly the same PQ whatever GPU you use (Intel, AMD or NVIDIA). Using XMBC with DXVA2 off is pointless to show Intel iGPU's deinterlacer works fine.

Well you edited your post where we were discussing the CPU usage of the Intel CPU with DXVA on and off. I think what I am doing is very interesting in that, again, there seem to be these reports that it will increase CPU usage by 40% or so (up to 50% which may be significant to some) with DXVA off. While maybe this isn't interesting to you it is very interesting to me as I want to know what is actually fact when it comes to HTPC and not what is merely reported to be fact.

Again, I am not doing these latest tests to show anything at all about the deinterlacer but trying to see what happens to CPU usage with DXVA off. smile.gif
Edited by assassin - 6/17/12 at 11:49am
post #44 of 78
Thread Starter 
Here is my Intel G620T with HD1000 iGPU for those interested. ASRock motherboard, 4GB RAM.

CPU usage with DXVA off with this $50 CPU.

Wall.E 1080p test file (about 30-50% CPU usage). The CPU usage for this movie was all over the place from 30% up to about 55%. I tried to get screenshots of each.

368

368

Super 8 1080p test file (about 20-25% CPU usage)

368

368
post #45 of 78
One more thing to test if you don't mind. There's a claim that the G530 or G620 cannot bitstream HD audio in XBMC. The claimant never tried it -- it's just a theory so of course, since it's the internet, all theories are true until proven otherwise, but those CPUs couldn't possibly be powerful enough to bitstream HD audio (because it somehow matters?!). I tried to explain it doesn't matter the CPU, as long as the GPU supports it but what do I know?

Care to give it a try?
post #46 of 78
Thread Starter 
Quote:
Originally Posted by StardogChampion View Post

One more thing to test if you don't mind. There's a claim that the G530 or G620 cannot bitstream HD audio in XBMC. The claimant never tried it -- it's just a theory so of course, since it's the internet, all theories are true until proven otherwise, but those CPUs couldn't possibly be powerful enough to bitstream HD audio (because it somehow matters?!). I tried to explain it doesn't matter the CPU, as long as the GPU supports it but what do I know?
Care to give it a try?

Sure. But I don't need to do this test as I have already done it. In fact we do it about every week or month as we sell G620 HTPCs often and test them before shipping.

The claim you list is not correct.
post #47 of 78
Thread Starter 
Enough testing for today as its Father's day. Hope at least some of you found the tests interesting (sorry you did not renethx). I know I did. Please feel free to comment and let me know what you think as I value your opinion even if it differs from mine.
post #48 of 78
post #49 of 78
Thread Starter 
Quote:

Looks like there are promising fixes and work arounds. This isn't just limited to Intel but, again, I am sure Intel will be blamed as the only one that this affects.
Quote:
I resolved the 29/59 frame rate issue on:

ASUS P8H67-M EVO motherboard (BIOS v2303) with HD2000 Integrated Graphics

Driver Version 8.15.10.2656

Windows 7 x64 SP1

Plugged in via HDMI

In the Driver Control Panel (Advanced Mode) I am currently setup as follows:

DISPLAY:

General Settings:

Resolution: 1920x1080

Color Depth: 32bit

Refresh Rate: 60i Hz (I have a 1080i panel)

Color Enhancement:

YCbCr: CHECKED

Monitor / TV Settings: [note: I could never get these to 'stick' to any other values, but I did play around here]

Quantization Range: Limited

IT Content: CHECKED

MEDIA:

Color Enhancement:

Standard Color Correction: Driver Settings

Brightness: 0.0

Contrast: 1.00

Hue: 0.0

Saturation: 1.00

Total Color Correction: UNCHECKED

Image Enhancement:

Noise Reduction: Driver Automatic Settings

Luma: SELECTED

Sharpness: Driver Automatic Settings

Skin Tone Enhancement: UNCHECKED

Other Settings:

Film Mode Detection: UNCHECKED

Adaptive Contrast Enhancement: UNCHECKED

Image Scaling:

Scaling: UNCHECKED

NOTE:

After doing all of this, I also did the following, which in the end seems to have been the trick:

- Right Click on the HD Graphics Notification Center Icon

- Select Graphic Options, Profiles, Media Properties, and Default Profile.

No check mark will appear next to Default Profile if you check it, however when I had WMC in windowed mode on HBOHD and performed this, the flickering stopped immediately. I rebooted and made sure things were still good and indeed they were. I also confirmed that a recorded video which previously experienced this problem no longer had the visual flicker, even though when I looked at '411-More Info' I could still see the rapid cycling between 59 and 29 on the Presentation Mode screen.

Seems like this issue is bigger than Intel, AMD, NVidia and even Microsoft. Its a multi-centered complete screwup by the companies who broadcast the metadata.
Quote:
Solution

There is only one true solution to this problem: Encode the video with the correct metadata to begin with. The solution is the responsibility of the entity that encoded the video, which means contacting the broadcaster, cable network, or cable company that encoded the video and working with their engineers to resolve the issue. Additionally, Microsoft has acknowledged this issue in their Knowledge Base Article ID 2658140 with no known Windows Media Center resolution at this time.

Edit: I just checked all my channels and I don't have any that have the 29/59 bug so I won't be able to test. I would have loved to had been able to find a durable fix.
Edited by assassin - 6/17/12 at 2:30pm
post #50 of 78
This has been an interesting read. I also tried this and did not find any OTA tv shows that are having this issue. However, I am still disappointed in the tv quality of fast moving video and not sure if its my HD2000 GPU or just the way it is with the OTA signal...
post #51 of 78
Thread Starter 
Quote:
Originally Posted by jbilliel View Post

This has been an interesting read. I also tried this and did not find any OTA tv shows that are having this issue. However, I am still disappointed in the tv quality of fast moving video and not sure if its my HD2000 GPU or just the way it is with the OTA signal...

Thanks. I thought so as well.

I get pixelation on my Directv HD DVR box as well on some fast moving content on some channels which obviously has absolutely nothing to do with HTPC. This is perhaps the best explanation of pixelation --- even from OTA sources --- I have ever read from one of our AV members (so good that I bookmarked it):
Quote:
HDTV broadcasts are in a compressed data format in which both full frames and partial frame updates are used. If a frame update is dropped in transmission then the next frame update will often apply to the wrong pixels on the screen causing pixelation. This is very common in high motion or panning HD video where almost every transmission can be a full frame of content and there is not enough bandwith available for all of these tranmissions to complete normally from your source to your receiver.
post #52 of 78
I just built an HTPC with a G620 CPU & an using integrated graphics and WMC. Its main function is to be used as an OTA DVR. It is connected via HDMI to a 1280x720 HDTV.

With the Intel graphics set for a 1280x720 display, the properties indicate a refresh rate of 29 Hz. (the only choices are 29 or 30).

What does the HTPC output when watching a 720p program? Isn’t this a 720p60 signal? Am I losing half of the frames? What happens with a 1080i program?
post #53 of 78
Quote:
Originally Posted by assassin View Post

Sure. But I don't need to do this test as I have already done it. In fact we do it about every week or month as we sell G620 HTPCs often and test them before shipping.
The claim you list is not correct.

So, the counter-claim is then it only works for the G620 it couldn't possibly work for the G530 or G440 -- too weak to bitstream.

You could probably post a picture of "DTS Master Audio" on your AVR with the G530 in the background and that guy would not believe you biggrin.gif. Not that he tried it -- but he's just figures it wouldn't work. No explanation that bitstreaming is passthrough and has nothing to do with the CPU will suffice.
post #54 of 78
Thread Starter 
Quote:
Originally Posted by StardogChampion View Post

So, the counter-claim is then it only works for the G620 it couldn't possibly work for the G530 or G440 -- too weak to bitstream.
You could probably post a picture of "DTS Master Audio" on your AVR with the G530 in the background and that guy would not believe you biggrin.gif. Not that he tried it -- but he's just figures it wouldn't work. No explanation that bitstreaming is passthrough and has nothing to do with the CPU will suffice.

I don't think the strength of the CPU has anything at all to do with HD Audio bitstreaming. I am not going to test my G530 as it is in my server which is running WHS2011.
post #55 of 78
Quote:
Originally Posted by assassin View Post

I don't think the strength of the CPU has anything at all to do with HD Audio bitstreaming. I am not going to test my G530 as it is in my server which is running WHS2011.

I haven't tried it in the DSPlayer, but I've bitstreamed with the G530 in MPC-HC, WMC and WMP, so yeah, I don't get his assertion that it won't work.
post #56 of 78
Quote:
Originally Posted by StardogChampion View Post

One more thing to test if you don't mind. There's a claim that the G530 or G620 cannot bitstream HD audio in XBMC. The claimant never tried it -- it's just a theory so of course, since it's the internet, all theories are true until proven otherwise, but those CPUs couldn't possibly be powerful enough to bitstream HD audio (because it somehow matters?!). I tried to explain it doesn't matter the CPU, as long as the GPU supports it but what do I know?
Care to give it a try?

 

No problem in bitstreaming HD audio with Celeron G530+XBMC (I tested). Bitstreaming audio uses very little CPU time. I see there are tons of foolish claimants on the Internet. smile.gif

Nevertheless I don't recommend Intel iGPU for XBMC, in particular Celeron. You have to disable DXVA so that the video playback is in software mode, that results in high CPU usage (40-60% to play back HD video) and sometimes poorer PQ and/or stuttering for interlaced contents.


Edited by renethx - 6/17/12 at 10:30pm
post #57 of 78

There are lots of pictures (most of them are pointless, unfortunately) and posts. Here is a quick summary of my own. smile.gif

 

Summary of Intel iGPU + XBMC

 

1. Intel iGPU's decoder and deinterlacer are broken in XBMC so that you have to disable DXVA2.

2. Then the video playback is completely in software mode and PQ is identical no matter what GPU you use, Intel, AMD, NVIDIA. Decoding is done by CPU and all DI/post-processing tasks are done by CPU (maybe pixel shaders of GPU) with XBMC's own algorithms.

3. As a result, you will see higher CPU usage than DXVA2 mode. I saw at HD AVC video playback with Celeron G530 2.4GHz,

 

- ~10% with DXVA2 on (often with heavy pixelation, that's the reason why DXVA2 has to be off)

- 40%-60% CPU usage with DXVA2 off

 

Deinterlacing is poorer than DXVA2 mode (at least in benchmarks) and often results in stuttering. Read also

 

- DI in DXVA2

- DI in non-DXVA2.

 

So the conclusion is pretty clear (to me smile.gif): avoid Intel iGPU for XBMC. Llano (I recommend A6-3500) is a better choice (or add a discrete card for the best PQ).

 

Update: I found that A6-3500 is already one of the most popular CPU+GPU in the XBMC community. No need of my recommendation. wink.gif

 

Summary of Intel iGPU's hardware deinterlacer

 

Intel iGPU does deinterlacing by a fixed function of GPU called "Media Sampler" + Intel graphics driver's algorithm. It works under one of:

 

- Any DXVA2 video decoder + EVR (e.g. WMC/WMP, TMT, PowerDVD, MPC-HC with built-in decoder + EVR [CP])

- Any video decoder outputting NV12 with correct interlaced flag (e.g. ffdshow Video Decoder) + EVR

- Intel QuickSync video decoder (the latest build) + any video renderer

- Any video decoder + madVR

 

Deinterlacing quality is more or less equivalent to AMD's Vector Adaptive.

 

Problems? Personally I don't see any.


Edited by renethx - 6/23/12 at 6:28am
post #58 of 78
Quote:
Originally Posted by Mike99 View Post

I just built an HTPC with a G620 CPU & an using integrated graphics and WMC. Its main function is to be used as an OTA DVR. It is connected via HDMI to a 1280x720 HDTV.
With the Intel graphics set for a 1280x720 display, the properties indicate a refresh rate of 29 Hz. (the only choices are 29 or 30).
What does the HTPC output when watching a 720p program? Isn’t this a 720p60 signal? Am I losing half of the frames? What happens with a 1080i program?

I made a mistake here. With a 1280x720 display, the properties indicate a refresh rate of 59 Hz. The 29 Hz shows up when choosing a 1920x1080 display.

My question is almost the same as it would still pertain to the 29 Hz rate. What happens to a 720p60 signal which has 60 frames/fields per second?
post #59 of 78
Quote:
Originally Posted by renethx View Post

Summary of Intel iGPU + XBMC

1. Intel iGPU's decoder and deinterlacer are broken in XBMC so that you have to disable DXVA2.

So the conclusion is pretty clear (to me smile.gif ): avoid Intel iGPU for XBMC. Llano (I recommend A6-3500) is a better choice (or add a discrete card for the best PQ).

A lot of this is still new to me. What is the advantage of using XBMC instead of WMC?
post #60 of 78
Quote:
Originally Posted by Mike99 View Post

Quote:
Originally Posted by Mike99 View Post

I just built an HTPC with a G620 CPU & an using integrated graphics and WMC. Its main function is to be used as an OTA DVR. It is connected via HDMI to a 1280x720 HDTV.
With the Intel graphics set for a 1280x720 display, the properties indicate a refresh rate of 29 Hz. (the only choices are 29 or 30).
What does the HTPC output when watching a 720p program? Isn’t this a 720p60 signal? Am I losing half of the frames? What happens with a 1080i program?

I made a mistake here. With a 1280x720 display, the properties indicate a refresh rate of 59 Hz. The 29 Hz shows up when choosing a 1920x1080 display.

My question is almost the same as it would still pertain to the 29 Hz rate. What happens to a 720p60 signal which has 60 frames/fields per second?


720p60 has 60 frames (not fields) per second. You will loose half the resolution if the desktop refresh rate is 30 Hz (although you may not notice it).

New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Home Theater Computers
AVS › AVS Forum › Video Components › Home Theater Computers › Intel HD2000 iGPU testing