or Connect
AVS › AVS Forum › Video Components › Home Theater Computers › AMD Llano - THE great HTPC chip?
New Posts  All Forums:Forum Nav:

AMD Llano - THE great HTPC chip? - Page 12

post #331 of 880
Quote:
Originally Posted by Zon2020 View Post

Can you give us a hint when your Sandy Bridge driver fix review will be out (and will we be happy or disappointed).

This is off-topic in the Llano thread, but let me respond anyway:

Currently, the renderer is locking to a non-23.976 fps playback frame rate similar to NVIDIA (without the custom resolution). In this behaviour, it is similar to what was got with the UAC hack (It is now available without messing with UAC).

The display refresh rate is also similar to what NVIDIA provides (drifting on either side of the locked frame rate, and approaching 23.976), but the deviation on either side seems to be much more than NVIDIA's.

It is not as bad as Clarkdale, where 23 Hz setting still refreshed at 24 Hz.

Why didn't I mention this in the Llano review? Users taking advantage of the UAC hack apparently ended up with lip sync errors after some time in the playback. I am trying to see if the drivers I am testing with also have the issue.

As to why the review is not up yet: I need to do some more testing on the actual PC, like measure USB 3 speeds, run some standard benchmarks etc, nothing related to HTPC specific features of the SNB processor
post #332 of 880
Quote:
Originally Posted by ilovejedd View Post

Hmm, yeah, I figured that's probably the case. Still, the Asus Z68 does appear to use more power compared to other boards.


It seams to me MSI tends to beat out the other manufacturers when it comes to idle power consumption.

I now have a msi zacate and msi H61.
post #333 of 880
The least expensive is a Biostar for $95, & it's got an internal SPDIF connector. The Biostar USA website still doesn't list FM1 motherboards, but the Biostar global website does.

For ~$25 less (actually greater difference , due to no shipping charge), this minimum feature (in it's more convenient rear panel optical form) can be had for $70 in a H61 chipset motherboard. The SATA3 and USB3 are nice features to have but for a semi-dedicated play only HTPC, they are probably don't cares to me.

Hmm...
post #334 of 880
A8-3850 (AD3850WNGXBOX) and A6-3650 (AD3650WNGXBOX) are available at retail stores now.
post #335 of 880
post #336 of 880
Tiger Direct email said they can start selling them today, but their website won't let you add them to the cart yet.
post #337 of 880
Newegg has the 3850 @ 139, so $10 less then Tiger.

http://www.newegg.com/Product/Produc...-942-_-Product
post #338 of 880
Any indications on when we should start seeing Mini-ITX mobos?
post #339 of 880
ASUS F1A75-I Deluxe and F1A75-I Mini-ITX mb are available in August.
ASRock A75M-ITX? (This is not even listed in the ASRock website.)
post #340 of 880
I hope the 65W show up soon, specifically the 3800 which is just what I need.

Sucks that ITX is a month out and only asus is confirmed so far, usually good just pricey.

Time to start popping by microcenter on the way to work too
post #341 of 880
Quote:
Originally Posted by jakmal View Post

Maybe we can start collecting them over here ; Readers should feel free to contribute so that we can get AMD to look at each and tick them off one by one.

1. ESVP not working for 1080p60 H.264 : Clips play back OK if all the post processing options are fully turned off manually, so it should be fairly easy to get it to work.

2. High GPU utilization when using DXVA (compared to using the ATI proprietary calls) for non-Blu-ray files

3. Colour levels are not maintained properly when playing limited range content (most videos) on a display set to 16-235 (there is an intermediate conversion to 0-255 and loss of precision in this unnecessary expansion / contraction)

4. Silent stream bug when bitstreaming HD audio to AVR

I am suspecting (3) and (4), but, since I didn't have time to do extensive testing and rule out any other problems, I didn't mention them in the review.

Thank you for posting these.
post #342 of 880
post #343 of 880
Video playback

jakmal pointed out in his review that at 1080p60 playback approximately half the frames were dropped. It looks like enabling Dynamic contrast is the main cause of dropped frames here. I observed that fps drawn fluctuates between 30fps and 60fps seemingly periodically. At some scenes fps suddenly drops, then a few seconds later it goes back to 60fps. So I disabled Dynamic contrast, voila! there was zero dropped frame. The other post-processors on or off affects dropped frames little to none. I tried DDR3-1333, 1600 and 1866. DDR3-1866 is a lot better at 1080p60 playback even if Dynamic contrast enabled (very few dropped frames).

As for madVR, the APU is good enough for SD/HD 24p contents. However I see lots of dropped frames (roughly two thirds of the frames) at 1080i60 playback (let alone 1080p60), with ffdshow Video Decoder (libavcodec H.264 decoder) + yadif (a software deinterlacer) + madVR. Edit: This is a bug of the latest libavcodec that integrates ffmpeg-mt. If rolled back to SVN 3866 with ffmpeg-mt, the playback of 1080i60 and 1080p60 became smooth as butter.

In summary, the new APU is good for:

Under DXVA/EVR

- HD/SD 24p
- HD/SD 60i
- HD/SD 60p if Dynamic contrast is disabled (well, who'd want to enable it anyway?)

Under ffshow Video Decoder (ffmpeg-mt + yadif)/madVR and DDR3-1600 (see below)

- HD/SD 24p
- HD/SD 60i (VC-1 is not supported)
- HD/SD 60p

Obviously ffdshow/madVR is a better solution than DXVA/EVR because of consistent PQ and smoother playback (higher power consumption of the system is a drawback, however).

Memory selection

DXVA/EVR

You'd better stay away from DDR3-1333. DDR3-1600 is the minimum for stable GPU operation (only +$5 [2 x 2GB]) or go for DDR3-1866 if you can afford (+$26; but then you'd better add a discrete card).

ffdshow/madVR

Here are a bit strange/surprising results. Number of dropped frames (those during the first few seconds are excluded):
  1080i60 (La Traviata) 1080p60 (Ginza Cat)
DDR3-1333 ~500 0
DDR3-1600 4 0
DDR3-1773 ~1000 ~100
I tried a couple of times, with similar results. So obviously DDR3-1600 is the best choice. BTW I saw stability issues at DDR3-1866 (at 1.65V; not memory itself but GPU). So I set BCLK to 95MHz (default is 100MHz), so that the effective memory clock is 95MHz * 28/3 * 2 = 1773.333...MHz. (28/3 = 9.3333, 28/3 * 2 = 18.666...)

Dual display

GPU is not powerful enough to drive dual display stably (let alone decode dual video stream) even if video memory is set to 1GB.
post #344 of 880
I attached a HD 6570 discrete graphics card in CrossFire mode. Dual graphics is now called HD 6630D2. The display can be attached to either of the GPUs, but with slightly different video playback performance.

1080i60

It looks like both GPUs are used evenly at video playback, a great improvement over the previous Hybrid CrossFireX.

1080p60

Obviously only UVD of the GPU to which the display is attached is used. That means:

- If the display is attached to HD 6570, then playback is as good as HD 6570
- If the display is attached to HD 6550D, then playback is as good/bad as HD 6550D.
LL
post #345 of 880
post #346 of 880
Some good info Renethex, so what discrete level graphics would you say it stacks up against?

HD6450
HD6570
gt430
post #347 of 880
Quote:
Originally Posted by renethx View Post
Video playback
...
It looks like enabling Dynamic contrast is the main cause of dropped frames here. I observed that fps drawn fluctuates between 30fps and 60fps seemingly periodically.
...
renethx, did you observe the GPU load spike up to 100% and down to normal levels like a sine wave? This is what the AMD engineers observed too, and I think it is more of a driver issue. Dynamic contrast shouldn't be as heavy on the GPU as some of the other post processing algorithms. I am sure this is very useful info for AMD engineers to continue debug.

Quote:
I tried DDR3-1333, 1600 and 1866. DDR3-1866 is a lot better at 1080p60 playback even if Dynamic contrast enabled (very few dropped frames).
What is the power consumption difference between DDR3-1333 and DDR3-1866 ?

Quote:
Originally Posted by renethx View Post
1080i60

It looks like both GPUs are used evenly at video playback, a great improvement over the previous Hybrid CrossFireX.
This is the most surprising result of all (that I haven't been able to reproduce. I saw that for video activities, the GPU to which the display was connected was the only one active. I am going to update my drivers and see what the issue is (AMD engineers told me before NDA lift that
Quote:
Dual Graphics scales performance in applications using D3D and OpenCL, but not for video playback at the moment.
)
post #348 of 880
Jakmal or Renethx,

Any chance either of you are able to test/check how badly llano exhibits the 29/59 stutter bug?

Thanks in advance.
post #349 of 880
Renethx, what video player is that in the frame shot of La Traviata which provides that playback technical info? Thanks.
post #350 of 880
Quote:
Originally Posted by jakmal View Post

renethx, did you observe the GPU load spike up to 100% and down to normal levels like a sine wave? This is what the AMD engineers observed too, and I think it is more of a driver issue. Dynamic contrast shouldn't be as heavy on the GPU as some of the other post processing algorithms. I am sure this is very useful info for AMD engineers to continue debug.

What is the power consumption difference between DDR3-1333 and DDR3-1866 ?

This is the most surprising result of all (that I haven't been able to reproduce. I saw that for video activities, the GPU to which the display was connected was the only one active. I am going to update my drivers and see what the issue is (AMD engineers told me before NDA lift that )

No, I don't see it. If Dynamic contrast is on, GPU usage fluctuates between 90% and 95% and "Frame rate" goes down and up between 30fps and 60fps. If it is off, GPU usage is 80%-90% and "Frame rate" is pretty constant, ~60fps. I played Ginza Cat and Basketball dozens of times and I have never seen "Frame rate" drop if Dynamic contrast is off.

DDR3-1866 usually requires 1.65V or more to run stably and that increases the power consumption by 2-3W watts. Most DDR3-1600 runs stably at 1.50V and the power consumption is almost the same as DDR3-1333 1.50V.

I found the secret. If the display is connected to HD 6550D, then load is balanced between the GPUs. If the display is connected to HD 6570, then only HD 6570 is used, at least for video playback.

The display connected to HD 6550D

The display connected to HD 6570

BTW all the post-processors including Dynamic contrast are on. DDR3-1600. In either case I see zero dropped frame (except for the startup). Well, that's expected.
LL
LL
post #351 of 880
Quote:
Originally Posted by Zon2020 View Post

Renethx, what video player is that in the frame shot of La Traviata which provides that playback technical info? Thanks.

The player is MPC HomeCinema BE mod found at XvidVideo.RU. View > Statistics enabled.
post #352 of 880
renethx,

Overall, this is welcome progress. Disappointed to hear that the load stays at 95% despite disabling dynamic contrast, though.

Now, if only ESVP is configured to disable dynamic contrast on detection of 1080p60 H.264 (or, well, it can disable all post processing if it wants to!)

The worrisome point is that not many HTPC users will go for the higher rated DRAM modules.

I hope there can be some driver 'magic' to optimize the memory bandwidth usage and handle extra latency :| Now, I am not so optimistic about the prowess of the mobile Llano chips for the HTPCs also :| (Mobile SNB makes a pretty decent HTPC, IMHO)
post #353 of 880
Right now buying DDR3-1600 9-9-9-24 2 x 2GB like this $44 and disabling Dynamic contrast permanently is the best option for A-Series APU. That works pretty well for (almost) whatever video file you throw in with either DXVA/EVR or madVR (Dynamic contrast or any other AMD post-processor is irrelevant to madVR of course). Surely memory management is the weakest point of APU. I expect improvement in Trinity and later.

Intel has improved GPU incrementally (G4x chipset, Clarkdale [a separate die on the same package], SNB [on the same die]) and it's pretty mature (apart from long time negligence of the 23.976Hz issue). Even with DDR3-1333, 1080p60 is no problem. (There are not enough EUs for madVR, however.) AMD tried to integrate discrete GPU on the die in the new 32nm process. This is the 1st gen of APU. I think it works pretty well at the first attempt. Trinity with Bulldozer core and better memory management will improve performance considerably (I hope).
post #354 of 880
95%???

I know there are no dropped frames but doesn't that worry you?
post #355 of 880
If Dynamic contrast is disabled, GPU usage is 80-90%. jakmal read wrong.
post #356 of 880
Quote:
Originally Posted by renethx View Post
If Dynamic contrast is disabled, GPU usage is 80-90%. jakmal read wrong.
That still seems a little high. Its okay for now but I am a little concerned about out being able to handle future upgrades in technology.

Whats the gpu usage of the integrated intel by comparison?
post #357 of 880
AFAIK there is no tool to tell GPU usage of Intel HD Graphics.

That's why there are a PCI Express x16 slot and discrete graphics card. We don't have to upgrade the platform/CPU/APU too often. Just add a new graphics card and we will be done.
post #358 of 880
Thread Starter 
Quote:
Originally Posted by renethx View Post
That's why there are a PCI Express x16 slot and discrete graphics card. We don't have to upgrade the platform/CPU/APU too often. Just add a new graphics card and we will be done.
*sigh* It is exactly for this reason that I'm hesitant to go Llano. If I'm going to need a discrete GPU anyway, might as well go for the faster Sandy Bridge CPU. Granted, high-end Llano has the benefit of having four physical cores but most applications suffer some overhead from multi-threading anyway. For the apps I'll be using, two of SNB's faster cores is better than four of Llano's slower ones. The slow CPU would have been acceptable had the GPU worked between HD 5570 and 5670 level as earlier speculated. As it is, Llano seems horribly bandwidth-limited.

I think I'll still get Llano but I'll probably go with the less expensive dual-core A4 or E2 model. At the very least, the A4's GPU should perform better than an HD 4550 and that's the minimum performance level I require. An HD 5570 would have allowed me to do more with the HTPC, alas, it appears we won't be getting that level of performance until Trinity. Hopefully, Trinity will still be on socket FM1 so I can just do a drop-in replacement to increase both CPU and GPU performance.
post #359 of 880
Actually I like Llano very much. Finally I can get superb PQ of madVR without adding a discrete graphics (DXVA/EVR is also very nice). Once A6-3600 65W is released, that will be the best option for SFF + madVR.
post #360 of 880
Thread Starter 
Quote:
Originally Posted by renethx View Post

Actually I like Llano very much. Finally I can get superb PQ of madVR without adding a discrete graphics (DXVA/EVR is also very nice). Once A6-3600 65W is released, that will be the best option for SFF + madVR.

It really is all about the pricing. If the A6-3600 65W is priced at ~$100, it'd probably sell like hotcakes. I'd definitely go for one of those. At $120 for the A6-3650 and $140 for the A8-3850, meh...

By the way, are there any reviews of the A6-3650? From memory, it has 320 SP compared to 400 SP on the A8-3850 so it'll certainly be interesting to see if there's a noticeable difference in performance or if the latter is just extremely bandwidth limited.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Home Theater Computers
AVS › AVS Forum › Video Components › Home Theater Computers › AMD Llano - THE great HTPC chip?