Trinity vs. I3 Ivy Bridge - Page 2 - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #31 of 51 Old 09-27-2012, 08:13 AM
Advanced Member
 
Ruiner's Avatar
 
Join Date: Dec 2002
Posts: 535
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 5 Post(s)
Liked: 18
A8/A10 GPU benches (only, per AMD NDA) for Trinity at the usual suspects' sites. No surprises...10-15% over Llano in games, creams HD4000, good enough for modern titles @1080p/LQ or 720p/MQ give or take.

Release prob 10/2 and probably priced against i3 for A8/A10.

Some power numbers here, but only for A10: http://www.xbitlabs.com/articles/graphics/display/amd-trinity-graphics_11.html#sect0
Ruiner is offline  
Sponsored Links
Advertisement
 
post #32 of 51 Old 09-27-2012, 09:15 AM
Advanced Member
 
nathanddrews's Avatar
 
Join Date: Jun 2012
Location: Minnesota
Posts: 933
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 27 Post(s)
Liked: 77
HTPC specific benchmarks:

http://www.anandtech.com/show/6335/amds-trinity-an-htpc-perspective
nathanddrews is offline  
post #33 of 51 Old 09-27-2012, 09:24 AM
AVS Addicted Member
 
assassin's Avatar
 
Join Date: Jul 2004
Posts: 12,961
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 51 Post(s)
Liked: 241
Quote:
Originally Posted by nathanddrews View Post

HTPC specific benchmarks:
http://www.anandtech.com/show/6335/amds-trinity-an-htpc-perspective

Glad to see that drivers are improving for AMD. Was quite disappointed with power consumption though.

Overall I like what I am seeing with the new Trinity platform.
assassin is offline  
post #34 of 51 Old 09-27-2012, 09:45 AM
Advanced Member
 
Ruiner's Avatar
 
Join Date: Dec 2002
Posts: 535
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 5 Post(s)
Liked: 18
The 65W A8 seems to be the HTPC choice over the A10K Anand tested. That article shows 84% gpu utilization with madVR and LAV copy-back for 1080i60p VC1...on the A10. I wonder if the A8 can pull that off without hitting 100%. Is that type of content common enough to be an issue?
Ruiner is offline  
post #35 of 51 Old 09-27-2012, 11:29 AM
AVS Special Member
 
jeffkro's Avatar
 
Join Date: Aug 2009
Posts: 2,053
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 23
http://www.anandtech.com/show/6332/amd-trinity-a10-5800k-a8-5600k-review-part-1/8

Wow, trinity manages to beat ivy bridge in system idle power consumption. Pretty good since its got more transistors on a larger process.
jeffkro is offline  
post #36 of 51 Old 09-27-2012, 12:19 PM
AVS Special Member
 
jakmal's Avatar
 
Join Date: Dec 2008
Location: Sunnyvale, CA
Posts: 1,059
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by assassin View Post

Glad to see that drivers are improving for AMD. Was quite disappointed with power consumption though.
Overall I like what I am seeing with the new Trinity platform.

Same here. Very disappointed with the power profile, but I have to say that it is a 100W TDP part. So, I should probably cut some slack. Not sure what would happen if we go to one of the 65W TDP parts. Also, hoping that the future driver releases don't break what is working in the current one.

Quote:
Originally Posted by Ruiner View Post

The 65W A8 seems to be the HTPC choice over the A10K Anand tested. That article shows 84% gpu utilization with madVR and LAV copy-back for 1080i60p VC1...on the A10. I wonder if the A8 can pull that off without hitting 100%. Is that type of content common enough to be an issue?

Please note that DXVA2CB + madVR for 1080i60 / 1080p60 is NOT working properly. Note the emboldened text beneath the table. We have software decode + madVR working without problems. For Ivy Bridge, even software decode + madVR was an issue. However, in the QS or DXVA2CB / madVR case, I have to say that the number of dropped frames in the IVB setup was actually lesser than in the Trinity setup. DXVA2CB seems to be taking up more GPU resources than expected in Trinity and that actually results in madVR dropping frames. It is not a memory bandwidth issue because I saw only very slight improvement with DDR3-2133. I wasn't allowed to talk about overclocking in the review, but I will mention here that OCing the GPU is possible and it didn't help in this case.

If you want DXVA2CB + madVR, a discrete GPU is the only solution.

Btw, 1080i60 content is quite common. Most of the BBC documentaries on Blu-ray are encoded in 1080i60 VC-1 IIRC.
Quote:
Originally Posted by jeffkro View Post

http://www.anandtech.com/show/6332/amd-trinity-a10-5800k-a8-5600k-review-part-1/8
Wow, trinity manages to beat ivy bridge in system idle power consumption. Pretty good since its got more transistors on a larger process.

Idle power consumption is quite easy to manage with power gating. Once you power gate, leakage power is quite minimal. A difference of few watts can be attributed to environmental factors such as PSU efficiency / chipset efficiency etc. The proper way to judge this would be to measure the CPU package power. HWInfo provides that for Intel CPUs, but AMD doesn't play nice. Only AMD's system monitor program gives a few insights, but, even that is restricted to CPU / GPU load sharing and separate CPU / GPU and memory loads.

Ganesh T S
Sr. Editor, AnandTech Inc.
jakmal is offline  
post #37 of 51 Old 09-27-2012, 12:25 PM
Member
 
im bored's Avatar
 
Join Date: Feb 2012
Posts: 42
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I don't get something if the A10-5800K is supposed to be a 100W CPU+GPU than why is it going all the way up to 133W on that test. Also same goes for A8-5600K if that's a 65W than that should be the absolute max they would ever use. Am i missing something or is this 65W, 100W just marketing BS.
im bored is offline  
post #38 of 51 Old 09-27-2012, 12:34 PM
AVS Special Member
 
jakmal's Avatar
 
Join Date: Dec 2008
Location: Sunnyvale, CA
Posts: 1,059
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by im bored View Post

I don't get something if the A10-5800K is supposed to be a 100W CPU+GPU than why is it going all the way up to 133W on that test. Also same goes for A8-5600K if that's a 65W than that should be the absolute max they would ever use. Am i missing something or is this 65W, 100W just marketing BS.

That is power consumed at the wall by the system. Please compute the numbers relative to the idle power. But, even that number includes network activity or SATA read activity power consumption which are not part of the 100W TDP (that is for the processor alone). As I mentioned in the previous post, CPU package power is the correct metric to check, but AMD doesn't expose it to the end user while Intel does.

Ganesh T S
Sr. Editor, AnandTech Inc.
jakmal is offline  
post #39 of 51 Old 09-27-2012, 12:51 PM
Newbie
 
tuskers's Avatar
 
Join Date: Jan 2005
Posts: 12
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by im bored View Post

I don't get something if the A10-5800K is supposed to be a 100W CPU+GPU than why is it going all the way up to 133W on that test. Also same goes for A8-5600K if that's a 65W than that should be the absolute max they would ever use. Am i missing something or is this 65W, 100W just marketing BS.

That is total system power measured at the wall. It includes running all the electronics in the entire computer (system drive/optical drive/motherboard/memory) and doesn't take into account the efficiency of the power supply. With an 85% efficient power supply, the system is actually using 113 watts. It's very reasonable to assume that the rest of the system is using more than 13 watts.

Also, I think everyone saying "Haswell is just around the corner" needs to realize that if you'd use an i3 today, the Haswell i3-class chips won't hit the market for at least 9 months, and probably more. The Ivy Bridge i3s literally just hit the market. Haswell's release chips will almost certainly be in the i5 class at the lowest, with pricing over $200.
tuskers is offline  
post #40 of 51 Old 09-27-2012, 01:01 PM
Advanced Member
 
Ruiner's Avatar
 
Join Date: Dec 2002
Posts: 535
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 5 Post(s)
Liked: 18
Jakmal,
What was CPU utilization like when running madVR/software decode?



re: power
http://hothardware.com/Reviews/AMD-A10-and-A8-Virgo-APU-Experience-and-Gaming/?page=8
http://www.xbitlabs.com/articles/graphics/display/amd-trinity-graphics_11.html#sect0

It may be safe to say that an i3 and ~40w discrete GPU would use less power under load and outperform an A10, possibly significantly on both counts.
Ruiner is offline  
post #41 of 51 Old 09-27-2012, 01:05 PM
AVS Special Member
 
jakmal's Avatar
 
Join Date: Dec 2008
Location: Sunnyvale, CA
Posts: 1,059
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by Ruiner View Post

Jakmal,
What was CPU utilization like when running madVR/software decode?.

It was around 30 - 40% for the 1080p60 material. Software decode using avcodec uses the multithreading capabilities very well. I am sure even 4K decode in software is not an issue.

Ganesh T S
Sr. Editor, AnandTech Inc.
jakmal is offline  
post #42 of 51 Old 09-27-2012, 01:14 PM
AVS Special Member
 
jeffkro's Avatar
 
Join Date: Aug 2009
Posts: 2,053
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 23
Quote:
Originally Posted by jakmal View Post

Same here. Very disappointed with the power profile, but I have to say that it is a 100W TDP part. So, I should probably cut some slack. Not sure what would happen if we go to one of the 65W TDP parts. Also, hoping that the future driver releases don't break what is working in the current one.
Please note that DXVA2CB + madVR for 1080i60 / 1080p60 is NOT working properly. Note the emboldened text beneath the table. We have software decode + madVR working without problems. For Ivy Bridge, even software decode + madVR was an issue. However, in the QS or DXVA2CB / madVR case, I have to say that the number of dropped frames in the IVB setup was actually lesser than in the Trinity setup. DXVA2CB seems to be taking up more GPU resources than expected in Trinity and that actually results in madVR dropping frames. It is not a memory bandwidth issue because I saw only very slight improvement with DDR3-2133. I wasn't allowed to talk about overclocking in the review, but I will mention here that OCing the GPU is possible and it didn't help in this case.
If you want DXVA2CB + madVR, a discrete GPU is the only solution.
Btw, 1080i60 content is quite common. Most of the BBC documentaries on Blu-ray are encoded in 1080i60 VC-1 IIRC.
Idle power consumption is quite easy to manage with power gating. Once you power gate, leakage power is quite minimal. A difference of few watts can be attributed to environmental factors such as PSU efficiency / chipset efficiency etc. The proper way to judge this would be to measure the CPU package power. HWInfo provides that for Intel CPUs, but AMD doesn't play nice. Only AMD's system monitor program gives a few insights, but, even that is restricted to CPU / GPU load sharing and separate CPU / GPU and memory loads.

Its also tied into how much stuff has been moved off the mobo controller chip to the CPU. The more you move to the SOC concept the more efficient you get. ARM/windows RT based net tops should be able to spank anything from intel or AMD for this reason, but will they get DVR software and good hardware based video accelerator support.
jeffkro is offline  
post #43 of 51 Old 09-27-2012, 01:29 PM
AVS Special Member
 
jeffkro's Avatar
 
Join Date: Aug 2009
Posts: 2,053
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 23
Quote:
Originally Posted by Ruiner View Post

The 65W A8 seems to be the HTPC choice over the A10K Anand tested. That article shows 84% gpu utilization with madVR and LAV copy-back for 1080i60p VC1...on the A10. I wonder if the A8 can pull that off without hitting 100%. Is that type of content common enough to be an issue?

I would have liked to see them put the dual core chips through this test and see if they could pull everything off, they might be fully capable for HTPC.
jeffkro is offline  
post #44 of 51 Old 09-27-2012, 01:38 PM
AVS Special Member
 
jeffkro's Avatar
 
Join Date: Aug 2009
Posts: 2,053
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 23
"AMD's Southern Islands as well as NVIDIA's Kepler GPUs support 4K output over HDMI, but Trinity doesn't have 4K video decode acceleration or 4K display output over HDMI."

Ivy bridge had the same limitation, looks like 4K still requires discrete graphics. It might not make sense for intel or AMD to support 4K until there is some demand for it. Anyways the conclusion I draw from this article is that you can't go wrong anymore, both trinity and ivy bridge make excellent HTPC's.
jeffkro is offline  
post #45 of 51 Old 09-27-2012, 01:40 PM
AVS Special Member
 
jeffkro's Avatar
 
Join Date: Aug 2009
Posts: 2,053
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 23
Quote:
Originally Posted by jakmal View Post

It was around 30 - 40% for the 1080p60 material. Software decode using avcodec uses the multithreading capabilities very well. I am sure even 4K decode in software is not an issue.

Neither ivy bridge or trinity can output 4K, so its not much of an issue.
jeffkro is offline  
post #46 of 51 Old 09-27-2012, 01:45 PM
AVS Special Member
 
jakmal's Avatar
 
Join Date: Dec 2008
Location: Sunnyvale, CA
Posts: 1,059
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by jeffkro View Post

"AMD's Southern Islands as well as NVIDIA's Kepler GPUs support 4K output over HDMI, but Trinity doesn't have 4K video decode acceleration or 4K display output over HDMI."
Ivy bridge had the same limitation, looks like 4K still requires discrete graphics. It might not make sense for intel or AMD to support 4K until there is some demand for it. Anyways the conclusion I draw from this article is that you can't go wrong anymore, both trinity and ivy bridge make excellent HTPC's.

Fully agree smile.gif From a technical standpoint, both IVB and Trinity are good for most common HTPC workloads. Pricing is going to favour AMD while power consumption is going to favour Intel. Depending on the end user requirements (slight gaming prowess needed / more CPU grunt needed etc.), an appropriate choice can be made. Personally, I don't play games at all.. so, it is pretty clear what works for me smile.gif, but I can't say it is going to be the same for everyone.

Ganesh T S
Sr. Editor, AnandTech Inc.
jakmal is offline  
post #47 of 51 Old 09-27-2012, 03:03 PM
Advanced Member
 
Tiddles88's Avatar
 
Join Date: May 2012
Posts: 583
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 24
Not that impressed. I'd rather get a G540 and a GTX 650. That seems to be the easiest option for decoding with MadVR. Those new chips are still thirsty and a discrete option is still better.
Tiddles88 is offline  
post #48 of 51 Old 09-27-2012, 03:04 PM
AVS Addicted Member
 
assassin's Avatar
 
Join Date: Jul 2004
Posts: 12,961
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 51 Post(s)
Liked: 241
Quote:
Originally Posted by Tiddles88 View Post

Not that impressed. I'd rather get a G540 and a GTX 650. That seems to be the easiest option for decoding with MadVR. Those new chips are still thirsty and a discrete option is still better.

I don't use Madvr. Don't see a difference.
assassin is offline  
post #49 of 51 Old 09-27-2012, 04:10 PM
AVS Special Member
 
jeffkro's Avatar
 
Join Date: Aug 2009
Posts: 2,053
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 23
Quote:
Originally Posted by assassin View Post

I don't use Madvr. Don't see a difference.

I agree, Madvr seems like a whole lot of fuss and is a massive resource hog.
jeffkro is offline  
post #50 of 51 Old 09-27-2012, 04:29 PM
AVS Special Member
 
Shark007's Avatar
 
Join Date: Apr 2007
Location: BC, Canada
Posts: 1,097
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 27 Post(s)
Liked: 50
Quote:
Originally Posted by assassin View Post

I don't use Madvr. Don't see a difference.

MadVR is great for those poeple with to much time on their hands.
It keeps them busy chasing rainbows but in the end, that pot of gold wasnt worth the effort.

Use Shark007 Codecs and retain your sanity.
Shark007 is offline  
post #51 of 51 Old 09-27-2012, 04:48 PM
Advanced Member
 
Tiddles88's Avatar
 
Join Date: May 2012
Posts: 583
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 24
I don't use MadVR either, just saying generally. I've left my HTPC at 50Hz, so 25p is fine. 23.976/24, I can't see a difference or any judder. As for a quality difference, I doubt there is any either.
Tiddles88 is offline  
Reply Home Theater Computers

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off