AVS Forum banner
1 - 4 of 4 Posts

·
Registered
Joined
·
406 Posts
Discussion Starter · #1 ·
 Tom's Hardware have been running an article describing overclocking of the intel Core2 E6300 processor showing that it can generally outperform the $999 Core2 X6800 cpu in a number of tests. The two cpus differ in clock frequency, cache size (4MB vs 2MB shared) and price.


This is interesting in itself, but I think that the results for MPEG2->h264 1080 transcoding is very interesting for HTPC applications.

http://www.tomshardware.com/2007/01/...eme/page8.html


"MainConcept H.264 Encoder v2 Version: 2.1

2:19 min MPEG2-source 1920x1080 to H.264

Profile: High

Audio: AAC

Stream: Program "


This seems to be a transcoding of 1080 HDTV material (unknown bitrates) from MPEG2 to h264. OTOH, I would guess that h264 encoding takes the bulk of processing time, and that scalability should apply to a h264 pure decoder as well?


For the special case of a pure cpu-limited application, one would suspect there to be a 1:1 scaling between cpu clock and exececution speed. Normally, this isnt so because normal applications may be limited by graphics card performance (games), harddrive performance, memory performance etc.


If we calculate (processing time)*(clock frequency)


10:58 (658 seconds) at 1.86 GHz -> 1224

8:26 (506 seconds) at 2.44 GHz -> 1235

7:07 (427 seconds) at 2.93 GHz (X6800) -> 1251

6:12 (372 seconds) at 3.4 GHz -> 1265


It is evident that for this particular task, increasing the cpu clock leads to a near linear increase in video performance, and that 4MB of cpu cache seems to have no impact.


One can only speculate whether this extends to other similar tasks such as real-time decoding of high-bitrate h264/VC1 material.


If we compare with WinRAR (a zip-like utility for lossless compression of general data files):


The 3.4GHz overclocked E6300 give virtually no improvement compared to the 2.93GHz X6800.


Possible reasons:

1. Cache size is important for this application

2. At >~3 GHz, this platform cannot improve Winrar performance due to other bottlenecks such as harddisk latency/bandwidth


-k
 

·
Registered
Joined
·
2,914 Posts
It probable does scale for decoding, but doesn't matter too much as even a 2ghz can decode any HD-DVD or broadcast h264 (even 20mbit mbaff), regardless of the graphics card.


Maybe if someone is daft enough to produce 35mbit Bluray AVC you might need more, but I'm dubious that will ever happen as some set-top Bluray drives might not have the processing grunt for that either.
 

·
Registered
Joined
·
1,548 Posts
Let me add few comments to this since I actually did overclocking on my computer:

Core2Duo is very easy to overclock, I run my E-6400 at 2.8GHz daily (from original 2.1 GHz) with everything stock and all voltages that can be adjusted at minimum, which makes my computer running cooler now at higher speed than originally with higher factory set voltages. I also got the CPU to run at 3.4 GHz at higher vcore voltages but without some extra cooling I don't feel comfortable to run it that high all the time. All you really need is capable motherboard and you can easily get 20%-50% higher performance without even trying. As far as why, well, why not? Playing straight HD video may be fine even on E6300 but try to add some filters like noise reduction, sharpening etc. and you'll bring even the fastest X-6800 to it's knees in no time and don't forget the sound processing and all the background tasks run by operating system. Having some headroom now will save some headaches later. Depending on particular application the bottleneck may not be CPU but it's easy enough to make sure . The reason for Winrar running slower on overclocked E-6300 is the E-6800's 4MB cache , but it still run much faster than stock 6300, didn't it?
 

·
Registered
Joined
·
406 Posts
Discussion Starter · #4 ·

Quote:
Originally Posted by arfster /forum/post/0


It probable does scale for decoding, but doesn't matter too much as even a 2ghz can decode any HD-DVD or broadcast h264 (even 20mbit mbaff), regardless of the graphics card.


Maybe if someone is daft enough to produce 35mbit Bluray AVC you might need more, but I'm dubious that will ever happen as some set-top Bluray drives might not have the processing grunt for that either.
http://www.anandtech.com/video/showdoc.aspx?i=2886&p=4

"We tested CPU utilization by using perfmon to record data while we viewed a section of X-Men: The Last Stand. The bookmark feature really helped out, allowing us to easily jump to the specific scene we wanted to test in Chapter 18. In this scene, the Golden Gate is being torn apart and people are running everywhere. This is one of the most stressful scenes in the movie, reaching a bitrate of over 41 Mbps at one point.

...
"


The fact that even a powerful Core2 E6600 peaked at 100% cpu utilisation when doing pure software playback seems to indicate that for this single title, using the current level of decoding optimisation, choice and scaling of cpu/gpu does matter.


-k
 
1 - 4 of 4 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top