The Official Gigabyte GA-MA78GM-S2H RS780 mATX Thread - Page 14 - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #391 of 4430 Old 03-04-2008, 10:53 AM
Senior Member
 
omholt's Avatar
 
Join Date: Nov 2006
Location: Norway
Posts: 278
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by crabnebula View Post

Here is another review of the Gigabyte that gives HQV and HD HQV scores with an Athlon 64 X2 processor and with the 8.3 drivers that Gary from AnandTech mentions in his blog post.

http://www.bit-tech.net/hardware/200...ics_chipset/10

This time, we get 93/130 and 47/100 respectively. Interestingly, the deinterlacing performance for 1080i (especially film resolution loss) isn't judged to be quite as bad as it was at ocworkbench.

Subjective results, huh?

Hmmm. Intel G35 scores much better. How visible are the differences on a
42" screen? Do they really matter much?
omholt is offline  
Sponsored Links
Advertisement
 
post #392 of 4430 Old 03-04-2008, 10:54 AM
AVS Special Member
 
vkristof's Avatar
 
Join Date: Dec 2002
Location: Long Island
Posts: 1,116
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by crabnebula View Post

By the way, that article gives a perfect HQV score (not HD) of 130 running with a 4850e processor, which is HT 2.0

Are the 4850es for sale anywhere online yet?
vkristof is offline  
post #393 of 4430 Old 03-04-2008, 11:00 AM
AVS Special Member
 
vkristof's Avatar
 
Join Date: Dec 2002
Location: Long Island
Posts: 1,116
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by ChrisMorley View Post

The CPU does change it because it enables the UVD to open up post processing abilities only available with an HT3.0 link. Also, I would challenge anyone to show me a real world issue with B2 stepping Phenoms outside of Virtualization environments. You can turn off the TLB patch in the GB board in the BIOS, FYI.

OK. Is AMD going to selling the slow clocked Phenoms in the low $100 price range then?

It sounds like you're saying that HT3.0 would help for some of the finer points of HD video processing. I'd bite if the Phenoms are in the 100-130 range that somebody mentioned previously.

For the hundredth time: I wish AMD success, competition is GOOD for the consumer. Me!
vkristof is offline  
post #394 of 4430 Old 03-04-2008, 11:07 AM
Senior Member
 
crabnebula's Avatar
 
Join Date: Jan 2005
Location: Montreal, Canada
Posts: 389
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by ChrisMorley View Post

The CPU does change it because it enables the UVD to open up post processing abilities only available with an HT3.0 link. Also, I would challenge anyone to show me a real world issue with B2 stepping Phenoms outside of Virtualization environments. You can turn off the TLB patch in the GB board in the BIOS, FYI.

Chris, do you happen to know exactly what further post-processing is enabled by using an HT 3.0 CPU that provides increased memory bandwidth?

I assume deinterlacing of video-based HD material might be one thing as that is fairly demanding, but I would think deinterlacing film-based 1080i shoudn't be a problem either way (reviews don't corroborate this however). And what about PQ with SD DVD, is it the same regardless of CPU?

Finally, I was wondering whether a motherboard with the optional side-port memory would unlock the extra post processing, even with an older HT 2.0 CPU, since it also provides greater memory bandwidth to the GPU as demonstrated by the increase of 3D performance. Has anyone announced boards with this feature?

Thanks for any further info!
crabnebula is offline  
post #395 of 4430 Old 03-04-2008, 11:16 AM
Newbie
 
spin_dizzy's Avatar
 
Join Date: Mar 2008
Posts: 2
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Is the Gigabyte board reviewed on Tom's Hardware an engineering sample as well? Look at the closeup of the 780G on this page:
h t t p://www.tomshardware.com/2008/03/04/amd_780g_chipset/page14.html

Is 0744 ENG an engineering chip?
spin_dizzy is offline  
post #396 of 4430 Old 03-04-2008, 11:33 AM - Thread Starter
Advanced Member
 
ChrisMorley's Avatar
 
Join Date: Aug 2006
Location: Austin, TX
Posts: 770
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by vkristof View Post

OK. Is AMD going to selling the slow clocked Phenoms in the low $100 prcie range then?

It sounds like you're saying that HT3.0 would help for soem of the finer point of HD video processing. I'd bite if the Phenoms are in the 100-130 range taht somebody mentioned.

For the hundredth time: I wish AMD success, competition is GOOD for the consumer. Me!

I've been trying to get pricing out of all my channels in AMD, but from what I'm hearing it hasn't been handed down to the FAE's yet - so while I HOPE it's in the $100-$130 range, which would ROCK, I have no idea what it really will be.

Also, I do have a Phenom 9100e and it does run cooler and is probably going to be THE chip to have for HTPCs if you are going the AMD platform route.
ChrisMorley is offline  
post #397 of 4430 Old 03-04-2008, 11:34 AM - Thread Starter
Advanced Member
 
ChrisMorley's Avatar
 
Join Date: Aug 2006
Location: Austin, TX
Posts: 770
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by crabnebula View Post

Chris, do you happen to know exactly what further post-processing is enabled by using an HT 3.0 CPU that provides increased memory bandwidth?

I assume deinterlacing of video-based HD material might be one thing as that is fairly demanding, but I would think deinterlacing film-based 1080i shoudn't be a problem either way (reviews don't corroborate this however). And what about PQ with SD DVD, is it the same regardless of CPU?

Finally, I was wondering whether a motherboard with the optional side-port memory would unlock the extra post processing, even with an older HT 2.0 CPU, since it also provides greater memory bandwidth to the GPU as demonstrated by the increase of 3D performance. Has anyone announced boards with this feature?

Thanks for any further info!


I will send an email right now in regards to post processing, as to side port performance, I am pretty sure that's only going to affect gaming performance, but your theory is sound.
ChrisMorley is offline  
post #398 of 4430 Old 03-04-2008, 11:45 AM
Senior Member
 
crabnebula's Avatar
 
Join Date: Jan 2005
Location: Montreal, Canada
Posts: 389
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Thanks Chris
crabnebula is offline  
post #399 of 4430 Old 03-04-2008, 12:34 PM
Newbie
 
cgsheen's Avatar
 
Join Date: Feb 2008
Posts: 6
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by ChrisMorley View Post

Most people test outside of chassis. This issue is being QUICKLY resolved, however. There are gating issues with older BIOSs that have been identified that leads to higher heat. It's not an issue in most instances, just some high heat, low flow HTPC environments. F3E BIOS fixed this to a degree, and they're about to button it up completely.

I've been following this thread and your comments/problems with heat. I got my Gigabyte 780G from PCClub last Weds - got it installed last Thursday. I haven't had the same experience...

I was nervous about the NB after reading the posts here so I pulled the HS, cleaned off the TIM they used and reset it using AS5 before I installed the MB. I'm using an Athlon 64 X2 5400+ and it runs a little warm with a stock AMD HSF.

I installed AMD Overdrive 2 beta to check the GPU temp and usually get the same "81C" that others have reported. EXCEPT - I have a "point and shoot" laser thermometer (from my HVAC days) and it's extremely accurate. While Overdrive is reporting the GPU core at 81C, the heatsink (with no active cooling) is showing 47-51C or so at it's hottest point.

SO - either I really screwed up re-installing the sink, or the software is reporting the temp wrongly... Since the heatsink sits directly on the core, there really shouldn't be that large of a difference to my mind. I know the heatsink will never show as hot as the core actually is, but I wouldn't expect that large of "spread". (oh, and I DIDN'T screw up the heatsink install...)

Nonetheless - mine hasn't showed any of the overheating signs that you mentioned in your previous posts. It's in a big Silverstone case, and I've just replaced the stock CPU cooler with a HSF that does a better job with the CPU and blows it's air over the NB sink also. Buttoned up this thing is running WAY better than the 690G it replaced.

Sphere: I've noticed that the Vista Desktop @ 1080i is MUCH easier to read and there is MUCH less "interlacing jitter" with the 780G compared to the 8600GT we were using with the 690G MB...
cgsheen is offline  
post #400 of 4430 Old 03-04-2008, 01:03 PM
Member
 
HwyXingFrog's Avatar
 
Join Date: Nov 2007
Posts: 24
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Well, it's looking like the Gigabyte 780g is going to be a necessary upgrade from my Gigabyte 690g.

Maybe I will wait for a little bit for one with 6 internal SATA ports instead of the esata, or not, geez I hate choosing mobos.

I just want to upgrade to a blu ray drive (the LG combo isn't a big deal anymore since HDDVD is now dead), but I knew the 690g would struggle with my be-2350.
HwyXingFrog is offline  
post #401 of 4430 Old 03-04-2008, 01:07 PM
AVS Special Member
 
vkristof's Avatar
 
Join Date: Dec 2002
Location: Long Island
Posts: 1,116
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by spin_dizzy View Post

Is the Gigabyte board reviewed on Tom's Hardware an engineering sample as well? Look at the closeup of the 780G on this page:
h t t p://www.tomshardware.com/2008/03/04/amd_780g_chipset/page14.html

Is 0744 ENG an engineering chip?

YES! All the board photos I've seen with the heatsinks removed are stamped ENG. 0744 is 2007, week 44.

Remember that these are VERY short design cycles in a VERY competitive market.
vkristof is offline  
post #402 of 4430 Old 03-04-2008, 01:09 PM
Senior Member
 
redtyler1's Avatar
 
Join Date: Jan 2007
Posts: 423
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
cgsheen:

What BIOS version are you using? Is it possible that this whole overheating thing is a BIOS error either due to 1) incorrect display of the temperature or 2) could it be the actual cause of the NB getting too hot?

Is the F3E version of the BIOS shipping now?

Hell, I've been waiting for a Biostar or MSI board because I thought Chris et al knew that Gigabyte jacked up their heatsink.

I guess a definitive answer will arise eventually. But, thanks for everyone's input especially Mr. Morley, Java Jack and Bingo 13.
redtyler1 is offline  
post #403 of 4430 Old 03-04-2008, 01:09 PM
AVS Special Member
 
vkristof's Avatar
 
Join Date: Dec 2002
Location: Long Island
Posts: 1,116
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by HwyXingFrog View Post

Well, it's looking like the Gigabyte 780g is going to be a necessary upgrade from my Gigabyte 690g.

Maybe I will wait for a little bit for one with 6 internal SATA ports instead of the esata, or not, geez I hate choosing mobos.

I just want to upgrade to a blu ray drive (the LG combo isn't a big deal anymore since HDDVD is now dead), but I knew the 690g would struggle with my be-2350.

You can always run the ESATA port back into the case via cable.
vkristof is offline  
post #404 of 4430 Old 03-04-2008, 01:18 PM - Thread Starter
Advanced Member
 
ChrisMorley's Avatar
 
Join Date: Aug 2006
Location: Austin, TX
Posts: 770
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by cgsheen View Post

I've been following this thread and your comments/problems with heat. I got my Gigabyte 780G from PCClub last Weds - got it installed last Thursday. I haven't had the same experience...

I was nervous about the NB after reading the posts here so I pulled the HS, cleaned off the TIM they used and reset it using AS5 before I installed the MB. I'm using an Athlon 64 X2 5400+ and it runs a little warm with a stock AMD HSF.

I installed AMD Overdrive 2 beta to check the GPU temp and usually get the same "81C" that others have reported. EXCEPT - I have a "point and shoot" laser thermometer (from my HVAC days) and it's extremely accurate. While Overdrive is reporting the GPU core at 81C, the heatsink (with no active cooling) is showing 47-51C or so at it's hottest point.

SO - either I really screwed up re-installing the sink, or the software is reporting the temp wrongly... Since the heatsink sits directly on the core, there really shouldn't be that large of a difference to my mind. I know the heatsink will never show as hot as the core actually is, but I wouldn't expect that large of "spread". (oh, and I DIDN'T screw up the heatsink install...)

Nonetheless - mine hasn't showed any of the overheating signs that you mentioned in your previous posts. It's in a big Silverstone case, and I've just replaced the stock CPU cooler with a HSF that does a better job with the CPU and blows it's air over the NB sink also. Buttoned up this thing is running WAY better than the 690G it replaced.

Sphere: I've noticed that the Vista Desktop @ 1080i is MUCH easier to read and there is MUCH less "interlacing jitter" with the 780G compared to the 8600GT we were using with the 690G MB...

Glad to hear it - it seems like a well ventilated desktop chassis doesn't have an issue - I've had access to 4 of these boards and there was definitely an issue in regards to overheating particularly prior to the F3E BIOS. Just make sure you have a HS/F sitting on your CPU that blows air over the NB and you have good ventilation and you have nothing to worry about.
ChrisMorley is offline  
post #405 of 4430 Old 03-04-2008, 01:20 PM - Thread Starter
Advanced Member
 
ChrisMorley's Avatar
 
Join Date: Aug 2006
Location: Austin, TX
Posts: 770
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by redtyler1 View Post

cgsheen:

What BIOS version are you using? Is it possible that this whole overheating thing is a BIOS error either due to 1) incorrect display of the temperature or 2) could it be the actual cause of the NB getting too hot?

Is the F3E version of the BIOS shipping now?

Hell, I've been waiting for a Biostar or MSI board because I thought Chris et al knew that Gigabyte jacked up their heatsink.

I guess a definitive answer will arise eventually. But, thanks for everyone's input especially Mr. Morley, Java Jack and Bingo 13.


F3E is public. There is another one in development that further alleviates any overheating issues. TBH, this is the first time I've come across an overheating issue that can be attributed to the BIOS. But it's getting cleared up apparently...
ChrisMorley is offline  
post #406 of 4430 Old 03-04-2008, 02:24 PM
Senior Member
 
kenyee's Avatar
 
Join Date: Jul 2005
Location: Boston, PRofMA
Posts: 258
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Does the Gigabyte board support ECC memory? I can't find anything in the specs or manual that mentions it.
Since ECC memory support is supported in the Athlon chips, they just have to include BIOS settings to enable it...
kenyee is offline  
post #407 of 4430 Old 03-04-2008, 02:29 PM
AVS Special Member
 
Java Jack's Avatar
 
Join Date: Mar 2006
Location: Austin TX
Posts: 1,784
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Just posted some info on the MSI mobo targeted at HTPC space. Take a look at the MSI 7411 thread.

Regards.
Java

There are 10 types of people in the world, those that understand binary, and those that don't.

AMD@Home Blog: http://links.amd.com/Home
Twitter: http://twitter.com/Java_Jack
Java Jack is offline  
post #408 of 4430 Old 03-04-2008, 03:29 PM
Newbie
 
cgsheen's Avatar
 
Join Date: Feb 2008
Posts: 6
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
The board came with F1 I believe. I immediately flashed it to F3b and then found F3e. It's currently running F3e. The temp in Overdrive's Status Monitor never varies more than 4 or 5 C... Even when I know I'm pushing the GPU. I'm not certain I have a lot of faith in it. I saw Tom's Hardware used GPU-Z in their review. Does it report GPU temp?
cgsheen is offline  
post #409 of 4430 Old 03-04-2008, 04:16 PM
AVS Club Gold
 
renethx's Avatar
 
Join Date: Jan 2006
Posts: 16,016
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 120 Post(s)
Liked: 341
Quote:
Originally Posted by SpHeRe31459 View Post

The HQV socres are really hard to compare because of how much the subjective portions of the tests weight the scores. Further I contend the CPU doesn't actually change the HQV properties (at least in the portions of the test that really matter in the real world), it's all on the graphics processor.

Also I would be hesitant to get any Phenom until the B3 stepping comes out, the 9100e is still B2, it will be replaced by summer with the 9150e which is the fixed B3 version.

The point seems to be not the CPU but the HyperTransport bus. As you know, the memory controller is integrated in the CPU and the gaphics core communitcates with the system memory via the HT bus (thus implementing IGP is harder in the AMD platform than the Intel platform where memory controller is located in the same chipset). Higher bandwidth of HT 3.0 helps a lot in 3D performance (communication between shader processors and memory; 110% increase, that's not a mere speculation but the fact!), then could help deinterlacing.

If it's the HT bus that is important, the B2 stepping and BIOS update should be fine for these specific purposes.

For your information, here is a 3DMark06 score chart (source: 日経パソコンオンライン - AMD 780G続報Phenomとの組み合わせで高速化).


The graphs are normalized relative to the scores for Athlon X2 5000+ BE. The processor speed has no effect (no difference between 5000+ 2.6GHz and 6000+ 3.0GHz). The improvement by Phenom is just dramatic.
renethx is offline  
post #410 of 4430 Old 03-04-2008, 04:20 PM
Newbie
 
markc3's Avatar
 
Join Date: Feb 2008
Location: Washington State
Posts: 13
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
I installed AMD Overdrive 2 beta to check the GPU temp and usually get the same "81C" that others have reported. EXCEPT - I have a "point and shoot" laser thermometer (from my HVAC days) and it's extremely accurate. While Overdrive is reporting the GPU core at 81C, the heatsink (with no active cooling) is showing 47-51C or so at it's hottest point.

I upgraded to F3E Sunday night. I'm using a Rogue case with three suprisingly v. quiet 120mm fans.
When I look at OD it holds steady at 80, once in while going to 79 or 81. I tried speedfan w/same results. Also tried CPUID hwmonitor. I've read a couple of posts with same types of readings but then mention doesn't feel that hot to the touch....
markc3 is offline  
post #411 of 4430 Old 03-04-2008, 04:57 PM
AVS Special Member
 
vkristof's Avatar
 
Join Date: Dec 2002
Location: Long Island
Posts: 1,116
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by renethx View Post

The point seems to be not the CPU but the HyperTransport bus. As you know, the memory controller is integrated in the CPU and the gaphics core communitcates with the system memory via the HT bus (thus implementing IGP is harder in the AMD platform than the Intel platform where memory controller is located in the same chipset). Higher bandwidth of HT 3.0 helps a lot in 3D performance (communication between shader processors and memory; 110% increase, that's not a mere speculation but the fact!), then could help deinterlacing.

If it's the HT bus that is important, the B2 stepping and BIOS update should be fine for these specific purposes.

For your information, here is a 3DMark06 score chart (source: 日経パソコンオンライン - AMD 780G続報Phenomとの組み合わせで高速化).


The graphs are normalized relative to the scores for Athlon X2 5000+ BE. The processor speed has no effect (no difference between 5000+ 2.6GHz and 6000+ 3.0GHz).

I haven't bought a CPU for my 780G yet.

What's an "Athlon X2 5000+ BE"?
vkristof is offline  
post #412 of 4430 Old 03-04-2008, 04:59 PM - Thread Starter
Advanced Member
 
ChrisMorley's Avatar
 
Join Date: Aug 2006
Location: Austin, TX
Posts: 770
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Athlon X2 5000+ BE is a BLACK EDITION with unlocked multiplier. Good for overclocking.
ChrisMorley is offline  
post #413 of 4430 Old 03-04-2008, 05:05 PM
AVS Special Member
 
vkristof's Avatar
 
Join Date: Dec 2002
Location: Long Island
Posts: 1,116
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by ChrisMorley View Post

Athlon X2 5000+ BE is a BLACK EDITION with unlocked multiplier. Good for overclocking.

Gotcha. I thought the BE was some 45W variant I hadn't heard of.
vkristof is offline  
post #414 of 4430 Old 03-04-2008, 05:13 PM
AVS Special Member
 
SpHeRe31459's Avatar
 
Join Date: Nov 2004
Location: Sacramento, CA
Posts: 1,017
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by renethx View Post

The point seems to be not the CPU but the HyperTransport bus. As you know, the memory controller is integrated in the CPU and the gaphics core communitcates with the system memory via the HT bus (thus implementing IGP is harder in the AMD platform than the Intel platform where memory controller is located in the same chipset). Higher bandwidth of HT 3.0 helps a lot in 3D performance (communication between shader processors and memory; 110% increase, that's not a mere speculation but the fact!), then could help deinterlacing.

If it's the HT bus that is important, the B2 stepping and BIOS update should be fine for these specific purposes.

There are about two different things going on with my comments that aren't necessarily related and I think have you confused about what I've been saying.

1. My comments about the Phenom aren't related to whether it helps the 3D graphics performance, clearly it does. And the increased HyperTransport bandwidth may indeed help with deinterlacing. My hesitation isn't about the performance increase. It's about buying a CPU we know to be flawed. Yes with the TLB patch off in the BIOS it's fine, still I would be cautious and wait an extra couple of months for the B3 stepping. I would tend to agree that just getting the 9100e is fine, however consider the time frame of when the 9100e comes out, which is supposed to be around April, and then sometime in summer AMD will get the B3 steppings out, so you basically gave into impulse and bought the old stepping within a few months of a new stepping. Not the end of the world, but just a recommendation of caution if you can wait to build your HTPC.

2. Again, I think you're confused about what I'm saying. The 3D Mark scores do not necessarily correlate to improved video anything. You'll recall that the VC-1 and H.264 decoding are done in dedicated silicon, as is the noise reduction, the designs are borrowed from ATI's Xilleon line of DTV chips. What is done in the shaders is deinterlacing and MPEG2 decode. So yes those areas may improve, but we don't have any tests of this yet, all we see is it failing to score well in the HD HQV tests (which are compressed in VC-1, so it doesn't test the MPEG2 shader code, only the deinterlacing portion). We need the same test bench with an Athlon 64 X2 and then a Phenom swapped in.

Now the post-processing is only what the video drivers expose to the video decoder filter. If the video drivers don't even expose the ability to choose ATI's advanced deinterlacing then it will fail, regardless of the CPU. I posted earlier on why this could be happening, I'm not saying this is the reality of it, but the tactic has been done before by ATI and NVIDIA, when they felt their low-end parts didn't have the speed for a type of deinterlacing or post-processing they simply disabled it in the drivers. Now hopefully ATI has done a load-balancing type of code in their drivers for the deinterlacing and it properly recognizes that you have a beefy CPU which can help bear the brunt of advanced deinterlacing.
SpHeRe31459 is offline  
post #415 of 4430 Old 03-04-2008, 05:31 PM
AVS Club Gold
 
renethx's Avatar
 
Join Date: Jan 2006
Posts: 16,016
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 120 Post(s)
Liked: 341
Quote:
Originally Posted by cgsheen View Post

I've been following this thread and your comments/problems with heat. I got my Gigabyte 780G from PCClub last Weds - got it installed last Thursday. I haven't had the same experience...

I was nervous about the NB after reading the posts here so I pulled the HS, cleaned off the TIM they used and reset it using AS5 before I installed the MB. I'm using an Athlon 64 X2 5400+ and it runs a little warm with a stock AMD HSF.

I installed AMD Overdrive 2 beta to check the GPU temp and usually get the same "81C" that others have reported. EXCEPT - I have a "point and shoot" laser thermometer (from my HVAC days) and it's extremely accurate. While Overdrive is reporting the GPU core at 81C, the heatsink (with no active cooling) is showing 47-51C or so at it's hottest point.

SO - either I really screwed up re-installing the sink, or the software is reporting the temp wrongly... Since the heatsink sits directly on the core, there really shouldn't be that large of a difference to my mind. I know the heatsink will never show as hot as the core actually is, but I wouldn't expect that large of "spread". (oh, and I DIDN'T screw up the heatsink install...)

Nonetheless - mine hasn't showed any of the overheating signs that you mentioned in your previous posts. It's in a big Silverstone case, and I've just replaced the stock CPU cooler with a HSF that does a better job with the CPU and blows it's air over the NB sink also. Buttoned up this thing is running WAY better than the 690G it replaced.

Sphere: I've noticed that the Vista Desktop @ 1080i is MUCH easier to read and there is MUCH less "interlacing jitter" with the 780G compared to the 8600GT we were using with the 690G MB...

Thanks for the report. Unfortunately it's very incomplete. What task was the NB doing when you measured the temperature? Measuring temp in three cases will be good:

- idle
- video playback, e.g. playing a BD movie
- gaming (3DMark06 is enough)

Which CPU cooler are you using? Many users prefer Scythe Ninja Mini (side-flow) to a top-flow cooler because they usually (?) use Antec NSK2480. Maybe reporting temp with the stock cooler will be helpful for these users.
renethx is offline  
post #416 of 4430 Old 03-04-2008, 05:42 PM
AVS Special Member
 
AbMagFab's Avatar
 
Join Date: Feb 2001
Posts: 4,611
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
So when is the 4850e going to be available? Or should I just get the Athlon X2 64 5000+ (non-BE, don't care about OC my HTPC I don't think)?

So the GByte MB, Athlon 64 X2 5000+ CPU, and Radeon 3450 for some extra GPU power? Is that the right setup for a HTPC with this MB?

TiVo is on it's way out - stream everything!
AbMagFab is offline  
post #417 of 4430 Old 03-04-2008, 05:45 PM - Thread Starter
Advanced Member
 
ChrisMorley's Avatar
 
Join Date: Aug 2006
Location: Austin, TX
Posts: 770
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by vkristof View Post

Gotcha. I thought the BE was some 45W variant I hadn't heard of.


Close, those have a lower case e designate at the end. Like the 4850e which is a 2.5GHz dual core rated @ 45w, or the Phenom 9100e which is a 1.8GHz quad core @ 65w.
ChrisMorley is offline  
post #418 of 4430 Old 03-04-2008, 05:47 PM - Thread Starter
Advanced Member
 
ChrisMorley's Avatar
 
Join Date: Aug 2006
Location: Austin, TX
Posts: 770
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by AbMagFab View Post

So when is the 4850e going to be available? Or should I just get the Athlon X2 64 5000+ (non-BE, don't care about OC my HTPC I don't think)?

Launched today, available en masse in the next few weeks. If you have plenty of airflow, the 5000+ is fine, but doesn't come with a HS/F. The 4850 does and is $10 cheaper.
ChrisMorley is offline  
post #419 of 4430 Old 03-04-2008, 05:49 PM
Senior Member
 
crabnebula's Avatar
 
Join Date: Jan 2005
Location: Montreal, Canada
Posts: 389
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by SpHeRe31459 View Post

2. Again, I think you're confused about what I'm saying. The 3D Mark scores do not necessarily correlate to improved video anything. You'll recall that the VC-1 and H.264 decoding are done in dedicated silicon, as is the noise reduction, the designs are borrowed from ATI's Xilleon line of DTV chips. What is done in the shaders is deinterlacing and MPEG2 decode. So yes those areas may improve, but we don't have any tests of this yet, all we see is it failing to score well in the HD HQV tests (which are compressed in VC-1, so it doesn't test the MPEG2 shader code, only the deinterlacing portion). We need the same test bench with an Athlon 64 X2 and then a Phenom swapped in.

Now the post-processing is only what the video drivers expose to the video decoder filter. If the video drivers don't even expose the ability to choose ATI's advanced deinterlacing then it will fail, regardless of the CPU. I posted earlier on why this could be happening, I'm not saying this is the reality of it, but the tactic has been done before by ATI and NVIDIA, when they felt their low-end parts didn't have the speed for a type of deinterlacing or post-processing they simply disabled it in the drivers. Now hopefully ATI has done a load-balancing type of code in their drivers for the deinterlacing and it properly recognizes that you have a beefy CPU which can help bear the brunt of advanced deinterlacing.

I think what Chris Morley has been suggesting is that the drivers would indeed expose further post-processing when using an HT 3.0 CPU, however, my understanding is that it is the extra memory bandwidth that made this possible, and not reliance on extra CPU processing. Does that make sense? In other words, would motion adaptive deinterlacing be constrained by memory bandwidth or the actual computational power of the GPU shaders in this case?

I don't have any recent AMD/ATI hardware myself, but I thought that registry hacks had been found to enable the post-proc that is disabled on low end parts like the 2400 Pro (though I have no idea what the results are). Assuming similar hacks work for the 780G and that the post-processing in in part GPU-constrained, then overclocking the IGP might also provide working deinterlacing...

I really wish we had a proper review, comparing an Athlon with a Phenom and +/- overclocking the IGP. Unfortunately, even if we did, it might be a little early to assume that the drivers won't tweak how this all works over the next few releases.
crabnebula is offline  
post #420 of 4430 Old 03-04-2008, 05:54 PM
AVS Special Member
 
AbMagFab's Avatar
 
Join Date: Feb 2001
Posts: 4,611
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
So does this 780G MB support DDLive/DTSConnect so I can get TrueHD 5.1 soundtracks to play over the HDMI connection? And is the DDLive/DTSConnect any good on this board? I haven't seen any reviews yet about the audio over HDMI on the 780G.

TiVo is on it's way out - stream everything!
AbMagFab is offline  
Reply Home Theater Computers

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off