Intel HD 3000 real use PQ same/close to AMD & NVidia? - AVS Forum
Forum Jump: 
Reply
 
Thread Tools
post #1 of 67 Old 05-23-2012, 08:44 PM - Thread Starter
Advanced Member
 
Livin's Avatar
 
Join Date: May 2004
Posts: 623
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 20 Post(s)
Liked: 11
I promised my current HTPC to a friend so I'm looking at buying a laptop (prices are rock bottom right now) or mITX to replace it.

I currently run an AMD HD6450 which as been fine on my 65" DLP. I use Vector Adaptive and turn off all other features for most content. I play mainly 720p, 1080i, and 1080p content. Sometimes I use 480p and unconvert to 1080p

So I'm curious... does the PQ on the Intel HD3000 rival that of an HTPC AMD or Nvidia GFX card... like HD6450 or GT430/520?

I'm not talking about pixel peeping but real world viewing under various lighting conditions based on time of day, materials etc.

My DLP is calibrated/tweaked (using service menus, etc) and I sit the normal 8-10' away. Also, my next set will likely be 70 or 80" at 10-12' viewing distance.

So the questions remain...

1) Will using an Intel HD3000 show any degradation of PQ, colors, motion, etc?

2) Will there be any issues specific to the DLP using HD3000?

3) anything else to consider?


thx for the help!

________
Ltek

my setup: XBMC, Windows Media Center, Z-Wave/Insteon automation, Paradigm-Parasound-Onkyo-Velodyne Home Theater, 110" DIY Screen & BenQ W1070
Livin is offline  
Sponsored Links
Advertisement
 
post #2 of 67 Old 05-25-2012, 05:26 PM - Thread Starter
Advanced Member
 
Livin's Avatar
 
Join Date: May 2004
Posts: 623
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 20 Post(s)
Liked: 11
anyone?

________
Ltek

my setup: XBMC, Windows Media Center, Z-Wave/Insteon automation, Paradigm-Parasound-Onkyo-Velodyne Home Theater, 110" DIY Screen & BenQ W1070
Livin is offline  
post #3 of 67 Old 05-25-2012, 05:49 PM
AVS Addicted Member
 
assassin's Avatar
 
Join Date: Jul 2004
Posts: 12,961
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 50 Post(s)
Liked: 241
Q: Intel HD 3000 real use PQ same/close to AMD & NVidia?

A: Yes


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

assassin is offline  
post #4 of 67 Old 05-26-2012, 08:10 AM
Senior Member
 
C17chief's Avatar
 
Join Date: May 2004
Posts: 287
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 2 Post(s)
Liked: 17
between G620 using the on chip graphics and an AMD E-350 APU setup I have....both look exactly the same as far as color, saturation, etc goes (intel HD graphics much better at things like de-interlacing and all so cant compare my 2 machines in those sort of respects) post calibrating with a meter. Default to default before putting a meter to use, I want to say the picture from the intel setup is better looking then the amd, but since they are about the same post calibration, it's probably just a default settings sort of thing rather then any hardware reproduction sort of difference.
C17chief is offline  
post #5 of 67 Old 05-26-2012, 08:45 AM - Thread Starter
Advanced Member
 
Livin's Avatar
 
Join Date: May 2004
Posts: 623
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 20 Post(s)
Liked: 11
Quote:
Originally Posted by C17chief View Post

between G620 using the on chip graphics and an AMD E-350 APU setup I have....both look exactly the same as far as color, saturation, etc goes (intel HD graphics much better at things like de-interlacing and all so cant compare my 2 machines in those sort of respects) post calibrating with a meter. Default to default before putting a meter to use, I want to say the picture from the intel setup is better looking then the amd, but since they are about the same post calibration, it's probably just a default settings sort of thing rather then any hardware reproduction sort of difference.

You are talking about a lower class of GPU than the examples I gave... and what I'm comparing.

I'm comparing HD3000 to the AMD A series APU (6620G or higher) and Nvidia 430/520

I've seen reports saying the Intel HD3000 cannot render HD material properly nearly as well. And since I'm using a large screen 65" today, 80 will be my next step, I'm thinking I'll stick with AMD or Nvidia

________
Ltek

my setup: XBMC, Windows Media Center, Z-Wave/Insteon automation, Paradigm-Parasound-Onkyo-Velodyne Home Theater, 110" DIY Screen & BenQ W1070
Livin is offline  
post #6 of 67 Old 05-26-2012, 09:02 AM
AVS Addicted Member
 
assassin's Avatar
 
Join Date: Jul 2004
Posts: 12,961
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 50 Post(s)
Liked: 241
Quote:
Originally Posted by Livin View Post

You are talking about a lower class of GPU than the examples I gave... and what I'm comparing.

I'm comparing HD3000 to the AMD A series APU (6620G or higher) and Nvidia 430/520

I've seen reports saying the Intel HD3000 cannot render HD material properly nearly as well. And since I'm using a large screen 65" today, 80 will be my next step, I'm thinking I'll stick with AMD or Nvidia

Why did you even ask then?

I have used the HD2000 on a 100" and 112" screen. 1080p/720p looks exactly the same on Intel, NVidia and ATI.

Edit: Where did you see these "reports"? The XBMC forums by chance? They are decidedly anti-Intel and I have seen numerous things posted there that are just flat out wrong or incorrect (often by people that admittedly haven't even used the current generation of Intel iGPUs).


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

assassin is offline  
post #7 of 67 Old 05-26-2012, 09:41 AM - Thread Starter
Advanced Member
 
Livin's Avatar
 
Join Date: May 2004
Posts: 623
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 20 Post(s)
Liked: 11
Quote:
Originally Posted by assassin View Post

Why did you even ask then?

I have used the HD2000 on a 100" and 112" screen. 1080p/720p looks exactly the same on Intel, NVidia and ATI.

Edit: Where did you see these "reports"? The XBMC forums by chance? They are decidedly anti-Intel and I have seen numerous things posted there that are just flat out wrong or incorrect (often by people that admittedly haven't even used the current generation of Intel iGPUs).

I asked because I was hoping to get some real-world feedback... not just a few professional reviewers why may or may not use HTPC daily. The reviews I'm talking about were not XBMC forums, or any forums. Articles from Anandtech, and others who do reviews regularly.

Your feedback in this post was the type I'm looking for... Have you used any post-processing to clean-up the videos? AMD has a ton of post processing features. I don't use any of the except Vector Adaptive motion since my HD6450 cannot do any of the others at HD res. But, I have noticed Vector Adaptive is MUCH better than other forms of motion compensation.

________
Ltek

my setup: XBMC, Windows Media Center, Z-Wave/Insteon automation, Paradigm-Parasound-Onkyo-Velodyne Home Theater, 110" DIY Screen & BenQ W1070
Livin is offline  
post #8 of 67 Old 05-26-2012, 09:48 AM
AVS Addicted Member
 
assassin's Avatar
 
Join Date: Jul 2004
Posts: 12,961
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 50 Post(s)
Liked: 241
Quote:
Originally Posted by Livin View Post

I asked because I was hoping to get some real-world feedback... not just a few professional reviewers why may or may not use HTPC daily. The reviews I'm talking about were not XBMC forums, or any forums. Articles from Anandtech, and others who do reviews regularly.

Your feedback in this post was the type I'm looking for... Have you used any post-processing to clean-up the videos? AMD has a ton of post processing features. I don't use any of the except Vector Adaptive motion since my HD6450 cannot do any of the others at HD res. But, I have noticed Vector Adaptive is MUCH better than other forms of motion compensation.

Those same reviewers use benchmarking to try to show picture quality which is not "real world".

There are many well respected people at AVS (including but not limited to myself, fitbrit, GreenEyez, idividebyzero, Nevcairiel (developer of LAV), olyteddy, pokekevin, Shark007 (developer of Shark007 codec pack), Somewhatlost, steelman1991, SUBCOB, Tony_Montana, Tulli, vladd, whiteboy714, xfett, etc who have used all three and not seen any difference for 1080p.

By default ATI has all their post-processing crap turn on and Intel and NVidia has theirs off.

The best advice is to try the Intel iGPU first and if you aren't happy for whatever reason add an ATI or NVidia card later. There is very little downside to this approach.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

assassin is offline  
post #9 of 67 Old 05-26-2012, 12:33 PM
AVS Special Member
 
ctviggen's Avatar
 
Join Date: Dec 2003
Posts: 1,897
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 8 Post(s)
Liked: 19
Anyone compare computer video processing with processing by an external video processor? I have a Dell video processor that I'd like to put between my HT computer (with Intel 3000 graphics on an i3), but don't want to take the time to figure out how to do it unless it's worth it.

Bob
ctviggen is offline  
post #10 of 67 Old 05-26-2012, 03:26 PM - Thread Starter
Advanced Member
 
Livin's Avatar
 
Join Date: May 2004
Posts: 623
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 20 Post(s)
Liked: 11
Quote:
Originally Posted by assassin View Post

Those same reviewers use benchmarking to try to show picture quality which is not "real world".

There are many well respected people at AVS (including but not limited to myself, fitbrit, GreenEyez, idividebyzero, Nevcairiel (developer of LAV), olyteddy, pokekevin, Shark007 (developer of Shark007 codec pack), Somewhatlost, steelman1991, SUBCOB, Tony_Montana, Tulli, vladd, whiteboy714, xfett, etc who have used all three and not seen any difference for 1080p.

By default ATI has all their post-processing crap turn on and Intel and NVidia has theirs off.

The best advice is to try the Intel iGPU first and if you aren't happy for whatever reason add an ATI or NVidia card later. There is very little downside to this approach.

For 1080p I suspect there will be very little, if any, noticeable PQ difference since no processing will occur on it... but I'm curious to 720p and 1080i when up-scaling to 1080p... which is what I do since 95% of the content I watch is 720p (ripped material) or 1080i (OTA material) and played on a 65" 1080p DLP display which shows any/all video PQ defects - much more so than any LCD I've seen.

I turn all post processing off, except Vector Adaptive motion since it makes a huge difference in PQ for movement. All the reviewers I've seen have pointed this out as a large AMD advantage. Though, I have not seen Intel or NVidia myself.

Since I'm looking to replace my current HTPC with a laptop or mITX (for space/size reasons) adding an AMD or NVidia card in later is not an option.

I'll see if I can borrow an Intel system from someone and try it for a bit.

________
Ltek

my setup: XBMC, Windows Media Center, Z-Wave/Insteon automation, Paradigm-Parasound-Onkyo-Velodyne Home Theater, 110" DIY Screen & BenQ W1070
Livin is offline  
post #11 of 67 Old 05-26-2012, 03:33 PM
AVS Addicted Member
 
assassin's Avatar
 
Join Date: Jul 2004
Posts: 12,961
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 50 Post(s)
Liked: 241
There are plenty low profile graphics card options for mini itx or m-atx. So they are an option.

Don't use a laptop.

Again, it seems like your mind is already made up in regards to what a certain product will or won't do.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

assassin is offline  
post #12 of 67 Old 05-26-2012, 03:54 PM
AVS Special Member
 
ricabullah's Avatar
 
Join Date: Mar 2007
Posts: 1,782
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I cannot say AMD and nVidia or Intel gives the same PQ.
nVidia gives the best natural colors, and Intel gives very close results to nVidia while AMD/Ati cannot. Its PQ always seemed to me a water-color picture.

Chronical Tester
ricabullah is offline  
post #13 of 67 Old 06-11-2012, 05:10 PM - Thread Starter
Advanced Member
 
Livin's Avatar
 
Join Date: May 2004
Posts: 623
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 20 Post(s)
Liked: 11
Quote:
Originally Posted by assassin View Post

There are plenty low profile graphics card options for mini itx or m-atx. So they are an option.


Don't use a laptop.


Again, it seems like your mind is already made up in regards to what a certain product will or won't do.

Quote:
Originally Posted by ricabullah View Post

I cannot say AMD and nVidia or Intel gives the same PQ.

nVidia gives the best natural colors, and Intel gives very close results to nVidia while AMD/Ati cannot. Its PQ always seemed to me a water-color picture.


We'll a laptop fits best in the space I have for it... and the prices of the laptops are better than building a mITX. (plus I get a built-in screen & kb/mouse when I need it)

Just moved from a system with AMD HD6450 to an gen1 i7 laptop with NVidia 880M. After using it for 2 weeks, I'm not sure NVidia's colors are any more natural than AMD. Flesh tones actually seem less natural, more saturated. (I did not touch color settings with either card)
I do notice more macro blocking with the NVidia, in darker scenes.

The AMD drivers seem to have more ability to 'tweak' via post processing (type of deinterlacing, denoise, deblock, etc)... Does NVidia have this and I'm just missing it?

thx

________
Ltek

my setup: XBMC, Windows Media Center, Z-Wave/Insteon automation, Paradigm-Parasound-Onkyo-Velodyne Home Theater, 110" DIY Screen & BenQ W1070
Livin is offline  
post #14 of 67 Old 06-12-2012, 07:47 PM
Member
 
x0lliex's Avatar
 
Join Date: Apr 2006
Posts: 18
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
I haven't found any difference between a HD2000 and an AMD HD6570 with one exception: de-interlacing. I found the Intel HD2000 de-interlacing to be poor on 1080i material. I have had no such problems with the AMD HD6570. Obviously my Intel GPU was the 2000, so maybe the 3000 has improved de-interlacing performance, just something to think about.
x0lliex is offline  
post #15 of 67 Old 06-12-2012, 08:29 PM - Thread Starter
Advanced Member
 
Livin's Avatar
 
Join Date: May 2004
Posts: 623
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 20 Post(s)
Liked: 11
Quote:
Originally Posted by x0lliex View Post

I haven't found any difference between a HD2000 and an AMD HD6570 with one exception: de-interlacing. I found the Intel HD2000 de-interlacing to be poor on 1080i material. I have had no such problems with the AMD HD6570. Obviously my Intel GPU was the 2000, so maybe the 3000 has improved de-interlacing performance, just something to think about.

I don't know what type/encoding/res/bitrate/etc you are comparing but I' be very surprised if the HD2000 or even HD3000 could come close in post processing capabilities to the AMD HD6570

Looks at the in-depth review...
http://www.anandtech.com/show/4380/discrete-htpc-gpus-shootout/1
... unfortunately it does not compare the Intel HD2000/3000 but as you can see even the discrete GT520 and HD6450, which both easily beat the Intel HD3000 in horse power, cannot even come close to the HD6570.

I'm still looking for ways to change the NVidia settings like I can the AMD?

EDIT: I did find a recent review of the HD3000 & HD4000 running the same tests...

http://www.anandtech.com/show/5773/intels-ivy-bridge-an-htpc-perspective/3

The HD6570 still bests HD3000 easily and looks similar to the HD4000... but, of course, you can only get an HD4000 with a VERY expensive Ivy Bridge CPU

________
Ltek

my setup: XBMC, Windows Media Center, Z-Wave/Insteon automation, Paradigm-Parasound-Onkyo-Velodyne Home Theater, 110" DIY Screen & BenQ W1070
Livin is offline  
post #16 of 67 Old 06-13-2012, 01:32 PM
Member
 
nxsfan's Avatar
 
Join Date: Jan 2010
Location: New Mexico
Posts: 133
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 6 Post(s)
Liked: 19
This thread has raised some questions for me. I was going to post my subjective experiences with nvidia versus intel but the answers to these question may be more useful. Can anyone help?

1) Do people generally use the post processing features of intel/amd/nvidia drivers? (I deliberately leave them off, I want to preserve the original 1080p material - if you are dealing with poor quality material maybe it makes sense).

2) When talking about picture quality does that make sense only in the context of using the native intel/amd/nvidia hardware accelerated/optimized decoder implementations? Or is just effected by post processing?

3) In that case, do most people use the native decoders?

4) What about when using 3rd party decoders and renderers (lav, madvr)? I use the LAV decoder + MadVR renderer and I can't tell the difference between my GTX460 and my HD2K HTPCs.

5) To rephrase, if I'm using the same LAV decoder and MadVR versions, using quicksync and CUDA acceleration on HD2K and GTX460 respectively, won't the resultant output be identical to all extents and purposes.

6) OK finally, if 3rd party decoders and renderers provide excellent performance and probably the best output picture quality - doesn't that make this thread redundant? Or more likely, there is something fundamental about the decode/render process that I have failed to appreciate.

A quote from the anandtech link provided above.

"Blu-rays are usually mastered very carefully. Any video post processing (other than deinterlacing) which needs to be done is handled before burning it in. In this context, we don't think it is a great idea to run the HQV benchmark videos off the disc. Instead, we play the streams after copying them over to the hard disk."

Is it just me, or does this make no sense whatsoever. (It's probably just me).
nxsfan is offline  
post #17 of 67 Old 06-13-2012, 02:16 PM
AVS Special Member
 
Shark007's Avatar
 
Join Date: Apr 2007
Location: BC, Canada
Posts: 1,080
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 20 Post(s)
Liked: 48
Quote:
Originally Posted by nxsfan View Post

6) ....doesn't that make this thread redundant?

quality entertainment is not redundant. (I refer to the thread and not anything htpc related)

Use
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
and retain your sanity.
Shark007 is offline  
post #18 of 67 Old 06-13-2012, 02:33 PM
AVS Special Member
 
Sammy2's Avatar
 
Join Date: Mar 2011
Location: Right next to Wineville, CA
Posts: 9,835
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 115 Post(s)
Liked: 189
Quote:
Originally Posted by assassin View Post


Those same reviewers use benchmarking to try to show picture quality which is not "real world".


There are many well respected people at AVS (including but not limited to myself, fitbrit, GreenEyez, idividebyzero, Nevcairiel (developer of LAV), olyteddy, pokekevin, Shark007 (developer of Shark007 codec pack), Somewhatlost, steelman1991, SUBCOB, Tony_Montana, Tulli, vladd, whiteboy714, xfett, etc who have used all three and not seen any difference for 1080p.


By default ATI has all their post-processing crap turn on and Intel and NVidia has theirs off.


The best advice is to try the Intel iGPU first and if you aren't happy for whatever reason add an ATI or NVidia card later. There is very little downside to this approach.

I dont' get no respect.

Sammy2 is offline  
post #19 of 67 Old 06-13-2012, 03:15 PM
AVS Special Member
 
jakmal's Avatar
 
Join Date: Dec 2008
Location: Sunnyvale, CA
Posts: 1,059
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by nxsfan View Post

This thread has raised some questions for me. I was going to post my subjective experiences with nvidia versus intel but the answers to these question may be more useful. Can anyone help?
1) Do people generally use the post processing features of intel/amd/nvidia drivers? (I deliberately leave them off, I want to preserve the original 1080p material - if you are dealing with poor quality material maybe it makes sense).

By default, NVIDIA leaves everything off. So, that is an out-of-the-box solution for users who turn off post processing features.
Quote:
2) When talking about picture quality does that make sense only in the context of using the native intel/amd/nvidia hardware accelerated/optimized decoder implementations? Or is just effected by post processing?

Decoding is the same on all GPUs (for a given supported profile). Post processing is what is done after the decoding process to make the video appear 'better' / make it compliant with the output device. For example, most Blu-rays are encoded in 4:2:0 format. However, HDMI doesn't support 4:2:0 (needs 4:2:2 or 4:4:4). Therefore, chroma scaling is something that can't be avoided here. This is one type of 'post processing'.
Quote:
3) In that case, do most people use the native decoders?

Choice of decoder is basically only reliant on whether it can connect to the appropriate renderer filter in your desired graph / it supports the profile of the video file you are trying to play back.
Quote:
4) What about when using 3rd party decoders and renderers (lav, madvr)? I use the LAV decoder + MadVR renderer and I can't tell the difference between my GTX460 and my HD2K HTPCs.

madVR + HD2000 may not be reliable in terms of avoiding dropped frames for certain types of content (720p60, for example). As long as the hardware is powerful enough for the file being played back and the configuration of the decoders / renderers is the same, you should not find any difference between GTX460 and HD2000 unless the driver version has some issues with black level or color space or some other driver dependent post processing gets used in the chain.
Quote:
5) To rephrase, if I'm using the same LAV decoder and MadVR versions, using quicksync and CUDA acceleration on HD2K and GTX460 respectively, won't the resultant output be identical to all extents and purposes.

Yes, as long as the drivers in either Intel or NVIDIA side are not broken with respect to some post processing step in the chain using driver API calls. IIRC, Mathias indicated that some driver post processing could be inadvertently enabled when choosing hardware deinterlacing in recent versions of madVR unless they are explicitly turned off.
Quote:
"Blu-rays are usually mastered very carefully. Any video post processing (other than deinterlacing) which needs to be done is handled before burning it in. In this context, we don't think it is a great idea to run the HQV benchmark videos off the disc. Instead, we play the streams after copying them over to the hard disk."
Is it just me, or does this make no sense whatsoever. (It's probably just me).

This note was meant for PQ testing aspects such as skin tone correction. Also, as a general note, it is content such as TV broadcasts and user recorded camcorder content which need deinterlacing / cadence detection. Blu-rays with interlaced content are few in number. Further, film transfers in most (if not, all) Blu-rays are done without the telecining process. So cadence detection is actually not needed.
nxsfan likes this.

Ganesh T S
Sr. Editor,
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
jakmal is offline  
post #20 of 67 Old 06-13-2012, 03:29 PM
AVS Special Member
 
jakmal's Avatar
 
Join Date: Dec 2008
Location: Sunnyvale, CA
Posts: 1,059
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by Livin View Post

EDIT: I did find a recent review of the HD3000 & HD4000 running the same tests...
http://www.anandtech.com/show/5773/intels-ivy-bridge-an-htpc-perspective/3
The HD6570 still bests HD3000 easily and looks similar to the HD4000... but, of course, you can only get an HD4000 with a VERY expensive Ivy Bridge CPU

I also had this quote in the above piece:

HQV scores need to be taken with a grain of salt. In particular, one must check the tests where the GPU lost out points. In case those tests don't reflect the reader's usage scenario, the handicap can probably be ignored. So, it is essential that the scores for each test be compared, rather than just the total value.

Ganesh T S
Sr. Editor,
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
jakmal is offline  
post #21 of 67 Old 06-13-2012, 04:36 PM
AVS Addicted Member
 
assassin's Avatar
 
Join Date: Jul 2004
Posts: 12,961
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 50 Post(s)
Liked: 241
Agreed.

Those scores are pretty meaningless.

I said this when it showed the Llano to be "better" and I will say it again now that it shows Intel to be "better".


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

assassin is offline  
post #22 of 67 Old 06-13-2012, 04:47 PM
AVS Special Member
 
jeffkro's Avatar
 
Join Date: Aug 2009
Posts: 2,053
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 23
Trinity should have the graphics power of these low end graphics cards but in an integrated setup, it might pay to wait
jeffkro is offline  
post #23 of 67 Old 06-13-2012, 04:52 PM - Thread Starter
Advanced Member
 
Livin's Avatar
 
Join Date: May 2004
Posts: 623
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 20 Post(s)
Liked: 11
In the same review I linked to, GPUs are compared in one section based only on deinterlacing and then with denoise, etc (on an otherwise unadulterated images) and there are clear difference in what each GPU is able to render and to what the accuracy.

For those of you that only use 1080p material, it likely does not matter what GPU you use as long as it can handle the base processing... BUT if you use other material like 1080i, 720p, 480p, etc AND have a large screen where you'll notice 'PQ inconsistencies' AND want to get the cleanest scaling, deinterlacing, denoise, deblocking, etc .. then GPU matters... a lot. I'm in all of these categories.

As I mentioned before... I moved from an AMD HD6450 (low-end card) to an Nvidia Quadra 880M and noticed immediately much more macroblocking. Ive been trying to tweak the NVidia settings best I can but there seems FAR LESS ability to do so than with AMD. Thus, PQ is noticeably much lower. AND, this comes from the fact I did no post processing with the HD6450 other than vector adaptive deinterlacing... so, I must conclude the AMD is much better at upscaling and deinterlacing, by default.

I was hoping to get other videophiles that might have experiences with different GPUs but it seems maybe there are not many out there that have done evaluations.

________
Ltek

my setup: XBMC, Windows Media Center, Z-Wave/Insteon automation, Paradigm-Parasound-Onkyo-Velodyne Home Theater, 110" DIY Screen & BenQ W1070
Livin is offline  
post #24 of 67 Old 06-13-2012, 05:12 PM
AVS Addicted Member
 
assassin's Avatar
 
Join Date: Jul 2004
Posts: 12,961
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 50 Post(s)
Liked: 241
Quote:
Originally Posted by jeffkro View Post

Trinity should have the graphics power of these low end graphics cards but in an integrated setup, it might pay to wait

Where have I heard that before?


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

assassin is offline  
post #25 of 67 Old 06-13-2012, 05:14 PM
AVS Addicted Member
 
assassin's Avatar
 
Join Date: Jul 2004
Posts: 12,961
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 50 Post(s)
Liked: 241
Quote:
Originally Posted by Livin View Post

In the same review I linked to, GPUs are compared in one section based only on deinterlacing and then with denoise, etc (on an otherwise unadulterated images) and there are clear difference in what each GPU is able to render and to what the accuracy.
For those of you that only use 1080p material, it likely does not matter what GPU you use as long as it can handle the base processing... BUT if you use other material like 1080i, 720p, 480p, etc AND have a large screen where you'll notice 'PQ inconsistencies' AND want to get the cleanest scaling, deinterlacing, denoise, deblocking, etc .. then GPU matters... a lot. I'm in all of these categories.
As I mentioned before... I moved from an AMD HD6450 (low-end card) to an Nvidia Quadra 880M and noticed immediately much more macroblocking. Ive been trying to tweak the NVidia settings best I can but there seems FAR LESS ability to do so than with AMD. Thus, PQ is noticeably much lower. AND, this comes from the fact I did no post processing with the HD6450 other than vector adaptive deinterlacing... so, I must conclude the AMD is much better at upscaling and deinterlacing, by default.
I was hoping to get other videophiles that might have experiences with different GPUs but it seems maybe there are not many out there that have done evaluations.

There are videophiles on AVS that have tried all three.

They don't share your experience or summary.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

assassin is offline  
post #26 of 67 Old 06-13-2012, 05:17 PM
AVS Special Member
 
jakmal's Avatar
 
Join Date: Dec 2008
Location: Sunnyvale, CA
Posts: 1,059
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 12
Quote:
Originally Posted by Livin View Post

In the same review I linked to, GPUs are compared in one section based only on deinterlacing and then with denoise, etc (on an otherwise unadulterated images) and there are clear difference in what each GPU is able to render and to what the accuracy.

Denoising is another post processing step that, IMHO, is not really necessary for Blu-ray videos and even most broadcast content. It might be necessary for only camcorder content (and even in that case, most camcorders do plenty of denoising before actually encoding the video). The HQV test clip is an artificial one just to show the denoising algorithms in action.

Deinterlacing in current day GPUs : Different GPU vendors use different algorithms. While AMD's VA used to be very good, it appears that other vendors have caught up. For non-artificial test material, as long as the cadence is detected properly, it is almost impossible to tell the difference between the deinterlaced outputs from the various GPUs. Please look at MissingRemote's reviews where Andrew uses a football testclip. Recently, at AnandTech, we have started using some 480i TV broadcast content with particularly nasty ticker combing artifacts. Using that clip and madVR deinterlacing (which uses the HW deinterlacer in the GPU), I can't visually find any difference when playing it back using GPUs of any of the three vendors.

Also, in the pictures in the discrete HTPC GPU shootout piece, note that different GPU vendors have different default contrast enhancement settings / output levels / RGB or YCbCr output. To discuss things on a equal footing for the average consumer, we left everything at default. It is possible that a reader might like the default config of one of the GPU vendors better, but that doesn't make that output right for everyone smile.gif
Quote:
For those of you that only use 1080p material, it likely does not matter what GPU you use as long as it can handle the base processing... BUT if you use other material like 1080i, 720p, 480p, etc AND have a large screen where you'll notice 'PQ inconsistencies' AND want to get the cleanest scaling, deinterlacing, denoise, deblocking, etc .. then GPU matters... a lot. I'm in all of these categories.

For H.264, deblocking is not optional in the decoder path. Hence, you should be able to get the same output from all the GPU decoders. If you use madVR for scaling, the output will again be the same. EVR / EVR-CP use driver APIs for scaling, so the algorithm used may not be the same. Unfortunately, I don't think you can choose what algorithm gets used in that step for any of the GPUs. I already covered the denoising aspect in the first part of this post.
Quote:
As I mentioned before... I moved from an AMD HD6450 (low-end card) to an Nvidia Quadra 880M and noticed immediately much more macroblocking. Ive been trying to tweak the NVidia settings best I can but there seems FAR LESS ability to do so than with AMD. Thus, PQ is noticeably much lower. AND, this comes from the fact I did no post processing with the HD6450 other than vector adaptive deinterlacing... so, I must conclude the AMD is much better at upscaling and deinterlacing, by default.

Quadro 880M is GT2xx part as per this page. Further, it is a professional workstation card, and it is not immediately obvious whether the Quadro drivers have all the video post processing steps enabled in the drivers. GT2xx was never a great HTPC candidate (except for its Linux VDPAU support). So, I believe comparing Quadro 880M and AMD 6450 is like comparing apples and oranges. For 6450 comparison, I suggest using GT430. PQ wise, you will get very similar results.
Quote:
I was hoping to get other videophiles that might have experiences with different GPUs but it seems maybe there are not many out there that have done evaluations.

What sort of further evaluation are you looking for? We will try to address this in future coverage.

Ganesh T S
Sr. Editor,
To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.
jakmal is offline  
post #27 of 67 Old 06-13-2012, 05:42 PM
Senior Member
 
SUBCOB's Avatar
 
Join Date: Aug 2007
Posts: 328
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 10
Quote:
Originally Posted by Livin View Post

I don't know what type/encoding/res/bitrate/etc you are comparing but I' be very surprised if the HD2000 or even HD3000 could come close in post processing capabilities to the AMD HD6570
Looks at the in-depth review...
http://www.anandtech.com/show/4380/discrete-htpc-gpus-shootout/1
... unfortunately it does not compare the Intel HD2000/3000 but as you can see even the discrete GT520 and HD6450, which both easily beat the Intel HD3000 in horse power, cannot even come close to the HD6570.
I'm still looking for ways to change the NVidia settings like I can the AMD?
EDIT: I did find a recent review of the HD3000 & HD4000 running the same tests...
http://www.anandtech.com/show/5773/intels-ivy-bridge-an-htpc-perspective/3. The HD6570 still bests HD3000 easily and looks similar to the HD4000... but, of course, you can only get an HD4000 with a VERY expensive Ivy Bridge CPU



cut and paste from your top link

The HQV benchmarking procedure has been heavily promoted by AMD, but it is something NVIDIA says it doesn't optimize for. Considering the fact that there aren't any other standardized options available to evaluate the video post processing capabilities of the GPUs, we feel that HQV benchmarking should be an integral part of the reviews."

This in itself makes this test useless. Video quality is subjective to ones eyes just as the sound is to ones ears. There is no one card that works for everyone. All work and as long as I get good PQ I will stick with discrete in order to save some power, and to make for smaller form factor builds to fit into my entrainment system. Finding benchmarks that one company optimizes for and one doesn't does not make your point.
SUBCOB is offline  
post #28 of 67 Old 06-13-2012, 05:44 PM
AVS Special Member
 
Zon2020's Avatar
 
Join Date: Mar 2010
Posts: 2,716
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Liked: 28
Thank you jakmal. That was a very useful explanation.

Kind of wish you'd said all that in the article.

Wondered about the Quadro. Don't see people using those for video much. Now I know why.
Zon2020 is offline  
post #29 of 67 Old 06-13-2012, 05:55 PM
AVS Addicted Member
 
assassin's Avatar
 
Join Date: Jul 2004
Posts: 12,961
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 50 Post(s)
Liked: 241
Quote:
Originally Posted by Zon2020 View Post

Thank you jakmal. That was a very useful explanation.
Kind of wish you'd said all that in the article.
Wondered about the Quadro. Don't see people using those for video much. Now I know why.

+1.

I thought it was excellent as well. Agreed that I would love to see this expanded upon in an article.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.


To view links or images in signatures your post count must be 0 or greater. You currently have 0 posts.

assassin is offline  
post #30 of 67 Old 06-13-2012, 05:56 PM - Thread Starter
Advanced Member
 
Livin's Avatar
 
Join Date: May 2004
Posts: 623
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 20 Post(s)
Liked: 11
Quote:
Originally Posted by jakmal View Post

Quadro 880M is GT2xx part as per this page. Further, it is a professional workstation card, and it is not immediately obvious whether the Quadro drivers have all the video post processing steps enabled in the drivers. GT2xx was never a great HTPC candidate (except for its Linux VDPAU support). So, I believe comparing Quadro 880M and AMD 6450 is like comparing apples and oranges. For 6450 comparison, I suggest using GT430. PQ wise, you will get very similar results.
What sort of further evaluation are you looking for? We will try to address this in future coverage.

Yes, the Quadro is a professional card... made for CAD, Photo editing, Video editing, etc... thus, it should be able to compete (at minimum) with a low-end consumer card. The professional and consumer cards are always very similar (usually same silicon and driver core)... the difference usually the testing cycles (and ISV certifications) for professional are much more rigorous and they don't bother with game optimizations in the drivers - they focus on ISV (app) optimization. I used to work for Dell and very closely with NVidia & AMD for Dell's consumer and commercial side of PC/workstation class systems.

The question still remains... Can the user control the same settings/tweaks with NVidia drivers, and they can with AMD? My findings so far say AMD gives the user a LOT more control... but maybe I'm missing something?

Quote:
Originally Posted by SUBCOB View Post

The HQV benchmarking procedure has been heavily promoted by AMD, but it is something NVIDIA says it doesn't optimize for. Considering the fact that there aren't any other standardized options available to evaluate the video post processing capabilities of the GPUs, we feel that HQV benchmarking should be an integral part of the reviews. this is a cut and paste from the top link."
This in itself makes this test useless. Video quality is subjective to ones eyes just as the sound is to ones ears. There is no one card that works for everyone. All work and as long as I get good PQ I will stick with discrete in order to save some power, and to make for smaller form factor builds to fit into my entrainment system. Finding benchmarks that one company optimizes for and one doesn't does not make your point.

I see your point on optimizing for benchmark tests and agree about the subjective nature of PQ... though, I have seen tests stating some of the Intel GPUs just simply cannot render some types of images - not talking decoding, simply rendering. Thus, it makes me wonder about the rest of the chip's abilities to do anything else well. All-in-all, I've decided Intel video (chip + drivers) is too problematic... Intel is still putting our Intel video "beta" IMO. I read the Intel forums a bit yesterday and there are some seriously crazy issues that are plaguing many users, even on the latest chips.

At this point, I'm only considering AMD and NVidia... and I suspect I'll stick with AMD since I'm leaning towards an A8 series CPU.

________
Ltek

my setup: XBMC, Windows Media Center, Z-Wave/Insteon automation, Paradigm-Parasound-Onkyo-Velodyne Home Theater, 110" DIY Screen & BenQ W1070
Livin is offline  
Reply Home Theater Computers

User Tag List

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off