or Connect
AVS › AVS Forum › Video Components › Home Theater Computers › *Official* NVIDIA Kepler (6xx) GPUs for HTPCs
New Posts  All Forums:Forum Nav:

*Official* NVIDIA Kepler (6xx) GPUs for HTPCs - Page 6

post #151 of 238
Thread Starter 
Quote:
Originally Posted by renethx View Post

OK, I didn't see such a problem. Almost the same configuration as yours except for MPC-HC version (mine is 1.6.3.4943), madVR full screen exclusive, and possibly driver (mine is 12.6 beta). Actually playing back 1080i60 with DXVA2CB + madVR default HQ setting can be done even with Llano A6-3500 with no problem. Improvements I saw are related to 760p60 and 1080p60 under DXVA2CB + madVR default HQ setting.

FSW vs. FSE could be an issue. Let me try with FSE later tonight. (or, if you are close to your testbed now, can you try with FSW?)
post #152 of 238

The main advantages of HD 7750 over GT 640 I found are:

 

- Working full / limited range RGB levels (NVIDIA requires a custom resolution or a registry hack)

- The ready-for-use 23Hz refresh rate closer to 23.976Hz as well as 24Hz (NVIDIA requires a custom resolution; then it loses 24Hz, not good for European users; for stereoscipic 3D, we have to switch back and forth between 2D mode and 3D mode manually to make use of 2D custom resolutions.)

- Support for 88.2kHz/176.4kHz LPCM over HDMI (NVIDIA can't output these sample rates in WASAPI exclusive mode, not good for music lovers)

 

Disadvantages are:

 

- No dual audio stream (yet); not good for 3D with a HDMI 1.3 AVR. Actually this does not matter at all if you use SNB/IVB (use Intel iGPU for audio)

- Two HW accelerated decoders (DXVA2CB, QuickSync) vs three (CUVID, DXVA2CB, QuickSync) under madVR (and Intel processor)

- No support for 4K decode yet. Well, this does not matter at all, IMO (simply because there is almost no 4K content nor affordable 4K display yet); NVIDIA can't decode 4Kp60 either (maybe GDDR5 fixes it?).

 

The advantages weigh more than the disadvantages for me (the disadvantages are actually not true disadvantages), that's why I prefer HD 7750.


Edited by renethx - 6/27/12 at 10:46am
post #153 of 238
Thread Starter 
Quote:
Originally Posted by renethx View Post

- The ready-for-use 23Hz refresh rate closer to 23.976Hz as well as 24Hz (NVIDIA requires a custom resolution; then it loses 24Hz, not good for European users)
.....
- Two HW accelerated decoders (DXVA2CB, QuickSync) vs three (CUVID, DXVA2CB, QuickSync) under madVR (and Intel processor)

renethx,

What about the driver's native chroma upsampling algorithm used by EVR ? From my experience, Intel and NVIDIA are better than AMD in the past few releases.

Also, for the 23 Hz custom res in NVIDIA apparently resulting in loss of 24 Hz, that is not entirely true. Setting the screen refresh rates in the monitor properties (rather than the NV control panel) resolves the problem:

http://images.anandtech.com/galleries/1941/004-01-Post-23Hz-Setup-Mon-23Hz.png

http://images.anandtech.com/galleries/1941/005-01-Post-23Hz-Setup-Mon-24Hz.png

(The above screenshots are from GT 540M, but I have no reason to believe GT 640 is otherwise).

If I remember correctly, the decoded video doesn't need to leave the GPU card at all when decoding with CUVID. For DXVA2CB and QS, system RAM speed will have quite a bit of effect because the decoded video has to reside in the system RAM before madVR can take over (please correct me, if I am wrong).

WASAPI audio sample rates issue is something I am hearing for the first time. I do sample rates testing for all the media streamer boxes, but haven't done for HTPCs so far. Maybe, it is time to revisit the audio segment smile.gif
post #154 of 238
Quote:
Originally Posted by jakmal View Post

By the way, I am also simultaneously evaluating the AMD 7750 (its 4K decode is broken) where I expect at least madVR + DXVA2 CB to be functional for hardware decode and HQ rendering. I am facing an issue as described here:
http://forum.doom9.org/showthread.php?p=1579080#post1579080

If you think thats bad, you should try a 4K clip. :P
I only have my 7750 in my HTPC, and didn't get a chance to test your clip in there, will try tomorrow.
post #155 of 238
Thread Starter 
Quote:
Originally Posted by Nevcairiel View Post

If you think thats bad, you should try a 4K clip. :P
I only have my 7750 in my HTPC, and didn't get a chance to test your clip in there, will try tomorrow.

I have given up on madVR with 4K content. I don't think there is a chance for any HTPC GPU to work its magic there.. Maybe the 680 or 7900 series could do something, but I don't have those cards to test.

4K with DXVA2CB and EVRCP works on 640, but crashes on AMD 7750, which is a bit strange, because DXVA Native doesn't work on any GPU and I would imagine DXVA2CB behavior to be similar - either work on both GPUs or not work at all...
post #156 of 238
Does anyone know if Kepler has the same issue as the GT430 did where defining a custom resolution breaks frame packed 3D playback? Link to issue: http://www.avsforum.com/t/1240499/faq-for-the-3d-htpc/1080#post_21969498
post #157 of 238
Thread Starter 
Quote:
Originally Posted by jstabb View Post

Does anyone know if Kepler has the same issue as the GT430 did where defining a custom resolution breaks frame packed 3D playback? Link to issue: http://www.avsforum.com/t/1240499/faq-for-the-3d-htpc/1080#post_21969498

This appears to be a driver bug and not hardware-specific. If it is not working on your GT 430 with the latest drivers, it probably doesn't work on the Keplers either. I will check up the next time I plug the 640 back into my testbed.
post #158 of 238
Quote:
Originally Posted by jakmal View Post

This appears to be a driver bug and not hardware-specific. If it is not working on your GT 430 with the latest drivers, it probably doesn't work on the Keplers either. I will check up the next time I plug the 640 back into my testbed.
It does indeed seem like a driver issue. But even with unified driver models, you never know when there's branching logic in the driver that's model dependent. Thank you for offering to check, it's much appreciated! Also, if the issue does indeed remain, I'd like to suggest you include a note of it in your review so that others like myself who make purchasing decisions based on the custom resolution support would be well informed. Thanks again!
post #159 of 238
Quote:
Originally Posted by jakmal View Post

FSW vs. FSE could be an issue. Let me try with FSE later tonight. (or, if you are close to your testbed now, can you try with FSW?)

 

Are you using FSW in all your tests? Then that's a bad idea. FSW requires lots of GPU processing power, in particular for AMD. Llano A6-3500 can play all contents easily except for 720p60 and 1080p60 with DXVA2CB + madVR HQ FSE.

post #160 of 238
Quote:
Originally Posted by jakmal View Post

4K with DXVA2CB and EVRCP works on 640, but crashes on AMD 7750, which is a bit strange, because DXVA Native doesn't work on any GPU and I would imagine DXVA2CB behavior to be similar - either work on both GPUs or not work at all...

Didn't i explain that often enough yet? smile.gif
DXVA2 Native in LAV is currently artifically limited to 1080p, i specifically block any higher resolutions.

This is done because there is no proper auto-detection for 4K Hardware support yet. This means if you would run a 4K sample on a GPU that didn't support 4k DXVA, you would end up not getting an image (switching from DXVA Native back to software decoding is not easily possible, so you have to fail very early to manage the fallback)
I also didn't invest much into this yet, because until recently NVIDIA just crashed on 4K DXVA (need to re-test with recent drivers), and AMD produces a broken image, so the need was just not there.

Also regarding 4K on 7750, i didn't mean using madVR specifically, last i tried with any renderer it resulted in half an image, and half green. Very amusing. If it crashes now, AMD may have changed something in the driver (not for the better, i guess)
post #161 of 238
Thread Starter 
Quote:
Originally Posted by Nevcairiel View Post

If you think thats bad, you should try a 4K clip. :P
I only have my 7750 in my HTPC, and didn't get a chance to test your clip in there, will try tomorrow.

Hendrik, After a couple of reboots / change to FSE, now both FSE and FSW playback are flawless for the 1080i60 clip. Definitely a strange symptom that I am not able to reproduce now or be in a position to explain away.

On a separate note, is there any possibility at all of a CUVID equivalent for AMD GPUs? i.e, transfer decoded data to madVR without using the system RAM in-between?
Quote:
Originally Posted by renethx View Post

Are you using FSW in all your tests? Then that's a bad idea. FSW requires lots of GPU processing power, in particular for AMD. Llano A6-3500 can play all contents easily except for 720p60 and 1080p60 with DXVA2CB + madVR HQ FSE.

I remember FSW used to cause dropped frames in 540M (That is why I used FSE in all the tests for the Vision 3D 252B), but it was flawless for the GT 640. For the GT 640 review, I left the madVR install at default (I think it defaults to FSW), and I didn't get any dropped frames, so the graphs for the 640 review are all in FSW. I am wondering how much difference it makes to the power numbers. Will revisit that aspect sometime down the road.
post #162 of 238
Quote:
Originally Posted by jakmal View Post

On a separate note, is there any possibility at all of a CUVID equivalent for AMD GPUs? i.e, transfer decoded data to madVR without using the system RAM in-between?

CUVID doesn't even work like that, it also copies the data back to the system RAM.
The only way to avoid the copy is DXVA Native, which madVR does not support (yet).
post #163 of 238
Thread Starter 
Quote:
Originally Posted by Nevcairiel View Post

Also regarding 4K on 7750, i didn't mean using madVR specifically, last i tried with any renderer it resulted in half an image, and half green. Very amusing. If it crashes now, AMD may have changed something in the driver (not for the better, i guess)

avcodec works. You are right about 4K behavior with DXVA2CB.. Results in a display driver crash. renethx believes HW decode is not important for H.264 4K material, and while I agree with him that is the case for the YouTube encodes, I am not sure it will hold for other 4Kp30 material. 4Kp60 is important, but it will probably end up with consumers a good 2 - 3 years after 4Kp30 becomes common.
post #164 of 238
Thread Starter 
Quote:
Originally Posted by Nevcairiel View Post

CUVID doesn't even work like that, it also copies the data back to the system RAM.
The only way to avoid the copy is DXVA Native, which madVR does not support (yet).

Thanks for the clarification! I was doubtful about this anyway (as one of my previous posts shows). If Mathias implements support for DXVA Native, I assume CUVID also doesn't need to copy back to system RAM for madVR to take over (I am finding CUVID more reliable than native DXVA,; Btw, MPEG-4 acceleration -- is that possible with native DXVA?)
post #165 of 238
1080p60 isn't even all that important right now, which tells alot about 4k 60p tongue.gif

CUVID will always copy the data back, doing anything else would be quite complicated and most likely not worth it. DXVA2 Native should work fine in most cases. MPEG4 support is in theory possible, in practice it may be a bit more complicated because i heard NVIDIA doesn't follow the MS standard there.

With CPUs getting faster and faster, the need for GPU decoding also goes down. Even today (on recent hardware) its only really important for mobile use-cases.
In any case, somehow i don't think we'll be getting commerical 4k content in H264 anyway. My guess is that it'll be H.265/HEVC until we get 4k, which current GPUs don't support anyway.
post #166 of 238
Thread Starter 
Quote:
Originally Posted by Nevcairiel View Post

1080p60 isn't even all that important right now, which tells alot about 4k 60p tongue.gif
CUVID will always copy the data back, doing anything else would be quite complicated and most likely not worth it. DXVA2 Native should work fine in most cases. MPEG4 support is in theory possible, in practice it may be a bit more complicated because i heard NVIDIA doesn't follow the MS standard there.
With CPUs getting faster and faster, the need for GPU decoding also goes down. Even today (on recent hardware) its only really important for mobile use-cases.
In any case, somehow i don't think we'll be getting commerical 4k content in H264 anyway. My guess is that it'll be H.265/HEVC until we get 4k, which current GPUs don't support anyway.

Agreed about 4Kp60 : The camera that renethx linked apparently shoots four separate 1080p60 streams and records to four SDXC cards. Stitching together is done offline in software if my understanding is right. Too many complications.

Problem with faster and faster CPUs is the need to cool them down. For a quiet HTPC, I think GPU decoding is still very important.

HEVC / H.265 is still quite some time off from standardization. (best case is standard being finalized and sent for approval this July, and then, after more meetings, finally being ratified). However, apart from standardization, I think the biggest roadblock to HEVC is the patents issue. For H.264, MPEG-LA has had a patent pool which ensures that any company using H.264 doesn't end up getting sued by any patent holder as long as they pay royalty to MPEG-LA. Unfortunately, HEVC patent owners are threatening to license on their own. Imagine a company doing a HEVC encoder or decoder and being forced to pay royalty to 50 different patent holders, each of them demanding 2% of the revenue or something like that. It would be foolhardy for any company to come out with a HEVC product in that scenario. My belief is that we will eventually need HEVC for 4K and higher resolution content, but, for the first generation or two of 4K-enabled products, it will be H.264.
post #167 of 238
Quote:
Originally Posted by jakmal View Post

Problem with faster and faster CPUs is the need to cool them down. For a quiet HTPC, I think GPU decoding is still very important.

I dunno, if you get one of the new Intel CPUs, cooling them seems to be a non-issue, even on full load. CPUs aren't just getting faster, but also more efficient (at least on Intels side, AMD is a bit lacking here)

Its also important to know that CUVID has one major drawback, it causes the GPU to run at 3D clocks, which can also cause additional heat.
Sadly thats just a side-effect of using CUDA to access the video decoder - CUDA always runs at high power levels to guarantee performance.
post #168 of 238
Quote:
Originally Posted by jakmal View Post

What about the driver's native chroma upsampling algorithm used by EVR ? From my experience, Intel and NVIDIA are better than AMD in the past few releases.

Also, for the 23 Hz custom res in NVIDIA apparently resulting in loss of 24 Hz, that is not entirely true. Setting the screen refresh rates in the monitor properties (rather than the NV control panel) resolves the problem:

http://images.anandtech.com/galleries/1941/004-01-Post-23Hz-Setup-Mon-23Hz.png

http://images.anandtech.com/galleries/1941/005-01-Post-23Hz-Setup-Mon-24Hz.png

(The above screenshots are from GT 540M, but I have no reason to believe GT 640 is otherwise).

WASAPI audio sample rates issue is something I am hearing for the first time. I do sample rates testing for all the media streamer boxes, but haven't done for HTPCs so far. Maybe, it is time to revisit the audio segment smile.gif

 

I ran HQV 2.0, but for my eyes, this is not a big problem. Surely AMD driver's algorithm can be improved and you should report any problem you find in your articles.

 

Thanks for the tip on NVIDIA custom resolution. It works, but it's a bit ugly. NVIDIA driver in video playback aspect hasn't changed for a long time. I feel as if NVIDIA abandoned the driver development for HTPC users. smile.gif

 

There are tons of music sources at 88.1kHz/176.2kHz and some people even upsample 44.1kHz music to 88.2kHz for better quality. Although many people prefer analog sound cards, being able to send 88.1kHz/176.2kHz over HDMI to AVR is a nice addition.

 

BTW Sapphire Ultimate HD 7750 won't fit popular microATX HTPC cases such as SilverStone GD04/GD05/GD06 and Antec Fusion Remote, while HIS H775P1GD with iSilence 5 passive cooler will (but occupies 2.5 slots, and is too long for Antec Fusion Remote). This is an important point when recommending HD 7750.


Edited by renethx - 6/21/12 at 8:48am
post #169 of 238

CPU usage of Core i5-3570K at playing back 4Kp60 clip with AVCODEC + EVR is ~60%. Core i3 can't handle it (100% CPU usage). Intel HD Graphics 4000 with ArcSoft Video Decoder + EVR can play it perfectly with CPU usage ~5% (but it can't output even 4Kp24). Adding a good third-party cooler is necessary if you want a quiet PC. Anyway by the time 4K contents are more popular, we will have much better graphics cards and processors.


Edited by renethx - 6/21/12 at 10:41pm
post #170 of 238
Quote:
Originally Posted by jakmal View Post

The camera that renethx linked apparently shoots four separate 1080p60 streams and records to four SDXC cards. Stitching together is done offline in software if my understanding is right. Too many complications.

 

That's correct. Another video clip at JVC web site (4 MP4 files), but it looks like I need Mac OS to convert them into a single 4K MOV file.

 

I would like to have a consumer 4Kp60 camcorder for $500 in ten years. smile.gif


Edited by renethx - 6/21/12 at 8:14am
post #171 of 238
Quote:
Originally Posted by renethx View Post

The main advantages of HD 7750 over GT 640 I found are:

 

- The ready-for-use 23Hz refresh rate closer to 23.976Hz as well as 24Hz (NVIDIA requires a custom resolution; then it loses 24Hz, not good for European users)

 

Here is another complication in custom resolutions: for sterescopic 3D, I have to switch to 3D mode manually, then go back to 2D mode manually after finishing 3D; otherwise the custom resolution won't work properly. AMD cards work fine automatically. (For example, read this thread.) I am more and more confident that either NVIDIA lost interest in HTPC or just ceased HTPC-related driver development for some reason (maybe financial one).


Edited by renethx - 6/21/12 at 12:13pm
post #172 of 238
Thread Starter 
Quote:
Originally Posted by renethx View Post

Here is another complication in custom resolutions: for sterescopic 3D, I have to switch to 3D mode manually, then go back to 2D mode manually after finishing 3D; otherwise the custom resolution won't work properly. AMD cards work fine automatically. (For example, read this thread.) I am more and more confident that either NVIDIA lost interest in HTPC or just ceased HTPC-related driver development for some reason (maybe financial one).

renethx.

Can we summarize the HTPC issues with the NVIDIA drivers? I will try to shoot them an email and see their response.

To your findings, please add the 'colour space / black level' issues that I reported in the 640 review.

By the way, the 7750 is not that accurate with the black levels either (though it is just a simple switch of the applicable options to fix the problem): http://imgur.com/a/fWRU4
post #173 of 238
NVIDIA should just add a switch in the driver for the full or limited range. The driver can clearly do it, for some reason they just don't expose a switch and only do what the TV reports - sadly most TVs don't report this correctly. wink.gif
At least NVIDIA has a work-around for the issue, unlike Intel where it just doesn't work at all.
post #174 of 238

Im curious how much of a real world difference is there between a 640GT kepler DDR3 and 640GT kepler DDR5 doing typical HTPC tasks.

 

The reason I ask is because I see a lot of people on HTPC forums immediately focus on the DDR3 version being "memory bandwidth starved". But, at the same time I see positive comments like, "significantly improved performance when decoding H.264, VC-1 and MPEG-2 codecs" on the Nvidia purevideo wiki page; as, well as very positive comments on Anandtech when used on HTPCs. Any extra processing the GPU could handle instead of my CPU is a welcome addition to my HTPC; even if my quadcore CPU could handle these kinds of tasks on it's own.

 

So, what's the deal?  Under what typical HTPC tasks would I see a noticeable improvement in performance using DDR5?

 

Personally, I use common mpeg2/H.264/VC-1 such as LAV video, MPC-HT, CoreAVC, Arcsoft) as well as 2D/3D bluray playback with PowerDVD 12.  I dont use MadVR since my setup is revolved around EVR-specific apps.  Nor do I playback 4K video.  Once in a great while I'll re-encode a video to H.264.

post #175 of 238
I am looking at purchasing a 660TI for HTPC use today (i want to play games on it); either the Zotac, Gigabyte or Asus. This will be to replace a 7750 that doesn't run games very well. Anyone know if the 660ti supports bitstreaming?

I see that people have mentioned the 670 and 680 do. Also on Zotac's website they this the 660Ti AMP supporting bitstreaming, but I can't find anything from NVIDIA saying so. I'm assuming it does since Kepler was supposed to, but I don't see it listed as a feature or discussed in any reviews. I am not willing to get rid of my AMD 7750 until I know forsure. But I will be happy to get rid of it.
post #176 of 238
Nvidia has officially announce the GTX 660 and the GTX 650.

http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-660

http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-650

GTX 650 would probably be the most interesting card for HTPC usage and with 80GB/s of bandwidth, it's not bandwidth starved like the GT 640.
post #177 of 238

The term "bandwidth starved" was actually meant for 3D gaming; not to be confused with anything related to HTPCs (in respect to the GT640s).  The GT640 can comfortably handle the most advanced video formats out.  Actually, the GT640s are considered ideal for HTPC's since they would offer the least fan noise; and, take up the least amount of space.  A lot of people don't realize that the DDR3 can be a benefit; which dissipates less heat than DDR5; which is why the GT640's can comfortably fit into one PCI slot space.  That's why you won't see any one-slot space GTX650's being released.

 

Having said that, the GTX650's look like they may be ideal for someone that needs to play games in addition to HTPC-related tasks.

 

Quote:
Originally Posted by HDGT View Post

Nvidia has officially announce the GTX 660 and the GTX 650.
http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-660
http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-650
GTX 650 would probably be the most interesting card for HTPC usage and with 80GB/s of bandwidth, it's not bandwidth starved like the GT 640.
post #178 of 238
For the record I have no issues with the GT640 as well on my dual monitor system (madVR, no 4K, no gaming).
post #179 of 238
Well, as of today, nobody should look at the GT 640. The GTX 650 is here, costs only $10 more (list) and is better in every way, including idle power. It looks like the GTX 650 is the new HTPC card to beat.
post #180 of 238
jakmal,

Don't you think that it would be a really good time for a top-level technology website to do a new HTPC card/driver roundup.
GT 640, GTX 650, throw in a 670 for good measure
HF 7750, HD 7780, HD 7870

Latest drivers - what has been fixed, anything still broken...

I'm just saying...
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Home Theater Computers
AVS › AVS Forum › Video Components › Home Theater Computers › *Official* NVIDIA Kepler (6xx) GPUs for HTPCs