Originally Posted by staknhalo
Wouldn't the GPU chipset have to be built with support for h.265 in mind? Or am I as wrong as can be?
You are correct. It should be self-evident that the card does not contain an h265 decoder - the GPU was designed circa 2009, went on sale in 2010 and h265 was just standardized years later. It is going to be at least one more generation before a regular nvidia GPU has an h265 hardware decoder.
As for a shader or OpenCL based decoder, that ain't going to happen either. The GT430 has bottom of the barrel math performance, generally struggling to even match the performance of some of the lower-end cards of the previous generation. I couldn't find a solid reference, but h265 is expected to be significantly more math intensive to decode than h264 is.
Plus, nobody has an h264 decoder native to either OpenCL or CUDA (ok CORE has one that is part CUDA, but it ain't open source). Greeneyez was incorrect when he stated that the LAV filter uses CUDA. What the LAV filter uses is CUVID which is Nvidia's way of exposing the card's own h264 hardware decoder through CUDA. Under the hood, it is exactly the same part of the GPU that decodes video via DXVA - LAV does not contain an h264 decoder itself, it only knows how to use the hardware decoders.