Originally Posted by GetGray
Maybe for many on this thread you are right. But I strongly disagree. The last place I want the video processing done in in the PC. Many of us have high end VP's that are designed and dedicated to jsut that task and do a wonderful job of it, given the opportunity. As for HDMI, nothing will make me happer than to bypass it with HD-SDI. But alas, I don't see that happening on a PC, either. I wasn't aware the HDMI spec disallowed 4:2:0.
I’m curious, have you actually done a per-frame comparison at high magnification?
I have yet to find any stand-alone device that can match the image quality of an HTPC running madVR. The only possible exception would be devices with dedicated deinterlacing hadware for video-based content, but that is of no concern to me as I don’t own any
interlaced video content—I watch films, where deinterlacing is trivial, and I’m not aware of any stand-alone device that can even output 24p from an NTSC or PAL DVD for example.
In fact most of the high-end external video processors I have owned, only process the image in 10-bit 4:2:2 as the manufacturers do not consider there to be any benefit to 4:4:4/RGB when the video source is natively 4:2:0.
With madVR the image is processed with 16-bits of internal precision, and there absolutely is a distinct advantage outputting RGB compared to 4:2:2 chroma. Perhaps it is because these external video processors have to be fed a 4:2:2 signal (as far as I am aware, it is not possible to connect them to a 4:2:0 source) that they simply cannot do a good enough job, that they simply don’t bother to try.
Where an external video processor does have an advantage, is that it can pass on 10-bit data to the display, rather than being restricted to 8-bits precision. This is fine if you are simply outputting the final image, but if you are doing 3D LUT display calibration in the VP/HTPC, 8-bits is not enough. (there is noticeable degradation when using madVR’s 3D LUT option) HTPCs should be capable of outputting 10-bit data to the display though, it’s just a case of the software needing to support it.
Originally Posted by blaubart
Bitstreaming - with good equipment and good ears you never more wanna live without it
Bitstreaming is no different from the decoded LPCM signal. With ReClock you can upsample the audio when adjusting it to keep it in sync, so there should be no perceivble quality loss. (and if you have spare CPU cycles, there is a replacement DLL
for even higher quality upsampling options)
One could make the argument that playing back audio at the bitstreamed 24/1.001fps found on most US-released discs (for legacy compatibility with 60/1.001 displays) compared to the original 24.000fps of the original film source (as found on many European releases) is a far more noticeable change than anything ReClock is doing.
Originally Posted by Wizziwig
MadVR converts and scales 4:2:0/NV12 decoded frames into a full resolution RGB frame buffer. Video card then converts this full-res RGB data back to downsampled 4:2:2 for HDMI output. The extra RGB stage often causes mismatched black-levels and/or aliasing artifacts from poor chroma down-scaling by the video card. If you set your PC HDMI output to RGB, then this extra conversion will just happen inside the TV or processor. It's just not as clean of a path as you get from a CE device.
When you add the whole de-interlacing aspect to all of the above issues, things only get worse.
You should absolutely not be outputting YCC from your HTPC, it should be outputting RGB, or else you are undoing most of the work that madVR has done.
My television—and any high-end display—will display the full RGB source information. It is possibly converted to 4:4:4 YCC internally rather than remaining RGB, but if that is the case, then the display would be converting a 4:2:2 YCC input to that as well.
There is nothing “cleaner” about the signal chain of a CE device converting from 4:2:0 to 4:2:2 and then having the display convert that to 4:4:4—if anything this is worse
than going straight from 4:2:0 to RGB and then RGB to 4:4:4 YCC.
With the CE device chain, there are two image scaling steps, whereas there is a single image scaling step using the HTPC with madVR, and that step is performed with 16-bits of internal precision and high fidelity image scaling algorithms—who knows what
your player or display is doing when converting from 4:2:0 to 4:2:2, and then 4:2:2 to 4:4:4.