Originally Posted by Benjii
I'd like to preface this by saying I'm totally new to madVR, I've never used it!
I'm upgrading my 4k non HDR TV to a 4k OLED with HDR, which I will be getting professionally calibrated and a 3D LUT will be provided at the end. I already have an HTPC (G4560 CPU, 8GB RAM, SSD) which I was running Kodi on and watching 4k remuxes on it. The CPU obviously has 10bit hardware support, so it can play 4k remuxes without breaking a sweat. How much more taxing is madVR going to be beyond 4k playback? Does it seem viable that I'd be able to watch HDR10 content with a G4560 or will a GPU upgrade be needed?
On a somewhat related note... as far as I'm aware, madVR has additional processing involved, why is it needed? Why isn't the source video left untouched? I was told madVR would be my best option for my HTPC, but I don't really understand why.
I appreciate any help given!
The integrated graphics won't work well for madVR. It uses the GPU to do its processing because of the processing resources provided by gaming GPUs.
The additional processing done by madVR can be misunderstood. Kodi, for example, does its basic color space conversions, upscaling, processing, etc. at 8-bits without dithering, even for 10-bit sources. madVR by comparison does its color space conversions in 32-bit floating point with high-quality dithering, at output. This alone uses more processing than the average Kodi box running LibreELEC can handle. This expensive math can be considered overkill, but it reduces errors in color conversions due to rounding and other problems with image quality such as banding and mosquito noise. The original Y'CbCr, uncompressed HD video is also calculated in 32-bit floating point. If they thought 8-bits was good enough to not introduce errors, they would have used it. The sensitivity of color space conversions on PCs is made more important by the fact they operate in RGB rather than Y'CbCr.
That is just one basic difference. Other factors come down to customizing the output to match your equipment and personal tastes. You can choose the output gamut, the bit depth, how HDR content is processed, the quality of image upscaling, improve imperfections in the source such as banding, ringing, compression artifacts and noise, because even high-quality sources have imperfections. For example, 1080p Blu-rays come with banding, they are not always shot to look sharp, and can be sharpened, and many are stained with edge enhancement and digital noise reduction. Sometimes processing a video, even an uncompressed Blu-ray, is not a terrible idea. But that can be personal taste. Basic operations like chroma upscaling and image upscaling must be performed on all videos (those pixels must contain something) and madVR will do a much better job than Kodi of interpolating those missing pixels. Proper HDR10 support is also available with full metadata passthrough. Kodi does not offer proper HDR support, and instead does a poor HDR -> SDR conversion, which madVR can also do in better quality.
If you want to know more about madVR's features, try the link in my signature. The first post contains more on the equipment needed for a madVR HTPC. In your case, you may only need a new graphics card to go with your existing parts.
Hopefully, that is helpful.