Trying a CFI software DmitriRender and getting high GPU usage. It's a GPU depended software.
GTX 1060 6GB (core goes to 2Ghz), Win10x64, 425.31 driver. MPC HC x64, LAV filters.
With EVR it stays around 10% with regular spikes at 40%.
With madvr is stays around 30-40% with spikes to 90+%.
Without DR, madvr with the settings below sits at about 10%.
No artifact removal, image enhancements, scaling, other than chroma upscale NGU AA low. Nothing disabled in trade quality for performance. No smooth motion, dithering ED2. D3D11 from LAV video.
Not sure if this makes a difference in performance:
NVCP: 8 bit, RGB full.
Display: PC mode
8 bit, PC levels
this display is calibrated, disable GPU gamma ramps, .709 primary, pure power curve 2.2
enable gamma processing, pure power curve, 2.2
1080p file on 1080p display.
Why the extra GPU consumption when using madvr?
What else could I disable?
Originally Posted by Onkyoman
I would disable all of the trade quality for performance checkboxes related to tone mapping and enable everything under tone map HDR using pixel shaders with the exception of output video in HDR format and color tweaks for fire & explosions. You should leave those two disabled.
What are the other settings for HDR tone mapping other than compromise on tone & gamut mapping accuracy
Looks fine. Rendering times for 23.976fps should be under 41.71ms.