Noticed something recently and would like some input.
Full HTPC in my signature, but in summary: Win10 Pro, Jriver, i5-4670k, GTX-1060 SC Gaming (6GB). I have a Vizio M70-d3, and I'm outputting video directly from the GPU (while audio goes from the iGPU to my old receiver).
I setup MadVR using the Kodi forum guide from
signature and OP. I used the part of the guide several posts down, starting with "Let's repeat this process, this time assuming the display resolution is 3840 x 2160p (4K UHD). Two graphics cards will be used for reference. A Medium-level card such as the GTX 960, and a High-level card similar to a GTX 1080."
For the most part I used the GTX 960 settings. At this point I can't remember if I made any substantial changes toward the 1080 settings, but I think in an effort to solve this issue I am pretty much back at the 960 settings.
On certain content types, namely TV shows that are 30 (or 29.97) fps, whether 1080p or 720p, I get dropped frames constantly. At least a few per second. Using a little android remote system monitor I could see while viewing this content that the GPU core was running at 96-99%. For 1080p24 content it hovered in the range of 85-92%. Clearly I'm taxing the GPU.
I tried every manner of knocking back settings with no luck, until last night. I unchecked SuperRes from upscaling refinement. Boom - problem went away, GPU usage % dropped to mid 80's and all was good.
Previously I had only basic profiles in MadVR... SD, 720p, 1080p, 2160p. I created another filter as the guide suggests for 1080p60, and one for 720p60 and unchecked SuperRes for these.
My questions are...
- Does this make sense?
- Should my 1060 have this sort of issue with settings proposed for a 960?
- Am I better off sacrificing something other than SuperRes to bring the processing time down?
Here is some MedifInfo data for one of the files in question: