Originally Posted by madshi
I'm not completely sure which kind of GPU is needed to do 1080p -> 4K upscaling. Of course much depends on which NGU quality level you're aiming at.
Something I've been wondering about
: Is it possible to roughly guesstimate madVR capability, based on GPU metrics? E.g., 768 CUDA cores, 1300 MHz core clock, 4 GB 128-Bit GDDR5, on a PCI-E 3.0 bus?
Or ramping that up to 1152 CUDA cores, 1500 MHz core clock, 3 GB 192-bit GDDR5? Those are basically non-overclocked GTX 1050 Ti vs. 1060, respectively. The 1050 Ti is attractive due to its low 75W TDP, vs. the 120W TDP of the 1060 cards. (less heat = less noise in an HTPC environment) Though the thermal-cost ratio of 1.6x is significantly less than the potential performance increment (~2.6x).
My other key question involves the difficulty of processing different types of source inputs. In my own case, the bulk of what I have (90+%) is either high-quality Blu-ray (~30 Mbit, 1080p/24, MPEG4), which we've been discussing here as a good match for NGU. OR moderate-quality, early-day C-band sourced (~17-19 Mbit, 1080i/60, MPEG2) content (on DVHS tapes), before they cranked up the compression knob. I assume that the later might take more work to clean and upscale to 4K, because it would likely have more artifacts that one would like to reduce. And because deinterlacing adds it's own overhead burden. Since this will be viewed on an 8.5-ft wide screen from a 10-ft distance (a relatively immersive 46-degree FOV), the improvements that madVR can provide should be readily apparent. But the expectations for the later would obviously be less than for the former.