Hi Madshi, that definitely makes sense and keep up the good work.@ Mike, there seems to be a misunderstanding here. When the 2 of us discussed this topic previously, you posted in a way that read to me as if you were saying that comparing image quality should only be done with moving images, and not with still frames. And I replied by pointing you to the fact that scientists and researchers world wide today actually compare video upscaling quality primarily by measuring and comparing single ("still") frames with well-known metrics such as PSNR and SSIM. Which is certainly a worthwhile fact to mention?
But I fully agree with you and Javs that comparing image quality in motion is important, as well. I'm a fan of comparing "everything". Artificial test patterns, still photos and movie content. In still frames using pixel peeping and in motion. From seating distance and with the nose pressed to the screen. Personally, I use pixel peeping in screenshots as my primary testing & comparison & development method, which is what scientists and researchers seem to do, as well. The reason for this preference is that it makes things more objective. You just can't cheat that easily on still frames. Most artifacts like e.g. ringing and aliasing become easily visible, which can be missed by the inexperienced eye in motion. Plus, you can switch back & forth between 2 comparison images in the blink of an eye (which makes even small differences visible), while when you compare motion sequences, you need to rely on short term visual memory. But as I said, I like to test in motion, as well, to get the "full picture", and to make sure that there are no motion artifacts etc.
There are certain artifacts that are way more visible in motion, as well. E.g. if you have banding, often in a fade-to-black the bands are moving very visibly, which makes the banding stick out like a sore thumb, while in still frames it can be much less obvious. Or if there's an upscaling algo which tries to hallucinate texture detail, that could look great in stills but could potentially look terrible in motion. So testing in motion is definitely helpful/useful.
Does the Envy allow native (non-scaled) content to be sharpened? - I am very sensitive to sharpness and like my images sharp without the detrimental effects of over-sharpening. Having said that, I would want my images (still and moving) to be profoundly sharp when viewed from my seating position which means that it would look a touch over-sharp when viewed close-up.
Another question I have is whether you've looked into a more sophisticated method for local contrast enhancement?