Originally Posted by Luminated67
Originally Posted by jch2
Yes, that is correct. Just doubling and sending the same 1080p twice, with the second part identical to the first one but shifted 50% of a pixel up and right would look terrible and blur the original 1080p frame. It would look like double-vision.
The simplest solution would for the "4k Enhancement" algorithm would be to guess the pixels of the shifted frame from interpolating the four neighboring pixels, but the image processing is much more sophisticated than that, and also includes noise reduction, edge enhancement (i.e. sharpness), and lots of other algorithmic trickery to create an even better image. If it works (and it generally does) it looks better than the just sending the original 1080p image without the "4k Enhancement" processing.
That's why i love this place, you learn something new everyday. Why does a REAL 4K bluray look that little bit better than a 1080P one, is it the fact that the system has more data to work with and can calculate better what to make the second image from?
That's mostly correct.
A 1080p Blu-ray disc has 2 million pixels per frame, uses 8-bit or 10-bit color, and a much smaller SDR color space called Rec.709. To display 4 million pixels from this source the Epson uses its "4k Enhancement" -- it displays the actual 2 million pixels and then it "guesses"the other 2 million (called upscaling) to make a better image. The image is still SDR / Rec.709 though -- there's no "color space upscaling". It generally does a good job of this and the upscaled image looks better than if you turned 4k Enhancement off and just viewed the original 1080p image natively, but not always.
A 4k HDR Blu-ray has 8 million pixels per frame (4 times as many), uses 10-bit or 12-bit color, and has a much larger HDR color space called Rec.2020 or BT.2020 (which is twice as large as Rec.709). To display 4 million pixels from this source the Epson can compute values for all 4 million pixels it can display from the actual source pixels (called downscaling). It doesn't have to guess anything like it does when it upscales. A downscaled 4 million pixels image will almost always look better than 4 million pixels upscaled 1080p image, because guessing the value for pixels will always be less accurate than computing the actual values from real source pixels.
So, the differences you are seeing between the two different images displayed by the Epson: a 1080p SDR image upscaled to 4 million pixels and a 4k HDR image downscaled to 4 million pixels are a combination of resolution (8 million source pixels is better than 2 million, there's no guessing when it has all 8 million pixels to work with) and color space (HDR is better than SDR).
However, 4 million pixels displayed will look somewhat less sharp than a true 4k projector that does a good job of displaying all 8 million pixels (like a JVC NX7). But, from normal viewing distances (more than 50% of the screen diagonal) and 20/20 vision, your eyes don't have the physical ability to tell the difference between 4 million (Epson pixel-shift) and 8 million pixels (true 4k). If you see a difference from more than 50% of the screen diagonal away, what you are most likely seeing is the image processing differences (sharpness / edge enhancement, noise reduction, etc) between display manufacturers, not the difference in the number of pixels rendered on the screen.
Even though you can't see the difference between 4 million and 8 million pixels at typical viewing distances, what you can see clearly with 20/20 vision out to about 125% of the screen diagonal away is the difference between 2 million pixels (HD/1080p, upscaled or not) and 4 million pixels downscaled from a 4k source with Epson pixel-shift. So, the Epson pixel-shifted downscaled image from a 4k source looks sharper than 1080p native or upscaled image all the way out to about 125% of the screen diagonal. But even if you walk far enough away from your screen so the resolution no longer matters, you'll still clearly see the differences between the SDR / Rec.709 limited color space vs the HDR / Rec.2020 expanded color space.
Also, as a note, properly calibrated, the Epson can display over 100% of Rec.709 in any color mode, and about 75% of Rec.2020 with high lamp mode and any of the three color modes that engages the color filter (Cinema, B&W Cinema, and Digital Cinema), with Digital Cinema being the most accurate out of the box.
Rec.2020 is a target, and no current movie theater projection system or any home display can hit 100% of Rec.2020. There are some pro displays that can, but they are small and super-expensive, and only used by a handful of movie studios. Rec.2020 has always been an aspirational target that to hit will require about 1,000 nits of brightness (about 333 fL!) and 12-bit panels (10-bit panels are the current high-end). The best home and theater displays (at the time of this post) are reaching into the high 80% of Rec.2020, but nothing can hit 100% yet. Also note, the next (even higher) aspirational standard is called Rec.2100 (also BT.2100), that has even a larger color space than Rec.2020.
While movie producers use the Rec.2020 color space, they target the current subset of it that current display technology is capable of called DCI-P3. DCI-P3 lies about halfway between Rec.709 and Rec.2020. So, if the display can reach about 75% of Rec.2020, it will have near 100% coverage of DCI-P3. So, you can watch a film exactly as the producers intended with 100% of DCI-P3. The Epson covers about 87% of DCI-P3 in color modes without the color filter (Dynamic, Bright Room, and Natural), and over 100% of DCI-P3 in color modes with the color filter (Cinema, B&W Cinema, and Digital Cinema). So, with the Epson properly calibrated, you can experience a 4k HDR source mastered to DCI-P3 exactly as the movie producers intended.
I know that's a lot to digest, but movie production and display reproduction technology is complicated. And this post is a very simplified version of that complexity. Hope this post helps explain all that complexity in layman's terms and you find it useful.