I've long maintained that pixel resolutions of 4K/UHD and higher are unimportant for video displays of a common size—say, 50-65 inches—when viewed from a normal seating distance—say, 9-10 feet—because human visual acuity is limited. Most people can't discern the extra resolution offered by 4K/UHD, not to mention 8K, on a screen that size while viewing from that distance. However, at last week's SMPTE tech conference, one presentation convinced me that higher resolutions are important after all, even if we can't see the extra detail directly.

The presentation, called "Beyond the Limits of Visual Acuity: The Real Reason for 4K and 8K Image Resolution," was presented by Edward Reuss, an independent industry consultant. He started by explaining that visual acuity is not a single number. "Reading acuity" refers to a viewer's ability to read the standard eye chart in a doctor's office; with 20/20 vision, a person should be able to read letters with features that subtend an angle of 1 arc-minute (1/60 of a degree) at a distance of 20 feet. "Simple acuity" refers to the optical limit for viewers with very keen vision under optimal lighting and viewing conditions; such viewers can theoretically perceive details half as small as the limit of reading acuity.

Then there's "hyperacuity," which refers to a viewer's ability to discern anomalies in an object's shape. Of most interest to videophiles are artifacts from image compression and spatial aliasing such as jaggies and moiré, which viewers can often perceive on a scale smaller than reading or even simple acuity. In fact, artifacts with features half to a quarter as large as the limit of simple acuity can be visible.

One example Reuss provided was two still images of a tennis court, seen above. In the upper image, the resolution is relatively low, and you can see jaggies in the white lines and moiré in the net. In the lower image, the resolution is twice as high, and the moiré is effectively eliminated while the jaggies are greatly reduced, though not entirely removed, especially in the upper left of the court. (Interestingly, he said that a "pathological" image with spatial artifacts can be constructed for any resolution, no matter how high.)

Jaggies and moiré can be reduced or eliminated using a lowpass filter to remove high-frequency components from the video signal, including sharp, hard edges. However, applying such a filter softens those edges. By starting with a resolution that is higher than humans can perceive, the softening is not visible, and edges still appear sharp without artifacts.

Reuss went on to say that high dynamic range (HDR) and wide color gamut (WCG) actually enhance the impact of spatial artifacts such as jaggies and moiré. Thus, higher resolutions are even more important for HDR/WCG displays so lowpass filters can be applied to remove these artifacts without visibly softening sharp edges.

As a result of this presentation, I've changed my position on 4K and 8K consumer displays. I still believe that the extra resolution is not perceptible on a typical display at a typical seating distance, but the ability to apply lowpass filters to reduce or eliminate spatial artifacts without visibly softening the image is a big advantage, especially for HDR/WCG content. I guess even an old dog can learn new tricks!