Originally Posted by ARROW-AV
Absolutely correct. 'Extra pixels' don't matter beyond the respective HVP threshold viewing distance
A native 4K single pixel checkerboard test pattern will be perceived as a solid grey screen and identical as compared with if it were displayed downscaled to HD 1080p resolution once you are viewing it beyond the respective HVP threshold viewing distance.
In other words you won't be able to perceive any difference
Originally Posted by imagic
Hey, you think I'm writing nonsense? I can verify my observations and indeed can replicate them on the TV I'm using to type this.
I have a computer, it's got Photoshop, feeding 4K to a 4K TV.
So, right now, in my own home, as I type this, I can tell you in no uncertain terms, that I have to get about 30+ feet away from the 75 inch TV that I'm using to see no stairstep whatsoever—just a smooth (slightly) diagonal line. That's with sharpening off (as it should be) It's a question of seeing one single pixel, or not. I can see that stair-step transition from quite a distance.
Yes, at such distance, a single-pixel checkerboard looks grey. Not arguing that. Indeed, the checkerboard looks grey a lot closer than 30 feet, in 4K.
Interestingly, with 4K anti-aliasing can't "fix" it, the line looks like it's getting fatter and skinnier, even from 30+ feet. So either stair-steps, or fat/skinny. With 8K the anti-aliasing would be more effective and the diagonal line would look much smoother, but still sharp, from a much closer distance.
Another interesting thing is from how far away you can see a one pixel wide, screen-width black line on a white background in 4K You can see it from very, very far away. It does not disappear and become white. I had to exit my house and cross the street before it disappeared. Eyes are capable of discerning that line despite it being much thinner than what you should be able to "see."
I'm looking for the maximum distance at which I can see, make out, however you want to define it, the difference a single pixel can make. Exactly as described in the article itself.
There is no chart in the world that's gonna replace the fact that I can replicate these effects right now, as I type this.
Originally Posted by imagic
Have you even bothered reading my posts? Or are you all about arguing with an appeal to authority while ignoring a practical example that contradicts your point?
Firstly, because I am going to reference it I am going to quote this except from your article here:
"As an experiment, I put a 4K and an 8K TV side-by-side (82″ TVs) in a large room and walked backward until I could no longer see stair-steps in the lines. For the 8K TV, an 82″ Samsung Q900F, the line became smooth at about a 15-foot viewing distance. But the 4K TV (also an 82″ Samsung) required that I step back to somewhere around 40-50 feet away before the lines became smooth—I was very surprised that I had to triple the distance, I thought it would happen at double the distance. This is one provable, observable reason upscaling to 8K can yield a visible benefit"
Secondly, there are multiple different conversations going on in this thread.
The conversation which I have been having is with respect to the perception of image resolution specifically.
What you are talking about is use of algorithms to upscale 4K content to 8K to improve the performance of video artefacts, such as aliasing artefacts. This will not magically enable anyone to fully distinguish between and perceive all of the additional detail that might be present within the 8K image as compared with the 4K image if the viewing distance and screen size combination falls beyond the limits of HVP. In other words, video artefacts aside, you won't be able to perceive any difference in overall resolution or fine detail in general. Sorry!
Your discussion about the possiblity of making use of 8K resolution to improve the performance of video artefacts that might be associated with 4K images which are being perceived as (or below) 4K resolution (which will be the case unless the screen size is larger enough and the viewing distance close enough in order for the combination to be within the perception limits of HVP) is very interesting though
However, your experiment is unfortunately fundamentally flawed in more ways than one.
You have included only one set of data points from one set of samples, being one singular 4K TV and one singular 8K TV. You therefore cannot factor out of the equation the influences of interunit variance. Furthermore, the TVs are not 100% identical in every way aside from the resolution. They have differing hardware, software, firmware, and image processing etc. etc. hence you are merely evaluating what are the comparative performances of these particular units of these particular makes and models of TVs.
Therefore, for hopefully obvious reasons, you cannot generalize that your subjective observations are applicable to ALL 4K and 8K TVs.
Additionally, you have used Photoshop for video processing anti-aliasing, meaning that what you are really testing here is how good or bad is Photoshop's anti-aliasing video processing... which isn't brilliant. Try repeating your experiment using MadVR instead, configured using its optimum settings for best anti-aliasing performance, and you will find yourself experiencing a better outcome as compared with using Photoshop
If you are able to perceive a video artefact at a distance wherein the HVP limit is 4K resolution then the video artefact has a resolution of 4K resolution (or less), meaning that if the upscaling to 8K reduces the severity of the video artefact this calls into question how good is the performance of the video processing and the display itself. However, you dress it up, you cannot perceive 8K resolution beyond the HVP limit for perceiving 8K resolution. So if you are perceiving it, and the screen size and view distance combination are beyond the HVP limit for perceiving 8K resolution, then the specific thing you are perceiving is not 8K resolution. Wherein, bear in mind that aliasing artefacts are not single pixel resolution but comprise multiple pixels so are in fact lower than the display's native resolution
Also, consider this for a moment, 'MANUFACTURER A' wants to successfully market 8K resolution TVs at a premium over and above its new model 4K TVs. Are you therefore really surprised that you can see differences in performance between the two in this regard? Incidentally, when the very first prototype 8K TVs were demoed versus 4K TVs nobody who was present when I viewed them could tell any difference between them. However, at the next trade show some months later, you could. But the reason was because the video content that was being fed to the 8K TV had edge-enhancement, with black lines around people and objects, to make them stand out more and make the 8K TV look 'sharper' than the 4K TV. Seriously
Originally Posted by imagic
Another interesting thing is from how far away you can see a one pixel wide, screen-width black line on a white background in 4K You can see it from very, very far away. It does not disappear and become white. I had to exit my house and cross the street before it disappeared.
With respect, you are looking for the wrong thing here. The correct tests in this instance would be to display the native 4K single pixel width black line on a white background on two displays that are identical in every way aside from one being 4K resolution and the other being 8K resolution, with the 8K TV upscaling the 4K image to 8K, and then see at what distance you cannot tell the difference between the two. Wherein, it would also be interesting to use a native 8K single pixel width black line against a white background and feed this natively to the 8K TV and downscale it to 4K and feed this to the 4K TV, then again, see at what distance you cannot tell the difference between the two. This is how you would need to go about properly evalating this.