Originally Posted by tgm1024
I've stared and stared at the 4K X900 at BB/Mag and because my couch is more than 12 feet back from my own set I tested that as well. There's no way I'm not seeing 4K effects.
I also have a pre-press / printed graphics background and yes, I also know not to base things on acuity charts. But with TVs the charts are a far less appropriate metric because motion is now involved (as opposed to typography). And your eye/brain neurology can detect edges in apparently insane ways: that's it's job in fact. Your eye is not
a CCD array. One of the oddball examples I've given in the past has to do with a spiderweb strand some 20 feet away. You can often not see it at all, even if you're staring right at it, until it moves. In fact, your eye undergoes saccadic motion even with stationary things to further this phenomenon, and when it's coupled with a change against a reference background as well (such as with video), you really have an amazing ability to see a difference.
We see *less* resolution on a screen for Items in motion than for static items. The reason is motion blur during capture and judder during display. The eye's resolving power for items in motion is also the same as for static items, this is because nothing is static on the retina anyway (because of the saccades you mention). I would also point out that individual display pixels aren't moving anyway, that is unless you put your TV on casters and push it across your floor.
But, there is an important factor from motion in determining the pixels needed for a display with "invisible" pixels. By invisible, I mean one where you can't tell that it is made up of individual pixels. That factor is that artifacts that move are far more visible than stationary artifacts. So, while some aliasing is not too objectionable in a static image, any aliasing in a moving image is distracting and ugly. Now avoiding aliasing is at odds with preserving detail right up to the pixel spacing, so having extra pixels beyond what a person can strictly see is helpful to simultaneously avoid artifacts and have sharp details at the visible limit. How much extra you need depends on how the anti-aliasing is performed. If first you capture a very high resolution image (say with a Red Epic-Dragon camera), then digitally down-sample it, you can retain more detail without artifacts than if you directly capture at the display resolution. (The best Blu-Ray discs are generated from movies captured originally with more than 2K resolution.)
It is simple to determine how much resolution you need if you own a set today: Simply change your viewing distance to check the effect of different numbers of pixels per angle of view. I think a reasonable rule of thumb is to use a pixel spacing just finer than the high contrast resolution limit for your vision. At that spacing, with natural images (not computer generated), the pixels will be invisible for all but the highest contrast edges. This is the value used in the Sony paper justifying 4k theater projection. Sony used 20:20 vision which yields the following ratios for viewing distance to image height based on the number of pixels vertically:
lines: ratio (distance/image-height)
480 : 7.16
720 : 4.77
1080 : 3.18
1620 : 2.12
2160 : 1.59
I think many people can just discern 20:15 details even if they can't identify text at that scale. This resolution yields the following ratios:
lines: ratio (distance/image-height)
480 : 9.55
720 : 6.37
1080 : 4.24
1620 : 2.83
2160 : 2.12
So, at 10 feet the maximum screen size for each number of lines is:
480 : 25"
720 : 38"
1080 : 57"
1620 : 86"
2160 : 115"
Since after 1080p the next step up usually available for a TV is 2160p (for a 4K set), this chart implies you will see a difference moving up to 4K for 58" TVs and above.
At 12 feet, 1080p is good up to a 69" screen. So, I think the OP shouldn't need more than 1080p for a 55" set.Edited by deconvolver - 9/26/13 at 10:29am