Joined
·
11,844 Posts
At what point will increased resolution surpass what the human eye actually can decipher? At what resolution will going any further be pointless? Anyone know?
Now Available: Tech Talk Podcast with Scott Wilkinson, Episode 19 Click here for details.
Originally posted by Ernie Smith What does this have to do with HDTV Programming? |
Originally posted by ChadD I believe HDTV was planned with this maximum resolution in mind. However we will need cameras capable of capturing the full 1080p resolution and devices capable of displaying the full 1080p resolution before we can appreciate this. The generally accepted maximum resolution for human eyesight ( http://www.spie.org/web/oer/october/oct97/eye.html ) is 2 arc min per resolved line pair. Which I believe would be 1/60th of a degree for a single pixel. So if you are sitting at a distance from your screen such that the screen takes up 30 degrees of your viewing field (roughly twice the width of your screen), 1800 horizontal pixels should equate to the maximum resolution of the human eye. Hence the recommended twice screen width viewing distance for HDTV. |
S.A. Moore - No, that simply means that if you sit twice the width of the screen back from it, the distance between pixels is on the edge of your resolving power. |
Originally posted by BarryO This came up a long time ago (actually, it may have been on another forum). I did some calculations that showed that, at the HDTV "design viewing distance" of 3X the picture hieght, 1920x1080 basically matches the visual actuity of someone with 20/20 vision. In other words, getting more resolution does buy you much, unless you're sitting closer, or have exceptional eyesight. |
Originally posted by John Mason (A full 1920X1080, of course, currently requires 1.5 billion bits per second instead of broadcast HDTV's ~19.39 million bits per second.) |
Why would it be necessary to broadcast uncompressed video to get the highest resolution ? This sounds like a myth in progress..... |