AVS Forum banner

Status
Not open for further replies.
1 - 20 of 24 Posts

·
Premium Member
Joined
·
11,839 Posts
Discussion Starter #1
At what point will increased resolution surpass what the human eye actually can decipher? At what resolution will going any further be pointless? Anyone know?
 

·
Registered
Joined
·
1,888 Posts
For HT I would say that when we reach Film resolution (estimated at 4000) we have reached the upper limit of Human perceived resolution on even the 100" and up FP systems. Of course if we are talking about 36" and less I think HD (1080i) could already be there!
 

·
Registered
Joined
·
545 Posts
Using the figures contained here:

http://white.stanford.edu/html/numbers/node1.html


I did extensive calculations and came to the enscapable conclusion that

I need to get a life.


Anyways, if you make the unwarranted assumption that eyes are square,

thats upwards of 30,000 by 30,000. The eye does not really acheive

that, nor is the density the same all over, so for the area you are actually

looking directly at, its probally a lot higher. The eye does not waste a

lot of receptors on your peripheral vision.


So I suspect the short answer is, there is no answer, and the rule of

thumb answer is, its probally well over 10,000 for all practical purposes.
 

·
Registered
Joined
·
6,092 Posts
Frank,


4000 is great estimate for 35mm stills, but once you have the film rolling the resolution is less. I would guess about half for a good film presentation. (others would argue)


I did do an interesting experiement at our IMAX in Houston last year: The IMAX has a test film loop of about 100 frames that puts out a stagnit resolution/focus test pattern (similar to the TVL tets pattern in AVIA). We were comparing the image of the IMAX with that of a Proxima ProAV 9410 SXGA projector. The Proxima was projecting an image about half as wide as the IMAX screen. The test pattern was roughly twice as sharp as the 9410. A little quick math: 1280pixels X twice as sharp X twice the width = >5000 pixel equivalent image. It was sort of apples to oranges comparison but a little more than 5000 would be a good guess. (btw, the 9410 was going to be used for commercials for the museum imbetween showings but the funding fell through).


Kipp,


The maximum resolution comes down to how many screen widths you are away. The minimum recommended screen size is 30 degree width which is almost 1.9 screen widths away. At that distance, I think it would be difficult for someone with 20/20 vision to see pixels on a 1280 pixel image. likewise, at 1.3 screen widths away, I think 1920 would be nearly impossible to see pixels.


There are some more scientific articles on the subject if you do a search.


-Mr. Wigggles


Ps. It is interesting I never thought to look at the IMAX test pattern with a magnigying glass when it wasn't moving. That might give a good idea of the resolution difference
 

·
Registered
Joined
·
396 Posts
I believe HDTV was planned with this maximum resolution in mind. However we will need cameras capable of capturing the full 1080p resolution and devices capable of displaying the full 1080p resolution before we can appreciate this.


The generally accepted maximum resolution for human eyesight ( http://www.spie.org/web/oer/october/oct97/eye.html ) is 2 arc min per resolved line pair. Which I believe would be 1/60th of a degree for a single pixel.


So if you are sitting at a distance from your screen such that the screen takes up 30 degrees of your viewing field (roughly twice the width of your screen), 1800 horizontal pixels should equate to the maximum resolution of the human eye. Hence the recommended twice screen width viewing distance for HDTV.
 

·
AVS Forum Special Member
Joined
·
11,139 Posts
Eighteen-hundred isn't a bad number for HDTV programming, IMO. It's on the optimistic side. But it's roughly what the FCC's experts committee anticipated (and measured) here and the chart here . That's avoiding getting into minutia such as whether it involves moving or stationary parts of a 1080i image, factoring in color resolution, whether a 1440-limited Sony HDCAM was involved, whether it's a live 1080i image vs. a D-5 tape-format image, etc., and whether you have a display that can resolve such detail. -- John
 

·
Premium Member
Joined
·
11,839 Posts
Discussion Starter #8
Quote:
Originally posted by Ernie Smith
What does this have to do with HDTV Programming?
The leveling off of the resolution of the HDTV programming we watch. I wanted to know where the resolution of HDTV programming can go and what we will be able to actually see.
 

·
Registered
Joined
·
545 Posts
Quote:
Originally posted by ChadD
I believe HDTV was planned with this maximum resolution in mind. However we will need cameras capable of capturing the full 1080p resolution and devices capable of displaying the full 1080p resolution before we can appreciate this.


The generally accepted maximum resolution for human eyesight ( http://www.spie.org/web/oer/october/oct97/eye.html ) is 2 arc min per resolved line pair. Which I believe would be 1/60th of a degree for a single pixel.


So if you are sitting at a distance from your screen such that the screen takes up 30 degrees of your viewing field (roughly twice the width of your screen), 1800 horizontal pixels should equate to the maximum resolution of the human eye. Hence the recommended twice screen width viewing distance for HDTV.
No, that simply means that if you sit twice the width of the screen back

from it, the distance between pixels is on the edge of your resolving power. Its not the maximum resolution of the eye. If your field of view

were filled by the screen at that distance, I would say it is so, but its

not.
 

·
Registered
Joined
·
621 Posts
I really don't understand this thread at all. You must all be engineers.


I do use Photoshop, so I can understand 2400 dpi as referring to, 2400 dots per inch, an actual measurement. 2400 is the upper edge of what is required in print resolution, and details can be seen clearly at far less resolutions.


When we're referring to the resolution of film, however, are we talking about the actual measured resolution on a piece of 35mm film, or how that resolution is perceived from someone sitting far back from the screen, where the image is blown up a thousand times from the size of the film itself? It's very confusing.


The picture which comes over my cable box of Jay Leno in high-definition (1080i) is far more detailed and clear than what I myself would see, standing on the stage at about the same distance away from Jay, to approximate my viewing distance in my home theater. The major difference seems to be that Jay on my TV is cut off to the dimensions of the display, but in life, I would have peripheral vision which would encompass more.


I also find film over my cable to be softer, and overall less resolution than Jay Leno, yet with more depth, substance, and soul. Still, I find movies in HDTV to provide much more detail than I could understand in life. I can see the makeup on the actors faces, covering moles and pores. This would not be perceivable to me in life, unless I was within 6 inches of the actors' faces.
 

·
Registered
Joined
·
396 Posts
Quote:
S.A. Moore -

No, that simply means that if you sit twice the width of the screen back

from it, the distance between pixels is on the edge of your resolving power.
I don't understand your point. That's the definition "resolution" when applied to optics, the ability to distinguish separate images of close objects.


Resolution has nothing to do with the size of the field of view, only the ability to distinguish two separate points, it is my understanding that HDTV was designed with a 30 degree field of view in mind.
 

·
Registered
Joined
·
1,273 Posts
This came up a long time ago (actually, it may have been on another forum). I did some calculations that showed that, at the HDTV "design viewing distance" of 3X the picture hieght, 1920x1080 basically matches the visual actuity of someone with 20/20 vision. In other words, getting more resolution does buy you much, unless you're sitting closer, or have exceptional eyesight.
 

·
Registered
Joined
·
545 Posts
Quote:
Originally posted by BarryO
This came up a long time ago (actually, it may have been on another forum). I did some calculations that showed that, at the HDTV "design viewing distance" of 3X the picture hieght, 1920x1080 basically matches the visual actuity of someone with 20/20 vision. In other words, getting more resolution does buy you much, unless you're sitting closer, or have exceptional eyesight.
What he said, he said it pretty good.


I was simply pointing out that the originator of this thread was asking

at what resolution the eye could preceive no more, and you guys were

saying that HDTV is the most people can preceive AT THE PROPER DISTANCE.


These are not one and the same.
 

·
AVS Forum Special Member
Joined
·
11,139 Posts
Okay, I'll point out, too, that my post about 1800 pixels being close to HDTV reality was only related to posts just preceding it about HDTV programming.


Just to expand things a bit. Recall reading just a while back, perhaps here, that some HDTV systems with 4 times the current ATSC system's resolution are being investigated. As pointed out, at typical viewing distances in homes you couldn't appreciate that resolution. But you could, like a radiologist zooming in on a high-resolution X-ray graphics display, zoom in one part of the screen that interested you--say, some hockey action being shown in wide view. Such HDTV resolutions could be expanded almost indefinitely, IMO, depending on image-sensor pitch and available transmission bandwidth. My preference would be to try for as much of a true 1920X1080 (or higher) as the sharpest-eyed viewer could resolve on the best displays available. (A full 1920X1080, of course, currently requires 1.5 billion bits per second instead of broadcast HDTV's ~19.39 million bits per second.) Then, from that starting point, HDTV should add the third dimension, depth, with stereoscopic 3-D HDTV. Suspect you'd have to be at the end of an optical-fiber transmission link.


Upcoming hardware for giant E-cinema screens is getting up there. If you visit JVC's D-ILA site ,
Hayakawa's PDF-file paper mentions JVC's 3395X2160 Q-HDTV projector. -- John
 

·
Registered
Joined
·
545 Posts
Quote:
Originally posted by John Mason
(A full 1920X1080, of course, currently requires 1.5 billion bits per second instead of broadcast HDTV's ~19.39 million bits per second.)
Why would it be necessary to broadcast uncompressed video to get the

highest resolution ? This sounds like a myth in progress.....
 

·
Registered
Joined
·
1,054 Posts
Having looked at ~80 HDTV full resolution captures, I think I can safely say that we have not reached the resolution plateau.


When we say resolution, let's clarify that that is resolvable pixels, or distinct color/luminance variations.


If I take a 320x240 image and blow it up to 1920x1080, the *resolution* in the strictest sense, is 320 x 240 distinct points of light (thanks George Bush!).


Most of the stuff in HD right now is more like 1400x700 (or less). I think as cameras get better, and as producers buy more expensive cameras, we'll approach 1920x1080 resolution,


It's the same way that NTSC started out as probably 300x200 distinct "pixels" or spots of brightness and slowly improved. Just look at how bad TV looked in the 80's. Now, NTSC has been refined for 50 years and some of it looks pretty darn good.


So far, the examples that jump out at me of programs that really are using the full resolution are some of the WRAL snapshots, and the Showtime HD transfers. I can safely say that there are a full 1920x1080 distinct picture elements (pixels) in those examples, or pretty close.
 

·
AVS Forum Special Member
Joined
·
11,139 Posts
Quote:


Why would it be necessary to broadcast uncompressed video to get the highest resolution ? This sounds like a myth in progress.....
True enough. Just wanted to illustrate the contrast. But of course then you have to ask, given the sharpest eyes and the best display, at what point in any compression process do you start missing something? I'm not particularly happy with 19.39 (actually ~17) Mbps-- John
 

·
Registered
Joined
·
1,054 Posts
17Mbps is indeed a straightjacket, but if WRAL can do it...
 

·
Registered
Joined
·
2,412 Posts
Some of the ~19.4 Mb/sec in ATSC are for non-picture related data like audio, ASIP, PSIP and other housekeeping signals even if you're doing only one, full-blown 1080i HDTV program, leaving ~17 Mb/sec for the image.


One way of putting this part of the discussion into perspecitve is to realize that ATSC 1080/60i "can provide up to 1920 X 1080 pixels" depending on circumstances. The HD signal is compressed in ATSC and all 2 million pixels may not be present at once, depending on scene content, rapid motion, etc. The system tries to give you as much perceived resolution as practical.


When and if uncompressed HDTV pictures become available in the home via one or more of the copy-protected formats being put forward, don't be surprized when the picture looks better than ATSC. This is the other side of the double-edged sword that is DVI, 5C, HTCP, etc.: there's the threat of image-constrained analog HD outputs and copy-restrictions on the one hand but the possibility of being able to view uncompressed HD images via the approved digital outputs on the other.
 

·
Registered
Joined
·
1,054 Posts
You won't see any uncompressed HDTV format in consumers hands. EVER.


What you might see is HDTV with compression that isn't quite so nasty as ATSC requires. It's probably 70:1 MPEG-2 right now. Maybe at 40:1 we'll see a big difference.
 
1 - 20 of 24 Posts
Status
Not open for further replies.
Top