AVS Forum banner
Status
Not open for further replies.
1 - 20 of 35 Posts

·
Registered
Joined
·
502 Posts
Discussion Starter · #1 ·
With newer and greater projectors coming out all the time, I can't help but wonder when a projector will give us a picture as good as the real world we see with our eyes. Anybody know what the resolution our eyes are capable of. I know my eyes are way better than 1080p!
 

·
Registered
Joined
·
467 Posts
Quote:
Originally posted by ChaCha
I know my eyes are way better than 1080p!
Not really. :)


The visual acuity of the human eye for a person with 20/20 vision is 1/60th degree of arc. After which no more detail can be discerned.


In practical terms, say you had a screen size of 80 inches diagonally and were projecting HD in full 1080p. If you sat any further than 10.5 ft back details would blend together and you wouldn't be able to see the difference between 1080p or 10,000p. This assumes you have 20/20 vision.


That sounds pretty damn good to me. I don't think the broadcast standards are going to change for a very long time. I don't think anyone is going to complain that we need more resolution when true 1080p arrives in HD-DVD and all TV is 1080i and 720p (hopefully). I'm sure we'll complain about something, but I don't see resolution being a problem.
 

·
Registered
Joined
·
19,586 Posts
The primary reason to go beyond 1080p would be to be able to support things like anti-aliasing. The one thing that still brings out the pixel based nature of even HD content is aliasing along hard edges. We can see these very clearly. So though we wouldn't necessarily need much more than 1080p to provide some really good home video, we could use more resolution for playing tricks that make the image look smoother.


Actually, according to how it's done, you wouldn't actually have to deliver more than 1080p. There could be vector information in the 1080p data stream that provided hints as to how each frame could benefit from anti-aliasing, and it could be done on the fly by the display, given sufficient processing power. Given the great inertia inherent in changing broadcast and delivery and display formats, that might be an easier rock to crack, since if your display didn't use the hints, that's fine. But if you have the display with 2x or more times the available source resolution, you could make use of it.
 

·
Registered
Joined
·
122 Posts
Quote:
Originally posted by David777
The visual acuity of the human eye for a person with 20/20 vision is 1/60th degree of arc. After which no more detail can be discerned.
That's not entirely true. Vernier acuity is 5-10 seconds of arc for the average person. What you are talking about is minimum separable acuity. Check out this site for an explanation:

http://www.pc.ibm.com/ww/healthycomp...vdt13eyee.html


Albert
 

·
Banned
Joined
·
2,411 Posts
I agree, beyond 1024x768, it's mostly for anti-aliasing. However, anything over 1280x720p seems to be overkill for screen sizes up to 96" wide. Beyond that more res is better of course because of the pixel spread per square inch of screen size, vs. how far away you sit. The caveat with too much resolution is that the image gets softer and smoother, and this isn't necessarily desirable to the brain. And filtering or sharpening can actually reduce true resolution down a couple of notches because edging take up several pixels instead of one. It's all very subjective of course. I cranked my CRT up to it's physical 1600x1200 max for HD once and it was way too smooth and soft, even without line overlaps. I cranked it back to 1024x768 and liked it better. I now have the Benq 8700 and at 1280x720p it is as good as the crt ever was, except for the black perception issue which helped to add more detail to the resolution to the crt. So that's another factor as well. High resolution without detail is meaningless too, which is why some are going to the Vutec Silverstar screens that add more detail.


Bob
 

·
Registered
Joined
·
1,458 Posts
Quote:
The caveat with too much resolution is that the image gets softer and smoother, and this isn't necessarily desirable to the brain.
What? This makes no sense. A softer, smoother image should be more life-like. It's like looking at a circle on a 640x480 screen versus a 1024x768 screen; the circle looks smoother, less digital, and more "real" on the latter.
 

·
Registered
Joined
·
10,200 Posts
Quote:
The caveat with too much resolution is that the image gets softer and smoother, and this isn't necessarily desirable to the brain.
Well, if that is the case we should blame our experience with projectors convincing us that the high-frequency distortion that is screen door, pixel structure, and/or scan lines is somehow desirable. Or, we can blame our current scaler hardware, which is probably not doing the filtering quite properly, for giving filtering a bad name. But the truth is that a 1280x720p video image, properly upsampled and filtered, should look a lot smoother than it appears to be on a 1280x720 display---but it will also reveal a lot more of its detail as well.


Count me in the "more resolution is always better---if the scaling hardware is doing its job properly" camp.
 

·
Registered
Joined
·
9,769 Posts
1024 x 768 :rolleyes: you're joking right?


Give me 7680 x 4320 display and a 1080p source :)
 

·
Registered
Joined
·
10,200 Posts
You know, there was a really great thread, by Bjoern Roy I *think* (but don't hold me to that), with an awesome pictorial illustration of the benefits of upscaling video. It was pretty darn evident from the illustrations that proper upsampling did not sacrifice detail; indeed, it made the detail easier to see! As an aside, the Gaussian-like shape of a CRT beam spot came reasonably close to an ideal upsampler, suggesting that CRTs may not need as much "help" from upscaling as digital projectors do.


If anyone knows where that thread is it would be an *ideal* link to post here.
 

·
Registered
Joined
·
3,639 Posts
Why only a 1080p source? I'd want a a 4320p source. Sure, it would take a whole hard drive for one movie, but it'll be cheap by then.
 

·
Registered
Joined
·
10,200 Posts
threed123, what you're saying is all well and good. But if we need edge enhancement (which is what you're describing) in order to enjoy our home theater images, we should be shooting for perfect reproduction first, then we can add high-frequency emphasis properly. There's no excuse to forcibly limit ourselves to pixel structure and screen door, which is a flawed way to get that "detail" effect.


But hey, it sounds like we want one and the same thing! Bring it on!
 

·
Registered
Joined
·
19,586 Posts
Quote:
Why only a 1080p source? I'd want a a 4320p source. Sure, it would take a whole hard drive for one movie, but it'll be cheap by then.
For the reasons already discussed. In order to actually see that resolution, you would need to either sit a foot away from a normal home FP screen, or have from a normal viewing distance have a screen far larger than any normal human can afford. It won't be so cheap by then that it will be done just for a handful of people.


As discussed, 1080p is actually a lot of resolution, and not probably far from what most people are getting in reality in their crappy local film theater on a huge screen. On your home screen, it's a lot of resolution. We can use more display resolution, in order to do anti-aliasing, and we could use more color depth, but 1080p resolution isn't at all bad. There's no use to ship resolution that no one can see.
 

·
Registered
Joined
·
3,822 Posts
It depends on the distance from the screen. For the usual 1/60 degree resolution, you end up with:


x = 1/(tan(n/60))


x is distance in screen widths

n is the horizontal pixel count


Some common values for x:


1.60 (n=1920, large LCD flat panel)

2.34 (n=1386, Sony LCD-RPTV)

2.56 (n=1280, HD DLP)

3.95 (n=853, Panny ED Plasma)

4.70 (n=720, DVD horizontal resolution)


Any closer and the "blur" sets in.


The vernier limit? Probably not in our lifetimes.
 

·
Banned
Joined
·
2,411 Posts
Just for laughs,


If the human eye can see detail to about 200 dots/pixels per inch of resolution max on printed matter, then holding an inch up to 1 foot from the eye is about 1 foot of screen size at 14 feet. So for an 8 foot screen at 14 feet, then resolution should be in the neighborhood of 1600pixels by 1600 (considering 16:9) then 1600x900. Isn't that interesting.
 

·
Registered
Joined
·
1,278 Posts
Don't forget the 12-bit video! That would significantly help the sense of three dimensionality and get a LOT more precision without the dithering. It believe the eye can handle a total of about 100 million pixels, but we are probably only able to focus on, let's say, 25 million. A 4320x7680 display would not only perfectly scale all HDTV formats, but would gives us about 30 million pixels. Unless they develop a "holodeck", I don't see how it gets any better than that.
 

·
Registered
Joined
·
3,639 Posts
Quote:
Originally posted by Dean Roddey
For the reasons already discussed. In order to actually see that resolution, you would need to either sit a foot away from a normal home FP screen, or have from a normal viewing distance have a screen far larger than any normal human can afford. It won't be so cheap by then that it will be done just for a handful of people.


As discussed, 1080p is actually a lot of resolution, and not probably far from what most people are getting in reality in their crappy local film theater on a huge screen. On your home screen, it's a lot of resolution. We can use more display resolution, in order to do anti-aliasing, and we could use more color depth, but 1080p resolution isn't at all bad. There's no use to ship resolution that no one can see.
That makes no sense. Higher resolution doesn't mean a giant screen. Just more lines of information within a picture. Why would I have to watch at 1 foot away or have a giant screen?


And 1080p is nice, but I want better. I want to be able to walk up to within a few inches from the screen and not see pixels or anything else. Why do I want that much detail in the picture? Why not?
 

·
Registered
Joined
·
19,586 Posts
Quote:
That makes no sense. Higher resolution doesn't mean a giant screen. Just more lines of information within a picture. Why would I have to watch at 1 foot away or have a giant screen?
That's what's been being discussed throughout this entire thread. You can't see the resolution. The human eye has limited spatial resolution, and as soon as you get beyond the resolution limit of the eye, the extra resolution is wasted. The smaller the screen wrt to the amount of resolution, the less and less of the resolution you can see.
 

·
Registered
Joined
·
10,200 Posts
Well, according to dr_mark2001's calculations we have a ways to go before we run out of useful resolution...
 

·
Registered
Joined
·
19,586 Posts
But you can't sit that close for other reasons. start getting down to 1.0 screen widths and your viewers will look like they are at a tennis match, and you couldn't have any sort of useful sound system unless you were going to use near field speakers. And you also have to take into account lower resolution stuff you need to watch and so forth. The real issue at hand is what can you see from a viewing distance that is useful.


1.6 screen widths is about right for HD viewing, and you can see HD resolution to about it's fullest extent from there probably. If you doubled the resolutin, you'd be getting so close to the screen that you'd be seeing the shadow of your head on the screen by the time you got close enough to take advantage of all that resolution.
 
1 - 20 of 35 Posts
Status
Not open for further replies.
Top