Originally Posted by mastermaybe
Re the decision on 1080 content: Yeah, I guess I kinda sorta didn't buy the earlier explanation (post 287).
If it's reasonable to conclude that we'll eventually be consuming all (or nearly all) 1080 content- which is easily conceivable to me in the next 5 years, especially considering how much 1080P content is already avail- why would any of what you proposed matter in the slightest?
And even if there's some truth to your assertion that the number is some kind of hybrid between differing "standards" (for lack of a better term) do you not think it's unbelievably coincidental to find the resolutions you listed to be fantastically similar to one another (1080, 950-1150)? Where do you suppose those
Sure 1080 may be some kind of "compromised" figure, but where did the points that comprise that compromise (
) emanate from? Something tells me the answer is very, very, straightforward.
You are on a circular argument/ guess. It is based on visual acuity so 1080 is human limit and human limit is 1080. Until someone can show that 1080 was
chosen because they considered it as visual limit, it is pure speculation. Like I said I have never read such literature. Have you? Young Irkuck says he had but too bad it's a difficult dig because only
that particular SMPTE paper mentions it in the internet library.
Put in another way: 480 DVD is double 240 VCD. 1150-950 HD is about double 480/576. And dubious 8k is well, double 4k. Sounds pretty familiar like digital binary progression but IMHO I don't see a link to visual acuity in the trend.
BTW 10' for 80" possibly
be perceptible for 4k based on the Sony document but you said 1080 cannot be resolved at 10.5' for 80". That doesn't click. Like all research it's probably somewhere in between for different people and environment.
Originally Posted by specuvestor
Are you asking printed media dots per inch (dpi) vs digital display pixels per inch (ppi)?
Originally Posted by irkuck
480 is a very old story, very little to do with visual acuity.720 I told already, some infighting and horse trading. 1080 based on subjective testing and background acuity research.
What I am trying to explain you is that visual system is very tricky and lots depends on the viewing scenario. Thus for 3-4H1080 is fine, for 1-2H 4K is OK. But in the glossy magazine scenario 1500 ppi are necessary for things looking fine.
Yes, under optimistic scenario 4K difference should be visible up to 3.2H and this is max. So realistically 2.5H is safe and 2.5H is extreme viewing scenario for TV. If you say 4MP DSLR will look better on a 4K the the question is what is the viewing scenario. I am saying 4K computer monitors are abolutely needed for digital photography work.
It was a loaded question in disguise:
1) Your nomenclature in the first place is incorrect. ppi and dpi are different though people use it as interchangeably as Hz and fps, but these have significantly different connotations. But since this is AVS and not AMZN, we have to be accurate. I tend to edit most of my longer posts several times to be as factually accurate as possible.
2) Technically retina display of 326ppi consist of 3 RGB primaries so the resolution is 326X3 if you want to make a "somewhat apple-to-apple" comparison with dpi.
3) Displays are additive RGB vs print which is CMY subtractive. If you are old enough you will remember the hype in WYSIWYG which IMHO for this reason cannot be perfected in terms of color.
4) Displays has greyscale as you can adjust the luma. Print cannot adjust luma so they have to have more colors to produce the grey scale ie their palette is technically constantly insufficient. Hence they have to resort to dithering/ half-tones which is why they need more resolution. In fact this technique make use of the limit in visual acuity. It should not be used in another circular argument that retina can see beyond 326ppi.
I don't think it is even an issue with the visual system. It is an issue with the print hardware ie it is a technical limitation.
As to the 4MP DSLR, I don't think you got my point, as usual