That's not a well formed question. The eye's resolution is in terms of subtended arc it can distinguish. Any angle spread out wider and wider as it travels over a given distance. So whatever the eye's angular resolution, what that means in terms of required pixel density not to be visible depends on how far the pixels are from your eye and the size of those pixels (pixels per inch.). If it's a foot away, then even a very dense pixel structure on a small screen can still be very obviously visible.
For TVs and projectors, they are typically a lot further from you than a hand held device. So generally the hand held has to have a higher (allota alliteration) resolution, despite being much smaller (which makes it doubly more pixel dense.) You may have it a few feet from your eyes. But a plasma 14' away can have a far lower resolution without it being noticeable because by then the smallest angle your eye can resolve has spread out a lot more.
So, anyway, the question is always, what is the actual distance covered by the angular resolution at the desired viewing distance, and given size of the display. All those things are interactive, so the answer about what is optimal will be different in almost everyone's situation.
A forum community dedicated to home theater owners and enthusiasts. Come join the discussion about home audio/video, TVs, projectors, screens, receivers, speakers, projects, DIY’s, product reviews, accessories, classifieds, and more!