While this is not pulled from an article, this poster had an excellent point about the limitations of film.
There may be a lot of argument about what is a typical film. However,
some good films measure out to roughly 100 line pairs per mm. That is
200 pixels per mm, or 5000 pixels per inch. On a 35mm frame, that is
7200 x 4800 pixels. Now, there are MANY other factors that limit
resolution other than just the film, so you are lucky to get half that
value when EVERYTHING is factored in, which brings you to about the
values CSMI gets.
Another thing to take into account in the current reality of fillmaking is that the film itself is scanned into digital form in order to add CGI and edit the film. They no longer cut and splice film in the editing room to make most movies. After the editing, it is then transferred back to film. This will inherently create artifacts even in the smalles scale. So in reality, film is not infinite in the purist sense.
You may argue that scanning is better than raw digital capture. You may be right right, and you may be wrong as there is alot that goes into the transfer and all things ar enot equal. There can even be differences between individual film scanners in the same model let alone different models.
What it all comes down to is the perceived difference. Most people can see a difference between 480 and 720p once you get higher than 27". Most can see a difference between 720p and 1080p. How much of a difference can one see between 1080 and 2000? You can argue that the "big screen" is bigger, but many of us have screens where the viewing angle (which is what is important) is actually greater than what most watch in the theater. Most people seem to sit in the back of the theater.
In my home theater, I have a viewing angle of 40 degrees. When I go to the theater, I sit about midway and estimate a viewing angle of about 32. I imagine many toward the back have a viewing angle in the 20s.
at 40 degrees, I can notice a difference obviously between 480p and 720. I can also between 720 and 1080. I may notice a difference between 2K and 4K, but that is questionable. Would any of us really see a difference between 4K and 6K, 4K and 8K. And that is 40 degrees. At 28 degrees, I doubt there would really be any perceived difference.
And yes, I know there is more than resolution.... but resolution is none the less extremely important, isnt it?
So why introduce screen grain bits or allow it to exist to provide, in theory, a resolution that is not perceivable by most? That is the essence of my argument. To me, HD means that the image is cleaner and crisper. That is what I am looking for. It doesn't make it look plastic like... just more crisp. Just like a new car looks all beautiful and shiny while a 4 year old car can look a little more dull. Sure most cars look a little worn, but it doesn't make a new shiny car look less real.