Originally Posted by etc50
They are beginning to shoot movies in 4K as 4K cameras become widespread. 4K digital resolution is comparable to 35mm film. HD does not carry as much detail as 35mm film. That has changed with the arrival of 4K.
I think you have gross simplifications in both statements. The #1
digital camera by far in Hollywood is the Arri Alexa, which (at best) only has 2.8K. In general, most cinematographers are going for the overall look and dynamic range of the image, not for resolution alone, so the choice of camera does not hinge on the K. I think very few of them care.
I agree that HD has never had as much detail as 35mm film, but consider that for decades you had to go through four generations to get to a theatrical print: OCN (camera negative) -> IP (interpostive) -> IN (internegative) -> release print. When I worked for Kodak, they told me best case that their then-best negative emulsion, 5219, had as much as 6K resolution depending on how you measured it, how the MTF stacked up, lenses involved, and so on.
The problem is, every contact generation in the lab subtracted another 1K of information. They figured out more than 20 years ago that on average, a theatrical print had no more than 2048x1556 resolution... and that's where they got the numbers for the digital Cineon standard.
In truth, the VFX business still can't deal with 4K data because the pipeline is so slow and limited. It's not a question of money or technology... it's really a question of time
. When you have to deliver a film at the very end, there's just no time to add on another 4X of work just to deal with that much more data. Not yet.
As a result, 99% of the "4K" movies you see in theaters are all uprezzed. In a handful of cases, they're taking the 2K VFX and uprezzing those and integrating them into 4K camera footage... but not very often. There are also non-effects films that do stick with 4K all the way, but those are relatively rare (yet do happen).