Originally Posted by coolscan
You must distinguish between capture (film&digital) and delivery after post production.
Both film and digital capture format have low contrast regardless of resolution to capture as much dynamic range as possible.
Desired contrast and electronic sharpness is then added in post.
I agree that high contrast increase the perceived sharpness.
Low resolution, f.ex. 2K capture requires more added electronic sharpness added in post than capture done at a higher resolution.
5K or 6K capture for 4K or 2K delivery will have a higher organic sharpness and avoids the Edge Enhancement that we so often see in movies released on BD.
The reason for this EE is that previously one thought that 2K film scans and 2K cameras was sufficient for 2K delivery. When the material is authored for BD, the production people deem the material to be "too soft" and over-sharpens the material.
Now one has learn that film schould be scanned at 4K for 2K delivery and at least at 6K for 4K delivery.
The same with digital cameras that now have gotten sensors with higher than 2K resolution for 2K delivery.
That those cameras also will be used for up-converting to 4K delivery just show how little quality consciousness and care for image quality there are in the movie industry.
What I see complained about from people that work in the industry is that there are mostly stupidity, hearsay, fanboyism, incompetence and money that is the reason for choosing a capture format for a movie.
The choice of equipment based on the desire for the best technical Image quality capture and "future proofing" are more rare.
The 2K digital cameras does a lot of in camera processing (including electronic sharpening) before the images are stored on the capture cards/tapes.
This processing doesn't happen for 35mm film before post production stages.
That' why the 2K cameras can look sharper.
Similar to the 35mm film format happens to the digital RAW capture formats on those cameras that doesn't pre-process the RAW in camera.
Stating as you do; Have you seen much movies shot on Sony F65 (6K'ish) or Red Epic(5K) RAW properly processed, or film scanned at 6K, projected with the latest Barco or Christie Series-2 4K projectors (Sony 4K SXRD/Lcos is softer)?
I don't see the evidence at all in your estimates, because you forget an important point.
That is that IMAX titles usually are scanned at 8K and down-sampled to 2K.
Which shows that very high resolution capture matters very much compared to 2K capture.
By the way; Estimates done for 35mm capture film (not release prints) is that it has from 1.5 to 4 megapixel resolution, depending on type of film used and shooting situation.
As consumers we only care about the delivery format, the pixel count of the capture system is a secondary issue.
MTF and dynamic range are completely separate issues. MTF quantifies the amplitude v frequency response of the system not the overall dynamic range.
All digital capture systems have a falling high frequency response that must drop close to zero at nyquist (spatial frequency equal to the pixel grid) to avoid out of band frequencies entering the system and causing distortion. The low pass filters used (both optical and digital) cannot have an infinite cut off, capturing with many more pixels than is required for final output allows better filter performance and higher in band response when scaling down to the lower output resolution.
Lenses, film and film scanners also have an MTF response (falling high frequency response) and the final output is the sum of all MTF losses in the chain. With digital cameras the additional losses of film are avoided which allows higher mid band MTF which is what defines a sharp picture.
No matter what capture system is used a digital image can only have about 60% of the visible resolution its pixel count would suggest, above that MTF is too low for detail to be visible.
The reason IMAX or high res digital looks sharper than 35mm film when encoded to 2k is because mid frequency MTF (well below 2k) is higher, the limiting resolution is the same in all cases and set by the 2k format.
Since most 35mm film titles are significantly softer than IMAX or high res digital when encoded to 2k its clear the 2k format is not the major limitation for 35mm film source.
"Resolution" and image sharpness are separate issues. A high resolution image with low MTF will look soft while a lower resolution image with high MTF will look sharp. High resolution without high MTF is pointless for movies.
I am all for high res capture systems as they do offer tangible befits, the issue here is 8k v 4k as a display format and I just dont see 8k being worth it for reasonable screen size to viewing distance ratios. A 200" diagonal screen at 8' is not reasonable or practical IMHO.
Here is an example of what falling MTF at high spatial frequencies does to visible resolution. The top half is 100% MTF (perfect), the lower half is a typical representation of the performance of a digital capture system, at the pixel limit (nyquist) MTF drops to about 10%. The limiting "resolution" is the same in both cases as you can see the fine detail if you look closely, but stand back from the screen and the resolution you can actually see changes dramatically.
Remember, most details in the real world dont have 100% relative contrast to begin with, 30% is likely more typical. Factor in 50% or more MTF loss and a 30% contrast difference between details in the original image becomes 15% or less in the video which is useless.
For those who want a better understanding of "MTF" I suggest this tutorial.