Originally Posted by interstellar
We shot in 4K and did the down rez off a 4K master in post.
Reads like this would maximize any potential boost in effective resolution from 4k=>2k or 1080. Wonder if anyone has measured such potential gains in final effective resolution, say with spectrum analysis to show changes in MTF (like the Arri-paper graphs cited earlier above)?
Most producers will shoot in 4K and work off a 2K master before down rezing to 1920P. The allows them to use the footage for any possible resolution changes in the future - like 4K HD monitors and broadcasts.
Going from a 2k master to 1920X1080 for distribution, since these format resolutions are so similar, seems to waste the benefits of 4k capture for 1920X1080p distribution. Puzzling why they wouldn't make an exact digital copy of the original 4k to work with, although suspect computer costs and crunch time is a factor.
BluRay version is coming.
From your ~4k recording (RED One cameras?) and the best BR bit rate, seems this would maximize effective resolution, then. Although, if diffusion filters were used on the cameras, guess that could vary widely.
The many articles--and AVS posts--on HD images versus 4k (or 4k downconversions) imply most can't tell the difference, although comparison images often show less stair-stepping of thinner lines from 4k downconversion. -- John