Originally Posted by nathanddrews
If certain studios choose to "dumb down" the OCN in favor of making it look like a theatrical print, then it stands to reason that we'll hit the wall of diminishing returns much sooner with higher resolution formats. Is it then logical to assume that the inevitable 4K format will be wasted on catalog films from said studios? As evidenced by Jaws, Universal (or specifically Spielberg in this case) thinks that too much original detail detracts from the viewing experience... for 1080p. What then, 4K?
I don't like where this is going.
You're forgetting that there were fundamental shifts in film stocks used at the end of the 60's and late into the 80's.
The first change was moving away from the expensive stuff, like large formats and 3 strip technicolor. In the case of the latter, what happened was, in the 70's and early 80's, the studios were mastering to cheaper single dye layer Kodak stocks that had a horrible shelf life. In other words, they went from high quality masters, past the premium color stocks Kodak also offered, and straight to stuff that was not truly archival. The result is prints from 30 years ago that look far worse than prints of older vintage.
Luckily, the studios realized their mistake (especially after seeing the revenues that could be made from home video down the road), and used improved stocks for mastering processes by the time the 90's rolled around. The new problem is digital workflow which is largely 2K. That means you likely would see diminishing returns from higher resolution formats.
So, there is a lot to be gained from creating 4K scans from movies from certain eras, particularily the 50s and 60's when there were still plenty of films being made with large, high quality and fine grained stocks.
Unfortunately, just like with TV, the 70's and 80's was a bad timeframe for content. Cheaper was chosen over quality, which is why some TV series likely will be stuck in SD video forever.