Originally Posted by Dan Hitchman
However, there are still compression artifacts and other unwanted anomalies showing up at lower bitrates with H.264 and 1080p at only 8 bit... especially if they have to filter the master file to within an inch of its life to compress the data at that extreme level. The better Blu-ray transfers usually have higher average bitrates and far less pre-filtering.
The mistake here is that people in general think that the artifacts are solely a result of too hard compression and low bitrate. The fact is that most of those artifacts are already present in the raw material coming out of the camera or the scanner.
The reason is that most films shot digital are shot on cameras with 2K sensors or from film scans scanned at 2K for DI. These camera sensors and scanners gives barely a 1.5 megapixel of real pixel resolution, outputted as 2K with the baked in artifacts, which are again amplified with further compression.
These images are in addition rather soft and needs electronic sharpening, therefore you see all the ugly edge enhancement on 1080p material. This would be avoided if the source for 1080p was original in 4K.
We generally think that post production techniques are on a high professional level and knowledge updated to the latest processing techniques. But that is very far from the truth, it much more the opposite.
Much of what is shot with 4K cameras has often been directly transcoded to 2K to Apple ProRes, discarding the 4K RAW, and edited, graded and rendered from the 2K ProRes, because of the Apple/FCP centric "standard" in post houses and/or pure stupidity. This has happened to most movies shot on Red 4K cameras, and even if films now are scanned at 4K,6K and even 8K, they are processed with bad methods.
The first digital 2K camera that gives a full 2K file after debayering is the Arri Alexa, because it has a 3K sensor that is sub-sampled in camera to 2K. The negative is that the camera has an extra strong Optical Low Pass Filter that gives a very soft picture.
So only movies that are shot with Red 4K & 5K cameras (or the new Sony F65), Alexa or films scanned at higher than 2K can give a artifacts free 2K source, if the files are treated right.
Only the Red and Sony cameras can provide a 4K "future proof" source, which is why one have the "head scratching) wonder why so many high profile films and TV series are shot on the Alexa, Like "Game of Thrones", "Downton Abbey", "Avengers", "James Bond; Skyfall" !
Which means, all these productions will only exist in a future "4K world" as up-converted versions.
Why doesn't the studios do some "forward thinking" and "future proof" their material by demanding their productions are made for a 4K delivery, even if that is somewhat into the future?
The post processing situation is slowly improving at those post houses that care to update themselves, but the choice of the right camera for the job is almost more in decline.
20-25 Mb/s sure seems extremely low for 4k material. That can sometimes be too low for 1080p currently depending on the complexity of the image.
I want the 4k format to look outstanding on a BIG screen, not a small telecine monitor. That will necessitate higher than Blu-ray bitrates and 10 bit, 4:2:2 or higher video even with H.265.
The claim is that the quality of 4K projected at a 20 foot wide screen at those low bitrates is almost indistinguishable from a DCI version. Provided that the material is shot with a camera with an higher resolution than 4K, is artifacts free and is treated properly in post.
If UHD doesn't look noticeably better than 1080p @ 8 bits with a packaged consumer medium it will never sell.
It is possible that the difference between 1080p and 4K is not high enough resolution to sell to the general public, who are not very quality concious, and have too small screens. That is also why NHK decided to skip 4K for the future broadcast system and settle for 8K.
For us enthusiasts with large screens, I think we will appreciate the quality of a 4K upgrade. But it will also depend on a combination of several factors that will contribute to how impressed we will be, like; the before mentioned post production treatment, high contrast (and color saturated) images impress more than low contrast (and desaturated) images, laser projectors will give a wider color gamut than lamp based projectors etc. etc.