Originally Posted by irkuck
This is very tricky issue which you do not appreciate most likely because of lack of deeper knowledge how compression works.
You repeatedly state this nonsense, even now criticising other for not having deeper understanding on how compression works, and you have absolutely nothing to back it up.
Your claims are solely made up in your head and goes against all reports from the people that work with the new compressions codecs and all the people that have seen the effect in person.
In this case 4K and Blu-ray have then both about 25 Mb/s budget but for 4K this is deep compression regime and for Blu-ray it is safely away from it. Now, one could downconvert the 4K to 1080 and according to your claim be sure it is absolutely better than Blu-ray which is unlikely or absurdal since it would imply deep compression at higher res is better than non-deep compression at original res .
4K down-converted to 1080p are done numerous times in re-scanned movie titles that have been re-released on Blu-ray and they always show better image quality.
Bitrate has fare less importance than so many people believe and claim. You reach a point of diminishing return long before people think they do.
Often the reason for high bitrate in BD is because of trouble spots/faults in the source that introduces unacceptable much artefacts, and bitrate is increased far above the normal just to save the situation.
It has no apparent effect on clean sources. If that was the case, all BD movies would run at max bitrate. They do have the space for it.
But one could up convert the Blu-ray to 4K and easily claim that it will be better than 4K just due to going around the compression artefacts due to the deep compression regime.
You never seems to get that having original four time the resolution from a camera or scanner source will always trump the image quality coming from a 1080p source.
The higher original resolution will have less artefacts, not more, because much of the artefacts from the lower resolution source are already present in the original camera files.
Then this get compressed which that both emphasis the original artefacts and add new ones.
Take the example of a much complained about artefact in BD movies; The Edge Enhancement (EE).
Why do you think it is there?
Because the original 2K source is so un-sharp that during authoring the technician has to turn the sharpening knob to "11".
When your source is 4K shot on even higher resolution, the resolved resolution of details (more correct term is surface texture) makes the image appear sharper by the increased resolution by itself.
Because the increased and improved natural organic sharpness is present in the original source material, the need for added electronic sharpening decrease immensely.
In addition, because of improved camera sensor technology that introduces less artefacts into the original source, the image will be much cleaner.
Marry all these facts together and send the 4K original finished film through one of the new compression codecs, you will then be able to compress the source harder without loosing any significant image quality and retain transparency to the original source.
The biggest problem for HD original sources is that they always have been too low on resolution. At 4K we just barely pass a resolution quality that is acceptable.
You have been told this time and time again, and still you are stuck on how 2K resolution and H.264 function through compression and the result of it.
Now it is time to understand that with 4K and new HEVC codecs and compression methods, everything you knew have changed.
Until you can come up with some new data that back your antiquated claims, you should rather resist the temptation of posting this endless stream of statements that only exist in your own brain and have nothing to do with the technical reality of 2013.
Stop misleading people.
You are reall comparing apples and... HD
. Talking about the PPI without viewing scenario is pure absurd.
Again and again you show that you have no understanding of the relation between higher image resolution sources + high resolution displays and PPI.
You're just stuck in viewing distance/angle of view only argument, and never consider all the other factors that have an impact on display quality when resolution is quadrupled from HD to UHD.