Originally Posted by irkuck
BTW, when talking about res and PQ one can consult the Digital Cinema standard
. They define 2K and 4K formats. Most important though is that the digital cinema content is full 12-bit RGB and compression is intraframe JPEG-2000. In addition there is specified 2K profile @48fps. These guys really knew where are the weak points in the present digital video. In a similar way, instead of talking about 4K for the home it would be logical to talk first about addressing the very same aspects - e.g. by new Blu-Ray @ 10-bit RGB, intraframe, @48fps. That would then solved any PQ problems and blow out any need for mediocre 4K.
Haha. Just enough knowledge to be dangerous!
2K DCI footage is in the region of 200-300GB for a film at 24p, so you're looking at 400-600GB at 48fps. (and probably 3D too?)
A big part of this is, as you say, because each frame is stored as its own JPEG 2000 image. This means that it's almost approaching lossless compression, and you can take any still frame from a film, and it will be fully detailed with almost no compression artefacts. Do that in a fast-moving scene on Blu-ray and you will see compression there.
Except the compression formats used on Blu-ray take motion into consideration. Neither the displays we have, or the human vision system requires each frame to be perfect, especially with a lot of motion.
As for 10 or 12-bit data, there are virtually no consumer displays available today that are even transparent to 8-bit. There can be benefits to sending greater than 8-bit data from the player, and with 10-bit panels, but with the exception of animated content, there's likely going to be very little, if any, visual benefit to using 10-bit at the source-something which inflates the data 4x. (8-bit = 256 levels per channel, 10-bit = 1024)
For animated content, using high bitrates on Blu-ray seems to be sufficient as it is. Increasing the bit-depth allows higher compression to be used with animated content before macroblocking starts to become visible. As far as I know, this is only done by people that want to rip a disc they have, and compress it further to save on storage space.
DCI colour is also much wider gamut than the BT.709 standard for HDTV, and data is stored in an X'Y'Z' format, which also plays a big part in why they are using 12-bit data.
Assuming that both formats have the same amount of space to work with, there will be far
more benefit to 4K Blu-ray at 8-bit, than 1080p Blu-ray with 10-bit colour.
You keep saying that there are image quality problems with Blu-ray that using less compression and moving to 10-bit colour would fix (if anything compression may need to be higher, but lets ignore that for now) can you give some examples of this?
Increasing the bit-depth, increasing the colour gamut (this would be the first thing on my list after 4K) moving to 4:4:4 chroma (or an X'Y'Z' format) and reducing compression are all nice things to have, but you have to be realistic, and as far as image quality is concerned, the thing which will have the most benefit right now is moving to 4K. (by which I mean 3840x2160 Blu-ray, not
the cinema format)