Originally Posted by lordx2
Have studios released 10bit blurays? Or have these 10bit files come from a digital studio copy? So if I have a bluray of a movie - and I see that same movie out there in a 10bit encode - is that 10bit better than my current untouched bluray?
Are you talking about 10-bit color or a 10Mbit stream? That would need clarification.
Blu-ray is limited (per spec) to 8 bits per color channel (RGB) which equates to 24bpp (24 bits per pixel, i.e.: selecting 24-bit true color mode on your computer). 10 bits per channel and higher bits per pixel would require the right software for encoding/decoding (which is easy to come by) but would then require hardware capable of displaying native 10-bit+ content. Otherwise, the 10/12/16-bit per channel content will just be dithered down to 8-bit (or 6-bit) when it hits your TV/display.
Originally Posted by 42041
The Blu-ray spec only supports 8-bit/channel color. I'm not sure what those 10bit encodes are but it's probably BS.
The 10-bit encodes could be real, but without context as to what those files are or where they came from, we can't say for certain. Like most things involving "Deep Color", it's mostly marketing at this point.
Originally Posted by Iain-
I don't think that's entirely correct. The times I've checked my BD video signal on AVR Information overlay screen, it has always indicated:
1080P 24Hz 12 bit. BTW, my player is configured for original resolution and Y'CbCr 4:4:4 colour space, AVR is set for video signal pass-through without modification and display is configured for 1080P Pure Direct.
If you have disabled ALL processing, then your AVR is simply describing the connection and not the content (subtle difference). Your equipment is all Deep Color-compatible, so the connection is 12-bit (36bpp), but your disc still only holds 8-bit (24bpp) content. If you activate Deep Color processing, then takes your standard 8-bit (24bpp) stream and resamples it to 10-bit (30bpp), 12-bit (36bpp), or 16-bit (48bpp), whatever your equipment is capable of. Of course, most people only have 6-bit or 8-bit LCDs, so everything just gets dithered back down anyway. LOL
Originally Posted by lordx2
I think it is 10bit per channel??? Hell I dont know - which is why I was asking
So would it be possible for a studio to release a digital download with super high bit rate - and super high frame rate? I would assume so - if the source was shot in those higher specs. (Anyone here remember Terminator 2 special edition DVD? It had a digital copy on the bonus disk that was 1080p! Way before Bluray hit the market!) Can digital containers like MKV handle a high bit rate of 10+ and high FPS of 48+ @ 1080p? PS - you guys do bring up a good point about the super deep colors at the 30bit range - 10 bits per channel - so it seems that hdmi 1.3a+ is capable of transmitting those colors - just that the bluray spec may not handle them - so it would have to be a digital computer video file.. PPS - Anyone know of any digital 1080p files at super high frame rates like 60fps? I would love to see my 1080p projector firing on all cylinders @ 60fps!
Yes, videos of any frame rate, color depth, and resolution are available in nearly any container and many codecs (containers are generally agnostic to the A/V within). With the right codec, you can play anything on a computer. But what good is it if you don't have a display that can natively view it? I have some 48fps, 60fps, and 120fps videos on my computer, but my laptop screen is only good for 60Hz, but my CRT can do 120Hz (it can go as high as 260Hz, but not HD resolution). So you have to figure out what makes sense.
Also, I still have my T2 Extreme Edition 1080p DVD.
Originally Posted by Kilian.ca
You haven't told us where you came across those files so we can't tell either. There might be software on the PC to tell you and my standalone video processor can also show the info.
AVCHD can do that and someone made some test files in the BDP forum.
Yeah, that's still the real question. For all we know it's something he found on TPB. LOL