Originally Posted by helvetica bold
So with the announcement of Ultra BluRay general specs I'm wondering how much can the W9 take advantage the improved PQ. I realize there's things we don't know yet about UBR but the W9 could handle the 10bit, HFR or even the P3 color gamut? I care more about that stuff more so than resolution. From the sound of it we won't get the benefit of HDR but what possibly can we gain from the new format?
i have been wondering the same thing while reading of the new upcoming rec.2020 UHD format. of all the new specs, the higher resolution is probably the least interesting part
imho. the exiting part is the increased colour space, higher bit rate (stating 12 to 16 mb/sec is the minimum needed for "good" video quality), progressive video (50p, 60p, 100p and 120p even) and HDR. (note: for these higher frame rate progressive signals however HDMI cant cope, and you would need display link or display port connectors, which many of the new UHD TV's dont have yet). our w900a's can also already cope with 1080p50 and 1080p60, yet there is still no indication of any material at that level of quality to become available (the latest Hobbit movie might have a 48p version on bluray i believe)
Netflix indicated already that when bandwith fluctuations cause a reduction in transfer speeds for their new limited release 4k web broadcasts, they will preferentially reduce resolution to HD and keep HDR because viewers will perceive less reduction in video quality (adding HDR to a UHD video stream only adds 2 or 3 mb/sec loading) . In comparison, SD bitrates for DTV are around 6 mb/sec, and HD DTV is between 9 and 13 mb/sec. SD DVD's are usually 6 or 8 mb/sec, and bluray can be from 15 to 35 m/sec (most are around 25 mb/sec). The few HD "mastered in 4k" bluray movies released by sony that use the enhanced x.v.Color color gamut (with expanded red and green range) are 35 to 38 mb/s. my point being: a good HD tv can already quite happily cope with the new "higher bit rate" they are lauding for UHD broadcasts.
an interesting point is that, if i understood it correctly, there was reference in some technical documents for UHD that the transition phase from HD to UHD
(phase 1, which we are currently in till 2018 +/-) will initially allow some UHD material to use rec.709 color space.
the current 2015 UHD models also only use 85% of the new rec.2020 color space
. as a comparison, quantum dot lcd's have a 50% larget color gamut then standard lcd's, and our w900a can already happily produce 73.9% of rec2020
. Oled has an even larger color gamut then quantum dot backlit lcd's, and has no problems reaching a similar 85% of rec.2020, however it is struggling with the UHD resolution and experiences color bleed. Oled also still has problems reaching the 1000 nits spec level required for HDR brightness (LG has promised a firmware update for later this year to implement this on their just released first 4k oled tv). And the final buzz kill for Oled is the limited lifespan of one of its 3 primary colors ( ? blue ) which starts to fade after even a few years ( the brighter you run the display, the quicker it will fade). If they had issues with this in the last few years with HD oled models, they will need some clever magic to cope with the 1000 nits spec of UHD. All this is in context of HDR in the new rec.2020 being very much dependent on using the increased brightness levels, but even then for most new UHD video material 300 to 500 nits would be more then adequate.
Looking at the new 2015 range from the better main brands like sony, samsung, panasonic and LG, only a few select high end models use quantum dots, and from what i can make out they are all UHD. there are no more HD models with quantum dots (
or even HD models using other increased color gamut enhancement methods aside from OLED). if we can find a way to reduce UHD to HD, our current sony w900a's can cope with most of the other specs
The bluray HD and HDTV DTV broadcast standards specifies 100 nits max brightness. The new HDR standard uses a much brighter 1000 nits
, but that much hgher level of brightness would be like looking directly into the sun, something which is very unrealistic to expect a tv to need to produce and use in any significant way (and would potentially be dangerous to your eyesight) . its main implementation will be in providing better contrast levels (eg the range between dark and light is what is important in creating better shades of grey/black, and selectively brighten some parts of the screen). but 200 or 300 nits would be ample for that purpose in most situations, and iirc our sony w900 can already do 350 nits (with the quantum dot technology producing the increased brightness over what older lcd's with different back-lighting technologies can produce).
One very positive element of this new UHD standard is the use of HEVC compression
(HD was h264/AVC) which provides 60% better compression, reducing bandwith needs(eg for broadcasts) and storage space. however a single UHD movie would still require 300 to 500 GB in space ! other then some UHD broadcasts of selected events , a widespread uptake of this technology for home users is highly unlikely for some yrs to come and it will remain a niche product. To keep this in context bluray never had a major success with the masses, and dvd is still the main format being used (and sold). similarly DVD-A and SACD never gained traction in the audio world, however good they were/are. worse even, where i live 2/3 of all DTV broadcasts are still only in SD, and only 1/3 are in HD ! in the real world most of us would be happy with HD at a decent bit rate (and better color space etc) rather then get some newer but further watered down low bitrate UHD. SD also scales ok to HD, but it is nonsense to try and scale it to UHD and expect a good result.
in short, i think it is still much to early to jump on the 4K wagon
. these are great new specs to aim for, and we will all welcome the video improvements it promises, but it will take some time before they become relevant for most viewers. however as these improvements are gradually implemented, it will hopefully also improve the HD video experience at he same time. for those who dont have a good HD tv yet, or might want to update and get a larger/better screen size, i think it is a painful time to have to make a choice. only the high end models of the new UHD tv's from the main brands have in 2015 started to implement these new UHD specs, and prices range from 5000 to 8000 $ for a 65 or 70'. i havnt seen any 55' UHD models from the 4 main good brands that have all the high end specs (they lack the HDR or quantum dot elements usually).
by all indications our w900a's will happily last us a few more years while they standardize UHD specs further and they become more mainstream, or for any worthwhile amount of UHD material to become available. if only we could find a way to downconvert UHD to HD and keep most of the improved video quality elements, we would have the best of both worlds