The real reason sony and Samsung aren't using dolby vision - Page 4 - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 416Likes
Reply
 
Thread Tools
post #91 of 405 Old 04-11-2016, 10:21 AM
Lionheart of AVS
 
King Richard's Avatar
 
Join Date: Oct 2015
Location: Greenstone, Ontario, Canada
Posts: 3,287
Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 1797 Post(s)
Liked: 4367
Quote:
Originally Posted by latreche34 View Post
What does all this mean to the masses, Are they going to make two versions of HDR for every Blu-ray released like PAL and NTSC DVD back in the days??/

First of all, UHD Blu-rays do not have to include any HDR at all...

... however, HDR10 is the mandatory format when HDR is included on the disc. Dolby Vision is only optional on those HDR UHD Blu-rays.

In other words, the Blu-ray Disc Association (BDA) has mandated that any HDR Ultra HD Blu-ray disc always start from a generic (SMPTE BT 2084 standard) HDR10 base layer (which requires an HDMI 2.0a input on the TV) and, if the content provider so desires, a proprietary Dolby Vision layer can then ride on top of that.

Stated differently, all HDR Ultra HD Blu-rays must include basic HDR10 metadata while only some will add Dolby Vision HDR metadata that will ride on top of the basic layer.


Richard

Last edited by King Richard; 04-11-2016 at 10:26 AM.
King Richard is offline  
Sponsored Links
Advertisement
 
post #92 of 405 Old 04-11-2016, 10:41 AM
 
player002's Avatar
 
Join Date: Dec 2008
Posts: 6,246
Mentioned: 197 Post(s)
Tagged: 1 Thread(s)
Quoted: 3935 Post(s)
Liked: 2944
Quote:
Originally Posted by alexanderg823 View Post
Bruh, you're the guy claiming HDR on an edge lit unit is A-OK while trying to go around telling people they don't understand things?


OK buddy, you need to get back to the fundamentals.
on the podcast they both said that people dismiss that Edge lit can't do hdr. They said that they tested HDR on edge lit and it looked very good. They went further to say that in a darkened room it looks great and the colours are fantastic. Furthermore WIFISPY says that HDR is not all about nits it's mostly about colours and the edge lit does a fantastic job that they were surprised. They downplayed comments like yours saying edge lit is not capable of doing HDR. They also said that fald is better which we all know. You do know that the edge lit displays from Samsung are UHD certified this year with 1000 nits and much better dimming right? Fald will have a better image due to it's more precise dimming no doubt but Edge lit is more than capable of doing HDR too. This comes from Them testing many Tv's. Nobody is saying edge lit is better than fald but the misconception that Edge can't do HDR is false clearly stated by Spectracal.
player002 is offline  
post #93 of 405 Old 04-11-2016, 11:15 AM
AVS Forum Special Member
 
bp787's Avatar
 
Join Date: Feb 2005
Posts: 1,782
Mentioned: 19 Post(s)
Tagged: 0 Thread(s)
Quoted: 893 Post(s)
Liked: 619
Quote:
Originally Posted by King Richard View Post
Some of you might be interested in this new CNet article:

>>> HDR is TV's next big format war <<<

I somewhat disagree with the Title and premise of the article however.

I don't really see it as a "format war" (like Blu-ray vs HD-DVD or VHS vs Betamax) because I believe that both formats will continue to co-exist side-by-side like Dolby Audio and DTS Audio currently do.

Again, as I sated in a previous post, they are currently talking about the possibility of adding dynamic metadata to HDR10. So if/when this does happen, HDR10, depending on how well this is implemented, will probably look very similar to Dolby Vision. It will certainly narrow the "gap" between the two.


Richard


so, I generally agree here with Richard. A little background, in that I am an Electrical and Software Engineer. I also do full Systems Engineering. I often work with control systems, embedded processing, and video systems. I am not a "TV" expert, by all means. I like to watch them, though


At it's heart, Dolby Vision is a Standard. plain and simple. It just so happens that DV is a standard that uses a bit of silicon to control the output, taking that part out of the manufacturers' hand. This certainly can be a good thing, but like all standards, there are no standards. Every MFR can implement a standard differently, incorrectly, etc... I personally deal with this quite often. The only way any standard can be guaranteed is if the governing body does QC on it themselves. HDMI is a REALLY good example of how standards can be totally borked.


Regarding the display management component that has been soooo hotly contested in this little thread, without seeing an actual specification document, we can't really draw too many conclusions about what that is. From the dolby whitepaper, which is the only valid source on this topic, IMO, the DISPLAY MANAGER consists of the Display Composer. It's not evidently clear what else makes up that piece and is HIGHLY likely to be a typo. For all intents and purposes, the white paper means Display Manager - DISPLAY MAPPER.


What is the Display Mapper? It is a tunable piece of gear that allows the tv manufacturer to adjust the outputs to meet the needs of the panel and other components of the tv. From the graphic on Page 10, the Display Management/Display Mapper is muxing the actual video data stream with the Metadata stream to create a correctly composted imaged + metadata for the actual display. it doesn't "take control" of the tv itself. The manufacturer still has control of the output of the DV engine. (This tuning point is listed on Pg 11 of the white paper, FYI).


A really important thing to note is that this DV output could be completely ripped apart by ANY manufacturer and further tweaked, modified, etc...


DV also doesn't appear to have much to do with manufacturers own internal engines that perform functions like upscaling, etc...


DV appears to HELP manufacturers achieve a standard HDR picture that they would otherwise have to spend a lot of time perfecting/integrating. I'd equate it much to using an NVIDIA board in a computer versus developing your own graphics subsystem (like in the old days). In no way does it dictate how a display looks. It just helps the MFR get there...


I'd expect to see most high end MFRs have DV work in conjunction with their own engines to deliver the best possible PQ for a particular set.


The biggest reason I think a number of set makers won't include DV.... licensing cost. I haven't seen any indication of how much the license will be or how much the chipset will be, but I bet it won't be cheap. Since HDR 10 and DV aren't that far apart, I would guess that many larger companies won't bother with it for awhile.




It's just my .02, but Dolby isn't taking over anything. they're providing a tool set. It can't hurt if it brings many of the sets into a similar realm of PQ, but it will in no way make all sets equal or look the same. cheap materials, bad QC, other factors will always trump this.
Egan, Tom Roper, cmdrdredd and 4 others like this.

Sony xbr65x850c Side Blooming test clips:
https://dl.dropboxusercontent.com/u/...br65x850c.xlsx
bp787 is offline  
Sponsored Links
Advertisement
 
post #94 of 405 Old 04-11-2016, 11:24 AM
AVS Forum Special Member
 
tmdorsey's Avatar
 
Join Date: Feb 2002
Posts: 1,148
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 457 Post(s)
Liked: 405
Edge-lit displays can't do HDR? LOL

Well, the HDR I've viewed on my edge-lit set has looked pretty damn good to me and a noticeable difference from regularly graded SDR content.
cmdrdredd and King Richard like this.
tmdorsey is offline  
post #95 of 405 Old 04-11-2016, 02:48 PM
AVS Forum Special Member
 
DreamWarrior's Avatar
 
Join Date: Nov 2005
Location: Dirty South Jersey
Posts: 2,177
Mentioned: 28 Post(s)
Tagged: 0 Thread(s)
Quoted: 993 Post(s)
Liked: 579
So, 94 posts later and we're all (except @alexanderg823 ) in agreement with posts 2 and 3, lol.
DreamWarrior is offline  
post #96 of 405 Old 04-11-2016, 02:56 PM - Thread Starter
 
alexanderg823's Avatar
 
Join Date: May 2015
Posts: 3,449
Mentioned: 16 Post(s)
Tagged: 0 Thread(s)
Quoted: 3128 Post(s)
Liked: 2544
Quote:
Originally Posted by tmdorsey View Post
Edge-lit displays can't do HDR? LOL

Well, the HDR I've viewed on my edge-lit set has looked pretty damn good to me and a noticeable difference from regularly graded SDR content.
If you think edge lit can really do hdr, you don't understand how hdr or lcd tech or edge lighting works.

Just because you see an image doesn't mean it's anything but smoke and mirrors.

I can get my old 720p tv to display a 1080p signal. Doesn't mean it's 1080p.

Anyone with basic knowledge understands why edge lit hdr is a con job for uninformed consumers.
6athome likes this.
alexanderg823 is offline  
post #97 of 405 Old 04-11-2016, 03:29 PM
AVS Forum Special Member
 
bruceames's Avatar
 
Join Date: May 2006
Location: San Rafael, CA
Posts: 1,427
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 538 Post(s)
Liked: 371
Quote:
Originally Posted by alexanderg823 View Post
If you think edge lit can really do hdr, you don't understand how hdr or lcd tech or edge lighting works.

Just because you see an image doesn't mean it's anything but smoke and mirrors.

I can get my old 720p tv to display a 1080p signal. Doesn't mean it's 1080p.

Anyone with basic knowledge understands why edge lit hdr is a con job for uninformed consumers.

I have both edge lit and FALD HDR displays and the HDR looks very similar to each other and very good on both of them. Yes edge lit can do HDR very well.
bruceames is offline  
post #98 of 405 Old 04-11-2016, 03:32 PM
AVS Forum Special Member
 
DreamWarrior's Avatar
 
Join Date: Nov 2005
Location: Dirty South Jersey
Posts: 2,177
Mentioned: 28 Post(s)
Tagged: 0 Thread(s)
Quoted: 993 Post(s)
Liked: 579
Quote:
Originally Posted by alexanderg823 View Post
If you think edge lit can really do hdr, you don't understand how hdr or lcd tech or edge lighting works.

Just because you see an image doesn't mean it's anything but smoke and mirrors.

I can get my old 720p tv to display a 1080p signal. Doesn't mean it's 1080p.

Anyone with basic knowledge understands why edge lit hdr is a con job for uninformed consumers.
I think the Sony 930D has went a long way towards improving what edge-lighting can do. Tweaking their approach, I can only see it getting better. Certainly FALD will always out-do it, but with quality light guides, good software, and Sony's using stacked edge-lights to increase the edge-lit zone count, edge-lighting is edging (lol) closer to FALD.

That said, I'd still take a FALD display as the display's thickness doesn't matter to me.
DreamWarrior is offline  
post #99 of 405 Old 04-11-2016, 04:14 PM
AVS Forum Special Member
 
tmdorsey's Avatar
 
Join Date: Feb 2002
Posts: 1,148
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 457 Post(s)
Liked: 405
The real reason sony and Samsung aren't using dolby vision

Quote:
Originally Posted by alexanderg823 View Post
If you think edge lit can really do hdr, you don't understand how hdr or lcd tech or edge lighting works.

Just because you see an image doesn't mean it's anything but smoke and mirrors.

I can get my old 720p tv to display a 1080p signal. Doesn't mean it's 1080p.

Anyone with basic knowledge understands why edge lit hdr is a con job for uninformed consumers.


Well I better contact a lawyer to get my class action lawsuit prepared.

:rollseyes.

Just because a FALD set may do it better doesn't mean an edge-lit set can't do it.
tmdorsey is offline  
post #100 of 405 Old 04-11-2016, 04:43 PM
AVS Forum Special Member
 
bp787's Avatar
 
Join Date: Feb 2005
Posts: 1,782
Mentioned: 19 Post(s)
Tagged: 0 Thread(s)
Quoted: 893 Post(s)
Liked: 619
Quote:
Originally Posted by alexanderg823 View Post
If you think edge lit can really do hdr, you don't understand how hdr or lcd tech or edge lighting works.

Just because you see an image doesn't mean it's anything but smoke and mirrors.

I can get my old 720p tv to display a 1080p signal. Doesn't mean it's 1080p.

Anyone with basic knowledge understands why edge lit hdr is a con job for uninformed consumers.
just to pile on here... your comparison is flawed. If a TV is displaying HDR and the TV is capable of displaying 10 bit data with Wide Color Gamut, then it's doing what it's supposed to be doing. The tv isn't modifying the HDR10 metadata or signal to display the image. There is a limitation with Edge lighting, sure, but that doesn't mean it's not displaying HDR. This is quite different than upscaling a 720p image, which modifies that actual video by adding additional pixels in between existing pixels.

And technically, only the following has to be met to be HDR compliant:
Quote:
High Dynamic Range
SMPTE ST2084 EOTF
A combination of peak brightness and black level either:
More than 1000 nits peak brightness and less than 0.05 nits black level
OR
More than 540 nits peak brightness and less than 0.0005 nits black level
only the 1000 nits is an issue for certification purposes, and the tv really only has to do ST2084 EOTF.

Sony xbr65x850c Side Blooming test clips:
https://dl.dropboxusercontent.com/u/...br65x850c.xlsx
bp787 is offline  
post #101 of 405 Old 04-11-2016, 06:55 PM
 
player002's Avatar
 
Join Date: Dec 2008
Posts: 6,246
Mentioned: 197 Post(s)
Tagged: 1 Thread(s)
Quoted: 3935 Post(s)
Liked: 2944
Quote:
Originally Posted by alexanderg823 View Post
If you think edge lit can really do hdr, you don't understand how hdr or lcd tech or edge lighting works.

Just because you see an image doesn't mean it's anything but smoke and mirrors.

I can get my old 720p tv to display a 1080p signal. Doesn't mean it's 1080p.

Anyone with basic knowledge understands why edge lit hdr is a con job for uninformed consumers.
Well you need to tell SpectraCal that they have no idea what they are talking about and should listen to armchair Calibration expert that you are instead.
jhinesjr, KidHorn, bp787 and 2 others like this.
player002 is offline  
post #102 of 405 Old 04-11-2016, 09:10 PM
AVS Forum Special Member
 
jjackkrash's Avatar
 
Join Date: Nov 2010
Posts: 5,276
Mentioned: 40 Post(s)
Tagged: 0 Thread(s)
Quoted: 2223 Post(s)
Liked: 2331
Quote:
Originally Posted by player002 View Post
The only difference is the tone mapping that DV will keep for them alone. Much fail on the OP and lots of misinformation. The truth is DV and HDR10 behave and look very similar, personally I take spectracal observations and info over the OP. The op use to say edge lit can't do HDR yet Spectracal says this is nonsense that edge lit does HDR just fine.
I have listened to this several times as well as a few other podcasts with Tyler. What I get out of this is that DV tone mapping is critical because without it there is no way to calibrate a set to HDR standards because HDR10/DV is graded in a Rec 2020 container, which none of the current sets can do. The tone mapping allows some semblance of accuracy for a set that falls short of Rec 2020. Without tone mapping, the sets are all over the place and wildly inaccurate. Also, it is still up in the air whether any of the current HDR10 sets will be upgradable when HDR10 starts using active/dynamic meta data. So while HDR10 sets--including edge lit sets--can play HDR10 content, there is really no way to predict whether the sets are playing the content accurately.
Madmax67 likes this.

Last edited by jjackkrash; 04-11-2016 at 09:19 PM.
jjackkrash is offline  
post #103 of 405 Old 04-11-2016, 10:54 PM
AVS Forum Special Member
 
DreamWarrior's Avatar
 
Join Date: Nov 2005
Location: Dirty South Jersey
Posts: 2,177
Mentioned: 28 Post(s)
Tagged: 0 Thread(s)
Quoted: 993 Post(s)
Liked: 579
Quote:
Originally Posted by jjackkrash View Post
I have listened to this several times as well as a few other podcasts with Tyler. What I get out of this is that DV tone mapping is critical because without it there is no way to calibrate a set to HDR standards because HDR10/DV is graded in a Rec 2020 container, which none of the current sets can do. The tone mapping allows some semblance of accuracy for a set that falls short of Rec 2020. Without tone mapping, the sets are all over the place and wildly inaccurate. Also, it is still up in the air whether any of the current HDR10 sets will be upgradable when HDR10 starts using active/dynamic meta data. So while HDR10 sets--including edge lit sets--can play HDR10 content, there is really no way to predict whether the sets are playing the content accurately.
Graded for a Rec 2020 container that no current monitor, including those the studios use, to the best of my knowledge, can display. So...I still don't completely get it, if the studios don't push the colors out further than their own monitors can display, then at current moment, I don't see why it makes a difference if consumer monitors can't display them....

That said, I suppose there is still value in DV's standardizing the way tone-mapping is done and giving the manufacturers a canned method to perform it. This will help get us at home closer to what the studios intended than whatever "special-sauce" the manufacturers like to add to their color profiles. However, I still fail to see how an HDR10 set can't be calibrated to perform within the set's capabilities and still look just as good, assuming, of course, that the manufacturer gives us an appropriate set of calibration controls to do so.
jjackkrash and King Richard like this.

Last edited by DreamWarrior; 04-11-2016 at 10:57 PM.
DreamWarrior is offline  
post #104 of 405 Old 04-11-2016, 11:18 PM
Lionheart of AVS
 
King Richard's Avatar
 
Join Date: Oct 2015
Location: Greenstone, Ontario, Canada
Posts: 3,287
Mentioned: 86 Post(s)
Tagged: 0 Thread(s)
Quoted: 1797 Post(s)
Liked: 4367
Quote:
Originally Posted by jjackkrash View Post
DV tone mapping is critical because without it there is no way to calibrate a set to HDR standards because HDR10/DV is graded in a Rec 2020 container, which none of the current sets can do. The tone mapping allows some semblance of accuracy for a set that falls short of Rec 2020. Without tone mapping, the sets are all over the place and wildly inaccurate.

Let's see if I can shed a bit of light on this...

Dolby Vision content is first graded and mastered on a 42" Dolby Pulsar Monitor which is a 4000 nits LCD monitor with a DCI (Digital Cinema Initiative) P3 Color Gamut. (P3 covers 72.9 % of the Rec.2020 Color Space.)

For a Dolby Vision Theater release, they then do a "trim pass" on a Christie Laser Projector that is used in Dolby Vision Theaters.

For a Home Video release, they use the "full" 4000 nits mastered version which is then "mapped", using Dolby's proprietary "tone mapper" and "dynamic metadata" (Dolby's "secret sauce"), to the capabilities (which they refer to as their "golden reference values" which is a set of parameters) of the specific Dolby Vision enabled TV. These "Golden Reference Values" are what the "tone mapper" uses to map the Dolby Vision content down to the capabilities of the specific Home Display. The capabilities of the TV are in large part determined by its "dynamic range" and maximum nit values.

Yes a Rec.2020 Color Space container is used for the Home Video release... However, the original Dolby Vision content is mastered to a DCI P3 Color Gamut. Note: No current consumer display can reach the full Rec.2020 Color Gamut - and as far as I know, neither can the Dolby Pulsar Monitor used to grade and master the content.


Now getting back to this whole HDR10 vs. Dolby Vision debate, as I stated in my very first post in this thread, "HDR10 and Dolby Vision are actually very similar. They both use SMPTE ST-2084 PQ EOTF (Perceptual Quantizer Electro-Optical Transfer Function) which is a logarithmic-like curve that replaces the gamma curve in image encoding."

Again, the main difference between HDR10 and Dolby Vision HDR is that Dolby uses "dynamic metadata" while HDR10 uses "static metadata" (SMPTE ST-2086). However, there is talk of adding "dynamic metadata" (SMPTE ST-2094) to HDR10 in the future. Now whether this will require a Hardware upgrade or be able to be implemented via a Firmware update remains to be seen.

And as I said before: "If/when this does happen, HDR10, depending on how well this is implemented, will probably look very similar to Dolby Vision. It will certainly narrow the "gap" between the two."


Richard
Egan, Tom Roper, bp787 and 4 others like this.

Last edited by King Richard; 05-16-2016 at 09:11 PM.
King Richard is offline  
post #105 of 405 Old 04-12-2016, 06:22 AM
Senior Member
 
GatorJZ's Avatar
 
Join Date: Oct 2007
Posts: 245
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 54 Post(s)
Liked: 61
Quote:
Originally Posted by alexanderg823 View Post
I didn't say it makes all sets equal. Never said that.

I said it makes all sets with same hardware specs equal since there's no proprietary processing software on x brands tv to spice the picture up.

And that's what the big boys are scared of.

Sony x940c is a 128 zone 600 nit full white lcd with a 5000:1 panel. Sound familiar? It's the same specs as a vizio p that retails for thousands less.
Uhmm, that might be because Sony and Vizio now buy most of their panels from the same manufacturer.

Samsung UN50HU8550, Vizio P602ui-B3 & Vizio M43‑C1.
GatorJZ is offline  
post #106 of 405 Old 04-12-2016, 06:37 AM
AVS Forum Special Member
 
jjackkrash's Avatar
 
Join Date: Nov 2010
Posts: 5,276
Mentioned: 40 Post(s)
Tagged: 0 Thread(s)
Quoted: 2223 Post(s)
Liked: 2331
Quote:
Originally Posted by King Richard View Post
R10 vs. Dolby Vision debate, as I stated in my very first post in this thread, "HDR10 and Dolby Vision are actually very similar. They both use SMPTE ST-2084 PQ EOTF (Perceptual Quantizer Electro-Optical Transfer Function) which is a logarithmic-like curve that replaces the gamma curve in image encoding."

Again, the main difference between HDR10 and Dolby Vision HDR is that Dolby uses "dynamic metadata" while HDR10 uses "static metadata" (SMPTE ST-2086). However, there is talk of adding "dynamic metadata" (SMPTE ST-2096) to HDR10 in the future. Now whether this will require a Hardware upgrade (very likely) or be able to be implemented via a Firmware update (unlikely) remains to be seen.

And as I said before: "If/when this does happen, HDR10, depending on how well this is implemented, will probably look very similar to Dolby Vision. It will certainly narrow the "gap" between the two."


Richard

I appreciate the detailed explanation, as I am trying to learn a bit more about this stuff to make an informed purchasing decision on my next set. Is it fair to say that, currently, a calibrator is more likely to get an accurate calibration with a DV set with golden reference number than with a static HDR10 set without them?

I get the sense that a lot on here are saying HDR10 is "just as good" because of what it will do in the future because it will be upgraded to dynamic meta data, but this seems to discount that there is a strong likelihood that no current set will be able to take advantage of that feature. So, for current purchasing decisions, it seems to me that HDR10 is not as good as DV. This is what I am trying to assess, because if current HDR10 sets are or will be "just as good", I would likely pull the trigger on a Sony 940D. If not, I will likely wait for a year or two or break down an get the Vizio with DV.

For example, Stacey Spears recommends waiting for one to two years before a purchase decision if you intend to keep the set more than one to two years. (see post #29):

https://www.avsforum.com/forum/138-av...-p-series.html
King Richard likes this.
jjackkrash is offline  
post #107 of 405 Old 04-12-2016, 06:41 AM
AVS Forum Special Member
 
jjackkrash's Avatar
 
Join Date: Nov 2010
Posts: 5,276
Mentioned: 40 Post(s)
Tagged: 0 Thread(s)
Quoted: 2223 Post(s)
Liked: 2331
Also, on a related note, I have seen lots of anecdotal reports that the same HDR content "looks great" or was "blown out" or "looked like crap" or was "dull" on different sets, which also leads me to believe that different sets capable of playing HDR10 are simply not playing the content accurately. So that's another concern I have.
Tom Roper likes this.
jjackkrash is offline  
post #108 of 405 Old 04-12-2016, 06:56 AM
AVS Forum Special Member
 
bp787's Avatar
 
Join Date: Feb 2005
Posts: 1,782
Mentioned: 19 Post(s)
Tagged: 0 Thread(s)
Quoted: 893 Post(s)
Liked: 619
Quote:
Originally Posted by jjackkrash View Post
I appreciate the detailed explanation, as I am trying to learn a bit more about this stuff to make an informed purchasing decision on my next set. Is it fair to say that, currently, a calibrator is more likely to get an accurate calibration with a DV set with golden reference number than with a static HDR10 set without them?

I get the sense that a lot on here are saying HDR10 is "just as good" because of what it will do in the future because it will be upgraded to dynamic meta data, but this seems to discount that there is a strong likelihood that no current set will be able to take advantage of that feature. So, for current purchasing decisions, it seems to me that HDR10 is not as good as DV. This is what I am trying to assess, because if current HDR10 sets are or will be "just as good", I would likely pull the trigger on a Sony 940D. If not, I will likely wait for a year or two or break down an get the Vizio with DV.

For example, Stacey Spears recommends waiting for one to two years before a purchase decision if you intend to keep the set more than one to two years. (see post #29):

https://www.avsforum.com/forum/138-av...-p-series.html
i doubt that the difference between a good HDR10 based set and a DV set will be all that significant. In 2 years, there will be something new and you'll ask "should I wait another 2 years?" and so on. HDR10 based sets could easily get upgraded with dynamic metadata support via software. HDR10 sets, however, cannot be upgraded to DV unless they already contain the necessary chipsets.
King Richard likes this.

Sony xbr65x850c Side Blooming test clips:
https://dl.dropboxusercontent.com/u/...br65x850c.xlsx
bp787 is offline  
post #109 of 405 Old 04-12-2016, 07:08 AM
AVS Forum Special Member
 
bp787's Avatar
 
Join Date: Feb 2005
Posts: 1,782
Mentioned: 19 Post(s)
Tagged: 0 Thread(s)
Quoted: 893 Post(s)
Liked: 619
Quote:
Originally Posted by jjackkrash View Post
Also, on a related note, I have seen lots of anecdotal reports that the same HDR content "looks great" or was "blown out" or "looked like crap" or was "dull" on different sets, which also leads me to believe that different sets capable of playing HDR10 are simply not playing the content accurately. So that's another concern I have.
there are soooo many factors to take into consideration with people's "Opinions" on the internet.
what are there settings?
What source?
dull, blown out, etc... those descriptors are tainted by the persons preferences. As in, maybe they watch their sets on Torch/Best Buy Mode and like that. HDR kills that off completely on most sets when it's being viewed.

HDR on an Element brand might not look like HDR on a Sony/Samsung though. Again, the Manufacturers can do a lot to screw up ANYTHING. just gotta look at it and see if you like it on that set.
Tom Roper and King Richard like this.

Sony xbr65x850c Side Blooming test clips:
https://dl.dropboxusercontent.com/u/...br65x850c.xlsx
bp787 is offline  
post #110 of 405 Old 04-12-2016, 07:12 AM
AVS Forum Special Member
 
bruceames's Avatar
 
Join Date: May 2006
Location: San Rafael, CA
Posts: 1,427
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 538 Post(s)
Liked: 371
Quote:
Originally Posted by GatorJZ View Post
Uhmm, that might be because Sony and Vizio now buy most of their panels from the same manufacturer.
Both the 940C and Vizio have about equal full-on brightness. But that's where the similarities end. The 940C has 3D and gets around 1250 nits in a 10 percent window, while the Vizio doesn't get any brighter in a smaller window area (actually it gets worse when local dimming is enabled).
King Richard likes this.
bruceames is offline  
post #111 of 405 Old 04-12-2016, 07:18 AM
AVS Forum Special Member
 
bruceames's Avatar
 
Join Date: May 2006
Location: San Rafael, CA
Posts: 1,427
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 538 Post(s)
Liked: 371
Quote:
Originally Posted by jjackkrash View Post
So, for current purchasing decisions, it seems to me that HDR10 is not as good as DV. This is what I am trying to assess, because if current HDR10 sets are or will be "just as good", I would likely pull the trigger on a Sony 940D. If not, I will likely wait for a year or two or break down an get the Vizio with DV.
DV is of course better, but you also have to consider the display. DV is only as good as the display it's played on. DV content is mastered at up to 4000 nits but no DV TV comes close to that. So the DV content is remapped according to the capabilities of the display. In the case of Vizio, that is around 600 nits full white and even less than that for highlights.

For HDR10 it's 1000 nits and the HDR10 content does not take the capabilities of the display into account so HDR TVs with only 400 nits peak luminance will not look very impressive, especially in a lit room. But for TV that can deliver the full peak luminance then then you will get the full benefit of HDR10.

So given real world conditions, what is really better: DV on a 600 nit display or HDR10 on a 1000 nit display? The former is being limited by the display while the latter is not. It's a good question. When the Vizio gets updated for HDR10, the HDR10 on it will probably not look as good as the HDR10 on a UHD Premium display because it only gets up to 600 nits (and it may be worse than that if local dimming is used). And who knows when we will see DV players and discs. I know the world is transitioning to streaming but I for one will be enjoying HDR exclusively on disc.
bruceames is offline  
post #112 of 405 Old 04-12-2016, 07:26 AM - Thread Starter
 
alexanderg823's Avatar
 
Join Date: May 2015
Posts: 3,449
Mentioned: 16 Post(s)
Tagged: 0 Thread(s)
Quoted: 3128 Post(s)
Liked: 2544
Quote:
Originally Posted by player002 View Post
Well you need to tell SpectraCal that they have no idea what they are talking about and should listen to armchair Calibration expert that you are instead.
I don't think SpectraCal would disagree with me.


You're confusing the ability to decode something with the ability to actually display it properly.


There's simply no way for an edge lit unit to accurately light up highlights without distorting the surrounding areas unless it defies the laws of physics. This has been explained to you over and over and over again. And it's not a difficult concept to grasp either.
dancolt likes this.
alexanderg823 is offline  
post #113 of 405 Old 04-12-2016, 07:52 AM
AVS Forum Special Member
 
jjackkrash's Avatar
 
Join Date: Nov 2010
Posts: 5,276
Mentioned: 40 Post(s)
Tagged: 0 Thread(s)
Quoted: 2223 Post(s)
Liked: 2331
Quote:
Originally Posted by bruceames View Post
DV is of course better, but you also have to consider the display. DV is only as good as the display it's played on. DV content is mastered at up to 4000 nits but no DV TV comes close to that. So the DV content is remapped according to the capabilities of the display. In the case of Vizio, that is around 600 nits full white and even less than that for highlights.

For HDR10 it's 1000 nits and the HDR10 content does not take the capabilities of the display into account so HDR TVs with only 400 nits peak luminance will not look very impressive, especially in a lit room. But for TV that can deliver the full peak luminance then then you will get the full benefit of HDR10.

So given real world conditions, what is really better: DV on a 600 nit display or HDR10 on a 1000 nit display? The former is being limited by the display while the latter is not. It's a good question. When the Vizio gets updated for HDR10, the HDR10 on it will probably not look as good as the HDR10 on a UHD Premium display because it only gets up to 600 nits (and it may be worse than that if local dimming is used). And who knows when we will see DV players and discs. I know the world is transitioning to streaming but I for one will be enjoying HDR exclusively on disc.
Thanks for this explanation, this helps.
jjackkrash is offline  
post #114 of 405 Old 04-12-2016, 08:03 AM
 
player002's Avatar
 
Join Date: Dec 2008
Posts: 6,246
Mentioned: 197 Post(s)
Tagged: 1 Thread(s)
Quoted: 3935 Post(s)
Liked: 2944
Quote:
Originally Posted by alexanderg823 View Post
I don't think SpectraCal would disagree with me.


You're confusing the ability to decode something with the ability to actually display it properly.


There's simply no way for an edge lit unit to accurately light up highlights without distorting the surrounding areas unless it defies the laws of physics. This has been explained to you over and over and over again. And it's not a difficult concept to grasp either.
Dude this is the difference between Fald and edge lit. Fald will always have better dimming period thats like saying ur tv is not a true HD tv because it can't do dimming as accurate as full array. The edge lits handle the metadata and display HDR correctly and to the sets capabilities as long as the edge lit can control zones it's fine. Now a edge lit with no zones is obviously going to have light bleed and uniformity issues however the top end edge lits have many zones and do a good job. The fald has better highlights as it's dimming is better but the edge lit top tier do not have this horrible light bleed you describe they have improved from days of old .

Spectracal does disagree with you watch the video bro they clearly say that people wrongly say edge lits can't do HDR does not get much clearer than that. Once again the 2016 edge lits from Samsung are UHD certified for premium HDR. So no you are wrong HDR is being displayed correctly on edge lit how else did it get certified @ More 1000 nits peak brightness and less than 0.05 nits black level. Full array will look better for highlights as it has superior dimming but saying EDGE lit can't do HDR is false especially with the new models this year THAT ARE UHD CERTIFIED!!!
player002 is offline  
post #115 of 405 Old 04-12-2016, 08:11 AM
AVS Forum Special Member
 
bruceames's Avatar
 
Join Date: May 2006
Location: San Rafael, CA
Posts: 1,427
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 538 Post(s)
Liked: 371
Quote:
Originally Posted by player002 View Post
Dude this is the difference between Fald and edge lit. Fald will always have better dimming period thats like saying ur tv is not a true HD tv because it can't do dimming as accurate as full array. The edge lits handle the metadata and display HDR correctly and to the sets capabilities as long as the edge lit can control zones it's fine. Now a edge lit with no zones is obviously going to have light bleed and uniformity issues however the top end edge lits have many zones and do a good job. The fald has better highlights as it's dimming is better but the edge lit top tier do not have this horrible light bleed you describe they have improved from days of old .

Spectracal does disagree with you watch the video bro they clearly say that people wrongly say edge lits can't do HDR does not get much clearer than that. Once again the 2016 edge lits from Samsung are UHD certified for premium HDR. So no you are wrong HDR is being displayed correctly on edge lit how else did it get certified @ More 1000 nits peak brightness and less than 0.05 nits black level. Full array will look better for highlights as it has superior dimming but saying EDGE lit can't do HDR is false especially with the new models this year THAT ARE UHD CERTIFIED!!!
Exactly. The difference in HDR quality between edge lit and FALD is proportional to the quality difference between regular content viewed on those displays.
bruceames is offline  
post #116 of 405 Old 04-12-2016, 09:07 AM
Super Moderator
 
markrubin's Avatar
 
Join Date: May 2001
Location: Jersey Shore
Posts: 21,158
Mentioned: 60 Post(s)
Tagged: 0 Thread(s)
Quoted: 1800 Post(s)
Liked: 3579
posts deleted

please take the high road in every post:do not respond to or quote a problematic post: report it
HDMI.org:what a mess HDCP = Hollywood's Draconian Copy Protection system
LG C9 OLED owner


markrubin is offline  
post #117 of 405 Old 04-12-2016, 11:05 AM
AVS Forum Special Member
 
DreamWarrior's Avatar
 
Join Date: Nov 2005
Location: Dirty South Jersey
Posts: 2,177
Mentioned: 28 Post(s)
Tagged: 0 Thread(s)
Quoted: 993 Post(s)
Liked: 579
Quote:
Originally Posted by jjackkrash View Post
I appreciate the detailed explanation, as I am trying to learn a bit more about this stuff to make an informed purchasing decision on my next set. Is it fair to say that, currently, a calibrator is more likely to get an accurate calibration with a DV set with golden reference number than with a static HDR10 set without them?

I get the sense that a lot on here are saying HDR10 is "just as good" because of what it will do in the future because it will be upgraded to dynamic meta data, but this seems to discount that there is a strong likelihood that no current set will be able to take advantage of that feature. So, for current purchasing decisions, it seems to me that HDR10 is not as good as DV. This is what I am trying to assess, because if current HDR10 sets are or will be "just as good", I would likely pull the trigger on a Sony 940D. If not, I will likely wait for a year or two or break down an get the Vizio with DV.

For example, Stacey Spears recommends waiting for one to two years before a purchase decision if you intend to keep the set more than one to two years. (see post #29):

https://www.avsforum.com/forum/138-av...-p-series.html
I addition to what @bruceames mentioned in post #111, I'd like to inquire about the dynamic metadata part of the equation. I'm not in the industry, so I can't really say, beyond the marketing, what practical advantage it adds. However, it sounds like the point of it is to try to regrade based on the scene's overall levels. While HDR10 has only one static set of metadata characteristics that describe the calibration used to grade the entire movie.

How much benefit does it offer the studio to change the metadata on a scene-to-scene basis? At the end of the day, wouldn't the entire film will be viewed on a single monitor front-to-back? Couldn't the HDR10 static metadata reflect this environment with similar intent?

Moreover, I think I heard that the dynamic metadata becomes more valuable as the capabilities of the mastering monitor drift further from those of the playback monitor. Right now, the best HDR10 sets are probably closer to the mastering equipment than any of the DV LCD sets are to theirs. So, maybe they are using DV to help them overcome this deficit? My layman's reading into this seems to say that tone-mapping can be performed more accurately with the dynamic metadata than it can with a single static metadata, but if the set is closer to the original mastering set, less tone mapping needs to be done anyway, so the static metadata is probably "good enough".

That said, DV does seem more future-proof in-so-much as you'll get a better experience on lesser equipment as the mastering equipment becomes more capable and your set stays as-is. This is, I suppose, a benefit. However, I'd still choose a more capable HDR10 set, as I'd want to upgrade anyway if the mastering sets become sufficiently more capable, because I'd rather see the original than a "close" tone-mapped display.
Quote:
Originally Posted by bp787 View Post
*snip*
HDR10 based sets could easily get upgraded with dynamic metadata support via software. *snip*
Not sure this is true. If you read the specs it says the ability to transmit HDR10 dynamic metadata is coming to HDMI 2.1. Considering HDMI's history, the upgrade from 2.0a -> 2.1 will probably require a new SOC since those greedy pricks can't seem to separate what should be a firmware upgrade from a hardware upgrade. So, to get the added functionality for dynamic metadata, they may force you to buy equipment containing a new 2.1 capable SOC -- this may extend all the way back to your receiver!

To that, I say *middle fingers* and probably won't be upgrading a thing this year unless a piece of my system dies unexpectedly.
DreamWarrior is offline  
post #118 of 405 Old 04-12-2016, 11:38 AM
AVS Forum Special Member
 
bp787's Avatar
 
Join Date: Feb 2005
Posts: 1,782
Mentioned: 19 Post(s)
Tagged: 0 Thread(s)
Quoted: 893 Post(s)
Liked: 619
Quote:
Originally Posted by DreamWarrior View Post

Not sure this is true. If you read the specs it says the ability to transmit HDR10 dynamic metadata is coming to HDMI 2.1. Considering HDMI's history, the upgrade from 2.0a -> 2.1 will probably require a new SOC since those greedy pricks can't seem to separate what should be a firmware upgrade from a hardware upgrade. So, to get the added functionality for dynamic metadata, they may force you to buy equipment containing a new 2.1 capable SOC -- this may extend all the way back to your receiver!

To that, I say *middle fingers* and probably won't be upgrading a thing this year unless a piece of my system dies unexpectedly.
That was from a software perspective. right now I don't know of any physical limitations to HDR 10 metadata to say one way or the other. And yes, all ends of the equipment would need to be "flashable" to support that dynamic metdata, even just from a software perspective.

Sony xbr65x850c Side Blooming test clips:
https://dl.dropboxusercontent.com/u/...br65x850c.xlsx
bp787 is offline  
post #119 of 405 Old 04-12-2016, 11:42 AM
AVS Forum Special Member
 
jjackkrash's Avatar
 
Join Date: Nov 2010
Posts: 5,276
Mentioned: 40 Post(s)
Tagged: 0 Thread(s)
Quoted: 2223 Post(s)
Liked: 2331
DreamWarrior, good post, I appreciate it.
DreamWarrior likes this.
jjackkrash is offline  
post #120 of 405 Old 04-12-2016, 01:00 PM
AVS Forum Special Member
 
DreamWarrior's Avatar
 
Join Date: Nov 2005
Location: Dirty South Jersey
Posts: 2,177
Mentioned: 28 Post(s)
Tagged: 0 Thread(s)
Quoted: 993 Post(s)
Liked: 579
Quote:
Originally Posted by bp787 View Post
That was from a software perspective. right now I don't know of any physical limitations to HDR 10 metadata to say one way or the other. And yes, all ends of the equipment would need to be "flashable" to support that dynamic metdata, even just from a software perspective.
I don't believe the HDMI physical layer would require upgrading to support the metadata. However, for some reason the HDMI community seems to find ways to force their customers to upgrade to a new unit to get support for new features. I don't know if it's by design, or if in the past at least one feature that was added between the specifications did change the physical layer (e.g. ARC / CEC using new pins on the connector or additional bandwidth added for new formats) and it was easier just to force an upgrade than deal with what's a software upgradeable feature vs. a hardware+software feature and all the potential incompatibilities that would cause (not that HDMI isn't without incompatibilities, lol).

Either way, I have little confidence that HDMI 2.1 will be released and current HDMI SOCs will be upgradeable to support it, regardless of whether the existing HDMI PHY can.

I would love to think that the responsibility to tx/rx on the PHY is all the HDMI chip does and the rest is "just software". However, it doesn't appear that's how manufacturers are going about implementing HDMI. They appear to be using a SOC with more capability than just tx/rx; maybe to avoid writing their own protocol handling software. Do those SOC vendors provide firmware upgrades to support new features? Have we seen any of them do it yet? I think the answer is, "no".

It seems, despite the undue hardware churn it creates, a SOC vendor is happy to sell new chips every version upgrade rather than providing firmware updates to the manufacturers to implement new features in their units. I won't support that behavior with my wallet. Especially since, as a software engineer, I know there is a clear separation between the limits of the PHY and what the software sends over it.
bp787 and King Richard like this.
DreamWarrior is offline  
Sponsored Links
Advertisement
 
Reply High Dynamic Range (HDR) & Wide Color Gamut (WCG)

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off