HDR10+ at CES 2018 - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 18Likes
Reply
 
Thread Tools
post #1 of 47 Old 01-22-2018, 12:36 PM - Thread Starter
AVS Forum Special Member
 
Scott Wilkinson's Avatar
 
Join Date: Sep 2001
Location: Burbank, CA
Posts: 3,258
Mentioned: 94 Post(s)
Tagged: 0 Thread(s)
Quoted: 1881 Post(s)
Liked: 4908
HDR10+ at CES 2018

HDR10+ gained momentum at CES 2018, leading me to believe it can be a viable alternative to Dolby Vision.

https://www.avsforum.com/hdr10plus-ces-2018
Manni01 likes this.
Scott Wilkinson is offline  
Sponsored Links
Advertisement
 
post #2 of 47 Old 01-22-2018, 12:56 PM
AVS Forum Special Member
 
JohnAV's Avatar
 
Join Date: Jun 2007
Location: SF Bay Area
Posts: 7,166
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 518 Post(s)
Liked: 704
Quote:
Instead, Samsung came up with a different solution: It added dynamic metadata to HDR10, calling it HDR10+. What’s the difference between HDR10+ and Dolby Vision? HDR10+ is an open, royalty-free standard—now codified as SMPTE ST 2094—that requires no licensing fee to implement. (Another difference is that Dolby Vision content starts out with 12-bit precision, while HDR10+ starts out with 10-bit precision.)
Per http://www.hdr10plus.org/press-release/

Quote:
Key aspects of the license program will include:
  • Benefits for device manufacturers (e.g., TV, UHD Blu-ray, OTT STB, etc.), content distribution services providers, SoC manufacturers, content publishers, and content creation tool providers.
  • No per unit royalty.
  • A nominal annual administration fee for device manufacturers, SoC manufacturers and content distribution service providers.
  • Technical specification, test specification, HDR10+ logo/logo guide, patents from the three companies directly related to the technical specification and the test specification.
  • Certification for devices will be performed by a third-party, authorized testing center
Dolby Vision? HDR10+? Dolby makes the case that both can coexist - Tech Radar
This is a somewhat bias interview from Dolby's SVP of Consumer Entertainment at CES 2018 to be candid it did have this interesting comment.

Quote:
What Dolby Vision and HDR10+ both do is use dynamic metadata to change those attributes on a moment-to-moment basis. This means that scenes can be more dynamic with one scene reaching peak black levels of the TV, with the next doing the same for brightness.

The difference between them, and the point Baker wants to make abundantly clear, is that, for Dolby Vision, all of that metadata is created by hand by colorists and editors at the movie studio. HDR10+’s metadata, meanwhile, is produced by an upscaling algorithm. This algorithm-based mode for content generation takes a lot of work off the editors and reduces time in development, which, Baker admitted, he could understand that being seen as a positive trait.
If you search a bit more about HDR10+ it is actually using a scene by scene then frame by frame. Whether that makes that much of a difference for videophile has yet to be determined, as we have no A vs B comparisons.

Oppo Beta Group
JohnAV is offline  
post #3 of 47 Old 01-22-2018, 01:15 PM
AVS Forum Special Member
 
Join Date: Mar 2015
Location: Los Angeles, CA
Posts: 2,814
Mentioned: 18 Post(s)
Tagged: 0 Thread(s)
Quoted: 1852 Post(s)
Liked: 1116
The missing detail in your article, and that is that Panasonic announced that their new UHD Blu-ray player will get Dolby Vision in a firmware update.



Sent from my SM-G360T1 using Tapatalk
DisplayCalNoob is online now  
Sponsored Links
Advertisement
 
post #4 of 47 Old 01-22-2018, 01:18 PM
Advanced Member
 
EvLee's Avatar
 
Join Date: Dec 2009
Posts: 689
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 396 Post(s)
Liked: 381
Quote:
Originally Posted by JohnAV View Post
Dolby Vision? HDR10+? Dolby makes the case that both can coexist - Tech Radar
This is a somewhat bias interview from Dolby's SVP of Consumer Entertainment at CES 2018 to be candid it did have this interesting comment.

Quote:
What Dolby Vision and HDR10+ both do is use dynamic metadata to change those attributes on a moment-to-moment basis. This means that scenes can be more dynamic with one scene reaching peak black levels of the TV, with the next doing the same for brightness.

The difference between them, and the point Baker wants to make abundantly clear, is that, for Dolby Vision, all of that metadata is created by hand by colorists and editors at the movie studio. HDR10+’s metadata, meanwhile, is produced by an upscaling algorithm. This algorithm-based mode for content generation takes a lot of work off the editors and reduces time in development, which, Baker admitted, he could understand that being seen as a positive trait.
If you search a bit more about HDR10+ it is actually using a scene by scene then frame by frame. Whether that makes that much of a difference for videophile has yet to be determined, as we have no A vs B comparisons.
Dolby Vision actually starts with an automatic pass. You could just run that to generate the metadata and be done. It is optional for a colorist to fine tune the results of that pass if they feel the initial result from the Dolby algorithm could benefit from some manual adjusting.

So the suggestion that there is an underlying time difference between creating one format vs the other doesn't really exist. The folks using Dolby Vision on their content are probably going to be inclined to put the extra time and effort into fine tuning the metadata, but that may not always be the case.

One practical benefit of having frame by frame metadata is when there is a dissolve between two scenes. If the tone mapping is going to change on either side of the dissolve, then frame by frame metadata during the dissolve makes a better transition.
JohnAV and sneals2000 like this.
EvLee is offline  
post #5 of 47 Old 01-22-2018, 01:55 PM
Senior Member
 
Jish9's Avatar
 
Join Date: Apr 2014
Posts: 355
Mentioned: 8 Post(s)
Tagged: 0 Thread(s)
Quoted: 293 Post(s)
Liked: 111
The way I perceive this article has more to do with the intellectual property side than anything else; much like what Google did with their phone OS that is being used in over 80% of all cell phones worldwide. By making HDR10+ an open and free architecture, and with less physical work needed to render scenes, why would we not begin to see this on a vast majority of devices and media. The truth of the matter is that much of the technology we are seeing today has been available for some time, but purposefully trickled in by companies in order to maintain their profit margins. Leave it to one of those same companies to come up with an industry disruptive solution to basically cut the legs out of another's business model. Will be interesting to see whether or not Dolby sues Samsung over this or there is enough difference between the two for a court to side with Samsung.
Jish9 is offline  
post #6 of 47 Old 01-22-2018, 05:52 PM
Member
 
Join Date: Aug 2017
Posts: 152
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 75 Post(s)
Liked: 41
There is no patent dispute here otherwise it wouldn't be accepted as part of the SMPTE ST 2094 standard and hdmi 2.1.
mozmo is offline  
post #7 of 47 Old 01-23-2018, 02:28 AM
AVS Forum Special Member
 
Mashie Saldana's Avatar
 
Join Date: Dec 2008
Location: UK
Posts: 2,201
Mentioned: 18 Post(s)
Tagged: 0 Thread(s)
Quoted: 1211 Post(s)
Liked: 1052
So does HDR10+ require specific hardware like Dolby Vision or could it in theory be added using firmware to TV's/media players from 2017?

Tower Cinema - 9.1.6 in a 12'x12' room
Input : Nvidia Shield TV, Panasonic DMP-UB400
Magic : Marantz SR7010, Marantz SR6010, 2x NAD T743
Output : Panasonic TX65EZ952B, SVS PB13 Ultra, Monitor Audio GSLCR 2xGS20 2xGS10 4xGSFX 6xBX1
Mashie Saldana is offline  
post #8 of 47 Old 01-23-2018, 03:32 AM
AVS Forum Special Member
 
Join Date: Jan 2016
Posts: 7,626
Mentioned: 18 Post(s)
Tagged: 0 Thread(s)
Quoted: 4298 Post(s)
Liked: 3723
It may require HDMI 2.1 if you want to feed HDR10+ from another device.
video_analysis is offline  
post #9 of 47 Old 01-23-2018, 03:46 AM
AVS Forum Special Member
 
Mashie Saldana's Avatar
 
Join Date: Dec 2008
Location: UK
Posts: 2,201
Mentioned: 18 Post(s)
Tagged: 0 Thread(s)
Quoted: 1211 Post(s)
Liked: 1052
Quote:
Originally Posted by video_analysis View Post
It may require HDMI 2.1 if you want to feed HDR10+ from another device.
I can't imagine that is a requirement considering the 2018 Panasonic OLED's and UHD players will support HDR10+ it and they don't have HDMI 2.1.

Tower Cinema - 9.1.6 in a 12'x12' room
Input : Nvidia Shield TV, Panasonic DMP-UB400
Magic : Marantz SR7010, Marantz SR6010, 2x NAD T743
Output : Panasonic TX65EZ952B, SVS PB13 Ultra, Monitor Audio GSLCR 2xGS20 2xGS10 4xGSFX 6xBX1
Mashie Saldana is offline  
post #10 of 47 Old 01-23-2018, 03:53 AM
AVS Forum Special Member
 
Join Date: Jan 2016
Posts: 7,626
Mentioned: 18 Post(s)
Tagged: 0 Thread(s)
Quoted: 4298 Post(s)
Liked: 3723
The story is pretty constantly changing with interoperability of this format (and virtually all HDR formats), so I'm probably behind the times.
video_analysis is offline  
post #11 of 47 Old 01-23-2018, 04:00 AM
AVS Forum Special Member
 
Mashie Saldana's Avatar
 
Join Date: Dec 2008
Location: UK
Posts: 2,201
Mentioned: 18 Post(s)
Tagged: 0 Thread(s)
Quoted: 1211 Post(s)
Liked: 1052
Quote:
Originally Posted by video_analysis View Post
The story is pretty constantly changing with interoperability of this format (and virtually all HDR formats), so I'm probably behind the times.
I can't disagree with that, the world of HDR is an absolute mess.
rak306 and Sal1950 like this.

Tower Cinema - 9.1.6 in a 12'x12' room
Input : Nvidia Shield TV, Panasonic DMP-UB400
Magic : Marantz SR7010, Marantz SR6010, 2x NAD T743
Output : Panasonic TX65EZ952B, SVS PB13 Ultra, Monitor Audio GSLCR 2xGS20 2xGS10 4xGSFX 6xBX1
Mashie Saldana is offline  
post #12 of 47 Old 01-23-2018, 04:33 AM
AVS Forum Special Member
 
ratm's Avatar
 
Join Date: Sep 2006
Location: South Florida
Posts: 1,874
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 195 Post(s)
Liked: 97
Scott,

Given the juggernaut that is Samsung as a whole and their reluctance for DV, do you see this developing into a VHS/BetaMax or Bluray/HDDVD sort of battle? If so, do you see other manufacturers (LG, Sony, Panasonic, etc.) allowing all formats in their products via firmware updates until a "standard" wins?
ratm is offline  
post #13 of 47 Old 01-23-2018, 08:07 AM
Senior Member
 
Join Date: Jan 2018
Posts: 285
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 197 Post(s)
Liked: 82
I think Dolby Vision has already won the streaming battle. Though I don't think it will ever be widely used for UHD blu rya
AVS Commenter is offline  
post #14 of 47 Old 01-23-2018, 09:53 AM
Advanced Member
 
Utopianemo's Avatar
 
Join Date: May 2007
Location: Portland, OR
Posts: 751
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 248 Post(s)
Liked: 188
The elephant in the room

Quote:
Originally Posted by Scott Wilkinson View Post
HDR10+ gained momentum at CES 2018, leading me to believe it can be a viable alternative to Dolby Vision.

https://www.avsforum.com/hdr10plus-ces-2018
While I went out of my way to install a DV-compatible pipeline, it was perhaps premature. I have yet to read a Ralph Potts review (or any other) that indicates there's a significant difference between DV versus even standard HDR10 on a UHD disc. So I'm all fine with HDR10+, but I'm not holding my breath that it will be much different than what we have available now.

Pioneer SC-95, 2x Crown XLS 1000 (LCR amps)
3x Epos Epic 2(LCR), 4x Emotiva UAW 6.2(Surrounds), 4x Aperion Intimus LC-I6(Atmos/DTS:X)
2x Dayton Ultimax 18" sealed(subs), Behringer Inuke 6000DSP(sub amp),
Vizio P-65C1 LED HDR
Oppo UDP-203 HD-Blu Ray Player, Oppo DV971H DVD Player, Roku Premiere+
Utopianemo is offline  
post #15 of 47 Old 01-23-2018, 10:02 AM
Member
 
Join Date: Mar 2016
Posts: 44
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 27 Post(s)
Liked: 8
So now we have to wait for equipment (TV's in particular) to be able to hand HDR10, DV, HDR10+, and HLG?? I feel like I'm always waiting if I want to get a TV that can do it all for the forseeable future.
Dan Stanley is offline  
post #16 of 47 Old 01-23-2018, 10:41 AM - Thread Starter
AVS Forum Special Member
 
Scott Wilkinson's Avatar
 
Join Date: Sep 2001
Location: Burbank, CA
Posts: 3,258
Mentioned: 94 Post(s)
Tagged: 0 Thread(s)
Quoted: 1881 Post(s)
Liked: 4908
Quote:
Originally Posted by video_analysis View Post
It may require HDMI 2.1 if you want to feed HDR10+ from another device.
Apparently not. As I wrote in the article, Samsung demonstrated sending HDR10+ over HDMI 2.0. They said it was only the "important data" and that HDMI 2.1 is required to send full HDR10+ metadata, but the partial metadata got through via HDMI 2.0.
Scott Wilkinson is offline  
post #17 of 47 Old 01-23-2018, 11:37 AM
AVS Forum Special Member
 
mtbdudex's Avatar
 
Join Date: May 2007
Location: SE Michigan
Posts: 6,830
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 83 Post(s)
Liked: 1532
Blu-ray vs HDDVD ... hey there is still a forum here for that believe it or not
https://www.avsforum.com/forum/148-blu-ray-hd-dvd/
Of course we all know which "won", and why.
Some old timers even may remember the hot-hot debates in this very forum, and the time-out(s) given.

Now, there so many HDR formats how the heck does the industry expect joe avg consumer to know or rightly care?

For my family room I've got a 2016 OLED B6, and a 2016 AVR Denon 4300H, am I "safe" for HDR?
Source included UHDBR via xBox1s, and Netflix and Amazon Prime ....

For my basement HT, I've got a 2016 JVC RS400, source includes XBox1X, and soon the 2018 Denon 8500H, am I "safe" for HDR?

This becomes tedious, and old fast ....
dr_gallup, twan69666 and The Mice like this.
mtbdudex is offline  
post #18 of 47 Old 01-23-2018, 11:55 AM
AVS Forum Special Member
 
JohnAV's Avatar
 
Join Date: Jun 2007
Location: SF Bay Area
Posts: 7,166
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 518 Post(s)
Liked: 704
Quote:
Originally Posted by Scott Wilkinson View Post
Apparently not. As I wrote in the article, Samsung demonstrated sending HDR10+ over HDMI 2.0. They said it was only the "important data" and that HDMI 2.1 is required to send full HDR10+ metadata, but the partial metadata got through via HDMI 2.0.
Hi Scott please check out https://www.flatpanelshd.com/news.ph...&id=1516615699

Quote:
When FlatpanelsHD met HDMI Forum’s CEO Rob Tobias and Marketing Director Brad Bramy at CES 2018, the team highlighted VRR (Variable Refresh Rate), QMS (Quick Media Switching) and eARC (enhanced Audio Return Channel) as examples of features that can be added via a firmware update.

The organization added that if a product implements some of the features of HDMI 2.1, it can be sold as a HDMI 2.1 compatible product. However, the manufacturer must specify which features of HDMI 2.1 are supported.
DanBa likes this.

Oppo Beta Group
JohnAV is offline  
post #19 of 47 Old 01-23-2018, 12:21 PM
AVS Forum Special Member
 
wco81's Avatar
 
Join Date: May 2001
Posts: 7,820
Mentioned: 8 Post(s)
Tagged: 0 Thread(s)
Quoted: 2362 Post(s)
Liked: 944
Why would consumers want to invest in equipment now for "partial metadata?"

So if people have to wait until HDMI 2.1 gear is widely available, Dolby may have more licensees from hardware and content makers, increasing its lead.

Does Warner and Fox not license DV for its content?

Or Amazon not supporting DV?

Then you have a format war.

But if the companies supporting HDR10+ are also supporting DV, with the exception of Samsung, they might end up the last holdout.
wco81 is online now  
post #20 of 47 Old 01-23-2018, 12:25 PM - Thread Starter
AVS Forum Special Member
 
Scott Wilkinson's Avatar
 
Join Date: Sep 2001
Location: Burbank, CA
Posts: 3,258
Mentioned: 94 Post(s)
Tagged: 0 Thread(s)
Quoted: 1881 Post(s)
Liked: 4908
Quote:
Originally Posted by JohnAV View Post
Well, my article is about HDR10+ and dynamic metadata, not VRR, QMS, or eARC. Also, I'm very surprised that eARC can be added via firmware update. According to my research for this article, it requires a new chipset that uses the HDMI Ethernet channel to carry all the extra data...ARC has a maximum data rate of 1 Mbps, while eARC's max data rate is 37 Mbps.

Last edited by Scott Wilkinson; 01-24-2018 at 10:36 AM.
Scott Wilkinson is offline  
post #21 of 47 Old 01-23-2018, 12:40 PM
Senior Member
 
Despoiler's Avatar
 
Join Date: Dec 2006
Posts: 249
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 101 Post(s)
Liked: 74
As a consumer I will only buy equipment that supports both standards. Sorry Samsung. From a studio perspective I read that the choice largely comes down to provided tools and workflow refinement. Dolby Vision is ahead on those supposedly. Capabilities of the competing formats is a tertiary concern.

|| Samsung N8000 || Oppo Digital UDP-203 || Chromecast Ultra ||

|| Quad S-2 || Pass ACA Amp ||
Despoiler is offline  
post #22 of 47 Old 01-23-2018, 12:48 PM
Member
 
BaeckerX1's Avatar
 
Join Date: Apr 2008
Posts: 30
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 11 Post(s)
Liked: 13
Quote:
Originally Posted by Scott Wilkinson View Post
HDR10+ gained momentum at CES 2018, leading me to believe it can be a viable alternative to Dolby Vision.

https://www.avsforum.com/hdr10plus-ces-2018
Quite frankly, all this is BS unless Samsung bothers to update their 2017 HDR10 TVs to support HDR10+ as they promised. After getting burned on my 65" KS8000 purchase, I won't be buying another TV for the pleasure.

Between that and the game mode HDR issues and them violating their own ToS by showing ads in my menus even when I disagreed to their privacy policy (thus losing all smart features), I won't ever buy another Samsung TV. Somehow they get away with this garbage though.
BaeckerX1 is offline  
post #23 of 47 Old 01-23-2018, 12:50 PM
Senior Member
 
The Mice's Avatar
 
Join Date: Dec 2010
Location: Edmonton, Alberta, Canada
Posts: 491
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 175 Post(s)
Liked: 513
I own an LG 55LW5000 passive 3D 55 inch TV that I bought new in 2011. I admit that I have the "upgrade bug" but it would have to be something compelling for me to want to give up 3D, and these changing standards also give me pause for thought. Especially when I would want a 65" to 75" set. At those prices, I don't want to make a mistake.
The Mice is offline  
post #24 of 47 Old 01-23-2018, 01:02 PM
Member
 
BaeckerX1's Avatar
 
Join Date: Apr 2008
Posts: 30
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 11 Post(s)
Liked: 13
Quote:
Originally Posted by The Mice View Post
I own an LG 55LW5000 passive 3D 55 inch TV that I bought new in 2011. I admit that I have the "upgrade bug" but it would have to be something compelling for me to want to give up 3D, and these changing standards also give me pause for thought. Especially when I would want a 65" to 75" set. At those prices, I don't want to make a mistake.
They are indeed treacherous waters to jump into right now.

Current issues for consumers right now (imo):

TVs being treated like disposable tech (2 year smartphone lifecycle, forced obsolescence)
Inconsistent HDMI spec
Inconsistent HDR format support
Inconsistent display coatings (matte, glossy, reflections)
Same TV model using different displays in different sizes
Lack of firmware update support
TV issues switching between HDR, non-HDR modes and maintaining settings
Companies trying to monetize your TV viewing (Vizio busted for screen capture, Samsung ads on purchased TV with opt-out option not working)
Input lag still an issue for gamers and not advertised spec
Artificial refresh rates instead of true native refresh rates listed
Max brightness not advertised spec (required for good HDR)

These are just off the top of my head. It's a pain in the ass buying a TV these days.
ciuvak and The Mice like this.

Last edited by BaeckerX1; 01-23-2018 at 01:06 PM.
BaeckerX1 is offline  
post #25 of 47 Old 01-23-2018, 01:04 PM
Member
 
blue dragon's Avatar
 
Join Date: Mar 2008
Posts: 145
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 78 Post(s)
Liked: 38
Quote:
Originally Posted by Despoiler View Post
As a consumer I will only buy equipment that supports both standards. Sorry Samsung. From a studio perspective I read that the choice largely comes down to provided tools and workflow refinement. Dolby Vision is ahead on those supposedly. Capabilities of the competing formats is a tertiary concern.
And that is exactly why I bought an LG and not a Samsung panel this year. Unless Samsung is willing to support both standards, they won't get my $$$

7.2.4 Atmos HT -> Paradigm Studio 100s- Fronts, + Studio CC-690 - Center + Studio 20s Surround + Rear Surround + dual Studio Sub 12s + four AMS-150R-30 (Atmos overhead)
Pre/Pro - Yamaha CX-A5100, AMP - Yamaha MX-A5000
Projector: Panasonic PT-AE8000U, Screen: Seymour F100 2.35:1 TV: LG OLED 65B7P
blue dragon is offline  
post #26 of 47 Old 01-23-2018, 01:12 PM
Senior Member
 
dr_gallup's Avatar
 
Join Date: Oct 2011
Posts: 256
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 61 Post(s)
Liked: 44
Quote:
Originally Posted by mtbdudex View Post
Blu-ray vs HDDVD ... hey there is still a forum here for that believe it or not
https://www.avsforum.com/forum/148-blu-ray-hd-dvd/
Of course we all know which "won", and why.
Some old timers even may remember the hot-hot debates in this very forum, and the time-out(s) given.

Now, there so many HDR formats how the heck does the industry expect joe avg consumer to know or rightly care?

For my family room I've got a 2016 OLED B6, and a 2016 AVR Denon 4300H, am I "safe" for HDR?
Source included UHDBR via xBox1s, and Netflix and Amazon Prime ....

For my basement HT, I've got a 2016 JVC RS400, source includes XBox1X, and soon the 2018 Denon 8500H, am I "safe" for HDR?

This becomes tedious, and old fast ....
I think this industry passed by the average Joe's understanding and caring long ago. TV used to be plug and play, now you have to spend weeks researching the components, standards, bugs, work arounds, etc. Then you fork out a fortune, spend days or weeks getting it all hooked up and figuring out arcane manuals and on screen displays. Finally you get it calibrated & fine tuned. Then you find out your significant other can't stand the array of remote controls and button pushes and reverts to watching Game of Thrones on a 5" phone with earbuds.
Papwolf, Tarheel72 and krumme like this.
dr_gallup is offline  
post #27 of 47 Old 01-23-2018, 01:21 PM
AVS Forum Special Member
 
PeterTHX's Avatar
 
Join Date: Jun 2006
Posts: 2,534
Mentioned: 18 Post(s)
Tagged: 0 Thread(s)
Quoted: 569 Post(s)
Liked: 431
Quote:
Originally Posted by Jish9 View Post
The way I perceive this article has more to do with the intellectual property side than anything else; much like what Google did with their phone OS that is being used in over 80% of all cell phones worldwide. By making HDR10+ an open and free architecture, and with less physical work needed to render scenes, why would we not begin to see this on a vast majority of devices and media. The truth of the matter is that much of the technology we are seeing today has been available for some time, but purposefully trickled in by companies in order to maintain their profit margins. Leave it to one of those same companies to come up with an industry disruptive solution to basically cut the legs out of another's business model. Will be interesting to see whether or not Dolby sues Samsung over this or there is enough difference between the two for a court to side with Samsung.
Except it's not free. It's a yearly subscription rather than per unit.
TCL & Vizio & Lionsgate can afford Dolby Vision on their budget devices/releases. The licensing cost is negligible and on par with something like Dolby Atmos for audio. As far as "less physical work" its because it's an automated process, much like what "auto HDR" does on already existing devices (and why companies like Sony or LG are not interested).


HDR10+ is limited to 10-bit color. Dolby Vision is already 12-bit and ready for the upcoming generation of true 12-bit display devices.
curtisb likes this.

My opinions do not reflect the policies of my company
PeterTHX is offline  
post #28 of 47 Old 01-23-2018, 01:24 PM
AVS Forum Special Member
 
wco81's Avatar
 
Join Date: May 2001
Posts: 7,820
Mentioned: 8 Post(s)
Tagged: 0 Thread(s)
Quoted: 2362 Post(s)
Liked: 944
Quote:
Originally Posted by PeterTHX View Post
Except it's not free. It's a yearly subscription rather than per unit.
TCL & Vizio & Lionsgate can afford Dolby Vision on their budget devices/releases. The licensing cost is negligible and on par with something like Dolby Atmos for audio. As far as "less physical work" its because it's an automated process, much like what "auto HDR" does on already existing devices (and why companies like Sony or LG are not interested).


HDR10+ is limited to 10-bit color. Dolby Vision is already 12-bit and ready for the upcoming generation of true 12-bit display devices.
Other thing I read is that for a disc to have both HDR10+ and DV, there has to be two separate encodes and those two encodes may not fit on a single UHD BD disc.

So studios will have to choose whether to support both, whether to ship two discs, etc.
wco81 is online now  
post #29 of 47 Old 01-23-2018, 02:00 PM
Advanced Member
 
EllisGJ's Avatar
 
Join Date: Dec 2012
Location: Colorado
Posts: 540
Mentioned: 2 Post(s)
Tagged: 0 Thread(s)
Quoted: 417 Post(s)
Liked: 189
From my consumer perspective, Dolby Vision took a big hit last week with the totally flubbed roll-out of the Dolby Vision firmware update for Sony TVs, including the A1E, ZD9 and many other models. What a mismanaged mess and what a disappointment. The update only supports apps running on the TVs, not HDMI or USB sources. Many Sony TV owners have waited up to a year for this update, only now to be in limbo.

I blame Dolby foremost for allowing this to happen. I'll think twice about buying a Sony TV again, too. In any event, I'm now far more interested in HDR10+ going forward.

Last edited by EllisGJ; 01-23-2018 at 02:06 PM.
EllisGJ is offline  
post #30 of 47 Old 01-23-2018, 02:06 PM
AVS Forum Special Member
 
wco81's Avatar
 
Join Date: May 2001
Posts: 7,820
Mentioned: 8 Post(s)
Tagged: 0 Thread(s)
Quoted: 2362 Post(s)
Liked: 944
Quote:
Originally Posted by EllisGJ View Post
From my consumer perspective, Dolby Vision took a big hit last week with the totally flubbed roll-out of the Dolby Vision firmware update for Sony TVs, including the A1E, ZD9 and many other models. What a mismanaged mess and what a disappointment. The update only supports apps running on the TVs, not HDMI or USB sources. Many Sony TV owners have waited up to a year for this update, only now to be in limbo.

I blame Dolby foremost for allowing this happen. I'll think twice about buying a Sony TV again, too. In any event, I'm now far more interested in HDR10+ going forward.
Well how reliable are firmware updates from AV manufacturers?

I'd just make sure DV is supported out of the box rather than relying on promises of future updates.
wco81 is online now  
Sponsored Links
Advertisement
 
Reply High Dynamic Range (HDR) & Wide Color Gamut (WCG)

Tags
20th century fox , ces2018 , hdr10+ , Panasonic , Samsung

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off