2017 TV Shootout Evaluation event will be in NYC, July 12 and July 13, 2017 - Page 111 - AVS Forum | Home Theater Discussions And Reviews
Forum Jump: 
 5268Likes
Reply
 
Thread Tools
post #3301 of 3545 Old 07-30-2017, 08:46 AM
Member
 
caramonrun's Avatar
 
Join Date: Jun 2008
Posts: 23
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 4 Post(s)
Liked: 12
Quote:
Originally Posted by jrref View Post
So despite the confusion, the Sony is generating dynamic metadata for HDR10 which means it generates the data dynamically frame by frame. So if a frame or a set of frames have a peak of say 600 nits and an average 100 nits, that will be the meta data generated vs the static metadata which might say peak of 1000 and average of 300, for example for the whole movie. This results in an overall brighter picture. Why? Because a frame that has a peak of 600 nits and an average of 100 nits, for example, and i'm making these numbers up to illustrate the point, will be displayed as such and not or only slightly tone mapped vs if it used the static metadata of 1000/300, for example it might be tone mapped more aggressively, depending on the set, resulting in a darker picture because the tone mapping will be trying to retain the detail.

Now all that said, since the Sony A1's peak luminance is around 700 nits, the tone mapping assumes that the content is mastered at 1000 nits, so if the content is over 1000 nits, the Sony's tone mapping formula will tend to keep the brightness and throw away the detail, since tone mapping is a balance of either keeping detail or brightness, which is why you will get an overall brighter picture but loose detail over 1000 nits. This actually varies slightly depending on the PM used.
So, detail over 1000 nits is still being lost. What makes it different than a static curve that clips over 1000 nits? What makes you so sure (beyond Sony's claims) that there is dynamic metadata generation at all?

Quote:
LG does the same only with a slightly different formula and they also give you the option of turning off the dynamic metadata where the Sony does not on the A1.
LG's dynamic metadata generation is not the same at all, it retains all detail above 1000 nits.

Quote:
This is a very high level explanation to describe this.
There's an even simpler explanation, Occam's razor and all.
caramonrun is offline  
Sponsored Links
Advertisement
 
post #3302 of 3545 Old 07-30-2017, 08:54 AM
Assoc. Editor @ AVS Forum
 
imagic's Avatar
 
Join Date: Dec 2005
Location: Philadelphia, PA
Posts: 11,973
Mentioned: 255 Post(s)
Tagged: 0 Thread(s)
Quoted: 6276 Post(s)
Liked: 11001
Quote:
Originally Posted by caramonrun View Post
So, detail over 1000 nits is still being lost. What makes it different than a static curve that clips over 1000 nits? What makes you so sure (beyond Sony's claims) that there is dynamic metadata generation at all?

LG's dynamic metadata generation is not the same at all, it retains all detail above 1000 nits.

There's an even simpler explanation, Occam's razor and all.
Yeah, I'd chalk it up to boredom and the effect it has on Internet posts in public forums.

Mark Henninger (aka Imagic)
imagic is online now  
post #3303 of 3545 Old 07-30-2017, 08:58 AM
Assoc. Editor @ AVS Forum
 
imagic's Avatar
 
Join Date: Dec 2005
Location: Philadelphia, PA
Posts: 11,973
Mentioned: 255 Post(s)
Tagged: 0 Thread(s)
Quoted: 6276 Post(s)
Liked: 11001
To summarize... a few conspiracy theorists here say the same person who loaned his mastering display to the shootout, which was used to pick the winner, is lying about a feature, even after being asked about it in the specific context of this thread. That's not just rude and disrespectful, it's illogical and kinda... (fill in the blank)ish.

Mark Henninger (aka Imagic)
imagic is online now  
 
post #3304 of 3545 Old 07-30-2017, 09:01 AM
Advanced Member
 
gorman42's Avatar
 
Join Date: Nov 2005
Posts: 693
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 85 Post(s)
Liked: 50
Quote:
Originally Posted by Al Leong View Post
I don't "bicker" (females bicker)
Seriously? I find this very offensive. What is this? 1950 all over again?

Seriously. AVS is a place where you go to learn to be unhappy. - Bear5k
gorman42 is offline  
post #3305 of 3545 Old 07-30-2017, 09:03 AM
AVS Forum Special Member
 
Join Date: Nov 2014
Posts: 2,734
Mentioned: 52 Post(s)
Tagged: 0 Thread(s)
Quoted: 2176 Post(s)
Liked: 2414
Quote:
Originally Posted by caramonrun View Post
So, detail over 1000 nits is still being lost. What makes it different than a static curve that clips over 1000 nits? What makes you so sure (beyond Sony's claims) that there is dynamic metadata generation at all?



LG's dynamic metadata generation is not the same at all, it retains all detail above 1000 nits.



There's an even simpler explanation, Occam's razor and all.
So, detail over 1000 nits is still being lost. What makes it different than a static curve that clips over 1000 nits? What makes you so sure (beyond Sony's claims) that there is dynamic metadata generation at all?
>>Correct, detail over 1000 nits or so are lost but since most content doesn't go over 1000 nits, as discovered by many enthusiasts with software to check this, overall you will get a brighter picture. I haven't looked at the metadata stream but you can see it's doing essentially the same thing to the content as the LG with dynamic metadata on with content mastered at 1000 nits when the sets are side by side. The LG is a little darker since the tone mapping formula is slightly different.



LG's dynamic metadata generation is not the same at all, it retains all detail above 1000 nits.
>>Correct it's trying to tone map to 4000 nits at the sacrifice of an overall slightly darker picture with content that's less than 1000 nits, say in the 100 -200 nit range for example. Remember that tone mapping is a trade off between detail and brightness. There is only so much you can do to "map" 1000- 4000 nit content into a set that has a peak of around 700 nits. There is no right or wrong, because there is no standard. Each manufacturer has their own formula and they can change it at any time and Sony has at least once that we have measured.


This is the best I can do to explain this and Vincent's tone mapping video can give you a slightly more detailed explanation but this is the way I understand this to work as verified by LG and Sony engineers whom I have personally spoken to.
ataneruo likes this.

John
Sony 55A1E / LG 55OLEDE6P
Marantz 7009
Ohm Walsh Speakers
ISF Level II Certified
jrref is offline  
post #3306 of 3545 Old 07-30-2017, 09:15 AM
Oppo Beta Group
 
RichB's Avatar
 
Join Date: Nov 1999
Location: Massachusetts
Posts: 10,186
Mentioned: 9 Post(s)
Tagged: 0 Thread(s)
Quoted: 1206 Post(s)
Liked: 791
Quote:
Originally Posted by jrref View Post
LG's dynamic metadata generation is not the same at all, it retains all detail above 1000 nits.
>>Correct it's trying to tone map to 4000 nits at the sacrifice of an overall slightly darker picture with content that's less than 1000 nits, say in the 100 -200 nit range for example. Remember that tone mapping is a trade off between detail and brightness. There is only so much you can do to "map" 1000- 4000 nit content into a set that has a peak of around 700 nits. There is no right or wrong, because there is no standard. Each manufacturer has their own formula and they can change it at any time and Sony has at least once that we have measured.

There is a "right" it is called adherence to the PQ EOTF. Tone-mapping that goes well below 200 nits is throwing away PQ tracking to preserve highlight detail. This detail cannot be "right" because the nit levels cannot be achieved. Globally lowering the APL because of static metadata is akin to global ABL. Folks rightly don't like ABL but seem to like this. It is not correct to say there is no standard.


Dynamic metadata on the LG corrects most tone-mapping ABL, restoring PQ tracking. However, the aggressive tone-mapping returns in the presence of any bright highlight (no matter how insignificant to the scene). This is where DV has an advantage. It is not real-time and can make better decisions.


If LG were to provide tone-mapping options in addition to the Active HDR, the user could choose to PQ Tracking emphasis to preserve the intended look of the film for HDR-10 sources. Please LG, please


- Rich

Oppo UPD-205 | Sonica DAC | BDP-105D | HA-1 | PM-1 | Emotiva XMC-1 | ATI Signature AT6002 x 2 + AT6006 | Revel Salon2s, Voice2, Studio2s | Velodyne HGS-15 | LG 65C7 | Lumagen 2020
RichB is online now  
post #3307 of 3545 Old 07-30-2017, 09:21 AM
AVS Forum Special Member
 
Join Date: Nov 2014
Posts: 2,734
Mentioned: 52 Post(s)
Tagged: 0 Thread(s)
Quoted: 2176 Post(s)
Liked: 2414
Quote:
Originally Posted by RichB View Post
There is a "right" it is called adherence to the PQ EOTF. Tone-mapping that goes well below 200 nits is throwing away PQ tracking to preserve highlight detail. This detail cannot be "right" because the nit levels cannot be achieved. Globally lowering the APL because of static metadata is akin to global ABL. Folks rightly don't like ABL but seem to like this. It is not correct to say there is no standard.


Dynamic metadata on the LG corrects most tone-mapping ABL, restoring PQ tracking. However, the aggressive tone-mapping returns in the presence of any bright highlight (no matter how insignificant to the scene). This is where DV has an advantage. It is not real-time and can make better decisions.


If LG were to provide tone-mapping options in addition to the Active HDR, the user could choose to PQ Tracking emphasis to preserve the intended look of the film for HDR-10 sources. Please LG, please


- Rich
The PQ EOTF is a standard but the way each set tracks it to preserve detail or brightness isn't. Just look at Vincent's video and you can see the way different set's track.
KMFDMvsEnya likes this.

John
Sony 55A1E / LG 55OLEDE6P
Marantz 7009
Ohm Walsh Speakers
ISF Level II Certified
jrref is offline  
post #3308 of 3545 Old 07-30-2017, 09:28 AM
AVS Forum Special Member
 
losservatore's Avatar
 
Join Date: Jan 2011
Location: Cleveland,Ohio
Posts: 6,409
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 2619 Post(s)
Liked: 2401
Dolby still looks better on Oleds ,why we stress out about HDR10 when Dolby is better.
losservatore is offline  
post #3309 of 3545 Old 07-30-2017, 09:30 AM
AVS Forum Special Member
 
Join Date: Nov 2014
Posts: 2,734
Mentioned: 52 Post(s)
Tagged: 0 Thread(s)
Quoted: 2176 Post(s)
Liked: 2414
Quote:
Originally Posted by losservatore View Post
Dolby still looks better on Oleds ,why we stress out about HDR10 when Dolby is better.
You hit the nail on the head lol!

John
Sony 55A1E / LG 55OLEDE6P
Marantz 7009
Ohm Walsh Speakers
ISF Level II Certified
jrref is offline  
post #3310 of 3545 Old 07-30-2017, 09:37 AM
Assoc. Editor @ AVS Forum
 
imagic's Avatar
 
Join Date: Dec 2005
Location: Philadelphia, PA
Posts: 11,973
Mentioned: 255 Post(s)
Tagged: 0 Thread(s)
Quoted: 6276 Post(s)
Liked: 11001
Quote:
Originally Posted by losservatore View Post
Dolby still looks better on Oleds ,why we stress out about HDR10 when Dolby is better.
Quote:
Originally Posted by jrref View Post
You hit the nail on the head lol!
Because the selection of Dolby Vision movies available in the highest-quality format (Ultra HD Blu-ray) can still be counted on your fingers? Because even though it's rolling out now, not every Ultra HD Blu-ray is coming out as a Dolby Vision title?

I mean sure, if you have the choice go with the best format your system can handle. But @losservatore 's comment would only really be true if Dolby Vision was included on Ultra HD Blu-rays from day 1 and was part of the standard. Given that's not the case, how a TV deals with the limitations of HDR10 (or rather, how a TV deals with its own limitations in displaying HDR10 content) matters.

Mark Henninger (aka Imagic)
imagic is online now  
post #3311 of 3545 Old 07-30-2017, 09:52 AM
Senior Member
 
rene2kx's Avatar
 
Join Date: Mar 2010
Posts: 267
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 106 Post(s)
Liked: 49
Lg oleds give better price to performance ratio and support DV out of the box, but having seen LG Sony and Panasonic SBS, I definitely
thought Panasonic looked better than the other two. On LG the color appears a little 'touched up', the Panasonic shades look more natural. Panasonic has more detailed CMS than lg. The motions infrastructure panning scenes looks a little better than LG, and the near black areas show minimal noise on the Panasonic while the LG in the same dark scenes shows visible noise. On the plus side for LG, the LG oled in general appears brighter,
rene2kx is online now  
post #3312 of 3545 Old 07-30-2017, 09:55 AM
Advanced Member
 
tanman's Avatar
 
Join Date: Apr 2003
Location: Dallas, TX
Posts: 683
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 170 Post(s)
Liked: 131
Quote:
Originally Posted by imagic View Post
To summarize... a few conspiracy theorists here say the same person who loaned his mastering display to the shootout, which was used to pick the winner, is lying about a feature, even after being asked about it in the specific context of this thread. That's not just rude and disrespectful, it's illogical and kinda... (fill in the blank)ish.
This is a just a BLOG. About 80-90 % of posts can be deleted. You should read what the Kardashians put up with on their BLOG!
tanman is offline  
post #3313 of 3545 Old 07-30-2017, 10:32 AM
AVS Forum Special Member
 
losservatore's Avatar
 
Join Date: Jan 2011
Location: Cleveland,Ohio
Posts: 6,409
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 2619 Post(s)
Liked: 2401
Quote:
Originally Posted by imagic View Post
Because the selection of Dolby Vision movies available in the highest-quality format (Ultra HD Blu-ray) can still be counted on your fingers? Because even though it's rolling out now, not every Ultra HD Blu-ray is coming out as a Dolby Vision title?

I mean sure, if you have the choice go with the best format your system can handle. But @losservatore 's comment would only really be true if Dolby Vision was included on Ultra HD Blu-rays from day 1 and was part of the standard. Given that's not the case, how a TV deals with the limitations of HDR10 (or rather, how a TV deals with its own limitations in displaying HDR10 content) matters.
I know but my point is that is better and will always be better. I know that some will disagree maybe because their display doesn't support Dolby , but Sony is working to have Dolby vision, Samsung will be the only display without the format.

HDR10 will still be available as the base layer but Dolby is the enhanced version.

Once movies start releasing in dolby vision there will be a higher demand of the format and many people will avoid the display that doesn't have the dolby logo on their display.


I don't know why some people think that HDR10 is going to be the format that will dominate in the industry. HDR10 is a mess right now and it will be for long time.


The Sony Z9D will look more impressive with Dolby Vision.Dolby isn't only for limited peak brightness displays. Is for all type of displays.I'm sure that they are waiting for that firmware release.
bigapp likes this.

Last edited by losservatore; 07-30-2017 at 12:52 PM.
losservatore is offline  
post #3314 of 3545 Old 07-30-2017, 10:51 AM
Member
 
SubZombie's Avatar
 
Join Date: Oct 2009
Posts: 20
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 12 Post(s)
Liked: 31
In the future I would like to see a category for best picture quality without a pro calibration. In the real world hardly anyone gets their tv calibrated, and only a fraction of a fraction of buyers are going to get the level of calibration that was done to the sets for the shootout. This seems more useful than the streaming category.
ARROW-AV and Keithian like this.
SubZombie is offline  
post #3315 of 3545 Old 07-30-2017, 10:52 AM
Assoc. Editor @ AVS Forum
 
imagic's Avatar
 
Join Date: Dec 2005
Location: Philadelphia, PA
Posts: 11,973
Mentioned: 255 Post(s)
Tagged: 0 Thread(s)
Quoted: 6276 Post(s)
Liked: 11001
Quote:
Originally Posted by losservatore View Post
I know but my point is that is better and will always be better. I know that some will disagree maybe because their display doesn't support Dolby , but Sony is working to have Dolby vision, Samsung will be the only display without the format.

HDR10 will still be as the base layer but Dolby is the enhanced version.

Once movies start releasing in dolby vision there will be a higher demand of the format and many people wont buy a display without that dolby logo on their display.

I don't know why some people think that HDR10 is going to be the format that will dominate in the industry. HDR10 is a mess right now and it will be for long time.
That's speculative. The appeal of HDR10 is threefold. 1. It's the standard used by UHD Blu-ray. 2. It's free and open source. (Dolby Vision is proprietary and costs $$$ to use) 3. Samsung still sells a lot of TVs, and every other TV maker that sells HDR TVs supports HDR10 as well.

Oh, and in projector land, Dolby Vision is not happening yet. And let's face it, HDR is still niche and driven by early-adopters plus enthusiasts and that includes the home theater guys.

I'm curious how it all plays out.
bigapp and ataneruo like this.

Mark Henninger (aka Imagic)
imagic is online now  
post #3316 of 3545 Old 07-30-2017, 10:58 AM
Advanced Member
 
tanman's Avatar
 
Join Date: Apr 2003
Location: Dallas, TX
Posts: 683
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 170 Post(s)
Liked: 131
Quote:
Originally Posted by imagic View Post
That's speculative. The appeal of HDR10 is threefold. 1. It's the standard used by UHD Blu-ray. 2. It's free and open source. (Dolby Vision is proprietary and costs $$$ to use) 3. Samsung still sells a lot of TVs, and every other TV maker that sells HDR TVs supports HDR10 as well.

Oh, and in projector land, Dolby Vision is not happening yet. And let's face it, HDR is still niche and driven by early-adopters plus enthusiasts and that includes the home theater guys.

I'm curious how it all plays out.
So you think this category will shake-out like BetaMax/VHS, Blu-Ray/HD-DVD or will both standards co-exist like DTS/Dolby Sound?
tanman is offline  
post #3317 of 3545 Old 07-30-2017, 11:16 AM
Assoc. Editor @ AVS Forum
 
imagic's Avatar
 
Join Date: Dec 2005
Location: Philadelphia, PA
Posts: 11,973
Mentioned: 255 Post(s)
Tagged: 0 Thread(s)
Quoted: 6276 Post(s)
Liked: 11001
Quote:
Originally Posted by tanman View Post
So you think this category will shake-out like BetaMax/VHS, Blu-Ray/HD-DVD or will both standards co-exist like DTS/Dolby Sound?
The latter, given that the two already co-exist relatively peacefully. And I'm not discounting Dolby at all. But if HLG catches on for broadcast and Technicolor releases something else, it's hard to see how Dolby can wrap up the whole market for itself. It'll be one of numerous HDR flavors available, likely marketed as "premium" HDR.

Mark Henninger (aka Imagic)

Last edited by imagic; 07-30-2017 at 11:21 AM.
imagic is online now  
post #3318 of 3545 Old 07-30-2017, 11:51 AM
AVS Forum Special Member
 
Join Date: Oct 2013
Location: Thousand Oaks, CA
Posts: 1,394
Mentioned: 80 Post(s)
Tagged: 0 Thread(s)
Quoted: 926 Post(s)
Liked: 623
Quote:
Originally Posted by jrref View Post
So, detail over 1000 nits is still being lost. What makes it different than a static curve that clips over 1000 nits? What makes you so sure (beyond Sony's claims) that there is dynamic metadata generation at all?
>>Correct, detail over 1000 nits or so are lost but since most content doesn't go over 1000 nits, as discovered by many enthusiasts with software to check this, overall you will get a brighter picture. I haven't looked at the metadata stream but you can see it's doing essentially the same thing to the content as the LG with dynamic metadata on with content mastered at 1000 nits when the sets are side by side. The LG is a little darker since the tone mapping formula is slightly different.

LG's dynamic metadata generation is not the same at all, it retains all detail above 1000 nits.
>>Correct it's trying to tone map to 4000 nits at the sacrifice of an overall slightly darker picture with content that's less than 1000 nits, say in the 100 -200 nit range for example. Remember that tone mapping is a trade off between detail and brightness. There is only so much you can do to "map" 1000- 4000 nit content into a set that has a peak of around 700 nits. There is no right or wrong, because there is no standard. Each manufacturer has their own formula and they can change it at any time and Sony has at least once that we have measured.

This is the best I can do to explain this and Vincent's tone mapping video can give you a slightly more detailed explanation but this is the way I understand this to work as verified by LG and Sony engineers whom I have personally spoken to.
1,000 nits will melt your face. When I watch TV, I don't want my face continually melted - nor do I want to wear eclipse certified sunglasses. Who cares what details are lost at 1000 nits? Who's going to see it besides a meter or an electron?

I mean in all seriousness - there are specs which might enhance the viewing experience, and then there are specs that only the respective manufacturer's marketing depts. find useful. And then we all act like a bunch of Lemmings who become attracted to a spec abyss.

LG OLED65E6P | VIZIO M70-D3
Pioneer Elite VSX-90 | Samsung K8500 | AppleTV 4G | DirecTV 4K
Polk RTiA3 cherrywood Mains | CSiA6 cherrywood Center | OWM3 Surrounds
Pioneer SP-T22A Atmos Speaker Toppers | Sunfire Signature True Subwoofer
M70-D3 Calibration FW 3.3.18.1 SDR & HDR: http://www.avsforum.com/forum/166-lc...l#post54599532
sonoftumble is online now  
post #3319 of 3545 Old 07-30-2017, 11:52 AM
AVS Forum Special Member
 
losservatore's Avatar
 
Join Date: Jan 2011
Location: Cleveland,Ohio
Posts: 6,409
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 2619 Post(s)
Liked: 2401
This is a very interesting thread Universal HDR-compliant displays. There is great info about every HDR format.

Last edited by losservatore; 07-30-2017 at 12:31 PM.
losservatore is offline  
post #3320 of 3545 Old 07-30-2017, 12:13 PM
Member
 
ataneruo's Avatar
 
Join Date: Dec 2008
Posts: 116
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 87 Post(s)
Liked: 109
2017 TV Shootout Evaluation event will be in NYC, July 12 and July 13, 2017

Quote:
Originally Posted by sonoftumble View Post
1,000 nits will melt your face. When I watch TV, I don't want my face continually melted - nor do I want to wear eclipse certified sunglasses. Who cares what details are lost at 1000 nits? Who's going to see it besides a meter or an electron?

I mean in all seriousness - there are specs which might enhance the viewing experience, and then there are specs that only the respective manufacturer's marketing depts. find useful. And then we all act like a bunch of Lemmings who become attracted to a spec abyss.

1000 nits will not melt your face. A bright sunny day is 30,000 nits. From the Dolby blog 2013 announcement of Dolby Vision:

"The super TV

At Dolby, we wanted to find out what the right amount of light was for a display like a television. So we built a super-expensive, super-powerful, liquid-cooled TV that could display incredibly bright images. We brought people in to see our super TV and asked them how bright they liked it.

Here’s what we found: 90 percent of the viewers in our study preferred a TV that went as bright as 20,000 nits. (A nit is a measure of brightness. For reference, a 100-watt incandescent lightbulb puts out about 18,000 nits.)

You may be thinking, “Wow, I don’t want to look at a TV that bright. Looking at a 100-watt bulb would hurt my eyes!” And you’d be right if the TV was displaying a full-screen, pure-white image. That would be uncomfortable.

But real TV images, like scenes in the real world, include a mixture of dark and light. Only small parts of real-world scenes are very bright, and we have no problem looking at them. In fact, one of the secrets to producing TV images that look like real life is having that mix of true brights and darks.

If viewers want images of as much 20,000 nits, guess what the industry standard is for the brightness of current TV images. (Go ahead, we’ll wait.)

If your guess is more than 100 nits, you’re wrong. It’s true—most viewers want TV images that are 200 times brighter than today’s industry standard.

Does that difference really matter? You bet it does. Today’s TVs simply can’t match the depth and detail of a display that can produce far brighter images. And conventional TVs can’t recreate all the colors found in the world around us. It’s a classic case of “you don’t know what you’re missing until you see it.” When you experience a display with much higher brightness, you never want to go back to a conventional display."

https://blog.dolby.com/2013/12/tv-bright-enough/


Sent from my iPhone using Tapatalk

ataneruo is online now  
post #3321 of 3545 Old 07-30-2017, 12:25 PM
AVS Forum Special Member
 
CHASLS2's Avatar
 
Join Date: Feb 2009
Location: Tampa FL area
Posts: 2,361
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 650 Post(s)
Liked: 256
I am sticking with 1080p to the bitter end. Not buying into this 4k crap. Even if i am forced to buy a 4K set it will be used for 1080p.
sonoftumble and sickkent like this.

Sony 65" A1E
OPPO 83 &103
Pioneer Elite DV 59AVI
Paradigm 490cc B&W 685 S2's
Pioneer Elite SC61 SVS SB13Ultra
B&W CM10 S2's Acurus 200A3
CHASLS2 is online now  
post #3322 of 3545 Old 07-30-2017, 12:58 PM
AVS Forum Special Member
 
Join Date: Oct 2013
Location: Thousand Oaks, CA
Posts: 1,394
Mentioned: 80 Post(s)
Tagged: 0 Thread(s)
Quoted: 926 Post(s)
Liked: 623
Quote:
Originally Posted by ataneruo View Post
1000 nits will not melt your face. A bright sunny day is 30,000 nits. Today’s TVs simply can’t match the depth and detail of a display that can produce far brighter images. And conventional TVs can’t recreate all the colors found in the world around us. It’s a classic case of “you don’t know what you’re missing until you see it.” When you experience a display with much higher brightness, you never want to go back to a conventional display."

https://blog.dolby.com/2013/12/tv-bright-enough/


Sent from my iPhone using Tapatalk
Thank you for all the detail. I hope you realize that I wasn't being literal about the face melting comment.

My point is that when you get a 1,000 nit light blast from your TV, it's usually an explosion or other brief flash that happens so fast that the cornea will not be able to react quickly enough to make out any details in the image before it's gone. So I don't really care if a particular TV loses details at the 1,000 nit level because my eyes won't be able to tell anyway. I have two HDR TV's and I certainly appreciate the higher luminance dynamics of my OLED vs. my LCD FALD. And I'm a big fan of Dolby Vision.

I recently watched the UHD/HDR Blu-ray of the original 1995 "Independence Day". Throughout the movie there are bright flashes of white to indicate scene cuts. On my 460 peak nit TV, those cuts are very uncomfortable to look at - especially in a dark room where our eyes have adjusted to the dark. I haven't put a meter on the flash, but it's probably lower that the TV's peak- and still difficult to watch - I certainly wouldn't want it any brighter than it was. I'm thankful that they were just quick flashes. I couldn't look at that for any extended period of time. Maybe in a bright room, 1,000 nits will produce some PQ benefit.

So as a spec, 1,000 nits doesn't get me as excited as other specs such as WCG or HDR or OLED. To me, it's really a matter of a manufacturer's marketing dept. being able to claim that their TV will melt your face better than another brand. I think there comes a point where enough is enough.
video_analysis likes this.

LG OLED65E6P | VIZIO M70-D3
Pioneer Elite VSX-90 | Samsung K8500 | AppleTV 4G | DirecTV 4K
Polk RTiA3 cherrywood Mains | CSiA6 cherrywood Center | OWM3 Surrounds
Pioneer SP-T22A Atmos Speaker Toppers | Sunfire Signature True Subwoofer
M70-D3 Calibration FW 3.3.18.1 SDR & HDR: http://www.avsforum.com/forum/166-lc...l#post54599532
sonoftumble is online now  
post #3323 of 3545 Old 07-30-2017, 01:04 PM
Oppo Beta Group
 
RichB's Avatar
 
Join Date: Nov 1999
Location: Massachusetts
Posts: 10,186
Mentioned: 9 Post(s)
Tagged: 0 Thread(s)
Quoted: 1206 Post(s)
Liked: 791
Quote:
Originally Posted by jrref View Post
The PQ EOTF is a standard but the way each set tracks it to preserve detail or brightness isn't. Just look at Vincent's video and you can see the way different set's track.


And since humans are most sensitive to luminance, sets that cannot track the PQ EOTF (all of them for 4000 nit content) must either clip bright content or implement tone-mapping which reduces overall APL at some starting nit level. For HDR-10, each manufacturer display is going to look different even using the same OLED panels. They cannot be calibrated to an accurate standard without similar tone-mapping options.


There may not be a tone-mapping standard but it would be nice to have some "best practices" to preserve the PQ EOTF tracking in mid to low nit content. A good approach preserves the PQ EOTF to some minimum level. For example, 300 nits would allow most scenes to look alike.


- Rich
King Richard likes this.

Oppo UPD-205 | Sonica DAC | BDP-105D | HA-1 | PM-1 | Emotiva XMC-1 | ATI Signature AT6002 x 2 + AT6006 | Revel Salon2s, Voice2, Studio2s | Velodyne HGS-15 | LG 65C7 | Lumagen 2020
RichB is online now  
post #3324 of 3545 Old 07-30-2017, 01:20 PM
AVS Forum Special Member
 
losservatore's Avatar
 
Join Date: Jan 2011
Location: Cleveland,Ohio
Posts: 6,409
Mentioned: 7 Post(s)
Tagged: 0 Thread(s)
Quoted: 2619 Post(s)
Liked: 2401
Major film studios & HDR10 / Dolby Vision.


http://www.flatpanelshd.com/news.php...&id=1500554934
ataneruo likes this.

Last edited by losservatore; 07-30-2017 at 01:30 PM.
losservatore is offline  
post #3325 of 3545 Old 07-30-2017, 01:59 PM
AVS Forum Special Member
 
Join Date: Nov 2014
Posts: 2,734
Mentioned: 52 Post(s)
Tagged: 0 Thread(s)
Quoted: 2176 Post(s)
Liked: 2414
Quote:
Originally Posted by RichB View Post
And since humans are most sensitive to luminance, sets that cannot track the PQ EOTF (all of them for 4000 nit content) must either clip bright content or implement tone-mapping which reduces overall APL at some starting nit level. For HDR-10, each manufacturer display is going to look different even using the same OLED panels. They cannot be calibrated to an accurate standard without similar tone-mapping options.


There may not be a tone-mapping standard but it would be nice to have some "best practices" to preserve the PQ EOTF tracking in mid to low nit content. A good approach preserves the PQ EOTF to some minimum level. For example, 300 nits would allow most scenes to look alike.


- Rich
According to Vincent, Panasonic uses a formula that stays true to the EOTF curve the longest but again it's a tradeoff. There is no user control over the way a set tone maps and I doubt we will ever see a standard or there would have been one already.
KMFDMvsEnya likes this.

John
Sony 55A1E / LG 55OLEDE6P
Marantz 7009
Ohm Walsh Speakers
ISF Level II Certified
jrref is offline  
post #3326 of 3545 Old 07-30-2017, 02:03 PM
Advanced Member
 
tanman's Avatar
 
Join Date: Apr 2003
Location: Dallas, TX
Posts: 683
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 170 Post(s)
Liked: 131
Quote:
Originally Posted by CHASLS2 View Post
I am sticking with 1080p to the bitter end. Not buying into this 4k crap. Even if i am forced to buy a 4K set it will be used for 1080p.
4K has to become a broadcast industry standard until it really picks up. But then the industry will induct 8K in the hall-of-fame and then watch out for Soap Opera Effects on these threads!
tanman is offline  
post #3327 of 3545 Old 07-30-2017, 02:13 PM
Advanced Member
 
tanman's Avatar
 
Join Date: Apr 2003
Location: Dallas, TX
Posts: 683
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 170 Post(s)
Liked: 131
Quote:
Originally Posted by imagic View Post
The latter, given that the two already co-exist relatively peacefully. And I'm not discounting Dolby at all. But if HLG catches on for broadcast and Technicolor releases something else, it's hard to see how Dolby can wrap up the whole market for itself. It'll be one of numerous HDR flavors available, likely marketed as "premium" HDR.
I am for more competition and hoping for Technicolor magic (dust) will join in. But then, the problem arises for the movie industry of whom to support. Who am I to say, but some of the movies select the wrong standard, listening and/or viewing!
tanman is offline  
post #3328 of 3545 Old 07-30-2017, 02:21 PM
AVS Forum Special Member
 
CHASLS2's Avatar
 
Join Date: Feb 2009
Location: Tampa FL area
Posts: 2,361
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 650 Post(s)
Liked: 256
Quote:
Originally Posted by tanman View Post
4K has to become a broadcast industry standard until it really picks up. But then the industry will induct 8K in the hall-of-fame and then watch out for Soap Opera Effects on these threads!
We never even got 1080p with OTA- HD. I am fine with just 720p and leave it at that. I am not replacing 100's of DVD's and BD's and 5 BD players and a AVR just to see 4k. I will never stream as well.

720P is good enough for me. Can't tell a diff in 4k in my bedroom 11ft away from a 60" screen anyways. I can't care less about color.

Sony 65" A1E
OPPO 83 &103
Pioneer Elite DV 59AVI
Paradigm 490cc B&W 685 S2's
Pioneer Elite SC61 SVS SB13Ultra
B&W CM10 S2's Acurus 200A3
CHASLS2 is online now  
post #3329 of 3545 Old 07-30-2017, 02:27 PM
Senior Member
 
kittycarole's Avatar
 
Join Date: Jan 2003
Location: Frisco, TX
Posts: 308
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 50 Post(s)
Liked: 89
Quote:
Originally Posted by Al Leong View Post

I don't "bicker" (females bicker)
Quote:
Originally Posted by gorman42 View Post
Seriously? I find this very offensive. What is this? 1950 all over again?
yes and thank you gorman, as a female i very much agree with you. perhaps al just forgot to add a smilie or winky face after that comment? would have made it somewhat less offensive.

so much more i'd really like to say but i am doing my best to respect the mods' multiple requests.
King Richard likes this.
kittycarole is offline  
post #3330 of 3545 Old 07-30-2017, 02:37 PM
Advanced Member
 
tanman's Avatar
 
Join Date: Apr 2003
Location: Dallas, TX
Posts: 683
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 170 Post(s)
Liked: 131
Quote:
Originally Posted by CHASLS2 View Post
We never even got 1080p with OTA- HD. I am fine with just 720p and leave it at that. I am not replacing 100's of DVD's and BD's and 5 BD players and a AVR just to see 4k. I will never stream as well.

720P is good enough for me. Can't tell a diff in 4k in my bedroom 11ft away from a 60" screen anyways. I can't care less about color.
So was it because the broadcast industry couldn't catch up to the standards or was interlaced much easier in the long run!
tanman is offline  
Sponsored Links
Advertisement
 
Reply OLED Technology and Flat Panels General

Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off