AVS Forum banner

301 - 320 of 648 Posts

·
Registered
Joined
·
506 Posts
8 bit+FRC is "capable" of doing HDR - but it has to quickly switch between 2 different colors to give you a "fake" or "approximate" color - that is not exactly the "real" color that you would see on a 10-bit panel.

:confused:

It uses frame rate control to do that. Remember most HDR content is in movies that are 24p with panels being 120hz. My HU8550 has a 10bit panel and so does my KS8000. I wish I could convince my relative that the colors are fake and to trade with me if you prove this out to be fact but they never would lol. I can tell you without a doubt the Q80T is capable of better HDR than most TVs on the market. Remember switching quickly is instant when you’re talking about technology. It isn’t a fake or inaccurate color and your eyes will still see the color that is being produced. The method has been used with monitors and displays for years. If you really want to know for sure your best bet is to just see the TV for yourself.


Sent from my iPhone using Tapatalk
 
  • Like
Reactions: venus933 and denpom

·
Registered
Joined
·
3,315 Posts
also, depending on the implementation of FRC, in some cases, image quality results are better than some "true" 10-bit panels or at least indistinguishable.

We get it, you love your Q80T, nothing is wrong with your Q80T...

Anyone in this forum who brings up a concern you attack personally and is gunned-down by you - your Q80T is the most perfect, most wonderful TV that you have ever had.

We get it - enjoy your TV! :)
 

·
Registered
Joined
·
1,932 Posts
Discussion Starter #304
We get it, you love your Q80T, nothing is wrong with your Q80T...
10bit gradients, again, proven by tests, are not what's wrong with it, for sure. i'm still confused, what are you're doing in these threads.
 

·
Registered
Joined
·
429 Posts
It uses frame rate control to do that. Remember most HDR content is in movies that are 24p with panels being 120hz. My HU8550 has a 10bit panel and so does my KS8000. I wish I could convince my relative that the colors are fake and to trade with me if you prove this out to be fact but they never would lol. I can tell you without a doubt the Q80T is capable of better HDR than most TVs on the market. Remember switching quickly is instant when you’re talking about technology. It isn’t a fake or inaccurate color and your eyes will still see the color that is being produced. The method has been used with monitors and displays for years. If you really want to know for sure your best bet is to just see the TV for yourself.


Sent from my iPhone using Tapatalk
As you give great explanations and I don't wana get into the back and forth match, Will we see an impact if the screen is receiving a true 10 bit color signal would 8 bit + FRC be a limit to fine color reproduction for such sources as next gen consoles or UHD players? I know it might not matter much now as most sources the tv receives are in 8 bit.
 

·
Registered
Joined
·
506 Posts
As you give great explanations and I don't wana get into the back and forth match, Will we see an impact if the screen is receiving a true 10 bit color signal would 8 bit + FRC be a limit to fine color reproduction for such sources as next gen consoles or UHD players? I know it might not matter much now as most sources the tv receives are in 8 bit.

The short answer is no you won’t see a difference (you can google it if you’re still curious). All HDR10 content is 10bit. There could be an issue if these sets supported Dolby Vision which is 12bit and the panels were indeed 8bit + FRC as I’m not sure I’d they can go above 10bit. Although DV does works at 10bit (for an interesting read check out the article below):

https://community.cedia.net/blogs/david-meyer/2018/05/16/hdmi-data-rates-for-4k-hdr


Sent from my iPhone using Tapatalk
 

·
Registered
Joined
·
429 Posts
The short answer is no you won’t see a difference (you can google it if you’re still curious). All HDR10 content is 10bit. There could be an issue if these sets supported Dolby Vision which is 12bit and the panels were indeed 8bit + FRC as I’m not sure I’d they can go above 10bit. Although DV does works at 10bit (for an interesting read check out the article below):

https://community.cedia.net/blogs/david-meyer/2018/05/16/hdmi-data-rates-for-4k-hdr


Sent from my iPhone using Tapatalk
Awesome I'll check it out danke!
 

·
Vendor
Joined
·
26,628 Posts
I'm happy with all my sets!

I had to edit...here is the q70, and a 77C8 Oled. This is from First Man. Granted the lunar module is overexposed, but the blooming and black bars stick out. If the q80t eliminates that all together, then Ill drink the kool aid.
We get it, you love your Q80T, nothing is wrong with your Q80T...

Anyone in this forum who brings up a concern you attack personally and is gunned-down by you - your Q80T is the most perfect, most wonderful TV that you have ever had.

We get it - enjoy your TV! :)
That is great if he found a set that suites his needs perfectly, nothing wrong with that. 80T has a great price point as well.
 

·
Registered
Joined
·
35 Posts
I'm having trouble determining if my Q80T is receiving HDR content. Rtings says, "a small HDR icon appears next to the picture mode on the quick settings menu." However, when I am playing an HDR game on my Xbox One X, I do not see this icon. I suspect that it is receiving an HDR signal since some settings seem to change after the game starts vs the Xbox menu (brightness moves up to 50), but I don't see any other indication. Is there some other way to check?
 

·
Registered
Joined
·
2,216 Posts
There is a lot of discussion going about 8 bit-FRC vs 10-bit not only in this thread but others as well. Anyone who has technical knowledge about FRC will tell you it is impossible to tell the difference between 8 bit-FRC and 10-bit without test patterns and a high speed camera. I have seen panels side-by-side. There is absolutely no difference between the two in content. LCD panels have been using FRC for decades without issue.
 

·
Registered
Joined
·
506 Posts
I'm having trouble determining if my Q80T is receiving HDR content. Rtings says, "a small HDR icon appears next to the picture mode on the quick settings menu." However, when I am playing an HDR game on my Xbox One X, I do not see this icon. I suspect that it is receiving an HDR signal since some settings seem to change after the game starts vs the Xbox menu (brightness moves up to 50), but I don't see any other indication. Is there some other way to check?

So there are a few ways to tell. The first is what you are seeing with maxed out settings. Another is clicking on the source again to see the information banner where it should read UHD HDR or something like that (there was a bug in early NU8000 firmware where this didn’t update the information banner properly but you could tell because settings were maxed out). This method won’t work inside Samsung apps as you are not using one of the HDMI sources. The last method is when you click Home and scroll over to settings (don’t enter the settings but look at the quick settings), it will say picture mode (ie Movie) and HDR or HDR10+ in the corner of the picture mode box.


Sent from my iPhone using Tapatalk
 

·
Registered
Joined
·
35 Posts
So there are a few ways to tell. The first is what you are seeing with maxed out settings. Another is clicking on the source again to see the information banner where it should read UHD HDR or something like that (there was a bug in early NU8000 firmware where this didn’t update the information banner properly but you could tell because settings were maxed out). This method won’t work inside Samsung apps as you are not using one of the HDMI sources. The last method is when you click Home and scroll over to settings (don’t enter the settings but look at the quick settings), it will say picture mode (ie Movie) and HDR or HDR10+ in the corner of the picture mode box.


Sent from my iPhone using Tapatalk
Thank you, the source method worked best. Picture mode is grayed out in game mode.
 

·
Registered
Joined
·
16 Posts
HDR viewing:

Picture Mode: Movie (note, that Standard mode slightly oversatures and boosts contrast, not by much but still. for example Color: 27 in Standard mode would be about 5 ticks higher in Movie mode if you want to roughly match them)

Brightness: 50
Contrast: 50
Sharpness: 1-10 (HDR material with a lot of film grain (especially artificial one) wouldn't look good with anything but 0/1 but for something like Transformers The Last Knight this can be easily boosted higher)
Color: 27/30 (see above)
Picture Clarity: Blur: 10/0 Judder: 0 Noise Reduction: Auto
Local Dimming: High
Contrast Enhancer: Low (yes, High would make it brighter but 98% of content out there is not mastered to be THAT bright and High setting would blow out the highlights. some is, though but leave it on Low if you don't want to experiment with it all the time)
Color Tone: Warm1
White Balance 2 point: Gain: 6, -9, -3 Offset: 2, 0, 2
Gamma ST.2084: 2
Shadow Detail: 0

Color Space Custom (DCI-P3):
RED: 100 40 32
GREEN: 5 90 70
BLUE: 30 3 100
YELLOW: 55 40 45
CYAN: 25 50 43
MAGENTA: 46 40 55

I really like these settings! I'm going with Contrast Enhancer on High, that's the only difference.
I'm watching Hunters, which is HDR10+ and these settings are legit!

Thanks!
 

·
Registered
Joined
·
38 Posts
I really like these settings! I'm going with Contrast Enhancer on High, that's the only difference.
I'm watching Hunters, which is HDR10+ and these settings are legit!

Thanks!
What’s the viewing environment?, I would not imagine it would be a dark room.
 

·
Registered
Joined
·
16 Posts
What’s the viewing environment?, I would not imagine it would be a dark room.
My living room, pretty bright in the daytime. At night it's kinda dark, I don't watch in total dark.
That show is pretty dark, and these settings work good for that show. I noticed that when I watched HDR with these settings, not HDR10+ but just HDR, I had to change settings up a bit.
I guess it just depends on the content? I'm not real familiar with how HDR works, but some movies are dark, some not...I wish I could find a one and done setting where everything looks good.
 

·
Registered
Joined
·
16 Posts
My living room, pretty bright in the daytime. At night it's kinda dark, I don't watch in total dark.
That show is pretty dark, and these settings work good for that show. I noticed that when I watched HDR with these settings, not HDR10+ but just HDR, I had to change settings up a bit.
I guess it just depends on the content? I'm not real familiar with how HDR works, but some movies are dark, some not...I wish I could find a one and done setting where everything looks good.
Only time I can sit down and watch movies are on the weekends. That's when I get to see HDR content.
During the week, I get home and basically watch news and a few shows and I'm off to bed.
I just basically use standard settings and turn on intelligent mode/adaptive picture and just enjoy the shows without trying to judge the picture.
I know that it kills the blacks, but if I'm not constantly trying to judge, the picture looks pretty darn good out of the box for what I watch! I don't care what the director intended, as long as it looks good to me.

I'm guessing I'm the only one on here that uses intelligent/adaptive settings when just casually watching?
 

·
Registered
Joined
·
93 Posts
After my Q80T got wrecked I replaced it with a Vizio PX75-G1, I decided to go with a cheaper set, in case it got wrecked again, the loss would sting less.



I can't tell how they compare calibrated, because I never got a chance to calibrate the Q80T, but I can say that I miss the anti glare filter. The Vizio does OK at anti-glare, but the Samsung filter is really special.


Out of the box, the Vizio comes with brighter shadows. To my eyes, it looks better 90% of the time, but 10% of the time the shadows look noisy or boosted somehow.


The Q80T really nailed local dimming when displaying text and other blocky angular content like user interfaces. I rarely noticed the FALD working when switching apps or navigating through menus. On the Vizio, the FALD is very noticeable in menus, but I could get used to it. FALD on both sets did great on real content.


I noticed zero DSE on the Q80T and I have noticed some DSE on the Vizio already in real content, that sucks.



Overall in terms of PQ each set has pros and cons, except the Vizio DSE. The Samsung anti-glare filter is what makes it a winner for me, but in the end I didn't think it was worth 80% more money (given the way higher than average risk a sword might go through my screen).


I don't think I'll be back in this thread very much but thought I would throw in my comparison.
 

·
Registered
Joined
·
1,932 Posts
Discussion Starter #319
I noticed that when I watched HDR with these settings, not HDR10+ but just HDR, I had to change settings up a bit.
I guess it just depends on the content? I'm not real familiar with how HDR works, but some movies are dark, some not.
exactly. HDR content differs in mastering quite a bit so it's normal to slightly tweak settings depending on what you're watching.
 

·
Registered
Joined
·
16 Posts
Going back to Hunters I am watching on Prime app...during the Intro of every episode my Q80T really changes levels of brightness, it's really noticeable.
Being that this is my first Full Array panel, I am assuming this is how Local Dimming works? I notice it every now and then while I'm watching. I am ok with it.
Here is the thing, I don't notice it a whole lot through other apps, or at least it hasn't really jumped out at me. My question is, is this just how Local Dimming works? Or is it the Prime app?
 
301 - 320 of 648 Posts
Top