AVS Forum banner

8021 - 8040 of 16068 Posts

·
Registered
Joined
·
1,562 Posts
Thanks, now I see it :)



Would be nice if there was a list of games that supported it. But easy enough to toggle to check.
I haven't been able to find a list anywhere

TV - LG C9, C8 & C6 - AVR - Pioneer SC-LX502, Xbox One X, PS4 Pro & Switch
 

·
Registered
Joined
·
9,194 Posts
Please correct me if I am wrong, but I think you should use your regular HDR settings unless the game itself supports HGIG. My understanding is HGIG disables any tone mapping by the TV (someone a few pages back tested patterns and found with HGIG enabled there was detail loss around 700 nits peak brightness) so the game engine itself can do all the tone mapping dynamically to the TV's capabilities.
You can't disable the TV's own tone mapping - it's baked in, and required. Every time the signal asks for a colour that is outside the gamut that the TV has, (WCG), and every time the signal asks for a pixel to be brighter than the TV can display, it will be tonemapped down to something within the TV's capabilities. The standards for HDR and WCG were designed to be future-proof (at least better than SD, and HD were). This means that every HDR TV in the world will be doing tone mapping, for decades to come, until some time in the distant future when all TVs can display the full Rec.2020 colour gamut at the full range from 0-10,000 nits. This is something we all need to remember.

You can disable dynamic tone mapping in the TV's menus, which is completely different. That's just changing the TV's tone-mapping curves dynamically by reading the input signal and guessing. That just controls how and where the TV does tone-mapping, but it still has to do tone-mapping! :) To use DTM with a HGIG game would interfere with the tone mapping strategy of HGiG where the game produces images within the known tone-curves for each TV it can look up from its database. For that to work, for the TV to have a fixed tone-curve that the game can render within, not a constantly changing one. So you can't then go and use something like "dynamic tone mapping" which would act against it. The thinking around here is that the HGiG item in the menu is the same as "off", but that's just a logical theory based on HGiG's own documents.

All discussed in this thread a few days ago, as you said. HTH :)
 

·
Registered
Joined
·
2,767 Posts
Discussion Starter #8,024
So should I turn HGIG on or off to get the best picture?


Personally I would use Cinema for HDR if you have and Xbox X because it’s supports ALLM.
I want to wait and see what exactly is going on with HGiG. No harm in using tho.


Sent from my iPhone using Tapatalk
 

·
Registered
Joined
·
375 Posts
You can't disable the TV's own tone mapping - it's baked in, and required.
The TV uses different tone mapping curves for different settings. I believe the HGIG eliminates the use of tone mapping curves, and instead displays everything 1:1 clipping everything above the panel's peak brightness (which should never be sent by the game).

Every time the signal asks for a colour that is outside the gamut that the TV has, (WCG), and every time the signal asks for a pixel to be brighter than the TV can display, it will be tonemapped down to something within the TV's capabilities. The standards for HDR and WCG were designed to be future-proof (at least better than SD, and HD were). This means that every HDR TV in the world will be doing tone mapping, for decades to come, until some time in the distant future when all TVs can display the full Rec.2020 colour gamut at the full range from 0-10,000 nits. This is something we all need to remember.
The point of HGIG as you mention below, is that the game will never ask for a color outside the TV's native gamut. Tone mapping allows TVs to display content beyond their capabilities within their capabilities to the most accurate extent possible. What the HGIG game sends to the TV is already mapped to the TV's capabilities, meaning no further mapping needs to be done by the TV. The TV shouldn't be rolling off HDR content before hitting the panel's color/brightness limits because the game won't send anything beyond the panel's limits. This is why the TV needs to support HGIG so it doesn't tone map the already tone mapped image a second time, further compressing the HDR content.

You can disable dynamic tone mapping in the TV's menus, which is completely different. That's just changing the TV's tone-mapping curves dynamically by reading the input signal and guessing. That just controls how and where the TV does tone-mapping, but it still has to do tone-mapping! :) To use DTM with a HGIG game would interfere with the tone mapping strategy of HGiG where the game produces images within the known tone-curves for each TV it can look up from its database. For that to work, for the TV to have a fixed tone-curve that the game can render within, not a constantly changing one. So you can't then go and use something like "dynamic tone mapping" which would act against it. The thinking around here is that the HGiG item in the menu is the same as "off", but that's just a logical theory based on HGiG's own documents.
On this we are agreed. I am not talking about using the TV's dynamic tone mapping with HGIG. The two together would be pointless, and I believe the HGIG setting disables dynamic tone mapping.

After further reflection, our disagreement in how we think HGIG works may simply be an issue of semantics. When I say the HGIG disables tone mapping, what I mean is the TV displays everything 1:1 without any change to the color or brightness of each pixel as it would do with a normal tone mapping curve, such as rolling off luminance values above a certain level to preserve highlight detail in content mastered beyond the panel's capabilities.
 

·
Registered
Joined
·
9,194 Posts
The TV uses different tone mapping curves for different settings. I believe the HGIG eliminates the use of tone mapping curves, and instead displays everything 1:1 clipping everything above the panel's peak brightness (which should never be sent by the game).
I don't think that's correct because the HGIG document on their website (discussed here the other day) goes to great length to talk about the TV's tone curves. If it was the case that they were asking every manufacturer to implement a brand new extra tone-mapping algorithm with no rolloff (they nearly all have rolloffs, except for mastering displays which clip), then that would be a huge change across every model!

But yes I think we need to find out a bit more what the TV is doing. I await more games that use it and people testing things with great interest!

It should be quite simple for someone with a C9 and a meter to measure the tone-curves in HGIG mode with different metadatas.

For the game to arrange not to request a brightness outside the TV's capabilities sounds logical. But colours still have to be tone-mapped, and I haven't seen anything in the HGIG documents about a database of TV's colour gamuts. Unless the games stick to Rec.709, and hope that the TVs can all display it, on the basis that "most" TVs can get 9x% of rec.709 now.
 

·
Registered
Joined
·
375 Posts
I don't think that's correct because the HGIG document on their website (discussed here the other day) goes to great length to talk about the TV's tone curves. If it was the case that they were asking every manufacturer to implement a brand new extra tone-mapping algorithm with no rolloff (they nearly all have rolloffs, except for mastering displays which clip), then that would be a huge change across every model!

But yes I think we need to find out a bit more what the TV is doing. For the game to arrange not to request a brightness outside the TV's capabilities sounds logical. But colours still have to be tone-mapped, and I haven't seen anything in the HGIG documents about a database of TV's colour gamuts. Unless the games stick to Rec.709, and hope that the TVs can all display it, on the basis that "most" TVs can get 9x% of rec.709 now.

It should be quite simple for someone with a C9 and a meter to measure the tone-curves in HGIG mode with different metadatas.
I agree we need a little more information. The document also discussed the console overriding the TV's tone mapping.

See my post here - NBA 2K20 (PS4 PRO) definitely does.

http://imgur.com/a/E4ibMvJ
That's an interesting video. I know everything cannot be judged by video, but it looks like the red paint and brightest highlight from the light reflection stayed the same while the wood grain of the court had increased highlights, bringing out more detail.
 

·
Registered
Joined
·
769 Posts
I agree we need a little more information. The document also discussed the console overriding the TV's tone mapping.







That's an interesting video. I know everything cannot be judged by video, but it looks like the red paint and brightest highlight from the light reflection stayed the same while the wood grain of the court had increased highlights, bringing out more detail.
Yeah I went through every game last night that I have installed.

Madden NFL 20 -

Definitely not updated (not surprising) Madden is notorious for having a flawed HDR implementation.

NBA 2K20 -

Interestingly HGIG enabled results in a very similar image to dynamic tone mapping without fluctuations so that leads me to believe it's implemented.

Horizon Zero Dawn -

Much dimmer with HGIG enabled however I did note on the title screen the sun has extra brightness as does the onscreen text with HGIG enabled - so jury's out.


Battlefront II -

Couldn't tell much of a difference here. Clouds in the sky had a slight change in brightness/detail between off and HGIG


Those are the games I tested last night with the eye test of course. I fully anticipate that Modern Warfare will be HGIG supported and that's why Sony pushed this update out in timely manner.
 

·
Registered
Joined
·
45 Posts
Can someone summarize the best settings I should be using for HDR pc gaming? I currently have 4:4:4 rgb 8bit set along with pc mode with lg.

I know 4:2:2 10bit ycbr is an option and i try that from time to time. I am also reading in this thread that i should rename my pc icon to something else to help alleviate banding. Im confused about that. Is keeping my pc in pc mode on lg ok? I do notice some minor banding in gears 5 if im looking for it in dim scenes, Thanks.
 

·
Registered
Joined
·
10 Posts
Cleanupdisc.

I asked the same question and got the following reply from Adam at Rtings:

'To get the full HDR effect, you have to be sending a minimum of 10 bit color, which automatically restricts you to YCbCr422 for 4k @ 60Hz gaming, due to the bandwidth limitations of HDMI 2.0. If you want the best HDR experience, this is the recommended setting. If frame rate is more important to you, it might be best to switch to 8 bit SDR.'

I have played about with both 422, 444 and RGB and can only really say that 422 10bit appears brighter and slightly more vibrant to me eyes but does have a small decrease in text clarity.
 

·
Vendor
Joined
·
26,676 Posts
FYI wall mount fellow members, your old mounts will need to move downward on that wall because the VESA holes are not centered on the panel, but rather are located near the bottom of TV. I have to get the TV tomorrow and measure carefully before I drill as I cannot find a diagram anywhere.
I like it nice and close to the wall........
https://www.amazon.com/dp/B007HQW0HQ/ref=psdc_760796_t4_B0069MCH6S Ya, probably have to cut the acess arms off with a hack saw, but .35 " off the wall.......
 

·
Registered
Joined
·
1,562 Posts
That's the only mode hgig is available in

TV - LG C9, C8 & C6 - AVR - Pioneer SC-LX502, Xbox One X, PS4 Pro & Switch
 

·
Registered
Joined
·
259 Posts
Thanks Ivan, hopefully when hdmi 2.1 graphics cards come out we can just put our settings at 4:4:4 10bit and be done with it haha.
This is the statement of the century.

Although i just bought a 2080ti a few months ago, i knew this would be a problem, but i still bought one. I can still sell it.

But.....

Can't we just reduce the resolution? Shouldn't [email protected] 4:4:4 10 bit be possible? either at 60 or 120hz?
 

·
Registered
Joined
·
316 Posts
The Last of Us Remastered also supports HGiG.
I played around with it. But IMHO it looks even worse compared to HGiG off.
DTM still gives me the best result even though it messes around with the whole image.

Some examples (Recorded with my 1+6T, so do not expect perfect quality. But I think you should still see the differences):

1st Example (Bright sun):
With HGiG on I lose details in brighter areas around the sun. On the other hand the clouds on the left side looked
a bit better IMHO. This goes for pretty much every bright spot. Instead of gaining details I lose details compared to HGiG off/DTM on.


2nd Example (Darker area):
As you might see this is a darker area. Between HGiG on/off there is zero difference.
DTM on brightens the whole image and reveals more details.

During the HDR configuration in the menu of PS4 I noticed the following:
With HGiG off and HGiG on configuring the sliders is much more "rough".
The setup tells me to adjust the sliders as long as I barely see the logo.
If I try this at one point the logo is noticible a lot. One click further I do not notice the logo at all.
Then If have to decide between "clearly seeing the logo" vs. "not seeing the logo at all". There is no inbetween.
In the end it does not even make a difference for the ingame result which way I go. Both look the same.

I know it is not recommended to do this setup with DTM enabled. But if I run it with DTM enabled I am able to control
these sliders much more much more precise. Only with DTM enabled I am able to make the logos "barely" visible.
 

·
Registered
Joined
·
1,562 Posts
The Last of Us Remastered also supports HGiG.

I played around with it. But IMHO it looks even worse compared to HGiG off.

DTM still gives me the best result even though it messes around with the whole image.



Some examples (Recorded with my 1+6T, so do not expect perfect quality. But I think you should still see the differences):



1st Example (Bright sun):

https://www.youtube.com/watch?v=H5YRR5dltOU

With HGiG on I lose details in brighter areas around the sun. On the other hand the clouds on the left side looked

a bit better IMHO. This goes for pretty much every bright spot. Instead of gaining details I lose details compared to HGiG off/DTM on.





2nd Example (Darker area):

https://www.youtube.com/watch?v=mgjvZ3bTp3I

As you might see this is a darker area. Between HGiG on/off there is zero difference.

DTM on brightens the whole image and reveals more details.



During the HDR configuration in the menu of PS4 I noticed the following:

With HGiG off and HGiG on configuring the sliders is much more "rough".

The setup tells me to adjust the sliders as long as I barely see the logo.

If I try this at one point the logo is noticible a lot. One click further I do not notice the logo at all.

Then If have to decide between "clearly seeing the logo" vs. "not seeing the logo at all". There is no inbetween.

In the end it does not even make a difference for the ingame result which way I go. Both look the same.



I know it is not recommended to do this setup with DTM enabled. But if I run it with DTM enabled I am able to control

these sliders much more much more precise. Only with DTM enabled I am able to make the logos "barely" visible.
I don't think it supports hgig.

Even games that don't support it, you will still see some changes unfortunately you will see the wrong changes.

The game is just reacting to you changing the setting.

I doubt they new anything about hgig when they made or remastered the Tlou

TV - LG C9, C8 & C6 - AVR - Pioneer SC-LX502, Xbox One X, PS4 Pro & Switch
 
8021 - 8040 of 16068 Posts
Top