AVS Forum banner

401 - 420 of 10592 Posts

·
Registered
Joined
·
815 Posts
I doubt LG will fix that. They will just call it a feature. Or it needs a true HDMI 2.1 source to go away, if not they would have fixed it already. Also I’m presuming the raised blacks is a feature to control Burn-in from Gaming.

My bet is that this is a simple oversight. Similarly to how the display panel behaves differently between SDR and HDR modes (higher W brightness ceiling, different ABL, etc), it also behaves differently between VRR and FRR modes. I guess they could make them look the same, either on the panel driver side or via the processor (for example, separate LUTs for VRR and FRR --- but if any Brightness setting causes elevated MLL, then it has to be fixed on the panel side, I guess). May be it's not even that (sort of mis-calibration) and only a simple bug (like a typo or trivial mistake in the code).
 

·
Registered
Joined
·
490 Posts
so to avoid the raised blacks in VRR HDR games on Xbox One X should I disable VRR and Auto low latency mode on the xbox one x? or just disable VRR?
 

·
Registered
Joined
·
815 Posts
so to avoid the raised blacks in VRR HDR games on Xbox One X should I disable VRR and Auto low latency mode on the xbox one x? or just disable VRR?
On my C9, only VRR alters the tone response (makes dark gray shades overly bright) but MLL remains absolute black. LLM (either any mode + ALLM engaged or Game at all times even with ALLM disabled) causes only a subtle change in precision (especially near-black but nothing serious, not even in HDR10 mode).
 

·
Registered
Joined
·
1,079 Posts
Discussion Starter #404
so to avoid the raised blacks in VRR HDR games on Xbox One X should I disable VRR and Auto low latency mode on the xbox one x? or just disable VRR?
On my C9, only VRR alters the tone response (makes dark gray shades overly bright) but MLL remains absolute black. LLM (either any mode + ALLM engaged or Game at all times even with ALLM disabled) causes only a subtle change in precision (especially near-black but nothing serious, not even in HDR10 mode).
Wait. On xbox you can enable vrr and allm separately? That's not possible on pc as once gsync compatible is toggled off, the tv is out of instant game response mode and so allm is gone as well
 

·
Registered
Joined
·
815 Posts
Wait. On xbox you can enable vrr and allm separately? That's not possible on pc as once gsync compatible is toggled off, the tv is out of instant game response mode and so allm is gone as well
I don't have an XBox, I used PC sources. The Game picture mode, both with or without ALLM (with identical settings where applicable), is effectively the same as Cinema + ALLM with low latency mode engaged. In other words, LLM is Forced-On for Game mode and Cinema+LLM is the same as Game. This is how I deduced what LLM and VRR do separately.
 

·
Registered
Joined
·
1,079 Posts
Discussion Starter #406
Wait. On xbox you can enable vrr and allm separately? That's not possible on pc as once gsync compatible is toggled off, the tv is out of instant game response mode and so allm is gone as well
I don't have an XBox, I used PC sources. The Game picture mode, both with or without ALLM (with identical settings where applicable), is effectively the same as Cinema + ALLM with low latency mode engaged. In other words, LLM is Forced-On for Game mode and Cinema+LLM is the same as Game. This is how I deduced what LLM and VRR do separately.
Hmm I'm not really sure about that because if I remember correctly Cinema+ALLM or even Game+ALLM have some settings greyed out (presumably to reduce input lag), while Game mode without ALLM has all the settings available. So I don't think they are exactly the same.
 

·
Registered
Joined
·
815 Posts
So I don't think they are exactly the same.
I mean, same in terms of what you see on the screen when all the available settings are matched. Display a gradient test chart, stare at the near-black, switch between modes. There is a recognizable pattern of banding (it's fairly subtle but it's there) between normal and low latency modes (similar to the more severe banding in PC mode).
 

·
Registered
Joined
·
90 Posts
I was wondering if anyone has heard of this issue and if theres a fix or if my tv needs to be replaced since im still within the return window. So during the past few weeks ive owned my 65cx it has randomly shut down and sometimes restarted itself. At first I thought it was the timer setting or an hdmi control setting so i changed that stuff and turned it off.

Seemed to be fine till today I was playing a game on Xbox, 4k hdr 10 bit settings on Xbox, and in this game its easy to have lots of explosions so when that started happening, the cx would just turn off and restart itself. This happened at least 3 times within an hour while playing this same game. Im thinking its something to do with the brightness, maybe its overloading somehow and the tv shuts down to prevent damage, if its not that i assume a power supply or other internal issue if its not just a simple software bug somewhere on the xbox or tv itself. The game that makes this happen easily is the newly remastered Saints Row 3.
 

·
Registered
Joined
·
2,835 Posts
I was wondering if anyone has heard of this issue and if theres a fix or if my tv needs to be replaced since im still within the return window. So during the past few weeks ive owned my 65cx it has randomly shut down and sometimes restarted itself. At first I thought it was the timer setting or an hdmi control setting so i changed that stuff and turned it off.



Seemed to be fine till today I was playing a game on Xbox, 4k hdr 10 bit settings on Xbox, and in this game its easy to have lots of explosions so when that started happening, the cx would just turn off and restart itself. This happened at least 3 times within an hour while playing this same game. Im thinking its something to do with the brightness, maybe its overloading somehow and the tv shuts down to prevent damage, if its not that i assume a power supply or other internal issue if its not just a simple software bug somewhere on the xbox or tv itself. The game that makes this happen easily is the newly remastered Saints Row 3.
I would exchange it thats not normal

Sent from my SM-G986U using Tapatalk
 

·
Registered
Joined
·
1,159 Posts
Is 2.2 or 2.4 correct for games? I mostly watch in a dark room and have my movie/tv inputs set to 2.4 gamma.

I could have sworn I read that games are “mastered” at 2.2? What are you guys using?


Sent from my iPhone using Tapatalk
 

·
Registered
Joined
·
169 Posts
PC SDR content, such as games, is not represented using any of the TV supported transfer functions (aka "gamma"). The PC SDR color space is sRGB which has the same color primaries as Rec. 709, but the gamma function is sRGB which is different from both the 2.2 and 2.4 options you can find in the TV display settings. If you are using Game mode, just set the gamma to Medium, as that's the closest to sRGB. The commercial software calibration packages such as Calman and Lightspace that can be used to make custom LUTs don't properly support the sRGB transfer function either, they just use a simple 2.2 gamma power curve instead and pretend it's sRGB.
 

·
Registered
Joined
·
169 Posts
Just wanting to say one more thing. It's normal to see banding when using both 8 (in both SDR and HDR) and 10 bit (in HDR).There is research that was carried out to find out when people start noticing a difference in two close colors. And the results of that research was used to figure out what is the signal quantization (aka using 8 bit, or 10 bit, or something else for color encoding) needed to avoid having people noticing differences in color (one of the most obvious cases being banding) just because the signal quantization is too low. I've posted a link for the ITU BT-2380 report in the owners thread which has graphics for this, but you can easily search it on the internet.

The fact that the LG TVs show banding when using HDMI input set as PC in Game mode (or even outside Game mode) with 8 or 10 bits shows that these TVs display a noise-free signal in an accurate way (the PC generates a noise free signal, while the TV signal or what ends up encoded as movies is always with noise introduced in the mastering process). If you would not see banding, it's because the TV adds noise to the signal, to hide the banding. Search the net for "noise shaping" to see how it's done. When outside HDMI PC input mode these LG OLEDs do add additional noise to the signal from any source, that's why the HDR at YCbCr 4:2:2 10bit appears to be banding free. It's not, btw, and the TV is kind of dumb at how it adds noise - use a 10 bit grayscale ramp with HDMI set as anything else but PC, with YCbCr 10 bit in HDR and with only a grayscale shown it will look as it has no banding, then suddenly display something else like the Windows Explorer window over the grayscale and you'll suddenly notice the banding over the grayscale as the TV algo which controls the noise added is tricked into decreasing the amount of noise shaping.

Another thing I want to set straight is regarding the often mention of "what's the point of 12 bit, the panel is 10 bit anyway?" There are multiple reasons why you want the signal that enters the TV to have as high quantization level as possible:
1. because at 12 bit, there is no banding visible by people due to the signal quantization itself. 8-bit and 10-bit (with no noise added somewhere in the signal) will have visible banding when displayed accurately simply because 8 or 10 bit are not enough to encode the colors without having banding visible by the human eyes (see the ITU report I mentioned) (10 bits is enough for SDR but not for HDR). 12 bit however is enough to have the color encoded so that there is no visible banding when displaying that image content accurately.
2. because any processing performed on a signal introduces errors in the least significant bits of the signal. When the signal it works on is a 12bit signal, these errors will have less visible consequences than when the signal is 10 bits or 8 bits.
3. because having a 12bit input signal allows to do both noise shaping and dithering to 10 bit that will result in an image which will preserve some of the extra detail in the 12 bit signal, and this means an image which appears more detailed than a 10 bit encoded one.

All of these is basic signal processing stuff. It's being done not only in the video world, but for any kind of signal acquisition, processing, storing and use. A lot of reviewers and people that post stuff on the internet lack the basics to understand what they are reviewing, and that seems to sometime include popular references such as rtings, flatpanelshd, or Vincent.

There are people who do this in their professional work, or have researched it professionally. Have you noticed you don't find references to their comments or works anywhere? That's because everybody soon learns that any effort to share some of that knowledge outside their circle gets drowned by noise from popular internet "celebrities" or for-profit businesses with interest in having a discussion going on rather than just simply delivering the answer (what do you think the business model of most of the internet, including forums such as this, is?), or simply by people who get really upset when someone contradicts their gut feelings they post as "truths", and give up on doing this altogether. My advice to those that are really interested in "knowing" things rather than knowing whatever passes as truth in the street-talk game, is to look at the knowledge existing about those topics in the science and engineering sources. Smart people have already investigated all these things, and it's published out there.
 

·
Registered
Joined
·
103 Posts
For anyone hassling with the Nvidia control panel to change the output settings, I've created a command-line tool which enables you to change the output color format and bit depth with ease.
It uses the NvAPI_Disp_ColorControl-function that is probably used by the control panel itself.
I've also added functionality which allows you to change the dithering algorithm of the driver.
You can however not change the resolution (yet).

Download link:
http://www.mediafire.com/file/yxcecws1dwfwhqq/NvColorControl2.0.0.0.zip/file

Here is some basic usage info:

Code:
Usage: NvColorControl    
     : 8, 10 or 12
  : RGB (full), RGBLM (limited), YUV444, YUV422 or YUV420
     : state: 0 = auto, 1 = enabled, 2 = disabled,
                  bits : 0 = 6 bit, 1 = 8 bit, 2 = 10 bit,
                  mode : 0 = none, 1, 2 or 3 = spacial, 4 = temporal
           : 0 or 1
Examples:
- NvColorControl 8 YUV444
- NvColorControl 10 YUV422
- NvColorControl 12 YUV420
- NvColorControl 8 RGB 1 1 4 1

NOTES:
- not all combinations are possible
- HDR can currently only be enabled and requires the application to stay open
- this application does not revert automatically to the previous settings after a timeout
The neat thing is you can create a shortcut to this application (with parameters) on the desktop and configure a shortcut key, so that you can change modes by simply pressing a key combination.
Let me know what you all think of this.
 

·
Registered
Joined
·
916 Posts
For anyone hassling with the Nvidia control panel to change the output settings, I've created a command-line tool which enables you to change the output color format and bit depth with ease.
It uses the NvAPI_Disp_ColorControl-function that is probably used by the control panel itself.
...
The neat thing is you can create a shortcut to this application (with parameters) on the desktop and configure a shortcut key, so that you can change modes by simply pressing a key combination.
Let me know what you all think of this.
Amazing, I'll try out, probably now :)

Edit: does it support multiple "monitors"?

There're already 2 useful tools like this:
- Display Changer
- DisplayChanger II (uses Win10 API) (I use this to manage 2 "monitors")
-- but it can't set bit depth, etc.
 

·
Registered
Joined
·
916 Posts
All of these is basic signal processing stuff.
...
A lot of reviewers and people that post stuff on the internet lack the basics to understand what they are reviewing, and that seems to sometime include popular references such as rtings, flatpanelshd, or Vincent.
...
That's because everybody soon learns that any effort to share some of that knowledge outside their circle gets drowned by noise from popular internet "celebrities" or for-profit businesses with interest in having a discussion going on rather than just simply delivering the answer
First of all, I completely agree with you in this, 100%!

"what's the point of 12 bit, the panel is 10 bit anyway?" There are multiple reasons why you want the signal that enters the TV to have as high quantization level as possible
I agree with you in this as well, in theory :) , but especially HDR10 mode is screwed in PC mode (see below)

It's normal to see banding when using both 8 (in both SDR and HDR) and 10 bit (in HDR).
...
The fact that the LG TVs show banding when using HDMI input set as PC in Game mode (or even outside Game mode) with 8 or 10 bits shows that these TVs display a noise-free signal in an accurate way (the PC generates a noise free signal
Do you have a CX? (I have a B8.)

I. I just tested HDR10 banding 1 more time with in PC mode on B8 with Mehanik HDR10 test patterns (e.g. "03. Grayscale\02. Ramps\01. Diagonal Ramp 400nit.mp4") with Oppo 203 (the output can be easily selectable) and full RGB 10 vs 12 bit output:
- there's only just a tiny difference between 10 vs 12 bit (10 bit looks just a bit better)
- but there's a huge (!) difference between PC and non-PC HDMI (e.g. ISF Dark room preset) modes, no matter if 10 or 12 bit is used!

If you play the same sample files on your SDR (let's say 8 bit (6bit + FRC)) PC monitor (!) - even after tonemapping -, you shouldn't see any kind of issues like the LG produce in PC mode with HDR10 content, no matter what the theory is behind it!

, while the TV signal or what ends up encoded as movies is always with noise introduced in the mastering process). If you would not see banding, it's because the TV adds noise to the signal, to hide the banding. Search the net for "noise shaping" to see how it's done. When outside HDMI PC input mode these LG OLEDs do add additional noise to the signal from any source, that's why the HDR at YCbCr 4:2:2 10bit appears to be banding free. It's not, btw, and the TV is kind of dumb at how it adds noise - use a 10 bit grayscale ramp with HDMI set as anything else but PC, with YCbCr 10 bit in HDR and with only a grayscale shown it will look as it has no banding, then suddenly display something else like the Windows Explorer window over the grayscale and you'll suddenly notice the banding over the grayscale as the TV algo which controls the noise added is tricked into decreasing the amount of noise shaping.
That's an interesting test, thanks and it works as you described (even with SDR content).

II. But as I suggested couple of days ago, it's not just banding with HDR10, but:
- HDR10 posterization with dark content: it seems some sort of the lack of processing
- looks like this

Take a look at this HDR10 sample (UHD BD Remux,01-2160p_23fps_hdr0686-thgm_banding.mkv):
- stop playback, and switch between PC mode and non-PC mode (e.g. Cinema presets)

It has at least 3 different problems (maybe 2nd and 3rd is the same), tested with passthrough + DTM off + RBG full 12bit:
- banding: e.g. red light of torches around @03:20
- added noise: @00:32, around @01:16
- posterization: clouds in dark scenes, @03:20, @03:25, etc.

Posterisation is just crazy sometimes! :) That's the biggest problem and this has nothing to do with 8bit vs 12bit as it's also present with 8bit.

8bit YCbCr 4:4:4 (limited) eliminates a lot of them, but there's still banding and the nasty posterisation still exist e.g. @03:25 (although removes it from @03:20).

None of this happens in non-PC mode even with RGB Full 12bit!

III. There's also some HDR10 gamut mapping problem in dark scenes (due to BT.2020 is applied; SDR doesn't have this issue), that:
- results in cyan "overlay" when blue color is shown
- this is there in non-PC mode as well
- especially visible if you can play the same content via DoVi as well and switch between DoVi and HDR10
- take a look at this HDR10/DoVi sample (UHD BD Remux, 02-2160p_23fps_hdr1052-DoVi-ict_cyan_error.7z):
-- play it back on an Oppo device to get DoVi
-- play it back on PC to get HDR10 and switch between the 2 input

So, in summary:
can someone test the CX at least with these 2 files and report back? :)

Note:
- DoVi mode has not got any of these issues!
- although DoVi has no PC mode (not even from external sources)
 

·
Registered
Joined
·
103 Posts
Amazing, I'll try out, probably now :)

Edit: does it support multiple "monitors"?

There're already 2 useful tools like this:
- Display Changer
- DisplayChanger II (uses Win10 API) (I use this to manage 2 "monitors")
-- but it can't set bit depth, etc.
No, it currently always uses the primary display of the system. I myself only have the tv (LG C9) hooked up to the pc, no other monitors.
So I imagine it'll only work if the tv is set up as the primary monitor.

For managing the HDR switch in the display settings I currently use an auto hotkey script, because there's still no API to reliably enable (and disable!) HDR on the desktop. My tool can only enable it at this moment.

And if anyone's interested, I also have a tool to connect to the tv and send commands to it. This enables you to, for example, automatically open the picture settings and change the picture mode or even the "Home dashboard" and change the input label (from pc to game console and vice versa). It's a bit tricky though, because it has to send all the key presses you would normally do via the remote in the right order and take into account the delay that some actions require.
 

·
Registered
Joined
·
31 Posts
It does upset me that this can’t do DTS. I’m a gamer but I have plenty of movies with DTS:X and I keep reading about how there’s no way to bypass this. It’s incredibly mind boggling why LG removed that feature.
 

·
Registered
Joined
·
916 Posts
So I imagine it'll only work if the tv is set up as the primary monitor.
Thanks, maybe you will have time later for multi monitor support.

For managing the HDR switch in the display settings I currently use an auto hotkey script, because there's still no API to reliably enable (and disable!) HDR on the desktop. My tool can only enable it at this moment.
Hmm, are you talking about enabling Win10 HDR switch? Or is there an HDR switch in nvidia CP? (I still use an old driver, see my signature)
But madvr can correctly trigger HDR with nvidia API, but I don't know how madshi did it.

And if anyone's interested, I also have a tool to connect to the tv and send commands to it. This enables you to, for example, automatically open the picture settings and change the picture mode or even the "Home dashboard" and change the input label (from pc to game console and vice versa). It's a bit tricky though, because it has to send all the key presses you would normally do via the remote in the right order and take into account the delay that some actions require.
Now that can be also interesting :)

Btw, have you heard about aiopylgtv? It's written in python, and can do a lot using LG's undocumented (reverse engineered :) ) APIs:
- e. g. uploading 3dlut created by DisplayCal
- I use it every day to automate switching the TV off when playing a media content is finished from the PC (substituting sleep timer)
 

·
Registered
Joined
·
186 Posts
For anyone hassling with the Nvidia control panel to change the output settings, I've created a command-line tool which enables you to change the output color format and bit depth with ease.
It uses the NvAPI_Disp_ColorControl-function that is probably used by the control panel itself.
I've also added functionality which allows you to change the dithering algorithm of the driver.
You can however not change the resolution (yet).

Download link:
http://www.mediafire.com/file/yxcecws1dwfwhqq/NvColorControl2.0.0.0.zip/file
Here is some basic usage info:

Code:
Usage: NvColorControl    
     : 8, 10 or 12
  : RGB (full), RGBLM (limited), YUV444, YUV422 or YUV420
     : state: 0 = auto, 1 = enabled, 2 = disabled,
                  bits : 0 = 6 bit, 1 = 8 bit, 2 = 10 bit,
                  mode : 0 = none, 1, 2 or 3 = spacial, 4 = temporal
           : 0 or 1
Examples:
- NvColorControl 8 YUV444
- NvColorControl 10 YUV422
- NvColorControl 12 YUV420
- NvColorControl 8 RGB 1 1 4 1

NOTES:
- not all combinations are possible
- HDR can currently only be enabled and requires the application to stay open
- this application does not revert automatically to the previous settings after a timeout
The neat thing is you can create a shortcut to this application (with parameters) on the desktop and configure a shortcut key, so that you can change modes by simply pressing a key combination.
Let me know what you all think of this.

WOW Thanks. Would be great if choosing refresh rate would be integrated (even without resolution change) , will not have to open NVCP... and I guess that for it to work , the refresh rate should be the first option
 

·
Registered
Joined
·
9 Posts
I'm legit curious, why is the 40 gbit vs 48 gbit debate such an issue(other then lg leading ppl to believe) Can't you just use 12bit 4.2.2 chroma 4k 120fps for nvidia cards on the cx? Interms of gaming on a tv, isn't that enough?espically when your sitting farther away then you would for a monitor, is this even an issue?
 
401 - 420 of 10592 Posts
Top