Originally Posted by webdove
At some point I want to see if I can replicate what you are describing, but I know that I have my inputs set to the HDMI icon and not PC.
PC Icon mode is the only way to get 4:4:4 chroma on the 2016 OLEDs, with a bonus of low input latency very close to Game modes without loosing access to calibration+CMS controls.
I don't have the option of using anything else as I use my E6 as my computer monitor.
If/when you check, simply measure primaries and compare (bt.2020 vs rec.709 with Windows HDR enabled). I didn't compare SDR on PC Icon at all as I immediately downgraded at that point.
Since the issue I poorly described happened from a recent firmware update, and someone with a 2017 model described this same behavior in the MadVR thread on doom9, I think this may effect any LG display with that version of WebOS (3.5? Whatever version came with the firmware that replaces the dog/flower/mushroom/mountain no signal screens).
I don't think there are enough complaints to get LG to fix this. Look how long and loud we had to get for HDR Game mode to be added, and then look at all the complaints about the darker tone mapping that replaced the original one for Game mode (darker to not clip) and all LG had to say is "enable Dynamic Contrast and set to HIGH for correct picture".
You know something is wrong when your phone/online support reps are telling them it's wrong lol.
Originally Posted by webdove
The two PC cards, PS3 and XBOne all behave the same. When set to RGB Full the Vertex input reports RGB Limited. I don't know whether there is no metadata for code range in the standard and the Vertex just assumes Limited or if all four of those video devices are failing to report the code range properly (which seems unlikely).
All I know about the Vertex is it can be used to force custom metadata flags, so if results from consoles and video card match this leads me to believe the issue lies with the Vertex.
Ignoring what it reports, do the levels LOOK like they match what you set them to (ex: 0-255/full from GPU and display at HIGH)?
If so, perhaps the Vertex wants to be like LG with backwards names (IMO "high" black should be 16, makes more logical sense to me).
I don't know if it will be helpful, but there are two different EDIDs on the E6, one pair for each input port. One is used when the "HDMI ULTRA HD Deep Color" option enabled and another is used when it's disabled.
Service Manual has timings and raw data for EDID blocks : https://lg.encompass.com/shop/model_...OLED55E6PU.pdf
There may be a few bad timings in the EDID, as I've had a few of them that resulted in a black screen with audio when I was playing around trying to see if I could force 4:2:0 or 4:2:2 at 10-bit depth for making custom resolutions for MadVR. None of them worked, but I think there is something going on as the non-deep color EDID would more frequently show as supporting 10-bit, but none of them worked (black screen + audio), even trying the ones that my PS4 reported as 10-bit which did work as expected.
Originally Posted by webdove
BTW, I could not find any way to get the LG E6 to report what it believes is the input coding system (YCrCB or RGB, bits and range). This is the first TV I have had that lacks that function.
I really wish this information was shown, it's not even in the service menu this time.
You can see SOME flags by clicking on the Input label in the top left. This will show flags for stuff like HDR, bt.2020, audio quality and crap like that (I mean it's pretty easy to know if HDR is on...)
Originally Posted by webdove
I was trying madvr yesterday. It appeared to be full screen. I am not sure how I verify exclusivity. It also does not seem to report what bit depth and coding it thinks are in use.
Press CTRL+J to show the OSD, it has everything you need to know.
You can also see which API is being used for HDR.
Another way to check for exclusive full screen is to enable "delay switch to exclusive mode by 3 seconds" as the screen will flash as it switches to FSE. (MadVR options: Rendering > exclusive mode).
Originally Posted by maxOLED
Thank you for your work on HCFR and this thread
. I have a request for the "black frame inserstion" feature:
Could you please allow a frequency setting of 1, so that after every pattern a black frame is shown, and add a setting to define the number of ms this black frame will be shown? For which time it is shown right?
ConnecTEDDD did some test with LG 8 and showed that inserting a black frame after every pattern can reduce panel drift: https://www.avsforum.com/forum/139-di...l#post56281456
While the black frame insertion helps, I don't think it should be the only option used IMO. In my case it caused more harm than good (more detail below).
This was something I tried doing thinking it would help issues caused by the behavior that Ted was referring to in one of his comments on my first HDR calibration way back in the 2016 LG OLED calibration thread. At some point I basically gave up following typical calibration procedures due to all the unexpected behavior of my E6 that I kept finding--How controls effected the picture, how a single control behaves and how they interact with each other, auto-dimming triggers and behavior, etc.
My E6 has a red push coming out of black, and the higher it goes over 100 nits the less time it takes for image retention to effect measurements and the magnitude of difference ala drift.
I've found using a low energy grey, like 35-35-35 rgb, helps stabilize the display for more accurate measurements between ~10-45% for HDR and -30% for SDR.
I saw minimal impact outside of these ranges, and a slight improvement to the red tint of HDR near blacks 0-4% (can't fix with adjustments via HDR control point 127 being way to damn strong).
This is what I mean by causing more harm than good.
Ted made a damn nice post HERE
in the 2018 LG OLED claibration thread. This is basically the behavior I was talking about and never could understand why everyone kept trying to do sweeps for measurements.
Bobof (Fabio?) did a great job combating this issue even without having an oled of his own to test with when making his patch sets. HERE
Anger.miki shows his differences between black frame insertion and one of Fabio's patch sets (same 2018 OLED thread as above).
I've been "planning" to make my own patch set for my E6 but for HDR (100+ nits) this for HDR, but there's so many variables that need to be considered that I've been way to afraid to start. I know how bad my OCD is and it scares me knowing how deep this particular rabbit hole will go.
Here is a brief outline of some of my thoughts on this based on observed behavior of my own E6:
First, what's the issue? Stability over 100+ nits. The general consensus is that this makes calibrating HDR impossible (see link to Ted's post above). I don't entirely agree or disagree on this.
Calibration patterns are "synthetic" content so how the display behaves WILL be different.
What I'm getting at is a signal sent to a pixel is always going to be changing in real content far differently than it does to show a calibration pattern.
To illustrate what I mean, imagine pixels as group of people who got special instructions that all say "you are brighter than the others around you" so a big fight starts as everyone tries to follow their instructions to be brighter than everyone else.
I think there are three critical "energy" points to consider.
: As mentioned above, my E6 has a red push coming out of "black". It decays over ~5 seconds depending on what energy is already stored at the pixel level, which depends on the average energy of everything shown in the last ~15-20 seconds (Linear in luminance increase over time but not rgb balance due to white subpixel).
: Because of image retention behavior, the current state of retention needs to be factored into the current measurement. Too high and results will "drift". Too low and response will be inconsistent (unstable, like my red push).
I've found best results by using a dark grey full screen in between measurements. I tried to find something neutral so all pixels would be "on". By not using black I can raise or lower a small amount of image retention to get pixel behavior that behaved closer how retention in real content would effect it. This seems to be and agrees with what some 3D LUT users were doing by pre-emptively running a patch set for a few minutes before profiling.
I've recently started playing with the Idea of using different levels of grey for different ranges I'm measuring. but only showing the brighter ones for X seconds before switching back to the dark grey then immediately taking a measurement. This seems to help stabilize consistent measurements in 200-350 nit range, and the time can be changed to simulate more or less retention.
I've been going with 5 seconds which gives green a notable boost with no real impact on blue or red.
: You need to know how the last measurement will effect the one, all thanks to retention again.
I've been ignoring this entirely by doing manual measurements, but this is crucial to automate any of this even for 50-100 nits.
I've been trying to think of a way to visualize all of this, but Ted and BlackJoker's posts in that 2018 thread have done better than I could think of (I don't have lightspace).
But a static picture doesn't really show the impact of this behavior, IMO, as it all depends on TIME which isn't easily represented by something without time :P