Originally Posted by skavan
First off -- thanks for taking the time to help me figure this out.
I tried to follow instructions! TX0 is connected to LG C9. TX1 (via RX) is connected to an Epson 5050 Projector.
- On the TX, I set a custom EDID on all my inputs - such as the LG C8 Custom Edid #8 (also tried #5).
- On the RX, "Use custom HDR for TX0 and TX1 when input is LLDV (for ATV only)"
p.s. I didn't enter any data into text box above the "Send HDR" Button.
The outcome is a pink/purple screen on the LG C9 (which is DV capable!) and a proper picture on the projector. Which makes no sense to me whatsoever!
Any ideas welcome -- including whether all this brain damage is worth it and whether I should just set the TX EDID's to a non-DV set like #4
, Q9FN? In other words, is fighting for DV on the LG9 worth the battle?
Originally Posted by claw
Did you click the Send HDR button with the defaults? It might be possible that the LG doesn't know what to do with LLDV if you also include custom HDR10 metadata. Why do you want to send custom HDR metadata? Does your Epson require it in order to go into HDR picture mode? I would suggest to test with not sending custom HDR.
I think you want the LG to process the input as LLDV since it supports Dolby Vision, but the Epson to process it as HDR10.
It's quite easy, both Normal DV or LLDV, use DV string to encode and display decode it.
The idea here is to use Sony A1 LLDV string (that's a string in A1 EDID), because Sony A1 TV have nearly no processing available so all processing is done at source level, the result is a picture close to HDR. so by forcing a capable source (ATV4K, sony players mainly) to output LLDV for Sony A1, one can then deal with that signal just like if it was a HDR signal, so if you are sending that to a HDR capable display you can send custom HDR at the same time to force display in HDR mode and picture will nearly be as good as HDR10 with the advantage of the sources making frame by frame adjustment as it does for DV/LLDV content compared to normal HDR10.
Of course Sony A1 prolly adjust a few others things, so some additional settings might be required on the display or on the HDR metadata sent or both... This have yet to be determinate.
In all cases, once your source output LLDV for A1, then if you don't have A1 display, the only way to deal with that signal is to force HDR mode, another DV display won't be able to handle the signal properly because it was not encoded using that display DV string, but A1 DV string.
So what Skavan reported is just NORMAL.
If he used his C9 DV string, then the signal would not be that much processed at source level and fully unwatchable as HDR on another display.
So, if you don't have Sony A1 and force a source into LLDV using Sony A1 EDID, then you have to force both display into HDR mode. so here, with SOURCES > TX/C9 > RX/EPSON, both RX and TX need to send Custom HDR to force both into HDR mode so both can render LLDV as HDR.
Just small misunderstanding about how DV works here.
Normal DV is DV SOURCE ENCODE for DISPLAY using DISPLAY DV STRING, only that DISPLAY CAN DECODE IT. + Stream travels in RGB container, so no operation possible on the signal when using a device in the middle without breaking DV fully.
LLDV is DV SOURCE ENCODE for DISPLAY using DISPLAY DV STRING, only that DISPLAY CAN DECODE IT. but stream travels normally and operation can be done using a device in the middle without breaking it. + If user have a TV with weak processing power such as A1, all processing is done at source level from capable source and results is HDR like stream travelling which can be rendered as HDR on all HDR capable display, yet a few others tweaking might be needed to be perfect because if you compare LLDV > A1 and HDR > A1, there very slight difference when you look closely the stream before the display is in LLDV or HDR mode.