Originally Posted by venkatesh_m
Question, if all we were to use is 4k SDR 2020 from the Lumagen with DTM would the 18Gbps outputs be necessary? What kind of data rates would be likely be used?
For HDR sources, most require an 18 GHz input on the Pro. There are some exceptions that work with a 9 GHz input for 4k24 HDR sources (Kaleidescape Strato, Panasonic UB900 and likely UB9000, Oppo 203). We recommend 18 GHz inputs for HDR sources, but you can choose either 18 GHz or 9 GHz for non-HDR sources since the material is compressed 8-bit and so even at 4k60 using 9 GHz 4:2:0 at 8-bit is enough.
For Radiance Pro outputs, I actually recommend 9 GHz outputs to customers, when they are not stuck on having 18 GHz output just because. Here's why:
The 9 GHz outputs have slower edge rates than the 18 GHz outputs. This is easier on what I refer to as "marginal HDMI input designs." We have seen some projectors that work with the 9 GHz output at 9 GHz, but *not* with the 18 GHz output running at 9 GHz. The only difference in this case is the output edge rate, and perhaps slightly different output EQ. This is an issue either in the projector's input PCB layout, or an issue in the HDMI input chip, or both (assuming a good HDMI cable). Interestingly TVs have better HDMI inputs than most projectors. The TVs we have tested all work well at 18 GHz.
Even if your projector has a good 18 GHz input (our RS4500 for example), it will almost certainly lock on to a 9 GHz signal faster than a 18 GHz signal.
For output at 4k24, there is no difference in the output data between 9 GHz and 18 GHz output cards, for our recommended 4:2:2 at 12-bit output mode. Zero. Both output card types use 12-bit 4:2:2. So you cannot see a difference between a 18 GHz output card and a 9 GHz output card for 4k24 movies and other 24 Hertz programs.
For SDR sources outputting at 4k60, since the SDR source is compressed 8-bit to start, with the Pro's excellent output dither which takes the 12-bit pipeline (which is up-sampled from the 8 bit source) and dithers it back down to 8-bit, in my opinion, there is no chance you would see any difference between the 9 and 18 GHz outputs, even doing an A to B comparison.
That leaves 10-bit HDR10 outputting at 4k60 in the Lumagen recommended "HDR output in a SDR container." There is currently one 4k60 HDR movie that I am aware of. So we tested by outputting 4k24 HDR movies at 4k60 and compared the 8-bit 9 GHz 4:2:0 output from the Pro to the 12-bit 4:2:2 4k60 output from the Pro. I can't see a difference even on tough scenes. It used to be on a very few scenes there was a very small visible difference, but we improved the Radiance Pro output dither until there is IMO no visible difference for these. I have challenged a few technical video folks to prove me wrong and so far no takers, and this is for A to B comparisons. Since I have not evaluated all possible content I can only say than I have not found a difference. Since you would not be doing an A to B comparison, but rather watching video, I am very confident you would never see an issue caused by the output being 8-bit dithered rather than 12-bit. You might see content compression related issue, but these are not due to the Pro's output dither.
Another advantage of the dual 9 GHz output, is that both outputs in a 424X can carry both audio and video. We have seen a number of audio processors that apparently do not correctly implement all HDMI audio interrupts. So these products depend on the video changing to know when the audio has changed. With an audio only output the video does not change. We have worked hard to "kick" the audio processors awake when the Pro detects and audio change, but for some audio processors the only reliable work-around to their issue is to enable both audio and video on the output of the Pro that is connected to their input. With the 9 GHz output card in a 424X, you can do this, but with the 18 GHz output in the 424X, only one output can have both audio and video.