As Joe Kane suggested in his presentation at the Samsung QLED/HDR10 Summit, it would be great to generate a single master file of UHD content that could then be easily adapted for different displays. That’s the basic idea behind the Colorfront Transkoder grading system, which was demonstrated at the event. (To be accurate, Joe was talking about a single delivery file, while Colorfront was talking about a single master file from which different grades can be derived.)
The system is GPU-based software running on Supermicro, HP Z840, and Macintosh Pro computers. It features an ACES-compliant pipeline with input transforms for camera-specific live signals and recorded footage as well as output transforms for projectors and flat panels in SDR and HDR. Speaking of which, it can handle HDR10, HDR10+, Dolby Vision, and HLG as well as SDR. It consolidates all types of color spaces and formats onto a common platform, simplifying them into a single workflow.
The Colorfront Engine—the basis for Transkoder—is based on human perception. Image analysis determines the mix of perceptual weighting and a simple rolloff at the high end of the brightness range while leaving the low end alone. As a result, perceived relationships of color and brightness remain constant across different dynamic ranges, brightnesses, and color gamuts. The goal is to preserve the original creative intent no matter what the final delivery format is.
According to Colorfront, the system is fully compatible with current and future delivery formats. It’s also camera-agnostic, allowing it to ingest and process footage from any camera. In addition, the Colorfront Engine is implemented in the AJA FS-HDR real-time converter/frame synchronizer that is designed to meet the HDR and WCG needs of broadcast, streaming, post-production, and live-event applications in 4K/UHD and 2K/HD. In fact, the Colorfront Engine can operate in real time with resolutions up to 8K.
The demo consisted of ungraded 4K footage shot on an Arri Alexa Mini camera in Amsterdam. Transkoder was running on a Supermicro computer, which played the footage out to a Samsung 65Q9 TV and a monitor for the GUI (graphic user interface) via HDMI. As seen in the photo above, the system was set up to split the image in a “butterfly” (mirror-image) configuration, with a 1000-nit HDR10 version on the left and a 400-nit HDR10+ version on the right. (It can also split the screen in a quad configuration with each window displaying a different format.)
The GUI monitor was displaying a waveform monitor and color gamut in real time. As you can see in the waveform monitor, the right side is at a lower brightness than the left. Also, the colors extend beyond P3 in the blue-green region.
The image was clearly brighter on the left, and I didn’t see any significant advantage in the HDR10+ image. As far as I could tell, the scenes didn’t vary a lot in overall brightness from one to the next. I guess the demo was intended to show that the system could process and play different formats on the same screen.
I asked the Colorfront reps to show me the same footage at the same peak brightness, one with HDR10 and the other with HDR10+. I saw almost no difference between them, though the HDR10+ side might have had a bit more detail in the bright areas. This seemed to confirm that the demo was not intended to reveal the difference between HDR10 and HDR10+, but rather that Transkoder can display a given file at just about any peak brightness in any HDR format.
It didn’t take long to reset Transkoder to output the footage at the same peak brightness (600 nits) with the left in HDR10 and the right in HDR10+. I could see little if any difference between them on this particular footage.