Billy Lynn’s Long Halftime Walk Demo at NAB 2016

At the Future of Cinema Conference that precedes the NAB (National Association of Broadcasters) convention this week, the big buzz is an 11-minute rough-cut clip from Ang Lee’s upcoming movie Billy Lynn’s Long Halftime Walk. Due to be released this fall, the movie is based on a novel by Ben Fountain and tells the tale of an Army unit that returns from Iraq to be honored during the halftime show at a Dallas Cowboys football game. But that’s only the most superficial description; it goes much deeper than that—in fact, it’s been called the Catch-22 of the Iraq war.

Aside from the incredibly moving, emotional story, what makes this movie remarkable—unique, in fact—is that Lee shot it at 120 frames per second, five times the frame rate of almost all movies for the last hundred years. Other capture parameters include 4K and native 3D—which means 120 fps for each eye!

BLLHW-Line
Conference attendees waited in long lines to see the demo, which was repeated six times during the afternoon. We had to surrender our cell phones and cameras before entering the room to make sure no one captured any unauthorized images.

The 11-minute clip comprised nearly 10 terabytes of uncompressed data split between two 7thSense servers, each of which was connected to a Christie Mirage 4KLH RGB laser-illuminated projector, the only projector in the world capable of rendering 4K at 120 fps. Each server transmitted its data to the corresponding projector using four DisplayPort connections, each one sending a 2K section at 120 fps, and the four sections were tiled to form the complete 4K image.

The demo used Dolby 3D, a process in which one projector renders the image for the left eye and the other projector renders the image for the right eye. Each projector uses slightly different wavelengths of red, green, and blue—a technique commonly called 6P to indicate a total of six primary colors (two reds, two greens, two blues)—and the glasses filter the wavelengths so only one set of primaries reaches each eye.

The image was projected onto a 25-foot-wide, 1.85:1, matte-white screen. With a native contrast ratio of about 3000:1, the Mirage projectors do not exhibit high dynamic range, but they were able to achieve a peak luminance of 28 foot-lamberts at each eye through the Dolby 3D glasses. That’s twice as bright as Dolby Vision 3D and around eight times brighter than conventional 3D in a commercial cinema.

Superlatives seem entirely inadequate in light of what I saw—it was astounding. At 120 fps, all movement—including moving objects and camera pans—was crystal clear with absolutely no visible strobing or motion blur. Of course, some will complain that it isn’t “film-like,” and a few people reported that it looked like video, but those same people were so impressed that it didn’t matter to them, and they soon forgot about it. Instead, they were blown away by images never before seen in a commercial-cinema presentation—and so was I.

The image was very bright, especially the scenes in the desert of Iraq, which looked a bit blown out—an intentional effect, I’m sure. The black level was not terribly deep, but then again, there weren’t many really dark shots in the clip we saw. Later, we were told that the footage was shot and protected for high dynamic range, so I expect the final version to have a much greater dynamic range, at least if it’s shown in Dolby Cinemas.

Perhaps even more amazing to me was that I saw virtually no problems with the Dolby 3D glasses. Normally, I really dislike this technology because of reflections between the inner surface of the 3D glasses and the outer surface of my prescription glasses, which result in milky halos around the screen and double images. But I saw none of that in this case, and I’m not sure why. My best guess is because the image was so bright, though I was also told that the design of the glasses has been improved with better tuning of the filters to the wavelengths used for each eye.

Whatever the reason, it was easily the best, most comfortable 3D I’ve ever seen, with no “cardboard cutout” effect. It looked like I was actually watching the scene in the real world. More than one viewer commented that when someone crossed the screen in the near field, they thought it was someone in the audience, and I can certainly understand that misperception.

Late in the afternoon, Ang Lee and several of the people working on the movie took the stage in the conference room to talk about it. Lee acknowledged Douglas Trumbull—who was my guest on the Home Theater Geeks podcast last December—as one of the people who opened his eyes to what high frame rates can do. He emphasized that he wanted to “see more clear” and that HFR and well-made 3D could achieve that goal.

BLLHW-Panel
L-R: Tim Squyres, editor; Ang Lee, director; Ben Gervais, production systems supervisor; Demetri Portelli, stereographer; Scot Barbour, VP, Production Technology, Sony Pictures Entertainment; David Cohen (Variety), moderator

However, this also meant they would have to relearn the art of moviemaking. “We are brainwashed about how to make movies,” he said. Everything about this movie would be different, from lighting to makeup—in fact, they used almost no makeup at all, leading the actors to engage in a weeks-long regimen to clear their skin. As Lee put it, “I want to see through the skin, through the eyes, into how the characters feel.” He also said that the more clarity the image has in terms of resolution and lack of strobing, the more deeply it impacts the emotions of the audience.

As a counter-example, Lee mentioned one of his previous movies, Life of Pi. Shot at 24 fps, he said you sometimes can’t see Suraj Sharma’s performance as the boat he was in moved around on the ocean due to motion blur and strobing. At 120 fps, you can really see the actors’ performance in every detail. As a result, even acting becomes more difficult, requiring the actors to get closer to real life.

Editor Tim Squyres mentioned that shooting at 120 fps with a 360-degree shutter angle (the shutter remains open during the entire frame) is more like capturing data, which allows much more flexibility in how the movie is ultimately presented. This is critically important, since there are no commercial cinemas that can currently show content at 4K/3D/120 fps. The Christie Mirage projectors used in the demo are not actually digital-cinema projectors, because they do not have the security protections required for commercial distribution.

The best that commercial cinemas can do is 2K/3D/60 fps with one projector (e.g., a Christie Series 2, which is found in many commercial theaters) or 2K/3D/120 fps with two projectors. Reducing the frame rate to 60 or even 24 fps is not a simple matter of dropping frames; instead, adjacent frames are blended together. As a result, every version all the way down to streaming to mobile devices will look much better than if the movie had been shot at the lower rate to begin with. (Interestingly, Lee’s team has a pair of Mirage projectors, but the Avid software can barely support 4K/3D/60 fps, so that’s the format they must use for editing.)

Lee had wanted his next project to be a boxing movie, but when he read Ben Fountain’s novel, he realized that it would be a superb vehicle for HFR 3D. “People don’t understand veterans,” he said, “and this is a good way to examine the new technology and humanity.” He sees movie theaters as temples—”I pray to the movie god,” he quipped. His dream is to create a spiritual experience in a shared space, and I’m convinced that Billy Lynn’s Long Halftime Walk will deliver that experience beautifully.