This doesn't tell you anything about how the video was mastered.
For a start the CGI would likely have been lit and textured in linear , the DI would have been 0.6 on a 1.7 display, all this could have been done on a display with a gamma of anything between 1.7 and 2.5. This has been profiled and lutted to take it to any reasonable colorspace the colorist desires.
I sit in front of a display with a 2.2 gamma and a D.65 color temp , what I actually view on it is D.54 gamma 1.7 with a suitable LUT.
The hardware isn't important , the end to end gamma is : and even then people have moved away from defining simplistic gammas and now use transfer functions.
Also professional imagery is created with an eye on robustness. A gamma deviation on the order of 0.3 is within the tolerance of good imagery as you may want to tweak the end gamma depending on display capabilities and environment.
When I'm creating video imagery I view it at 2.2 and 2.5 with and without a hard clip to 16-235 in place to ensure robustness. I actually find retaining detail above 235 to be far more critical than nailing gamma to within a couple of tenths of a point.
Within the envelope of technical video requirements how the actual imagery looks is not an absolute quantity. If you gave two potters exactly the same amount and type of clay and asked them to give you a pot of a desired design each pot will have slight differences. Imagery is a bit like this.
There is no single set of numbers that will convert film imagery into video imagery and give acceptable results for every single shot in the film. This is why you need a human being being paid a lot of money to subjectively decide how best to compromise the film into video.
I assume that something competently mastered for video will look acceptable between 2-2.5. I generally prefer something a bit higher than 2.2. There is little merit in trying to hit an absolute gamma number on your display beyond starting at 2.2 ( although I prefer to hit Rec.709) and apply a reasonable gamma offset accordingly to your preference and situation.
For a start the CGI would likely have been lit and textured in linear , the DI would have been 0.6 on a 1.7 display, all this could have been done on a display with a gamma of anything between 1.7 and 2.5. This has been profiled and lutted to take it to any reasonable colorspace the colorist desires.
I sit in front of a display with a 2.2 gamma and a D.65 color temp , what I actually view on it is D.54 gamma 1.7 with a suitable LUT.
The hardware isn't important , the end to end gamma is : and even then people have moved away from defining simplistic gammas and now use transfer functions.
Also professional imagery is created with an eye on robustness. A gamma deviation on the order of 0.3 is within the tolerance of good imagery as you may want to tweak the end gamma depending on display capabilities and environment.
When I'm creating video imagery I view it at 2.2 and 2.5 with and without a hard clip to 16-235 in place to ensure robustness. I actually find retaining detail above 235 to be far more critical than nailing gamma to within a couple of tenths of a point.
Within the envelope of technical video requirements how the actual imagery looks is not an absolute quantity. If you gave two potters exactly the same amount and type of clay and asked them to give you a pot of a desired design each pot will have slight differences. Imagery is a bit like this.
There is no single set of numbers that will convert film imagery into video imagery and give acceptable results for every single shot in the film. This is why you need a human being being paid a lot of money to subjectively decide how best to compromise the film into video.
I assume that something competently mastered for video will look acceptable between 2-2.5. I generally prefer something a bit higher than 2.2. There is little merit in trying to hit an absolute gamma number on your display beyond starting at 2.2 ( although I prefer to hit Rec.709) and apply a reasonable gamma offset accordingly to your preference and situation.