AND THIS IS THE BS I'M SPEAKING OF!!!!

__A. 77" 2019 OLED TV:__

HDR Real Scene Peak Brightness: 593 cd/m²

HDR Peak 2% Window: 689 cd/m²

HDR Peak 100% Window: 130 cd/m²

HDR Sustained 100% Window: 126 cd/m²

cd/m² is the same as nits. That's a 1:1 conversion. So, anytime you see cd/m² you can just say - 'nits'.

OLED is a phenomenal technology, and reviews of them tend to be very strong.

B. 100" Triple-laser UST;

So, they are saying that this projector puts up 400nits on a 100" screen?

LET'S DO THE MATH!!!!

It is a 2.775 square meter screen which is about a 29.6 square foot screen.

A direct conversion of 400 nits to lumens gives us 37.16 lumens per square foot.

400 nits * 2.775 gives us total light of 1,110 (lumens)

37.16 ft/lm * 29.6 gives us total light of 1,100 lumens total.

So, nits times the screen size will give us the lumen output of the projector. Instead of just saying it is a 1,100 lumen projector, they confuse the matter by talking about nits.

__C. 120" projector bright enough for dark room:__

1065lumens

If the goal is 15-18 lumens per square foot (as it should be) then a 120" diagonal is about 43 square feet. 43*18 is 775 lumens. That's really all that is needed in a dark theater space to produce an excellent image. 1000+ lumens gives us over 23 lm/ft². That's a bit more than is needed in a dark space, but it is nice for lamp aging and for 3D viewing and will help with HDR content.

Two question:

1. Which brightness number would you use to compute TV's lumens output? If 100% window sustained brightness is used, the lumens output would come out lower than the 120" projector. If Real Scene Peak Brightness is used, the lumens output will be on par with the UST. Which one looks more plausible?

This is why nits are used with TVs and why numbers are all over the place. HDR makes it so that absolute peak brightness is used for specular highlights, and white fields are used rarely in actual television production. Snow scenes and such. Fade to white... But, this exemplifies why the numbers for TVs and even projectors are often so meaningless without an actual review and without real world measurements.

2. Not being bright enough is one common complain of OLED, but it actually has higher nits rating than the 120" projector. What other factors are at play here?

Many thanks and best regards,

Doing the math, you may not have realized that both the UST and the 120" projector basically are claiming the same number of lumens. This is why lumens is typically given, because you are measuring the total light output of a projector. I would think that if nits is used as a standard for a projector, then it would be over one square meter as 'the standard', while lumens is given as a total number, but it is actually over one square foot. A 1,000 lumen projector is 1,000 lm/ft². If 1 m² is the standard for how nits are measured with projectors, then a 10763 nit projector (for 1 m² would be the exact same 1,000 lumen projector.

This all gets extremely convoluted between the different measurements and calculations which are available out there and the conversion between metric and imperial. The hard part of all of this is that nits are supposed to be a measurement of light output over a square meter. If a screen was exactly 1m² and the screen used was a true 1.0 gain, then it would be a good standard form of measurement, but with light typically measured directly from the projector and calculated to a much larger size, then the lumen makes more sense.

At the end, it is still contrast which matters most, and that's why with OLED the different measurements are given and it is why a projector review is far more important than the specifications on paper.

I may have some incorrect math here and would be happy to be corrected and learn something new.