Originally Posted by Smackrabbit
That is going to depend on how you measure nits. The most common method is with a 10% window, and there a Z9D will do around 1800 while the OLED are going to be in the 650-750 range, depending on the panel you got.
However, if you do a highlight inside of real content you'll start to see more differences. The Z9D will still top, but the OLEDs won't drop at all while some other LCDs that aren't full-array backlights can have a significant drop-off. There are many ways to measure peak nits, but it doesn't always mean the number is going to align with real-world content.
This is a pretty important point. FALD without sufficient zones will either have bad black levels around highlights in HDR, or will have significantly reduced highlights.
In fact, here's a summary of 12 recent TV tested by rtings with 10% window and "real scene" peak brightness
It quickly becomes clear that the "LCD HDR brightness advantage" is only reserved for a small portion of the high end market. The further to the left that a dot is from the black line, the more of a drop off you'll see in real content vs the standard 10% window brightness test.
sets included in graph: Sony Z9D, Sony X930E, Sony X900E, Vizio P Series 2016, Sony X700D, Samsung MU8000, Samsung Q7F, LG B6, LG C6, LG E6, LC C7)
:edit: fixed axis scales