I found a way to use the PC HDR mode in a meaningful way. There are two things to consider for this:
1: The v390.65 nVidia driver not only brought back the NVAPI HDR support for Win10 1703+ but also uses dithering for 8-bit output (rather than simply truncating the higher precision input like some older drivers did --- at least on my GTX1070 and with Win10).
2: The PC mode HDR can't properly handle >8 bit input (I guess it's just arbitrary low precision, not exact truncation to 8-bit but looks about as good as that: clearly visible banding) and can't do gamut mapping from device to Rec2020 space (you just get the roughly DCI-P3 native gamut with the fixed Wide option in Standard or Game, sRGB in the rest at Auto).
So... when the VGA driver dithers the display output to 8-bit (from the 10+ bit software image) the TV in PC/HDR mode can properly handle that 8-bit, so it looks decent overall (not as good as real 10+ bit but it's closer to real 10-bit than it is to plain 8-bit, so it's acceptable for WCG and HDR too).
And... if you use a color sensor to profile the Wide gamut mode (which is actually easier to than it is for Auto, since this Wide mode is closer to the native gamut and that's usually the easiest to profile reliably...) and create a 3DLUT for ReShade, you can have your proper Rec2020 colors in games. I used the PC/HDR/Game mode for this (Standard could also work but that's further from the desired tone curve, the rest is garbage).
After that, the only outstanding issue is the handling of the the anisotropic WRGB color space of these panels. I solved this by creating the 3DLUT with hard-clip at 650 nit (the actual peak white of the display with W+R+G+B) but setting the videogame to 400 nit peak output. This latter number comes from two sources: one is the fact that the R+G+B cluster (without the help of the W sub-pixel) can produce ~380 nit white on my panel and I also spent some time in the game staring at shiny lights and tweaking this number (and found that it's fine to overshoot the R+G+B number a little bit without seeing obvious problems but numbers like 500 are obviously problematic).
(If you wander, in Frostbite games, you can set the peak white with the Render.DisplayMappingHDR10PeakLuma command in the developer console.)
So, after all that, I have decently smooth-ish gradients and 4:4:4 in NVAPI HDR games with the LG C7 running in PC/HDR/Game. The only thing I changed in this picture mode was tuning the sharpness down from 10 to 0 and the Color from 65 to 50 (because that's the "neutral" point for Color in every single picture mode, both TV and PC, I have no idea why it's set to 55 for all "accurate" modes in TV mode, R,G,B colors reach their peak saturation and luminance exactly at 50, not at 49, not at 55, but 50). Ah, and I also tweaked the white point settings in the service menu, so this stupid W30-C30 slider always produces D65 white (I set all factory white balance settings to the same, so the slider does nothing).
"DIY certified hobby-calibrator" (based on ChadB's "warning signs" list
Last edited by janos666; 01-18-2018 at 01:26 PM.