Unfortunately, as far as I can see, they haven't solved the problem, they have only disabled the colorspace setting when receiving 12bits, so that the symptoms are less visible.
YCC422 is still forced in 12bits even at 30p and lower, but in 12bits all the colorspace options in the JVC have the same effect, which is to select YCC422 internally.
So when you send RGB 444 12bits or YCC 444 12bits, you don't see that the wrong colorspace is applied if you don't select auto in the JVC, but there is still an unwanted chroma conversion happening behind the source's back that is detrimental to chroma resolution (in a minor way).
It's not a big issue, but I would still recommend to use 4:4:4 8bits or 4:2:2 12bits, depending on the source. RGB 4:4:4 12bits and YCC 4:4:4 12bits are still not recommended.
I attach a screenshot showing that YCC422 is forced irrespective of the colorspace option selected, and a close up on the "fox" pattern to show the effect (look at the red and blue lines at the bottom, the chroma resolution is much better in 8bits than in 12bits).
This is more likely to be visible on games and desktop apps, and unlikely to be noticeable on video content, except in very unusual situations, and only if the film is very boring
If your source does good dithering (such as madVR), especially if your source is a PC you're also using for desktop work or gaming, I highly recommend to keep using RGB 8bits until this is fixed (with madVR set to 8bits for native panel bit depth). This doesn't add any banding compared to 12bits. You'll get better results than with RGB 12bits and madVR set to 10bits native bit depth, due to this minor chroma issue, and you'll get better results in gaming/desktop apps too.
If you have a dedicated HTPC exclusively to video content and you feel better using RGB 12bits because it feels more "right", by all mean do so, I doubt it will harm the picture in any significant way.
What's important with HDR is to have 10bits in the content. Having madVR dithering from its internal 16bits to 10bits or 8bits makes very little difference to the final result, and certainly none that would be noticeable from sitting distance regarding banding or additional noise.
With a standalone player, I still recommend using YCC 422 12bits. It will work at all frame rates up to 60p and should give you excellent results. If, for any reason, you need/prefer to use RGB or YCC444 12bits, again I doubt it will produce any significant/visible degradation with video content.
Mike, please forward to JVC
and ask them if the new f/w is supposed to actually fix the problem, or if this is as far as they can go due to hardware limitations. V2.07 makes the issue less obvious when a "wrong" colorspace is selected manually, but from a PQ point of view there is zero difference as far as I can see.
I tested with nVidia drivers V385.28. I'll try with 430.86 later if I have the time, but I don't expect the results to be any different. [EDIT: tested with 430.86, same results].