I'm not an expert on this by any means, but it's my understanding that Dolby Vision is a superset of HDR10, and that in fact any Dolby Vision title has an HDR10 base that is used on a non Dolby Vision display.So I got an X800M2 to play with and I found that too determine if the disc has Dolby Vision encoding you should enable Dolby Vision output on the player, then start the disc.
Once the disc starts you press the "Display" button on the remote and the player will tell you if the disc has Dolby Vision encoding or just HDR, and that the player is outputting Dolby Vision to your TV.
Doing it the other way - leaving Dolby Vision off and pressing the "Display" button will tell you the disc has HDR encoding and that the playing is outputting HDR - it does not tell you if there is the disc is Dolby Vision encoded.
This is what it looks like when play Blade Runner 4K which is only in HDR when the Dolby Vision output is ON. You would want to turn Dolby Vision off so the correct HDR mapping is used.
So, extrapolating that out, if you use an HDR10 source disk such as Blade Runner, if you select Dolby Vision as the output format, all it'll be doing is outputting the HDR10 metadata even if it says it's DV. So there won't be any modification of the metadata and the display should be identical to if you output it as HDR10. The only variable I can see here is if there is some sort of profile on the display device that modifies display settings if the source says it's DV and not HDR10.
Is there anyone who understands the specs well enough to confirm what actually gets output from the player in this situation (HDR10 source, player set to DV output)?