Dolby Vision on HDR10 Display Device
I did some testing during the past days and I have a question to everyone as I'm sure there is an answer to that. I have an Epson 4000 HC which is compatible with HDR10 metadata. However, when streaming Netflix from my Apple TV4K the match frame/range option defaults the titles to show the Dolby Vision logo. Now I know my projector is not capable of Dolby Vision but is capable of HDR10. In fact when playing content with the DV logo the projector switch to HDR mode but my question is: Am I watching a Dolby Vision content showed through a device not capable of DV or I'm watching a HDR10 content correctly showed?
I know that Dolby Vision contains HDR10 static metadata and also dynamic frame by frame metadata...but I'm wondering if because the Apple TV is capable of Dolby Vision but my projector is not the results and image quality is not correct. The content tends to be washed out when playing title with the Dolby Vision logo but the projector showed stunning images when playing HDR10 content from the Spears and Munsil calibration disc.
It would be nice to test the same content with an HDR10 only capable device like the Roku stick and see if the image looks different. With devices that do not support Dolby Vision the same Netflix content display the logo HDR instead of Dolby Vision. The problem is that ATV4K is the only device that match frame rate for content streamed through Netflix (Roku will default the fps to 60 when streaming 4K from Netflix and my projector can only support 4K HDR up to 30 fps but this is a different story...)
Can anyone tell me if ATV4K shows content in HDR10 automatically and ignore the extra dynamic metadata when the display device (my projector) is not capable of DV despite the logo on the Netflix title states Dolby Vision? Or somehow the extra dynamic metadata has an impact on the picture showed through my HDR10 capable projector washing out the picture?