Originally Posted by toonj64
I didnt think about bandwith demands being the holdup but that could be possible. HDR is just metadata so I doubt it's really that heavy no?
And I know Vizio has nothing to do with it, certification is on Amazon's end.
It’s 10bit WCG/HDR vs 8bit, the metadata just tells how to render that’s HDR. That said the data usage is probably like 20% so not a huge usage increase compared to 4K which is about a 100% increase. But yeah there is no way this is over data usage.
Vizio 2016 P65, Denon x2200, 5.0.2 Elac Debut F5, C5, B4, and A4
PS4 Pro, AMD Vega 56, and Nvidia Shield