Originally Posted by JackB
I'm not sure if I totally understand the tech side of this but why don't the hardware device companys, Roku, Amazon, Shield, etc., just publish the interface standards for both the hardware and the software, and leave it up to the program suppliers to make their streams match up with the various hardware devices?
They already do - they all have developer programs with published SDKs/APIs/Tools/Specs/etc - and therein lies the "problem":
Not all devices SDKs/APIs/Specs are equal, and, to further compound the issue, they change over time (to enable new/better/changed functionality, fix bugs, create/eliminate limitations, add new API calls, deprecate API calls, support new hardware, eliminate/limit support for older hardware, support new UI elements/changes, support platform management changes, and on and on)
They can even have identical hardware (CPU/chipset/etc), but with significant software platform differences/changes (different OS/Firmware, with updates over time) that have to be taken into account.
So they do "make sure the streams match up with the hardware devices", but those hardware devices are not monolithic - there are different models even with a given platform, and the device OS/firmware/APIs change over time, and all of that has to be taken into account on both the server and client sides (e.g. Amazon releases a new FireOS version, and/or a new Fire TV Home version, app developers (NetFlix, Apple TV) then release app updates to account for new/changed APIs/functionality/etc).
Effectively all these devices are just use-specific computers, just like more "traditional" computers with Operating System updates and upgrades and fixes and patches (leading to new/changed APIs etc), running on different hardware platforms/mixes, requiring both constant OS/hardware driver/app updates (for security/compatibility/functionality/usability/etc) - the difference being the OS/driver/app updates/upgrades are all integrated and automated and mostly beyond our control (officially).
And just like in the the rest of the computing-sphere, streaming app developers have to keep up with all the OS/API changes and updates on all the different platforms they develop streaming apps for - they design their app/streaming framework to be as flexible and abstracted as possible for/from the hardware/OS/API/platform differences, but ultimately there are differences (e.g. HDR10 versus Dolby Vision, or EAC3-JOC Atmos versus PCM MAT 2.0 Atmos) which lead to different experiences in terms of audio/video/usability - and some companies just have better app designs and/or are just better at tracking and maintaining their client apps.
And all of this is why you see so many folks hereabouts using multiple streaming devices/platforms/apps - to try and maximize their experiences for any given content/service by using the device/app that best suits their preferences, knowing all the advantages/disadvantages that exist for all/any of them.