Originally Posted by blazar
Home automation is a mess for lack of standards as well as an unwilingness for companies to standardize for fear of being "too interoperable" and therefore obsolete in a sense.
That's kind of arguable. If you look at the companies who would have to agree to standardize on any sort of common protocol, 99.9% of them aren't in the home automation business. And, if they did all use a common protocol, that integration wouldn't be a deciding factor in a customer selecting their gear, it would still be based on the same things that drive those decisions now, so such standardization wouldn't be a factor for them one way or another.
Without that 99.9%, it would make little difference if the actual automation system vendors tried to become more interoperable. That wouldn't improve the situation more than very fractionally, if at all. The automation vendors don't make the gear that needs to be integrated, or not much of it, they make the gear that has to do the integration.
So I don't think it's fair to say that the situation has much to do with fear of becoming too interoperable, since that 99.9% of companies would only benefit from it, though for most of them the benefit would be pretty smallish.
Still, even if such a protocol was adopted ubiquitously, and I periodically try to harp on this because it's important, a common protocol isn't going to make that much difference. There are a number of reasons and I've written at length about them before, particularly in a previous thread here about Apple's protocol, which was getting hyped out of proportion by Applites as though it were somehow going to solve all of the automation world's problems. But a standard protocol is just a small part of the issue, not a panacea at all.
On the issue of device support, that's hard to deal with. Just to throw some example numbers out there, let's say that there were 1000 new devices/models/modules that came out this year that some number of customers would like to be supported, and that's probably conservative for most years, what does that mean? So, let's say it takes, on average, a man-week to do a good, solid, two way driver after all of the time costs are factored in (and that's not unreasonable.) 1000 devices times a man week is a thousand man weeks, which is 19.2 man years. If you assume at least $100K overhead per man year for people with the skills to create good drivers, that's $1.9M just in salary/benefits costs to support those devices, each of which may only be applicable to a handful of customers.
And that leaves aside the costs of getting those devices in house where they can be kept around for regression testing and setting them up in realistic scenarios to test them, and keeping people who are spun up on the ins and outs of those devices and the facilities that allow that. Some of them are complex. So you might be talking tens of millions of dollars in expenses ultimately, and that overhead just grows every year since you have to support the previous years' stuff as well.
Customers just have to accept to some degree that the deluge of devices just cannot be kept up with, unless they are willing to pay a lot more for the product in order to finance it. The only companies that can get around that to some degree are those so large (e.g. Crestron) that the vendors themselves feel obligated in many cases to provide the driver themselves (but of course they probably aren't experts and so the quality can vary and bad drivers make for bad automation solutions, not something a company wants happening.) Just even providing validation services for such vendor provided drivers would be a huge undertaking.
Anyway, it's a lot to bite off, even for a large company.