Originally Posted by hpmoon
UHD is extremely elementary, old technology. It needs to be sub-$1k NOW
This really made me chuckle.
I'm not sure why I saw so much humor in it... Is it just me?
I'm sure it is! Anyway, I digress...
Originally Posted by hpmoon
I'm so tired of people making excuses for manufacturers who time innovation according to profit margins (with arguable anti-trust violations when they cut private deals to prevent cannibalizing high-margin "professional" gear). UHD is extremely elementary, old technology. It needs to be sub-$1k NOW, as surely as my Acer 1080p DLP projector cost me $600. As long as people like you make excuses, the executives will keep buying up McMansions and delaying release of minimum-spec gear while delighting in the profitability of their market-gaming portfolios.
My Panasonic GH4 cost $1699. Getting to UHD on the acquisition side was no big deal. The display industry is stubborn and it will once again undermine their long-term profits, in the same way that 3D got killed by greed.
The GH4 is a great camera. It's not a good comparison here, though.
Think about it: digital photography - even in it's budget form - has exceeded the 8-12 megapixel-barrier (that is, '4K') for the best part of the last decade. Many of these budget cameras have offered fairly capable burst modes: say, 4 frames per second; at their maximum resolutions.
The jump to 24 frames per second - the maximum at full-resolution for your GH4 - wasn't so much of a technical barrier as it was a resource-allocation barrier: ie, the imaging devices have supported in excess of 4k for many years - one of main limitations for video has been processing power required to manage/maintain a steady stream of 4k images up to the 24 fps requirement for video. It's easier to add processing power to an already-established imaging-device paradigm than it is to re-fabricate a new-design imaging device whilst still maintaining the same good image quality.
By that token, if you want a more analogous example, I feel you should be comparing resolution
of your image acquisition device rather. Keeping in mind that digital cameras took over two decades from their early sub-1-megapixel origins to start to truly compete with film - especially in the budget segment.
4K stills/bursts were available years ago to anyone on a budget. Thereafter it was a matter of increasing burst-speed to video-levels.
In contrast, when you're dealing with projection, the imaging devices - DLP DMD chips/LCD chips/LCoS chips - are all a fraction-of-an-inch in size. It's certainly easier to quadruple the 1080p resolution on a 60" TV than it is to do the same on a sub-1" projection imaging device. Costs will - and already have - certainly started coming down as 4K-LCoS chips have become more popular and cheaper to manufacture. But the volumes are still small; and the demand isn't massive.
Think about it: in the TV segment, 4K is still a tad on the niche side. There're practically no 4K broadcasts. There's no optical media for it yet.
Heck, HDCP 2.2 - the copy-protection medium intended to protect 4K content - was only finalized two months ago! That's right: most older 4K devices are now going to have a problem decoding copy-protected 4K material. Even the receiver I purchased 8 months ago - touting it's "4K-Readiness" - is somewhat worthless in this regard.
That's not good for market penetration,
And on the subject of market penetration: take a walk around the streets in your neighborhood. How many people that you bump into own flat-panel displays? Just about all of them, right?
Now how many of those people own 4K flat-panel displays? A significantly smaller bunch, correct?
OK, now how many of those people own home-theater projectors? Once again, compared to flat-panels, it's a much, much smaller number.
So we're literally dealing with a fairly-niche market within
an already-niche market. Anyone who knows their way around economics will tell you that's not fantastic for manufacturing optimization; nor for pricing.
How long did it take 1080p projection - your Acer, for example - to reach the sub-$1000 price-point? A decade?
Manufacturers don't necessarily explicitly want this kind of delay. Think of how companies like BenQ, Optoma and Acer have had such success with their sub-$1000 1080p models; to the point where that price-point has become something of a focus for them. That pricing puts them in volume seller
territory - volume is good for business; it's good for profit margins. I don't think it's as simple as "manufacturers deliberately holding off" - these manufacturers want volumes. Even stalwarts like Sony have had good success recently dipping their toes into the sub-$3000 price-bracket for their 1080p projectors; certainly taking a knock in profit-margin to do so but almost definitely making up for it in volume.
As the core projection technology - DLP (unavailable in home-projection-ready 4K form at present); LCD (also unavailable in home-projection-ready 4K form at present) and LCoS - become available and then cheaper to fabricate, manufacturers will jump on board; and prices will start dropping as economics takes over.
I see it happening in the next few years: certainly not as soon as the "NOW" you're after; but definitely less than the decade it took for 1080p.
But one thing's for certain: as a result of the points above, you'll almost-certainly see the $3000 price-bracket broken significantly sooner than the $1000-bracket.