Originally Posted by Michael9009
At this point in time, I think 4k is a manufacturer's gimmick.
I was going to disagree with you, but most of the definitions for "gimmick" are in fact accurate for 4K and most things we enjoy:
- an ingenious or novel mechanical device
- an important feature that is not immediately apparent
- an ingenious and usually new scheme or angle
Though I believe you were getting at this: "a trick
or device used to attract business or attention".
4K falls quite squarely into the more positive definitions and is definitely not a "trick"
1. HD television (720p and 1080p) was introduced in the mid-'90s. It took more than ten years until the general population started to buy into it.
So is HD a "gimmick", a trick?
a. Most people were used to watch TV from relatively far away and did not see a big difference in picture quality to justify the cost to buy a new television set. Most people were also not used to replacing the TV every five years, but rather at 10-20 years intervals.
b. I bought my first HD TV in 2002, it was a 300 lb Sony KV-40XBR800 CRT television. Most people who saw it were not impressed by its picture quality versus SD.
c. Two changes made the difference: (i) the introduction of flat screen, thin TVs, which people saw as fashionable, trendy and cool. They did not care about the PQ as much as they cared about the TV's small thickness. Many continued to watch only SD programming on their state-of-the-art units; and (ii) the discontinuation of the CRT TVs, which forced people who had to upgrade to buy HD TVs as there were no other choices left.
I don't think any of that made HD a "trick".
2. At the end of the 2000s, manufacturers realized that most people upgraded their TVs to HD TV units and the sales began to decrease. They had to find new ways to attract buyers. So, here comes:
a. The 3D gimmick. Obviously a failure. Unless in a dedicated HT, how many people would be willing to wear weird spectacles to watch 3D TV? Also, many people do other things while watching TV and the 3D spectacles would not allow that any longer. And many would feel nauseatic after watching 3D programming for a while. Epic failure.
b. After the 3D gimmick failed, they came up with 4k. This may have more PQ benefits and us, enthusiasts, will appreciate this. For the general population, though, who have recently upgraded their TVs, the slight increase in perceived PQ and the lack of 4k programming will not make the new technology worthwhile. It will take another 10+ years to mass adoption and only if the manufacturers will stop making 1080p display gear. Unfortunately, they already made the TVs very thin, so I am not sure what the manufacturers could change to further increase the coolness factor.
4K's been in the works for way more than the last "few" years, technology is just finally making it practical.
The same can be said about the PJs, too. 4k is a technology for enthusiasts only for now and for the foreseeable immediate future, I think.
Just because something is niche or of limited interest, doesn't make it a "trick" or "gimmick"
Originally Posted by Michael9009
If it's something that alters and post-processes the image, not for me, thanks. I run an unaltered video channel straight from the Blu-ray player, through the receiver to the projector.
Everything you see is altered and post processed. Do you calibrate your home projector for <2000:1 sequential contrast to match what you get in a theater? Do you disable scaling and watch 480p and 720p in small windows on your projector? Or do you have a separate projector with the correct native resolution for everything you watch? What exactly does it mean to not be "post processed"? What is "accurate"? Unless you are watching on exactly the same display that was used for the mastering what you see on your screen is altered. Unless you watch on the same DCI machine that you saw at the theater, what you see is altered.
Regardless of whether there is any digital processing turned on, or introduced, every projector (it's imaging device, lens, light engine, lamp) and every screen "alter" the image and impart their own fingerprint on the image.
The point is, "accurate" is not black and white, and it's not as simple as just turning off features. Darbee, the best way I can describe it, is like a lens compensation system. It produces an image that is approximate to having a projector with a better lens, with better sharpness and higher MTF. It does not "alter" the image, it's like upgrading to a better projector, but for $300 instead of $3000 or $30,000 (obviously not to say it's equivalent).
Originally Posted by Charles R
When I read comments about 4k not being important I think about phone and tablet crazy resolutions. My tablet is 1920x1280 at 8.9"... more pixels than my projector at 120". When are we getting Retina Display projectors...
Yup, that's what I'm waiting for. There's more to resolution than simply not "seeing" pixels.