Originally Posted by markmon1
This is absolutely not proof that the chart is wrong. The increased bitrate on the 4K source all by itself can provide a much sharper more detailed image without the resolution, itself being part of it. The 1080p is not HDR and contrast / color changes alone can provide what appears to be a much sharper and more detailed image. There are tons of factors here and 4k vs 1080p cannot really be isolated outside of running an HTPC for side-by-side comparisons.
The bitrate HAS to increase if there is more information (i.e. 4x more pixels to display). BUT, from what I have seen, UHD blu rays don't end up with 4x the bitrate of the 1080p counterpart, it is usually a lower number (typically I see about 2x the bitrate), meaning that for the number of pixels, there is more compression. So in effect, you have lowered the bitrate per pixels in typical UHD content.
You are correct, there is MUCH more to 4k UHD vs 1080p than just pixels, and that is part of my point. We aren't just talking about a 1080p static image vs the same image upscaled to 4k while sitting at a specific distance from the display. We are talking about more image information: a higher resolution source stock, a wider color gamut, and more dynamic light range, and combined this becomes far more than how well your eyes can resolve pixels from your seating distance.
When people are talking about "4k" in the context of a home theater, they are talking about the whole package, not just the number of pixels. So telling them that they will gain nothing by going to 4k because they don't sit a specific distance from the screen for the screen size is simply bad advice, IMHO. As a rough guideline, I buy it, but as a hard and fast rule, my eyes tell me it is wrong.
I do understand that at some point, higher resolution will become moot as you can no longer resolve more detail, but I don't think that point is the same for everyone, and I don't think it has anything to do with whether you can make out individual pixels. When I go to the eye doctor and read that bottom line, I can score 20/20 despite the fact that the letters are blurry. There is just enough detail that my brain can resolve it and determine what it is. It's still blurry though, and someone who also scores 20/20 could easily have better vision than me. Likewise, if I am trying to look at one pixel in 1080, and then the corresponding 4 pixels in 4k that make up the same dot, I wouldn't be able to tell you the difference between those pixels. But create a larger image from those pixels, and my brain starts to see the difference even though I couldn't spot a difference on the smaller scale. The point here is you can't just say that you have to sit close enough to be able to resolve the pixels in 1080p in order to be able to gain from 4k. You need to look at the images yourself and determine if you can or cannot see a difference.
You can all disagree, but I know what I can see, and the charts don't accurately represent whether it is "worth it" for me. YMMV.