You are correct, in an ideal world.
Originally Posted by Ryan1
This makes no sense on multiple levels.
That's simply because we're looking at it at different assumptions.
Both you and I can be correct if we define assumptions differently.
(1) Assumption #1 that I made is angular view
remains the same. e.g. 30 degrees of vision.
(2) Assumption #2 that I made is content provider bitrates
goes up, but not to Blu-Ray ratios (accuracy per pixel, to eliminate artifacts).
First, pixels have nothing to do with artifacts. You either can see the individual pixels, or you cannot.
Pixels combine into artifacts at the macro level. Pixels are the building blocks of an image. Imperfect pixel colors (compression loss) can combine to imperfect artifacts at the macro level.
From 10', you are highly unlikely to be able to tell the difference between 720p and 1080p grids on an 50" screen. And even if you are eagle-eyed, you are highly unlikely to discern any difference between 1080p and 4k.
I am not talking about individual, single pixels.
Second, if processing x-number of bits is causing artifacts, trying to process 4 times x-number of bits with finite resources is likely to result in more apparent artifacts.
Again, I retierate:
(2) Assumption #2 that I made is content provider bitrates goes up, but not
to Blu-Ray ratios (accuracy per pixel, to eliminate artifacts).
You've got 4 times as many pixels in the same angular view. Bitrate is also going to go up, but not correspondingly to Blu-Ray leagues. However, if we use the same bitrate ratios as today's cable TV, Netflix, iTunes, we *are* still going to be able to see artifacts in 4K at 50" from 10 feet away. Guaranteed -- the compression artifacts are so blatantly obvious from 10 feet away at 1080p, they're not going to magically disappear assuming the assumptions remain constant. Content makers ARE going to skimp on bitrate to the point where they will compress "just enough" to cause artifacts to show up, but at least still vastly better than 1080p streaming.
Just by having 4K, we will at least gain better streaming quality than 1080p, and we'll finally have a "slightly over-compressed 4K" that looks roughly as good as "well-compressed Blu-Ray" at consumer-view distances from consumer-sized HDTV's.
Are you going to buy Comcast and Charter, and mandate a rule that broadcast 4K must be a minimum of 100 megabits per second?
Are you going to wave a magic wand and force Netflix to blast 100 megabits per second for 4K video, in order for your argument to make better sense than mine? (Please throw in a free Google Fiber for everyone, if you do -- ).
I thought not.
Conclusion: Compression artifacts are still going to exist. Full stop.
I reiterate: We aren't talking about individual pixels. We're talking about clusters of imperfect pixels (aka compression artifacts) that's inevitable by 4K Comcast, 4K Charter, 4K Netflix, etc.
As others have pointed out, 4k will result in noticeable improvements in cases where pixels are visible on an 1080p screen, but for a distance of 10', we are talking 100" or so at a minimum.
The other option is to just boycott low-bitrate 4K, but that's not an option everyone is planning to do. At *least* it will look better in general than current 720p, 1080i and 1080p from the same content providers for the SAME angular field of view. By not upgrading to 4K, we don't get that image quality upgrade that's still noticeable at 10 feet from a 50 inch. And sometimes people do sit closer (e.g. games, movie nights, photos, usage as monitor, etc). If it costs only a few dollars more, then why avoid 4K?
See? My argument factors into the inevitable economics.Edited by Mark Rejhon - 7/30/13 at 7:22am