Originally Posted by DaViD Boulet
Let's assume that's true for the sake of your argument.
Damned by faint praise
If it was, then still, why would 1080 not benefit from upsampling?
I explained that David.
If I pull up a 1920 x 1080 desktop image from my PC I can see the limits of the 1920 x 1080 pixels from between 1 and 1.5 screen widths without having 20 20 vision. Are all HT viewers mandated to sit father away than 1.5 screen widths by a law of cinema? 30 degrees is recommended as a minimum viewing angle, not a maximum for peripheral stimulation, so why not enjoy it as a minimum rather than maximum in our HT venue as well? Why not a HT system that's friendly to front-seat viewers as well as mid-to-rear seat viewers?
I have not said any of this to you. I have said the following:
1. Upsamping results in no new data, or increased resolution/detail. None. The math dictates this so it is not an opinion.
2. If from your seating position you see the pixel structure, then getting a projector with finer pixels will solve that problem.
3. Due to realities of pixel structure, when you resample to finer pixels, subjectively the image will look somewhat softer. Not a lot. But somewhat softer. People like to talk about these images being "smoother." The way it become smoother is to have less sharp edges.
Being able to enjoy native 1080p-encoded images (via 4K upsampling) without visible pixel structure from between 1 and 1.5 screen widths without visible pixel structure but maintaining clarity would benefit front-row viewers in many viewing rooms, and create for a more immersive image for those wishing to view constant width 2.35:1 systems which attempt to widen, rather than shorten, scope pictures.
Vast majority of people are not seeing pixel structure from 1.5X. Not even close. If you are sitting closer than that, then likely you are seeing convergence errors in your projector, artifacts in the image, etc. Getting finer pixels doesn't solve these problems.
Bjoern's exellent examples of the 5th element upconverted DVD demonstrate how upsampling can better reveal detail that's inherent in the native image but obscured by quantization/pixel noise.
Context is important. As I said, no one is saying that you should watch 480p video on a 10 foot screen. You will see the pixel structure and therefore a 1080p projector will show a much more pleasing image.
There is however a crossover point at which the pixel structure is just adding enough high frequency noise as to make the lower resolution image look sharper. Since you are a believer in Buoern's work, here is an interchange documenting that from the archive: http://archive.avsforum.com/avs-vb/s...?postid=691456Poster
: but even then, the hard edges of the pixels add artificial high frequency content to the image. That's why we like to use higher resolution modes, and scale up to them.Bjoern
: Thats right. The higher you scale, the less aliasing noise (hard edges) you get.
Again, if the pixels are quite large and visible then the harshness becomes a liability and resampling helps a lot with that. But in tiny amounts which is typical in our viewing distance with 1080p, it can actually be beneficial. And getting rid of it will make the image look a bit softer subjectively.
Such algorithms interpolate new pixels, but not additional detail. I'm suggesting that 1080p images also have quantization noise and can have their native detail better delivered by careful upsampling as well. No "new" detail need be invented... just good algorithms to reduce quantization noise to best reveal the source's inherent detail.
Hopefully you see from above that as non-intuitive as it might be, the quantization noise can be your friend. It wasn't in 480p world because its level was huge. Not so with 1080p.
And again remember that your content has been sharpened relative to a 1080p display in post. Not 4K. If you get rid of that extra quantization noise, you will have a softer image than the talent approved.
If 1080p captured all possible visible detail at 1.5 screen widths, then native 4K at the 1.5 screen widths would look identical to native 1080p without any detectable difference whatsoever.
There are no fast and lose formulas like you are using. A typical PJ at less than $10K will have huge amount of lateral CA, lens issues, non-flat MTF, AT screen, etc. that all contribute as different types of filters relative to you seeing the pixel edges. At what point you see the pixel structure is highly dependent on these formulas which in all cases, help hide pixels more, than not. I know with our $8K JVC if I got close enough to see the pixels, I would see a rainbow of colors due to CA.
Here is a shot I have from a $65K three-chip DLP projector:
You can see more than a pixel bleed. So there are no quantization noise as you can imagine from simulations people post on forums. Here is another shot showing the offset of each color:
As you can clearly observe, the thee primaries do not line up. As such, pixels do not just go from white to black but white to other shades and then black.
If 4K looks better at all (all other variables being the same), then this demonstrates that quantization noise is present and visible at the 1080 level (hence, the potential for 1080p to benefit from upscaling to reduce quantization/pixel noise and improve visible detail as a result).
Who conceded that 4K looks better? At typical viewing distance you may not see the difference at all.
At CES Panasonic had a 4K display but in a small size. I want to say 24 inches. They had the exact same size display showing 1080p. I stood there for five minutes and only occasionally I could tell the sharpness difference. This was from ~2 foot distance. If I stepped back at all the difference would easily vanish and at any rate, on many scenes it was not different. Both displays looked great and folks would have been happy with either.
Sounds good to me. Apparently Sony feels that this reduction in stairstep jaggies benefits native 1080p images. I don't disagree.
So you have to determine first if from your seating position you see said jaggies which I can tell you, is very hard on real video material instead of computer source. And further, even if it is there and you remove it you are going to get a softer image per above.
Please guys. Don't believe in video alchemy. There is a world out there that wants you dump huge amount of money toward 4K. But they couldn't even be bothered to give the source in that resolution. Don't fall for it. Know that it only solves one problem (pixel structure being too visible) and it comes at a cost both in softening the video and of course in the display, AVR, etc.