Originally Posted by *UFO*
Originally Posted by tgm1024
You compared the w900A(2K) to the X900A(4K) both running a 2K movie Blu-Ray? You're likely looking at simple up-convert, which for all we know is limited to Nearest Neighbor. (1 pix -> 2x2 pixels).
I saw a very informative video on the disaster 4k has in store for the majority of content. If I find it again il post it but basically a 1080p movie will look horrible on a 4k display because of the inability to scale it even remotely correct. Not even a super high end scaler would solve the problem, its just the way it maps out.
Sorry, but that's completely completely incorrect. I have no idea where you got that, but it wasn't any place reputable.
First, the point was that you were comparing two TVs each with 2K content. So you're not going to see a slam-dunk 4K effect from one of them. So your side by side test was not giving you what you thought.
Secondly, (and this is important): Upscaling 2K to a 3840x2160 (UHD) display using nearest neighbor sampling is as clean an up-convert as you can get. Do you know what that is? For every incoming 2K pixel, it's duplicated to a 2x2 box. How is this an advantage over 2K? Technically, it isn't, except that there's a thing about the grid between the pixels actually being thinner. This actually does result in a superior display, but not by much.
You don't go backwards in quality with UHD 4K.
And that's just NN. If they employ edge detection algorithms, an upconvert would dramatically improve things.
However, If the display were something less amenable to scaling (say the DCI "4K" which was/is 4096x2160 or grosser stuff like 4096x3112 and others), then you can get immediate artifacts. But those are not UHD, a consumer television format. Nearest-neighbor (1 --> 2x2) is as clean as you can get. Double check your sources.
Edited by tgm1024 - 7/29/13 at 12:00pm