Originally Posted by Pete
Ok...let me be more specific...having viewed a 2K source on a number of premium single-chip 2K DLP projectors -- both conventional lamp-driven and LED -- and having viewed the same source material upconverted on a Sony 4K projector, the image on the upconverted Sony is softer and seems more prone to artifacts. Same with the JVC faux K. That's all.
This might be your observations and I can accept that this is what your eyes were seeing. However, from a purely scientific point of view (this is AVScience, after all), when using two identical projectors, one being 2K and one 4K, the 4K one should produce a clearly better and sharper image with 1080p/2K content. The fillrate plays a certain role, though. Let me show you with images:
On the left side you're seeing a 2K video frame displayed on a 2K projector with 1:1 pixel mapping, when going so near to the screen that you can see the individual pixels. Each square on the left side of the image above is one pixel of the original 2K image. Now a 4K projector can simulate a 2K projector simply by using four (a 2x2 square) pixels to render one 2K content pixel. This is what you're seeing on the right side of the above image: A 4K projector simulating a 2K projector by using simple pixel replication (this method is called "nearest neighbor interpolation"). The final image is not identical, though, because usually fillrate is not perfect. So the fillrate makes a 2K projector look slightly different than a 4K projector simulating a 2K projector.
Now let's suppose both 2K and 4K projectors have 100% fillrate. So no black/gray borders around the individual pixels. The same image as above suddenly looks like this then:
This means: All else being equal, and with a 100% fillrate, a 4K projector can *perfectly* simulate a 2K projector, by using nearest neighbor interpolation. However, in real life there's always a small difference due to fillrate not being 100%. I don't think the difference is that big, though. Other factors will play a bigger role. E.g. if you compare a 2K DLP projector to a 4K SXRD projector, the projectors differ so much in various properties (e.g. on/off contrast is better on the SXRD, but ANSI contrast is better on the DLP etc) that you're comparing much more things than just the resolution difference.
Anyway, let's get back to comparing exactly the same theoretical projector in 2K and 4K versions, which has identical properties, except for the resolution difference. Let's also once again imagine that the projector has 100% fillrate. So in this situation the 4K projector could simulate the 2K projector perfectly, as shown above. But is this actually the best solution for the 4K projector to display 2K content? No. Here's why:
2K castle image displayed in 2K (or in 4K using nearest neighbor interpolation).
2K castle image upscaled to 4K and sharpened with good algorithms.
So using a 4K projector to display 2K content, using high quality upscaling and sharpening algorithms, should produce better results than using a 2K projector. BUT this only applies if we compare two projectors which are equal in all properties except for the resolution, *and* if we're using high quality algorithms.