|View Poll Results: At one theoretical screen width, where do you see the difference?|
|I didn't follow the instructions. The right image is definitely the clearest.||7||21.21%|
|The right image is noticeably clearer.||15||45.45%|
|The middle image and right image look the same.||9||27.27%|
|All three images look the same.||2||6.06%|
|Voters: 33. You may not vote on this poll|
The below image was created using Windows Paint. The original (right / "8K") image was shrank to 1/2 normal size. I then used the "zoom" function to bring it back to it's original size. (Re-size would not work as Paint will upsample the image. Using zoom effectively simulates multiplying the pixel size by 4). I then used the "PrtSc" (Print Screen) button to capture the image. I then pasted to a new file, and cropped it to the original size. This creates the pixelated effect as if you are viewing the picture on a display which is half the resolution as your actual display. This created the middle ("4K") image. The same process was done for the left ("2K") image, just it was shrank/zoomed to 1/4 size.
IMPORTANT: Right-Click the picture and select "Open image in new tab" to see the full size (1198x368) image.
First, some instructions are in order. You can't simply view this image normally and assume you are seeing the difference of 8K. You have to simulate an 8K display.
To see what an 8K display would look like at one screen width:
1. First, you need to know the resolution your screen is set at. Right-click your desktop and select "Screen Resolution". For best results, assure that your computer's resolution is set to the native resolution of your display. My laptop is 1366x768.
2. Divide 1920 (as in 1920x1080) by your monitor's horizontal resolution. For me, I get 1920 / 1366 = 1.4.
3. Measure the width of your display. Mine is about 14".
4. Multiply your display width by the result in step 2. I get 1.4 x 14 = 19.6 This is how wide your computer monitor would be if it were a 1920x1080 monitor. This is also your viewing distance to view your theoretical 1920x1080 monitor at 1 screen width.
5. Multiply the result of step 4 by 4. I get 19.6 x 4 = 78.4. This is how far away you have to view your display to simulate it being an 8K display.
(Note: If the above doesn't make sense, think about it. If you are looking at a 1920x1080 monitor, an 8K monitor of the same size would have pixels 1/4 the size if your current monitor. And so to simulate pixels which are 1/4 the size of your current monitor, you need to view your monitor from 4 times your normal viewing distance.)
6. Now, right-click the above picture and open it in a new tab to assure it is being displayed full size. (1198x368)
7. View the picture from the distance you calculated in step 5, and report your observations.
At one theoretical screen width, I can see a difference between the left (2K) and middle (4K) image. So I do see the benefit there, and will consider a 4K projector when they come down below $5000 in probably 5 years.
However, I don't see a difference between the 4K and 8K image. At .25 widths, yes, absolutely. At .5 widths, kinda. At .75 widths, barely maybe. At one width, I highly doubt I could correctly pick which one is the 8K image with 100% accuracy if the images were randomly swapped.
At 1.5 screen widths, no way.
And so, for my situation where I view a 100" (front projected) screen from 1 screen width, there is a plausible benefit to going 4K. But not in going 8K.
The room is 14′ long, and the TV is wall mounted. With my back against the wall and the 3″ of the wall mount, it is exactly 160″ viewing distance.
My normal viewing distance is around 1W, where the distance between the three is very obvious.
At 80″, which is 2W, the difference is just as clear as one screen width away.
At 120″ which is 3W, the difference is still clear to me. I would say this is the point at which 4K and 8K are starting to look close though.
At 160″ which is 4W, there is a difference if you’re looking for it, but not as obvious. You probably couldn’t tell if the image was in motion. There’s clearly more detail on the hairs and centre of the flower, and the antennae seems smoother. Overall, the image has the appearance of being slightly higher contrast than the others as well.
If I turn on the “detail enhancer” on my TV and set it to low (sharpens the image without ringing) then the difference is clear again at 160″
I wear glasses, and have a 1.25 and 1.00 correction on my eyes, so relatively minor. (as well as correcting for astigmatism)
I would vote, but anything like that has been completely broken on the site for me since the upgrade.
OK I stopped seeing a diff. to 8K at ~0.75 simulated screen width (although I tested on a truly blurry CRT monitor mind you ).
BUT.. there are situations with pixel-sharp high contrasts, E.G. bad Play-menus, bad text, games without anti-aliasing, OS,(rarer in film/video) and there you have a clear gain over 4K at 1 sim. screen width.
Test these contrasty stripes.. easier to make out 8K pixels there at 1 sim. screen width (limit at 1.25 - 1.75 somewhere?).
(open in new tab (and click) for correct full size )
Anyway, the middle and right image look very close. The left one is obviously far worse. I can see sublte differences between the middle and right image. For example in the middle image, the left antenae has a kind of kink in it while the right image is straight.
So my initial thoughts going in were the same as they are now. 4K is definitely going to be the way to go, and sticking with 2K / HD as 'good enough' isn't going to cut it. I'll upgrade to 4K Blu-ray (or similar) when it becomes available. If 4K content only comes via download, I won't bother as I prefer to own the content rather than rely on the long term availability of content on a remote server. If that were the case, I'd miss out on a better experience but would have my reasons. Now eventual 8K...I highly doubt I would upgrade to that, unless content was only available in that form. There's very little point in upgrading for home theater use when your eyes won't likely be able to tell the difference in 'real life' usage (unless you're a front few rows at the cinema type).
Add motion to this and you'll have different effects. If motion blur were not an issue, as an object moves through a pixel in various fractional stages, the additional shades over the time domain form an anti-aliasing which forms a new edge at a perceived higher resolution---This would move the 2K appearance closer to the 4K/8K, but I believe it'll still be apparent. With motion blur, I'm just not sure, but I'm still confident that we'll see a 2K vs 4K difference.
Also, I'm not sure of the effect here, but we probably should test this with colors at 4:2:0 chroma subsampling, not 4:4:4. You'd have to be careful with how you generate the bees.
For each Bee picture:
- Start with 4:4:4 picture of a bee.
- Reduce it in size (to 1/2 each dimension or 1/4th each dimension depending which bee)
- Only then can you apply 4:2:0 CSS at that reduced resolution.
- Then nearest-neighbor enlarge it.
Do that for the "2K" bee and "4K" bee, then glue them together.
User interface engineers: Always try to think in terms of putting in "un-do"s and not confirmation dialogs on every single question! Good grief! We're STILL doing this junior level newbie nonsense in 2015?????
I'm viewing on a 17 inch laptop monitor, and I can tell the right image is slightly clearer even if I'm all the way against the opposing wall. The difference between the left and middle images is still extremely apparent.
8K is quite noticeable for me. I'd say in real viewing settings, 65 inches would be the point where it 8K can be appreciated over 4K. Of course there is always a diminishing return. As far as this maximum human perception crap is concerned, televisions aren't even close to achieving "life-like" resolution, especially large screens. As long as the industry makes sure the next step up is exponentially greater each time, there will be people buying 8K, 32K and beyond, until we get displays that have 100% field of view coverage at 576 megapixels.