In my opinion, any speculation about whether effectively quadrupling the amount of visual information standard display technology outputs is useless or not is quite silly. This increase in pixel density is not only inevitable, but will become standard faster than previous resolution jumps have taken place. To argue against 2160p technology development (and consequentially, adoption) is to severely lack foresight and/or to not have been paying attention to the past evolution of display resolution increases. The simple fact of the matter is that there are a huge amount use cases for display technologies being far more useful and more detailed due to the 4x (and later, 16x with 4320p) visual information increase. I know there are large amounts of HT enthusiasts here, but as we go into the future, display devices will be used increasinly more for simply playing back pre-recorded (or -rendered or -animated, etc.) video content, to the point where eventually 'dumb' display devices without entire SoCs inside them running a fairly modern and capable operating system will be niche products. Even though 2160p film and TV content will in time be produced and then become the standard, the lack of said content right now in no way makes me snub my nose at the idea of 2160p, *especially* as I am MORE of a heavy user of computers to perform a multitude of tasks than I am a film/tv consumer.
I currently use a 2560x1440 panel that I sit ~55cm (~2 feet) from, and I have a 1080p projector with a +100" screen size hooked up to the same computer. Before this, I had a 1920x1200 panel and a 720p projector. In both cases, when I upgraded the resolution, it was very noticeable and gave me much more room to display visual information on the screen and improved my workflow in multimedia editing applications. There is every reason why jumping to 2160p will be beneficial to those of us who attach display devices to our computers, be they in the flavour of PCs, game consoles, or tablets/smartphones that can wirelessly stream 1080p (and soon, 2160p) via Miracast.
For those that think of their displays in terms of simple playback devices for centralised broadcast video or polycarbonate discs, then yes, maybe staying with 1080p makes sense for the forseeable future. However, even if I had a 2160p display right now, I still couldn't display or edit the photos produced by my DSLR at 100% view without scrolling off the screen!! Or any of the fullscreen applications that I use in which I can select the output resolution of, from video games to rendering software, or perhaps being able to edit 1080p video without having the project preview be downscaled at all, or tiling multiple windows next to each other to improve my workflow... I could go on, but there are many reasons why more visual information than 1080p can be useful. Not to mention, I am absolutely sure that any informed projector owners here with decent screen sizes know that getting 4 times visual informations on their walls is going to be incredibly sexy. On a related note, video mappers will have much more creative license in their craft, if the total available pixel grid to be projected is increased by *any* amount beyond the current 1080p standard.
In ten years, when most of the members of this forum have ≥2160p in our pockets which can wirelessly beam the full resolution to both 2160p displays natively and upscaled on 4320p displays (which at that point will have ceased to be simple video output devices themselves anymore, but IP-connected computers in and of themselves), the idea of clinging on to 1080p will seem downright silly.
Edited by almalsamo - 1/12/13 at 9:09am