All this talk of 4k rendering makes me realize that there are some pretty obvious and elegant ways to handle the 1080p to 4k transition. It wouldn't be terribly difficult for developers to take advantage of how poor human vision really is and play with some rendering tricks.
Humans are far more sensitive to vertical resolution than to horizontal. As a quick and dirty example, one could render to a 4:3 2872x2160 image before stretching it to 16:9 3840x2160 for display. That alone drops you from an 8.2MP to a 6.2MP rendering requirement, or only 3x 1080p. Layer a full-res 4k HUD on top of that, and you'd be hard pressed to find people that can tell the difference in a proper double-blind test.
As a secondary step, one could take advantage of the fact that humans have much sharper vision at the center of their view than they do at the edges. Again, quick and dirty example, but we could render the center 50% of the image at 4k resolution, a second 25% band at 1620p, and the outside 25% at 1080p. The rendering requirement here drops to 5.7MP from 8MP. Upon close inspection of individual pixels one could find the differences, but in actual moment to moment use while in a game, especially first-person titles of any sort, it would be hard to say that it's any less sharp.
Combine these two fairly simple tricks and you are down to 4.2MP. That is twice the requirement for 1080p, but half the requirement for 4k. While there are twice as many pixels on screen, the perception would probably be that it's 50% sharper than 1080p, and probably only 10% less sharp than 4k native. Again, human vision is pretty terrible. Native 4k is not perceived as 4x as sharp as 1080p even though there are 4x the pixels in play. At most it feels twice as sharp as 1080p, but probably not even that to even the most picky among us.
As a reference point, I was rendering 4.6MP across my 3 1080p monitors at 5100x900 two years ago on what is now a $150 card. 4.2MP for really nice looking 4k rendering would be easy to accomplish on commodity cards today with a bit of elegance on either Nvidia's part in the drivers, or the individual devs' part in their software.
Edited by darklordjames - 6/6/13 at 2:42pm