Originally Posted by confidenceman
Seriously. If this happens, I hope it has more to do with non-gaming media.
But I think I see a big reason why Sony might want to do this (other than to sell a new display ecosystem to people). Wouldn't this gives them a way to outstrip most bandwidth capacities for OTA and cable broadcast? The hurdle to physical media and high-end display tech is that very similar quality content is available through cable subs and streaming content. But if they can push 4K successfully, it might differentiate physical media from digital streaming/broadcast.
I share your worry, T2, that game publishers might use raw resolution to sell games. It makes for pretty trailers and kiosks, but it makes for ****** long-term performance. Give me 1080p/60fps any day over 4K resolution at sub-par framerates (20-30fps). Unfortunately, I think most developers and publishers (and, frankly, gamers too) think that ~30fps is "good enough." The common understanding among most developers seems to be that console games should aim for 30fps, and PC versions should be optimized for 60fps. The pessimist in me doesn't see that paradigm changing anytime soon.
It be worse than that. Try 15-30 FPS. 4K is about 4X the pixels of 1080P.
If you're very, very lucky you might get late PSN titles running at it. Maybe. And really, unless it's a dedicated hardware scaler in the thing, the cost to upres lower res games would not be marginal and would hit performance. (Think how COD runs nativly at a lower resolution and is upscaled to 720P)
I want 1080P @ 60FPS with better shaders and 4XAA, with better AI, physics, and more memory for asset allocation You know, stuff that actually goes to improve visuals and gameplay.
At this point resolution is diminishing returns for exponential costs in rendering (in videogames, and somewhat in media). We already had to go through a generation of sub 720P with no AA and poor framerates. Never again please.Edited by TyrantII - 8/22/12 at 5:34pm