Some more observations:
I was back at the local high end AV store watching the Sony 4K display again, yesterday. There are many brands of flat panels playing content all around the store, often in dedicated rooms - Panasonic plasmas, Sharp and sony LCDs, etc, playing HD feeds or Avatar or the like. (FWIW, I have both an old Panny ED plasma at home that I don't really watch much anymore, and a last year's model Samsung 55" 1080p LCD, which is quite good). While last time the Sony 4K was playing the Avatar Blu-Ray, this time it was playing the Sony 4K native demo material of a hard drive - the same content I saw playing at the Sony store last month. Images of museums, spanish beaches and beachfront homes, cities, sports, etc. Once again it looked great, but wasn't an immediately obvious difference from 1080p. As I moved closer, it held detail better of course, and looked pretty spectacular. But it didn't have an "absolute" life-like clarity - not sure if it's the panel settings or whether it really takes 8K to get that. But still, it was impressively clear and very finely detailed. I sat for quite a while watching this display. And here was for me the interesting thing...
...once I got used to watching the 4K content, I went back to the 1080p displays and they seemed more like SD displays! I'd become accustomed to seeing detail resolve so finely, no matter where on the screen I looked or how far into the distance, that 1080p looked less focused, obviously less resolved, like moving from 1080p to SD! I guess that makes some sense after all. Engaget reported the same effect on it's reporters after taking in so many 4K images at CES, that 1080p now looked more like SD. And I remember when HD first came out that it was somewhat similar. There wasn't always an obvious "wow" difference between HD and really good SD, but after becoming accustomed to HD images on a display, going back to SD looked obviously more blurry.
Also, I think I became a bit more aware of pixelation on the 1080p displays. Though I STILL don't think the big selling point of 4K is (or ought to be) "reducing pixelation." That's nice to have, but as a selling point, most people don't notice pixelation to begin with. "Buy this expensive 4K display and no longer will you have to put up with visible pixels in your image!" Average buyer "Huh? What pixels in my image? I'm not bothered by pixels in my image." But more realistic image detail? If 4K can deliver that, with the big image size, that's more compelling.
One last observation: for some reason still the Sony 4K has yet to truly blow me away as in "Wow THIS is INCREDIBLE I've got to have this next generation display technology!" At home afterward I set up my JVC RS55 projector to project about a 95" 16:9 image size, and put on the Avatar Blu-Ray and others. It still blew my mind, the sense of detail, clarity, seeing-into-a-world clarity. Just in terms of subjective, perceptual impact, it felt in ways more amazing than what I saw on the Sony 4K display. What I take from this is that I still haven't seen the full potential of 4K content. My projector has been professionally calibrated, and also subtly tweaked with some processing from devices like the Darbee Darblet, so I presume I'm comparing optimized 1080p with whatever is not optimized in that Sony 4K set I saw. I therefore presume that when 4K sources become available for true 4K displays (my JVC does 4K pixels, but doesn't take 4K sources, so I'd need a "true" 4K projector), that, once optimized, things will look pretty glorious!