Originally Posted by bd2003
There seems to be very basic disagreement about what "3D" really is.
Some (myself included) see 3D as the next obvious step. Human beings have two eyes for a reason. We experience the world in 3D (at least, we're supposed to). You dont need to explain the benefits to me, why wouldnt I want to see a movie as I see the world? For gaming, even more so. I can see how 3D can be used in a silly way, but nevertheless, Id rather see in three dimensions than not. Its as integral to my visual system as color.
Some think of 3D as some sort of parlor trick. As if its some unnatural effect added to the natural goodness of film. I suspect the same people would argue in favor of film grain and 24 frames a second. Theyre used to seeing film in a certain way, and any such drastic change basically spoils it. Theyre fine with the experience as is, and it seems like there's a certain level of realism that they dont want. Likewise, Ive met people who dont like surround sound because they dont want to be immersed, they dont want the movie to take over their whole sensory system, they want that distinction between movie and reality to be clear.
No one is really arguing that the next logical step is 3D. It is the implementation of 3D that is the problem. When Nintendo released the Virtual Boy, it was unanimously hated and scrapped because the effect was extremely poor and the health implications more than just mild. Would you have argued to keep it instead?
The fact that people are still getting nauseated shows how "unnatural" the current 3D implementation can be. If our eyes and visual system are logically tuned in and comfortable with 3D and thus 3D is the next logical step, then why is it such a pain? The IMPLEMENTATION is at fault. They are still using "parlor tricks" to achieve 3D. Even those people who like the effect can get tired after a while. The tech hasn't really advanced in all these years nor has there been a more superior way to achieve 3D that makes it look more realistic and/or lessens the strain. The best analogy would be the CGI effects in movies. The best effects are the ones you DON'T see, or realize are fake. Otherwise, they are really corny. The fact that you can tell it is still fake 3D shows there is a lot of progress to be made before it becomes good enough. The glasses, dimmed vision, and TV size requirements certainly don't add to this idea of realism you are touting so much.
Saying people who don't like surround sound and are fine with 24fps because they don't like "realism" is also rather condescending and ignorant of several facts. For surround sound, when you watch a movie, it comes from a flat screen from one direction. If a man punches a window, why is the glass shattering BEHIND me? Why not where the window is at? The window is in FRONT of me, where the television is. Immersion and realism require logic as well. Surround sound makes much more sense in video games, first-person video games, and less sense in movies, where you are usually just a spectator, unless it is a first-person movie with no scene cuts.
In regards to 24fps, everyone sees and perceives at a different framerate or is less/more sensitive to certain ones. That's why some people get headaches looking at old-school CRT monitors that are below a certain refresh rate and others do not. You ever hear jokes about a movie being cheap because it is like a Mexican soap opera, buttery smooth? How about the fake cheap look of 120hz televisions? The same reason why they like 24fps, and camera blur. You have to respect the fact that not everyone perceives images in the same way, and that their realism is not always the same as your realism, nor is one better than the other. The same goes for grain. And 3D. Some people can't even see 3D, or their brain uses it sparingly or completely ignores it. A perfect 3D image would thus still be completely unreal to them.
Moral of the story. Realism is in the eye of the beholder.