Originally Posted by mogbert2
I'm not certain of that, IW. The reason is that the pixels also have a large space between them horizontally. Now, it may be that they had to choose a screen with this kind of separation in order to prevent crosstalk through the FPR. Either way, it makes pictures look like I'm seeing them on a bad CRT rather then I'm actually looking at a picture. I'm still trying to determine if it is worth giving up on 3D displays for that reason or not. I can't go active, so for me it is either passive or some sort of parallax barrier (ala 3DS).
(edit: To be clear, I'm not disagreeing that the FPR and the grid are related, I'm just saying maybe they chose this kind of screen for the FPR rather then the FPR making a normal screen look like this.)
I touched on it before, but the TV really is better at displaying reds then my last LG 32". So much so, I might consider it a tradeoff and keep the set.
One thing I don't really get is why we need middleware or hdmi1.4 in order for a game to display 3D. It seems that interlaced 3D would be easy to calculate inside the game and output at 1080p 60fps over hdmi1.3. Why are we trying to output a squashed SBS signal or 1080p 24fps? It just seems that some of the solutions are bass ackwards. (edit: looked into this, and the reason is standardization. Rather then program different options for every type of TV, they just output a standard signal and let the TV convert it to what it wants. So, you know... nevermind.)
In answer to a previous question, I have a Nvidia GTX570 (super OC'd out of the box). It outperforms GTX580 in some circumstances. It's pretty much as fast as I'd care to go without going SLI (or Crossfire or whatever). However, with Nvidia wanting to charge extra in order to get their card to output a standard 3D signal to a TV, I may defect to ATI next time. I just see it as a very douche move. I've heard the arguments about how they spend man-hours on it, etc... but in which case they should charge us to read their spec sheets or charge for driver updates. There is a point where the company gives away something because if they don't have it, they lose customers, and maybe they just haven't reached that point yet (come on ATI, we need you to poke this sleeping bear!).
Every driver I get from Nvidia has a 3D component it wants to install as well, and previously I haven't installed it (because people were having performance degrade in 2D mode after installing it occasionally). What is that supposed to do? If it isn't outputting a 3D image, then why are they asking everyone to install it all of the time? honestly, when I bought the TV, I thought that would be all I needed to play my games on it.
I'm new at this, and haven't had much time at all lately to play games or sleep, and I'm afraid it's making me cranky. My Google-Fu is suffering and I can't find good information on the pro's and con's of each. Mostly I get a very one sided view where one program was actually handed down from God with the 10 commandments and the other two are the devil's work. Only by reading a number of these have I been able to, maybe, see some of the possible flaws in each.
Does anyone have a fairly unbiased, in-depth review of each of these programs, complete with how well they work, screen shots of how to use them, etc...? I just don't like buying a pig in a poke. When Nvidia was accused of hiding the troubles it's software had, along with the things it couldn't do, they claimed the information was freely available in various posts inside their user forums... so these middleware programs are definitely a case of buyer beware. Need more info, please.
edit2:ARRRRRGGGGG!!! I finally got 3D working on regular games and stuff through one of the free trials. BAD NEWS!!! This may work for 3D for console games, but if you are planning on using it for computer games, there is a VERY IMPORTANT CAVEAT!
You MUST sit further then 4 FEET away from the screen or you will get areas of the screen (for me, starting with the lower right) where both eyes WILL see both frames. I sit about 2ft to 2.5ft away from the screen in order to reach keyboard mouse. At this point, only about 50% of the screen is POSSIBLE to show 3D images.
Again, this is not a case of "optimal viewing", it is a case of "possible viewing." Any closer then 4ft and areas of the screen WILL fail to display 3D.
I'm sorry, but for my this is the ultimate deal breaker. I will have to attempt to return this screen.
Edit3: Did the math and the screen has an effective vertical viewing angle of about 10 degrees (if you are dead center, the top of the screen is 10 degrees and the bottom of the screen is 10 degrees at 48"). Looks like for me to sit 2ft away from a 32" 16:19 screen, I would need a viewing angle of about 18 degrees (if perfectly positioned, more if I will be off center at all). Seems to me the viewing angle should improve if they can get the FPR film closer to the LCD cells. Either way, I have to return this and come up with another way, or, more likely, just wait until the technology matures more. Am bummed because when it worked it looked great. I would still recommend this TV for small room console gaming.
Edit4: edited some numbers above because I made a small mistake. Screen is 15.5 inches tall and 28 inches wide. I needed to divide 15.5 by 2 to get the right triangle between my eye line and the top of the screen. When I redid the calculations using 7.75, I found I had been trying about an 18 degree viewing angle and it needed about a 10. Just saw a press release on HR274H. A smaller screen could reduce the distance I need to sit to the screen, but I need to see it's vertical viewing angle in order to see how close I could sit.