Originally Posted by orbidia
I posted this rant in the nvidia forum but it seems like it is just as appropriate for the AVS Forum. I also wanted to see if there were any responses from the good folks of AVS Forum who probably know a lot more than I do:
I just wanted to write in this forum to vent a bit.
It seems like all companies involved in the entire 3D launch screwed up.
Nvidia tried their best a couple years ago with Nvidia 3D Vision. This basically had everything needed to make the best 3D possible at the time. Even though the first 3D monitors didn't support it, 3D Vision supported TRUE 120hz 1920x1080p. That is the resolution that makes the most sense for 3D games. So Nvidia had the "vision" to get the specification down properly. A lot of people debate about how 3D should be done but Nvidia's solution works better than the current HDMI 1.4a solution. The problem is that 3D Vision needs displays to work. The bigger the display, the better. And Nvidia didn't get TV manufacturers to properly implement 3D Vision. People bothering with 3D want at least a 32" monitor/TV for 3D. Now, the best we can get to use large TVs with 3D and computers is 3DTV Play which can only play games at Stereo 1280x720 60hz resolution.
The problem really comes down to the competition between Displayport and HDMI. HDMI is a licensed technology which requires fees to use on devices. It is also inferior to displayport in bandwidth. HDMI is basically the same as DVI. Nvidia could have chosen displayport to implement 3D Vision which would have had enough bandwidth even with diplayport 1.0 spec. Instead, Nvidia chose to use Dual Link DVI because all of their graphics cards had dual link DVI for a while. Single link DVI does not have enough bandwidth for 120hz 1920x1080. Maybe Nvidia thought HDMI would eventually incorporate a new Dual Link HDMI standard. That would certainly have been nice but it didn't happen.
Instead Sony, Samsung and all the HDMI gang came up with HDMI 1.4a - the current spec supporting the latest 3D standards. There is a huge corporate push for this with millions of advertising dollars trying to convince us 3D is the next great thing. But instead of giving us a nice clean 120hz signal for 3D, HDMI 1.4a uses Frame Packing and weird Side by Side etc garbage to get around other technical limitations. I understand this was mostly for backward compatibility with previous blu-ray players and also for compression advantages of the similar left/right eye frames. But its really a messy solution. Instead of taking a step back and doing it right, they fudged the whole thing to try to get something to work that wasn't quite ready. They needed to add more bandwidth to HDMI to make a proper 3D standard or not bother with 3D. The bandwidth between HDMI 1.3 and HDMI 1.4 is the same. They just wanted to get 3D out the door without thinking about long term consequences.
The main limitation of HDMI 1.4a is that it only requires the standards of stereo 720p 60hz or stereo 1080p 24 hz. HDMI (single link) does not have the bandwidth for stereo 1080p 60. But there is NO EXCUSE that the HDMI board didn't REQUIRE stereo 1080p 30. This was still within the bandwidth restrictions of HDMI and would have allowed an "acceptable" framerate for games of 30 frames per eye at 1080p. It would not have been ideal but it would have been good enough. Stereo 1080p 24 is obviously unplayable. Nvidia should have pushed a lot harder to get the stereo 1080p 30 as part of the HDMI 1.4a specification. Now for the next 2 years minimum, we are stuck with a max resolution of 3D 720p 60 for playing games when using large screen TVs. All the TVs released now will be obsolete when the next Xbox or Playstation 4 comes out in a couple years. Because that's when the HDMI standard will be upgraded to support Stereo 1080p 60 which the new game consoles will surely have available. Again, the sad part of this is the lack of Stereo 1080p 30 support for all the current TV models.
Another benefit of 3D vision was the ability to have nice smooth 120hz 1920x1080 resolution for normal computer use as well. I used to use 120hz in old CRT days. Its really nice and a worthwhile upgrade in itself even without the 3D. None of that is possible with the current HDMI 1.4a implementation. Also, any future monitors coming out for 3D Vision that have hdmi 1.4a won't support 120hz over hdmi because its not in the standard. To get 3D 1080p or 120hz, one will still need to use dual link dvi or displayport which nvidia barely supports.
That's the other problem with nvidia. They REFUSE to push displayport. Possibly, if Nvidia had started to use displayport on all their reference graphics cards, then it would have caught on a lot faster. It might even have become standard to put a displayport input on TVs. This was a huge mistake on Nivdia's part. Displayport would have worked perfectly for 120hz 1920x1080p even with the old spec so they could have used it for 3D Vision. There was also no disadvantage to adding displayport to all of their graphics cards. First of all Displayport has no licensing charges for manufacturers. It is much cheaper to add displayport than hdmi.
DVI and HDMI are basically the same. The proper port configuration of a video card today should be at a minimum, 1x displayport, 1xhdmi, and 1x dvi. ATI uses 1xdisplayport, 1xhdmi and 2x dvi. But the 2xdvi takes up venting room on the second slot. The second DVI is not necessary to include on a modern card though because a very cheap HDMI to DVI converter or cable can make the 1xhdmi become a second DVI perfectly. All nvidia must do is supply the converter with their graphics cards. Then the consumer has the choice of how to connect everything. 2x DVI, 1xDVI and 1xHMDI, 1xdisplayport and 1xHDMI or DVI. It's all possible. With Nvidia's current implementation, the consumer has the choice of HDMI or DVI (which is essentially just HDMI) i.e. no choice at all. Of course, it would also be nice to get eyefinity functionality on nvidia without having to buy two cards. Displayport also makes it easier to have eyefinity functionality, but that's for another discussion.
So ATI, Apple and Dell were already way ahead of the curve by supporting displayport. Maybe if nvidia had also supported displayport, it would have tipped the sails enough to either:
1. get a TV manufacturer to use displayport as one of the TV inputs.
2. pushed HDMI members to have more bandwidth/better standards for 3D in hdmi 1.4a.
Instead we have the mess we are in now. Displayport has all the bandwidth necessary for whatever is needed - even 3D at 2560x1600. But no HDTVs and very few monitors use it because the only things outputting displayport are ATI cards and a few non-reference nvidia cards - oh, and some apple stuff. I was rather suprised even the latest and greatest reference Nvidia 580gtx still lacks displayport. Maybe in 6 months we'll get a non-reference card which supports it. The really pathetic part is that Nvidia supports displayport at the chip level and in its drivers already. But it totally failed not making it part of the reference board. It's a small port - it should be standard on all graphics cards. At the very least, having such a high bandwidth port available everywhere puts pressure on HDMI and other standards to compete. Even if displayport ultimately fails completely, it would have been to nvidia's avantage to include it because it would have given Nvidia more bargaining power in the future 3D market.
It seems fairly obvious that all manufacturers involved are playing a game of "Manufactured Obsolescence." I guess, if I want to play with 3D Vision, I can play on a giant 24" monitor with Dual Link DVI. Or I can pay $2000 now for a brand new "obsolete" 46" TV and use nvidia 3DTV to play games at an amazing 1280x720 resolution. Wow.
I can only hear Sony, Samsung, Sharp and Toshiba in a couple years: "Everybody needs to upgrade EVERYTHING again because we forgot to include 3D 1080p30 in the HDMI specification!!" Woops! This was a huge failure by all manufacturers concerned and the consumer lost another round.
All of this could have been avoided if displayport was adopted. Of course, at any time, any TV manufacturer could put displayport on their TV and that would allow the full resolution we need for 3D and 120hz. Certainly AMD would be happy because Radeon will get a quick boost in the new 3D market. Nvidia will probably also be happy because they will have a great excuse to finally add displayport to a reference design and then nvidia fans will need to fork out for a new graphics card. It's all about making money, isn't it? But its very unlikely Sony or any TV manufacturer will bother with displayport now. HDMI has too much momentum and marketing money.
The mess could also have been avoided if HDMI 1.4a included at least Stereo 1080p 30 which would have been good enough as a stop gap. It was entirely technically possible without any extra effort. There would still be plenty of people upgrading from their HDTV to 3D or even OLED etc in a couple years, but people buying 3D now at least wouldn't feel totally shafted in the near future. I thought we were over the 720p hump already! If these giant corporate entities can try to look out for the consumer a bit more, that would actually benefit everyone because more momentum would be built in general for 3D and 120hz. People wouldn't always feel like they are being tricked.
It certainly would have been really nice if the HDMI gang adopted the full 120hz 1080p resolution of 3D vision into HDMI. Of course, HDMI itself would need a bandwidth upgrade. But if there's a giant corporate push to start something big like "3D", then it makes sense to do it CORRECTLY and make SURE the bandwidth is there to implement 3D PROPERLY and with true 120hz. It also should be able to integrate EASILY with computers because everyone has computers now. All they really had to do was make a new dual link HDMI standard - everything would have slotted into place perfectly! Instead, we get this half-baked, non 120hz, frame packed standard that needs $40 drivers from nvidia to use with a computer. Messy standards like that will not be easy to undo when they are embedded in millions of blu-ray players and graphics cards. It would have made a lot more sense to POSTPONE the release of all this 3D stuff until they got the standard right... I guess its too late for that now. The comical part is that TVs everywhere were already pushing a fake 120hz frame interpolation and even fake 240hz frame interpolation. But apparently, the rich corporations couldn't just make an hdmi standard with true 120hz or use displayport which already had it?? Sure - have them roll out a whole new standard that we'll use for a long time that is totally FLAWED just so they can sell us a slightly less flawed version next Christmas!
I just want to repeat what should have been the proper solution for the new 3D standard:
Either use Displayport 1.2 or use a new Dual Link HDMI standard which is compatible with Dual Link DVI.
At the very least, make HDMI 1.4a standard have 1080p30 for each eye a requirement for the sake of all the gamers in the world.
I guess all of these companies are competing in a cut-throat corporate world. Oh, the joys of capitalism!
My rant is over. I don't think writing this made me feel any better.