or Connect
AVS › AVS Forum › 3D Central › 3D Tech Talk › Support for 3D 1080p @ 60 and 30 frames per second?
New Posts  All Forums:Forum Nav:

Support for 3D 1080p @ 60 and 30 frames per second? - Page 2

post #31 of 73
Quote:
Originally Posted by Richard Paul View Post

1080p60 Frame Packing is listed in the "Secondary 3D Video Format Timings" section.

The only ones I see in that section are 1080p/60 1/2 resolution side by side or 1/2 resolution Top and Bottom format. Instead of the full resolution packed buffer formats listed in Table 8-15 and described in the text preceeding the table.
post #32 of 73
Quote:
Originally Posted by walford View Post

The only ones I see in that section are 1080p/60 1/2 resolution side by side or 1/2 resolution Top and Bottom format. Instead of the full resolution packed buffer formats listed in Table 8-15 and described in the text preceeding the table.

On page 12 of the HDMI 1.4a 3D specification look at the tenth video format listed. The HDMI 1.4 3D specification has a bit more information since in Table H-6 it specifically lists the bandwidth for 1080p60 Frame Packing as 297 MHz. Links to both HDMI 3D specifications can be found in this post.
post #33 of 73
Quote:
Originally Posted by Richard Paul View Post

On page 12 of the HDMI 1.4a 3D specification look at the tenth video format listed. The HDMI 1.4 3D specification has a bit more information since in Table H-6 it specifically lists the bandwidth for 1080p60 Frame Packing as 297 MHz. Links to both HDMI 3D specifications can be found in this post.

When I click the links at hdmi.org it says "The system cannot find the file specified".
post #34 of 73
3d gamers are up in arms about this because we have source devices that are capable of rendering 1080p60 3d, and 3dtv's that are capable displaying 1080p60 3d.
The only thing standing in the way is sending the signal to the tv (and the tv receiving that signal).
From an engineering standpoint, that should be the easiest step to make, yet it lags behind. It seems suspiciously like they are deliberately holding back the tech via the HDMI stranglehold.
HDMI chips are too slow is a pretty lame excuse. If consumers can afford to buy source components that can render 1080p60 3d, and TVs that can display the same, surely consumers can front the cost of faster HDMI chips.

It's a delayed feature marketing scheme, pure and simple. Feed us small upgrades every few years, rather than selling us the tech that could easily be available now. Frankly, the only force that can oppose this very profitable delayed feature scheme, is consumer education and subsequent outrage and boycots.
post #35 of 73
Quote:
Originally Posted by Joe Bloggs View Post

When I click the links at hdmi.org it says "The system cannot find the file specified".

You probably have to start at the following link?

http://www.hdmi.org/manufacturer/specification.aspx

One of the problems in the Spec is that it uses "frame packing" to descibe the two 1/2 resolution frames contained in 1080i or 720p buffers for Side-by-Side or Top-n-bottom formats and for the 2 full frame packed buffer format which uses a 1920x2025 buffer for 1080p 3D content coming from the new 3D Blu-ray players.
post #36 of 73
Quote:
Originally Posted by Joe Bloggs View Post

When I click the links at hdmi.org it says "The system cannot find the file specified".

Those links used to work but it looks like that has changed.


Quote:
Originally Posted by walford View Post

One of the problems in the Spec is that it uses "frame packing" to descibe the two 1/2 resolution frames contained in 1080i or 720p buffers for Side-by-Side or Top-n-bottom formats and for the 2 full frame packed buffer format which uses a 1920x2025 buffer for 1080p 3D content coming from the new 3D Blu-ray players.

Where did it say that in the specification? I have read both HDMI 3D specifications and Frame Packing always refers to full resolution per eye 3D video.

For example with the blanking intervals included 24-bit color 1080p60 Frame Packing is 2200 (horizontal pixels) x 2250 (vertical pixels) x 60 (frame rate) x 10 (an 8-bit color component converted into a 10-bit TMDS symbol) x 3 (a color component sent over each of the three TMDS data links) = 8.91 Gbits or 297 MHz (since a 10-bit TMDS symbol is sent over each of the three TMDS data links every clock cycle).
post #37 of 73
Line 10 is quoted below"
"
 1920x1080p @ 59.94/60Hz (Frame Packing, Side-by-Side(Half)))
"
post #38 of 73
Quote:
Originally Posted by walford View Post

Line 10 is quoted below"
"
 1920x1080p @ 59.94/60Hz (Frame Packing, Side-by-Side(Half)))
"

That means that 1080p60 can be done using either Frame Packing or Side-by-Side(Half) and it means that those are Secondary 3D Video Format Timings since that line is in the "Secondary 3D Video Format Timings" section. The "Primary 3D Video Format Timings" section and the "Secondary 3D Video Format Timings" section do not include the video format timings. The video format timings would be different for 1080p60 Frame Packing and 1080p60 Side-by-Side(Half).

This was easier to see in the HDMI 1.4 3D specification since it had a longer more detailed list of video format timings (including the vertical active resolution for both frames with Frame Packing) but you can see this in the HDMI 1.4a 3D specification. Look at Table 8-15 which has different video timings for 1080p24 Frame Packing (requires 148.5 MHz of bandwidth) and 1080p24 Top-and-Bottom (requires 74.25 MHz of bandwidth). For 1080p24 Frame Packing the video timing with blanking intervals would be 2750 (horizontal pixels) x 2250 (vertical pixels) x 24 (frame rate) = 148.5 MHz. For 1080p24 Top-and-Bottom the video timing with blanking intervals would be 2750 (horizontal pixels) x 1125 (vertical pixels) x 24 (frame rate) = 74.25 MHz. They have different video timings but both of them are on the same line in the "Primary 3D Video Format Timings" section.
post #39 of 73
It appears I misunderstood the line I quoted.
post #40 of 73
i am confused. Acer Aspire GD245HQ can handle 1080p 60Hz per eye. There is no bandwidth problem. DVI ~ HDMI as far as i know. so what is the problem with 3dtvs?!
post #41 of 73
Quote:
Originally Posted by tohdom View Post

i am confused. Acer Aspire GD245HQ can handle 1080p 60Hz per eye. There is no bandwidth problem. DVI ~ HDMI as far as i know. so what is the problem with 3dtvs?!

The problem with 3d TVs is they don't have a dual link DVI connector. That monitor can't do 60hz/eye over HDMI.
post #42 of 73
Quote:
Originally Posted by Darin View Post

The problem with 3d TVs is they don't have a dual link DVI connector. That monitor can't do 60hz/eye over HDMI.

Yeah, it's too bad that the 2010 model 3Dtv's don't have DisplayPort or dual-DVI. I guess I wouldn't really expect them to have dual link DVI, but DisplayPort seems to be an input that should be included if they want to sidestep the bandwidth issue on current HDMI chips.
post #43 of 73
I wish they atleast included a mandatory 3d format for 1080p30 frame packing. 30hz display is good enough for many games, especially if the rendering and physics is done at higher frames per second before being delivered at a rock solid 30hz.
The difference between 24hz and 30hz is staggering in gaming.

It's quiet clear that current HDMI chips can handle the bandwidth of 1080p30 3d, I wonder why they didnt make it one of the mandatory standards?
post #44 of 73
Quote:
Originally Posted by obveron View Post

I wish they atleast included a mandatory 3d format for 1080p30 frame packing. 30hz display is good enough for many games, especially if the rendering and physics is done at higher frames per second before being delivered at a rock solid 30hz.
The difference between 24hz and 30hz is staggering in gaming.

It's quiet clear that current HDMI chips can handle the bandwidth of 1080p30 3d, I wonder why they didnt make it one of the mandatory standards?

Probably because Bluray doesn't included 30Hz within it's specs. Just 24 and 60Hz
post #45 of 73
The first genration of HDMI chips supporteds only the ATSC standard digital formats. The Second genration added suupport for 1080p/60 which is not an ATSC format.
The HDMI 1.4 standards were released in the spring of 2009 and chips became available in late 2009. The HDMI 1.4a 3D standards were releasd in March 2010 and the 3D TV manufacturers implemented is firmaware since the allready had 1.4 receiver chips in their systems.
AFAIK additiional to the ATSC standards are being worked on and they cause additions to the HDMI standards to be developed.
post #46 of 73
Quote:
Originally Posted by Lee Stewart View Post

Probably because Bluray doesn't included 30Hz within it's specs. Just 24 and 60Hz

And 50Hz (720p50, 1920x1080/50i). And 23.976.

HDMI supports 1080p50 too.
post #47 of 73
Quote:
Originally Posted by Lee Stewart View Post

Probably because Bluray doesn't included 30Hz within it's specs. Just 24 and 60Hz

Quote:
Originally Posted by Joe Bloggs View Post

And 50Hz (720p50, 1920x1080/50i). And 23.976.

HDMI supports 1080p50 too.

Blu-ray only supports 1080p23.97 / 1080p24, but not 1080p25, 1080p29.97, 1080p30, 1080p50, 1080p59.94 or 1080p60. Most Blu-ray players can frame-interpolate to the higher frame rates on the HDMI output, but that information is not on the disc, and the TV might do a better job of upping the frame rate (3:3, 4:4 or 5:5 pulldown instead 3:2).

Only 720p and 1080i are supported in 50, 59.94 and 60 Hz.

Again, 1080p60 Frame Packing 3D would require a 297 MHz pixel clock on HDMI, while most current Tx and Rx chips top out at 225 MHz. So, pretty much no one is doing this, even though it's specified in the standards.
post #48 of 73
Ok but why does it need to be in the bluray spec to be a mandatory format for HDMI? We're all aware that many devices other than bluray use HDMI?

1080p30 framepacked 3d would have been nice for gamers, and it would of been possible with the bandwidth of current HDMI chips.
post #49 of 73
Quote:
Originally Posted by obveron View Post

Ok but why does it need to be in the bluray spec to be a mandatory format for HDMI? We're all aware that many devices other than bluray use HDMI?

1080p30 framepacked 3d would have been nice for gamers, and it would of been possible with the bandwidth of current HDMI chips.

I did not dispute that. It would be nice if 24, 25 and 30 Hz modes would always be supported together as a "package", since they require the exact same bandwidth on HDMI, and allow 24, 50 and 60 Hz material to be rendered at even pulldown rates.

(23.97 and 24.00, as well as 29.97 vs. 30.00 and 59.94 vs. 60.00 Hz are already "packaged" on HDMI, the spec does not allow to separate support between the X*1000/1000 and the X*1000/1001 modes.)
post #50 of 73
I posted this rant in the nvidia forum but it seems like it is just as appropriate for the AVS Forum. I also wanted to see if there were any responses from the good folks of AVS Forum who probably know a lot more than I do:

-------------------------------------

I just wanted to write in this forum to vent a bit.

It seems like all companies involved in the entire 3D launch screwed up.

Nvidia tried their best a couple years ago with Nvidia 3D Vision. This basically had everything needed to make the best 3D possible at the time. Even though the first 3D monitors didn't support it, 3D Vision supported TRUE 120hz 1920x1080p. That is the resolution that makes the most sense for 3D games. So Nvidia had the "vision" to get the specification down properly. A lot of people debate about how 3D should be done but Nvidia's solution works better than the current HDMI 1.4a solution. The problem is that 3D Vision needs displays to work. The bigger the display, the better. And Nvidia didn't get TV manufacturers to properly implement 3D Vision. People bothering with 3D want at least a 32" monitor/TV for 3D. Now, the best we can get to use large TVs with 3D and computers is 3DTV Play which can only play games at Stereo 1280x720 60hz resolution.

The problem really comes down to the competition between Displayport and HDMI. HDMI is a licensed technology which requires fees to use on devices. It is also inferior to displayport in bandwidth. HDMI is basically the same as DVI. Nvidia could have chosen displayport to implement 3D Vision which would have had enough bandwidth even with diplayport 1.0 spec. Instead, Nvidia chose to use Dual Link DVI because all of their graphics cards had dual link DVI for a while. Single link DVI does not have enough bandwidth for 120hz 1920x1080. Maybe Nvidia thought HDMI would eventually incorporate a new Dual Link HDMI standard. That would certainly have been nice but it didn't happen.

Instead Sony, Samsung and all the HDMI gang came up with HDMI 1.4a - the current spec supporting the latest 3D standards. There is a huge corporate push for this with millions of advertising dollars trying to convince us 3D is the next great thing. But instead of giving us a nice clean 120hz signal for 3D, HDMI 1.4a uses Frame Packing and weird Side by Side etc garbage to get around other technical limitations. I understand this was mostly for backward compatibility with previous blu-ray players and also for compression advantages of the similar left/right eye frames. But its really a messy solution. Instead of taking a step back and doing it right, they fudged the whole thing to try to get something to work that wasn't quite ready. They needed to add more bandwidth to HDMI to make a proper 3D standard or not bother with 3D. The bandwidth between HDMI 1.3 and HDMI 1.4 is the same. They just wanted to get 3D out the door without thinking about long term consequences.

The main limitation of HDMI 1.4a is that it only requires the standards of stereo 720p 60hz or stereo 1080p 24 hz. HDMI (single link) does not have the bandwidth for stereo 1080p 60. But there is NO EXCUSE that the HDMI board didn't REQUIRE stereo 1080p 30. This was still within the bandwidth restrictions of HDMI and would have allowed an "acceptable" framerate for games of 30 frames per eye at 1080p. It would not have been ideal but it would have been good enough. Stereo 1080p 24 is obviously unplayable. Nvidia should have pushed a lot harder to get the stereo 1080p 30 as part of the HDMI 1.4a specification. Now for the next 2 years minimum, we are stuck with a max resolution of 3D 720p 60 for playing games when using large screen TVs. All the TVs released now will be obsolete when the next Xbox or Playstation 4 comes out in a couple years. Because that's when the HDMI standard will be upgraded to support Stereo 1080p 60 which the new game consoles will surely have available. Again, the sad part of this is the lack of Stereo 1080p 30 support for all the current TV models.

Another benefit of 3D vision was the ability to have nice smooth 120hz 1920x1080 resolution for normal computer use as well. I used to use 120hz in old CRT days. Its really nice and a worthwhile upgrade in itself even without the 3D. None of that is possible with the current HDMI 1.4a implementation. Also, any future monitors coming out for 3D Vision that have hdmi 1.4a won't support 120hz over hdmi because its not in the standard. To get 3D 1080p or 120hz, one will still need to use dual link dvi or displayport which nvidia barely supports.

That's the other problem with nvidia. They REFUSE to push displayport. Possibly, if Nvidia had started to use displayport on all their reference graphics cards, then it would have caught on a lot faster. It might even have become standard to put a displayport input on TVs. This was a huge mistake on Nivdia's part. Displayport would have worked perfectly for 120hz 1920x1080p even with the old spec so they could have used it for 3D Vision. There was also no disadvantage to adding displayport to all of their graphics cards. First of all Displayport has no licensing charges for manufacturers. It is much cheaper to add displayport than hdmi.

DVI and HDMI are basically the same. The proper port configuration of a video card today should be at a minimum, 1x displayport, 1xhdmi, and 1x dvi. ATI uses 1xdisplayport, 1xhdmi and 2x dvi. But the 2xdvi takes up venting room on the second slot. The second DVI is not necessary to include on a modern card though because a very cheap HDMI to DVI converter or cable can make the 1xhdmi become a second DVI perfectly. All nvidia must do is supply the converter with their graphics cards. Then the consumer has the choice of how to connect everything. 2x DVI, 1xDVI and 1xHMDI, 1xdisplayport and 1xHDMI or DVI. It's all possible. With Nvidia's current implementation, the consumer has the choice of HDMI or DVI (which is essentially just HDMI) i.e. no choice at all. Of course, it would also be nice to get eyefinity functionality on nvidia without having to buy two cards. Displayport also makes it easier to have eyefinity functionality, but that's for another discussion.

So ATI, Apple and Dell were already way ahead of the curve by supporting displayport. Maybe if nvidia had also supported displayport, it would have tipped the sails enough to either:
1. get a TV manufacturer to use displayport as one of the TV inputs.
2. pushed HDMI members to have more bandwidth/better standards for 3D in hdmi 1.4a.

Instead we have the mess we are in now. Displayport has all the bandwidth necessary for whatever is needed - even 3D at 2560x1600. But no HDTVs and very few monitors use it because the only things outputting displayport are ATI cards and a few non-reference nvidia cards - oh, and some apple stuff. I was rather suprised even the latest and greatest reference Nvidia 580gtx still lacks displayport. Maybe in 6 months we'll get a non-reference card which supports it. The really pathetic part is that Nvidia supports displayport at the chip level and in its drivers already. But it totally failed not making it part of the reference board. It's a small port - it should be standard on all graphics cards. At the very least, having such a high bandwidth port available everywhere puts pressure on HDMI and other standards to compete. Even if displayport ultimately fails completely, it would have been to nvidia's avantage to include it because it would have given Nvidia more bargaining power in the future 3D market.

It seems fairly obvious that all manufacturers involved are playing a game of "Manufactured Obsolescence." I guess, if I want to play with 3D Vision, I can play on a giant 24" monitor with Dual Link DVI. Or I can pay $2000 now for a brand new "obsolete" 46" TV and use nvidia 3DTV to play games at an amazing 1280x720 resolution. Wow.

I can only hear Sony, Samsung, Sharp and Toshiba in a couple years: "Everybody needs to upgrade EVERYTHING again because we forgot to include 3D 1080p30 in the HDMI specification!!" Woops! This was a huge failure by all manufacturers concerned and the consumer lost another round.

All of this could have been avoided if displayport was adopted. Of course, at any time, any TV manufacturer could put displayport on their TV and that would allow the full resolution we need for 3D and 120hz. Certainly AMD would be happy because Radeon will get a quick boost in the new 3D market. Nvidia will probably also be happy because they will have a great excuse to finally add displayport to a reference design and then nvidia fans will need to fork out for a new graphics card. It's all about making money, isn't it? But its very unlikely Sony or any TV manufacturer will bother with displayport now. HDMI has too much momentum and marketing money.

The mess could also have been avoided if HDMI 1.4a included at least Stereo 1080p 30 which would have been good enough as a stop gap. It was entirely technically possible without any extra effort. There would still be plenty of people upgrading from their HDTV to 3D or even OLED etc in a couple years, but people buying 3D now at least wouldn't feel totally shafted in the near future. I thought we were over the 720p hump already! If these giant corporate entities can try to look out for the consumer a bit more, that would actually benefit everyone because more momentum would be built in general for 3D and 120hz. People wouldn't always feel like they are being tricked.

It certainly would have been really nice if the HDMI gang adopted the full 120hz 1080p resolution of 3D vision into HDMI. Of course, HDMI itself would need a bandwidth upgrade. But if there's a giant corporate push to start something big like "3D", then it makes sense to do it CORRECTLY and make SURE the bandwidth is there to implement 3D PROPERLY and with true 120hz. It also should be able to integrate EASILY with computers because everyone has computers now. All they really had to do was make a new dual link HDMI standard - everything would have slotted into place perfectly! Instead, we get this half-baked, non 120hz, frame packed standard that needs $40 drivers from nvidia to use with a computer. Messy standards like that will not be easy to undo when they are embedded in millions of blu-ray players and graphics cards. It would have made a lot more sense to POSTPONE the release of all this 3D stuff until they got the standard right... I guess its too late for that now. The comical part is that TVs everywhere were already pushing a fake 120hz frame interpolation and even fake 240hz frame interpolation. But apparently, the rich corporations couldn't just make an hdmi standard with true 120hz or use displayport which already had it?? Sure - have them roll out a whole new standard that we'll use for a long time that is totally FLAWED just so they can sell us a slightly less flawed version next Christmas!

I just want to repeat what should have been the proper solution for the new 3D standard:
Either use Displayport 1.2 or use a new Dual Link HDMI standard which is compatible with Dual Link DVI.
At the very least, make HDMI 1.4a standard have 1080p30 for each eye a requirement for the sake of all the gamers in the world.

I guess all of these companies are competing in a cut-throat corporate world. Oh, the joys of capitalism!

My rant is over. I don't think writing this made me feel any better.
post #51 of 73
Quote:
Originally Posted by stinster View Post

I'm looking to purchase a 3D plasma, likely a Samsung 63c8000 or Panasonic 65vt25 this summer in time for the World Cup (3D). I've gleaned that 3D bluray discs will only require a 3D TV which can display 1080p @ 24 frames per second per eye, with the display refreshing more frequently, for smooth video. I've also learned that the initial offerings from cable, satallite, fiber TV providers will likely display at 720p (not sure the fps) per eye. I believe these formats will easily be handled by "3D ready" device currently or soon to be on the market.

What about hardware capable of displaying a 1080p image for each eye at 30 and 60 frames per second? Will this first generation of TVs and audio video receivers support these standards? I'm DEEPLY concerned about spending 3-4 thousand dollars on a television and another thousand on a nice 3D ready receiver, if they lack support for these formats.

I've read about NVIDIA's upcoming 3D coversion software for PC games. I would hope, so long as my hardware is capable and I'm willing to dial in the graphics options in games, that this solution would be able to output 1080p at 30 fps for each eye. If the hardware isn't quite up to the task today for a full 60 fps, I'm sure it will be in another year. Likewise, I believe that the next generation of game consoles will also be capable of producing good quality graphics in these formats.

Have any of the TV or receiver manufacturers discussed their plans for these formats? Are they targeted as required formats for a future HDMI standard?

Any gamers out there with similar concerns about spending money this summer on hardware without guarantees that they can game in 1080p 3D at 30 and 60 fps?

At the moment you will not be able to run 30p or 60p 3D games on time sequential 3D TVs.
The only way for now is to use Time Parallel systems with nVidia's QUATRO professional dual graphics adapter.

Mathew Orman

http://www.*******************-usa.com/
post #52 of 73
Quote:
Originally Posted by orbidia View Post

That is the resolution that makes the most sense for 3D games.


looks like a nice satire you wrote there, I'll have a laugh when I have the time
post #53 of 73
Orbidia:

The 3DTVs that are available now are designed around two different 3D formats:
  • Frame Packed for 3D BD (movies and PS3 games)
  • Frame Compatible for DBS/CBL and eventually OTA

3D movies are shot/rendered in 24 frames/sec. Not 30 and not 60. The PS3 is limited to 720x60P for 3d games because that is all the Cell BE can handle. It can't do 1080x60P.

The Frame Compatible 3D format allows millions of DBS and CBL STBs to pass 3D signals with just a simple firmware upgrade as opposed to replacing them. Plus it takes up the same bandwidth that a regular HD channel does.
post #54 of 73
Quote:
Originally Posted by orbidia View Post

I posted this rant in the nvidia forum but it seems like it is just as appropriate for the AVS Forum. I also wanted to see if there were any responses from the good folks of AVS Forum who probably know a lot more than I do:

-------------------------------------

I just wanted to write in this forum to vent a bit.

It seems like all companies involved in the entire 3D launch screwed up.

Nvidia tried their best a couple years ago with Nvidia 3D Vision. This basically had everything needed to make the best 3D possible at the time. Even though the first 3D monitors didn't support it, 3D Vision supported TRUE 120hz 1920x1080p. That is the resolution that makes the most sense for 3D games. So Nvidia had the "vision" to get the specification down properly. A lot of people debate about how 3D should be done but Nvidia's solution works better than the current HDMI 1.4a solution. The problem is that 3D Vision needs displays to work. The bigger the display, the better. And Nvidia didn't get TV manufacturers to properly implement 3D Vision. People bothering with 3D want at least a 32" monitor/TV for 3D. Now, the best we can get to use large TVs with 3D and computers is 3DTV Play which can only play games at Stereo 1280x720 60hz resolution.

The problem really comes down to the competition between Displayport and HDMI. HDMI is a licensed technology which requires fees to use on devices. It is also inferior to displayport in bandwidth. HDMI is basically the same as DVI. Nvidia could have chosen displayport to implement 3D Vision which would have had enough bandwidth even with diplayport 1.0 spec. Instead, Nvidia chose to use Dual Link DVI because all of their graphics cards had dual link DVI for a while. Single link DVI does not have enough bandwidth for 120hz 1920x1080. Maybe Nvidia thought HDMI would eventually incorporate a new Dual Link HDMI standard. That would certainly have been nice but it didn't happen.

Instead Sony, Samsung and all the HDMI gang came up with HDMI 1.4a - the current spec supporting the latest 3D standards. There is a huge corporate push for this with millions of advertising dollars trying to convince us 3D is the next great thing. But instead of giving us a nice clean 120hz signal for 3D, HDMI 1.4a uses Frame Packing and weird Side by Side etc garbage to get around other technical limitations. I understand this was mostly for backward compatibility with previous blu-ray players and also for compression advantages of the similar left/right eye frames. But its really a messy solution. Instead of taking a step back and doing it right, they fudged the whole thing to try to get something to work that wasn't quite ready. They needed to add more bandwidth to HDMI to make a proper 3D standard or not bother with 3D. The bandwidth between HDMI 1.3 and HDMI 1.4 is the same. They just wanted to get 3D out the door without thinking about long term consequences.

The main limitation of HDMI 1.4a is that it only requires the standards of stereo 720p 60hz or stereo 1080p 24 hz. HDMI (single link) does not have the bandwidth for stereo 1080p 60. But there is NO EXCUSE that the HDMI board didn't REQUIRE stereo 1080p 30. This was still within the bandwidth restrictions of HDMI and would have allowed an "acceptable" framerate for games of 30 frames per eye at 1080p. It would not have been ideal but it would have been good enough. Stereo 1080p 24 is obviously unplayable. Nvidia should have pushed a lot harder to get the stereo 1080p 30 as part of the HDMI 1.4a specification. Now for the next 2 years minimum, we are stuck with a max resolution of 3D 720p 60 for playing games when using large screen TVs. All the TVs released now will be obsolete when the next Xbox or Playstation 4 comes out in a couple years. Because that's when the HDMI standard will be upgraded to support Stereo 1080p 60 which the new game consoles will surely have available. Again, the sad part of this is the lack of Stereo 1080p 30 support for all the current TV models.

Another benefit of 3D vision was the ability to have nice smooth 120hz 1920x1080 resolution for normal computer use as well. I used to use 120hz in old CRT days. Its really nice and a worthwhile upgrade in itself even without the 3D. None of that is possible with the current HDMI 1.4a implementation. Also, any future monitors coming out for 3D Vision that have hdmi 1.4a won't support 120hz over hdmi because its not in the standard. To get 3D 1080p or 120hz, one will still need to use dual link dvi or displayport which nvidia barely supports.

That's the other problem with nvidia. They REFUSE to push displayport. Possibly, if Nvidia had started to use displayport on all their reference graphics cards, then it would have caught on a lot faster. It might even have become standard to put a displayport input on TVs. This was a huge mistake on Nivdia's part. Displayport would have worked perfectly for 120hz 1920x1080p even with the old spec so they could have used it for 3D Vision. There was also no disadvantage to adding displayport to all of their graphics cards. First of all Displayport has no licensing charges for manufacturers. It is much cheaper to add displayport than hdmi.

DVI and HDMI are basically the same. The proper port configuration of a video card today should be at a minimum, 1x displayport, 1xhdmi, and 1x dvi. ATI uses 1xdisplayport, 1xhdmi and 2x dvi. But the 2xdvi takes up venting room on the second slot. The second DVI is not necessary to include on a modern card though because a very cheap HDMI to DVI converter or cable can make the 1xhdmi become a second DVI perfectly. All nvidia must do is supply the converter with their graphics cards. Then the consumer has the choice of how to connect everything. 2x DVI, 1xDVI and 1xHMDI, 1xdisplayport and 1xHDMI or DVI. It's all possible. With Nvidia's current implementation, the consumer has the choice of HDMI or DVI (which is essentially just HDMI) i.e. no choice at all. Of course, it would also be nice to get eyefinity functionality on nvidia without having to buy two cards. Displayport also makes it easier to have eyefinity functionality, but that's for another discussion.

So ATI, Apple and Dell were already way ahead of the curve by supporting displayport. Maybe if nvidia had also supported displayport, it would have tipped the sails enough to either:
1. get a TV manufacturer to use displayport as one of the TV inputs.
2. pushed HDMI members to have more bandwidth/better standards for 3D in hdmi 1.4a.

Instead we have the mess we are in now. Displayport has all the bandwidth necessary for whatever is needed - even 3D at 2560x1600. But no HDTVs and very few monitors use it because the only things outputting displayport are ATI cards and a few non-reference nvidia cards - oh, and some apple stuff. I was rather suprised even the latest and greatest reference Nvidia 580gtx still lacks displayport. Maybe in 6 months we'll get a non-reference card which supports it. The really pathetic part is that Nvidia supports displayport at the chip level and in its drivers already. But it totally failed not making it part of the reference board. It's a small port - it should be standard on all graphics cards. At the very least, having such a high bandwidth port available everywhere puts pressure on HDMI and other standards to compete. Even if displayport ultimately fails completely, it would have been to nvidia's avantage to include it because it would have given Nvidia more bargaining power in the future 3D market.

It seems fairly obvious that all manufacturers involved are playing a game of "Manufactured Obsolescence." I guess, if I want to play with 3D Vision, I can play on a giant 24" monitor with Dual Link DVI. Or I can pay $2000 now for a brand new "obsolete" 46" TV and use nvidia 3DTV to play games at an amazing 1280x720 resolution. Wow.

I can only hear Sony, Samsung, Sharp and Toshiba in a couple years: "Everybody needs to upgrade EVERYTHING again because we forgot to include 3D 1080p30 in the HDMI specification!!" Woops! This was a huge failure by all manufacturers concerned and the consumer lost another round.

All of this could have been avoided if displayport was adopted. Of course, at any time, any TV manufacturer could put displayport on their TV and that would allow the full resolution we need for 3D and 120hz. Certainly AMD would be happy because Radeon will get a quick boost in the new 3D market. Nvidia will probably also be happy because they will have a great excuse to finally add displayport to a reference design and then nvidia fans will need to fork out for a new graphics card. It's all about making money, isn't it? But its very unlikely Sony or any TV manufacturer will bother with displayport now. HDMI has too much momentum and marketing money.

The mess could also have been avoided if HDMI 1.4a included at least Stereo 1080p 30 which would have been good enough as a stop gap. It was entirely technically possible without any extra effort. There would still be plenty of people upgrading from their HDTV to 3D or even OLED etc in a couple years, but people buying 3D now at least wouldn't feel totally shafted in the near future. I thought we were over the 720p hump already! If these giant corporate entities can try to look out for the consumer a bit more, that would actually benefit everyone because more momentum would be built in general for 3D and 120hz. People wouldn't always feel like they are being tricked.

It certainly would have been really nice if the HDMI gang adopted the full 120hz 1080p resolution of 3D vision into HDMI. Of course, HDMI itself would need a bandwidth upgrade. But if there's a giant corporate push to start something big like "3D", then it makes sense to do it CORRECTLY and make SURE the bandwidth is there to implement 3D PROPERLY and with true 120hz. It also should be able to integrate EASILY with computers because everyone has computers now. All they really had to do was make a new dual link HDMI standard - everything would have slotted into place perfectly! Instead, we get this half-baked, non 120hz, frame packed standard that needs $40 drivers from nvidia to use with a computer. Messy standards like that will not be easy to undo when they are embedded in millions of blu-ray players and graphics cards. It would have made a lot more sense to POSTPONE the release of all this 3D stuff until they got the standard right... I guess its too late for that now. The comical part is that TVs everywhere were already pushing a fake 120hz frame interpolation and even fake 240hz frame interpolation. But apparently, the rich corporations couldn't just make an hdmi standard with true 120hz or use displayport which already had it?? Sure - have them roll out a whole new standard that we'll use for a long time that is totally FLAWED just so they can sell us a slightly less flawed version next Christmas!

I just want to repeat what should have been the proper solution for the new 3D standard:
Either use Displayport 1.2 or use a new Dual Link HDMI standard which is compatible with Dual Link DVI.
At the very least, make HDMI 1.4a standard have 1080p30 for each eye a requirement for the sake of all the gamers in the world.

I guess all of these companies are competing in a cut-throat corporate world. Oh, the joys of capitalism!

My rant is over. I don't think writing this made me feel any better.

What is wrong with regular, frame sequential FULL HD 1080p 60Hz?
Works fine on my DLP TV and it is not even 3D ready.
Works fine with games running 30fps per eye.
60 Hz frame sequential 3D movies play just fine using
regular BluRay Player.
HDMI 1.3 if just fine for FULL HD 3D 60Hz frame sequential stereoscopic content.

Mathew Orman
post #55 of 73
Quote:
Originally Posted by orbidia View Post

... The problem really comes down to the competition between Displayport and HDMI. ...

You are right with that one. But there are larger politics involved. There is the CEA 861 standard, which defines the video timings, audio formats and meta data, such as AVI InfoFrames (these tell your TV e.g. what color space and aspect ratio is being used). HDMI did not define these specifics in their standard, which only describes the physical interface and data transfer, such as audio data embedding.

HDMI is not the only display interface standard using CEA 861. DisplayPort also uses this, and Diiva as well.

CEA was working on 3D extensions for the 861 standard.

Now, it appears that HDMI felt the heat from the DisplayPort competition. So, they went ahead of CEA, and snatched the 3D feature by implementing their own video timings and InfoFrames for 3D, also also threw in the 4K resolutions.

This means, 3D is currently only defined for HDMI, but not for DisplayPort or any other competing interface. Which gives HDMI a competitive advantage.

It makes the technical implementation more awkward: Related timings are defined in two different specs, signaling of content formats is done in two complementary InfoFrames (CEA 861 AVI and HDMI VSI). And it has put CEA into the position to having to cancel their own 3D implementation, since no HDMI developer wants two competing standards for that.

By the way, once more of the new HDMI 1.4 chips with HEAC will become available, you will see more devices supporting the 297 MHz pixel clock necessary for 1080p60 3D and 1080p120. The timings are already defined, just the hardware has to follow now.
post #56 of 73
Display Port claimed FullHD 3D support since DP1.1 and DP1.2 pushed it even beyond. What exactly is not working with their implementation ?
post #57 of 73
Quote:
Originally Posted by BlackShark View Post

Display Port claimed FullHD 3D support since DP1.1 and DP1.2 pushed it even beyond. What exactly is not working with their implementation ?

I have to admit I do not know much about the Display Port specs. Is their 3D comparable to HDMI 3D, in terms of supported formats and such? I assume their 3D formats and signaling is not compatible with the HDMI 3D.
post #58 of 73
I know DP1.1 specified only emitting frame sequential with some labelling system that identified the frames as left and right.
DP1.2 extended the 3D features to support more outputs with lots of different formats like side by side and others but I don't know the actual list of formats, if some are mandatory or not. DP did not specify that in their press release.
You'd have to look through the actual specification documents to know.
post #59 of 73
What we should have aside, it's good to know how to deal with what we do have.
With the current limited HDMI standards, should we be gaming at 720p?

By my math, 3D gaming on the current 3DTVs is best done at 1080p side by side or top bottom. Framepacked 720p has slightly less total pixels.
1920X540=1,036,800
or
960x1080= 1,036,800
or
12870x720= 921,600
post #60 of 73
Quote:
Originally Posted by obveron View Post

What we should have aside, it's good to know how to deal with what we do have.
With the current limited HDMI standards, should we be gaming at 720p?

By my math, 3D gaming on the current 3DTVs is best done at 1080p side by side or top bottom. Framepacked 720p has slightly less total pixels.
1920X540=1,036,800
or
960x1080= 1,036,800
or
12870x720= 921,600

I sincerely doubt you could ever see any difference between those two numbers.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: 3D Tech Talk
AVS › AVS Forum › 3D Central › 3D Tech Talk › Support for 3D 1080p @ 60 and 30 frames per second?