AVS Forum banner

Support for 3D 1080p @ 60 and 30 frames per second?

21K views 72 replies 20 participants last post by  gain3 
#1 ·
I'm looking to purchase a 3D plasma, likely a Samsung 63c8000 or Panasonic 65vt25 this summer in time for the World Cup (3D). I've gleaned that 3D bluray discs will only require a 3D TV which can display 1080p @ 24 frames per second per eye, with the display refreshing more frequently, for smooth video. I've also learned that the initial offerings from cable, satallite, fiber TV providers will likely display at 720p (not sure the fps) per eye. I believe these formats will easily be handled by "3D ready" device currently or soon to be on the market.


What about hardware capable of displaying a 1080p image for each eye at 30 and 60 frames per second? Will this first generation of TVs and audio video receivers support these standards? I'm DEEPLY concerned about spending 3-4 thousand dollars on a television and another thousand on a nice 3D ready receiver, if they lack support for these formats.


I've read about NVIDIA's upcoming 3D coversion software for PC games. I would hope, so long as my hardware is capable and I'm willing to dial in the graphics options in games, that this solution would be able to output 1080p at 30 fps for each eye. If the hardware isn't quite up to the task today for a full 60 fps, I'm sure it will be in another year. Likewise, I believe that the next generation of game consoles will also be capable of producing good quality graphics in these formats.


Have any of the TV or receiver manufacturers discussed their plans for these formats? Are they targeted as required formats for a future HDMI standard?


Any gamers out there with similar concerns about spending money this summer on hardware without guarantees that they can game in 1080p 3D at 30 and 60 fps?
 
See less See more
#53 ·
Orbidia:


The 3DTVs that are available now are designed around two different 3D formats:
  • Frame Packed for 3D BD (movies and PS3 games)
  • Frame Compatible for DBS/CBL and eventually OTA


3D movies are shot/rendered in 24 frames/sec. Not 30 and not 60. The PS3 is limited to 720x60P for 3d games because that is all the Cell BE can handle. It can't do 1080x60P.


The Frame Compatible 3D format allows millions of DBS and CBL STBs to pass 3D signals with just a simple firmware upgrade as opposed to replacing them. Plus it takes up the same bandwidth that a regular HD channel does.
 
#54 ·

Quote:
Originally Posted by orbidia /forum/post/19533052


I posted this rant in the nvidia forum but it seems like it is just as appropriate for the AVS Forum. I also wanted to see if there were any responses from the good folks of AVS Forum who probably know a lot more than I do:


-------------------------------------


I just wanted to write in this forum to vent a bit.


It seems like all companies involved in the entire 3D launch screwed up.


Nvidia tried their best a couple years ago with Nvidia 3D Vision. This basically had everything needed to make the best 3D possible at the time. Even though the first 3D monitors didn't support it, 3D Vision supported TRUE 120hz 1920x1080p. That is the resolution that makes the most sense for 3D games. So Nvidia had the "vision" to get the specification down properly. A lot of people debate about how 3D should be done but Nvidia's solution works better than the current HDMI 1.4a solution. The problem is that 3D Vision needs displays to work. The bigger the display, the better. And Nvidia didn't get TV manufacturers to properly implement 3D Vision. People bothering with 3D want at least a 32" monitor/TV for 3D. Now, the best we can get to use large TVs with 3D and computers is 3DTV Play which can only play games at Stereo 1280x720 60hz resolution.


The problem really comes down to the competition between Displayport and HDMI. HDMI is a licensed technology which requires fees to use on devices. It is also inferior to displayport in bandwidth. HDMI is basically the same as DVI. Nvidia could have chosen displayport to implement 3D Vision which would have had enough bandwidth even with diplayport 1.0 spec. Instead, Nvidia chose to use Dual Link DVI because all of their graphics cards had dual link DVI for a while. Single link DVI does not have enough bandwidth for 120hz 1920x1080. Maybe Nvidia thought HDMI would eventually incorporate a new Dual Link HDMI standard. That would certainly have been nice but it didn't happen.


Instead Sony, Samsung and all the HDMI gang came up with HDMI 1.4a - the current spec supporting the latest 3D standards. There is a huge corporate push for this with millions of advertising dollars trying to convince us 3D is the next great thing. But instead of giving us a nice clean 120hz signal for 3D, HDMI 1.4a uses Frame Packing and weird Side by Side etc garbage to get around other technical limitations. I understand this was mostly for backward compatibility with previous blu-ray players and also for compression advantages of the similar left/right eye frames. But its really a messy solution. Instead of taking a step back and doing it right, they fudged the whole thing to try to get something to work that wasn't quite ready. They needed to add more bandwidth to HDMI to make a proper 3D standard or not bother with 3D. The bandwidth between HDMI 1.3 and HDMI 1.4 is the same. They just wanted to get 3D out the door without thinking about long term consequences.


The main limitation of HDMI 1.4a is that it only requires the standards of stereo 720p 60hz or stereo 1080p 24 hz. HDMI (single link) does not have the bandwidth for stereo 1080p 60. But there is NO EXCUSE that the HDMI board didn't REQUIRE stereo 1080p 30. This was still within the bandwidth restrictions of HDMI and would have allowed an "acceptable" framerate for games of 30 frames per eye at 1080p. It would not have been ideal but it would have been good enough. Stereo 1080p 24 is obviously unplayable. Nvidia should have pushed a lot harder to get the stereo 1080p 30 as part of the HDMI 1.4a specification. Now for the next 2 years minimum, we are stuck with a max resolution of 3D 720p 60 for playing games when using large screen TVs. All the TVs released now will be obsolete when the next Xbox or Playstation 4 comes out in a couple years. Because that's when the HDMI standard will be upgraded to support Stereo 1080p 60 which the new game consoles will surely have available. Again, the sad part of this is the lack of Stereo 1080p 30 support for all the current TV models.


Another benefit of 3D vision was the ability to have nice smooth 120hz 1920x1080 resolution for normal computer use as well. I used to use 120hz in old CRT days. Its really nice and a worthwhile upgrade in itself even without the 3D. None of that is possible with the current HDMI 1.4a implementation. Also, any future monitors coming out for 3D Vision that have hdmi 1.4a won't support 120hz over hdmi because its not in the standard. To get 3D 1080p or 120hz, one will still need to use dual link dvi or displayport which nvidia barely supports.


That's the other problem with nvidia. They REFUSE to push displayport. Possibly, if Nvidia had started to use displayport on all their reference graphics cards, then it would have caught on a lot faster. It might even have become standard to put a displayport input on TVs. This was a huge mistake on Nivdia's part. Displayport would have worked perfectly for 120hz 1920x1080p even with the old spec so they could have used it for 3D Vision. There was also no disadvantage to adding displayport to all of their graphics cards. First of all Displayport has no licensing charges for manufacturers. It is much cheaper to add displayport than hdmi.


DVI and HDMI are basically the same. The proper port configuration of a video card today should be at a minimum, 1x displayport, 1xhdmi, and 1x dvi. ATI uses 1xdisplayport, 1xhdmi and 2x dvi. But the 2xdvi takes up venting room on the second slot. The second DVI is not necessary to include on a modern card though because a very cheap HDMI to DVI converter or cable can make the 1xhdmi become a second DVI perfectly. All nvidia must do is supply the converter with their graphics cards. Then the consumer has the choice of how to connect everything. 2x DVI, 1xDVI and 1xHMDI, 1xdisplayport and 1xHDMI or DVI. It's all possible. With Nvidia's current implementation, the consumer has the choice of HDMI or DVI (which is essentially just HDMI) i.e. no choice at all. Of course, it would also be nice to get eyefinity functionality on nvidia without having to buy two cards. Displayport also makes it easier to have eyefinity functionality, but that's for another discussion.


So ATI, Apple and Dell were already way ahead of the curve by supporting displayport. Maybe if nvidia had also supported displayport, it would have tipped the sails enough to either:

1. get a TV manufacturer to use displayport as one of the TV inputs.

2. pushed HDMI members to have more bandwidth/better standards for 3D in hdmi 1.4a.


Instead we have the mess we are in now. Displayport has all the bandwidth necessary for whatever is needed - even 3D at 2560x1600. But no HDTVs and very few monitors use it because the only things outputting displayport are ATI cards and a few non-reference nvidia cards - oh, and some apple stuff. I was rather suprised even the latest and greatest reference Nvidia 580gtx still lacks displayport. Maybe in 6 months we'll get a non-reference card which supports it. The really pathetic part is that Nvidia supports displayport at the chip level and in its drivers already. But it totally failed not making it part of the reference board. It's a small port - it should be standard on all graphics cards. At the very least, having such a high bandwidth port available everywhere puts pressure on HDMI and other standards to compete. Even if displayport ultimately fails completely, it would have been to nvidia's avantage to include it because it would have given Nvidia more bargaining power in the future 3D market.


It seems fairly obvious that all manufacturers involved are playing a game of "Manufactured Obsolescence." I guess, if I want to play with 3D Vision, I can play on a giant 24" monitor with Dual Link DVI. Or I can pay $2000 now for a brand new "obsolete" 46" TV and use nvidia 3DTV to play games at an amazing 1280x720 resolution. Wow.


I can only hear Sony, Samsung, Sharp and Toshiba in a couple years: "Everybody needs to upgrade EVERYTHING again because we forgot to include 3D 1080p30 in the HDMI specification!!" Woops! This was a huge failure by all manufacturers concerned and the consumer lost another round.


All of this could have been avoided if displayport was adopted. Of course, at any time, any TV manufacturer could put displayport on their TV and that would allow the full resolution we need for 3D and 120hz. Certainly AMD would be happy because Radeon will get a quick boost in the new 3D market. Nvidia will probably also be happy because they will have a great excuse to finally add displayport to a reference design and then nvidia fans will need to fork out for a new graphics card. It's all about making money, isn't it? But its very unlikely Sony or any TV manufacturer will bother with displayport now. HDMI has too much momentum and marketing money.


The mess could also have been avoided if HDMI 1.4a included at least Stereo 1080p 30 which would have been good enough as a stop gap. It was entirely technically possible without any extra effort. There would still be plenty of people upgrading from their HDTV to 3D or even OLED etc in a couple years, but people buying 3D now at least wouldn't feel totally shafted in the near future. I thought we were over the 720p hump already! If these giant corporate entities can try to look out for the consumer a bit more, that would actually benefit everyone because more momentum would be built in general for 3D and 120hz. People wouldn't always feel like they are being tricked.


It certainly would have been really nice if the HDMI gang adopted the full 120hz 1080p resolution of 3D vision into HDMI. Of course, HDMI itself would need a bandwidth upgrade. But if there's a giant corporate push to start something big like "3D", then it makes sense to do it CORRECTLY and make SURE the bandwidth is there to implement 3D PROPERLY and with true 120hz. It also should be able to integrate EASILY with computers because everyone has computers now. All they really had to do was make a new dual link HDMI standard - everything would have slotted into place perfectly! Instead, we get this half-baked, non 120hz, frame packed standard that needs $40 drivers from nvidia to use with a computer. Messy standards like that will not be easy to undo when they are embedded in millions of blu-ray players and graphics cards. It would have made a lot more sense to POSTPONE the release of all this 3D stuff until they got the standard right... I guess its too late for that now. The comical part is that TVs everywhere were already pushing a fake 120hz frame interpolation and even fake 240hz frame interpolation. But apparently, the rich corporations couldn't just make an hdmi standard with true 120hz or use displayport which already had it?? Sure - have them roll out a whole new standard that we'll use for a long time that is totally FLAWED just so they can sell us a slightly less flawed version next Christmas!


I just want to repeat what should have been the proper solution for the new 3D standard:

Either use Displayport 1.2 or use a new Dual Link HDMI standard which is compatible with Dual Link DVI.

At the very least, make HDMI 1.4a standard have 1080p30 for each eye a requirement for the sake of all the gamers in the world.


I guess all of these companies are competing in a cut-throat corporate world. Oh, the joys of capitalism!


My rant is over. I don't think writing this made me feel any better.

What is wrong with regular, frame sequential FULL HD 1080p 60Hz?

Works fine on my DLP TV and it is not even 3D ready.

Works fine with games running 30fps per eye.

60 Hz frame sequential 3D movies play just fine using

regular BluRay Player.

HDMI 1.3 if just fine for FULL HD 3D 60Hz frame sequential stereoscopic content.


Mathew Orman
 
#55 ·

Quote:
Originally Posted by orbidia /forum/post/19533052


... The problem really comes down to the competition between Displayport and HDMI. ...

You are right with that one. But there are larger politics involved. There is the CEA 861 standard, which defines the video timings, audio formats and meta data, such as AVI InfoFrames (these tell your TV e.g. what color space and aspect ratio is being used). HDMI did not define these specifics in their standard, which only describes the physical interface and data transfer, such as audio data embedding.


HDMI is not the only display interface standard using CEA 861. DisplayPort also uses this, and Diiva as well.


CEA was working on 3D extensions for the 861 standard.


Now, it appears that HDMI felt the heat from the DisplayPort competition. So, they went ahead of CEA, and snatched the 3D feature by implementing their own video timings and InfoFrames for 3D, also also threw in the 4K resolutions.


This means, 3D is currently only defined for HDMI, but not for DisplayPort or any other competing interface. Which gives HDMI a competitive advantage.


It makes the technical implementation more awkward: Related timings are defined in two different specs, signaling of content formats is done in two complementary InfoFrames (CEA 861 AVI and HDMI VSI). And it has put CEA into the position to having to cancel their own 3D implementation, since no HDMI developer wants two competing standards for that.


By the way, once more of the new HDMI 1.4 chips with HEAC will become available, you will see more devices supporting the 297 MHz pixel clock necessary for 1080p60 3D and 1080p120. The timings are already defined, just the hardware has to follow now.
 
#56 ·
Display Port claimed FullHD 3D support since DP1.1 and DP1.2 pushed it even beyond. What exactly is not working with their implementation ?
 
#57 ·

Quote:
Originally Posted by BlackShark /forum/post/19535320


Display Port claimed FullHD 3D support since DP1.1 and DP1.2 pushed it even beyond. What exactly is not working with their implementation ?

I have to admit I do not know much about the Display Port specs. Is their 3D comparable to HDMI 3D, in terms of supported formats and such? I assume their 3D formats and signaling is not compatible with the HDMI 3D.
 
#58 ·
I know DP1.1 specified only emitting frame sequential with some labelling system that identified the frames as left and right.

DP1.2 extended the 3D features to support more outputs with lots of different formats like side by side and others but I don't know the actual list of formats, if some are mandatory or not. DP did not specify that in their press release.

You'd have to look through the actual specification documents to know.
 
#59 ·
What we should have aside, it's good to know how to deal with what we do have.

With the current limited HDMI standards, should we be gaming at 720p?


By my math, 3D gaming on the current 3DTVs is best done at 1080p side by side or top bottom. Framepacked 720p has slightly less total pixels.

1920X540=1,036,800

or

960x1080= 1,036,800

or

12870x720= 921,600
 
#60 ·

Quote:
Originally Posted by obveron /forum/post/19536336


What we should have aside, it's good to know how to deal with what we do have.

With the current limited HDMI standards, should we be gaming at 720p?


By my math, 3D gaming on the current 3DTVs is best done at 1080p side by side or top bottom. Framepacked 720p has slightly less total pixels.

1920X540=1,036,800

or

960x1080= 1,036,800

or

12870x720= 921,600

I sincerely doubt you could ever see any difference between those two numbers.
 
#61 ·

Quote:
Originally Posted by Lee Stewart /forum/post/19536374

Quote:
Originally Posted by obveron /forum/post/19536336


What we should have aside, it's good to know how to deal with what we do have.

With the current limited HDMI standards, should we be gaming at 720p?


By my math, 3D gaming on the current 3DTVs is best done at 1080p side by side or top bottom. Framepacked 720p has slightly less total pixels.

1920X540=1,036,800

or

960x1080= 1,036,800

or

12870x720= 921,600

I sincerely doubt you could ever see any difference between those two numbers.

I suppose thats true, but still it's good to know what's the best. Alot of people are saying 720p is the best gamers can do right now.


Also isn't 1080p SBS supported on older HDMi standards (pre 1.4)?

(not mention 1080p checkerboard which also beats framepacked 720p)

If so, the best standards for 3D gaming on TV , so far, were available before 1.4. That's definitely worth noting in this whole ordeal. HDMI 1.4 strictly brings better standards for movies and broadcast only.


Gamers with 1.3 HDMI AVRs might be interested to know this...
 
#62 ·
Indeed Checkerboard 1080p beats 720p frame packed.

Especially since checkerboard should use 5-point sampling to create the checkerboard out of a real stereo1080p frame.
 
#63 ·

Quote:
Originally Posted by obveron /forum/post/19536431


I suppose thats true, but still it's good to know what's the best. Alot of people are saying 720p is the best gamers can do right now.

I know that for the PS3 and games off of a 3D BD, FP'd 720P was choosen because that was the limit of the Cell BE's processing power. It can't do 1080x60P. It just isn't powerful enough.

Quote:
Also isn't 1080p SBS supported on older HDMi standards (pre 1.4)?

(not mention 1080p checkerboard which also beats framepacked 720p)

If so, the best standards for 3D gaming on TV , so far, were available before 1.4. That's definitely worth noting in this whole ordeal. HDMI 1.4 strictly brings better standards for movies and broadcast only.


Gamers with 1.3 HDMI AVRs might be interested to know this...

It's isn't just the bandwidth AFAIK. There is also the EDID to deal with.
 
#64 ·
This thread makes me
and
. I've been thinking about buying a 3DTV because I can afford one now and 3D is awesome, but 1080p 60 didn't cross my mind. No way will I buy a TV now that will be obsolete in 2 years.



On the other hand, my career is on the rise and in two more years I will be able to afford a supremely massive 3DTV (projector or whatever) when 1080p 60 is supported.
 
#65 ·
All HDMI 1.4a 3D formats can be transmitted by any HDMI 3D transmitter chip and can be received and passed through any A/V receiver that pases though what it receivs. Howver, only the HDMI 1.4a packed double buffer 3D format, can have the audio content can be removed and processed with an HDMI 1.4a receiver.
 
#66 ·
Quote:
Originally Posted by Lee Stewart
I sincerely doubt you could ever see any difference between those two numbers.
Also, for this kind of bandwidth calculation, you need to use the total timing size, including blanking. For 720p, this is 1650x750; 1080i/p have 2200x1125 (this is per frame, so there's 30 per second for 1080i, and 60 per second for 1080p): 1650*750*60=2200*1125*30, so 720p has exactly the same bandwidth requirement as 1080i.
 
#67 ·
OK, but I was just referring to what LOOKS better on screen, regardless of bandwidth. 1080p SBS or T/B or checkerboard offer more resolution on screen than framepacked 720p. Checkerboard is particularly better from my understanding.


Those were available formats before HDMI 1.4. So in reality, 3D gamers shoudn't have had to wait for HDMI 1.4, it has offered nothing better.

What we had before 720p framepacking was just as good, if not better. This is why the Xbox 360 has not needed HDMi 1.4 for 3D. Because it just uses SBS and that's not any worse than 720p framepacked, and SBS can also work over component, vga, dvi etc.


Now, what the consoles can actually render is a different story. Obviously both the PS3 and the 360 struggle to render 720p at decent frame rates in 2D, let alone 3D.

But that doesn't make this discussion moot.



As far as I can tell, PC gamers using 3DTV play might benefit slightly from using 1080p SBS rather than framepacked 720p.
 
#68 ·

Quote:
Originally Posted by obveron /forum/post/19538778


OK, but I was just referring to what LOOKS better on screen, regardless of bandwidth. 1080p SBS or T/B or checkerboard offer more resolution on screen than framepacked 720p. Checkerboard is particularly better from my understanding.


Those were available formats before HDMI 1.4. So in reality, 3D gamers shoudn't have had to wait for HDMI 1.4, it has offered nothing better.

What we had before 720p framepacking was just as good, if not better. This is why the Xbox 360 has not needed HDMi 1.4 for 3D. Because it just uses SBS and that's not any worse than 720p framepacked, and SBS can also work over component, vga, dvi etc.


Now, what the consoles can actually render is a different story. Obviously both the PS3 and the 360 struggle to render 720p at decent frame rates in 2D, let alone 3D.

But that doesn't make this discussion moot.



As far as I can tell, PC gamers using 3DTV play might benefit slightly from using 1080p SBS rather than framepacked 720p.

PC Games have no limitation and can reader at 4K UHD 3D Stereo using dual nVidia QUATRO professional adapters.


Mathew Orman
 
#71 ·
The Mits 3D ready DLP TVs are 1080p 120Hz models and give you you 960x1080@ 60fps per eye when in 3D mode and the TVs them selves have no video crosstalk since the are so fast.

However, glases have video crosstalk problems. The New Xpand 103 glases have been reported to have a lot less crosstalk then the Mits IR glases in the Mits 3D kit.
 
#72 ·

Quote:
Originally Posted by walford /forum/post/19540953


The Mits 3D ready DLP TVs are 1080p 120Hz models and give you you 960x1080@ 60fps per eye when in 3D mode and the TVs them selves have no video crosstalk since the are so fast.

However, glases have video crosstalk problems. The New Xpand 103 glases have been reported to have a lot less crosstalk then the Mits IR glases in the Mits 3D kit.

I was referring to non 3D ready DLP TVs running FULL HD 1920x1080p at 60Hz or 30 Hz per eye.


Mathew Orman
 
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top