Originally Posted by Thebarnman
Well, I do know that everything I watch, 480i, 480p, 720p and 1080p shows up on my TV as 1080p.
I'm talking about frame rate. 24fps shows up as 24sps. 30fps shows up at 30fps.
However, I do not know if my TV is capable of DECODING a 1080p signal @ 60fps.
Makes me wonder. Is 1080p 60fps even one of the HDTV standards?
I guess my answer wasn't too clear cause I didn't spell it out totally.... a 1080i signal is a "30 based" signal, a "true 1080p" is a 60 based signal and is 60 fps. IF your tv supports an input of 1080P then is supports 1080p(60).... you will normally see a "rider" (as in special "condition"), if the tv can handle a "24fps" because to do that "properly", it need to do some "special processing".
And yes, on your screen a 1080p input signal should / will show up as "1080 60".... if it is an "i" incoming source, it will show up as "1080i x0".... I do believe depending on what the manufacturer has decided to display the status of the incoming signal.... ultimately the tv has to de-interlace and display it on the screen as 1080p 60. Thus, if you are seeing a picture from a source (try blu ray stuff) and not the movie itself as most are 24fpr based which you will see... but the menu and special features stuff, then it is processing 1080p60 stuff and displaying it.Edited by budwich - 1/15/13 at 2:44pm