or Connect
AVS › AVS Forum › Display Devices › LCD Flat Panel Displays › 1080i signal to 1080p HDTV?
New Posts  All Forums:Forum Nav:

1080i signal to 1080p HDTV?

post #1 of 16
Thread Starter 
I just built a new computer with hdmi and hooked it up to my new denon 1712.

For some reason I have to use a 1080i signal on my TV to get the scaling correct, otherwise at 1080p my control center removes the scaling option and everything is to big for my screen.

My question is, will I cause any damage to my HDTV because it's 1080p?
post #2 of 16
Quote:
Originally Posted by petriebird View Post

I just built a new computer with hdmi and hooked it up to my new denon 1712.

For some reason I have to use a 1080i signal on my TV to get the scaling correct, otherwise at 1080p my control center removes the scaling option and everything is to big for my screen.

My question is, will I cause any damage to my HDTV because it's 1080p?

No
post #3 of 16
No problem. No different than connecting a 1080i cable box.
post #4 of 16
Like everyone else is saying, nope. You could send it 480 and the TV wouldnt care at all.
post #5 of 16
Quote:
Originally Posted by petriebird View Post


My question is, will I cause any damage to my HDTV because it's 1080p?

you cannot cause damage this way, even if you sent a resolution the TV doesn't support you just get a "not supported mode" message

a 1080p can handle 480i/p, 720p, and 1080i/p

a 720p can handle all of those formats aside from 1080p
post #6 of 16
Quote:
Originally Posted by PlasmaPZ80U View Post

you cannot cause damage this way, even if you sent a resolution the TV doesn't support you just get a "not supported mode" message

a 1080p can handle 480i/p, 720p, and 1080i/p

a 720p can handle all of those formats aside from 1080p

But why then is he having problems with the 1080p signal from his HTPC? I am fairly sure my HTPC is outputting 1080p through my Denon 1712 receiver into my TV - and there is no problem handling it. I don't have the scaling issue as reported.
post #7 of 16
Quote:
Originally Posted by petriebird View Post

I just built a new computer with hdmi and hooked it up to my new denon 1712.

For some reason I have to use a 1080i signal on my TV to get the scaling correct, otherwise at 1080p my control center removes the scaling option and everything is to big for my screen.

My question is, will I cause any damage to my HDTV because it's 1080p?

Quote:
Originally Posted by indio22 View Post

But why then is he having problems with the 1080p signal from his HTPC? I am fairly sure my HTPC is outputting 1080p through my Denon 1712 receiver into my TV - and there is no problem handling it. I don't have the scaling issue as reported.

If "everything is too big" for the screen, then the TV itself needs to be set to 1:1 or native mode to get rid of overscan. Yours is set correctly, the OP's is not.
post #8 of 16
Quote:
Originally Posted by mdavej View Post

If "everything is too big" for the screen, then the TV itself needs to be set to 1:1 or native mode to get rid of overscan. Yours is set correctly, the OP's is not.

Good point - I do have my TV set to the "just scan" option that is suppose to display 1:1 mapping. Also, I don't recall having any issues with my i3 on-chip graphics, but some dedicated video cards might require playing with settings to eliminate overscan.
post #9 of 16
No.
post #10 of 16
Overscan adjustments (backporch timing, etc) are needed only for analog video signals unless you have a very poorly designed HD TV.

As an example, I have a Vizio 24" 1080p TV. When I connected my 1080p FHD laptop to it using VGA, there was a slight overscan. Just enough of the Windows desktop was missing around all four edges to be annoying. When I switched to displayport/HDMI, it became pixel-perfect. Now there is no overscan at all and I can see the entire desktop. It fits exactly within the TV's bezels. Although I previously had the impression that the VGA display was excellent, the computer's graphics now are noticably sharper since the pixels sent from the computer exactly match the pixels on the screen.

Edited to add:
Just to clarify: VGA signals are analog, although you probably were aware of that. My laptop's HDMI connection is getting to the TV by going through a Marantz nr1501 receiver, providing very good 5.0 surround sound. (I happen not to have a subwoofer in that room. It would have been too inconvenient due to space limitations. There's one on the system in the living room, though.)
post #11 of 16
Like analog Component, VGA is also pixel perfect. Had you adjusted overscan, there would have been practically no difference between HDMI and VGA.
post #12 of 16
Sorry, but you seem to be using a different meaning for "pixel perfect" than I am.

Analog video signals are too easily "smeared". The timing of the sampling by the display device's input ADCs does not necessarily match the timing of the signal generated by the originating device. Too often, careful tweaking of sync signals, in addition to cable lengths and their dispersion indexes are needed to make sending and receiving devices agree. (e.g. sending VGA over Cat6 cabling is a pain since the pairs have different wraps and thus different frequency responses.) This is especially true for component (YPrPb/YUV) video, since normally its color difference signals have only half the resolution of its luminance signal.

In contrast, a digital video signal contains an explicit specification for each of the pixels that it transmits. There's never any question about what values have been assigned to which pixel. Unless, of course, there have been errors in the digital transmission. (How those pixel values get scaled onto a display device which has a digital resolution different from that of the signal is yet another problem, of course.)
post #13 of 16
I agree analog can degrade over longer cable runs. But for short runs you'd be hard pressed to see any difference. VGA can also handle higher resolutions than HDMI (not that any TV's can take advantage of it). If you can use HDMI, it's always perfect, so use it by all means. But if you must use VGA, then you'll still get excellent results at reasonable distances.
post #14 of 16
Quote:
Originally Posted by mdavej View Post

I agree analog can degrade over longer cable runs. But for short runs you'd be hard pressed to see any difference. VGA can also handle higher resolutions than HDMI (not that any TV's can take advantage of it). If you can use HDMI, it's always perfect, so use it by all means. But if you must use VGA, then you'll still get excellent results at reasonable distances.

If you've seen CNET's results with PC-inputs via VGA, you'll see some TVs can give excellent PQ and resolution via VGA and others appear softer and may not show full resolution despite being set to 1080p.

http://reviews.cnet.com/flat-panel-t...wBody;continue

"PC: While it handled the full 1,920x1,080 resolution via its VGA input, the LND550 wasn't as good as the LG or Vizio in this department. We detected minor interference and softness in high-res test patterns--imperfect, but still better than the Sony overall.

Test, Result, Score

PC input resolution (VGA), 1920x1080, Poor"
post #15 of 16
Thread Starter 
Quote:
Originally Posted by RollTide2011 View Post

No

Thank you
post #16 of 16
Thread Starter 
Quote:
Originally Posted by Selden Ball View Post

Sorry, but you seem to be using a different meaning for "pixel perfect" than I am.

Analog video signals are too easily "smeared". The timing of the sampling by the display device's input ADCs does not necessarily match the timing of the signal generated by the originating device. Too often, careful tweaking of sync signals, in addition to cable lengths and their dispersion indexes are needed to make sending and receiving devices agree. (e.g. sending VGA over Cat6 cabling is a pain since the pairs have different wraps and thus different frequency responses.) This is especially true for component (YPrPb/YUV) video, since normally its color difference signals have only half the resolution of its luminance signal.

In contrast, a digital video signal contains an explicit specification for each of the pixels that it transmits. There's never any question about what values have been assigned to which pixel. Unless, of course, there have been errors in the digital transmission. (How those pixel values get scaled onto a display device which has a digital resolution different from that of the signal is yet another problem, of course.)

ummm.........

http://translate.google.com/

The link didn't help any.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: LCD Flat Panel Displays
AVS › AVS Forum › Display Devices › LCD Flat Panel Displays › 1080i signal to 1080p HDTV?