AVS Forum banner
  • Our native mobile app has a new name: Fora Communities. Learn more.

720p or 1080i on 1360 x 768 max resolution LCD ?

3731 Views 11 Replies 8 Participants Last post by  Nmlobo
**** I posted this in Playstation forum but couldn't get a decent reply ****


haven't updated my Ps3 yet but from what I've read on the forum , it seems like Ps3 can now play blu-ray movies on 720p instead of 1080i. I have Olevia 32" LCD with max resolution of 1360 x 768.


Now the question is should I now select 720p as the highest resolution on Ps3 (previously 1080i was automatically select by Ps3) ? Will I see any difference in P/Q ? Secondly, about the new feature BTB and WTW....will turning them on along with 720p make a big difference ?

Edit: I updated the firmware and played POTC 2 and Apocalypto on both resolutoins but I could not see any difference. Is that how its supposed to be or am I missing something ?
.
See less See more
2
1 - 12 of 12 Posts
The native resolution of your screen is 720p, so you should set everything to that for the clearest picture. Any 1080i source has to be scaled to fit a 720p screen, so that offsets the "higher resolution" of 1080i on a 720p display.

Quote:
Originally Posted by wallst32 /forum/post/0


The native resolution of your screen is 720p, so you should set everything to that for the clearest picture. Any 1080i source has to be scaled to fit a 720p screen, so that offsets the "higher resolution" of 1080i on a 720p display.

Well, not exactly. The native resolution is 1360x768. Even 720p sources will need to be scaled.


My guess is that no matter what resolution is used, the difference will be subtle, but that 720p will be better for games and 1080i will be better for movies.


I don't think the WTW and BTB information will make a big difference, but it can't hurt.
See less See more
I'll have to try it on games....but for movies I did not see any difference between 1080i or 720p.....maybe I need to watch little more than five mins of a movie to see the difference
See less See more
My understanding is HDTV standards are not resolution, ie. pixel, based. HDTV standards are based on active vertical scanning lines, 16:9 display ratio, and digital audio. You can extrapolate the horizontal pixels from 720 vertical scanning lines and 16:9 aspect ratio but it is not a part of the standard. For example, 42" plasma TVs have pixel structure of 1024x768 but are still considered 720p HDTV. Resolutions and HDTV standards are related but are not interchangeable.

Quote:
Originally Posted by juicius /forum/post/0


My understanding is HDTV standards are not resolution, ie. pixel, based. HDTV standards are based on active vertical scanning lines, 16:9 display ratio, and digital audio. You can extrapolate the horizontal pixels from 720 vertical scanning lines and 16:9 aspect ratio but it is not a part of the standard. For example, 42" plasma TVs have pixel structure of 1024x768 but are still considered 720p HDTV. Resolutions and HDTV standards are related but are not interchangeable.

That may be the case, but actual real life digital HDTV signals are made up of pixels. Discussing pixels is also probably the easiest way to understand them, since people are familiar with that way of thinking from their experience with computers.
Here is my .02


For a 720p signal, no deinterlacing is needed (which is a plus) but upscaling is still needed. However, when you are upscaling, the tv needs to extrapolate extra pixels to fit the 720 picture in the space of 768.


For a 1080i signal, it has to be deinterlaced and scaled down. However, in this case, the tv is not extrapolating extra pixels, it's (potentially) just discarding some of the 1080 it doesn't think it needs to fit it in 768.


The real life difference between the 2 really depend on both the deinterlacing and scaling qualities of the TV. I probably oversimplified my description of how a TV scales in my above two examples, but you get the idea.


In the end, I personally don't notice a difference at all on my Sharp 32D43U when connecting my XBOX360 via 720p or 1080i so I just use 1080i.
See less See more

Quote:
Originally Posted by pdp76 /forum/post/0


Here is my .02


For a 720p signal, no deinterlacing is needed (which is a plus) but upscaling is still needed. However, when you are upscaling, the tv needs to extrapolate extra pixels to fit the 720 picture in the space of 768.


For a 1080i signal, it has to be deinterlaced and scaled down. However, in this case, the tv is not extrapolating extra pixels, it's (potentially) just discarding some of the 1080 it doesn't think it needs to fit it in 768.


The real life difference between the 2 really depend on both the deinterlacing and scaling qualities of the TV. I probably oversimplified my description of how a TV scales in my above two examples, but you get the idea.


In the end, I personally don't notice a difference at all on my Sharp 32D43U when connecting my XBOX360 via 720p or 1080i so I just use 1080i.

but aren't most 360 games output to 720p?


if they are, the 360 would have to upscale to 1080i and then your tv would have to scale it down to 768p...


or I am wrong here?
See less See more

Quote:
Originally Posted by krabby5 /forum/post/0


but aren't most 360 games output to 720p?


if they are, the 360 would have to upscale to 1080i and then your tv would have to scale it down to 768p...


or I am wrong here?

yeah, you're right if the game is only 720p. But if you have the HD-DVD add on it might make a bigger difference. But this is all moot for me since I can't tell the difference anyway!

Quote:
Originally Posted by chrisherbert /forum/post/0


Discussing pixels is also probably the easiest way to understand them, since people are familiar with that way of thinking from their experience with computers.

Not necessarily. I was involved in a discussion on another board with people who adamantly refused to believe 1024x768 plasma HDTV (Vizio 42" in fact) could in fact display 720p since it only had 1024 horizontal pixels. They seemed to believe that if it wasn't 1280x720, it wasn't "720p." Well, I suppose that sucks for most LCD panels that are 1366x768 as well as most mid-size plasmas.
Some Olevia TVs have a "1:1" setting (using the "Aspect" buton on the Olevia remote).


On that setting, with 720p input, it just passes it through without any scaling. (Since the TV is actually 768, there will be a thin 24 pixel wide black border around the image.) In theory, that should be better quality than having the PS3 convert to 1080i, and then having the Olevia convert to 720p (or 768p).


In practice, the scalers on 2007 products are pretty good, and any differences tend to be subtle.

Quote:
Originally Posted by juicius /forum/post/0


My understanding is HDTV standards are not resolution, ie. pixel, based. HDTV standards are based on active vertical scanning lines, 16:9 display ratio, and digital audio. You can extrapolate the horizontal pixels from 720 vertical scanning lines and 16:9 aspect ratio but it is not a part of the standard. For example, 42" plasma TVs have pixel structure of 1024x768 but are still considered 720p HDTV. Resolutions and HDTV standards are related but are not interchangeable.

Correct, there are no standards for displays other than vertical scanning lines. The display manufacturers fought for using only the vertical lines. There are format standards for each resolution.

The ATSC Standard defines the following formats:


Number of Lines Pixels (per line) Aspect Ratio Display Rate

1080 1920 16:9 60i, 30p, 24p

720 1280 16:9 60p, 30p, 24p

480 704 4:3 or 16:9 60p, 60i, 30p, 24p

480 640 4:3 60p, 60i, 30p, 24p


A "true" 1080 display should be able to display 1920 horizontal lines and a 720 1280 horiz. lines.
See less See more
1 - 12 of 12 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top