AVS Forum banner

1 - 20 of 40 Posts

·
Registered
Joined
·
998 Posts
Discussion Starter #1
What is 720p? 1024 x 768 or 1366 x 768?


LCD's and Plasmas with 1024 and 1366 both say they are 720p? Which one is the true 720p?


I know 1366 is a better resolution, but if a 720p broadcast is 1024X768 then does the extra resolution make for a better picture?


Does 1366 offer a significant advantage when watching 1080?


Or would I be better off looking at brightness, contrast and other stats and not worring about 1024 vs 1366?
 

·
Registered
Joined
·
836 Posts
720p material is 1280x720 resolution. So a 1366x768 is the only one capable of displaying the full 720p resolution.
 

·
Registered
Joined
·
7 Posts
720p is 1280x720. A set with 1366x768 will have to slightly upscale in both directions when you feed it a 720P signal. A set with 1024x768 will upscale the horizontal lines, and downscale the vertical lines, so you'll loose some detail.
 

·
Registered
Joined
·
998 Posts
Discussion Starter #4

Quote:
Originally Posted by fugiot /forum/post/0


720p material is 1280x720 resolution. So a 1366x768 is the only one capable of displaying the full 720p resolution.

I know this is going to be a stupid question. Why don't they make 720p TV's that display in 1280x720? Why undershoot and then overshoot it? What happens with the extra pixels in 1366? Is the picture stretched out?


My second question: Is it a significant difference in picture quality between 1024 and 1366?
 

·
Registered
Joined
·
3,315 Posts

Quote:
Originally Posted by szupan /forum/post/0


720p is 1280x720. A set with 1366x768 will have to slightly upscale in both directions when you feed it a 720P signal. A set with 1024x768 will upscale the horizontal lines, and downscale the vertical lines, so you'll loose some detail.

What a confusing way to say that.

Vertical lines?

It sounds like you have it backwards. It upscales the vertical pixels and downscales the horizontal pixels.
 

·
Registered
Joined
·
3,315 Posts

Quote:
Originally Posted by doogiehowser /forum/post/0


I know this is going to be a stupid question. Why don't they make 720p TV's that display in 1280x720? Why undershoot and then overshoot it? What happens with the extra pixels in 1366? Is the picture stretched out?


My second question: Is it a significant difference in picture quality between 1024 and 1366?

Pioneer did for quite a while.


I think plasma makers realized they could maximize resolution within the practical limits of signals in the real world. There are many resolution limiters out there.

I think I do notice the small difference but I doubt most would. People are more sensitive to the vertical resolution, I believe.
 

·
Registered
Joined
·
194 Posts

Quote:
Originally Posted by Elemental1 /forum/post/0


Pioneer did for quite a while.


I think plasma makers realized they could maximize resolution within the practical limits of signals in the real world. There are many resolution limiters out there.

I think I do notice the small difference but I doubt most would. People are more sensitive to the vertical resolution, I believe.

I am with Doogie


Elemental, you seem to very knowledgable in all this so please be patient with some of us.


But like Doogie, I see the 768 capability out there but yet stations put out 720 if i remember right, and have not understood why the hdtv manufacturers use 768 instead of 720. Does that mean the mfg'ers use the extra 48 in some magic way or did they go to 768 because there are mediums that output 768 and they have to go to the extra effort to translate 720 to their panel's 768?


Maybe a different question might be, what are the common outputs by the various sources such as television stations (sd and hd), statelite, dvd players, blue ray, etc.


I presume the hdtv manufacturers must look at all the possible and common inputs from the common sources and then design accordingly, and then write software/firmware that translate those input resolutions to the type of panel they design and sell.


It seems a simple table would be of real help so those of us who are not at your level could learn.


thanks
 

·
Registered
Joined
·
1,302 Posts
I think there is just too much to take in easily. Heck, my first plasma is a 16:9 HDTV, yet had a resolution of 1024x1024. That means it has rectangular pixels.


Now add in the broadcast of 1080i which really has a resolution of 540 horizontal lines except it is interlaced so the TV will have to 'somehow' put the signal back together and then scale to the 720p which is really 768p
. Ahhh...forget everything I said. It makes no sense.


I do remember years ago when everyone (including me) was trying to hook up their computers to these HD monitors and it wasn't that easy. Once they started coming out with these 1366x768 monitors, it was much easier...except for that stupid 1366 which video cards don't easily do so now many monitors do support 1360x768. Uh oh...there I go again
.
 

·
Registered
Joined
·
1,557 Posts
480x480 - DirecTV and Dish Network SD

720x480 - DVD (including widescreen, it tells it to show wider pixels)

1280x720 - 720p stations like ESPN and ABC and FOX

1280x1080 - HDLite from DirecTV and Dish Network

1920x1080 - Full HD from most broadcast stations and cable providers and some Dish channels (I think they still have it). Also, all HD-DVD and Blu-Ray discs are at this resolution.


Everything is scaled on most TVs, because it is much harder to build a circuit that bypasses the scaler chip.


I think some 1366x768 TVs actually show 720p in a black box as a special case, but you don't see it because it is a projector and you only see the 1280x720 pixels on the screen (the black box is in the overscan area).
 

·
Registered
Joined
·
2,540 Posts
There's a few things going on here.


The two HDTV resolutions are 1280x720 and 1920x1080.


First, manufacturers use the term "720p display" in their marketing literature loosely and incorrectly. These days, they tend to use it for anything that isn't a "1080p display". So let's put that one to rest first.


The native resolution of a 1366x768 panel is not 720p. If anything, it is 768p, since all input is scaled to the 768 lines. But, of course, 768p is not a resolution that is used in the source material. Only 720p and 1080i/p are used. We should count 480i/p also, for SD material and DVDs.


In the case of plasma, it is fiendishly difficult to make plasma panels with small pixels. This is why 1080p plasmas are so long in coming and why 42 inch plasma panels are 1024x768.


How can 1024x768 be HD, you ask. Many dozens of people have asked that over and over here in the forum. The answer is, the Consumer Electronics Association has decided that any display with at least 720 lines can be called an HDTV and have the logo. There is some basis in fact for this because the human eye is more sensitive to vertical resolution than to horizontal resolution. This has been taken advantage of for years in NTSC because the horizontal resolution of NTSC is really poor. It is also taken advantage of in so-called HD-lite where the video is resampled to 1440x1080 on things like satellite, or maybe even shot with a 1440x1080 camera instead of the 1920x1080 camera you thought the network was using, and you probably didn't even notice.


We have had intense wars about whether that CEA definition is legitimate. Believe what you will, and if you don't think 1024x768 is HD, buy something else. Vote with your wallet. Simple as that.


In the case of LCD, it is difficult to make large pixels. This is one reason why 1366x768 has been used. Also, please do not forget that a lot of HD is in 1080i, so 1920x1080 has to be downscaled to 1366x768. Conversely, 1280x720 has to be upscaled to 1366x768.


To those who say, but wouldn't it be a good idea if the panel were 1280x720 so at least one of the resolutions wouldn't have to be scaled, I say nope! For one thing, you'd be downscaling 1920x1080 all the way down to 1280x720, so you'd lose a lot of resolution on the very format that is supposed to be giving you lots of detail, which is also what most channels broadcast in. For another thing, the extra resolution is really helpful on things like diagonal lines, where you can smooth out the line. If you were in 1280x720 on a 32 or 37 inch display, you'd probably be complaining that the display looks all blocky and pixelated because the pixels are just too big and the diagonal lines have too much stairstepping.


Turns out you can get 1280x720, but you have to go all the way down to 22 and 23 inches, where it works nicely.


Oh, and one other thing that comes up over and over again. The resolution of 1080i is not "540p", it is 1920x1080, dang it. If you were to shoot a still life, you would have 1920x1080 worth of pixels. It's simply that you have to wait 1/30 second rather than 1/60 second to get all the pixels. If it's a live shot. If it's film, it was shot at a mere 24 frames per second, so (inverse telecine, 3:2 pulldown, which is actually reverse 3:2 pulldown) done correctly gives you back your movie.
 

·
Registered
Joined
·
3,655 Posts
What about scaling to easy multiples (1.5 x up or down from 720, for example) rather than non-multiples? (I'm no math major, so I apologize if this is not expressed correctly)


480 x 1.5 = 720


1080/720 = 1.5


768/720 = 1.066666666666666


1080/768 = 1.40625


Would not the scaler work less hard with a 1.5 conversion? Just asking. And also, while there are few, if any, flat panels at 1280x720, are there not a number of RPTVs at that resolution? Is there a reason why it is easier (if it is easier) to do so with RPTVs?
 

·
Registered
Joined
·
578 Posts
Yes, in one sense it's easier to build the very simplest of scalers if the ratio is 1.5 (1080/720), but these days, we want better results than can be had of such simple scalers. Once a designer bites the bullet and implements a more sophisticated scaling algorithm, it's not nearly so compelling to keep to a nice ratio like 1.5. This path was followed precisely with the original DVD standard where 720 x 480 pixels are scaled down to 720 x 360 pixels when the DVD player letterboxes a 16:9 aspect ratio program to a 4:3 aspect ratio display. In that case, it was a nice 4 vs 3 and the very early players simply discarded every 4th line to make a really gross downscaler. Nowadays, no manufacturer does it that way. To do it that way for 1080/720, you just throw away every 3rd line. The result from that method is not so great, with the introduction of aliasing artifacts, deinterlacing, etc. This can be prevented by implementing a more sophisticated scaler that uses all of the soruce lines to produce the final downscaled output. As I said, once you use all of the lines to produce the final output, it's not such an advantage to keep a simple ratio of two small integers.


Now, as to why 768? Well, 768 happens to be a fairly nice-ish binary number that looks like this:


1100000000


vs 720 which looks like this:


1011010000


720 in binaryland where chip designers work, is not nearly so nice as a round number like 768. Chip designers grumble a lot less when presented with 768, than do when presented with a not so round number like 720. Once the designer chooses an "anything" to "anything" scaler architecture, it greatly simplifies things, if the output "anything" is actually constrained to a nice round binary number like 768. It's a lot more trouble, if the output resolution needs to be 720.


There are lots of other round number issues with HDTV. Example: MPEG actually likes things in multiples of 16, which 1920 x 1080 most certainly is not! 1080 / 16 = 67.5 yech! That's why the broadcast mpeg stream for a 1080i program is actually encoded as 1920 x 1088. There are lots of computer video cards with round number issues. Many of them cannot handle vertical or horizontal resolutions that aren't a nice multiple of 8, of which 1366 most certainly is (1366 / 8 = 170.75).


Once it's been chosen to have 768 lines, then the 16:9 display aspect ratio dictates 1366 pixels per line and even then it doesn't come out precisely 1366, but more like 1366.33333. So, 1366 x 768 isn't precisely a 16 x 9 aspect ratio, assuming that pixels are isometric (square). It hasn't always been of much concern to have isometric pixels. Only with the more recent popularity of connecting pcs to HDTV displays, has it really mattered so much, since few pc designs allow for undistorted display of shapes on a display device with anisometric pixels. So, if you want to connect up a pc for non-movie playing use, then you will struggle if your display does not have isometric pixels.


J.J.
 

·
Registered
Joined
·
1,153 Posts
To answer your original question, they went to x768 because then it becomes more compatible with computer resolutoins such as 1024x768, a very common computer resolution.


This makes the displays more useful as computer monitors while still being able to display the full 720p signal.


Some LCD projectors like the Sony VW-10ht were also 1366x768 back in their day.


-Allen
 

·
Registered
Joined
·
998 Posts
Discussion Starter #14
Thanks to everyone who responded.


I only have one more question left. Will a 1366x768 do a better job with 1080 than 1024x768? Is it a HUGE difference?


If you have 2 sets next to one another, and everything else was equal except the 1366 VS 1024, and they both had the same 1080 signal, would the 1366 look much better than the 1024? Or would they both look about the same?
 

·
Registered
Joined
·
9 Posts

Quote:
Originally Posted by Valence01 /forum/post/0


It hasn't always been of much concern to have isometric pixels. Only with the more recent popularity of connecting pcs to HDTV displays, has it really mattered so much, since few pc designs allow for undistorted display of shapes on a display device with anisometric pixels. So, if you want to connect up a pc for non-movie playing use, then you will struggle if your display does not have isometric pixels.

J.J.

Excellent stuff J.J.


I am looking for a display that I can use, split screen(PC monitor/TV), 26 or 32", and have narrowed the field to the Samsung or Sony. Can I determine if they have isometric pixels or not? I have scoured the product detail pages, and no mention.

Any other comments from you or anyone else here are appreciated on my search.....


MD
 

·
Registered
Joined
·
2,540 Posts

Quote:
Originally Posted by doogiehowser /forum/post/0


Thanks to everyone who responded.


I only have one more question left. Will a 1366x768 do a better job with 1080 than 1024x768? Is it a HUGE difference?


If you have 2 sets next to one another, and everything else was equal except the 1366 VS 1024, and they both had the same 1080 signal, would the 1366 look much better than the 1024? Or would they both look about the same?

The problem with your question is, everything else isn't going to be equal. In all probability, you are actually asking between a 1366x768 LCD and a 1024x768 plasma. In this case, the LCD/plasma differences (brightness, contrast, glare resistance) swamp the resolution difference.


Answer this question for yourself by going to a store and looking at the two side by side.
 

·
Registered
Joined
·
2,540 Posts

Quote:
Originally Posted by Macdaddie /forum/post/0


Excellent stuff J.J.


I am looking for a display that I can use, split screen(PC monitor/TV), 26 or 32", and have narrowed the field to the Samsung or Sony. Can I determine if they have isometric pixels or not? I have scoured the product detail pages, and no mention.

Any other comments from you or anyone else here are appreciated on my search.....


MD

You can always determine for yourself if the pixels are isometric (or square). If the native resolution ratio is 16:9, they are. If the native resolution ratio is not 16:9, they are rectangular. 1024x768 is probably the most common case where they are rectangular. This doesn't apply to LCD computer monitors where you often see screens that are actually 16:10, 5:4 and other ratios.
 

·
Registered
Joined
·
474 Posts
easiest way to know its not. Watch 720p source material. Set 768p set to 1:1. Notice black bars all the way around. Must be zoomed to fill that screen. On the other hand, set computer to 1360x768, set to 1:1. Notice the full screen.


And please stop the 1080i lies. Theoretically, 1080i is almost, ALMOST showing you full 1920x1080. The human eye can detect the flicker that results from the alternating fields. You are not seeing the full resolution. You think you are during slower scenes, where the "wobbling" is almost undetectable, but just wait until the QB rifles one downfield. You are seeing, 540p several times per second. That's the same wobble that we've seen all of our lives. Now that's its 1080i, it seems to not count for some reason, but when its 480i vs 480p its the same thing. Insanity. There's a reason one has the i and one has the p and is more expensive than the other.
 

·
Registered
Joined
·
3,655 Posts

Quote:
Originally Posted by Valence01 /forum/post/0


Yes, in one sense it's easier to build the very simplest of scalers if the ratio is 1.5 (1080/720), but these days, we want better results than can be had of such simple scalers. Once a designer bites the bullet and implements a more sophisticated scaling algorithm, it's not nearly so compelling to keep to a nice ratio like 1.5. This path was followed precisely with the original DVD standard where 720 x 480 pixels are scaled down to 720 x 360 pixels when the DVD player letterboxes a 16:9 aspect ratio program to a 4:3 aspect ratio display. In that case, it was a nice 4 vs 3 and the very early players simply discarded every 4th line to make a really gross downscaler. Nowadays, no manufacturer does it that way. To do it that way for 1080/720, you just throw away every 3rd line. The result from that method is not so great, with the introduction of aliasing artifacts, deinterlacing, etc. This can be prevented by implementing a more sophisticated scaler that uses all of the soruce lines to produce the final downscaled output. As I said, once you use all of the lines to produce the final output, it's not such an advantage to keep a simple ratio of two small integers.


Now, as to why 768? Well, 768 happens to be a fairly nice-ish binary number that looks like this:


1100000000


vs 720 which looks like this:


1011010000


720 in binaryland where chip designers work, is not nearly so nice as a round number like 768. Chip designers grumble a lot less when presented with 768, than do when presented with a not so round number like 720. Once the designer chooses an "anything" to "anything" scaler architecture, it greatly simplifies things, if the output "anything" is actually constrained to a nice round binary number like 768. It's a lot more trouble, if the output resolution needs to be 720.


There are lots of other round number issues with HDTV. Example: MPEG actually likes things in multiples of 16, which 1920 x 1080 most certainly is not! 1080 / 16 = 67.5 yech! That's why the broadcast mpeg stream for a 1080i program is actually encoded as 1920 x 1088. There are lots of computer video cards with round number issues. Many of them cannot handle vertical or horizontal resolutions that aren't a nice multiple of 8, of which 1366 most certainly is (1366 / 8 = 170.75).


Once it's been chosen to have 768 lines, then the 16:9 display aspect ratio dictates 1366 pixels per line and even then it doesn't come out precisely 1366, but more like 1366.33333. So, 1366 x 768 isn't precisely a 16 x 9 aspect ratio, assuming that pixels are isometric (square). It hasn't always been of much concern to have isometric pixels. Only with the more recent popularity of connecting pcs to HDTV displays, has it really mattered so much, since few pc designs allow for undistorted display of shapes on a display device with anisometric pixels. So, if you want to connect up a pc for non-movie playing use, then you will struggle if your display does not have isometric pixels.


J.J.

Thank you for the detailed answer than even a non-math guy like me can understand.
 

·
Registered
Joined
·
836 Posts

Quote:
Originally Posted by rantanamo /forum/post/0


easiest way to know its not. Watch 720p source material. Set 768p set to 1:1. Notice black bars all the way around. Must be zoomed to fill that screen. On the other hand, set computer to 1360x768, set to 1:1. Notice the full screen.


And please stop the 1080i lies. Theoretically, 1080i is almost, ALMOST showing you full 1920x1080. The human eye can detect the flicker that results from the alternating fields. You are not seeing the full resolution. You think you are during slower scenes, where the "wobbling" is almost undetectable, but just wait until the QB rifles one downfield. You are seeing, 540p several times per second. That's the same wobble that we've seen all of our lives. Now that's its 1080i, it seems to not count for some reason, but when its 480i vs 480p its the same thing. Insanity. There's a reason one has the i and one has the p and is more expensive than the other.

So according to this, when I'm watching my old 480i trinitron, I'm actually watching 240p several times per second??
 
1 - 20 of 40 Posts
Top