AVS Forum banner
1 - 19 of 19 Posts

·
Registered
Joined
·
42 Posts
Discussion Starter · #1 ·
Hello everyone! As you may know, most (if not all) 720p televisions out there have a native resolution of 1366x768, despite the fact that most 720p content is 1280x720. Since 1280x720 and 1366x768 both have different aspect ratios, does this mean that a 1366x768 television stretches 1280x720 content, thus making circles look like ovals and squares look like rectangles? Thank you!
 

·
Registered
Joined
·
6,908 Posts
You're not going to see any significant geometric problems:
  • 1280/720=1.777
  • 1366/768=1.778

The big problem is that even 720p source material has to be internally upconverted on these bastard resolution sets which DOES cause artifacts and distortion.


Best to leave all sources at their native resolution and let the TV do the conversion ... if you have to convert resolution, it is best to do it only once!!
 

·
Registered
Joined
·
4,696 Posts
If you are using a PC set resolution to 1360 x 768, You will loose 3 pixels on each side (hardly notice) but the PC will do all the scaling.


Oddly when watching 720p content the PC will just drop the pixels leaving a small border all the way around the TV.
 

·
Registered
Joined
·
42 Posts
Discussion Starter · #4 ·

Quote:
Originally Posted by cavu /forum/post/18151641


You're not going to see any significant geometric problems:
  • 1280/720=1.777
  • 1366/768=1.778

The big problem is that even 720p source material has to be internally upconverted on these bastard resolution sets which DOES cause artifacts and distortion.


Best to leave all sources at their native resolution and let the TV do the conversion ... if you have to convert resolution, it is best to do it only once!!

Why don't these 1366x768 televisions just pillarbox 1280x720 content to prevent stretching?
 

·
Registered
Joined
·
1,355 Posts
If I may comment, I hooked up my Xbox 360, via VGA, to my Samsung LN32B360, which has a Native panel resolution of 1366x768.


So my 360 was also set to Natively feed 1366x768 because thats a supported resolution.


I played, a DVD Corpse Bride, the widescreen version, and the 360 beefy Scaler chip is doing, and the conversion math, letting the Panel just be a display and do its thing, And WOW! next to a Bluray at the store I have never seen a cleaner brighter picture, no noise no artifacts.


So lets not poo-poo 1366x768
 

·
Registered
Joined
·
16,749 Posts
Because no one would buy them since they are marketed as 720 TVs.

There are no longer any true 1280x720 resolution TVs on the market.

The images are not stretchd they are upscaled so that there is no distortion like there would be if they were stretched horizontaly and had black bars put on the top and bottom which would occur if the images were letterboxed.
 

·
Registered
Joined
·
12,736 Posts
720p (1280 by 720) is very close to 1366 by 768 and as mentioned, they are both 16:9 formats. While the Xbox 360 does support such odd resolutions (among others), the PS3 and standalone BD and upconverting DVD players will not. However, 720p on a 720p set is not the real concern, 1080i is. That's because 1920 by 1080i (or 1920 by 540) doesn't quite match up to a 1280 by 720 or 1366 by 768 panel.
 

·
Registered
Joined
·
16,749 Posts
1080i is 1920x1080 interlaced which means that it contains a 1920x540 field containg the odd numbered liness from a 1080p frame follwed by a 1920x540 field containg the even lines from a 1080p frame. When these two fields are de-interlaced a true 1080p frame is created which is obvioulsy of much higher resolution then 1366x768 or 1280x720 frames.

Years ago some poorer quality HDTVs did de-interlace 1080i into 1920x540 frames using what is called Bob de-interlacing to save costs but thank goodness this practice has ended.
 

·
Registered
Joined
·
1,355 Posts
Why can't broadcasters just all Broadcast in 720p???


Why 1080i if its a. bad for fast motion, and b. its abut trickier for LCD's to scale and fit properly, especially given that so many people have 1366x768 panels, 1280x720 older panels. Let all Broadcasters or force them to do, 720p period.
 

·
Registered
Joined
·
42 Posts
Discussion Starter · #10 ·

Quote:
Originally Posted by walford /forum/post/18157182


1080i is 1920x1080 interlaced which means that it contains a 1920x540 field containg the odd numbered liness from a 1080p frame follwed by a 1920x540 field containg the even lines from a 1080p frame. When these two fields are de-interlaced a true 1080p frame is created which is obvioulsy of much higher resolution then 1366x768 or 1280x720 frames.

Years ago some poorer quality HDTVs did de-interlace 1080i into 1920x540 frames using what is called Bob de-interlacing to save costs but thank goodness this practice has ended.

How exactly is 1080i de-interlaced when both fields are captured at different moments in time?
 

·
Registered
Joined
·
848 Posts

Quote:
Originally Posted by Shadowz O Death /forum/post/18158368


How exactly is 1080i de-interlaced when both fields are captured at different moments in time?

Probably the same way 480i is deinterlaced, you know... Motion adaptive and all that jazz.


A good deinterlacer has to be able to "fill in" the missing detail, basically.


It's a pretty complicated thing, which is why those video deinterlacers/scalers from DVDO, Lumagen, etc. are so pricey.


You have to do the same thing to convert 1080i video (60 fields) to 1080p (60 frames). Not an easy task!
 

·
Registered
Joined
·
42 Posts
Discussion Starter · #12 ·

Quote:
Originally Posted by sodaboy581 /forum/post/18158478


Probably the same way 480i is deinterlaced, you know... Motion adaptive and all that jazz.


A good deinterlacer has to be able to "fill in" the missing detail, basically.


It's a pretty complicated thing, which is why those video deinterlacers/scalers from DVDO, Lumagen, etc. are so pricey.


You have to do the same thing to convert 1080i video (60 fields) to 1080p (60 frames). Not an easy task!

I'm pretty sure that 1080i/60 would be converted to 1080p/30, not 1080p/60, because 2 fields make up a frame. That's why I don't understand how the two frames are "combined" when they're captured at different moments in time.
 

·
Registered
Joined
·
16,749 Posts
Recent generation `1080p HDTVs convert 1080i/60 to 1080p/60 content by de-interlacing each field receioved twice once with the preceeding field and once with the following field. This is how the even line field of one frame can be "combined" with odd line field of another frame.

This method producrs much smoother 60fps contennt then if each of the 30 frames contained in a single 1080i/60 content is just displayed twice on a 60Hz TV.
 

·
Registered
Joined
·
848 Posts

Quote:
Originally Posted by Shadowz O Death /forum/post/18158532


I'm pretty sure that 1080i/60 would be converted to 1080p/30, not 1080p/60, because 2 fields make up a frame. That's why I don't understand how the two frames are "combined" when they're captured at different moments in time.

Well, you'd think wrong.
Like I said, it's the same as when 480i/60 is converted to 480p/60 (like when video, not film, is shot with a camera, professional or not, a proper deinterlacer keeps the smooth frame rate and restores lost detail)


You don't change the frame rate when converting. (Unless the content on the media is marked as being FILM... then you convert 480i/60 or 1080i/60 to 480p/24 or 1080p/24, or you use 3:2 pull down on it... but there is no such thing as 1080p/30 or 480p/30, just FYI!!)

That's why deinterlacing 480i/1080i video, which is 60 fields per second, into 480p/1080p 60 frames per second is difficult.
 

·
Registered
Joined
·
12,736 Posts
The benefit of 1080i HDTV programming is that a 1080p TV (or the older HD CRT 1080i TVs) should be able to properly deinterlace the signal to enable you to see a 1920 by 1080 image that is essentially identical to a 1080p signal. Limiting all programming to 720p would favor 720p sets over 1080p ones, and at this point 1080p is the norm and 720p is simply a cheaper alternative. These days getting 1080p is easy, starting at 32" for LCDs and 42" for Plasmas.
 

·
Registered
Joined
·
42 Posts
Discussion Starter · #18 ·

Quote:
Originally Posted by walford /forum/post/18159548


Recent generation `1080p HDTVs convert 1080i/60 to 1080p/60 content by de-interlacing each field receioved twice once with the preceeding field and once with the following field. This is how the even line field of one frame can be "combined" with odd line field of another frame.

This method producrs much smoother 60fps contennt then if each of the 30 frames contained in a single 1080i/60 content is just displayed twice on a 60Hz TV.

Ahh, I see. I thought that each field was only de-interlaced once, like every two fields are combined to form one frame. I still don't understand how the fields are "combined" when EVERY field is captured at a different moment in time.
 

·
Registered
Joined
·
848 Posts

Quote:
Originally Posted by Shadowz O Death /forum/post/18161387


Ahh, I see. I thought that each field was only de-interlaced once, like every two fields are combined to form one frame. I still don't understand how the fields are "combined" when EVERY field is captured at a different moment in time.
http://en.wikipedia.org/wiki/Deinter...n_Compensation


Basically gives you the jist of it.


Good deinterlacers usually look a few fields ahead, like I believe the DVDO video processors look at 4 or 5 fields, before creating the 480P/1080P frame from the 480I/1080I content. (This usually adds 2 frames of delay to the video processing, but the game mode reduces it to 1 frame of delay...)


Video processor or TV (unless it's a CRT TV that takes a interlaced signal natively), you will probably always have AT LEAST 1 frame of delay in your video. But great TVs are somewhere between 1-2, OK TVs 3, bad 4+ :p I mean, if you care about input lag!!
 
1 - 19 of 19 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top