AVS Forum banner
1 - 20 of 33 Posts

·
Registered
Joined
·
1,075 Posts
Discussion Starter · #1 ·
I have a hard time understanding this. If HD is defined by being 1280x720 or 1920x1080, how can you have content considered HD that is 4:3? For example, the Charlie Brown Blu Rays. They are 4:3 1080p. How does that work if 1920x1080 is in nature a 16:9 resolution?
 

·
Registered
Joined
·
483 Posts

Quote:
Originally Posted by Trip in VA /forum/post/19555185


1440x1080p is 4:3.


- Trip

...and some older material was originally captured in 4:3 aspect ratio but in a medium such as 35mm film that had enough resolution that it could be reproduced using a modern HD format.


Great example: the Wizard of Oz is on TBS tonight in HD (a Thanksgiving weekend institution!). It was filmed in 4:3 technicolor, which has been scanned by a telecine into an electronic HD format for television broadcast. The original film medium actually has MORE than enough resolution to qualify as HD.


Another way to look at it: you are seeing these classics in Original Aspect Ratio (OAR)! It just happens to be narrower than 16:9, unlike many more recent films are wider than 16:9.
 

·
Registered
Joined
·
1,108 Posts

Quote:
Originally Posted by Satori84 /forum/post/19555454


...and some older material was originally captured in 4:3 aspect ratio but in a medium such as 35mm film that had enough resolution that it could be reproduced using a modern HD format.


Great example: the Wizard of Oz is on TBS tonight in HD (a Thanksgiving weekend institution!). It was filmed in 4:3 technicolor, which has been scanned by a telecine into an electronic HD format for television broadcast. The original film medium actually has MORE than enough resolution to qualify as HD.


Another way to look at it: you are seeing these classics in Original Aspect Ratio (OAR)! It just happens to be narrower than 16:9, unlike many more recent films are wider than 16:9.

Theatrical aspect has been 20x9 for several decades.....
 

·
Banned
Joined
·
2,443 Posts

Quote:
Originally Posted by BPTTV /forum/post/19555515


Theatrical aspect has been 20x9 for several decades.....

Most movies made before the 1950's were 4:3. It was the invention of the TV that caused movies to be made in a wider format (and color) in order to lure people back into theaters.
 

·
Registered
Joined
·
1,482 Posts
Modern theatrical films are either made at 1.85 or 2.35 (effectively 2.40 for projection). AFAIK Avatar is the only major theatrical release made and released at 1.77 (though in some theatres it may have been scaled slightly to 1.85 and all kinds of stuff was done for IMAX). Very rarely is a 1.33 film produced for major theatrical release, even independent films. Artistic considerations aside it would be tough to get financing for such a project these days. It's more common to see 4:3 used as a vignette in modern films like the opening of A Serious Man.
 

·
Registered
Joined
·
10,682 Posts

Quote:
Originally Posted by ABCTV99 /forum/post/19558921


Very rarely is a 1.33 film produced for major theatrical release, even independent films. Artistic considerations aside it would be tough to get financing for such a project these days.

In the United States. It's still common to shoot 1.33 in Central and South America. It's commonly (but not always) masked to 1.66 in the theaters and sometimes shot safe for 1.85 cropping.


The cinemas in these countries never had the amount of competition from television that they did in the U.S. so there was much less enthusiasm for installing expensive wide screens in their theaters. It's common to find old screens that are still 4:3. American features will have those darn black bars at the top and bottom.
 

·
Registered
Joined
·
7,958 Posts
However to answer the OP's question. The mainstream HD scanning formats are defined as 16:9 1280x720 and 1920x1080. If you are displaying 4:3 content then this will be pillarboxed with bars either side to fill the 16:9 frame. If you are displaying 21:9 content this will be letterboxed to 16:9 with bars top and bottom.


Whilst there are both 4:3 and 16:9 full-height/full-width formats for SD content - the HD standards are 16:9 only in the main.


So 4:3 content will occupy the central 1440x1080 section of a 1920x1080 frame - but there will be padding of 240 pixels each side to fill the 1920x1080 frame.


Confusingly there are also non-square pixel-based 16:9 formats in use (1440x1080, 1280x1080 and 960x720 are all in widespread use in broadcast applications with codecs like DVC Pro HD and HD Cam)
 

·
Registered
Joined
·
1,052 Posts

Quote:
Originally Posted by Lkr /forum/post/19555141


I have a hard time understanding this. If HD is defined by being 1280x720 or 1920x1080, how can you have content considered HD that is 4:3?

Maybe your premise ("HD is defined by being 1280x720 or 1920x1080") is incorrect. The old definition I've heard is that HD is any resolution higher than SD and ED.


Your definition of HD as only 1280x720 or 1920x108 might be a common misuse of the term.

But not as bad as (mis)using "SD" to mean 4:3 aspect ratio or analog (i.e. not digital) video/TV (e.g. "SD tuner").


Regards
 

·
Registered
Joined
·
189 Posts

Quote:
Originally Posted by blue_z /forum/post/19571477


Maybe your premise ("HD is defined by being 1280x720 or 1920x1080") is incorrect. The old definition I've heard is that HD is any resolution higher than SD and ED.

That's really a fuzzy area. For example, what would you call the old French system which was 819 lines, 4:3, monochrome? For many years, there were industrial video formats with more than 625 lines (European 'standard definition'), but they weren't thought of as 'high-definition television'. In fact, that phrase was originally used to describe the Marconi-EMI 405-line system of 1934, when it replaced the Baird 30-line system previously used by the BBC.


In modern times, the term High-Definition Television is intimately associated with the format which came out of research at NHK in the 1960s & 1970s. This system was always characterized by a higher static resolution & wider aspect ratio than conventional television, along with multichannel audio. The canonical form of the NHK Hi-Vision HDTV system was 1125 lines (1035 visible), 2:1 interlace scanning at 30.00 frames per second, with an aspect ratio of 5:3 and four-channel (LCRS) audio. This became the basis for SMPTE 240M, which has of course been considerably amended over the years.


Nowadays we have varying framerates, from 23.97 to 60.00 frames progressive, and your choice of 1125/1080 or 750/720 lines per frame, with 16:9 aspect ratio and 5.1-channel (sometimes) sound… who knows what "High Definition" is anymore? It's certainly possible to shoot 4:3 material with 1080 or 720 lines, but I don't know anyone who would recommend it, & it's just about the one thing the ATSC format table doesn't include.
 

·
Registered
Joined
·
7,958 Posts

Quote:
Originally Posted by Sheer Lunacy /forum/post/19572120


That's really a fuzzy area. For example, what would you call the old French system which was 819 lines, 4:3, monochrome?

In reality, AIUI, not particularly good
The spec promised more than the broadcasts actually delivered I understand...


However in spec terms it is a bit of an oddity. Essentially twice the resolution of the British system! (Hmmm - wonder why that was...)

Quote:
For many years, there were industrial video formats with more than 625 lines (European 'standard definition'), but they weren't thought of as 'high-definition television'. In fact, that phrase was originally used to describe the Marconi-EMI 405-line system of 1934, when it replaced the Baird 30-line system previously used by the BBC.

ISTR the BBC initially allowed Baird to test ~30line TV using BBC facilities, but when they came to decide on their own broadcast standard they ran 240line progressive Baird and 405line interlaced Marconi/EMI side-by-side. Was the 240line progressive Baird system also described as high definition?


I know that the BBC's Marconi/EMI 405 line system was described as 'The world's first regular high definition TV service' which started in 1936.

Quote:
In modern times, the term High-Definition Television is intimately associated with the format which came out of research at NHK in the 1960s & 1970s. This system was always characterized by a higher static resolution & wider aspect ratio than conventional television, along with multichannel audio. The canonical form of the NHK Hi-Vision HDTV system was 1125 lines (1035 visible), 2:1 interlace scanning at 30.00 frames per second, with an aspect ratio of 5:3 and four-channel (LCRS) audio. This became the basis for SMPTE 240M, which has of course been considerably amended over the years.


Nowadays we have varying framerates, from 23.97 to 60.00 frames progressive, and your choice of 1125/1080 or 750/720 lines per frame, with 16:9 aspect ratio and 5.1-channel (sometimes) sound who knows what "High Definition" is anymore? It's certainly possible to shoot 4:3 material with 1080 or 720 lines, but I don't know anyone who would recommend it, & it's just about the one thing the ATSC format table doesn't include.

Also - AFAIK there is no recognised 4:3 native HD broadcast interconnect or data standard? All the 1080i/p and 720p broadcast interconnect standards are based around 16:9 image ratios?
 

·
Registered
LG 55" C9 OLED, Yamaha RX-A660, Monoprice 5.1.2 Speakers, WMC HTPC, TiVo Bolt, X1
Joined
·
45,683 Posts

Quote:
Originally Posted by blue_z /forum/post/19571477


Maybe your premise ("HD is defined by being 1280x720 or 1920x1080") is incorrect. The old definition I've heard is that HD is any resolution higher than SD and ED.


Your definition of HD as only 1280x720 or 1920x108 might be a common misuse of the term.

Not at all.


If you go back to when TV was just getting off the ground, a number of formats were called high definition, simply because they were higher than the previous resolutions.


But, since the advent of ATSC in the US, HD formats are defined as 720p, 1080i, and 1080p.
 

·
Registered
Joined
·
189 Posts
Quote:
Originally Posted by sneals2000
In reality, AIUI, not particularly good
The spec promised more than the broadcasts actually delivered I understand...
My information is that the cameras & associated optical systems weren't capable of resolving what the video system could theoretically transmit.


Quote:
Originally Posted by sneals2000
ISTR the BBC initially allowed Baird to test ~30line TV using BBC facilities, but when they came to decide on their own broadcast standard they ran 240line progressive Baird and 405line interlaced Marconi/EMI side-by-side. Was the 240line progressive Baird system also described as high definition?
The 30-line Baird TV service ran for several years altogether, part of that time on a regular schedule. When it was decided to test improved systems, the intent was to run the Baird 240-line system for a few weeks, then the Marconi-EMI system, alternately for a year or more. After the first time the all-electronic system went on the air, they never looked back.

I am not aware that the improved Baird system was ever described as "high-definition" ; the Marconi system was, at the time, the highest resolution in use in the world, although that only lasted a little while. The French pre-war system was, I think, 455 lines at 25 fps, while the EIA preliminary standard (for which a few hundred sets were made in the USA before the adoption of the monochrome NTSC standard) was 441 lines at 30 fps.

Quote:
Originally Posted by sneals2000
Also - AFAIK there is no recognised 4:3 native HD broadcast interconnect or data standard? All the 1080i/p and 720p broadcast interconnect standards are based around 16:9 image ratios?
SMPTE 240M, last time I looked, provided only for the 16:9 ratio, but that doesn't mean much ; the introduction of widescreen (anamorphic) NTSC video in the 1990s did not bring with it a revision of EIA RS-170A to allow for aspect ratios other than 4:3. As long as it's internal to your plant, there's nothing to stop you from doing any oddball thing you want, such as packaging 4:3 or 21:9 (Cinemascope) into the "Common Image Format" of 1920×1080 pixels, but interchange or broadcast will be different.
 

·
Registered
Joined
·
9,204 Posts
There were "live" real time X-RAY systems in the 1970s through the 90s that used a 1050 line video system IIRC. Ampex and IVC both made modified open reel industrial VTRs that could record these signals for later evaluation.


I would expect this all went to digital computer workstation based systems around the turn of the century.
 

·
Registered
Joined
·
1,052 Posts

Quote:
Originally Posted by Sheer Lunacy /forum/post/19572120


That's really a fuzzy area.

Quote:
... who knows what "High Definition" is anymore? ...

Well that's sort of the point I wanted to make.

Isn't "HD" more of a (vague) marketing term rather than a technical acronym/jargon? Does the EIA, IEEE, ATSC or any organization actually define "high definition"?

Quote:
Originally Posted by Ken H /forum/post/19592639


Not at all.

...

But, since the advent of ATSC in the US, HD formats are defined as 720p, 1080i, and 1080p.

Well, two of those are ATSC formats that do satisfy the "more than SD and ED" generic definition.

Is this "definition" by a standards organization or popular usage?


Is there a different definition for HD formats versus HDTVs? There are 1024x768 displays (e.g. current model Panasonic "720p" plasmas) that are advertised as HDTVs. There are 13XXx768 displays (e.g. various LCD "720p" displays) that don't match exactly to either of those two resolutions.


Regards
 

·
Registered
Joined
·
4,649 Posts
I don't see any confusion in the definition of high-definition TV. ATSC defines high-definition as 1080x1920 or 720x1280 with 16:9 aspect ratio. Some broadcasters also apparently use 1080x1440. ATSC is the standard for broadcast TV in the USA. I think they have another definition that compares it to at least twice the resolution of SD. Blu-ray supports the same high-definition resolutions. The 1024x768 and 13XXx768 resolutions you mention are computer monitor resolutions (XGA and WXGA) and don't have anything to do with TV other than the fact that many devices are capable of displaying both TV signals and computer signals.
 

·
Registered
Joined
·
16,749 Posts
AFAIK the reason that many manufacturers use 1366x768 for 720p HDTVs is that many commercial users like to program their store displays using a PC so the 1366x768 PC resolution makes this easier for them while at the same time meeting the requirement that 720p content will be displayed at full or higher resoluition. An added benefit is that all 1080i or 1080p content does not have to be downscaled as much when going downt to 1366x768 instead of to 1280x720.
 

·
AVS Forum Special Member
Joined
·
11,139 Posts

Quote:
Originally Posted by Colm /forum/post/19595895


Some broadcasters also apparently use 1080x1440. ATSC is the standard for broadcast TV in the USA.

Understand that Australia and some of the UK's broadcasting is/was 1440X1080 but not U.S. ATSC OTA. DBS or cable can use it, too.
 
1 - 20 of 33 Posts
Top