AVS Forum banner

Status
Not open for further replies.
1 - 20 of 47 Posts

·
Registered
Joined
·
1,417 Posts
Discussion Starter #1
When I got my 65xwx ABC always looked really hazy, and I thought 720p was just horrible compared, but then when I went to the 57swx it was much better, so I thought it was just size, and 1080i was just that much better, but after watching a few Monday night games this year, I am seeing that MNF looks allot better then regular ABC prime time shows.


I like 1080i on CBS for the 2:30 game a little better, but I guess on regular ABC compared to CBS at night, they may be using 720p but not HD, and when they use whatever they do different on the football games, it is a very good picture, and close enough to 1080i on the CBS that I couldn't be unhappy one way or the other. This is just a very good picture.


Can somebody more into 720p explain what they are doing on primetime to make it look good, but nothing close to this on MNF?


I am noticing the back views on the replay are not 720p in HD, but the other 3 are just awesome.


Does anyone have an idea how much more cost it would add to these things to have 720p native and 1080i native? Would it make sense for just some of the brands better models to do it, or is it one of those deals that unless all the high end units do it, the cost scares them to be missed based on that extra cost. AS good as 720p looks upconverted to 1080i it seems like it would be scary good if these RPTVs could do the full display.


I apologize for my mistake in thinking ABC was broadcasting 720p in HD regularly when it clearly seems that they are not as verified by my picture looking so much better on MNF. I am a big college fan, and before I got this HD set rarely watched pro ball, but now I watch 1/2 pro because it looks so good, i am having to cut back on college to keep my wife, because football in HD is just too good no matter what level it is played out. Bring on the Pee-Wee game on Tuesday's because I am in.
 

·
Registered
Joined
·
32,172 Posts
Yes, on primetime they are taking film-based products or standard-def video as the source. When film-based and indoors, the stuff doesn't look that incredible (like Alias). When SD video (some sitcoms, perhaps?), then they only can upconvert it.


On MNF, they use live HD video cameras.


And, wow, it looks great, huh?
 

·
Registered
Joined
·
1,058 Posts
Alias is intentionally fuzzy and grainy. Look at Life with Bonnie

for the best ABC sitcom/drama picture. Others are in between.


Doug McDonald
 

·
Registered
Joined
·
9,826 Posts
I receive Alias on my year 2000 Panny 720p native-capable 65", and it is NOT fuzzy! It is every bit as crisp as CSI in 1080i, and remarkably better than ABC's The Practice, and FOX's Ally McBeal. Alias, CSI and The Agency are some of the crispest HD offerings on the air.


In dark scenes it does get grainy sometimes. I figure that's because of the more sensitive, higher-speed film they use for darker-scene shots, which is always grainier than regular film. I see that in all film-based renderings broadcast in HD.


Perhaps it is getting fuzzy in the translation to 1080i on your set? Mine was the last year Panny did 720p, and I bought it a year old from onecall.com as a last year's closeout, just to get that feature.


When I asked them at CES if they would be coming out with 720p native capable again soon, they said that electronics capable of 45K frequency are much more expensive than for 33K - 1080i's highest needed frequency - and no, they will not be doing that anymore, with the prices of RPTVs having gone so low in the past few years.



True HD is true HD. 720p is 720p, and ABC is 720p 24/7.


Sometimes 720p is carrying true HD. Most times while on the air it is not. A show has to be broadcast in HD before you will pick it up in HD. Many of the shows on ABC - and here in the SF Bay Area FOX, which also transmits in 720p here, while in the rest of the country it transmits in 480p - are regular 480i shows, which are simply upconverted to 720p for broadcast purposes, and there's no way short of a Faroudja-class instrument, to make one even remotely resemble the other.


That said, as far as I was concerned, Super Bowl last year sucked! I see much cleaner product from every Sunday night's episode of Alias than ABC's treatment of the Super Bowl!



Mr Bob
 

·
Registered
Joined
·
32,172 Posts
Mr. Bob, I concur that the film used indoors yields the grain that Alias exhibits. It's just not that stunning indoors as a result (JG, still stunning though :)).


I have seen Alias on a number of displays, and it's never as "Wow!" picture quality-wise on indoors scenes as, say, CSI (which is still not all that amazing indoors, but perhaps cleaner).


Of course, in whichever episode Sydney donned the cover-up outfit and had to fight Dixon in the Middle East or Far East or whatever it was, the colors and picture quality then were totally unreal.


I have not seen anything on ABC that is as mind boggling as the stuff on CBS -- in terms of dramas / comedies.


But sports broadcasts like MNF vs. the CBS HD football show that on some level, either 1080i or 720p yields stunning stuff for sports -- when done well.
 

·
Registered
Joined
·
661 Posts
I am glad to read this stuff because I am breaking in a new LCoS set that is getting mixed reviews and I was absolutely horrified when I saw the PQ of CSI and Alias dark scenes (not that they were horrible, but hey weren't what I expected from some of the other HD stuff I have been watching). I wasn't sure if it was a limitation of the set or if the broadcast was grainy. Whew!!!
 

·
Registered
Joined
·
1,426 Posts
The NFL games on ESPNHD in 720p on my DLP is IMO even better than MNF's 720p, but more likely they are identical in quality.
 

·
Registered
Joined
·
1,058 Posts
Quote:
Originally posted by Mr Bob
I receive Alias on my year 2000 Panny 720p native-capable 65", and it is NOT fuzzy! It is every bit as crisp as CSI in 1080i, and remarkably better than ABC's The Practice, and FOX's Ally McBeal. Alias, CSI and The Agency are some of the crispest HD offerings on the air.



Perhaps it is getting fuzzy in the translation to 1080i on your set



That said, as far as I was concerned, Super Bowl last year sucked! I see much cleaner product from every Sunday night's episode of Alias than ABC's treatment of the Super Bowl!



Mr Bob


I have a true 720p TV, the Panasonic PT40LC12.


OK, perhaps I did not communicate well enough. Alias looks fuzzy

because it is intentionally shot with a very narrow depth of field.


In 720p the eyes of the subject focused on are indeed sharp.


The Super Bowl was sharper than the current Monday Night Football,

but not by much .... but MNF suffers badly from edge enhancement.

Alias and The Practice are the least effective of ABCs HD offerings.

Alias is the worst.


Doug McDonald
 

·
Registered
Joined
·
2,241 Posts
IMO HD was invented for football!:D I have seen hundreds of shows in HD (or converted to 1080i) and nothing on TV excites me as much as watching football in HD - the grass, uniforms, cheerleaders...
 

·
Registered
Joined
·
9,826 Posts
>>Alias looks fuzzy

because it is intentionally shot with a very narrow depth of field.



Hadn't noticed this. Will look for it. The eyes will be crisp but everything closer than the eyes - and farther away than the eyes - will be fuzzy, right?


My experience of Alias is that there's a lot more to its crispness than just the eyes being crisp. They have lots of shots of the inner offices, and the bars and nightclubs she visits, and that room they kept her mother in, and her roommate and the various friends, and Dixon, and her father, and all those foreign cities she visits...


I don't remember any of that being fuzzy, no matter how far away or close to the camera it all was, or where the middleground that would represent the midpoint of the depth of field would be -


What kind is the PT40LC12? LCD? DLP? DV?



Mr Bob
 

·
Registered
Joined
·
1,417 Posts
Discussion Starter #11
Quote:
Originally posted by Mr Bob
I receive Alias on my year 2000 Panny 720p native-capable 65", and it is NOT fuzzy! It is every bit as crisp as CSI in 1080i, and remarkably better than ABC's The Practice, and FOX's Ally McBeal. Alias, CSI and The Agency are some of the crispest HD offerings on the air.


In dark scenes it does get grainy sometimes. I figure that's because of the more sensitive, higher-speed film they use for darker-scene shots, which is always grainier than regular film. I see that in all film-based renderings broadcast in HD.


Perhaps it is getting fuzzy in the translation to 1080i on your set? Mine was the last year Panny did 720p, and I bought it a year old from onecall.com as a last year's closeout, just to get that feature.


When I asked them at CES if they would be coming out with 720p native capable again soon, they said that electronics capable of 45K frequency are much more expensive than for 33K - 1080i's highest needed frequency - and no, they will not be doing that anymore, with the prices of RPTVs having gone so low in the past few years.



True HD is true HD. 720p is 720p, and ABC is 720p 24/7.


Sometimes 720p is carrying true HD. Most times while on the air it is not. A show has to be broadcast in HD before you will pick it up in HD. Many of the shows on ABC - and here in the SF Bay Area FOX, which also transmits in 720p here, while in the rest of the country it transmits in 480p - are regular 480i shows, which are simply upconverted to 720p for broadcast purposes, and there's no way short of a Faroudja-class instrument, to make one even remotely resemble the other.


That said, as far as I was concerned, Super Bowl last year sucked! I see much cleaner product from every Sunday night's episode of Alias than ABC's treatment of the Super Bowl!



Mr Bob
720p is not HD 720p all the time for the reason i stated. HD football looks great and almost all the regular shows do not, so if you were right then there would not be a night and day difference between the two. You are lucky to have a Panny set that does 720p native though because when I bought last year, their sets were not close to the brands I ended up comparing, but if one did 720p native then I would have still considered them. I guess panny then thought they would all have that 720p native and so they included it early.
 

·
Registered
Joined
·
32,172 Posts
OK, again, the difference between outdoor shots and indoors shots is huge with HD -- period.


The difference between HD video-sourced material and HD created from film is huge -- period.


Alias is often dark and indoors, which are not that HD friendly. And Alias is always HD created from film.


None of this has anything to do with it being 720p. Period.
 

·
Registered
Joined
·
9,826 Posts
>>You are lucky to have a Panny set that does 720p native though because when I bought last year, their sets were not close to the brands I ended up comparing, but if one did 720p native then I would have still considered them.



Agreed, Pannys don't look that great OOB or in the showroom.


But after full calibration and the ultimate in fine tweaking, they simply can't be beat.


The reason I was sold even before I had made up my mind about getting one with 720p was that I had calibrated one, a 56", that truly had a really lifelike "glow" to it after the cal was complete.


I had occasion to see it in action again a year later when I was looking over this client's HD receiving and recording equipment, which I subsequently bought, and still use, and absolutely love also.


It still looked absolutely fabulous!


And his DIDN'T have 720p! Panny had done 720p native twice, and his was the in-between model, without the 720p native capability.


That made up my mind, and when another guy whose Panny 65" I was considering buying - halfway across the country from where I live on the West Coast - found out that onecall was having his exact model on closeout, he shut down the potential sale to me of his, and told me about the sale - which got me a brand new, warrantied one for the same price I was going to be paying him - just under $3K. He lost the sale of his set to me, but knew he'd sleep better at night if he told me about this incredibly rare window of opportunity that he had just learned of.


I will always thank him. Hope he's reading this.


I still don't agree that HD football is great and nothing else is. Virtually everything my HD system picks up is great, as far as I am concerned, with the exceptions noted in another of my posts, above. Super Bowl was football, and it sucked last year, on ABC's 720p, in terms of "turbulence" I saw around figures in the distance. This turbulence is rare and I have not seen it since, on anything ABC has offered since then.



Mr Bob
 

·
Banned
Joined
·
29,681 Posts
Quote:
Originally posted by Addicted Help!!



Can somebody more into 720p explain what they are doing on primetime to make it look good, but nothing close to this on MNF?

http://www.avsforum.com/avs-vb/showt...hreadid=257964


Read this link- there is some good talk and posts about 720p and 1080i differences....


In theory, 1080i should provide a superior image in terms of detail, since 1080ix1920 contains more "active" pixel elements than 720pX1280 does.


Even if you look at a field of video and not a frame, (two interlaced fields make a frame)1080i on paper looks better.


One field of 1080i is- 540 lines by 1920 pixels for total of 1,036,800 pixel elements. (right?)


One field of 720px1280 is 921,600 (720x1280)


Now, since our eyes can be fooled into thinking we see all 1080i of the interlaced lines at once... 1080i does indeed contain more picture elements than 720p does. Also, the first 540 lines don't have to be exactly the same as the second set of 540 lines, where 720p this is the case.


Now for reality,


There is not really any 1080ix1920 signals available. As posted earlier, due to current limitations from certain brands of HDTV recorders, cameras, and broadcast station links to local networks really the most you get is about 1080ix1454 or so.


Now try the math out again...


One field of 1080i now give you 540x1440 = 777,600


While, 720px1280 still gives you 921,600.


Take into acount that 1080i has more interlaced artifacts and picture noise, and 720p seems superior.


Particuarly if you extend this to a full frame of video, where as much as 30% (according to Mr. Kane) of the interlaced resolution can be lost in the conversions from progressive to interlaced to minimize the visibiity of interlaced artifacts, which 720p again does not suffer from.


If you take the full frame (two fields) 720p looks alot better because now you have 1280x720x2=1,843,200 active picture elements and resolution in both horizontal and vertical directions remains constant over time, where 1080x1440 (540x1440x2) only gives you 1,231,200 active picture elements.


1,843,200 minus 1,231,200 = 612,200 more active elements of the picture in 720p, along with the absence of interlaced artifacts.


For now, it does seem 720p is really better, in reality, than 1080i; while in theory or the future perhaps 1080i can be better... perhaps even 1080p someday....
 

·
Banned
Joined
·
29,681 Posts
 http://www.dtv720p.com/hoffner_article.htm


Here is a great link.


===================================================


Posted with permission of the author


by Randy Hoffner

ABC Television Network


July, 2002


Good HDTV: It's More Than a Numbers Game


The advent of digital television(DTV) broadcasting and HDTV has presented the broadcaster and the consumer with a multitude of choices. One choice that broadcasters were required to make was the selection of an HDTV scanning format. This choice might seem difficult at first glance. However, once we have cut through the myths and the hype, we find that from a technological standpoint choosing is not really so difficult after all.


Since digital HDTV broadcasting began, we have heard a lot of discourse about the two HDTV scanning formats that are used by broadcasters: 1080i and 720P. Most of us know that 1080i is an interlaced scanning format, while 720P is a progressive scanning format. Let's take a closer look at these two formats. What are their respective attributes, and what does it all mean to the HDTV viewer?


Skipping the technical details, a television picture is produced by electronically scanning a scene line-by-line, converting those scan lines to electrical signals, and transmitting the signals to a receiver, where they are converted back to scan lines that reconstruct the original picture. When we compare the pictures made by broadcast NTSC, VHS, and DVD, all of which have the same 480 scanning lines, it is apparent that the number of scanning lines alone does not determine the quality of the television picture, although the greater the number of lines, the greater the picture's resolution potential.


The earliest experimental electronic television pictures were scanned progressively. Progressive scanning is just what its name implies: the lines of the television picture are scanned sequentially, line 1 followed by line 2, etc. While this seems the logical way to scan, other considerations led to a different approach. The greater the number of scanning lines, the greater the resolution possible in the resulting pictures. However, there is no free lunch, and the greater the number of scanning lines, the greater the analog bandwidth the signals occupy when transmitted. In NTSC television broadcasting, the analog bandwidth of the visual signal is strictly limited in order to fit it into the television channel along with the sound signal.


The inventors of television as we know it found themselves on the horns of a dilemma. They wanted in excess of 400 scan lines to afford sufficient resolution, but if they scanned this number of lines progressively, they were limited to a frame repetition rate of about 30 per second in order to stay within the allotted analog bandwidth. But they also had to confront the problem of flicker, which is the sensation of the picture perceptibly fluttering or flashing. Flickering occurs when the vertical repetition rate or the number of light flashes per second is too low for the specific viewing circumstances. The critical flicker frequency, or the repetition rate above which flicker cannot be perceived, falls between 40 and 60 repetitions per second, the exact number depending on conditions that include picture brightness and ambient room light. A repetition rate of 30 flashes per second is below the critical flicker threshold under any viewing circumstances. Although motion pictures run at 24 frames per second, each frame is projected twice, bringing the light flash rate up to 48 per second. This is above the critical flicker frequency in a dark motion picture theater where the brightness of the images is relatively low, but it is below the critical flicker frequency for a bright television picture viewed in a lighted room.


In the early 1930's, someone hit upon an idea that was hailed as a great invention to solve the flicker problem in television. It was called interlaced scanning. In interlace, each video frame is scanned as two half-frames, called fields. In one field, all the odd-numbered lines of the frame: 1, 3, 5, etc., are scanned, while in the other field all the even-numbered lines: 2 , 4, 6, etc., are scanned. The fields are scanned, transmitted, and displayed sequentially. As they are displayed, the odd-numbered lines and the even-numbered lines, which are spatially offset from one another by one scanning line's height, are "interlaced" together by the human visual system into a complete frame or picture. This would, at first glance, seem to be the best of all possible worlds. Thirty frames worth of picture information is transmitted each second, while the repetition rate is doubled to 60 light flashes per second. Yet, like a David Lynch movie, if we dig below the surface, things are not quite what they seemed at first glance to be. Interlaced scanning is a compromise that trades the absence of flicker for a number of other problems. The price interlaced scanning exacts in visual quality was necessary to meet certain objectives in NTSC television, but in the digital TV broadcasting milieu, it is not necessary to pay the interlace penalty.


1080i HDTV continues the tradition of interlaced scanning, and brings with it the interlace quality penalties. In the DTV world, each scanning line is made up of samples, called pixels. In 1080i, each line is made up of 1,920 pixels, which is in some cases reduced to 1,440 pixels. There are 1,080 lines in each complete frame, and 540 lines in each field, a little more than double the number of lines in an NTSC frame and field respectively. 1080i is usually transmitted with a frame rate of about 30 frames per second, as is NTSC.


The other HDTV scanning format, 720P, is a progressively-scanned format. Each 720P line is made up of 1,280 pixels, and there are 720 lines in each frame. 720P is typically transmitted at about 60 full frames per second, as opposed to 1080i's 60 half-frames per second. This affords 720P some significant advantages in picture quality over 1080i, advantages such as improved motion rendition and freedom from interlace artifacts.


The advocates of 1080i HDTV support their cause with a flurry of numbers: 1080 lines, 1920 pixels per line, 2 million pixels per frame. The numbers, however, don't tell the whole story. If we multiply 1920 pixels per line times 1080 lines, we find that each 1080i frame is composed of about two million pixels. 1080i advocates are quick to point out that a 720P frame, at 1280 pixels by 720 lines, is composed of about one million pixels. They usually fail to mention that during the time that 1080i has constructed a single frame of two million pixels, about 1/30 second, 720P has constructed two complete frames, which is also about two million pixels. Thus, in a given one-second interval, both 1080i and 720P scan out about 60 million pixels. The truth is that, by design, the data rates of the two scanning formats are approximately equal, and 1080i has no genuine advantage in the pixel rate department. In fact, if the horizontal pixel count of 1080i is reduced to 1440, as is done in some encoders to reduce the volume of artifacts generated when compressing 1080i, the 1080i pixel count per second is less than that of 720P.


Another parameter 1080i advocates use to advance their cause is resolution. Resolution is the ability to preserve the separate components of fine detail in a picture, so that they may be discerned by the viewer. But picture quality is not dependent on resolution alone. Numerous studies of perceived picture quality reveal that it is dependent on brightness, color reproduction, contrast, and resolution. Color reproduction is identical in all HDTV scanning formats, and may thus be disregarded as a factor. A typical study assigns the following weights to brightness, contrast, and resolution:


Contrast 64%


Resolution 21%


Brightness 15%


Resolution, then, is only a factor, and not the largest factor, in the determination of the subjective quality of a television picture. This was well illustrated in an industry meeting of professional video engineers that took place a few years ago. At that meeting, two direct-view (cathode ray tube) monitors of the same size, shape, and brand were fed the same HDTV signal. One of these monitors was priced in the $40,000 range, while the other was priced in the $4000 range. The $40,000 monitor unsurprisingly had a picture tube of far higher resolution capability than the lesser priced monitor, but the lesser monitor, because of its larger pixel "dots", had the higher contrast ratio, the relationship between the lightest and darkest parts of the picture. With a single exception, the engineers preferred the pictures displayed on the lower-definition monitor. While they seem at first glance to contradict intuition, the results of this demonstration are consistent with all the published literature on the subject.


Television pictures move, so when we consider resolution, dynamic resolution is typically a more important factor than static resolution. We have seen that the goal of interlaced scanning in NTSC was to effectively provide about 480 lines of vertical resolution, while keeping the vertical repetition rate above the critical flicker threshold. The latter goal was met, but it has long been known that the former goal was not. A 480-line interlaced picture only has a vertical resolution near 480 lines when it is a still picture. In the interlaced scanning structure, the two halves of a frame are separated in time by 1/60th second, and consequently, when something in the picture moves in the vertical dimension between half-frames, vertical resolution is compromised. In the worst case, the resolution of an image that moves vertically is reduced to half, or about 240 lines. Similarly, a moving 1080i picture may have its vertical resolution reduced to around 540 lines. Thus, the real vertical resolution of a 1080i picture dynamically varies between the limits of almost1080 lines and almost 540 lines, depending on the degree and speed of motion. This resolution degeneration in interlaced scanning has been well known for many years, and its degree is quantified by application of the interlace factor, which effectively specifies dynamic vertical resolution as a percentage of the total number of lines in an interlaced frame. Progressive scanning does not have this problem, and the dynamic vertical resolution of a 720P picture is very close to 720 lines under any conditions of motion.


As long ago as 1967, a Bell Laboratories study concluded that the degree of resolution enhancement that accrued from use of interlaced scanning over the number of lines in a single field depends on the picture brightness, but at normal brightness this enhancement amounted not to 100 percent, but only to about 10 percent, corresponding to an interlace factor of about 0.60.


Results of testing done by the Japanese broadcaster NHK in the early 1980's indicate that picture quality achieved with interlacing is nearly equivalent to that achieved from progressive scanning with only 60 percent of the number of scanning lines, which is an interlace factor of 0.60. This finding agrees with the 1967 study, and also with another study that was published back in 1958. What this means to the HDTV viewer is that the vertical resolution of any HDTV pictures that have a vertical motion component is better in 720P than in 1080i. Based on the above findings, progressively-scanned images equivalent to the observed dynamic vertical resolution of 1080i may be achieved using only 648 lines. If we want to play a numbers game, 720P has better dynamic vertical resolution than 1080i by 72 lines.


Horizontal motion also causes artifacts when interlaced scanning is used. Depending on its speed, horizontal motion in interlaced scanning generates distortions that range from serrated edges, through blurriness, to double images in the extreme case.


But wait, there's more! The resolution impairments of interlace, plus the fact that progressive scanning affords far better motion rendition than interlaced scanning, make it apparent that a football game, for example, would be much more enjoyable in 720P than in 1080i. Add to this its freedom from other well-known interlace artifacts such as visibility of scanning lines, line crawl, and flickering aliases, and it quickly becomes clear that 720P is equal to, if not better than, 1080i in the representation of real-world, moving television images.


We have seen that interlaced scanning was born as a compromise to conserve analog bandwidth; a compromise that results in picture impairments and artifacts. A DTV broadcast is limited not by analog bandwidth but by digital bandwidth: the critical limitation is on the number of digital bits per second that may be transmitted. In order to broadcast DTV pictures, their bit rate must be aggressively reduced by digital compression to fit within the broadcast channel or pipeline that is available. The digital bits representing HDTV pictures must be compressed by a ratio that averages around 70 to 1 in order to fit into the 19 megabit-per-second DTV transmission channel. This creates a "funnel effect": for each 70 bits that enter the funnel's large end, only a single bit passes through the small end of the funnel into the transmission channel. Digital compression technology is improving rapidly, but it has been consistently observed that 720P HDTV pictures may be compressed much more aggressively than 1080i pictures before they become visually unacceptable. In fact, compression of 1080i pictures routinely generates visible artifacts, particularly when the pictures contain fast motion or fades to or from black. These artifacts cause the picture to degenerate into a blocky, fuzzy, mosaic, that may be observed frequently in 1080i broadcasts. The stress level to the HDTV broadcast system caused by bit rate reduction is much lower for 720P, and blockiness artifacts are seldom observed in 720P broadcast pictures. It may be expected that 720P will always lead 1080i in compressibility and freedom from compression artifacts, because progressive scanning is by its nature superior in the area of motion estimation. This gives it a "coding gain" relative to interlaced scanning, and the result will always be delivery of the same picture quality at a lower bit rate.


Finally, let's take a closer look at the display. The resolution of any type of display is dependent on its dot pitch, which effectively defines the physical size of the dots, or screen pixels: the higher the resolution, the smaller each dot must be. We see this when considering computer monitors or printers: a 600 dot-per-inch printer makes a sharper image than a 300 dot-per-inch printer, and a 0.28 dot-pitch monitor makes a higher resolution image than a 0.50 dot-pitch monitor, and of course the higher resolution printer and monitor cost more than their lower-resolution counterparts.


In order to fully resolve a 1080i picture, a display screen must have about 6 million dots, and for 720P, the figure is about 2.75 million dots. The larger the number of dots required, the smaller each dot must be, and the smaller the dot, the less light it generates. The full resolution of 720P may be displayed using dots three times larger than 1080i for a given screen size, and this gives the HDTV viewer a brighter picture with a higher contrast ratio. As an added bonus, the lower resolution display is less expensive to make.


We saw previously that the real vertical resolution of 720P pictures is better than that of 1080i pictures. It is also true that the additional horizontal resolution that 1080i boasts cannot be displayed on any currently available consumer HDTV display of any technology. Fortunately for the viewer, it is not necessary to the enjoyment of HDTV. An instructive illustration is the much-admired digital cinema, where micromirror projectors are used to project theatrical features onto screens that may be 50 feet or more wide. The horizontal resolution capability of these projectors is 1280 pixels, the same as that of 720P, and we have not heard anyone complain that digital cinema has inadequate horizontal resolution.


Micromirror projection is one of several advanced display technologies that are now available. Others include LCD and plasma flat-panel displays. All these advanced displays are inherently scanned progressively, and 720P may be displayed on all of them without the potentially image-degrading de-interlacing step.


720P, when compared with 1080i, provides better dynamic resolution, better motion rendition, the absence of interlace artifacts, and the absence of compression artifacts. It makes brighter pictures with a higher contrast ratio than 1080i. It is well matched to the resolution capability of consumer displays. It is a forward-looking technological choice that is compatible with computers, with advanced display technologies, and with the display of text and multiple windows as well as conventional television pictures. Given all this, the technological choice between 720P and 1080i is not a difficult one. The topic of subjective picture quality is complex, but the reasons ABC chose 720P HDTV may be distilled down to a simple truth: it gives the viewer better HDTV pictures.
 

·
Banned
Joined
·
29,681 Posts
1080i vs 720p

What Is True HDTV - 1080i vs 720p


by William F. Schreiber,

Prof. Emeritus of Electrical Engineering, MIT
[email protected]


The Manicheans, third-century Persian religious sect, divided the world into two states: spirit vs. matter. Ever since, people have repeated the pattern of Manichean dualism in every aspect of their lives: love vs. sex, art vs. commerce, toMAY-to vs. to-MAH-to. Tech heads are not immune. We also revel in dualism: Beta vs. VHS, analog vs. digital, DVD vs. Divx.


Thanks to the pluralism inherent in the digital television standard which embraces 18 different video formats, Manichean dualism is taking even more new forms. The most obvious one is HDTV vs. SDTV, or high-definition vs. standard- definition TV You've probably already figured out that standard-def is a euphemism and that for a large high-end home-theater screen, high-def is the stuff to get. But the specs include two different ways of producing an HDTV picture, so here comes a new technological divide: 1080i vs. 720p.


The Advanced Television Systems Committee (ATSC), which originated the 18-headed beast, says both qualify as true HDTV So does the Consumer Electronics Manufacturers Association, an influential trade group representing makers of TVs and other A/V products. But for various reasons, other parties to the HD debate are choosing sides.


Most TV makers who have staked out a position at this writing have chosen to display HDTV in 1080i. Among their Supporters are CBS, NBC, HBO, and Harry Somerfield, this magazine's technical editor. But others claim 1080i is significantly flawed and that 720p is better looking, better suited for new display technologies, and therefore more in tune with the future of digital television. On this side of the fence are ABC, Fox, the computer industry, and video guru Joe Kane of the Imaging Science Foundation. Still others state that on real-world sets the difference is so minor that the debate itself is a destructive distraction.


But we never shy away from controversy, so let's consider 1080i vs. 720p. Who's right? Which looks better? And does it really matter? Let's start by looking at the numbers.


Counting Pixels


Looking at the numbers is how 1080i proponents make their case. If you count scanning lines, the number 1080 is higher than the number 720, right? However, the preferred method is to count "pixels," or picture elements-the dots that make up the picture.


Do the math: multiply the number of active vertical pixels in each format by the number of horizontal pixels. In 1080i, 1080 times 1920 equals 2,073,600 dots. In 720p, 720 times 1280 equals 921,600 dots. When you count dots, 1080i seems to have more than twice as many dots as 720p, and therefore a picture that's more than twice as sharp.


Watching the Clock


However, numbers aren't everything. As the "i" and "p" imply, there is another distinction: 1080i is an "interlaced" format while 720p is a "progressive" format. Each uses a different method to turn a succession of still images into moving pictures, Interlaced scanning produces a still picture, or a "frame," by scanning two sets of alternating lines, or "fields." Progressive scanning creates a frame in one pass.


If both are moving at the same rate- "refreshing" the screen at the same number of passes per second-that gives progressive scanning a big advantage, because it scans a complete picture (a frame), not half a picture (a field). It produces fewer dots and lines, but at twice the speed.


So now it's a question of timing. As ABCs FAQ explains: "The number of lines of resolution in progressive and interlace pictures is not an, apples-to-apples' comparison. In the time it takes 720p to paint 720 lines, 1080i paints only 540 lines. And by the time 1080i does paint 1080 lines, 720p has painted 1440 lines."


As Sony's Phil Abram points out, the comparison is legitimate only if the timing-the refresh rate-remains constant. So far, that seems to be the case. While the standard allows different refresh rates, the first generation of digital sets refresh the picture, tracing new fields or frames across the screen, every 1/60th of a second, according to Joe Kane. Anything broadcast more slowly will be upconverted by the set, presumably to prevent slower-moving frames from inducing visible flicker.


Spatial vs. Temporal Resolution


The truth is that 1080i and 720p each look better in different situations. The 1080i format is better at producing fine detail in still frames and pictures with little or no motion-it's stronger at "spatial resolution." Regardless of how long it takes to produce a picture, that picture has more lines, more dots. But this works well only as long as nothing moves. Remember how two fields make a frame? If something moves, the trajectory of the motion changes between the alternating fields. That introduces "motion artifacts," or visible distortion, such as a stair-step pattern on a diagonal edge.


The 720p format excels at reproducing motion, introducing no visible distortion regardless of the timing of moving objects, so it has better "temporal resolution." Still frames won't look as sharp, because when you stop the clock, 720p has fewer dots and lines.


So in the Manichean dualism of static vs. moving images, which affects performance more? As Kane explains it, the performance of 1080i varies constantly between 540 and 1080, depending on the amount of motion in the picture at any given moment. "The whole idea of television is to have moving pictures. And since there is usually motion, the average resolution is always going to be well below 1080. It's rare that the picture would ever reach a resolution of 1080. Calculations based on average motion mean vertical resolution is going to be somewhere around 635 lines. It's not a numerical average because what determines the actual resolution is the amount of motion. But 720p has a solid 720 lines all the time. I believe 720p is a better direction for the time being."


Don't get too excited about temporal resolution, though. As Ed Milbourn, Thomson's manager for advanced TV planning, points out: "Movies have terrible temporal resolution." They're filmed at only 24 frames per second, slower than both 1080i and 720p! To minimize flicker, projectionists open the shutter twice per frame. Yet 35mm film is our standard for what video should aspire to: "film-like" is high praise.


According to Milbourn, the textbook definition of true HDTV includes: resolution of about twice the current analog NTSC standard, a wide screen, and "CD quality" (meaning digital) sound. Both 1080i and 720p qualify.


What About the Display?


It's fine to talk about the theoretical limits of technology. But what about the technology in our home theaters and TV dens? Here, it's a horse race, though 1080i seems to be winning on the manufacturing side, at least right now.


Echoing other 720p proponents, Kane says progressive scanning is more appropriate for gas plasma, DLP, and other next-generation display technologies. In fact, Kane says, even tube TVs "are more efficient when driven progressive-scan, so you get more light out of them. If I use a linedoubler to convert NTSC to progressive, on most displays I can get a 40 percent increase in light output without distorting the signal. That's a serious amount of light. That's why I'm heavily behind the progressive-scan format."


However, 720p is not practical for affordable tube-based TVs, Sony's Phil Abram says. "Interlaced vs. progressive is all tied to horizontal scanning frequency. That's how quickly you can draw that electron beam across the face of the picture tube." Interlacing, he explains, "gives you a break" because each field omits every other line, drawing only half the picture. "If you're going up to 60 [full] frames per second in 720p, the electronics, chassis, electron gun, shielding, and everything else becomes very expensive."


He says a good approach would be to display either 1080i or 480p, converting everything else to one of those two formats: "Those horizontal scanning frequencies are close together. You can center on those and make a cost effective set." Some, though not all, manufacturers are doing just that.


At this writing, Toshiba has introduced 71- and 65-inch multi-scan rear projectors that show both true high-def 1080i and true standard-def 480p-the latter can come from new DVD players with a special ColorStream Pro progressive output-but 720p will be converted to 1080i, and analog NTSC to 480p. A widescreen 56-incher from Panasonic will work similarly; Panasonic is also offering two SDTV models. Mitsubishi has unveiled seven sets, both widescreen and conventional, that displays 1080i at 1080i and convert analog NTSC, 720p, and 480p to its own unique scanning format of 960i. The odd man out is Pioneer, whose 50-inch gas-plasma set will display 768 x 1280 progressive, slightly in excess of the 720p standard's vertical res. Everything, including 720p, gets converted to vertical res of 768. Many more choices will be available by the time you read this.


Although gas-plasma and liquid-crystal displays are both "natively progressive," Abram says, getting the full resolution of either 1080i or 720p is "an issue of how many pixels you can have in a particular display. Then it's a matter of getting the yields up and the size. This is extraordinarily expensive in any format." Guess that's why the Pioneer plasma costs $25,000. Even so, he adds, "I'm always surprised at what engineers can do when they have the right motivation and goals."


Transmission, Production, and the Future


Because it is digitally maladroit, 1080i has a major disadvantage in transmission, according to the ABC FAQ: "The 1080 x 1920 (1080i) format ... cannot be compressed to fit in a 6MHz channel without creating objectionable artifacts and it has been recommended that the 1920 pixels be sub-sampled to 1440 to reduce compression artifacts. Therefore, encoder manufacturers have elected to discard approximately 25 percent of the picture for over-the-air transmission. This compromise is not required for 720p. More of the original picture information remains through the transmission chain." John Malone of cable giant TCI has even said he would not voluntarily carry 1080i signals, calling it a "spectrum hog," though he added he had no problem with 720p.


However, regardless of their strengths and weaknesses, neither 1080i nor 720p is the last word in digital TV. As a previous column has explained, the ATSC standard specifies a 1080p format that combines the high spatial resolution of 1080i with the high temporal resolution of 720p. It requires even more compression for transmission than 1080i (assuming the frame rate stays the same). And few consumers, even those with high-end home theaters, could afford the almost impossibly expensive and high-quality display it demands. So it is not feasible as a transmission or display format for the vidiot on the street, now or in the near future. But Kane says it would be ideal for production and archiving. And it will make its debut next year at the National Association of Broadcasters convention. He should know-he's been asked to come up with the demo displays



__________________
 

·
Registered
Joined
·
1,023 Posts
Quote:
Originally posted by Mfusick
Too bad CRT's don't do 720p native... which makes this whole thing pointless.



And too bad Digital fixed pixel displays don't/ can't do 1080i.... which makes these points more pointless.....
And too bad that most CRT RP sets can't do true 1080i either (having 7" tubes they are too small to have the full 1080 screen resolution.)


Only the few which use 9" tubes can come close to true 1080i. however those sets are priced a bit high for your average consumer.


At least the fixed pixel displays can handle true 720p resolutions.
 

·
Registered
Joined
·
845 Posts
Too much of the numbers are misleading. Just because 720P draws 720 lines per field, it isn't accurate to say that it does 1440 lines per frame, at it is redrawing the same line. All this gives it is more light output. And though 1080i only draws 540 lines per field, it's next field draws a different 540 lines, which gives the appearance of 1080 lines. Without a doubt, 1080i has much more resolution than 720P. The kink is that with fast moving scenes, 720P is more stable. I think they tried to set the bar higher with 1080i, by virtue of emerging technologies that will actually operate at 1080P, which will be decidedly better than 720P. And hopefully the next generation of DLP chips will indeed be 1080P devices. I think that right now, ABC is making the best choice in using 720P for MNF.
 

·
Banned
Joined
·
29,681 Posts
Quote:
Originally posted by luclin999
And too bad that most CRT RP sets can't do true 1080i either (having 7" tubes they are too small to have the full 1080 screen resolution.)


Only the few which use 9" tubes can come close to true 1080i. however those sets are priced a bit high for your average consumer.


At least the fixed pixel displays can handle true 720p resolutions.
It's not the 1080 part of a 1080iX1920 signal that CRT's have a problem with.


It's the 1920 theoretical part; the key word here is THEORETICAL.


People bash CRT for not being able to resolve 1920 pixels across each scan line (9" CAN DO IT) but..... DOES THIS REALLY MATTER.


I SAY NO.


Because:


There really is no content or sources that support 1920 pixels per scan line!


Most HDTV spec camera's are limited to about 1440i at the most... and even more of this is lost in the post mastering process because of technology and mastering equipment limitations....


HDNET on DIRECTV is probably 1080iX1220~1320 tops.... (NOT 1080iX1920)


Can CRT's resolve 1300+ pixels per scan line? YES!


What can fixed pixels do? 1366 for most LCD and 1280 for DLP. Plasma is lower at 1024.


SO... Does CRT have lower resolutions in 1080i than LCD,DLP, or Plasma?


Do Plasma, LCD, and DLP offer increased resolution compared to CRT?



I would say that a high quality well calibrated and converged CRT can keep up with any fixed pixel display on 1080i programing....and show better colors and contrast doing it.


Uncalibrated, CRT doesn't stand a chance because of convergence and focus requirements of the technology- and most people don't calibrate- SO DLP and LCD and Plasma do offer some real world benefits here.


I just don't understand why people beat up CRT for resolution so much because it doesn't deserve it
 
1 - 20 of 47 Posts
Status
Not open for further replies.
Top