AVS Forum banner
  • Our native mobile app has a new name: Fora Communities. Learn more.

HD quality: Film versus HD cameras

1930 Views 25 Replies 15 Participants Last post by  scowl
From a technical standpoint, all other factors being equal, which HD method produces superior picture quality: HD converted from film (for these purposes, a new Hollywood release), or HD from HD cameras? Thanks.
Status
Not open for further replies.
1 - 20 of 26 Posts
HD video cameras. But of course "superior" means very different things to different people.
Apples and oranges. Isn't a fair or appropriate comparison. Both are superior in their own right when done well and correctly.
The IMAX films shown on INHD are a good example of a great film to HDTV transfer. The Tonight Show is a good example of using HD cameras on a well lit show. The Sound of Music was a good example of a poor transfer from film to HDTV.
I read somewhere that it took 4000 lines of resolution to pick up everything a 35mm frame had in it. Don't know if that's true or not.
Most people comment here saying that The Tonight Show and American Idol have the best HD PQ, however, these are videotaped shows. Majority of dramas are filmed, and depending on how good the transfers are dictates the end quality. Shows like Malcolm and The O.C. are not the best film to HD video transfers. However, I've seen Mary Poppins on ABC in HD one time and the PQ is so much better than The O.C. mostly because more care has been done in the film transfers, and that Mary Poppins was filmed in a larger film format than The O.C.


In television; news, reality shows, sports, soap operas, talk shows, and comedies are videotaped; sitcoms, dramas, and movies (obviously) are filmed. Filmed shows are 24 fps from the source with a film gamma curve. This look is imitated in two SD prosumer camcorders. Some shows that look like film could maybe be videotaped but with the filmlook. Some say Arrested Development is one of those shows. That show looks much cleaner than Malcolm, free of grain.


On another note, what even happened to videotaping sitcoms, like Home Improvement and Married with Children? In the HD era, I have not seen any videotaped sitcoms yet.
See less See more
Many CBS sitcoms are shot on HD, but at 24 fps. I believe Arrested Development is also shot that way, as are many other shows on Fox. You will see less and less film originated drama and sitcoms on the networks because of cost. 1080p/24fps HD is much cheaper to shoot.


The reason you no longer see sitcoms videotaped is because it doesn't convert well for foreign markets. You need 24fps to do that well.
I watched "Married with Children" on British television years ago. It was videotaped and it looked as good as any other show.
Video taping, generally, takes two forms for the 1080 format. 1080psf (segmented frames) captures images at 24 fps to emulate film. 24 frames per second are captured, with each stored in halves that can be recombined to represent single frames. Like film, the 24psf tape format can easily be converted to other formats for international marketing.


The other widely used 1080 format captures 60 fields or half-frames per second (1080/60i). This interlaced format means that 540 odd-numbered video lines are captured at a different instant of time than 540 even-numbered lines. The two 1/60-second interlaced fields create a 1/30-second frame, so the format is sometimes expressed as 1080/30i, even though each 1/60-sec field is a 'snapshot' of an image. Conversion of interlaced images to other formats isn't practical for international marketing.


1080/60i tapes, seen as 'wow-effect' images on HDNet, INHD, etc. have smoother motion since images (half frames) are captured each 1/60 second instead of each 1/24 second with film or 1080psf tape. Also, 1080/60i is capturing more information per second, making images appear 'crisper' than images captured at 24 fps--or even images captured at 1080/50i (an overseas format). And 24-fps film or tape must be converted to 1080/60i before being broadcast. This involves "2:3 pulldown" (720p broadcast differs), which repeats frames and varies the playback speed slightly to achieve 1080/60i. One result is motion 'judder' created by the frame duplication.


With film, of course, images must be first optically scanned (telecined) onto tape or discs. While film carries far more detail than current HDTV can deliver, it appears most telecines capture far less resolvable details ( 800--1300 lines/picture width) than the ~1700-line limiting resolution of the 1080 format. (The format is still 1920X1080, but the resolvable detail is limited to ~1700 lines/PW unless oversampling above the standard ~74-MHz sampling rate is used.) -- John
See less See more
Quote:
Originally posted by John Mason
With film, of course, images must be first optically scanned (telecined) onto tape or discs. While film carries far more detail than current HDTV can deliver, it appears most telecines capture far less resolvable details ( 800--1300 lines/picture width) than the ~1700-line limiting resolution of the 1080 format. (The format is still 1920X1080, but the resolvable detail is limited to ~1700 lines/PW unless oversampling above the standard ~74-MHz sampling rate is used.) -- John
Certainly that's true for TV series where budgets and time constraints don't allow for higher quality (and time consuming scans). However, more feature films are utilizing 4K scans that preserve much more of the original film resolution than 2K scans that are still commonplace. I would suspect that as these scanners become the norm in feature production, the prices (and speed of the scans) will get to the point where television productions can both afford and utilize higher grade scanning.


That being said, some of the best looking shows on TV (CSI, Lost, Desperate Housewives and - formerly- NYPD Blue) are film-based productions. Editing, lighting and other factors can play as much or more of a role in the quality of the image.


Film is still king in Hollywood because it's predictable, yet infinitely variable allowing nearly any look to be achieved. Many of these same results that can be created on set with film require extra post work to simulate in video. However, you can't beat the convenience of a mult-camera video setup for sit-coms and the soaps where quickness is essential.


Video excels in a controlled soundstage environment, while film is more easily adaptable to the changing conditions of location shooting (where color of light is constantly changing as the day progresses, or over the course of a few days). In addition, the contrast ratio of film makes the use of shadows and hard lighting more effective for dramatic purposes. Video's smaller range makes it more adaptable to a more neutrally lit set. Video thrives on light, while film devours contrast. Video is improving in leaps and bounds in this area, though.


There are other differences, too. Video camera lenses tend to "breathe" during focus changes (the image on the screen actually expands and contracts during a focus pull), where film camera lenses generally don't (this happens under rare circumstances when the wrong type lens is used on a film camera during a rack focus - it's rare if the DP knows his job). Film-style lenses can minimize this on video cameras, but those lenses are very expensive for productions on a strict budget. Another big feature of film is the usage of the frame (or, in some cases, lack thereof). Film is not widescreen by its nature, so it's usually cropped for widescreen use. This gives the director and editor the ability to customize cropping in case something enters the frame that shouldn't have or the framing just isn't quite right. The extra resolution of film usually allows for shots to be "tightened" by blowing the image up a small amount. Finally, that format of negative allows any format to be achieved without loss of image information, such as opening the matte for a "full-screen" presentation or tighter cropping for theatrical release.


Finally, because film currently contains greater resolution than current HD standards, it's upgradeable to the next format way down the road. Current HD doesn't offer that ability, though it would still look better than it does now at home due to compression for broadcast/cable/satellite, etc.


That being said, the gap is narrowing. HD video contrast ratios and color control is improving all the time as technology improves. It's slowly moving toward simple preference on the part of directors and DPs, rather than any technical reasons. We have a long way to go, though, before more minds are changed. The fact that most films are posted digitally is a huge step in that direction. Larger numbers of productions shooting video can't be far behind, though with improved film scanning that change may take a step back.


The biggest obstacle yet to come is archiving. Unless stored on a non-magnetic medium, video degrades quickly. Film masters degrade too, but much more slowly. That's why some films that are decades old still look great, even with minimal restoration efforts. The biggest problem with film is the color dyes. The degrade much more quickly than the rest of the image, leaving a dull, lifeless image. Older black and white films that were stored properly often look better than color films that came a decade or more later. Like all things, technology marches on and these obstacles will soon become minor bumps in the road.
See less See more
Great post NetworkTV. All very good points.
Quote:
Originally posted by NetworkTV
Film-style lenses can minimize this on video cameras, but those lenses are very expensive for productions on a strict budget. Another big feature of film is the usage of the frame (or, in some cases, lack thereof). Film is not widescreen by its nature, so it's usually cropped for widescreen use.
...although it may be worth noting that in the world of television, this typically isn't the case. Super 16/Super 35/3-perf give a flat negative of anywhere between 1.66:1 and 1.78:1, requiring little or no cropping to extract a widescreen image, and the 1.33:1 version seen in analog broadcasts and the like are center-extracted from that. I know there are a number of exceptions to that, but from what I've read, it seems like that's the most frequent approach to shooting wide for television.
This is a wonderfully informative thread! I have really enjoyed reading the information posted. Thank you all!
Quote:
Originally posted by Adam Tyner
...although it may be worth noting that in the world of television, this typically isn't the case. Super 16/Super 35/3-perf give a flat negative of anywhere between 1.66:1 and 1.78:1, requiring little or no cropping to extract a widescreen image, and the 1.33:1 version seen in analog broadcasts and the like are center-extracted from that. I know there are a number of exceptions to that, but from what I've read, it seems like that's the most frequent approach to shooting wide for television.
Of course. I was just making the point that film offers the ability to be used in any format, unlike video which pretty much leaves you with widescreen or 4:3. Production folks like to have options.
What's driving me nuts at the moment are programs that were filmed or taped in 24 fps but have edits that break the pulldown pattern. Yes, I understand that no one edits television shows in film any more and these edits have no consequences to anyone except the few of us who like to convert these shows back to their original 24 fps so they take up less space (I can get two to three hours of 24 fps HD on a DVD with MPEG4). These video edits sometimes cause pullup filters to grab the wrong frames, causing the video to stutter for a second or two. I'm sure I'll find a way around this problem.
You should never be seeing that. Post houses generally go to a great deal of trouble to make sure that doesn't happen since it introduces so many problems in the transmission chain.
scowl, have you ever A-B compared original 1080i(?) movie recordings with the reverse-telecined version? If so, can you detect significant differences for judder, interlace versus progressive, etc.? For HD, my CRT RPTV setup displays interlace only. -- John
Quote:
Originally posted by scowl
What's driving me nuts at the moment are programs that were filmed or taped in 24 fps but have edits that break the pulldown pattern. Yes, I understand that no one edits television shows in film any more and these edits have no consequences to anyone except the few of us who like to convert these shows back to their original 24 fps so they take up less space (I can get two to three hours of 24 fps HD on a DVD with MPEG4). These video edits sometimes cause pullup filters to grab the wrong frames, causing the video to stutter for a second or two. I'm sure I'll find a way around this problem.
It's not the editing process. Even digital editing systems can edit true 24fps video. The problem is no one broadcasts it that way. The problem you're seeing is caused by your conversion process. You're trying convert a 24fps source that has been converted to 30 or 60fps back to 24fps. That screws with MPEG big time and is the cause of what you're seeing.
NetworkTV wrote


"However, more feature films are utilizing 4K scans that preserve much more of the original film resolution than 2K scans that are still commonplace. I would suspect that as these scanners become the norm in feature production, the prices (and speed of the scans) will get to the point where television productions can both afford and utilize higher grade scanning."




quoted from filmandvideomagazine.com "D-Cinema Dilemma" By George Jarrett

"The studios are quite worried that HDTV is a consumer format, and will be rolled out much quicker than was originally thought. If they don’t better HD, the offering in the home is going to be the same as the offering in the theater, so they are looking at compression schemes that maybe give them RGB 4:4:4 and 12 bits rather than 8 bits, and obviously give them up to 4K studio resolution,†he added.


Wilson believes that once the politics of the technology — like the 2K versus 4K and answer-print conflicts — have stabilized, what actually rolls out will be related to venue size.

“But I think I will stick with my bet on us ending up with graded cinemas.â€


Wilson is manager of HDTV and advanced technology at Snell & Wilcox, chairs the European Digital Cinema Forum’s technical group and has become a big ally of the Digital Cinema Initiative (DCI), which comprises the major Hollywood studios.
http://www.filmandvideomagazine.com/...cinema0603.htm

- - -


In 2004 "Spiderman-2" was the first Hollywood feature film to use 4k resolution from scanning of the film negative through digital cinema projection.





mmost wrote on 05-01-03

"Telecine transfer of dailies is a rather mechanical step that is done to established standards, usually on a "graveyard" shift (i.e., night shift) and is not usually intended to establish the final look of the program. There is no difference between television series, MOW's, or features to a more limited extent, in how this is done or how much time it takes. Final color correction is usually budgeted at somewhere between 10 and 16 hours for a one hour (i.e., approx. 44 minute) television series episode, including one hour to record, or "lay off" the corrected show. Some series, like CSI, require more massaging in color correction than other, more "straightforward" programs like, say, Boston Public, and are budgeted accordingly. This would also be the case with shows that are "cuttier," since more cuts require more time. MOW's vary, but most of them are done over a 3-4 day period (i.e., 24-32 hours), or a bit less than two series episodes."

- -- -




A TV show like CarnivÃ*le takes a lot of time in color correction as seen in February 2005's article in Millimeter magazine.
http://millimeter.com/mag/video_carnivale_life/




This article details how and why a tv show shot on 35mm looks as good as a feature film.

Guys and gals, it's narrowing. the gap of quality and the tools used for [narrative] television programs and Hollywood feature films. Because the work is currently performed at either HD (1920x1080 resolution) or very close to it at Film 2k (current standards for Hollywood before Spiderman-2.)


comparison:

HDTV 1920x1080

2K 2048x1556

4K 4096x3112




don't rule out future formats....


video:

UHDV 7680 by 4320 pixels (replacement for HDTV in 15 years, currently in research and development stages)


film scanned at:

8K 8192x6224




-kspaz
See less See more
1 - 20 of 26 Posts
Status
Not open for further replies.
Top