or Connect
AVS › AVS Forum › Blu-ray & HD DVD › HDTV Software Media Discussion › The Digital Bits: grain is not a defect on the disc!
New Posts  All Forums:Forum Nav:

The Digital Bits: grain is not a defect on the disc! - Page 6

post #151 of 338
Quote:
Originally Posted by MovieSwede View Post

But does the digital cam really need less noise the the best filmstocks?

Not sure what you mean by that?
I've seen digital footage from a few sources ( genesis mainly) and I'd describe them all as looking noisier than a nominally exposed modern filmstock.

The assumption that digital capture will avoid noise type artifacts is still something I've yet to see.
post #152 of 338
Quote:
Originally Posted by Mr.D View Post

Not sure what you mean by that?
I've seen digital footage from a few sources ( genesis mainly) and I'd describe them all as looking noisier than a nominally exposed modern filmstock.

The assumption that digital capture will avoid noise type artifacts is still something I've yet to see.

I was refering to that digital doesnt need to be better then film to replace it, just that it needs to be good enough, that the difference wont be an issue for the production.
post #153 of 338
Quote:
Originally Posted by MovieSwede View Post

I was refering to that digital doesnt need to be better then film to replace it, just that it needs to be good enough, that the difference wont be an issue for the production.

The whole premise of bringing digital into this discussion is that digital capture will produce noise free images in comparison to imagery that contains film grain. Its not about whether digital will replace film.

Some people have made the assumption that digital capture will have a visib;ly less noisy picture compared with film grain. I've yet to see this and in fact would describe most digitally captured imagery I've seen as being noisier looking than nominally exposed film.
post #154 of 338
I just watched Prince Caspian on my local theaters 4K digital projector. They added grain of course. I actually found it kind of distracting for a few seconds... I think mostly because I was conscious of it. I wish they wouldn't.

For instance, I watched Caddyshack for the first time on HDM this weekend. I found the grain really seemed to "degrade" the image in my eyes. I had only seen it on 480p on a 27" screen. You really see it in HDM at 92".

Will say this about the HD. I never realized the Cindy Morgon was sporting the early 80's late 70's no-bra look in that movie. Found it quite distracting... in a good way.
post #155 of 338
Quote:
Originally Posted by bjmarchini View Post

I just watched Prince Caspian on my local theaters 4K digital projector. They added grain of course. I actually found it kind of distracting for a few seconds... I think mostly because I was conscious of it. I wish they wouldn't.
.

What do you mean by "they added grain". How did you come to this assumption?
post #156 of 338
Since Caspian was shoot on S35, the source should already have all the grain it needs.
post #157 of 338
Quote:
Originally Posted by Mr.D View Post

What do you mean by "they added grain". How did you come to this assumption?

Because it is a massived digitally edited movie.

Warner and Sony have gone to all digital "filming". It would make sense for disney to film this in digital as it requires so much post processing for all the CGI animation.

Adding grain is nothing new. look at 300

are you sure it was shot with s35? I do a search under s35 and nothing comes up.
post #158 of 338
I know alot of people would hate me for this, but I hate screen grain. I want it to look as real as possible which is why I went HD. There is no screen grain in real life..... unless you need glasses or are in a super dusty room.

I kinda view in the same light as people who still want clamation or puppets instead of cgi. There are those out there for these too. If the technology is there.
post #159 of 338
Camera
Arricam Cameras, Zeiss Master Prime Lenses
Arriflex 235, Zeiss Master Prime Lenses
Arriflex 435 Xtreme, Zeiss Master Prime Lenses

Film negative format (mm/video inches)
35 mm

Cinematographic process
Digital Intermediate (master format)
Super 35 (source format)

Printed film format
35 mm (anamorphic)

Aspect ratio
2.35 : 1

http://www.imdb.com/title/tt0499448/technical


Its doesnt matter if you shoot digital or on film for the postproduction since they use DI.
post #160 of 338
Quote:
Originally Posted by bjmarchini View Post

I know alot of people would hate me for this, but I hate screen grain. I want it to look as real as possible which is why I went HD. There is no screen grain in real life..... unless you need glasses or are in a super dusty room.

I kinda view in the same light as people who still want clamation or puppets instead of cgi. There are those out there for these too. If the technology is there.

There is no orchersta playing during car chases or when someone gets murdered either.
post #161 of 338
Quote:
Originally Posted by MovieSwede View Post

There is no orchersta playing during car chases or when someone gets murdered either.

Yeah, but that is different. You know what I mean. There aren't really dinosaurs or monsters either.... usually.

The point is I just don't understand the drawback of removing degradion inherent in a media when a better alternative is available.

I am not saying that screen grain for artistic purposes won't enhance an experience.

Here is a better example. When Color was introduced, most films jumped at the chance to go color. Isn't this the same argument? Improving the image. There are some instance where using black and white enhances a presentation (Citizen Kane, SinCity, Schindlers List....). It doesn't mean that I want to see star wars in black and white or most others. I understand putting screen grain in 300. I can sorta understand in a movie like Indiana Jones. I don't understand in Narnia. Why would they use S35, when digital is available? It would actually be more interesting if they used screen grain for the England shots and digital in narnia. like the wizard of oz.

On the other hand, I would not go back and artificially remove grain on older films. I would relate this to "colorising" black and white films.
post #162 of 338
Quote:
Originally Posted by bjmarchini View Post

There is no screen grain in real life..... .

Remove grain and loose image detail. Also, you are probably observing how i.e. the (A/D) digitizing process actually emphasizes film grain in an attempt to achieve transparency.
post #163 of 338
Quote:
Originally Posted by bjmarchini View Post

Here is a better example. When Color was introduced, most films jumped at the chance to go color. Isn't this the same argument? Improving the image. There are some instance where using black and white enhances a presentation (Citizen Kane, SinCity, Schindlers List....). It doesn't mean that I want to see star wars in black and white or most others. I understand putting screen grain in 300. I can sorta understand in a movie like Indiana Jones. I don't understand in Narnia. Why would they use S35, when digital is available? It would actually be more interesting if they used screen grain for the England shots and digital in narnia. like the wizard of oz.

Very good example. I agree with you.
post #164 of 338
Quote:
Originally Posted by bjmarchini View Post


Here is a better example. When Color was introduced, most films jumped at the chance to go color. Isn't this the same argument? Improving the image. There are some instance where using black and white enhances a presentation (Citizen Kane, SinCity, Schindlers List....). It doesn't mean that I want to see star wars in black and white or most others. I understand putting screen grain in 300. I can sorta understand in a movie like Indiana Jones. I don't understand in Narnia. Why would they use S35, when digital is available? It would actually be more interesting if they used screen grain for the England shots and digital in narnia. like the wizard of oz.

On the other hand, I would not go back and artificially remove grain on older films. I would relate this to "colorising" black and white films.


I can expand a bit, im not that much into grain, but when I use a projector the grain actually helps creating the illusion that I watch film, and not LCD. So it help bringing theater into hometheater. But if they shoot the movie on low grain filmstocks im perfectly fine with that. But I dont want DNR other then for restoration purposes like Blade runner.


As for the use of S35 when digital is available, mainly because S35 is better even if digital starts to catch up.
post #165 of 338
Quote:
Originally Posted by tbrunet View Post

Remove grain and loose image detail. Also, you are probably observing how i.e. the (A/D) digitizing process actually emphasizes film grain in an attempt to achieve transparency.

Ok, you are losing me here. There is a difference between removing screen grain and adding screen grain. If you remove screen grain, there is a possibility you are going to soften the image. If it is shot in digital, there is nothing to lose in detail. If anything, adding screen grain to a digital image will degrade the detail. As I stated in my example with the color versus black and white transition, I am not for removing ACTUAL screen grain. Just adding it OR using film when digital is available.

I can understand if S35 will give you more detail than the current digital standard. But everything is going digital eventually. From a home theater enthusiast perspective, I would actually prefer the "inferior" digital as I don't know we could see the enhanced difference in 1080p anyway. 4K+ yes, but probably not in 1080p. Of course, this goes back to another discussion of whether movies are moving toward the home theater and away the actual "big screen".

Wikipedia actually has a descent article on the sudject

I don't know that we would see that much of a difference between the newer more expensive 4K cameras and S35. While they may still use 2K for lower budget films, this would almost definitely been shot with 4K cameras if done digitally. Oceans 13 was shot in digital ... and looked pretty good to me. Not best of the three movies, but definitely good PQ

Quote:


Wikipedia

Resolution
Substantive debate over the subject of film resolution vs. digital image resolution is clouded by the fact that it is difficult to meaningfully and objectively determine the resolution of either.

Unlike a digital sensor, a film frame does not have a regular grid of discrete pixels. Rather, it has an irregular pattern of differently sized grains. As a film frame is scanned at higher and higher resolutions, image detail is increasingly masked by grain, but it is difficult to determine at what point there is no more useful detail to extract. Moreover, different film stocks have widely varying ability to resolve detail.

Determining resolution in digital acquisition seems straightforward, but is significantly complicated by the way digital camera sensors work in the real world. This is particularly true in the case of high-end digital cinematography cameras that use a single large bayer pattern CMOS sensor. A bayer pattern sensor does not sample full RGB data at every point; each pixel is biased toward red, green or blue[7], and a full color image is assembled from this checkerboard of color by processing the image through a demosaicing algorithm. Generally with a bayer pattern sensor, actual resolution will fall somewhere between the "native" value and half this figure, with different demosaicing algorithms producing different results. Additionally, most digital cameras (both bayer and three-chip designs) employ optical low-pass filters to avoid aliasing. Such filters reduce resolution.

In general, it is widely accepted that film exceeds the resolution of HDTV formats and the 2K digital cinema format, but there is still significant debate about whether 4K digital acquisition can match the results achieved by scanning 35mm film at 4K, as well as whether 4K scanning actually extracts all the useful detail from 35mm film in the first place. However, as of 2007 the majority of films that use a digital intermediate are done at 2K because of the costs associated with working at higher resolutions. Additionally, 2K projection is chosen for almost all permanent digital cinema installations, often even when 4K projection is available.

One important thing to note is that the process of optical duplication, used to produce theatrical release prints for movies that originate both on film and digitally, causes significant loss of resolution. If a 35mm negative does capture more detail than 4K digital acquisition, ironically this may only be visible when a 35mm movie is scanned and projected on a 4K digital projector.


[edit] Grain & Noise
Film has a characteristic grain structure, which many people view positively, either for aesthetic reasons or because it has become associated with the look of 'real' movies. Different film stocks have different grain, and cinematographers may use this for artistic effect.

Digitally acquired footage lacks this grain structure. Electronic noise is sometimes visible in digitally acquired footage, particularly in dark areas of an image or when footage was shot in low lighting conditions and gain was used. Some people believe such noise is a workable aesthetic substitute for film grain, while others believe it has a harsher look that detracts from the image.

Well shot, well lit images from high-end digital cinematography cameras can look almost eerily clean. Some people believe this makes them look "plasticky" or computer generated, while others find it to be an interesting new look, and argue that film grain can be emulated in post-production if desired.

Since most theatrical exhibition still occurs via film prints, the super-clean look of digital acquisition is often lost before moviegoers get to see it, because of the grain in the film stock of the release print.
post #166 of 338
Quote:
Originally Posted by bjmarchini View Post

I can understand if S35 will give you more detail than the current digital standard. But everything is going digital eventually. From a home theater enthusiast perspective, I would actually prefer the "inferior" digital as I don't know we could see the enhanced difference in 1080p anyway. 4K+ yes, but probably not in 1080p. Of course, this goes back to another discussion of whether movies are moving toward the home theater and away the actual "big screen".

Its not just about the detail in the image, Its about latitude, colors etc.

I doubt anyone has trained eyes that they can see the difference if the actual resolution of a film is 1000 lines or 1500 lines. Since the film is rolling.

But you can see if the total image experience has flaws.


Basicly you could say in theory that any digital cam capable of shooting 1920*1080/24P with 8bits 4:2:0 would be impossible to tell apart from film shoot and converted to 1920*1080/24P 8bits 4:2:0.

But in reality you work with the negative before you make the final conversion. So what impossible to see on a digital display will still show itself since any tweak with the orginal will come out different depending of the source you start with.

Its like record a music track directly in mp3, since the end product will be mp3.

Transparancy fails when you tweak.
post #167 of 338
Quote:
Originally Posted by MovieSwede View Post

Its not just about the detail in the image, Its about latitude, colors etc.

I doubt anyone has trained eyes that they can see the difference if the actual resolution of a film is 1000 lines or 1500 lines. Since the film is rolling.

But you can see if the total image experience has flaws.


Basicly you could say in theory that any digital cam capable of shooting 1920*1080/24P with 8bits 4:2:0 would be impossible to tell apart from film shoot and converted to 1920*1080/24P 8bits 4:2:0.

But in reality you work with the negative before you make the final conversion. So what impossible to see on a digital display will still show itself since any tweak with the orginal will come out different depending of the source you start with.

Its like record a music track directly in mp3, since the end product will be mp3.

Transparancy fails when you tweak.

I disagree with your analogy. The resolution is 4K not 1080p. They are talking about the raw digital video as well.

What if you recorded to 640Kbit mp3? There would be no noticeable difference. It is hard for even the best to discern between 320. and is slightly noticeable below 224. I am not talking about ripping either which introduces recompression artifacts from non professional encoders.

[quote]Thus the future of digital cinema can be expected to have as a standard 4K capture and 4K projection. Currently in development are cameras capable of recording 4K RAW, such as the RED One and Dalsa Corporation's Origin......

Early DLP projectors, used primarily in the U.S., used limited 1280 x 1024 resolution which are still widely used for pre-show advertising but not usually for feature presentations. The DCI specification for digital projectors calls for three levels of playback to be supported: 2K (2048x1080) at 24 frames per second, 4K (4096x2160) at 24 frames per second, and 2K at 48 frames per second.)

8,847,360 versus the 2073600 of bluray or the 350K of DVD. To put that into perspective, this is an 8.8 megapixel image. With a professional camera, there is little difference between film and digital. Especially if it is in Raw AVI which is where they are going for the digital cinema cameras.
post #168 of 338
Quote:
Originally Posted by Mr.D View Post

I'll file this under "believe it when I see it" category.

Which part? The first or second?
Working with stills in RAW format from the Nikon D3 I'm very impressed what kind of detail you can get out of the shadows and highlights in shots that look hopelessly overexposed or lacking any detail in the shadows or blacks in the 8 bit version. A digital grader's wet dream, basically. (The RED is doing the same at >= 24 fps, just with more noise for now.) And how much noise, I mean little, there is at ISO 1200, 2400, even 3200. 6400 is still usable and less grainy than many film prints. Such performance will come to sensors that work at 24 fps. D3 works already at 9 fps for a very short time before buffers are full.
post #169 of 338
Quote:
Originally Posted by mhafner View Post

A digital grader's wet dream, basically. (The RED is doing the same at >= 24 fps, just with more noise for now.)

I have to say I'm hearing very bad reports from friends of mine about Red footage in a film currently in production. Jellyvision basically. It now generally seems to be regarded as a bit of a joke throughout the industry.
post #170 of 338
Quote:
Originally Posted by mhafner View Post

Which part? The first or second?
Working with stills in RAW format from the Nikon D3 I'm very impressed what kind of detail you can get out of the shadows and highlights in shots that look hopelessly overexposed or lacking any detail in the shadows or blacks in the 8 bit version. A digital grader's wet dream, basically. (The RED is doing the same at >= 24 fps, just with more noise for now.) And how much noise, I mean little, there is at ISO 1200, 2400, even 3200. 6400 is still usable and less grainy than many film prints. Such performance will come to sensors that work at 24 fps. D3 works already at 9 fps for a very short time before buffers are full.

It will get there soon enough. Look at how cameras have evolved. I have a friend who is a photographer for a newspaper and freelances for a magazine. What alot of newspapers and magazines are starting to do is switch from digital still camera to frame grabbing off of video. The idea is instead of take a still or series hoping you get the action, that you instead just shoot a whole event in HD video and then go back and grab the frame you need for the shot in the photo or magazine. He expects digital cameras to be like a typewriter... just not really used anymore except under the rarest of needs.

Really shocked me when he told me what they were doing. Never thought digital cameras would go by the wayside... but I could see it happening after I thought about it. The only reason this is possible is because of how fast the technology is advancing.

We are realy just in the beginning of digital video capturing for movies. People thought the idea of a 1GB hard drive was impossible... and then over 100gb was just not physically possible either. Now you can easily pick a 1gb HDD.

Look how fast ipods evolved from a 5gb device. 5-10 years in a small amount of time for movies. a lifetime for technology. Remember when they talked about having 5Ghz processors. they ran across the thermal problem. And many "experts" said there was no way around it... until the idea of a multi core processor came to reality. Technology will advance forward..... atleast until those damn dirty apes take over.

There is only one constant in the universe.... change.
post #171 of 338
Quote:
Originally Posted by bjmarchini View Post


There is only one constant in the universe.... change.


Where's the yawn smiley gone?
post #172 of 338
Quote:
Originally Posted by Mr.D View Post

Where's the yawn smiley gone?

Great response.
post #173 of 338
IMO if a little bit of DNR can get rid of the majority of the grain and keep 80% or greater of the sharpness and detail, then I say do it. I like my movies looking like they're shot in HD cam.
post #174 of 338
Quote:
Originally Posted by 30XS955 User View Post

IMO if a little bit of DNR can get rid of the majority of the grain and keep 80% or greater of the sharpness and detail, then I say do it. I like my movies looking like they're shot in HD cam.

It all depends how much an effort they put into it. I was a bit disappointed when I watched Caddyshack in HD. It was much better than the SD DVD, but the grain was really strong in that film. I wouldn't mind seeing that one cleaned up. On the other hand, I would keep it in Jaws or One flew over the cuckoos nest. Wouldn't mind if they removed it from Dune as well. The last starfighter, I am on the edge with.

The problem is I don't trust them doing a good DNR job. Some of the studios don't even bother to put lossless on the disc or HD extras. Nothing drives me more nuts that 480p extras on an HD disk. Especially the newer movies.... like Harry Potter.
post #175 of 338
Quote:
Originally Posted by bjmarchini View Post

Great response.

Well I just get bored going over the same old ground.

Firstly you keep talking about digital capture not having any noise. It does, there is not one digital camera system out there that I would say has a less noisy image than modern filmstock exposed nominally.

Secondly you keep waxing lyrical about 4k and digital, have you any experience of the actual visual difference 4k makes to an image over 2k?

Thirdly and finally you seem to assume that the ultimate aesthetic goal of film-making is reality. Films watching is an extremely abstract experience with its own visual language. Assuming that the ultimate goal of film representation is merely to mimic reality , is an extremely dull and limited ideal. Its like proclaiming the ultimate goal of music shoud be to perfectly mimic birdsong.

Its boring , its facile and usually spewed forth by people whose grasp of these subjects consists of realising 4 is a larger number than 2 .
post #176 of 338
Quote:
Originally Posted by 30XS955 User View Post

IMO if a little bit of DNR can get rid of the majority of the grain and keep 80% or greater of the sharpness and detail, then I say do it. I like my movies looking like they're shot in HD cam.

I like my movies looking what they're supposed to look like according to the creator. Everything else is projecting someone else's interpretation on the matter.
post #177 of 338
Quote:
Originally Posted by bjmarchini View Post


The problem is I don't trust them doing a good DNR job. .

There is no such thing as a good "DNR" job.
post #178 of 338
Quote:
Originally Posted by bjmarchini View Post


8,847,360 versus the 2073600 of bluray or the 350K of DVD. To put that into perspective, this is an 8.8 megapixel image. With a professional camera, there is little difference between film and digital. Especially if it is in Raw AVI which is where they are going for the digital cinema cameras.

Except that resolution is just one part of the equation. And if the digital camera doesnt have the same latitude as a filmcam, then the end result on a bluray will look different.

But I have seen filmouts of HDCAM footage that have looked so close to S35 that I had to check afterwards that it really were digital from the beginning.

But since the camera they shoot the movie isnt as good as 35mm, they must be more careful when they filmed such as protect the highlights.


Quote:


What if you recorded to 640Kbit mp3? There would be no noticeable difference. It is hard for even the best to discern between 320. and is slightly noticeable below 224. I am not talking about ripping either which introduces recompression artifacts from non professional encoders.

Yes its hard to hear the differences, but when you start to tweak it in post you could very well enforce compression artifacts.
post #179 of 338
Quote:
Originally Posted by Mr.D View Post

Well I just get bored going over the same old ground.

Firstly you keep talking about digital capture not having any noise. It does, there is not one digital camera system out there that I would say has a less noisy image than modern filmstock exposed nominally.

Secondly you keep waxing lyrical about 4k and digital, have you any experience of the actual visual difference 4k makes to an image over 2k?

Thirdly and finally you seem to assume that the ultimate aesthetic goal of film-making is reality. Films watching is an extremely abstract experience with its own visual language. Assuming that the ultimate goal of film representation is merely to mimic reality , is an extremely dull and limited ideal. Its like proclaiming the ultimate goal of music shoud be to perfectly mimic birdsong.

Its boring , its facile and usually spewed forth by people whose grasp of these subjects consists of realising 4 is a larger number than 2 .

The problem for me and many others is that screen grain does not add, but detracts to the presentation. To me... and others, it comes across as a distration similar to SDE. Sure these are two different things, but you get the idea, I think. Some people have no problems with SDE, some do. Funny thing is SDE doesn't really bother, but screen grain does.

You would think that making something look more real is a bad thing. Doesn't seem to have made football or baseball worse... infact better. Sure these are two different types of entertainment, but are they really that diffferent. Is the problem similar to watching SD DVD after you have been watching HDM for awhile? The SD DVD are watchable, but you really seem them as a much softer image than before. Some people go so far as to try to avoid SD entirely from that point on. I didn't really notice screen grain on SD DVD before. But now that I am watching HD I do. I have gotten used to not seeing screen grain and now take notice of it on films that have it.

different but similar thing happened to my car. I got a little ding in my. I had a dent wizard fix it for $20. He did a great job, and no one would notice there ever was a dent.... but me. I can still see where it was, but I know what to look for.

Once you taste from the fruit of the tree of no screen grain, never again can you return to the garden of screen grain ignorance.

And I am obviously not the only that notices. HD has brought about this problem.. There weren't complaints like this when DVD was introduced.

Interesting complaint thread as they try and figure why 300 looks so bad in HD. lol

http://www.amazon.com/Grainy/forum/F...sin=B000Q6GXW2
post #180 of 338
Quote:
Originally Posted by chirpie View Post

I like my movies looking what they're supposed to look like according to the creator. Everything else is projecting someone else's interpretation on the matter.

I like my movies how I would like to see them. Not someone else's point of view. Just because a director chooses one way to present the material, doesn't mean that it is the best way. Have you ever watched a "directors cut" and think that the theatrical version was better (case in point, Independence Day)
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: HDTV Software Media Discussion
AVS › AVS Forum › Blu-ray & HD DVD › HDTV Software Media Discussion › The Digital Bits: grain is not a defect on the disc!