Originally Posted by madshi
Edit: And no, noise does not cause grain.
Digital and Analog noise are different though, if speaking of Digital Noise, I would think, yah it definitely can. If speaking of Analog noise, then we'd have to argue on semantics.From the WIKI:Image noise can also originate in film grain and in the unavoidable shot noise of an ideal photon detector
So to say that GRAIN and NOISE are mutually exclusive does not work IMO, unless we are referring to one specific type of grain (but the film grain we see is different from movie to movie, 35mm vs 16mm, camera brand to camera brand, digital vs. analog, different mastering processes and cleanup algorithms, etc..). So we cannot just call it "film grain" either.
So since we are now over-analyzing this, then even if we were to refer to it as "analog film grain", then it varies between the source and camera type and mm size of the film. So there is no simple, this is grain and that is noise formula. Then some of that original grain might be enhanced accidentally by digital mastering (or removed), so my point is, you cannot separate grain and noise these days.
http://en.wikipedia.org/wiki/Image_noiseFrom the NET:
Higher ISO film tended to have more grain; and higher ISO digital shots exhibit more noise - a similar cause, but the visual appearance is different.
Digital ISO noise is related to the size of each pixel, as the noise is per-pixel (so the more pixels you have, the less obvious noise is when viewed the same size), whereas with film, the noise is per crystal - you need larger crystals for higher sensitivity.
Film grain is generally not as even looking as Digital Grain or Remastered Grain, though they can sometimes look similar. I think film grain is what people prefer, not digital grain. The problem is digital sources from non-analog origins do not contain film grain, so they either added it or it was there from Digital Noise (that is my take on it).Edited by coderguy - 9/3/13 at 4:01am