• Members 78 posts
    April 29, 2023, 7:36 a.m.

    On earth, much light in nature comes from stars like the sun, which have been shown to behave like Lambertian blackbody radiators. Lambertian can be taken to mean that the source is relatively sizeable and therefore it is not a single point of light in the sky. Blackbody radiation was explained by Planck and Einstein, who first realized that the emitted light was not continuous but quantized - in the form of energy packets referred to as photons.

    Photons are not emitted uniformly in space and time but according to a distribution described by Poisson: you may see a steady 60 cars per minute passing under a bridge, but you won't see a steady 1 car per second. We say that 1 car per second is the mean number of cars that pass under the bridge, and the observed typical deviation from that mean is the standard deviation.

    The standard deviation of light packets emitted by such a source is what we refer to as photon Noise in photography. Their mean is what we refer to as the Signal.
    So the ratio of the mean to standard deviation is the Signal to Noise Ratio (SNR), which correlates well with our impression of how noisy an image appears.

    It turns out that these concepts generalize well. In other words photons are emitted according to Poisson statistics; after some of them have been randomly scattered or absorbed (say by clouds or glass in a lens) the continuing photons still follow a Poisson distribution; this is also true of the process which turns photons interacting with the photodiode in a pixel into photoelectrons (the photoelectric effect) - also resulting in such a signal. There is no one for one correspondence but the mean and standard deviations are those of Poisson distributions once one accounts for losses (or Quantum Efficiency). And so it goes.

    In photography we often say 'photon' noise when photons are directly involved, and 'shot' noise generically. 'Shot' because the pattern of recorded counts by a uniformly illuminated patch of pixels on a sensor is similar to what one gets when shooting a shotgun at a wall.

    Jack

    PS A bit more here: www.strollswithmydog.com/information-theory-for-photographers/
    PPS More in depth explanation by Antisthenes at the old farm.

  • Members 280 posts
    April 29, 2023, 8:27 a.m.

    I think the metaphor "shot noise" refers to the random distribution of drops of lead in a shot tower. See the animation in Wikipedia here:
    en.wikipedia.org/wiki/Shot_tower

    Don Cox

  • Members 78 posts
    April 29, 2023, 9:49 a.m.

    Well I'll be, learn something new every day, thanks Don.

  • Members 4254 posts
    April 29, 2023, 10:14 a.m.

    @JackHogan

    Thank you so much for explaining in fairly easy to understand language the definition of signal, noise and SNR although some of it has gone over my head and will need a few rereads to sink in 🙂. The cars passing under a bridge analogy helps me a lot to understand what you are explaining

    But there is one thing that seems to contradict what is in the article you linked to.

    In your post you say

    which from my high school maths recollection means SNR = standard deviation / mean

    but in the article you linked to it says

    "SNR = mean / standard deviation"

    which to me sounds correct but I am not sure because it is the reverse of my understanding of the above quote from your post.

    Am I misunderstanding something or is there a typo in either your post or the article?

  • Members 78 posts
    April 29, 2023, 10:18 a.m.

    Typo, good catch Danno. The mean is what we refer to as the signal, standard deviation is the noise. So Signal to Noise is indeed mean/stdev, I've corrected it above.

  • Members 4254 posts
    April 29, 2023, 10:44 a.m.

    No problem. You're welcome.

    Up until now I have been thinking of SNR in laymans terms as being the ratio of good data to bad data which I have been told by other tech people who know much more about the nuts and bolts of what goes on under the hood than I do that that is wrong.

    I accept that it is not technically correct but it helped me get my head around the fact that to minimise visible noise I need to maximise the SNR within my artistic requirements (dof and blur) for the shot without clipping highlights.

    After having read your post and linked article, now I can understand much better why my good to bad data analogy is not technically correct.

  • Members 1737 posts
    April 29, 2023, 5 p.m.

    It has always seemed to me strange that the distribution of counted photons is Poisson regardless of the QE of the detector. That goes to Eric's "both" comment.

  • Members 878 posts
    April 29, 2023, 6:18 p.m.

    That would be, well, the standard deviation of the noise. The noise itself is the phenomenon we observe. The statistical view of noise is a bunch of random variables, roughly speaking.

  • Members 1737 posts
    April 29, 2023, 6:28 p.m.

    It's also the rms value of the noise

  • Members 78 posts
    April 29, 2023, 6:53 p.m.

    Right. And the fact that the processes that result in fewer quanta out than in are random, Antisthenes explained it well up top.

  • Members 976 posts
    April 29, 2023, 7:15 p.m.
  • Members 16 posts
    April 29, 2023, 8:24 p.m.

    It's fine to be saying the distribution is Poisson, except it's useful to note that sensels don't see the distribution. What they see is charge deposited by some number of individual photons over a period of time. Thus, the game is trying to estimate the distribution's central value from sampling only a modest number of photons -- estimating the ideal value for each pixel from incomplete information. Photography is generally NOT really about measuring the light, but about constructing an appearance model for the scene; photons are merely the sampling mechanism. The reason this distinction is important is that, if you know some things about the scene appearance, what you know can be used to alter how you estimate the center of the distribution for each pixel, thus significantly reducing noise.

    For example, suppose that a region within an image has no apparent gradients nor edges, but measured pixel values randomly vary within some statistical error bounds corresponding to shot noise. It is then a reasonable assumption that the region is actually a single material with a consistent appearance across all those pixels. In that case, noise can be reduced by estimating the pixel values using all the samples in that region together. KREMY, the raw file noise reduction program I described in An improved raw image enhancement algorithm using a statistical model for pixel value error, works by building a statistical model for the noise in an image and using similarity-weighted averaging to tweak the values of pixels within their computed error bounds. The same concept can be applied temporally; if there is no indication of scene movement (or if scene content is spatially aligned before processing), it is reasonable to assume that the scene appearance has not changed across a time sequence of captures, and noise can be effectively reduced by averaging over the time sequence of values for each pixel. That's basically the method used for low-light imaging in most cell phones. Trained AI noise reduction methods essentially recognize noisy patterns that correspond to de-noised structures with which they were trained, and essentially nudge the pixel value estimates toward the corresponding pixel values in the patterns with which they were trained. It is fundamentally the same thinking behind compressed sensing if we call the statistical properties of the training set "prior knowledge."

    Photon shot noise is simply the natural, statistical, variation in rate of photon emission from any light source -- not really "noise" per se. It becomes noise when we estimate the ideal value of a pixel from a sensel's count of photons, in which case photon shot noise becomes statistical error bounds on the ideal pixel value. The interesting thing is that, although photon shot noise is not under the camera's control, we now have lots of computational methods that effectively allow us to improve the value estimates by using higher-level scene properties to narrow value bounds.

  • Members 78 posts
    April 29, 2023, 8:48 p.m.

    Good link Iliah, as long as readers are aware that in practice Fano noise does not apply to visible light, thus photography.

  • Members 1737 posts
    April 29, 2023, 9:16 p.m.

    Jack, thanks for recommending that book to me many years ago.

  • Members 878 posts
    April 30, 2023, 6:36 a.m.

    I am not sure what that even means - the sensels do not see that distribution. There is a Poisson process creating the pixel values. Those values are subject to a Posson distribution. That means that you keep shooting, each particular pixel would take values governed by a Poisson distribution. Or, if you have one shot with infinitely many (or a large enough number) pixels, the values would have the same distribution. If you have too few pixels, the mean would fluctuate with the experiment, indeed (which is another distribution one can estimate), and one cannot even talk about Poisson or not but the process is still Poisson.

  • Members 83 posts
    April 30, 2023, 11:41 a.m.

    Thanks for the paper. I must have missed or forgotten your previous mention of it. It is an interesting read.

    I am not certain I understood the probability calculation. Could consideration of exposure time and sensor temperature improve the probability calculation? Would a shorter exposure time increase the probability that a measured value has greater deviation from an ideal value? If, the ISO setting for a particular camera modifies the raw values in some known way, then could knowledge of the ISO setting and knowledge of the particular sensor implementation also improve the probability calculation? Would higher sensor temperatures change the probability distribution curve shape?

    (I have been playing with dcraw_emu included with libraw source and it has been many years since I looked at dcraw, so I might have missed much)

    Thanks again.

    John

  • Members 46 posts
    April 30, 2023, 8:28 p.m.

    Jim, what alternative might have seemed less strange? Just curious about which degree of assumed freedom where that might have otherwise seemed like it could have gone. A change in the exponent in the relation of noise to signal, or insertion of a scalar factor maybe?

    My simplistic idea is that in any scenario where we care deeply about arrival statistics we can often safely ignore things that don’t happen.

    Jack, I’m having difficulty imagining how such a “loss” process could be anything but random. But I also don’t know how a missed photon plays out in the actual physics. Does the photon energy just go directly towards heating the sensor, or is there a pair production that reverts back by recombination, or does it pass right through, etc,? In any event the term “capture cross section” comes to mind.

    And if you care to elaborate, the mention of Antisthenes went over my head. 😊

    I appreciate the learning opportunities.

  • Members 1737 posts
    April 30, 2023, 8:46 p.m.

    Well, maybe that the Poisson stats of the light that falls on the sensor applies to the electrons, with a scale factor. But that's not what happens.