• April 30, 2023, 9:11 p.m.

    Good spot. From a Physics point of view, it's not 'mean' but 'expected value'. If we don't expect a value, all we can do is take the mean. The origin of 'signal' is from military 'signals'. The 'signal' is the message being sent. The 'noise' is bits that aren't the message. If you think that light is a continuous wave phenomenon, then the expectation is a constant steady value describing the power or energy of the light - that's the 'message' from whatever is causing the illumination. Thus anything that deviates from that expectation becomes 'noise'.

  • Members 878 posts
    April 30, 2023, 9:54 p.m.

    It (the expected value) is the mean (with respect to the probability measure) of the process creating that outcome. The outcome itself typically has a somewhat different mean, indeed. Of course, if also depends on what the meaning of the word "it" is. We may want the random variable in question to be not the process creating it but the output (one image of a uniform area) instead. Then the expected value of each pixel would be the mean of that image.

  • April 30, 2023, 9:56 p.m.

    Which process, which mean?

  • Members 878 posts
    April 30, 2023, 10:19 p.m.

    The random process creating the noise in the first place. In layman's terms this is the process responsible for creating different values for each pixel under repeated experiments. The noise we observe in each shot is a creation (event) of that process.

    The process itself would have a mean/average in the sense I explained, which is called the expected value. Each particular experiment may have a somewhat different mean (with standard deviation equal to that of the process divided by the square root of the number of the pixels in our case, and all this needs some assumptions as you can guess).

  • Members 3952 posts
    May 1, 2023, 3:55 a.m.

    Thank you, but it was a fairly glaring inconsistency that even my not very technically minded brain was able to pick up for clarification. The rest of the discussion starting from JimKasson's first post is over the top of my head.

    As I posted earlier, JackHogan's op and linked article helped me a lot to get a better understanding of what the signal to noise ratio actually is, at least to a basic level which is all I really need as a hobby/enthusiast photographer.

  • Members 105 posts
    May 1, 2023, 9:02 a.m.

    When I gathered my limited knowledge of such matters, standard deviation was defined on both sides of the peak of a gaussian distrinbution. Not so for a poisson distribution, so the gaussian formulas for dviation could not be used. Also, black body radiation as a fourth power function of temperatrure was referred to by the Stefan-Bolzmann formulas. Please clarify.

    p.

  • Members 78 posts
    May 1, 2023, 1:33 p.m.

    Hi Tony,
    A good question that I'll leave for the physicists amongst us.

  • Members 78 posts
    May 1, 2023, 2:07 p.m.

    Yes, it's easy to forget that photons (p) arrive from the scene with their own Poisson distribution; and that - since some of them will be converted to photoelectrons (e-) but others will not with probability (QE) - conversion to e- is a classic binomial process which results in its own well known distribution. So there are two standard deviations involved which, being independent of each other, can and must be added (in quadrature).

    1) Photon arrival (Poisson) stdev after QE = QE * sqrt(p)
    2) Conversion (Binomial) stdev = sqrt( QE * ( 1-QE) * p )
    3) e- stdev = quadrature sum 1+2 = sqrt( QE^2 * p + QE * p - QE^2 * p ) = sqrt( QE * p )

    The mean number of e- captured is QE * p by definition. So their standard deviation is the square root of their mean, which we all know for a Poisson distribution.

    Jack

  • Members 1737 posts
    May 1, 2023, 3:42 p.m.

    Perfect explanation!

  • Members 105 posts
    May 2, 2023, 8:38 a.m.

    Yes, I just took a quick look at the Wp piece on sigma: It seems that for distributions with an infinite tail like Poisson one gets into difficulties having to choose how large errors one would accept, although one could make them quite small. But it did not look like a rigorous method, so it seems that the gaussian nomenclature is more reassuring than exact in this case , but sufficiant for practical purposes. Also, in the early post above I do miss reference to Bolzmann.

    p.

  • Members 105 posts
    May 2, 2023, 8:52 a.m.

    On further reflection I realize that one is considering a finite time window of observation, so no cardinality difficulties arise for exposures.

    p

  • Members 878 posts
    May 2, 2023, 9:16 a.m.

    What do you mean? Gaussian has "infinite tail" as well but for both of them, standard deviation is well defined. For small photon counts, they are different enough.

  • Members 128 posts
    May 2, 2023, 9:27 a.m.

    Bernoulli is maybe a bit more specific then "Binomial". But that's maybe getting a bit pedantic.


    The way I think about this special case of a compound Poisson process is:

    In the base Poisson process, the probability of an arrival in an infinitesimal interval of length dt is 𝝺dt.

    In the compound process, each arrival in the base process has a probability p of being accepted.

    So in the compound process, the probability of an arrival in an infinitesimal interval of length dt is (𝝺p)dt.

    And we have a new Poisson process, with rate parameter 𝝺p.

  • Members 78 posts
    May 2, 2023, 3:15 p.m.

    Yes, thinning the incoming process by randomly removing a percentage of the population still maintains all the characteristics of a Poisson process (homogeneous rate, independent events, no collisions).

  • Members 878 posts
    May 3, 2023, 7:41 a.m.

    This does not explain why it would be a Poisson process. It just says that its expectation would be 𝝺p.

    One has to start with the probability distribution of a Poisson process and take it from there, if you want to follow this path.

  • May 3, 2023, 9:23 a.m.

    You don't know in advance what is the 'noise' so you can't associate a process that creates the 'noise'. It's the basic problem with shot noise, - it is the actuality - yet our 'expectation' is a smoothly graduated analog scene - part of where the quantum world diverges from what we expect.

    Can't agree with that. If the 'expected value' was just the mean, we'd call it the 'mean value'. That's one of the problems facing NR - if the expected value was just the mean (over some period) then NR would be easy - but actually we have an expectation that is something different. That becomes particularly problematic when the photon counts are low enough that there isn't a statistically significant mean.

  • Members 878 posts
    May 3, 2023, 10:03 a.m.

    Know it or not, it is there. For shot noise, it is a Poisson process with some expected value.

    In what way is that a problem?

    Some do. From the wikipedia page: "Despite the newly abstract situation, this definition [of expected value] is extremely similar in nature to the very simplest definition of expected values, given above, as certain weighted averages." Mean = average in this case.

    Also: "In probability theory, the expected value (also called expectation, expectancy, mathematical expectation, mean, average, or first moment) is a generalization of the weighted average. Informally, the expected value is the arithmetic mean of a large number of independently selected outcomes of a random variable."

    It is the mean (with respect to the probability measure) of the process. Not of the outcome of each experiment. Of course, with a low sample count, we would have a poor estimate of it.