Analog ISO brightening is commonly deferred until after the ADC to increase available highlight headroom when using various DR modes where it is instead handled by the jpeg processing engine (digitally).
I can draw whatever schematic you want. However, that doesn't mean it corresponds to reality.
There exists some cameras where the ISO setting does not affect the raw data values at all. For instance the Fujifilm X-H1 can act this way. In that case the affect of the diagram block labeled "ISO" is not altered by the ISO setting.
While specific (even common) implementations of ISO are interesting to discuss, such implementations are outside the scope of the ISO standard. Different cameras are free to implement ISO in dramatically different ways.
I enjoy reading your blog articles. Thank you for those.
This one very short and narrowly focused little old article has begun to resemble a Rorschach test in which people read into it topics it doesn't address or get impressions from what it doesn't say.
I can tell you that nothing I read in the article gave me "the false impression that the input to the HVS [Human Visual System] should be non linear". I came away with the opposite impression from reading it. The article differentiates between what is Linear (the light we see and the raw capture files which record it) and Nonlinear (our visual perception of the light we see and processed image files with gamma adjusted).
Once again, the article does not delve into all of the linear and nonlinear pathways of human vision and perception; such as how the cones in the retina function or the vast array of other related topics. It doesn't cover the raw processing pipeline or imaging chain in any depth. There is much that it doesn't discuss 'cause it ain't that kind of article. It's scope is narrowly limited and the coverage is brief, but hopefully stimulates enough interest for some to seek out additional material and to learn more.
A different article or book may offer more expansive and deeper exploration of topics left unaddressed or only lightly touched upon in this article. A different author may have expressed what is covered with different language or in a different style. A reader may want additional content to learn more about the subjects raised. I think that's all great! With all due respect, what I don't intend to do is to continue discussing this same article ad nauseam.
Please continue adding to your own blog articles as they are informative and well written. Best regards to you.
Due to the context of the audience for which the article was written, I took the statements regarding how linear images "look" without gamma correction to be reflecting the standard practice for consumer digital imaging in practical terms; as opposed to a generic statement which is true in every possible scenario, such as a wholly linear workflow. As I mentioned in an earlier reply, the description of linear images as very dark corresponds with my own experience.
As examples of "standard practice for consumer digital imaging", you can look at this linked image gamma comparison as a practical demonstration with the display device you're currently using. The very dark image on the left is linear (no gamma correction) and the image on the right is the same image with gamma correction. Another example is this comparison labeled with their encoded gamma. They resemble the difference seen in the referenced images from the article because the images we commonly produce every day are processed with built-in nonlinear gamma correction as the convention for consumer digital image files. This de facto standard has roots both in history and efficiency of encoding.
Nothing in the article says light is nonlinear. In fact it's a major point of the article that light is indeed linear.
My impression is that encoding of gamma correction as currently implemented is, to some degree, the result of legacy systems technology and design practices which have become entrenched as industry standards for the encoding of processed images and the most widely used color spaces like sRGB, Adobe RGB, ProPhoto RGB, etc. Another link to an article worth reading with a quote which briefly summarizes one piece of a much larger set of interconnected moving parts...
Noise is an issue in low light photography, because light itself is noisy. The noise inherent in the quantum nature of light (called the "shot noise") is usually the main factor in low light image noise. Cameras are not perfect, and they do add some additional noise, however this is usually minor compared to the shot noise.
If cameras were perfect, and they did not add any noise to the image, we would still get noise in low light photography.
Well, in a sense they all count photons, just not very precisely. And noise will always be there because it is inherent in how light is emitted and detected.
Ideally Data Numbers written in the raw file are proportional to the intensity of the incoming light. We can actually measure the constant of proportionality to a pretty decent accuracy for a given setup. For instance 8 green photons per DN at base ISO.
Sometimes I see shopkeepers go into a bank with bags of coins they need to deposit. Generally the shopkeeper has them already sorted into different coin denominations. The bank counts them by putting them on a scale.
Thank you for the kind words, you obviously understand the subject matter well. On the other hand, after reading that paper, I was one of the beginners who early on and for quite a while believed incorrectly that digital sensors lacked 'the compressive nonlinearity typical of human perception' and was therefore convinced that as a result the raw data would produce dark images if processed linearly. Innumerable posts at the old farm based on these misconceptions.
Then, with the help of others, some of them in this thread, I learned a bit more and realized that none of that was true. So when that article comes up, I get a Freudian knee-jerk overreaction to set the record straight.
Right. As long as everyone understands that any gamma introduced by the processing needs to ideally be completely undone by the output medium before it hits our eyes to make the system from scene to eye linear end-to-end, we are on our way.
Are you just making that up? When I worked in the branches, If things were prerolled, we’d credit their account accordingly. It’d be verified later and the deposit adjusted if discrepancies were found (shady types were known to slip nickels into quarter rolls). Or they’d bring it loose and we could have a machine count it and sort it and credit their account when done. But that wasn’t in branch. That was sent to proof. At no point did a scale ever factor in. For currency, it gets fed into a machine and counted. Newer ones can read the denominations and sort. Older ones just counted the bills and our eyes confirmed the denominations.
You are correct on the history. Gamma correction stems from CRTs -- note the "correction" in the name. The concept of gamma predates CRT imagining, though. Those three color spaces all use different tone curves. The PPRGB color space in particular was not designed to be directly displayed on a CRT. Nonlinear tone curves are useful for reducing the precision required to encode wide DR perceptually pleasing images. However, not all tone curves have anything to do with CRTs; consider those in CIEL*a*b**and CIEL*u*v*.
Different countries, I guess. In the UK coins are designed to weigh the same if of the same denomination. When they started making them of plated steel rather than copper, they got fatter because steel is less dense than copper. The scale is simpler than a counting machine and every branch can have one - in fact each teller's position has its own scale. No need to send it outside, the customer can get their cash credited immediately.
Which "digital" sensors have 'the compressive nonlinearity typical of human perception'? The analog sensors used in consumer digital cameras, at least those of which I'm aware, measure the light falling on them in a linear fashion and lack "the nonlinearity typical of human perception" in their light intensity measurements. They just measure photons in a linear fashion as they respond to the linear light entering the sensor.
Since human perception of lightis nonlinear (though the light itself is linear), I presume that's not what you consider to be incorrect in the statement you find objectionable.