• Members 1737 posts
    April 20, 2023, 3:50 p.m.

    Someone recommended a Bruce Fraser white paper that he wrote for Adobe in 2004. It’s still up on the web. I read it. It seemed that it was far from Bruce’s finest work. I read it again, and noticed more things that are wrong with it. I now think that it is not only error-ridden, but that it is conceptually wrong and should be pulled. Some of the errors are being promulgated even today, and so some explication is in order.

    In this post, I'm only going to concern myself with the first two errors.

    Here's the beginning of the article:

    There are two assertions that I’d like to examine. The first is that film and digital respond to light in fundamentally different ways, and the second is that film responds to light like the human visual system (HVS).

    If you’re used to film sensitometry, you’re familiar with the Hurter and Driffeld (aka H&D) curves. They are also known as D log E curves. In these curves, the x-axis is the base 10 logarithm of the exposure, and the y axis is the base 10 logarithm of the absorption of light. That last quantity is also known as density. In the following graph, I’ve plotted a linear response, which is what most digital sensors have. It is possible to develop film so that it is linear through much of its exposure range, and the linear line can serve as a proxy for that. Negative film usually has more of what Bruce calls compression: if you lower the exposure to half of what it was, the density won’t halve, but will go down by less than that. A gamma of 0.7 is not uncommon for developed negative films. Slide, or reversal, film not only doesn’t exhibit compression, it expands the input: if you halve the exposure, the density will more than double. A gamma of 1.5 is typical of slide films developed normally.

    Gamma in digital images is specified differently than with film. An image encoded with a gamma of 2.2 will look right on a monitor with a gamma of 2.2. In film sensitometry terms, we’d say that file had a gamma of 0.45. There are several different ways to estimate the nonlinearity of human visual response. For small changes, the lightness axis of CIELab works pretty well. For most of its range, CIE Lstar is the input lightness to the one-third power. In display gamma terms, we’d say it has a gamma of 3.

    All of those are plotted on the graph below.

    film digital handd.png

    In order to make the different sensor media comparable, I’ve rendered the negative plot above as a positive.
    The two film curves are on both sides of the linear one that represents digital capture. Even the flattest film curve is a long way from a gamma 2.2 display, and even further from the human visual system lightness nonlinearity. I see little justification for saying that film capture offers nonlinearities that are far different from digital capture, or for making the claim that film capture nonlinearities are close to those of the human visual system.

    film digital handd.png

    PNG, 70.6 KB, uploaded by JimKasson on April 20, 2023.

  • Members 78 posts
    April 20, 2023, 4:49 p.m.

    Right. My pet peeve with that white paper is that it gives the impression that a raw/linear image is 'dark' because the sensor does not 'mimic the eye's response to light'. This is incorrect and/or misleading, the eye expects a linear image as input, as previously mentioned here.

    Demo: The following two images are based on the exact same linear raw data. One of them was white balanced and saved as is with a linear gamma profile; the other in addition had gamma applied assuming a 2.2 gamma monitor and saved with an sRGB profile. They are from this article that explains how raw images are rendered.
    DSC_4022cfaGamma1.jpg

    DSC_4022cfaGamma22.jpg

    If the site and the browser are properly color managed, and the monitor is calibrated to a 2.2 gamma, they should look similarly bright* because the objective of the imaging system as a whole (camera, software, display medium) is to provide Luminance to the eye proportional (i.e. linear) to that from the scene. Any non-linearity introduced by film or the monitor ideally needs to be fully undone before it reaches the eye.

    So digital sensors respond to light just like the eye does, linearly. The HVS is then free to respond non-linearly to that linear input, as it always does.

    Jack

    • If they don't it's because one of the components in the system is not doing its job properly. I probably should have chosen a better example but this is what I had lying around.
    DSC_4022cfaGamma1.jpg

    JPG, 135.0 KB, uploaded by JackHogan on April 20, 2023.

    DSC_4022cfaGamma22.jpg

    JPG, 232.5 KB, uploaded by JackHogan on April 20, 2023.

  • Members 511 posts
  • Members 1737 posts
    April 20, 2023, 5:08 p.m.

    I am. But the white paper lives on. I think it should be taken down. It is not indicative the quality of his work.

  • Members 976 posts
    April 20, 2023, 5:14 p.m.

    Much of what Bruce wrote will be hard to find, but this one is easy and it doesn't represent his legacy.

  • Members 511 posts
    April 20, 2023, 5:33 p.m.

    The link was posted merely as a reference to his passing.
    The website is a tribute created by the photographer Stephen Johnson.

  • Members 511 posts
    April 20, 2023, 5:43 p.m.

    Jim, if I'm not mistaken that pdf is being hosted on some "Mickey Mouse" website. lol
    Maybe google found it.

    If Bruce had survived and revisited his work, he would have been the first one to hold his hand up and say... 'okay fellas, this is incorrect.'
    You sure don't need me to tell you that a lot of water has gone under the bridge in the last 17 years.

    Sorry to hear about your recent mishap.
    I enjoy your writings at your blog, and elsewhere.

  • Members 1737 posts
    April 20, 2023, 5:46 p.m.

    It was posted on this site recently as an exemplar. The poster has defended it as such. I didn't go looking for it.

  • Members 1737 posts
    April 20, 2023, 5:57 p.m.

    It appears that Adobe has taken than white paper down or moved it, because this link no longer works:

    wwwimages2.adobe.com/content/dam/acom/en/products/photoshop/pdfs/linear_gamma.pdf

    But there are many people currently hosting that white paper in whole or in part.

  • Members 511 posts
    April 20, 2023, 6:08 p.m.
  • Members 50 posts
    April 20, 2023, 6:09 p.m.

    I have been told that film, at least in the range of its highest exposure capacity, does not simply blow out white (actually black in the negative), but digital sensors do. In other words, film handles overexposed highlights better. I always wondered if this is actually true. If it is, the film is indeed more non-linear than a sensor.

    The next question I have in this regard is about the dynamic range of the eye. It seems to be only so big because we can adapt with the iris. So, saying that the eye can adapt to 10000 times the light (12 stops), but the sensor cannot, makes no sense to me. A camera has an aperture and a shutter speed.

  • Members 1737 posts
    April 20, 2023, 6:13 p.m.

    Agree. I don't think the person who originally posted the link on this forum is an engineer. But the concepts that Bruce discussed were associated with the engineering aspects of digital -- and film -- imaging systems.

    But the date of publication is not the issue here. Accuracy is. The paper was rife with problems the day it was published.

  • Members 1737 posts
    April 20, 2023, 6:14 p.m.

    That is true of negative film. That is not true of slide film.

    The nonlinear portion of a digital sensor highlight response is typically hidden from the camera user by the people who designed the camera.

    However, you could argue that a smooth rolloff is less nonlinear than hard clipping.

  • Members 1737 posts
    April 20, 2023, 6:18 p.m.

    The eye's high DR comes from neural processing, not from the iris, which has quite a limited range.

  • Members 1737 posts
    April 20, 2023, 10:24 p.m.

    I've been asked to explain where the 0.45 comes from. It's the reciprocal of 2.2. the film world specifies the gamma that the capture has when developed. The display world specifies the gamma of the monitor on which the image will look right.

  • Members 78 posts
    April 21, 2023, 7:30 a.m.

    If the monitor's response is non-linear to the tune of a power function with a 2.2 exponent (gamma), and we want light hitting the eye to be proportional (linear) to light from the scene, we need to apply a pre-emphasis to light from the scene to the tune of a 1/2.2 (≈0.45) exponent to counteract what will happen at the monitor. Simplifying somewhat, 'applying gamma' is all about performing this operation: i^0.45, with i representing normalized intensity in the linear raw data. Then after the monitor:

    (i^0.45)^2.2 ≈ i = linear with respect to light from the scene.

    Current good quality LCD monitors can be made to effectively provide virtually arbitrary responses, including linear.

  • April 21, 2023, 8:15 a.m.

    And as I remember (if I remember correctly) gamma correction was put in place to deal with non-linearity in CRT displays.

  • Members 128 posts
    April 21, 2023, 8:33 a.m.

    So, how linear can I reasonably expect RAW files to be?

    Say at base ISO, between 10% and 90% of the way between black and clipping, measured over a 40k pixel (200x200) patch, in a single channel of a Bayer sensor, with WB fixed (thanks for that last bit, Nikon).

    If (in RawDigger) a patch in one test reads 88.00%, and a patch in another test reads 22.00%, I estimate that 4.000x more photons have gotten past the CFA in the first test than in the second test.

    What can I expect the +/- to be in the "4.000x" ? (From indirect evidence, I'm guessing +/-0.5%, 0.01EV or better).