• Removed user
    March 20, 2024, 12:17 a.m.

    Yes - one would think that, in a perfect world, variations between different sensor outputs would be eliminated during conversion to RGB by matrices or by Color Lookup Tables such that the RGB results have barely noticeable differences e.g. less than say two or three in CIE LAB space.

  • Removed user
    March 20, 2024, 12:39 a.m.

    Are you sure?

    This is a raw composite (no conversion) image exported from a Lumix G9 raw file:

    P1000564-RAWcomposite.jpg

    Perhaps we are talking at cross-purposes. It is obvious that without a CFA over the photocells the sensor outputs would have no color ... but the raw file does include the effect of the CFA as can be seen above.

    P1000564-RAWcomposite.jpg

    JPG, 437.9 KB, uploaded by xpatUSA on March 20, 2024.

  • Members 651 posts
    March 20, 2024, 1:10 a.m.

    But what you posted was a jpg, so, naturally, it has color. But, assuming we could view a RAW file, wouldn't it be a bunch of pixels that are all entirely red, green, or blue? And wouldn't the hue of the red, green, and blue necessarily correlate to the dyes in the CFA?

  • Members 2316 posts
    March 20, 2024, 1:14 a.m.

    thats not true 🤪 the voltages are measured from the coloured filter/diode. and i beleive every sensor maker has there own colour mixtures.

  • Members 878 posts
    March 20, 2024, 1:29 a.m.

    [deleted]

  • Members 1445 posts
    March 20, 2024, 2:43 a.m.

    Agreed. Raw is just numbers. How these are turned into colour is what is being discussed. There is the manufacturer's interpretation and then there is any interpretation you care to apply to that RAW data.
    Interestingly, when people do blind tests to decide what camera/processor did what image, the variation is huge. By and large, people are very poor at knowing which result came from which camera. Colour interpretation/rendition is a very subjective thing.

  • March 20, 2024, 8:02 a.m.

    They represent an energy-weighted photon count. The filters select (loosely) by energy. We lazily call them 'red', 'blue' and 'green', but that's not strictly right. When talking about the eye the responses are "S' (short), 'M' (medium) and 'L' (long). In the CIE colour model the basal stimuli are 'X', 'Y' and 'Z'.
    The channels in a digital camera sensor are not standardised, so could be called anything, but 'red', 'green' and 'blue' while descriptive can be misleading.

  • March 20, 2024, 8:05 a.m.

    It's actually charge that's being measured, not voltage. It gets turned into a voltage along the way. I suppose that you can go a step back from that. The charge is the aggregate charge of a bunch of electrons, which in turn were freed from silicon atoms by incident photons. So in the end what's happening is photons are being counted.

  • March 20, 2024, 8:12 a.m.

    Effectively that is exactly what does happen. All colour spaces in use are defined by reference to the CIE XYZ space. Conceptually processing converts the camera native space to XYZ, then the XYZ is converted to the target space, quite often using LUTs (especially in older cameras with limited computational power). It's more computationally efficient to chain the conversions and go direct from camera to target, missing out XYZ along the way. Just do the maths of what is the function that does the two translations combined and program that directly. Remember that also a colour temperature operation has to be performed, extracting the spectrum of the illuminating light and replacing it with that of the standard illuminant.

  • Members 2316 posts
    March 20, 2024, 8:45 a.m.

    i will stick with voltage 😁
    quote : There is no physical variable of any kind that you can measure directly. You either measure it by calibrating an instrument based on a nature constant or declare a fundamental nature constant as a fixed value without measuring it (e.g. the speed of light). Charge is measured by one of several ways the force it applies from a known distance from a reference charge (e.g. the charge of an electron) or by the deduced from trajectory of known mass in a known magnetic field. In engineering practice we measure charge by the voltage on a known capacitor. Is this abstract? it probably is but how do you measure distance? The only officially precise way is to measure the time it would take light to travel the distance between these two points. But then how do we measure time? I will stop here and would encourage you to take a look at the branch of physics called metrology.

  • March 20, 2024, 9:32 a.m.

    Well, you're wrong.

    From whom?

    So you're just choosing to go one step further down the line. By the same token there is no direct way to measure electrical potential (aka 'voltage'). Moreover, potential is not absolute, it's always relative to another point. Charge is absolute, one of the basic properties of matter. So what you're measuring is charge, via current, then potential, then the relative potential in the channel of a MOS transistor, etc. What you're measuring is charge.

  • Members 2316 posts
    March 20, 2024, 10:05 a.m.

    Prof Physics, UCBerkeley, author of "Now—The Physics of Time" (2016)Upvoted by
    Santanu Malakar
    , Ph.D. Chemistry, Rutgers University and
    Alejandra Miranda
    , BS, MS Physics & AstrophysicsAuthor

    Quote :
    We have no way to calculate the charge of the electron. Indeed, many people consider such a calculation to be an ultimate goal of unified field theory. Put in dimensionless units, the charge is usually referred to has the “fine structure constant” represented by the Greek letter alpha.

    Perhaps you mean “how was the charge on the electron first measured?” That was done by Millikan in his famous “oil drop” experiment - see Wikipedia

  • Members 86 posts
    March 20, 2024, 10:20 a.m.

    Ok, before we go too far down a rabbit hole here regarding "colour science" we have to define and understand that we are talking specifically about colour perception and not colour itself or really anything to do with the physics of light or even actual colour.

    @bobn2 Yes, absolutely.

    RGB is misleading, especially the assumed symmetry it provides via the meaning of the labels. There is no real link between RGB sensors, RGB colour spaces and the RGB output of additive colour systems such as cumputer/phone screens other that the way the human eye sees and perceives colour. Colour accuracy is defined by comparisons by the standard (or average) human eye under strict reference illumination in the subtractive colour model.

    The additive colour as in the RGB colour from your screen is not colour at all. No colours are comprised by RG or B components nor can they be broken down into them, similarly RG and B mixed do not create the colours you see on your screen.

    The eye is a remarkable thing in the way it gets around the problem of the balance between accuracy and efficiency. All colour is a combination of lots of different wavelengths combined. So if you want lots of accuracy then you really need to define those wavelengths and their relative density, but the problem is that this takes lots of sensors tuned to different bands and so the efficiency (and sample rate therefore accuracy) falls as increasingly more of the light falls on sensors that fail to record it.

    The human eye effectively does this with three sensors (in the simple model). Basically reduces all colour to three variable signals.

    So it is theoretically, and practically, possible to produce the sensation of colour in a human eye simply by stimulating the three receptors to produce the same signals are real colour would produce, or basically shining narrow bands (wavelenghts) of red, green and blue light directly at the eye.

    Now the problems here are that a camera sensor can't replicate either the complexity of the spectral sensitivities of the cones, or their behavior characteristics when exposed to light. So we have a compromise, both in the spectral sensitivities of the RGB of the sensors and in the selection and widths of the RGB output light of the screen. And to do all this on a computer does require that we create an absolute model of numerical colour.

    But again, two things:
    Colour accuracy and the LUTs are still defined by the standard (or average) human eye, not by wavelength or measurement of actual light.
    RGB has little to nothing to do with real colour or the physics of light.

    When you display jpegs on computer screens you are talking about creating the illusion of colour that looks similar to your own human eye, there is no measured accuracy other than that, there is no colour fidelity, in fact there is no actual colour accuracy at all as it is all complete illusion based on nothing more than it looks the same to the standard (average) human eye.

    [EDIT] To make it clearer: If you were to place a vase of daffodils on your desk next to your monitor, which was displaying the excellent photo you just took of those flowers. Then measured the actual light being reflected of the flower and that emitted by the computer screen of the same flower...

    ...You would find that there would be nothing even remotely comparable between the two readings other than they look the same to a standard human eye.

    Whatever voltage/charge that may be on whatever device through how that is transformed into whatever numerical colour model.

    Simple rule though, if you want to sell cameras to a mass market then good clean and punchy primaries. We all want our photos and lives to be better than others on social media. That is what we perceive as reality, and so it becomes reality. You can measure it if you like, but accuracy doesn't sell cameras, or perversely doesn't look real compared to our perception/memory.

  • Members 538 posts
    March 20, 2024, 12:48 p.m.

    There's no actual interpretation of anything regarding color with the raw pixel values. A "color" raw image is actually a set of interleaved B&W images (stacked, for Foveon) shot through different color filters, and the raw data, as cooked or mangled as it sometimes is, has no color interpretation in it; that only happens when you convert it.

  • Members 86 posts
    March 20, 2024, 1:36 p.m.

    The baseline starting point for colour is how the standard human eye perceives it reflected off objects in controlled conditions. Colour is meaningless unless related directly to the human eye.

    There is no saturated colour recorded in any colour reproduction systems because colour is not about absolute wavelength but average relative values of multiple wavelengths, a standardised average and deviations from that, (colour temp). Sensors don't record absolute wavelength, and neither does the human eye.

    It's misleading to talk about colour as a property of light, it is only an absolute property in the sense of the perceived colour of a surface of known reflectivity under strict ambient light, and this must be so to enable colour to translated into a theoretical and numerical colour space and thus reproduced with accuracy as defined by the standard human eye across multiple platforms.

  • Removed user
    March 20, 2024, 1:51 p.m.

    Yes, that's exactly what my first Sigma camera did with a matrix called 'CamToXYZ_Flash' where I believe that "Flash" means XYZ(D55).

    In that camera the WB preset matrices remained in XYZ space, then presumably followed by an XYZ to RGB transformation dependent on the selected RGB color space. All nice and clean and simple to follow ...

    Then, around the time Sigma took over Foveon, XYZ matrices disappeared from the raw meta-data and understanding it became less easy, grump.