• Members 13 posts
    March 26, 2023, 10:07 a.m.

    Lets get things going in this forum, here´s a start on a topic often referred to in discussions about pixel density and high ISO vs sharpness and noise.

    Here are two, highly enlarged, pictures of singel pixels: one pixel from an ISO 64, very sharp, part of an image, the other pixel from an ISO 12800, very blurry, part of an image.
    Please discuss which is which, and what general conclusions that can be made fron these examples.

    (All in good fun, of course)

    2.jpg

    JPG, 13.5 KB, uploaded by Rubank on March 26, 2023.

    1.jpg

    JPG, 13.1 KB, uploaded by Rubank on March 26, 2023.

  • Members 4 posts
    March 26, 2023, 11:11 a.m.

    2.jpg is lighter so it’s obviously the ISO 12800 shot.

  • Members 13 posts
    March 27, 2023, 9:42 a.m.

    Excellent analysis. High ISO gives brighter images.

    Now we´re gettings somewhere! :)

  • Members 97 posts
    March 29, 2023, 7:58 a.m.

    since it is not yet april 1. I refrain from commenting.

    p.

  • Members 878 posts
    March 29, 2023, 4:44 p.m.

    The EXIF says one is at ISO 140, the other one is at ISO 250.

    What did I win?

  • Members 2 posts
    March 30, 2023, 8:29 a.m.

    I'm going with 2.jpg as the ISO64 shot, because at 129x128 pixels (0.016512 megapixels, perceptual or otherwise) it clearly has higher "resolution" than 1.jpg, which weighs in at only 128x128 pixels (0.016384 megapixels).

    I tried some slanted-edge analyses, but the contrast across the edge was too low to provide any additional insights.

  • Removed user
    March 30, 2023, 5:47 p.m.

    What?

  • Members 360 posts
    April 1, 2023, 7:20 p.m.

    WTH?

    As sharpness is rather described as a contrast ratio, with single pixel, there cannot be sharpness to talk about.

  • Members 1737 posts
    April 1, 2023, 7:52 p.m.

    WRT sharpness, none at all.

  • Members 16 posts
    April 3, 2023, 4:05 p.m.

    You also can't say anything about color from a single pixel. Color is not about wavelengths, nor even energy distribution over wavelengths, but about relative behavior across multiple pixels. Yes, I'm talking about that stuff that puzzled Land for so long, and that Wendy Carlos (the musician) actually explains rather well on her WWW site: www.wendycarlos.com/colorvis/color.html. What colors do you see in:

    www.wendycarlos.com/colorvis/gridmed.gif

    Open the image in a separate tab for the best effect -- it gets scaled funny here.

    That image really only contains red, white, and black...

  • Members 1737 posts
    April 3, 2023, 5:49 p.m.
  • Members 16 posts
    April 3, 2023, 11:52 p.m.

    Too narrow. ;-) Just to be clear, the colors that aren't really there can actually be photographed using color film, etc. In fact, that's how this cover shot from May 1959 Scientific American was made:

    www.wendycarlos.com/colorvis/landcov.jpg

    I first learned of this way back in the mid 1970s, and quickly confirmed that you really could take two monochrome images with color filters as close as just a few tens of nm, process the B&W images and then project them, and see -- and photograph -- full color images from typical scenes. Being clever, I figured I would shoot bright line spectra this way to really get the color response nailed down... however, the bright line spectra handled this way looked monochromatic! In any case, one pixel tells you nothing about color because (1) one pixel would normally be sampling only one color "channel" and (2) wildly different energy distributions over the sensed wavelengths can result in the exact same single pixel value.

    Incidentally, my then high-school AP physics teacher told me about Land's work when I asked him for an explanation of something even stranger. When my brother and I were very young, we only had a B&W TV, but lots of shows were filmed in color and regularly announced the colors of various objects so folks could adjust their color TVs. Well, both my brother and I somehow learned to see B&W TV broadcasts in color! Our parents confirmed that we both were able to identify colors in shows we hadn't seen before with disturbingly high accuracy. We both lost that ability by the time we were something like 6 years old, but the fact we had it always bothered me until I read Land's work, which actually confirms that distinguishing colors in monochrome images is actually possible provided the scene content has the right kind of structure. The retinex algorithm was the best result from Land's work, but he never got a complete explanation of what is happening, nor has anyone else that I'm aware of. Incidentally, the 2-color trick that Land "discovered" was widely used in the printing industry to reduce cost in making apparently full-color printed images as far back as the 1800s; this weirdness has been around and in use for a very long time without being well understood.

  • Members 1737 posts
    April 4, 2023, 12:59 a.m.

    When I was doing color science, I spent too much time looking at Retinex and not finding any way for it to improve image reproduction.

  • Members 16 posts
    April 4, 2023, 4:05 a.m.

    You're in great company on that; Land spent something like two decades on this.

    I personally spent a while doing things related to retinex too. Never improved upon it, but did get some spin-offs. One was a novel method for reconstructing full color stereo pairs from a single anaglyph image: Reprocessing anaglyph images. The other involved some hacks for obtaining multispectral data with more bands than the sensor had filters. I first did this using a Canon PowerShot G1 (CMYG CFA) in 2001, and by 2005 I could sometimes extract up to 8 bands from a Sony F828 (RGBE CFA with NIR filter disengaged), but the method was very sensitive to noise. Anyway, we used a similar method in Multispectral, high dynamic range, time domain continuous imaging. Maybe one day one of us will figure out how color really works...? Until then, I see all this as a very deep and ponderous rabbit hole with "color science" being the term slapped on whatever magical approximation currently works best. ;-)

  • Members 97 posts
    April 10, 2023, 8:14 a.m.

    Most interesting piece about colour perception, but to my mind, the original question about SINGLE pixel sharpness seems nonsensical.

    A pixel on the sensor is a tiny "light meter" prceeded by a coloured filtter and its physical size is determined in the fabrication process. So how the "lightmeter" is calibrated cannot possibly deform the chip or make it squishy. The perceprion of the aggregate of pixels is another matter.

    p.