• Members 2331 posts
    April 19, 2024, 3:03 a.m.

    pretty safe bet if you ask me πŸ€”

    2024-03-04-03.04.22 ZS PMax copyfs (2024_03_04 06_39_11 UTC).jpg

    2024-03-04-03.04.22 ZS PMax copyfs (2024_03_04 06_39_11 UTC).jpg

    JPG, 8.4Β MB, uploaded by DonaldB on April 19, 2024.

  • Members 4254 posts
    April 19, 2024, 3:12 a.m.

    I was referring to your opinions about Jim K.

    Just because you might disagree with him doesn't make what he says wrong.

    You were justifiably sandboxed from what I saw.

  • April 19, 2024, 11:57 a.m.

    You have to be pretty smart to find Jim K to be wrong - and be right. The converse also applies.

  • Members 4254 posts
    April 19, 2024, 12:02 p.m.

    That's why I would back Jim K against the likes of DonaldB, especially when DonaldB makes unproven claims which are then at best simply his opinions.

  • Members 2331 posts
    April 19, 2024, 1:36 p.m.

    i dont take images of other peoples art/work , i create my own content.

  • Members 273 posts
    April 19, 2024, 1:38 p.m.

    Then you should shoot with a single-pixel sensor.

  • Members 273 posts
    April 19, 2024, 1:38 p.m.

    Are you dumb, or just obnoxious?

  • Members 2331 posts
    April 19, 2024, 1:40 p.m.

    you should shoot with a phone 😁 it might even power your home they have 200 solar panels 2x the gfx2.

  • Members 542 posts
    April 19, 2024, 2:37 p.m.

    This has no application, whatsoever, to the issue of subdividing a given sensing area into photosites of varying density.

  • Members 542 posts
    April 19, 2024, 4:05 p.m.

    The greyscale? If it is the greyscale you are talking about, that is a matter of glare on the card and conversion levels/contrast settings, and has nothing to do with color filters, noise, or anything that can vary with pixel size, when the exposure is that high. With less total light per patch, and more noise, then the darker patches may all look similar or get lost, no matter what the contrast, gamma, or lightness.

  • Members 542 posts
    April 19, 2024, 4:12 p.m.

    Talk about all your eggs in one basket!

  • Members 2331 posts
    April 19, 2024, 9:21 p.m.

    all the test images were taken with controled lighting by DPR i pushed the sample images to expose the darker patches to see what sensor had better tone seperation which is due to more acurate photon charge being recorded. and we can clearly see the larger pixel is by far better . i just processed another sample from DPR sample images. shadow recovery form the 3 meg sensor is amazing and im pushing the jpegs, i always said my FF jpegs can easliy out perform any m43 raw files.

  • Members 273 posts
    April 19, 2024, 10:21 p.m.

    Do you even realize how crazy this argument is, from a mathematical point of view? Essentially what you're saying is that stairs are a better approximation of the side of mountain than a curve is.

  • Members 676 posts
    April 20, 2024, 12:11 a.m.

    That should be the new motto for the GOP. πŸ˜‚

  • Members 2331 posts
    April 20, 2024, 1:56 a.m.

    plot the "stair" in the images and post the graph from several cameras of any era. ive measured some of the patches via Ps eye dropper tool and some go the wrong way πŸ€” just post your conclusions via images im sick of armchair mathematicians, they are about as acurate as meteorologist 😁

  • Members 878 posts
    April 20, 2024, 2:07 a.m.

    Joe's uncle was eaten by cannibals.

  • Members 676 posts
    April 20, 2024, 2:30 a.m.

    How the ++++ did you know that?! πŸ˜πŸ˜‚πŸ˜†

  • Members 542 posts
    April 20, 2024, 5:47 p.m.

    What are you plotting? Raw values? sRGB values? Of pixels? Of averages for entire patches?

    The average or mean values in the raw data will separate the patches quite well, even under extreme under-exposure, unless the read noise starts falling below about 0.8DN, at which point the dither of noise will fail to prevent mean shifts in levels as over-quantization shows its artifacts. There are actually very few cameras that do that; it is most common among the last few models that used Sony Exmor 12-bit sensors a decade or more ago, and only at base ISO, as 2x base usually brings those read noise DNs into the safe zone. Some had black frame noise in the 0.4DN to 0.6DN range at base and they can not record even mean patch values accurately. At 2x base, this might be 0.7 to 1.1, which, while not perfect, is much less problematic than base ISO.

    With the presentation you gave, and the levels of exposure of even the darkest patches, nothing but conversion style and/or levels editing is going to showing any loss of definition between the first and second darkest patches, or the first and second brightest. They're way out of the range where the sensor and pixels can conflate neighbor patches.

    Let me show you how well small pixels can distinguish extremely low-exposure mean patch levels, if the read noise is above 1.3DN. This is the entire frame of a 12-bit, 1.86 micron compact camera, the Canon G9 at ISO 80, under-exposed by about 12+ stops (pushed to about ISO 300,000) , where the brightest, clipped part of the image near the top, above the transparency, is one ADU above the black level, as I clipped away almost 4000 levels. Obviously, there is a lot of image-level noise here, but even the two darkest darkest slices in the transparency are almost distinguishable, and any of the brighter slices, all of which are recorded STOPS BELOW the bottom of the DR:

    sub1ADUG9ISO80.jpg

    So, why would there be confusion of levels with much larger pixels, and much more exposure?

    sub1ADUG9ISO80.jpg

    JPG, 69.0Β KB, uploaded by JohnSheehyRev on April 20, 2024.