• Members 2104 posts
    Feb. 26, 2024, 7:03 a.m.

    today i watched a utube (matt granger) and finally made a decission not to persue the sony a93. as you all know im an extreme macro shooter shooting live bubgs at 10x at 10 frames per sec with specialised gear that no one else has successfully/consistantly done yet. plenty are trying but not as advanced as my shooting/style is atm. anyway i thought the a93 would have been a great choice shooting 120frames per second, 200 image stack in 2 secs would have been amazing . But .... you cant shoot at 120 frames per/second with microscope objectives only 15 . next i thought the big pixels with lower noise would be good, wrong, more noise than my a74 with less pixels. i shoot using xfine jpegs as im stacking up to 400 frames at a time, with no buffer , the a93 can only shoot 105 xfine jpegs and takes 11 secs to clear the buffer WTF. thats slower than my a74 as i can shoot 130 images in 10 secs with no buffer. you say well why not the A1 well the 50meg pixels have a smaller pixel pitch so diffration is present where the a74 5.0 um pixels are the perfect size for 10x microscope objectives and the A1 again can only shoot 15 frames per/sec with objective lens . so looks like i have the best equipment ATM for what i shoot. i have a new 10x WD 16.5mm nikon objective arriving tomorrow and hoping its as good if not better than my olympus 10x objectives. I will keep on dreaming. ๐Ÿ˜Ž one more thing was if i was shooting at 120fps would the max shutter speed of 1/40 (lighting restriction) sec been to slow and shown motion blur at 10x moving the camera 2mm in 1-2 sec , do we have any math experts ๐Ÿค”๐Ÿคจ

    2024-02-20-08.04.35 ZS PMax copy.jpg

    2024-02-20-08.04.35 ZS PMax copy.jpg

    JPG, 18.4ย MB, uploaded by DonaldB on Feb. 26, 2024.

  • Members 474 posts
    March 1, 2024, 7:33 p.m.

    Don't hold back, Don -- just say you're the macro God everyone bows down to. ๐Ÿ˜

    Pixel size/count has very little to do with the noisiness of a photo, except, perhaps, when the light gets very, very, very low.

    Why do bugs tat their eyes? ๐Ÿ˜†

  • March 1, 2024, 8 p.m.

    That's camoflage ๐Ÿ˜

  • Members 2104 posts
    March 1, 2024, 8:37 p.m.

    thats not true either GB pixel size has everything to do with expreme macro. the reason is the stacking programs stack noise and you cant get away from it. the cleaner the image straight from camera produces the best results.

  • Members 474 posts
    March 2, 2024, 5:37 a.m.

    But more smaller pixels aren't more noisy than fewer larger pixels (unless the light is very, very, very low).

  • Members 2104 posts
    March 2, 2024, 6:59 a.m.

    images are not made up of white paper ๐Ÿค”๐Ÿ˜œ extreme macro shooting shows up all the faults with smaller pixels. ๐Ÿ˜Ž havnt you seen people complaining about noise in skys at base iso on certain cameras.
    check this image out. ive never seen anyone shoot the eye cells with this high gloss detail before.

    2024-02-06-05.13.06 ZS PMax copy web (2024_02_06 08_09_22 UTC).jpg

    not many if any can shoot a 300 image stack of a live subject at these magnifacations ๐Ÿ˜

    2024-02-13-03.24.33 ZS PMax (2024_02_13 06_52_21 UTC).jpg

    2024-02-13-03.24.33 ZS PMax (2024_02_13 06_52_21 UTC).jpg

    JPG, 21.3ย MB, uploaded by DonaldB on March 2, 2024.

    2024-02-06-05.13.06 ZS PMax copy web (2024_02_06 08_09_22 UTC).jpg

    JPG, 14.0ย MB, uploaded by DonaldB on March 2, 2024.

  • Members 511 posts
    March 2, 2024, 5:16 p.m.

    Only to an automaton who views everything at 100%, and judges by visceral reaction to what they see, regardless of how many pixels there are, which would form the desired composition.

    This is what a horizontal white fiber, a little out of focus, actually would look like in an analog capture, with enough magnification, where every registered photon was colored according to its wavelength. The more you pixelate it (simulating larger pixels), the whiter and smoother it gets, but the more it lies about what is really there, as it loses information:

    whitelinepoiss.GIF

    That is reality. It is not damage done by a camera, other than the lack of 100% quantum efficiency; it is what light becomes, when captured. You may not feel the need to express this level of detail, but this is always a better source image to start with than one that is already pixelated to be whiter and smoother, especially in an image with lots of fine detail, because pixelating to a pixel-sharp, smoother-variation version in-camera, and then resampling yet again, can cause extra aliasing, which you wouldn't get going from full pixel res to the target display res.

    whitelinepoiss.GIF

    GIF, 14.9ย KB, uploaded by JohnSheehyRev on March 2, 2024.

  • Members 2104 posts
    March 3, 2024, 12:50 a.m.

    a6300 24 meg apsc crop a7iv 14 meg
    this is the stress test, flick between the 2 images full screen and see the diffraction softness,between different size pixels. i had a dead fly laying around, again ive done this test before. so you can imagine as soon as i shoot the a74 on full sensor it blows apsc away, not even close.

    these are straight from zerene NO PP

    a74 fly.jpg

    a6300_2.jpg

    Screenshot 2024-03-03 103930.jpg

    Screenshot 2024-03-03 104146.jpg

    a6300_2.jpg

    JPG, 16.4ย MB, uploaded by DonaldB on March 3, 2024.

    Screenshot 2024-03-03 104146.jpg

    JPG, 1.5ย MB, uploaded by DonaldB on March 3, 2024.

    Screenshot 2024-03-03 103930.jpg

    JPG, 1.6ย MB, uploaded by DonaldB on March 3, 2024.

    a74 fly.jpg

    JPG, 10.7ย MB, uploaded by DonaldB on March 3, 2024.

  • Members 2104 posts
    March 3, 2024, 1:06 a.m.

    applied usm to bring the a6300 up to the a74 ,but as you can see the a6300 has lost a bit of detail.

    a74 a6300 usm a6300.jpg

    a74 a6300 usm a6300.jpg

    JPG, 1.9ย MB, uploaded by DonaldB on March 3, 2024.

  • Removed user
    March 3, 2024, 1:43 a.m.

    Please at least be consistent, Donald. Either mis-spell it "a7iv" or mis-spell it "a74" ... ๐Ÿ˜‰

    My experience with diffraction is that - the bigger the pixel pitch, the less the sensor is sensitive to diffraction but irrespective of sensor size.

    So it is that my camera with 9.12um pixel pitch, diffraction only "sets in" at f/16 whereas another with 3.9um pixel pitch, it "sets in" at maybe f/7 or f/8, even with it's smaller sensor ...

  • Members 2104 posts
    March 3, 2024, 2:13 a.m.

    exactly . my microscope objectives are apparently f4 but eq to f16 FF sensor.

  • Removed user
    March 3, 2024, 5:02 a.m.

    Sorry, I thought that microscope objectives are classified by Numeric Aperture (NA), so I'm not sure how your "f4" relates to "f16 FF sensor" ...

    .. not asking for help with that, maybe someone else knows?

  • Members 474 posts
    March 3, 2024, 6:49 a.m.

    This is very interesting! So, assuming the same number of stacks with the same lens and same settings (which you said was the case) and also assuming both are equally well focused, then I would argue that the difference comes not from the pixel size, but from differences in the AA filter and/or height of the sensor stack.

  • Members 206 posts
    March 3, 2024, 7:02 a.m.

    NA = 1 / ( 2 * f# )
    Thus
    f4 = 0.125 NA
    He might assume 3x magnification, i.e.
    effective aperture = f# * ( 1 + magnification )
    effective aperture = 4 * ( 1 + 3 ) = 16
    But it still remains irrespective of any sensor size.
    Then again, he might have pulled the numbers out of thin air.

  • Members 206 posts
    March 3, 2024, 7:06 a.m.

    A6300 does have an OLPF / AA filter.
    A7.4 doesn't.

  • Members 204 posts
    March 3, 2024, 7:22 a.m.

    One also has to look at what stacking software does, any difference in contrast will show a greater amount of detail coming out of the software as it detects contrast variations . You would really need to have the same visual contrast setting between the 2 images, one image that has more contrast via processing has the potential to show more detail in the outgoing image in a stacking software and not a function of the sensor itself.

    There is a lot of done under the hood when it comes from creating a stacked image. Next does the cameras both produce the same amount of sharpening for the same setting, has the same sharpening being applied, this can play a very large role into how the stacking software determines what to stack.

    How the image is generated before the stacking can dramatically change how much detail will be created this onto itself is something that has to be considered

  • Members 2104 posts
    March 3, 2024, 7:51 a.m.

    my m43 cameras didnt have AA filters, but we will never know as we cant take them off.

  • Members 2104 posts
    March 3, 2024, 7:53 a.m.

    havent done that experiment yet. will give it a go tomorrow. you might have given me a good idea to how i can take my images to the next step. Thanks mate.
    that might help with the problems me and another poster are have using LWD objectives