sometimes there are no other options because manufacturer does not make the body that you want with all other bells & whistles within you price range with less mp
[quote="@davidwien"]
If you have a 40 Mp sensor, you need lenses that are good enough to take advantage of it. Otherwise why have it?
š
David
Good point. What lenses are "good enough to take advantage of [40 MP]" ?
I have a Panasonic m4/3 DC-G9 which be set 80 MP in pixel-shift mode. But the MTF curve for pixel-shift captures is pretty poor and the popular MTF50 occurs at a low spatial frequency relative to Nyquist. Which means that I don't need an Otus 55mm for that camera, or any of mine for that matter.
On the other hand, I can almost stick any old lens on SD9 with it's 9.12um pixel spacing ... š
LOL. With respect, that is a very vague statement! āOn axisā says nothing about the majority of the area of a photo, āoptimal apertureā is also rather restrictive, and āmost decent lensesā needs definition: how many is āmostā, and what is the criterion for ādecentā?
I think it more appropriate to compare the results of using a given lens with 40MP and 20MP sensors of āagreedā quality. š
Another audio analogy, though inverted: what is the point of spending thousands of Ā£/$/Ā„/ā¬ on microphones, amplifiers, etc, if the result is to be played on a boom box? One expects to match the quality of a lens to the recording abilities of the camera: the glass bottoms of Coca Cola bottle are not appropriate as lenses for Alanās camera. š
Unfortunately, reviews seem not to make recommendations of suitable lens/camera combinations.
This is part of a photo taken with a7Rmk4, 3,76 micrometer pixels. The aperture was between 8 and 11, so the diffraction was in the play too.
What I see in the picture is aliasing if I understand how aliasing looks in pictures.
I'll bet aliasing was visible in picture taken with 3,1 micrometer pixel pitch sensor.
That means the lens outresolves the sensor?
The aliasing improvements in real world photography when switching from the 50 MP to the 100MP 33x44mm sensor was something I found important in my photography. But I still see aliasing with the GFX 100x and the Hasselblad X2D.
It is easier to produce aliasing in a lab, but you don't need a lab to produce aliasing. As long as you have sufficient optical sharpness and camera/subject stability relative to pixel density, aliasing is always possible. What's so annoying about aliasing is that for any given sensor, the sharper the lens that you use and the more stable things are, which should be good things, the more likely you are to get visible aliasing.
Every Bayer capture that is moderately pixel-sharp has aliasing in the red and blue channels if the subject matter has aliasing potential; the raws are generally converted to "play dumb" with pixel-level color detail and hide the wild chromatic artifacts of color aliasing. This is even true with sensors with AA filters, because the point spread of the AA filter is small relative to the pixel spacing in the red and green channels. If it were big enough to greatly reduce aliasing in the red and blue channels, individual points of light could get ghosted into 4 distinct visible points in the implied luminance channel.
I think it is more productive to think in terms of the frequency of artifacts for any given usage and equipment, rather than "real world" vs "lab". How much can a lab improve over real world for stability, using a relatively wide lens and f/4 in sunlight at 1/1000? Why talk "real world vs lab" when one can easily tell the full story of "hand-held after too much caffeine vs tripod"?
I see no aliasing here.
I would expect aliasing to show up most clearly on a photo of a city scene, with plenty of slanted straight edges on buildings.