I don't see why not. In the end, seeing more real detail is the context which actually matters.
In 25 years of routinely shooting multi-shot images, I can recall many instances of a client asking to see the differences in the images produced by the various single-shot and multi-shot options. I can't ever recall one asking me to show them a graph. That's understandable because they don't pay to have charts or measurements produced. They are paying for images to be produced.
Differences in image qualities like apparent sharpness, acuity, clarity, resolution, detail, etc. can be described in a variety of ways. There are measurable differences and visible differences. Some differences are measurable, but not visible. Some differences are barely discernible visually and others readily apparent, with a continuously variable range between the extremes. Visual inspection and evaluation of those differences are factored into whether the additional time, effort, and expense of multi-shot images is approved and budgeted or not. Sometimes it is and sometimes good enough is good enough.
Multi-shot image capture itself is subject to variables which affect resulting image quality and as a result any comparative differences seen. At the end of the day, it's the final images produced and how they're perceived that matters most.
A photograph alone doesn't prove I reproduced the colours on a painting accurately enough.
Passing FADGI, Metamorfoze, Golden Thread, Delt.ae tests is the proof.
Of course, those are objective measurements of photographs taken of test charts and used for quality assurance and control. Color was not what I was addressing and yes, there are resolution components to those quality control standards.
Within the context of whether one could "define higher resolution as seeing more real detail" in comparing "pixel shifted images", to which my comments were addressed; what my customers have always wanted to compare with regard to single-shot, 4-shot, 6-shot, 16-shot image captures are what they themselves can see in the image detail gained using the various methods. Others may have had different experiences.
MTF and deltaE came to life because colour evaluation as well as differentiating real details from artifacts, noise, and grain by eye isn't a reliable or repeatable method due to many reasons. That is to say I can see "why not". At some point your customers may also want to rely on standardized metrics.
But no matter, we both stated our positions.
My customers and I do utilize and rely on standardized metrics as part of evaluating images and for quality control. But never solely, there is also a role for visual evaluation and comparison.
Standardized quality assurance methods for calibration, control, and evaluation are valuable. Visual evaluation and comparison is also useful. They are not mutually exclusive and both are applied frequently within any production environment in which I've participated.
In some instances, a combination of measurement and visual evaluation are used to improve productivity and accuracy in a workflow. For example, when visually fine tuning the parameters used in monitor calibration for soft proofing to obtain better screen to print matching for specific paper and printer combinations. A good general overview of that is demonstrated in 5 minutes here.
Note: For the full text and specific context of the quotes from me shown above click here or here.
PPI is used mainly for setting the physical print size for an image file.
For example: say you have a 3000px x 2000px image and you want to make a 12in x 8in print. You set the PPI setting to 250 PPI (3000/12) x (2000/8).
Or in your print dialogue box you could just set the paper size to 12 x 8 and the printing app calculates the PPI, 250 in this case, for you.
When you submit the print, the image file along with the PPI value set for the print size you want are both sent to the printer. It then does its stuff (a discussion for another thread) to convert the image data to DPI (dots of ink per inch) to output the image to the print size you wanted. I normally print at 1440 DPI.
With the exception of a few RIPs, the interpolation done in the printer driver is crude. You should send the file to the driver with the pixel pitch that the halftoner in the driver uses, not let the driver do the resampling.
Yes, that is true but I thought for the purposes of answering Alan's question it would have been over complicating the explanation of PPI he was querying.
Fwiw, yes I send images to my Epson Printer at 360 PPI for the paper size I want since 360PPI is my printer's native print resolution.
Here I need to disagree to some extent in so far as I think describing it as "crude" is a bit harsh.
I am not a scientist or an engineer but I accept that looking at prints under a microscope you very well might be able to see the difference in print quality if the printer driver did the interpolation to the printer's native print resolution.
But for all intents and purposes I think most "mum and dad" photographers will not be able to see any difference in print quality in their prints when viewed under their normal viewing conditions if the printer driver did the interpolation.