Ok, I'll try to break the ice on this forum and start a thread.
This thread is not really a question but a description of my understanding of what histograms are and what they represent.
Ok, here goes:
It should be helpful to most people if they understand what the histogram actually is and what it is telling you.
The two main histograms I use are:
-
RGB Histogram - which is nothing more than a frequency count of the 0 - 255 values in all the pixels in the image. Visually on its own, it tells you nothing at all regarding exactly where in the image any data point on the histogram came from.
-
Luminosity Histogram - is a frequency count of the brightness value of each pixel calculated from the rgb values for a given pixel.
Personally I prefer to use the luminosity histogram to set white and black points for an image.
Anyway, to demonstrate what the far more commonly used RGB histogram actually is, consider this simple and basic image with pure red (255,0,0), green (0,255,0) and blue (0,0,255) pixels on a pure white (255,255,255) background.
Obviously the rgb data in the image will contain only the values 0 or 255.
Here is the RGB histogram for the above image.
You can see that the only data in the histogram are the two vertical bars at 0 and 255. The height of the bars representing the total number of 0 values in the rgb data and the number of 255 values in the rgb data.
The histogram on its own cannot tell you what the colors are in the actual image or where in the image (white, red, green, blue elements) any particular histogram data point came from.
If someone was given just the above histogram without the image it came from it would be totally wrong to say the image contained only the colors black and white.
From an RGB histogram you can interpret whether the image is overall lightish, darkish or somewhere in between and whether it possibly lacks contrast.