Often, the "noise present from the get-go" is actually the "shot noise". This is the noise inherent in the quantum nature of light. Even if you had a perfect camera, that did not add any noise to the image, there would still be visible noise at very low exposures. That's just physics.
Getting back to my raindrop analogy, imagine a tray of shot glasses that you expose to the rain for a very short period. Perhaps the glasses average 5 raindrops each. Some may have 4, some 6. Perhaps a few will have 3 and a few 7. Even though the average is 5, there's a significant variation between 3 and 7. Those +/- 2 drops represents a +/- 40% variation in raindrops captured.
Now leave the glasses out for a 100 times as long. Now there's going to be an average of 500 raindrops in each glass. Some will have 499, some 501, a few at 498, and a few at 502. With such a high exposure to the rain, those +/- 2 drops represents only a +/- 0.4% difference in raindrops captured. Hardly a significant deviation.
Of course, this is a simplification, but it should get the idea across that at low exposures, the noise is in the light itself, and not merely a product of an imperfect camera.