Hour by hour gif of integration time. Get more integration time, it's worth it.

The only difference between 1 hour and 1,000 hours should be the signal to noise ratio

If you're doing some form of averaging to the subexposures rather than just adding them, there's also an improvement in digitization levels - you get more fine grained digitization when you average more photos. For a simple example, think about if you take a black-and-white exposure of a smooth gradient that spans between the digitization levels of '0' and '1'. All points in the gradient will be recorded as a '0' or a '1', and you'll end up with an image with only two brightness levels - half of the image a '0' and half of it a '1'. If you take a second exposure of the same gradient, you end up with the same thing again. But if it's a 'real' gradient that has some fluctuation in the photon count, such as anything in the real world sky, some of the points that were near the '0.5' level will have fluctuated to above 0.5 in one exposure (to be recorded as a '1') and below 0.5 in the other (to be recorded as a '0'). If you average them together, those points will be recorded as halfway between the '0' and '1' levels, and therefore make up a whole new digitization level. Your final recorded gradient will now have some points of '0.5' in the middle, smoothing out the gradient. The more images you take, the more possible fractional values you add and therefore the smoother you make the gradient. This works for pretty much anything with real world photon statistics.

In short, averaging images effectively increases the bit range of your analog-to-digital converter, leading to less posterization and more room for post-processing histogram stretching. How much of a practical difference it makes depends on the image and the processing steps. There's a nice article on it here.

/r/astrophotography Thread Parent Link - i.imgur.com