Exposure time and noise

Exposure time and noise

Unlike daylight photography on Earth, astrophotography requires a very long exposure time because very few photons from the objects reach Earth.

For this, a consideration follows soon, how many photons hit the earth's surface resp. the telescope.

In principle, it can be stated that a long total exposure time should always be aimed for. Especially with low-light objects, several hours of exposure time should be achieved. Sometimes some objects are even exposed over several nights. Since it is hardly feasible and also disadvantageous to expose a single photo over the whole night, several single photos (sub-frames or also light-frames) are generated, which are then superimposed/added (stacked) with suitable software and divided by their number.

In low-light astrophotography, physical effects always result in noise, which is present both in the image background and in the object itself, and which differs in each image. It is this difference that causes the signal-to-noise ratio (SNR) to improve with each additional sub-frame during stacking.
This can be explained as follows: Each individual image contains the weak object information, which is superimposed by the noise in such a way that it is hardly or not at all perceived when viewing the individual image. The signals are averaged by stacking (adding and dividing). The object signals remain almost unaffected by this averaging. However, the noise components, which are vary in each pixel for each sub-frame, are reduced by averaging. The signal-to-noise ratio has improved and the image appears smoother as a result. Subsequent stretching (s. menu point ‘Image processing’ – ‘Stacking and stretching of images’) then increases the contrast of the image and brings out the object details.
(Source: Burkart, R. (2024). Astronomie, Das Magazin (edition 43) (page 69f))


  1. sub-frames: object information (blue) is superimposed by noise components
  2. after a stacking process: noise components have been averaged out, object information (blue) becomes visible
  3. further images were stacked as in a)
  4. the stacked images are averaged again: the object information (blue) is visible (but not brighter), the noise components are strongly averaged and smoothed out
  5. stretching the image and cutting away the depths in the tone value curve

(Source: Burkart, R. (2024). Astronomie, Das Magazin (edition 43) (page 70))


The main aim of astrophotography is therefore to obtain a good signal-to-noise ratio (SNR).

It is possible to take a few photos with a long single exposure time or a large number of photos with short single exposure times in one night. Both approaches have advantages and disadvantages. However, for the software to achieve a good result, there should be at least a two-digit number of individual images.

Each photon that hits a pixel releases an electron from the silicon of a pixel. The electrons are collected during a single exposure and stored in the respective pixel memory. After completion of the single exposure, the filling quantity of this memory represents the signal of the single pixel. A voltage is applied to the memory via the collected electrons, which can be increased via an amplifier (ISO/Gain), and is converted via an Analog Digital Converter (ADC) into a digital number (ADU - Analog Digital Unit) as a value for the brightness. In this way, different gray scales are created (also in color cameras), through which the dynamic range of the image is defined in this way. In a color camera, there is a so-called Bayer-matrix above the chip, a color filter that allows only green, red or blue to pass through for each pixel. Only then, the corresponding color is assigned to the digital value.

Depending on how many bits the ADC of a camera has, this memory and the signal values output thereby (and thus also the dynamic range - number of gray levels) are different.

Data depth of camera chipValue range for output signal (ADU) behind the ADC (number of gray levels)https://www.youtube.com/watch?v=M2upgShXNO0


Each pixel can only generate a certain number of charge carriers until the memory is full (full well capacity).
But there are two limits for the memory: (Source: https://www.baumer.com/):

    • Absolute sensitivity threshold - describes the smallest number of photons at which the camera can distinguish useful information in the image from noise
    • Saturation capacity - to avoid non-linearities, a value is defined by the manufacturer which is typically smaller than the full well capacity. The uppermost memory limit is not reached


The area between these limits is then scaled to the data depth (bit values of the ADC) during evaluation.
If the memory reaches saturation capacity during a single exposure, no further information about the object can be collected, and so-called pixel burnout (overexposure) has occurred. This is not so bad with stars, but if it affects the complete object, details are lost (the dynamic range decreases).



Influencing types of noise

(Background knowledge through Astrophotocast by Frank Sackenheim:
as well as


Photon noise / shot noise RP

During an exposure and the subsequent exposures of an exposure series, not every pixel is hit by the same number of photons. It is in the nature of light that one pixel is hit by more or fewer photons than the pixel next to it. In the next shot, it could be the other way around. In addition, each pixel also has a different sensitivity to light (quantum efficiency). Thus, in a single exposure, depending on the single exposure time t, different object signals SP will result for neighboring pixels, although the object may not show any differences at this point.
The standard deviation of the photon count from the expected mean value of the photons incident per pixel is a measure of the scatter around this value and is called photon noise (also shot noise). The photon noise RP is given by:

These differences in brightness between the pixels become especially apparent when few photons reach the chip, as in astrophotography.


Background noise RH by sky glow SH

The sky glow (moonlight, light pollution, reflecting dust...) is an additional signal that is added to the object signal and generates noise in the same way as photon noise. The unwanted signal SH can be removed by software, but since the average value of the signal is used, the noise RH remains for each pixel and is calculated again during the duration of the single exposure time t from the formula.


Dark current noise RD

Due to thermal effects, electrons couple out of the silicon chip from time to time, even when the camera is closed, and generate a signal SD for the duration of the single exposure time t. The dark current responsible for this signal varies due to various physical effects. This deviation from the average dark current is the dark current noise RD and is calculated again with the formula of the standard deviation.

By recorded dark-frames the faulty signal can be subtracted, but here again the noise remains. To keep the dark current signal as low as possible due to the thermal effects and thus the dark current noise, the chips of the astro cameras are cooled down with Peltier elements.


Readout noise

Due to physical processes, random errors occur from time to time when reading out and converting electrons from the pixel memory. These deviations of the statistical average from the readout current (the effective signal SN) are called readout noise RA and unavoidably occur once for every single image. Readout noise has a particularly strong effect on short-exposure single-frame images.

Since noise values are both above and below an expected value, the four noise components cannot simply be added. Only by squaring the noise values and then taking the square root it is possible to determine the total noise of the image. This squaring adds the effect that larger noise values are even more dominant and smaller noise values are so small in comparison that they can be neglected.

For the pure signal values, a simple addition can be used. All signals reaching the chip during the single exposure time t result in a total effective signal SN.
If everything is put into the formula for the signal-to-noise ratio, the following results for a single image:

By stacking with a software a certain number of images N is superimposed. If this number is included in the formula, this must be done equally for the signal and for each noise value. This results in the formula:

The following becomes visible from the formula:

    • In contrast to the values in the denominator, the useful signal in the numerator is not under a root → with increasing number of images and/or increasing number of single exposure times the signal-to-noise ratio increases.
    • The readout noise is squared → the camera should have a readout noise as small as possible. (data sheet specifications of the camera)
    • The readout noise is a fixed value for each frame. → A large number of single frames increases the total readout noise of the final image, but a large number of single frames also increases the signal.
    • In modern cameras with cooling, the dark current is very low compared to the other noise values and can therefore usually be neglected.
    • If the additional sky glow is subtracted from the effective signal in the numerator via the software, longer exposures must be used to improve the poorer signal-to-noise ratio again.
    • A dark sky produces less background signal and thus requires a longer exposure time to neglect readout noise.

It is therefore necessary to find a good compromise between a minimum and a very long exposure time depending on the sky glow, so that the readout noise is negligible. The exposure time should be chosen in such a way that the signal SH generated by the sky glow brightens the image just enough so that object details are not lost. The background photons therefore limit the exposure time. It is then spoken of background-limited exposures. (Source: https://www.youtube.com/watch?v=xUzk9V2NZBY)

In areas close to the city, the sky glow SH is very strong. This is often noticeable after 30 s to 60 s. A capturing software (e.g. N.I.N.A.) can give a recommendation for the exposure time based on the sky glow via test images (link). (Here also the equipment is to be considered. Is a slow, or a fast optics used, or are narrow band filters applied, which let only few photons through).

The sky glow SH is thus the dominant part of the noise values in the formula, and the disturbing readout noise can be neglected. With very dark skies, however, several minutes of exposure are required to discard the readout noise.


If a color camera or RGB filter set is used, the calculated exposure time should be tripled. This is due to the fact that above the chip of a color camera there is a Bayer-matrix, which for each pixel lets through either only green, red or blue. (s. menu point 'Components' - 'Camera' - 'Color or monochrome')

When using narrow band filters (12 nm), the exposure time should be multiplied by 25, and even by 100 when using a 3 nm narrow band filter.
(Source: https://www.youtube.com/watch?v=3RH93UvP358)

For a direct calculation of the optimal exposure time with given light pollution and the readout noise of the camera the following formula is valid:
(Source: https://www.youtube.com/watch?v=3RH93UvP358)

R – readout noise of the camera
P – light pollution rate (electrons extracted by sky glow per pixel per second. [e/pixel/s])

C is a factor that includes how much percentage deviation E from an unavoidable minimum possible noise should still be allowed.

Under the following link an example calculation for this formula is executed. Here the astro camera ZWO ASI294MC Pro (cooled color camera) is used. As telescope a f/6 refractor is chosen.

As an approximate value, in places with medium light pollution and optics that are at f/5 to f/6, an exposure time of about 30 s can be recommended for a CMOS color camera without a narrowband filter to be background-limited. However, since many other factors also play a role, this is not a fixed value. Times between 30 s and 300 s can also be tested, especially since longer exposure times are often the better choice.


Short single exposure times (e.g. 900 images with 20 s exposure time each = 5 h)•  Failed images (blurred, satellite tracks, airplanes, meteorites) can be sorted out without decreasing the total exposure time to a great extent

•   Suitable for very bright objects, so that the pixels do not go into saturation (burn out)

•   Small deviations that occur during alignment are not so relevant

•   Very large amount of data (a single image has several MB depending on chip size)

•   Long processing times during stacking

•   When using narrow band filters or very faint objects, too few photons get onto the chip

•   Too short exposures result in a bad signal-to-noise ratio

Long single exposure times (e.g. 150 images with 120 s exposure time each = 5 h)•   Smaller data volumes

•   Processing times are considerably shorter compared to the procedure with a high number of images

•   At the limit of background limitation, longer single exposure times improve the signal-to-noise ratio

•   Failed images (blurred, satellite tracks, airplanes, meteorites) are heavily weighted by the sorting out, so that the complete image then is missing exposure time

•   Only after a few capturing it can be seen if the object has been hit well; by this, however, a lot of time of the night has passed already

•   It must be aligned very exactly

•   The sky glow has a stronger influence near the city and must be considered