M31 (Andromeda Galaxy)

Sensitivity and novel techniques to get the best from a classic subject.

 

 

 

Equipment:

Refractor, 98 mm aperture, 618 mm focal length

Reducer / flattener (0.8x)

QSI683 CCD (Kodak KAF8300 sensor)

QSI integrated Filter Wheel (1.25” Baader filters)

QSI integrated off-axis guider with Lodestar CCD

Paramount MX, Berlebach tripod

Software: (Windows 7)

Sequence Generator Pro, ASCOM drivers

PHD2 autoguider software

PixInsight (Mac OSX)

Exposure: (LRGBHα)

L bin 1; 10 × 120, 15 × 300, 15 × 600 seconds

RGB bin 2; 15 × 300 seconds, Ha bin 1; 15 × 600 seconds

There are a few popular images that appear in every-one’s portfolio but at the same time present technical difficulties that require special measures to make the very best image. One of these is the great Andromeda Galaxy. It is often one of the first to be attempted and is within the reach of a digital SLR fitted with a modest telephoto lens. At the same time, it is particularly difficult to render the faint outer reaches and keep the bright central core from clipping and losing detail. The temptation with such a bright object is to take insufficient exposures that – although they show up the impressive dust lanes, glowing core and neighboring galaxies – are individually too long to see the faint detail in the galaxy core and insufficient in duration to capture the outer margins. To compensate, the image requires considerable stretching and a great number of early attempts often show clipped white stars. This subject then deserves a little respect and sensitivity using some novel techniques to do it justice.

One of the most impressive versions of M31 is that by Robert Gendler, editor of Lessons from the Masters, who produced a 1-GB image mosaic using a remotely operated 20-inch RC telescope located in New Mexico. I think his beautiful image is the benchmark for those of us on terra firma. By comparison, my modest portable system in my back yard, 30 miles from London, feels somewhat inadequate and my final image is all the more satisfying for the same reason.

fig135_308_1.tif

M31, is one of the few galaxies that exhibits a blue-shift as it hurtles towards us on a collision course (in approximately 3.75 billion years). This object is about 3° wide (about six times wider than the Moon) and requires a short focal length. Using an APS-C sized CCD with a 618 mm focal length refractor, fitted with a 0.8 × reducer, it just squeezes its extent across the sensor diagonal. The lack of a margin did cause some issues during the background equalization process, since the background samples near two of the corners affected the rendering of the galaxy itself. After realizing this, I repeated the entire workflow with fewer background samples and this provided an even better image, with extended fine tracery.

Setting Up

The equipment setup is remarkably simple, a medium-sized refractor fitted with a reducing field-flattener screwed to a CCD camera, filter wheel and an off-axis guider. For precise focus control, a Feather Touch focus upgrade to the telescope was fitted with a digital stepper motor and USB controller. At this short focal length the demands on the telescope mount are slight and in this instance the Paramount MX hardly noticed the load. The mount was quickly aligned using its polar scope and confirmed with a 15-point TPoint model to within one arc minute.

Acquisition

Imaging occurred over four nights during Andromeda’s early season in September. Image acquisition started at an altitude of 35° and ideally, had deadlines permitted, I would have waited a few months to image from 45° and above, away from the light pollution gradient near the horizon. To minimize its effect on each night, I started with a binned color filter exposure set and followed with luminance, by which time, M31 had risen to a respectable altitude and the local street lamps had turned off. The exposures were captured with Sequence Genera-tor Pro using PHD2 to guide via a Lodestar’s ST4 interface. SGP remembers the sequence progress when it shuts down and picks up again where it left off on each subsequent session. The filename and FITS header in each file were clearly labelled so that the different luminance sub exposures could be grouped together during calibration. Early autumn in the UK is a damp affair and the dew heater was set to half power to ensure the front optic remained clear in the high humidity.

Guiding performance on this occasion was a further improvement from previous sessions. For the DEC axis I changed PHD2’s DEC guiding algorithm from “resist switching” to “low pass”. RMS tracking errors with 5-second exposures were around 0.3 arc seconds. PHD2 mimics PHD guiding algorithms and I experimented with different control algorithms including hysteresis, resist switching and the low pass option. The hysteresis function, with hysteresis set to 0, behaves similarly to Maxim DL’s guiding algorithm.

fig135_1.tif

fig.1 This simplified workflow is designed for use with PixInsight. Most of the imaging steps can be accomplished in other packages or combinations too.

Calibration and Channel Integration

The 1.25-inch filters in the QSI683 are a tight crop, and when used with a fast aperture scope, the extreme corners vignette sharply. Flat calibration does not entirely remedy this and in practice, if these corners are crucial, they are remedied in Photoshop as a final manual manipulation. For this case study, the calibration process generated seven sets of calibrated and registered files; three Luminance sets of different exposure duration, RGB and Ha. The individual calibrated files were registered to the Ha file with the smallest half flux density. These file sets were separately integrated to form seven master image files, weighted by noise and using Winsorized Sigma Clipping to ensure the plane trails and cosmic ray hits were removed.

Processing

Channel Combinations

One of the first “tricks” with this image is the combination of image channels to form a set of LRGB master files. In the case of the luminance images, we have three sets, two of which have progressively more clipping at the galaxy’s core. PixInsight has an effective high dynamic range combination tool, named HDRComposition, that blends two or more images together using a certain threshold for the 50:50 point. This is great but in this case, several hours of shorter exposures do nothing to assist the noise level of the 600-second exposures. Recall I took three luminance exposure sets? The ImageIntegration tool can weight-average three exposures or more. The first trick is to simply average the 120-, 300- and 600-second exposure sets according to their exposure duration (fig.2) and then finally, to improve the galaxy core, use the HDRComposition tool to substitute in the 300- and 120-second exposure information. Since the image is already an average of three files, the core is not clipped, but upon closer inspection it is posterized, at an image level threshold of about 0.2. Using 0.15 as the threshold in HDRComposition, the smoother, detailed cores from the shorter exposure blend into the core as well as the brighter stars (fig.3). (In fact, using HDRComposition on a starfield image, taken with a shorter exposure, is a nifty method to reduce bright star intensity in images.)

In practice, the luminance files have considerable light pollution in them. Before integrating the three exposure stacks (with no pixel rejection), I used the DynamicBackgroundExtraction (DBE) tool to remove the sky pedestal and even out the background. In that way, the ImageIntegration tool only took account of the image signal rather than the image plus light pollution. (To confirm my assumptions were correct I compared the image noise and noise weighting using the Sub-frameSelector script on the separate luminance files, the HDRComposition file and the hybrid combination.)

fig135_2.tif

fig.2 The evaluate noise feature of the ImageIntegration tool provides a measure of the final noise level. This allows one to optimize the settings. In this case, an “exposure time” weighting outperformed the “noise” weighting.

The other departure from standard LRGB imaging is to combine the Ha and red channels. Comparing the red and Ha images, the Ha exposures pick out the nebulosity along the spiral arms and do not exhibit a gradient. Again, after carefully applying the DBE tool to each stack, they were combined using PixelMath using a simple ratio: Several values were tried, including:

equation

fig135_3.tif

fig.3 The integrated image (top) seemingly looks fine. On closer inspection there are two concentric halos, corresponding to the clipping boundary of the two longer exposure sets. Using HDRComposition, this is blended away, substituting the shorter exposure set data for the higher image intensities, leaving just a few saturated pixels in the very middle.

Since the red channel contained a prominent image gradation (before applying DBE) it was evident that it was detecting light pollution and I chose to favor the Hα channel 9:1 using the PixelMath equation:

equation

Linear Processing

Some of the linear processing steps (DBE) had already been completed to assist with the unique image channel combinations. The remaining green and blue channels were similarly processed, removing background gradients. In all cases applying DBE was particularly difficult as the galaxy extended to the extreme corners of the image. In hindsight, my sampling points were too close to the faintest periphery and if I had “retreated”, the subtle subtraction would have been less severe, preserving the outer glow. The master luminance file also required further standard processing steps including deconvolution, using an extracted point spreading function (PSF), deringing with the support of a purpose-made mask, and combining a star and range mask to exclude the background. The deringing option was adjusted precisely to remove tell-tale halos, especially those stars overlapping the bright galaxy core. This improved star and dust lane definition at the same time.

fig135_4.tif

fig.4 The TGVDenoise has an extended option for “Local Support” that progressively protects areas with a high signal to noise ratio. The settings for mid, shadow and highlight tones are set to the image parameters. Fortunately the ScreenTransferFunction supplies the data, by clicking on the wrench symbol. Note the numbers are in a different order between the two tools.

To complete the luminance linear processing, noise reduction was applied using the TGVDenoise tool, with local support carefully set to proportionally protect those areas with a high SNR, and with a range mask to be on the safe side. The settings were practiced on a preview before applying to the full image.

For the color channels, having equalized their backgrounds, the separate image stacks were inspected for overall levels. Using the green channel as a reference, the LinearFit tool was applied in turn to the red and blue images to equalize them, prior to using the ChannelCombination tool to create the initial RGB color image. To neutralize the background I created several small previews on areas of dark sky and combined them for use as a reference for the tool of the same name (fig.5). This was additionally used as the background reference for the ColorCalibration tool, in combination with a preview window drawn over the galaxy area, to set the color balance. After several attempts using different white references, a pleasing color balance was established, with a range of warm and cool tones in the galaxy.

fig135_5.tif

fig.5 Background neutralization is quite difficult on this image on account of the sky gradient. Multiple previews were created to cover the background. These were combined using the PreviewAggregator script and this preview used as the background reference.

Non-Linear Luminance Processing

Although non-linear processing followed a well-trodden path, it was not the easiest of journeys. Starting with the luminance, a careful set of iterative HistogramTransformation applications ensured that the histogram did not clip at either end and there was sufficient image headroom to allow for the brightening effect of sharpening tools. The details in the galaxy were teased out using the LocalHistogramEqualization tool, applied at different scales and with different range masks; first to the galaxy to improve the definition of the spiral arms and again, using a more selective mask, revealing only the central core to tease out the innermost dust lanes. By this stage some star intensities had started to clip and, unless they were reduced in intensity, would be rendered white in the final color image. I used the Morphological tool, with a supporting star mask, to slightly shrink the stars and at the same time, reduce their overall intensity. This useful trick ensures that, providing the non-linear RGB file is not over-stretched, the star color is preserved. (Another method is to substitute stars in from a duplicate linear image that has had masked stretch applied.) Using a general luminance mask to reveal the background and faint parts of the galaxy, a little more noise reduction was applied to the background, using TGVDenoise, careful to avoid a “plastic” look. (It is easy to over-do noise reduction.)

Non-Linear Color Processing

The aim of the color file is to just provide supporting color information for the luminance file. As such, it has to be colorful, yet with low chrominance noise. At the same time, the background must be neutral and even. Since bright images by their very nature have low color saturation, the non-linear stretch should be moderate, to avoid clipping pixels color values. The non-linear stretch was carried out in two passes to form a slightly low-key result (fig.6). Troublesome green pixels were removed with the SCNR tool and after a mild saturation boost, noise reduction was applied to the background using TGVDenoise (with a supporting mask). The stars were then slightly blurred too, to remove some colorful hot pixels on their margins (through a supporting star mask). This evened out a few colored fringes and gave a more natural look. Satisfied I had a smooth and colorful image, the luminance file was applied to it using the LRGBCombination tool, adjusting the saturation and lightness settings to suit my taste. To preserve color, the image brightness was kept on the darker side of the default settings.

fig135_6.tif

fig.6 The RGB color image is simply used for color information. To preserve color saturation, it is important not to over-stretch it or create excessive chrominance noise. For that reason the entire image was blurred slightly and noise reduction applied to selective areas. Star saturation was increased slightly using the CurvesTransformation tool and a star mask.

Now that the general color, saturation and brightness were established, the structures in the galaxy core and margins were emphasized using the MultiscaleMedian-Transform tool and the LocalHistogramEqualization tool, with supporting masks to concentrate their effect. On screen, I could see the faint tracery of the dust lanes spiral into the very center of the galaxy core. After some minor boosts to the saturation, using the CurvesTransformation tool, a final course of noise reduction was applied to those areas that needed it, of course, using masks to direct the effect to those areas where it was most needed. (By the end of this exercise I had about 10 different masks, each optimized to select particular objects based on brightness, scale or both.)

Improvements

This impressive subject is hard to resist. I had to use several novel techniques to overcome the difficulties presented by its high dynamic range to lift it above the average interpretation. There was still room for improvement; having compared the final result with other notable examples, I realized that I had inadvertently lowered the intensity of the outer margins with over-zealous use of the DynamicBackgroundExtraction tool and missed an opportunity to boost a ring of bright blue nebulosity around the margins. (The fault of a range mask that was too restrictive to the brighter parts of the galaxy.) Putting these corrections back into the beginning of the image workflow yielded a small but worthwhile improvement, many hours later.

At the same time, I also realized another trick to improve the appearance of the red and blue nebulosity in the margins: I used CurvesTransformation’s saturation curve to boost the red color and the blue and RGB/K curves to emphasize and lighten the blue luminosity. The trick was to select the content using a stretched version of the Hα and blue-image stacks as a mask. The final re-processed image appears below (fig.7). Phew!

fig135_7.tif

fig.7 The final image makes full use of the APS-C sensor in more ways than one. In this particular image tiny dust clouds in the companion galaxy M110 can be seen. At first I thought it was noise, until an Internet search confirmed it to be the case.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset