Processing a Typical Astrophotograph

For this example, I’m using a recent set of shots I took to put together a final shot for NGC6853 (the Dumbbell Nebula).  A friend of mine asked a while ago:

Q:So are you compositing multiple shots together or taking differing shots across spectrums? Quite curious how this works’. 

I didn’t like my answer so much because examples are always going to be better than a long wordy explanation.  Here’s the wordy part notwithstanding that last statement.

A: Yes, both. The CCD chip in your camera has infra-red, ultra violet and maybe some colour correction in-front of it. It also has enough light to take a short exposure shot and the ‘smarts’ in the camera do some post processing and present you with a picture that looks like your eye would see.  With astronomy cameras, the CCD chip is exposed to every spectrum of light and you need to add filters and take many shots so that the signal to noise ratio is reduced as much as possible.  You can see in this picture some grainyness around the faint nebula areas – with more exposures and longer times for the exposures you can make that smoother and the detail clearer.  I use many techniques including stacking shots together with astronomy software (Nebulosity, Maxim DL, GCX) and then often have to do more post processing in GIMPShop (Ubuntu’s free version of photoshop), I’ll take ‘darks’ often which show me the inherent noise in the CCD chip from electrical current flowing across it and subtract that from the ‘light’ shots (where the CCD is exposed to light).  I’ll often take shots with red, blue, green, yellow filters over the front to capture those spectrums as clearly as possible, then there’s H-alpha and Oxygen wavelengths that emit other detail.  Stack all that together, put your interpretation of colour on it and that’s how every photo of every nebula and galaxy is made.  Even Hubble shots have processing like this done over them.  It’s a rough truth, but all that you see in space is greyish green light, with our eyes.  Everything else is a function of light refracting in some sort of medium (atmosphere).  So all photos of space are adjusted to let us see the different wavelengths in fake colour, which our eyes would not normally be able to see at all.

How it works using some quick examples.  These are taken with a QHY8 one shot colour camera,  so I don’t need to take multiple shots with colour filters.  The trade off is less sensitivity for light waves at various wavelengths and the possibility of a band of colour being out of focus (which can be corrected using black and white photography with filters in place).  I also have to take longer shots to get an acceptable amount of detail, which means keeping my equipment tracking for longer than I would otherwise need.

Step 1: Take a shot and capture the raw data. This particular one s a 4200 second long shot, taken on an average night, not the best seeing around but got enough data to produce this dark picture. You can barely see the nebula - but it's there! NGC6853 Unprocessed 4200 Seconds
Step 2: I normally do this in MaximDL. the 1st part is to remove the inherent noise in the CCD chip by subtracting a "Dark". A Dark is just a picture taken with the scope covered, so there's no light data, just static. It should look like the back of your eyes when you close them! After the noise is reduced, I apply a few curve changes to the raw data to bring out more of the luminosity in the picture and a bit more detail. You can see in this shot more detail and the nebula is coming out. I haven't painted anything or done anything artificial to the picture, just bought out what was inherently already in the 1st shot.NGC6853 Curved 4200 Seconds
Step 3: I then 'stretch' the image. This process takes out the data above and below a certain point, those points are a personal choice and can change how your picture looks dramatically. Cutting out too much data towards the higher range of the photo means you lose that data, so a photo that has been stretched incorrectly can look like a flat picture, very little shadow or variation in colour tone. If you cut too much data out at the lower side then you also lose shading and detail, cutting out too much can mean that the blacks in the picture are flat. Not cutting out enough on the other hand means you bring out more noise. It's a judgement in many cases. This shot shows a stretched raw picture, no other processing applied to it.NGC6853 Stretch 4200 Seconds
Step 4: After stretching (this is hard work after all!), I then apply some sharpening filters to the picture to bring out more detail in the nebula and reduce some of the glare from the stars. I might adjust the colour to bring it towards a palette that I think this picture should be. That part is more art than science. NGC6853 Sharpened nd re-curved 4200 Seconds
Take the previous 4 steps and repeat that over many shots. My shots take between 12 and 20 hours to get to something I am happy with. The more shots taken and applied to the final picture, the more detail you can bring out in the final shot. Each shot is stacked using software that does all the stacking for me. I usually use a 'drizzle' stack. This shot has been colour shifted towards more of a blue/red image, I normally prefer something with less colour in it but the end result brings out some of the knots in this shot which are interesting. I also prefer less luminosity in my shots where possible, but the brightness in this shot is needed to show detail outside the nebula which is hard to see otherwise.NGC6853 The Dumbbell Nebula 27 Jun 2015 (TIF)
.page-section-59757d5c693df { padding:10px 0; background-attachment:; background-attachment: scroll\9 !important; background-position:left top; background-repeat:repeat; } .page-section-59757d5c693df .alt-title span { } .page-section-59757d5c693df.section-expandable-true:not(.active-toggle):hover .mk-section-color-mask { opacity:0.2 !important; } .page-section-59757d5c693df .expandable-section-trigger i { opacity:1; top:0 !important; }
Related Posts