For this example, I’m using a recent set of shots I took to put together a final shot for NGC6853 (the Dumbbell Nebula). A friend of mine asked a while ago:
Q: ‘So are you compositing multiple shots together or taking differing shots across spectrums? Quite curious how this works’.
I didn’t like my answer so much because examples are always going to be better than a long wordy explanation. Here’s the wordy part notwithstanding that last statement.
A: Yes, both. The CCD chip in your camera has infra-red, ultra violet and maybe some colour correction in-front of it. It also has enough light to take a short exposure shot and the ‘smarts’ in the camera do some post processing and present you with a picture that looks like your eye would see.
With astronomy cameras, the CCD chip is exposed to every spectrum of light and you need to add filters and take many shots so that the signal to noise ratio is reduced as much as possible. You can see in this picture some grainyness around the faint nebula areas – with more exposures and longer times for the exposures you can make that smoother and the detail clearer.
I use many techniques including stacking shots together with astronomy software (Nebulosity, Maxim DL, GCX) and then often have to do more post processing in GIMPShop (Ubuntu’s free version of photoshop), I’ll take ‘darks’ often which show me the inherent noise in the CCD chip from electrical current flowing across it and subtract that from the ‘light’ shots (where the CCD is exposed to light). I’ll often take shots with red, blue, green, yellow filters over the front to capture those spectrums as clearly as possible, then there’s H-alpha and Oxygen wavelengths that emit other detail. Stack all that together, put your interpretation of colour on it and that’s how every photo of every nebula and galaxy is made. Even Hubble shots have processing like this done over them. It’s a rough truth, but all that you see in space is greyish green light, with our eyes. Everything else is a function of light refracting in some sort of medium (atmosphere).
So all photos of space are adjusted to let us see the different wavelengths in fake colour, which our eyes would not normally be able to see at all.
How it works using some quick examples. These are taken with a QHY8 one shot colour camera, so I don’t need to take multiple shots with colour filters. The trade off is less sensitivity for light waves at various wavelengths and the possibility of a band of colour being out of focus (which can be corrected using black and white photography with filters in place). I also have to take longer shots to get an acceptable amount of detail, which means keeping my equipment tracking for longer than I would otherwise need.