Astrophotography Filters part 3: Putting it all together

This is part 3 of my series on astrophotography with filters. Last time I explained how the human eye sees in colour, using three different types of cone cells. We use one type for sensing red light, one for green light and one for blue light. The different channels are then recombined in our brains to produce colour images.

True colour images project the red part of an image onto the red-sensitive nerves in your retinas, the green part of an image onto the green-sensitive nerves in your retinas, and the blue part of an image onto the blue-sensitive nerves in your retinas.

False colour images muck about with this, sometimes transmitting blue or green onto your red sensitive nerves, and so on. At other times they don’t even start with red, green or blue components, but use highly specific “narrowband” filters to get different components in order to show objects differently.

After talking about this with my wife, she asked me what a familiar sight would look like in false colour, and I had absolutely no answer.

So I went and did the experiment myself.

What image did I decide on?

First, I needed an image. It had to be something that would be familiar enough, so that everyone could relate to it. It also had to be a landscape. I’m no slouch when it comes to setting up and taking astronomical photos, but the whole process takes an hour or so. I wasn’t going to be able to photograph people or other things that moved.

My sister happens to live in a house with a brilliant view – miles better than the freeway I can see from my place. She can see an old convent across the valley. Its owner changed it to be an aged care facility, but everyone still calls it the convent. The convent is surrounded by trees of various sorts, and to its left there’s a mobile phone tower. Behind this, Melbourne’s suburbia stretches as far as the Dandenong Ranges.

I decided that this view fitted the bill just right.

This is what the convent looks like with a DSLR and a 500mm birdwatching lens. It’s a familiar enough scene, with green trees, a sandstone building, a concrete tower and the distant background fading to blue due to something or other in the atmosphere.

The convent, taken with a DSLR (colour camera)

Once I’d set up, I started taking photos through my telescope. I have a bunch of filters in a filter wheel, so after I’d framed the shot, I rattled off monochrome photos through each of the filters.

Broadband filters

To start with, I wanted a true colour photo. To build this, I needed the images I’d taken through red, green and blue filters. Here they are.

Black and white photos taken through red, green and blue filters

As you can see, they’re similar – like the photos of me mucking about with a telescope back in part 2. You’ll notice that blue areas in the colour version (like the background) are brighter in the blue filter image. Red areas (like the wall on the left of the convent, and some of the rooves) show up brightest in the red filter image. That’s the way the filters work.

True colour “RGB” image

I recombined these three filter images to produce a new “true colour” image when I got home. Here it is. It’s obviously very similar to the original DSLR photo, but because I’ve done the processing rather then the camera, I’ve been able to make it a bit more vibrant. I hope you prefer it too, or my efforts will have been for nothing!

Red, green and blue filters recombined to for a "true colour" (RGB) image

Narrowband filters

After I got the red, green and blue “broadband” images, I took “narrowband” images. These, if you remember, are sulphur, hydrogen and oxygen, and here they are.

Black and white images taken through sulphur, hydrogen and oxygen filters.

First, notice that the sulphur and hydrogen images are quite similar? That’s because the filters are both basically reddish. The interest (and the science) is in the differences between the two. The oxygen image is quite different, though, and is reminiscent of the blue broadband image.

Second, notice that some filters bring out details that others, especially the broadband filters missed? Have a look at the background mountain, and compare the detail that the hydrogen, and especially the sulphur filter produced. Now, have a look back at what the blue broadband filter produced.

The oxygen filter punched through all the incoherent light in that blue area and saw the trees on the mountainside. The blue filter just saw blue light, even though the oxygen and blue filters are similar colours. Amazing!

This is because the oxygen filter (501nm) excludes “nearly” oxygen light (say, 495nm), and the blue filter just lets it all in. The additional light pours in extra detail that we just don’t want.

To give an analogy, let’s imagine you’re looking at a plate of spaghetti. You want to follow one strand from end to end. It’s impossible with all that other spaghetti around and on top of your strand. But if you remove all the other strands, the one you want is right there looking back at you.

False colour image

Taking the data home, I recombined the narrowband filters like this:

  • Red channel = Sulphur II
  • Green channel = Hydrogen alpha
  • Blue channel = Oxygen III

This is commonly known as an “SHO” image. It’s also known as “the Hubble Palette”, because a lot of images from the Hubble Space Telescope are presented this way.

So here it is.

Narrowband filters recombined to produce a "false colour" (SHO) image

There’s a lot I could say about this photo, but first, I was surprised at how “normal” it looked. The background is mainly blue, and the trees are basically green.

Of course, that’s only because the Hubble Palette has oxygen (which is, if you remember, a teal blue) being mapped to the blue channel, so blue things stay blue. I did a version which mapped the channels differently (“HOS”, where oxygen goes to green), and the picture looked pretty weird. The sky was green and the trees were purple.

But look again at the details. Can you see a high-tension electricity pylon in the background? That shows up way sharper in the narrowband than the broadband image. Actually, if you look back, both the narrowband and broadband (RGB) images are better than the DSLR image.

The same goes for the background. All that detail that the oxygen filter was able to discern in the mountainside has come through in the narrowband image.

What science can we do with all this?

A scientist might use a narrowband image to search for tiny patches of different elements. She would then try to explain the details of what she saw.

when I looked at the details of the narrowband composite image, I found very few areas that are actually red. These would be things in the picture that show up bright in sulphur (mapped to red) but not in hydrogen (mapped to green). But sharp-eyed viewers will notice that there is something red. It’s to the right of the mobile phone tower, about half way up the picture. I don’t know what it is, but it’s a highly specific colour that the hydrogen filter cuts out but the sulphur filter allows through.

Once they find something like this, the scientists start asking what these things might be and why they show up like that.

Isaac Asimov once commented that the most powerful phrase in science is “that’s weird”.

The scientific method

Observations lead to questions, which can lead to other observations. They can also lead to scientists proposing new theories or rejecting old theories. This is how humans learn. We call this “the scientific method”, and it’s as awesome as anything else I’ve talked about here.

There are other details that probably don’t show up in the small versions of the images I’m able to put on this blog. For example, the tower has cable stays that just don’t show up in any of the photos apart from the hydrogen narrowband filter and the red broadband filter. Perhaps it’s because these filters allow less diffraction, or maybe that the diffraction that does occur is more consistent.

See? I’m proposing hypotheses. Now I can make more observations to test them. I’m off back to my sister’s place!

Conclusion

In this final part of my blog, I actually went out and did some science. I took careful observations, analysed the results and proposed a couple of explanations for what I found.

I hope you’ve enjoyed these blogs, and that you’ve learned something about colour filters, narrowband filters, recomposition and the science that might come from all this.

Further reading

Starizona’s “Narrowband Imaging” page explains a lot more about all this in more detail, including some physics.

If you want to learn more about the scientific method, try The Sleepwalkers: A History of Man’s Changing Vision of the Universe by Arthur Koestler.

2 Comments


  1. Bill, thanks for these fantastic three articles. I really enjoyed them.

    Cheers

    Reply

  2. Hi, I just finished reading your “Astrophotography Filters” 3 part series. I am very new to astronomy/astrophotography, and I have been reading as much as I possibly can to learn as much as I possibly can. Just wanted to thank you for posting this series on filters. Not only was the information extremely educational and easy to understand, it was also hilarious, which made reading it that much more enjoyable.

    Truly….thank you!!

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *