Tricks for (Astro)Treats

OK y’all, I know it’s been a bit since my last post. After a few weeks of good weather and 3 good images collected, I ended up with the flu and an ear infection (thanks kids!) and have been catching up a bit rather than worrying about the site. My bad. I know you were waiting with bated breath for me to post another update, right?

I jest, I jest. So, the most recent image I’ve worked on was another dataset from TelescopeLive, the remote astrophotography service that (conveniently for me) has several observatories in the Southern Hemisphere shooting things I cannot see from my house. Cool objects too that I would love to be able to image one day. One of those is the Rim Nebula, aka Dragons of Ara, which I think is a neat object and I wanted to create an image for my portfolio. The dataset was also a great example of one of the tricks we use in AP to process the data to bring out the feature object, even when a dense starfield is present that can overwhelm the image.

When you use longer exposures, 10 minutes in this case, the stars can overwhelm your data as you try to gather enough photons of the object itself. Here’s what I mean. Below is the stacked but completely unprocessed image of the H-alpha signal from that object. You can see how dense the starfield is and the nebulosity is almost an afterthought of the image, but that’s what we want to see.

Unedited, but stacked Ha image

Now, the Ha is really strong here as typical for emission nebulae, so you can see the object fairly clearly. In other emission lines (Sii, Oiii) collected, the object is sometimes much dimmer than it is here. To get the image processed, one of the tricks we use is star removal, even at the initial stages of processing.

The reason for that is as we do our processing on the image, mainly “stretching” and other manipulations of color, it throws the stars out of whack in both color and intensity. For those of you that don’t want to click and read the stretching link, we are manipulating the image signal from linear to non-linear to bring the dim objects to life.

There are some great tools to do this, some paid and some open source which is outside the scope of this article, but when we remove all the stars, we get this:

Ha starless

As you can see, this allows working on the main target without fear of blowing out the stars or having them turn out weird colors because we are making changes to the base signal.

These tools will generate a separate image of the stars themselves, which you can leave in a natural state and re-integrate later. A lot of completely starless images have been popular due to these tools, but the main function is to isolate the feature object, do your edits to it (some other processing steps can mess up the stars as well), and re-integrate natural-looking stars at the end.

Once your processing is done, you end up with something like this:

starless processed image of the Dragons

Through the combinations of color channels, you can do quite a bit to enhance the detail and appearance of the object. If I had left the stars in here, there would be a bunch of yellow and magenta stars which looks completely unnatural and distracts from the image. We process those in the normal spectrum, so they look right. Now you may say to yourself, “isn’t this cheating? or fake?” Well, I guess that depends on your perspective. A lot of stuff in space is going to look the same every time because of the spectrum of emissions available, and that gets boring. Alternate color palettes are something that NASA/ESA processors have been doing for some time, especially with Hubble data, hence the name for SHO mapping, the “Hubble palette.” It’s a way for us to breathe life into these images and allow us to see the fantastic objects that are out there in the cosmos.

Once we feel comfortable with the feature object, we can add the stars back in and generally will use some processes to reduce how prominent they are to keep attention on the feature object.

Cool, huh? I will be straight with you, the gear and image acquisition is the fun part, and not difficult. This is the hard part and where even getting to my level of processing ability (which isn’t great) has taken quite a bit of time to reach.

What’s next?

Well, the weather has not been great and I was excited for Jupiter to be super bright this season, but the seeing conditions (atmospheric turbulence) were really bad. This led to some unimpressive images even by my standards. I will keep at it as planetary is good for spotty clouds since I am imaging for minutes vs. 18-24 hours.

This is a weird time for my skies, the main nebula regions are either up late, or go down too early, so I have to wait a bit to get those winter targets in good imaging position. I have a few things in mind for the coming weeks, so hopefully, the weather will eventually cooperate as we get into those nice long nights.

Dear reader, I hope all is well with you….talk soon,

-M

Previous
Previous

Looking back at 2022

Next
Next

Hogging photons…