Racing against the Moon, Sun, and the meridian, on two consecutive nights I captured Panel G in luminance, red, green, and blue. The mosaic seen above consists of all six panels in luminance only.
Now having all data in LRGB, I created Panel G in color. The image above shows accurate star color and also variations in the arms of the galaxy as expected.
Try as I might I could not get Astro Pixel Processor (APP) to create the grand mosaic consisting of five monochrome panels and one color panel. At the very end of the process it complained it could not create a mosaic consisting of just one panel.
In a couple weeks when the Moon recedes I will capture RGB data for Panel F on the first night. Hopefully that will satisfy APP. With luck I’ll get a string of clear nights as the Fall season approaches.
If you have been following the progress of the mosaic and you’ve wondered how I derived the panel naming scheme, this explains it:
That screenshot came from a program called Computer Assisted Astronomy (C2A) by Philippe Deverchère. I highly recommend it. It offers the ability to create User Catalogs. I created a catalog named “M31 Mosaic” that contains 8 records. A record defines each of the eight rectangles (or panels) that you see. The size of each panel represents the field-of-view of my Atik 314E camera and William Optics ZenithStar 71 telescope. Furthermore, I adjusted the position and orientation of each panel so as to provide sufficient overlap to satisfy Astro Pixel Processor (APP).
The body of the galaxy is represented by the large ellipse. As I’ve come to learn, the perimeter of the ellipse captures the very outer edges of the galaxy which is not visible in my mosaic, so because of that I’ve decided not to capture panels H and D. Perhaps one day after I acquire more data I will go back and capture them.
Captured three additional panels last night: C, E, and F. The plan was to also capture G but a cloud bank interfered with F. Waited for clouds to pass but by then G was not possible due to the meridian and dawn.
The weather forecast says that this coming Saturday may be favorable. It will be my last chance to capture G for two weeks due to the Moon.
Six more panels to go in Luminance! And then to do eight panels in Red, eight panels in Green, and eight panels in Blue. Lot’s to do but the results are encouraging.
For perspective here is Panel A:
and here is Panel B:
Astro Pixel Processor (APP) for imaging processing and mosaic creation. INDI/Ekos used for image capture.
Dew is condensed water vapor that likes to form on optical surfaces whenever the relative humidity is high. The featured image shows a dew drop that formed on my camera’s sensor window shortly after I turned on the thermo-electric cooler. It ruined a night’s worth of astrophotography because I did not have a plan in place.
My first experience with dew was two years ago. The sky was clear but I could see my breath and the grass was glistening with moisture. When I began my imaging run the computer screen showed bright stars and dark space but as the night wore on I noticed that the image looked increasingly less defined. When I shined a light on the objective lens of my refractor I saw that it was fogged over.
Don’t let this happen to you. Clear nights are hard to come by so don’t waste it by not having a dew management plan. There are several vendors who have solutions that consist of heater bands and a controller for adjusting the temperature. Me, I am a Do-It-Yourself person, so I like to build my own. I’ve built a couple homemade heater strips as described here at DewBusters website:
I don’t have a controller so if you build a heater strip according to those specifications you will find that it runs hot when you apply a constant 12VDC across it. Now I tend to build strips with half the number of resistors such that it runs at half power. I solved the dew problem on my sensor window by building a small heater strip that fits nicely around the camera’s nose-piece using Velcro.
The other night as I prepared to image the Eastern Veil Nebula I purposely disconnected the dew heaters earlier in the evening to conserve battery power as I waited for the target to rise above the treeline. When the time came I engaged the cooler, waited 20 minutes for the temperature to stabilize, and then took a test shot. Right away I knew what the problem was. I ran outside, reconnected the dew heaters and after only five minutes I could see the dew had evaporated.
Astro Pixel Processor (APP) has a powerful tool for creating mosaics. Last night I tested it out. The result is stunning:
My camera has a small field-of-view. It cannot fit the entire nebula in one shot. It must be broken up into two panels. Here is a screenshot of my planetarium software C2A. I used it to plan where to position the telescope. The two overlapping red rectangles indicate the framing. As shown there must be some overlap in order for APP to do its magic:
Each panel consists of 50 individual images using a 73-second exposure. The first step in creating the mosaic is to stack those 50 images to create the upper panel:
Followed by the lower panel:
By the way you may have noticed when you enlarge each image that the stars look square. That was due to the choice I made to capture each image using 2×2 binning, essentially reducing each 2×2 matrix of pixels to one pixel. I did that to boost the signal-to-noise ratio at the cost of resolution. The luminance filter was used for all images, no narrowband.
This summer and fall I plan to create a 15-panel color mosaic of the Andromeda Galaxy.
The Cocoon Nebula lies in the constellation Cygnus in one of the nearby arms of our Milky Way galaxy. In the distant past the nebula gave birth to a cluster of highly energetic stars. The energy from those stars ionize the hydrogen gas, causing it to glow red. The technical classification of this nebula is IC 5146, a bright emission nebula, but there is another nebula, a dark nebula named Barnard 168. You can see hints of it immediately surrounding IC 5146, a region relatively devoid of stars that extend to the upper right-hand corner of the frame. In fact this dark nebula extends a great distance from what you see here. Do an internet search of “Cocoon Nebula” to see wide-field images that show it. If you live atop a mountain or somewhere with exceptionally clear skies away from city lights, dark nebulae can be seen as smokey gray regions.
This was an experiment that fortunately succeeded. I say fortunately because it enables me to practice both astrophotography and photometry using only four filters instead of the usual six. Normally astrophotography requires four filters: luminance, red, green, and blue (LRGB for short). Photometry requires a minimum of two filters: “V” and “B”.
My filter wheel has only five slots. So how did I fit six filters into five slots? I didn’t. I simply replaced the G and B filters with the photometric V and B. I call it LRVB instead of LRGB.
The photometric V filter looks green when you hold it up to light and the B filter looks blue. I knew for a fact that I needed to “white balance” them in order to determine the proper exposure for each. I performed that task last week. I thought it would end there but I was mistaken.
When the time finally came to process all of the images I was disappointed. The colors were muddy looking. What was the problem? The answer lies in the dissimilar spectral response of the filters. The traditional G filter passes light between 500nm and 600nm whereas the photometric V filter passes light between 475nm and 650nm. So the V filter passes some light into what is traditionally the blue and red bands! Furthermore the photometric B filter is slow to pick up light in the blue band but is aggressive in deep blue to ultraviolet.
The solution was found in AstroPixelProcessor (APP) which provides a tool to combine the individual LRVB stacks into a single color composite image. Originally I told APP to assign 100% of the V-stack to the green channel but that resulted in muddy colors. This time I told it to assign 75% to the green channel and 25% to the blue channel. That was the solution!
The technical details
William Optics 71mm f/5.9 Atik 314E CCD (cooled but not set-point) Optolong Luminance and Red filters Astrodon Photometric V and B filters Unitron Model 142 German Equatorial Mount. Tracking: Own design Permanent Periodic Error Correction (PPEC) using stepper motor and Raspberry Pi Model 3B. Flat-fielder: Own design “The Flatinator”
Exposure: Luminance (binning 1×1): 70x 60s using Optolong Luminance filter Red (binning 2×2): 70x 73s using Optolong Red filter Green (binning 2×2): 70x 45s using Astrodon Photometric V filter Blue (binning 2×2): 70x 92s using Astrodon Photometric B filter
Flats: 50 each filter Darks: 50 each filter Bias: 100x 1ms
Total Integration Time: 5.25 hours
Captured with Astroberry/INDI/Ekos on Raspberry Pi Model 3B+. Processed in Astro Pixel Processor (APP) and GIMP. White Balancing using a method described by Al Kelly: “White Balancing RGB Filters with a G2V Star”
Bortle 5 site Transparency: Average Seeing: Average
The Dumbbell Nebula was discovered in
1764 by famed French astronomer and comet hunter Charles Messier. It
is the 27th object in his eponymous catalog, better known as M27.
The hot central star, which can be seen
in this image, is in one of its last evolutionary stages. The gases
were ejected about 9,800 years ago based on the expansion rate
determined by a group of researchers in 1970.
The intense ultraviolet radiation from
the central star causes the gas atoms of the nebula to emit light in
the visible spectrum. The color of the light is significant. Red
indicates hydrogen and green indicates oxygen. M27 is known as an
emission nebula for that reason. Another type of nebula is a
reflection nebula which only reflects the light of nearby stars.
The color of ionized oxygen is green in my image. Other photos may show it as bluish-green or cyan. The actual color is in indeed cyan. This discrepancy has to do with my camera’s filters. There are many decisions when purchasing filters, chief among them is how they handle ionized oxygen, so-called OIII regions at 501nm wavelength. OIII is right at the dividing line between the green and blue filters. My green filter passes nearly all of the OIII light; relatively little passes through the blue filter. Other manufacturers design their filters to pass equal amounts of OIII in both the green and blue filters, giving you cyan. This highlights the challenges of properly imaging emission nebulae. All other colors are accurate, including star colors.
This image is the first done with my
new $400 CCD camera: Atik 314E. It is “new” to me but the
camera is actually 10 years old. This is an outstanding price for a
quality CCD camera. You can easily spend $2,000 or more for newer CCD
cameras.
CCD image quality is superior to CMOS in my opinion. The more experience I gain in astrophotography the more convinced I am that one size does not fit all. Before anything you must answer the question: what is my goal? If your answer is lunar and planetary imaging then CMOS is right for you. If your answer is deep-sky, like nebulae and galaxies, then CCD is the choice.
One closing remark about my image: you may have noticed a faint reddish cast to the leftmost two-thirds of the image. This is not light pollution. It is the Milky Way.
Here are the technical details for
those who would like to duplicate my results:
William Optics 71mm f/5.9 Atik 314E CCD (cooled but not set-point) Optolong LRGB filters Unitron Model 142 German Equatorial Mount (GEM) — 50 years old. Tracking: Own design Periodic Error Correction (PEC) using stepper motor and Raspberry Pi. Flat-fielder: Own design “The Flatinator”
Exposure: Luminance (binning 1×1): 30x 60s Red (binning 2×2): 30x 73s Green (binning 2×2): 30x 45s Blue (binning 2×2): 30x 61s
Flats: 50 each filter Darks: 50 each filter Bias (1×1): 100x 1ms Bias (2×2): 100x 1ms
Total Integration Time: 120m
Captured with INDI/Ekos running on Raspberry Pi. Processed in Astro Pixel Processor (APP) and GIMP. White Balancing using a method described by Al Kelly: “White Balancing RGB Filters with a G2V Star”
Bortle 5 site Transparency: Above Average Seeing: Average
There is a raging debate over short exposures vs long. The decision to use one over the other is multi-faceted. Here are some reasons to consider short exposures over longer ones:
1. Your mount isn’t up to the task. 2. You live in a zone where aircraft frequently buzz by. 3. You have clear skies but with intermittent clouds. 4. You are a EAA practitioner.
If none of these apply to you then you should open yourself up to the benefits of longer exposures. I’d like to present two examples. The first one has both signal and noise so it easily applies to imaging:
1. Imagine it is the dead of night and the world is asleep. You wake up suddenly and whisper to your partner: “Did you lock the front door?” Their reply is “Yes, now go back to sleep.” This conversation was possible due to a low-noise environment. Now consider a high-noise environment like Niagara Falls and try to whisper. Nope, doesn’t work. You need to raise your volume. Some people might interpret this as the reason why you need longer exposures with CCDs. This is true but please understand that CMOS and CCD both benefit from longer exposures. Imagine that you are back at Niagara Falls and your partner finally finds a voice level that you can just make out, barely. Doesn’t it seem reasonable to ask for louder voice levels just to be absolutely, positively clear what they said?
This next example is an analogy. Take it for what it is worth but I think it does a good job of explaining why increasing exposure is beneficial even when it means that you are recording high levels of sky glow:
2. Imagine that you are at an automobile drag strip. One car can travel at a maximum speed of 100 feet per second and the other at 110 fps. Assume that when the light turns green that they can immediately accelerate to their maximum speed. After 1 second they have traveled 100 ft and 110 ft, respectively. Only 10 ft separate the two. After 2 seconds they traveled 200 ft and 220 ft. Now 20 ft separate the two. You can see that the distance separating the two vehicles steadily increases as time goes by. This is the same with imaging. After 1 second you have only 10 units of brightness separating galaxy from sky glow. But after 2 seconds you have 20 units of brightness separating them. And so on.
Like I said not everyone can run longer exposures for the reasons that I listed above but there is one other consideration to be mindful of. Foreground stars are the brightest objects in your field of view. The longer you keep your shutter open the more the risk of saturating the brightest stars. The saturation level is very much dependent on your camera. CCDs are very good at this because they generally have much deeper ‘wells’ than CMOS and can hold more electrons (i.e. photons). But this is all a matter of taste. Personally I don’t like saturated stars so I try to crop them out when I can.
Finally, what about stacking? Stacking helps increase signal-to-noise in both CMOS and CCD, however keep in mind this inequality: 100x 1-second exposures does not equal 1x 100-second exposure. You will always get better results in less total time when increasing exposure. You don’t have to go crazy. Just try doubling and then go from there.
The fallacy of the perfect exposure:
Your choice of exposure is akin to a multi-lane highway — both have boundaries. Highways have shoulders — veering one way sends you into a gully, and the other way into oncoming traffic. Likewise with imaging, too short of an exposure records nothing of interest, and too long saturates stars and potentially damages the frame by passing aircraft and clouds. What lane you travel in depends on your skill level and risk tolerance.
This was my first attempt at color with a monochrome camera. It was a real learning experience.
The image you see uses synthetic luminance which I created from the RGB stacks with the help of StarTools. A couple weeks earlier, I imaged in actual luminance but made the mistake of not taking the full suite of calibration frames. This failure affected its appearance so I rejected it in favor of synthetic luminance.
Factoid: NGC 6791 is an enigma. The stars are twice as old as our Sun but have an Iron-to-Hydrogen abundance ratio (metallicity) that is more than twice that of the Sun. This flies in the face of the rule of thumb that “older means metal-poor”. NGC 6791 is one of the most studied star clusters
Technical Details:
William Optics 71mm f/5.9 Altair 290M camera (uncooled) Optolong LRGB filters Unitron Model 142 GEM Passive tracking with PEC No active guiding