Color Issues in Digicam Astrophotography

Jack Kramer

When I took my first digital camera image of M42, the Great Nebula in Orion, I was thrilled at actually being able to capture some of the nebulosity. But it was puzzling that the color of the nebulosity in the raw image was blue instead of predominantly red, as M42 "should be". I've since learned that this is a function of the sensor in the camera, and there are color variations from camera to camera, depending pretty much on which CCD chip the manufacturer is using. It's not an issue with terrestrial photography because of the many different wavelengths of light involved, but it does leave out an important color component in some astronomical objects.

M42 imaged with AP 155, Nikon CP950 (attached with ScopeTronix Digi-T) and Scopetronix 40mm Plossl. Single image taken at 8 second max exposure of the CP950

Photo courtesy of Arpad Kovacsy www.integram.com/astro/index.html
Digital cameras are not intended for astronomy, but for such everyday shots as vacations and family gatherings. Instead of the continuous spectrum of light in "normal" subjects, many astronomical subjects emit most of their light in a few discreet wavelengths. Emission nebulae, in particular, emit most strongly in the two visible wavelengths of Oxygen III and Hydrogen alpha (H).

CCD sensors tend to be very sensitive in the near infrared, though the human eye is not. So most digicams have an internal IR filter to make the result look more like what the eye sees. The Hydrogen alpha line that is prevalent in many nebulae is close to the infrared, so it also falls victim to the camera's IR filter. The Hydrogen alpha line (656 nm) is on the borderline of the eye's sensitivity; that's why you need the light grasp of a large telescope in order to visually detect any reddish hue in M42. Since H is not an important wavelength for terrestrial photography, to get natural looking daylight photographs the camera should not be any more sensitive to it than is the human eye.

Another example is the Oxygen III spectral lines at 496 and 501 nm which fall between the colors blue and green. Ideally, both the blue and green sensors in the camera's CCD chip should be equally sensitive at those wavelengths. But in fact there could be a blind spot and no registration in either the blue or green sensor at the very narrow wavelength emitted by a nebula.

Italian digicam astrophotographer Pertti Tapola relates his experience with different cameras: "I got first-hand experience on this after looking at images taken through the same Oxygen III filter with Nikon 990 and Canon 10D. The Nikon image was definitely greener - while in ordinary daylight photography there wasn't any distinct difference. It is simply a case of what the blue and green sensitivity curves look at their edges."

What can happen too is that the digicam might give a color rendering that doesn't exist in the object you photograph. The star Gamma Delphini looks to many observers as though it were green in color. But stars simply don't radiate in this "impossible" green color. It's generally conceded that the perception of green is purely in the eye of the beholder. Some astrophotographers note that this star is also recorded as green in their digital camera, which is acting very much like the human eye.

Is there a solution? I've heard that somewhere on the Internet there are instructions that show how to remove the IR filter from a Nikon Coolpix camera to increase its infrared sensitivity. But this is a tricky modification, and it creates a problem for when the camera is used in terrestrial photography. Sony has a model that allows the user to control whether or not the filter is active. It certainly would be useful to have the ability to adjust a camera for sensitivity to different wavelengths. But realize that astro pics is a use for which consumer digital cameras were never intended.

One advantage of dedicated CCD cameras is that they take monochromatic images through various colored filters, then combine them with software that con verts the final image to the colors of the filters through which each image was taken. This reveals as many different wavelengths as possible. It is also possible to use software to adjust the color values in digital camera images. All good image-processing programs have a feature that allows you to adjust the color curves. But use this judiciously or you run the risk of inducing a color that is not actually emitted by your target object. Sort of like NASA's "false color" images!

Another color problem is "bleeding". A digital camera sensor captures just one color at each pixel location. These individual pixels are combined by the camera's software to produce a full color pixel. But during longer exposures, colors tend to bleed from one pixel to another. (Some new CCD chips capture full color in every pixel of the array, giving sharper images and truer color.) Another problem is "bloating" which occurs when there are large brightness differences, for example with Martian polar caps or a crescent moon when trying to capture earthshine. The brighter areas look much larger than they really are. Both bleeding and bloating may be dealt with by using shorter exposures, but taking more images and stacking them.

Published in the January 2004 issue of the NightTimes