FITS --vs-- RAW
The differences aren't as big as it may seem -- RAW files too are wrappers that contain more or less unprocessed data from the sensor array plus a bunch of additional information the camera vendor has added. And the details of each RAW format is different from camera model & vendor. But this "mess" is hidden by the fact that we deal only with one or two camera vendors and that they have documented the format or provide libraries for other developers to read their RAW format. As a user you install one tool but never see how many different formats that RAW converter has to be able to support.
FITS also is used to bundle together more or less unprocessed data from the sensor array plus a bunch of additional information. In contrast, when you use an astronomy CCD camera, the resulting data in the FITS file not only depends on what camera you use but also which software is involved in the image acquisition. You can use the same camera with different tools and end up with data in different representation in the FITS wrapper ⇒ big difference to dSLR & RAW. And subsequent image conversion & editing tools may or may not be able to process that particular format your camera + image acquisition SW generated.
Images are just lots of numbers
The way these numbers are turned into pictures & colors is what makes this more complicated. From 256 different colors (8bit) to 16 Million colors (R, G & B, each 8 bit) to formats that use 3 * 16bit to even larger ones. Add to this the variations of signed or unsigned integers as well as floating-point data. On top of that, some tools use compression to reduce the size of the (often huge) files -- sadly not all support that or use different methods.
Here are some of the popular choices
16-bit, RGB FITS Maxim/AstroArt, uncompressed
16-bit, RGB FITS, 3 separate "slices" withing the FITS, Nebulosity, uncompressed
16-bit, RGB FITS ImagesPlus, compressed
15-bit, 3-separate files, uncompressed or PNG/TIFF.
Photoshop File, Save 16-bit/color TIFF
It wouldn't be fun without color
When you use a one-shot color camera, you can save the three colors in separate files, or store them as separate images inside a single FITS file or ignore the color information and let the user & software figure this out later.
What I called "let the user & software figure this out later" is quite common and it is also used in the SLR's RAW format. Each pixel implicitly is assigned to R, G or B. And there are several "standard" arrangements -- see wiki/Bayer_filter
In a step called "demosaic" or "debayer interpolation", the image processing software has to compute the combined color from a 2x2 array of pixels. But to do that, the SW has to know which filter mask was used to capture the data. "RGGB" is widely used, not only in SLRs. But if you pick the wrong mask for your data, you end up with a false-color image and also may have annoying moire-patterns in it.
What can go wrong ...
In most cases, image capture / camera applications use a 16bit or 48bit format (16 bit integer per color) and don't perform any (debayer) color interpolation. In rare cases tools use FITS' NAXIS feature to store the different colors in separate "pages" -- most of the time, data is stored as is. Or so one may think ....
Sometimes the camera images are flipped horizontally and/or vertically -- that's very helpful when viewing the images because they now appear "correct" to a (SLR) photographer. And it is also no problem when you use a monochrome camera or FITS-file. But when these operations are done to color-image data before a debayer interpolation, the resulting image has the wrong colors. For example, a cell with a blue filter in front of it now is in the place where originally was a red cell. Using the wrong debayer mask to process a transformed image can result in false-color images and also often creates a lot of black pixels.
Some tools have options to choose among many different input formats before running demosaic / debayer interpolation. Other tools assume RGGB and create images in false colors
How to do it right ...
That's the difficult part -- and I am still working on that one. One important step forward is to know your enemy. Know the possible pitfalls. Many tools offer demo-versions and you can use those to try different combinations and see if im- & export functions are compatible.
If you use a "OSC"-camera, you should carefully look at the options to "generate color images from gray-scale FITS", "demosaic" and "deBayer". Make sure you have all the different RGGB, BGGR, .... combinations available. Some cameras use different formats -- especially video have CMY(G/S).
Personally I am not a fan of the "Do-it-All"-packages because you are tied to a single (expensive) product and often proprietary data formats. Their approach can eliminate some of the obstacles I described above but you may regret that choice later if or when you want to switch to a different tool.
- DeepSkyStacker can read the "gray-scale" FITS & RAW files and debayer them with various masks
- AstroArt also has a number of options to debayer a "grayscale" image. And it can "synthesis" an RGB from separate files (tbd : NAXIS support)
- Nebulosity has a good reputation but reading an existing FITS, it turned my red, eclipsed moon into a blue one.
- There a many tools to process FITS : fits.gsfc.nasa.gov/fits_viewer.htm but those don't handle color / debayer conversions
- Irfanview is one of the "colorblind" FITS tools -- it has a handy option to convert FITS into grayscale JPG ⇒ I use that to generate preview & thumbnail images
A Checklis -- especially for OSC (one-shot color) cameras
Capture images with several different tools, compare the on-screen results and compare the results after importing into the editing tools. Does the image you see in the image processing program have the correct orientations -- I'm speaking from experience where I process the same file with two different tools yields images that are horizontally flipped to each other. Some tools mimic the SLR's processing and present the result like a "correct image". That can mess up the debayer interpolation
If an image capture tool does some (unwanted) adjustments, you will see that moving the telescope moves the target in one direction but the same target viewed through the guide-camera & PHD moves into the opposite direction.
If you can, setup the camera during daylight and run these tests with terrestrial objects -- it is a lot easier and you can have correct colors and maybe a sign as a reference. My CCD supports exposure times down to 1/100s (GOOD !!) while the guide-camera only goes down to 1/5s (BAD) and that usually yields too bright images even at dawn.
Try out different capture tools and AFTER DEBAYER look at the results -- if you see a grid-pattern of black dots, the conversion failed. Try a different color arrangement. If you used a daylight shot, it is easier to judge if the colors are correct.
And my recommendation is to keep very accurately track which setting & software you have used -- some programs support cameras through their own driver as well as through an ASCOM driver. Ideally both deliver the same result. And ideally using different programs to read data from the camera via ASCOM also would produce the same results.
But we don't live in an ideal world.
.
© Copyright 2014, All rights reserved -- post a link to this page or contact me, if you want to use it in your blog or publication or if you want a (large) print
1 comment
Stargazer95050 said: