On 4/15/2021 1:11 PM, Jerry wrote:
> Hi Tom
>
> Here’s an overview of image processing I sent to Tim earlier.
>
> At last weeks telescope workshop you said you’d like to see a simple list of what image processing consists of. So here’s an overview.
>
>
> First there is the object in the sky that you want to take an image of. Let’s say it’s M42. So you attach your camera to your telescope, focus it and point it at the object and take a picture.
>
>
> You have now taken a picture that includes M42, but you also took a picture of a number of other things too. These other things must be removed so you are left with a picture of only M42.
>
>
> These other things you imaged are:
>
> 1 Light reflected from the sky, including gradients;
>
> 2 Other objects in the sky
>
> airplanes
>
> meteors
>
> satellites
>
> 3 Scattered light in the telescope
>
> 4 Diffraction effects
>
> 5 Vignetting
>
> 6 Optical aberrations
>
> 7 Dust
>
> 8Tracking errors
>
> uncorrected periodic errors in your telescope drive and wind.
>
> 9 Variation in dark current from pixel to pixel across the focal plane (non optical generated current, exponentially dependent on camera temperature)
>
> 10 Variation of gain between each preamp channel in the cameras electronics.
>
>
> Numbers 1, 3, 4, 5, 6 and 7 can be imaged by themselves, called a flat frame, and subtracted using software from your raw image. There are a number of ways to take a flat frame. The idea is to have your camera attached to and oriented with the telescope and in-focus exactly the same way it was you took your astrophoto and the use that setup to take a picture of a uniform, gradient free field. Getting that field is the greatest challenge of flat framing.
>
> The first way I learned to get a white field is to stand next to your telescope while it’s in a horizontal position and then put a large piece of white poster board in front of the scope so the scope cannot see around it, then, wearing white shirt shine a flashlight on yourself. The light will shine off you and onto the poster board and then into the telescope.
>
> You can also make or buy a light box that fits on the end of your scope.
>
> A picture of the sky at about 50 to 60 degrees above the horizon during dusk, but before any stars come out will also due.
>
> And finally there is software that can remove some of the common gradients. But the software will not remove dust donuts.
>
> We’ll get to number 2 later.
>
>
> Number 9 is captured in single frame called a dark frame. This image is taken by putting the lens cap on your camera so no light can get in and taking a picture at exactly the same exposure time, ISO setting for CMOS (or gain for CCD), and focal plane temperature. Focal plane temperature is by far the most important parameter to control, since dark current is exponentially dependent on temperature.
>
> This is the hardest thing to control is DSLRs as the focal plane temperature varies with camera use. The processing electronics generates heat and light.
>
> For DSLRs I find a “Bad Pixel Map” works better than a dark frame.
>
>
> Number 10 is captured in a single picture called a “bias frame’. To take a bias frame put the lens cap on your camera and take a picture at the fastest shutter speed available. This will capture a map of the variation of electronic gain over the entire focal plane.
>
>
> Number 2 and 8 are handled in a different manner, through stacking subframes. For many reasons it’s better to take many subframes instead of one single picture. Suppose you wanted a one hour picture of M42. You could hold the shutter open for one hour but it’s better to take twelve 5 minutes or sixty 1 minute subframes and add them, called stacking. Errors like 2 and 8 will occur differently on each subframe. Many, like a meteor, will only occur on one subframe. In stacking, anything that is unique to one subframe compared to the ensemble of subframes will be eliminated.
>
>
> None of these images errors vary in time all are fixed. Except of course for meteors and such. Since these image artifacts do not fluctuate in time on a scale we can notice they are not technically noise. But amateurs, even amateur magazines refer to them as noise. Unless your trying to write image processing algorithms or talk about image processing to a scientist, it’s OK to refer to them as noise. It’s just something you want to get rid of.
>
>
> In summary your Raw image of M42 will actually be an image of M42 plus a bias frame and a flat frame and a dark frame. Every picture you take with your camera will have a bias frame. When you take a dark frame or a flat frame, they will also have a bias frame in them. In addition to a bias frame, a flat frame also contains a dark frame. The software you use to calibrate your photos will keep everything straight for you.
>
> Sent from my iPad