On 4/6/2021 4:59 PM, TomCez wrote:
> JerryW & all,
> How do astronomers precisely measure the light curves (magnitude, relative flux?) for supernovas (and variable stars), which I would think would tell them what type of supernova is taking place? I guess this might be the best answer to my question:
https://lco.global/spacebook/telescopes/what-is-photometry/ :
>
> "When measuring the brightness of an object or objects over several images taken over time, comparison stars must be used. By using a comparison star or stars, variation in brightness that can be caused by sources such as the atmosphere is removed. For example, if over the course of several exposures, a thin cloud passed over the telescope, the brightness of all of the stars in the image would be decreased by a similar amount. By using comparison stars, effects like these are divided out. Comparison stars that are of similar brightness to the target star, not very close to other stars, and not near the edge of the frame make the best choices. Astronomers usually compare the comparison stars, and don't use ones that are variable stars."
>
> Also,
https://lco.global/education/activities/plotting-a-supernova-light-curve/ >
> I still do not get how they do it extremely precisely to parts per million of a magnitude or better...counting photons hitting a pixel of a camera sensor? And what is the exact standard magnitude star to start from, Vega = 0.0000000? For all wavelengths? Seems like a lot of work involved to get all the stars rated!
>
> TomT