Welcome, Guest. Please login or register.

Username: Password:
Pages: 1 [2] 3 4 ... 10
 11 
 on: November 23, 2021, 09:23:49 PM 
Started by TomT - Last post by TomT
see attached files for more great insights
Ophiuchus and the Rod of Asclepius area of the sky
and
Comments on Astrophotography

 12 
 on: November 22, 2021, 12:23:29 PM 
Started by TomT - Last post by TomT
On Fri, Nov 19, 2021 at 9:06 PM Jerry wrote:

    Hi Tom
    I listed the steps I go through to set up a portable scope for imaging. I can go over it at Tuesdays meeting.

    Steps to setup a portable Astro-imaging session


        Set up tripod and mount before dusk and roughly align with North using a compass. Be sure to compensate for the difference between true and magnetic north.
        Using a polar scope in the polar axis of the mount, complete alignment to true north by aligning the image of Polaris with it’s location on the scopes reticle for the correct latitude and time.
        Mount the telescope and counterweights, and balance the scope in both axes
        Align the finder scope with the main scope using a bright star and securely lock the finder scope in place.
        Using the finder, visually center a bright star.
        Put the camera in and focus, using a Bahtinoff mask, on the image of the star.
        REMOVE THE BAHTINOFF MASK!!!
        Inform the planetarium program (e.g. The Sky X) the name of the current star.
        Go to the first target of the evening.
        Take a few quick frames to frame the object in the FOV.
        Set the camera control on the desired number and duration of subframes. Usually 12 exposures at 6 minutes each.
        In PHD Guiding, identify a guide star that will track with a 1 second exposure.
        Calibrate each axis, engage dithering with the camera control software.
        Start the capture sequence.
        Listen to my ipod.


    This is the sequence I have used for the past two decades with Astro-Physics mounts (900, 1200 and 1600), QSI wsg type cameras with integral guide chip (500 and 600 series cameras), and DSLR Canon 20Da, 50DH and 60Da. Software is Nebulosity 4 and PHD 2 from Craig Stark. Power is from deep cycle batteries.The planetarium software is The Sky 6 and X. Processing is done using CCDStack, Mira Pro 7, and PhotoShop 6.

    Jerry

 13 
 on: November 21, 2021, 06:07:47 PM 
Started by TomT - Last post by TomT
"Going beyond the Veil", a great, 9 page, 3MB, explainer of the objects in the sky near Cygnus the Swan constellation is attached.

 14 
 on: July 09, 2021, 10:27:39 PM 
Started by TomT - Last post by TomT
On 4/15/2021 1:49 PM, Jerry wrote:
> Hi Tom
>
> The major steps of image processing are:
> Calibration
> Stacking
> Stretching
>       First linear stretching using “levels”
>       Then non linear stretching using “curves”
> Adjust color
>
> Hope that helps, Jerry
>
> Sent from my iPad

 15 
 on: July 09, 2021, 10:25:15 PM 
Started by TomT - Last post by TomT
On 4/15/2021 1:11 PM, Jerry wrote:
> Hi Tom
>
> Here’s an overview of image processing I sent to Tim earlier.
>
> At last weeks telescope workshop you said you’d like to see a simple list of what image processing consists of. So here’s an overview.
>
>
> First there is the object in the sky that you want to take an image of. Let’s say it’s M42. So you attach your camera to your telescope, focus it and point it at the object and take a picture.
>
>
> You have now taken a picture that includes M42, but you also took a picture of a number of other things too. These other things must be removed so you are left with a picture of only M42.
>
>
> These other things you imaged are:
>
> 1 Light reflected from the sky, including gradients;
>
> 2 Other objects in the sky
>
>    airplanes
>
>    meteors
>
>    satellites
>
> 3 Scattered light in the telescope
>
> 4 Diffraction effects
>
> 5 Vignetting
>
> 6 Optical aberrations
>
> 7 Dust
>
> 8Tracking errors
>
>    uncorrected periodic errors in your telescope drive and wind.
>
> 9 Variation in dark current from pixel to pixel across the focal plane (non optical generated current, exponentially dependent on camera temperature)
>
> 10 Variation of gain between each preamp channel in the cameras electronics.
>
>
> Numbers 1, 3, 4, 5, 6 and 7 can be imaged by themselves, called a flat frame, and subtracted using software from your raw image. There are a number of ways to take a flat frame. The idea is to have your camera attached to and oriented with the telescope and in-focus exactly the same way it was you took your astrophoto and the use that setup to take a picture of a uniform, gradient free field. Getting that field is the greatest challenge of flat framing.
>
>    The first way I learned to get a white field is to stand next to your telescope while it’s in a horizontal position and then put a large piece of white poster board in front of the scope so the scope cannot see around it, then, wearing white shirt shine a flashlight on yourself. The light will shine off you and onto the poster board and then into the telescope.
>
>    You can also make or buy a light box that fits on the end of your scope.
>
>    A picture of the sky at about 50 to 60 degrees above the horizon during dusk, but before any stars come out will also due.
>
>    And finally there is software that can remove some of the common gradients. But the software will not remove dust donuts.
>
>    We’ll get to number 2 later.
>
>
> Number 9 is captured in single frame called a dark frame. This image is taken by putting the lens cap on your camera so no light can get in and taking a picture at exactly the same exposure time, ISO setting for CMOS (or gain for CCD), and focal plane temperature. Focal plane temperature is by far the most important parameter to control, since dark current is exponentially dependent on temperature.
>
>    This is the hardest thing to control is DSLRs as the focal plane temperature varies with camera use. The processing electronics generates heat and light.
>
>    For DSLRs I find a “Bad Pixel Map” works better than a dark frame.
>
>
> Number 10 is captured in a single picture called a “bias frame’. To take a bias frame put the lens cap on your camera and take a picture at the fastest shutter speed available. This will capture a map of the variation of electronic gain over the entire focal plane.
>
>
> Number 2 and 8 are handled in a different manner, through stacking subframes. For many reasons it’s better to take many subframes instead of one single picture. Suppose you wanted a one hour picture of M42. You could hold the shutter open for one hour but it’s better to take twelve 5 minutes or sixty 1 minute subframes and add them, called stacking. Errors like 2 and 8 will occur differently on each subframe. Many, like a meteor, will only occur on one subframe. In stacking, anything that is unique to one subframe compared to the ensemble of subframes will be eliminated.
>
>
> None of these images errors vary in time all are fixed. Except of course for meteors and such. Since these image artifacts do not fluctuate in time on a scale we can notice they are not technically noise. But amateurs, even amateur magazines refer to them as noise. Unless your trying to write image processing algorithms or talk about image processing to a scientist, it’s OK to refer to them as noise. It’s just something you want to get rid of.
>
>
> In summary your Raw image of M42 will actually be an image of M42 plus a bias frame and a flat frame and a dark frame. Every picture you take with your camera will have a bias frame. When you take a dark frame or a flat frame, they will also have a bias frame in them. In addition to a bias frame, a flat frame also contains a dark frame. The software you use to calibrate your photos will keep everything straight for you.
>
> Sent from my iPad

 16 
 on: July 09, 2021, 10:23:55 PM 
Started by TomT - Last post by TomT
On 4/6/2021 4:59 PM, TomCez wrote:
> JerryW & all,
> How do astronomers precisely measure the light curves (magnitude, relative flux?) for supernovas (and variable stars), which I would think would tell them what type of supernova is taking place?   I guess this might be the best answer to my question:  https://lco.global/spacebook/telescopes/what-is-photometry/  :
>
> "When measuring the brightness of an object or objects over several images taken over time, comparison stars must be used. By using a comparison star or stars, variation in brightness that can be caused by sources such as the atmosphere is removed. For example, if over the course of several exposures, a thin cloud passed over the telescope, the brightness of all of the stars in the image would be decreased by a similar amount. By using comparison stars, effects like these are divided out. Comparison stars that are of similar brightness to the target star, not very close to other stars, and not near the edge of the frame make the best choices. Astronomers usually compare the comparison stars, and don't use ones that are variable stars."
>
> Also, https://lco.global/education/activities/plotting-a-supernova-light-curve/
>
> I still do not get how they do it extremely precisely to parts per million of a magnitude or better...counting photons hitting a pixel of a camera sensor?  And what is the exact standard magnitude star to start from, Vega = 0.0000000?  For all wavelengths?  Seems like a lot of work involved to get all the stars rated!
>
> TomT

 17 
 on: July 09, 2021, 09:39:00 PM 
Started by TomT - Last post by TomT


On 3/14/2021 11:54 AM, Henk Aling wrote:
> The only reason why I align the guide scope with the OTA is because I can’t plate-solve with my best imaging camera, a Fuji X-a1.  And even if I would use my modified 450D instead that I can control, it’s not as spiffy as the QHY5 of the Orion SSAG.  Once I get my ASI2600MM I will use it for alignment using plate solving.

> I’m sure a laser pointer will work very well and is easy to use but I vowed to myself to never use one because of the unknown risk to airplanes.

> My GSO has screws to attach a dovetail at the top but then I would need more counterweights especially because of the larger radius.  If I had looked better and noticed that the GSO was 15 lbs. heavier than an aluminum alternative it would not have been that bad.  The steel may be stiffer though so that’s an advantage.  GSO also mentioned it’s better for temperature adjustments.

> When I saw it was a GSO product I assumed it was good based on my experiences with GSO products.  They included a little cheap looking finder scope that turns out to be quite nice optically.  I may try to convert that to a guide scope.  The eypiece screws in with plastic thread.  I’m sure I can drill into the plastic and attach a sensor to it.


> From: bkm
> Sent: Saturday, March 13, 2021 7:39 PM
>
> Subject: Re: My homebuilt 12.5 inch astrograph at CalStar

> Henk,
>
> I like your wood block addition to the lower dovetail rail.  Makes for easy and secure mounting!  When I mount the C11 on the Orion Atlas Pro equatorial mount, I have a black ink line placed on the C11 white Naugahyde sleeve that lines up with a mark on the mount, which I can easily see when I attach the scope.  Your wood stop seems better, and I may try that method.  it also seems more secure (less wobbly) during the attachment.
>
> Some Dobsonian mounting rings have saddles both top and bottom.  Obviously, yours has them on the bottom where the dovetail plate attaches.  If they are present on the top also, you can mount another dovetail plate there to which you can attach your guide scope.  It will always be aligned.
>
> The C11 has dovetail plates both top and bottom.  The finder scope mounts to the upper dovetail plate.
>
> I've also added a curved 1/8" thick aluminum plate to the top of the scope that mounts to existing tapped holes in the C11 OTA rear casting.  I've drilled and tapped holes in that new plate to which dovetail shoes are attached.  Among other things, I attach a laser finder to one of the dovetail shoes.  This addition makes alignment really easy.  The laser pointer is already collimated to the scope.  I swing the scope around in RA and dec to have the laser pointer point at the desired object.  When I look in the finder scope, the object is in view, and I only have to center it.  lastly, I center it in the eyepiece of the main scope.  After two stars, the alignment is complete.
>
> Bruce
>
>
> On 3/13/2021 4:52 PM, Henk Aling wrote:
>> Well here’s my setup that I will try tonight.
>> 
>> I have added a wooden stop underneath the Losmandy dovetail.  There was a screw hole already so all I had to do was cut the wood to the right size so it is balanced, shorten the screw that I found, drill a hole and screw it in.  I only have to make a power push to lift the scope on the mount, flip it down so the stop holds it, then tighten the screws.  Not too hard really, it’s only 48 lbs. and when I grab it at both sides it’s easy to carry.  I can still see the DEC level with the stop on.
>> 
>> I fixed the counterweight problem by making an adapter for my guide scope to fit at the bottom of the counterweight bar.  It doubles as a stop for the two other weights.  It balances perfectly, there is more space to push the weights out if I have to.  Over all it is better for balancing than putting a guide scope on top of the OTA.
>> 
>> I line up the guide scope to the OTA by pushing a glass fiber tent pole segment against the base of the guide scope shoe and eyeball if it lines up with the OTA.  Should be pretty accurate.  Keep in mind that for now I need to use my guide scope for plate solving so long as I don’t have my ASI2600MM yet, so it needs to be lined up.
>> 
>> I installed Astroberry on my Pi 4B with 8 GB and it looks great.  I have not yet hooked it up but maybe I will, tonight.  One problem is that it is powered by USB-C and I want it to be powered by a 12 V cigarette lighter socket.  I can use the power adapter that came with it for now if I add an extension cord.
>> 
>> I figured out how to an old Boost cell phone as a web cam and view it through a web browser through the LiveDroid app.  It will be fun to watch the scope move from my computer.  If I want smooth video I need to hook up an ethernet cable.  The Pi is powerful enough but the Wifi dongle is slow.


 18 
 on: July 09, 2021, 04:55:17 PM 
Started by TomT - Last post by TomT
On 6/10/2021 6:06 PM, bkm wrote:
> Mike,
> Thanks for reminding me of the avalanche photodiode, and your nice explanation.
> Like you, its been many years since I used them (I've been retired 32 years).  They are fast and sensitive, but excess noise is a problem.  Wikipedia has a nice writeup: https://en.wikipedia.org/wiki/Avalanche_photodiode
> In a camera detector, it is the capacitance of the photodiode that converts the photoelectric electrons to a voltage (V = Q/C).  This process is often referred to as filling the well, and when the well gets full, the pixel saturates (a voltage problem).
> Let's keep dreaming!
> Bruce
> They talk about poisson
>
> On 6/10/2021 1:21 PM, Mike Chibnik wrote:
>> That along with a new development on a new type of imaging sensor which is supposed to be more sensitive than what we have today.  Evidently they were stating something similar to avalanche diode technology.  It’s something they’ve been doing with fiber optics for a long time and we had used them at Canoga Perkins back in 2000.  With normal photo diodes one photon gets converted into 0.2 to 0.9 electron .    This number is referred to the detector’s quantum efficiency.  Avalanche diodes utilize an additional mechanism were the initial electron is accelerated in an electric field by a high voltage applied to it. As a result additional electrons are generated effectively amplifying the detected optical input.  It’s been 20 years so I’m fuzzy one the gain but as I remember it was from about 20 to 1000 depending on the construction and operating voltages which were around 100 volts.  These diodes were super expensive and tricky to use.  They were also a bit noisy.  In contrast nearly all optical sensors are photovoltaic sensors which in simplistic terms uses photons to generate a current of electrons which generates a charge that becomes a voyage which is the signal that is detected.  If it took about 20 years to combine one expensive photo diode in to a chip with millions of inexpensive high performance devices then it shouldn’t be to unreasonable to assume that this will happen with avalanche photo diode technology. 
>> I think history with image sensors will repeat itself as something very similar happened to television camera tube in the early fifties when the old Iconoscopes were replaced by the more sensitive image orthicons.
>> So it might be possible in the future where people will hold up a thick version of what might look like an iPad which will have that new optical technology with sensor technology to view the heavens.
>> We can dream can’t we?
>> Mike
>>
>>> On Jun 10, 2021, at 12:17 PM, Robert C R wrote: 
>>> Thanks, Chuck...Wow!  This looks like transformative stuff when it comes to optics.  Amazing to watch how discoveries like this will affect the world of photography and telescope optics.  We do live in amazing times!
>>> Bob R.
>>>
>>> From: macpuzl
>>> To: Bruce M
>>> Cc: Jerry W Mike C; Web Master <webmaster@sbau.org>; Ron H; Tom W; Tom T; Bob R; Dick B
>>> Sent: Thu, Jun 10, 2021 12:00 pm
>>> Subject: Re: camera iso invariance
>>> Bruce (et al) -
>>> This may be interesting:
>>> https://phys.org/news/2021-06-goodbye-camera-miniaturized-optics-counterpart.html
>>> Hasta nebula - Chuck

On 6/10/2021 3:06 AM, bkm wrote:
> Jerry,
> I forgot to mention that I also use the hat trick.  I have a piece of cardboard that is flat black painted on both sides.  I hold it in front of the telescope (not touching) when I remote trip the shutter, count to 2, then lift the cardboard away.
> The "hat trick" name arises from old time photographers who used their hat to block the camera lens as they were manually releasing the shutter on the longer time exposures needed for ordinary photography of the day (iso 25 film, or less).
> Bruce
>
> On 6/10/2021 2:37 AM, bkm wrote:
>> Thanks Jerry,
>> Accidental dithering is an apt description!
>> I normally take astrophotos with the mirror locked up, use a 2-second shutter delay, and use a remote trip
>> If I average 10 or more images in Deep Sky Stacker, I don't see pattern noise, but do see non pinpoint stars due to atmospheric turbulence.  Brighter stars are bigger smeared circles.
>> Nikon went all out with the D500 shutter.  It and the mirror holder are made from carbon fiber.  The camera can take 10-pictures per second for a 200 frame interval before it slows down.  Between each frame, it does an autofocus and an exposure calculation.  Both autofocus and exposure require the mirror to be down.
>> I know that this fast frame rate is not used in astrophotography, but the very light components should minimize shutter shake, especially if I use the quiet shutter release mode.
>> My images are usually non guided.  Typically 30 second exposures, or less.
>> Thanks for the info.  It gives me a lot of food for thought.
>> Bruce
>>
>> On 6/9/2021 6:17 PM, Jerry wrote:
>>> Bruce
>>> I depended on the accidental dithering provided by the environment, until I realized it really indicated a weakness in my set up. When I went to non DSLR cameras which did not provide a mechanical shutter shock, my dithering disappeared and my images went down the tubes.
>>> I now do dithering on purpose using software designed for it, Nebulosity and PHD Guiding working together. It gives me better control of my imaging.
>>> Jerry
>>>
>>>> On Jun 9, 2021, at 5:51 PM, bkm < wrote:
>>>> Mike,
>>>> Thanks for the reminder.  I figured that the minute movement of the image between pictures (turbulence, wind, etc.) coupled with the small sensor pixel dimensions (microns), pretty much guarantees the same Bayer filtered pixels will not overlay from one exposure to the next.  When I use Deep Sky Stacker, the pattern noise generally disappears.
>>>> Regarding your other email on dynamic range, the same website has other displays that might interest you.  See: https://www.photonstophotos.net/Charts/PDR_Area_scatter.htm
>>>> You can turn off the various camera manufacturers  colored symbol by clicking on them in the legend.  I find my Nikon D500 and D5200 have very similar dynamic ranges, and are among the best.
>>>> Bruce
>>>>
>>>> On 6/9/2021 2:48 PM, Mike Chibnik wrote:
>>>>> Hi Bruce:
>>>>> The way you get around the issue with fixed pattern color noise is to use dithering.  My friend Mark said that the best way he found was to dither for every single exposure.  Otherwise he found out that there was not enough randomization.
>>>>> Mike

Hi Bruce:
I went to the website and plugged in the dynamic range page for various cameras I had/have in addition the the RA which is Canon's latest Astroimaging camera.
It appears that dynamic range increases as ISO is reduced.  However, this might result in losing faint nebulosity.  However, if you want to image colors best for objects like colorful globulars or the Orion Nebula then to me it appears a lower ISO is the way.
image.png
Mike

On Wed, Jun 9, 2021 at 11:03 AM bkm > wrote:
    Jerry and Mike,
    Thanks for your responses.  I look forward to more information you might
    find.
    Jerry,  yes, fixed pattern noise in color pictures can be a problem with
    Bayer interpolation.  I've noticed it.  When Deep Sky Stacker averages a
    number of photos, the fixed pattern noise appears to be much reduced.
    Probably caused by the images not being aligned exactly on the sensor, a
    good thing.
    Bob Richard has noticed and complained about this artifact in his
    pictures with the Atik cameras he uses.
    Bruce

    On 6/9/2021 10:43 AM, Mike Chibnik wrote:
    > Thanks Bruce
    > I was going over other websites afterwards and found links in some
    > sites where they refer to others that have differing findings.  I’m
    > led to believe that as technology changes certain aspects such as read
    > noise, dark noise, pixel size and quantum efficiency have lead to
    > differing optimum solutions.
    > Mike
 
    >> On Jun 9, 2021, at 10:10 AM, Jerry  wrote:
    >> Hi Bruce
    >> It’s an interesting article, thanks. The author is comparing dynamic
    >> range to photon noise and uses the proper physics definition of
    >> noise. Usually when astro imagers talk about noise in DSLRs they
    >> really mean fixed pattern noise which spatial nonuniformity. He does
    >> not address that aspect, at least at my first read. I’ll take a
    >> deeper look when I get some time.
    >> Jerry

    >>> On Jun 9, 2021, at 2:54 AM, bkm wrote:
    >>>  In today's telescope workshop, I related there was a website
    >>> showing digital camera iso invariance i.e., the signal to noise is
    >>> the same over a wide iso range.  See:
    >>> https://www.photonstophotos.net/Charts/PDR_Shadow.htm#Nikon%20D500
    >>> For the D500 I have, which has a Sony sensor, is iso invariant from
    >>> iso 400 to iso 102400.  This fact is refreshing to know, in that I
    >>> can shoot pictures at higher isos and faster shutter speeds and not
    >>> worry about noise.
    >>> The Canon 6D that astrophotographers like to use is not iso invariant.
    >>> Notice the vertical axis scale is logarithmic.
    >>> Even though the camera names are grayed out, you can select other
    >>> cameras to see their input-referred read noise vs. iso
    >>> Here are the plots from the above website:
    >>> <sensor nosie collage.jpg>
    >>> Enjoy,
    >>> Bruce

 19 
 on: June 06, 2021, 01:46:54 PM 
Started by TomT - Last post by TomT
> On Jun 6, 2021, at 8:58 AM, bkm wrote:
> Tim,
> Thanks for the information.
> I have carbon fiber cloth which I use to repair broken plastic parts.  I glue the part back together with super glue to temporarily hold it in place, then I weld the broken plastic part back together using a soldering iron.  Then I use a large soldering iron to melt carbon fiber strands into the plastic surface all the way around.
The repair is stronger than was the part originally.
> Because the carbon fiber is black and the plastic is generally black, the repair is nearly invisible.
> I buy carbon fiber cloth at Fiberglass Hawaii in Ventura (https://shop.fiberglasshawaii.com/).
> Bruce
>
>
> On 6/6/2021 8:29 AM, Tim Crawford wrote:
>> In any of you have the pelican cases for eye pieces or any of your gear:
>> One of my latches on my 1400 case broke. I contacted Pelican company. They wanted me to send images of the case number and latch damage . For $10 shipping they sent two new latches and pins. Ask them for instructions on replacement and they will send you instructions via email.
>> On my case, it required I drill holes through the handle to allow access to the pins. On the 1400, they want you to drill a 1/8” hole opposite the pin axis. But, in my case I drilled the first hole about 1/4” so I could “ pivot” the 1/8” drill bit to open the hole for the pin.
>> It’s a pretty easy operation and makes the case go a whole lot further! On newer cases, the access to the pins is much easier. You just tap out the pins and tap in the new latch and pins- that easy.
>> Tim

 20 
 on: March 01, 2021, 02:42:36 PM 
Started by TomT - Last post by TomT
Here are some weblinks and part of Jerry's agenda for the show.  Download and play the attached file for the audio.  Youtube link:  https://youtu.be/B_ah0YYtPWY

May possibly make this hour live in the future on Youtube if we can set up the permissions correctly

March 1, 2021 Santa Barbara Astronomical Unit Astronomy Video Podcast weblinks:
https://www.sbau.org/ is our home page, with lots of links and info
New York Times, Exploring the Solar System--various missions listed, By Jonathan Corum, Updated July 30, 2020: 
https://www.nytimes.com/interactive/2020/science/exploring-the-solar-system.html
Scientific Psychic Interactive Moon Map--shows major near side areas with identification:
https://www.scientificpsychic.com/etc/moonmap/moon-map.html
Far side of the Moon:
https://www.publicdomainpictures.net/pictures/190000/velka/moon-far-side.jpg
The Sky Live 3D Solar System Simulator -- excellent interactive, moveable layout of the solar system.  Go to the objects listed in their menus and click on "information", scroll to bottom of the object's info page and quite often there is a 3D interactive map. 
https://theskylive.com/3dsolarsystem
Sizes of large telescopes, including Hubble and James Webb and beyond:
https://i.insider.com/5b76f354e361c037008b5083?width=1759
James Webb telescope will be placed at the L2 Lagrangian point: 
https://upload.wikimedia.org/wikipedia/commons/thumb/8/88/Lagrange_points.jpg/1200px-Lagrange_points.jpg
SpaceX Starlink satellite internet constellation:
https://en.wikipedia.org/wiki/Starlink
View of the 26,000 year circle of stars that the Earth's North Celestial Pole points to:
https://astronomy.com/-/media/Images/News%20and%20Observing/Ask%20Astro/2012/09/Earths-spin-axis.jpg?mw=600
NASA Solar System Dawn Mission to Vesta & Ceres:
https://solarsystem.nasa.gov/missions/dawn/overview/
The Sky Live 3D map of Asteroid 4 Vesta:
https://theskylive.com/3dsolarsystem?obj=vesta
The Sky Live 3D map of 1km wide Asteroid (NEO) 231937 flying by Earth on March 21, 2021:
https://theskylive.com/3dsolarsystem?obj=231937
Spaceweather shows the Sun conditions and asteroid encounter distances & size:
https://spaceweather.com/
Nice layout with Orion area nebulas identified:
https://eastexastronomy.blogspot.com/2010/08/orion.html
NASA’s Perseverance Has a Mars Rover Family Portrait: 
https://www.extremetech.com/extreme/320356-nasas-perseverance-has-a-mars-rover-family-portrait
Mars 2020 mission parachute secret decoded:
https://people.com/human-interest/nasa-reveals-secret-message-hidden-on-mars-rover-parachute/
Chinese mission to Mars, Tianwen-1: 
https://en.wikipedia.org/wiki/Tianwen-1
Emirates Mars Mission is by United Arab Emirates Space Agency: 
https://en.wikipedia.org/wiki/Emirates_Mars_Mission

More Santa Barbara Astronomical Unit videos:
https://www.youtube.com/channel/UCU0r4RxzoyT_pwNcfN5aJsA/videos


Jerry's agenda for the show, abbreviated:
This Week
  The full Moon occurred in the wee small hours of the morning last Saturday. As February’s Full Moon, it’s also known as the Snow Moon. Because the Full Moon is so bright, it’s difficult to observe deep-sky objects during this phase. But at the same time, the Moon makes an excellent target for beginning and experienced observers alike.

Monday, March 1
  Mars opens the month of March a mere 3° due south of M45, better known as the Pleiades. This sparkling open star cluster is easy to spot with the naked eye in the constellation Taurus (even with a nearly full moon) already relatively high in the southwest by the time full darkness falls.
APOD20191206.jpg
On March 30, 2019, Mars approached within 3.3° of the Pleiades star cluster in Taurus the Bull. A thin layer of clouds diffused the light from the Red Planet, producing the halo.

Tuesday, March 2
  The Moon reaches perigee — the closest point to Earth in its slightly elliptical orbit — at 9:18 A.M. PST. At that time, it will sit 227,063 miles (365,422 kilometers) away.

Thursday, March 4
  Asteroid 4 Vesta is at opposition today at 1 P.M. EST. You can find it in the constellation Leo tonight — the main-belt asteroid is now one of the top 10 brightest lights in the Lion’s hindquarters.
  Vesta, as its number indicates, was the fourth body discovered in the asteroid belt. At roughly 300 miles (480 km) across, it’s half the size of dwarf planet 1 Ceres.

The Big News
 Of course the Big news is currently the successful touchdown of the Perseverance Rover in Jezero crater on Mars. The landing was a spectacular success and was captured in the parachute phase by the Mars Reconnaissance Orbiter.

Asteroid in March. https://www.space.com/potentially-hazardous-asteroid-whizzes-near-earth-2001-fo32
 The space rock, officially called 231937 (2001 FO32), is about 0.5 to 1 mile (0.8 to 1.7 kilometers) in diameter and will come within 1.25 million miles (2 million kilometers) of Earth at 11:03 a.m. EST (1603 GMT) on March 21 — close enough and large enough to be classified as "potentially hazardous," a it whizzes by at almost 77,000 mph (124,000 km/h).

Pages: 1 [2] 3 4 ... 10