Automated technology enables unprecedented space exploration from the Moon, to asteroids and beyond



[ad_1]

OSIRIS-REx at Sample Site Nightingale

Top view of OSIRIS-REx at the Nightingale sample site, with a parking lot for comparison. Credits: NASA / Goddard / CI Lab / University of Arizona

During the Apollo 11 landing in 1969, the astronauts looked out the window to make out the features they recognized from the maps of the Moon and were able to steer the lander to avoid a disastrous landing atop a rocky area. . Now, 50 years later, the process can be automated. Distinctive features, such as known craters, boulders, or other unique surface features, provide information on surface hazards to help avoid them when landing.

NASA scientists and engineers are maturing the technology for navigating and landing on planetary bodies by analyzing images as they descend, a process called terrain-relative navigation (TRN). This optical navigation technology is included in NASA’s latest Mars rover Perseverance, which will test TRN when it lands on the Red Planet in 2021, paving the way for future manned missions to the Moon and beyond. TRN has also been used during NASA’s recent Origins, Spectral Interpretation, Resources Identification, Security, Regolith Explorer (OSIRIS-REx) Touch-and-Go mission (TAG) event to collect samples of the asteroid Bennu in order to better understand the characteristics and movement of asteroids.

Since reaching Bennu in 2018, the OSIRIS-REx spacecraft has mapped and studied its surface, including topography and lighting conditions, in preparation for the TAG. Nightingale Crater was selected from four candidate sites based on its large amount of sample material and spacecraft accessibility.


On October 20, the OSIRIS-REx spacecraft successfully sailed to the surface of the asteroid Bennu and collected a sample. Credit: Goddard Space Flight Center / NASA Scientific Visualization Studio

Engineers routinely use ground-based optical navigation methods to navigate the OSIRIS-REx spacecraft near Bennu, where new images taken by the spacecraft are compared to three-dimensional topographic maps. During TAG, OSIRIS-REx performed a similar optical navigation process on board in real time, using a TRN system called Natural Feature Tracking. Images of the sample site were taken during the descent of the TAG, compared to the topographic maps on board, and the spacecraft’s trajectory was readjusted to aim for the landing site. Optical navigation could also be used in the future to minimize the risks associated with landing in other unfamiliar environments in our solar system.

NASA’s Lunar Reconnaissance Orbiter (LRO) has been acquiring images from orbit since 2009. Project scientist Noah Petro, LRO, said one of the challenges in preparing for landed missions is the lack of high-resolution camera images and ad tight angle in all lighting conditions for any specific landing site. . These images would be useful for automated landing systems, which need the lighting data for a specific time on the lunar day. However, NASA was able to collect high-resolution topographic data using LRO’s Lunar Orbiter Laser Altimeter (LOLA).

“The LOLA data, and other topographical data, allow us to take the shape of the moon and illuminate it for any moment in the future or in the past, and with that we can predict what the surface will look like,” said Petro.

Astronaut Artemis on the moon

Artistic concept of the astronaut Artemis who goes to the Moon. Credit: NASA

Using LOLA data, the sun’s angles are superimposed on a three-dimensional elevation map to model the shadows of surface elements at specific dates and times. NASA scientists know the position and orientation of the Moon and LRO in space, having made billions of lunar laser measurements. Over time, these measurements are compiled into a grid map of the lunar surface. The images taken during the landing are compared to this main map so that landers that can be used as part of the Artemis program have another tool for navigating safely on lunar terrain.

The lunar surface is like a fingerprint, Petro said, where no two identical landscapes exist. Topography can be used to determine the exact location of a spacecraft above the Moon by comparing images as a forensic scientist compares fingerprints from crime scenes to match a known person to an unknown person or to match a location to where one find the spacecraft in its flight.

After landing, the TRN can be used on the ground to help astronauts drive manned rovers. As part of NASA’s lunar surface sustainability concept, the agency is considering using a habitable mobility platform such as a RV and Lunar Land Vehicle (LTV) to help the crew travel the lunar surface.

Astronauts can typically travel short distances of a few miles in an unpressurized rover such as the LTV as long as they have reference points to guide them. However, traveling greater distances is much more challenging, not to mention that the Sun at the lunar South Pole is always low on the horizon, which adds to the visibility challenges. Driving across the South Pole would be like driving a car straight east first in the morning – the light can be blinding and landmarks can appear distorted. With TRN, astronauts might be able to navigate better to the South Pole despite lighting conditions, as the computer could better detect hazards.

Speed ​​is the key difference between using TRN to land a spacecraft and using it to navigate a manned rover. Landing requires image capture and processing faster, with one-second intervals between images. To bridge the gap between the images, onboard processors keep the spacecraft on track to land safely.

“When you move slower – such as with a rover or OSIRIS-REx orbiting the asteroid – you have more time to process the images,” said Carolina Restrepo, an aerospace engineer at NASA Goddard in Maryland who works to improve current products. data for the moon. surface. “When you are moving very fast – descent and landing – there is no time for that. You have to take the images and process them as quickly as possible on board the spaceship and it has to be all autonomous. “

Automated TRN solutions can meet the needs of human and robotic explorers as they navigate unique locations in our solar system, such as the optical navigation challenges faced by OSIRIS-REx for TAG on Bennu’s rocky surface. Due to missions like LRO, Artemis astronauts can use TRN algorithms and lunar topography data to integrate surface images in order to safely land and explore the Moon’s South Pole.

“What we are trying to do is anticipate the needs of future terrain navigation systems by combining existing data types to ensure we can build the highest resolution maps for key positions along future trajectories and landing sites.” Restrepo said. “In other words, we need high-resolution maps for both scientific and navigation purposes.”



[ad_2]
Source link