Back to overview

Online Magazine

What the moon has kept in the dark – and how AI makes it visible

There are areas on the moon that neither humans nor robots have ever seen. Valentin Bickel and his team from ETH Zurich are now making them visible for the first time – with an AI-supported tool called HORUS. In an interview, Bickel explains how this works and why, with artificial intelligence, we could gain even deeper insights into our universe in the future.



Mr Bickel, you and your team from ETH Zurich are researching "the permanently shadowed regions of the moon". Can you explain what that means?
The Moon's shadowy regions, or permanent shadow regions, are craters and depressions at the Moon's two poles that have not received a ray of sunlight for millions of years. For this reason, these regions are extremely cold – in fact, they belong to the coldest places in our solar system. At these temperatures, volatile elements and compounds such as water could be deposited there, so-called volatiles. These volatiles are the focus of most missions that will fly to the Moon in the coming years – after all, they have great potential to answer a number of fascinating scientific questions: Where did water on Earth come from? And how did our sun evolve throughout its history? In addition, the hope is to use these volatiles and other resources in the future to supply a base on the moon with water and oxygen – or to produce cheap rocket fuel.

Topographic map of the lunar south pole with its permanently shadowed regions (Source).

To gain insights into this so-called South Pole region of the Moon, you use images from the Lunar Reconnaissance Orbiter (LRO) probe – together with artificial intelligence. What is AI's contribution here? What new insights does it provide?
LRO's camera was designed for the sunlit regions of the moon. In shadowed areas, most images suffer from extreme noise, similar to a smartphone camera at night, which makes the images mostly unusable. We have developed an AI-assisted (deep learning) method to remove the noise in these images. These optimised images give us new, unique insights into shadowy areas – regions that no human or robot has seen before.

These optimised images give us new, unique insights into shadowy areas – regions that no human or robot has seen before.

How exactly does your AI or deep learning tool work in technical terms?
Our tool HORUS – Hyper-effective nOise Removal Unet Software – consists of two networks. Network #1 (DestripeNet) has been trained with more than 70,000 calibration images routinely taken on the night side of the moon. In addition, Network #1 uses a number of camera environmental parameters such as camera temperature, which have a strong influence on image noise. Network #2 (PhotonNet) was trained with millions of synthetic image pairs; this involved adding synthetic noise to "clean" images. Network #2 then learned to produce clean images from the noise. Both networks work in chronological sequence.

Can you tell us something about the development process of this AI solution and the challenges involved?
The biggest challenge was understanding and tracking the camera and all the relevant image processing steps. LRO uses a number of different, complex modes to capture images, which are then processed and compressed in different ways before being downlinked from the satellite. Before we could design HORUS, it was important to understand exactly where an AI-powered solution could come in – and exactly what data we could work with to preserve the scientific integrity of the images.

Gaining a better understanding of the Earth thanks to AI ...

Thanks to artificial intelligence, not only the surface of the Moon but also the Earth can be discovered in a new way. AI in combination with satellite images makes it possible, for example, to better identify changes in the Earth's surface over time. This can among other things help to combat climate change.

Find out more here!

With your work, you make a significant contribution to ensuring that future lunar missions are successful. First and foremost, you will visualise and identify potential landing sites and promising locations for exploration, right?
At least that is our goal. Our images allow scientists and engineers to get a first look at certain shadow areas: Which shadow area looks interesting? Where are particularly many boulders on the surface? What would be the easiest and safest way to enter a particular shadow area? All this information has a real practical use: For example, one could place a landing site in such a way as to minimise the distance to a particularly interesting shadow area. In the best case, our images will also help to identify potential risks for robots and astronauts ahead of time.

HORUS image of a permanently shadowed region on the Leibnitz Plateau embedded in a regular narrow-angle camera image (Source).

You work together with NASA, but also with other institutions. Which lunar missions will your ETH team be involved in and how?
For this particular project, the collaboration with NASA and colleagues from other institutions is on a purely scientific basis. Unfortunately, as far as I know, there are no concrete plans for mission participation at the moment, at least in the lunar shadow regions. However, a lot is still in flux at the moment – until then, we are using HORUS for further scientific studies that could have a direct or indirect benefit for future missions.

In the best case, our images will also help to identify potential risks for robots and astronauts ahead of time.

What is the next step in your current research in this area? Can or will the existing AI solution be further developed?
The performance of HORUS is mainly driven by the quality and quantity of available training data. We expect a flood of new data from the Moon in the next few years – with this data we want to further improve HORUS.

Does your research or your successful AI approach open other doors? Could it also be used to make other corners of the universe visible?
Absolutely. At its core, HORUS is a tool that uses environmental parameters and calibration data to reduce image noise in low-light images. Theoretically, one could develop and use a variant of HORUS for any imaging sensor, provided there is sufficient training data.

In general, what potential do you see in your field of research for technologies like AI – and where do we stand today?
Planetary research is still in its infancy as far as AI is concerned. Astrophysics and astronomy are already much further along. Nevertheless, I personally still see great potential for AI-supported methods in all these fields of research. For example, it is particularly interesting to use AI methods to systematically and, above all, quickly search through the sometimes extremely large data sets – terabytes to petabytes – for interesting objects and/or events that can then be scientifically analysed. It is impossible for a human to look through an archive of more than 2 million satellite images – an AI can do it in a few days.

More on the topic of AI in space

The idea of using artificial intelligence to solve problems in space is not new. The European Space Agency (ESA) and the German Research Centre for Artificial Intelligence (DFKI), for example, are pursuing the approach of using AI to prevent collisions between satellites and space debris.

Read the interview with Marlon Nuske here!

What is your personal research goal or wish?
I am only at the beginning of my career, but my personal goal is to make AI-based methods useful for planetary research and exploration. We are at the beginning of a great adventure – we are going back to the Moon! – and my wish is to help future missions succeed. The next few years will be very exciting.

About Valentin Bickel

Valentin Bickel (*1990) is a planetary scientist at ETH Zurich. His research focuses on the geomorphology and evolution of planetary surfaces, AI-assisted remote sensing and soil/terra mechanics. From November 2022, Bickel will continue his research at the University of Bern.


In conversation with
AI in research Machine learning AI

... Pablo Piaggi
In conversation with
AI Machine learning

... Ute Schmid
In conversation with
Machine learning

... Christoph Grünberger