NASA Picturing Earth: Behind the Scenes [Video]

From their perch on the International Space Station, astronauts have spent twenty years sharing a story about Earth as they can see it from above. Like the directors of any film, those astronaut storytellers have a crew working behind the scenes to help them tell that story. Meet the Earth Science and Remote Sensing Unit (ESRS), the researchers who guide astronauts as they observe and document changes on Earth and then make those photographs accessible to scientists and the public.

Learn more about astronaut photography and the ESRS team in parts 1 & 2 in the series:

Video Transcript:

From their perch on the International Space Station, astronauts have spent two decades sharing a story about Earth as they see it from above.

Like the directors of any film, those astronaut storytellers have a cast and crew working behind the scenes to help them tell that story.

The Earth Science and Remote Sensing group (ESRS) here at Johnson Space Center is primarily designed to support the International Space Station program. Our job there is twofold.

One, it’s to run, manage the Crew Earth Observations Facility on the ISS. That’s the handheld crew imagery of the Earth. The other is to serve as subject matter experts to the ISS program for matters of remote sensing.

Though the photos shot from the space station can be quite beautiful, they’re mostly shot for practical reasons. Astronaut photography is a science product designed to help everyone from academic and government researchers to resource managers and conservation groups to educators and students.

The photographs document changes in our cities and remote ecosystems, in polluted waters, and pristine landscapes. They have been used to study everything from urban development and economics to unusual electrical discharges in the atmosphere. These snapshots have been used to observe fishing boats and coral reefs, calving icebergs, and vast inland deltas.

Though handheld cameras are not as precise as robotic imagers they provide a nice complement. Most satellites view the world at the same time and same resolution with each pass. But each Space Station orbit brings different viewing angles, different times of day, and different lighting.

We’ve had a number of external science requests for nighttime imagery of various cities for investigators who are looking at light pollution and seeing how that’s affecting biodiversity in their cities.

The Earth Science and Remote Sensing team manages dozens of requests for images with new ones coming in weekly.

We’ve got this long list of maybe 50 to 100 on any given day that we will parse through, and the way we decide which target we will ask for the next day is, we’ll see where the ISS is. We’ll take into account lighting conditions, whether it’s light enough for daytime targets or dark enough for nighttime targets.

Once we have what they can see, then we’ll filter that for predicted weather, cloud cover for that area. There’s no point in asking them for a target if they’re not going to be able to see it.

With a new sunrise and sunset every 90 minutes, the astronauts have 16 chances to see what the planet and its people are doing.

But life on the Space Station brings its own sort of sunrise and sunset.

Flight surgeons say do not ask for anything during their sleep period which is a third of the day. That’s a full eight hours. But you may ask for everything between six o’clock and nine, 21:30, in the evening which includes their getting up time, breakfast time, and the work day. And bedtime, supper time, brush your teeth time.

Having considered astronauts schedules and the viability of each target, the Earth Observations team chooses a few targets for each day. They assemble guides and maps so the astronauts can quickly orient themselves. They send the target plans to payload operations at Marshall Space Flight Center where photo ops are reviewed, approved, and added to the ISS workday.

They will provide for us useful information to help give us maybe cues off the surface of the Earth that will help train our eye to go find a specific target that otherwise would be very difficult to find in there. They want that target for specific reasons.

Preflight they will have trained us in that and why those elements of that data are so important, which helps us in the technique that we might apply in getting the photograph.

Perhaps the hardest part of the job for the Earth Observations team comes after the astronauts put down the camera. Anywhere from 100 to 10,000 images are sent down to ground control every day.

Once that gets downlinked, it gets ingested into our system and I’ll look through those and see if they’re appropriate to send to the researchers.

Though computers are now part of the process, for most of the past 20 years those photos have been sorted by human hands and eyes.

That’s the highest priority for us, to catalog them. And by cataloging I mean adding descriptive metadata to the image. Things like what geographic features can you see in this image. That helps then when it’s fed into our online database. That’s what enables people to search for those images.

Every photo has some basic information about the location. But since the ground team did not shoot the photos they have to decipher what can be seen in them. It is a monumentally important but monumentally time-consuming job.

Because these images are taken by the astronauts, they’re looking out of the cupola with a with a handheld camera. They can look in any direction they want. We know exactly where the space station was when they took the picture.

So we kind of have a general idea of where it could possibly be. We know what hemisphere it is in for instance. But we don’t necessarily know which way they were looking.

It is a rich decades-long catalog of the dynamic changes, forces, and beauties of our planet.

It always struck me how little the remote sensing community used this data. And primarily that was because it historically has been a difficult data set to use.

One of the focuses that I’ve had leading this group has been, okay, we need we need to bring new tools to bear to make this dataset more useful.

Stefanov’s team has been working to automate more of its tasks using new software and computing techniques like machine learning.

If we have machines that can identify these features automatically, then our database becomes that much more searchable and the public can use it more easily, scientists can find exactly what they’re looking for.

Lambert and colleagues are now training computers to detect and recognize certain features on the land, in the sky, and on the ocean. They’re using neural networks to help machines quickly identify photos with a lake or a sea, the limb of the Earth or the Moon, a crater or a city.

They’re particularly focused on clouds.

One problem that people have that are doing GIS is that you’ve got our imagery, and it’s great, but it’s got clouds in it.

And those clouds obscure what you’re interested in looking at. If we have an algorithm that produces a mask where every pixel is labeled as either clouds or no clouds, that label can be applied to the whole image and they can subtract the clouds from their analysis, so that way they are ignored.

Astronaut photographs have been used for some scientific studies and for inspiring people to better appreciate the planet.

But the Earth Observations team believes those photos could have even wider use, and they want to make sure the world sees more of them.

Be the first to comment on "NASA Picturing Earth: Behind the Scenes [Video]"

Leave a comment

Email address is optional. If provided, your email will not be published or shared.