On a lovely evening in Pasadena, Sue Owen described the NASA ISRO SAR mission, primarily aimed at mapping the damage caused by natural disasters using synthetic aperture radar and interferometry.
What does that really mean?
I couldn’t tell you. But there was some super cool space stuff discussed.
Joking, only partially though.
When a natural disaster occurs there is a lot of questions raised. How bad was the earthquake? How many people are affected? What magnitude? Where do we direct aid?
These can be answered in part, through the mapping of damage. Data points and collected by scientists help us understand how much earth has changed as a result of a natural disaster. Examples of these measurements include sea-level increases, ground dispersion, infrastructure damage, etc. But how do we know how much change has occurred? Well, that’s where your friendly neighborhood space agency swoops in.
Making these measurements from space makes sense, you can measure on a global scale (Nepal to Mexico to Hawaii). As well, humans are left out of harm’s way. I always think of the reporter who inevitably ends up in “the eye of the storm.” That can’t be safe.
One of the ways this is done is Geodetic imaging, which monitors the shifting of tectonic plates through the use of high precision GPS. While the GPS we have in our phones is accurate up to 5 meters, the GPS tracked via permanent stations is accurate 1-2 millimeters in the horizontal, and 5 millimeters in the vertical. Whoa. Also cool to note, apparently the government has this neat little site to talk about GPS accuracy.
Many of these permanent GPS stations are located in places you would expect, the Western US and Japan being especially populated (popping up… like daisies!). No, I will not cool it on the Mulan references.
But then, synthetic aperture radar busted its way into the presentation, and it held the audience captive for the rest of Sue’s talk.
The great thing about synthetic aperture radar (SAR being its street name) is that it can assess how much the land has changed by taking “photos” of the ground. Radar can pass through cloud cover, which make it an obvious choice over traditional photography, in addition to being able to cover a larger portion of ground.
What these can create are these somewhat trippy graphics called “interferograms” (the newer, hippier Instagram).
The satellite emitting the radar will have to make two passes over the natural disaster to measure the rate of change and ground displacement. Each color cycle is called a fringe, and one cycle of color (from blue to blue) indicates a certain amount of ground movement. Thus, the closer together the bands of color are, the greater the distance of displacement in that area (lots of movement).
Changes detected by SAR can be localized to individual city blocks. As well, these maps of damage are often passed into the hands of FEMA and other agencies to plan aid for disaster relief.
Sue had a number of super interesting visuals in her presentation that I was unable to find online, but I assure you, the amount of precision in this technology is downright impressive.
The trouble of this program is the reliance on the use of other country’s cooperation. NASA does not currently have its own SAR satellite, so the mission partners with Italy, Canada, Japan, and other countries in order to access data. Not all these countries are able to put aside the commercial and civilian mission that fund these SAR to map areas just hit by disasters. Thus, time delays are common.
The good news is that in 2022, NASA will launch its own SAR satellite (WOO!).
Bad news is until then, we have to push countries for open and free data policies. Ha…. haha…. ha.
Boundaries aside, this mission is mega cool. And further fossilizes my perception of the people of NASA as superheroes.