Newswise — BROOKINGS, S.D. — Far-sighted data policy and cloud computing are leading to the “democratization of satellite mapping,” one expert says — and the payoff will be wider access to information about the earth via platforms such as the new Google Earth Engine, a planetary-scale platform for environmental data and analysis.

That is the view of South Dakota State University professor Matt Hansen, one of several scientists who worked with Google to launch Google Earth Engine. The new technology was showcased at the annual meeting of the United Nations Framework Convention on Climate Change in Cancun, Mexico, in December 2010.

Hansen, the co-director of the Geographic Information Science Center of Excellence at SDSU, said that up until now, analyzing remote sensing data from satellites has required a hefty investment in infrastructure and lots of training. But not anymore. New policies by the U.S. Geological Survey are making satellite images available for free. That change in policy, paired with the cloud-computing capability offered by organizations such as Google, is making it possible for ordinary people to analyze satellite imagery without having expensive equipment.

“Eventually — soon, I expect — they’ll have the entire Landsat archive online at Google. And they’ll have the cloud computing capability to process all the data,” Hansen said. “This is an incredible advantage in terms of generating the value-added products that we create for quantifying deforestation, natural hazards, cropland area, urbanization, you name it.”

Google Earth Engine was one of the innovative ideas unveiled at the Cancun climate talks. Hansen and postdoctoral researcher Peter Potapov of SDSU worked with Google to help process more than 50,000 images in order to produce a detailed map of Mexico to demonstrate the technology.

"We are very excited about our collaboration with Dr. Matt Hansen and SDSU,” said Rebecca Moore, engineering manager for Google Earth Outreach and Google Earth Engine. “We're hopeful that the combination of our technology and his deep scientific expertise will contribute to a better understanding of the earth and its dynamics."

Hansen noted that the technology is a response to a far-sighted decision by the U.S. Geological Survey to make satellite imagery data available for free. Just two years ago, a user would have had to spend $32 million simply to get access to the images Google and Hansen’s SDSU team processed.

“It’s not just Google. It’s good data policy. When the U.S. Geological Survey made the data free, all of a sudden this whole new world opened up to us. It implies that you have to have to have cloud computing capability to mine all of those data,” Hansen said. “Landsat imagery went from a cost model to a free basis, so the data that we use as our main monitoring observation, 30-meter Landsat data, went from $600 per image — which is around 185 kilometers by 185 kilometers — to being free. So instead of begging and borrowing for money to work with, say, a couple hundred images, we now can access tens of thousands of images. Once you do that, you need to upscale your computing.”

Improved publicly available processing tools will “democratize” the processing of satellite data, as more people become engaged in working with the data. But, he noted, that will require more collaboration between academics, government scientists, and perhaps private industry in processing and characterizing the satellite data sets.

“There is always a chance that uninformed people will try their hand at making advanced products, and they’ll be able to do it because they will not have to have the infrastructure behind it,” Hansen said. “That’s going to put the onus on accuracy. We’re going to have to put a lot of money into accuracy assessment or what we call validation — having data sets that help determine the accuracy of the map products. This will ensure that the most accurate information on how the earth is changing is used in making policy decisions.”

MEDIA CONTACT
Register for reporter access to contact details