Terrestrial Big DataPosted: October 3, 2012 Filed under: Measurement and Analytics | Tags: big data, climate, environment, NEON, science Leave a comment
Frequently we all work in and with organizations that operate and think in the short-term. Here is a refreshing example of a long-term plan to gather high quality data in a standardized way. The plan is called NEON, the U.S. National Ecological Observatory Network. The system is described in The Economist:
Once this network is completed, in 2016 if all goes well, 15,000 sensors will be collecting more than 500 types of data, including temperature, precipitation, air pressure, wind speed and direction, humidity, sunshine, levels of air pollutants such as ozone, the amount of various nutrients in soils and streams, and the state of an area’s vegetation and microbes.
Crucially, these instruments will take the same measurements in the same way in every place. By gathering data in this standardised way, and doing so in many places and over long periods of time, Dr Schimel hopes to achieve the statistical power needed to turn ecology from a craft into an industrial-scale enterprise. The idea is to see how ecosystems respond to changes in climate and land use, and to the arrival of new species. That will let the team develop models which can forecast the future of an ecosystem and allow policymakers to assess the likely consequences of various courses of action.
NEON’s researchers have divided America into 20 domains (see above), each of which is dominated by a particular type of ecosystem. Each domain will have three sets of sensors within it. One set will be based in a core site—a place where conditions are undisturbed and likely to remain so—that will be monitored for at least 30 years. The other two sets will move around, staying in one place for three to five years before being transplanted elsewhere. These “relocatable” sites will allow comparisons to be made within a domain.
Every site, whether core or relocatable, will have a sensor-laden tower that reaches ten metres above the existing vegetation. In an area of a few tens of square kilometres around this tower, the researchers will place further sensors in the soil and in local streams, to measure temperature, carbon-dioxide and nutrient levels, along with rates of root growth and the activities of microbes. These sensors will indicate how efficiently different ecosystems use nutrients and water, how vegetation responds to the climate, and how carbon dioxide moves between living things and the atmosphere. That will help those who seek to understand the carbon cycle—and with it, the consequences of greenhouse-gas-induced climate change.
To complement these ground-based measurements, which can focus on only a limited area, the team will conduct aerial surveys once a year at each core site, looking at things like leaf chemistry and the health of forest canopies, and will also look down on them with satellites. In addition, NEON’s researchers can deploy a specially equipped aeroplane, fitted with lidar (an optical version of radar), a spectrometer (to measure chemical compositions) and a high-resolution camera, to assess the impact of natural disasters such as floods, wildfires and outbreaks of pests.
So many data, of course, require a lot of number crunching. Indeed, it might be argued that what truly distinguishes Big Science from the small stuff—as astronomers and physicists have known for decades and biologists discovered in the aftermath of the Human Genome Project—is not the amount of money involved but the volume of data that needs to be processed. When fully operational NEON is expected to generate 200 terabytes a year. That is four times as much as the Hubble space telescope, a reasonably big piece of science, churned out in its first two decades.
NEON, then, truly does represent a shift by ecologists towards bigness. No doubt that will change the practice of the subject, just as astronomy, physics and genetics changed when they became big.