Global Warming

Global Warming

Global Warming

Global Warming

Global Warming

Global Warming

Global Warming

Global Warming

Global Warming

Global Warming

Thursday, 2 August 2012

China Olympics Traffic Measures Cut Carbon Emissions

A new NASA-funded study of the impacts of China's traffic restrictions for the 2008 Summer Olympics in Beijing shows how widespread changes in transportation patterns could greatly reduce the threat of climate change.

New research by an international team of scientists led by the National Center for Atmospheric Research (NCAR), Boulder, Colo., indicates that China's restrictions on motor vehicles designed to improve air quality during the games had the side benefit of dramatically cutting emissions of carbon dioxide by between 26,500 and 106,000 U.S. tons (24,000 and 96,000 metric tons) during the event.

To put this in perspective, the authors note that this reduction by a single city represents more than one-quarter of one percent of the emissions cut that would be necessary worldwide, on a sustained basis, to prevent the planet from heating up by more than about 3.6 degrees Fahrenheit (2 degrees Celsius) by the end of this century. That is the amount of heating generally considered to lead to major societal impacts.

While scientists have long known that reduced traffic would lead to lower carbon-dioxide emissions, precise estimates for an actual urban area are difficult to calculate. "The Beijing Olympics allowed us to actually measure what happens when people drive much less, and it turns out that it makes quite a substantial difference to our climate," says NCAR scientist Helen Worden, the lead author. "People may think their choice of how to commute to work doesn't make a difference, whether driving their cars or riding their bikes. But on a large scale, it really does."

Recent research has confirmed that China's traffic restrictions successfully reduced levels of air pollutants such as carbon monoxide and ozone. Worden and her colleagues, using new methods in satellite observations and computer simulations, were also able to estimate the impact on carbon dioxide, a powerful greenhouse gas. Data from the NCAR/University of Toronto Measurements of Pollution in the Troposphere (MOPITT) instrument aboard NASA's Terra satellite were used to obtain the carbon monoxide estimates utilized to infer the carbon dioxide emissions.

Worden adds that the same study could not be done for this summer's London Olympics, partly because the surface and cloud conditions in London aren't as favorable for the measurements of carbon monoxide near the surface from MOPITT. In addition, London is very different from Beijing in terms of pollution controls, and it has already restricted traffic in the central city for several years.

Funded primarily by NASA, the study was published in Geophysical Research Letters, a publication of the American Geophysical Union. It was co-authored by researchers at the University of Iowa, Iowa City; the University of Tsinghua in Beijing; Argonne National Laboratory, Lemont, Ill.; and NASA's Jet Propulsion Laboratory, Pasadena, Calif. NCAR is sponsored by the National Science Foundation.

Wednesday, 13 June 2012

Storm sentinels


Beginning this summer and over the next several years, NASA will be sending unmanned aircraft dubbed "severe storm sentinels" above stormy skies to help researchers and forecasters uncover information about hurricane formation and intensity changes.

Several NASA centers are joining federal and university partners in the Hurricane and Severe Storm Sentinel (HS3) airborne mission targeted to investigate the processes that underlie hurricane formation and intensity change in the Atlantic Ocean basin.

NASA's unmanned sentinels are autonomously flown. The NASA Global Hawk is well-suited for hurricane investigations because it can over-fly hurricanes at altitudes greater than 60,000 feet with flight durations of up to 28 hours - something piloted aircraft would find nearly impossible to do. Global Hawks were used in the agency's 2010 Genesis and Rapid Intensification Processes (GRIP) hurricane mission and the Global Hawk Pacific (GloPac) environmental science mission.

"Hurricane intensity can be very hard to predict because of an insufficient understanding of how clouds and wind patterns within a storm interact with the storm’s environment. HS3 seeks to improve our understanding of these processes by taking advantage of the surveillance capabilities of the Global Hawk along with measurements from a suite of advanced instruments," said Scott Braun, HS3 mission principal investigator and research meteorologist at NASA's Goddard Space Flight Center in Greenbelt, Md.

HS3 will use two Global Hawk aircraft and six different instruments this summer, flying from a base of operations at Wallops Flight Facility in Virginia.

"One aircraft will sample the environment of storms while the other will measure eyewall and rainband winds and precipitation," Braun said. HS3 will examine the large-scale environment that tropical storms form in and move through and how that environment affects the inner workings of the storms.

HS3 will address the controversial role of the hot, dry, and dusty Saharan Air Layer in tropical storm formation and intensification. Past studies have suggested that the Saharan Air Layer can both favor or suppress intensification. In addition, HS3 will examine the extent to which deep convection in the inner-core region of storms is a key driver of intensity change or just a response to storms finding favorable sources of energy.

The HS3 mission will operate during portions of the Atlantic hurricane seasons, which run from June 1 to November 30. The 2012 mission will run from late August through early October.

The instruments to be mounted in the Global Hawk aircraft that will examine the environment of the storms include the scanning High-resolution Interferometer Sounder (S-HIS), the Advanced Vertical Atmospheric Profiling System (AVAPS) also known as dropsondes, and the Cloud Physics Lidar (CPL). The Tropospheric Wind Lidar Technology Experiment (TWiLiTE) Doppler wind lidar will likely fly in the 2013 mission.

Another set of instruments will fly on the Global Hawk focusing on the inner region of the storms. Those instruments include the High-Altitude Imaging Wind and Rain Airborne Profiler (HIWRAP) conically scanning Doppler radar, the Hurricane Imaging Radiometer (HIRAD) multi-frequency interferometric radiometer, and the High-Altitude Monolithic Microwave Integrated Circuit Sounding Radiometer (HAMSR) microwave sounder. Most of these instruments represent advanced technology developed by NASA, that in some cases are precursors to future satellite sensors.

NASA's Science Mission Directorate Global Hawk aircraft will deploy to Wallops Flight Facility from their home base at NASA's Dryden Flight Research Center on Edwards Air Force Base, Calif.


NASA's Global Hawk soars aloft from Edwards Air Force Base, Calif. The NASA Global Hawk is well-suited for hurricane investigations because it can over-fly hurricanes at altitudes greater than 60,000 feet with flight durations of up to 28 hours — something piloted aircraft would find nearly impossible to do. Credit: NASA/Tony Landis
"HS3 marks the first time that NASA's Global Hawks will deploy away from Dryden for a mission, potentially marking the beginning of an era in which they are operated regularly from Wallops," said Paul Newman, atmospheric scientist at NASA Goddard and deputy principal investigator on the HS3 mission.

NASA's Science Mission Directorate in Washington is establishing a Global Hawk operations center for science operations from Wallops. "With the Global Hawks at NASA Dryden in California, NASA Wallops will become the 'Global Hawk - Eastern' science center," Newman said.

From rockets studying the upper atmosphere to unmanned aircraft flying over hurricanes, NASA's Wallops Flight Facility is fast becoming a busy place for science. Wallops is one of several NASA centers involved with the HS3 mission. Others include Goddard, Dryden, Ames Research Center, Marshall Space Flight Center, and the Jet Propulsion Laboratory.

The HS3 mission is funded by NASA Headquarters and managed by NASA's Earth System Science Pathfinder Program at NASA's Langley Research Center, Hampton, Va. The HS3 mission also involves collaborations with various partners including the National Centers for Environmental Prediction, Naval Postgraduate School, Naval Research Laboratory, NOAA's Hurricane Research Division and Earth System Research Laboratory, Northrop Grumman Space Technology, National Center for Atmospheric Research, State University of New York at Albany, University of Maryland - Baltimore County, University of Wisconsin, and University of Utah.

NASA's new carbon-counting instrument leaves the nest


Its construction now complete, the science instrument that is the heart of NASA's Orbiting Carbon Observatory-2 (OCO-2) spacecraft — NASA's first mission dedicated to studying atmospheric carbon dioxide — has left its nest at NASA's Jet Propulsion Laboratory in Pasadena, Calif., and has arrived at its integration and test site in Gilbert, Ariz.

A truck carrying the OCO-2 instrument left JPL before dawn on Tuesday, May 9, to begin the trek to Orbital Science Corporation's Satellite Manufacturing Facility in Gilbert, southeast of Phoenix, where it arrived that afternoon. The instrument will be unpacked, inspected and tested. Later this month, it will be integrated with the Orbital-built OCO-2 spacecraft bus, which arrived in Gilbert on April 30.

Once technicians ensure the spacecraft is clean of any contaminants, the observatory's integration and test campaign will kick off. That campaign will be conducted in two parts, with the first part scheduled for completion in October. The observatory will then be stored in Gilbert for about nine months while the launch vehicle is prepared. The integration and test campaign will then resume, with completion scheduled for spring 2014. OCO-2 will then be shipped to Vandenberg Air Force Base, Calif., in preparation for a launch as early as the summer of 2014.

Technicians load the OCO-2 instrument and its ground support equipment aboard a moving van at JPL in preparation for its trek to Orbital Science Corporation's Satellite Manufacturing Facility in Gilbert, Ariz. Credit: NASA/JPL-Caltech.
"The OCO-2 instrument looks great, and its delivery to Orbital's Gilbert, Ariz., facility is a big step forward in successfully launching and operating the mission in space," said Ralph Basilio, OCO-2 project manager at JPL.

OCO-2 is the latest mission in NASA's study of the global carbon cycle. Carbon dioxide is the most significant human-produced greenhouse gas and the principal human-produced driver of climate change. The original OCO mission was lost shortly after launch on Feb. 24, 2009, when the Taurus XL launch vehicle carrying it malfunctioned and failed to reach orbit.

The experimental OCO-2 mission, which is part of NASA's Earth System Science Pathfinder Program, will uniformly sample the atmosphere above Earth's land and ocean, collecting more than half a million measurements of carbon dioxide concentration over Earth's sunlit hemisphere every day for at least two years. It will do so with the accuracy, resolution and coverage needed to provide the first complete picture of the regional-scale geographic distribution and seasonal variations of both human and natural sources of carbon dioxide emissions and their sinks-the places where carbon dioxide is removed from the atmosphere and stored.

Scientists will use OCO-2 mission data to improve global carbon cycle models, better characterize the processes responsible for adding and removing carbon dioxide from the atmosphere, and make more accurate predictions of global climate change.

The mission provides a key new measurement that can be combined with other ground and aircraft measurements and satellite data to answer important questions about the processes that regulate atmospheric carbon dioxide and its role in the carbon cycle and climate. This information could help policymakers and business leaders make better decisions to ensure climate stability and retain our quality of life. The mission will also serve as a pathfinder for future long-term satellite missions to monitor carbon dioxide.

Each of the OCO-2 instrument's three high-resolution spectrometers spreads reflected sunlight into its various colors like a prism, focusing on a different, narrow color range to detect light with the specific colors absorbed by carbon dioxide and molecular oxygen. The amount of light absorbed at these specific colors is proportional to the concentration of carbon dioxide in the atmosphere. Scientists will use these data in computer models to quantify global carbon dioxide sources and sinks.

For more information on the mission, visit: the JPL and NASA OCO-2 websites.

Mild fire forecast


Forests in the Amazon Basin are expected to be less vulnerable to wildfires this year, according to the first forecast from a new fire severity model developed by university and NASA researchers.

Fire season across most of the Amazon rain forest typically begins in May, peaks in September and ends in January. The new model, which forecasts the fire season’s severity from three to nine months in advance, calls for an average or below-average fire season this year within 10 regions spanning three countries: Bolivia, Brazil and Peru.

“Tests of the model suggested that predictions should be possible before fire activity begins in earnest,” said Doug Morton, a co-investigator on the project at NASA’s Goddard Space Flight Center in Greenbelt, Md. “This is the first year to stand behind the model and make an experimental forecast, taking a step from the scientific arena to share this information with forest managers, policy makers, and the public alike.”


Gauges convey the fire severity forecast for 10 regions in the Amazon Basin where fire activity varies greatly from year to year, and where climate conditions have a significant impact on fire activity. Credit: Yang Chen/UC Irvine
The model was first described last year in the journal Science. Comparing nine years of fire data from NASA’s Moderate Resolution Imaging Spectroradiometer (MODIS) on the Terra satellite, with a record of sea surface temperatures from NOAA, scientists established a connection between sea surface temperatures in the Pacific and Atlantic oceans and fire activity in South America.

“There will be fires in the Amazon Basin, but our model predictions suggest that they won’t be as likely in 2012 as in some previous years,” said Jim Randerson of the University of California, Irvine, and principal investigator on the research project.

Specifically, sea surface temperatures in the Central Pacific and North Atlantic are currently cooler than normal. Cool sea surface temperatures change patterns of atmospheric circulation and increase rainfall across the southern Amazon in the months leading up to the fire season.

“We believe the precipitation pattern during the end of the wet season is very important because this is when soils are replenished with water,” said Yang Chen of UC Irvine. “If sea surface temperatures are higher, there is reduced precipitation across most of the region, leaving soils with less water to start the dry season.”

Without sufficient water to be transported from the soil to the atmosphere by trees, humidity decreases and vegetation is more likely to burn. Such was the case in 2010, when above-average sea surface temperatures and drought led to a severe fire season. In 2011, conditions shifted and cooler sea surface temperatures and sufficient rainfall resulted in fewer fires, similar to the forecast for 2012.


Improvements to the model are possible by incorporating data from the MODIS instrument on NASA's Aqua satellite, accounting for fires that occur in the afternoon when conditions are hotter and drier. Credit: Doug Morton.
Building on previous research, the researchers said there is potential to adapt and apply the model to other locations where large-scale climate conditions are a good indicator of the impending fire season, such as Indonesia and the United States.

Amazon forests, however, are particularly relevant because of their high biodiversity and vulnerability to fires. Amazon forests also store large amounts of carbon, and deforestation and wildfires release that carbon back to the atmosphere. Predictions of fire season severity may aid initiatives – such as the United Nation’s Reducing Emissions from Deforestation and forest Degradation program – to reduce the emissions of greenhouse gases from fires in tropical forests.

“The hope is that our experimental fire forecasting information will be useful to a broad range of communities to better understand the science, how these forests burn, and what predisposes forests to burning in some years and not others,” Morton said. “We now have the capability to make predictions, and the interest to share this information with groups who can factor it into their preparation for high fire seasons and management of the associated risks to forests and human health.”

Muddled outlook


The 2012 hurricane season in North and Central America arrives with a muddled outlook. Sea surface temperatures are not particularly warm or cool, and the El Niño–Southern Oscillation (ENSO) is drifting in a neutral state that NASA climate scientist Bill Patzert playfully calls “La Nada.”

The map above shows sea surface temperatures (SSTs) in the tropical Atlantic Ocean and tropical eastern Pacific on May 30, 2012. The map was built with data from the Microwave Optimally Interpolated SST product, a NASA-supported effort at Remote Sensing Systems. Researchers combine observations and analyses from NASA’s Tropical Rainfall Measurement Mission and Aqua and Terra satellites, as well as the U.S. Navy’s WindSAT instrument on the Coriolis satellite (operated jointly with the Air Force).

Shades of blue depict water temperatures below 27.8 degrees Celsius (about 82 degrees Fahrenheit), while yellows, oranges, and reds depict waters above that threshold. Scientists generally agree that waters above that temperature are needed to build and sustain hurricanes, though there are exceptions. Of course, measurements of sea surface temperature account for only the top few millimeters of the ocean, and the amount of heat stored at greater depths (which is harder to measure) can also be a factor in hurricane development. So SSTs do not tell the whole story, but they are a fair predictor of the readiness of the ocean to sustain tropical storms.

“The waters look on the slightly cool side across some of the ‘main development region (MDR)’—the tropical band extending over the east and central Atlantic off Africa,” noted Jeff Halverson, a hurricane researcher at the University of Maryland–Baltimore County. “Whether this will persist for several months as we get into the high season, I don't know.”

The official start of hurricane season is June 1, though four named tropical storms in May—Alberto and Beryl in the Atlantic, Aletta and Bud in the Pacific—didn't wait for the calendar. The Hurricane Research Division of the National Oceanic and Atmospheric Administration (NOAA) announced on May 24, 2012, that it is expecting a near-normal season, with nine to fifteen named storms and four to eight hurricanes. According to NOAA, an average season between 1980 to 2010 produced 12 named storms with six hurricanes, including three major hurricanes.
“We shouldn't be fooled by the storms that have already developed off the southeast U.S. in May,” Halverson said. “Development can and does happen this early—albeit infrequently—and these developments are almost always not far off the U.S. mainland. They have little to do with what is coming off Africa and streaming across the MDR. So these early home-grown storms are not necessarily a predictor of the August to October season, which is dominated by Cape Verde storms.”

Meteorologists often look to ENSO for a sense of whether atmospheric weather patterns will promote or tamp down hurricane formation. In general, researchers believe that El Niño reduces hurricane activity and La Niña promotes it. But the science on the matter is not really settled, and it may be that ENSO affects the number but not necessarily the intensity of storms.

La Niña just ended earlier this spring, and the next El Niño may be some months off. “The equatorial Pacific is neutral, with no El Niño developing...not even a hint,” said Patzert, who is based at the Jet Propulsion Laboratory. “If El Niño builds, I think it will be late and whimpy.”
In the eastern Pacific, NOAA is calling for a near-normal or below-normal season. “Forecasters estimate a 70 percent chance of 12 to 18 named storms, which includes 5 to 9 hurricanes.”

Regardless of the predictions, the key to hurricane season is vigilance. “The important issue is hurricane preparedness along the coasts,” said Patzert. “All it takes is one in your neighborhood to wreak havoc. Listen to the National Hurricane Center, know your evacuation routes, and be super prepared.”

Tuesday, 6 March 2012

NASA study: Earth's energy budget 'out of balance'

02.01.12
By Adam Voiland,
NASA's Earth Science News Team
A new NASA study underscores the fact that greenhouse gases generated by human activity—not changes in solar activity—are the primary force driving global warming.

The study offers an updated calculation of the Earth's energy imbalance, the difference between the amount of solar energy absorbed by Earth's surface and the amount returned to space as heat. The researchers' calculations show that, despite unusually low solar activity between 2005 and 2010, the planet continued to absorb more energy than it returned to space.


A prolonged solar minimum left the sun's surface nearly free of sunspots and accompanying bright areas called faculae between 2005 and 2010. Total solar irradiance declined slightly as a result, but the Earth continued to absorb more energy than it emit throughout the minimum. An animation of a full solar cycle is available here. Credit: NASA Goddard's Scientific Visualization Studio
James Hansen, director of NASA's Goddard Institute for Space Studies (GISS) in New York City, led the research. Atmospheric Chemistry and Physics published the study last December.

Total solar irradiance, the amount of energy produced by the sun that reaches the top of each square meter of the Earth's atmosphere, typically declines by about a tenth of a percent during cyclical lulls in solar activity caused by shifts in the sun's magnetic field. Usually solar minimums occur about every eleven years and last a year or so, but the most recent minimum persisted more than two years longer than normal, making it the longest minimum recorded during the satellite era.

Pinpointing the magnitude of Earth's energy imbalance is fundamental to climate science because it offers a direct measure of the state of the climate. Energy imbalance calculations also serve as the foundation for projections of future climate change. If the imbalance is positive and more energy enters the system than exits, Earth grows warmer. If the imbalance is negative, the planet grows cooler.

Hansen's team concluded that Earth has absorbed more than half a watt more solar energy per square meter than it let off throughout the six year study period. The calculated value of the imbalance (0.58 watts of excess energy per square meter) is more than twice as much as the reduction in the amount of solar energy supplied to the planet between maximum and minimum solar activity (0.25 watts per square meter).

"The fact that we still see a positive imbalance despite the prolonged solar minimum isn't a surprise given what we've learned about the climate system, but it's worth noting because this provides unequivocal evidence that the sun is not the dominant driver of global warming," Hansen said.

According to calculations conducted by Hansen and his colleagues, the 0.58 watts per square meter imbalance implies that carbon dioxide levels need to be reduced to about 350 parts per million to restore the energy budget to equilibrium. The most recent measurements show that carbon dioxide levels are currently 392 parts per million and scientists expect that concentration to continue to rise in the future.


Data collected by Argo floats, such as this one, helped Hansen's team improve the calculation of Earth's energy imbalance. Credit: Argo Project Office
Climate scientists have been refining calculations of the Earth's energy imbalance for many years, but this newest estimate is an improvement over previous attempts because the scientists had access to better measurements of ocean temperature than researchers have had in the past.

The improved measurements came from free-floating instruments that directly monitor the temperature, pressure and salinity of the upper ocean to a depth of 2,000 meters (6,560 feet). The network of instruments, known collectively as Argo, has grown dramatically in recent years since researchers first began deploying the floats a decade ago. Today, more than 3,400 Argo floats actively take measurements and provide data to the public, mostly within 24 hours.

Hansen's analysis of the information collected by Argo, along with other ground-based and satellite data, show the upper ocean has absorbed 71 percent of the excess energy and the Southern Ocean, where there are few Argo floats, has absorbed 12 percent. The abyssal zone of the ocean, between about 3,000 and 6,000 meters (9,800 and 20,000 feet) below the surface, absorbed five percent, while ice absorbed eight percent and land four percent.

The updated energy imbalance calculation has important implications for climate modeling. Its value, which is slightly lower than previous estimates, suggests that most climate models overestimate how readily heat mixes deeply into the ocean and significantly underestimates the cooling effect of small airborne particles called aerosols, which along with greenhouse gases and solar irradiance are critical factors in energy imbalance calculations.

"Climate models simulate observed changes in global temperatures quite accurately, so if the models mix heat into the deep ocean too aggressively, it follows that they underestimate the magnitude of the aerosol cooling effect," Hansen said.

Aerosols, which can either warm or cool the atmosphere depending on their composition and how they interact with clouds, are thought to have a net cooling effect. But estimates of their overall impact on climate are quite uncertain given how difficult it is to measure the distribution of the particles on a broad scale. The new study suggests that the overall cooling effect from aerosols could be about twice as strong as current climate models suggest, largely because few models account for how the particles affect clouds.


A chart shows the global reach of the network of Argo floats. (Credit: Argo Project Office)
"Unfortunately, aerosols remain poorly measured from space," said Michael Mishchenko, a scientist also based at GISS and the project scientist for Glory, a satellite mission designed to measure aerosols in unprecedented detail that was lost after a launch failure in early 2011. "We must have a much better understanding of the global distribution of detailed aerosol properties in order to perfect calculations of Earth's energy imbalance," said Mishchenko.

NASA study solves case of Earth's 'missing energy'

01.31.12
By Alan Buis,
Jet Propulsion Laboratory
Two years ago, scientists at the National Center for Atmospheric Research in Boulder, Colo., released a study claiming that inconsistencies between satellite observations of Earth's heat and measurements of ocean heating amounted to evidence of "missing energy" in the planet's system.

Where was it going? Or, they wondered, was something wrong with the way researchers tracked energy as it was absorbed from the sun and emitted back into space?

An international team of atmospheric scientists and oceanographers, led by Norman Loeb of NASA's Langley Research Center in Hampton, Va., and including Graeme Stephens of NASA's Jet Propulsion Laboratory in Pasadena, Calif., set out to investigate the mystery.

They used 10 years of data—spanning 2001 to 2010—from NASA Langley's orbiting Clouds and the Earth's Radiant Energy System Experiment (CERES) instruments to measure changes in the net radiation balance at the top of Earth's atmosphere. The CERES data were then combined with estimates of the heat content of Earth's ocean from three independent ocean-sensor sources.

Their analysis, summarized in a NASA-led study published Jan. 22 in the journal Nature Geosciences, found that the satellite and ocean measurements are, in fact, in broad agreement once observational uncertainties are factored in.

"One of the things we wanted to do was a more rigorous analysis of the uncertainties," Loeb said. "When we did that, we found the conclusion of missing energy in the system isn't really supported by the data."

'Missing energy' is in the ocean

"Our data show that Earth has been accumulating heat in the ocean at a rate of half a watt per square meter (10.8 square feet), with no sign of a decline," Loeb said. "This extra energy will eventually find its way back into the atmosphere and increase temperatures on Earth."

Scientists generally agree that 90 percent of the excess heat associated with increases in greenhouse gas concentrations gets stored in Earth's ocean. If released back into the atmosphere, a half-watt per square meter accumulation of heat could increase global temperatures by 0.3 or more degrees centigrade (0.54 degree Fahrenheit).

Loeb said the findings demonstrate the importance of using multiple measuring systems over time, and illustrate the need for continuous improvement in the way Earth's energy flows are measured.

The science team at the National Center for Atmospheric Research measured inconsistencies from 2004 and 2009 between satellite observations of Earth's heat balance and measurements of the rate of upper ocean heating from temperatures in the upper 700 meters (2,300 feet) of the ocean. They said the inconsistencies were evidence of "missing energy."

Other authors of the paper are from the University of Hawaii, the Pacific Marine Environmental Laboratory in Seattle, the University of Reading United Kingdom and the University of Miami.

NASA mission takes stock of Earth's melting land ice

02.08.12
By Alan Buis,
Jet Propulsion Laboratory

Steve Cole,
NASA Headquarters

In the first comprehensive satellite study of its kind, a University of Colorado at Boulder-led team used NASA data to calculate how much Earth's melting land ice is adding to global sea level rise.

Using satellite measurements from the NASA/German Aerospace Center Gravity Recovery and Climate Experiment (GRACE), the researchers measured ice loss in all of Earth's land ice between 2003 and 2010, with particular emphasis on glaciers and ice caps outside of Greenland and Antarctica.

The total global ice mass lost from Greenland, Antarctica and Earth's glaciers and ice caps during the study period was about 4.3 trillion tons (1,000 cubic miles), adding about 0.5 inches (12 millimeters) to global sea level. That's enough ice to cover the United States 1.5 feet (0.5 meters) deep.

"Earth is losing a huge amount of ice to the ocean annually, and these new results will help us answer important questions in terms of both sea rise and how the planet's cold regions are responding to global change," said University of Colorado Boulder physics professor John Wahr, who helped lead the study. "The strength of GRACE is it sees all the mass in the system, even though its resolution is not high enough to allow us to determine separate contributions from each individual glacier."

About a quarter of the average annual ice loss came from glaciers and ice caps outside of Greenland and Antarctica (roughly 148 billion tons, or 39 cubic miles). Ice loss from Greenland and Antarctica and their peripheral ice caps and glaciers averaged 385 billion tons (100 cubic miles) a year. Results of the study will be published online Feb. 8 in the journal Nature.

Traditional estimates of Earth's ice caps and glaciers have been made using ground measurements from relatively few glaciers to infer what all the world's unmonitored glaciers were doing. Only a few hundred of the roughly 200,000 glaciers worldwide have been monitored for longer than a decade.

One unexpected study result from GRACE was that the estimated ice loss from high Asian mountain ranges like the Himalaya, the Pamir and the Tien Shan was only about 4 billion tons of ice annually. Some previous ground-based estimates of ice loss in these high Asian mountains have ranged up to 50 billion tons annually.

"The GRACE results in this region really were a surprise," said Wahr, who is also a fellow at the University of Colorado-headquartered Cooperative Institute for Research in Environmental Sciences. "One possible explanation is that previous estimates were based on measurements taken primarily from some of the lower, more accessible glaciers in Asia and extrapolated to infer the behavior of higher glaciers. But unlike the lower glaciers, most of the high glaciers are located in very cold environments and require greater amounts of atmospheric warming before local temperatures rise enough to cause significant melting. This makes it difficult to use low-elevation, ground-based measurements to estimate results from the entire system."

"This study finds that the world's small glaciers and ice caps in places like Alaska, South America and the Himalayas contribute about 0.02 inches per year to sea level rise," said Tom Wagner, cryosphere program scientist at NASA Headquarters in Washington. "While this is lower than previous estimates, it confirms that ice is being lost from around the globe, with just a few areas in precarious balance. The results sharpen our view of land-ice melting, which poses the biggest, most threatening factor in future sea level rise."

The twin GRACE satellites track changes in Earth's gravity field by noting minute changes in gravitational pull caused by regional variations in Earth's mass, which for periods of months to years is typically because of movements of water on Earth's surface. It does this by measuring changes in the distance between its two identical spacecraft to one-hundredth the width of a human hair.

The GRACE spacecraft, developed by NASA's Jet Propulsion Laboratory, Pasadena, Calif., and launched in 2002, are in the same orbit approximately 137 miles (220 kilometers) apart. The California Institute of Technology manages JPL for NASA.

'First Light' Taken by NASA's Newest CERES Instrument

The doors are open on NASA's Suomi NPP satellite and the newest version of the Clouds and the Earth's Radiant Energy System (CERES) instrument is scanning Earth for the first time, helping to assure continued availability of measurements of the energy leaving the Earth-atmosphere system.

The CERES results help scientists to determine the Earth's energy balance, providing a long-term record of this crucial environmental parameter that will be consistent with those of its predecessors.


Thick cloud cover tends to reflect a large amount of incoming solar energy back to space (blue/green/white image), but at the same time, reduce the amount of outgoing heat lost to space (red/blue/orange image). Contrast the areas that do not have cloud cover (darker colored regions) to get a sense for how much impact the clouds have on incoming and outgoing energy. Credit: NASA/NOAA/CERES Team
*** Click either image to enlarge it ***



In the longwave image, heat energy radiated from Earth (in watts per square meter) is shown in shades of yellow, red, blue and white. The brightest-yellow areas are the hottest and are emitting the most energy out to space, while the dark blue areas and the bright white clouds are much colder, emitting the least energy. Increasing temperature, decreasing water vapor, and decreasing clouds will all tend to increase the ability of Earth to shed heat out to space. Credit: NASA/NOAA/CERES Team

CERES arrived in space Oct. 28, 2011, carried by NASA's newest Earth-observing satellite, the recently renamed Suomi National Polar-orbiting Partnership, or Suomi NPP. Suomi NPP is the result of a partnership between NASA, NOAA and the Department of Defense.

Instrument cover-opening activities began on the instrument at 10:12 a.m. Eastern time Jan. 26, an operation that took about three hours. The "first light" process represented the transition from engineering checkout to science observations. The next morning CERES began taking Earth-viewing data, and on Jan. 29 scientists produced an image from the scans.

"It's extremely gratifying to see the CERES FM-5 instruments on Suomi NPP begin taking measurements. We're continuing the legacy of the most accurate Earth radiation budget observations ever made," said CERES project scientist Kory Priestley, of NASA's Langley Research Center in Hampton, Va.

"It has taken an incredible team of engineers, scientists, data management and programmatic experts to get CERES to this point," he said.

MORE INFORMATION
› Suomi NPP Mission
› CERES page
› Q&A With CERES Principal Investigator
NASA instruments have provided the scientific community unprecedented observations of the Earth's climate and energy balance for nearly 30 years. The first CERES instrument was launched in 1997. Before that, the Earth Radiation Budget Experiment (ERBE) did the job beginning in 1984.

Langley Research Center has led both the ERBE and CERES experiments and provided stewardship of these critical climate observations.

For 27 years without a break, the instruments collectively have returned a vast quantity of precise data about the solar energy reflected and absorbed by Earth, the heat the planet emits, and the role of clouds in that process.

"CERES monitors minute changes in the Earth's energy budget, the difference between incoming and outgoing energy," said CERES principal investigator Norman Loeb, of Langley Research Center.

"Any imbalance in Earth's energy budget due to increasing concentrations of heat trapping gases warms the ocean, raises sea level, and causes increases in atmospheric temperature," Loeb said. "Amassing a long record of data is important in order to understand how Earth's climate is changing in response to human activities as well as natural processes."


How It Works

In addition to observing changes in Earth's radiation budget, scientists are also monitoring changes in clouds and aerosols, which strongly influence Earth's radiation budget.

"Clouds both reflect sunlight and block energy from radiating to space," Loeb said. "Which of these two effects dominates depends upon the properties of clouds, such as their amount, thickness and height."

"As the Earth's environment evolves, cloud properties may change in ways that could amplify or offset climate change driven by other processes. Understanding the influence of clouds on the energy budget is therefore a critical climate problem."

The four other CERES instruments are in orbit on NASA's Aqua and Terra satellites.


Overall Mission

The five-instrument suite on Suomi NPP collects and distributes remotely sensed land, ocean, and atmospheric data to the meteorological and global Earth system science research communities. The mission will provide atmospheric and sea surface temperatures, humidity sounding, land and ocean biological productivity, cloud and aerosol properties, total/profile ozone measurements, and monitor changes in the Earth's radiation budget.

NASA's Goddard Space Flight Center in Greenbelt, Md., manages the Suomi mission for the Earth Science Division of the Science Mission Directorate at NASA Headquarters in Washington. The National Oceanic and Atmospheric Administration's Joint Polar Satellite System (JPSS) program provides the satellite ground system and NOAA provides operational support. Suomi NPP commissioning activities are expected to be completed by March.

NASA Langley manages the CERES experiment with additional contracted support from Science Systems and Applications, Inc. The TRW Space & Electronics Group in Redondo Beach, Calif., now owned by Northrop Grumman Aerospace Systems, built all of the CERES instruments.




Michael Finneran
NASA Langley Research Center

Detecting detrimental change in coral reefs



02.09.12
By Laura Rocchio,
NASA Goddard Space Flight Center
Over dinner on R.V. Calypso while anchored on the lee side of Glover’s Reef in Belize, Jacques Cousteau told Phil Dustan that he suspected humans were having a negative impact on coral reefs. Dustan — a young ocean ecologist who had worked in the lush coral reefs of the Caribbean and Sinai Peninsula — found this difficult to believe. It was December 1974.


Reef environments provide habitat for hundreds of fish species including the queen angelfish shown here in the Florida Keys National Marine Sanctuary. Credit: Chris Huss.
But Cousteau was right. During the following three-plus decades, Dustan, an ocean ecologist and biology professor at the College of Charleston in South Carolina, has witnessed widespread coral reef degradation and bleaching from up close. In the late 1970s Dustan helped build a handheld spectrometer, a tool to measure light given off by the coral. Using his spectrometer, Dustan could look at light reflected and made by the different organisms that comprised the living reefs. Since then, he has watched reefs deteriorate at an alarming rate. Recently he has found that Landsat offers a way to evaluate these changes globally. Using an innovative way to map how coral reefs are changing over time, Dustan now can find 'hotspots' where conservation efforts should be focused to protect these delicate communities.


Left: A Landsat pixel-based map showing where the most change has been detected on Caryfort Reef between 1984 and 1996. Right: The spine of elevation shows where the most change has occurred; for Carysfort this change has been correlated with coral decline. Credit: Phil Dustan.
A role for remote sensing
Situated in shallow clear water, most coral reefs are visible to satellites that use passive remote sensing to observe Earth's surface. But coral reefs are complex ecosystems with coincident coral species, sand and water all reflecting light. Dustan found that currently orbiting satellites do not offer the spatial or spectral resolution needed to distinguish between them and specifically classify coral reef composition. So instead of attempting to classify the inherently complex coral ecosystem to monitor their health, Dustan has instead started to look for change — how overall reflectance for a geographic location varies over time.

Dustan uses a time series of Landsat data to calculate something called temporal texture — basically a map showing where change has occurred based on statistical analysis of reflectance information. While Dustan cannot diagnosis the type of change with temporal texture he can establish where serious changes have occurred. Coral communities have seasonal rhythms and periodicities, but larger, significant changes show up as statistical outliers in temporal texture maps and often correlate with reef decline.


There was a 92 percent loss of living coral on Carysfort between 1975 and 2000. Credit: Dustan and Halas; FKNMS Coral Reef Evaluation and Monitoring Project.
Carysfort reef — named for the HMS Carysfort, an eighteenth century British warship that ran aground on the reef in 1770 — is considered the most ecologically diverse on the Florida Keys National Marine Sanctuary’s northern seaward edge, but today it is in a state of ecological collapse.
A case study
Dustan and colleagues conducted the first quantitative field study of coral health at Carysfort in 1974. After a quarter century their studies showed that coral had declined 92 percent. The coral had succumbed to an array of stressors culminating with deadly diseases.

Using the well-characterized Carysfort reef as his control, Dustan calculated the temporal texture for the reef using a series of 20 Landsat images collected between 1982 and 1996. The resulting temporal texture maps correlated with the known areas of significant coral loss (where coral communities have turned into algal-dominated substrates) and they correctly showed that the seaward shallow regions have had the most detrimental change.

This novel approach to change detection is only possible because the long-term calibration of Landsat data assures that data from year-to-year is consistent. Dustin needs at least six to eight Landsat images to create a reliable temporal texture map, but the more data that is available, the finer the results.


Carysfort Reef is located in the eastern portion of the Florida Keys National Marine Sanctuary, offshore of Key Largo. Credit: National Oceanic and Atmospheric Administration.
Dustan tested this work in the U.S. because he had a robust study site and because prior to 1999 coverage of reefs outside of the U.S. was spotty. With the Landsat 7 launch in 1999 a new global data acquisition strategy was established and for the first time the planet’s coral reefs were systematically and regularly imaged, greatly increasing our knowledge of reefs. The Landsat archive enabled the completing of the first exhaustive global survey of reefs (Millennium Global Coral Reef Mapping Project, http://landsat.gsfc.nasa.gov/news/news-archive/news_0031.html). Efforts are currently underway to receive and ingest Landsat data collected and housed by international ground-receiving stations. International partners often downlink Landsat scenes of their countries that the U.S. does not, so it is very likely that historic reef images will be added the U.S. Landsat archive during this process.
Carrying on outside Carysfort
Temporal texture gives scientists an entirely new way to look at coral reefs. A worldwide study could help managers locate change ‘hotspots’ and could better inform conservation efforts.

Ideally, after more testing, Dustan would like to see an automatic change detection system implemented to follow major worldwide reef systems. “There is no reason that a form of temporal texture monitoring could not be implemented with current satellites in orbit,” Dustan says.

Because reefs are underwater it is difficult to grasp the extensive devastation being exacted upon them. Global temporal texture mapping could bring the ravages into focus.

Infrared eye opens

02.09.12
By Cynthia O'Carroll, NASA Goddard Space Flight Center,
John Leslie, National Oceanic and Atmospheric Administration

A powerful new infrared instrument, flying on NASA's newest polar-orbiting satellite, designed to give scientists more refined information about Earth's atmosphere and improve weather forecasts and our understanding of climate, has started sending its data back to Earth.

The Cross-track Infrared Sounder (CrIS) joins four other new instruments aboard the Suomi National Polar-orbiting Partnership (NPP) satellite, which NASA launched on Oct. 28, 2011 from Vandenberg Air Force Base in California. The Suomi NPP mission is the bridge between NOAA's Polar Operational Environmental Satellite (POES) and NASA's Earth Observing System satellites and the next-generation Joint Polar Satellite System (JPSS).

Since it reached orbit, Suomi NPP and its suite of five instruments are undergoing extensive checkouts before starting regular science observations. Suomi NPP is the result of a partnership between NASA, NOAA and the Department of Defense.

CrIS, an advanced spectrometer with 1,305 infrared spectral channels, is designed to provide high vertical resolution information on the atmosphere's three-dimensional structure of temperature and water vapor. The Atmospheric Infrared Sounder (AIRS) on the EOS Aqua mission, launched in 2002, demonstrated how useful this type of data could be for understanding the atmosphere. CrIS will continue this data record and provide data for use in NOAA's numerical weather prediction models to forecast severe weather days in advance.

"Significant overlap between AIRS and CrIS will provide the Earth science research community the ability to maintain the unprecedented accuracy and stability of the temperature and moisture data record initiated by AIRS," said Diane Wickland, Suomi NPP program scientist at NASA Headquarters.

"Having data from CrIS will only improve the quality, timeliness and accuracy of NOAA's weather and climate predictions, which directly affect everyone in America," said Mary Kicza, assistant administrator for NOAA's Satellite and Information Service (NESDIS).

"Over longer periods, data from CrIS will help NOAA to better understand climate phenomena such as El Niño and La Niña that impact global weather patterns," said Mitch Goldberg, NOAA's JPSS program scientist.

The Advanced Technology Microwave Sounder (ATMS), which measures temperature and humidity in both clear and cloudy conditions, was the first Suomi NPP instrument activated. ATMS and CrIS data together will be used operationally in weather forecasts beginning in the Spring of 2012.

"The instrument commissioning is going well and we are pleased that Suomi NPP is taking the next step in its mission of providing critical weather data to NOAA and global Earth system science data to the U.S. research community," stated Ken Schwer, Suomi NPP project manager. Commissioning activities will continue through February, and once completed, satellite operations will be turned over to the JPSS program. NOAA will operate the satellite and process and distribute the data to users around the world.

NASA's Goddard Space Flight Center in Greenbelt, Md., manages the Suomi NPP mission for the Earth Science Division of the Science Mission Directorate at NASA Headquarters in Washington. NOAA provides the CrIS instrument and operational support and the JPSS program provides the satellite ground system.

NASA's Suomi NPP mission will allow us to advance our knowledge of how the entire Earth system works by providing enhanced data for our nation's weather forcasting system, providing new insight to scientists to better understand climate.

NASA map sees Earth's trees in a new light

02.21.12
By Alan Buis,
Jet Propulsion Laboratory
A NASA-led science team has created an accurate, high-resolution map of the height of Earth's forests. The map will help scientists better understand the role forests play in climate change and how their heights influence wildlife habitats within them, while also helping them quantify the carbon stored in Earth's vegetation.

Scientists from NASA's Jet Propulsion Laboratory, Pasadena, Calif.; the University of Maryland, College Park; and Woods Hole Research Center, Falmouth, Mass., created the map using 2.5 million carefully screened, globally distributed laser pulse measurements from space. The light detection and ranging (lidar) data were collected in 2005 by the Geoscience Laser Altimeter System instrument on NASA's Ice, Cloud and land Elevation Satellite (ICESat).




Accurate measurements of the height of Earth's forests can improve global efforts to monitor how much carbon they contain, while benefitting studies of forest biodiversity. Image credit: NASA/JPL-Caltech/Josh Fisher



"Knowing the height of Earth's forests is critical to estimating their biomass, or the amount of carbon they contain," said lead researcher Marc Simard of JPL. "Our map can be used to improve global efforts to monitor carbon. In addition, forest height is an integral characteristic of Earth's habitats, yet is poorly measured globally, so our results will also benefit studies of the varieties of life that are found in particular parts of the forest or habitats."
The map, available at http://lidarradar.jpl.nasa.gov, depicts the highest points in the forest canopy. Its spatial resolution is 0.6 miles (1 kilometer). The map was validated against data from a network of nearly 70 ground sites around the world.

The researchers found that, in general, forest heights decrease at higher elevations and are highest at low latitudes, decreasing in height the farther they are from the tropics. A major exception was found at around 40 degrees south latitude in southern temperate rainforests in Australia and New Zealand, where stands of eucalyptus, one of the world's tallest flowering plants, tower much higher than 130 feet (40 meters).

The researchers augmented the ICESat data with other types of data to compensate for the sparse lidar data, the effects of topography and cloud cover. These included estimates of the percentage of global tree cover from NASA's Moderate Resolution Imaging Spectroradiometer on NASA's Terra satellite, elevation data from NASA's Shuttle Radar Topography Mission, and temperature and precipitation maps from NASA's Tropical Rainfall Measuring Mission and the WorldClim database. WorldClim is a set of freely available, high-resolution global climate data that can be used for mapping and spatial modeling.

In general, estimates in the new map show forest heights were taller than in a previous ICESat-based map, particularly in the tropics and in boreal forests, and were shorter in mountainous regions. The accuracy of the new map varies across major ecological community types in the forests, and also depends on how much the forests have been disturbed by human activities and by variability in the forests' natural height.

"Our map contains one of the best descriptions of the height of Earth's forests currently available at regional and global scales," Simard said. "This study demonstrates the tremendous potential that spaceborne lidar holds for revealing new information about Earth's forests. However, to monitor the long-term health of Earth's forests and other ecosystems, new Earth observing satellites will be needed."

Results of the study were published recently in the Journal of Geophysical Research – Biogeosciences.

Trekking the global in search of answers

02.15.12
By Alan Buis,
NASA Jet Propulsion Laboratory
While NASA's fleet of Earth science spacecraft — its "eyes on the Earth" — continues to monitor the pulse of our home planet, 2012 is also shaping up to be an extraordinary time for NASA's Airborne Science Program and its Earth system science research initiatives. Multiple aircraft and specialized instruments, including several from JPL, will operate in the United States, Europe, Asia and South America this year in support of studies conducted by NASA and the Earth science community, improving scientists' understanding of our planet.

The program maintains a fleet of highly modified aircraft and specialized instruments that can be deployed all over the world for Earth science missions. Researchers use these aircraft and sensors to obtain high-resolution measurements of local phenomena and processes, such as ice sheet thicknesses, precipitation and air quality. These measurements are often combined with global satellite observations and ground sampling to better model and understand the complete Earth system. NASA's Airborne Science Program and its flight campaigns play a key role in the development of both hardware and algorithms for future satellite missions, including JPL's planned Soil Moisture Active-Passive (SMAP) and Surface Water Ocean Topography (SWOT) missions, among others.

The aircraft provide scientists with access to unique capabilities, including high-altitude and long-duration flights, and the ability to fly large payloads and multiple instruments to nearly anywhere at any time. NASA's Airborne Science aircraft and sensors offer the science community the ability to collect data from Earth's surface to 70,000 feet (21,300 meters) in altitude. These unique assets add another research dimension to ground and satellite measurements.

"NASA's Airborne Science support of the Earth system science community will be exceptional in 2012," said Randy Albertson, NASA Airborne Science deputy program director. "Not only is the program on track to exceed the 2011 record of 2,600 hours flying science missions, the growth in new sensor integrations means NASA is well poised to conduct more accurate and complex airborne science in the future." science in the future."

For details on NASA's 2012 airborne science missions and links to more information, visit this page.

Earth's clouds are getting lower

02.22.12
By Alan Buis,
NASA Jet Propulsion Laboratory
Earth's clouds got a little lower — about one percent on average — during the first decade of this century, finds a new study based on NASA satellite data. The results have potential implications for future global climate.

Scientists at the University of Auckland in New Zealand analyzed the first 10 years of global cloud-top height measurements (from March 2000 to February 2010) from the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra spacecraft. The study, published recently in the journal Geophysical Research Letters, revealed an overall trend of decreasing cloud height. Global average cloud height declined by around one percent over the decade, or by around 100 to 130 feet (30 to 40 meters). Most of the reduction was due to fewer clouds occurring at very high altitudes.




Data from NASA's MISR instrument on NASA's Terra spacecraft show that global average cloud height declined by about 1 percent over the decade from 2000 to 2010, or around 100 to 130 feet (30 to 40 meters). Credit: University of Auckland/NASA JPL-Caltech.



Lead researcher Roger Davies said that while the record is too short to be definitive, it provides a hint that something quite important might be going on. Longer-term monitoring will be required to determine the significance of the observation for global temperatures.
A consistent reduction in cloud height would allow Earth to cool to space more efficiently, reducing the surface temperature of the planet and potentially slowing the effects of global warming. This may represent a "negative feedback" mechanism — a change caused by global warming that works to counteract it. "We don't know exactly what causes the cloud heights to lower," says Davies. "But it must be due to a change in the circulation patterns that give rise to cloud formation at high altitude."




Patterns that relate changes in cloud-top height with El Niño/ La Niña indicators. Credit: University of Auckland/NASA JPL-Caltech



NASA's Terra spacecraft is scheduled to continue gathering data through the remainder of this decade. Scientists will continue to monitor the MISR data closely to see if this trend continues.

Watching the planet breathe

Scientists have come up with an entirely new way to monitor the health of Earth’s plants from space. In work published in Geophysical Research Letters [1], researchers working at NASA’s Jet Propulsion Laboratory (JPL) and in Germany and Japan report on how measurements taken from space can open a whole new window onto the planet’s carbon cycle.

Carbon is a building block of life. It is also a key component of our climate. Carbon dioxide — a gas that exists naturally in the air, but is also produced by humans when we burn fossil fuels, drive cars and chop down trees — acts as a thermostat that controls the temperature of the planet. As a “greenhouse gas,” it acts like a blanket that traps heat close to the surface of the Earth. The more carbon dioxide we emit, the more the warming. Since the beginning of the industrial age, carbon dioxide levels have gone up by nearly 40 percent, and the world’s average temperature has risen by about 0.5 degrees Celsius (nearly 1 degree Fahrenheit) as a result. Knowing how much carbon is going into and out of the Earth’s land, air and oceans — the carbon cycle — is critical for understanding how much global warming is likely to happen to our planet in the future. And plants and vegetation are a key part of this cycle.

When plants photosynthesize, they use energy from sunlight to turn carbon dioxide from the air into sugars used to live and grow. In doing so, they give off a fluorescent light — a glow that can’t be seen with the naked eye, but that can be seen with the right instruments. More photosynthesis translates into more fluorescence, meaning that the plants are very productive in taking up carbon dioxide. The amount of carbon dioxide taken up by plants is called “gross primary productivity,” and is the largest part of the global carbon cycle.

Launched in 2009, the Japanese satellite Greenhouse Gases Observing Satellite (GOSAT) has the ability to pick up this glow. Using GOSAT data, JPL scientist Christian Frankenberg and colleagues have shown that it is possible to pick up this fluorescent glow from space over the entire planet, and thereby infer details about the health and activity of vegetation on the ground.

Typically, our best guess of global plant productivity comes from looking at the general greenness of plants from space, taking into account a plant’s ability to both block out harmful infrared radiation as well as absorb useful visible light. Normally, we would expect that the greener the plant, the more productive it is. However, there are exceptions to this rule. Evergreen trees in the winter, for example, are not very productive; water-stressed tropical forests may ramp down photosynthesis until the rains come back, but in the meantime they still maintain their greenness. So greenness is not always the best measure of plant productivity.

“The greenness-based approaches offer good approximate estimates, but they make assumptions. They are indirect estimates relying on additional information about the plants that is not always readily available, and are often contaminated by atmospheric interference,” explains Joshua Fisher, a climate scientist at JPL and co-author of the paper. “Our observations of plant fluorescence are instead direct indicators of plant productivity. They don’t make any assumptions based on apparent greenness, and take advantage of a narrow window in the atmosphere where fluorescence can escape to space unimpeded by atmospheric interference.” In addition, fluorescence responds immediately to environmental stress, while it can take days or even weeks before changes in greenness are seen by space satellites. The fluorescent glow given off by plant activity can therefore offer an early warning sign.


Figure 1. (a) Global map of plant chlorophyll fluorescence as measured by the GOSAT satellite from June 2009 to May 2010. The fluorescence is measured at a spectral wavelength of 757 nanometers and superimposed on a 2°x 2° grid. Areas of higher and lower plant activity can be seen in different parts of the world. (b) Time variations in the fluorescence signal given off by vegetation, from June 2009 to August 2010. A pronounced seasonal variation can be seen that reflects the growing season in the northern hemisphere and seasonal vegetation shifts in the tropics. See larger image
The JPL-led team — which also includes scientists from the Institute for Meteorology and Climate Research in Germany, the Max Planck Institute for Biogeochemistry in Germany, the Japan Aerospace Exploration Agency and the National Institute for Environmental Studies in Japan — has produced a global map of plant activity from space (Figure 1). The map pinpoints areas of very active vegetation and areas of lower activity such as barren or snow-covered surfaces. Plants fluoresce only when they are actively photosynthesizing. If plants are in a drought situation and short of water, for example, they don’t photosynthesize as much as when growing conditions are good, and their fluorescence drops.

The map shows increased plant activity over tropical evergreen forests, the eastern United States, Asia and central Europe. It also captures smaller-scale variations, such as enhanced fluorescence in southeastern Australia and comparatively low fluorescence in the Iberian Peninsula. In addition, a pronounced seasonal variation in plant activity is observed, reflecting the growing season in the northern hemisphere and seasonal vegetation shifts in the tropics.

While this is the not the first map of plant fluorescence produced from space [2], these new findings provide the first accurate fluorescence data because they take into account important instrument effects that can severely impact the accuracy. It is also the first time that fluorescence has been compared to model-derived gross primary productivity on a global scale. The authors will continue to scrutinize finer details, for example, the higher-than-expected fluorescence signals over croplands and savannas (thought to be linked to underestimates of plant productivity).

As Frankenberg explains, the work is a proof-of-principle. “We’ve shown that chlorophyll fluorescence exhibits a strong linear correlation with gross primary production, and can therefore be used as an entirely new way to monitor plant productivity from space.” The findings bode well for NASA’s upcoming mission, the Orbiting Carbon Observatory-2 (OCO-2), which will measure Earth’s carbon dioxide levels and plant fluorescence from space much like GOSAT. OCO-2 (launch date to-be-determined), will collect about 50 times more data than GOSAT and offer full coverage of the planet. Together, GOSAT and OCO-2 will provide an unprecedented amount of information on the health of plants and carbon dioxide levels of our planet. The hope is that this will give us a much better grip on the Earth’s carbon cycle — and therefore climate change.


References
[1] C. Frankenberg et al., “New global observations of the terrestrial carbon cycle from GOSAT: Patterns of plant fluorescence with gross primary productivity,” Geophys. Res. Lett., vol. 38, L17706 (2011).
[2] J. Joiner et al., “First observations of global and seasonal terrestrial chlorophyll fluorescence from space,” Biogeosciences Discuss., 7, 8281-8313 (2010).

Thursday, 2 February 2012

NASA Spacecraft Returns First Video from Far Side Of The Moon

WASHINGTON -- A camera aboard one of NASA's twin Gravity Recovery And Interior Laboratory (GRAIL) lunar spacecraft has returned its first unique view of the far side of the moon. MoonKAM, or Moon Knowledge Acquired by Middle school students, will be used by students nationwide to select lunar images for study.

GRAIL consists of two identical spacecraft, recently named Ebb and Flow, each of which is equipped with a MoonKAM. The images were taken as part of a test of Ebb's MoonKAM on Jan. 19. The GRAIL project plans to test the MoonKAM aboard Flow at a later date.

To view the 30-second video clip, visit:

http://go.nasa.gov/zZXAPs



In the video, the north pole of the moon is visible at the top of the screen as the spacecraft flies toward the lunar south pole. One of the first prominent geological features seen on the lower third of the moon is the Mare Orientale, a 560 mile-wide (900 kilometer) impact basin that straddles both the moon's near and far side.

The clip ends with rugged terrain just short of the lunar south pole. To the left of center, near the bottom of the screen, is the 93 mile-wide (149 kilometer) Drygalski crater with a distinctive star-shaped formation in the middle. The formation is a central peak, created many billions of years ago by a comet or asteroid impact.

"The quality of the video is excellent and should energize our MoonKAM students as they prepare to explore the moon," said Maria Zuber, GRAIL principal investigator from the Massachusetts Institute of Technology in Cambridge.

The twin spacecraft successfully achieved lunar orbit last New Year's Eve and New Year's Day. Previously named GRAIL-A and -B, the washing machine-sized spacecraft received their new names from fourth graders at the Emily Dickinson Elementary School in Bozeman, Mont., following a nationwide student-naming contest.

Thousands of fourth- to eighth-grade students will select target areas on the lunar surface and send requests to the GRAIL MoonKAM Mission Operations Center in San Diego. Photos of the target areas will be sent back by the satellites for students to study. The MoonKAM program is led by Sally Ride, America's first woman in space. Her team at Sally Ride Science and undergraduate students at the University of California in San Diego will engage middle schools across the country in the GRAIL mission and lunar exploration. GRAIL is NASA's first planetary mission carrying instruments fully dedicated to education and public outreach.

"We have had great response from schools around the country, more than 2,500 signed up to participate so far," Ride said. "In mid-March, the first pictures of the moon will be taken by students using MoonKAM. I expect this will excite many students about possible careers in science and engineering."

Launched in September 2011, Ebb and Flow periodically perform trajectory correction maneuvers that, over time, will lower their orbits to near-circular ones with an altitude of about 34 miles (55 kilometers). During their science mission, the duo will answer longstanding questions about the moon and give scientists a better understanding of how Earth and other rocky planets in the solar system formed.

NASA's Jet Propulsion Laboratory in Pasadena, Calif., manages the GRAIL mission for NASA's Science Mission Directorate in Washington. The GRAIL mission is part of the Discovery Program managed at NASA's Marshall Space Flight Center in Huntsville, Ala. Lockheed Martin Space Systems in Denver built the spacecraft.

NASA finds 2011 ninth-warmest year on record

While average global temperature will still fluctuate from year to year, scientists focus on the decadal trend. Nine of the 10 warmest years since 1880 have occurred since the year 2000, as the Earth has experienced sustained higher temperatures than in any decade during the 20th century. As greenhouse gas emissions and atmospheric carbon dioxide levels continue to rise, scientists expect the long-term temperature increase to continue as well. (Data source: NASA Goddard Institute for Space Studies. Credit: NASA Earth Observatory, Robert Simmon)


01.20.12
By Leslie McCarthy,
NASA's Goddard Institute for Space Studies
The global average surface temperature in 2011 was the ninth warmest since 1880, according to NASA scientists. The finding continues a trend in which nine of the 10 warmest years in the modern meteorological record have occurred since the year 2000.

NASA's Goddard Institute for Space Studies (GISS) in New York, which monitors global surface temperatures on an ongoing basis, released an updated analysis that shows temperatures around the globe in 2011 compared to the average global temperature from the mid-20th century. The comparison shows how Earth continues to experience warmer temperatures than several decades ago. The average temperature around the globe in 2011 was 0.92 degrees F (0.51 degrees C) warmer than the mid-20th century baseline.

"We know the planet is absorbing more energy than it is emitting," said GISS Director James E. Hansen. "So we are continuing to see a trend toward higher temperatures. Even with the cooling effects of a strong La Niña influence and low solar activity for the past several years, 2011 was one of the 10 warmest years on record."

The difference between 2011 and the warmest year in the GISS record (2010) is 0.22 degrees F (0.12 degrees C). This underscores the emphasis scientists put on the long-term trend of global temperature rise. Because of the large natural variability of climate, scientists do not expect temperatures to rise consistently year after year. However, they do expect a continuing temperature rise over decades.

The first 11 years of the 21st century experienced notably higher temperatures compared to the middle and late 20th century, Hansen said. The only year from the 20th century in the top 10 warmest years on record is 1998.

Higher temperatures today are largely sustained by increased atmospheric concentrations of greenhouse gases, especially carbon dioxide. These gases absorb infrared radiation emitted by Earth and release that energy into the atmosphere rather than allowing it to escape to space. As their atmospheric concentration has increased, the amount of energy "trapped" by these gases has led to higher temperatures.

The carbon dioxide level in the atmosphere was about 285 parts per million in 1880, when the GISS global temperature record begins. By 1960, the average concentration had risen to about 315 parts per million. Today it exceeds 390 parts per million and continues to rise at an accelerating pace.

The temperature analysis produced at GISS is compiled from weather data from more than 1,000 meteorological stations around the world, satellite observations of sea surface temperature and Antarctic research station measurements. A publicly available computer program is used to calculate the difference between surface temperature in a given month and the average temperature for the same place during 1951 to 1980. This three-decade period functions as a baseline for the analysis.

The resulting temperature record is very close to analyses by the Met Office Hadley Centre in the United Kingdom and the National Oceanic and Atmospheric Administration's National Climatic Data Center in Asheville, N.C.

Hansen said he expects record-breaking global average temperature in the next two to three years because solar activity is on the upswing and the next El Niño will increase tropical Pacific temperatures. The warmest years on record were 2005 and 2010, in a virtual tie.

"It's always dangerous to make predictions about El Niño, but it's safe to say we'll see one in the next three years," Hansen said. "It won't take a very strong El Niño to push temperatures above 2010."

NASA study solves case of Earth's 'missing energy'

Clouds play a vital role in Earth's energy balance, cooling or warming Earth's surface depending on their type. This painting, "Cumulus Congestus," by JPL's Graeme Stephens, principal investigator of NASA's CloudSat mission, depicts cumulus clouds, which transport energy away from Earth's surface. 

01.31.12
By Alan Buis,
Jet Propulsion Laboratory
Two years ago, scientists at the National Center for Atmospheric Research in Boulder, Colo., released a study claiming that inconsistencies between satellite observations of Earth's heat and measurements of ocean heating amounted to evidence of "missing energy" in the planet's system.

Where was it going? Or, they wondered, was something wrong with the way researchers tracked energy as it was absorbed from the sun and emitted back into space?

An international team of atmospheric scientists and oceanographers, led by Norman Loeb of NASA's Langley Research Center in Hampton, Va., and including Graeme Stephens of NASA's Jet Propulsion Laboratory in Pasadena, Calif., set out to investigate the mystery.

They used 10 years of data—spanning 2001 to 2010—from NASA Langley's orbiting Clouds and the Earth's Radiant Energy System Experiment (CERES) instruments to measure changes in the net radiation balance at the top of Earth's atmosphere. The CERES data were then combined with estimates of the heat content of Earth's ocean from three independent ocean-sensor sources.

Their analysis, summarized in a NASA-led study published Jan. 22 in the journal Nature Geosciences, found that the satellite and ocean measurements are, in fact, in broad agreement once observational uncertainties are factored in.

"One of the things we wanted to do was a more rigorous analysis of the uncertainties," Loeb said. "When we did that, we found the conclusion of missing energy in the system isn't really supported by the data."

'Missing energy' is in the ocean

"Our data show that Earth has been accumulating heat in the ocean at a rate of half a watt per square meter (10.8 square feet), with no sign of a decline," Loeb said. "This extra energy will eventually find its way back into the atmosphere and increase temperatures on Earth."

Scientists generally agree that 90 percent of the excess heat associated with increases in greenhouse gas concentrations gets stored in Earth's ocean. If released back into the atmosphere, a half-watt per square meter accumulation of heat could increase global temperatures by 0.3 or more degrees centigrade (0.54 degree Fahrenheit).

Loeb said the findings demonstrate the importance of using multiple measuring systems over time, and illustrate the need for continuous improvement in the way Earth's energy flows are measured.

The science team at the National Center for Atmospheric Research measured inconsistencies from 2004 and 2009 between satellite observations of Earth's heat balance and measurements of the rate of upper ocean heating from temperatures in the upper 700 meters (2,300 feet) of the ocean. They said the inconsistencies were evidence of "missing energy."

Other authors of the paper are from the University of Hawaii, the Pacific Marine Environmental Laboratory in Seattle, the University of Reading United Kingdom and the University of Miami.

NASA study: Earth's energy budget 'out of balance'












         A graph of the sun's total solar irradiance shows that in recent years irradiance dipped to the lowest levels recorded during the satellite era. The resulting reduction in the amount of solar energy available to affect Earth's climate was about .25 watts per square meter, less than half of Earth's total energy imbalance. (Credit: NASA/James Hansen)

02.01.12
By Adam Voiland,
NASA's Earth Science News Team
A new NASA study underscores the fact that greenhouse gases generated by human activity—not changes in solar activity—are the primary force driving global warming.

The study offers an updated calculation of the Earth's energy imbalance, the difference between the amount of solar energy absorbed by Earth's surface and the amount returned to space as heat. The researchers' calculations show that, despite unusually low solar activity between 2005 and 2010, the planet continued to absorb more energy than it returned to space.

James Hansen, director of NASA's Goddard Institute for Space Studies (GISS) in New York City, led the research. Atmospheric Chemistry and Physics published the study last December.


A prolonged solar minimum left the sun's surface nearly free of sunspots and accompanying bright areas called faculae between 2005 and 2010. Total solar irradiance declined slightly as a result, but the Earth continued to absorb more energy than it emit throughout the minimum. An animation of a full solar cycle is available here. Credit: NASA Goddard's Scientific Visualization Studio





Total solar irradiance, the amount of energy produced by the sun that reaches the top of each square meter of the Earth's atmosphere, typically declines by about a tenth of a percent during cyclical lulls in solar activity caused by shifts in the sun's magnetic field. Usually solar minimums occur about every eleven years and last a year or so, but the most recent minimum persisted more than two years longer than normal, making it the longest minimum recorded during the satellite era.

Pinpointing the magnitude of Earth's energy imbalance is fundamental to climate science because it offers a direct measure of the state of the climate. Energy imbalance calculations also serve as the foundation for projections of future climate change. If the imbalance is positive and more energy enters the system than exits, Earth grows warmer. If the imbalance is negative, the planet grows cooler.

Hansen's team concluded that Earth has absorbed more than half a watt more solar energy per square meter than it let off throughout the six year study period. The calculated value of the imbalance (0.58 watts of excess energy per square meter) is more than twice as much as the reduction in the amount of solar energy supplied to the planet between maximum and minimum solar activity (0.25 watts per square meter).

"The fact that we still see a positive imbalance despite the prolonged solar minimum isn't a surprise given what we've learned about the climate system, but it's worth noting because this provides unequivocal evidence that the sun is not the dominant driver of global warming," Hansen said.


Data collected by Argo floats, such as this one, helped Hansen's team improve the calculation of Earth's energy imbalance. Credit: Argo Project Office








According to calculations conducted by Hansen and his colleagues, the 0.58 watts per square meter imbalance implies that carbon dioxide levels need to be reduced to about 350 parts per million to restore the energy budget to equilibrium. The most recent measurements show that carbon dioxide levels are currently 392 parts per million and scientists expect that concentration to continue to rise in the future.

Climate scientists have been refining calculations of the Earth's energy imbalance for many years, but this newest estimate is an improvement over previous attempts because the scientists had access to better measurements of ocean temperature than researchers have had in the past.

The improved measurements came from free-floating instruments that directly monitor the temperature, pressure and salinity of the upper ocean to a depth of 2,000 meters (6,560 feet). The network of instruments, known collectively as Argo, has grown dramatically in recent years since researchers first began deploying the floats a decade ago. Today, more than 3,400 Argo floats actively take measurements and provide data to the public, mostly within 24 hours.

Hansen's analysis of the information collected by Argo, along with other ground-based and satellite data, show the upper ocean has absorbed 71 percent of the excess energy and the Southern Ocean, where there are few Argo floats, has absorbed 12 percent. The abyssal zone of the ocean, between about 3,000 and 6,000 meters (9,800 and 20,000 feet) below the surface, absorbed five percent, while ice absorbed eight percent and land four percent.

The updated energy imbalance calculation has important implications for climate modeling. Its value, which is slightly lower than previous estimates, suggests that most climate models overestimate how readily heat mixes deeply into the ocean and significantly underestimates the cooling effect of small airborne particles called aerosols, which along with greenhouse gases and solar irradiance are critical factors in energy imbalance calculations.

"Climate models simulate observed changes in global temperatures quite accurately, so if the models mix heat into the deep ocean too aggressively, it follows that they underestimate the magnitude of the aerosol cooling effect," Hansen said.




Aerosols, which can either warm or cool the atmosphere depending on their composition and how they interact with clouds, are thought to have a net cooling effect. But estimates of their overall impact on climate are quite uncertain given how difficult it is to measure the distribution of the particles on a broad scale. The new study suggests that the overall cooling effect from aerosols could be about twice as strong as current climate models suggest, largely because few models account for how the particles affect clouds.

A chart shows the global reach of the network of Argo floats. (Credit: Argo Project Office)



Friday, 27 January 2012

NASA finds Russian runoff freshening Canadian Arctic


Increasing freshwater on the U.S. and Canadian side of the Arctic from 2005 to 2008 is balanced by decreasing freshwater on the Russian side, so that on average the Arctic did not have more freshwater. Here blue represents maximum freshwater increases and the yellows and oranges represent maximum freshwater decreases. Credit: University of Washington

Increasing freshwater on the U.S. and Canadian side of the Arctic from 2005 to 2008 is balanced by decreasing freshwater on the Russian side, so that on average the Arctic did not have more freshwater. Here blue represents maximum freshwater increases and the yellows and oranges represent maximum freshwater decreases. Credit: University of Washington

01.04.12
By Alan Buis,
Jet Propulsion Laboratory

A new NASA and University of Washington study allays concerns that melting Arctic sea ice could be increasing the amount of freshwater in the Arctic enough to have an impact on the global "ocean conveyor belt" that redistributes heat around our planet.
Lead author and oceanographer Jamie Morison of the University of Washington's Applied Physics Laboratory in Seattle, and his team, detected a previously unknown redistribution of freshwater during the past decade from the Eurasian half of the Arctic Ocean to the Canadian half. Yet despite the redistribution, they found no change in the net amount of freshwater in the Arctic that might signal a change in the conveyor belt.
The team attributes the redistribution to an eastward shift in the path of Russian runoff through the Arctic Ocean, which is tied to an increase in the strength of the Northern Hemisphere's west-to-east atmospheric circulation, known as the Arctic Oscillation. The resulting counterclockwise winds changed the direction of ocean circulation, diverting upper-ocean freshwater from Russian rivers away from the Arctic's Eurasian Basin, between Russia and Greenland, to the Beaufort Sea in the Canada Basin bordered by the United States and Canada. The stronger Arctic Oscillation is associated with two decades of reduced atmospheric pressure over the Russian side of the Arctic. Results of the NASA- and National Science Foundation-funded study are published Jan. 5 in the journal Nature.
Between 2003 and 2008, the resulting redistribution of freshwater was equivalent to adding 10 feet (3 meters) of freshwater over the central Beaufort Sea.
The freshwater changes were seen between 2005 and 2008 by combining ocean bottom pressure, or mass, data from NASA's Gravity Recovery and Climate Experiment satellites with ocean height data from NASA's ICESat satellite. By calculating the difference between the two sets of measurements, the team was able to map changes in freshwater content over the entire Arctic Ocean, including regions where direct water sample measurements are not available.
Red arrows show the new path of Russian river water into the Canada Basin. The previous freshwater pathway - across the Eurasian Basin toward Greenland and the Atlantic - was altered by atmospheric conditions created by the Arctic Oscillation. Credit: University of Washington
Red arrows show the new path of Russian river water into the Canada Basin. The previous freshwater pathway - across the Eurasian Basin toward Greenland and the Atlantic - was altered by atmospheric conditions created by the Arctic Oscillation. Credit: University of Washington
"Knowing the pathways of freshwater is important to understanding global climate because freshwater protects sea ice by helping create a strongly stratified cold layer between the ice and warmer, saltier water below that comes into the Arctic from the Atlantic Ocean," said Morison. "The reduction in freshwater entering the Eurasian Basin resulting from the Arctic Oscillation change could contribute to sea ice declines in that part of the Arctic."
"Changes in the volume and extent of Arctic sea ice in recent years have focused attention on melting ice," said co-author and senior research scientist Ron Kwok of NASA's Jet Propulsion Laboratory, Pasadena, Calif., which manages Grace for NASA. "The Grace and ICESat data allow us to now examine the impacts of widespread changes in ocean circulation."
An instrument about to be dropped through an opening in the ice to the seafloor will record ocean bottom pressure to compare with similar data recorded by NASA's GRACE satellites. Data from GRACE, ICESat and actual water samples led to the discovery of a new pathway of freshwater in the Arctic. Credit: C. Peralta-Ferriz/UW Applied Physics Laboratory
An instrument about to be dropped through an opening in the ice to the seafloor will record ocean bottom pressure to compare with similar data recorded by NASA's GRACE satellites. Data from GRACE, ICESat and actual water samples led to the discovery of a new pathway of freshwater in the Arctic. Credit: C. Peralta-Ferriz/UW Applied Physics Laboratory
Kwok said on whole, Arctic Ocean salinity is similar to what it was in the past, but the Eurasian Basin has become more saline, and the Canada Basin has freshened. In the Beaufort Sea, the water is the freshest it's been in 50 years of record keeping, with only a tiny fraction of that freshwater originating from melting ice and the vast majority coming from Russian river water.
The Beaufort Sea stores more freshwater when an atmospheric pressure system called the Beaufort High strengthens, driving a counterclockwise wind pattern. Consequently, it has been argued that the primary cause of freshening is a strengthening of the Beaufort High, but salinity began to decline early in the 1990s, when the Beaufort High relaxed and the counterclockwise Arctic Oscillation pattern increased.
"We discovered a pathway that allows Russian river runoff to feed the Beaufort gyre," Kwok said. "The Beaufort High is important, but so are the hemispheric-scale effects of the Arctic Oscillation."
"To better understand climate-related changes in sea ice and the Arctic overall, climate models need to more accurately represent the Arctic Oscillation's low pressure and counterclockwise circulation on the Russian side of the Arctic Ocean," Morison added.

NASA finds 2011 ninth-warmest year on record




While average global temperature will still fluctuate from year to year, scientists focus on the decadal trend. Nine of the 10 warmest years since 1880 have occurred since the year 2000, as the Earth has experienced sustained higher temperatures than in any decade during the 20th century. As greenhouse gas emissions and atmospheric carbon dioxide levels continue to rise, scientists expect the long-term temperature increase to continue as well. (Data source: NASA Goddard Institute for Space Studies. Credit: NASA Earth Observatory, Robert Simmon)



The global average surface temperature in 2011 was the ninth warmest since 1880, according to NASA scientists. The finding continues a trend in which nine of the 10 warmest years in the modern meteorological record have occurred since the year 2000.

NASA's Goddard Institute for Space Studies (GISS) in New York, which monitors global surface temperatures on an ongoing basis, released an updated analysis that shows temperatures around the globe in 2011 compared to the average global temperature from the mid-20th century. The comparison shows how Earth continues to experience warmer temperatures than several decades ago. The average temperature around the globe in 2011 was 0.92 degrees F (0.51 C) warmer than the mid-20th century baseline.




Global temperatures have warmed significantly since 1880, the beginning of what scientists call the "modern record." At this time, the coverage provided by weather stations allowed for essentially global temperature data. As greenhouse gas emissions from energy production, industry and vehicles have increased, temperatures have climbed, most notably since the late 1970s. In this animation of temperature data from 1880-2011, reds indicate temperatures higher than the average during a baseline period of 1951-1980, while blues indicate lower temperatures than the baseline average. (Data source: NASA Goddard Institute for Space Studies. Visualization credit: NASA Goddard Space Flight Center Scientific Visualization Studio)

"We know the planet is absorbing more energy than it is emitting," said GISS Director James E. Hansen. "So we are continuing to see a trend toward higher temperatures. Even with the cooling effects of a strong La Niña influence and low solar activity for the past several years, 2011 was one of the 10 warmest years on record."

The difference between 2011 and the warmest year in the GISS record (2010) is 0.22 degrees F (0.12 C). This underscores the emphasis scientists put on the long-term trend of global temperature rise. Because of the large natural variability of climate, scientists do not expect temperatures to rise consistently year after year. However, they do expect a continuing temperature rise over decades.

The first 11 years of the 21st century experienced notably higher temperatures compared to the middle and late 20th century, Hansen said. The only year from the 20th century in the top 10 warmest years on record is 1998.

Higher temperatures today are largely sustained by increased atmospheric concentrations of greenhouse gases, especially carbon dioxide. These gases absorb infrared radiation emitted by Earth and release that energy into the atmosphere rather than allowing it to escape to space. As their atmospheric concentration has increased, the amount of energy "trapped" by these gases has led to higher temperatures.

The carbon dioxide level in the atmosphere was about 285 parts per million in 1880, when the GISS global temperature record begins. By 1960, the average concentration had risen to about 315 parts per million. Today it exceeds 390 parts per million and continues to rise at an accelerating pace.

The temperature analysis produced at GISS is compiled from weather data from more than 1,000 meteorological stations around the world, satellite observations of sea surface temperature and Antarctic research station measurements. A publicly available computer program is used to calculate the difference between surface temperature in a given month and the average temperature for the same place during 1951 to 1980. This three-decade period functions as a baseline for the analysis.

The resulting temperature record is very close to analyses by the Met Office Hadley Centre in the United Kingdom and the National Oceanic and Atmospheric Administration's National Climatic Data Center in Asheville, N.C.

Hansen said he expects record-breaking global average temperature in the next two to three years because solar activity is on the upswing and the next El Niño will increase tropical Pacific temperatures. The warmest years on record were 2005 and 2010, in a virtual tie.

"It's always dangerous to make predictions about El Niño, but it's safe to say we'll see one in the next three years," Hansen said. "It won't take a very strong El Niño to push temperatures above 2010."

Share

Twitter Delicious Facebook Digg Stumbleupon Favorites More