Thursday, March 26, 2009

Deep-sea Rocks Point To Early Oxygen On Earth

ScienceDaily (Mar. 25, 2009) — Red jasper cored from layers 3.46 billion years old suggests that not only did the oceans contain abundant oxygen then, but that the atmosphere was as oxygen rich as it is today, according to geologists.

This jasper or hematite-rich chert formed in ways similar to the way this rock forms around hydrothermal vents in the deep oceans today.
"Many people have assumed that the hematite in ancient rocks formed by the oxidation of siderite in the modern atmosphere," said Hiroshi Ohmoto, professor of geochemistry, Penn State. "That is why we wanted to drill deeper, below the water table and recover unweathered rocks."
The researchers drilled diagonally into the base of a hill in the Pilbara Craton in northwest Western Australia to obtain samples of jasper that could not have been exposed to the atmosphere or water. These jaspers could be dated to 3.46 billion years ago.
"Everyone agrees that this jasper is 3.46 billion years old," said Ohmoto. "If hematite were formed by the oxidation of siderite at any time, the hematite would be found on the outside of the siderite, but it is found inside," he reported in a recent issue of Nature Geoscience.
The next step was to determine if the hematite formed near the water's surface or in the depths. Iron compounds exposed to ultra violet light can form ferric hydroxide, which can sink to the bottom as tiny particles and then be converted to hematite at temperatures of at least 140 degrees Fahrenheit.
"There are a number of cases around the world where hematite is formed in this way," says Ohmoto. "So just because there is hematite, there is not necessarily oxygen in the water or the atmosphere."
The key to determining if ultra violet light or oxygen formed the hematite is the crystalline structure of the hematite itself. If the precursors of hematite were formed at the surface, the crystalline structure of the rock would have formed from small particles aggregating producing large crystals with lots of empty spaces between. Using transmission electron microscopy, the researchers did not find that crystalline structure.
"We found that the hematite from this core was made of a single crystal and therefore was not hematite made by ultra violet radiation," said Ohmoto.
This could only happen if the deep ocean contained oxygen and the iron rich fluids came into contact at high temperatures. Ohmoto and his team believe that this specific layer of hematite formed when a plume of heated water, like those found today at hydrothermal vents, converted the iron compounds into hematite using oxygen dissolved in the deep ocean water.
"This explains why this hematite is only found in areas with active submarine volcanism," said Ohmoto. "It also means that there was oxygen in the atmosphere 3.46 billion years ago, because the only mechanism for oxygen to exist in the deep oceans is for there to be oxygen in the atmosphere."
In fact, the researchers suggest that to have sufficient oxygen at depth, there had to be as much oxygen in the atmosphere 3.46 billion years ago as there is in today's atmosphere. To have this amount of oxygen, the Earth must have had oxygen producing organisms like cyanobacteria actively producing it, placing these organisms much earlier in Earth's history than previously thought.
"Usually, we look at the remnant of what we think is biological activity to understand the Earth's biology," said Ohmoto. "Our approach is unique because we look at the mineral ferric oxide to decipher biological activity."
Ohmoto suggests that this approach eliminates the problems trying to decide if carbon residues found in sediments were biologically created or simply chemical artifacts.
Other researchers on the study included Masamichi Hoashi, graduate student at Kagoshima University, Japan; Arthur H. Hickman, geologist with the Geological Survey of Western Australia; Satoshi Utsunomiya, Kyushu University, Japan, and David C. Bevacqua and Tsubasa Otake, former Penn State master's and doctoral students, Penn State; and Yumiko Watanabe, research associate, Penn State.
The NASA Astrobiology Institute supported this work.
Adapted from materials provided by Penn State.

Monday, March 23, 2009

Climate Warming Affects Antarctic Ice Sheet Stability

SOURCE

ScienceDaily (Mar. 22, 2009) — A five-nation scientific team has published new evidence that even a slight rise in atmospheric concentrations of carbon dioxide, one of the gases that drives global warming, affects the stability of the West Antarctic Ice Sheet (WAIS). The massive WAIS covers the continent on the Pacific side of the Transantarctic Mountains. Any substantial melting of the ice sheet would cause a rise in global sea levels.
The research, which was published in the March 19 issue of the journal Nature, is based on investigations by a 56-member team of scientists conducted on a 1,280-meter (4,100-foot)-long sedimentary rock core taken from beneath the sea floor under Antarctica's Ross Ice Shelf during the first project of the ANDRILL (ANtarctic geological DRILLing) research program--the McMurdo Ice Shelf (MIS) Project.
"The sedimentary record from the ANDRILL project provides scientists with an important analogue that can be used to help predict how ice shelves and the massive WAIS will respond to future global warming over the next few centuries," said Ross Powell, a professor of geology at Northern Illinois University.
"The sedimentary record indicates that under global warming conditions that were similar to those projected to occur over the next century, protective ice shelves could shrink or even disappear and the WAIS would become vulnerable to melting," Powell said. "If the current warm period persists, the ice sheet could diminish substantially or even disappear over time. This would result in a potentially significant rise in sea levels."
ANDRILL--which involves scientists from the United States, New Zealand, Italy and Germany--refines previous findings about the relationship between atmospheric carbon dioxide concentration, atmospheric and oceanic temperatures, sea level rise and natural cycles in Earth's orbit around the Sun, through the study of sediment and rock cores that are a geological archive of past climate.
The dynamics of ice sheets, including WAIS, are not well understood, and improving scientists' comprehension of the mechanisms that control the growth, melting and movements of ice sheets was one of NSF's research priorities during the International Polar Year (IPY). The IPY field campaign, which officially ended March 2009, has been an intense scientific campaign to explore new frontiers in polar science, improve our understanding of the critical role of the polar regions in global processes, and educate students, teachers, and the public about the polar regions and their importance to the global system. NSF was the lead agency for U.S. IPY efforts.
The cores retrieved by ANDRILL researchers have allowed them to peer back in time to the Pliocene era, roughly 2 million to 5 million years ago. During that era, the Antarctic was in a natural climate state that was warmer than today and atmospheric carbon dioxide levels were higher. Data from the cores indicate the WAIS advanced and retreated numerous times in response to forcing driven by these climate cycles.
Powell and Tim Naish, director of Victoria University of Wellington's Antarctic Research Centre, served as co-chief scientists of the 2006-2007 ANDRILL project that retrieved the data and are lead authors in one of two companion studies published in Nature.
Naish said the new information gleaned from the core shows that changes in the tilt of Earth's rotational axis has played a major role in ocean warming that has driven repeated cycles of growth and retreat of the WAIS for the period in Earth's history between 3 million and 5 million years ago.
"It also appears that when atmospheric carbon dioxide concentrations reached 400 parts per million around four million years ago, the associated global warming amplified the effect of the Earth's axial tilt on the stability of the ice sheet," he said.
"Carbon dioxide concentration in the atmosphere is again approaching 400 parts per million," Naish said. "Geological archives, such as the ANDRILL core, highlight the risk that a significant body of permanent Antarctic ice could be lost within the next century as Earth's climate continues to warm. Based on ANDRILL data combined with computer models of ice sheet behavior, collapse of the entire WAIS is likely to occur on the order of 1,000 years, but recent studies show that melting has already begun."
The second ANDRILL study in Nature--led by David Pollard of Pennsylvania State University and Rob DeConto from University of Massachusetts--reports results from a computer model of the ice sheets. The model shows that each time the WAIS collapsed, some of the margins of the East Antarctic Ice Sheet also melted, and the combined effect was a global sea level rise of 7 meters above present-day levels.
Whether the beginnings of such a collapse could start 100 years from now or within the next millennium is hard to predict and depends on future atmospheric CO2 levels, the researchers said. However, the new information from ANDRILL contributes a missing piece of the puzzle as scientists try to refine their predictions of the effects of global warming.
The most recent report of the Intergovernmental Panel on Climate Change (IPCC) noted that because so little is understood about ice sheet behavior it is difficult to predict how ice sheets will contribute to sea level rise in a warming world. The behavior of ice sheets, the IPCC report said, is one of the major uncertainties in predicting exactly how the warming of the globe will affect human populations.
"From these combined data modeling studies, we can say that past warming events caused West Antarctic ice shelves and ice grounded below sea level to melt and disappear. The modeling suggests these collapses took one to a few thousand years," Pollard said.
Pollard and DeConto also underscored the role of ocean temperatures in melting of the ice.
"It's clear from our combined research using geological data and modeling that ocean temperatures play a key role," DeConto said. "The most substantial melting of protective ice shelves comes from beneath the ice, where it is in contact with seawater. We now need more data to determine what is happening to the underside of contemporary ice shelves."
The National Science Foundation (NSF), which manages the U.S. Antarctic Program (USAP), provided about $20 million in support of the ANDRILL program. The other ANDRILL national partners contributed an additional $10 million in science and logistics support.
The ANDRILL Science Management Office, located at the University of Nebraska-Lincoln, supports science planning and the activities of the international ANDRILL Science Committee (ASC). Antarctica New Zealand is the ANDRILL project operator and has developed the drilling system in collaboration with Alex Pyne at Victoria University of Wellington and Webster Drilling and Exploration.
The U.S. Antarctic Program and Raytheon Polar Services Corporation (RPSC) supported the science team at McMurdo Station and in the Crary Science and Engineering Laboratory, while Antarctica New Zealand supported the drilling team at Scott Base.
ANDRILL scientific studies are jointly supported by: the U.S. National Science Foundation, the New Zealand Foundation for Research, the Italian Antarctic Research Program, the German Science Foundation and the Alfred Wegener Institute.
Adapted from materials provided by National Science Foundation.

Friday, March 20, 2009

Earth Science: Lithosphere Deformed And Fractured Under Indian Ocean Much Earlier Than Previously Thought

SOURCE

ScienceDaily (Mar. 20, 2009) — The discovery by Indian and British scientists that the Earth’s strong outer shell – the ‘lithosphere’ – within the central Indian Ocean began to deform and fracture 15.4–13.9 million years ago, much earlier than previously thought, impacts our understanding of the birth of the Himalayas and the strengthening of the Indian-Asian monsoon.
India and Asia collided around 50 million years ago as a result of plate tectonics – the large-scale movements of the lithosphere, which continue to this day. The new study, published in the scientific journal Geology, focuses on the tectonics-related deformation of the lithosphere below the central Indian Ocean.
“Compression of the lithosphere has caused large-scale buckling and cracking,” says team member Professor Jon Bull of the University of Southampton’s School of Ocean and Earth Science based at the National Oceanography Centre; “The ocean floor has been systematically transformed into folds 100-300 kilometres long and 2,000-3,000 metres high, and there are also regularly spaced faults or fractures that are evident from seismic surveys and ocean drilling.”
The onset of this deformation marks the start of major geological uplift of the Himalayas and the Tibetan Plateau, some 4,000 km further to the north, due to stresses within the wider India-Asia area. Some studies indicate that it began around 8.0–7.5 million years ago, while others have indicated that it started before 8.0 million years ago, and perhaps much earlier.
This controversy has now been addressed by Professor Bull and his colleagues Dr Kolluru Krishna of the National Institute of Oceanography in India, and Dr Roger Scrutton of Edinburgh University. They have analysed seismic profiles of 293 faults in the accumulated sediments of the Bengal Fan. This is the world’s largest submarine fan, a delta-shaped accumulation of land-derived sediments covering the floor of the Bay of Bengal.
They demonstrate that deformation of the lithosphere within the central Indian Ocean started around 15.4–13.9 million years ago, much earlier than most previous estimates. This implies considerable Himalayan uplift before 8.0 million years ago, which is when many geologists believe that the strong seasonal winds of the India-Asia monsoon first started.
“However,” says Professor Bull, “the realisation that the onset of lithospheric deformation within the central Indian Ocean occurred much earlier fits in well with more recent evidence that the strengthening of the monsoon was linked to the early geological uplift of the Himalayas and Tibetan plateau up to 15-20 million years ago.”
Intensive deep-sea drilling within the Bengal Fan should provide better age estimates for the onset of deformation of the lithosphere in the central Indian Ocean and help settle the controversy.
The research was funded by India’s Council of Scientific and Industrial Research (CSIR), and the United Kingdom’s Royal Society and Natural Environment Research Council (NERC).
Adapted from materials provided by National Oceanography Centre, Southampton.

Thursday, March 19, 2009

Ozone: New Simulation Shows Consequences Of A World Without Earth's Natural Sunscreen

SOURCE

ScienceDaily (Mar. 19, 2009) — The year is 2065. Nearly two-thirds of Earth's ozone is gone -- not just over the poles, but everywhere. The infamous ozone hole over Antarctica, first discovered in the 1980s, is a year-round fixture, with a twin over the North Pole. The ultraviolet (UV) radiation falling on mid-latitude cities like Washington, D.C., is strong enough to cause sunburn in just five minutes. DNA-mutating UV radiation is up 650 percent, with likely harmful effects on plants, animals and human skin cancer rates.
Such is the world we would have inherited if 193 nations had not agreed to ban ozone-depleting substances, according to atmospheric chemists at NASA's Goddard Space Flight Center, Greenbelt, Md., Johns Hopkins University, Baltimore, and the Netherlands Environmental Assessment Agency, Bilthoven.
Led by Goddard scientist Paul Newman, the team simulated "what might have been" if chlorofluorocarbons (CFCs) and similar chemicals were not banned through the treaty known as the Montreal Protocol. The simulation used a comprehensive model that included atmospheric chemical effects, wind changes, and radiation changes. The analysis has been published online in the journal Atmospheric Chemistry and Physics.
"Ozone science and monitoring has improved over the past two decades, and we have moved to a phase where we need to be accountable," said Newman, who is co-chair of the United Nations Environment Programme's Scientific Assessment Panel to review the state of the ozone layer and the environmental impact of ozone regulation. "We are at the point where we have to ask: Were we right about ozone? Did the Montreal Protocol work? What kind of world was avoided by phasing out ozone-depleting substances?"
Ozone is Earth's natural sunscreen, absorbing and blocking most of the incoming UV radiation from the sun and protecting life from DNA-damaging radiation. The gas is naturally created and replenished by a photochemical reaction in the upper atmosphere where UV rays break oxygen molecules (O2) into individual atoms that then recombine into three-part molecules (O3). As it is moved around the globe by upper level winds, ozone is slowly depleted by naturally occurring atmospheric gases. It is a system in natural balance.
But chlorofluorocarbons -- invented in 1928 as refrigerants and as inert carriers for chemical sprays -- upset that balance. Researchers discovered in the 1970s and 1980s that while CFCs are inert at Earth's surface, they are quite reactive in the stratosphere (10 to 50 kilometers altitude, or 6 to 31 miles), where roughly 90 percent of the planet's ozone accumulates. UV radiation causes CFCs and similar bromine compounds in the stratosphere to break up into elemental chlorine and bromine that readily destroy ozone molecules. Worst of all, such ozone depleting substances can reside for several decades in the stratosphere before breaking down.
In the 1980s, ozone-depleting substances opened a wintertime "hole" over Antarctica and opened the eyes of the world to the effects of human activity on the atmosphere. By 1987, the World Meteorological Organization and United Nations Environment Program had brought together scientists, diplomats, environmental advocates, governments, industry representatives, and non-governmental organizations to forge an agreement to phase out the chemicals. In January 1989, the Montreal Protocol went into force, the first-ever international agreement on regulation of chemical pollutants.
“The regulation of ozone depleting substances was based upon the evidence gathered by the science community and the consent of industry and government leaders," Newman noted. "The regulation pre-supposed that a lack of action would lead to severe ozone depletion, with consequent severe increases of solar UV radiation levels at the Earth’s surface."
In the new analysis, Newman and colleagues "set out to predict ozone losses as if nothing had been done to stop them." Their "world avoided" simulation took months of computer time to process.
The team started with the Goddard Earth Observing System Chemistry-Climate Model (GEOS-CCM), an earth system model of atmospheric circulation that accounts for variations in solar energy, atmospheric chemical reactions, temperature variations and winds, and other elements of global climate change. For instance, the new model accounts for how changes in the stratosphere influence changes in the troposphere (the air masses near Earth's surface). Ozone losses change the temperature in different parts of the atmosphere, and those changes promote or suppress chemical reactions.
The researchers then increased the emission of CFCs and similar compounds by 3 percent per year, a rate about half the growth rate for the early 1970s. Then they let the simulated world evolve from 1975 to 2065.
By the simulated year 2020, 17 percent of all ozone is depleted globally, as assessed by a drop in Dobson Units (DU), the unit of measurement used to quantify a given concentration of ozone. An ozone hole starts to form each year over the Arctic, which was once a place of prodigious ozone levels.
By 2040, global ozone concentrations fall below 220 DU, the same levels that currently comprise the "hole" over Antarctica. (In 1974, globally averaged ozone was 315 DU.) The UV index in mid-latitude cities reaches 15 around noon on a clear summer day (a UV index of 10 is considered extreme today.), giving a perceptible sunburn in about 10 minutes. Over Antarctica, the ozone hole becomes a year-round fixture.
In the 2050s, something strange happens in the modeled world: Ozone levels in the stratosphere over the tropics collapse to near zero in a process similar to the one that creates the Antarctic ozone hole.
By the end of the model run in 2065, global ozone drops to 110 DU, a 67 percent drop from the 1970s. Year-round polar values hover between 50 and 100 DU (down from 300-500 in 1960). The intensity of UV radiation at Earth's surface doubles; at certain shorter wavelengths, intensity rises by as much as 10,000 times. Skin cancer-causing radiation soars.
"Our world avoided calculation goes a little beyond what I thought would happen," said Goddard scientist and study co-author Richard Stolarski, who was among the pioneers of atmospheric ozone chemistry in the 1970s. "The quantities may not be absolutely correct, but the basic results clearly indicate what could have happened to the atmosphere. And models sometimes show you something you weren't expecting, like the precipitous drop in the tropics."
"We simulated a world avoided," said Newman, "and it's a world we should be glad we avoided."
The real world of CFC regulation has been somewhat kinder. Production of ozone-depleting substances was mostly halted about 15 years ago, though their abundance is only beginning to decline because the chemicals can reside in the atmosphere for 50 to 100 years. The peak abundance of CFCs in the atmosphere occurred around 2000, and has decreased by roughly 4 percent to date.
Stratospheric ozone has been depleted by 5 to 6 percent at middle latitudes, but has somewhat rebounded in recent years. The largest recorded Antarctic ozone hole was recorded in 2006.
"I didn't think that the Montreal Protocol would work as well as it has, but I was pretty naive about the politics," Stolarski added. "The Montreal Protocol is a remarkable international agreement that should be studied by those involved with global warming and the attempts to reach international agreement on that topic."
Adapted from materials provided by NASA/Goddard Space Flight Center.

Drought, Urbanization Were Ingredients For Atlanta's Perfect Storm

ScienceDaily (Mar. 18, 2009) — On March 14, 2008, a tornado swept through downtown Atlanta, its 130 mile-per-hour winds ripping holes in the roof of the Georgia Dome, blowing out office windows, and trashing parts of Centennial Olympic Park. It was an event so rare in an urban landscape that researchers immediately began to examine NASA satellite data and historical archives to see what weather and climatological ingredients may have combined to brew such a storm.

Though hundreds of tornadoes form each year across the United States, records of "downtown tornadic events" are quite rare. The 2008 Atlanta tornado—the first in the city's recorded history—was also unique because it developed during extreme drought conditions.
In a NASA-funded study, researchers from Purdue University in West Lafayette, Ind., and the University of Georgia (UGA) in Athens found that intermittent rain in the days before the storms—though providing temporary drought relief—may have moistened some areas enough to create favorable conditions for severe storms to form and intensify. Additionally, the sprawling urban landscape may have given the storms the extra, turbulent energy needed to spin up a tornado. The researchers reported their findings in January at the annual meeting of the American Meteorological Society.
"The Atlanta tornado, though forecasted well, caught us by surprise because it evolved rapidly under very peculiar conditions during a drought and over a downtown area," said Dev Niyogi, an assistant professor of regional climatology at Purdue and lead author of the modeling study. "We wanted to know why it hit Atlanta during one of the longest, harshest droughts the southeast has experienced. Was it a manifestation of the drought? Does urban development have an effect on such a storm?"
Such questions are becoming more relevant as the Intergovernmental Panel on Climate Change, NASA, and other institutions investigate the relationships between extreme water cycle events (such as drought), land cover change, weather, and climate change.
In the southeastern U.S., tornadoes are quite common in the spring when upper level wind patterns, surface moisture, and surface weather features promote severe weather. But moisture was scarce in the weeks leading up to the March 2008 Atlanta tornado, and likely should have suppressed a storm, according to atmospheric scientist Marshall Shepherd of UGA. Shepherd, Niyogi and colleagues recently completed a 50-year climatological assessment that finds tornadic activity is often suppressed during droughts in the Southeast.
To get to the bottom of how such a storm could have developed despite the drought, Purdue researchers Niyogi, Ming Lei, and Anil Kumar—along with Shepherd—investigated reports of isolated rain showers that had swept through parts of Alabama and northwest Georgia in the 48 hours prior to the tornado. They suspected that these "wet pockets" might have triggered—but more likely enhanced—the initial thunderstorms.
The scattered rainfall fell between areas that received no rain, setting up pockets of high humidity between areas of warm, dry air. The wet and dry areas may have acted as weak atmospheric fronts or may have promoted air circulation and evaporation that could have intensified the storms. A similar phenomenon promotes severe thunderstorms in Florida, where moist sea breezes interact with dry interior air masses.
Niyogi and Shepherd also found evidence that storm intensity was amplified by the heat-retaining effects of Atlanta's buildings and streets. The "heat island" effect leads to warmer air temperatures in urban areas because impervious surfaces like glass, metal, concrete and asphalt absorb, reflect, and store heat differently than tree or grass-covered land. Urban environments heat the air and cause moisture to rise quickly, creating a "thunderstorm pump" that can fuel or intensify storms. In March 2008, the differences in soil moisture and Atlanta's sprawling land cover may have provided the perfect blend for storms to intensify.
"A thunderstorm, energized by moist pockets within a drought region, grew into a tornado-causing severe thunderstorm because of weather instabilities it encountered at the rural-urban boundary," Niyogi explained.
"Drought and urbanization do not cause the thunderstorms or tornado, but ultimately they added fuel to the fire of an already energized storm," he added. "The variable rain bands created patches of land that were wet and dry, green and not green. The combination created surface boundaries that can destabilize the weather system and energize an approaching storm, providing the one-two punch."
Niyogi, Shepherd, and colleagues used the Moderate Resolution Imaging Spectroradiometer (MODIS) aboard NASA's Aqua satellite to assess the state of ground vegetation immediately before and after the storm, as well the long-term differences before and during the drought. The researchers also examined rainfall estimates captured by NASA's Tropical Rainfall Measurement Mission satellite to identify the unusual bands of rainfall two days before the tornado.
Finally, they examined soil moisture data from the Japanese Advanced Microwave Scanning Radiometer - Earth Observing System (AMSR-E) instrument on NASA's Aqua satellite to evaluate the intensity of the drought at the time of the tornado. When these real drought and urban land cover conditions were included in the team's atmosphere-land surface computer models, the simulations produced a more intense storm that mirrored reality.
"Our findings highlight the difficulty in detangling the influences of the atmosphere and of Earth's surface within the weather-hydroclimate system," said Shepherd. "Soil moisture and urban land cover are not well-represented in weather models, but a new look at satellite data offers a fresh opportunity to improve forecasts."
"With many studies suggesting more potential for urbanization and droughts in our future," Niyogi added, "it will be important to see if this kind of intense storm development could happen more frequently in future climates."
Adapted from materials provided by NASA/Goddard Space Flight Center.

Wednesday, March 18, 2009

Lessons From Hurricane Rita Not Practiced During Hurricane Ike


ScienceDaily (Mar. 19, 2009) — A new Rice University report released yesterday, exactly six months after Hurricane Ike slammed the Texas Gulf Coast, suggests that people did not practice the lessons learned from Hurricane Rita.
According to the study, 75 percent of Harris County residents say they would evacuate if a Category 4 hurricane threatened Houston. This is a significant potential increase over the 24 percent of residents who left during the Category 2 Hurricane Ike. It's also a significant increase over the 52 percent of Harris County residents who evacuated in 2005 during the Category 4 Hurricane Rita but found themselves stuck in miles-long traffic jams on highways or stranded as the storm approached.
"Essentially, this study shows that people didn't learn from Hurricane Rita," said the report's co-author Robert Stein, the Lena Gohlman Fox Professor of Political Science at Rice. "Had Hurricane Ike been a severe storm -- a Category 3 or 4 -- more people would have evacuated, and we would have experienced roadway gridlock."
The reports shows that significantly fewer people evacuated during Hurricane Ike than during Hurricane Rita, but a large portion of the population left areas that were not under an evacuation order.
"The timing of evacuations showed no improvement over the experience during Hurricane Rita, when roadways experienced paralyzing gridlock," Stein said. "People evacuating from hurricane Ike all left too late, potentially creating the same conditions that existed during Hurricane Rita had a larger population evacuated."
The report details the results of surveys that assessed people's experience before, during and after each hurricane's landfall. The surveys were conducted in the weeks immediately after each storm -- Sept. 29-Oct. 3 for the Hurricane Rita survey, and Sept. 23-Oct. 24 for the Hurricane Ike survey.
The report is intended to enable policymakers and leaders to be more effective in getting their constituents to comply with evacuation orders.
The report also found:
Local television weather reporters were the most-relied-upon source of information for both hurricanes. During Hurricane Ike, the Weather Channel was the second most-relied-upon source.
In non-evacuation zones during Hurricane Rita, 40 percent of residents evacuated. These "shadow evacuees" were largely responsible for the road congestion. During Hurricane Ike, that number fell to 21 percent.
Evacuees during Hurricane Ike responded correctly by taking fewer vehicles and slightly more people per vehicle. This was particularly true for people from areas under an evacuation order.
The release of this report coincides with a free public forum at Rice University March 12 featuring Houston Mayor Bill White and Harris County Judge Ed Emmett discussing the leadership challenges they had to overcome to guide Houston through the disaster. "Leadership in Crisis: Guiding Houston through the Storm" will be held from 6 to 7 p.m. in Sewall Hall, Room 301, on the Rice campus, 6100 Main St. Stein and report co-authors Leonardo Dueñas-Osorio, assistant professor in civil and environmental engineering, and Devika Subramanian, professor of computer science and in electrical and computer engineering, will be available to take questions before and after the event.
The full report is available at http://www.media.rice.edu/images/media/0312_CCE_HurricaneIke_report.pdf

Earth's Crust Melts Easier Than Previously Thought

ScienceDaily (Mar. 19, 2009) — A University of Missouri study just published in Nature has found that the Earth's crust melts easier than previously thought. In the study, researchers measured how well rocks conduct heat at different temperatures and found that as rocks get hotter in the Earth's crust, they become better insulators and poorer conductors.
This finding provides insight into how magmas are formed and will lead to better models of continental collision and the formation of mountain belts.
"In the presence of external heat sources, rocks will heat up more efficiently than previously thought," said Alan Whittington, professor of geological sciences in the MU College of Arts and Science. "We applied our findings to computer models that predict what happens to rocks when they get buried and heat up in mountain belts, such as the Himalayas today or the Black Hills in South Dakota in the geologic past. We found that strain heating, caused by tectonic movements during mountain belt formation, quite easily triggers crustal melting."
In the study, researchers used a laser-based technique to determine how long it took heat to conduct through different rock samples. In all of the samples, thermal diffusivity, or how well a material conducts heat, decreased rapidly with increasing temperatures. Researchers found the thermal diffusivity of hot rocks and magmas to be half that of what had been previously assumed.
"Most crustal melting on the Earth comes from intrusions of hot basaltic magma from the Earth's mantle," said Peter Nabelek, professor of geological sciences in the MU College of Arts and Science. "The problem is that during continental collisions, we don't see intrusions of basaltic magma into continental crust. These experiments suggest that because of low thermal diffusivity, strain heating is much faster and more efficient, and once rocks get heated, they stay hotter for much longer. Of course, these processes take millions of years to occur and we can only simulate them on a computer. This new data will allow us to create computer models that more accurately represent processes that occur during continental collisions."
The study was co-authored by Whittington, Nabelek and Anne Hofmeister, a professor at Washington University. The National Science Foundation funded this research.
Journal reference:
. Temperature-dependent thermal diffusivity of the Earth's crust and implications for magmatism. Nature, March 19, 2009
Adapted from materials provided by University of Missouri-Columbia, via EurekAlert!, a service of AAAS.

Tree-eating Bugs Seen By Satellite As They Denude Invasive Tamarisk Trees In Southwest U.S.


ScienceDaily (Mar. 18, 2009) — More than 150 years after a small Eurasian tree named tamarisk or saltcedar started taking over river banks throughout the U.S. Southwest, saltcedar leaf beetles were unleashed to defoliate the exotic invader.
Now, University of Utah scientists say their new study shows it is feasible to use satellite data to monitor the extent of the beetle's attack on tamarisk, and whether use of the beetles may backfire with unintended environmental consequences.
"We don't have any idea of the long-term impacts of using the beetles; their release may have unexpected repercussions," says Philip Dennison, an assistant professor of geography and first author of the study scheduled for online publication later this month in the journal Remote Sensing of Environment.
"The impact of this defoliation is largely unknown," says study co-author Kevin Hultine, a research assistant professor of biology at the University of Utah. "The net impact of controlling tamarisk could be positive or negative."
"We would like on-the-ground scientists and managers to understand and think about the long-term impact – what are these riparian [riverbank] areas going to look like 15 years from now, and how can we can maintain ecosystems" as well as water flows for farms, cities and river recreation, Hultine says.
Dennison and Hultine conducted the study with Jim Ehleringer, a distinguished professor of biology at the University of Utah; physical scientist Pamela Nagler, of the U.S. Geological Survey in Tucson, Ariz.; and Edward Glenn, a University of Arizona environmental scientist.
A Shady Invader from Eurasia
Anyone who has rafted Southwestern rivers like the Green and Colorado knows about the shady thickets of tamarisk that line the riverbanks. The trees can grow up to 30 feet tall. There are about 10 species of tamarisk.
The U.S. Animal and Plant Health Inspection Service (APHIS) says saltcedar or tamarisk is "a highly invasive, exotic weed" in the form of "a large shrub or small tree that was introduced to North America from Asia in the early 1800s. The plant has been used for windbreaks, ornamentals, and erosion control. By 1850, saltcedar had infested river systems and drainages in the Southwest, often displacing native vegetation."
"By 1938, infestations were found from Florida to California and as far north as Idaho," according to APHIS. "Saltcedar continues to spread rapidly and currently infests water drainages and areas throughout the United States."
Tamarisk dominates riverbank habitats, limiting camping areas for river runners, reducing diversity and providing poor habitat for some species of wildlife. Tamarisk also raises the risk of fires that destroy cottonwoods and other native plants but not tamarisk, which re-sprouts from roots. And tamarisk forms a dense canopy, also helping wipe out competing plants. Finally, tamarisk has a bad rap as a water-sucking wastrel that dries springs, lowers water tables and reduces stream flows, even impairing boating.
Dennison and Hultine say recent research indicates tamarisk's thirst is overstated.
"Some of the earliest research on tamarisk water use suggested tamarisk uses dramatically more water than other tree species," Hultine says. "So a lot of estimates on water loss over entire river reaches are based on information that now has been discredited in the scientific literature."
Hultine believes that unless aggressive programs to restore defoliated areas are implemented, tamarisk will be replaced by other invaders – Russian knapweed, Russian olive and pepperweed – that may use more water than tamarisk. Eradicating tamarisk with beetles also may reduce bird habitat, he adds.
Monitoring the Attack of the Tamarisk-Munching Beetles
The saltcedar leaf beetle, Diorhabda elongata, was brought to the U.S. from Kazakhstan. After an environmental assessment, APHIS approved them for tamarisk control.
Dennison says thousands of the beetles first were released in Utah during summer 2004, then again in summer 2005 and 2006 at locations along the Colorado River near Moab. Widespread defoliation of tamarisk in the area was noted during summer 2007.
Because long stretches of rivers in the Colorado River Basin are remote, Dennison and colleagues decided to test the feasibility of using satellite images to detect tamarisk leaf loss due to the spread of the saltcedar leaf beetles.
They mapped 56 accessible areas already defoliated by tamarisk, and studied if the defoliation could be detected using two instruments on Terra, one of the National Aeronautics and Space Administration's Earth-observing satellites.
Both instruments make images using red and near-infrared light. Plant pigments absorb red from sunlight and reflect near-infrared. In near-infrared images, tamarisk-covered areas appear red. Defoliated areas appear brown or black because there are no leaves to absorb red light and reflect near-infrared light. The two instruments are:
ASTER, the Advanced Spaceborne Thermal Emission and Reflection Radiometer, obtains relatively high-resolution images, with each pixel covering an area about 50 feet long by 50 feet wide. It can detect big changes like tamarisk defoliation on an even smaller scale. It only obtains one to three images of a given area every summer.
MODIS, the Moderate Resolution Imaging Spectroradiometer, which can detect less detail – a pixel measures about 820 feet by 820 feet. But it can see where large swaths of tamarisk have been defoliated, Dennison says. MODIS makes daily images.
Dennison says the infrequent, higher-resolution ASTER images allow researchers to map defoliated areas, while the frequent, lower-resolution MODIS images help them detect changes in vegetation over time.
The area studied included four sites along the Colorado River northeast of Moab, and a fifth site along the tributary Dolores River at the Entrada Field Station operated by the University of Utah for education and research. The five sites covered 589 acres, and within them, researchers mapped 56 polygon-shaped areas totaling 57 acres where tamarisk had been defoliated by the beetles.
ASTER measured what is known as NVDI – the normalized difference vegetation index, which is the difference between red light absorbed by plants and near-infrared light reflected by them. The index is high when plants are present, low when they are absent.
Those satellite measurements showed minor changes in vegetation at the test sites from 2005 to 2006, but a large change between 2006 and 2007 – indicating extensive defoliation of tamarisk, even though the defoliated plants regrow within about six weeks.
The satellite's MODIS instrument used another vegetation index that also revealed widespread tamarisk defoliation at the five sites in July 2007.
While some tamarisk has died in Nevada where the beetles first were established, "we don't understand whether repeated defoliation eventually will kill most of the trees, or will they reach some point where they'll just have less leaf area over the entire year," Hultine says.
The researchers also used the satellite to estimate "evapotranspiration" – the evaporation of water from soil and the transpiration or use of water by plants – to learn more about how defoliation of tamarisk affects water use. For comparison, Hultine measured sap flow through trees, which reflects how much water is used by the trees.
Satellite estimates of tamarisk water use declined modestly as the plants were defoliated, Dennison says. The findings also were consistent with earlier research indicating tamarisk is less of a water hog than previously thought.
Dennison says he and his colleagues did the study to test the feasibility of using satellites to monitor tamarisk defoliation on an ongoing basis. That, he says, could be done by federal agencies such as the Bureau of Land Management, Bureau of Reclamation and U.S. Geological Survey.
Adapted from materials provided by University of Utah.

Tuesday, March 17, 2009

Robot Sub Searches For Signs Of Melting 60 Km Into An Antarctic Ice Shelf Cavity


ScienceDaily (Mar. 17, 2009) — Autosub, a robot submarine built and developed by the UK's National Oceanography Centre, Southampton, has successfully completed a high-risk campaign of six missions travelling under an Antarctic glacier.
Autosub has been exploring Pine Island Glacier, a floating extension of the West Antarctic ice sheet, using sonar scanners to map the seabed and the underside of the ice as it juts into the sea. Scientists hope to learn why the glacier has been thinning and accelerating over recent decades. Pine Island Glacier is in the Amundsen Sea, part of the South Pacific bordering West Antarctica. Changes in its flow have been observed since the early 1970s, and together with neighbouring glaciers it is currently contributing about 0.25 mm a year to global sea level rise.
Steve McPhail led the Autosub team during the ten-day survey. He said: "Autosub is a completely autonomous robot: there are no connecting wires with the ship and no pilot. Autosub has to avoid collisions with the jagged ice overhead and the unknown seabed below, and return to a pre–defined rendezvous point, where we crane it back onboard the ship.
"Adding to the problems are the sub zero water temperatures and the crushing pressures at 1000 m depth. All systems on the vehicle must work perfectly while under the ice or it would be lost. There is no hope of rescue 60 km in, with 500 metres of ice overhead."
An international team of scientists led by Dr Adrian Jenkins of British Antarctic Survey and Stan Jacobs of the Lamont-Doherty Earth Observatory, Columbia University, New York on the American ship, the RVIB Nathaniel B Palmer, has been using the robot sub to investigate the underside of the ice and measure changes in salinity and temperature of the surrounding water.
After a test mission in unusually ice-free seas in front of the face of the glacier, they started with three 60km forays under the floating glacier and extended the length of missions to 110km round-trip. In all, a distance over 500km beneath the ice was studied.
Using its sonar, the Autosub picks its way through the water, while creating a three-dimensional map that the scientists will use to determine where and how the warmth of the ocean waters drives melting of the glacier base.
"There is still much work to be done on the processing of the data", said Adrian Jenkins, "but the picture we should get of the ocean beneath the glacier will be unprecedented in its extent and detail. It should help us answer critical questions about the role played by the ocean in driving the ongoing thinning of the glacier."
The lead US researcher on the project, Stan Jacobs, is studying the Pine Island Glacier with International Polar Year (IPY) funding from the National Science Foundation (NSF). One of the IPY research goals is to better understand the dynamics of the world's massive ice sheets, including the massive West Antarctic Ice Sheet. If this were to melt completely global sea levels would rise significantly. The most recent report of the Intergovernmental Panel on Climate Change (IPCC) noted that because so little is understood about ice-sheet behaviour it is difficult to predict how ice sheets will contribute to sea level rise in a warming world. The behaviour of ice sheets the IPCC report said is one of the major uncertainties in predicting exactly how the warming of the global will affect human populations.
Complementing the Autosub exploration, other work during the 53-day NB Palmer cruise included setting out 15 moored instrument arrays to record the variability in ocean properties and circulation over the next two years, extensive profiling of 'warm' and melt-laden seawater, sampling the perennial sea ice and swath-mapping deep, glacially-scoured troughs on the sea floor.
Autosub is an AUV – Automated Underwater Vehicle, designed, developed and built at the National Oceanography Centre, Southampton with funding from the Natural Environment Research Council. Autosub has a maximum range of 400km and is powered by 5,000 ordinary D-cell batteries. The batteries are packed in bundles in pressure-tested housings. Either end of the seven-metre sub there are free-flooding areas where the payload of instruments are installed. It carries a multibeam sonar system that builds up a 3D map of the ice above and the seabed below.. It also carries precision instruments for measuring the salinity, temperature, and oxygen concentrations in the sea water within the ice cavity, which are vital to understanding the flow of water within the ice cavity and the rate of melting. Autosub is 7m long and weighs 3.5 tonnes. Travelling at 6km hour it is capable of diving up to 1600 m deep, and can operate for 72 hours (400 km) between battery changes.
Adapted from materials provided by National Oceanography Centre, Southampton (UK).

Cleaning Up Oil Spills Can Kill More Fish Than Spills Themselves

ScienceDaily (Mar. 17, 2009) — A new Queen's University study shows that detergents used to clean up spills of diesel oil actually increase its toxicity to fish, making it more harmful.

"The detergents may be the best way to treat spills in the long term because the dispersed oil is diluted and degraded," says Biology professor Peter Hodson. "But in the short term, they increase the bioavailability and toxicity of the fuel to rainbow trout by 100-fold."
The detergents are oil dispersants that decrease the surface tension between oil and water, allowing floating oil to mix with water as tiny droplets. Dr. Hodson and his team found that dispersion reduces the potential impacts of oil on surface-dwelling animals, While this should enhance biodegradation, it also creates a larger reservoir of oil in the water column.
This increases the transfer of hydrocarbons from oil to water, Dr. Hodson explains. The hydrocarbons pass easily from water into tissues and are deadly to fish in the early stages of life. "This could seriously impair the health of fish populations, resulting in long-term reductions in economic returns to fisheries," he says.
The study is published in the journal, Environmental Toxicology and Chemistry.
The researchers also determined that even though chemical dispersants are not typically used in freshwater, turbulent rivers can disperse spilled diesel and create similar negative effects.
"It doesn't matter if the oil is being dispersed by chemicals or by the current," says Dr. Hodson. "Now that we know how deadly dispersed oil is, it is important to assess the risks of diesel spills to fish and fisheries in terms of the spill location, and the timing relative to fish spawning and development."
Funding for the study was provided by the Natural Sciences and Engineering Research Council of Canada (NSERC) and by Petroleum Research Atlantic Canada. Also on the research team are Allison Schein and Jason Scott from Queen's School of Environmental Studies and environmental consultant Lizzy Mos.
Adapted from materials provided by Queen's University.

Climate-related Changes Affect Life On The Antarctic Peninsula


ScienceDaily (Mar. 17, 2009) — Scientists have long established that the Antarctic Peninsula is one of the most rapidly warming spots on Earth. Now, new research using detailed satellite data indicates that the changing climate is affecting not just the penguins at the apex of the food chain, but simultaneously the microscopic life that is the base of the ecosystem.
The research was published in the journal Science by researchers with the National Science Foundation's (NSF) LTER (Long Term Ecological Research) program. The LTER, which has 26 sites around the globe, including two in Antarctica, enables tracking of ecological variables over time, so that the mechanisms of climate change impact on ecosystems can be revealed. The specific findings were made by researchers with the Palmer LTER, using data collected near Palmer Station and from the research vessel Laurence M. Gould. Both Palmer Station and the Laurence M. Gould are operated by NSF's Office of Polar Programs.
Hugh Ducklow, of the Marie Biological Laboratory at Woods Hole, the principal investigator for the Palmer LTER project, said that the new findings are scientifically significant, but they also are consistent with the climate trends on the Peninsula and other observed changes.
However, it took new scientific tools and analytical work by post-doctoral fellow Martin Montes Hugo to verify scientifically what scientists had been inferring from other changes for some time.
"I have to say the findings weren't a surprise; I think with the weight of all the other observations that we had on changes happening to organisms higher up in the food chain, we thought that phytoplankton weren't going to escape this level of climate change," Ducklow said. "But it took Martin to have all the right tools and the abilities to go in and do the analysis and prove what we suspected."
Those data, gathered over years, were essential to tracking patterns that supported the new findings.
"That's the beauty of the LTER program," he added.
Over the past 50 years, winter temperatures on the Peninsula have risen five times faster than the global average and the duration of sea-ice coverage has decreased. A warm, moist maritime climate has moved into the northern Peninsula region, pushing the continental, polar conditions southward.
As a result, the prevalence of species that depend on sea ice, such as Adelie penguins, Antarctic silverfish and krill, has decreased in the Peninsula's northern region, and new species that typically avoid ice, such as Gentoo and Chinstrap penguins, and lanternfish are moving into the habitat.
The LTER researchers show that satellite data on ocean color, temperature, sea ice and winds, indicate that phytoplankton at the base of the food chain are also responding to changes in sea-ice cover and winds driven by climate change. However, there are contrasting changes in northern and southern regions, and the satellite and ground-based data provide insights into the forcing mechanisms for each region.
The researchers weren't surprised that primary productivity in the waters of the Peninsula has changed dramatically over the last 20 years. But the contrasting changes in the north and south were a surprise.
In the north, where ice-dependent species are disappearing, sea ice cover has declined and wind stress has increased. The wind intensity and reduced sea ice causes greater mixing of the surface ocean waters. The result--a deepening of the surface mixed layer that lowers primary productivity rates and causes changes in phytoplankton species, because phytoplankton cells are exposed to less light.
Conversely, in the southern Peninsula waters, where ice-dependent species continue to thrive, the situation is reversed. There, sea ice loss has been in areas where it formerly covered most of the ocean surface for most of the year. Now, ice is less prevalent, exposing more water to sunlight and stimulating phytoplankton growth. The ice loss in the South, combined with less wind stress, promotes the formation of a shallower mixed layer, with increased light and the development of large phytoplankton cells, such as diatoms. Diatoms, single-celled creatures, form the base of the rich Antarctic food web that includes krill, penguins and whales.
Journal reference:
Montes-Hugo et al. Recent Changes in Phytoplankton Communities Associated with Rapid Regional Climate Change Along the Western Antarctic Peninsula. Science, March 13, 2009; 323 (5920): 1470 DOI: 10.1126/science.1164533
Adapted from materials provided by National Science Foundation.

Monday, March 16, 2009

New Tool For Study Of Air Quality Developed

ScienceDaily (Mar. 17, 2009) — Air quality models have achieved a great degree of sophistication over the last few years thanks mainly to scientific and computational advances. These are tools that simulate the dynamics of the atmosphere and estimate the impact of particular sources of contamination such as industries or traffic on air quality so that plans and decisions can then be made according to the produced results.

The Grupo de Modelos y Software para el medio Ambiente of the Facultad de Informática at the Universidad Politécnica de Madrid has developed a very sophisticated tool (OPANA) that estimates the impact of air quality on the health of citizens using last generation models.
This tool is based on very advanced numerical methods that produce extremely precise measurements of the concentration of a certain atmospheric contaminant that a person breathes in a determined time and place, from a particular source (an industry, an incinerator, a motorway, etc.). It is possible to determine the consequent impact under almost any circumstances or distance from the source thanks to the enormous calculating power available today.
In order to ensure that the obtained results are reliable, it is necessary to introduce accurate data into the tool. The tool requires detailed information about the topography of the site under study, the different uses of land obtained through remote sensing, meteorological information, relevant information about the surroundings of the area under study, and above all an accurate estimate of the emissions that occur in the area and its surroundings.
With these tools, it is possible to evaluate the impact that a new industry would have on the atmospheric contamination of an area and carry out experiments using different scenarios to be compared against the current conditions. In this way, the best decisions can be made to protect the health of inhabitants of the area.
Journal reference:
Sanjose et al. The evaluation of the air quality impact of an incinerator by using MM5-CMAQ-EMIMO modeling system: North of Spain case study. Environment International, 2008; 34 (5): 714 DOI: 10.1016/j.envint.2007.12.010
Adapted from materials provided by Universidad Politécnica de Madrid, via AlphaGalileo.

Atmospheric 'Sunshade' Could Reduce Solar Power Generation

SOURCE

ScienceDaily (Mar. 16, 2009) — The concept of delaying global warming by adding particles into the upper atmosphere to cool the climate could unintentionally reduce peak electricity generated by large solar power plants by as much as one-fifth, according to a new NOAA study.
“Injecting particles into the stratosphere could have unintended consequences for one alternative energy source expected to play a role in the transition away from fossil fuels,” said author Daniel Murphy, a scientist at NOAA’s Earth System Research Laboratory in Boulder, Colo.
The Earth is heating up as fossil-fuel burning produces carbon dioxide, the primary heat-trapping gas responsible for man-made climate change. To counteract the effect, some geoengineering proposals are designed to slow global warming by shading the Earth from sunlight.
Among the ideas being explored is injecting small particles into the upper atmosphere to produce a climate cooling similar to that of large volcanic eruptions, such as Mt. Pinatubo’s in 1991. Airborne sulfur hovering in the stratosphere cooled the Earth for about two years following that eruption.
Murphy found that particles in the stratosphere reduce the amount and change the nature of the sunlight that strikes the Earth. Though a fraction of the incoming sunlight bounces back to space (the cooling effect), a much larger amount becomes diffuse, or scattered, light.
On average, for every watt of sunlight the particles reflect away from the Earth, another three watts of direct sunlight are converted to diffuse sunlight. Large power-generating solar plants that concentrate sunlight for maximum efficiency depend solely on direct sunlight and cannot use diffuse light.
Murphy verified his calculations using long-term NOAA observations of direct and diffuse sunlight before and after the 1991 eruption.
After the eruption of Mt. Pinatubo, peak power output of Solar Electric Generating Stations in California, the largest collective of solar power plants in the world, fell by up to 20 percent, even though the stratospheric particles from the eruption reduced total sunlight that year by less than 3 percent.
“The sensitivity of concentrating solar systems to stratospheric particles may seem surprising,” said Murphy. “But because these systems use only direct sunlight, increasing stratospheric particles has a disproportionately large effect on them.”
Nine Solar Electric Generating Stations operate in California and more are running or are under construction elsewhere in the world. In sunny locations such systems, which use curved mirrors or other concentrating devices, generate electricity at a lower cost than conventional photovoltaic, or solar, cells.
Flat photovoltaic and hot water panels, commonly seen on household roofs, use both diffuse and direct sunlight. Their energy output would decline much less than that from concentrating systems.
Even low-tech measures to balance a home’s energy, such as south-facing windows for winter heat and overhangs for summer shade, would be less effective if direct sunlight is reduced.
The findings recently appeared in the journal Environmental Science and Technology.
Journal reference:
Daniel M. Murphy. Effect of Stratospheric Aerosols on Direct Sunlight and Implications for Concentrating Solar Power. Environmental Science & Technology, 2009; 090311080700076 DOI: 10.1021/es802206b
Adapted from materials provided by National Oceanic And Atmospheric Administration.

Sea Level Rise Due To Global Warming Poses Threat To New York City

SOURCE

ScienceDaily (Mar. 16, 2009) — Global warming is expected to cause the sea level along the northeastern U.S. coast to rise almost twice as fast as global sea levels during this century, putting New York City at greater risk for damage from hurricanes and winter storm surge, according to a new study led by a Florida State University researcher.
Jianjun Yin, a climate modeler at the Center for Ocean-Atmospheric Prediction Studies (COAPS) at Florida State, said there is a better than 90 percent chance that the sea level rise along this heavily populated coast will exceed the mean global sea level rise by the year 2100. The rising waters in this region -- perhaps by as much as 18 inches or more -- can be attributed to thermal expansion and the slowing of the North Atlantic Ocean circulation because of warmer ocean surface temperatures.
Yin and colleagues Michael Schlesinger of the University of Illinois at Urbana-Champaign and Ronald Stouffer of Geophysical Fluid Dynamics Laboratory at Princeton University are the first to reach that conclusion after analyzing data from 10 state-of-the-art climate models, which have been used for the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report. Yin's study is published in the journal Nature Geoscience.
"The northeast coast of the United States is among the most vulnerable regions to future changes in sea level and ocean circulation, especially when considering its population density and the potential socioeconomic consequences of such changes," Yin said. "The most populous states and cities of the United States and centers of economy, politics, culture and education are located along that coast."
The researchers found that the rapid sea-level rise occurred in all climate models whether they depicted low, medium or high rates of greenhouse-gas emissions. In a medium greenhouse-gas emission scenario, the New York City coastal area would see an additional rise of about 8.3 inches above the mean sea level rise that is expected around the globe because of human-induced climate change.
Thermal expansion and the melting of land ice, such as the Greenland ice sheet, are expected to cause the global sea-level rise. The researchers projected the global sea-level rise of 10.2 inches based on thermal expansion alone. The contribution from the land ice melting was not assessed in this study due to uncertainty.
Considering that much of the metropolitan region of New York City is less than 16 feet above the mean sea level, with some parts of lower Manhattan only about 5 feet above the mean sea level, a rise of 8.3 inches in addition to the global mean rise would pose a threat to this region, especially if a hurricane or winter storm surge occurs, Yin said.
Potential flooding is just one example of coastal hazards associated with sea-level rise, Yin said, but there are other concerns as well. The submersion of low-lying land, erosion of beaches, conversion of wetlands to open water and increase in the salinity of estuaries all can affect ecosystems and damage existing coastal development.
Although low-lying Florida and Western Europe are often considered the most vulnerable to sea level changes, the northeast U.S. coast is particularly vulnerable because the Atlantic meridional overturning circulation (AMOC) is susceptible to global warming. The AMOC is the giant circulation in the Atlantic with warm and salty seawater flowing northward in the upper ocean and cold seawater flowing southward at depth. Global warming could cause an ocean surface warming and freshening in the high-latitude North Atlantic, preventing the sinking of the surface water, which would slow the AMOC.
Journal reference:
Yin et al. Model projections of rapid sea-level rise on the northeast coast of the United States. Nature Geoscience, March 15, 2009; DOI: 10.1038/ngeo462
Adapted from materials provided by Florida State University, via EurekAlert!, a service of AAAS.

Researchers Study Cave’s 'Breathing' For Better Climate Clues

ScienceDaily (Mar. 16, 2009) — A University of Arkansas researcher studying the way caves “breathe” is providing new insights into the process by which scientists study paleoclimates.
Katherine Knierim, a graduate student at the University of Arkansas, together with Phil Hays of the geosciences department and the U.S. Geological Survey and Erik Pollock of the University of Arkansas Stable Isotope Laboratory, are conducting close examinations of carbon cycling in an Ozark cave. Caves “breathe” in the sense that air flows in and out as air pressure changes.
The researchers have found that carbon dioxide pressures vary with external temperatures and ground cover, indicating a possible link between the carbon found in rock formations in the caves and seasonal changes. They presented their findings at a recent meeting of the American Geophysical Union.
The movement of carbon in cave systems is controlled by the concentration of carbon dioxide. When conditions are right, this carbon can be deposited as layers in stalagmites, stalactites and soda straws. These layers resemble the rings found in trees, except that they can date back millions of years, hold information about cave conditions.
“People have been using these formations as paleoclimate records,” Hays said. However, researchers make an assumption when they do so.
“The problem is that you have to assume you are getting even carbon and oxygen isotope exchange,” Knierim said. Isotopes, or atoms of the same type but with slightly different weights, are found in plants, animals, organic matter and rocks. Different types of material have unique “signatures,” or proportions of a particular atom at a particular atomic weight.
By looking at carbon isotope ratios in cave topsoils, the cave atmosphere and the stream within the cave, Knierim and her colleagues will be able to determine the different contributions of carbon sources to the formations. This will help scientists develop more accurate paleoclimate conditions from cave formations.
A greater knowledge of how carbon cycles through cave systems also will help scientists develop better methods for watershed management.
The researchers are in the geosciences department of the J. William Fulbright College of Arts and Sciences.
Adapted from materials provided by University of Arkansas, Fayetteville.

Ninth Warmest February For Globe, NOAA

SOURCE

ScienceDaily (Mar. 16, 2009) — The combined global land and ocean surface average temperature for February 2009 was the ninth warmest since records began in 1880, according to an analysis by NOAA’s National Climatic Data Center in Asheville, N.C.
The analyses in NCDC’s global reports are based on preliminary data, which are subject to revision. Additional quality control is applied to the data when later reports are received several weeks after the end of the month and as increased scientific methods improve NCDC’s processing algorithms.
Temperature Highlights – February
The combined global land and ocean surface temperature for February was 54.80 degrees F, 0.90 degree F above the 20th century mean of 53.9 degrees F, ranking as the ninth warmest on record.
Separately, the global land surface temperature was 39.38 degrees F, 1.58 degrees F above the 20th century mean of 37.8 degrees F.
The global ocean surface temperature of 61.25 degrees F ranked as eighth warmest on record and was 0.65 degree F above the 20th century mean of 60.6 degrees F.
Temperature Highlights – Boreal (Meteorological) Winter
The combined global land and ocean surface temperature for boreal winter (December-February) was 54.72 degrees F, 0.92 degree F above the 20th century mean of 53.8 degrees F and ranking eighth warmest.
Separately, the global land surface temperature was 39.31 degrees F, 1.51 degrees F above the 20th century mean of 37.8 degrees F, ranking as ninth warmest on record.
The global ocean surface temperature of 61.20 degrees F ranked as seventh warmest on record and was 0.70 degree F above the 20th century mean of 60.5 degrees F.
Global Highlights for February
Based on NOAA satellite observations of snow cover extent, 10.7 million square miles (27.7 million square kilometers) of Eurasia (Europe and Asia) were covered by snow in February 2009, which is 0.4 million square miles (1.1 million square kilometers) below the 1966-2009 average of 11.1 million square miles (28.8 million square kilometers).
Satellite-based snow cover extent for the Northern Hemisphere was 17.4 million square miles (45.0 million square kilometers) in February, which is 0.3 million square miles (0.9 million square kilometers) below the 1966-2009 average of 17.7 million square miles (45.9 million square kilometers).
Arctic sea ice coverage during February 2009 was at its fourth lowest February extent since satellite records began in 1979, according to the National Snow and Ice Data Center. Average ice extent during February was 5.7 million square miles (14.8 million square kilometers). The Arctic sea ice pack usually expands during the cold season, reaching a maximum in March, then contracts during the warm season, reaching a minimum in September.
Very hot, dry conditions affected southern Australia during the end of January and beginning of February. An intense heat wave February 6-8 resulted in a high temperature of 119.8 degrees F at Hopetoun, Victoria, Feb. 7, surpassing the previous record of 117.0 degrees F set in January 1939. This is a state record and perhaps the highest temperature ever recorded for such a southerly latitude. The hot, dry conditions contributed to the development of Australia’s deadliest wildfires in history.
China declared its highest level of emergency for eight provinces that were suffering from their worst drought in 50 years. The drought conditions, which began in November 2008, affected more than 4 million people and more than 24 million acres of crops.
A strong winter storm brought heavy snow to parts of the United Kingdom on February 2, disrupting transportation and bringing London to a virtual standstill. The event, in which up to 12 inches of snow fell in southeastern England, was the UK’s most widespread snow in 18 years, according to the UK Met Office.
Adapted from materials provided by National Oceanic And Atmospheric Administration.

Sunday, March 15, 2009

Silica Algae Reveal How Ecosystems React To Climate Changes


ScienceDaily (Mar. 14, 2009) — A newly published dissertation by Linda Ampel from the Department of Physical Geography and Quaternary Geology at Stockholm University in Sweden examined how rapid climate changes during the most recent ice age affected ecosystems in an area in continental Europe.
Rapid and extensive climate changes have taken place on several occasions in the past. For example, the latest ice age (lasting from about 115,000 to 11,500 years ago) is characterized by several rapid and dramatic climate swings. These swings recurred in cycles of roughly 1,500 years and were originally discovered through studies of ice cores from Greenland in the early 1990s. These cycles started with an extremely rapid rise in temperatures, over just a few years or decades, of as much as 8-16o C over Greenland.
Linda Ampel studied how these rapid cycles in the climate affected ecosystems in an area in continental Europe. The study was based on analyses of sediment cores from an overgrown lake named Les Echets in eastern France and focuses on a time interval between 40,000 and 16,000 ago.
The findings are based on analyses of fossil silica algae, diatoms. Various species of diatoms prefer different water conditions relating to physical and chemical parameters such as temperature, salinity, access to nutrients, light, water depth, or available types of places to grow. These parameters, in turn, are affected by climate. Different species of diatoms can therefore indicate how the water environment changed as a consequence of the climate in the past.
Diatom analyses of the environmental archive from Les Echets, together with further analyses of chemical and biological parameters such as content of organic material and pollen grains from trees and other plants preserved in the lake, show that the ecosystems in the lake and its surroundings underwent marked changes during the latest ice age as a consequence of these 1,500-year cycles. The adaptation of the ecosystems prompted by the recurring warm periods took place as quickly as within 50 to 200 years.
“These findings show that ecosystems have changed rapidly in reaction to climate changes in the past, which indicates that quick adaptations could also take place in the future as a consequence of global warming, for instance,” says Linda Ampel.
Adapted from materials provided by Vetenskapsrådet (The Swedish Research Council), via AlphaGalileo.

Saturday, March 14, 2009

Wind Shifts May Stir Carbon Dioxide From Antarctic Depths, Amplifying Global Warming

ScienceDaily (Mar. 13, 2009) — Natural releases of carbon dioxide from the Southern Ocean due to shifting wind patterns could have amplified global warming at the end of the last ice age--and could be repeated as manmade warming proceeds, a new paper in the journal Science suggests.
Many scientists think that the end of the last ice age was triggered by a change in Earth's orbit that caused the northern part of the planet to warm. This partial climate shift was accompanied by rising levels of the greenhouse gas CO2, ice core records show, which could have intensified the warming around the globe. A team of scientists at Columbia University's Lamont-Doherty Earth Observatory now offers one explanation for the mysterious rise in CO2: the orbital shift triggered a southward displacement in westerly winds, which caused heavy mixing in the Southern Ocean around Antarctica, pumping dissolved carbon dioxide from the water into the air.
"The faster the ocean turns over, the more deep water rises to the surface to release CO2," said lead author Robert Anderson, a geochemist at Lamont-Doherty. "It's this rate of overturning that regulates CO2 in the atmosphere." In the last 40 years, the winds have shifted south much as they did 17,000 years ago, said Anderson. If they end up venting more CO2 into the air, manmade warming underway now could be intensified.
Scientists have been studying the oceans for more than 25 years to understand their influence on CO2 levels and the glacial cycles that have periodically heated and chilled the planet for more than 600,000 years. Ice cores show that the ends of other ice ages also were marked by rises in CO2.
Two years ago, J.R. Toggweiler, a scientist at the National Oceanic and Atmospheric Administration (NOAA), proposed that westerly winds in the Southern Ocean around Antarctica may have undergone a major shift at the end of the last ice age. This shift would have raised more CO2-rich deep water to the surface, and thus amplified warming already taking place due to the earth's new orbital position. Anderson and his colleagues are the first to test that theory by studying sediments from the bottom of the Southern Ocean to measure the rate of overturning.
The scientists say that changes in the westerlies may have been triggered by two competing events in the northern hemisphere about 17,000 years ago. The earth's orbit shifted, causing more sunlight to fall in the north, partially melting the ice sheets that then covered parts of the United States, Canada and Europe. Paradoxically, the melting may also have spurred sea-ice formation in the North Atlantic Ocean, creating a cooling effect there. Both events would have caused the westerly winds to shift south, toward the Southern Ocean. The winds simultaneously warmed Antarctica and stirred the waters around it. The resulting upwelling of CO2 would have caused the entire globe to heat.
Anderson and his colleagues measured the rate of upwelling by analyzing sediment cores from the Southern Ocean. When deep water is vented, it brings not only CO2 to the surface but nutrients. Phytoplankton consume the extra nutrients and multiply.
In the cores, Anderson and his colleagues say spikes in plankton growth between roughly 17,000 years ago and 10,000 years ago indicate added upwelling. By comparing those spikes with ice core records, the scientists realized the added upwelling coincided with hotter temperatures in Antarctica as well as rising CO2 levels.
In the same issue of Science, Toggweiler writes a column commenting on the work. "Now I think this really starts to lock up how the CO2 changed globally," he said in an interview. "Here's a mechanism that can explain the warming of Antarctica and the rise in CO2. It's being forced by the north, via this change in the winds."
At least one model supports the evidence. Richard Matear, a researcher at Australia's Commonwealth Scientific and Industrial Research Organisation, describes a scenario in which winds shift south and produce an increase in CO2 venting in the Southern Ocean. Plants, which incorporate CO2 during photosynthesis, are unable to absorb all the added nutrients, causing atmospheric CO2 to rise.
Some other climate models disagree. In those used by the Intergovernmental Panel on Climate Change, the westerly winds do not simply shift north-south. "It's more complicated than this," said Axel Timmermann, a climate modeler at the University of Hawaii. Even if the winds did shift south, Timmermann argues, upwelling in the Southern Ocean would not have raised CO2 levels in the air. Instead, he says, the intensification of the westerlies would have increased upwelling and plant growth in the Southeastern Pacific, and this would have absorbed enough atmospheric CO2 to compensate for the added upwelling in the Southern Ocean.
"Differences among model results illustrate a critical need for further research," said Anderson. These, include "measurements that document the ongoing physical and biogeochemical changes in the Southern Ocean, and improvements in the models used to simulate these processes and project their impact on atmospheric CO2 levels over the next century."
Anderson says that if his theory is correct, the impact of upwelling "will be dwarfed by the accelerating rate at which humans are burning fossil fuels." But, he said, "It could well be large enough to offset some of the mitigation strategies that are being proposed to counteract rising CO2, so it should not be neglected."
In addition to Anderson, the paper was coauthored by Simon Nielsen of Florida State University, and five Lamont-Doherty researchers: Shahla Ali, Louisa Bradtmiller, Martin Fleisher, Brenton Anderson and Lloyd Burckle. The study was funded by NOAA, the National Science Foundation, Norwegian Research Council and Norwegian Polar Institute.
Journal reference:
. Wind-Driven Upwelling in the Southern Ocean and the Deglacial Rise in Atmospheric CO2. Science, March 13, 2009
Adapted from materials provided by The Earth Institute at Columbia University, via EurekAlert!, a service of AAAS.

Friday, March 13, 2009

New Method For Monitoring Volcanoes

SOURCE

ScienceDaily (Mar. 13, 2009) — Seventeen of the world’s most active volcanoes have been supplied with monitoring equipment from Chalmers University of Technology in Sweden to measure their emission of sulphur dioxide. The measurement results will be used to make it easier to predict volcano eruptions, and they can also be used to improve today’s climate models.
One of the Chalmers researchers who developed the monitoring equipment is Mattias Johansson, who recently defended his doctoral dissertation in the subject.
The most active volcanoes in the world have special observatories that monitor them in order to be able to sound the alarm and evacuate people in the vicinity if an eruption threatens. These observatories keep track of several parameters, primarily seismic activity. Now 17 observatories have received a new parameter that facilitates their work – the volcanoes’ emissions of sulphur dioxide.
“Increasing gas emissions may indicate that magma is rising inside the volcano,” says Mattias Johansson at the Department of Radio and Space Science at Chalmers. “If this information is added to the other parameters, better risk estimates can be made at the observatories.”
The equipment he has been working with measures the total amount of gas emitted, whereas most other methods for metering gas can only indicate the gas concentration at a particular point. This is made possible by placing two or more metering instruments in different places around the volcano and then aggregating the information they gather.
Much of the Chalmers researchers’ work has involved making the equipment sufficiently automatic, robust, and energy-efficient for use in the inhospitable environment surrounding volcanoes, in poor countries with weak infrastructure.
“I have primarily been working with the software required for processing and presenting the measurement results,” says Mattias Johansson. “Among other things, I have created a program that analyzes the data collected, calculates the outward flow of gas, and presents the information as a simple graph on a computer screen that the observatory staff need only glance at to find out how much sulphur dioxide the volcano is emitting at any particular time.”
He has also participated in the installation of the equipment on two of the volcanoes, Aetna in Italy and San Cristobal in Nicaragua. In Project Novac, which his research is part of, a total of 20 volcanoes will be provided with monitoring equipment from Chalmers.
It will also be possible to improve global climate models when the Chalmers researchers receive continuous reports about how much sulphur dioxide is emitted by the 20 most active volcanoes.
“Sulphur dioxide is converted in the atmosphere to sulphate particles, and these particles need to be factored into climate models if those models are to be accurate,” says Associate Professor Bo Galle, who directed the dissertation. “Volcanoes are an extremely important source of sulphur dioxide. Aetna alone, for instance, releases roughly ten times more sulphur dioxide than all of Sweden does.”
The methods that Mattias Johansson has devised can moreover be used to measure the total emissions of air pollutants from an entire city. China has already purchased equipment that they are now using to study the pollution situation in the megacity Beijing.
Adapted from materials provided by The Swedish Research Council, via AlphaGalileo.

Wednesday, March 11, 2009

Coral Reefs May Start Dissolving When Atmospheric Carbon Dioxide Doubles

SOURCE

ScienceDaily (Mar. 10, 2009) — Rising carbon dioxide in the atmosphere and the resulting effects on ocean water are making it increasingly difficult for coral reefs to grow, say scientists. A study to be published online March 13, 2009 in Geophysical Research Letters by researchers at the Carnegie Institution and the Hebrew University of Jerusalem warns that if carbon dioxide reaches double pre-industrial levels, coral reefs can be expected to not just stop growing, but also to begin dissolving all over the world.
The impact on reefs is a consequence of both ocean acidification caused by the absorption of carbon dioxide into seawater and rising water temperatures. Previous studies have shown that rising carbon dioxide will slow coral growth, but this is the first study to show that coral reefs can be expected to start dissolving just about everywhere in just a few decades, unless carbon dioxide emissions are cut deeply and soon.
"Globally, each second, we dump over 1000 tons of carbon dioxide into the atmosphere and, each second, about 300 tons of that carbon dioxide is going into the oceans," said co-author Ken Caldeira of the Carnegie Institution's Department of Global Ecology, testifying to the U.S. House of Representatives Subcommittee on Insular Affairs, Oceans and Wildlife of the Committee on Natural Resources on February 25, 2009. "We can say with a high degree of certainty that all of this CO2 will make the oceans more acidic – that is simple chemistry taught to freshman college students."
The study was designed determine the impact of this acidification on coral reefs. The research team, consisting of Jacob Silverman, Caldeira, and Long Cao of the Carnegie Institution as well as Boaz Lazar and Jonathan Erez from The Hebrew University of Jerusalem, used field data from coral reefs to determine the effects of temperature and water chemistry on coral calcification rates. Armed with this information, they plugged the data into a computer model that calculated global seawater temperature and chemistry at different atmospheric levels of CO2 ranging from the pre-industrial value of 280 ppm (parts per million) to 750 ppm. The current atmospheric concentration is over 380 ppm, and is rapidly rising due to human-caused emissions, primarily through the burning of fossil fuels.
Based on the model results for more than 9,000 reef locations, the researchers determined that at the highest concentration studied, 750 ppm, acidification of seawater would reduce calcification rates of three quarters of the world's reefs to less than 20% of pre-industrial rates. Field studies suggest that at such low rates, coral growth would not be able to keep up with dissolution and other natural as well as manmade destructive processes attacking reefs.
Prospects for reefs are even gloomier when the effects of coral bleaching are included in the model. Coral bleaching refers to the loss of symbiotic algae that are essential for healthy growth of coral colonies. Bleaching is already a widespread problem, and high temperatures are among the factors known to promote bleaching. According to their model the researchers calculated that under present conditions 30% of reefs have already undergone bleaching and that at CO2 levels of 560 ppm (twice pre-industrial levels) the combined effects of acidification and bleaching will reduce the calcification rates of all the world's reefs by 80% or more. This lowered calcification rate will render all reefs vulnerable to dissolution, without even considering other threats to reefs, such as pollution.
"Our fossil-fueled lifestyle is killing off coral reefs," says Caldeira. "If we don't change our ways soon, in the next few decades we will destroy what took millions of years to create."
"Coral reefs may be the canary in the coal mine," he adds. "Other major pieces of our planet may be similarly threatened because we are using the atmosphere and oceans as dumps for our CO2 pollution. We can save the reefs if we decide to treat our planet with the care it deserves. We need to power our economy with technologies that do not dump carbon dioxide into the atmosphere or oceans."
Journal reference:
Silverman, J., B. Lazar, L. Cao, K. Caldeira, and J. Erez. Coral reefs may start dissolving when atmospheric CO2 doubles. Geophys. Res. Lett., 2009; DOI: 10.1029/2008GL036282
Adapted from materials provided by Carnegie Institution, via EurekAlert!, a service of AAAS.