Thursday, October 1, 2009

Google Earth Application Maps Carbon's Course.

ScienceDaily (Sep. 30, 2009) — Sometimes a picture really is worth a thousand words, particularly when the picture is used to illustrate science. Technology is giving us better pictures every day, and one of them is helping a NASA-funded scientist and her team to explain the behavior of a greenhouse gas.
Google Earth -- the digital globe on which computer users can fly around the planet and zoom in on key features -- is attracting attention in scientific communities and aiding public communication about carbon dioxide. Recently Google held a contest to present scientific results using KML, a data format used by Google Earth.
"I tried to think of a complex data set that would have public relevance," said Tyler Erickson, a geospatial researcher at the Michigan Tech Research Institute in Ann Arbor.
He chose to work with data from NASA-funded researcher Anna Michalak of the University of Michigan, Ann Arbor, who develops complex computer models to trace carbon dioxide back in time to where it enters and leaves the atmosphere.
"The datasets have three spatial dimensions and a temporal dimension," Erickson said. "Because the data is constantly changing in time makes it particularly difficult to visualize and analyze."
A better understanding of the carbon cycle has implications for energy and environmental policy and carbon management. In June 2009, Michalak described this research at the NASA Earth System Science at 20 symposium in Washington, D.C.
A snapshot from Erickson's Google Earth application shows green tracks representing carbon dioxide in the lowest part of the atmosphere close to Earth's surface where vegetation and land processes can impact the carbon cycle. Red tracks indicate particles at higher altitudes that are immune from ground influences.
The application is designed to educate the public and even scientists about how carbon dioxide emissions can be traced. A network of 1,000-foot towers across the United States is equipped with instruments by NOAA to measure the carbon dioxide content of parcels of air at single locations.
The application is designed to educate the public and even scientists about how carbon dioxide emissions can be traced. A network of 1,000-foot towers across the United States, like the tower above, are equipped with instruments by NOAA to measure the carbon dioxide content of parcels of air at single locations.
But where did that gas come from and how did it change along its journey? To find out, scientists rely on a sleuthing technique called "inverse modeling" – measuring gas concentrations at a single geographic point and then using clues from weather and atmospheric models to deduce where it came from. The technique is complex and difficult to explain even to fellow scientists.
Michalak related the technique to cream in a cup of coffee. "Say someone gave you a cup of creamy coffee," Michalak said. "How do you know when that cream was added?" Just as cream is not necessarily mixed perfectly, neither is the carbon dioxide in the atmosphere. If you can see the streaks of cream (carbon dioxide) and understand how the coffee (atmosphere) was stirred (weather), then scientists can use those clues to retrace the time and location that the ingredient was added to the mix.
The visual result typically used by scientists is a static two-dimensional map of the location of the gas, as averaged over the course of a month. Most carbon scientists know how to interpret the 2D map, but visualizing the 3D changes for non-specialists has proved elusive. Erickson spent 70 hours programming the Google Earth application that makes it easy to navigate though time and watch gas particles snake their way toward the NOAA observation towers. For his work, Erickson was declared one of Google's winners in March 2009.
"Having this visual tool allows us to better explain the scientific process," Michalak said. "It's a much more human way of looking at the science."
The next step, Erickson said, is to adapt the application to fit the needs of the research community. Scientists could use the program to better visualize the output of complex atmospheric models and then improve those models so that they better represent reality.
"Encouraging more people to deliver data in an interactive format is a good trend," Erickson said. "It should help innovation in research by reducing barriers to sharing data."
Adapted from materials provided by
NASA.

San Andreas Affected By 2004 Sumatran Quake; Largest Quakes Can Weaken Fault Zones Worldwide.

SOURCE

ScienceDaily (Sep. 30, 2009) — U.S. seismologists have found evidence that the massive 2004 earthquake that triggered killer tsunamis throughout the Indian Ocean weakened at least a portion of California's famed San Andreas Fault. The results, which appear this week in the journal Nature, suggest that the Earth's largest earthquakes can weaken fault zones worldwide and may trigger periods of increased global seismic activity.
"An unusually high number of magnitude 8 earthquakes occurred worldwide in 2005 and 2006," said study co-author Fenglin Niu, associate professor of Earth science at Rice University. "There has been speculation that these were somehow triggered by the Sumatran-Andaman earthquake that occurred on Dec. 26, 2004, but this is the first direct evidence that the quake could change fault strength of a fault remotely."
Earthquakes are caused when a fault fails, either because of the buildup of stress or because of the weakening of the fault. The latter is more difficult to measure.
The magnitude 9 earthquake in 2004 occurred beneath the ocean west of Sumatra and was the second-largest quake ever measured by seismograph. The temblor spawned tsunamis as large as 100 feet that killed an estimated 230,000, mostly in Indonesia, Sri Lanka, India and Thailand.
In the new study, Niu and co-authors Taka'aki Taira and Paul Silver, both of the Carnegie Institution of Science in Washington, D.C., and Robert Nadeau of the University of California, Berkeley, examined more than 20 years of seismic records from Parkfield, Calif., which sits astride the San Andreas Fault.
The team zeroed in on a set of repeating microearthquakes that occurred near Parkfield over two decades. Each of these tiny quakes originated in almost exactly the same location. By closely comparing seismic readings from these quakes, the team was able to determine the "fault strength" -- the shear stress level required to cause the fault to slip -- at Parkfield between 1987 and 2008.
The team found fault strength changed markedly at three times during the 20-year period. The authors surmised that the 1992 Landers earthquake, a magnitude 7 quake north of Palm Springs, Calif. -- about 200 miles from Parkfield -- caused the first of these changes. The study found the Landers quake destabilized the fault near Parkfield, causing a series of magnitude 4 quakes and a notable "aseismic" event -- a movement of the fault that played out over several months -- in 1993.
The second change in fault strength occurred in conjunction with a magnitude 6 earthquake at Parkfield in September 2004. The team found another change at Parkfield later that year that could not be accounted for by the September quake alone. Eventually, they were able to narrow the onset of this third shift to a five-day window in late December during which the Sumatran quake occurred.
"The long-range influence of the 2004 Sumatran-Andaman earthquake on this patch of the San Andreas suggests that the quake may have affected other faults, bringing a significant fraction of them closer to failure," said Taira. "This hypothesis appears to be borne out by the unusually high number of large earthquakes that occurred in the three years after the Sumatran-Andaman quake."
The research was supported by the National Science Foundation, the Carnegie Institution of Washington, the University of California, Berkeley, and the U.S. Geological Survey.
Adapted from materials provided by
Rice University.

Saturday, July 25, 2009

Auroras In Northern And Southern Hemispheres Are Not Identical


ScienceDaily (July 24, 2009) — Norwegian researchers have shown that the auroras in the Northern and the Southern hemispheres can be totally asymmetric. These findings contradict the commonly made assumption of aurora being mirror images of each other.
The study was performed by PhD student Karl Magnus Laundal and professor Nikolai Østgaard at the Institute of Physics and Technology at the University of Bergen.
"The aurora is produced due to collisions between the Earth’s atmosphere and electrically charged particles streaming along the Earth’s geomagnetic field lines. – Since these processes occur above the two hemispheres, both the Northern and the Southern light are created. So far researchers have assumed that these auroras are mirror images of each other, but our findings show that this is not always the case," professor Nikolai Østgaard says.
The researchers at the University of Bergen have used data from the two NASA-satellites IMAGE and Polar to examine the Northern and the Southern light. In the Nature letter they present several possible explanations to the unexpected result.
"The most plausible explanation involves electrical currents along the magnetic field lines. Differences in the solar exposure may lead to currents between the two hemispheres, explaining why the Northern and the Southern light are not identical," PhD student Karl Magnus Laundal says.
In addition to yielding new knowledge about the aurora, the results are important for other researchers studying the near-Earth space.
"Our study shows that data from only one hemisphere is not sufficient to state the conditions on the other hemisphere. This is important because most of our knowledge about the aurora, as well as processes in the upper atmosphere in the polar regions, is based solely on data from the Northern hemisphere," Østgaard points out.
The work of Østgaard and Laundal has been financed by the Research Council of Norway via the International Polar Year project IPY-ICESTAR.
Laundal, K.M. and Østgaard, N., Asymmetric auroral intensities in the Earth's Northern and Southern hemispheres, Nature 460, 491-493 (23 July 2009) doi:10.1038/nature08154
Journal reference:
Laundal et al. Asymmetric auroral intensities in the Earth's Northern and Southern hemispheres. Nature, 2009; 460 (7254): 491 DOI:
10.1038/nature08154
Adapted from materials provided by
University of Bergen, via AlphaGalileo.

Wednesday, July 22, 2009

California's Channel Islands Hold Evidence Of Clovis-age Comets


ScienceDaily (July 21, 2009) — A 17-member team has found what may be the smoking gun of a much-debated proposal that a cosmic impact about 12,900 years ago ripped through North America and drove multiple species into extinction.
In a paper appearing online ahead of regular publication in the Proceedings of the National Academy of Sciences, University of Oregon archaeologist Douglas J. Kennett and colleagues from nine institutions and three private research companies report the presence of shock-synthesized hexagonal diamonds in 12,900-year-old sediments on the Northern Channel Islands off the southern California coast.
These tiny diamonds and diamond clusters were buried deeply below four meters of sediment. They date to the end of Clovis -- a Paleoindian culture long thought to be North America's first human inhabitants. The nano-sized diamonds were pulled from Arlington Canyon on the island of Santa Rosa that had once been joined with three other Northern Channel Islands in a landmass known as Santarosae.
The diamonds were found in association with soot, which forms in extremely hot fires, and they suggest associated regional wildfires, based on nearby environmental records.
Such soot and diamonds are rare in the geological record. They were found in sediment dating to massive asteroid impacts 65 million years ago in a layer widely known as the K-T Boundary. The thin layer of iridium-and-quartz-rich sediment dates to the transition of the Cretaceous and Tertiary periods, which mark the end of the Mesozoic Era and the beginning of the Cenozoic Era.
"The type of diamond we have found -- Lonsdaleite -- is a shock-synthesized mineral defined by its hexagonal crystalline structure. It forms under very high temperatures and pressures consistent with a cosmic impact," Kennett said. "These diamonds have only been found thus far in meteorites and impact craters on Earth and appear to be the strongest indicator yet of a significant cosmic impact [during Clovis]."
The age of this event also matches the extinction of the pygmy mammoth on the Northern Channel Islands, as well as numerous other North American mammals, including the horse, which Europeans later reintroduced. In all, an estimated 35 mammal and 19 bird genera became extinct near the end of the Pleistocene with some of them occurring very close in time to the proposed cosmic impact, first reported in October 2007 in PNAS.
In the Jan. 2, 2009, issue of the journal Science, a team led by Kennett reported the discovery of billions of nanometer-sized diamonds concentrated in sediments -- weighing from about 10 to 2,700 parts per billion -- in six North American locations.
"This site, this layer with hexagonal diamonds, is also associated with other types of diamonds and with dramatic environmental changes and wildfires," said James Kennett, paleoceanographer and professor emeritus in the Department of Earth Science at the University of California, Santa Barbara.
"There was a major event 12,900 years ago," he said. "It is hard to explain this assemblage of materials without a cosmic impact event and associated extensive wildfires. This hypothesis fits with the abrupt cooling of the atmosphere as shown in the record of ocean drilling of the Santa Barbara Channel. The cooling resulted when dust from the high-pressure, high-temperature, multiple impacts was lofted into the atmosphere, causing a dramatic drop in solar radiation."
The hexagonal diamonds from Arlington Canyon were analyzed at the UO's Lorry I. Lokey Laboratories, a world-class nanotechnology facility built deep in bedrock to allow for sensitive microscopy and other high-tech analyses of materials. The analyses were done in collaboration with FEI, a Hillsboro, Ore., company that distributes the high-resolution Titan microscope used to characterize the hexagonal diamonds in this study.
Transmission electron microscopy and scanning electron microscopes were used in the extensive analyses of the sediment that contained clusters of Lonsdaleite ranging in size from 20 to 1,800 nanometers. These diamonds were inside or attached to carbon particles found in the sediments.
These findings are inconsistent with the alternative and already hotly debated theory that overhunting by Clovis people led to the rapid extinction of large mammals at the end of the ice age, the research team argues in the PNAS paper. An alternative theory has held that climate change was to blame for these mass extinctions. The cosmic-event theory suggests that rapid climate change at this time was possibly triggered by a series of small and widely dispersed comet strikes across much of North America.
The National Science Foundation provided primary funding for the research. Additional funding was provided by way of Richard A. Bray and Philip H. Knight faculty fellowships of the University of Oregon, respectively, to Kennett and UO colleague Jon M. Erlandson, a co-author and director of the UO's Museum of Natural and Cultural History.
The 17 co-authors on the PNAS paper are Douglas Kennett, Erlandson and Brendan J. Culleton, all of the University of Oregon; James P. Kennett of UC Santa Barbara; Allen West of GeoScience Consulting in Arizona; G. James West of the University of California, Davis; Ted E. Bunch and James H. Wittke, both of Northern Arizona University; Shane S. Que Hee of the University of California, Los Angeles; John R. Johnson of the Santa Barbara Museum of Natural History; Chris Mercer of the Santa Barbara Museum of Natural History and National Institute of Materials Science in Japan; Feng Shen of the FEI Co.; Thomas W. Stafford of Stafford Research Inc. of Colorado; Adrienne Stich and Wendy S. Wolbach, both of DePaul University in Chicago; and James C. Weaver of the University of California, Riverside.
About the University of Oregon
The University of Oregon is a world-class teaching and research institution and Oregon's flagship public university. The UO is a member of the Association of American Universities (AAU), an organization made up of the 62 leading public and private research institutions in the United States and Canada. The UO is one of only two AAU members in the Pacific Northwest.
Alternate Media Contact: Gail Gallessich, science writer, UC Santa Barbara, 805-893-7220, .edu
Sources: Douglas Kennett, professor of archaeology, department of anthropology, .edu. He currently is in Europe and readily assessable by email (a phone number is available through the UO media contact above)
James Kennett, professor emeritus, UC Santa Barbara, 805-893-4187, .edu (home phone number for media access is available from either media contact above)
Links: Doug Kennett faculty page: http://www.uoregon.edu/~dkennett/Welcome.html; anthropology department: http://www.uoregon.edu/~anthro/
James Kennett faculty page: http://www.geol.ucsb.edu/faculty/kennett/Home.html
Adapted from materials provided by University of Oregon.

Wednesday, July 15, 2009

First Remote, Underwater Detection Of Harmful Algae, Toxins


ScienceDaily (July 15, 2009) — Scientists at NOAA's National Centers for Coastal Ocean Science and the Monterey Bay Aquarium Research Institute (MBARI) have successfully conducted the first remote detection of a harmful algal species and its toxin below the ocean's surface. The achievement was recently reported in the June issue of Oceanography.
This achievement represents a significant milestone in NOAA's effort to monitor the type and toxicity of harmful algal blooms (HABs). HABs are considered to be increasing not only in their global distribution, but also in the frequency, duration, and severity of their effects. HABs damage coastal ecosystem health and pose threats to humans as well as marine life. Climate change is expected to exacerbate this trend, since many critical processes that govern HABs dynamics, such as water temperature and ocean circulation, are influenced by climate.
A MBARI-designed robotic instrument called the Environmental Sample Processor, or 'ESP,' designed as a fully-functional analytical laboratory in the sea, lets researchers collect the algal cells and extract the genetic information required for organism identification as well as the toxin needed to assess the risk to humans and wildlife. The ESP then conducts specialized, molecular-based measurements of species and toxin abundance, and transmits results to the laboratory via radio signals.
"This represents the first autonomous detection of both a HAB species and its toxin by an underwater sensor," notes Greg Doucette, Ph.D., a research oceanographer at NOAA's Center for Coastal Environmental Health and Biomolecular Research laboratory in Charleston, S.C. "It allows us to determine not only the organism causing a bloom, but also the toxicity of the event, which ultimately dictates whether it is a threat to the public and the ecosystem."
For the first demonstration of the ESP's ability to detect HABs and their toxins, Doucette and his MBARI colleague, Chris Scholin, Ph.D., targeted certain members of the algal genus Pseudo-nitzschia and their neurotoxin, domoic acid in Monterey Bay, Calif.
Pseudo-nitzschia and domoic acid have been a concern in the Monterey Bay area for well over a decade. In 1991, the first U.S. outbreak of domoic acid poisoning was documented in Monterey Bay. This outbreak resulted in the unusual deaths of numerous pelicans and cormorants that ingested sardines and anchovies, which had accumulated the domoic acid by feeding on a bloom of the toxic algae.
In the spring of 1998, a mass mortality of sea lions in and around the Monterey Bay area was attributed to the sea lions' feeding on domoic acid contaminated anchovies. Since that time, Pseudo-nitzschia and domoic acid have appeared on virtually an annual basis in California coastal waters and are the objects of an intensive statewide monitoring program run by the California Dept. of Public Health. Humans also can be affected by the toxin through consumption of contaminated seafood such as shellfish.
"Our public health monitoring program is one of the many groups that can benefit directly from the ESP technology and ability to provide an early warning of impending bloom activity and toxicity," said Gregg Langlois, director of the state of California's Marine Biotoxin Monitoring Program. "This is critical information for coastal managers and public health officials in mitigating impacts on the coastal ecosystem, since the toxicity of these algae can vary widely from little or no toxicity to highly toxic."
Beyond improving forecasting of HABs, this research will contribute to the rapidly emerging U.S. Integrated Ocean Observing System (IOOS) by adding a new way to make coastal ocean observations.
Adapted from materials provided by National Oceanic and Atmospheric Administration.

Monday, July 13, 2009

Tremors On Southern San Andreas Fault May Mean Increased Earthquake Risk


ScienceDaily (July 13, 2009) — Increases in mysterious underground tremors observed in several active earthquake fault zones around the world could signal a build-up of stress at locked segments of the faults and presumably an increased likelihood of a major quake, according to a new University of California, Berkeley, study.
Seismologist Robert M. Nadeau and graduate student Aurélie Guilhem of UC Berkeley draw these conclusions from a study of tremors along a heavily instrumented segment of the San Andreas Fault near Parkfield, Calif. The research is reported in the July 10 issue of Science.
They found that after the 6.5-magnitude San Simeon quake in 2003 and the 6.0-magnitude Parkfield quake in 2004, underground stress increased at the end of a locked segment of the San Andreas Fault near Cholame, Calif., at the same time as tremors became more frequent. The tremors have continued to this day at a rate significantly higher than the rate before the two quakes.
The researchers conclude that the increased rate of tremors may indicate that stress is accumulating more rapidly than in the past along this segment of the San Andreas Fault, which is at risk of breaking like it did in 1857 to produce the great 7.8 magnitude Fort Tejon earthquake. Strong quakes have also occurred just to the northwest along the Parkfield segment of the San Andreas about every 20 to 30 years.
"We've shown that earthquakes can stimulate tremors next to a locked zone, but we don't yet have evidence that this tells us anything about future quakes," Nadeau said. "But if earthquakes trigger tremors, the pressure that stimulates tremors may also stimulate earthquakes."
While earthquakes are brief events originating, typically, no deeper than 15 kilometers (10 miles) underground in California, tremors are an ongoing, low-level rumbling from perhaps 15 to 30 kilometers (10-20 miles) below the surface. They are common near volcanoes as a result of underground fluid movement, but were a surprise when discovered in 2002 at a subduction zone in Japan, a region where a piece of ocean floor is sliding under a continent.
Tremors were subsequently detected at the Cascadia subduction zone in Washington, Oregon and British Columbia, where several Pacific Ocean plates dive under the North American continental plate. In 2005, Nadeau identified mysterious "noise" detected by the Parkfield borehole seismometers as tremor activity, and has focused on them ever since. Unlike the Japanese and Cascadia tremor sites, however, the Parkfield area is a strike/slip fault, where the Pacific plate is moving horizontally against the North American plate.
"The Parkfield tremors are smaller versions of the Cascadia and Japanese tremors," Nadeau said. "Most last between three and 21 minutes, while some Cascadia tremors go on for days."
Because in nearly all known instances the tremors originate from the edge of a locked zone - a segment of a fault that hasn't moved in years and is at high risk of a major earthquake - seismologists have thought that increases in their activity may forewarn of stress build-up just before an earthquake.
The new report strengthens that association, Nadeau said.
For the new study, Nadeau and Guilhem pinpointed the location of nearly 2,200 tremors recorded between 2001 and 2009 by borehole seismometers implanted along the San Andreas Fault as part of UC Berkeley's High-Resolution Seismic Network. During this period, two nearby earthquakes occurred: one in San Simeon, 60 kilometers from Parkfield, on Dec. 22, 2003, and one in Parkfield on the San Andreas Fault on Sept. 28, 2004.
Before the San Simeon quake, tremor activity was low beneath the Parkfield and Cholame segments of the San Andreas Fault, but it doubled in frequency afterward and was six times more frequent after the Parkfield quake. Most of the activity occurred along a 25-kilometer (16-mile) segment of the San Andreas Fault south of Parkfield, around the town of Cholame. Fewer than 10 percent of the tremors occurred at an equal distance above Parkfield, near Monarch Peak. While Cholame is at the northern end of a long-locked and hazardous segment of the San Andreas Fault, Monarch Peak is not. However, Nadeau noted, Monarch Peak is an area of relative complexity on the San Andreas Fault and also ruptured in 1857 in the Fort Tejon 7.8 earthquake.
The tremor activity remains about twice as high today as before the San Simeon quake, while periodic peaks of activity have emerged that started to repeat about every 50 days and are now repeating about every 100-110 days.
"What's surprising is that the activity has not gone down to its old level," Nadeau said. The continued activity is worrisome because of the history of major quakes along this segment of the fault, and the long-ago Fort Tejon quake, which ruptured southward from Monarch Peak along 350 kilometers (220 miles) of the San Andreas Fault.
A flurry of pre-tremors was detected a few days before the Parkfield quake, which makes Nadeau hopeful of seeing similar tremors preceding future quakes.
He noted that the source of tremors is still somewhat of a mystery. Some scientists think fluids moving underground generate the tremors, just as movement of underground magma, water and gas causes volcanic tremors. Nadeau leans more toward an alternative theory, that non-volcanic tremors are generated in a deep region of hot soft rock, somewhat like Silly Putty, that, except for a few hard rocks embedded like peanut brittle, normally flows without generating earthquakes. The fracturing of the brittle inclusions, however, may be generating swarms of many small quakes that combine into a faint rumble.
"If tremors are composed of a lot of little earthquakes, each should have a primary and secondary wave just like large quakes," but they would overlap and produce a rumble, said Guilhem.
The stimulation of tremors by shear (tearing) stress rather than by compressional (opening and closing) stress is more consistent with deformation in the fault zone than with underground fluid movement, Nadeau said. The researchers' mapping of the underground tremors also shows that the tremors are not restricted to the plane of the fault, suggesting that faults spread out as they dive into the deeper crust.
Whatever their cause, tremors "are not relieving a lot of stress or making the fault less hazardous, they just indicate a changes in stress next to locked faults," said Nadeau.
Seismologists around the world are searching for tremors along other fault systems, Guilhem noted, although tremors can be hard to detect because of noise from oceans as well as from civilization. Brief tremor activity has been observed on a few faults, triggered by huge quakes far away, and these may be areas to focus on. Tremors were triggered on Northern California's Calaveras Fault by Alaska's Denali quake of 2002, Nadeau said.
The work is supported by the U.S. Geological Survey and the National Science Foundation.
Adapted from materials provided by University of California - Berkeley.

Saturday, July 11, 2009

New Envisat images highlight the dramatic retreat of the Aral Sea’s shoreline.


ScienceDaily (July 12, 2009) — New Envisat images highlight the dramatic retreat of the Aral Sea’s shoreline from 2006 to 2009. The Aral Sea was once the world’s fourth-largest inland body of water, but it has been steadily shrinking over the past 50 years since the rivers that fed it were diverted for irrigation projects.
By the end of the 1980s, it had split into the Small Aral Sea (north), located in Kazakhstan, and the horse-shoe shaped Large Aral Sea (south), shared by Kazakhstan and Uzbekistan.
By 2000, the Large Aral Sea had split into two – an eastern and western lobe. As visible in the images, the eastern lobe retreated substantially between 2006 and 2009. It appears to have lost about 80% of its water since the 2006 acquisition, at which time the eastern lobe had a length of about 150 km and a width of about 70 km.
The sea’s entire southern section is expected to dry out completely by 2020, but efforts are underway to save the northern part.
The Kok-Aral dike, a joint project of the World Bank and the Kazakhstan government, was constructed between the northern and southern sections of the sea to prevent water flowing into the southern section. Since its completion in 2005, the water level has risen in the northern section by an average of 4 m.
As the Aral Sea evaporated, it left behind a 40 000 sq km zone of dry, white salt terrain now called the Aral Karakum Desert. Each year violent sandstorms pick up at least 150 000 tonnes of salt and sand from the Aral Karakum and transport it across hundreds of km, causing severe health problems for the local population and making regional winters colder and summers hotter. In an attempt to mitigate these effects, vegetation that thrives in dry, saline conditions is being planted in the former seabed.
In 2007, the Kazakhstan government secured another loan from the World Bank to implement the second stage, which includes the building of a second dam, of the project aimed at reversing this man-made environmental disaster.
Envisat acquired these images on 1 July 2006 and 6 July 2009 with its Medium Resolution Imaging Spectrometer (MERIS) instrument while working in Full Resolution Mode to provide a spatial resolution of 300 m.
Adapted from materials provided by European Space Agency.

Wednesday, July 8, 2009

Beyond Carbon Dioxide: Growing Importance Of Hydrofluorocarbons (HFCs) In Climate Warming


ScienceDaily (July 9, 2009) — Some of the substances that are helping to avert the destruction of the ozone layer could increasingly contribute to climate warming, according to scientists from NOAA's Earth System Research Laboratory and their colleagues in a new study in the journal Proceedings of the National Academy of Sciences.
The authors took a fresh look at how the global use of hydrofluorocarbons (HFCs) is expected to grow in coming decades. Using updated usage estimates and looking farther ahead than past projections (to the year 2050), they found that HFCs—especially from developing countries—will become an increasingly larger factor in future climate warming.
"HFCs are good for protecting the ozone layer, but they are not climate friendly," said David W. Fahey, a scientist at NOAA and second author of the new study. "Our research shows that their effect on climate could become significantly larger than we expected, if we continue along a business-as-usual path."
HFCs currently have a climate change contribution that is small (less than 1 percent) in comparison to the contribution of carbon dioxide (CO2) emissions. The authors have shown that by 2050 the HFCs contribution could rise to 7 to 12 percent of what CO2 contributes. And if international efforts succeed in stabilizing CO2 emissions, the relative climate contribution from HFCs would increase further.
HFCs, which do not contain ozone-destroying chlorine or bromine atoms, are used as substitutes for ozone-depleting compounds such as chlorofluorocarbons (CFCs) in such uses as refrigeration, air conditioning, and the production of insulating foams. The Montreal Protocol, a 1987 international agreement, has gradually phased out the use of CFCs and other ozone-depleting substances, leading to the development of long-term replacements such as HFCs.
Though the HFCs do not deplete the ozone layer, they are potent greenhouse gases. Molecule for molecule, all HFCs are more potent warming agents than CO2 and some are thousands of times more effective. HFCs are in the "basket of gases" regulated under the 1997 Kyoto Protocol, an international treaty to reduce emissions of greenhouse gases.
The new study factored in the expected growth in demand for air conditioning, refrigerants, and other technology in developed and developing countries. The Montreal Protocol's gradual phasing out of the consumption of ozone-depleting substances in developing countries after 2012, along with the complete phase-out in developed countries in 2020, are other factors that will lead to increased usage of HFCs and other alternatives.
Decision-makers in Europe and the United States have begun to consider possible steps to limit the potential climate consequences of HFCs. The PNAS study examined several hypothetical scenarios to mitigate HFC consumption. For example, a global consumption limit followed by a 4 percent annual reduction would cause HFC-induced climate forcing to peak in the year 2040 and then begin to decrease before the year 2050.
"While unrestrained growth of HFC use could lead to significant climate implications by 2050, we have shown some examples of global limits that can effectively reduce the HFCs' impact," said John S. Daniel, a NOAA coauthor of the study.
The authors of the PNAS study are Guus J.M Velders of the Netherlands Environmental Assessment Agency, Fahey and Daniel of NOAA's Earth System Research Laboratory, Mack McFarland of DuPont Fluoroproducts, and Stephen O. Andersen of the U.S. Environmental Protection Agency.
Journal reference:
. The large contribution of projected HFC emissions to future climate forcing. Proceedings of the National Academy of Sciences, June 22, 2009
Adapted from materials provided by National Oceanic and Atmospheric Administration.

Ice Volume Of Switzerland’s Glaciers Calculated


ScienceDaily (July 9, 2009) — Swiss glaciers have lost a lot of ice in recent years due to increased melting. As temperatures climb, so do the fears that the glaciers could one day disappear altogether. Until now it could only be estimated approximately how big the ice volume in the Swiss Alps actually is and how it has changed in recent years.
A team of scientists headed by Martin Funk, ETH-Professor at the Laboratory of Hydraulics, Hydrology and Glaciology (VAW) at ETH Zurich, however, has now developed a novel procedure for determining the ice volume of a glacier. Their results are presented in the current issue of Global and Planetary Change.
The researchers developed the new method according to the law of mass conservation, which states that the surface mass balance has to be balanced by the ice flow and the change in ice thickness. This allows to infer on the ice thickness distribution of a glacier from the surface topography by estimating the mass balance distribution. "The calculation of the current ice volume is the most important indicator in predicting future glacier changes," explains Martin Funk.
74 km3 of glaciers
The scientists applied the procedure to the 59 Swiss glaciers larger than three square kilometers. For the remaining 1’400 glaciers, the ice volume was estimated by using an empirical area-volume approach derived from the new generated data set. The total glacier ice volume in 1999 was estimated to 74 km3, with an accuracy of 9 km3. This means the total volume of all glaciers of Switzerland is smaller than that of the Lake Geneva, which has a water volume of 89 km3. With a glaciated land area of 1’063 km2, the Swiss glaciers have an average ice thickness of 70 meters.
The scientists also discovered that the 59 larger glaciers account for almost 88% of the ice volume, whereas about 24% is stored in the Aletsch region alone (the Oberaletschgletscher, Mittelaletschgletscher and Grosser Aletschgletscher). The area of the Great Aletsch Glacier is approximately the same as the total area of all the Swiss glaciers smaller than one square kilometer. However, they have an overall ice volume that is twenty times smaller than that of the Grosser Aletschgletscher. "This just goes to show that the large glaciers carry the most weight in determining a region’s ice volume," explains Funk.
Volume decreased by 12 percent
The ETH Zurich study also revealed a change in the ice volume since 1999. Over the last decade – the warmest for 150 years – Switzerland’s glaciers have lost 9 km3 of ice (-12%), including 2.6 km³ (-3.5%) in the record-breaking sum-mer of 2003 alone. These figures are all the more alarming as the climate continues to warm up and the temperatures in the Swiss Alps are expected to increase by 1.8 degrees in the winter and 2.7 degrees in the summer by 2050.
Adapted from materials provided by ETH Zurich.

Tuesday, July 7, 2009

Close Relationship Between Past Warming And Sea-level Rise

SOURCE

ScienceDaily (July 7, 2009) — A team from the National Oceanography Centre, Southampton (NOCS), along with colleagues from Tübingen (Germany) and Bristol presents a novel continuous reconstruction of sea level fluctuations over the last 520 thousand years. Comparison of this record with data on global climate and carbon dioxide (CO2) levels from Antarctic ice cores suggests that even stabilisation at today's CO2 levels may commit us to sea-level rise over the next couple of millennia, to a level much higher than long-term projections from the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC).
Little is known about the total amount of possible sea-level rise in equilibrium with a given amount of global warming. This is because the melting of ice sheets is slow, even when temperature rises rapidly. As a consequence, current predictions of sea-level rise for the next century consider only the amount of ice sheet melt that will occur until that time. The total amount of ice sheet melting that will occur over millennia, given the current climate trends, remains poorly understood.
The new record reveals a systematic equilibrium relationship between global temperature and CO2 concentrations and sea-level changes over the last five glacial cycles. Projection of this relationship to today's CO2 concentrations results in a sea-level at 25 (±5) metres above the present. This is in close agreement with independent sea-level data from the Middle Pliocene epoch, 3-3.5 million years ago, when atmospheric CO2 concentrations were similar to the present-day value. This suggests that the identified relationship accurately records the fundamental long-term equilibrium behaviour of the climate system over the last 3.5 Million years.
Lead author Professor Eelco Rohling of the University of Southampton's School of Ocean and Earth Science based at NOCS, said: "Let's assume that our observed natural relationship between CO2 and temperature, and sea level, offers a reasonable 'model' for a future with sustained global warming. Then our result gives a statistically sound expectation of a potential total long-term sea-level rise. Even if we would curb all CO2 emissions today, and stabilise at the modern level (387 parts per million by volume), then our natural relationship suggests that sea level would continue to rise to about 25 m above the present. That is, it would rise to a level similar to that measured for the Middle Pliocene."
Project partners Professor Michal Kucera (University of Tübingen) and Dr Mark Siddall (University of Bristol), add: "We emphasise that such equilibration of sea level would take several thousands of years. But one still has to worry about the large difference between the inferred high equilibrium sea level and the level where sea level actually stands today. Recent geological history shows that times with similarly strong disequilibria commonly saw pulses of very rapid sea-level adjustment, at rates of 1-2 metres per century or higher."
The new study's projection of long-term sea-level change, based on the natural relationship of the last 0.5 to 3.5 million years, differs considerably from the IPCC's model-based long-term projection of +7 m. The discrepancy cannot be easily explained, and new work is needed to ensure that the 'gap is closed'.
The observed relationships from the recent geological past can form a test-bed or reality-check for models, to help them achieve improved future projections.
The project was funded by the Natural Environment Research Council (UK) and the Deutsche Forschungs-Gemeinschaft (Germany).
The authors are Eelco Rohling (NOCS), Katharine Grant (NOCS), Mike Bolshaw (NOCS), Andrew Roberts (NOCS), Mark Siddall (University of Bristol), Christoph Hemleben (University of Tübingen) and Michal Kucera (University of Tübingen).
.
Journal reference:
Rohling et al. Antarctic temperature and global sea level closely coupled over the past five glacial cycles. Nature Geoscience, June 21, 2009; DOI: 10.1038/ngeo557
Adapted from materials provided by National Oceanography Centre, Southampton (UK).

Monday, July 6, 2009

Natural Deep Earth Pump Fuels Earthquakes And Ore


ScienceDaily (July 6, 2009) — For the first time scientists have discovered the presence of a natural deep earth pump that is a crucial element in the formation of ore deposits and earthquakes.
The process, called creep cavitation, involves fluid being pumped through pores in deformed rock in mid-crustal sheer zones, which are approximately 15 km below the Earth's surface.
The fluid transfer through the middle crust also plays a key role in tectonic plate movement and mantle degassing.
The discovery was made by examining one millimetre sized cubes of exposed rock in Alice Springs, which was deformed around 320 million years ago during a period of natural mountain formation.
The evidence is described in a paper published in the latest edition of Nature entitled Creep cavitation can establish a dynamic granular fluid pump in ductile shear zones.
One of the paper's author's CSIRO Exploration and Mining scientist Dr Rob Hough said that this was the first direct observation of fluids moving through the mid-crustal shear zone.
"We are seeing the direct evidence for one of the processes that got ore forming fluids moving up from the mantle to the shallow crust to form the ore deposits we mine today, it is also one of the mechanisms that can lead to earthquakes in the middle crust," Dr Hough said.
Research leader Dr Florian Fusseis, from the University of Western Australia, said that the discovery could provide valuable information in understanding how earthquakes are formed.
"While we understand reasonably well why earthquakes happen in general, due to stress build-up caused by motions of tectonic plates, the triggering of earthquakes is much more complex," Dr Fusseis said.
"To understand the 'where' and 'when' of earthquakes, the 'how' needs to be understood first. We know that earthquakes nucleate by failure on a small part of a shear zone."
Dr Fusseis said that while their sample did not record an earthquake it gave them an insight into the structures that could be very small and localized precursors of seismic failure planes.
The discovery was made possible through the use of high-resolution Synchrotron X-ray tomographic, scanning electron microscope observations at the nanoscale and advanced visualisation using iVEC in Western Australia.
The authors of the paper propose that the fluid movement, described as the granular fluid pump, is a self sustaining process where pores open and close allowing fluid and gas to be pumped out.
The paper was written by five authors from CSIRO Exploration and Mining working through the Minerals Down Under National Research Flagship, The School of Earth & Environmental Sciences, University of Western Australia and Advanced Photon Source, and Argonne National Laboratory, USA.
Three of the authors are with CSIRO: Prof Klaus Regenauer-Lieb who shares his time between CSIRO and the University of Western Australia and is also a WA Premiers Fellow; Dr Jie Liu and Dr Rob Hough.
The experiments at the Advanced Photon Source in Chicago were funded in part by the Australian Synchrotron Research Program.
Adapted from materials provided by CSIRO Australia.

Sunday, July 5, 2009

World's Largest Aerosol Sensing Network Has Leafy Origins


ScienceDaily (July 6, 2009) — Twenty years ago, Brent Holben was part of a NASA team studying vegetation from space. In an unlikely career twist, his research morphed into the study of a critical, if overlooked, subplot in the story of climate change.
From his office at NASA's Goddard Space Flight Center in Greenbelt, Md., Holben helps manage the world's largest network of ground-based sensors for aerosols -- tiny specks of solids and liquids that waft about in the atmosphere. These particles come from both human and natural sources and can be observed everywhere in the world.
Scientists know that some of them play an outsized role in Earth's climate. And much of that knowledge has come from the Aerosol Robotic Network, or AERONET, the collaborative, international sensor network which Holben leads.
"Aerosols play a key role in climate, and pretty much everybody who studies aerosols uses data from AERONET," said William Lau, director of the Atmospheric Sciences Division at Goddard. "Without AERONET, our understanding of the climate system simply wouldn't be where it is today."
Trouble Seeing the Forest and the Trees
The origins of AERONET date to the late 1980s, when Goddard researchers were attempting — and struggling — to study vegetation using satellites. "The atmosphere kept getting in the way," Holben said. Satellites couldn't properly sense the vegetation through all the dust, minerals, soot, salt, and other atmospheric aerosols obscuring the view. The problem prompted Holben to put his vegetation research aside "temporarily" to tackle aerosols. In 1992, he planned a field campaign to the Amazon, where farmers were burning swaths of rainforest to clear the land. The heavy emissions from the fires made it an ideal environment to study aerosol particles. During that project, Holben began to develop a method for studying aerosols that became a template for future campaigns. He used lamp-sized instruments called sun-sky photometers to measure the intensity of light filtering through a given column of atmosphere. Aerosol particles scatter or absorb portions of the incoming light, allowing scientists to deduce their size, shape, and chemical composition. Often the instruments are installed on the roofs of universities, but solar-powered versions of the devices can also be deployed in remote corners of the world, far off the grid.
Intriguing results began to emerge from the Amazon campaign as well as others in Africa, Canada, and Hawaii. Though aerosols generally reside in the atmosphere for just a few weeks, the data from the Amazon showed that heavy fires could increase pollution levels dramatically for as long as two months after the burning season ended.
A Time to Plant, A Time to Reap
The timing of Holben's foray into aerosol research turned out to be impeccable. Around the time he was deploying photometers in the Amazon, the volcanic eruption of Mount Pinatubo in the Philippines flooded the atmosphere with sulfate aerosols. The burst blocked some solar radiation from reaching Earth's surface and caused global temperatures to drop by 0.5 °C (0.9 °F) for a few years. The eruption underscored the profound impact sulfate aerosols could have on climate. It also reminded researchers how poorly they understood many other types of aerosols. Deploying more photometers was a logical and relatively low-cost way to start filling the gaps in knowledge. Holben and colleagues slowly set up an array of sensors in the United States, while forging collaborations for similar networks in France, Brazil, Spain, Canada, and Japan. Soon, Holben and his collaborators realized that they had created a global network. In 1998, he described the network's potential in an article in Remote Sensing of Environment, laying out methods of calibrating the sensors and guidelines for collecting and interpreting data. With that paper, AERONET was officially born. Today AERONET consists of approximately 400 sites in 50 countries on all seven continents. There are AERONET stations on remote sand dunes in Mali, on the ice sheet at South Pole, and on the tiny island nation of Nauru in the South Pacific.
An Ever-Wider Net
By providing accurate measurements from the ground, AERONET has emerged as the best tool to validate the accuracy of new satellite instruments. For example, scientists have relied upon AERONET to reconcile differences between aerosol measurements from the Moderate-Resolution Imaging Spectroradiometer (MODIS) and the Multiangle Imaging SpectroRadiometer (MISR), two instruments on NASA's Terra satellite. "Without AERONET, we'd have no baseline for comparison," said Michael Mishchenko, a remote sensing expert at NASA's Goddard Institute for Space Studies in New York City and project scientist for NASA's upcoming Glory mission. Glory will rely on AERONET to validate the Aerosol Polarimetery Sensor, an innovative instrument that will distinguish between different types of aerosols from space. Though developed nations are dense with AERONET stations, coverage in many developing areas remains sparse. That's a problem because aerosols don't recognize borders and they aren't limited to land masses.
Gaps in coverage can lead to gaps in understanding, said Venkatachalam Ramaswamy, director of the National Oceanic and Atmospheric Administration's Geophysical Fluid Dynamics Laboratory and a professor of geosciences at Princeton University, N.J. Compared to other factors that affect climate — such as the output of the Sun or greenhouse gases — aerosols are considered the least-well understood. So Holben and colleagues are working to expand AERONET and continue filling in the gaps. Zhanqing Li, an atmospheric scientist at the University of Maryland, College Park, Md. is leading an international field campaign in China, where aerosol loading is exceptionally high. The scientists are deploying AERONET photometers and other instruments that gather information about the impact of aerosols on the region's climate, especially on the dynamics of the Asian monsoon.
In India, AERONET-affiliated researchers are deploying sensors along the flight track of NASA's CALIPSO satellite, which uses light detection and ranging (LIDAR) to measure aerosols. AERONET's open-source approach to data collection and analysis has also aided its expansion. All data is relayed through weather satellites to a centralized database at Goddard, where it quickly becomes available on the Internet.
"It's always been important to me that AERONET data be freely available," Holben said. "The taxpayers fund this project, and they deserve to know what we're finding."
"Scientists are typically protective of their data, so Holben's insistence on data sharing was a bit avant-garde," said Joel Schafer, a Goddard AERONET scientist. The strategy has paid off. The 1998 study Holben used to introduce AERONET recently passed an impressive academic milestone: it has been cited more than 1,000 times, making it one of the most referenced papers in contemporary earth science. With that accomplishment under his belt, perhaps Holben will have the time to turn back to that old vegetation research.
Adapted from materials provided by NASA/Goddard Space Flight Center. Original article written by Adam Voiland.

Desert Dust Alters Ecology Of Colorado Alpine Meadows

ScienceDaily (July 5, 2009) — Accelerated snowmelt--precipitated by desert dust blowing into the mountains--changes how alpine plants respond to seasonal climate cues that regulate their life cycles, according to results of a new study reported this week in the journal Proceedings of the National Academy of Sciences (PNAS). These results indicate that global warming may have a greater influence on plants' annual growth cycles than previously thought.
Current mountain dust levels are five times greater than they were before the mid-19th century, due in large part to increased human activity in deserts.
"Human use of desert landscapes is linked to the life cycles of mountain plants, and changes the environmental cues that determine when alpine meadows will be in bloom, possibly increasing plants' sensitivity to global warming," said Jay Fein, program director in the National Science Foundation (NSF)'s Division of Atmospheric Sciences, which funded the research in part.
This year, 12 dust storms have painted the mountain snowpack red and advanced the retreat of snow cover, likely by more than a month across Colorado.
"Desert dust is synchronizing plant growth and flowering across the alpine zone," said Heidi Steltzer, a Colorado State University scientist who led the study. "Synchronized growth was unexpected, and may have adverse effects on plants, water quality and wildlife."
"It's striking how different the landscape looks as result of this desert-and-mountain interaction," said Chris Landry, director of the Center for Snow and Avalanche Studies (CSAS) in Silverton, Colo., who, along with Tom Painter, director of the Snow Optics Laboratory at the University of Utah, contributed to the study.
"Visitors to the mountains arriving in late June will see little remaining snow," said Landry, "even though snow cover was extensive and deep in April. The snow that remains will be barely distinguishable from the surrounding soils.
Earlier snowmelt by desert dust, said Painter, "depletes the natural water reservoirs of mountain snowpacks and in turn affects the delivery of water to urban and agricultural areas."
With climate change, warming and drying of the desert southwest are likely to result in even greater dust accumulation in the mountains.
In an alpine basin in the San Juan Mountains, the researchers simulated dust effects on snowmelt in experimental plots. They measured dust's acceleration of snowmelt on the life cycles of alpine plants.
The timing of snowmelt signals to mountain plants that it's time to start growing and flowering. When dust causes early snowmelt, plant growth does not necessarily begin soon after the snow is gone.
Instead, plants delay their life cycle until air temperatures have warmed consistently above freezing.
"Climate warming could therefore have a great effect on the timing of growth and flowering," said Steltzer.
Competition for water and nutrient resources among plants should increase, leading to the loss of less competitive species. Delayed plant growth could increase nutrient losses, decreasing water quality.
Similarity in flowering times and plant growth will result in abundant resources for wildlife for a short time rather than staggered resources over the whole summer, Steltzer believes.
"With increasing dust deposition from drying and warming in the deserts," she said, "the composition of alpine meadows could change as some species increase in abundance, while others are lost, possibly forever."
Adapted from materials provided by National Science Foundation.

Thursday, July 2, 2009

New Type Of El Nino Could Mean More Hurricanes Make Landfall


ScienceDaily (July 3, 2009) — El Niño years typically result in fewer hurricanes forming in the Atlantic Ocean. But a new study suggests that the form of El Niño may be changing potentially causing not only a greater number of hurricanes than in average years, but also a greater chance of hurricanes making landfall, according to climatologists at the Georgia Institute of Technology. The study appears in the July 3, 2009, edition of the journal Science.
"Normally, El Niño results in diminished hurricanes in the Atlantic, but this new type is resulting in a greater number of hurricanes with greater frequency and more potential to make landfall," said Peter Webster, professor at Georgia Tech's School of Earth and Atmospheric Sciences.
That's because this new type of El Niño, known as El Niño Modoki (from the Japanese meaning "similar, but different"), forms in the Central Pacific, rather than the Eastern Pacific as the typical El Niño event does. Warming in the Central Pacific is associated with a higher storm frequency and a greater potential for making landfall along the Gulf coast and the coast of Central America.
Even though the oceanic circulation pattern of warm water known as El Niño forms in the Pacific, it affects the circulation patterns across the globe, changing the number of hurricanes in the Atlantic. This regular type of El Niño (from the Spanish meaning "little boy" or "Christ child") is more difficult to forecast, with predictions of the December circulation pattern not coming until May. At first glance, that may seem like plenty of time. However, the summer before El Niño occurs, the storm patterns change, meaning that predictions of El Niño come only one month before the start of hurricane season in June. But El Niño Modoki follows a different prediction pattern.
"This new type of El Niño is more predictable," said Webster. "We're not sure why, but this could mean that we get greater warning of hurricanes, probably by a number of months."
As to why the form of El Niño is changing to El Niño Modoki, that's not entirely clear yet, said Webster.
"This could be part of a natural oscillation of El Niño," he said. "Or it could be El Niño's response to a warming atmosphere. There are hints that the trade winds of the Pacific have become weaker with time and this may lead to the warming occurring further to the west. We need more data before we know for sure."
In the study, Webster, along with Earth and Atmospheric Sciences Chair Judy Curry and research scientist Hye-Mi Kim used satellite data along with historical tropical storm records and climate models.
The research team is currently looking at La Niña, the cooling of the surface waters in the Eastern and Central Pacific.
"In the past, La Nina has been associated with a greater than average number of North Atlantic hurricanes and La Nina seems to be changing its structure as well," said Webster. "We're vitally interested in understanding why El Niño-La Niña has changed. To determine this we need to run a series of numerical experiments with climate models."
Adapted from materials provided by Georgia Institute of Technology, via EurekAlert!, a service of AAAS.

Sea Ice At Lowest Level In 800 Years Near Greenland


ScienceDaily (July 2, 2009) — New research, which reconstructs the extent of ice in the sea between Greenland and Svalbard from the 13th century to the present indicates that there has never been so little sea ice as there is now. The research results from the Niels Bohr Institute, among others, are published in the scientific journal, Climate Dynamics.
There are of course neither satellite images nor instrumental records of the climate all the way back to the 13th century, but nature has its own 'archive' of the climate in both ice cores and the annual growth rings of trees and we humans have made records of a great many things over the years - such as observations in the log books of ships and in harbour records. Piece all of the information together and you get a picture of how much sea ice there has been throughout time.
Modern research and historic records
"We have combined information about the climate found in ice cores from an ice cap on Svalbard and from the annual growth rings of trees in Finland and this gave us a curve of the past climate" explains Aslak Grinsted, geophysicist with the Centre for Ice and Climate at the Niels Bohr Institute at the University of Copenhagen.
In order to determine how much sea ice there has been, the researchers needed to turn to data from the logbooks of ships, which whalers and fisherman kept of their expeditions to the boundary of the sea ice. The ship logbooks are very precise and go all the way back to the 16th century. They relate at which geographical position the ice was found. Another source of information about the ice are records from harbours in Iceland, where the severity of the winters have been recorded since the end of the 18th century.
By combining the curve of the climate with the actual historical records of the distribution of the ice, researchers have been able to reconstruct the extent of the sea ice all the way back to the 13th century. Even though the 13th century was a warm period, the calculations show that there has never been so little sea ice as in the 20th century.
In the middle of the 17th century there was also a sharp decline in sea ice, but it lastet only a very brief period. The greatest cover of sea ice was in a period around 1700-1800, which is also called the 'Little Ice Age'.
"There was a sharp change in the ice cover at the start of the 20th century," explains Aslak Grinsted. He explains, that the ice shrank by 300.000 km2 in the space of ten years from 1910-1920. So you can see that there have been sudden changes throughout time, but here during the last few years we have had some record years with very little ice extent.
"We see that the sea ice is shrinking to a level which has not been seen in more than 800 years", concludes Aslak Grinsted.
Journal reference:
Macias Fauria et al. Unprecedented low twentieth century winter sea ice extent in the Western Nordic Seas since A.D. 1200. Climate Dynamics, 2009; DOI: 10.1007/s00382-009-0610-z
Adapted from materials provided by University of Copenhagen.

Thunderhead Accelerator


Besides being host to stunning lightning displays, thunderclouds also emit gamma rays, although researchers aren't completely sure why. Last fall, detectors installed on a mountaintop in Japan captured the first simultaneous observations of this radiation along with the high-speed electrons thought to be their source. The results, detailed in the 26 June Physical Review Letters, support the prevailing model of thundercloud accelerators generating "runaway" electrons, which may sometimes initiate lightning.
Since 1994, satellites, aircraft, and ground-based detectors have picked up gamma ray flashes from thunderstorms. They can last from a few milliseconds to minutes, but only the shortest ones appear to be associated directly with lightning strikes.
Experts believe the gamma rays come from electrons accelerated to near the speed of light in the strong electric fields of thunderclouds. When one of these fast electrons collides with an air molecule it slows down, causing it to emit a gamma ray photon as so-called bremsstrahlung radiation. To account for enough high-speed electrons, theorists have proposed that cosmic rays provide a "seed" population. As these primary electrons accelerate in a thundercloud's electric field, they knock other electrons off of air molecules, and the newly liberated electrons accelerate and knock out still more electrons from atoms. This "runaway" avalanche model is consistent with short flashes, and it may provide the trigger for lightning strikes [1]. But it hasn't been clear whether the model could also explain long-duration bursts.
To provide a new type of data set, Harufumi Tsuchiya of RIKEN, a Japanese research institute, and his colleagues built a system that could measure both the electrons and photons from a thunderstorm. Their main components were a sodium-iodide scintillator that could detect all incoming particles in the range of 10 thousand electron-volts (keV) to 12 million electron-volts (MeV) and a plastic scintillator sensitive primarily to electrons with 500 keV or more energy. Because these relativistic electrons travel at most a few hundred meters through the atmosphere, the detectors were placed 2770 meters above sea level at the Norikura Observatory, where low-lying thunderstorms are frequent.
During a storm on the night of 20 September 2008, the detectors picked up a radiation burst lasting 90 seconds, with no associated lightning strike. A computer model showed that the gamma rays--which were identified by subtracting the two scintillator signals--likely originated from 90 meters above the detectors. "Because of this short source distance, the accelerated electrons were able to arrive at our detector after escaping an acceleration region in the thunderclouds," Tsuchiya says. The team's estimated electron energies extending up to 20 MeV were consistent with the runaway model, suggesting that at least the energy predictions of the theory are reasonable for long-duration gamma-ray bursts.
From the electron counts, the authors also inferred the cloud's accelerator to be 200 meters long. This length is shorter than might be expected from the runaway model, says Robert Roussel-Dupré, a science consultant in Santa Fe, New Mexico, who helped develop the model. He thinks an extension of the theory is needed to explain the size of the accelerator, the long duration of such bursts, and also the puzzlingly small electric fields measured by balloons flown inside thunderclouds. By current theories, these fields aren't large enough to initiate lightning. A clearer picture of long-duration bursts could connect runaway electrons to the lightning spark.--Michael Schirber Michael Schirber is a freelance science writer in Lyon, France.
References:[1] J. R. Dwyer, M. A. Uman, and H. K. Rassoul, "Remote Measurements of Thundercloud Electrostatic Fields," J. Geophys. Res. 114, D09208 (2009) [see Cosmic Rays Offer Clue to Lightning (news article at Physicsworld.com)].

QuikScat Finds Tempests Brewing In 'Ordinary' Storms

SOURCE

ScienceDaily (July 2, 2009) — "June is busting out all over," as the song says, and with it, U.S. residents along the Atlantic and Gulf coasts begin to gaze warily toward the ocean, aware that the hurricane season is revving up. In the decade since NASA's QuikScat satellite and its SeaWinds scatterometer launched in June 1999, the satellite has measured the wind speed and wind direction of these powerful storms, providing data that are increasingly used by the National Oceanic and Atmospheric Administration's (NOAA) National Hurricane Center and other world forecasting agencies. The data help scientists detect these storms, understand their wind fields, estimate their intensity and track their movement.
But tropical cyclones aren't the only storms that generate hurricane-force winds. Among others that do is a type of storm that dominates the weather in parts of the United States and other non-tropical regions every fall, winter and into spring: extratropical cyclones.
Extratropical Cyclones: Meteorological 'Bombs'
Scientists have long known that extratropical cyclones (also known as mid-latitude or baroclinic storms) sometimes produce hurricane-force winds. But before QuikScat, hurricane-force extratropical cyclones were thought to be relatively rare. Thanks to QuikScat, we now know that such storms occur much more frequently than previously believed, and the satellite has given forecasters an effective tool for routinely and consistently detecting and forecasting them.
These storms, which occur near busy trans-oceanic shipping lanes, pose a significant threat to life and property for those on the high seas, generating high winds and waves up to 30 meters (100 feet) high. When they make landfall, in areas like Alaska, the Pacific Northwest, New England and the U.S. mid-Atlantic coast, they produce strong winds, high surf, coastal flooding, heavy rains, river flooding and even blizzard conditions.
Take the "Hanukkah Eve" extratropical cyclone of Dec. 14-15, 2006, for example. That storm viciously raked the U.S. Pacific Northwest and British Columbia with torrential rainfall and hurricane-force winds exceeding 87 knots (100 miles per hour) in spots. Dozens of people were injured and 18 people lost their lives, while thousands of trees were downed, power was knocked out for more than 1.5 million residents and structural damage topped $350 million.
NOAA defines an extratropical cyclone as "a storm system that primarily gets its energy from the horizontal temperature contrasts that exist in the atmosphere." These low pressure systems have associated cold fronts, warm fronts and occluded fronts. Tropical cyclones, in contrast, don't usually vary much in temperature at Earth's surface, and their winds are generated by the energy released as clouds and rain form in warm, moist, tropical air. While a tropical cyclone's strongest winds are near Earth's surface, the strongest winds in extratropical cyclones are about 12 kilometers (8 miles) up, in the tropopause. Tropical cyclones can become extratropical, and vice versa.
Extratropical cyclones occur in both the North Atlantic and North Pacific year-round. Those with hurricane-force winds have been observed from September through May. Their frequency typically begins to increase in October, peaks in December and January, and tapers off sharply after March. They can range from less than 100 kilometers (62 miles) in diameter to more than 4,000 kilometers (nearly 2,500 miles) across. They typically last about five days, but their hurricane-force winds are usually short-lived--just 24 hours or less. Because they can intensify rapidly, they're often referred to as meteorological "bombs." Wind speeds in extratropical cyclones can vary from just 10 or 20 knots (12 to 23 miles per hour) to hurricane-force (greater than 63 knots, or 74 miles per hour). During their development, they can trek along at more than 30 knots (35 miles per hour), but they slow down as they mature. At their seasonal peak, up to eight such storms of varying intensity have been observed at once in both the North Atlantic and North Pacific.
Early work by scientists at NASA, NOAA and other organizations demonstrated the effectiveness of using scatterometers for detecting these powerful and destructive winds. Scatterometers work by sending radar signals to the ocean surface and measuring the strength of the radar signals that bounce back. The higher the wind speed, the more the ocean surface is disturbed, and the stronger the reflection that is bounced back to the satellite.
Among those who pioneered these efforts at NASA was Senior Research Scientist Timothy Liu of NASA's Jet Propulsion Laboratory, Pasadena, Calif., who used data from the NASA Scatterometer, the predecessor to QuikScat, to study the transition of tropical cyclones into extratropical storms in 1997. In addition, Robert Atlas of NASA's Goddard Space Flight Center, Greenbelt, Md., demonstrated that scatterometer data were able to improve predictions of extratropical storm strength and location.
Raising Forecaster Awareness
Joe Sienkiewicz, chief of the Ocean Applications Branch at NOAA's Ocean Prediction Center, Camp Springs, Md., says QuikScat data have raised the awareness of forecasters to the occurrence of hurricane-force intensity conditions in extratropical cyclones and have significantly advanced their short-term wind warning and forecast processes.
"QuikScat winds have given forecasters at NOAA's Ocean Prediction Center a high level of situational awareness over the data-sparse waters of the North Atlantic and North Pacific Oceans," he said. "Ocean Prediction Center forecasters daily examine every QuikScat pass and patch of wind and frequently base wind warning and forecast decisions solely on QuikScat winds. Through confidence gained from QuikScat, the National Weather Service began issuing warnings for dangerous hurricane-force winds in extratropical cyclones in December 2000.
"From 10 years of QuikScat, we have learned that hurricane force winds in extratropical cyclones occur more frequently than thought, are most frequent in winter months, and the conditions are most often observed south of the cyclone center," he added.
Over the years, the number of storms observed with hurricane-force winds has steadily increased due to forecasters gaining confidence using the data, and improvements to the QuikScat data. From the fall of 2006 through 2008, NOAA's Ocean Prediction Center identified and issued warnings for 115 separate extratropical cyclones (64 in the Atlantic and 51 in the Pacific) that reached hurricane force.
As confirmed in a 2008 study, QuikScat substantially extends the ability of forecasters to detect hurricane-force wind events in extratropical storms. For the studied case, QuikScat was able to identify more than three-and-a-half times as many hurricane-force events as combined data from the European ASCAT sensor on the METOP-A satellite, directly-measured buoy and ship information, and model predictions.
Another study in 2002 found that incorporating QuikScat data increased the number of wind warnings the Ocean Prediction Center issued for extratropical cyclones by 30 percent in the North Atlantic and by 22 percent in the North Pacific. Between 2003 and 2006, the Ocean Prediction Center's forecasters successfully predicted hurricane-force winds two days in advance 58 percent of the time in the Atlantic and 44 percent in the Pacific. Considering that a successful forecast of hurricane-force winds requires accurate prediction of the timing and intensity of an explosive deepening cyclone, these numbers are impressive.
QuikScat data have been instrumental in the ability to forecast hurricane-force extratropical cyclones several days in advance, while they are still well out over the ocean. Forecasters can use the data to determine which numerical weather prediction models are handling a storm the best, thereby improving the accuracy of forecasts and increasing warning lead times. QuikScat data are available to forecasters within three hours of acquisition.
The availability of a consistent observing capability for extratropical cyclones from QuikScat has allowed NOAA to add a third "hurricane-force" warning category for extratropical cyclone winds, in addition to gale and storm, providing better warnings of a coming storm's severity. The U.S. Coast Guard broadcasts these warnings by radiofax, and they are posted online at: http://www.opc.ncep.noaa.gov .
A Boon to Shipping
These extratropical cyclone warnings have a great economic impact on the $200 billion global marine shipping industry. A recent study estimates improvements to warning and forecast services due to QuikScat save the container and bulk shipping industry $135 million a year by reducing their exposure to hurricane-force wind conditions in non-tropical storms over the North Pacific and North Atlantic. Without QuikScat, the severity of many extratropical cyclones would not be determined. The data are also vital to the fishing industry, offshore energy industries, search and rescue organizations, and agencies that track and manage marine hazards like oil spills.
Paul Chang, ocean winds science team lead at NOAA's National Environmental Satellite, Data and Information Service/Center for Satellite Applications and Research, Camp Springs, Md., said ocean vector wind measurements from QuikScat have become a basic part of NOAA's day-to-day forecasting and warning processes.
"The 10 years of observations from the QuikScat mission have provided critical information for the monitoring, modeling, forecasting and research of the atmosphere, oceans and climate," he said.
For more information about QuickScat, visit http://winds.jpl.nasa.gov/.
Adapted from materials provided by NASA/Jet Propulsion Laboratory.

Wednesday, July 1, 2009

Rising Acidity Levels Could Trigger Shellfish Revenue Declines, Job Losses


ScienceDaily (July 1, 2009) — Changes in ocean chemistry — a consequence of increased carbon dioxide (CO2) emissions from human industrial activity — could cause U.S. shellfish revenues to drop significantly in the next 50 years, according to a new study by researchers at the Woods Hole Oceanographic Institution (WHOI).
Intensive burning of fossil fuels and deforestation over the last two centuries have increased CO2 levels in the atmosphere by almost 40 percent. The oceans have absorbed about one-third of all human-generated carbon emissions, but the buildup of CO2 in the ocean is pushing surface waters toward more acidic conditions.
This “ocean acidification” creates a corrosive environment for marine organisms such as corals, marine plankton, and shellfish that build carbonate shells or skeletons. Mollusks — including mussels and oysters, which support valuable marine fisheries — are particularly sensitive to these changes.
In a case study of U.S. commercial fishery revenues published in the June issue of Environmental Research Letters, WHOI scientists Sarah Cooley and Scott Doney calculated the possible economic effects of ocean acidification over the next 50 years using atmospheric CO2 trajectories from the Intergovernmental Panel on Climate Change and laboratory studies of acidification’s effects on shell-forming marine organisms, focusing especially on mollusks.
Mollusk sales by fishermen currently generate about $750 million per year — nearly 20 percent of total U.S. fisheries revenue. The study assumed that mollusks harvests in the U.S. would drop 10 to 25 percent in 50 years’ time as a result of increasing acidity levels, which would decrease these mollusk sales by $75 to $187 million dollars annually.
“Losses in primary revenue from commercial mollusk harvests—or the money that fisherman receive for their catch—could add up to as much as $1.4 billion by 2060,” said Cooley.
Reduced harvests of mollusks, as well as losses of predatory fish and other species that depend on mollusks for food, could lead to economic hardships for fishing communities.
“Ocean acidification will impact the millions of people that depend on seafood and other ocean resources for their livelihoods,” said Doney. “Losses of crustaceans, bivalves, their predators, and their habitat — in the case of reef-associated fish communities — would particularly injure societies that depend heavily on consumption and export of marine resources.”
Because changes in seawater chemistry are already apparent and will grow over the next few decades, Cooley and Doney suggest measures that focus on adaptation to future CO2 increases to lessen the impact on marine ecosystems, such as flexible fishery management plans and support for fishing communities.
“Limiting nutrient runoff from land helps coastal ecosystems stay healthy,” said Cooley. “Also fishing rules can be adjusted to reduce pressure on valuable species; fisheries managers may set up more marine protected areas, or they may encourage development of new fisheries.”
This research was supported by grants from the National Science Foundation and Woods Hole Oceanographic Institution.
Journal reference:
Cooley et al. Anticipating ocean acidification's economic consequences on commercial fisheries. Environmental Research Letters, June 1, 2009; 4 (2): 024007 DOI: 10.1088/1748-9326/4/2/024007
Adapted from materials provided by Woods Hole Oceanographic Institution.

NASA, Japan Release Most Complete Topographic Map Of Earth

SOURCE

ScienceDaily (July 1, 2009) — NASA and Japan has released a new digital topographic map of Earth Monday that covers more of our planet than ever before. The map was produced with detailed measurements from NASA's Terra spacecraft.
The new global digital elevation model of Earth was created from nearly 1.3 million individual stereo-pair images collected by the Japanese Advanced Spaceborne Thermal Emission and Reflection Radiometer, or Aster, instrument aboard Terra. NASA and Japan's Ministry of Economy, Trade and Industry, known as METI, developed the data set. It is available online to users everywhere at no cost.
"This is the most complete, consistent global digital elevation data yet made available to the world," said Woody Turner, Aster program scientist at NASA Headquarters in Washington. "This unique global set of data will serve users and researchers from a wide array of disciplines that need elevation and terrain information."
According to Mike Abrams, Aster science team leader at NASA's Jet Propulsion Laboratory in Pasadena, Calif., the new topographic information will be of value throughout the Earth sciences and has many practical applications. "Aster's accurate topographic data will be used for engineering, energy exploration, conserving natural resources, environmental management, public works design, firefighting, recreation, geology and city planning, to name just a few areas," Abrams said.
Previously, the most complete topographic set of data publicly available was from NASA's Shuttle Radar Topography Mission. That mission mapped 80 percent of Earth's landmass, between 60 degrees north latitude and 57 degrees south. The new Aster data expand coverage to 99 percent, from 83 degrees north latitude and 83 degrees south. Each elevation measurement point in the new data is 30 meters (98 feet) apart.
"The Aster data fill in many of the voids in the shuttle mission's data, such as in very steep terrains and in some deserts," said Michael Kobrick, Shuttle Radar Topography Mission project scientist at JPL. "NASA is working to combine the Aster data with that of the Shuttle Radar Topography Mission and other sources to produce an even better global topographic map."
NASA and METI are jointly contributing the Aster topographic data to the Group on Earth Observations, an international partnership headquartered at the World Meteorological Organization in Geneva, Switzerland, for use in its Global Earth Observation System of Systems. This "system of systems" is a collaborative, international effort to share and integrate Earth observation data from many different instruments and systems to help monitor and forecast global environmental changes.
NASA, METI and the U.S. Geological Survey validated the data, with support from the U.S. National Geospatial-Intelligence Agency and other collaborators. The data will be distributed by NASA's Land Processes Distributed Active Archive Center at the U.S. Geological Survey's Earth Resources Observation and Science Data Center in Sioux Falls, S.D., and by METI's Earth Remote Sensing Data Analysis Center in Tokyo.
Aster is one of five Earth-observing instruments launched on Terra in December 1999. Aster acquires images from the visible to the thermal infrared wavelength region, with spatial resolutions ranging from about 15 to 90 meters (50 to 300 feet). A joint science team from the U.S. and Japan validates and calibrates the instrument and data products. The U.S. science team is located at JPL.
For visualizations of the new Aster topographic data, visit: http://www.nasa.gov/topics/earth/features/20090629.html .
Data users can download the Aster global digital elevation model at: https://wist.echo.nasa.gov/~wist/api/imswelcome and http://www.gdem.aster.ersdac.or.jp .
Adapted from materials provided by NASA/Jet Propulsion Laboratory.

Tuesday, June 23, 2009

How Aerosols Contribute To Climate Change


ScienceDaily (June 23, 2009) — What happens in Vegas may stay in Vegas, but what happens on the way there is a different story.
As imaged by Lynn Russell, a professor of atmospheric chemistry at Scripps Institution of Oceanography at UC San Diego, and her team, air blown by winds between San Diego and Las Vegas gives the road to Sin City a distinctive look.
The team has sampled air from the tip of the Scripps Pier since last year, creating a near real-time record of what kinds of particles — from sea salt to car exhaust — are floating around at any given time. Add data about wind speed and direction and the scientists can tell where particles came from and can map their pathways around Southern California.
When Russell and her students put it all together, the atmosphere of greater San Diego comes alive in colors representing the presence of different airborne chemical compounds in aerosol form. One streak of deep red draws a distinct line from the pier that sometimes extends all the way to Las Vegas. The red denotes organic mass, a carbon-based component of vehicular and industrial emissions that pops up on Russell’s readouts frequently. Plot the streak on a road atlas and it reveals the daily life of pollution in Southern California. For one stretch of time, it neatly traced Interstate 15 all the way past the California-Nevada border.
“We were really surprised,” said Russell. “We did not expect to have such consistent winds for the selected study days.”
The hunt for various types of aerosols is helping Russell draw new kinds of global maps, ones that depict what organic compounds—whether natural or from sources such as Southern California traffic and industries—could do to affect rainfall, snowfall, atmospheric warming and cooling, and a host of other climate phenomena. Russell is part of an effort that involves several researchers at Scripps and UCSD and around the world. Collectively they are attempting to address a human-caused phenomenon in the Earth system scarcely considered before the last decade.
Aerosol research is considered one of the most critical frontiers of climate change science, much of which is devoted to the creation of accurate projections of future climate. These projections are generated by computer models — simulations of phenomena such as warming patterns, sea level fluctuations, or drought trends. The raw data for the models can come from historical records of climate basics like temperature and precipitation, but scientists often must rely on incomplete data and best guesses to represent more complex phenomena. The more such uncertainty goes into a model, the greater its margin of error becomes, making it less reliable as a guide for forecasts and adaptive actions.
Among these complex phenomena, the actions of aerosols are what some researchers consider the field’s holy grail, representing the biggest barrier to producing accurate representations of climate. In fact, the Intergovernmental Panel on Climate Change in 2007 specifically listed the effect of aerosols on cloud formation as the largest source of uncertainty in present-day climate models.
Bits of dust, sea salt, the remnants of burned wood, and even living things like bacteria all add to the mix of aerosols that create the skeletons on which clouds form. Around these particles, water and ice condense and cluster into cloud masses. The size and number of each of these droplets determine whether the clouds can produce rain or snow.
The aerosols are also influencing climate in other ways. Diesel exhaust, industrial emissions, and the smoke from burning wood and brush eject myriad bits of black carbon, usually in the form of soot, into the sky and form so-called “brown clouds” of smog. This haze has a dual heating and cooling effect. The particles absorb heat and make the air warmer at the altitudes to which they tend to rise but they also deflect sunlight back into space. This shading effect cools the planet at ground level.
The Arctic Circle is one of the places in the world most sensitive to changes in the mix of aerosols. Since the beginnings of the Industrial Revolution, scientists and explorers have noted the presence of the Arctic haze, a swirl of pollution that appears when sunlight returns after a winter of darkness. The presence of smog over a mostly uninhabited region leads many scientists to believe it is the reason the Arctic is experiencing the most rapid climate-related changes in the world. The haze now lingers for a longer period of time every year. It may be contributing to the forces now causing a meltdown of Arctic ice, a release of methane once stored in permafrost, and a host of ecological changes affecting the spectrum of organisms from mosquitoes to polar bears.
Russell has taken part in two recent analyses of polar air to understand where its imported aerosols come from and how the chemical components of those aerosols could be affecting temperature and cloud formation. From a research vessel in the Norwegian Sea and via continuous measurements from a ground station in Barrow, Alaska, Russell’s team is analyzing particles likely to have been blown to the Arctic from Europe and Asia. Her group has just compiled a full season of air samples fed through intake valves onto filters collected at Barrow.
With it, she believes she has proven what colleagues have previously theorized about where the particles are coming from. She is especially interested in organic particles—aerosols containing carbon supplied either by natural sources such as ocean or land plants or by human sources. Work in her group has shown that organics in the spring haze carry a signature consistent with dust and biomass burning taking place most likely in Siberia. The chemical signature changes in other seasons, revealing itself in infrared spectroscopy readings to be the product of aerosols from natural sources.
The aerosols could be influencing how much snowfall the Arctic gets and keeps. Human-produced aerosols are thought to stifle precipitation in some areas but may provide the impetus for torrential rain in others depending on their chemical make-up. Even if the Asian aerosols are not affecting precipitation, however, Russell said they appear to cool the Arctic atmosphere by deflecting light into space. At the same time, there is strong evidence that they are accelerating ice melt in the Arctic by darkening and heating ice once they fall to the ground the way a dark sweater makes its wearer hotter on a sunny day than does a white sweater.
Russell has been part of another collaborative effort launched in 2008, the International Polar Year, that created chemical profiles of relatively untainted air off the Norwegian west coast, which is only occasionally tinged by European smog. She has also teamed with collaborators at Scripps, NOAA and other universities to profile aerosols around Houston, Texas, and Mexico City.
In the latter two projects, she has provided evidence that agriculture adds more to the aerosol mix in an oil town like Houston than previously thought and that organic particles in Mexico City, rising from the smoke of street vendors and exhaust of cars driving on gas formulated differently than in the United States, glom on to dust in a different manner than American pollution to create aerosols with distinct chemical structures. Figuring out what they do locally and regionally is the next step.
Russell collaborates with a number of other faculty at UCSD whose research also focuses on aerosols, such as Kim Prather, an atmospheric chemistry professor with joint appointments at Scripps and the UCSD Department of Chemistry and Biochemistry. Russell and Prather are comparing their results form Mexico City in an effort to better understand the sources of aerosols in the atmosphere.
“We are trying to understand the major sources of aerosols in our atmosphere and how they affect the overall temperature of our planet; as opposed to greenhouse gases which we know are warming, aerosols can cool or warm depending on their composition and where they are located in the atmosphere," said Prather. Like Russell, Prather also studies long-range transport of aerosols from terrestrial and marine sources. Prather and Russell have worked together on several other projects and recently helped form the Aerosol Chemistry and Climate Institute, a collaboration between Scripps and the Department of Energy’s Pacific Northwest National Laboratory.
For her most comprehensive study, Russell need only to make her shortest journey to the end of Scripps Pier. It is possible that aerosol journeys of a thousand miles or more might be explained by shorter commutes between Southern California counties. Complete analysis of the Interstate 15 data suggests Vegas might not be a source of dirtiness after all. Using data collected over longer time periods, Russell’s pollution map of local counties now suggests organic human-made aerosols might just be blowing toward Nevada from San Bernardino and Riverside then back toward San Diego as winds shift. Russell employs a suite of complementary measurements at the pier to characterize short- and long-term aerosol trends. Those are combined with particle profiles made by Prather’s group and collaborators whose numbers are growing out of necessity.
“Understanding the big picture is the only way we’re going to be able to reduce the uncertainty associated with aerosol particles and their effects on climate,” said Russell. “There are so many parameters, there’s no one instrument or even one person who can do all of it at once.”
Adapted from materials provided by University of California, San Diego, via Newswise.