Thursday, October 1, 2009

Google Earth Application Maps Carbon's Course.

ScienceDaily (Sep. 30, 2009) — Sometimes a picture really is worth a thousand words, particularly when the picture is used to illustrate science. Technology is giving us better pictures every day, and one of them is helping a NASA-funded scientist and her team to explain the behavior of a greenhouse gas.
Google Earth -- the digital globe on which computer users can fly around the planet and zoom in on key features -- is attracting attention in scientific communities and aiding public communication about carbon dioxide. Recently Google held a contest to present scientific results using KML, a data format used by Google Earth.
"I tried to think of a complex data set that would have public relevance," said Tyler Erickson, a geospatial researcher at the Michigan Tech Research Institute in Ann Arbor.
He chose to work with data from NASA-funded researcher Anna Michalak of the University of Michigan, Ann Arbor, who develops complex computer models to trace carbon dioxide back in time to where it enters and leaves the atmosphere.
"The datasets have three spatial dimensions and a temporal dimension," Erickson said. "Because the data is constantly changing in time makes it particularly difficult to visualize and analyze."
A better understanding of the carbon cycle has implications for energy and environmental policy and carbon management. In June 2009, Michalak described this research at the NASA Earth System Science at 20 symposium in Washington, D.C.
A snapshot from Erickson's Google Earth application shows green tracks representing carbon dioxide in the lowest part of the atmosphere close to Earth's surface where vegetation and land processes can impact the carbon cycle. Red tracks indicate particles at higher altitudes that are immune from ground influences.
The application is designed to educate the public and even scientists about how carbon dioxide emissions can be traced. A network of 1,000-foot towers across the United States is equipped with instruments by NOAA to measure the carbon dioxide content of parcels of air at single locations.
The application is designed to educate the public and even scientists about how carbon dioxide emissions can be traced. A network of 1,000-foot towers across the United States, like the tower above, are equipped with instruments by NOAA to measure the carbon dioxide content of parcels of air at single locations.
But where did that gas come from and how did it change along its journey? To find out, scientists rely on a sleuthing technique called "inverse modeling" – measuring gas concentrations at a single geographic point and then using clues from weather and atmospheric models to deduce where it came from. The technique is complex and difficult to explain even to fellow scientists.
Michalak related the technique to cream in a cup of coffee. "Say someone gave you a cup of creamy coffee," Michalak said. "How do you know when that cream was added?" Just as cream is not necessarily mixed perfectly, neither is the carbon dioxide in the atmosphere. If you can see the streaks of cream (carbon dioxide) and understand how the coffee (atmosphere) was stirred (weather), then scientists can use those clues to retrace the time and location that the ingredient was added to the mix.
The visual result typically used by scientists is a static two-dimensional map of the location of the gas, as averaged over the course of a month. Most carbon scientists know how to interpret the 2D map, but visualizing the 3D changes for non-specialists has proved elusive. Erickson spent 70 hours programming the Google Earth application that makes it easy to navigate though time and watch gas particles snake their way toward the NOAA observation towers. For his work, Erickson was declared one of Google's winners in March 2009.
"Having this visual tool allows us to better explain the scientific process," Michalak said. "It's a much more human way of looking at the science."
The next step, Erickson said, is to adapt the application to fit the needs of the research community. Scientists could use the program to better visualize the output of complex atmospheric models and then improve those models so that they better represent reality.
"Encouraging more people to deliver data in an interactive format is a good trend," Erickson said. "It should help innovation in research by reducing barriers to sharing data."
Adapted from materials provided by
NASA.

San Andreas Affected By 2004 Sumatran Quake; Largest Quakes Can Weaken Fault Zones Worldwide.

SOURCE

ScienceDaily (Sep. 30, 2009) — U.S. seismologists have found evidence that the massive 2004 earthquake that triggered killer tsunamis throughout the Indian Ocean weakened at least a portion of California's famed San Andreas Fault. The results, which appear this week in the journal Nature, suggest that the Earth's largest earthquakes can weaken fault zones worldwide and may trigger periods of increased global seismic activity.
"An unusually high number of magnitude 8 earthquakes occurred worldwide in 2005 and 2006," said study co-author Fenglin Niu, associate professor of Earth science at Rice University. "There has been speculation that these were somehow triggered by the Sumatran-Andaman earthquake that occurred on Dec. 26, 2004, but this is the first direct evidence that the quake could change fault strength of a fault remotely."
Earthquakes are caused when a fault fails, either because of the buildup of stress or because of the weakening of the fault. The latter is more difficult to measure.
The magnitude 9 earthquake in 2004 occurred beneath the ocean west of Sumatra and was the second-largest quake ever measured by seismograph. The temblor spawned tsunamis as large as 100 feet that killed an estimated 230,000, mostly in Indonesia, Sri Lanka, India and Thailand.
In the new study, Niu and co-authors Taka'aki Taira and Paul Silver, both of the Carnegie Institution of Science in Washington, D.C., and Robert Nadeau of the University of California, Berkeley, examined more than 20 years of seismic records from Parkfield, Calif., which sits astride the San Andreas Fault.
The team zeroed in on a set of repeating microearthquakes that occurred near Parkfield over two decades. Each of these tiny quakes originated in almost exactly the same location. By closely comparing seismic readings from these quakes, the team was able to determine the "fault strength" -- the shear stress level required to cause the fault to slip -- at Parkfield between 1987 and 2008.
The team found fault strength changed markedly at three times during the 20-year period. The authors surmised that the 1992 Landers earthquake, a magnitude 7 quake north of Palm Springs, Calif. -- about 200 miles from Parkfield -- caused the first of these changes. The study found the Landers quake destabilized the fault near Parkfield, causing a series of magnitude 4 quakes and a notable "aseismic" event -- a movement of the fault that played out over several months -- in 1993.
The second change in fault strength occurred in conjunction with a magnitude 6 earthquake at Parkfield in September 2004. The team found another change at Parkfield later that year that could not be accounted for by the September quake alone. Eventually, they were able to narrow the onset of this third shift to a five-day window in late December during which the Sumatran quake occurred.
"The long-range influence of the 2004 Sumatran-Andaman earthquake on this patch of the San Andreas suggests that the quake may have affected other faults, bringing a significant fraction of them closer to failure," said Taira. "This hypothesis appears to be borne out by the unusually high number of large earthquakes that occurred in the three years after the Sumatran-Andaman quake."
The research was supported by the National Science Foundation, the Carnegie Institution of Washington, the University of California, Berkeley, and the U.S. Geological Survey.
Adapted from materials provided by
Rice University.

Saturday, July 25, 2009

Auroras In Northern And Southern Hemispheres Are Not Identical


ScienceDaily (July 24, 2009) — Norwegian researchers have shown that the auroras in the Northern and the Southern hemispheres can be totally asymmetric. These findings contradict the commonly made assumption of aurora being mirror images of each other.
The study was performed by PhD student Karl Magnus Laundal and professor Nikolai Østgaard at the Institute of Physics and Technology at the University of Bergen.
"The aurora is produced due to collisions between the Earth’s atmosphere and electrically charged particles streaming along the Earth’s geomagnetic field lines. – Since these processes occur above the two hemispheres, both the Northern and the Southern light are created. So far researchers have assumed that these auroras are mirror images of each other, but our findings show that this is not always the case," professor Nikolai Østgaard says.
The researchers at the University of Bergen have used data from the two NASA-satellites IMAGE and Polar to examine the Northern and the Southern light. In the Nature letter they present several possible explanations to the unexpected result.
"The most plausible explanation involves electrical currents along the magnetic field lines. Differences in the solar exposure may lead to currents between the two hemispheres, explaining why the Northern and the Southern light are not identical," PhD student Karl Magnus Laundal says.
In addition to yielding new knowledge about the aurora, the results are important for other researchers studying the near-Earth space.
"Our study shows that data from only one hemisphere is not sufficient to state the conditions on the other hemisphere. This is important because most of our knowledge about the aurora, as well as processes in the upper atmosphere in the polar regions, is based solely on data from the Northern hemisphere," Østgaard points out.
The work of Østgaard and Laundal has been financed by the Research Council of Norway via the International Polar Year project IPY-ICESTAR.
Laundal, K.M. and Østgaard, N., Asymmetric auroral intensities in the Earth's Northern and Southern hemispheres, Nature 460, 491-493 (23 July 2009) doi:10.1038/nature08154
Journal reference:
Laundal et al. Asymmetric auroral intensities in the Earth's Northern and Southern hemispheres. Nature, 2009; 460 (7254): 491 DOI:
10.1038/nature08154
Adapted from materials provided by
University of Bergen, via AlphaGalileo.

Wednesday, July 22, 2009

California's Channel Islands Hold Evidence Of Clovis-age Comets


ScienceDaily (July 21, 2009) — A 17-member team has found what may be the smoking gun of a much-debated proposal that a cosmic impact about 12,900 years ago ripped through North America and drove multiple species into extinction.
In a paper appearing online ahead of regular publication in the Proceedings of the National Academy of Sciences, University of Oregon archaeologist Douglas J. Kennett and colleagues from nine institutions and three private research companies report the presence of shock-synthesized hexagonal diamonds in 12,900-year-old sediments on the Northern Channel Islands off the southern California coast.
These tiny diamonds and diamond clusters were buried deeply below four meters of sediment. They date to the end of Clovis -- a Paleoindian culture long thought to be North America's first human inhabitants. The nano-sized diamonds were pulled from Arlington Canyon on the island of Santa Rosa that had once been joined with three other Northern Channel Islands in a landmass known as Santarosae.
The diamonds were found in association with soot, which forms in extremely hot fires, and they suggest associated regional wildfires, based on nearby environmental records.
Such soot and diamonds are rare in the geological record. They were found in sediment dating to massive asteroid impacts 65 million years ago in a layer widely known as the K-T Boundary. The thin layer of iridium-and-quartz-rich sediment dates to the transition of the Cretaceous and Tertiary periods, which mark the end of the Mesozoic Era and the beginning of the Cenozoic Era.
"The type of diamond we have found -- Lonsdaleite -- is a shock-synthesized mineral defined by its hexagonal crystalline structure. It forms under very high temperatures and pressures consistent with a cosmic impact," Kennett said. "These diamonds have only been found thus far in meteorites and impact craters on Earth and appear to be the strongest indicator yet of a significant cosmic impact [during Clovis]."
The age of this event also matches the extinction of the pygmy mammoth on the Northern Channel Islands, as well as numerous other North American mammals, including the horse, which Europeans later reintroduced. In all, an estimated 35 mammal and 19 bird genera became extinct near the end of the Pleistocene with some of them occurring very close in time to the proposed cosmic impact, first reported in October 2007 in PNAS.
In the Jan. 2, 2009, issue of the journal Science, a team led by Kennett reported the discovery of billions of nanometer-sized diamonds concentrated in sediments -- weighing from about 10 to 2,700 parts per billion -- in six North American locations.
"This site, this layer with hexagonal diamonds, is also associated with other types of diamonds and with dramatic environmental changes and wildfires," said James Kennett, paleoceanographer and professor emeritus in the Department of Earth Science at the University of California, Santa Barbara.
"There was a major event 12,900 years ago," he said. "It is hard to explain this assemblage of materials without a cosmic impact event and associated extensive wildfires. This hypothesis fits with the abrupt cooling of the atmosphere as shown in the record of ocean drilling of the Santa Barbara Channel. The cooling resulted when dust from the high-pressure, high-temperature, multiple impacts was lofted into the atmosphere, causing a dramatic drop in solar radiation."
The hexagonal diamonds from Arlington Canyon were analyzed at the UO's Lorry I. Lokey Laboratories, a world-class nanotechnology facility built deep in bedrock to allow for sensitive microscopy and other high-tech analyses of materials. The analyses were done in collaboration with FEI, a Hillsboro, Ore., company that distributes the high-resolution Titan microscope used to characterize the hexagonal diamonds in this study.
Transmission electron microscopy and scanning electron microscopes were used in the extensive analyses of the sediment that contained clusters of Lonsdaleite ranging in size from 20 to 1,800 nanometers. These diamonds were inside or attached to carbon particles found in the sediments.
These findings are inconsistent with the alternative and already hotly debated theory that overhunting by Clovis people led to the rapid extinction of large mammals at the end of the ice age, the research team argues in the PNAS paper. An alternative theory has held that climate change was to blame for these mass extinctions. The cosmic-event theory suggests that rapid climate change at this time was possibly triggered by a series of small and widely dispersed comet strikes across much of North America.
The National Science Foundation provided primary funding for the research. Additional funding was provided by way of Richard A. Bray and Philip H. Knight faculty fellowships of the University of Oregon, respectively, to Kennett and UO colleague Jon M. Erlandson, a co-author and director of the UO's Museum of Natural and Cultural History.
The 17 co-authors on the PNAS paper are Douglas Kennett, Erlandson and Brendan J. Culleton, all of the University of Oregon; James P. Kennett of UC Santa Barbara; Allen West of GeoScience Consulting in Arizona; G. James West of the University of California, Davis; Ted E. Bunch and James H. Wittke, both of Northern Arizona University; Shane S. Que Hee of the University of California, Los Angeles; John R. Johnson of the Santa Barbara Museum of Natural History; Chris Mercer of the Santa Barbara Museum of Natural History and National Institute of Materials Science in Japan; Feng Shen of the FEI Co.; Thomas W. Stafford of Stafford Research Inc. of Colorado; Adrienne Stich and Wendy S. Wolbach, both of DePaul University in Chicago; and James C. Weaver of the University of California, Riverside.
About the University of Oregon
The University of Oregon is a world-class teaching and research institution and Oregon's flagship public university. The UO is a member of the Association of American Universities (AAU), an organization made up of the 62 leading public and private research institutions in the United States and Canada. The UO is one of only two AAU members in the Pacific Northwest.
Alternate Media Contact: Gail Gallessich, science writer, UC Santa Barbara, 805-893-7220, .edu
Sources: Douglas Kennett, professor of archaeology, department of anthropology, .edu. He currently is in Europe and readily assessable by email (a phone number is available through the UO media contact above)
James Kennett, professor emeritus, UC Santa Barbara, 805-893-4187, .edu (home phone number for media access is available from either media contact above)
Links: Doug Kennett faculty page: http://www.uoregon.edu/~dkennett/Welcome.html; anthropology department: http://www.uoregon.edu/~anthro/
James Kennett faculty page: http://www.geol.ucsb.edu/faculty/kennett/Home.html
Adapted from materials provided by University of Oregon.

Wednesday, July 15, 2009

First Remote, Underwater Detection Of Harmful Algae, Toxins


ScienceDaily (July 15, 2009) — Scientists at NOAA's National Centers for Coastal Ocean Science and the Monterey Bay Aquarium Research Institute (MBARI) have successfully conducted the first remote detection of a harmful algal species and its toxin below the ocean's surface. The achievement was recently reported in the June issue of Oceanography.
This achievement represents a significant milestone in NOAA's effort to monitor the type and toxicity of harmful algal blooms (HABs). HABs are considered to be increasing not only in their global distribution, but also in the frequency, duration, and severity of their effects. HABs damage coastal ecosystem health and pose threats to humans as well as marine life. Climate change is expected to exacerbate this trend, since many critical processes that govern HABs dynamics, such as water temperature and ocean circulation, are influenced by climate.
A MBARI-designed robotic instrument called the Environmental Sample Processor, or 'ESP,' designed as a fully-functional analytical laboratory in the sea, lets researchers collect the algal cells and extract the genetic information required for organism identification as well as the toxin needed to assess the risk to humans and wildlife. The ESP then conducts specialized, molecular-based measurements of species and toxin abundance, and transmits results to the laboratory via radio signals.
"This represents the first autonomous detection of both a HAB species and its toxin by an underwater sensor," notes Greg Doucette, Ph.D., a research oceanographer at NOAA's Center for Coastal Environmental Health and Biomolecular Research laboratory in Charleston, S.C. "It allows us to determine not only the organism causing a bloom, but also the toxicity of the event, which ultimately dictates whether it is a threat to the public and the ecosystem."
For the first demonstration of the ESP's ability to detect HABs and their toxins, Doucette and his MBARI colleague, Chris Scholin, Ph.D., targeted certain members of the algal genus Pseudo-nitzschia and their neurotoxin, domoic acid in Monterey Bay, Calif.
Pseudo-nitzschia and domoic acid have been a concern in the Monterey Bay area for well over a decade. In 1991, the first U.S. outbreak of domoic acid poisoning was documented in Monterey Bay. This outbreak resulted in the unusual deaths of numerous pelicans and cormorants that ingested sardines and anchovies, which had accumulated the domoic acid by feeding on a bloom of the toxic algae.
In the spring of 1998, a mass mortality of sea lions in and around the Monterey Bay area was attributed to the sea lions' feeding on domoic acid contaminated anchovies. Since that time, Pseudo-nitzschia and domoic acid have appeared on virtually an annual basis in California coastal waters and are the objects of an intensive statewide monitoring program run by the California Dept. of Public Health. Humans also can be affected by the toxin through consumption of contaminated seafood such as shellfish.
"Our public health monitoring program is one of the many groups that can benefit directly from the ESP technology and ability to provide an early warning of impending bloom activity and toxicity," said Gregg Langlois, director of the state of California's Marine Biotoxin Monitoring Program. "This is critical information for coastal managers and public health officials in mitigating impacts on the coastal ecosystem, since the toxicity of these algae can vary widely from little or no toxicity to highly toxic."
Beyond improving forecasting of HABs, this research will contribute to the rapidly emerging U.S. Integrated Ocean Observing System (IOOS) by adding a new way to make coastal ocean observations.
Adapted from materials provided by National Oceanic and Atmospheric Administration.

Monday, July 13, 2009

Tremors On Southern San Andreas Fault May Mean Increased Earthquake Risk


ScienceDaily (July 13, 2009) — Increases in mysterious underground tremors observed in several active earthquake fault zones around the world could signal a build-up of stress at locked segments of the faults and presumably an increased likelihood of a major quake, according to a new University of California, Berkeley, study.
Seismologist Robert M. Nadeau and graduate student Aurélie Guilhem of UC Berkeley draw these conclusions from a study of tremors along a heavily instrumented segment of the San Andreas Fault near Parkfield, Calif. The research is reported in the July 10 issue of Science.
They found that after the 6.5-magnitude San Simeon quake in 2003 and the 6.0-magnitude Parkfield quake in 2004, underground stress increased at the end of a locked segment of the San Andreas Fault near Cholame, Calif., at the same time as tremors became more frequent. The tremors have continued to this day at a rate significantly higher than the rate before the two quakes.
The researchers conclude that the increased rate of tremors may indicate that stress is accumulating more rapidly than in the past along this segment of the San Andreas Fault, which is at risk of breaking like it did in 1857 to produce the great 7.8 magnitude Fort Tejon earthquake. Strong quakes have also occurred just to the northwest along the Parkfield segment of the San Andreas about every 20 to 30 years.
"We've shown that earthquakes can stimulate tremors next to a locked zone, but we don't yet have evidence that this tells us anything about future quakes," Nadeau said. "But if earthquakes trigger tremors, the pressure that stimulates tremors may also stimulate earthquakes."
While earthquakes are brief events originating, typically, no deeper than 15 kilometers (10 miles) underground in California, tremors are an ongoing, low-level rumbling from perhaps 15 to 30 kilometers (10-20 miles) below the surface. They are common near volcanoes as a result of underground fluid movement, but were a surprise when discovered in 2002 at a subduction zone in Japan, a region where a piece of ocean floor is sliding under a continent.
Tremors were subsequently detected at the Cascadia subduction zone in Washington, Oregon and British Columbia, where several Pacific Ocean plates dive under the North American continental plate. In 2005, Nadeau identified mysterious "noise" detected by the Parkfield borehole seismometers as tremor activity, and has focused on them ever since. Unlike the Japanese and Cascadia tremor sites, however, the Parkfield area is a strike/slip fault, where the Pacific plate is moving horizontally against the North American plate.
"The Parkfield tremors are smaller versions of the Cascadia and Japanese tremors," Nadeau said. "Most last between three and 21 minutes, while some Cascadia tremors go on for days."
Because in nearly all known instances the tremors originate from the edge of a locked zone - a segment of a fault that hasn't moved in years and is at high risk of a major earthquake - seismologists have thought that increases in their activity may forewarn of stress build-up just before an earthquake.
The new report strengthens that association, Nadeau said.
For the new study, Nadeau and Guilhem pinpointed the location of nearly 2,200 tremors recorded between 2001 and 2009 by borehole seismometers implanted along the San Andreas Fault as part of UC Berkeley's High-Resolution Seismic Network. During this period, two nearby earthquakes occurred: one in San Simeon, 60 kilometers from Parkfield, on Dec. 22, 2003, and one in Parkfield on the San Andreas Fault on Sept. 28, 2004.
Before the San Simeon quake, tremor activity was low beneath the Parkfield and Cholame segments of the San Andreas Fault, but it doubled in frequency afterward and was six times more frequent after the Parkfield quake. Most of the activity occurred along a 25-kilometer (16-mile) segment of the San Andreas Fault south of Parkfield, around the town of Cholame. Fewer than 10 percent of the tremors occurred at an equal distance above Parkfield, near Monarch Peak. While Cholame is at the northern end of a long-locked and hazardous segment of the San Andreas Fault, Monarch Peak is not. However, Nadeau noted, Monarch Peak is an area of relative complexity on the San Andreas Fault and also ruptured in 1857 in the Fort Tejon 7.8 earthquake.
The tremor activity remains about twice as high today as before the San Simeon quake, while periodic peaks of activity have emerged that started to repeat about every 50 days and are now repeating about every 100-110 days.
"What's surprising is that the activity has not gone down to its old level," Nadeau said. The continued activity is worrisome because of the history of major quakes along this segment of the fault, and the long-ago Fort Tejon quake, which ruptured southward from Monarch Peak along 350 kilometers (220 miles) of the San Andreas Fault.
A flurry of pre-tremors was detected a few days before the Parkfield quake, which makes Nadeau hopeful of seeing similar tremors preceding future quakes.
He noted that the source of tremors is still somewhat of a mystery. Some scientists think fluids moving underground generate the tremors, just as movement of underground magma, water and gas causes volcanic tremors. Nadeau leans more toward an alternative theory, that non-volcanic tremors are generated in a deep region of hot soft rock, somewhat like Silly Putty, that, except for a few hard rocks embedded like peanut brittle, normally flows without generating earthquakes. The fracturing of the brittle inclusions, however, may be generating swarms of many small quakes that combine into a faint rumble.
"If tremors are composed of a lot of little earthquakes, each should have a primary and secondary wave just like large quakes," but they would overlap and produce a rumble, said Guilhem.
The stimulation of tremors by shear (tearing) stress rather than by compressional (opening and closing) stress is more consistent with deformation in the fault zone than with underground fluid movement, Nadeau said. The researchers' mapping of the underground tremors also shows that the tremors are not restricted to the plane of the fault, suggesting that faults spread out as they dive into the deeper crust.
Whatever their cause, tremors "are not relieving a lot of stress or making the fault less hazardous, they just indicate a changes in stress next to locked faults," said Nadeau.
Seismologists around the world are searching for tremors along other fault systems, Guilhem noted, although tremors can be hard to detect because of noise from oceans as well as from civilization. Brief tremor activity has been observed on a few faults, triggered by huge quakes far away, and these may be areas to focus on. Tremors were triggered on Northern California's Calaveras Fault by Alaska's Denali quake of 2002, Nadeau said.
The work is supported by the U.S. Geological Survey and the National Science Foundation.
Adapted from materials provided by University of California - Berkeley.