Sunday, April 27, 2008

Scientists Reveal Presence Of Ocean Current 'Stripes'


Source:

ScienceDaily (Apr. 26, 2008) — More than 20 years of continuous measurements and a dose of "belief" yield discovery of subtle ocean currents that could dramatically improve forecasts of climate, ecosystem changes. An international collaborative of scientists led by Peter Niiler, a physical oceanographer at Scripps Institution of Oceanography, UC San Diego, and Nikolai Maximenko, a researcher at the International Pacific Research Center, University of Hawaii, has detected the presence of crisscrossing patterns of currents running throughout the world's oceans. The new data could help scientists significantly improve high-resolution models that help them understand trends in climate and marine ecosystems.
The basic dimensions of these steady patterns called striations have slowly been revealed over the course of several research papers by Niiler, Maximenko and colleagues. An analysis by Maximenko, Niiler and colleagues appearing today in the journal Geophysical Research Letters has produced the clearest representation of these striated patterns in the eastern Pacific Ocean to date and revealed that these complex patterns of currents extend from the surface to at least depths of 700 meters (2,300 feet). The discovery of similarly detailed patterns around the world is expected to emerge from future research.
Niiler credits the long-term and comprehensive ocean current measurements made over more than 20 years by the Global Drifter Program, now a network of more than 1,300 drifting buoys designed by him and administered by the National Oceanic and Atmospheric Administration (NOAA) for detecting these new current patterns on a global basis. Niiler added that the foresight of the University of California to provide long-term support to scientists was crucial to the discovery.
"I'm most grateful to the University of California for helping to support the invention and the 20-year maintenance of a comprehensive program of ocean circulation measurements," he said. "Scripps Institution of Oceanography is unique because of its commitment to long-term observations of the climate. Instrumental measurements of the ocean are fundamental to the definition of the state of the climate today and improvement of its prediction into the future."
In portions of the Southern Ocean, these striations-also known as ocean fronts-produce alternating eastward and westward accelerations of circulation and portions of them nearly circumnavigate Antarctica. These striations also delineate the ocean regions where uptake of carbon dioxide is greatest. In the Atlantic Ocean, these flows bear a strong association to the Azores Current along which water flowing south from the North Atlantic circulation is being subducted. The spatial high-resolution view of the linkage between the striations and the larger scale patterns of currents could improve predictions of ocean temperatures and hurricane paths.
In addition, the striations are connected to important ecosystems like the California and Peru-Chile current systems. Off California, the striations are linked to the steady east-west displacements, or meanders, of the California Current, a major flow that runs from the border of Washington and Oregon to the southern tip of Baja California. The striations run nearly perpendicular to the California Current and continue southwestward to the Hawaiian Islands.
Niiler said there are a number of scientists who have theorized the existence of striations in the ocean. He was the first to formulate such a theory as a postdoctoral researcher at Harvard University in 1965. Niiler's theory today is that the steady-state striations in the eastern North Pacific are caused by the angular momentum of the swirling eddies within the California Current System.
A worldwide crisscrossing pattern of ocean current striations has been revealed through measurements made by drifting buoys over a period of more than 20 years and through satellite readings of ocean velocity. Blue bands represent westward-flowing currents and red bands indicate eastward-flowing currents that move at roughly 1 centimeter per second. Image courtesy of Nikolai Maximenko, University of Hawaii.
The new maps of ocean circulation produced by a combination of drifter and satellite measurements will eventually be the yardstick for judging the accuracy of the circulation patterns portrayed by climate and ocean ecosystem models -a major deficiency in current simulations-and to generate substantially more reliable forecast products in climate and ecosystem management. Niiler noted, for example, that there are a large number of computer models that can simulate equatorial currents, but fail in the attempt to accurately simulate the meandering flow of the California Current and the striations that exude from it.
"I think this research presents the next challenge in ocean modeling," said Niiler. "I'm looking forward to the day when we can correctly portray most ocean circulation systems with all climate and ecosystem models."
Maximenko said the clear resolution of the subtle striations would not have been possible without the use of data from both the drifters and satellites.
"Our finding was so unbelievable that our first proposal submitted to the National Science Foundation failed miserably because most reviewers said 'You cannot study what does not exist,'" Maximenko said. "The striations are like ghosts. To see them one needs to believe in them. No doubt, armed with our hint, scientists will start finding all kinds of striations all around the world."
Maximenko, Niiler and their international colleagues are now writing a series of papers that reveal new details about the crisscross patterns and their ties to currents such as the Kuroshio, which flows in western Pacific Ocean waters near Japan.
NOAA, the National Science Foundation, the NASA Ocean Surface Topography Team, and the Japan Agency for Marine-Earth Science and Technology supported the research.
Adapted from materials provided by University of California - San Diego.

Fausto Intilla - www.oloscience.com

Northern Lights Glimmer With Unexpected Trait


Source:

ScienceDaily (Apr. 26, 2008) — An international team of scientists has detected that some of the glow of Earth's aurora is polarized, an unexpected state for such emissions. Measurements of this newfound polarization in the Northern Lights may provide scientists with fresh insights into the composition of Earth's upper atmosphere, the configuration of its magnetic field, and the energies of particles from the Sun, the researchers say.
If observed on other planets, the phenomenon might also give clues to the shape of the Sun's magnetic field as it curls around other bodies in the solar system.
When a beam of light is polarized, its electromagnetic waves share a common orientation, say, aligned vertically, or at some other angle. Until now, scientists thought that light from energized atoms and molecules in planetary upper atmospheres could not be polarized. The reason is simple: in spite of the low number of particles at the altitudes concerned (above 100 kilometers (60 miles)), there are still numerous collisions between molecules and gas atoms. Those collisions depolarize the emitted light.
Fifty years ago, an Australian researcher, Robert Duncan, claimed to observe what looked like polarization of auroral light, but other scientists found that single observation unconvincing.
To revisit the question, Jean Lilensten of the Laboratory of Planetology of Grenoble, France, and his colleagues studied auroral light with a custom-made telescope during the winters of 2006-2007 and 2007-2008. They made their observations from Svalbard Island, Norway, which is in the polar region, at a latitude of 79° north.
At the north and south magnetic poles, many charged particles in the solar wind --a flow of electrically charged matter from the Sun--are captured by the planet's field and forced to plunge into the atmosphere. The particles strike atmospheric gases, causing light emissions.
Lilensten and his colleagues observed weak polarization of a red glow that radiates at an altitude of 220 kilometers (140 miles). The glow results from electrons hitting oxygen atoms. The scientists had suspected that such light might be polarized because Earth's magnetic field at high latitudes funnels the electrons, aligning the angles at which they penetrate the atmosphere.
The finding of auroral polarization "opens a new field in planetology," says Lilensten, who is the lead author of the study. He and his colleagues reported their results on 19 April in Geophysical Research Letters, a publication of the American Geophysical Union, or AGU.
Fluctuations in the polarization measurements can reveal the energy of the particles coming from the Sun when they enter Earth's atmosphere, Lilensten notes. The intensity of the polarization gives clues to the composition of the upper atmosphere, particularly with regard to atomic oxygen.
Because polarization is strongest when the telescope points perpendicularly to the magnetic field lines, the measurements also provide a way to determine magnetic field configurations, Lilensten adds. That could prove especially useful as astronomers train their telescopes on other planetary atmospheres. If polarized emissions are observed there as well, the measurements may enable scientists to understand how the Sun's magnetic field is distorted by obstacles such as the planets Venus and Mars, which lack intrinsic magnetic fields.
Journal reference: Lilensten, J., J. Moen, M. Barthélemy, R. Thissen, C. Simon, D. A. Lorentzen, O. Dutuit, P. O. Amblard, and F. Sigernes (2008), Polarization in aurorae: A new dimension for space environments studies, Geophys. Res. Lett., 35, L08804, doi:10.1029/2007GL033006.
Adapted from materials provided by American Geophysical Union.

Fausto Intilla - www.oloscience.com

Friday, April 25, 2008

Mystery Of Ancient Supercontinent's Demise Revealed


Source:
ScienceDaily (Apr. 24, 2008) — In a paper published in Geophysical Journal International, Dr Graeme Eagles from the Earth Sciences Department at Royal Holloway, University of London, reveals how one of the largest continents ever to exist met its demise.
Gondwana was a ‘supercontinent’ that existed between 500 and 180 million years ago. For the past four decades, geologists have debated how Gondwana eventually broke up, developing a multitude of scenarios which can be loosely grouped into two schools of thought – one theory claiming the continent separated into many small plates, and a second theory claiming it broke into just a few large pieces. Dr Eagles, working with Dr Matthais König from the Alfred Wegener Institute for Polar and Marine Research in Bremerhaven, Germany, has devised a new computer model showing that the supercontinent cracked into two pieces, too heavy to hold itself together.
Gondwana comprised of most of the landmasses in today’s Southern Hemisphere, including Antarctica, South America, Africa, Madagascar, Australia-New Guinea, and New Zealand, as well as Arabia and the Indian subcontinent of the Northern Hemisphere. Between around 250 and 180 million years ago, it formed part of the single supercontinent ‘Pangea’.
Evidence suggests that Gondwana began to break up at around 183 million years ago. Analysing magnetic and gravity anomaly data from some of Gondwana’s first cracking points – fracture zones in the Mozambique Basin and the Riiser-Larsen Sea off Antarctica – Dr Eagles and Dr König reconstructed the paths that each part of Gondwana took as it broke apart. The computer model reveals that the supercontinent divided into just two large, eastern and western plates. Approximately 30 million years later, these two plates started to split to form the familiar continents of today’s Southern Hemisphere.
‘You could say that the process is ongoing as Africa is currently splitting in two along the East African Rift,’ says Dr Eagles. ‘The previously held view of Gondwana initially breaking up into many different pieces was unnecessarily complicated. It gave fuel to the theory that a plume of hot mantle, about 2,000 to 3,000 kilometres wide, began the splitting process. A straight forward split takes the spotlight off plumes as active agents in the supercontinent’s breakup, because the small number of plates involved resembles the pattern of plate tectonics in the rest of Earth’s history during which plumes have played bit parts.’
According to Dr Eagles and Dr König’s study, because supercontinents like Gondwana are gravitationally unstable to begin with, and have very thick crusts in comparison to oceans, they eventually start to collapse under their own weight.
Says Dr Eagles, ‘These findings are a starting point from which more accurate and careful research can be made on the supercontinent. The new model challenges the positions of India and Sri Lanka in Gondwana which have been widely used for the past 40 years, assigning them very different positions in the supercontinent. These differences have major consequences for our understanding of Earth.’
Adapted from materials provided by Royal Astronomical Society.
Fausto Intilla - www.oloscience.com

Monday, April 21, 2008

Extreme Ocean Storms Have Become More Frequent Over Past Three Decades, Study Of Tiny Tremors Shows


Source:
ScienceDaily (Apr. 21, 2008) — Data from faint earth tremors caused by wind-driven ocean waves -- often dismissed as "background noise" at seismographic stations around the world -- suggest extreme ocean storms have become more frequent over the past three decades. The International Panel on Climate Change (IPCC) and other prominent researchers have predicted that stronger and more frequent storms may occur as a result of global warming trends. The tiny tremors, or microseisms, offer a new way to discover whether these predictions are already coming true, said Richard Aster, a geophysics professor at the New Mexico Institute of Mining and Technology.
Unceasing as the ocean waves that trigger them, the microseisms show up as five- to 30-second oscillations of Earth's surface at seismographic stations around the world. Even seismic monitoring stations "in the middle of a continent are sensitive to the waves crashing all around the continent," Aster said.
As storm winds drive ocean waves higher, the microseism signals increase their amplitude as well, offering a unique way to track storm intensities across seasons, over time, and at different geographical locations. For instance, Aster and colleagues Daniel McNamara from the U.S. Geological Survey and Peter Bromirski of the Scripps Institution of Oceanography recently published analysis in the Seismological Society of America journal Seismological Research Letters showing that microseism data collected around the Pacific Basin and throughout the world could be used to detect and quantify wave activity from multi-year events such as the El Niño and La Niña ocean disruptions.
The findings spurred them to look for a microseism signal that would reveal whether extreme storms were becoming more common in a warming world. In fact, they saw "a remarkable thing," among the worldwide microseism data collected from 1972 to 2008, Aster recalled. In 22 of the 22 stations included in the study, the number of extreme storm events had increased over time, they found.
While the work on evaluating changes in extreme storms is "still very much in its early stages", Aster is "hoping that the study will offer a much more global look" at the effects of climate change on extreme storms and the wind-driven waves that they produce. At the moment, most of the evidence linking the two comes from studies of hurricane intensity and shoreline erosion in specific regions such as the Pacific Northwest Gulf of Mexico, he noted.
The researchers are also working on recovering and digitizing older microseism records, potentially creating a data set that stretches back to the 1930s. Aster praised the work of the long-term observatories that have collected the records, calling them a good example of the "Cinderella science"--unloved and overlooked--that often support significant discoveries.
"It's absolutely great data on the state of the planet. We took a prosaic time series, and found something very interesting in it," he said.
The presentation of "Microseism-Based Climate Monitoring" was made in the session: Models, Methods, and Measurements: Seismic Monitoring Research on April 17, 2008 at the annual meeting of the Seismological Society of America. Authors include Aster, R. New Mexico Institute of Mining and Technology; McNamara, D., U.S. Geological Survey in Golden, CO; Bromirski, P., Scripps Institution of Oceanography; and Gee, L., and Hutt, C.R., U.S. Geological Survey in Albuquerque, NM.
Adapted from materials provided by Seismological Society of America, via EurekAlert!, a service of AAAS.
Fausto Intilla - www.oloscience.com

Sunday, April 20, 2008

Greenland Ice May Not Be Headed Down Too Slippery A Slope, But Stability Still Far From Assured


Source:
ScienceDaily (Apr. 20, 2008) — Lubricating meltwater that makes its way from the surface down to where a glacier meets bedrock turns out to be only a minor reason why Greenland's outlet glaciers accelerated their race to the sea 50 to 100 percent in the 1990s and early 2000s, according to University of Washington's Ian Joughin and Woods Hole Oceanographic Institution's Sarah Das. The two are lead co-authors of two papers posted April 18 on the Science Express website.
The report also shows that surface meltwater is reaching bedrock farther inland under the Greenland Ice Sheet, something scientists had speculated was happening but had little evidence.
"Considered together, the new findings indicate that while surface melt plays a substantial role in ice sheet dynamics, it may not produce large instabilities leading to sea level rise," says Joughin, a glaciologist with the UW's Applied Physics Laboratory. Joughin goes on to stress that "there are still other mechanisms that are contributing to the current ice loss and likely will increase this loss as climate warms."
Outlet glaciers are rapid flows of ice that start in the Greenland Ice Sheet and extend all the way to the ocean, where their fronts break apart in the water as icebergs, a process called calving. While most of the ice sheet moves less than one tenth a mile a year, some outlet glaciers gallop along at 7.5 miles a year, making outlet glaciers a concern because of their more immediate potential to cause sea level rise.
If surface meltwater lubrication at the intersection of ice and bedrock was playing a major role in speeding up the outlet glaciers, one could imagine how global warming, which would create ever more meltwater at the surface, could cause Greenland's ice to shrink much more rapidly than expected -- even catastrophically. Glacial ice is second only to the oceans as the largest reservoir of water on the planet and 10 percent of the Earth's glacial ice is found in Greenland.
It turns out, however, that when considered over an entire year, surface meltwater was responsible for only a few percent of the movement of the six outlet glaciers monitored, says Joughin, lead author of "Seasonal Speedup along the Western Flank of the Greenland Ice Sheet." Even in the summer it appears to contribute at most 15 percent, and often considerably less, to the total annual movement of these fast-moving outlet glaciers.
Calculations were made both by digitally comparing pairs of images acquired at different times from the Canadian RADARSAT satellite and by ground-based GPS measurements in a project funded by the National Science Foundation and National Aeronautics and Space Administration.
But while surface meltwater plays an inconsequential role in the movement of outlet glaciers, meltwater is responsible for 50 to 100 percent of the summer speed up for the large stretches near the edge of the ice sheet where there are no major outlet glaciers, a finding consistent with, but somewhat larger than, earlier observations.
"What Joughin, Das and their co-authors confirm is that iceflow speed up with meltwater is a widespread occurrence, not restricted to the one site where previously observed. But, they also show that the really fast-moving ice doesn't speed up very much with this. So we can expect the ice sheet in a warming world to shrink somewhat faster than previously expected, but this mechanism will not cause greatly faster shrinkage," says Richard Alley, professor of geosciences at Pennsylvania State University, who is not connected with the papers.
So what's behind the speed up of Greenland's outlet glaciers" Joughin says he thinks what's considerably more significant is when outlet glaciers lose large areas of ice at their seaward ends through increased calving, which may be affected by warmer temperatures. He's studied glaciers such as Jakobshavn Isbrae, one of Greenland's fastest-moving glaciers, and says that as ice calves and icebergs float away it is like removing a dam, allowing ice farther uphill to stream through to the ocean more quickly. At present, iceberg calving accounts for approximately 50 percent of the ice loss of Greenland, much of which is balanced by snowfall each winter. Several other studies recently have shown that the loss from calving is increasing, contributing at present rates to a rise in sea level of 1 to 2 inches per century.
"We don't yet know what warming temperatures means for increased calving of icebergs from the fronts of these outlet glaciers," Joughin says.
Until now scientists have only speculated if, and how, surface meltwater might make it to bedrock from high atop the Greenland Ice Sheet, which is a half-mile or more thick in places. The paper "Fracture Propagation to the Base of the Greenland Ice Sheet During Supraglacial Lake Drainage," with Woods Hole Oceanographic Institution's glaciologist Das as lead author, presents evidence of how a lake that disappeared from the surface of the inland ice sheet generated so much pressure and cracking that the water made it to bedrock in spite of more than half a mile of ice.
The glacial lake described in the paper was 2 to 2 ½ miles at its widest point and 40 feet deep. Researchers installed monitoring instruments and, 10 days after leaving the area, a large fracture developed, a crack spanning nearly the full length of the lake. The lake drained in 90 minutes with a fury comparable to that of Niagara Falls. (The researchers were ever so glad they hadn't been on the lake in their 10-foot boat with its 5-horsepower engine and don't plan future instrument deployments when the lakes are full of water. They'll get them in place only when the lakes are dry.)
Measurements after the event suggest there's an efficient drainage system under the ice sheet that dispersed the meltwater widely. The draining of multiple lakes each could explain the observed net regional summer ice speedup, the authors write.
Along with Das and Joughin other authors on the two papers are Matt King, Newcastle University, UK; Ben Smith, Ian Howat (now at Ohio State) and Twila Moon of the UW's Applied Physics Laboratory; Mark Behn and Dan Lizarralde of Woods Hole Oceanographic Institution; and Maya Bhatia, Massachusetts Institute of Technology/WHOI Joint Program.
Adapted from materials provided by University of Washington.
Fausto Intilla - www.oloscience.com

Saturday, April 19, 2008

Climate Change Likely To Intensify Storms, New Study Confirms


Source:
ScienceDaily (Apr. 19, 2008) — Hurricanes in some areas, including the North Atlantic, are likely to become more intense as a result of global warming even though the number of such storms worldwide may decline, according to a new study by MIT researchers.
Kerry Emanuel, the lead author of the new study, wrote a paper in 2005 reporting an apparent link between a warming climate and an increase in hurricane intensity. That paper attracted worldwide attention because it was published in Nature just three weeks before Hurricane Katrina slammed into New Orleans.
Emanuel, a professor of atmospheric science in MIT's Department of Earth, Atmospheric and Planetary Sciences, says the new research provides an independent validation of the earlier results, using a completely different approach. The paper was co-authored by postdoctoral fellow Ragoth Sundararajan and graduate student John Williams and recently appeared in the Bulletin of the American Meteorological Society.
While the earlier study was based entirely on historical records of past hurricanes, showing nearly a doubling in the intensity of Atlantic storms over the last 30 years, the new work is purely theoretical. It made use of a new technique to add finer-scale detail to computer simulations called Global Circulation Models, which are the basis for most projections of future climate change.
"It strongly confirms, independently, the results in the Nature paper," Emanuel said. "This is a completely independent analysis and comes up with very consistent results."
Worldwide, both methods show an increase in the intensity and duration of tropical cyclones, the generic name for what are known as hurricanes in the North Atlantic. But the new work shows no clear change in the overall numbers of such storms when run on future climates predicted using global climate models.
However, Emanuel says, the new work also raises some questions that remain to be understood. When projected into the future, the model shows a continuing increase in power, "but a lot less than the factor of two that we've already seen" he says. "So we have a paradox that remains to be explained."
There are several possibilities, Emanuel says. "The last 25 years' increase may have little to do with global warming, or the models may have missed something about how nature responds to the increase in carbon dioxide."
Another possibility is that the recent hurricane increase is related to the fast pace of increase in temperature. The computer models in this study, he explains, show what happens after the atmosphere has stabilized at new, much higher CO2 concentrations. "That's very different from the process now, when it's rapidly changing," he says.
In the many different computer runs with different models and different conditions, "the fact is, the results are all over the place," Emanuel says. But that doesn't mean that one can't learn from them. And there is one conclusion that's clearly not consistent with these results, he said: "The idea that there is no connection between hurricanes and global warming, that's not supported," he says.
The work was partly funded by the National Science Foundation.
Adapted from materials provided by Massachusetts Institute Of Technology.
Fausto Intilla - www.oloscience.com

Global Land Temperature Warmest On Record In March 2008


Source:
ScienceDaily (Apr. 19, 2008) — The average global land temperature last month was the warmest on record and ocean surface temperatures were the 13th warmest. Combining the land and the ocean temperatures, the overall global temperature ranked the second warmest for the month of March. Global temperature averages have been recorded since 1880.
An analysis by NOAA’s National Climatic Data Center shows that the average temperature for March in the contiguous United States ranked near average for the past 113 years. It was the 63rd warmest March since record-keeping began in the United States in 1895.
Global Highlights
The global land surface temperature was the warmest on record for March, 3.3°F above the 20th century mean of 40.8°F. Temperatures more than 8°F above average covered much of the Asian continent. Two months after the greatest January snow cover extent on record on the Eurasian continent, the unusually warm temperatures led to rapid snow melt, and March snow cover extent on the Eurasian continent was the lowest on record.
The global surface (land and ocean surface) temperature was the second warmest on record for March in the 129-year record, 1.28°F above the 20th century mean of 54.9°F. The warmest March on record (1.33°F above average) occurred in 2002.
Although the ocean surface average was only the 13th warmest on record, as the cooling influence of La Niña in the tropical Pacific continued, much warmer than average conditions across large parts of Eurasia helped push the global average to a near record high for March.
Despite above average snowpack levels in the U.S., the total Northern Hemisphere snow cover extent was the fourth lowest on record for March, remaining consistent with boreal spring conditions of the past two decades, in which warming temperatures have contributed to anomalously low snow cover extent.
Some weakening of La Niña, the cold phase of the El Niño-Southern Oscillation, occurred in March, but moderate La Niña conditions remained across the tropical Pacific Ocean.
U.S. Temperature Highlights
In the contiguous United States, the average temperature for March was 42°F, which was 0.4°F below the 20th century mean, ranking it as the 63rd warmest March on record, based on preliminary data.
Only Rhode Island, New Mexico and Arizona were warmer than average, while near-average temperatures occurred in 39 other states. The monthly temperature for Alaska was the 17th warmest on record, with an average temperature 3.8°F above the 1971-2000 mean.
The broad area of near-average temperatures kept the nation’s overall temperature-related residential energy demand for March near average, based on NOAA’s Residential Energy Demand Temperature Index.
U.S. Precipitation Highlights
Snowpack conditions dropped in many parts of the West in March, but in general, heavy snowfall during December-February has left the western snow pack among the healthiest in more than a decade, with most locations near to above average.
Nine states from Oklahoma to Vermont were much wetter than average, with Missouri experiencing its second wettest March on record. Much of the month’s precipitation fell March 17-20, when an intense storm system moved slowly from the southern Plains through the southern Midwest.
Rainfall amounts in a 48-hour period totaled 13.84 inches in Cape Girardeau, Mo., and 12.32 inches in Jackson, Mo. The heavy rainfall combined with previously saturated ground resulted in widespread major flooding of rivers and streams from the Missouri Ozarks eastward into southern Indiana.
From March 7-9, eight to 12 inches of snow fell from Louisville, Ky., to central Ohio. In Columbus, an all-time greatest 24-hour snowfall of 15.5 inches broke the old record of 12.3 inches set on April 4, 1987.
In the Southeast, a powerful tornado moved through downtown Atlanta on March 14, causing significant damage to many buildings. This was one of 90 tornado reports from the Southeast in March.
Rainfall in the middle of March improved drought conditions in much of the Southeast, but moderate-to-extreme drought still remained in more than 59 percent of the region.
In the western U.S., the weather pattern in March bore a greater resemblance to a typical La Niña, with especially dry conditions across Utah, Arizona, Nevada, and California. March was extremely dry in much of California, tying as the driest in 68 years at the Sacramento airport with 0.05 inches, a 2.75 inch departure from average.
Adapted from materials provided by National Oceanic And Atmospheric Administration.
Fausto Intilla - www.oloscience.com

Friday, April 18, 2008

Millions Of Pounds Of Trash Found On Ocean Beaches


Source:
ScienceDaily (Apr. 18, 2008) — Ocean Conservancy released its annual report on trash in the ocean with new data from the 2007 International Coastal Cleanup the most comprehensive snapshot of the harmful impacts of marine debris. The mission of Ocean Conservancy’s International Coastal Cleanup is to engage people to remove trash from the world’s beaches and waterways, to identify the sources of debris and to change the behaviors that cause pollution.
This year, more than 378,000 volunteers participated in cleanups around every major body of water around the globe. Volunteers record the trash found on land and underwater allowing Ocean Conservancy a global snapshot of the problem.
"Our ocean is sick," says Laura Capps, Senior Vice President at Ocean Conservancy. "And the plain truth is that our ocean ecosystem cannot protect us unless it is healthy and resilient. Harmful impacts like trash in the ocean, pollution, climate change, and habitat destruction are taking its toll. But the good news is that hundreds of thousands of people from around the world are starting a sea change by joining together to clean up the ocean. Trash doesn’t’ fall from the sky it falls from people’s hands. With the International Coastal Cleanup, everyone has an opportunity to make a difference, not just on one day but all year long."
Trash in the ocean kills more than one million seabirds and 100,000 marine mammals and turtles each year through ingestion and entanglement. This year, 81 birds, 63 fish, 49 invertebrates, 30 mammals 11 reptiles and one amphibian were found entangled in debris by volunteers. Some of the debris they were entangled or had ingested include plastic bags, fishing line, fishing nets, six-pack holders, string from a balloon or kite, glass bottles and cans.
Entanglement in Ocean Trash
Wraps around flippers causing circulation loss and amputations
Creates wounds and cuts leading to bacterial infections
Slows animals ability to swim making them more vulnerable to predators
Smothers or traps animals, causing them to drown
Causes starvation as animals can no-longer eat or feed its young
Ingestion of Ocean Trash
Leads to starvation by blocking digestive tracks
Provides false sense of being full once swallowed, which leads to starvation
Ingests sharp objects like metal or glass that perforate the stomach, causing internal bleeding
Becomes lodged in animals windpipes, cutting off airflow and causing suffocation
How Long Does It Take for Trash in the Ocean to Decompose?
A tin can that entered the ocean in 1986 is still decomposing in 2036
A plastic bottle that entered the ocean in 1986 is decomposing in 2436
A glass bottle that entered the ocean in 1986 is decomposing in year 1,001,986
Prevention is the real solution to trash in the ocean. The International Coastal Cleanup volunteers make ocean conservation an everyday priority. Since 1986, more than six million volunteers have removed 116,000,000 pounds of debris across 211,460 miles of shoreline in 127 nations. The 23rd annual Flagship International Coastal Cleanup will be held Sept. 20, 2008.
Adapted from materials provided by Ocean Conservancy.
Fausto Intilla - www.oloscience.com

Ice Sheet 'Plumbing System' Found: Lakes Of Meltwater Can Crack Greenland's Ice And Contribute To Faster Ice Sheet Flow


Source:
ScienceDaily (Apr. 18, 2008) — Researchers from the Woods Hole Oceanographic Institution (WHOI) and the University of Washington (UW) have for the first time documented the sudden and complete drainage of a lake of meltwater from the top of the Greenland ice sheet to its base.
From those observations, scientists have uncovered a plumbing system for the ice sheet, where meltwater can penetrate thick, cold ice and accelerate some of the large-scale summer movements of the ice sheet.
According to research by glaciologists Sarah Das of WHOI and Ian Joughin of UW, the lubricating effect of the meltwater can accelerate ice flow 50- to 100 percent in some of the broad, slow-moving areas of the ice sheet.
“We found clear evidence that supraglacial lakes—the pools of meltwater that form on the surface in summer—can actually drive a crack through the ice sheet in a process called hydrofracture,” said Das, an assistant scientist in the WHOI Department of Geology and Geophysics. “If there is a crack or defect in the surface that is large enough, and a sufficient reservoir of water to keep that crack filled, it can create a conduit all the way down to the bed of the ice sheet.”
But the results from Das and Joughin also show that while surface melt plays a significant role in overall ice sheet dynamics, it has a more subdued influence on the fast-moving outlet glaciers (which discharge ice to the ocean) than has frequently been hypothesized. (To learn more about this result, read the corresponding news release from UW.)
The research by Das and Joughin was compiled into two complementary papers and published on April 17 in the online journal Science Express. The papers will be printed in the journal Science on May 9.
Co-authors of the work include Mark Behn, Dan Lizarralde, and Maya Bhatia of WHOI; Ian Howat, Twila Moon, and Ben Smith of UW; and Matt King of Newcastle University.
Thousands of lakes form on top of Greenland’s glaciers every summer, as sunlight and warm air melt ice on the surface. Past satellite observations have shown that these supraglacial lakes can disappear in as little as a day, but scientists did not know where the water was going or how quickly, nor the impact on ice flow.
Researchers have hypothesized that meltwater from the surface of Greenland’s ice sheet might be lubricating the base. But until now, there were only theoretical predictions of how the meltwater could reach the base through a kilometer of subfreezing ice.
“We set out to examine whether the melting at the surface—which is sensitive to climate change—could influence how fast the ice can flow,” Das said. “To influence flow, you have to change the conditions underneath the ice sheet, because what’s going on beneath the ice dictates how quickly the ice is flowing."
"If the ice sheet is frozen to the bedrock or has very little water available," Das added, "then it will flow much more slowly than if it has a lubricating and pressurized layer of water underneath to reduce friction.”
In the summers of 2006 and 2007, Das, Joughin, and colleagues used seismic instruments, water-level monitors, and Global Positioning System sensors to closely monitor the evolution of two lakes and the motion of the surrounding ice sheet. They also used helicopter and airplane surveys and satellite imagery to monitor the lakes and to track the progress of glaciers moving toward the coast.
The most spectacular observations occurred in July 2006 when their instruments captured the sudden, complete draining of a lake that had once covered 5.6 square kilometers (2.2 square miles) of the surface and held 0.044 cubic kilometers (11.6 billion gallons) of water.
Like a draining bathtub, the entire lake emptied from the bottom in 24 hours, with the majority of the water flowing out in a 90-minute span. The maximum drainage rate was faster than the average flow rate over Niagara Falls.
Closer inspection of the data revealed that the pressure of the water from the lake split open the ice sheet from top to bottom, through 980 meters (3,200 feet) of ice. This water-driven fracture delivered meltwater directly to the base, raising the surface of the ice sheet by 1.2 meters in one location.
In the middle of the lake bottom, a 750-meter (2,400 foot) wide block of ice was raised by 6 meters (20 feet). The horizontal speed of the ice sheet--which is constantly in motion even under normal circumstances--became twice the average daily rate for that location.
“It’s hard to envision how a trickle or a pool of meltwater from the surface could cut through thick, cold ice all the way to the bed,” said Das. “For that reason, there has been a debate in the scientific community as to whether such processes could exist, even though some theoretical work has hypothesized this for decades.”
The seismic signature of the fractures, the rapid drainage, and the uplift and movement of the ice all showed that water had flowed all the way down to the bed. As cracks and crevasses form and become filled with water, the greater weight and density of the water forces the ice to crack open.
As water pours down through these cracks, it forms moulins (cylindrical, vertical conduits) through the ice sheet that allow rapid drainage and likely remain open for the rest of the melt season.
Das, Joughin, and their field research team will be featured this summer during an online- and museum-based outreach project known as Polar Discovery. Their return research expedition to Greenland will be chronicled daily through photo essays, and the researchers will conduct several live conversations with students, educators, and museum visitors via satellite phone.
Funding for the research was provided by the National Science Foundation, the National Aeronautics and Space Administration, the WHOI Clark Arctic Research Initiative, and the WHOI Oceans and Climate Change Institute.
Adapted from materials provided by Woods Hole Oceanographic Institution.
Fausto Intilla - www.oloscience.com

Seven Months On A Drifting Ice Floe


Source:

ScienceDaily (Apr. 18, 2008) — For the first time, a German has taken part in a Russian drift expedition and has explored the atmosphere above the central Arctic during the polar night. Jürgen Graeser, a member of the Potsdam Research Unit of the Alfred-Wegener-Institute for Polar and Marine Research in the Helmholtz Association, has just returned home to Germany. As a member of the Russian expedition NP 35 (35. North Pole Drift Expedition), which consisted of 21 persons, he has spent seven months on a drifting ice floe in the Arctic.
The 49-year-old scientific technician has gained observational data from a region, which is normally inaccessible during the Arctic winter and therefore widely unexplored. Ascends with a tethered balloon up to an altitude of 400 metres as well as balloon borne sensor ascends up to an altitude of 30 kilometres provided data which will contribute to ameliorate existing climate models for the Arctic.
In spite of its importance for the global climate system, the Arctic is still a blank on the data map. Up to now, continuous measuring in the atmosphere above the Arctic Ocean is missing. „We are not able to develop any reliable climate scenarios without disposing of data series with high temporal and local resolutions about the Arctic winter. The data which Jürgen Graeser has obtained in the course of the NP 35 expedition are unique, and they are apt to considerably diminish the still existing uncertainties in our climate models“ said Prof. Dr. Klaus Dethloff, project leader at the Alfred Wegener Institute for Polar and Marine Research.
Russian-German co-operation
Since 1937/38, the Russian Institute for Arctic and Antarctic Research (AARI) has already operated 34 Russian North Pole drift stations. In the course of the International Polar Year 2007/2008, for the first time a foreigner was allowed to take part in a drift expedition (NP 35). Due to their close co-operation with the AARI, the scientists of the Potsdam Research Unit of the Alfred Wegener Institute now could realize a project to research the polar atmosphere in the hardly accessible region of the Arctic Ocean.
The expedition NP 35
From September 2007 to April 2008, the scientific technician Jürgen Graeser from the Potsdam Research Unit was a member of the NP 35 team. For seven months, the 49-year-old has lived and worked together with twenty Russian colleagues on an ice flow the size of three times five kilometres. While Graeser concentrated on measuring the Arctic atmosphere, the Russian scientists performed investigations of the ocean top layer, the characteristics of the sea ice, the snow coverage and the energy balance above the ice surface. Moreover, they recorded atmospherical data concerning temperature, moisture, wind and air pressure by means of earth stations as well as with ascends of radio sensors. In the course of the winter the ice floe drifted 850 kilometres in northwestern direction over the Arctic Ocean.
In April Jürgen Graser was picked up from the ice floe by Polar 5, the research aircraft of the Alfred-Wegener-Institute. A specialised pilot, Brian Burchartz from Enterprise Airlines Oshawa, Canada, accomplished the difficult landing and take-off operation on the ice. “I experienced my stay on the ice floe as an incredible enrichment, under personal as well as professional aspects,” Jürgen Graeser said. The Russian colleagues will continue their measurements until the planned evacuation of the station in September 2008.
The exploration of the atmospheric boundary layer
During the drift Jürgen Graeser has explored the atmosphere above the Arctic Ocean. In order to measure the meteorological structure of the Arctic boundary layer and its temporal changes, he regularly sent out a tethered balloon filled with helium. The six sensors fixed on the tether registered data for temperature, air pressure, moisture and wind and sent them to Greaser’s computer. The exchange processes of heat, impulses and moisture between the earth surface and the atmosphere, which are important for the climate, take place in the layer between the ground and an altitude of about 400 metres.
For the first time now the local and temporal structure of ground-level temperature inversions was measured during the complete polar night. To evaluate and interpret the data, the scientists in Potsdam performed simulations with a regional climate model of the Arctic. Preliminary comparisons of temperature profiles measured on the ice floe with those from the regional climate model underline the importance of the measurements performed by Jürgen Graeser. Considerable deviations are shown between the observed data and model data in the region between the ground and an altitude of about 400 metres. Subsequent research activities in Potsdam focus on the connection of the Arctic boundary layer with the development and the tracks of low-pressure areas.
The investigation of the atmosphere – ozone
Vertical high-resolution ozone data from the central Arctic are rare. To close this data gap, Jürgen Graeser regularly launched a research balloon equipped with a radiosonde and an ozone sensor. These balloons carry the sensors up to an altitude of about 30 kilometres. In the past winter, the region of the ozone layer in an altitude of about 20 kilometres was exceptionally cold, thus continuing the trend to colder conditions in this altitude that was observed in the past. The cold conditions have fostered considerable destruction of the Arctic ozone layer in the past winter. The unique measurements of NP 35 will significantly contribute to determine precisely how much of the ozone destruction is caused by human activities.
„The high amount of work caused by the extensive measuring program let the time on the ice flow go by extremely fast“, Jürgen Graeser said on his return. Daily life was structured by the measurements on the one hand and by the meals with the colleagues on the other. A cook was responsible for the meals of the whole team, but each overwinterer helped him with the kitchen work for one day every three weeks. This kitchen service coincided with the station service controlling the condition of the ice floe and the presence of polar bears near the station. These tasks turned out to be very important, for in the course of the winter the ice floe produced crevices several times, but those crevices closed again. Moreover, frequent visits of polar bears regularly caused for alarm among the participants. Jürgen Graeser had the possibility to communicate with the Potsdam colleagues via satellite telephone and to relay the actual measuring data promptly.
Future projects
The long-term aim is to significantly reduce the great imprecision of present climate models in polar regions. To create models, mathematical descriptions for physical processes taking place under natural conditions are used. These so-called „parameterizations“ are based on measured data, and only an excellent data base can enable them to produce realistic climate simulations. In November 2008, the scientists taking part in the NP-35 project will discuss the results of their expedition in the course of an international workshop in Potsdam. Altogether, the NP-35 project is one more significant milestone for the Potsdam atmospheric researchers. The results deliver an important base for the international focal projects CliC (Climate and Cryosphere) and SPARC (Stratospheric Processes and their Role in Climate Change) by the World Climate Research Programme (WCRP, wcrp.wmo.int/).Adapted from materials provided by Helmholtz Association of German Research Centres.

Fausto Intilla - www.oloscience.com

Thursday, April 17, 2008

Jet Streams Are Shifting And May Alter Paths Of Storms And Hurricanes


Source:

ScienceDaily (Apr. 17, 2008) — The Earth's jet streams, the high-altitude bands of fast winds that strongly influence the paths of storms and other weather systems, are shifting--possibly in response to global warming. Scientists at the Carnegie Institution determined that over a 23-year span from 1979 to 2001 the jet streams in both hemispheres have risen in altitude and shifted toward the poles. The jet stream in the northern hemisphere has also weakened. These changes fit the predictions of global warming models and have implications for the frequency and intensity of future storms, including hurricanes.
Cristina Archer and Ken Caldeira of the Carnegie Institution's Department of Global Ecology tracked changes in the average position and strength of jet streams using records compiled by the European Centre for Medium-Range Weather Forecasts, the National Centers for Environmental Protection, and the National Center for Atmospheric Research. The data included outputs from weather prediction models, conventional observations from weather balloons and surface instruments, and remote observations from satellites.
Jet streams twist and turn in a wide swath that changes from day to day. The poleward shift in their average location discovered by the researchers is small, about 19 kilometers (12 miles) per decade in the northern hemisphere, but if the trend continues the impact could be significant. "The jet streams are the driving factor for weather in half of the globe," says Archer. "So, as you can imagine, changes in the jets have the potential to affect large populations and major climate systems."
Storm paths in North America are likely to shift northward as a result of the jet stream changes. Hurricanes, whose development tends to be inhibited by jet streams, may become more powerful and more frequent as the jet streams move away from the sub-tropical zones where hurricanes are born.
The observed changes are consistent with numerous other signals of global warming found in previous studies, such as the widening of the tropical belt, the cooling of the stratosphere, and the poleward shift of storm tracks. This is the first study to use observation-based datasets to examine trends in all the jet stream parameters, however.
"At this point we can't say for sure that this is the result of global warming, but I think it is," says Caldeira. "I would bet that the trend in the jet streams' positions will continue. It is something I'd put my money on."
The results are published in the April 18 Geophysical Research Letters.
Adapted from materials provided by Carnegie Institution, via EurekAlert!, a service of AAAS.

Fausto Intilla - www.oloscience.com

Worst Offenders For Carbon Dioxide Emissions: Top 20 US Counties Identified


Source:

ScienceDaily (Apr. 17, 2008) — The top twenty carbon dioxide-emitting counties in the United States have been identified by a research team led by Purdue University.
The top three counties include the cities of Houston, Los Angeles and Chicago.
Kevin Gurney, an assistant professor of earth and atmospheric science at Purdue University and leader of the carbon dioxide inventory project, which is called Vulcan, says the biggest surprise is that each region of the United States is included in the ranking.
"It shows that CO2 emissions are really spread out across the country," he says. "Texas, California, New York, Florida, New Mexico, the Midwest — Indiana, Illinois, Ohio — and Massachusetts are all listed. No region is left out of the ranking, it would seem."
The listing of the counties includes the largest city in each county. The numbers are for millions of tons of carbon emitted per year.
Harris, Texas (Houston) — 18.625 million tons of carbon per year
Los Angeles, Calif. (Los Angeles) — 18.595
Cook, Ill. (Chicago) — 13.209
Cuyahoga, Ohio (Cleveland) — 11.144
Wayne, Mich. (Detroit) — 8.270
San Juan, N.M. (Farmington) — 8.245
Santa Clara, Calif. (San Jose) — 7.995
Jefferson, Ala. (Birmingham) — 7.951
Wilcox, Ala. (Camden) — 7.615
East Baton Rouge, La. (Baton Rouge) — 7.322
Titus, Texas (Mt. Pleasant) — 7.244
Carbon, Pa. (Jim Thorpe) — 6.534
Porter, Ind. (Valparaiso) — 6.331
Jefferson, Ohio (Steubenville) — 6.278
Indiana, Pa. (Indiana) — 6.224
Middlesex, Mass. (Boston metro area) — 6.198
Bexar, Texas (San Antonio) — 6.141
Hillsborough, Fla. (Tampa) — 6.037
Suffolk, N.Y. (New York metro area) — 6.030
Clark, Nev. (Las Vegas) — 5.955
The current emissions are based on information from 2002, but the Vulcan system will soon expand to more recent years.
Gurney says Vulcan, which is named for the Roman god of fire, quantifies all of the CO2 that results from the burning of fossil fuels such as coal and gasoline. It also tracks the hourly outputs at the level of factories, power plants, roadways, neighborhoods and commercial districts.
"It's interesting that the top county, Harris, Texas, is on the list because of industrial emissions, but the second highest CO2 emitting county, Los Angels, California, is on the list because of automobile emissions," Gurney says. "So it's not just cars, and it's not just factories, that are emitting the carbon dioxide, but a combination of different things."
Gurney notes that some counties on the list are there but they are producing goods or power for occupants of a different area.
"Counties such as Titus, Texas, Indiana, Pennsylvania, and Clark, Nevada, are dominated by large power production facilities that serve populations elsewhere," he says.
"My favorite one on the list is Carbon, Pennsylvania," Gurney adds.
The three-year project, which was funded by NASA and the U.S. Department of Energy under the North American Carbon Program, involved researchers from Purdue University, Colorado State University and Lawrence Berkeley National Laboratory.
The Vulcan data is available for anyone to download from the Web site at http://www.eas.purdue.edu/carbon/vulcan. Smaller summary data sets that offer a slice of the data and are easier to download also are available for non-scientists on the Vulcan Web site. These can be broken down into emission categories, such as industrial, residential, transportation, power producers, by fuel type, and are available by state, county, or cells as small as six miles (10 kilometers) across.
Adapted from materials provided by Purdue University.

Fausto Intilla - www.oloscience.com

Wednesday, April 16, 2008

Ward Hunt Ice Shelf, Largest In Northern Hemisphere, Has Fractured Into Three Main Pieces


Source:

ScienceDaily (Apr. 16, 2008) — A team of scientists including polar expert Dr. Derek Mueller from Trent University and Canadian Rangers have discovered that the largest ice shelf in the Northern Hemisphere has fractured into three main pieces.
During their sovereignty patrol across the northernmost parts of Canada over the last two weeks, they visited a new 18 kilometre-long network of cracks running from the southern edge of the Ward Hunt Ice Shelf to the Arctic Ocean. This accompanies a large central fracture that was first detected in 2002, and raises the concern that the remaining ice shelf will disintegrate within the next few years.
Evidence of these cracks first came from Radarsat satellite images in February. Confirmation came from Canadian Rangers, partnering with International Polar Year scientists during Operation NUNALIVUT 08, a Canadian Forces High Arctic sovereignty patrol. Rangers mapped the extent of the fissures and monitored melt rates for Quttinirpaaq National Park, which encompasses the ice shelf.
The patrol scientists also found that the nearby Petersen Ice Shelf lost over a third of its surface area in the past three years. This ice shelf calved following the break-up of landfast sea ice in the summer of 2005 and 2007, which had protected it from the open ocean.
“Canadian ice shelves have undergone substantial changes in the past six years, starting with the first break-up event on the Ward Hunt Ice Shelf, and the loss of the Ayles Ice Shelf,” said Dr. Luke Copland of the University of Ottawa. “These latest break-ups we are seeing have come after decades of warming and are irreversible,” said Dr. Derek Mueller of Trent University.
Only five large ice shelves remain in Arctic Canada, covering less than a tenth of the area than they did a century ago.
Derek Mueller holds Trent's Roberta Bondar Fellowship in Northern and Polar Studies.
Adapted from materials provided by Trent University.

Fausto Intilla - www.oloscience.com

Tuesday, April 15, 2008

Better Understanding Of Hurricane Trajectories Learned From Patterns On Soap Bubbles


Source:

ScienceDaily (Apr. 15, 2008) — Researchers at the Centre de Physique Moléculaire Optique et Hertzienne (CPMOH) (CNRS/Université Bordeaux (1) and the Université de la Réunion(1) have discovered that vortices created in soap bubbles behave like real cyclones and hurricanes in the atmosphere. Soap bubbles have enabled the researchers to characterize for the first time the random factor that governs the movement and paths of vortices. These results, published in the journal Physical Review Letters, could lead to a better understanding of such increasingly common and often devastating atmospheric phenomena.
A soap bubble is an ideal model for studying the atmosphere because it has analogous physical properties and, like the atmosphere, it is composed of a very thin film in relation to its diameter(2). In this experiment, the researchers created a half soap bubble that they heated at the “equator” and then cooled at the “poles”, thereby creating a single large vortex, similar to a hurricane, in the wall of the bubble. The researchers studied the movement of this vortex, which fluctuates in a random manner. This is characterized by a law known as a superdiffusive law(3), well known to physicists, but which had not until then been observed in the case of single vortices in a turbulent environment.
The disconcerting resemblance between vortices on soap bubbles and cyclones led the researchers to study their similarities. By analyzing in detail the trajectories of certain recent hurricanes such as Ivan, Jane, Nicholas, etc., the researchers measured the random factor that is always present in the movement of hurricanes. They then demonstrated the remarkable similarity of these fluctuations with those that characterize the disordered movement of the vortices that they created on soap bubbles.(4)
Taking this random factor into account in predicting the trajectory of hurricanes will be useful in anticipating the probability of impact on a given site or locality. Although the mean trajectory of hurricanes (without any fluctuations) is beginning to be well simulated by meteorologists, this random factor has, until now, been poorly understood. This discovery highlights a universality in the statistics of trajectory fluctuations and should make it possible in the future to better predict the behavior of hurricanes and anticipate the risks.
Notes :
1) Laboratoire de Génie Industriel.
2) The skin or film of soap is only several microns thick whereas the diameter of the bubble is around ten centimeters.
3) Law corresponding to a “Levy flight” random type movement, in other words a type of random walk dominated by several jumps of limited number but of large amplitude.
4) With a similar superdiffusive law.
Journal reference: Thermal convection and emergence of isolated vortices in soap bubbles, F. Seychelles, Y. Amarouchene, M. Bessafi*, and H. Kellay Université Bordeaux 1, CPMOH UMR 5798 du CNRS and * Université de la Réunion, Lab. de Génie Industriel. Physical Review Letters. April 7, 2008.
Adapted from materials provided by CNRS.

Fausto Intilla
www.oloscience.com

California Has More Than 99% Chance Of A Big Earthquake WIthin 30 Years, Report Shows


Source:
ScienceDaily (Apr. 15, 2008) — California has more than a 99% chance of having a magnitude 6.7 or larger earthquake within the next 30 years, according scientists using a new model to determine the probability of big quakes.
The likelihood of a major quake of magnitude 7.5 or greater in the next 30 years is 46%-and such a quake is most likely to occur in the southern half of the state.
The new study determined the probabilities that different parts of California will experience earthquake ruptures of various magnitudes. The new statewide probabilities are the result of a model that comprehensively combines information from seismology, earthquake geology, and geodesy (measuring precise locations on the Earth's surface). For the first time, probabilities for California having a large earthquake in the next 30 years can be forecast statewide.
"This new, comprehensive forecast advances our understanding of earthquakes and pulls together existing research with new techniques and data," explained USGS geophysicist and lead scientist Ned Field. "Planners, decision makers and California residents can use this information to improve public safety and mitigate damage before the next destructive earthquake occurs."
The new information is being provided to decision makers who establish local building codes, earthquake insurance rates, and emergency planning and will assist in more accurate planning for inevitable future large earthquakes.
The official earthquake forecasts, known as the "Uniform California Earthquake Rupture Forecast (UCERF)," were developed by a multidisciplinary group of scientists and engineers, known as the Working Group on California Earthquake Probabilities. Building on previous studies, the Working Group updated and developed the first-ever statewide, comprehensive model of California.
The organizations sponsoring the Working Group include the U.S. Geological Survey, the California Geological Survey and the Southern California Earthquake Center. An independent scientific review panel, as well as the California and National Earthquake Prediction Evaluation Councils, have evaluated the new UCERF study.
The consensus of the scientific community on forecasting California earthquakes allows for meaningful comparisons of earthquake probabilities in Los Angeles and the San Francisco Bay Area, as well as comparisons among several large faults.
The probability of a magnitude 6.7 or larger earthquake over the next 30 years striking the greater Los Angeles area is 67%, and in the San Francisco Bay Area it is 63%, similar to previous Bay Area estimates. For the entire California region, the fault with the highest probability of generating at least one magnitude 6.7 quake or larger is the southern San Andreas (59% in the next 30 years).
For northern California, the most likely source of such earthquakes is the Hayward-Rodgers Creek Fault (31% in the next 30 years). Such quakes can be deadly, as shown by the 1989 magnitude 6.9 Loma Prieta and the 1994 magnitude 6.7 Northridge earthquakes.
Earthquake probabilities for many parts of the state are similar to those in previous studies, but the new probabilities calculated for the Elsinore and San Jacinto Faults in southern California are about half those previously determined. For the far northwestern part of the State, a major source of earthquakes is the offshore 750-mile-long Cascadia Subduction Zone, the southern part of which extends about 150 miles into California. For the next 30 years there is a 10% probability of a magnitude 8 to 9 quake somewhere along that zone. Such quakes occur about once every 500 years on average.
The new model does not estimate the likelihood of shaking (seismic hazard) that would be caused by quakes. Even areas in the state with a low probability of fault rupture could experience shaking and damage from distant, powerful quakes. The U.S. Geological Survey (USGS) is incorporating the UCERF into its official estimate of California's seismic hazard, which in turn will be used to update building codes. Other subsequent studies will add information on the vulnerability of manmade structures to estimate expected losses, which is called "seismic risk." In these ways, the UCERF will help to increase public safety and community resilience to earthquake hazards.
The results of the UCERF study serve as a reminder that all Californians live in earthquake country and should be prepared. Although earthquakes cannot be prevented, the damage they do can be greatly reduced through prudent planning and preparedness. The ongoing work of the Southern California Earthquake Center, USGS, California Geological Survey, and other scientists in evaluating earthquake probabilities is part of the National Earthquake Hazard Reduction Program's efforts to safeguard lives and property from the future quakes that are certain to strike in California and elsewhere in the United States.
Adapted from materials provided by U.S. Geological Survey.

Fausto Intilla
www.oloscience.com

Ancient Method, 'Black Gold Agriculture' May Revolutionize Farming, Curb Global Warming


Source:
ScienceDaily (Apr. 15, 2008) — Fifteen hundred years ago, tribes people from the central Amazon basin mixed their soil with charcoal derived from animal bone and tree bark. Today, at the site of this charcoal deposit, scientists have found some of the richest, most fertile soil in the world. Now this ancient, remarkably simple farming technique seems far ahead of the curve, holding promise as a carbon-negative strategy to rein in world hunger as well as greenhouse gases.
At the 235th national meeting of the American Chemical Society, scientists report that charcoal derived from heated biomass has an unprecedented ability to improve the fertility of soil -- one that surpasses compost, animal manure, and other well-known soil conditioners.
They also suggest that this so-called "biochar" profoundly enhances the natural carbon seizing ability of soil. Dubbed "black gold agriculture," scientists say this "revolutionary" farming technique can provide a cheap, straight-forward strategy to reduce greenhouse gases by trapping them in charcoal-laced soil.
"Charcoal fertilization can permanently increase soil organic matter content and improve soil quality, persisting in soil for hundreds to thousands of years," Mingxin Guo, Ph.D., and colleagues report. In what they describe as a "new and pioneering" ACS report -- the first systematic investigation of soil improvement by charcoal fertilization -- Guo found that soils receiving charcoal produced from organic wastes were much looser, absorbed significantly more water and nutrients and produced higher crop biomass. The authors, with Delaware State University, say "the results demonstrate that charcoal amendment is a revolutionary approach for long-term soil quality improvement."
Soil deterioration from depletion of organic matter is an increasingly serious global problem that contributes to hunger and malnutrition. Often a result of unsustainable farming, overuse of chemical fertilizers and drought, the main weapons to combat the problem --compost, animal manure and crop debris -- decompose rapidly.
"Earth's soil is the largest terrestrial pool of carbon," Guo said. "In other words, most of the earth's carbon is fixed in soil." But if this soil is intensively cultivated by tillage and chemical fertilization, organic matter in soil will be quickly decomposed into carbon dioxide by soil microbes and released into the atmosphere, leaving the soil compacted and nutrient-poor.
Applying raw organic materials to soil only provides a temporary solution, since the applied organic matter decomposes quickly. Converting this unutilized raw material into biochar, a non-toxic and stable fertilizer, could keep carbon in the soil and out of the atmosphere, says Guo.
"Speaking in terms of fertility and productivity, the soil quality will be improved. It is a long-term effect. After you apply it once, it will be there for hundreds of years," according to Guo. With its porous structure and high nutrient- and water-holding capabilities, biochar could become an extremely attractive option for commercial farmers and home gardeners looking for long-term soil improvement.
The researchers planted winter wheat in pots of soil in a greenhouse. Some pots were amended with two percent biochar, generated from readily available ingredients like tree leaves, corn stalk and wood chips. The other pots contained ordinary soil.
The biochar-infused soil showed vastly improved germination and growing rates compared to regular soil. Guo says that even a one-percent charcoal treatment would lead to improved crop yield.
Guo is "positive" that this ground-breaking farming technique can help feed countries with poor soil quality. "We hope this technology will be extended worldwide," says Guo.
"The production of current arable land could be significantly improved to provide more food and fiber for the growing populations. We want to call it the second agricultural revolution, or black gold revolution!"
He suggests that charcoal production has been practiced for at least 3000 years. But until now, nobody realized that this charcoal could improve soil fertility until archaeologists stumbled on the aforementioned Amazonian soil several years ago.
Biochar production is straightforward, involving a heating process known as pyrolysis. First, organic residue such as tree leaves and wood chips is packed into a metal container and sealed. Then, through a small hole on top, the container is heated and the material burns. The raw organic matter is transformed into black charcoal. Smokes generated during pyrolysis can also be collected and cooled down to form bio-oil, a renewable energy source, says Guo.
In lieu of patenting biochar, Guo says he is most interested in extending the technology into practice as soon as possible. To that end, his colleagues at Delaware State University are investigating a standardized production procedure for biochar. They also foresee long-term field studies are needed to validate and demonstrate the technology. Guo noted that downsides of biochar include transportation costs resulting from its bulk mass and a need to develop new tools to spread the granular fertilizer over large tracts of farmland.
The researchers are about to embark on a five-year study on the effect of "black gold" on spinach, green peppers, tomatoes and other crops. They seek the long-term effects of biochar fertilization on soil carbon changes, crop productivity and its effect of the soil microorganism community.
"Through this long-term work, we will show to people that biochar fertilization will significantly change our current conventional farming concepts," says Guo.
Adapted from materials provided by American Chemical Society, via EurekAlert!, a service of AAAS.
Fausto Intilla

Sea Salt Worsens Coastal Air Pollution


Source:
ScienceDaily (Apr. 14, 2008) — Air pollution in the world's busiest ports and shipping regions may be markedly worse than previously suspected, according to a new study showing that industrial and shipping pollution is exacerbated when it combines with sunshine and salty sea air.
In a paper published in the journal Nature Geoscience, a team of researchers that included University of Calgary chemistry professor Hans Osthoff report that the disturbing phenomenon substantially raises the levels of ground-level ozone and other pollutants in coastal areas.
"We found unexpectedly high levels of certain air pollutants where pollution from cities and ships meets salt in the ocean air along the southeast coast of the United States," said Osthoff, who joined the U of C's Department of Chemistry last August. "It only makes sense that this is a problem everywhere industrial pollution meets the ocean, as is the case in many of the largest cities around the world. It also changes our view of the chemical transformations that occur in ship engine exhaust plumes, and tells us that emissions from marine vessels may be polluting the globe to a greater extent than currently estimated."
Dr. Osthoff was part of a National Oceanic and Atmospheric Administration (NOAA) team that spent six weeks monitoring air quality in busy shipping areas off the southeastern coast of the United States between Charleston, South Carolina and Houston, Texas, in the summer of 2006. The researchers found unexpectedly high levels of nitryl chloride (ClNO2), a chemical long suspected to be involved in ground-level ozone production along the coast.
They then determined that the compound is efficiently produced at night by the reaction of the nitrogen oxide N2O5 in polluted air with chloride from sea salt. With the help of sunlight, the chemical then splits into radicals that accelerate production of ozone and, potentially, fine particulate matter, which are the main components of air pollution. Their findings also show that up to 30 per cent of the ground-level ozone present in seaside cities such as Houston may be the result of pollution mixing with salt from ocean mist.
Dr. Osthoff intends to continue to work on halogen compounds at the University of Calgary.
"The Texas study covered only a very limited geographic area. We would like to find out to what extent this chemistry affects air quality in other regions, for example, the the Greater Vancouver area, or the Arctic," he said. "Our study indicates that halide salts such as chloride or bromide, which have been thought of as being relatively inert, may be playing a much greater role overall in the lower atmosphere."
The paper "High levels of nitryl chloride in the polluted subtropical marine boundary layer" is available in the April 6, 2008 advance online edition of the journal Nature Geoscience. The print version is scheduled to appear on May 1st, 2008.
Adapted from materials provided by University of Calgary, via EurekAlert!, a service of AAAS.
Fausto Intilla

Monday, April 14, 2008

Geologists Discover New Way Of Estimating Size And Frequency Of Meteorite Impacts


ScienceDaily (Apr. 12, 2008) — Scientists have developed a new way of determining the size and frequency of meteorites that have collided with Earth.
Their work shows that the size of the meteorite that likely plummeted to Earth at the time of the Cretaceous-Tertiary (K-T) boundary 65 million years ago was four to six kilometers in diameter. The meteorite was the trigger, scientists believe, for the mass extinction of dinosaurs and other life forms.
François Paquay, a geologist at the University of Hawaii at Manoa (UHM), used variations (isotopes) of the rare element osmium in sediments at the ocean bottom to estimate the size of these meteorites. The results are published in this week's issue of the journal Science.
When meteorites collide with Earth, they carry a different osmium isotope ratio than the levels normally seen throughout the oceans.
"The vaporization of meteorites carries a pulse of this rare element into the area where they landed," says Rodey Batiza of the National Science Foundation (NSF)'s Division of Ocean Sciences, which funded the research along with NSF's Division of Earth Sciences. "The osmium mixes throughout the ocean quickly. Records of these impact-induced changes in ocean chemistry are then preserved in deep-sea sediments."
Paquay analyzed samples from two sites, Ocean Drilling Program (ODP) site 1219 (located in the Equatorial Pacific), and ODP site 1090 (located off of the tip of South Africa) and measured osmium isotope levels during the late Eocene period, a time during which large meteorite impacts are known to have occurred.
"The record in marine sediments allowed us to discover how osmium changes in the ocean during and after an impact," says Paquay.
The scientists expect that this new approach to estimating impact size will become an important complement to a more well-known method based on iridium.
Paquay, along with co-author Gregory Ravizza of UHM and collaborators Tarun Dalai from the Indian Institute of Technology and Bernhard Peucker-Ehrenbrink from the Woods Hole Oceanographic Institution, also used this method to make estimates of impact size at the K-T boundary.
Even though these method works well for the K-T impact, it would break down for an event larger than that: the meteorite contribution of osmium to the oceans would overwhelm existing levels of the element, researchers believe, making it impossible to sort out the osmium's origin.
Under the assumption that all the osmium carried by meteorites is dissolved in seawater, the geologists were able to use their method to estimate the size of the K-T meteorite as four to six kilometers in diameter.
The potential for recognizing previously unknown impacts is an important outcome of this research, the scientists say.
"We know there were two big impacts, and can now give an interpretation of how the oceans behaved during these impacts," says Paquay. "Now we can look at other impact events, both large and small."
Adapted from materials provided by National Science Foundation.
Fausto Intilla - www.oloscience.com

Unusual Earthquake Swarm Off Oregon Coast Puzzles Scientists



ScienceDaily (Apr. 14, 2008) — Scientists at Oregon State University’s Hatfield Marine Science Center have recorded more than 600 earthquakes in the last 10 days off the central Oregon coast in an area not typically known for a high degree of seismic activity.
This earthquake “swarm” is unique, according to OSU marine geologist Robert Dziak, because it is occurring within the middle of the Juan de Fuca plate – away from the major, regional tectonic boundaries.
“In the 17 years we’ve been monitoring the ocean through hydrophone recordings, we’ve never seen a swarm of earthquakes in an area such as this,” Dziak said. “We’re not certain what it means. But we hope to have a ship divert to the site and take some water samples that may help us learn more.” The water samples may indicate whether the process causing the earthquakes is tectonic or hydrothermal, he added.
At least three of the earthquakes have been of a magnitude of 5.0 or higher, Dziak said, which also is unusual. On Monday (April 7), the largest event took place, which was a 5.4 quake. Seismic activity has continued through the week and a 5.0 tremor hit on Thursday. Numerous small quakes have continued in between the periodic larger events.
Few, if any, of these earthquakes would be felt on shore, Dziak said, because they originate offshore and deep within the ocean.
The earthquakes are located about 150 nautical miles southwest of Newport, Ore., in a basin between two subsurface “faulted” geologic features rising out of the deep abyssal sediments. The hill closest to the swarm location appears to be on a curved structure edging out in a northwestern direction from the Blanco Transform Fault toward the Juan de Fuca ridge, Dziak said.
Analysis of seismic “decay” rates, which look at the decreasing intensity of the tremors as they radiate outward, suggest that the earthquakes are not the usual sequence of a primary event followed by a series of aftershocks, Dziak said.
“Some process going on down there is sustaining a high stress rate in the crust,” he pointed out.
Dziak and his colleagues are monitoring the earthquakes through a system of hydrophones located on the ocean floor. The network – called the Sound Surveillance System, or SOSUS – was used during the decades of the Cold War to monitor submarine activity in the northern Pacific Ocean. As the Cold War ebbed, these and other unique military assets were offered to civilian researchers performing environmental studies, Dziak said.
Hatfield Marine Science Center researchers also have created their own portable hydrophones, which Dziak has deployed in Antarctica to listen for seismic activity in that region. The sensitive hydrophones also have recorded a symphony of sounds revealing not only undersea earthquakes, but the movement of massive icebergs, and vocalizations of whales, penguins, elephant seals and other marine species.
This isn’t the first time the researchers have recorded earthquake swarms off the Oregon coast, Dziak said. In 2005, they recorded thousands of small quakes within a couple of weeks along the Juan de Fuca Ridge northwest of Astoria. Those earthquakes were smaller, he pointed out, and located along the tectonic plate boundary.
This is the eighth such swarm over the past dozen years, Dziak said, and the first seven were likely because of volcanic activity on the Juan de Fuca ridge. The plate doesn't move in a continuous manner and some parts move faster than others. Movement generally occurs when magma is injected into the ocean crust and pushes the plates apart.
“When it does, these swarms occur and sometimes lava breaks through onto the seafloor,” Dziak pointed out. “Usually, the plate moves at about the rate a fingernail might grow – say three centimeters a year. But when these swarms take place, the movement may be more like a meter in a two-week period."
But this eighth swarm may be different.
“The fact that it’s taking place in the middle of the plate, and not a boundary, is puzzling,” Dziak admitted. “It’s something worth keeping an eye on.”
Adapted from materials provided by Oregon State University.

Fausto Intilla - www.oloscience.com