Tuesday, May 27, 2008

Melting Glaciers May Release DDT And Contaminate Antarctic Environment


Source:
ScienceDaily (May 27, 2008) — In an unexpected consequence of climate change, scientists are raising the possibility that glacial melting is releasing large amounts of the banned pesticide DDT, which is contaminating the environment in Antarctica.
The study is scheduled for the June 1 issue of ACS’ bi-weekly journal Environmental Science & Technology.
In the study, Heidi N. Geisz and colleagues estimate that up to 2.0-8.8 pounds of DDT are released into coastal waters annually along the Western Antarctic Ice Sheet from glacial meltwater. The researchers point out that DDT reaches Antarctica by long-range atmospheric transport in snow, and then gets concentrated in the food chain.
DDT has been banned in the northern hemisphere and has been regulated worldwide since the 1970s. Geisz found, however, that DDT levels in the Adelie penguin have been unchanged since the 1970s, despite an 80 percent reduction in global use.
Global warming may explain that contradiction, they say. As the annual winter temperature on the Antarctic Peninsula has increased by about 10 degrees Fahrenheit in the last 30 years, glaciers have retreated. The possibility that glacial meltwater has contaminated Antarctic organisms with DDT, the study says, “has compelling consequences” if global warming should continue and intensify.
Fausto Intilla - www.oloscience.com

Scorched Earth Millenium Map Shows 'Fire Scars'


Source:
ScienceDaily (May 27, 2008) — A geographer from the University of Leicester has produced for the first time a map of the scorched Earth for every year since the turn of the Millennium.
Dr Kevin Tansey, of the Department of Geography, a leading scientist in an international team, created a visual impression of the fire scars on our planet between 2000 and 2007. The work was funded by the Joint Research Centre of the European Commission.
The map reveals that between 3.5 and 4.5 million km2 of vegetation burns on an annual basis. This is an area equivalent to the European Union (EU27) and larger than the country of India that is burnt every year.
The information is vital for scientists and agencies involved in monitoring global warming, measuring and understanding pollutants in the atmosphere, managing forests and controlling fire and even for predicting future fire occurrence.
Dr Tansey, a Lecturer in Remote Sensing at the University of Leicester, said: "We have produced, for the first time, a global data base and map of the occurrence of fire scars covering the period 2000-2007. Prior to this development, data were only available for the year 2000. With seven years of data, it is not possible to determine if there is an increasing trends in the occurrence of fire, but we have significant year-to-year differences, of the order of 20%, in the area that is burnt.
"The work was undertaken with colleagues from the Joint Research Centre of the European Commission (Italy) and the Université catholique de Louvain (Belgium).
"This unique data set is in much demand by a large community of scientists interested in climate change, vegetation monitoring, atmospheric chemistry and carbon storage and flows.
"We have used the VEGETATION instrument onboard the SPOT European satellite, which collects reflected solar energy from the Earth's surface, providing global coverage on almost a daily basis.
"When vegetation burns the amount of reflected energy is altered, long enough for us to make an observation of the fire scar. Supercomputers located in Belgium were used to process the vast amounts of satellite data used in the project. At the moment, we have users working towards predicting future fire occurrence and fire management issues in the Kruger Park in southern Africa".
"The majority of fires occur in Africa. Large swathes of savannah grasslands are cleared every year, up to seven times burnt in the period 2000-2007 (see Figure 1). The system is sustainable because the grass regenerates very quickly during the wet season. From a carbon perspective, there is a net balance due to the regenerating vegetation acting as a carbon sink. Fires in forests are more important as the affected area becomes a carbon source for a number of years.
"The forest fires last summer in Greece and in Portugal a couple of years back, remind us that we need to understand the impact of fire on the environment and climate to manage the vegetation of the planet more effectively. Probably 95% of all vegetation fires have a human source; crop stubble burning, forest clearance, hunting, arson are all causes of fire across the globe. Fire has been a feature of the planet in the past and under a scenario of a warmer environment will certainly be a feature in the future".
The project was funded by the European Commission, through DG Joint Research Centre.
Fausto Intilla - www.oloscience.com

Tuesday, May 13, 2008

Hot Climate Could Shut Down Plate Tectonics


Source:
ScienceDaily (May 13, 2008) — A new study of possible links between climate and geophysics on Earth and similar planets finds that prolonged heating of the atmosphere can shut down plate tectonics and cause a planet's crust to become locked in place.
"The heat required goes far beyond anything we expect from human-induced climate change, but things like volcanic activity and changes in the sun's luminosity could lead to this level of heating," said lead author Adrian Lenardic, associate professor of Earth science at Rice University. "Our goal was to establish an upper limit of naturally generated climate variation beyond which the entire solid planet would respond."
Lenardic said the research team wanted to better understand the differences between the Earth and Venus and establish the potential range of conditions that could exist on Earth-like planets beyond the solar system. The team includes Lenardic and co-authors Mark Jellinek of the University of British Columbia in Vancouver and Louis Moresi of Monash University in Clayton, Australia. The research is available online from the journal Earth and Planetary Science Letters.
The findings may explain why Venus evolved differently from Earth. The two planets are close in size and geological makeup, but Venus' carbon dioxide-rich atmosphere is almost 100 times more dense than the Earth's and acts like a blanket. As a result, Venus' surface temperature is hotter than that of even Mercury, which is twice as close to the sun.
The Earth's crust -- along with carbon trapped on the oceans' floors -- gets returned to the interior of the Earth when free-floating sections of crust called tectonic plates slide beneath one another and return to the Earth's mantle. The mantle is a flowing layer of rock that extends from the planet's outer core, about 1,800 miles below the surface, to within about 30 miles of the surface, just below the crust.
"We found the Earth's plate tectonics could become unstable if the surface temperature rose by 100 degrees Fahrenheit or more for a few million years," Lenardic said. "The time period and the rise in temperatures, while drastic for humans, are not unreasonable on a geologic scale, particularly compared to what scientists previously thought would be required to affect a planet's geodynamics."
Conventional wisdom holds that plate tectonics is both stable and self-correcting, but that view relies on the assumption that excess heat from the Earth's mantle can efficiently escape through the crust. The stress generated by flowing mantle helps keep tectonic plates in motion, and the mantle can become less viscous if it heats up. The new findings show that prolonged heating of a planet's crust via rising atmospheric temperatures can heat the deep inside of the planet and shut down tectonic plate movement.
"We found a corresponding spike in volcanic activity could accompany the initial locking of the tectonic plates," Lenardic said. "This may explain the large percentage of volcanic plains that we find on Venus."
Venus' surface, which shows no outward signs of tectonic activity, is bone dry and heavily scarred with volcanoes. Scientists have long believed that Venus' crust, lacking water to help lubricate tectonic plate boundaries, is too rigid for active plate tectonics.
Lenardic said one of the most significant findings in the new study is that the atmospheric heating needed to shut down plate tectonics is considerably less than the critical temperature beyond which free water could exist on the Earth's surface.
"The water doesn't have to boil away for irrevocable heating to occur," Lenardic said. "The cycle of heating can be kicked off long before that happens. All that's required is enough prolonged surface heating to cause a feedback loop in the planet's mantle convection cycle."
The research was supported by the National Science Foundation and the Canadian Institute for Advanced Research.
Fausto Intilla - www.oloscience.com

Solar Variability: Striking A Balance With Climate Change


Source:

ScienceDaily (May 12, 2008) — The sun has powered almost everything on Earth since life began, including its climate. The sun also delivers an annual and seasonal impact, changing the character of each hemisphere as Earth's orientation shifts through the year. Since the Industrial Revolution, however, new forces have begun to exert significant influence on Earth's climate.
"For the last 20 to 30 years, we believe greenhouse gases have been the dominant influence on recent climate change," said Robert Cahalan, climatologist at NASA’s Goddard Space Flight Center in Greenbelt, Md.
For the past three decades NASA scientists have investigated the unique relationship between the sun and Earth. Using space-based tools, like the Solar Radiation and Climate Experiment (SORCE), they have studied how much solar energy illuminates Earth, and explored what happens to that energy once it penetrates the atmosphere. The amount of energy that reaches Earth's outer atmosphere is called the total solar irradiance. Total solar irradiance is variable over many different timescales, ranging from seconds to centuries due to changes in solar activity.
The sun goes through roughly an 11-year cycle of activity, from stormy to quiet and back again. Solar activity often occurs near sunspots, dark regions on the sun caused by concentrated magnetic fields. The solar irradiance measurement is much higher during solar maximum, when sunspot cycle and solar activity is high, versus solar minimum, when the sun is quiet and there are usually no sunspots.
"The fluctuations in the solar cycle impacts Earth's global temperature by about 0.1 degree Celsius, slightly hotter during solar maximum and cooler during solar minimum," said Thomas Woods, solar scientist at the University of Colorado in Boulder. "The sun is currently at its minimum, and the next solar maximum is expected in 2012."
Using SORCE, scientists have learned that about 1,361 watts per square meter of solar energy reaches Earth's outermost atmosphere during the sun's quietest period. But when the sun is active, 1.3 watts per square meter (0.1 percent) more energy reaches Earth. "This TSI measurement is very important to climate models that are trying to assess Earth-based forces on climate change," said Cahalan.
Over the past century, Earth's average temperature has increased by approximately 0.6 degrees Celsius (1.1 degrees Fahrenheit). Solar heating accounts for about 0.15 C, or 25 percent, of this change, according to computer modeling results published by NASA Goddard Institute for Space Studies researcher David Rind in 2004. Earth's climate depends on the delicate balance between incoming solar radiation, outgoing thermal radiation and the composition of Earth's atmosphere. Even small changes in these parameters can affect climate. Around 30 percent of the solar energy that strikes Earth is reflected back into space. Clouds, atmospheric aerosols, snow, ice, sand, ocean surface and even rooftops play a role in deflecting the incoming rays. The remaining 70 percent of solar energy is absorbed by land, ocean, and atmosphere.
"Greenhouse gases block about 40 percent of outgoing thermal radiation that emanates from Earth," Woods said. The resulting imbalance between incoming solar radiation and outgoing thermal radiation will likely cause Earth to heat up over the next century, accelerating the melting polar ice caps, causing sea levels to rise and increasing the probability of more violent global weather patterns.
Non-Human Influences on Climate Change
Before the Industrial Age, the sun and volcanic eruptions were the major influences on Earth's climate change. Earth warmed and cooled in cycles. Major cool periods were ice ages, with the most recent ending about 11,000 years ago.
"Right now, we are in between major ice ages, in a period that has been called the Holocene,” said Cahalan. “Over recent decades, however, we have moved into a human-dominated climate that some have termed the Anthropocene. The major change in Earth's climate is now really dominated by human activity, which has never happened before."
The sun is relatively calm compared to other stars. "We don't know what the sun is going to do a hundred years from now," said Doug Rabin, a solar physicist at Goddard. "It could be considerably more active and therefore have more influence on Earth's climate."
Or, it could be calmer, creating a cooler climate on Earth similar to what happened in the late 17th century. Almost no sunspots were observed on the sun's surface during the period from 1650 to 1715. This extended absence of solar activity may have been partly responsible for the Little Ice Age in Europe and may reflect cyclic or irregular changes in the sun's output over hundreds of years. During this period, winters in Europe were longer and colder by about 1 C than they are today.
Since then, there seems to have been on average a slow increase in solar activity. Unless we find a way to reduce the amount of greenhouse gases we put into the atmosphere, such as carbon dioxide from fossil fuel burning, the solar influence is not expected to dominate climate change. But the solar variations are expected to continue to modulate both warming and cooling trends at the level of 0.1 to 0.2 degrees Celsius (0.18 to 0.26 Fahrenheit) over many years.
Future Measurements of Solar Variability
For three decades, a suite of NASA and European Space Agency satellites have provided scientists with critical measurements of total solar irradiance. The Total Irradiance Monitor, also known as the TIM instrument, was launched in 2003 as part of the NASA’s SORCE mission, and provides irradiance measurements with state-of-the-art accuracy. TIM has been rebuilt as part of the Glory mission, scheduled to launch in 2009. Glory's TIM instrument will continue an uninterrupted 30-year record of solar irradiance measurements and will help researchers better understand the sun's direct and indirect effects on climate. Glory will also collect data on aerosols, one of the least understood pieces of the climate puzzle.

Fausto Intilla - www.oloscience.com

Sunday, May 4, 2008

Coherent Description Of Earth's Inaccessible Interior Clarifies Mantle Motion


Source:
ScienceDaily (May 4, 2008) — A new model of inner Earth constructed by Arizona State University researchers pulls past information and hypotheses into a coherent story to clarify mantle motion.
"The past maybe two or three years there have been a lot of papers in Science and Nature about the deep mantle from seismologists and mineral physicists and it's getting really confusing because there are contradictions amongst the different papers," says Ed Garnero, seismologist and an associate professor in Arizona State University's School of Earth and Space Exploration.
"But we've discovered that there is a single framework that is compatible with all these different findings," he adds.
Garnero partnered with geodynamicist and assistant professor Allen McNamara, also in the School of Earth and Space Exploration in ASU's College of Liberal Arts and Sciences, to synthesize the information for their paper to be published in the May 2 issue of Science.
"Our goal was to bring the latest seismological and dynamical results together to put some constraints on the different hypotheses we have for the mantle. If you Google 'mantle' you'll see 20 different versions of what people are teaching," explains McNamara.
According to the ASU scientists, all this recent research of the past few years fits into a single story. But what is that story? Is it a complicated and exceedingly idiosyncratic story or is it a straightforward simple framework?
"In my opinion," explains Garnero, "it's simple. It doesn't really appeal to anything new; it just shows how all those things can fit together."
The pair paints a story for a chemically complex inner earth, a model that sharply contrasts the heavily relied upon paradigm of the past few decades that the mantle is all one thing and well mixed. The original model was composed of simple concentric spheres representing the core, mantle and crust -- but the inner Earth isn't that simple.
What lies beneath
Earth is made up of several layers. Its skin, the crust, extends to a depth of about 40 kilometers (25 miles). Below the crust is the mantle area, which continues to roughly halfway to the center of Earth. The mantle is the thick layer of silicate rock surrounding the dense, iron-nickel core, and it is subdivided into the upper and lower mantle, extending to a depth of about 2,900 km (1,800 miles). The outer core is beneath that and extends to 5,150 km (3,200 mi) and the inner core to about 6,400 km (4,000 mi).
The inner Earth is not a static storage space of the geologic history of our planet -- it is continuously churning and changing. How a mantle convects and how the plates move is very different depending on whether the mantle is isochemical (chemically homogenous made entirely of only one kind of material) or heterogeneous, composed of different kinds of compounds.
Garnero and McNamara's framework is based upon the assumption that the Earth's mantle is not isochemical. Garnero says new data supports a mantle that consists of more than one type of material.
"Imagine a pot of water boiling. That would be all one kind of composition. Now dump a jar of honey into that pot of water. The honey would be convecting on its own inside the water and that's a much more complicated system," McNamara explains.
Observations, modeling and predictions have shown that the deepest mantle is complex and significantly more anomalous than the rest of the lower mantle. To understand this region, seismologists analyze tomographic images constructed from seismic wave readings. For 25 years they have been detecting differences in the speeds of waves that go through the mantle.
This difference in wave speeds provides an "intangible map" of the major boundaries inside the mantle -- where hot areas are, where cold areas are, where there are regions that might be a different composition, etc. The areas with sluggish wave speeds seem to be bounded rather abruptly by areas with wave speeds that are not sluggish or delayed. An abrupt change in wave speed means that something inside the mantle has changed.
If the mantle is all the same material, then researchers shouldn't be observing the boundary between hot and cold in the mantle as a super sharp edge and the temperature anomalies should also be more spread out. The abrupt change in velocity was noticeable, yet they didn't know what caused it.
Garnero and McNamara believe that the key aspect to this story is the existence of thermo-chemical piles. On each side of the Earth there are two big, chemically distinct, dense "piles" of material that are hundreds of kilometers thick -- one beneath the Pacific and the other below the Atlantic and Africa. These piles are two nearly antipodal large low shear velocity provinces situated at the base of Earth's mantle.
"You can picture these piles like peanut butter. It is solid rock but rock under very high pressures and temperatures become soft like peanut butter so any stresses will cause it to flow," says McNamara.
Recently mineral physicists discovered that under high pressure the atoms in the rocks go through a phase transition, rearranging themselves into a tighter configuration.
In these thermo-chemical piles the layering is consistent with a new high pressure phase of the most abundant lower mantle mineral called post-perovskite, a material that exists specifically under high pressures that cause new arrangements of atoms to be formed.
Perovskite is a specific arrangement of silicon and magnesium and iron atoms.
"At a depth a few hundred kilometers above the core, the mineral physicists tell us that the rocks' atoms can go into this new structure and it should happen abruptly and that's consistent with the velocity discontinuities that the seismologists have been seeing for decades," says Garnero.
These thick piles play a key role in the convection currents. Ultra-low velocity zones live closest to the edges of the piles because that's the hottest regions of the mantle due to the currents that go against the pile walls as they bring the heat up from the core. Off their edges exist instability upwellings that turn out to be the plumes that feed hot spots such as Hawaii, Iceland and the Galapagos.
"We observe the motions of plate tectonics very well, but we can't fully understand how the mantle is causing these motions until we better understand how the mantle itself is convecting," says McNamara. "The piles dictate how the convective cycles happen, how the currents circulate. If you don't have piles then convection will be completely different."
Adapted from materials provided by Arizona State University, via EurekAlert!, a service of AAAS.
Fausto Intilla - www.oloscience.com

Sunday, April 27, 2008

Scientists Reveal Presence Of Ocean Current 'Stripes'


Source:

ScienceDaily (Apr. 26, 2008) — More than 20 years of continuous measurements and a dose of "belief" yield discovery of subtle ocean currents that could dramatically improve forecasts of climate, ecosystem changes. An international collaborative of scientists led by Peter Niiler, a physical oceanographer at Scripps Institution of Oceanography, UC San Diego, and Nikolai Maximenko, a researcher at the International Pacific Research Center, University of Hawaii, has detected the presence of crisscrossing patterns of currents running throughout the world's oceans. The new data could help scientists significantly improve high-resolution models that help them understand trends in climate and marine ecosystems.
The basic dimensions of these steady patterns called striations have slowly been revealed over the course of several research papers by Niiler, Maximenko and colleagues. An analysis by Maximenko, Niiler and colleagues appearing today in the journal Geophysical Research Letters has produced the clearest representation of these striated patterns in the eastern Pacific Ocean to date and revealed that these complex patterns of currents extend from the surface to at least depths of 700 meters (2,300 feet). The discovery of similarly detailed patterns around the world is expected to emerge from future research.
Niiler credits the long-term and comprehensive ocean current measurements made over more than 20 years by the Global Drifter Program, now a network of more than 1,300 drifting buoys designed by him and administered by the National Oceanic and Atmospheric Administration (NOAA) for detecting these new current patterns on a global basis. Niiler added that the foresight of the University of California to provide long-term support to scientists was crucial to the discovery.
"I'm most grateful to the University of California for helping to support the invention and the 20-year maintenance of a comprehensive program of ocean circulation measurements," he said. "Scripps Institution of Oceanography is unique because of its commitment to long-term observations of the climate. Instrumental measurements of the ocean are fundamental to the definition of the state of the climate today and improvement of its prediction into the future."
In portions of the Southern Ocean, these striations-also known as ocean fronts-produce alternating eastward and westward accelerations of circulation and portions of them nearly circumnavigate Antarctica. These striations also delineate the ocean regions where uptake of carbon dioxide is greatest. In the Atlantic Ocean, these flows bear a strong association to the Azores Current along which water flowing south from the North Atlantic circulation is being subducted. The spatial high-resolution view of the linkage between the striations and the larger scale patterns of currents could improve predictions of ocean temperatures and hurricane paths.
In addition, the striations are connected to important ecosystems like the California and Peru-Chile current systems. Off California, the striations are linked to the steady east-west displacements, or meanders, of the California Current, a major flow that runs from the border of Washington and Oregon to the southern tip of Baja California. The striations run nearly perpendicular to the California Current and continue southwestward to the Hawaiian Islands.
Niiler said there are a number of scientists who have theorized the existence of striations in the ocean. He was the first to formulate such a theory as a postdoctoral researcher at Harvard University in 1965. Niiler's theory today is that the steady-state striations in the eastern North Pacific are caused by the angular momentum of the swirling eddies within the California Current System.
A worldwide crisscrossing pattern of ocean current striations has been revealed through measurements made by drifting buoys over a period of more than 20 years and through satellite readings of ocean velocity. Blue bands represent westward-flowing currents and red bands indicate eastward-flowing currents that move at roughly 1 centimeter per second. Image courtesy of Nikolai Maximenko, University of Hawaii.
The new maps of ocean circulation produced by a combination of drifter and satellite measurements will eventually be the yardstick for judging the accuracy of the circulation patterns portrayed by climate and ocean ecosystem models -a major deficiency in current simulations-and to generate substantially more reliable forecast products in climate and ecosystem management. Niiler noted, for example, that there are a large number of computer models that can simulate equatorial currents, but fail in the attempt to accurately simulate the meandering flow of the California Current and the striations that exude from it.
"I think this research presents the next challenge in ocean modeling," said Niiler. "I'm looking forward to the day when we can correctly portray most ocean circulation systems with all climate and ecosystem models."
Maximenko said the clear resolution of the subtle striations would not have been possible without the use of data from both the drifters and satellites.
"Our finding was so unbelievable that our first proposal submitted to the National Science Foundation failed miserably because most reviewers said 'You cannot study what does not exist,'" Maximenko said. "The striations are like ghosts. To see them one needs to believe in them. No doubt, armed with our hint, scientists will start finding all kinds of striations all around the world."
Maximenko, Niiler and their international colleagues are now writing a series of papers that reveal new details about the crisscross patterns and their ties to currents such as the Kuroshio, which flows in western Pacific Ocean waters near Japan.
NOAA, the National Science Foundation, the NASA Ocean Surface Topography Team, and the Japan Agency for Marine-Earth Science and Technology supported the research.
Adapted from materials provided by University of California - San Diego.

Fausto Intilla - www.oloscience.com

Northern Lights Glimmer With Unexpected Trait


Source:

ScienceDaily (Apr. 26, 2008) — An international team of scientists has detected that some of the glow of Earth's aurora is polarized, an unexpected state for such emissions. Measurements of this newfound polarization in the Northern Lights may provide scientists with fresh insights into the composition of Earth's upper atmosphere, the configuration of its magnetic field, and the energies of particles from the Sun, the researchers say.
If observed on other planets, the phenomenon might also give clues to the shape of the Sun's magnetic field as it curls around other bodies in the solar system.
When a beam of light is polarized, its electromagnetic waves share a common orientation, say, aligned vertically, or at some other angle. Until now, scientists thought that light from energized atoms and molecules in planetary upper atmospheres could not be polarized. The reason is simple: in spite of the low number of particles at the altitudes concerned (above 100 kilometers (60 miles)), there are still numerous collisions between molecules and gas atoms. Those collisions depolarize the emitted light.
Fifty years ago, an Australian researcher, Robert Duncan, claimed to observe what looked like polarization of auroral light, but other scientists found that single observation unconvincing.
To revisit the question, Jean Lilensten of the Laboratory of Planetology of Grenoble, France, and his colleagues studied auroral light with a custom-made telescope during the winters of 2006-2007 and 2007-2008. They made their observations from Svalbard Island, Norway, which is in the polar region, at a latitude of 79° north.
At the north and south magnetic poles, many charged particles in the solar wind --a flow of electrically charged matter from the Sun--are captured by the planet's field and forced to plunge into the atmosphere. The particles strike atmospheric gases, causing light emissions.
Lilensten and his colleagues observed weak polarization of a red glow that radiates at an altitude of 220 kilometers (140 miles). The glow results from electrons hitting oxygen atoms. The scientists had suspected that such light might be polarized because Earth's magnetic field at high latitudes funnels the electrons, aligning the angles at which they penetrate the atmosphere.
The finding of auroral polarization "opens a new field in planetology," says Lilensten, who is the lead author of the study. He and his colleagues reported their results on 19 April in Geophysical Research Letters, a publication of the American Geophysical Union, or AGU.
Fluctuations in the polarization measurements can reveal the energy of the particles coming from the Sun when they enter Earth's atmosphere, Lilensten notes. The intensity of the polarization gives clues to the composition of the upper atmosphere, particularly with regard to atomic oxygen.
Because polarization is strongest when the telescope points perpendicularly to the magnetic field lines, the measurements also provide a way to determine magnetic field configurations, Lilensten adds. That could prove especially useful as astronomers train their telescopes on other planetary atmospheres. If polarized emissions are observed there as well, the measurements may enable scientists to understand how the Sun's magnetic field is distorted by obstacles such as the planets Venus and Mars, which lack intrinsic magnetic fields.
Journal reference: Lilensten, J., J. Moen, M. Barthélemy, R. Thissen, C. Simon, D. A. Lorentzen, O. Dutuit, P. O. Amblard, and F. Sigernes (2008), Polarization in aurorae: A new dimension for space environments studies, Geophys. Res. Lett., 35, L08804, doi:10.1029/2007GL033006.
Adapted from materials provided by American Geophysical Union.

Fausto Intilla - www.oloscience.com

Friday, April 25, 2008

Mystery Of Ancient Supercontinent's Demise Revealed


Source:
ScienceDaily (Apr. 24, 2008) — In a paper published in Geophysical Journal International, Dr Graeme Eagles from the Earth Sciences Department at Royal Holloway, University of London, reveals how one of the largest continents ever to exist met its demise.
Gondwana was a ‘supercontinent’ that existed between 500 and 180 million years ago. For the past four decades, geologists have debated how Gondwana eventually broke up, developing a multitude of scenarios which can be loosely grouped into two schools of thought – one theory claiming the continent separated into many small plates, and a second theory claiming it broke into just a few large pieces. Dr Eagles, working with Dr Matthais König from the Alfred Wegener Institute for Polar and Marine Research in Bremerhaven, Germany, has devised a new computer model showing that the supercontinent cracked into two pieces, too heavy to hold itself together.
Gondwana comprised of most of the landmasses in today’s Southern Hemisphere, including Antarctica, South America, Africa, Madagascar, Australia-New Guinea, and New Zealand, as well as Arabia and the Indian subcontinent of the Northern Hemisphere. Between around 250 and 180 million years ago, it formed part of the single supercontinent ‘Pangea’.
Evidence suggests that Gondwana began to break up at around 183 million years ago. Analysing magnetic and gravity anomaly data from some of Gondwana’s first cracking points – fracture zones in the Mozambique Basin and the Riiser-Larsen Sea off Antarctica – Dr Eagles and Dr König reconstructed the paths that each part of Gondwana took as it broke apart. The computer model reveals that the supercontinent divided into just two large, eastern and western plates. Approximately 30 million years later, these two plates started to split to form the familiar continents of today’s Southern Hemisphere.
‘You could say that the process is ongoing as Africa is currently splitting in two along the East African Rift,’ says Dr Eagles. ‘The previously held view of Gondwana initially breaking up into many different pieces was unnecessarily complicated. It gave fuel to the theory that a plume of hot mantle, about 2,000 to 3,000 kilometres wide, began the splitting process. A straight forward split takes the spotlight off plumes as active agents in the supercontinent’s breakup, because the small number of plates involved resembles the pattern of plate tectonics in the rest of Earth’s history during which plumes have played bit parts.’
According to Dr Eagles and Dr König’s study, because supercontinents like Gondwana are gravitationally unstable to begin with, and have very thick crusts in comparison to oceans, they eventually start to collapse under their own weight.
Says Dr Eagles, ‘These findings are a starting point from which more accurate and careful research can be made on the supercontinent. The new model challenges the positions of India and Sri Lanka in Gondwana which have been widely used for the past 40 years, assigning them very different positions in the supercontinent. These differences have major consequences for our understanding of Earth.’
Adapted from materials provided by Royal Astronomical Society.
Fausto Intilla - www.oloscience.com

Monday, April 21, 2008

Extreme Ocean Storms Have Become More Frequent Over Past Three Decades, Study Of Tiny Tremors Shows


Source:
ScienceDaily (Apr. 21, 2008) — Data from faint earth tremors caused by wind-driven ocean waves -- often dismissed as "background noise" at seismographic stations around the world -- suggest extreme ocean storms have become more frequent over the past three decades. The International Panel on Climate Change (IPCC) and other prominent researchers have predicted that stronger and more frequent storms may occur as a result of global warming trends. The tiny tremors, or microseisms, offer a new way to discover whether these predictions are already coming true, said Richard Aster, a geophysics professor at the New Mexico Institute of Mining and Technology.
Unceasing as the ocean waves that trigger them, the microseisms show up as five- to 30-second oscillations of Earth's surface at seismographic stations around the world. Even seismic monitoring stations "in the middle of a continent are sensitive to the waves crashing all around the continent," Aster said.
As storm winds drive ocean waves higher, the microseism signals increase their amplitude as well, offering a unique way to track storm intensities across seasons, over time, and at different geographical locations. For instance, Aster and colleagues Daniel McNamara from the U.S. Geological Survey and Peter Bromirski of the Scripps Institution of Oceanography recently published analysis in the Seismological Society of America journal Seismological Research Letters showing that microseism data collected around the Pacific Basin and throughout the world could be used to detect and quantify wave activity from multi-year events such as the El Niño and La Niña ocean disruptions.
The findings spurred them to look for a microseism signal that would reveal whether extreme storms were becoming more common in a warming world. In fact, they saw "a remarkable thing," among the worldwide microseism data collected from 1972 to 2008, Aster recalled. In 22 of the 22 stations included in the study, the number of extreme storm events had increased over time, they found.
While the work on evaluating changes in extreme storms is "still very much in its early stages", Aster is "hoping that the study will offer a much more global look" at the effects of climate change on extreme storms and the wind-driven waves that they produce. At the moment, most of the evidence linking the two comes from studies of hurricane intensity and shoreline erosion in specific regions such as the Pacific Northwest Gulf of Mexico, he noted.
The researchers are also working on recovering and digitizing older microseism records, potentially creating a data set that stretches back to the 1930s. Aster praised the work of the long-term observatories that have collected the records, calling them a good example of the "Cinderella science"--unloved and overlooked--that often support significant discoveries.
"It's absolutely great data on the state of the planet. We took a prosaic time series, and found something very interesting in it," he said.
The presentation of "Microseism-Based Climate Monitoring" was made in the session: Models, Methods, and Measurements: Seismic Monitoring Research on April 17, 2008 at the annual meeting of the Seismological Society of America. Authors include Aster, R. New Mexico Institute of Mining and Technology; McNamara, D., U.S. Geological Survey in Golden, CO; Bromirski, P., Scripps Institution of Oceanography; and Gee, L., and Hutt, C.R., U.S. Geological Survey in Albuquerque, NM.
Adapted from materials provided by Seismological Society of America, via EurekAlert!, a service of AAAS.
Fausto Intilla - www.oloscience.com

Sunday, April 20, 2008

Greenland Ice May Not Be Headed Down Too Slippery A Slope, But Stability Still Far From Assured


Source:
ScienceDaily (Apr. 20, 2008) — Lubricating meltwater that makes its way from the surface down to where a glacier meets bedrock turns out to be only a minor reason why Greenland's outlet glaciers accelerated their race to the sea 50 to 100 percent in the 1990s and early 2000s, according to University of Washington's Ian Joughin and Woods Hole Oceanographic Institution's Sarah Das. The two are lead co-authors of two papers posted April 18 on the Science Express website.
The report also shows that surface meltwater is reaching bedrock farther inland under the Greenland Ice Sheet, something scientists had speculated was happening but had little evidence.
"Considered together, the new findings indicate that while surface melt plays a substantial role in ice sheet dynamics, it may not produce large instabilities leading to sea level rise," says Joughin, a glaciologist with the UW's Applied Physics Laboratory. Joughin goes on to stress that "there are still other mechanisms that are contributing to the current ice loss and likely will increase this loss as climate warms."
Outlet glaciers are rapid flows of ice that start in the Greenland Ice Sheet and extend all the way to the ocean, where their fronts break apart in the water as icebergs, a process called calving. While most of the ice sheet moves less than one tenth a mile a year, some outlet glaciers gallop along at 7.5 miles a year, making outlet glaciers a concern because of their more immediate potential to cause sea level rise.
If surface meltwater lubrication at the intersection of ice and bedrock was playing a major role in speeding up the outlet glaciers, one could imagine how global warming, which would create ever more meltwater at the surface, could cause Greenland's ice to shrink much more rapidly than expected -- even catastrophically. Glacial ice is second only to the oceans as the largest reservoir of water on the planet and 10 percent of the Earth's glacial ice is found in Greenland.
It turns out, however, that when considered over an entire year, surface meltwater was responsible for only a few percent of the movement of the six outlet glaciers monitored, says Joughin, lead author of "Seasonal Speedup along the Western Flank of the Greenland Ice Sheet." Even in the summer it appears to contribute at most 15 percent, and often considerably less, to the total annual movement of these fast-moving outlet glaciers.
Calculations were made both by digitally comparing pairs of images acquired at different times from the Canadian RADARSAT satellite and by ground-based GPS measurements in a project funded by the National Science Foundation and National Aeronautics and Space Administration.
But while surface meltwater plays an inconsequential role in the movement of outlet glaciers, meltwater is responsible for 50 to 100 percent of the summer speed up for the large stretches near the edge of the ice sheet where there are no major outlet glaciers, a finding consistent with, but somewhat larger than, earlier observations.
"What Joughin, Das and their co-authors confirm is that iceflow speed up with meltwater is a widespread occurrence, not restricted to the one site where previously observed. But, they also show that the really fast-moving ice doesn't speed up very much with this. So we can expect the ice sheet in a warming world to shrink somewhat faster than previously expected, but this mechanism will not cause greatly faster shrinkage," says Richard Alley, professor of geosciences at Pennsylvania State University, who is not connected with the papers.
So what's behind the speed up of Greenland's outlet glaciers" Joughin says he thinks what's considerably more significant is when outlet glaciers lose large areas of ice at their seaward ends through increased calving, which may be affected by warmer temperatures. He's studied glaciers such as Jakobshavn Isbrae, one of Greenland's fastest-moving glaciers, and says that as ice calves and icebergs float away it is like removing a dam, allowing ice farther uphill to stream through to the ocean more quickly. At present, iceberg calving accounts for approximately 50 percent of the ice loss of Greenland, much of which is balanced by snowfall each winter. Several other studies recently have shown that the loss from calving is increasing, contributing at present rates to a rise in sea level of 1 to 2 inches per century.
"We don't yet know what warming temperatures means for increased calving of icebergs from the fronts of these outlet glaciers," Joughin says.
Until now scientists have only speculated if, and how, surface meltwater might make it to bedrock from high atop the Greenland Ice Sheet, which is a half-mile or more thick in places. The paper "Fracture Propagation to the Base of the Greenland Ice Sheet During Supraglacial Lake Drainage," with Woods Hole Oceanographic Institution's glaciologist Das as lead author, presents evidence of how a lake that disappeared from the surface of the inland ice sheet generated so much pressure and cracking that the water made it to bedrock in spite of more than half a mile of ice.
The glacial lake described in the paper was 2 to 2 ½ miles at its widest point and 40 feet deep. Researchers installed monitoring instruments and, 10 days after leaving the area, a large fracture developed, a crack spanning nearly the full length of the lake. The lake drained in 90 minutes with a fury comparable to that of Niagara Falls. (The researchers were ever so glad they hadn't been on the lake in their 10-foot boat with its 5-horsepower engine and don't plan future instrument deployments when the lakes are full of water. They'll get them in place only when the lakes are dry.)
Measurements after the event suggest there's an efficient drainage system under the ice sheet that dispersed the meltwater widely. The draining of multiple lakes each could explain the observed net regional summer ice speedup, the authors write.
Along with Das and Joughin other authors on the two papers are Matt King, Newcastle University, UK; Ben Smith, Ian Howat (now at Ohio State) and Twila Moon of the UW's Applied Physics Laboratory; Mark Behn and Dan Lizarralde of Woods Hole Oceanographic Institution; and Maya Bhatia, Massachusetts Institute of Technology/WHOI Joint Program.
Adapted from materials provided by University of Washington.
Fausto Intilla - www.oloscience.com

Saturday, April 19, 2008

Climate Change Likely To Intensify Storms, New Study Confirms


Source:
ScienceDaily (Apr. 19, 2008) — Hurricanes in some areas, including the North Atlantic, are likely to become more intense as a result of global warming even though the number of such storms worldwide may decline, according to a new study by MIT researchers.
Kerry Emanuel, the lead author of the new study, wrote a paper in 2005 reporting an apparent link between a warming climate and an increase in hurricane intensity. That paper attracted worldwide attention because it was published in Nature just three weeks before Hurricane Katrina slammed into New Orleans.
Emanuel, a professor of atmospheric science in MIT's Department of Earth, Atmospheric and Planetary Sciences, says the new research provides an independent validation of the earlier results, using a completely different approach. The paper was co-authored by postdoctoral fellow Ragoth Sundararajan and graduate student John Williams and recently appeared in the Bulletin of the American Meteorological Society.
While the earlier study was based entirely on historical records of past hurricanes, showing nearly a doubling in the intensity of Atlantic storms over the last 30 years, the new work is purely theoretical. It made use of a new technique to add finer-scale detail to computer simulations called Global Circulation Models, which are the basis for most projections of future climate change.
"It strongly confirms, independently, the results in the Nature paper," Emanuel said. "This is a completely independent analysis and comes up with very consistent results."
Worldwide, both methods show an increase in the intensity and duration of tropical cyclones, the generic name for what are known as hurricanes in the North Atlantic. But the new work shows no clear change in the overall numbers of such storms when run on future climates predicted using global climate models.
However, Emanuel says, the new work also raises some questions that remain to be understood. When projected into the future, the model shows a continuing increase in power, "but a lot less than the factor of two that we've already seen" he says. "So we have a paradox that remains to be explained."
There are several possibilities, Emanuel says. "The last 25 years' increase may have little to do with global warming, or the models may have missed something about how nature responds to the increase in carbon dioxide."
Another possibility is that the recent hurricane increase is related to the fast pace of increase in temperature. The computer models in this study, he explains, show what happens after the atmosphere has stabilized at new, much higher CO2 concentrations. "That's very different from the process now, when it's rapidly changing," he says.
In the many different computer runs with different models and different conditions, "the fact is, the results are all over the place," Emanuel says. But that doesn't mean that one can't learn from them. And there is one conclusion that's clearly not consistent with these results, he said: "The idea that there is no connection between hurricanes and global warming, that's not supported," he says.
The work was partly funded by the National Science Foundation.
Adapted from materials provided by Massachusetts Institute Of Technology.
Fausto Intilla - www.oloscience.com

Global Land Temperature Warmest On Record In March 2008


Source:
ScienceDaily (Apr. 19, 2008) — The average global land temperature last month was the warmest on record and ocean surface temperatures were the 13th warmest. Combining the land and the ocean temperatures, the overall global temperature ranked the second warmest for the month of March. Global temperature averages have been recorded since 1880.
An analysis by NOAA’s National Climatic Data Center shows that the average temperature for March in the contiguous United States ranked near average for the past 113 years. It was the 63rd warmest March since record-keeping began in the United States in 1895.
Global Highlights
The global land surface temperature was the warmest on record for March, 3.3°F above the 20th century mean of 40.8°F. Temperatures more than 8°F above average covered much of the Asian continent. Two months after the greatest January snow cover extent on record on the Eurasian continent, the unusually warm temperatures led to rapid snow melt, and March snow cover extent on the Eurasian continent was the lowest on record.
The global surface (land and ocean surface) temperature was the second warmest on record for March in the 129-year record, 1.28°F above the 20th century mean of 54.9°F. The warmest March on record (1.33°F above average) occurred in 2002.
Although the ocean surface average was only the 13th warmest on record, as the cooling influence of La Niña in the tropical Pacific continued, much warmer than average conditions across large parts of Eurasia helped push the global average to a near record high for March.
Despite above average snowpack levels in the U.S., the total Northern Hemisphere snow cover extent was the fourth lowest on record for March, remaining consistent with boreal spring conditions of the past two decades, in which warming temperatures have contributed to anomalously low snow cover extent.
Some weakening of La Niña, the cold phase of the El Niño-Southern Oscillation, occurred in March, but moderate La Niña conditions remained across the tropical Pacific Ocean.
U.S. Temperature Highlights
In the contiguous United States, the average temperature for March was 42°F, which was 0.4°F below the 20th century mean, ranking it as the 63rd warmest March on record, based on preliminary data.
Only Rhode Island, New Mexico and Arizona were warmer than average, while near-average temperatures occurred in 39 other states. The monthly temperature for Alaska was the 17th warmest on record, with an average temperature 3.8°F above the 1971-2000 mean.
The broad area of near-average temperatures kept the nation’s overall temperature-related residential energy demand for March near average, based on NOAA’s Residential Energy Demand Temperature Index.
U.S. Precipitation Highlights
Snowpack conditions dropped in many parts of the West in March, but in general, heavy snowfall during December-February has left the western snow pack among the healthiest in more than a decade, with most locations near to above average.
Nine states from Oklahoma to Vermont were much wetter than average, with Missouri experiencing its second wettest March on record. Much of the month’s precipitation fell March 17-20, when an intense storm system moved slowly from the southern Plains through the southern Midwest.
Rainfall amounts in a 48-hour period totaled 13.84 inches in Cape Girardeau, Mo., and 12.32 inches in Jackson, Mo. The heavy rainfall combined with previously saturated ground resulted in widespread major flooding of rivers and streams from the Missouri Ozarks eastward into southern Indiana.
From March 7-9, eight to 12 inches of snow fell from Louisville, Ky., to central Ohio. In Columbus, an all-time greatest 24-hour snowfall of 15.5 inches broke the old record of 12.3 inches set on April 4, 1987.
In the Southeast, a powerful tornado moved through downtown Atlanta on March 14, causing significant damage to many buildings. This was one of 90 tornado reports from the Southeast in March.
Rainfall in the middle of March improved drought conditions in much of the Southeast, but moderate-to-extreme drought still remained in more than 59 percent of the region.
In the western U.S., the weather pattern in March bore a greater resemblance to a typical La Niña, with especially dry conditions across Utah, Arizona, Nevada, and California. March was extremely dry in much of California, tying as the driest in 68 years at the Sacramento airport with 0.05 inches, a 2.75 inch departure from average.
Adapted from materials provided by National Oceanic And Atmospheric Administration.
Fausto Intilla - www.oloscience.com

Friday, April 18, 2008

Millions Of Pounds Of Trash Found On Ocean Beaches


Source:
ScienceDaily (Apr. 18, 2008) — Ocean Conservancy released its annual report on trash in the ocean with new data from the 2007 International Coastal Cleanup the most comprehensive snapshot of the harmful impacts of marine debris. The mission of Ocean Conservancy’s International Coastal Cleanup is to engage people to remove trash from the world’s beaches and waterways, to identify the sources of debris and to change the behaviors that cause pollution.
This year, more than 378,000 volunteers participated in cleanups around every major body of water around the globe. Volunteers record the trash found on land and underwater allowing Ocean Conservancy a global snapshot of the problem.
"Our ocean is sick," says Laura Capps, Senior Vice President at Ocean Conservancy. "And the plain truth is that our ocean ecosystem cannot protect us unless it is healthy and resilient. Harmful impacts like trash in the ocean, pollution, climate change, and habitat destruction are taking its toll. But the good news is that hundreds of thousands of people from around the world are starting a sea change by joining together to clean up the ocean. Trash doesn’t’ fall from the sky it falls from people’s hands. With the International Coastal Cleanup, everyone has an opportunity to make a difference, not just on one day but all year long."
Trash in the ocean kills more than one million seabirds and 100,000 marine mammals and turtles each year through ingestion and entanglement. This year, 81 birds, 63 fish, 49 invertebrates, 30 mammals 11 reptiles and one amphibian were found entangled in debris by volunteers. Some of the debris they were entangled or had ingested include plastic bags, fishing line, fishing nets, six-pack holders, string from a balloon or kite, glass bottles and cans.
Entanglement in Ocean Trash
Wraps around flippers causing circulation loss and amputations
Creates wounds and cuts leading to bacterial infections
Slows animals ability to swim making them more vulnerable to predators
Smothers or traps animals, causing them to drown
Causes starvation as animals can no-longer eat or feed its young
Ingestion of Ocean Trash
Leads to starvation by blocking digestive tracks
Provides false sense of being full once swallowed, which leads to starvation
Ingests sharp objects like metal or glass that perforate the stomach, causing internal bleeding
Becomes lodged in animals windpipes, cutting off airflow and causing suffocation
How Long Does It Take for Trash in the Ocean to Decompose?
A tin can that entered the ocean in 1986 is still decomposing in 2036
A plastic bottle that entered the ocean in 1986 is decomposing in 2436
A glass bottle that entered the ocean in 1986 is decomposing in year 1,001,986
Prevention is the real solution to trash in the ocean. The International Coastal Cleanup volunteers make ocean conservation an everyday priority. Since 1986, more than six million volunteers have removed 116,000,000 pounds of debris across 211,460 miles of shoreline in 127 nations. The 23rd annual Flagship International Coastal Cleanup will be held Sept. 20, 2008.
Adapted from materials provided by Ocean Conservancy.
Fausto Intilla - www.oloscience.com

Ice Sheet 'Plumbing System' Found: Lakes Of Meltwater Can Crack Greenland's Ice And Contribute To Faster Ice Sheet Flow


Source:
ScienceDaily (Apr. 18, 2008) — Researchers from the Woods Hole Oceanographic Institution (WHOI) and the University of Washington (UW) have for the first time documented the sudden and complete drainage of a lake of meltwater from the top of the Greenland ice sheet to its base.
From those observations, scientists have uncovered a plumbing system for the ice sheet, where meltwater can penetrate thick, cold ice and accelerate some of the large-scale summer movements of the ice sheet.
According to research by glaciologists Sarah Das of WHOI and Ian Joughin of UW, the lubricating effect of the meltwater can accelerate ice flow 50- to 100 percent in some of the broad, slow-moving areas of the ice sheet.
“We found clear evidence that supraglacial lakes—the pools of meltwater that form on the surface in summer—can actually drive a crack through the ice sheet in a process called hydrofracture,” said Das, an assistant scientist in the WHOI Department of Geology and Geophysics. “If there is a crack or defect in the surface that is large enough, and a sufficient reservoir of water to keep that crack filled, it can create a conduit all the way down to the bed of the ice sheet.”
But the results from Das and Joughin also show that while surface melt plays a significant role in overall ice sheet dynamics, it has a more subdued influence on the fast-moving outlet glaciers (which discharge ice to the ocean) than has frequently been hypothesized. (To learn more about this result, read the corresponding news release from UW.)
The research by Das and Joughin was compiled into two complementary papers and published on April 17 in the online journal Science Express. The papers will be printed in the journal Science on May 9.
Co-authors of the work include Mark Behn, Dan Lizarralde, and Maya Bhatia of WHOI; Ian Howat, Twila Moon, and Ben Smith of UW; and Matt King of Newcastle University.
Thousands of lakes form on top of Greenland’s glaciers every summer, as sunlight and warm air melt ice on the surface. Past satellite observations have shown that these supraglacial lakes can disappear in as little as a day, but scientists did not know where the water was going or how quickly, nor the impact on ice flow.
Researchers have hypothesized that meltwater from the surface of Greenland’s ice sheet might be lubricating the base. But until now, there were only theoretical predictions of how the meltwater could reach the base through a kilometer of subfreezing ice.
“We set out to examine whether the melting at the surface—which is sensitive to climate change—could influence how fast the ice can flow,” Das said. “To influence flow, you have to change the conditions underneath the ice sheet, because what’s going on beneath the ice dictates how quickly the ice is flowing."
"If the ice sheet is frozen to the bedrock or has very little water available," Das added, "then it will flow much more slowly than if it has a lubricating and pressurized layer of water underneath to reduce friction.”
In the summers of 2006 and 2007, Das, Joughin, and colleagues used seismic instruments, water-level monitors, and Global Positioning System sensors to closely monitor the evolution of two lakes and the motion of the surrounding ice sheet. They also used helicopter and airplane surveys and satellite imagery to monitor the lakes and to track the progress of glaciers moving toward the coast.
The most spectacular observations occurred in July 2006 when their instruments captured the sudden, complete draining of a lake that had once covered 5.6 square kilometers (2.2 square miles) of the surface and held 0.044 cubic kilometers (11.6 billion gallons) of water.
Like a draining bathtub, the entire lake emptied from the bottom in 24 hours, with the majority of the water flowing out in a 90-minute span. The maximum drainage rate was faster than the average flow rate over Niagara Falls.
Closer inspection of the data revealed that the pressure of the water from the lake split open the ice sheet from top to bottom, through 980 meters (3,200 feet) of ice. This water-driven fracture delivered meltwater directly to the base, raising the surface of the ice sheet by 1.2 meters in one location.
In the middle of the lake bottom, a 750-meter (2,400 foot) wide block of ice was raised by 6 meters (20 feet). The horizontal speed of the ice sheet--which is constantly in motion even under normal circumstances--became twice the average daily rate for that location.
“It’s hard to envision how a trickle or a pool of meltwater from the surface could cut through thick, cold ice all the way to the bed,” said Das. “For that reason, there has been a debate in the scientific community as to whether such processes could exist, even though some theoretical work has hypothesized this for decades.”
The seismic signature of the fractures, the rapid drainage, and the uplift and movement of the ice all showed that water had flowed all the way down to the bed. As cracks and crevasses form and become filled with water, the greater weight and density of the water forces the ice to crack open.
As water pours down through these cracks, it forms moulins (cylindrical, vertical conduits) through the ice sheet that allow rapid drainage and likely remain open for the rest of the melt season.
Das, Joughin, and their field research team will be featured this summer during an online- and museum-based outreach project known as Polar Discovery. Their return research expedition to Greenland will be chronicled daily through photo essays, and the researchers will conduct several live conversations with students, educators, and museum visitors via satellite phone.
Funding for the research was provided by the National Science Foundation, the National Aeronautics and Space Administration, the WHOI Clark Arctic Research Initiative, and the WHOI Oceans and Climate Change Institute.
Adapted from materials provided by Woods Hole Oceanographic Institution.
Fausto Intilla - www.oloscience.com

Seven Months On A Drifting Ice Floe


Source:

ScienceDaily (Apr. 18, 2008) — For the first time, a German has taken part in a Russian drift expedition and has explored the atmosphere above the central Arctic during the polar night. Jürgen Graeser, a member of the Potsdam Research Unit of the Alfred-Wegener-Institute for Polar and Marine Research in the Helmholtz Association, has just returned home to Germany. As a member of the Russian expedition NP 35 (35. North Pole Drift Expedition), which consisted of 21 persons, he has spent seven months on a drifting ice floe in the Arctic.
The 49-year-old scientific technician has gained observational data from a region, which is normally inaccessible during the Arctic winter and therefore widely unexplored. Ascends with a tethered balloon up to an altitude of 400 metres as well as balloon borne sensor ascends up to an altitude of 30 kilometres provided data which will contribute to ameliorate existing climate models for the Arctic.
In spite of its importance for the global climate system, the Arctic is still a blank on the data map. Up to now, continuous measuring in the atmosphere above the Arctic Ocean is missing. „We are not able to develop any reliable climate scenarios without disposing of data series with high temporal and local resolutions about the Arctic winter. The data which Jürgen Graeser has obtained in the course of the NP 35 expedition are unique, and they are apt to considerably diminish the still existing uncertainties in our climate models“ said Prof. Dr. Klaus Dethloff, project leader at the Alfred Wegener Institute for Polar and Marine Research.
Russian-German co-operation
Since 1937/38, the Russian Institute for Arctic and Antarctic Research (AARI) has already operated 34 Russian North Pole drift stations. In the course of the International Polar Year 2007/2008, for the first time a foreigner was allowed to take part in a drift expedition (NP 35). Due to their close co-operation with the AARI, the scientists of the Potsdam Research Unit of the Alfred Wegener Institute now could realize a project to research the polar atmosphere in the hardly accessible region of the Arctic Ocean.
The expedition NP 35
From September 2007 to April 2008, the scientific technician Jürgen Graeser from the Potsdam Research Unit was a member of the NP 35 team. For seven months, the 49-year-old has lived and worked together with twenty Russian colleagues on an ice flow the size of three times five kilometres. While Graeser concentrated on measuring the Arctic atmosphere, the Russian scientists performed investigations of the ocean top layer, the characteristics of the sea ice, the snow coverage and the energy balance above the ice surface. Moreover, they recorded atmospherical data concerning temperature, moisture, wind and air pressure by means of earth stations as well as with ascends of radio sensors. In the course of the winter the ice floe drifted 850 kilometres in northwestern direction over the Arctic Ocean.
In April Jürgen Graser was picked up from the ice floe by Polar 5, the research aircraft of the Alfred-Wegener-Institute. A specialised pilot, Brian Burchartz from Enterprise Airlines Oshawa, Canada, accomplished the difficult landing and take-off operation on the ice. “I experienced my stay on the ice floe as an incredible enrichment, under personal as well as professional aspects,” Jürgen Graeser said. The Russian colleagues will continue their measurements until the planned evacuation of the station in September 2008.
The exploration of the atmospheric boundary layer
During the drift Jürgen Graeser has explored the atmosphere above the Arctic Ocean. In order to measure the meteorological structure of the Arctic boundary layer and its temporal changes, he regularly sent out a tethered balloon filled with helium. The six sensors fixed on the tether registered data for temperature, air pressure, moisture and wind and sent them to Greaser’s computer. The exchange processes of heat, impulses and moisture between the earth surface and the atmosphere, which are important for the climate, take place in the layer between the ground and an altitude of about 400 metres.
For the first time now the local and temporal structure of ground-level temperature inversions was measured during the complete polar night. To evaluate and interpret the data, the scientists in Potsdam performed simulations with a regional climate model of the Arctic. Preliminary comparisons of temperature profiles measured on the ice floe with those from the regional climate model underline the importance of the measurements performed by Jürgen Graeser. Considerable deviations are shown between the observed data and model data in the region between the ground and an altitude of about 400 metres. Subsequent research activities in Potsdam focus on the connection of the Arctic boundary layer with the development and the tracks of low-pressure areas.
The investigation of the atmosphere – ozone
Vertical high-resolution ozone data from the central Arctic are rare. To close this data gap, Jürgen Graeser regularly launched a research balloon equipped with a radiosonde and an ozone sensor. These balloons carry the sensors up to an altitude of about 30 kilometres. In the past winter, the region of the ozone layer in an altitude of about 20 kilometres was exceptionally cold, thus continuing the trend to colder conditions in this altitude that was observed in the past. The cold conditions have fostered considerable destruction of the Arctic ozone layer in the past winter. The unique measurements of NP 35 will significantly contribute to determine precisely how much of the ozone destruction is caused by human activities.
„The high amount of work caused by the extensive measuring program let the time on the ice flow go by extremely fast“, Jürgen Graeser said on his return. Daily life was structured by the measurements on the one hand and by the meals with the colleagues on the other. A cook was responsible for the meals of the whole team, but each overwinterer helped him with the kitchen work for one day every three weeks. This kitchen service coincided with the station service controlling the condition of the ice floe and the presence of polar bears near the station. These tasks turned out to be very important, for in the course of the winter the ice floe produced crevices several times, but those crevices closed again. Moreover, frequent visits of polar bears regularly caused for alarm among the participants. Jürgen Graeser had the possibility to communicate with the Potsdam colleagues via satellite telephone and to relay the actual measuring data promptly.
Future projects
The long-term aim is to significantly reduce the great imprecision of present climate models in polar regions. To create models, mathematical descriptions for physical processes taking place under natural conditions are used. These so-called „parameterizations“ are based on measured data, and only an excellent data base can enable them to produce realistic climate simulations. In November 2008, the scientists taking part in the NP-35 project will discuss the results of their expedition in the course of an international workshop in Potsdam. Altogether, the NP-35 project is one more significant milestone for the Potsdam atmospheric researchers. The results deliver an important base for the international focal projects CliC (Climate and Cryosphere) and SPARC (Stratospheric Processes and their Role in Climate Change) by the World Climate Research Programme (WCRP, wcrp.wmo.int/).Adapted from materials provided by Helmholtz Association of German Research Centres.

Fausto Intilla - www.oloscience.com

Thursday, April 17, 2008

Jet Streams Are Shifting And May Alter Paths Of Storms And Hurricanes


Source:

ScienceDaily (Apr. 17, 2008) — The Earth's jet streams, the high-altitude bands of fast winds that strongly influence the paths of storms and other weather systems, are shifting--possibly in response to global warming. Scientists at the Carnegie Institution determined that over a 23-year span from 1979 to 2001 the jet streams in both hemispheres have risen in altitude and shifted toward the poles. The jet stream in the northern hemisphere has also weakened. These changes fit the predictions of global warming models and have implications for the frequency and intensity of future storms, including hurricanes.
Cristina Archer and Ken Caldeira of the Carnegie Institution's Department of Global Ecology tracked changes in the average position and strength of jet streams using records compiled by the European Centre for Medium-Range Weather Forecasts, the National Centers for Environmental Protection, and the National Center for Atmospheric Research. The data included outputs from weather prediction models, conventional observations from weather balloons and surface instruments, and remote observations from satellites.
Jet streams twist and turn in a wide swath that changes from day to day. The poleward shift in their average location discovered by the researchers is small, about 19 kilometers (12 miles) per decade in the northern hemisphere, but if the trend continues the impact could be significant. "The jet streams are the driving factor for weather in half of the globe," says Archer. "So, as you can imagine, changes in the jets have the potential to affect large populations and major climate systems."
Storm paths in North America are likely to shift northward as a result of the jet stream changes. Hurricanes, whose development tends to be inhibited by jet streams, may become more powerful and more frequent as the jet streams move away from the sub-tropical zones where hurricanes are born.
The observed changes are consistent with numerous other signals of global warming found in previous studies, such as the widening of the tropical belt, the cooling of the stratosphere, and the poleward shift of storm tracks. This is the first study to use observation-based datasets to examine trends in all the jet stream parameters, however.
"At this point we can't say for sure that this is the result of global warming, but I think it is," says Caldeira. "I would bet that the trend in the jet streams' positions will continue. It is something I'd put my money on."
The results are published in the April 18 Geophysical Research Letters.
Adapted from materials provided by Carnegie Institution, via EurekAlert!, a service of AAAS.

Fausto Intilla - www.oloscience.com

Worst Offenders For Carbon Dioxide Emissions: Top 20 US Counties Identified


Source:

ScienceDaily (Apr. 17, 2008) — The top twenty carbon dioxide-emitting counties in the United States have been identified by a research team led by Purdue University.
The top three counties include the cities of Houston, Los Angeles and Chicago.
Kevin Gurney, an assistant professor of earth and atmospheric science at Purdue University and leader of the carbon dioxide inventory project, which is called Vulcan, says the biggest surprise is that each region of the United States is included in the ranking.
"It shows that CO2 emissions are really spread out across the country," he says. "Texas, California, New York, Florida, New Mexico, the Midwest — Indiana, Illinois, Ohio — and Massachusetts are all listed. No region is left out of the ranking, it would seem."
The listing of the counties includes the largest city in each county. The numbers are for millions of tons of carbon emitted per year.
Harris, Texas (Houston) — 18.625 million tons of carbon per year
Los Angeles, Calif. (Los Angeles) — 18.595
Cook, Ill. (Chicago) — 13.209
Cuyahoga, Ohio (Cleveland) — 11.144
Wayne, Mich. (Detroit) — 8.270
San Juan, N.M. (Farmington) — 8.245
Santa Clara, Calif. (San Jose) — 7.995
Jefferson, Ala. (Birmingham) — 7.951
Wilcox, Ala. (Camden) — 7.615
East Baton Rouge, La. (Baton Rouge) — 7.322
Titus, Texas (Mt. Pleasant) — 7.244
Carbon, Pa. (Jim Thorpe) — 6.534
Porter, Ind. (Valparaiso) — 6.331
Jefferson, Ohio (Steubenville) — 6.278
Indiana, Pa. (Indiana) — 6.224
Middlesex, Mass. (Boston metro area) — 6.198
Bexar, Texas (San Antonio) — 6.141
Hillsborough, Fla. (Tampa) — 6.037
Suffolk, N.Y. (New York metro area) — 6.030
Clark, Nev. (Las Vegas) — 5.955
The current emissions are based on information from 2002, but the Vulcan system will soon expand to more recent years.
Gurney says Vulcan, which is named for the Roman god of fire, quantifies all of the CO2 that results from the burning of fossil fuels such as coal and gasoline. It also tracks the hourly outputs at the level of factories, power plants, roadways, neighborhoods and commercial districts.
"It's interesting that the top county, Harris, Texas, is on the list because of industrial emissions, but the second highest CO2 emitting county, Los Angels, California, is on the list because of automobile emissions," Gurney says. "So it's not just cars, and it's not just factories, that are emitting the carbon dioxide, but a combination of different things."
Gurney notes that some counties on the list are there but they are producing goods or power for occupants of a different area.
"Counties such as Titus, Texas, Indiana, Pennsylvania, and Clark, Nevada, are dominated by large power production facilities that serve populations elsewhere," he says.
"My favorite one on the list is Carbon, Pennsylvania," Gurney adds.
The three-year project, which was funded by NASA and the U.S. Department of Energy under the North American Carbon Program, involved researchers from Purdue University, Colorado State University and Lawrence Berkeley National Laboratory.
The Vulcan data is available for anyone to download from the Web site at http://www.eas.purdue.edu/carbon/vulcan. Smaller summary data sets that offer a slice of the data and are easier to download also are available for non-scientists on the Vulcan Web site. These can be broken down into emission categories, such as industrial, residential, transportation, power producers, by fuel type, and are available by state, county, or cells as small as six miles (10 kilometers) across.
Adapted from materials provided by Purdue University.

Fausto Intilla - www.oloscience.com

Wednesday, April 16, 2008

Ward Hunt Ice Shelf, Largest In Northern Hemisphere, Has Fractured Into Three Main Pieces


Source:

ScienceDaily (Apr. 16, 2008) — A team of scientists including polar expert Dr. Derek Mueller from Trent University and Canadian Rangers have discovered that the largest ice shelf in the Northern Hemisphere has fractured into three main pieces.
During their sovereignty patrol across the northernmost parts of Canada over the last two weeks, they visited a new 18 kilometre-long network of cracks running from the southern edge of the Ward Hunt Ice Shelf to the Arctic Ocean. This accompanies a large central fracture that was first detected in 2002, and raises the concern that the remaining ice shelf will disintegrate within the next few years.
Evidence of these cracks first came from Radarsat satellite images in February. Confirmation came from Canadian Rangers, partnering with International Polar Year scientists during Operation NUNALIVUT 08, a Canadian Forces High Arctic sovereignty patrol. Rangers mapped the extent of the fissures and monitored melt rates for Quttinirpaaq National Park, which encompasses the ice shelf.
The patrol scientists also found that the nearby Petersen Ice Shelf lost over a third of its surface area in the past three years. This ice shelf calved following the break-up of landfast sea ice in the summer of 2005 and 2007, which had protected it from the open ocean.
“Canadian ice shelves have undergone substantial changes in the past six years, starting with the first break-up event on the Ward Hunt Ice Shelf, and the loss of the Ayles Ice Shelf,” said Dr. Luke Copland of the University of Ottawa. “These latest break-ups we are seeing have come after decades of warming and are irreversible,” said Dr. Derek Mueller of Trent University.
Only five large ice shelves remain in Arctic Canada, covering less than a tenth of the area than they did a century ago.
Derek Mueller holds Trent's Roberta Bondar Fellowship in Northern and Polar Studies.
Adapted from materials provided by Trent University.

Fausto Intilla - www.oloscience.com

Tuesday, April 15, 2008

Better Understanding Of Hurricane Trajectories Learned From Patterns On Soap Bubbles


Source:

ScienceDaily (Apr. 15, 2008) — Researchers at the Centre de Physique Moléculaire Optique et Hertzienne (CPMOH) (CNRS/Université Bordeaux (1) and the Université de la Réunion(1) have discovered that vortices created in soap bubbles behave like real cyclones and hurricanes in the atmosphere. Soap bubbles have enabled the researchers to characterize for the first time the random factor that governs the movement and paths of vortices. These results, published in the journal Physical Review Letters, could lead to a better understanding of such increasingly common and often devastating atmospheric phenomena.
A soap bubble is an ideal model for studying the atmosphere because it has analogous physical properties and, like the atmosphere, it is composed of a very thin film in relation to its diameter(2). In this experiment, the researchers created a half soap bubble that they heated at the “equator” and then cooled at the “poles”, thereby creating a single large vortex, similar to a hurricane, in the wall of the bubble. The researchers studied the movement of this vortex, which fluctuates in a random manner. This is characterized by a law known as a superdiffusive law(3), well known to physicists, but which had not until then been observed in the case of single vortices in a turbulent environment.
The disconcerting resemblance between vortices on soap bubbles and cyclones led the researchers to study their similarities. By analyzing in detail the trajectories of certain recent hurricanes such as Ivan, Jane, Nicholas, etc., the researchers measured the random factor that is always present in the movement of hurricanes. They then demonstrated the remarkable similarity of these fluctuations with those that characterize the disordered movement of the vortices that they created on soap bubbles.(4)
Taking this random factor into account in predicting the trajectory of hurricanes will be useful in anticipating the probability of impact on a given site or locality. Although the mean trajectory of hurricanes (without any fluctuations) is beginning to be well simulated by meteorologists, this random factor has, until now, been poorly understood. This discovery highlights a universality in the statistics of trajectory fluctuations and should make it possible in the future to better predict the behavior of hurricanes and anticipate the risks.
Notes :
1) Laboratoire de Génie Industriel.
2) The skin or film of soap is only several microns thick whereas the diameter of the bubble is around ten centimeters.
3) Law corresponding to a “Levy flight” random type movement, in other words a type of random walk dominated by several jumps of limited number but of large amplitude.
4) With a similar superdiffusive law.
Journal reference: Thermal convection and emergence of isolated vortices in soap bubbles, F. Seychelles, Y. Amarouchene, M. Bessafi*, and H. Kellay Université Bordeaux 1, CPMOH UMR 5798 du CNRS and * Université de la Réunion, Lab. de Génie Industriel. Physical Review Letters. April 7, 2008.
Adapted from materials provided by CNRS.

Fausto Intilla
www.oloscience.com

California Has More Than 99% Chance Of A Big Earthquake WIthin 30 Years, Report Shows


Source:
ScienceDaily (Apr. 15, 2008) — California has more than a 99% chance of having a magnitude 6.7 or larger earthquake within the next 30 years, according scientists using a new model to determine the probability of big quakes.
The likelihood of a major quake of magnitude 7.5 or greater in the next 30 years is 46%-and such a quake is most likely to occur in the southern half of the state.
The new study determined the probabilities that different parts of California will experience earthquake ruptures of various magnitudes. The new statewide probabilities are the result of a model that comprehensively combines information from seismology, earthquake geology, and geodesy (measuring precise locations on the Earth's surface). For the first time, probabilities for California having a large earthquake in the next 30 years can be forecast statewide.
"This new, comprehensive forecast advances our understanding of earthquakes and pulls together existing research with new techniques and data," explained USGS geophysicist and lead scientist Ned Field. "Planners, decision makers and California residents can use this information to improve public safety and mitigate damage before the next destructive earthquake occurs."
The new information is being provided to decision makers who establish local building codes, earthquake insurance rates, and emergency planning and will assist in more accurate planning for inevitable future large earthquakes.
The official earthquake forecasts, known as the "Uniform California Earthquake Rupture Forecast (UCERF)," were developed by a multidisciplinary group of scientists and engineers, known as the Working Group on California Earthquake Probabilities. Building on previous studies, the Working Group updated and developed the first-ever statewide, comprehensive model of California.
The organizations sponsoring the Working Group include the U.S. Geological Survey, the California Geological Survey and the Southern California Earthquake Center. An independent scientific review panel, as well as the California and National Earthquake Prediction Evaluation Councils, have evaluated the new UCERF study.
The consensus of the scientific community on forecasting California earthquakes allows for meaningful comparisons of earthquake probabilities in Los Angeles and the San Francisco Bay Area, as well as comparisons among several large faults.
The probability of a magnitude 6.7 or larger earthquake over the next 30 years striking the greater Los Angeles area is 67%, and in the San Francisco Bay Area it is 63%, similar to previous Bay Area estimates. For the entire California region, the fault with the highest probability of generating at least one magnitude 6.7 quake or larger is the southern San Andreas (59% in the next 30 years).
For northern California, the most likely source of such earthquakes is the Hayward-Rodgers Creek Fault (31% in the next 30 years). Such quakes can be deadly, as shown by the 1989 magnitude 6.9 Loma Prieta and the 1994 magnitude 6.7 Northridge earthquakes.
Earthquake probabilities for many parts of the state are similar to those in previous studies, but the new probabilities calculated for the Elsinore and San Jacinto Faults in southern California are about half those previously determined. For the far northwestern part of the State, a major source of earthquakes is the offshore 750-mile-long Cascadia Subduction Zone, the southern part of which extends about 150 miles into California. For the next 30 years there is a 10% probability of a magnitude 8 to 9 quake somewhere along that zone. Such quakes occur about once every 500 years on average.
The new model does not estimate the likelihood of shaking (seismic hazard) that would be caused by quakes. Even areas in the state with a low probability of fault rupture could experience shaking and damage from distant, powerful quakes. The U.S. Geological Survey (USGS) is incorporating the UCERF into its official estimate of California's seismic hazard, which in turn will be used to update building codes. Other subsequent studies will add information on the vulnerability of manmade structures to estimate expected losses, which is called "seismic risk." In these ways, the UCERF will help to increase public safety and community resilience to earthquake hazards.
The results of the UCERF study serve as a reminder that all Californians live in earthquake country and should be prepared. Although earthquakes cannot be prevented, the damage they do can be greatly reduced through prudent planning and preparedness. The ongoing work of the Southern California Earthquake Center, USGS, California Geological Survey, and other scientists in evaluating earthquake probabilities is part of the National Earthquake Hazard Reduction Program's efforts to safeguard lives and property from the future quakes that are certain to strike in California and elsewhere in the United States.
Adapted from materials provided by U.S. Geological Survey.

Fausto Intilla
www.oloscience.com