Showing posts with label Oceanography. Show all posts
Showing posts with label Oceanography. Show all posts

Tuesday, June 23, 2009

How Aerosols Contribute To Climate Change


ScienceDaily (June 23, 2009) — What happens in Vegas may stay in Vegas, but what happens on the way there is a different story.
As imaged by Lynn Russell, a professor of atmospheric chemistry at Scripps Institution of Oceanography at UC San Diego, and her team, air blown by winds between San Diego and Las Vegas gives the road to Sin City a distinctive look.
The team has sampled air from the tip of the Scripps Pier since last year, creating a near real-time record of what kinds of particles — from sea salt to car exhaust — are floating around at any given time. Add data about wind speed and direction and the scientists can tell where particles came from and can map their pathways around Southern California.
When Russell and her students put it all together, the atmosphere of greater San Diego comes alive in colors representing the presence of different airborne chemical compounds in aerosol form. One streak of deep red draws a distinct line from the pier that sometimes extends all the way to Las Vegas. The red denotes organic mass, a carbon-based component of vehicular and industrial emissions that pops up on Russell’s readouts frequently. Plot the streak on a road atlas and it reveals the daily life of pollution in Southern California. For one stretch of time, it neatly traced Interstate 15 all the way past the California-Nevada border.
“We were really surprised,” said Russell. “We did not expect to have such consistent winds for the selected study days.”
The hunt for various types of aerosols is helping Russell draw new kinds of global maps, ones that depict what organic compounds—whether natural or from sources such as Southern California traffic and industries—could do to affect rainfall, snowfall, atmospheric warming and cooling, and a host of other climate phenomena. Russell is part of an effort that involves several researchers at Scripps and UCSD and around the world. Collectively they are attempting to address a human-caused phenomenon in the Earth system scarcely considered before the last decade.
Aerosol research is considered one of the most critical frontiers of climate change science, much of which is devoted to the creation of accurate projections of future climate. These projections are generated by computer models — simulations of phenomena such as warming patterns, sea level fluctuations, or drought trends. The raw data for the models can come from historical records of climate basics like temperature and precipitation, but scientists often must rely on incomplete data and best guesses to represent more complex phenomena. The more such uncertainty goes into a model, the greater its margin of error becomes, making it less reliable as a guide for forecasts and adaptive actions.
Among these complex phenomena, the actions of aerosols are what some researchers consider the field’s holy grail, representing the biggest barrier to producing accurate representations of climate. In fact, the Intergovernmental Panel on Climate Change in 2007 specifically listed the effect of aerosols on cloud formation as the largest source of uncertainty in present-day climate models.
Bits of dust, sea salt, the remnants of burned wood, and even living things like bacteria all add to the mix of aerosols that create the skeletons on which clouds form. Around these particles, water and ice condense and cluster into cloud masses. The size and number of each of these droplets determine whether the clouds can produce rain or snow.
The aerosols are also influencing climate in other ways. Diesel exhaust, industrial emissions, and the smoke from burning wood and brush eject myriad bits of black carbon, usually in the form of soot, into the sky and form so-called “brown clouds” of smog. This haze has a dual heating and cooling effect. The particles absorb heat and make the air warmer at the altitudes to which they tend to rise but they also deflect sunlight back into space. This shading effect cools the planet at ground level.
The Arctic Circle is one of the places in the world most sensitive to changes in the mix of aerosols. Since the beginnings of the Industrial Revolution, scientists and explorers have noted the presence of the Arctic haze, a swirl of pollution that appears when sunlight returns after a winter of darkness. The presence of smog over a mostly uninhabited region leads many scientists to believe it is the reason the Arctic is experiencing the most rapid climate-related changes in the world. The haze now lingers for a longer period of time every year. It may be contributing to the forces now causing a meltdown of Arctic ice, a release of methane once stored in permafrost, and a host of ecological changes affecting the spectrum of organisms from mosquitoes to polar bears.
Russell has taken part in two recent analyses of polar air to understand where its imported aerosols come from and how the chemical components of those aerosols could be affecting temperature and cloud formation. From a research vessel in the Norwegian Sea and via continuous measurements from a ground station in Barrow, Alaska, Russell’s team is analyzing particles likely to have been blown to the Arctic from Europe and Asia. Her group has just compiled a full season of air samples fed through intake valves onto filters collected at Barrow.
With it, she believes she has proven what colleagues have previously theorized about where the particles are coming from. She is especially interested in organic particles—aerosols containing carbon supplied either by natural sources such as ocean or land plants or by human sources. Work in her group has shown that organics in the spring haze carry a signature consistent with dust and biomass burning taking place most likely in Siberia. The chemical signature changes in other seasons, revealing itself in infrared spectroscopy readings to be the product of aerosols from natural sources.
The aerosols could be influencing how much snowfall the Arctic gets and keeps. Human-produced aerosols are thought to stifle precipitation in some areas but may provide the impetus for torrential rain in others depending on their chemical make-up. Even if the Asian aerosols are not affecting precipitation, however, Russell said they appear to cool the Arctic atmosphere by deflecting light into space. At the same time, there is strong evidence that they are accelerating ice melt in the Arctic by darkening and heating ice once they fall to the ground the way a dark sweater makes its wearer hotter on a sunny day than does a white sweater.
Russell has been part of another collaborative effort launched in 2008, the International Polar Year, that created chemical profiles of relatively untainted air off the Norwegian west coast, which is only occasionally tinged by European smog. She has also teamed with collaborators at Scripps, NOAA and other universities to profile aerosols around Houston, Texas, and Mexico City.
In the latter two projects, she has provided evidence that agriculture adds more to the aerosol mix in an oil town like Houston than previously thought and that organic particles in Mexico City, rising from the smoke of street vendors and exhaust of cars driving on gas formulated differently than in the United States, glom on to dust in a different manner than American pollution to create aerosols with distinct chemical structures. Figuring out what they do locally and regionally is the next step.
Russell collaborates with a number of other faculty at UCSD whose research also focuses on aerosols, such as Kim Prather, an atmospheric chemistry professor with joint appointments at Scripps and the UCSD Department of Chemistry and Biochemistry. Russell and Prather are comparing their results form Mexico City in an effort to better understand the sources of aerosols in the atmosphere.
“We are trying to understand the major sources of aerosols in our atmosphere and how they affect the overall temperature of our planet; as opposed to greenhouse gases which we know are warming, aerosols can cool or warm depending on their composition and where they are located in the atmosphere," said Prather. Like Russell, Prather also studies long-range transport of aerosols from terrestrial and marine sources. Prather and Russell have worked together on several other projects and recently helped form the Aerosol Chemistry and Climate Institute, a collaboration between Scripps and the Department of Energy’s Pacific Northwest National Laboratory.
For her most comprehensive study, Russell need only to make her shortest journey to the end of Scripps Pier. It is possible that aerosol journeys of a thousand miles or more might be explained by shorter commutes between Southern California counties. Complete analysis of the Interstate 15 data suggests Vegas might not be a source of dirtiness after all. Using data collected over longer time periods, Russell’s pollution map of local counties now suggests organic human-made aerosols might just be blowing toward Nevada from San Bernardino and Riverside then back toward San Diego as winds shift. Russell employs a suite of complementary measurements at the pier to characterize short- and long-term aerosol trends. Those are combined with particle profiles made by Prather’s group and collaborators whose numbers are growing out of necessity.
“Understanding the big picture is the only way we’re going to be able to reduce the uncertainty associated with aerosol particles and their effects on climate,” said Russell. “There are so many parameters, there’s no one instrument or even one person who can do all of it at once.”
Adapted from materials provided by University of California, San Diego, via Newswise.

Monday, June 22, 2009

Carbon Dioxide Higher Today Than Last 2.1 Million Years

SOURCE

ScienceDaily (June 21, 2009) — Researchers have reconstructed atmospheric carbon dioxide levels over the past 2.1 million years in the sharpest detail yet, shedding new light on its role in the earth's cycles of cooling and warming.
The study, in the June 19 issue of the journal Science, is the latest to rule out a drop in CO2 as the cause for earth's ice ages growing longer and more intense some 850,000 years ago. But it also confirms many researchers' suspicion that higher carbon dioxide levels coincided with warmer intervals during the study period.
The authors show that peak CO2 levels over the last 2.1 million years averaged only 280 parts per million; but today, CO2 is at 385 parts per million, or 38% higher. This finding means that researchers will need to look back further in time for an analog to modern day climate change.
In the study, Bärbel Hönisch, a geochemist at Lamont-Doherty Earth Observatory, and her colleagues reconstructed CO2 levels by analyzing the shells of single-celled plankton buried under the Atlantic Ocean, off the coast of Africa. By dating the shells and measuring their ratio of boron isotopes, they were able to estimate how much CO2 was in the air when the plankton were alive. This method allowed them to see further back than the precision records preserved in cores of polar ice, which go back only 800,000 years.
The planet has undergone cyclic ice ages for millions of years, but about 850,000 years ago, the cycles of ice grew longer and more intense—a shift that some scientists have attributed to falling CO2 levels. But the study found that CO2 was flat during this transition and unlikely to have triggered the change.
"Previous studies indicated that CO2 did not change much over the past 20 million years, but the resolution wasn't high enough to be definitive," said Hönisch. "This study tells us that CO2 was not the main trigger, though our data continues to suggest that greenhouse gases and global climate are intimately linked."
The timing of the ice ages is believed to be controlled mainly by the earth's orbit and tilt, which determines how much sunlight falls on each hemisphere. Two million years ago, the earth underwent an ice age every 41,000 years. But some time around 850,000 years ago, the cycle grew to 100,000 years, and ice sheets reached greater extents than they had in several million years—a change too great to be explained by orbital variation alone.
A global drawdown in CO2 is just one theory proposed for the transition. A second theory suggests that advancing glaciers in North America stripped away soil in Canada, causing thicker, longer lasting ice to build up on the remaining bedrock. A third theory challenges how the cycles are counted, and questions whether a transition happened at all.
The low carbon dioxide levels outlined by the study through the last 2.1 million years make modern day levels, caused by industrialization, seem even more anomalous, says Richard Alley, a glaciologist at Pennsylvania State University, who was not involved in the research.
"We know from looking at much older climate records that large and rapid increase in CO2 in the past, (about 55 million years ago) caused large extinction in bottom-dwelling ocean creatures, and dissolved a lot of shells as the ocean became acidic," he said. "We're heading in that direction now."
The idea to approximate past carbon dioxide levels using boron, an element released by erupting volcanoes and used in household soap, was pioneered over the last decade by the paper's coauthor Gary Hemming, a researcher at Lamont-Doherty and Queens College. The study's other authors are Jerry McManus, also at Lamont; David Archer at the University of Chicago; and Mark Siddall, at the University of Bristol, UK.
Funding for the study was provided by the National Science Foundation.
Journal reference:
. Atmospheric Carbon Dioxide Concentrations Across the Mid-Pleistocene Transition. Science, June 19, 2009
Adapted from materials provided by The Earth Institute at Columbia University.

Friday, June 12, 2009

Typhoons Trigger Slow Earthquakes


ScienceDaily (June 12, 2009) — Scientists have made the surprising finding that typhoons trigger slow earthquakes, at least in eastern Taiwan. Slow earthquakes are non-violent fault slippage events that take hours or days instead of a few brutal seconds to minutes to release their potent energy. The researchers discuss their data in a study published the June 11, issue of Nature.
"From 2002 to 2007 we monitored deformation in eastern Taiwan using three highly sensitive borehole strainmeters installed 650 to 870 feet (200-270 meters) deep. These devices detect otherwise imperceptible movements and distortions of rock," explained coauthor Selwyn Sacks of Carnegie's Department of Terrestrial Magnetism. "We also measured atmospheric pressure changes, because they usually produce proportional changes in strain, which we can then remove."
Taiwan has frequent typhoons in the second half of each year but is typhoon free during the first 4 months. During the five-year study period, the researchers, including lead author Chiching Liu (Academia Sinica, Taiwan), identified 20 slow earthquakes that each lasted from hours to more than a day. The scientists did not detect any slow events during the typhoon-free season. Eleven of the 20 slow earthquakes coincided with typhoons. Those 11 were also stronger and characterized by more complex waveforms than the other slow events.
"These data are unequivocal in identifying typhoons as triggers of these slow quakes. The probability that they coincide by chance is vanishingly small," remarked coauthor Alan Linde, also of Carnegie.
How does the low pressure trigger the slow quakes? The typhoon reduces atmospheric pressure on land in this region, but does not affect conditions at the ocean bottom, because water moves into the area and equalizes pressure. The reduction in pressure above one side of an obliquely dipping fault tends to unclamp it. "This fault experiences more or less constant strain and stress buildup," said Linde. "If it's close to failure, the small perturbation due to the low pressure of the typhoon can push it over the failure limit; if there is no typhoon, stress will continue to accumulate until it fails without the need for a trigger."
"It's surprising that this area of the globe has had no great earthquakes and relatively few large earthquakes," Linde remarked. "By comparison, the Nankai Trough in southwestern Japan, has a plate convergence rate about 4 centimeters per year, and this causes a magnitude 8 earthquake every 100 to 150 years. But the activity in southern Taiwan comes from the convergence of same two plates, and there the Philippine Sea Plate pushes against the Eurasian Plate at a rate twice that for Nankai."
The researchers speculate that the reason devastating earthquakes are rare in eastern Taiwan is because the slow quakes act as valves, releasing the stress frequently along a small section of the fault, eliminating the situation where a long segment sustains continuous high stresses until it ruptures in a single great earthquake. The group is now expanding their instrumentation and monitoring for this research.
Adapted from materials provided by Carnegie Institution, via EurekAlert!, a service of AAAS.

Friday, June 5, 2009

Height Of Large Waves Changes According To Month

SOURCE

ScienceDaily (June 2, 2009) — A team of researchers from the University of Cantabria has developed a statistical model that makes it possible to study the variability of extreme waves throughout the year. Their study has shown that there are seasonal variations in the height of waves reaching Spain's coasts, and stresses the importance of this data in planning and constructing marine infrastructures.
"Anybody who observes waves can see that they are not the same height in winter and summer, but rather that their height varies over time, and we have applied a ‘non- seasonal' statistical model in order to measure extreme events such as these," says Fernando J. Méndez, an engineer at the Institute of Environmental Hydraulics at the University of Cantabria and co-author of a study published recently in the journal Coastal Engineering.
The new model can chart the pattern of extreme waves "with a greater degree of reliability", by studying ‘significant wave height' (Hs) in relation to a specific return period. The Hs is the representative average height of the sea, provided by buoys (it is calculated by measuring one in three of the highest waves), and the return period is the average time needed for the event to happen.
For example, if a wave height of 15 metres is established at a certain point on the coast with a return period of 100 years, this means that, on average, a wave of 15 metres could reach this point once every 100 years. "This can be very useful when it comes to building an oil platform in the sea or a particular piece of coastal infrastructure", explains Méndez.
The researchers have used data recorded between 1984 and 2003 by five coastal buoys located near the cities of Bilbao, in Vizcaya; Gijón, in Asturias; La Coruña, Cádiz and Valencia in order to demonstrate the validity of their model. The results show that extreme Hs values vary according to location and the month of the year.
The meteorological component of extreme waves
The results showed a similar seasonal variation between waves in Bilbao and Gijón, with waves being less than four metres high between May and September, but increasing after this to reach an average height of seven metres between December and January. The period of large waves in La Coruña extends from October to April, because of the city's westerly position and resulting exposure to more prolonged winter storms.
The Atlantic coast of Cádiz, meanwhile, reflects the characteristic calm of this area of sea between July and September, with Hs values below two metres. The figures for December and January, however, can vary a great deal from one year to another, reaching wave heights in excess of six metres.
Waves on the Mediterranean coast at Valencia measure between 3 and 3.5 metres from September until April, although the graphics reveal two peaks during this period, one of which coincides with the start of spring and the other with the autumn months, during which the phenomenon of the gota fría occurs. (Gota fría events are atmospheric cold air pools that cause rapid, torrential and very localised downpours and high winds).
"All these data are of vital importance in terms of coastal management, since they can establish the risk of flooding and are indispensable for the carrying out of marine construction work, for example infrastructure built close to the coast," says Melisa Menéndez, another of the study's authors. "In addition, they make it possible to calculate the likelihood of a maritime storm occurring."
The researcher also stresses that this information could be very useful in helping to better understand some biological processes, such as how the distribution of marine animals is affected by wave swell, and seaweed growth rates, as well as geological processes, such as how particulates and sediments are transported along the coast.
Extreme value theory
The model developed by the Spanish scientists is based on ‘extreme value theory', a recently-developed statistical discipline that aims to quantify the random behaviour of extreme events. The latest advances in this field have made it possible to better study climatic variability at various scales - over a year (seasonality), over consecutive years or decades (which allows climatic patterns to be derived), and over the long term (providing trends).
The study into extreme waves is on the seasonal scale, but the team has also studied extreme sea level values over almost a 100-year period, thanks to data gathered during the 20th Century by a mareograph located in Newlyn, in the United Kingdom. The scientists have already started to obtain information about extreme swell and sea level values at global level as part of a United Nations project to study the sea's impacts on coasts all over the planet, and how these affect climate change.
Journal references:
Melisa Menéndez, Fernando J. Méndez, Cristina Izaguirre, Alberto Luceño e Inigo J. Losada. The influence of seasonality on estimating return values of significant wave height. Coastal Engineering, 2009; 56 (3): 211 DOI: 10.1016/j.coastaleng.2008.07.004
Melisa Menendez, Fernando J. Mendez and Inigo J. Losada. Forecasting seasonal to interannual variability in extreme sea levels. ICES Journal of Marine Science, 2009; DOI: 10.1093/icesjms/fsp095
Adapted from materials provided by Plataforma SINC, via AlphaGalileo.

Friday, May 29, 2009

Huge undersea mountain found off Indonesia: scientists


This aerial view shows new homes being constructed to the north of Banda Aceh on the island of Sumatra in 2006. A massive underwater mountain discovered off the Indonesian island of Sumatra could be a volcano with potentially catastrophic power, a scientist said Friday.
A massive underwater mountain discovered off the Indonesian island of Sumatra could be a volcano with potentially catastrophic power, a scientist said Friday.
Indonesian government marine geologist Yusuf Surachman said the was discovered earlier this month about 330 kilometres (205 miles) west of Bengkulu city during research to map the seabed's seismic faultlines.
The cone-shaped mountain is 4,600 metres (15,100 feet) high, 50 kilometres in diameter at its base and its summit is 1,300 metres below the surface, he said.
"It looks like a volcano because of its conical shape but it might not be. We have to conduct further investigations," he told AFP.
He denied reports that researchers had confirmed the discovery of a new , insisting that at this stage it could only be described as a "seamount" of the sort commonly found around the world.
"Whether it's active or dangerous, who knows?" he added.
The ultra-deep geological survey was conducted with the help of French scientists and international geophysical company CGGVeritas.
The scientists hope to gain a clearer picture of the undersea lithospheric plate boundaries and seafloor displacement in the area, the epicentre of the catastrophic Asian and tsunami of 2004.
The tsunami killed more than 220,000 people across Asia, including 168,000 people in Aceh province on the northern tip of Sumatra.
is on the so-called Pacific "Ring of Fire," where the meeting of continental plates causes high volcanic and .
(c) 2009 AFP

Thursday, May 14, 2009

Ocean Circulation Doesn't Work As Expected

SOURCE

This model of North Atlantic currents has been called into question by new data from Duke University and the Woods Hole Oceanographic Institution. Image: Archana Gowda, Duke
(PhysOrg.com) -- The familiar model of Atlantic ocean currents that shows a discrete "conveyor belt" of deep, cold water flowing southward from the Labrador Sea is probably all wet.
New research led by Duke University and the Woods Hole Oceanographic Institution relied on an armada of sophisticated floats to show that much of this water, originating in the sea between Newfoundland and Greenland, is diverted generally eastward by the time it flows as far south as Massachusetts. From there it disburses to the depths in complex ways that are difficult to follow.
A 50-year-old model of ocean currents had shown this southbound subsurface flow of cold water forming a continuous loop with the familiar northbound flow of warm water on the surface, called the Gulf Stream.
"Everybody always thought this deep flow operated like a conveyor belt, but what we are saying is that concept doesn't hold anymore," said Duke oceanographer Susan Lozier. "So it's going to be more difficult to measure these signals in the deep ocean."
And since cold Labrador seawater is thought to influence and perhaps moderate human-caused climate change, this finding may affect the work of global warming forecasters.
"To learn more about how the cold deep waters spread, we will need to make more measurements in the deep ocean interior, not just close to the coast where we previously thought the cold water was confined," said Woods Hole's Amy Bower.
Lozier, a professor of physical oceanography at Duke's Nicholas School of the Environment and Bower, a senior scientist in the department of physical at the Woods Hole Institution, are co-principal authors of a report on the findings to be published in the May 14 issue of the research journal Nature.
Their research was supported by the National Science Foundation.
Climatologists pay attention to the Labrador Sea because it is one of the starting points of a global circulation pattern that transports cold northern water south to make the tropics a little cooler and then returns warm water at the surface, via the Gulf Stream, to moderate temperatures of northern Europe.
Since forecasters say effects of global warming are magnified at higher latitudes, that makes the Labrador Sea an added focus of attention. Surface waters there absorb heat-trapping carbon dioxide from the atmosphere. And a substantial amount of that CO2 then gets pulled underwater where it is no longer available to warm Earth's climate.
"We know that a good fraction of the human caused carbon dioxide released since the Industrial revolution is now in the deep North Atlantic" Lozier said. And going along for the ride are also climate-caused water temperature variations originating in the same Labrador Sea location.
The question is how do these climate change signals get spread further south? Oceanographers long thought all this Labrador seawater moved south along what is called the Deep Western Boundary Current (DWBC), which hugs the eastern North American continental shelf all the way to near Florida and then continues further south.
But studies in the 1990s using submersible floats that followed underwater currents "showed little evidence of southbound export of Labrador sea water within the Deep Western Boundary Current (DWBC)," said the new Nature report.
Scientists challenged those earlier studies, however, in part because the floats had to return to the surface to report their positions and observations to satellite receivers. That meant the floats' data could have been "biased by upper ocean currents when they periodically ascended," the report added.
To address those criticisms, Lozier and Bower launched 76 special Range and Fixing of Sound floats into the current south of the Labrador Sea between 2003 and 2006. Those "RAFOS" floats could stay submerged at 700 or 1,500 meters depth and still communicate their data for a range of about 1,000 kilometers using a network of special low frequency and amplitude seismic signals.
But only 8 percent of the RAFOS floats' followed the conveyor belt of the Deep Western Boundary Current, according to the Nature report. About 75 percent of them "escaped" that coast-hugging deep underwater pathway and instead drifted into the open ocean by the time they rounded the southern tail of the Grand Banks.
Eight percent "is a remarkably low number in light of the expectation that the DWBC is the dominant pathway for Labrador Sea Water," the researchers wrote.
Studies led by Lozier and other researchers had previously suggested cold northern waters might follow such "interior pathways" rather than the conveyor belt in route to subtropical regions of the North Atlantic. But "these float tracks offer the first evidence of the dominance of this pathway compared to the DWBC."
Since the RAFOS float paths could only be tracked for two years, Lozier, her graduate student Stefan Gary, and German oceanographer Claus Boning also used a modeling program to simulate the launch and dispersal of more than 7,000 virtual "efloats" from the same starting point.
"That way we could send out many more floats than we can in real life, for a longer period of time," Lozier said.
Subjecting those efloats to the same underwater dynamics as the real ones, the researchers then traced where they moved. "The spread of the model and the RAFOS float trajectories after two years is very similar," they reported.
"The new float observations and simulated float trajectories provide evidence that the southward interior pathway is more important for the transport of Labrador Sea Water through the subtropics than the DWBC, contrary to previous thinking," their report concluded.
"That means it is going to be more difficult to measure climate signals in the ," Lozier said. "We thought we could just measure them in the Deep Western Boundary Current, but we really can't."
Source: Duke University (news : web)

Monday, October 8, 2007

Geologists Recover Rocks Yielding Unprecedented Insights Into San Andreas Fault


Source:

Science Daily — For the first time, geologists have extracted intact rock samples from 2 miles beneath the surface of the San Andreas Fault, the infamous rupture that runs 800 miles along the length of California.
Never before have scientists had available for study rock samples from deep inside one of the actively moving tectonic plate-bounding faults responsible for the world's most damaging earthquakes. Now, with this newly recovered material, scientists hope to answer long-standing questions about the fault's composition and properties.
Altogether, the geologists retrieved 135 feet of 4-inch diameter rock cores weighing roughly 1 ton. They were brought to the surface through a research borehole drilled more than 2.5 miles into the Earth. The last of the cores was brought to the surface in the predawn hours of Sept. 7.
Scientists seeking to understand how the great faults bounding Earth's vast tectonic plates evolve and generate earthquakes have always had to infer the processes through indirect means. Up until now, they could only work with samples of ancient faults exposed at the Earth's surface after millions of years of erosion and uplift, together with computer simulations and laboratory experiments approximating what they think might be happening at the depths at which earthquakes occur.
"Now we can hold the San Andreas Fault in our hands," said Mark Zoback, the Benjamin M. Page Professor in Earth Sciences at Stanford. "We know what it's made of. We can study how it works."
Zoback is one of three co-principal investigators of the San Andreas Fault Observatory at Depth (SAFOD) project, which is establishing the world's first underground earthquake observatory. William Ellsworth and Steve Hickman, geophysicists with the U.S. Geological Survey (USGS) in Menlo Park, Calif., are the other co-principal investigators.
SAFOD, which first broke ground in 2004, is a major research component of EarthScope, a National Science Foundation-funded program being carried out in collaboration with the USGS and NASA to investigate the forces that shape the North American continent and the physical processes controlling earthquakes and volcanic eruptions.
"This is tremendously exciting. Obtaining cores from the actively slipping San Andreas Fault is truly unprecedented and will allow truly transformative research and discoveries," said Kaye Shedlock, EarthScope program director at the National Science Foundation.
In the next phase of the experiment, the science team will install an array of seismic instruments in the 2.5-mile-long borehole that runs from the Pacific plate on the west side of the fault into the North American plate on the east. By placing sensors next to a zone that has been the source of many small temblors, scientists will be able to observe the earthquake generation process with unprecedented acuity. They hope to keep the observatory operating for the next 10 to 20 years.
Studying the San Andreas Fault is important because, as Zoback noted, "The really big earthquakes occur on plate boundaries like the San Andreas Fault." The SAFOD site, located about 23 miles northeast of Paso Robles near the tiny town of Parkfield, sits on a particularly active section of the fault that moves regularly. But it does not produce large earthquakes. Instead, it moves in modest increments by a process called creep, in which the two sides of the fault slide slowly past one another, accompanied by occasional small quakes, most of which are not even felt at the surface.
One of the big questions the researchers seek to answer is how, when most of the fault moves in violent, episodic upheavals, can there be a section where the same massive tectonic plates seem, by comparison, to gently tiptoe past each other with the delicate tread of little cat feet"
"There have been many theories about why the San Andreas Fault slides along so easily, none of which could be tested directly until now," Hickman said. Some posit the presence of especially slippery clays, called smectites. Others suggest there may be high water pressure along the fault plane lubricating the surface. Still others note the presence of a mineral called serpentine exposed in several places along the surface trace of the fault, which-if it existed at depth-could both weaken the fault and cause it to creep.
Zoback said the correlation between the occurrence of serpentine, a metamorphosed remnant of old oceanic crust, and the slippery nature of the fault motion in the area has been the subject of speculation for more than 40 years. However, it has never been demonstrated that serpentine actually occurs along the active San Andreas at depth, and the mechanism by which serpentine might limber up the fault was unknown.
Then, in 2005, when the SAFOD drill pierced the zone of active faulting using rotary drilling (which grinds up the rock into tiny fragments), mineralogist Diane Moore of the USGS detected talc in the rock cuttings brought up to the surface. This finding was published in the Aug. 16, 2007, issue of Nature.
"Talc is one of the slipperiest, weakest minerals ever studied," Hickman said.
Might the same mineral that helps keep a baby's bottom smooth also be smoothing the way for the huge tectonic plates" Chemically, it's possible, for when serpentine is subjected to high temperatures in the presence of water containing silica, it forms talc.
Serpentine might also control how faults behave in other ways. "Serpentine can dissolve in ground water as fault particles grind past each other and then crystallize in nearby open pore spaces, allowing the fault to creep even under very little pressure," Hickman said.
The SAFOD borehole cored into two active traces of the fault this summer, both contained within a broad fault "zone" about 700 feet wide. The deeper of the two active fault zones, designated 10830 for its distance in feet from the surface as measured along the curving borehole, yielded an 8-foot-long section of very fine-grained powder called fault gouge. Such gouge is common in fault zones and is produced by the grinding of rock against rock. "What is remarkable about this gouge is that it contains abundant fragments of serpentine that appear to have been swept up into the gouge from the adjacent solid rock," Hickman said. "The serpentine is floating around in the fault gouge like raisins in raisin pudding."
The only way to know what role serpentine, talc or other exotic minerals play in controlling the behavior of the San Andreas Fault is to study the SAFOD core samples in the laboratory.
"To an earthquake scientist, these cores are like the Apollo moon rocks," Hickman said. "Scientists from around the world are anxious to get their hands on them in the hope that they can help solve the mystery of how this major, active plate boundary works."
Will these new samples allow scientists to predict earthquakes" The short answer is no. But research on these samples could provide clues to answer the question of whether earthquakes are predictable. The observatory will allow scientists to begin to address whether there are precursory phenomena occurring within the fault zone.
The other fault zone, called 10480, contains 3 feet of fault gouge. It also produces small earthquakes at a location about 300 feet below the borehole. "Remarkably, we observe the same earthquake rupturing at the same spot on the fault year after year," Ellsworth said. This repeating earthquake, always about a magnitude 2, will be the focus of the observatory to be installed inside the fault in 2008.
Sensitive seismometers and tiltmeters to be installed in the SAFOD borehole directly above the spot that ruptures will observe for the first time the birthing process of an earthquake from the zone where the earthquake energy accumulates. Preliminary observations made in 2006 already have revealed the tiniest earthquakes ever observed-so small they have negative magnitudes.
In early December, a "sample party" will be held at the USGS office in Menlo Park, where the cores will be on display and scientists will offer their proposals to do research projects in a bid to be allowed to analyze part of the core.
Zoback said most of the initial testing will be nondestructive in order to preserve the samples for as long as possible. "But then, some of the material will be made available for testing that simulates earthquakes and fault slip in the lab," he said.
When not being examined, the core samples will be refrigerated and kept moist to prevent the cores and the fluid in them from being disturbed.
Some of the cores will be on display at the press conference to be held Oct. 4 at Stanford University in Tresidder Union's Oak Room.
In addition to funding from the National Science Foundation, USGS and Stanford University, the SAFOD project also has been supported financially by the International Continental Scientific Drilling Program.
Note: This story has been adapted from material provided by Stanford University.

Fausto Intilla

Wednesday, October 3, 2007

Carbon Dioxide Did Not End The Last Ice Age, Study Says


Source:

Science Daily — Carbon dioxide did not cause the end of the last ice age, a new study in Science suggests, contrary to past inferences from ice core records.
"There has been this continual reference to the correspondence between CO2 and climate change as reflected in ice core records as justification for the role of CO2 in climate change," said USC geologist Lowell Stott, lead author of the study, slated for advance online publication Sept. 27 in Science Express.
"You can no longer argue that CO2 alone caused the end of the ice ages."
Deep-sea temperatures warmed about 1,300 years before the tropical surface ocean and well before the rise in atmospheric CO2, the study found. The finding suggests the rise in greenhouse gas was likely a result of warming and may have accelerated the meltdown -- but was not its main cause.
The study does not question the fact that CO2 plays a key role in climate.
"I don't want anyone to leave thinking that this is evidence that CO2 doesn't affect climate," Stott cautioned. "It does, but the important point is that CO2 is not the beginning and end of climate change."
While an increase in atmospheric CO2 and the end of the ice ages occurred at roughly the same time, scientists have debated whether CO2 caused the warming or was released later by an already warming sea.
The best estimate from other studies of when CO2 began to rise is no earlier than 18,000 years ago. Yet this study shows that the deep sea, which reflects oceanic temperature trends, started warming about 19,000 years ago.
"What this means is that a lot of energy went into the ocean long before the rise in atmospheric CO2," Stott said.
But where did this energy come from" Evidence pointed southward.
Water's salinity and temperature are properties that can be used to trace its origin -- and the warming deep water appeared to come from the Antarctic Ocean, the scientists wrote.
This water then was transported northward over 1,000 years via well-known deep-sea currents, a conclusion supported by carbon-dating evidence.
In addition, the researchers noted that deep-sea temperature increases coincided with the retreat of Antarctic sea ice, both occurring 19,000 years ago, before the northern hemisphere's ice retreat began.
Finally, Stott and colleagues found a correlation between melting Antarctic sea ice and increased springtime solar radiation over Antarctica, suggesting this might be the energy source.
As the sun pumped in heat, the warming accelerated because of sea-ice albedo feedbacks, in which retreating ice exposes ocean water that reflects less light and absorbs more heat, much like a dark T-shirt on a hot day.
In addition, the authors' model showed how changed ocean conditions may have been responsible for the release of CO2 from the ocean into the atmosphere, also accelerating the warming.
The link between the sun and ice age cycles is not new. The theory of Milankovitch cycles states that periodic changes in Earth's orbit cause increased summertime sun radiation in the northern hemisphere, which controls ice size.
However, this study suggests that the pace-keeper of ice sheet growth and retreat lies in the southern hemisphere's spring rather than the northern hemisphere's summer.
The conclusions also underscore the importance of regional climate dynamics, Stott said. "Here is an example of how a regional climate response translated into a global climate change," he explained.
Stott and colleagues arrived at their results by studying a unique sediment core from the western Pacific composed of fossilized surface-dwelling (planktonic) and bottom-dwelling (benthic) organisms.
These organisms -- foraminifera -- incorporate different isotopes of oxygen from ocean water into their calcite shells, depending on the temperature. By measuring the change in these isotopes in shells of different ages, it is possible to reconstruct how the deep and surface ocean temperatures changed through time.
If CO2 caused the warming, one would expect surface temperatures to increase before deep-sea temperatures, since the heat slowly would spread from top to bottom. Instead, carbon-dating showed that the water used by the bottom-dwelling organisms began warming about 1,300 years before the water used by surface-dwelling ones, suggesting that the warming spread bottom-up instead.
"The climate dynamic is much more complex than simply saying that CO2 rises and the temperature warms," Stott said. The complexities "have to be understood in order to appreciate how the climate system has changed in the past and how it will change in the future."
Stott's collaborators were Axel Timmermann of the University of Hawaii and Robert Thunell of the University of South Carolina. Stott was supported by the National Science Foundation and Timmerman by the International Pacific Research Center.
Stott is an expert in paleoclimatology and was a reviewer for the Intergovernmental Panel on Climate Change. He also recently co-authored a paper in Geophysical Research Letters tracing a 900-year history of monsoon variability in India.
The study, which analyzed isotopes in cave stalagmites, found correlations between recorded famines and monsoon failures, and found that some past monsoon failures appear to have lasted much longer than those that occurred during recorded history. The ongoing research is aimed at shedding light on the monsoon's poorly understood but vital role in Earth's climate.
Note: This story has been adapted from material provided by University of Southern California.

Fausto Intilla

Monday, September 24, 2007

Satellite Paints Picture Of World's Oceans Over Last Decade


Source:

Science Daily — The NASA-managed Sea-viewing Wide Field-of-view Sensor (SeaWiFS) instrument settled into orbit around Earth in 1997 and took its first measurements of ocean color. A decade later, the satellite's data has proved instrumental in countless applications and helped researchers paint a picture of a changing climate. NASA recognized the satellite's tenth anniversary September 19 with briefings at the Goddard Space Flight Center in Greenbelt, Md.
NASA and GeoEye's SeaWiFS instrument has given researchers the first global look at ocean biological productivity. Its data have applications for understanding and monitoring the impacts of climate change, setting pollution standards, and sustaining coastal economies that depend on tourism and fisheries.
"SeaWiFS allows us to observe ocean changes and the mechanisms linking ocean physics and biology, and that's important for our ability to predict the future health of the oceans in a changing climate," said Gene Carl Feldman, SeaWiFS project manager at Goddard.
Researchers used SeaWiFS data to identify factors controlling the unusual timing of the 2005 phytoplankton bloom in the California Current System that led to the die-off of Oregon coast seabirds. The blooming tiny microscopic plants are key indicators of ocean health, form the base of marine food webs, and absorb carbon dioxide -- a major greenhouse gas -- from Earth's atmosphere.
"Long-term observations of the California coast and other sensitive regions is essential to understanding how changing global climate impacted ecosystems in the past, and how it may do so in the future," said Stephanie Henson of the University of Maine, lead author of a study published last month in the American Geophysical Union's "Journal of Geophysical Research -- Oceans." "This type of large-scale, long-term monitoring can only be achieved using satellite instrumentation," she added.
The SeaWiFS instrument orbits Earth fourteen times a day, measuring visible light over every area of cloud-free land and ocean once every 48 hours. The result is a map of Earth with colors spanning the spectrum of visible light. Variations in the color of the ocean, particularly in shades of blue and green, allow researchers to determine how the numbers of the single-celled plants called phytoplankton are distributed in the oceans over space and time.
In other research, Mike Behrenfeld of Oregon State University, Corvallis, Ore., and colleagues were the first to use SeaWiFS to quantify biological changes in the oceans as a response to El Niño, which they described in a landmark 2001 study in Science.
"The 2001 study is significant because it marked the first time that global productivity was measured from a single sensor," said Paula Bontempi, program manager for the Biology and Biogeochemistry Research Program at NASA Headquarters in Washington. "The simplicity of SeaWiFS -- a single sensor designed only to measure ocean color -- has made it the gold standard for all ocean color monitoring instruments."
More recently, Zhiqiang Chen and colleagues at the University of South Florida, St. Petersburg, showed that SeaWiFS data have direct application for state and federal regulators looking to better define water quality standards. The team reported in "Remote Sensing of Environment" that instead of relying on the infrequent measurements collected from ships or buoys, SeaWiFS data can be used to monitor coastal water quality almost daily, providing managers with a more frequent and complete picture of changes over time.
SeaWiFS has revolutionized our understanding of the ocean's response to environmental change. Here's one example: over the last ten years, the instrument has gathered daily global bio-productivity readings. When coupled with daily sea surface temperature readings over the same time span, we immediately see a relationship between temperature and productivity.
Beyond the realm of ocean observations, however, SeaWiFS has "revolutionized the way people do research," Feldman said. SeaWiFS was one of the first missions to open up data access online to researchers, students and educators around the world. The mission was able to capitalize on advances in data processing and storage technologies and ride the crest of the World Wide Web's growth from its beginning.
When the SeaWiFS program launched in 1997, the goal was to place a sensor in space capable of routinely monitoring ocean color to better understand the interplay between the ocean and atmosphere and most importantly, the ocean's role in the global carbon cycle. A decade later, Feldman said, "SeaWiFS has exceeded everyone's expectations."
Note: This story has been adapted from a news release issued by NASA/Goddard Space Flight Center.

Fausto Intilla

Friday, September 14, 2007

Sea Ice Is Getting Thinner


Source:

Science Daily — Large areas of the Arctic sea-ice are only one metre thick this year, equating to an approximate 50 percent thinning as compared to the year 2001. These are the initial results from the latest Alfred-Wegener-Institute for Polar and Marine Research in the Helmholtz Association lead expedition to the North Polar Sea.
Fifty scientists have been on board the Research ship- Polarstern for two and a half months, their main aim; to carry out research on the sea-ice areas in the central Arctic. Amongst other things, they have found out that not only the ocean currents are changing, but community structures in the Arctic are also altering. Autonomous measuring-buoys have been placed out, and they will contribute valuable data, also after the expedition is finished, to the study of the environmental changes occurring in this region.
“The ice cover in the North Polar Sea is dwindling, the ocean and the atmosphere are becoming steadily warmer, the ocean currents are changing” said chief scientist Dr Ursula Schauer, from the Alfred Wegener Institute for Polar and Marine Research part of the Helmholtz community, when commenting on the latest results from the current expedition. She is currently in the Arctic, underway with 50 Scientists from Germany, Russia, Finland, the Netherlands, Spain, the USA, Switzerland, Japan, France and China, where they are investigating ocean and sea-ice conditions.
“We are in the midst of phase of dramatic change in the Arctic, and the International Polar Year 2007/08 offers us a unique opportunity to study this dwindling ocean in collaboration with international researchers” said Schauer. Oceanographers on board the research ship Polarstern are investigating the composition and circulation of the water masses, physical characteristics of sea-ice and transport of biological and geochemical components in ice and seawater. Sea-ice ecosystems in the seawater and on the ocean floor will also be a focus of investigations. Scientists will take sediments from the ocean floor in order to reconstruct the climatic history of the surrounding continents.
Oceanographic measuring buoys were set out in all regions of the Arctic ocean for the first time during this International Polar Year. They are able to drift freely in the Arctic Ocean whilst collecting data on currents, temperature, and salt content of the seawater. The buoys will continuously collect data over and send them back to the scientists via satellite.
In addition, the deployment of a new titanium measuring system which allows contamination free sample collection of trace elements for the first time due to its high effectiveness. These studies will take place within the context of different research projects, all taking place during the International Polar Year: SPACE (Synoptic Pan-Arctic Climate and Environment Study), iAOOS (Integrated Arctic Ocean Observing System) and GEOTRACES (Trace Elements in the Arctic). At the same time, a large component of the work is supported by the European Union Program DAMOCLES (Developing Arctic Modelling and Observing Capabilities for Long-term Environment Studies).
Changes in sea-ice
The thickness of the arctic sea-ice has decreased since 1979, and at the moment measures about a metre in diameter in the central Arctic Basin. In addition, oceanographers have found a particularly high concentration of melt-water in the ocean and a large number of melt-ponds. These data, collected from on board the Polarstern, and also from helicopter flights allow the scientists to better interpret their satellite images.
Sea-Ice biologists from the Institute of Polar Ecology at the University of Kiel are studying the animals and plants living in and beneath the ice. They are using the opportunity to investigate the threatened ecosystem. According to the newest models, the Arctic could be ice free in less than 50 years in case of further warming. This may cause the extinction of many organisms that are adapted to this habitat.
Ocean currents
The Arctic Ocean currents are an important part of global ocean circulation. Warm water masses flowing in from the Atlantic are changed in the Arctic through water cooling and ice formation, and sink to great depths. Constant monitoring by the Alfred-Wegener-Institute for Polar and Marine Research over the last ten years have recorded significant changes, and have demonstrated a warming of the incoming current from the Atlantic Ocean. During this expedition, the propagation of these warming events along each of the currents in the North Polar Sea will be investigated.
The large rivers of Siberia and North America transport huge amounts of freshwater to the Arctic. The freshwater appears to function as an insulating layer, controling the warmth transfer between the ocean, the ice and the atmosphere.
The study area stretches from the shelf areas of the Barents Sea, the Kara Sea and the Laptev Sea, across Nansen and Amundsen bays right up to Makarow Bay.
Between Norway and Siberia and up to the Canadian Bay the scientists have taken temperature, salinity, and current measurements at more than 100 places. First results have shown that the temperatures of the influx of water from the Atlantic are lower as compared to previous years. The temperatures and salinity levels in the Arctic deep sea are also slowly changing.
The changes are small here, but the areas go down to great depths, and enormous water volumes are therefore involved. In order to follow the circulation patterns in winter, oceanographic measuring buoys will be attached to ice floes, and continuous measurements will be taken whilst they float along with the ice. The measurements will be relayed back via satellite.
In addition to the ocean currents and sea-ice, zooplankton, sediment samples from the sea floor as well as trace elements will be taken. Zooplankton are at the base of the food chain for many marine creatures, and are therefore an important indicator for the health of the ecosystem. The deposits found on the ocean floor of the North Polar Sea read like a diary of the history of climate change for the surrounding continents. Through sediment cores, the scientists may be able to unlock the key to the glaciation of northern Siberia.
In addition, the members of the expedition will be able to measure trace elements from Siberian rivers and shelf areas, that through polar drift are being pushed towards the Atlantic.
Further information on this project can be found on the German contribution to the International Polar Year website (http://www.polarjahr.de/).
Note: This story has been adapted from a news release issued by Alfred Wegener Institute for Polar and Marine Research.

Fausto Intilla

Thursday, September 6, 2007

‘Bringing the Ocean to the World,’ in High-Def


Source:

By WILLIAM YARDLEY
Published: September 4, 2007

SEATTLE — Thousands of miles of fiber-optic cables are strung across the world’s oceans, connecting continents like so many tin cans in this age of critical global communication. So the fact that about 800 more miles of fiber-optic cable will soon thread the sea floor off the coast of the Pacific Northwest might not seem particularly revolutionary. Until you meet John R. Delaney, part oceanographer, part oracle.
“This is a mission to Planet Ocean,” said Mr. Delaney, a professor at the University of Washington. “This is a NASA-scale mission to basically enter the Inner Space, and to be there perpetually. What we’re doing is bringing the ocean to the world.”
Under a $331 million program long dreamed of by oceanographers and being financed by the National Science Foundation, Professor Delaney and a team of scientists from several other institutions are leading the new Ocean Observatories Initiative, a multifaceted effort to study the ocean — in the ocean — through a combination of Internet-linked cables, buoys atop submerged data collection devices, robots and high-definition cameras. The first equipment is expected to be in place by 2009.
A central goal, say those involved, is to better understand how oceans affect life on land, including their role in storing carbon and in climate change; the causes of tsunamis; the future of fish populations; and the effect of ocean temperature on growing seasons. Oceanographers also hope to engage other scientists and the public more deeply with ocean issues by making them more immediate. Instead of spending weeks or months on a boat gathering data, then returning to labs to make sense of it, oceanographers say they expect to be able to order up specific requests from their desktops and download the results.
Researchers will be able, for example, to assemble a year’s worth of daily data on deep ocean temperatures in the Atlantic or track changes in currents as a hurricane nears the Gulf of Mexico. And schoolchildren accustomed to dated graphics and grainy shark videos will only have to boot up to dive deep in high definition. “It’ll all go on the Internet and in as real time as possible,” said Alexandra Isern, the program director for ocean technology at the National Science Foundation. “This is really going to transform not only the way scientists do science but also the way the public can be involved.”
The program has three main parts, two of which involve placing a range of sensors in the oceans and one that connects through the Internet all the information gathered, so that the public and scientists can have access to it.
A “coastal/global” program will include stand-alone deep-water data-gathering stations far offshore, mostly in the higher latitudes of the Atlantic and Pacific, where cold, rough conditions have made ship-based oceanography difficult.
In American waters, observation systems are planned on both coasts. In the Pacific, off the Oregon coast, the system will study the upwelling of cold water that has led to repeated “dead zones” of marine life in recent summers. In the East, off Martha’s Vineyard, a coastal observation system is planned along the continental shelf, gathering information at the surface, subsurface and on the sea floor, where warm Gulf Stream currents confront colder water from off the coast of Canada.
“That mixing affects surface productivity, weather, carbon cycling,” said Robert S. Detrick, a senior scientist at Woods Hole Oceanographic Institution.
In August, the Joint Oceanographic Institutions, which is administering the Ocean Observatories Initiative for the National Science Foundation, chose Woods Hole to lead the offshore buoy and coastal program. Woods Hole, which will receive about $98 million of the total cost, will partner with the Scripps Institution of Oceanography at the University of California, San Diego, and Oregon State University’s College of Oceanic and Atmospheric Sciences.
In the Northwest, about $130 million of the initiative’s cost is being dedicated to build a regional observatory, a series of underwater cables that will crisscross the tectonic plate named for the explorer Juan de Fuca. Rather than provide an information superhighway that bypasses the ocean, this new network is being put in place to take its pulse. Professor Delaney, whose specialty is underwater volcanoes that form at the seams between tectonic plates and the surprising life those seams support, is among those who have been pursuing the cable network for more than a decade, overcoming hurdles of money, technology and skepticism.
Some scientists have suggested that the Juan de Fuca is an imperfect laboratory, that it is small and lacks some features, like the most intense El Niño fluctuations, that might reveal more about how that phenomenon affects conditions at sea and on land. But Professor Delaney says the Juan de Fuca plate is well-suited for the program precisely because it is self-contained, just offshore and rich with seafloor activity, complicated current patterns and abundant fish and marine mammals. The new network shares many similarities with a plan called Neptune that Professor Delaney and others began pushing for in the 1990s. As part of an earlier effort related to that project, Canada is moving forward with its own cabled network off the coast of British Columbia.
“For the first three or four years, people just laughed when I said we’re going to turn Juan de Fuca Plate into a national laboratory,” Professor Delaney said. “Now they’re not laughing.”

Many oceanographers say the program will transform their field. Oscar Schofield, a biological oceanographer at Rutgers University, has spent nearly a decade helping piece together a small-scale coastal observatory in the Atlantic using a combination of radar, remote-controlled underwater gliders and a 15-kilometer underwater cable. The program, called the Coastal Ocean Observation Lab, which he runs with Scott Glenn, also a professor at Rutgers, has a Web site where outsiders can track currents, water temperatures and salinity levels in parts of the Atlantic. They can also follow measuring instruments guided remotely from the New Jersey coast to south of the Chesapeake Bay.

Professor Schofield said that the data gathered already had upended some of what he was taught in graduate school, from the way rivers flow into the ocean to the complexity of surface currents.
“When there’s a hurricane, when all the ships are running for cover, I’m flying my gliders into the hurricane,” using his office computer, Professor Schofield said. “Then I’m sitting at home drinking a beer watching the ocean respond to a hurricane.”
He added: “What’s great about oceanography is we’re still in the phase of just basic exploration. We’ve discovered things off one of the most populated coasts in the United States that we didn’t know yet. O.O.I. will take us one level beyond that, to where any scientist in the world will be able to explore any ocean.”
Several scientists involved in the project cited improved technology — from increased bandwidth to the ability to provide constant power to instruments at sea by underwater cables or solar or wind power — as critical to making the new program possible. They also say that increased concern about global warming, storms and fisheries has brought new attention to their field.
Some experts say they wish the project included more money for, say, placing buoys in the Indian Ocean, where monsoons and other events make for rich ocean activity. But John A. Orcutt, a professor at Scripps who is directing the effort to link the new research to the Internet, said being able to provide constant new data to scientists around the world, or even to teenagers surfing YouTube, will help build support for expanding the project in the future.
“We want to get the oceans and the sea floor in front of the public now,” Professor Orcutt said, “so they can understand the need for more.”

Fausto Intilla

Thursday, August 30, 2007

Next Ice Age Delayed By Rising Carbon Dioxide Levels


Source:

Science Daily — Future ice ages may be delayed by up to half a million years by our burning of fossil fuels. That is the implication of recent work by Dr Toby Tyrrell of the University of Southampton's School of Ocean and Earth Science at the National Oceanography Centre, Southampton.
Arguably, this work demonstrates the most far-reaching disruption of long-term planetary processes yet suggested for human activity.
Dr Tyrrell's team used a mathematical model to study what would happen to marine chemistry in a world with ever-increasing supplies of the greenhouse gas, carbon dioxide.
The world's oceans are absorbing CO2 from the atmosphere but in doing so they are becoming more acidic. This in turn is dissolving the calcium carbonate in the shells produced by surface-dwelling marine organisms, adding even more carbon to the oceans. The outcome is elevated carbon dioxide for far longer than previously assumed.
Computer modelling in 2004 by a then oceanography undergraduate student at the University, Stephanie Castle, first interested Dr Tyrrell and colleague Professor John Shepherd in the problem. They subsequently developed a theoretical analysis to validate the plausibility of the phenomenon.
The work, which is part-funded by the Natural Environment Research Council, confirms earlier ideas of David Archer of the University of Chicago, who first estimated the impact rising CO2 levels would have on the timing of the next ice age.
Dr Tyrrell said: 'Our research shows why atmospheric CO2 will not return to pre-industrial levels after we stop burning fossil fuels. It shows that it if we use up all known fossil fuels it doesn't matter at what rate we burn them. The result would be the same if we burned them at present rates or at more moderate rates; we would still get the same eventual ice-age-prevention result.'
Ice ages occur around every 100,000 years as the pattern of Earth's orbit alters over time. Changes in the way the sun strikes the Earth allows for the growth of ice caps, plunging the Earth into an ice age. But it is not only variations in received sunlight that determine the descent into an ice age; levels of atmospheric CO2 are also important.
Humanity has to date burnt about 300 Gt C of fossil fuels. This work suggests that even if only 1000 Gt C (gigatonnes of carbon) are eventually burnt (out of total reserves of about 4000 Gt C) then it is likely that the next ice age will be skipped. Burning all recoverable fossil fuels could lead to avoidance of the next five ice ages.
Dr Tyrrell is a Reader in the University of Southampton's School of Ocean and Earth Science. This research was published in Tellus B, vol 59 p664.
Note: This story has been adapted from a news release issued by University Of South Hampton.

Fausto Intilla