Monday, September 24, 2007

Satellite Paints Picture Of World's Oceans Over Last Decade


Source:

Science Daily — The NASA-managed Sea-viewing Wide Field-of-view Sensor (SeaWiFS) instrument settled into orbit around Earth in 1997 and took its first measurements of ocean color. A decade later, the satellite's data has proved instrumental in countless applications and helped researchers paint a picture of a changing climate. NASA recognized the satellite's tenth anniversary September 19 with briefings at the Goddard Space Flight Center in Greenbelt, Md.
NASA and GeoEye's SeaWiFS instrument has given researchers the first global look at ocean biological productivity. Its data have applications for understanding and monitoring the impacts of climate change, setting pollution standards, and sustaining coastal economies that depend on tourism and fisheries.
"SeaWiFS allows us to observe ocean changes and the mechanisms linking ocean physics and biology, and that's important for our ability to predict the future health of the oceans in a changing climate," said Gene Carl Feldman, SeaWiFS project manager at Goddard.
Researchers used SeaWiFS data to identify factors controlling the unusual timing of the 2005 phytoplankton bloom in the California Current System that led to the die-off of Oregon coast seabirds. The blooming tiny microscopic plants are key indicators of ocean health, form the base of marine food webs, and absorb carbon dioxide -- a major greenhouse gas -- from Earth's atmosphere.
"Long-term observations of the California coast and other sensitive regions is essential to understanding how changing global climate impacted ecosystems in the past, and how it may do so in the future," said Stephanie Henson of the University of Maine, lead author of a study published last month in the American Geophysical Union's "Journal of Geophysical Research -- Oceans." "This type of large-scale, long-term monitoring can only be achieved using satellite instrumentation," she added.
The SeaWiFS instrument orbits Earth fourteen times a day, measuring visible light over every area of cloud-free land and ocean once every 48 hours. The result is a map of Earth with colors spanning the spectrum of visible light. Variations in the color of the ocean, particularly in shades of blue and green, allow researchers to determine how the numbers of the single-celled plants called phytoplankton are distributed in the oceans over space and time.
In other research, Mike Behrenfeld of Oregon State University, Corvallis, Ore., and colleagues were the first to use SeaWiFS to quantify biological changes in the oceans as a response to El Niño, which they described in a landmark 2001 study in Science.
"The 2001 study is significant because it marked the first time that global productivity was measured from a single sensor," said Paula Bontempi, program manager for the Biology and Biogeochemistry Research Program at NASA Headquarters in Washington. "The simplicity of SeaWiFS -- a single sensor designed only to measure ocean color -- has made it the gold standard for all ocean color monitoring instruments."
More recently, Zhiqiang Chen and colleagues at the University of South Florida, St. Petersburg, showed that SeaWiFS data have direct application for state and federal regulators looking to better define water quality standards. The team reported in "Remote Sensing of Environment" that instead of relying on the infrequent measurements collected from ships or buoys, SeaWiFS data can be used to monitor coastal water quality almost daily, providing managers with a more frequent and complete picture of changes over time.
SeaWiFS has revolutionized our understanding of the ocean's response to environmental change. Here's one example: over the last ten years, the instrument has gathered daily global bio-productivity readings. When coupled with daily sea surface temperature readings over the same time span, we immediately see a relationship between temperature and productivity.
Beyond the realm of ocean observations, however, SeaWiFS has "revolutionized the way people do research," Feldman said. SeaWiFS was one of the first missions to open up data access online to researchers, students and educators around the world. The mission was able to capitalize on advances in data processing and storage technologies and ride the crest of the World Wide Web's growth from its beginning.
When the SeaWiFS program launched in 1997, the goal was to place a sensor in space capable of routinely monitoring ocean color to better understand the interplay between the ocean and atmosphere and most importantly, the ocean's role in the global carbon cycle. A decade later, Feldman said, "SeaWiFS has exceeded everyone's expectations."
Note: This story has been adapted from a news release issued by NASA/Goddard Space Flight Center.

Fausto Intilla

Friday, September 21, 2007

Deep Earth Model Challenged By New Experiment


Source:

Science Daily — In the first experiments able to mimic the crushing, searing conditions found in Earth's lower mantle, and simultaneously probe tell-tale properties of iron, scientists* have discovered that material there behaves very differently than predicted by models. The research also points to the likelihood of a new zone deep in the Earth.
Surface phenomena such as volcanoes and earthquakes are generated by what goes on in Earth's interior. To understand some of these surface dynamics, scientists have to probe deep into the planet. The lower mantle is between 400 and 1,740 miles deep (650 km- 2,800 km) and sits atop the outer core.
Coauthor of the paper, Viktor Struzhkin of the Carnegie Institution's Geophysical Laboratory explains: "The deeper you go, the higher the pressures and temperatures become. Under these extreme conditions, the atoms and electrons of the rocks become squeezed so close together that they interact very peculiarly. In fact, spinning electrons in iron, which is prevalent throughout the inner Earth, are forced to pair up. When this spin state changes from unpaired electrons--called a high-spin state--to paired electrons--a low-spin state--the density, sound velocities, conductivity, and other properties of the materials can change. Understanding these conditions helps scientists piece together the complex puzzle of the interior/surface interactions."
The pressures in the lower mantle are brutal, ranging from about 230,000 times the atmospheric pressure at sea level (23 GPa), to almost 1.35 million times sea-level pressure (135 GPa). The heat is equally extreme--from about 2,800 to 6,700 degrees Fahrenheit (1800 K--4000 K).
Using a laser-heated diamond anvil cell to heat and compress the samples, the scientists subjected ferropericlase to almost 940,000 atmospheres and 3,140 °F. They analyzed it using so-called X-ray emission spectroscopy. As its name suggests, ferropericlase is iron-laden.
It is also the second most prevalent material found in the lower mantle. Previous to this study, ferropericlase has been subjected to high pressures, but only to room temperatures. The new experiments are the highest pressures and temperatures attained to probe the spin state of iron in the mineral at lower-mantle conditions.
Under the less-intense conditions of the former experiments, the high-spin to low-spin transition occurs in a narrow pressure range. In the new study, however, both spin states coexisted in the same crystal structure and the spin transition was also continuous over a large pressure range, indicating that the mineral is in a complex state over a large range in depth in the planet.
"We were expecting to find a transition zone, but did not know how extended it may be in the Earth's mantle," commented Struzhkin. "Our findings suggest that there is a region or 'spin-transition zone' from about 620 miles to 1,365 miles deep, where high spin, unpaired electrons, transition to low spin, paired electrons. The transitioning appears to be continuous over these depths. At pressures representing a lower depth of about 1,365 miles the transition stops and ferropericlase is dominated by low-spin electrons."
Since measurements that scientists use to determine the composition and density of the inner Earth, such as sound velocities, are influenced by the ratio of high-spin/low- spin states, the new finding calls into question the traditional techniques for modeling this region of the planet.
In addition, a continuous spin transition zone may explain some interesting experimental findings including why there has been no significant iron partitioning, or separating, into ferropericlase or perovskite, the most prevalent mineral in the region. The research also suggests that the depth of the transition zone is less than scientists had speculated.
The existence of this transition zone may also account for seismic-wave behavior at those depths. The fact that the lowermost area is dominated by denser low-spin material could also affect the temperature stability of mantle upwellings--the generators of volcanic hotspots, such as those in Hawaii.
"This paper solves only part of the puzzle," cautioned Struzhkin. "Since the major lower mantle mineral perovskite has not been measured yet with this technique, we know there are more surprises to come."
"The spin transition zone of iron needs to be considered in future models of the lower mantle," said Choong-Shik Yoo, a former staff member at LLNL and now a professor at Washington State University. "In the past, geophysicists had neglected the effects of the spin transition when studying the Earth's interior.
Since we identified this zone, the next step is to study the properties of lower mantle oxides and silicates across the zone. This research also calls for future seismic and geodynamic tests in order to understand the properties of the spin transition zone."
"The benchmark techniques developed here have profound implications for understanding the electronic transitions in lanthanoid and actinoid compounds under extreme conditions because their properties would be affected by the electronic transitions," said Valentin Iota, a staff member in LLNL's Physics and Advanced Technologies Directorate.
The work is published in the September 21, 2007, issue of Science.
*Authors on this paper are Jung-Fu Lin, Lawrence Livermore National Laboratory (LLNL); György Vankó, KFKI Research Institute for Particle and Nuclear Physics and the European Synchrotron Radiation Facility; Steven Jacobsen, Northwestern University; Viktor Struzhkin, Carnegie Institution's Geophysical Laboratory; Vitali Prakapenka, University of Chicago; Alexie Kuznetsov, University of Chicago; and Choong-Shik Yoo LLNL.
Note: This story has been adapted from a news release issued by Carnegie Institution.

Fausto Intilla

Wednesday, September 19, 2007

Does Underground Water Regulate Earthquakes?

Source:
Science Daily — Earthquakes happen to be surface (shallow-focus), intermediate and deep ones. Seismologists mark out the boundary between the first two types at the depth of about 70 kilometers, its nature being still unclear.
Russian researchers, specialists of the Institute of Maritime Geology and Geophysics (Far-Eastern Branch, Russian Academy of Sciences), Geophysical Center of the Russian Academy of Sciences and the P.P. Shirshov Institute of Oceanology (Russian Academy of Sciences) have put forward a hypothesis that the seismic boundary is simultaneously the lower boundary of hydrosphere. The earthquakes character depends on underground water.
Earthquakes taking place “at different sides of the boundary” differ from each other not only by the depth. Shallow-focus earthquakes – they account for about 85% of all recorded events - often take place under the influence of periodic external effects, for example, rising tides, which disturb the entire lithosphere of the Earth. Periodicity is not inherent to deeper earthquakes, they always occur by chance. The conclusion was made by the researchers who had analyzed the world ISC/NEIC catalogues data that covers the 1964-2005 period and takes into account about 80,000 events.
Seismologists connect existence of the 70-kilometer boundary with water state changes in the interior of the Earth. The deeper the water molecules are located, the more compressed they are. At the depth of about 70 kilometers, the water compression strain index increases up to 1.3. This is the way water molecules are squeezed in the crystal lattice. Above this boundary, water exists mainly in free phase, below the boundary – water embeds into the rock crystallite composition.
The rock containing free water (above the boundary) promptly reacts to periodic tidal effects, even the faintest ones. Pressure changes and respective environment density changes cause formation of a crack system, where free water rushes to. The cracks widen, increase, and rock decay gives birth to a seismic focus. In the rock, where free water is absent (below the boundary), weak tidal effects are not accumulated and deformation does not grow.
So, the seismic boundary at the depth of about 70 kilometers (where, according to the researchers’ assumption, the lower hydrosphere boundary runs) separates the events that are able to react to external action and the ones incapable of such reaction. Therefore, this boundary separates different types of earthquakes. However, it is still a hypothesis that requires experimental validation.
Note: This story has been adapted from a news release issued by Russian Academy Of Sciences.

Fausto Intilla
www.oloscience.com

Sunday, September 16, 2007

Northwest Passage Opens: Arctic Sea Ice Reaches New Low


Source:

Science Daily — The area covered by sea ice in the Arctic has now (September 14, 2007) shrunk to its lowest level since satellite measurements began nearly 30 years ago, opening up the Northwest Passage – a long-sought short cut between Europe and Asia that has been historically impassable.
Leif Toudal Pedersen from the Danish National Space Centre said: "We have seen the ice-covered area drop to just around 3 million sq km which is about 1 million sq km less than the previous minima of 2005 and 2006. There has been a reduction of the ice cover over the last 10 years of about 100 000 sq km per year on average, so a drop of 1 million sq km in just one year is extreme.
"The strong reduction in just one year certainly raises flags that the ice (in summer) may disappear much sooner than expected and that we urgently need to understand better the processes involved."
Arctic sea ice naturally extends its surface coverage each northern winter and recedes each northern summer, but the rate of overall loss since 1978 when satellite records began has accelerated.
The most direct route of the Northwest Passage across northern Canada is now fully navigable, while the Northeast Passage along the Siberian coast remains only partially blocked. To date, the Northwest Passage has been predicted to remain closed even during reduced ice cover by multi-year ice pack – sea ice that survives one or more summers. However, according to Pedersen, this year’s extreme event has shown the passage may well open sooner than expected.
The previous record low was in 2005 when the Arctic area covered by sea ice was just 4 million sq km. Even then, the most direct Northwest Passage did not fully open.
The Polar Regions are very sensitive indicators of climate change. The UN’s Intergovernmental Panel on Climate Change showed these regions are highly vulnerable to rising temperatures and predicted the Arctic would be virtually ice free by the summer of 2070. Still other scientists predict it could become ice free as early as 2040 due to rising temperatures and sea ice decline.
Because sea ice has a bright surface, the majority of solar energy that hits it is reflected back into space. When sea ice melts, the dark-coloured ocean surface is exposed. Solar energy is then absorbed rather than reflected, so the oceans get warmer and temperatures rise, making it difficult for new ice to form.
The Arctic is one of Earth’s most inaccessible areas, so obtaining measurements of sea ice was difficult before the advent of satellites. For more than 20 years, ESA has been providing satellite data to the cryosphere communities. Currently, ESA is contributing to the International Polar Year (IPY) – a large worldwide science programme focused on the Arctic and Antarctic.
Since 2006, ESA has supported Polar View, a satellite remote-sensing programme funded through the Earthwatch GMES Service Element (GSE) that focuses on the Arctic and the Antarctic.
In 2009, ESA will make another significant contribution to cryosphere research with the launch of CryoSat-2. The observations made over the three-year lifetime of the mission will provide conclusive evidence on the rates at which ice cover is diminishing.
Note: This story has been adapted from a news release issued by European Space Agency.

Fausto Intilla

Friday, September 14, 2007

Sea Ice Is Getting Thinner


Source:

Science Daily — Large areas of the Arctic sea-ice are only one metre thick this year, equating to an approximate 50 percent thinning as compared to the year 2001. These are the initial results from the latest Alfred-Wegener-Institute for Polar and Marine Research in the Helmholtz Association lead expedition to the North Polar Sea.
Fifty scientists have been on board the Research ship- Polarstern for two and a half months, their main aim; to carry out research on the sea-ice areas in the central Arctic. Amongst other things, they have found out that not only the ocean currents are changing, but community structures in the Arctic are also altering. Autonomous measuring-buoys have been placed out, and they will contribute valuable data, also after the expedition is finished, to the study of the environmental changes occurring in this region.
“The ice cover in the North Polar Sea is dwindling, the ocean and the atmosphere are becoming steadily warmer, the ocean currents are changing” said chief scientist Dr Ursula Schauer, from the Alfred Wegener Institute for Polar and Marine Research part of the Helmholtz community, when commenting on the latest results from the current expedition. She is currently in the Arctic, underway with 50 Scientists from Germany, Russia, Finland, the Netherlands, Spain, the USA, Switzerland, Japan, France and China, where they are investigating ocean and sea-ice conditions.
“We are in the midst of phase of dramatic change in the Arctic, and the International Polar Year 2007/08 offers us a unique opportunity to study this dwindling ocean in collaboration with international researchers” said Schauer. Oceanographers on board the research ship Polarstern are investigating the composition and circulation of the water masses, physical characteristics of sea-ice and transport of biological and geochemical components in ice and seawater. Sea-ice ecosystems in the seawater and on the ocean floor will also be a focus of investigations. Scientists will take sediments from the ocean floor in order to reconstruct the climatic history of the surrounding continents.
Oceanographic measuring buoys were set out in all regions of the Arctic ocean for the first time during this International Polar Year. They are able to drift freely in the Arctic Ocean whilst collecting data on currents, temperature, and salt content of the seawater. The buoys will continuously collect data over and send them back to the scientists via satellite.
In addition, the deployment of a new titanium measuring system which allows contamination free sample collection of trace elements for the first time due to its high effectiveness. These studies will take place within the context of different research projects, all taking place during the International Polar Year: SPACE (Synoptic Pan-Arctic Climate and Environment Study), iAOOS (Integrated Arctic Ocean Observing System) and GEOTRACES (Trace Elements in the Arctic). At the same time, a large component of the work is supported by the European Union Program DAMOCLES (Developing Arctic Modelling and Observing Capabilities for Long-term Environment Studies).
Changes in sea-ice
The thickness of the arctic sea-ice has decreased since 1979, and at the moment measures about a metre in diameter in the central Arctic Basin. In addition, oceanographers have found a particularly high concentration of melt-water in the ocean and a large number of melt-ponds. These data, collected from on board the Polarstern, and also from helicopter flights allow the scientists to better interpret their satellite images.
Sea-Ice biologists from the Institute of Polar Ecology at the University of Kiel are studying the animals and plants living in and beneath the ice. They are using the opportunity to investigate the threatened ecosystem. According to the newest models, the Arctic could be ice free in less than 50 years in case of further warming. This may cause the extinction of many organisms that are adapted to this habitat.
Ocean currents
The Arctic Ocean currents are an important part of global ocean circulation. Warm water masses flowing in from the Atlantic are changed in the Arctic through water cooling and ice formation, and sink to great depths. Constant monitoring by the Alfred-Wegener-Institute for Polar and Marine Research over the last ten years have recorded significant changes, and have demonstrated a warming of the incoming current from the Atlantic Ocean. During this expedition, the propagation of these warming events along each of the currents in the North Polar Sea will be investigated.
The large rivers of Siberia and North America transport huge amounts of freshwater to the Arctic. The freshwater appears to function as an insulating layer, controling the warmth transfer between the ocean, the ice and the atmosphere.
The study area stretches from the shelf areas of the Barents Sea, the Kara Sea and the Laptev Sea, across Nansen and Amundsen bays right up to Makarow Bay.
Between Norway and Siberia and up to the Canadian Bay the scientists have taken temperature, salinity, and current measurements at more than 100 places. First results have shown that the temperatures of the influx of water from the Atlantic are lower as compared to previous years. The temperatures and salinity levels in the Arctic deep sea are also slowly changing.
The changes are small here, but the areas go down to great depths, and enormous water volumes are therefore involved. In order to follow the circulation patterns in winter, oceanographic measuring buoys will be attached to ice floes, and continuous measurements will be taken whilst they float along with the ice. The measurements will be relayed back via satellite.
In addition to the ocean currents and sea-ice, zooplankton, sediment samples from the sea floor as well as trace elements will be taken. Zooplankton are at the base of the food chain for many marine creatures, and are therefore an important indicator for the health of the ecosystem. The deposits found on the ocean floor of the North Polar Sea read like a diary of the history of climate change for the surrounding continents. Through sediment cores, the scientists may be able to unlock the key to the glaciation of northern Siberia.
In addition, the members of the expedition will be able to measure trace elements from Siberian rivers and shelf areas, that through polar drift are being pushed towards the Atlantic.
Further information on this project can be found on the German contribution to the International Polar Year website (http://www.polarjahr.de/).
Note: This story has been adapted from a news release issued by Alfred Wegener Institute for Polar and Marine Research.

Fausto Intilla

Friday, September 7, 2007

Large Asteroid Breakup May Have Caused Mass Extinction On Earth 65 Million Years Ago


Source:

Science Daily — The impactor believed to have wiped out the dinosaurs and other life forms on Earth some 65 million years ago has been traced back to a breakup event in the main asteroid belt. A joint U.S.-Czech team from Southwest Research Institute (SwRI) and Charles University in Prague suggests that the parent object of asteroid (298) Baptistina disrupted when it was hit by another large asteroid, creating numerous large fragments that would later create the Chicxulub crater on the Yucatan Peninsula as well as the prominent Tycho crater found on the Moon.
The team of researchers, including Dr. William Bottke (SwRI), Dr. David Vokrouhlicky (Charles University, Prague) and Dr. David Nesvorny (SwRI), combined observations with several different numerical simulations to investigate the Baptistina disruption event and its aftermath. A particular focus of their work was how Baptistina fragments affected the Earth and Moon.
At approximately 170 kilometers in diameter and having characteristics similar to carbonaceous chondrite meteorites, the Baptistina parent body resided in the innermost region of the asteroid belt when it was hit by another asteroid estimated to be 60 kilometers in diameter. This catastrophic impact produced what is now known as the Baptistina asteroid family, a cluster of asteroid fragments with similar orbits. According to the team's modeling work, this family originally included approximately 300 bodies larger than 10 kilometers and 140,000 bodies larger than 1 kilometer.
Once created, the newly formed fragments’ orbits began to slowly evolve due to thermal forces produced when they absorbed sunlight and re-radiated the energy away as heat. According to Bottke, "By carefully modeling these effects and the distance traveled by different-sized fragments from the location of the original collision, we determined that the Baptistina breakup took place 160 million years ago, give or take 20 million years."
The gradual spreading of the family caused many fragments to drift into a nearby "dynamical superhighway" where they could escape the main asteroid belt and be delivered to orbits that cross Earth’s path. The team's computations suggest that about 20 percent of the surviving multi-kilometer-sized fragments in the Baptistina family were lost in this fashion, with about 2 percent of those objects going on to strike the Earth, a pronounced increase in the number of large asteroids striking Earth.
Support for these conclusions comes from the impact history of the Earth and Moon, both of which show evidence of a two-fold increase in the formation rate of large craters over the last 100 to 150 million years. As described by Nesvorny, "The Baptistina bombardment produced a prolonged surge in the impact flux that peaked roughly 100 million years ago. This matches up pretty well with what is known about the impact record."
Bottke adds, "We are in the tail end of this shower now. Our simulations suggest that about 20 percent of the present-day, near-Earth asteroid population can be traced back to the Baptistina family."
The team then investigated the origins of the 180 kilometer diameter Chicxulub crater, which has been strongly linked to the extinction of the dinosaurs 65 million years ago. Studies of sediment samples and a meteorite from this time period indicate that the Chicxulub impactor had a carbonaceous chondrite composition much like the well-known primitive meteorite Murchison. This composition is enough to rule out many potential impactors but not those from the Baptistina family. Using this information in their simulations, the team found a 90 percent probability that the object that formed the Chicxulub crater was a refugee from the Baptistina family.
These simulations also showed there was a 70 percent probability that the lunar crater Tycho, an 85 kilometer crater that formed 108 million years ago, was also produced by a large Baptistina fragment. Tycho is notable for its large size, young age and its prominent rays that extend as far as 1,500 kilometers across the Moon. Vokrouhlicky says, "The probability is smaller than in the case of the Chicxulub crater because nothing is yet known about the nature of the Tycho impactor."
This study demonstrates that the collisional and dynamical evolution of the main asteroid belt may have significant implications for understanding the geological and biological history of Earth.
As Bottke says, "It is likely that more breakup events in the asteroid belt are connected in some fashion to events on the Earth, Moon and other planets. The hunt is on!"
The article, "An asteroid breakup 160 Myr ago as the probable source of the K/T impactor," was published in the Sept. 6 issue of Nature.
The NASA Origins of Solar Systems, Planetary Geology and Geophysics, and Near-Earth Objects Observations programs funded Bottke's and Nesvorny's research; Vokrouhlicky was funded by the Grant Agency of the Czech Republic.
Note: This story has been adapted from a news release issued by Southwest Research Institute.

Fausto Intilla

Thursday, September 6, 2007

‘Bringing the Ocean to the World,’ in High-Def


Source:

By WILLIAM YARDLEY
Published: September 4, 2007

SEATTLE — Thousands of miles of fiber-optic cables are strung across the world’s oceans, connecting continents like so many tin cans in this age of critical global communication. So the fact that about 800 more miles of fiber-optic cable will soon thread the sea floor off the coast of the Pacific Northwest might not seem particularly revolutionary. Until you meet John R. Delaney, part oceanographer, part oracle.
“This is a mission to Planet Ocean,” said Mr. Delaney, a professor at the University of Washington. “This is a NASA-scale mission to basically enter the Inner Space, and to be there perpetually. What we’re doing is bringing the ocean to the world.”
Under a $331 million program long dreamed of by oceanographers and being financed by the National Science Foundation, Professor Delaney and a team of scientists from several other institutions are leading the new Ocean Observatories Initiative, a multifaceted effort to study the ocean — in the ocean — through a combination of Internet-linked cables, buoys atop submerged data collection devices, robots and high-definition cameras. The first equipment is expected to be in place by 2009.
A central goal, say those involved, is to better understand how oceans affect life on land, including their role in storing carbon and in climate change; the causes of tsunamis; the future of fish populations; and the effect of ocean temperature on growing seasons. Oceanographers also hope to engage other scientists and the public more deeply with ocean issues by making them more immediate. Instead of spending weeks or months on a boat gathering data, then returning to labs to make sense of it, oceanographers say they expect to be able to order up specific requests from their desktops and download the results.
Researchers will be able, for example, to assemble a year’s worth of daily data on deep ocean temperatures in the Atlantic or track changes in currents as a hurricane nears the Gulf of Mexico. And schoolchildren accustomed to dated graphics and grainy shark videos will only have to boot up to dive deep in high definition. “It’ll all go on the Internet and in as real time as possible,” said Alexandra Isern, the program director for ocean technology at the National Science Foundation. “This is really going to transform not only the way scientists do science but also the way the public can be involved.”
The program has three main parts, two of which involve placing a range of sensors in the oceans and one that connects through the Internet all the information gathered, so that the public and scientists can have access to it.
A “coastal/global” program will include stand-alone deep-water data-gathering stations far offshore, mostly in the higher latitudes of the Atlantic and Pacific, where cold, rough conditions have made ship-based oceanography difficult.
In American waters, observation systems are planned on both coasts. In the Pacific, off the Oregon coast, the system will study the upwelling of cold water that has led to repeated “dead zones” of marine life in recent summers. In the East, off Martha’s Vineyard, a coastal observation system is planned along the continental shelf, gathering information at the surface, subsurface and on the sea floor, where warm Gulf Stream currents confront colder water from off the coast of Canada.
“That mixing affects surface productivity, weather, carbon cycling,” said Robert S. Detrick, a senior scientist at Woods Hole Oceanographic Institution.
In August, the Joint Oceanographic Institutions, which is administering the Ocean Observatories Initiative for the National Science Foundation, chose Woods Hole to lead the offshore buoy and coastal program. Woods Hole, which will receive about $98 million of the total cost, will partner with the Scripps Institution of Oceanography at the University of California, San Diego, and Oregon State University’s College of Oceanic and Atmospheric Sciences.
In the Northwest, about $130 million of the initiative’s cost is being dedicated to build a regional observatory, a series of underwater cables that will crisscross the tectonic plate named for the explorer Juan de Fuca. Rather than provide an information superhighway that bypasses the ocean, this new network is being put in place to take its pulse. Professor Delaney, whose specialty is underwater volcanoes that form at the seams between tectonic plates and the surprising life those seams support, is among those who have been pursuing the cable network for more than a decade, overcoming hurdles of money, technology and skepticism.
Some scientists have suggested that the Juan de Fuca is an imperfect laboratory, that it is small and lacks some features, like the most intense El Niño fluctuations, that might reveal more about how that phenomenon affects conditions at sea and on land. But Professor Delaney says the Juan de Fuca plate is well-suited for the program precisely because it is self-contained, just offshore and rich with seafloor activity, complicated current patterns and abundant fish and marine mammals. The new network shares many similarities with a plan called Neptune that Professor Delaney and others began pushing for in the 1990s. As part of an earlier effort related to that project, Canada is moving forward with its own cabled network off the coast of British Columbia.
“For the first three or four years, people just laughed when I said we’re going to turn Juan de Fuca Plate into a national laboratory,” Professor Delaney said. “Now they’re not laughing.”

Many oceanographers say the program will transform their field. Oscar Schofield, a biological oceanographer at Rutgers University, has spent nearly a decade helping piece together a small-scale coastal observatory in the Atlantic using a combination of radar, remote-controlled underwater gliders and a 15-kilometer underwater cable. The program, called the Coastal Ocean Observation Lab, which he runs with Scott Glenn, also a professor at Rutgers, has a Web site where outsiders can track currents, water temperatures and salinity levels in parts of the Atlantic. They can also follow measuring instruments guided remotely from the New Jersey coast to south of the Chesapeake Bay.

Professor Schofield said that the data gathered already had upended some of what he was taught in graduate school, from the way rivers flow into the ocean to the complexity of surface currents.
“When there’s a hurricane, when all the ships are running for cover, I’m flying my gliders into the hurricane,” using his office computer, Professor Schofield said. “Then I’m sitting at home drinking a beer watching the ocean respond to a hurricane.”
He added: “What’s great about oceanography is we’re still in the phase of just basic exploration. We’ve discovered things off one of the most populated coasts in the United States that we didn’t know yet. O.O.I. will take us one level beyond that, to where any scientist in the world will be able to explore any ocean.”
Several scientists involved in the project cited improved technology — from increased bandwidth to the ability to provide constant power to instruments at sea by underwater cables or solar or wind power — as critical to making the new program possible. They also say that increased concern about global warming, storms and fisheries has brought new attention to their field.
Some experts say they wish the project included more money for, say, placing buoys in the Indian Ocean, where monsoons and other events make for rich ocean activity. But John A. Orcutt, a professor at Scripps who is directing the effort to link the new research to the Internet, said being able to provide constant new data to scientists around the world, or even to teenagers surfing YouTube, will help build support for expanding the project in the future.
“We want to get the oceans and the sea floor in front of the public now,” Professor Orcutt said, “so they can understand the need for more.”

Fausto Intilla

Wednesday, September 5, 2007

Earthquakes in Real Time! (Place & Hour)

LINK to see the Real Time Map:

Fausto Intilla (Scientific Popularizer) - www.oloscience.com

NASA Satellites Eye Coastal Water Quality


Source:

Science Daily — Using data from instruments aboard NASA satellites, Zhiqiang Chen and colleagues at the University of South Florida in St. Petersburg, found that they can monitor water quality almost daily, rather than monthly.
Such information has direct application for resource managers devising restoration plans for coastal water ecosystems and federal and state regulators in charge of defining water quality standards.
The team's findings, published July 30 in two papers in Remote Sensing of Environment, will help tease out factors that drive changes in coastal water quality. For example, sediments entering the water as a result of coastal development or pollution can cause changes in water turbidity -- a measure of the amount of particles suspended in the water. Sediments suspended from the bottom by strong winds or tides may also cause such changes. Knowing where the sediments come from is critical to managers because turbidity cuts off light to the bottom, thwarting the natural growth of plants.
"If we can track the source of turbidity, we can better understand why turbidity is changing. And if the source is human-related, we can try to manage that human activity," says Frank Muller-Karger, a study co-author from the University of South Florida.
Satellites previously have observed turbidity in the open ocean by monitoring how much light is reflected and absorbed by the water. The technique has not had much success in observing turbidity along the coast, however. That's because shallow coastal waters and Earth's atmosphere serve up complicated optical properties that make it difficult for researchers to determine which colors in a satellite image are related to turbidity, which to shallow bottom waters, and which to the atmosphere. Now with advances in satellite sensors combined with developments in how the data are analyzed, Chen and colleagues show it is possible to monitor turbidity of coastal waters via satellite.
The traditional methods of monitoring coastal water quality require scientists to use boats to gather water samples, typically on a monthly basis because of the high costs of these surveys. The method is sufficient to capture episodic events affecting water quality, such as seasonal freshwater runoff. Chen and colleagues suspected, however, that the monthly measurements were not capturing fast changes in factors that affect water quality, such as winds, tides and human influences including pollution and runoff.
The team set out to see if satellites could accurately measure two key indicators of water quality - turbidity and water clarity -- in Tampa Bay, Fla. An analysis of turbidity takes into account water clarity, a measure of how much light can penetrate into deep water. Satellites, with their wide coverage and multiple passes per week, provided a solution to frequent looks and measuring an entire estuary within seconds.
To determine water clarity in Tampa Bay, the team looked at more than eight years of imagery from GeoEYE's Sea-viewing Wide Field-of-view Sensor (SeaWiFS) instrument, whose data is analyzed, processed and distributed by NASA for research. The images give a measure of how much light is reflected by the water. The data were put through a two-step calculation to arrive at a measure of clarity. Similarly, data from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) instrument onboard the Aqua satellite was compared with measurements of turbidity gathered on the ground and then applied to each whole image to make the maps.
When compared with results from independent field measurements, collected with the help from the U.S. Geological Survey, the researchers found that the satellites offered an accurate measure of water quality in the bay. The method can be applied to coastal waters worldwide with little change in methods, according to Muller-Karger.
Frequent measurements from space could resolve questions about the specific timing and nature of events that led to decreases in water quality. Seasonal freshwater discharge from nearby rivers and runoff into the bay can carry nutrients. If these nutrients are not controlled, they can give rise to large and harmful phytoplankton blooms, which can kill sea grass. Wind conditions, however, are the driving force for a decline in water quality in the dry season between October and June, when bottom sediments are disturbed.
"It's important to look at baseline conditions and see how they change with the seasons and over the years, and whether that change is due to development, coastal erosion, the extraction and dumping of sediments, or digging a channel," Muller-Karger says.
The SeaWiFS sensor was launched aboard the OrbView-2 satellite in 1997 to collect ocean color data. MODIS was launched aboard the Aqua satellite in 2002. The instrument collects measurements from the entire Earth surface every one to two days.
Note: This story has been adapted from a news release issued by NASA/Goddard Space Flight Center.

Fausto Intilla

Monday, September 3, 2007

Volcanoes Key To Earth's Oxygen Atmosphere


Source:

Science Daily — A switch from predominantly undersea volcanoes to a mix of undersea and terrestrial ones shifted the Earth's atmosphere from devoid of oxygen to one with free oxygen, according to geologists.
"The rise of oxygen allowed for the evolution of complex oxygen-breathing life forms," says Lee R. Kump, professor of geoscience, Penn State.
Before 2.5 billion years ago, the Earth's atmosphere lacked oxygen. However, biomarkers in rocks 200 million years older than that period, show oxygen-producing cyanobacteria released oxygen at the same levels as today. The oxygen produced then, had to be going somewhere.
"The absence of oxidized soil profiles and red beds indicates that oxidative weathering rates were negligible during the Archaean," the researchers report in the Aug. 30 issue of Nature.
The ancient Earth should have had an oxygen atmosphere but something was converting, reducing, the oxygen and removing it from the atmosphere. The researchers suggest that submarine volcanoes, producing a reducing mixture of gases and lavas, effectively scrubbed oxygen from the atmosphere, binding it into oxygen containing minerals.
"The Archaean more than 2.5 billion years ago seemed to be dominated by submarine volcanoes," says Kump. "Subaerial andesite volcanoes on thickened continental crust seem to be almost absent in the Archaean."
About 2.5 billion years ago at the Archaean/Proterozoic boundary, when stabilized continental land masses arose and terrestrial volcanoes appeared, markers show that oxygen began appearing in the atmosphere.
Kump and Mark E. Barley, professor of geology, University of Western Australia, looked at the geologic record from the Archaean and the Palaeoproterozoic in search of the remains of volcanoes. They found that the Archaean was nearly devoid of terrestrial volcanoes, but heavily populated by submarine volcanoes. The Palaeoproterozoic, however, had ample terrestrial volcanic activity along with continuing submarine vulcanism. Subaerial volcanoes arose after 2.5 billion years ago and did not strip oxygen from the air. Having a mix of volcanoes dominated by terrestrial volcanoes allowed oxygen to exist in the atmosphere.
Terrestrial volcanoes could become much more common in the Palaeoproterozoic because land masses stabilized and the current tectonic regime came into play.
The researchers looked at the ratio of submarine to subaerial volcanoes through time. Because submarine volcanoes erupt at lower temperatures than terrestrial volcanoes, they are more reducing. As long as the reducing ability of the submarine volcanoes was larger than the amounts of oxygen created, the atmosphere had no oxygen. When terrestrial volcanoes began to dominate, oxygen levels increased.
The National Science Foundation, NASA Astrobiology Institute and the Australian Research Council supported this work.
Note: This story has been adapted from a news release issued by Penn State.

Fausto Intilla

Sunday, September 2, 2007

"The World without us"



Starting Over

By JENNIFER SCHUESSLER
Published: September 2, 2007

When Rachel Carson’s “Silent Spring” was published in 1963, the chemical giant Monsanto struck back with a parody called “Desolate Spring” that envisioned an America laid waste not by pesticides but by insects: “The bugs were everywhere. Unseen. Unheard. Unbelievably universal. ... On or under every square foot of land, every square yard, every acre, and county, and state and region in the entire sweep of the United States. In every home and barn and apartment house and chicken coop, and in their timbers and foundations and furnishings. Beneath the ground, beneath the waters, on and in limbs and twigs and stalks, under rocks, inside trees and animals and other insects — and yes, inside man.”

To Alan Weisman, this nightmare scenario would be merely a promising start. In his morbidly fascinating nonfiction eco-thriller, “The World Without Us,” Weisman imagines what would happen if the earth’s most invasive species — ourselves — were suddenly and completely wiped out. Writers from Carson to Al Gore have invoked the threat of environmental collapse in an effort to persuade us to change our careless ways. With similar intentions but a more devilish sense of entertainment values, Weisman turns the destruction of our civilization and the subsequent rewilding of the planet into a Hollywood-worthy, slow-motion disaster spectacular and feel-good movie rolled into one.
A journalist and author of three previous books, Weisman travels from Europe’s last remnant of primeval forest to the horse latitudes of the Pacific, interviewing everyone from evolutionary biologists and materials scientists to archaeologists and art conservators in his effort to sketch out the planet’s post-human future. In even the most heavily fortified corners of the settled world, the rot would set in quickly. With no one left to run the pumps, New York’s subway tunnels would fill with water in two days. Within 20 years, Lexington Avenue would be a river. Fire- and wind-ravaged skyscrapers would eventually fall like giant trees. Within weeks of our disappearance, the world’s 441 nuclear plants would melt down into radioactive blobs, while our petrochemical plants, “ticking time bombs” even on a normal day, would become flaming geysers spewing toxins for decades to come. Outside of these hot spots, Weisman depicts a world slowly turning back into wilderness. After about 100,000 years, carbon dioxide would return to prehuman levels. Domesticated species from cattle to carrots would revert back to their wild ancestors. And on every dehabitated continent, forests and grasslands would reclaim our farms and parking lots as animals began a slow parade back to Eden.
A million years from now, a collection of mysterious artifacts would remain to puzzle whatever alien beings might stumble upon them: the flooded tunnel under the English Channel; bank vaults full of mildewed money; obelisks warning of buried atomic waste (as current law requires) in seven long-obsolete human languages, with pictures. The faces on Mount Rushmore might provoke Ozymandian wonder for about 7.2 million more years. (Lincoln would probably fare better on the pre-1982 penny, cast in durable bronze.) But it’s hard to imagine an alien archaeologist finding poetry in the remote Pacific atolls awash in virtually unbiodegradable plastic bottles, bags and Q-tip shafts, or in the quadrillions of nurdles, microscopic plastic bits in the oceans — they currently outweigh all the plankton by a factor of six — that would continue to cycle uncorrupted through the guts of sea creatures until an enterprising microbe evolved to break them down.
As for the creatures who made this mess, the only residue of our own surprisingly negligible biomass — according to the biologist E. O. Wilson, the six billion-plus humans currently wreaking planetary havoc could all be neatly tucked away in one branch of the Grand Canyon — would be the odd fossil, mingling perhaps with the limbs of Barbie dolls.
Weisman knows from the work of environmental historians that humans have been shaping the natural world since long before the industrial age. His inner Deep Ecologist may dream of Earth saying good riddance to us, but he finds some causes for hope amid the general run of man-bites-planet bad news. At Amboseli National Park in Kenya, he takes comfort in the spectacle of Masai herdsmen living in carefully managed harmony with predators and grazers alike. In the 30-kilometer-radius “Zone of Alienation” around the Chernobyl nuclear plant, where some bridges remain too hot to cross 20 years after the 1986 meltdown, he finds eerie peace in the forests full of moose, lynx and radioactive deer. Watching from inside his protective suit as barn swallows buzz around the reactor, Weisman writes: “You want them to fly away, fast and far. At the same time, it’s mesmerizing that they’re here. It seems so normal, as if apocalypse has turned out to be not so bad after all. The worst happens, and life still goes on.”

So could we ourselves really simply fly away, leaving the rest of nature to slowly clean up our mess? Doomsday rhetoric aside, the fact is that nothing is likely to wipe us out completely, at least not without taking a good chunk of the rest of creation with us. (Even a virus with a 99.99 percent kill rate would still leave more than half a million naturally immune survivors who could fully repopulate the earth to current levels in a mere 50,000 years.) Not that some people aren’t trying to take matters into their own hands. Weisman checks in with Les Knight, the founder of the Voluntary Human Extinction Movement, which advocates gradually putting our species to sleep by collective refusal to procreate. After an initial panic, we would look around and see that the world was actually getting better: “With no more resource conflicts, I doubt we’d be wasting each other’s lives in combat,” Knight says. “The last humans could enjoy their final sunsets peacefully, knowing they have returned the planet as close as possible to the Garden of Eden.” (Apparently he never saw “Children of Men.”)

Weisman has his own flirtation with religious language, his occasionally portentous impassivity giving way to the familiar rhetoric of eco-hellfire as he imagines the earth’s most “narcissistic” species cleansed from the earth as punishment for its “overindulged lifestyle.” But Weisman stops short of calling for our full green burial, arguing instead for a universal “one child per human mother” policy. It would take until 2100 to dwindle to a global population of 1.6 billion, a level last seen in the 19th century, before leaping advances in energy, medicine and food production, but well before then we’d experience “the growing joy of watching the world daily become more wonderful.” And the evidence, Weisman writes, “wouldn’t hide in statistics. It would be outside every human’s window, where refreshed air would fill each season with more birdsong.”
Even readers who vaguely agree that there are “too many of us” (or is it too many of them?) may not all share Weisman’s brisk certainty that trading a sibling for more birdsong is a good bargain, just as those who applaud the reintroduction of the North American wolf may not quite buy the claim by Dave Foreman, a founder of Earth First!, that filling the New World’s empty über-predator niche with African lions and cheetahs is our best chance to avoid what Weisman calls “the black hole into which we’re shoving the rest of nature.” In the end, it’s the cold facts and cooler heads that drive Weisman’s cautionary message powerfully home. When it comes to mass extinctions, one expert tells him, “the only real prediction you can make is that life will go on. And that it will be interesting.” Weisman’s gripping fantasy will make most readers hope that at least some of us can stick around long enough to see how it all turns out.

Fausto Intilla

Saturday, September 1, 2007

Secrets Of Red Tide Revealed


Source:

Science Daily — In work that could one day help prevent millions of dollars in economic losses for seaside communities, MIT chemists have demonstrated how tiny marine organisms likely produce the red tide toxin that periodically shuts down U.S. beaches and shellfish beds.
In the Aug. 31 cover story of Science, the MIT team describes an elegant method for synthesizing the lethal components of red tides. The researchers believe their method approximates the synthesis used by algae, a reaction that chemists have tried for decades to replicate, without success.
Understanding how and why red tides occur could help scientists figure out how to prevent the blooms, which cause significant ecological and economic damage. The New England shellfish industry, for example, lost tens of millions of dollars during a 2005 outbreak, and red tide killed 30 endangered manatees off the coast of Florida this spring.
The discovery by MIT Associate Professor Timothy Jamison and graduate student Ivan Vilotijevic not only could shed light on how algae known as dinoflagellates generate red tides, but could also help speed up efforts to develop cystic fibrosis drugs from a compound closely related to the toxin. Red tides, also known as algal blooms, strike unpredictably and poison shellfish, making them dangerous for humans to eat. It is unknown what causes dinoflagellates to produce the red tide toxins, but it may be a defense mechanism, possibly provoked by changes in the tides, temperature shifts or other environmental stresses.
One of the primary toxic components of red tide is brevetoxin, a large and complex molecule that is very difficult to synthesize.
Twenty-two years ago, chemist Koji Nakanishi of Columbia University proposed a cascade, or series of chemical steps, that dinoflagellates could use to produce brevetoxin and other red tide toxins. However, chemists have been unable to demonstrate such a cascade in the laboratory, and many came to believe that the "Nakanishi Hypothesis" would never be proven.
"A lot of people thought that this type of cascade may be impossible," said Jamison. "Because Nakanishi's hypothesis accounts for so much of the complexity in these toxins, it makes a lot of sense, but there hasn't really been any evidence for it since it was first proposed."
Jamison and Vilotijevic's work offers the first evidence that Nakanishi's hypothesis is feasible. Their work could also help accelerate drug discovery efforts. Brevenal, another dinoflagellate product related to the red tide toxins, has shown potential as a powerful treatment for cystic fibrosis (CF). It can also protect against the effects of the toxins.
"Now that we can make these complex molecules quickly, we can hopefully facilitate the search for even better protective agents and even more effective CF therapies," said Jamison.
Until now, synthesizing just a few milligrams of red tide toxin or related compounds, using a non-cascade method, required dozens of person-years of effort.
The new synthesis depends on two critical factors-giving the reaction a jump start and conducting the reaction in water.
Many red tide toxins possess a long chain of six-membered rings. However, the starting materials for the cascades, epoxy alcohols, tend to form five-membered rings. To overcome that, the researchers attached a "template" six-membered ring to one end of the epoxy alcohol. That simple step effectively launches the cascade of reactions that leads to the toxin chain, known as a ladder polyether.
"The trick is to give it a little push in the right direction and get it running smoothly," said Jamison.
The researchers speculate that in dinoflagellates, the initial jump start is provided by an enzyme instead of a template.
Conducting the reaction in water is also key to a successful synthesis. Water is normally considered a poor solvent for organic reactions, so most laboratory reactions are performed in organic solvents. However, when Vilotijevic introduced water into the reaction, he noticed that it proceeded much more quickly and selectively.
Although it could be a coincidence that these cascades work best in water and that dinoflagellates are marine organisms, water may nevertheless be directly involved in the biosynthesis of the toxins or emulating an important part of it, said Jamison. Because of this result, the researchers now believe that organic chemists should routinely try certain reactions in water as well as organic solvents.
The research was funded by the National Institute of General Medical Sciences, Merck Research Laboratories, Boehringer Ingelheim, and MIT.
"This is an elegant piece of work with multiple levels of impact," said John Schwab, who manages organic chemistry research for the National Institute of General Medical Sciences. "Not only will it allow chemists to synthesize this important class of complex molecules much more easily, but it also provides key insights into how nature may make these same molecules. This is terrific bang for the taxpayers' buck!"
Note: This story has been adapted from a news release issued by Massachusetts Institute of Technology.

Fausto Intilla