Friday, November 30, 2007

Recipe For A Storm: Ingredients For More Powerful Atlantic Hurricanes


Source:

ScienceDaily (Nov. 30, 2007) — As the world warms, the interaction between the Atlantic Ocean and atmosphere may be the recipe for stronger, more frequent hurricanes.
University of Wisconsin-Madison scientists have found that the Atlantic organizes the ingredients for a powerful hurricane season to create a situation where either everything is conducive to hurricane activity or nothing is-potentially making the Atlantic more vulnerable to climate change than the world's other hurricane hot spots.
After the 2004 and 2005 hurricane seasons, many worry what Atlantic hurricane seasons will look like in a warmer world. Evidence indicates that higher ocean temperatures add a lot of fuel to these devastating storms. In a paper published today in the "Bulletin of the American Meteorological Society," co-authors Jim Kossin and Dan Vimont caution against only looking at one piece of the puzzle. "Sea surface temperature is a bit overrated," says Kossin, an atmospheric scientist at UW-Madison's Cooperative Institute of Meteorological Satellite Studies. "It's part of a larger pattern."
Kossin and Vimont, a professor in the Department of Atmospheric and Oceanic Sciences, noticed that warmer water is just one part of a larger pattern indicating that the conditions are right for more frequent, stronger hurricanes in the Atlantic. The atmosphere reacts to ocean conditions and the ocean reacts to the atmospheric situation, creating a distinct circulation pattern known as the Atlantic Meridional Mode (AMM). The AMM unifies the connections among the factors that influence hurricanes such as ocean temperature, characteristics of the wind, and moisture in the atmosphere.
Finding that a basin-wide circulation pattern drives Atlantic hurricane activity helps explain evidence of significant differences in long-term hurricane trends among the world's basins. In a study published last February, Kossin and his co-authors created a more consistent record of hurricane data that accounted for the significant improvement in storm detection that followed the advent of weather satellites. An analysis of this recalibrated data showed that hurricanes have become stronger and more frequent in the Atlantic Ocean over the last two decades. The increasing trend, however, is harder to identify in the world's other oceans.
Kossin and Vimont wanted to determine why long-term trends in the Atlantic looked different from those in other basins, particularly in the Pacific, where the majority of the world's hurricane activity occurs. "The AMM helps us understand why hurricanes in the Atlantic react differently to climate changes than those in the Pacific," Vimont says. According to Vimont, the other oceanic basins have their own modes of variability.
Understanding how factors vary together provides a new framework from which to consider climate change and hurricanes. "Our study broadens the interpretation of the hurricane-climate relationship," Vimont says.
Looking at the larger set of varying conditions provides a more coherent understanding of how climate change affects hurricane activity. In the Atlantic, warmer water indicates that other conditions are also ideal for hurricane development. However, in the Pacific, a hurricane-friendly environment goes along with cooler ocean temperatures in the area where the storms spend their lives. The inconsistent relationship with sea surface temperature leads Vimont and Kossin to conclude that the connection between hurricane activity and climate variability hinges on more than just changes in ocean temperatures.
"You can never isolate one factor on this planet," Kossin says. "Everything is interrelated."
Depending on the other conditions hurricanes care about, warmer oceans can mean different outcomes. Concentrating on how the atmosphere and the ocean work together helps hurricane researchers see the bigger picture. Because higher sea surface temperatures in the Atlantic act in concert with the AMM, Vimont and Kossin suggest that Atlantic hurricanes will be more sensitive to climate changes than storms in other ocean basins.
In addition to helping researchers understand and predict the effects of climate change on hurricane activity, Vimont and Kossin can forecast the AMM up to a year in advance. If the AMM is positive, all the conditions are right for hurricane development. If it is negative, those living on the coasts can generally expect a quieter hurricane season. Vimont and Kossin plan to further develop their AMM forecasts for use during the hurricane season. The duo also hopes to continue to research the physical relationships that constitute the AMM as well as how future climate change will affect these modes of climate variability.
Adapted from materials provided by University of Wisconsin-Madison.

Fausto Intilla

Wednesday, November 28, 2007

High Resolution Antarctica Map Lays Ground For New Discoveries


Source:

ScienceDaily (Nov. 28, 2007) — A team of researchers from NASA, the U.S. Geological Survey, the National Science Foundation and the British Antarctic Survey unveiled a newly completed map of Antarctica November 27 that is expected to revolutionize research of the continent's frozen landscape.
The Landsat Image Mosaic of Antarctica is a result of NASA's state-of-the-art satellite technologies and an example of the prominent role NASA continues to play as a world leader in the development and flight of Earth-observing satellites.
The map is a realistic, nearly cloudless satellite view of the continent at a resolution 10 times greater than ever before with images captured by the NASA-built Landsat 7 satellite. With the unprecedented ability to see features half the size of a basketball court, the mosaic offers the most geographically accurate, true-color, high-resolution views of Antarctica possible.
"This mosaic of images opens up a window to the Antarctic that we just haven't had before," said Robert Bindschadler, chief scientist of the Hydrospheric and Biospheric Sciences Laboratory at NASA's Goddard Space Flight Center in Greenbelt, Md. "It will open new windows of opportunity for scientific research as well as enable the public to become much more familiar with Antarctica and how scientists use imagery in their research. This innovation is like watching high-definition TV in living color versus watching the picture on a grainy black-and-white television. These scenes don't just give us a snapshot, they provide a time-lapse historical record of how Antarctica has changed and will enable us to continue to watch changes unfold."
Researchers can use the detailed map to better plan scientific expeditions. The mosaic's higher resolution gives researchers a clearer view over most of the continent to help interpret changes in land elevation in hard-to-access areas. Scientists also think the true-color mosaic will help geologists better map various rock formations and types.
To construct the new Antarctic map, researchers pieced together more than a thousand images from three years of Landsat satellite observations. The resulting mosaic gives researchers and the public a new way to explore Antarctica through a free, public-access web portal. Eight different versions of the full mosaic are available to download.
In 1972, the first satellite images of the Antarctic became available with the launch of NASA's Earth Resources Technology Satellite (later renamed Landsat). The series of Landsat satellites have provided the longest, continuous global record of land surface and its historical changes in existence. Prior to these satellite views, researchers had to rely on airplanes and survey ships to map Antarctica's ice-covered terrain.
Images from the Landsat program, now managed by the U.S. Geological Survey, led to more precise and efficient research results as the resolution of digital images improved over the years with upgraded instruments on each new Earth-observing satellite.
"We have significantly improved our ability to extract useful information from satellites as embodied in this Antarctic mosaic project," said Ray Byrnes, liaison for satellite missions at the U.S. Geological Survey in Reston, Va. "As technology progressed, so have the satellites and their image resolution capability. The first three in the Landsat series were limited in comparison to Landsats 4, 5, and 7."
Bindschadler, who conceived the project, initiated NASA's collection of images of Antarctica for the mosaic project in 1999. He and NASA colleagues selected the images that make up the mosaic and developed new techniques to interpret the image data tailored to the project. The mosaic is made up of about 1,100 images from Landsat 7, nearly all of which were captured between 1999 and 2001. The collage contains almost no gaps in the landscape, other than a doughnut hole-shaped area at the South Pole, and shows virtually no seams.
"The mosaic represents an important U.S.-U.K. collaboration and is a major contribution to the International Polar Year," said Andrew Fleming of British Antarctic Survey in Cambridge, England. "Over 60,000 scientists are involved in the global International Polar Year initiative to understand our world. I have no doubt that polar researchers will find this mosaic, one of the first outcomes of that initiative, invaluable for planning science campaigns."
NASA has 14 Earth-observing satellites in orbit with activities that have direct benefit to humankind. After NASA develops and tests new technologies, the agency transfers activities to other federal agencies. The satellites have helped revolutionize the information that emergency officials have to respond to natural disasters like hurricanes and wildfires.
The Landsat Image Mosaic of Antarctica is now available on the Web at: http://lima.usgs.gov/
Adapted from materials provided by NASA/Goddard Space Flight Center.

Fausto Intilla

Tuesday, November 27, 2007

Number Of Tropcial Storms In Recent Past Increasing


Source:

ScienceDaily (Nov. 27, 2007) — Counting tropical storms that occurred before the advent of aircraft and satellites relies on ships logs and hurricane landfalls, making many believe that the numbers of historic tropical storms in the Atlantic are seriously undercounted. However, a statistical model based on the climate factors that influence Atlantic tropical storm activity shows that the estimates currently used are only slightly below modeled numbers and indicate that the numbers of tropical storms in the recent past are increasing, according to researchers.
"We are not the first to come up with an estimate of the number of undercounted storms," says Michael E. Mann, associate professor of meteorology, Penn State, and director of the Earth System Science Center.
In the past, some researchers assumed that a constant percentage of all the storms made landfall and so they compared the number of tropical storms making landfall with the total number of reported storms for that year. Other researchers looked at ship logs and ship tracks to determine how likely a tropical storm would have been missed. In the early 1900s and before, there were probably not sufficient ships crossing the Atlantic to ensure full coverage.
The researchers report in the current issue of Geophysical Review Letters "that the long-term record of historical Atlantic tropical cyclone counts is likely largely reliable, with an average undercount bias at most of approximately one tropical storm per year back to 1870."
The previously estimated undercounts of three or more storms are inaccurate.
"We have a very accurate count of Atlantic tropical cyclones beginning in 1944 when aircraft became common," says Mann. "In the 1970s, satellites were added to that mix."
With more than 60 years of accurate hurricane counts, the researchers, who included Thomas Sabbatelli, an undergraduate in meteorology and the Schreyer Honors College at Penn State, and Urs Neu, a research scientist at ProClim, Swiss Academy of Sciences, looked at other, independent ways to determine the number of hurricanes before 1944.
They looked at how the cycle of El Nino/La Nina, the pattern of the northern hemisphere jet stream and tropical Atlantic sea surface temperatures influence tropical storm generation by creating a model that includes these three climate variables. The information is available back to 1870.
The statistical model proved successful in various tests of accuracy. The model also predicted 15 total Atlantic tropical storms with an error margin of 4 before the current season began. So far, 14 storms have formed, with a little more than one week left in the season.
The model, trained on the tropical storm occurrence information from 1944 to 2006 showed an undercount before 1944 of 1.2 storms per year. When the researchers considered a possible undercount of three storms per year, their model predicted too few storms total. The model only works in the range of around 1.2 undercounted storms per year with the climate data available. The model was statistically significant in its findings.
"Fifty percent of the variation in storm numbers from one year to the next appears to be predictable in terms of the three key climate variables we used," says Mann. "The other 50 percent appears to be pure random variation. The model ties the increase in storm numbers over the past decade to increasing tropical ocean surface temperatures.
"We cannot explain the warming trend in the tropics without considering human impacts on climate. This is not a natural variation," says Mann.
"This . . . supports other work suggesting that increases in frequency, as well as powerfulness, of Atlantic tropical cyclones are potentially related to long-term trends in tropical Atlantic sea surface temperatures, trends that have in turn been connected to anthropogenic influences on climate," the researchers report.
Adapted from materials provided by Penn State.

Fausto Intilla

'Ultrasound' Of Earth's Crust Reveals Inner Workings Of A Tsunami Factory


Source:

ScienceDaily (Nov. 27, 2007) — Research just announced by a team of U.S. and Japanese geoscientists may help explain why part of the seafloor near the southwest coast of Japan is particularly good at generating devastating tsunamis, such as the 1944 Tonankai event, which killed at least 1,200 people. The findings will help scientists assess the risk of giant tsunamis in other regions of the world.
Geoscientists from The University of Texas at Austin and colleagues used a commercial ship to collect three-dimensional seismic data that reveals the structure of Earth's crust below a region of the Pacific seafloor known as the Nankai Trough. The resulting images are akin to ultrasounds of the human body.
The results, published in the journal Science, address a long standing mystery as to why earthquakes below some parts of the seafloor trigger large tsunamis while earthquakes in other regions do not.
The 3D seismic images allowed the researchers to reconstruct how layers of rock and sediment have cracked and shifted over time. They found two things that contribute to big tsunamis. First, they confirmed the existence of a major fault that runs from a region known to unleash earthquakes about 10 kilometers (6 miles) deep right up to the seafloor. When an earthquake happens, the fault allows it to reach up and move the seafloor up or down, carrying a column of water with it and setting up a series of tsunami waves that spread outward.
Second, and most surprising, the team discovered that the recent fault activity, probably including the slip that caused the 1944 event, has shifted to landward branches of the fault, becoming shallower and steeper than it was in the past.
"That leads to more direct displacement of the seafloor and a larger vertical component of seafloor displacement that is more effective in generating tsunamis," said Nathan Bangs, senior research scientist at the Institute for Geophysics at The University of Texas at Austin who was co-principal investigator on the research project and co-author on the Science article.
The Nankai Trough is in a subduction zone, an area where two tectonic plates are colliding, pushing one plate down below the other. The grinding of one plate over the other in subduction zones leads to some of the world's largest earthquakes.
In 2002, a team of researchers led by Jin-Oh Park at Japan Marine Science and Technology Center (JAMSTEC) had identified the fault, known as a megathrust or megasplay fault, using less detailed two-dimensional geophysical methods. Based on its location, they suggested a possible link to the 1944 event, but they were unable to determine where faulting has been recently active.
"What we can now say is that slip has very recently propagated up to or near to the seafloor, and slip along these thrusts most likely caused the large tsunami during the 1944 Tonankai 8.1 magnitude event," said Bangs.
The images produced in this project will be used by scientists in the Nankai Trough Seismogenic Zone Experiment (NanTroSEIZE), an international effort designed to, for the first time, "drill, sample and instrument the earthquake-causing, or seismogenic portion of Earth's crust, where violent, large-scale earthquakes have occurred repeatedly throughout history."
"The ultimate goal is to understand what's happening at different margins," said Bangs. "The 2004 Indonesian tsunami was a big surprise. It's still not clear why that earthquake created such a large tsunami. By understanding places like Nankai, we'll have more information and a better approach to looking at other places to determine whether they have potential. And we'll be less surprised in the future."
Bangs' co-principal investigator was Gregory Moore at JAMSTEC in Yokohama and the University of Hawaii, Honolulu. The other co-authors are Emily Pangborn at the Institute for Geophysics at The University of Texas at Austin, Asahiko Taira and Shin'ichi Kuramoto at JAMSTEC and Harold Tobin at the University of Wisconsin, Madison. Funding for the project was provided by the National Science Foundation, Ocean Drilling Program and Japanese Ministry of Education, Culture, Sports and Technology.
Adapted from materials provided by University of Texas at Austin.

Fausto Intilla

Thursday, November 8, 2007

Scientists Enhance Mother Nature's Carbon Handling Mechanism


Source:

ScienceDaily (Nov. 8, 2007) — Taking a page from Nature herself, a team of researchers developed a method to enhance removal of carbon dioxide from the atmosphere and place it in the Earth's oceans for storage.
Unlike other proposed ocean sequestration processes, the new technology does not make the oceans more acid and may be beneficial to coral reefs. The process is a manipulation of the natural weathering of volcanic silicate rocks. Reporting in Nov. 7 issue of Environmental Science and Technology, the Harvard and Penn State team explained their method.
"The technology involves selectively removing acid from the ocean in a way that might enable us to turn back the clock on global warming," says Kurt Zenz House, graduate student in Earth and planetary sciences, Harvard University. "Essentially, our technology dramatically accelerates a cleaning process that Nature herself uses for greenhouse gas accumulation."
In natural silicate weathering, carbon dioxide from the atmosphere dissolves in fresh water and forms weak carbonic acid. As the water percolates through the soil and rocks, the carbonic acid converts to a solution of alkaline carbonate salts. This water eventually flows into the ocean and increases its alkalinity. An alkaline ocean can hold dissolved carbon, while an acidic one will release the carbon back into the atmosphere. The more weathering, the more carbon is transferred to the ocean where some of it eventually becomes part of the sea bottom sediments.
"In the engineered weathering process we have found a way to swap the weak carbonic acid with a much stronger one (hydrochloric acid) and thus accelerate the pace to industrial rates," says House.
The researchers minimize the potential for environmental problems by combining the acid removal with silicate rock weathering mimicking the natural process. The more alkaline ocean can store carbon as bicarbonate, the most plentiful and innocuous form of carbon in the oceans.
According to House, this would allow removal of excess carbon dioxide from the atmosphere in a matter of decades rather than millennia.
Besides removing the greenhouse gas carbon dioxide from the atmosphere, this technique would counteract the continuing acidification of the oceans that threatens coral reefs and their biological communities. The technique is adaptable to operation in remote areas on geothermal or natural gas and is global rather than local. Unlike carbon dioxide scrubbers on power plants, the process can as easily remove naturally generated carbon dioxide as that produced from burning fossil fuel for power.
The researchers, Kurt House; Daniel P. Schrag, director, Harvard University Center for the Environment and professor of Earth and planetary sciences; Michael J. Aziz, the Gordon McKay professor of material sciences, all at Harvard University and Kurt House's brother, Christopher H. House, associate professor of geosciences, Penn State, caution that while they believe their scheme for reducing global warming is achievable, implementation would be ambitious, costly and would carry some environmental risks that require further study. The process would involve building dozens of facilities similar to large chlorine gas industrial plants, on volcanic rock coasts.
"This work shows how we can remove carbon dioxide on relevant timescales, but more work is be needed to bring down the cost and minimize other environmental effects," says Christopher H. House.
The Link Energy Foundation, Merck Fund of the New York Community Trust, U.S. DOE and NASA supported this work.
Adapted from materials provided by Penn State.

Fausto Intilla

Tuesday, November 6, 2007

Seismic Hazard: Stateline Fault System Is Major Component Of Eastern California Shear Zone

Source:
ScienceDaily (Nov. 6, 2007) — The 200-km (125 miles)-long Stateline fault system is a right-lateral strike-slip fault zone with clear Late Quaternary surface ruptures extending along the California-Nevada state line, from Primm, Nevada area along Interstate 15 to the Amargosa Valley.
The fault passes within 40 km of the Las Vegas strip, 10 km of the town center of Pahrump, Nevada, and appears to end near the town of Amargosa Valley, Nevada (about 40 km west-southwest of the site of the proposed high-level nuclear waste repository at Yucca Mountain).
This fault has long been considered inactive and of only minor importance to the tectonic pattern of eastern California and southwestern Nevada, whereas fault systems like the Death Valley, Panamint Valley, and Owens Valley have received much more attention.
New research focused on the Stateline fault system is beginning to change how we view this fault zone. Guest et al. present geologic data that establishes the minimum offset on the southern segment of the fault system to be 30 ± 4 km over the last 13 million years.
This implies a minimum average slip rate for the southern segment of the fault system of 2.3 ± 0.35 mm/yr. This is twice the slip rate estimated from geodetic monitoring in the region, and therefore the fault is either in a transient period of slow slip or has been abandoned as activity in the eastern California shear zone has migrated west.
The magnitude of accumulated offset, evidence for Late Quaternary slip, and rapid long-term slip rate indicate that the Stateline fault system is a major component of the Eastern California shear zone. Given its proximity to population centers and important infrastructure in southern Nevada, the fault warrants close scrutiny in seismic hazards analyses of the region.
Adapted from materials provided by Geological Society of America.

Fausto Intilla
www.oloscience.com