Sunday, December 23, 2007

Geologists Say 'Wall Of Africa' Allowed Humanity To Emerge


Source:

ScienceDaily (Dec. 22, 2007) — Scientists long have focused on how climate and vegetation allowed human ancestors to evolve in Africa. Now, University of Utah geologists are calling renewed attention to the idea that ground movements formed mountains and valleys, creating environments that favored the emergence of humanity.
"Tectonics [movement of Earth's crust] was ultimately responsible for the evolution of humankind," Royhan and Nahid Gani of the university's Energy and Geoscience Institute write in the January, 2008, issue of Geotimes, published by the American Geological Institute.
They argue that the accelerated uplift of mountains and highlands stretching from Ethiopia to South Africa blocked much ocean moisture, converting lush tropical forests into an arid patchwork of woodlands and savannah grasslands that gradually favored human ancestors who came down from the trees and started walking on two feet -- an energy-efficient way to search larger areas for food in an arid environment.
In their Geotimes article, the Ganis -- a husband-and-wife research team who met in college in their native Bangladesh -- describe this 3,700-mile-long stretch of highlands and mountains as "the Wall of Africa." It parallels the famed East African Rift valley, where many fossils of human ancestors were found.
"Because of the crustal movement or tectonism in East Africa, the landscape drastically changed over the last 7 million years," says Royhan Gani (pronounced rye-hawn Go-knee), a research assistant professor of civil and environmental engineering. "That landscape controlled climate on a local to regional scale. That climate change spurred human ancestors to evolve from apes."
Hominins -- the new scientific word for humans (Homo) and their ancestors (including Ardipithecus, Paranthropus and Australopithecus) -- split from apes on the evolutionary tree roughly 7 million to 4 million years ago. Royhan Gani says the earliest undisputed hominin was Ardipithecus ramidus 4.4 million years ago. The earliest Homo arose 2.5 million years ago, and our species, Homo sapiens, almost 200,000 years ago.
Tectonics -- movements of Earth's crust, including its ever-shifting tectonic plates and the creation of mountains, valleys and ocean basins -- has been discussed since at least 1983 as an influence on human evolution.
But Royhan Gani says much previous discussion of how climate affected human evolution involves global climate changes, such as those caused by cyclic changes in Earth's orbit around the sun, and not local and regional climate changes caused by East Africa's rising landscape.
A Force from within the Earth
The geological or tectonic forces shaping Africa begin deep in the Earth, where a "superplume" of hot and molten rock has swelled upward for at least the past 45 million years. This superplume and its branching smaller plumes help push apart the African and Arabian tectonic plates of Earth's crust, forming the Red Sea, Gulf of Aden and the Great Rift Valley that stretches from Syria to southern Africa.
As part of this process, Africa is being split apart along the East African Rift, a valley bounded by elevated "shoulders" a few tens of miles wide and sitting atop "domes" a few hundreds of miles wide and caused by upward bulging of the plume.
The East African Rift runs about 3,700 miles from the Ethiopian Plateau south-southwest to South Africa's Karoo Plateau. It is up to 370 miles wide and includes mountains reaching a maximum elevation of about 19,340 feet at Mount Kilimanjaro.
The rift "is characterized by volcanic peaks, plateaus, valleys and large basins and freshwater lakes," including sites where many fossils of early humans and their ancestors have been found, says Nahid Gani (pronounced nah-heed go-knee), a research scientist. There was some uplift in East Africa as early as 40 million years ago, but "most of these topographic features developed between 7 million and 2 million years ago."
A Wall Rises and New Species Evolve
"Although the Wall of Africa started to form around 30 million years ago, recent studies show most of the uplift occurred between 7 million and 2 million years ago, just about when hominins split off from African apes, developed bipedalism and evolved bigger brains," the Ganis write.
"Nature built this wall, and then humans could evolve, walk tall and think big," says Royhan Gani. "Is there any characteristic feature of the wall that drove human evolution?"
The answer, he believes, is the variable landscape and vegetation resulting from uplift of the Wall of Africa, which created "a topographic barrier to moisture, mostly from the Indian Ocean" and dried the climate. He says that contrary to those who cite global climate cycles, the climate changes in East Africa were local and resulted from the uplift of different parts of the wall at different times.
Royhan Gani says the change from forests to a patchwork of woodland and open savannah did not happen everywhere in East Africa at the same time, and the changes also happened in East Africa later than elsewhere in the world.
The Ganis studied the roughly 300-mile-by-300-mile Ethiopian Plateau -- the most prominent part of the Wall of Africa. Previous research indicated the plateau reached its present average elevation of 8,200 feet 25 million years ago. The Ganis analyzed rates at which the Blue Nile River cut down into the Ethiopian Plateau, creating a canyon that rivals North America's Grand Canyon. They released those findings in the September 2007 issue of GSA Today, published by the Geological Society of America.
The conclusion: There were periods of low-to-moderate incision and uplift between 29 million and 10 million years ago, and again between 10 million and 6 million years ago, but the most rapid uplift of the Ethiopian Plateau (by some 3,200 vertical feet) happened 6 million to 3 million years ago.
The Geotimes paper says other research has shown the Kenyan part of the wall rose mostly between 7 million and 2 million years ago, mountains in Tanganyika and Malawi were uplifted mainly between 5 million and 2 million years ago, and the wall's southernmost end gained most of its elevation during the past 5 million years.
"Clearly, the Wall of Africa grew to be a prominent elevated feature over the last 7 million years, thereby playing a prominent role in East African aridification by wringing moisture out of monsoonal air moving across the region," the Ganis write. That period coincides with evolution of human ancestors in the area.
Royhan Gani says the earliest undisputed evidence of true bipedalism (as opposed to knuckle-dragging by apes) is 4.1 million years ago in Australopithecus anamensis, but some believe the trait existed as early as 6 million to 7 million years ago.
The Ganis speculate that the shaping of varied landscapes by tectonic forces -- lake basins, valleys, mountains, grasslands, woodlands -- "could also be responsible, at a later stage, for hominins developing a bigger brain as a way to cope with these extremely variable and changing landscapes" in which they had to find food and survive predators.
For now, Royhan Gani acknowledges the lack of more precise timeframes makes it difficult to link specific tectonic events to the development of upright walking, bigger brains and other key steps in human evolution.
"But it all happened within the right time period," he says. "Now we need to nail it down."
Adapted from materials provided by University of Utah.

Fausto Intilla

Monday, December 17, 2007

Without Its Insulating Ice Cap, Arctic Surface Waters Warm To As Much As 5 C Above Average


Source:

ScienceDaily (Dec. 17, 2007) — Record-breaking amounts of ice-free water have deprived the Arctic of more of its natural "sunscreen" than ever in recent summers. The effect is so pronounced that sea surface temperatures rose to 5 C above average in one place this year, a high never before observed, says the oceanographer who has compiled the first-ever look at average sea surface temperatures for the region.
Such superwarming of surface waters can affect how thick ice grows back in the winter, as well as its ability to withstand melting the next summer, according to Michael Steele, an oceanographer with the University of Washington's Applied Physics Laboratory. Indeed, since September, the end of summer in the Arctic, winter freeze-up in some areas is two months later than usual.
The extra ocean warming also might be contributing to some changes on land, such as previously unseen plant growth in the coastal Arctic tundra, if heat coming off the ocean during freeze-up is making its way over land, says Steele, who spoke December 12 at the American Geophysical Union meeting in San Francisco.
He is lead author of "Arctic Ocean surface warming trends over the past 100 years," accepted for publication in AGU's Geophysical Research Letters. Co-authors are physicist Wendy Ermold and research scientist Jinlun Zhang, both of the UW Applied Physics Laboratory.
"Warming is particularly pronounced since 1995, and especially since 2000," the authors write. The spot where waters were 5 C above average was in the region just north of the Chakchi Sea. The historical average temperature there is -1 C -- remember that the salt in ocean water keeps it liquid at temperatures that would cause fresh water to freeze. This year water in that area warmed to 4 C, for a 5-degree change from the average.
That general area, the part of the ocean north of Alaska and Eastern Siberia that includes the Bering Strait and Chukchi Sea, experienced the greatest summer warming. Temperatures for that region were generally 3.5 C warmer than historical averages and 1.5 C warmer than the historical maximum.
Such widespread warming in those areas and elsewhere in the Arctic is probably the result of having increasing amounts of open water in the summer that readily absorb the sun's rays, Steele says. Hard, white ice, on the other hand, can work as a kind of sunscreen for the waters below, reflecting rather than absorbing sunlight. The warming also may be partly caused by increasing amounts of warmer water coming from the Pacific Ocean, something scientists have noted in recent years.
The Arctic was primed for more open water since the early 1990s as the sea-ice cover has thinned, due to a warming atmosphere and more frequent strong winds sweeping ice out of the Arctic Ocean via Fram Strait into the Atlantic Ocean where the ice melts. The wind effect was particularly strong in the summer of 2007.
Now the situation could be self-perpetuating, Steele says. For example, he calculates that having more heat in surface waters in recent years means 23 to 30 inches less ice will grow in the winter than formed in 1965. Since sea ice typically grows about 80 inches in a winter, that is a significant fraction of ice that's going missing, he says.
Then too, higher sea surface temperatures can delay the start of freeze-up because the extra heat must be discharged from the upper ocean before ice can form. "The effect on net winter growth would probably be negligible for a delay of several weeks, but could be substantial for delays of several months," the authors write.
The work is funded by the National Science Foundation.
Adapted from materials provided by University of Washington.

Fausto Intilla

Friday, December 14, 2007

Natural Climate Changes Can Intensify Hurricanes More Efficiently Than Global Warming


Source:

ScienceDaily (Dec. 13, 2007) — Natural climate variations, which tend to involve localized changes in sea surface temperature, may have a larger effect on hurricane activity than the more uniform patterns of global warming, a report in Nature suggests.
In the debate over the effect of global warming on hurricanes, it is generally assumed that warmer oceans provide a more favorable environment for hurricane development and intensification. However, several other factors, such as atmospheric temperature and moisture, also come into play.
Drs. Gabriel A. Vecchi of the NOAA Geophysical Fluid Dynamics Laboratory and Brian J. Soden from the University of Miami Rosenstiel School of Marine & Atmospheric Science analyzed climate model projections and observational reconstructions to explore the relationship between changes in sea surface temperature and tropical cyclone 'potential intensity' - a measure that provides an upper limit on cyclone intensity.
They found that warmer oceans do not alone produce a more favorable environment for storms because the effect of remote warming can counter, and sometimes overwhelm, the effect of local surface warming. "Warming near the storm acts to increase the potential intensity of hurricanes, whereas warming away from the storms acts to decrease their potential intensity," Vecchi said.
Their study found that long-term changes in potential intensity are more closely related to the regional pattern of warming than to local ocean temperature change. Regions that warm more than the tropical average are characterized by increased potential intensity, and vice versa. "A surprising result is that the current potential intensity for Atlantic hurricanes is about average, despite the record high temperatures of the Atlantic Ocean over the past decade." Soden said. "This is due to the compensating warmth in other ocean basins."
"As we try to understand the future changes in hurricane intensity, we must look beyond changes in Atlantic Ocean temperature. If the Atlantic warms more slowly than the rest of the tropical oceans, we would expect a decrease in the upper limit on hurricane intensity," Vecchi added. "This is an interesting piece of the puzzle."
"While these results challenge some current notions regarding the link between climate change and hurricane activity, they do not contradict the widespread scientific consensus on the reality of global warming," Soden noted.
The journal article is entitled "Effect of Remote Sea Surface Temperature Change on Tropical Cyclone Potential Intensity."
Adapted from materials provided by University of Miami Rosenstiel School of Marine & Atmospheric Science.

Fausto Intilla

Source:

ScienceDaily (Dec. 13, 2007) — The decade of 1998-2007 is the warmest on record, according to data sources obtained by the World Meteorological Organization (WMO). The global mean surface temperature for 2007 is currently estimated at 0.41°C/0.74°F above the 1961-1990 annual average of 14.00°C/57.20°F.
The University of East Anglia and the Met Office's Hadley Centre have released preliminary global temperature figures for 2007, which show the top 11 warmest years all occurring in the last 13 years. The provisional global figure for 2007 using data from January to November, currently places the year as the seventh warmest on records dating back to 1850.
Other remarkable global climatic events recorded so far in 2007 include record-low Arctic sea ice extent, which led to first recorded opening of the Canadian Northwest Passage; the relatively small Antarctic Ozone Hole; development of La Niña in the central and eastern Equatorial Pacific; and devastating floods, drought and storms in many places around the world.
The preliminary information for 2007 is based on climate data up to the end of November from networks of land-based weather stations, ships and buoys, as well as satellites. The data are continually collected and disseminated by the National Meteorological and Hydrological Services (NMHS) of WMO’s 188 Members and several collaborating research institutions. Final updates and figures for 2007 will be published in March 2008 in the annual WMO brochure for the Statement on the Status of the Global Climate.
WMO’s global temperature analyses are based on two different sources. One is the combined dataset maintained by both the Hadley Centre of the UK Meteorological Office, and the Climatic Research Unit, University of East Anglia, UK, which at this stage ranked 2007 as the seventh warmest on record. The other dataset is maintained by the US Department of Commerce’s National Oceanic and Atmospheric Administration (NOAA), which indicated that 2007 is likely to be the fifth warmest on record.
Since the start of the 20th century, the global average surface temperature has risen by 0.74°C. But this rise has not been continuous. The linear warming trend over the last 50 years (0.13°C per decade) is nearly twice that for the last 100 years.
According to the Intergovernmental Panel on Climate Change’s 4th Assessment (Synthesis) Report, 2007, “warming of the climate system is unequivocal, as is now evident from observations of increases in global average air and ocean temperatures, widespread melting of snow and ice, and rising global average sea level.”
2007 global temperatures have been averaged separately for both hemispheres. Surface temperatures for the northern hemisphere are likely to be the second warmest on record, at 0.63°C above the 30-year mean (1961-90) of 14.6°C/58.3°F. The southern hemisphere temperature is 0.20°C higher than the 30-year average of 13.4°C/56.1°F, making it the ninth warmest in the instrumental record since 1850.
January 2007 was the warmest January in the global average temperature record at 12.7°C/54.9°F, compared to the 1961-1990 January long-term average of 12.1°C/53.8°F.
Regional temperature anomalies
2007 started with record breaking temperature anomalies throughout the world. In parts of Europe, winter and spring ranked amongst the warmest ever recorded, with anomalies of more than 4°C above the long-term monthly averages for January and April.
Extreme high temperatures occurred in much of Western Australia from early January to early March, with February temperatures more than 5°C above average.
Two extreme heat waves affected south-eastern Europe in June and July, breaking previous records with daily maximum temperatures exceeding 40°C/104°F in some locations, including up to 45°C/113°F in Bulgaria. Dozens of people died and fire-fighters battled blazes devastating thousands of hectares of land. A severe heat wave occurred across the southern United States of America during much of August with more than 50 deaths attributed to excessive heat. August to September 2007 was extremely warm in parts of Japan, setting a new national record of absolute maximum temperature of 40.9°/105.6°F on 16 August.
In contrast, Australia recorded its coldest ever June with the mean temperature dropping to 1.5°C below normal. South America experienced an unusually cold winter (June-August), bringing winds, blizzards and rare snowfall to various provinces with temperatures falling to -22°C/-7.6°F in Argentina and -18°C/-0.4°F in Chile in early July.
Prolonged drought
Across North America, severe to extreme drought was present across large parts of the western U.S. and Upper Midwest, including southern Ontario/Canada, for much of 2007. More than three-quarters of the Southeast U.S. was in drought from mid-summer into December, but heavy rainfall led to an end of drought in the southern Plains.
In Australia, while conditions were not as severely dry as in 2006, long term drought meant water resources remained extremely low in many areas. Below average rainfall over the densely populated and agricultural regions resulted in significant crop and stock losses, as well as water restrictions in most major cities.
China experienced its worst drought in a decade, affecting nearly 40 million hectares of farmland. Tens of millions of people suffered from water restrictions.
Flooding and intense storms
Flooding affected many African countries in 2007. In February, Mozambique experienced its worst flooding in six years, killing dozens, destroying thousands of homes and flooding 80,000 hectares of crops in the Zambezi valley.
In Sudan, torrential rains caused flash floods in many areas in June/July, affecting over 410,000 people, including 200,000 left homeless. The strong southwesterly monsoon resulted in one of the heaviest July-September rainfall periods, triggering widespread flash floods affecting several countries in West Africa, Central Africa and parts of the Greater Horn of Africa. Some 1.5 million people were affected and hundreds of thousands homes destroyed.
In Bolivia, flooding in January-February affected nearly 200,000 people and 70,000 hectares of cropland. Strong storms brought heavy rain that caused extreme flooding in the littoral region of Argentina in late March/early April. In early May, Uruguay was hit by its worst flooding since 1959, with heavy rain producing floods that affected more than 110,000 people and severely damaged crops and buildings. Triggered by storms, massive flooding in Mexico in early November destroyed the homes of half a million people and seriously affected the country’s oil industry.
In Indonesia, massive flooding on Java in early February killed dozens and covered half of the city of Jakarta by up to 3.7 metres of water. Heavy rains in June ravaged areas across southern China, with flooding and landslides affecting over 13.5 million people and killing more than 120. Monsoon-related extreme rainfall events caused the worst flooding in years in parts of South Asia. About 25 million people were affected in the region, especially in India, Pakistan, Bangladesh and Nepal. Thousands lost their lives. However, rainfall during the Indian summer monsoon season (June-September) for India was, generally, near normal (105% of the long-term average), but with marked differences in the distribution of rainfall in space and time.
A powerful storm system, Kyrill, affected much of northern Europe during 17-18 January 2007 with torrential rains and winds gusting up to 170km/h. There were at least 47 deaths across the region, with disruptions in electric supply affecting tens of thousands during the storm.
England and Wales recorded its wettest May-July period since records began in 1766, receiving 406 mm of rain compared to the previous record of 349 mm in 1789. Extensive flooding in the region killed nine and caused more than US$6 billion in damages.
Development of La Niña
The brief El Niño event of late 2006 quickly dissipated in January 2007, and La Niña conditions became well established across the central and eastern Equatorial Pacific in the latter half of 2007.
In addition to La Niña, unusual sea surface temperature patterns with cooler than normal values across the north of Australia to the Indian Ocean, and warmer than normal values in the Western Indian Ocean, were recorded. These are believed to have modified the usual La Niña impacts in certain regions around the world.
The current La Niña is expected to continue into the first quarter of 2008 at least.
Devastating tropical cyclones
Twenty-four named tropical storms developed in the North-West Pacific during 2007, below the annual average of 27. Fourteen storms were classified as typhoons, equalling the annual average. Tropical cyclones affected millions in south-east Asia, with typhoons Pabuk, Krosa, Lekima and tropical storms like Peipah among the severest.
During the 2007 Atlantic Hurricane season, 14 named storms occurred, compared to the annual average of 12, with 6 being classified as hurricanes, equalling the average. For the first time since 1886, two category 5 hurricanes (Dean and Felix) made landfall in the same season.
In February, due to tropical cyclone Gamède, a new worldwide rainfall record was set in French La Reunion with 3,929 mm measured within three days.
In June, cyclone Gonu made landfall in Oman, affecting more than 20,000 people and killing 50, before reaching the Islamic Republic of Iran. There is no record of a tropical cyclone hitting Iran since 1945.
On 15 November, tropical cyclone Sidr made landfall in Bangladesh, generating winds of up to 240 km/h and torrential rains. More than 8.5 million people were affected and over 3,000 died. Nearly 1.5 million houses were damaged or destroyed. Often hit by cyclones, Bangladesh has developed a network of cyclone shelters and a storm early-warning system, which significantly reduced casualties.
Australia’s 2006/2007 tropical season was unusually quiet, with only five tropical cyclones recorded, equalling the lowest number observed since at least 1943-44.
Relatively small Antarctic ozone hole
The 2007 Antarctic ozone hole was relatively small due to mild stratosphere winter temperatures. Since 1998, only the 2002 and 2004 ozone holes were smaller. In 2007, the ozone hole reached a maximum of 25 million square kms in mid-September, compared to 29 million square kms in the record years of 2000 and 2006. The ozone mass deficit reached 28 megatonnes on 23 September, compared to more than 40 megatonnes in the record year of 2006.
Record-low Arctic sea ice extent opened the Northwest Passage
Following the Arctic sea ice melt season, which ends annually in September at the end of the northern summer, the average “sea ice extent” was 4.28 million square kms, the lowest on record. The “sea ice extent” at September 2007 was 39% below the long-term 1979-2000 average, and 23% below the previous record set just two years ago in September 2005.For the first time in recorded history, the disappearance of ice across parts of the Arctic opened the Canadian Northwest Passage for about five weeks starting 11 August. Nearly 100 voyages in normally ice-blocked waters sailed without the threat of ice. The September rate of sea ice decline since 1979 is now approximately 10% per decade, or 72,000 square kms per year.
Sea level rise continues
The sea level continued to rise at rates substantially above the average for the 20th century of about 1.7 mm per year. Measurements show that the 2007 global averaged sea level is about 20 cm higher than the 1870 estimate. Modern satellite measurements show that since 1993 global averaged sea level has been rising at about 3 mm per year.
Global 10 Warmest Years Mean Global temperature (°C) (anomaly with respect to 1961-1990)
1998 0.52
2005 0.48
2003 0.46
2002 0.46
2004 0.43
2006 0.42
2007(Jan-Nov) 0.41
2001 0.40
1997 0.36
1995 0.28
UK 10 Warmest Years Mean UK Temperature (°C) (anomaly with respect to 1971-2000)
2006 +1.15
2007 (Jan to 10th Dec) + 1.10
2003 + 0.92
2004 + 0.89
2002 + 0.89
2005 + 0.87
1990 + 0.83
1997 + 0.82
1949 + 0.80
1999 + 0.78
Adapted from materials provided by World Meteorological Organization.

Fausto Intilla

'Magma P.I.' Unearths Clues To How Earth's Crust Was Sculpted


Source:

ScienceDaily (Dec. 14, 2007) — About a decade ago, Johns Hopkins University geologist Bruce Marsh challenged the century-old concept that the Earth's outer layer formed when crystal-free molten rock called magma oozed to the surface from giant subterranean chambers hidden beneath volcanoes.
Marsh's theory -- that the deep-seated plumbing underneath volcanoes is actually made up of an extensive system of smaller sheet-like chambers vertically interconnected with each other and transporting a crystal-laden "magmatic mush" to the surface -- has become far more widely accepted. This sort of system, known as a "magmatic mush column," is thought to exist beneath all of the world's major volcanic centers.
Now, Marsh -- using the windswept McMurdo Dry Valleys of Antarctica as his "walk in" laboratory -- posits that these channels did more than simply transport or supply magma and crystals to form the Earth's surface: As the magma pushed up through the earth, the pressure fractured the crust in such a way that it provided a sort of "template," guiding later erosion in sculpting a series of valleys and mountain ranges there.
Marsh described his latest findings to fellow scientists at a recent meeting of the American Geological Society.
"As the magma made its way to the surface, the pressure broke the crust up into pieces," Marsh says. "That fracturing reflected a pattern of stress in the same way that a windshield put under pressure will eventually fracture and the pattern of the broken glass would reflect where the stress was originally applied.
"Magma then seeped in," he says, "and 'welded' the fractures, sealing them temporarily until erosion -- in the form of snow, rain, ice and wind -- went to work on these weaknesses, carving out valleys, mountains and other landforms that we see there today and marking where the solidified magma originally was."
Marsh said that, in Antarctica, both of these functions date back at least 180 million years to the time when the continents split apart. He points out that this observation brings together the usually disparate study of deep-seated magmatic processes and land-surface evolution.
"It's one of those situations where, usually, never the twain shall meet, but they do in this case," the earth scientist said. "Having recognized evidence in this critical process in the McMurdo Dry Valleys is important because it may allow us to recognize it in other areas where the geologic record is scantier and less complete."
The Dry Valleys makes an ideal place to study these systems because it was eroded into its present form millions of years ago and has, unlike the rest of Earth's surface, undergone very little subsequent erosion. His colleagues George Denton of the University of Maine and David Marchant of Boston University call this region "a relic landscape," because it is the only known place on Earth that looks almost exactly as it did millions of years ago.
"The delicacy of the landscape in the Dry Valleys has preserved for us an unusually rich collection of geologic evidence of the processes that formed this terrain," Marsh said.
For more than a quarter of a century, Marsh -- who could be thought of by fans of 1980s detective television shows as sort of a "Magma P.I." -- has been working to understand the deep underground systems that bring magma to the Earth's surface. In 1993, he found the Dry Valleys, a walk-in "museum" that he calls "the one place on earth where the plumbing system is exposed in this way."
"You can stand on shelves of solidified lava that were deposited by magmatic activity 180 million years ago," he said. "It's awe inspiring."
Adapted from materials provided by Johns Hopkins University.

Fausto Intilla

Wednesday, December 12, 2007

NASA Spacecraft Make New Discoveries About Northern Lights


Source:

ScienceDaily (Dec. 12, 2007) — A fleet of NASA spacecraft, launched less than eight months ago, has made three important discoveries about spectacular eruptions of Northern Lights called "substorms" and the source of their power.
NASA's Time History of Events and Macroscale Interactions during Substorms (THEMIS) mission observed the dynamics of a rapidly developing substorm, confirmed the existence of giant magnetic ropes and witnessed small explosions in the outskirts of Earth's magnetic field. The findings will be presented at the annual meeting of the American Geophysical Union in San Francisco in December.
The discoveries began on March 23, when a substorm erupted over Alaska and Canada, producing vivid auroras for more than two hours. A network of ground cameras organized to support THEMIS photographed the display from below while the satellites measured particles and fields from above.
"The substorm behaved quite unexpectedly," says Vassilis Angelopoulos, the mission's principal investigator at the University of California, Los Angeles. "The auroras surged westward twice as fast as anyone thought possible, crossing 15 degrees of longitude in less than one minute. The storm traversed an entire polar time zone, or 400 miles, in 60 seconds flat."
Photographs taken by ground cameras and NASA's Polar satellite (also supporting the THEMIS mission) revealed a series of staccato outbursts each lasting about 10 minutes. Angelopoulos said that some of the bursts died out while others reinforced each other and went on to become major onsets.
Angelopoulos was quite impressed with the substorm's power and he estimated the total energy of the two-hour event at five hundred thousand billion Joules. That's equivalent to the energy of one magnitude 5.5 earthquake . Where does all that energy come from" THEMIS may have found the answer.
"The satellites have found evidence of magnetic ropes connecting Earth's upper atmosphere directly to the sun," said David Sibeck, project scientist for the mission at NASA's Goddard Space Flight Center, Greenbelt, Md. "We believe that solar wind particles flow in along these ropes, providing energy for geomagnetic storms and auroras."
A magnetic rope is a twisted bundle of magnetic fields organized much like the twisted hemp of a mariner's rope. Spacecraft have detected hints of these ropes before, but a single spacecraft was insufficient to map their 3D structure. THEMIS' five identical micro-satellites were able to perform the feat.
"THEMIS encountered its first magnetic rope on May 20," said Sibeck. "It was very large, about as wide as Earth, and located approximately 40,000 miles (70,000 km) above Earth's surface in a region called the magnetopause." The magnetopause is where the solar wind and Earth's magnetic field meet and push against one another like sumo wrestlers locked in combat. There, the rope formed and unraveled in just a few minutes, providing a brief but significant conduit for solar wind energy.
THEMIS also has observed a number of small explosions in Earth's magnetic bow shock. "The bow shock is like the bow wave in front of a boat," explained Sibeck. "It is where the solar wind first feels the effects of Earth's magnetic field. Sometimes a burst of electrical current within the solar wind will hit the bow shock and--Bang! We get an explosion."
The THEMIS satellites are equipped with instruments that measure ions, electrons and electromagnetic radiation in space. The satellites will line up along the sun-Earth line next February to perform their key measurements. Researchers expect to observe, for the first time, the origin of substorm onsets in space and learn more about their evolution. Scientists from the US, Canada, Western Europe, Russia and Japan are contributing to the scientific investigation over the next two years.
THEMIS is the fifth medium-class mission under NASA's Explorer Program, which provides frequent flight opportunities for world-class scientific investigations within the heliophysics and astrophysics science areas.
The Explorer Program Office at Goddard manages the NASA-funded THEMIS mission. The University of California, Berkeley's Space Sciences Laboratory is responsible for project management, science and ground-based instruments, mission integration and post launch operations. ATK (formerly Swales Aerospace), Beltsville, Md., built the THEMIS probes.
Adapted from materials provided by NASA/Goddard Space Flight Center.

Fausto Intilla

Greenland Melt Accelerating, According To Climate Scientist


Source:

ScienceDaily (Dec. 12, 2007) — The 2007 melt extent on the Greenland ice sheet broke the 2005 summer melt record by 10 percent, making it the largest ever recorded there since satellite measurements began in 1979, according to a University of Colorado at Boulder climate scientist.
The melting increased by about 30 percent for the western part of Greenland from 1979 to 2006, with record melt years in 1987, 1991, 1998, 2002, 2005 and 2007, said CU-Boulder Professor Konrad Steffen, director of the Cooperative Institute for Research in Environmental Sciences. Air temperatures on the Greenland ice sheet have increased by about 7 degrees Fahrenheit since 1991, primarily a result of the build-up of greenhouse gases in Earth's atmosphere, according to scientists.
Steffen gave a presentation on his research at the fall meeting of the American Geophysical Union held in San Francisco from Dec. 10 to Dec. 14. His team used data from the Defense Meteorology Satellite Program's Special Sensor Microwave Imager aboard several military and weather satellites to chart the area of melt, including rapid thinning and acceleration of ice into the ocean at Greenland's margins.
Steffen maintains an extensive climate-monitoring network of 22 stations on the Greenland ice sheet known as the Greenland Climate Network, transmitting hourly data via satellites to CU-Boulder to study ice-sheet processes.
Although Greenland has been thickening at higher elevations due to increases in snowfall, the gain is more than offset by an accelerating mass loss, primarily from rapidly thinning and accelerating outlet glaciers, Steffen said. "The amount of ice lost by Greenland over the last year is the equivalent of two times all the ice in the Alps, or a layer of water more than one-half mile deep covering Washington, D.C."
The Jacobshavn Glacier on the west coast of the ice sheet, a major Greenland outlet glacier draining roughly 8 percent of the ice sheet, has sped up nearly twofold in the last decade, he said. Nearby glaciers showed an increase in flow velocities of up to 50 percent during the summer melt period as a result of melt water draining to the ice-sheet bed, he said.
"The more lubrication there is under the ice, the faster that ice moves to the coast," said Steffen. "Those glaciers with floating ice 'tongues' also will increase in iceberg production."
Greenland is about one-fourth the size of the United States, and about 80 percent of its surface area is covered by the massive ice sheet. Greenland hosts about one-twentieth of the world's ice -- the equivalent of about 21 feet of global sea rise. The current contribution of Greenland ice melt to global sea levels is about 0.5 millimeters annually.
The most sensitive regions for future, rapid change in Greenland's ice volume are dynamic outlet glaciers like Jacobshavn, which has a deep channel reaching far inland, he said. "Inclusion of the dynamic processes of these glaciers in models will likely demonstrate that the 2007 Intergovernmental Panel on Climate Change assessment underestimated sea-level projections for the end of the 21st century," Steffen said.
Helicopter surveys indicate there has been an increase in cylindrical, vertical shafts in Greenland's ice known as moulins, which drain melt water from surface ponds down to bedrock, he said. Moulins, which resemble huge tunnels in the ice and may run vertically for several hundred feet, switch back and forth from vertical to horizontal as they descend toward the bottom of the ice sheet, he said.
"These melt-water drains seem to allow the ice sheet to respond more rapidly than expected to temperature spikes at the beginning of the annual warm season," Steffen said. "In recent years the melting has begun earlier than normal."
Steffen and his team have been using a rotating laser and a sophisticated digital camera and high-definition camera system provided by NASA's Jet Propulsion Laboratory to map the volume and geometry of moulins on the Greenland ice sheet to a depth of more than 1,500 feet. "We know the number of moulins is increasing," said Steffen. "The bigger question is how much water is reaching the bed of the ice sheet, and how quickly it gets there."
Steffen said the ice loss trend in Greenland is somewhat similar to the trend of Arctic sea ice in recent decades. In October, CU-Boulder's National Snow and Ice Data Center reported the 2007 Arctic sea-ice extent had plummeted to the lowest levels since satellite measurements began in 1979 and was 39 percent below the long-term average tracked from 1979 to 2007.
CIRES is a joint institute of CU-Boulder and the National Oceanic and Atmospheric Administration. For more information on Steffen's research, visit the Web site at: http://cires.colorado.edu/science/groups/steffen/.
Adapted from materials provided by University of Colorado at Boulder.

Fausto Intilla

In Search For Water On Mars, Clues From Antarctica


Source:

ScienceDaily (Dec. 11, 2007) — Scientists have gathered more evidence that suggests flowing water on Mars -- by comparing images of the red planet to an otherworldly landscape on Earth.
In recent years, scientists have examined images of several sites on Mars where water appears to have flowed to the surface and left behind a trail of sediment. Those sites closely resemble places where water flows today in the McMurdo Dry Valleys in Antarctica, the new study has found.
The new study bolsters the notion that liquid water could be flowing beneath the surface of Mars. And since bacteria thrive in the liquid water flowing in the Dry Valleys, the find suggests that bacterial life could possibly exist on Mars as well.
Researchers have used the Dry Valleys as an analogy for Mars for 30 years, explained Berry Lyons, professor of earth sciences and director of the Byrd Polar Research Center at Ohio State University.
Lyons is lead principal investigator for the National Science Foundation's Long Term Ecological Research (LTER) Network, a collaboration of more than 1,800 scientists who study the ecology of sites around the world.
One of the LTER sites is in the Dry Valleys, a polar desert in Antarctica with year-round saltwater flowing beneath the surface. With temperatures that dip as low as negative 85 degrees Fahrenheit, it's as cold as the Martian equator, and its iron-rich soil gives it a similar red color.
“If you looked at pictures of both landscapes side by side, you couldn't tell them apart,” Lyons said.
In the new study, LTER scientists did just that -- they compared images of water flows in the Dry Valleys to images of gullies on Mars that show possible evidence of recent water flow.
Team member Peter Doran of the University of Illinois at Chicago presented the results Tuesday, December 11, 2007, at the American Geophysical Union meeting at San Francisco .
The scientists' conclusion: the Martian sites closely resemble sites in the Dry Valleys where water has seeped to the surface.
The water in the Dry Valleys can be very salty -- it's full of calcium chloride, the same kind of salt we sprinkle on roadways to melt ice. That's why the water doesn't freeze. Natural springs form from melted ground ice or buried glacier ice, and the saltwater percolates to the surface.
“Even in the dead of winter, there are locations with salty water in the Dry Valleys ,” Lyons said. “Two months a year, we even have lakes of liquid water covered in ice.”
But after the water reaches the surface, it evaporates, leaving behind salt and sediment.
The same thing would happen on Mars, he added.
Because the suspected sediment sites on Mars closely resemble known sediment sites in the Dry Valleys, Lyons and his colleagues think that liquid saltwater is likely flowing beneath the Martian surface.
Lyons, who has led many expeditions to Antarctica, said that his team will continue to compare what they learn on Earth to any new evidence of water uncovered on Mars.
As they walk across the Dry Valleys, they can't help but compare the two.
“There's just something about that landscape, about being so far from civilization, that makes you think about other worlds,” he said.
Adapted from materials provided by Ohio State University.

Fausto Intilla

Sunday, December 9, 2007

Age-old Mystery Of Missing Chemicals From Earth's Mantle May Be Solved


Source:

ScienceDaily (Dec. 9, 2007) — Observations about the early formation of Earth may answer an age-old question about why the planet's mantle is missing some of the matter that should be present, according to UBC geophysicist John Hernlund.
Earth is made from chondrite, very primitive rocks of meteorites that date from the earliest time of the solar system before the Earth was formed. However, scientists have been puzzled why the composition of Earth's mantle and core differed from that of chondrite.
Hernlund's findings suggest that an ancient magma ocean swirled beneath the Earth's surface and would account for the discrepancy.
"As the thick melted rock cooled and crystallized, the solids that resulted had a different composition than the melt," explains Hernlund, a post-doctoral fellow at UBC Earth and Ocean Sciences.
"The melt held onto some of the elements. This would be where the missing elements of chondrite are stored."
He says this layer of molten rock would have been around 1,000 km thick and 2,900 km beneath the surface."
Published in the journal Nature, Hernlund's study explores the melting and crystallization processes that have controlled the composition of the Earth's interior over geological time. Co-authors are Stéphane Labrosse, Ecole Normale Superieure de Lyon and Nicolas Coltice, Université de Lyon.
The centre of Earth is a fiery core of melted heavy metals, mostly iron. This represents 30 per cent while the remaining 70 per cent is the outer mantle of solid rock.
Traditional views hold that a shallow ocean of melted rock (magma) existed 1,000 km below the Earth's surface, but it was short lived and gone by 10 million years after the formation of Earth.
In contrast, Hernlund's evolutionary model predicts that during Earth's hotter past shortly after its formation 4.5 billion years ago, at least one-third of the mantle closest to the core was also melted.
The partially molten patches now observed at the base of the Earth's mantle could be the remnants of such a deep magma ocean, says Hernlund.
Adapted from materials provided by University of British Columbia.

Fausto Intilla

Monday, December 3, 2007

Helium Isotopes Point To New Sources Of Geothermal Energy


Source:

ScienceDaily (Dec. 3, 2007) — With fossil fuel sources depleting and global warming on the rise, exploring alternative means of power for humans is a necessary reality. Now, looking to the sky, relying on the wind or harnessing water power are not the only remaining options. Deep within Earth is an untapped source of energy: geothermal energy.
It has been estimated that within the continental United States, there is a sizable resource of accessible geothermal energy -- about 3,000 times the current annual U.S. consumption.
Two important reasons this storehouse of energy has not been tapped is that locating the specific energy hot spots is difficult and expensive.
"Since many geothermal resources are hidden, that is, they do not show any clear indications of their presence at the surface, locating them by just using observations made at the surface is difficult," explains Matthijs van Soest, associate research professional at the Noble Gas Geochemistry and Geochronology Laboratory within the School of Earth and Space Exploration at Arizona State University.
"Often when people thought there might be a geothermal resource below the surface the only way to determine if their assumption was correct was drilling and drilling is extremely expensive," he says.
Now, research by van Soest and B. Mack Kennedy at Lawrence Berkeley National Laboratory reveals that geothermal exploration doesn't have to be high-priced.
And it doesn't even have to require drilling.
In a survey of the northern Basin and Range province of the western United States, geochemists Mack Kennedy of the Department of Energy's Lawrence Berkeley National Laboratory and Matthijs van Soest of Arizona State University have discovered a new tool for identifying potential geothermal energy resources.
Currently, most developed geothermal energy comes from regions of volcanic activity, such as The Geysers in Northern California. The potential resources identified by Kennedy and van Soest arise not from volcanism but from the flow of surface fluids through deep fractures that penetrate the earth's lower crust, in regions far from current or recent volcanic activity.
"A good geothermal energy source has three basic requirements: a high thermal gradient — which means accessible hot rock — plus a rechargeable reservoir fluid, usually water, and finally, deep permeable pathways for the fluid to circulate through the hot rock," says Kennedy, a staff scientist in Berkeley Lab's Earth Sciences Division. "We believe we have found a way to map and quantify zones of permeability deep in the lower crust that result not from volcanic activity but from tectonic activity, the movement of pieces of the Earth's crust."
Kennedy and van Soest made their discovery by comparing the ratios of helium isotopes in samples gathered from wells, surface springs, and vents across the northern Basin and Range. Helium-three, whose nucleus has just one neutron, is made only in stars, and Earth's mantle retains a high proportion of primordial helium-three (compared to the minuscule amount found in air) left over from the formation of the solar system.
Earth's crust, on the other hand, is rich in radioactive elements like uranium and thorium that decay by emitting alpha particles, which are helium-four nuclei. Thus a high ratio of helium-three to helium-four in a fluid sample indicates that much of the fluid came from the mantle.
High helium ratios are common in active volcanic regions, where mantle fluids intrude through the ductile boundary of the lower crust. But when Kennedy and van Soest found high ratios in places far from volcanism, they knew that mantle fluids must be penetrating the ductile boundary by other means.
The geology of the region was the clue. The Basin and Range is characterized by mountain ranges that mostly run north and south, separated by broad, relatively flat-floored valleys (basins), which are blocks of crust that have sunk and become filled with sediment eroded from the uplifted mountains. The alternating basin and range topography is the result of crustal spreading by east to west extension, which has occurred over the past approximately 30 million years. The Earth's crust in the Basin and Range is some of the thinnest in the world, resulting in unusually high thermal gradients.
The faces of mountain blocks in the Basin and Range clearly exhibit the normal faults that result as the blocks are pulled apart by the extension of the crust. Normal faults form high-angle pathways deep down into the brittle upper crust. But as the fault plane approaches the ductile lower crust, changes in the density and viscosity of the rock refract the principle stress acting on the fault, deflecting the fault plane, which becomes more horizontal. It is from these deep, horizontally-trending faults that Kennedy thinks permeable passageways may emanate, penetrating the ductile boundary into the mantle.
One of the most seismically active areas in the Basin and Range occurs in what is called the central Nevada seismic belt. The researchers' detailed studies in this area, notably at the Dixie Valley thermal system next to the Stillwater range, established that the highest helium ratios were restricted to fluids emerging from the Stillwater range-front fault system.
The northern Basin and Range, which Kennedy and van Soest surveyed on behalf of DOE's Office of Basic Energy Sciences and Office of Geothermal Technologies, includes parts of California, Nevada, Oregon, Idaho, and Utah. In their survey the researchers mapped the steady progression from low helium ratios in the east to high ratios in the west. The distribution of the increasing ratios corresponds remarkably with an increase in the rate and a change in the direction of crustal extension, which shifts from an east to west trend across the Basin and Range to a northwest trend.
This change in rate and direction reflects the added shear strain induced by the northward movement of the Pacific Plate past the North American Plate. Kennedy and van Soest believe that the added component of shear strain and increasing extension rate tear open fluid pathways through the ductile lower crust, into the mantle. The high helium isotope ratios they found, indicating potential new sources of geothermal energy, were superimposed upon the general background trend: anomalously high ratios map zones of higher than average permeability.
"We have never seen such a clear correlation of surface geochemical signals with tectonic activity, nor have we ever been able to quantify deep permeability from surface measurements of any kind," says Kennedy. The samples they collected on the surface gave the researchers a window into the structure of the rocks far below, with no need to drill.
With the urgent need to find energy sources that are renewable and don't emit greenhouse gases, geothermal energy is ideal — "the best renewable energy source besides the sun," Kennedy says. Accessible geothermal energy in the United States, excluding Alaska and Hawaii, has been estimated at 9 x 1016 (90 quadrillion) kilowatt-hours, 3,000 times more than the country's total annual energy consumption. Determining helium ratios from surface measurements is a practical way to locate some of the most promising new resources.
Journal reference: "Flow of Mantle Fluids Through the Ductile Lower Crust: Helium Isotope Trends," B. Mack Kennedy and Matthijs C. van Soest, Science, 30 November 2007 .
Adapted from materials provided by DOE/Lawrence Berkeley National Laboratory.

Fausto Intilla

Friday, November 30, 2007

Recipe For A Storm: Ingredients For More Powerful Atlantic Hurricanes


Source:

ScienceDaily (Nov. 30, 2007) — As the world warms, the interaction between the Atlantic Ocean and atmosphere may be the recipe for stronger, more frequent hurricanes.
University of Wisconsin-Madison scientists have found that the Atlantic organizes the ingredients for a powerful hurricane season to create a situation where either everything is conducive to hurricane activity or nothing is-potentially making the Atlantic more vulnerable to climate change than the world's other hurricane hot spots.
After the 2004 and 2005 hurricane seasons, many worry what Atlantic hurricane seasons will look like in a warmer world. Evidence indicates that higher ocean temperatures add a lot of fuel to these devastating storms. In a paper published today in the "Bulletin of the American Meteorological Society," co-authors Jim Kossin and Dan Vimont caution against only looking at one piece of the puzzle. "Sea surface temperature is a bit overrated," says Kossin, an atmospheric scientist at UW-Madison's Cooperative Institute of Meteorological Satellite Studies. "It's part of a larger pattern."
Kossin and Vimont, a professor in the Department of Atmospheric and Oceanic Sciences, noticed that warmer water is just one part of a larger pattern indicating that the conditions are right for more frequent, stronger hurricanes in the Atlantic. The atmosphere reacts to ocean conditions and the ocean reacts to the atmospheric situation, creating a distinct circulation pattern known as the Atlantic Meridional Mode (AMM). The AMM unifies the connections among the factors that influence hurricanes such as ocean temperature, characteristics of the wind, and moisture in the atmosphere.
Finding that a basin-wide circulation pattern drives Atlantic hurricane activity helps explain evidence of significant differences in long-term hurricane trends among the world's basins. In a study published last February, Kossin and his co-authors created a more consistent record of hurricane data that accounted for the significant improvement in storm detection that followed the advent of weather satellites. An analysis of this recalibrated data showed that hurricanes have become stronger and more frequent in the Atlantic Ocean over the last two decades. The increasing trend, however, is harder to identify in the world's other oceans.
Kossin and Vimont wanted to determine why long-term trends in the Atlantic looked different from those in other basins, particularly in the Pacific, where the majority of the world's hurricane activity occurs. "The AMM helps us understand why hurricanes in the Atlantic react differently to climate changes than those in the Pacific," Vimont says. According to Vimont, the other oceanic basins have their own modes of variability.
Understanding how factors vary together provides a new framework from which to consider climate change and hurricanes. "Our study broadens the interpretation of the hurricane-climate relationship," Vimont says.
Looking at the larger set of varying conditions provides a more coherent understanding of how climate change affects hurricane activity. In the Atlantic, warmer water indicates that other conditions are also ideal for hurricane development. However, in the Pacific, a hurricane-friendly environment goes along with cooler ocean temperatures in the area where the storms spend their lives. The inconsistent relationship with sea surface temperature leads Vimont and Kossin to conclude that the connection between hurricane activity and climate variability hinges on more than just changes in ocean temperatures.
"You can never isolate one factor on this planet," Kossin says. "Everything is interrelated."
Depending on the other conditions hurricanes care about, warmer oceans can mean different outcomes. Concentrating on how the atmosphere and the ocean work together helps hurricane researchers see the bigger picture. Because higher sea surface temperatures in the Atlantic act in concert with the AMM, Vimont and Kossin suggest that Atlantic hurricanes will be more sensitive to climate changes than storms in other ocean basins.
In addition to helping researchers understand and predict the effects of climate change on hurricane activity, Vimont and Kossin can forecast the AMM up to a year in advance. If the AMM is positive, all the conditions are right for hurricane development. If it is negative, those living on the coasts can generally expect a quieter hurricane season. Vimont and Kossin plan to further develop their AMM forecasts for use during the hurricane season. The duo also hopes to continue to research the physical relationships that constitute the AMM as well as how future climate change will affect these modes of climate variability.
Adapted from materials provided by University of Wisconsin-Madison.

Fausto Intilla

Wednesday, November 28, 2007

High Resolution Antarctica Map Lays Ground For New Discoveries


Source:

ScienceDaily (Nov. 28, 2007) — A team of researchers from NASA, the U.S. Geological Survey, the National Science Foundation and the British Antarctic Survey unveiled a newly completed map of Antarctica November 27 that is expected to revolutionize research of the continent's frozen landscape.
The Landsat Image Mosaic of Antarctica is a result of NASA's state-of-the-art satellite technologies and an example of the prominent role NASA continues to play as a world leader in the development and flight of Earth-observing satellites.
The map is a realistic, nearly cloudless satellite view of the continent at a resolution 10 times greater than ever before with images captured by the NASA-built Landsat 7 satellite. With the unprecedented ability to see features half the size of a basketball court, the mosaic offers the most geographically accurate, true-color, high-resolution views of Antarctica possible.
"This mosaic of images opens up a window to the Antarctic that we just haven't had before," said Robert Bindschadler, chief scientist of the Hydrospheric and Biospheric Sciences Laboratory at NASA's Goddard Space Flight Center in Greenbelt, Md. "It will open new windows of opportunity for scientific research as well as enable the public to become much more familiar with Antarctica and how scientists use imagery in their research. This innovation is like watching high-definition TV in living color versus watching the picture on a grainy black-and-white television. These scenes don't just give us a snapshot, they provide a time-lapse historical record of how Antarctica has changed and will enable us to continue to watch changes unfold."
Researchers can use the detailed map to better plan scientific expeditions. The mosaic's higher resolution gives researchers a clearer view over most of the continent to help interpret changes in land elevation in hard-to-access areas. Scientists also think the true-color mosaic will help geologists better map various rock formations and types.
To construct the new Antarctic map, researchers pieced together more than a thousand images from three years of Landsat satellite observations. The resulting mosaic gives researchers and the public a new way to explore Antarctica through a free, public-access web portal. Eight different versions of the full mosaic are available to download.
In 1972, the first satellite images of the Antarctic became available with the launch of NASA's Earth Resources Technology Satellite (later renamed Landsat). The series of Landsat satellites have provided the longest, continuous global record of land surface and its historical changes in existence. Prior to these satellite views, researchers had to rely on airplanes and survey ships to map Antarctica's ice-covered terrain.
Images from the Landsat program, now managed by the U.S. Geological Survey, led to more precise and efficient research results as the resolution of digital images improved over the years with upgraded instruments on each new Earth-observing satellite.
"We have significantly improved our ability to extract useful information from satellites as embodied in this Antarctic mosaic project," said Ray Byrnes, liaison for satellite missions at the U.S. Geological Survey in Reston, Va. "As technology progressed, so have the satellites and their image resolution capability. The first three in the Landsat series were limited in comparison to Landsats 4, 5, and 7."
Bindschadler, who conceived the project, initiated NASA's collection of images of Antarctica for the mosaic project in 1999. He and NASA colleagues selected the images that make up the mosaic and developed new techniques to interpret the image data tailored to the project. The mosaic is made up of about 1,100 images from Landsat 7, nearly all of which were captured between 1999 and 2001. The collage contains almost no gaps in the landscape, other than a doughnut hole-shaped area at the South Pole, and shows virtually no seams.
"The mosaic represents an important U.S.-U.K. collaboration and is a major contribution to the International Polar Year," said Andrew Fleming of British Antarctic Survey in Cambridge, England. "Over 60,000 scientists are involved in the global International Polar Year initiative to understand our world. I have no doubt that polar researchers will find this mosaic, one of the first outcomes of that initiative, invaluable for planning science campaigns."
NASA has 14 Earth-observing satellites in orbit with activities that have direct benefit to humankind. After NASA develops and tests new technologies, the agency transfers activities to other federal agencies. The satellites have helped revolutionize the information that emergency officials have to respond to natural disasters like hurricanes and wildfires.
The Landsat Image Mosaic of Antarctica is now available on the Web at: http://lima.usgs.gov/
Adapted from materials provided by NASA/Goddard Space Flight Center.

Fausto Intilla

Tuesday, November 27, 2007

Number Of Tropcial Storms In Recent Past Increasing


Source:

ScienceDaily (Nov. 27, 2007) — Counting tropical storms that occurred before the advent of aircraft and satellites relies on ships logs and hurricane landfalls, making many believe that the numbers of historic tropical storms in the Atlantic are seriously undercounted. However, a statistical model based on the climate factors that influence Atlantic tropical storm activity shows that the estimates currently used are only slightly below modeled numbers and indicate that the numbers of tropical storms in the recent past are increasing, according to researchers.
"We are not the first to come up with an estimate of the number of undercounted storms," says Michael E. Mann, associate professor of meteorology, Penn State, and director of the Earth System Science Center.
In the past, some researchers assumed that a constant percentage of all the storms made landfall and so they compared the number of tropical storms making landfall with the total number of reported storms for that year. Other researchers looked at ship logs and ship tracks to determine how likely a tropical storm would have been missed. In the early 1900s and before, there were probably not sufficient ships crossing the Atlantic to ensure full coverage.
The researchers report in the current issue of Geophysical Review Letters "that the long-term record of historical Atlantic tropical cyclone counts is likely largely reliable, with an average undercount bias at most of approximately one tropical storm per year back to 1870."
The previously estimated undercounts of three or more storms are inaccurate.
"We have a very accurate count of Atlantic tropical cyclones beginning in 1944 when aircraft became common," says Mann. "In the 1970s, satellites were added to that mix."
With more than 60 years of accurate hurricane counts, the researchers, who included Thomas Sabbatelli, an undergraduate in meteorology and the Schreyer Honors College at Penn State, and Urs Neu, a research scientist at ProClim, Swiss Academy of Sciences, looked at other, independent ways to determine the number of hurricanes before 1944.
They looked at how the cycle of El Nino/La Nina, the pattern of the northern hemisphere jet stream and tropical Atlantic sea surface temperatures influence tropical storm generation by creating a model that includes these three climate variables. The information is available back to 1870.
The statistical model proved successful in various tests of accuracy. The model also predicted 15 total Atlantic tropical storms with an error margin of 4 before the current season began. So far, 14 storms have formed, with a little more than one week left in the season.
The model, trained on the tropical storm occurrence information from 1944 to 2006 showed an undercount before 1944 of 1.2 storms per year. When the researchers considered a possible undercount of three storms per year, their model predicted too few storms total. The model only works in the range of around 1.2 undercounted storms per year with the climate data available. The model was statistically significant in its findings.
"Fifty percent of the variation in storm numbers from one year to the next appears to be predictable in terms of the three key climate variables we used," says Mann. "The other 50 percent appears to be pure random variation. The model ties the increase in storm numbers over the past decade to increasing tropical ocean surface temperatures.
"We cannot explain the warming trend in the tropics without considering human impacts on climate. This is not a natural variation," says Mann.
"This . . . supports other work suggesting that increases in frequency, as well as powerfulness, of Atlantic tropical cyclones are potentially related to long-term trends in tropical Atlantic sea surface temperatures, trends that have in turn been connected to anthropogenic influences on climate," the researchers report.
Adapted from materials provided by Penn State.

Fausto Intilla

'Ultrasound' Of Earth's Crust Reveals Inner Workings Of A Tsunami Factory


Source:

ScienceDaily (Nov. 27, 2007) — Research just announced by a team of U.S. and Japanese geoscientists may help explain why part of the seafloor near the southwest coast of Japan is particularly good at generating devastating tsunamis, such as the 1944 Tonankai event, which killed at least 1,200 people. The findings will help scientists assess the risk of giant tsunamis in other regions of the world.
Geoscientists from The University of Texas at Austin and colleagues used a commercial ship to collect three-dimensional seismic data that reveals the structure of Earth's crust below a region of the Pacific seafloor known as the Nankai Trough. The resulting images are akin to ultrasounds of the human body.
The results, published in the journal Science, address a long standing mystery as to why earthquakes below some parts of the seafloor trigger large tsunamis while earthquakes in other regions do not.
The 3D seismic images allowed the researchers to reconstruct how layers of rock and sediment have cracked and shifted over time. They found two things that contribute to big tsunamis. First, they confirmed the existence of a major fault that runs from a region known to unleash earthquakes about 10 kilometers (6 miles) deep right up to the seafloor. When an earthquake happens, the fault allows it to reach up and move the seafloor up or down, carrying a column of water with it and setting up a series of tsunami waves that spread outward.
Second, and most surprising, the team discovered that the recent fault activity, probably including the slip that caused the 1944 event, has shifted to landward branches of the fault, becoming shallower and steeper than it was in the past.
"That leads to more direct displacement of the seafloor and a larger vertical component of seafloor displacement that is more effective in generating tsunamis," said Nathan Bangs, senior research scientist at the Institute for Geophysics at The University of Texas at Austin who was co-principal investigator on the research project and co-author on the Science article.
The Nankai Trough is in a subduction zone, an area where two tectonic plates are colliding, pushing one plate down below the other. The grinding of one plate over the other in subduction zones leads to some of the world's largest earthquakes.
In 2002, a team of researchers led by Jin-Oh Park at Japan Marine Science and Technology Center (JAMSTEC) had identified the fault, known as a megathrust or megasplay fault, using less detailed two-dimensional geophysical methods. Based on its location, they suggested a possible link to the 1944 event, but they were unable to determine where faulting has been recently active.
"What we can now say is that slip has very recently propagated up to or near to the seafloor, and slip along these thrusts most likely caused the large tsunami during the 1944 Tonankai 8.1 magnitude event," said Bangs.
The images produced in this project will be used by scientists in the Nankai Trough Seismogenic Zone Experiment (NanTroSEIZE), an international effort designed to, for the first time, "drill, sample and instrument the earthquake-causing, or seismogenic portion of Earth's crust, where violent, large-scale earthquakes have occurred repeatedly throughout history."
"The ultimate goal is to understand what's happening at different margins," said Bangs. "The 2004 Indonesian tsunami was a big surprise. It's still not clear why that earthquake created such a large tsunami. By understanding places like Nankai, we'll have more information and a better approach to looking at other places to determine whether they have potential. And we'll be less surprised in the future."
Bangs' co-principal investigator was Gregory Moore at JAMSTEC in Yokohama and the University of Hawaii, Honolulu. The other co-authors are Emily Pangborn at the Institute for Geophysics at The University of Texas at Austin, Asahiko Taira and Shin'ichi Kuramoto at JAMSTEC and Harold Tobin at the University of Wisconsin, Madison. Funding for the project was provided by the National Science Foundation, Ocean Drilling Program and Japanese Ministry of Education, Culture, Sports and Technology.
Adapted from materials provided by University of Texas at Austin.

Fausto Intilla

Thursday, November 8, 2007

Scientists Enhance Mother Nature's Carbon Handling Mechanism


Source:

ScienceDaily (Nov. 8, 2007) — Taking a page from Nature herself, a team of researchers developed a method to enhance removal of carbon dioxide from the atmosphere and place it in the Earth's oceans for storage.
Unlike other proposed ocean sequestration processes, the new technology does not make the oceans more acid and may be beneficial to coral reefs. The process is a manipulation of the natural weathering of volcanic silicate rocks. Reporting in Nov. 7 issue of Environmental Science and Technology, the Harvard and Penn State team explained their method.
"The technology involves selectively removing acid from the ocean in a way that might enable us to turn back the clock on global warming," says Kurt Zenz House, graduate student in Earth and planetary sciences, Harvard University. "Essentially, our technology dramatically accelerates a cleaning process that Nature herself uses for greenhouse gas accumulation."
In natural silicate weathering, carbon dioxide from the atmosphere dissolves in fresh water and forms weak carbonic acid. As the water percolates through the soil and rocks, the carbonic acid converts to a solution of alkaline carbonate salts. This water eventually flows into the ocean and increases its alkalinity. An alkaline ocean can hold dissolved carbon, while an acidic one will release the carbon back into the atmosphere. The more weathering, the more carbon is transferred to the ocean where some of it eventually becomes part of the sea bottom sediments.
"In the engineered weathering process we have found a way to swap the weak carbonic acid with a much stronger one (hydrochloric acid) and thus accelerate the pace to industrial rates," says House.
The researchers minimize the potential for environmental problems by combining the acid removal with silicate rock weathering mimicking the natural process. The more alkaline ocean can store carbon as bicarbonate, the most plentiful and innocuous form of carbon in the oceans.
According to House, this would allow removal of excess carbon dioxide from the atmosphere in a matter of decades rather than millennia.
Besides removing the greenhouse gas carbon dioxide from the atmosphere, this technique would counteract the continuing acidification of the oceans that threatens coral reefs and their biological communities. The technique is adaptable to operation in remote areas on geothermal or natural gas and is global rather than local. Unlike carbon dioxide scrubbers on power plants, the process can as easily remove naturally generated carbon dioxide as that produced from burning fossil fuel for power.
The researchers, Kurt House; Daniel P. Schrag, director, Harvard University Center for the Environment and professor of Earth and planetary sciences; Michael J. Aziz, the Gordon McKay professor of material sciences, all at Harvard University and Kurt House's brother, Christopher H. House, associate professor of geosciences, Penn State, caution that while they believe their scheme for reducing global warming is achievable, implementation would be ambitious, costly and would carry some environmental risks that require further study. The process would involve building dozens of facilities similar to large chlorine gas industrial plants, on volcanic rock coasts.
"This work shows how we can remove carbon dioxide on relevant timescales, but more work is be needed to bring down the cost and minimize other environmental effects," says Christopher H. House.
The Link Energy Foundation, Merck Fund of the New York Community Trust, U.S. DOE and NASA supported this work.
Adapted from materials provided by Penn State.

Fausto Intilla

Tuesday, November 6, 2007

Seismic Hazard: Stateline Fault System Is Major Component Of Eastern California Shear Zone

Source:
ScienceDaily (Nov. 6, 2007) — The 200-km (125 miles)-long Stateline fault system is a right-lateral strike-slip fault zone with clear Late Quaternary surface ruptures extending along the California-Nevada state line, from Primm, Nevada area along Interstate 15 to the Amargosa Valley.
The fault passes within 40 km of the Las Vegas strip, 10 km of the town center of Pahrump, Nevada, and appears to end near the town of Amargosa Valley, Nevada (about 40 km west-southwest of the site of the proposed high-level nuclear waste repository at Yucca Mountain).
This fault has long been considered inactive and of only minor importance to the tectonic pattern of eastern California and southwestern Nevada, whereas fault systems like the Death Valley, Panamint Valley, and Owens Valley have received much more attention.
New research focused on the Stateline fault system is beginning to change how we view this fault zone. Guest et al. present geologic data that establishes the minimum offset on the southern segment of the fault system to be 30 ± 4 km over the last 13 million years.
This implies a minimum average slip rate for the southern segment of the fault system of 2.3 ± 0.35 mm/yr. This is twice the slip rate estimated from geodetic monitoring in the region, and therefore the fault is either in a transient period of slow slip or has been abandoned as activity in the eastern California shear zone has migrated west.
The magnitude of accumulated offset, evidence for Late Quaternary slip, and rapid long-term slip rate indicate that the Stateline fault system is a major component of the Eastern California shear zone. Given its proximity to population centers and important infrastructure in southern Nevada, the fault warrants close scrutiny in seismic hazards analyses of the region.
Adapted from materials provided by Geological Society of America.

Fausto Intilla
www.oloscience.com

Wednesday, October 31, 2007

Origin Of 'Breathable' Atmosphere Half A Billion Years Ago Discovered


Source:

ScienceDaily (Oct. 30, 2007) — Ohio State University geologists and their colleagues have uncovered evidence of when Earth may have first supported an oxygen-rich atmosphere similar to the one we breathe today.
The study suggests that upheavals in the earth's crust initiated a kind of reverse-greenhouse effect 500 million years ago that cooled the world's oceans, spawned giant plankton blooms, and sent a burst of oxygen into the atmosphere.
That oxygen may have helped trigger one of the largest growths of biodiversity in Earth's history.
Matthew Saltzman, associate professor of earth sciences at Ohio State, reported the findings October 28 at the meeting of the Geological Society of America in Denver.
For a decade, he and his team have been assembling evidence of climate change that occurred 500 million years ago, during the late Cambrian period. They measured the amounts of different chemicals in rock cores taken from around the world, to piece together a complex chain of events from the period.
Their latest measurements, taken in cores from the central United States and the Australian outback, revealed new evidence of a geologic event called the Steptoean Positive Carbon Isotope Excursion (SPICE).
Amounts of carbon and sulfur in the rocks suggest that the event dramatically cooled Earth's climate over two million years -- a very short time by geologic standards. Before the event, the Earth was a hothouse, with up to 20 times more carbon dioxide in the atmosphere compared to the present day. Afterward, the planet had cooled and the carbon dioxide had been replaced with oxygen. The climate and atmospheric composition would have been similar to today.
“If we could go back in time and walk around in the late Cambrian, this seems to be the first time we would have felt at home,” Saltzman said. “Of course, there was no life on land at the time, so it wouldn't have been all that comfortable.”
The land was devoid of plants and animals, but there was life in the ocean, mainly in the form of plankton, sea sponges, and trilobites. Most of the early ancestors of the plants and animals we know today existed during the Cambrian, but life wasn't very diverse.
Then, during the Ordovician period, which began around 490 million years ago, many new species sprang into being. The first coral reefs formed during that time, and the first true fish swam among them. New plants evolved and began colonizing land.
“If you picture the evolutionary ‘tree of life,' most of the main branches existed during the Cambrian, but most of the smaller branches didn't get filled in until the Ordovician,” Saltzman said. “That's when animal life really began to develop at the family and genus level.” Researchers call this diversification the “Ordovician radiation.”
The composition of the atmosphere has changed many times since, but the pace of change during the Cambrian is remarkable. That's why Saltzman and his colleagues refer to this sudden influx of oxygen during the SPICE event as a “pulse” or “burst.”
“After this pulse of oxygen, the world remained in an essentially stable, warm climate, until late in the Ordovician,” Saltzman said.
He stopped short of saying that the oxygen-rich atmosphere caused the Ordovician radiation.
“We know that oxygen was released during the SPICE event, and we know that it persisted in the atmosphere for millions of years -- during the time of the Ordovician radiation -- so the timelines appear to match up. But to say that the SPICE event triggered the diversification is tricky, because it's hard to tell exactly when the diversification started,” he said.
“We would need to work with paleobiologists who understand how increased oxygen levels could have led to a diversification. Linking the two events precisely in time is always going to be difficult, but if we could link them conceptually, then it would become a more convincing story.”
Researchers have been trying to understand the sudden climate change during the Cambrian period ever since Saltzman found the first evidence of the SPICE event in rock in the American west in 1998. Later, rock from a site in Europe bolstered his hypothesis, but these latest finds in central Iowa and Queensland, Australia, prove that the SPICE event occurred worldwide.
During the Cambrian period, most of the continents as we know them today were either underwater or part of the Gondwana supercontinent, Saltzman explained. Tectonic activity was pushing new rock to the surface, where it was immediately eaten away by acid rain. Such chemical weathering pulls carbon dioxide from the air, traps the carbon in sediments, and releases oxygen -- a kind of greenhouse effect in reverse.
“From our previous work, we knew that carbon was captured and oxygen was released during the SPICE event, but we didn't know for sure that the oxygen stayed in the atmosphere,” Saltzman said.
They compared measurements of inorganic carbon -- captured during weathering -- with organic carbon -- produced by plankton during photosynthesis. And because plankton contain different ratios of the isotopes of carbon depending on the amount of oxygen in the air, the geologists were able to double-check their estimates of how much oxygen was released during the period, and how long it stayed in the atmosphere.
They also studied isotopes of sulfur, to determine whether much of the oxygen being produced was re-captured by sediments.
It wasn't.
Saltzman explained the chain of events this way: Tectonic activity led to increased weathering, which pulled carbon dioxide from the air and cooled the climate. Then, as the oceans cooled to more hospitable temperatures, the plankton prospered -- and in turn created more oxygen through photosynthesis.
“It was a double whammy,” he said. “There's really no way around it when we combine the carbon and sulfur isotope data -- oxygen levels dramatically rose during that time.”
What can this event tell us about climate change today? “Oxygen levels have been stable for the last 50 million years, but they have fluctuated over the last 500 million,” Saltzman said. “We showed that the oxygen burst in the late Cambrian happened over only two million years, so that is an indication of the sensitivity of the carbon cycle and how fast things can change.”
Global cooling may have boosted life early in the Ordovician period, but around 450 million years ago, more tectonic activity -- most likely, the rise of the Appalachian Mountains -- brought on a deadly ice age. So while most of the world's plant and animal species were born during the Ordovician period, by the end of it, more than half of them had gone extinct.
Coauthors on this study included Seth Young, a graduate student in earth sciences at Ohio State; Ben Gill, a graduate student, and Tim Lyons, professor of earth sciences, both at the University of California, Riverside; Lee Kump, professor of geosciences at Penn State University; and Bruce Runnegar, professor of paleontology at the University of California, Los Angeles.
Adapted from materials provided by Ohio State University.

Fausto Intilla

Monday, October 8, 2007

Geologists Recover Rocks Yielding Unprecedented Insights Into San Andreas Fault


Source:

Science Daily — For the first time, geologists have extracted intact rock samples from 2 miles beneath the surface of the San Andreas Fault, the infamous rupture that runs 800 miles along the length of California.
Never before have scientists had available for study rock samples from deep inside one of the actively moving tectonic plate-bounding faults responsible for the world's most damaging earthquakes. Now, with this newly recovered material, scientists hope to answer long-standing questions about the fault's composition and properties.
Altogether, the geologists retrieved 135 feet of 4-inch diameter rock cores weighing roughly 1 ton. They were brought to the surface through a research borehole drilled more than 2.5 miles into the Earth. The last of the cores was brought to the surface in the predawn hours of Sept. 7.
Scientists seeking to understand how the great faults bounding Earth's vast tectonic plates evolve and generate earthquakes have always had to infer the processes through indirect means. Up until now, they could only work with samples of ancient faults exposed at the Earth's surface after millions of years of erosion and uplift, together with computer simulations and laboratory experiments approximating what they think might be happening at the depths at which earthquakes occur.
"Now we can hold the San Andreas Fault in our hands," said Mark Zoback, the Benjamin M. Page Professor in Earth Sciences at Stanford. "We know what it's made of. We can study how it works."
Zoback is one of three co-principal investigators of the San Andreas Fault Observatory at Depth (SAFOD) project, which is establishing the world's first underground earthquake observatory. William Ellsworth and Steve Hickman, geophysicists with the U.S. Geological Survey (USGS) in Menlo Park, Calif., are the other co-principal investigators.
SAFOD, which first broke ground in 2004, is a major research component of EarthScope, a National Science Foundation-funded program being carried out in collaboration with the USGS and NASA to investigate the forces that shape the North American continent and the physical processes controlling earthquakes and volcanic eruptions.
"This is tremendously exciting. Obtaining cores from the actively slipping San Andreas Fault is truly unprecedented and will allow truly transformative research and discoveries," said Kaye Shedlock, EarthScope program director at the National Science Foundation.
In the next phase of the experiment, the science team will install an array of seismic instruments in the 2.5-mile-long borehole that runs from the Pacific plate on the west side of the fault into the North American plate on the east. By placing sensors next to a zone that has been the source of many small temblors, scientists will be able to observe the earthquake generation process with unprecedented acuity. They hope to keep the observatory operating for the next 10 to 20 years.
Studying the San Andreas Fault is important because, as Zoback noted, "The really big earthquakes occur on plate boundaries like the San Andreas Fault." The SAFOD site, located about 23 miles northeast of Paso Robles near the tiny town of Parkfield, sits on a particularly active section of the fault that moves regularly. But it does not produce large earthquakes. Instead, it moves in modest increments by a process called creep, in which the two sides of the fault slide slowly past one another, accompanied by occasional small quakes, most of which are not even felt at the surface.
One of the big questions the researchers seek to answer is how, when most of the fault moves in violent, episodic upheavals, can there be a section where the same massive tectonic plates seem, by comparison, to gently tiptoe past each other with the delicate tread of little cat feet"
"There have been many theories about why the San Andreas Fault slides along so easily, none of which could be tested directly until now," Hickman said. Some posit the presence of especially slippery clays, called smectites. Others suggest there may be high water pressure along the fault plane lubricating the surface. Still others note the presence of a mineral called serpentine exposed in several places along the surface trace of the fault, which-if it existed at depth-could both weaken the fault and cause it to creep.
Zoback said the correlation between the occurrence of serpentine, a metamorphosed remnant of old oceanic crust, and the slippery nature of the fault motion in the area has been the subject of speculation for more than 40 years. However, it has never been demonstrated that serpentine actually occurs along the active San Andreas at depth, and the mechanism by which serpentine might limber up the fault was unknown.
Then, in 2005, when the SAFOD drill pierced the zone of active faulting using rotary drilling (which grinds up the rock into tiny fragments), mineralogist Diane Moore of the USGS detected talc in the rock cuttings brought up to the surface. This finding was published in the Aug. 16, 2007, issue of Nature.
"Talc is one of the slipperiest, weakest minerals ever studied," Hickman said.
Might the same mineral that helps keep a baby's bottom smooth also be smoothing the way for the huge tectonic plates" Chemically, it's possible, for when serpentine is subjected to high temperatures in the presence of water containing silica, it forms talc.
Serpentine might also control how faults behave in other ways. "Serpentine can dissolve in ground water as fault particles grind past each other and then crystallize in nearby open pore spaces, allowing the fault to creep even under very little pressure," Hickman said.
The SAFOD borehole cored into two active traces of the fault this summer, both contained within a broad fault "zone" about 700 feet wide. The deeper of the two active fault zones, designated 10830 for its distance in feet from the surface as measured along the curving borehole, yielded an 8-foot-long section of very fine-grained powder called fault gouge. Such gouge is common in fault zones and is produced by the grinding of rock against rock. "What is remarkable about this gouge is that it contains abundant fragments of serpentine that appear to have been swept up into the gouge from the adjacent solid rock," Hickman said. "The serpentine is floating around in the fault gouge like raisins in raisin pudding."
The only way to know what role serpentine, talc or other exotic minerals play in controlling the behavior of the San Andreas Fault is to study the SAFOD core samples in the laboratory.
"To an earthquake scientist, these cores are like the Apollo moon rocks," Hickman said. "Scientists from around the world are anxious to get their hands on them in the hope that they can help solve the mystery of how this major, active plate boundary works."
Will these new samples allow scientists to predict earthquakes" The short answer is no. But research on these samples could provide clues to answer the question of whether earthquakes are predictable. The observatory will allow scientists to begin to address whether there are precursory phenomena occurring within the fault zone.
The other fault zone, called 10480, contains 3 feet of fault gouge. It also produces small earthquakes at a location about 300 feet below the borehole. "Remarkably, we observe the same earthquake rupturing at the same spot on the fault year after year," Ellsworth said. This repeating earthquake, always about a magnitude 2, will be the focus of the observatory to be installed inside the fault in 2008.
Sensitive seismometers and tiltmeters to be installed in the SAFOD borehole directly above the spot that ruptures will observe for the first time the birthing process of an earthquake from the zone where the earthquake energy accumulates. Preliminary observations made in 2006 already have revealed the tiniest earthquakes ever observed-so small they have negative magnitudes.
In early December, a "sample party" will be held at the USGS office in Menlo Park, where the cores will be on display and scientists will offer their proposals to do research projects in a bid to be allowed to analyze part of the core.
Zoback said most of the initial testing will be nondestructive in order to preserve the samples for as long as possible. "But then, some of the material will be made available for testing that simulates earthquakes and fault slip in the lab," he said.
When not being examined, the core samples will be refrigerated and kept moist to prevent the cores and the fluid in them from being disturbed.
Some of the cores will be on display at the press conference to be held Oct. 4 at Stanford University in Tresidder Union's Oak Room.
In addition to funding from the National Science Foundation, USGS and Stanford University, the SAFOD project also has been supported financially by the International Continental Scientific Drilling Program.
Note: This story has been adapted from material provided by Stanford University.

Fausto Intilla