Thursday, June 17, 2010

Here’s a rapid solution to find out how solar panels work.

What exactly is solar energy ?
Solar energy is radiant energy which is produced by the sun. Every day the sun radiates, or sends out, an enormous volume of energy. The sun radiates more energy in a single second than people have used since the beginning of time!
The energy of the Sun comes from within the sun itself. Like other stars, the sun is really a big ball of gases––mostly hydrogen and helium atoms.
The hydrogen atoms in the sun’s core combine to form helium and generate energy in a process called nuclear fusion.

During nuclear fusion, the sun’s extremely high pressure and temperature cause hydrogen atoms to come apart and their nuclei (the central cores of the atoms) to fuse or combine. Four hydrogen nuclei fuse to become one helium atom. However the helium atom contains less mass compared to four hydrogen atoms that fused. Some matter is lost during nuclear fusion. The lost matter is emitted into space as radiant energy.
It takes millions of years for the energy in the sun’s core to make its way to the solar surface, and somewhat over eight minutes to travel the 93 million miles to earth. The solar energy travels to the earth at a speed of 186,000 miles per second, the speed of light.
Only a small part of the energy radiated from the sun into space strikes the earth, one part in two billion. Yet this volume of energy is enormous. On a daily basis enough energy strikes the united states to supply the nation’s energy needs for one and a half years!

Where does all this energy go?
About 15 percent of the sun’s energy that hits our planet is reflected back into space. Another 30 percent is used to evaporate water, which, lifted in to the atmosphere, produces rainfall. Solar power is also absorbed by plants, the land, and the oceans. The remaining could be used to supply our energy needs.
Who invented solar technology ?
Humans have harnessed solar energy for hundreds of years. As early as the 7th century B.C., people used simple magnifying glasses to concentrate the light of the sun into beams so hot they would cause wood to catch fire. Over a century ago in France, a scientist used heat from a solar collector to create steam to drive a steam engine. In the beginning of this century, scientists and engineers began researching ways to use solar technology in earnest. One important development was a remarkably efficient solar boiler invented by Charles Greeley Abbott, an american astrophysicist, in 1936.

The solar water heater gained popularity at this time in Florida, California, and the Southwest. The industry started in the early 1920s and was in full swing just before World War II. This growth lasted before mid-1950s when low-cost natural gas had become the primary fuel for heating American homes.
People and world governments remained largely indifferent to the possibilities of solar technology until the oil shortages of the1970s. Today, people use solar technology to heat buildings and water and also to generate electricity.
How we use solar power today ?
Solar energy is employed in a variety of ways, of course. There are 2 standard forms of solar power:

* Solar thermal energy collects the sun's warmth through 1 of 2 means: in water or in an anti-freeze (glycol) mixture.

* Solar photovoltaic energy converts the sun's radiation to usable electricity.

Listed below are the five most practical and popular ways that solar energy is used:

1. Small portable solar photovoltaic systems. We see these used everywhere, from calculators to solar garden tools. Portable units can be utilized for everything from RV appliances while single panel systems are used for traffic signs and remote monitoring stations.

2. Solar pool heating. Running water in direct circulation systems via a solar collector is an extremely practical solution to heat water for your pool or hot spa.

3. Thermal glycol energy to heat water. In this method (indirect circulation), glycol is heated by sunshine and the heat is then transferred to water in a hot water tank. This process of collecting the sun's energy is more practical now than ever before. In areas as far north as Edmonton, Alberta, solar thermal to heat water is economically sound. It can pay for itself in three years or less.

4. Integrating solar photovoltaic energy into your home or office power. In most parts on the planet, solar photovoltaics is an economically feasible approach to supplement the power of your home. In Japan, photovoltaics are competitive with other forms of power. In the USA, new incentive programs make this form of solar technology ever more viable in many states. An increasingly popular and practical way of integrating solar energy into the power of your home or business is through the use of building integrated solar photovoltaics.

5. Large independent photovoltaic systems. For those who have enough sun power at your site, you could possibly go off grid. It's also possible to integrate or hybridize your solar energy system with wind power or other forms of renewable energy to stay 'off the grid.'

How can Photovoltaic panels work ?
Silicon is mounted beneath non-reflective glass to create photovoltaic panels. These panels collect photons from the sun, converting them into DC electrical energy. The energy created then flows into an inverter. The inverter transforms the power into basic voltage and AC electrical power.
Photovoltaic cells are prepared with particular materials called semiconductors for example silicon, which is presently the most generally used. When light hits the Photovoltaic cell, a certain share of it is absorbed inside the semiconductor material. This means that the energy of the absorbed light is given to the semiconductor.

The power unfastens the electrons, permitting them to run freely. Photovoltaic cells also have one or more electric fields that act to compel electrons unfastened by light absorption to flow in a specific direction. This flow of electrons is a current, and by introducing metal links on the top and bottom of the -Photovoltaic cell, the current can be drawn to use it externally.
What are the benefits and drawbacks of solar power ?

Solar Pro Arguments:

- Heating our homes with oil or natural gas or using electricity from power plants running with coal and oil is a reason for climate change and climate disruption. Solar power, on the other hand, is clean and environmentally-friendly.

- Solar hot-water heaters require little maintenance, and their initial investment could be recovered within a relatively limited time.

- Solar hot-water heaters can work in almost any climate, even just in very cold ones. You just need to choose the right system for your climate: drainback, thermosyphon, batch-ICS, etc.

- Maintenance costs of solar powered systems are minimal and also the warranties large.

- Financial incentives (USA, Canada, European states…) can reduce the price of the first investment in solar technologies. The U.S. government, for example, offers tax credits for solar systems certified by by the SRCC (Solar Rating and Certification Corporation), which amount to 30 percent of the investment (2009-2016 period).

Solar Cons Arguments:

- The initial investment in Solar Hot water heaters or in Solar PV Electric Systems is greater than that required by conventional electric and gas heaters systems.

- The payback period of solar PV-electric systems is high, as well as those of solar space heating or solar cooling (only the solar warm water heating payback is short or relatively short).

- Solar water heating do not support a direct in conjunction with radiators (including baseboard ones).

- Some air cooling (solar space heating and the solar cooling systems) are costly, and rather untested technologies: solar air conditioning isn't, till now, a really economical option.

- The efficiency of solar powered systems is rather influenced by sunlight resources. It's in colder climates, where heating or electricity needs are higher, that the efficiency is smaller.

About me - Barbara Young writes on
motorhome solar power in her personal hobby blog Her work is centered on helping people save energy using solar power to reduce CO2 emissions and energy dependency.

Thursday, October 1, 2009

Google Earth Application Maps Carbon's Course.

ScienceDaily (Sep. 30, 2009) — Sometimes a picture really is worth a thousand words, particularly when the picture is used to illustrate science. Technology is giving us better pictures every day, and one of them is helping a NASA-funded scientist and her team to explain the behavior of a greenhouse gas.
Google Earth -- the digital globe on which computer users can fly around the planet and zoom in on key features -- is attracting attention in scientific communities and aiding public communication about carbon dioxide. Recently Google held a contest to present scientific results using KML, a data format used by Google Earth.
"I tried to think of a complex data set that would have public relevance," said Tyler Erickson, a geospatial researcher at the Michigan Tech Research Institute in Ann Arbor.
He chose to work with data from NASA-funded researcher Anna Michalak of the University of Michigan, Ann Arbor, who develops complex computer models to trace carbon dioxide back in time to where it enters and leaves the atmosphere.
"The datasets have three spatial dimensions and a temporal dimension," Erickson said. "Because the data is constantly changing in time makes it particularly difficult to visualize and analyze."
A better understanding of the carbon cycle has implications for energy and environmental policy and carbon management. In June 2009, Michalak described this research at the NASA Earth System Science at 20 symposium in Washington, D.C.
A snapshot from Erickson's Google Earth application shows green tracks representing carbon dioxide in the lowest part of the atmosphere close to Earth's surface where vegetation and land processes can impact the carbon cycle. Red tracks indicate particles at higher altitudes that are immune from ground influences.
The application is designed to educate the public and even scientists about how carbon dioxide emissions can be traced. A network of 1,000-foot towers across the United States is equipped with instruments by NOAA to measure the carbon dioxide content of parcels of air at single locations.
The application is designed to educate the public and even scientists about how carbon dioxide emissions can be traced. A network of 1,000-foot towers across the United States, like the tower above, are equipped with instruments by NOAA to measure the carbon dioxide content of parcels of air at single locations.
But where did that gas come from and how did it change along its journey? To find out, scientists rely on a sleuthing technique called "inverse modeling" – measuring gas concentrations at a single geographic point and then using clues from weather and atmospheric models to deduce where it came from. The technique is complex and difficult to explain even to fellow scientists.
Michalak related the technique to cream in a cup of coffee. "Say someone gave you a cup of creamy coffee," Michalak said. "How do you know when that cream was added?" Just as cream is not necessarily mixed perfectly, neither is the carbon dioxide in the atmosphere. If you can see the streaks of cream (carbon dioxide) and understand how the coffee (atmosphere) was stirred (weather), then scientists can use those clues to retrace the time and location that the ingredient was added to the mix.
The visual result typically used by scientists is a static two-dimensional map of the location of the gas, as averaged over the course of a month. Most carbon scientists know how to interpret the 2D map, but visualizing the 3D changes for non-specialists has proved elusive. Erickson spent 70 hours programming the Google Earth application that makes it easy to navigate though time and watch gas particles snake their way toward the NOAA observation towers. For his work, Erickson was declared one of Google's winners in March 2009.
"Having this visual tool allows us to better explain the scientific process," Michalak said. "It's a much more human way of looking at the science."
The next step, Erickson said, is to adapt the application to fit the needs of the research community. Scientists could use the program to better visualize the output of complex atmospheric models and then improve those models so that they better represent reality.
"Encouraging more people to deliver data in an interactive format is a good trend," Erickson said. "It should help innovation in research by reducing barriers to sharing data."
Adapted from materials provided by

San Andreas Affected By 2004 Sumatran Quake; Largest Quakes Can Weaken Fault Zones Worldwide.


ScienceDaily (Sep. 30, 2009) — U.S. seismologists have found evidence that the massive 2004 earthquake that triggered killer tsunamis throughout the Indian Ocean weakened at least a portion of California's famed San Andreas Fault. The results, which appear this week in the journal Nature, suggest that the Earth's largest earthquakes can weaken fault zones worldwide and may trigger periods of increased global seismic activity.
"An unusually high number of magnitude 8 earthquakes occurred worldwide in 2005 and 2006," said study co-author Fenglin Niu, associate professor of Earth science at Rice University. "There has been speculation that these were somehow triggered by the Sumatran-Andaman earthquake that occurred on Dec. 26, 2004, but this is the first direct evidence that the quake could change fault strength of a fault remotely."
Earthquakes are caused when a fault fails, either because of the buildup of stress or because of the weakening of the fault. The latter is more difficult to measure.
The magnitude 9 earthquake in 2004 occurred beneath the ocean west of Sumatra and was the second-largest quake ever measured by seismograph. The temblor spawned tsunamis as large as 100 feet that killed an estimated 230,000, mostly in Indonesia, Sri Lanka, India and Thailand.
In the new study, Niu and co-authors Taka'aki Taira and Paul Silver, both of the Carnegie Institution of Science in Washington, D.C., and Robert Nadeau of the University of California, Berkeley, examined more than 20 years of seismic records from Parkfield, Calif., which sits astride the San Andreas Fault.
The team zeroed in on a set of repeating microearthquakes that occurred near Parkfield over two decades. Each of these tiny quakes originated in almost exactly the same location. By closely comparing seismic readings from these quakes, the team was able to determine the "fault strength" -- the shear stress level required to cause the fault to slip -- at Parkfield between 1987 and 2008.
The team found fault strength changed markedly at three times during the 20-year period. The authors surmised that the 1992 Landers earthquake, a magnitude 7 quake north of Palm Springs, Calif. -- about 200 miles from Parkfield -- caused the first of these changes. The study found the Landers quake destabilized the fault near Parkfield, causing a series of magnitude 4 quakes and a notable "aseismic" event -- a movement of the fault that played out over several months -- in 1993.
The second change in fault strength occurred in conjunction with a magnitude 6 earthquake at Parkfield in September 2004. The team found another change at Parkfield later that year that could not be accounted for by the September quake alone. Eventually, they were able to narrow the onset of this third shift to a five-day window in late December during which the Sumatran quake occurred.
"The long-range influence of the 2004 Sumatran-Andaman earthquake on this patch of the San Andreas suggests that the quake may have affected other faults, bringing a significant fraction of them closer to failure," said Taira. "This hypothesis appears to be borne out by the unusually high number of large earthquakes that occurred in the three years after the Sumatran-Andaman quake."
The research was supported by the National Science Foundation, the Carnegie Institution of Washington, the University of California, Berkeley, and the U.S. Geological Survey.
Adapted from materials provided by
Rice University.

Saturday, July 25, 2009

Auroras In Northern And Southern Hemispheres Are Not Identical

ScienceDaily (July 24, 2009) — Norwegian researchers have shown that the auroras in the Northern and the Southern hemispheres can be totally asymmetric. These findings contradict the commonly made assumption of aurora being mirror images of each other.
The study was performed by PhD student Karl Magnus Laundal and professor Nikolai Østgaard at the Institute of Physics and Technology at the University of Bergen.
"The aurora is produced due to collisions between the Earth’s atmosphere and electrically charged particles streaming along the Earth’s geomagnetic field lines. – Since these processes occur above the two hemispheres, both the Northern and the Southern light are created. So far researchers have assumed that these auroras are mirror images of each other, but our findings show that this is not always the case," professor Nikolai Østgaard says.
The researchers at the University of Bergen have used data from the two NASA-satellites IMAGE and Polar to examine the Northern and the Southern light. In the Nature letter they present several possible explanations to the unexpected result.
"The most plausible explanation involves electrical currents along the magnetic field lines. Differences in the solar exposure may lead to currents between the two hemispheres, explaining why the Northern and the Southern light are not identical," PhD student Karl Magnus Laundal says.
In addition to yielding new knowledge about the aurora, the results are important for other researchers studying the near-Earth space.
"Our study shows that data from only one hemisphere is not sufficient to state the conditions on the other hemisphere. This is important because most of our knowledge about the aurora, as well as processes in the upper atmosphere in the polar regions, is based solely on data from the Northern hemisphere," Østgaard points out.
The work of Østgaard and Laundal has been financed by the Research Council of Norway via the International Polar Year project IPY-ICESTAR.
Laundal, K.M. and Østgaard, N., Asymmetric auroral intensities in the Earth's Northern and Southern hemispheres, Nature 460, 491-493 (23 July 2009) doi:10.1038/nature08154
Journal reference:
Laundal et al. Asymmetric auroral intensities in the Earth's Northern and Southern hemispheres. Nature, 2009; 460 (7254): 491 DOI:
Adapted from materials provided by
University of Bergen, via AlphaGalileo.

Wednesday, July 22, 2009

California's Channel Islands Hold Evidence Of Clovis-age Comets

ScienceDaily (July 21, 2009) — A 17-member team has found what may be the smoking gun of a much-debated proposal that a cosmic impact about 12,900 years ago ripped through North America and drove multiple species into extinction.
In a paper appearing online ahead of regular publication in the Proceedings of the National Academy of Sciences, University of Oregon archaeologist Douglas J. Kennett and colleagues from nine institutions and three private research companies report the presence of shock-synthesized hexagonal diamonds in 12,900-year-old sediments on the Northern Channel Islands off the southern California coast.
These tiny diamonds and diamond clusters were buried deeply below four meters of sediment. They date to the end of Clovis -- a Paleoindian culture long thought to be North America's first human inhabitants. The nano-sized diamonds were pulled from Arlington Canyon on the island of Santa Rosa that had once been joined with three other Northern Channel Islands in a landmass known as Santarosae.
The diamonds were found in association with soot, which forms in extremely hot fires, and they suggest associated regional wildfires, based on nearby environmental records.
Such soot and diamonds are rare in the geological record. They were found in sediment dating to massive asteroid impacts 65 million years ago in a layer widely known as the K-T Boundary. The thin layer of iridium-and-quartz-rich sediment dates to the transition of the Cretaceous and Tertiary periods, which mark the end of the Mesozoic Era and the beginning of the Cenozoic Era.
"The type of diamond we have found -- Lonsdaleite -- is a shock-synthesized mineral defined by its hexagonal crystalline structure. It forms under very high temperatures and pressures consistent with a cosmic impact," Kennett said. "These diamonds have only been found thus far in meteorites and impact craters on Earth and appear to be the strongest indicator yet of a significant cosmic impact [during Clovis]."
The age of this event also matches the extinction of the pygmy mammoth on the Northern Channel Islands, as well as numerous other North American mammals, including the horse, which Europeans later reintroduced. In all, an estimated 35 mammal and 19 bird genera became extinct near the end of the Pleistocene with some of them occurring very close in time to the proposed cosmic impact, first reported in October 2007 in PNAS.
In the Jan. 2, 2009, issue of the journal Science, a team led by Kennett reported the discovery of billions of nanometer-sized diamonds concentrated in sediments -- weighing from about 10 to 2,700 parts per billion -- in six North American locations.
"This site, this layer with hexagonal diamonds, is also associated with other types of diamonds and with dramatic environmental changes and wildfires," said James Kennett, paleoceanographer and professor emeritus in the Department of Earth Science at the University of California, Santa Barbara.
"There was a major event 12,900 years ago," he said. "It is hard to explain this assemblage of materials without a cosmic impact event and associated extensive wildfires. This hypothesis fits with the abrupt cooling of the atmosphere as shown in the record of ocean drilling of the Santa Barbara Channel. The cooling resulted when dust from the high-pressure, high-temperature, multiple impacts was lofted into the atmosphere, causing a dramatic drop in solar radiation."
The hexagonal diamonds from Arlington Canyon were analyzed at the UO's Lorry I. Lokey Laboratories, a world-class nanotechnology facility built deep in bedrock to allow for sensitive microscopy and other high-tech analyses of materials. The analyses were done in collaboration with FEI, a Hillsboro, Ore., company that distributes the high-resolution Titan microscope used to characterize the hexagonal diamonds in this study.
Transmission electron microscopy and scanning electron microscopes were used in the extensive analyses of the sediment that contained clusters of Lonsdaleite ranging in size from 20 to 1,800 nanometers. These diamonds were inside or attached to carbon particles found in the sediments.
These findings are inconsistent with the alternative and already hotly debated theory that overhunting by Clovis people led to the rapid extinction of large mammals at the end of the ice age, the research team argues in the PNAS paper. An alternative theory has held that climate change was to blame for these mass extinctions. The cosmic-event theory suggests that rapid climate change at this time was possibly triggered by a series of small and widely dispersed comet strikes across much of North America.
The National Science Foundation provided primary funding for the research. Additional funding was provided by way of Richard A. Bray and Philip H. Knight faculty fellowships of the University of Oregon, respectively, to Kennett and UO colleague Jon M. Erlandson, a co-author and director of the UO's Museum of Natural and Cultural History.
The 17 co-authors on the PNAS paper are Douglas Kennett, Erlandson and Brendan J. Culleton, all of the University of Oregon; James P. Kennett of UC Santa Barbara; Allen West of GeoScience Consulting in Arizona; G. James West of the University of California, Davis; Ted E. Bunch and James H. Wittke, both of Northern Arizona University; Shane S. Que Hee of the University of California, Los Angeles; John R. Johnson of the Santa Barbara Museum of Natural History; Chris Mercer of the Santa Barbara Museum of Natural History and National Institute of Materials Science in Japan; Feng Shen of the FEI Co.; Thomas W. Stafford of Stafford Research Inc. of Colorado; Adrienne Stich and Wendy S. Wolbach, both of DePaul University in Chicago; and James C. Weaver of the University of California, Riverside.
About the University of Oregon
The University of Oregon is a world-class teaching and research institution and Oregon's flagship public university. The UO is a member of the Association of American Universities (AAU), an organization made up of the 62 leading public and private research institutions in the United States and Canada. The UO is one of only two AAU members in the Pacific Northwest.
Alternate Media Contact: Gail Gallessich, science writer, UC Santa Barbara, 805-893-7220, .edu
Sources: Douglas Kennett, professor of archaeology, department of anthropology, .edu. He currently is in Europe and readily assessable by email (a phone number is available through the UO media contact above)
James Kennett, professor emeritus, UC Santa Barbara, 805-893-4187, .edu (home phone number for media access is available from either media contact above)
Links: Doug Kennett faculty page:; anthropology department:
James Kennett faculty page:
Adapted from materials provided by University of Oregon.

Wednesday, July 15, 2009

First Remote, Underwater Detection Of Harmful Algae, Toxins

ScienceDaily (July 15, 2009) — Scientists at NOAA's National Centers for Coastal Ocean Science and the Monterey Bay Aquarium Research Institute (MBARI) have successfully conducted the first remote detection of a harmful algal species and its toxin below the ocean's surface. The achievement was recently reported in the June issue of Oceanography.
This achievement represents a significant milestone in NOAA's effort to monitor the type and toxicity of harmful algal blooms (HABs). HABs are considered to be increasing not only in their global distribution, but also in the frequency, duration, and severity of their effects. HABs damage coastal ecosystem health and pose threats to humans as well as marine life. Climate change is expected to exacerbate this trend, since many critical processes that govern HABs dynamics, such as water temperature and ocean circulation, are influenced by climate.
A MBARI-designed robotic instrument called the Environmental Sample Processor, or 'ESP,' designed as a fully-functional analytical laboratory in the sea, lets researchers collect the algal cells and extract the genetic information required for organism identification as well as the toxin needed to assess the risk to humans and wildlife. The ESP then conducts specialized, molecular-based measurements of species and toxin abundance, and transmits results to the laboratory via radio signals.
"This represents the first autonomous detection of both a HAB species and its toxin by an underwater sensor," notes Greg Doucette, Ph.D., a research oceanographer at NOAA's Center for Coastal Environmental Health and Biomolecular Research laboratory in Charleston, S.C. "It allows us to determine not only the organism causing a bloom, but also the toxicity of the event, which ultimately dictates whether it is a threat to the public and the ecosystem."
For the first demonstration of the ESP's ability to detect HABs and their toxins, Doucette and his MBARI colleague, Chris Scholin, Ph.D., targeted certain members of the algal genus Pseudo-nitzschia and their neurotoxin, domoic acid in Monterey Bay, Calif.
Pseudo-nitzschia and domoic acid have been a concern in the Monterey Bay area for well over a decade. In 1991, the first U.S. outbreak of domoic acid poisoning was documented in Monterey Bay. This outbreak resulted in the unusual deaths of numerous pelicans and cormorants that ingested sardines and anchovies, which had accumulated the domoic acid by feeding on a bloom of the toxic algae.
In the spring of 1998, a mass mortality of sea lions in and around the Monterey Bay area was attributed to the sea lions' feeding on domoic acid contaminated anchovies. Since that time, Pseudo-nitzschia and domoic acid have appeared on virtually an annual basis in California coastal waters and are the objects of an intensive statewide monitoring program run by the California Dept. of Public Health. Humans also can be affected by the toxin through consumption of contaminated seafood such as shellfish.
"Our public health monitoring program is one of the many groups that can benefit directly from the ESP technology and ability to provide an early warning of impending bloom activity and toxicity," said Gregg Langlois, director of the state of California's Marine Biotoxin Monitoring Program. "This is critical information for coastal managers and public health officials in mitigating impacts on the coastal ecosystem, since the toxicity of these algae can vary widely from little or no toxicity to highly toxic."
Beyond improving forecasting of HABs, this research will contribute to the rapidly emerging U.S. Integrated Ocean Observing System (IOOS) by adding a new way to make coastal ocean observations.
Adapted from materials provided by National Oceanic and Atmospheric Administration.