Google Search

Sunday, August 16, 2009

Studying How Germs Spread

Mark Nicas has given some of his best years to spittle. He builds models – the mathematical kind – of how someone else's slobber ends up on you. The size of the particles, whether they come out in a dry cough or a wet sneeze, their evaporation rate, air speed – these are all complications, reasons why people like Nicas can spend careers piling up academic papers, all the while building up a healthy respect for pathogens.

Nicas, whose day job is at the University of California-Berkeley is one of a team of scientists affiliated with the Center for Advancing Microbial Risk Assessment (CAMRA), funded jointly by the U.S. Department of Homeland Security Science and Technology Directorate (DHS S&T) and the U.S. Environmental Protection Agency (EPA).
"In terms of homeland security, knowing how germs are spread is an important factor in countermeasures for potential biological attacks or pandemics," says Dr. Matthew Clark, Director of DHS S&T's Office of University Programs, who helps fund Nicas' research.
As an interdisciplinary research hub, CAMRA's goal is to help DHS S&T understand the risks associated with certain biological agents, and build a national network beyond the scientific community for sharing those insights.
Statistical predictions about flying saliva may seem like academic caricature. But they have important real-world applications to terrorist biological attacks and deadly diseases like bird flu that can ripple quickly through American cities. Disaster comes from the mouth, warns an ancient Chinese proverb on the dangers of linguistic drivel. But understanding the infectious potential of biological drivel may be the secret to restoring national health in a pandemic.
"When you get on an airplane, it's always best to sit at least three rows from a coughing person," said Nicas. "You don't know what they have."
Nicas used a Department of Homeland Security grant to test his airborne dispersion model for large and small particles in a small laboratory.
He isn't kidding about the airplane advice. It's a version of the three-foot rule—common in infection control circles—which says that transmitting pathogens between people through inhalation typically occurs inside of three feet. Outside that range, large particles carrying most of the pathogens fall out of the air quickly. On airplanes, the risk of infection declines rapidly between rows because of cabin design that circulates air within, not between, rows.
You might wonder if all that time spent thinking about germs might make Nicas obsessive about his own hygiene.
"I have a good sense of the risks," concedes Nicas, "probably more than most people. I try not to shake hands with people who have a cold. I tell my son to wash his hands. But I don't Lysol my counter every 10 minutes."

Transparent Aluminum Is ‘New State Of Matter’

Oxford scientists have created a transparent form of aluminium by bombarding the metal with the world’s most powerful soft X-ray laser. ‘Transparent aluminium’ previously only existed in science fiction, featuring in the movie Star Trek IV, but the real material is an exotic new state of matter with implications for planetary science and nuclear fusion.
In the journal Nature Physics an international team, led by Oxford University scientists, report that a short pulse from the FLASH laser ‘knocked out’ a core electron from every aluminium atom in a sample without disrupting the metal’s crystalline structure. This turned the aluminium nearly invisible to extreme ultraviolet radiation.
''What we have created is a completely new state of matter nobody has seen before,’ said Professor Justin Wark of Oxford University’s Department of Physics, one of the authors of the paper. ‘Transparent aluminium is just the start. The physical properties of the matter we are creating are relevant to the conditions inside large planets, and we also hope that by studying it we can gain a greater understanding of what is going on during the creation of 'miniature stars' created by high-power laser implosions, which may one day allow the power of nuclear fusion to be harnessed here on Earth.’
The discovery was made possible with the development of a new source of radiation that is ten billion times brighter than any synchrotron in the world (such as the UK’s Diamond Light Source). The FLASH laser, based in Hamburg, Germany, produces extremely brief pulses of soft X-ray light, each of which is more powerful than the output of a power plant that provides electricity to a whole city.
The Oxford team, along with their international colleagues, focused all this power down into a spot with a diameter less than a twentieth of the width of a human hair. At such high intensities the aluminium turned transparent.
Whilst the invisible effect lasted for only an extremely brief period – an estimated 40 femtoseconds – it demonstrates that such an exotic state of matter can be created using very high power X-ray sources.
Professor Wark added: ‘What is particularly remarkable about our experiment is that we have turned ordinary aluminium into this exotic new material in a single step by using this very powerful laser. For a brief period the sample looks and behaves in every way like a new form of matter. In certain respects, the way it reacts is as though we had changed every aluminium atom into silicon: it’s almost as surprising as finding that you can turn lead into gold with light!’
The researchers believe that the new approach is an ideal way to create and study such exotic states of matter and will lead to further work relevant to areas as diverse as planetary science, astrophysics and nuclear fusion power.
A report of the research, ‘Turning solid aluminium transparent by intense soft X-ray photoionization’, is published in Nature Physics. The research was carried out by an international team led by Oxford University scientists Professor Justin Wark, Dr Bob Nagler, Dr Gianluca Gregori, William Murphy, Sam Vinko and Thomas Whitcher.

Scientists Capitalize On Extended Solar Eclipse

On Wednesday, 2009 July 22, a total eclipse of the Sun is visible from within a narrow corridor that traverses half of Earth. The path of the Moon's umbral shadow begins in India and crosses through Nepal, Bangladesh, Bhutan, Myanmar and China. After leaving mainland Asia, the path crosses Japan's Ryukyu Islands and curves southeast through the Pacific Ocean where the maximum duration of totality reaches 6 min 39 s. A partial eclipse is seen within the much broader path of the Moon's penumbral shadow, which includes most of eastern Asia, Indonesia, and the Pacific Ocean.
The moon's shadow traced a path across the world's two most populous countries before racing across the Pacific, providing a view of totality for five minutes and 36 seconds for scientists gathered here from around the world as part of the Williams College Eclipse Expedition.
"We saw it! The clouds kept getting thinner, and we even had a pretty good-sized hole in the clouds for the five minutes of totality," reported Expedition Leader Jay Pasachoff, Field Memorial Professor of Astronomy at Williams and chair of the International Astronomical Union's Working Group on Solar Eclipses.
"Everyone saw all the coronal phenomena. The diamond rings were spectacular. Just before totality, the clouds were just the right thickness that allowed us to see partial phases without filters.
"All our equipment seems to have worked, so now we still have an hour or so of partial eclipse to image, and then we will download photos and start looking at them. The oscillation experiment has a lot of data through two filters, and we will assess later whether comparison of the two channels allow us to account for the cloud cover," Pasachoff said by email from China.
He was observing his 49th solar eclipse.
Pasachoff and his colleagues are capturing data over many eclipses to understand better why the Sun's corona, the outer halo of million-degree gas, shines hotter than the Sun itself. Most of the corona is visible from Earth only for the fleeting time that the moon totally blocks the Sun's direct rays.
They use a special rapid-readout electronic camera and single-color filters chosen to show only coronal gas, looking for oscillations with periods in the range of one second, which would signify certain classes of magnetic waves. The detailed structure of the corona, revealed by imaging in the visible and x-ray regions of the spectrum, and the correspondence of bright coronal regions with sunspot groups, shows that magnetism is the cause of coronal heating and the coronal structure. Competing explanations involve relatively tiny solar flares going off all the time.
Pasachoff's work with Miloslav Druckmuller of the Brno Institute of Technology in the Czech Republic and with Vojtech Rusin and Metod Saniga of the solar observatory in Slovakia has led to several joint papers in the Astrophysical Journal on views of the changing corona.
The expedition includes Bryce Babcock, staff physicist, and several undergraduate students from Williams and has been supported mainly by a grant from the Committee for Research and Exploration of the National Geographic Society.
The next total eclipse of the Sun, on July 11, 2010, will occur in the South Pacific and hit land only in the Cook Islands, Easter Island, and a small section of southern Chile and Argentina.

Cosmic Meddling With The Clouds By Seven-day Magic

Billions of tonnes of water droplets vanish from the atmosphere, as if by magic, in events that reveal in detail how the Sun and the stars control our everyday clouds. Researchers have traced the consequences of eruptions on the Sun that screen the Earth from some of the cosmic rays - the energetic particles raining down on our planet from exploded stars.
"The Sun makes fantastic natural experiments that allow us to test our ideas about its effects on the climate," says Prof. Henrik Svensmark, lead author of a report newly published in Geophysical Research Letters. When solar explosions interfere with the cosmic rays there is a temporary shortage of small aerosols, chemical specks in the air that normally grow until water vapour can condense on them, so seeding the liquid water droplets of low-level clouds. Because of the shortage, clouds over the ocean can lose as much as 7 per cent of their liquid water within seven or eight days of the cosmic-ray minimum.
"A link between the Sun, cosmic rays, aerosols, and liquid-water clouds appears to exist on a global scale," the report concludes. This research, to which Torsten Bondo and Jacob Svensmark contributed, validates 13 years of discoveries that point to a key role for cosmic rays in climate change. In particular, it connects observable variations in the world's cloudiness to laboratory experiments in Copenhagen showing how cosmic rays help to make the all-important aerosols.
Other investigators have reported difficulty in finding significant effects of the solar eruptions on clouds, and Henrik Svensmark understands their problem. "It's like trying to see tigers hidden in the jungle, because clouds change a lot from day to day whatever the cosmic rays are doing," he says. The first task for a successful hunt was to work out when "tigers" were most likely to show themselves, by identifying the most promising instances of sudden drops in the count of cosmic rays, called Forbush decreases. Previous research in Copenhagen predicted that the effects should be most notice-able in the lowest 3000 metres of the atmosphere. The team identified 26 Forbush decreases since 1987 that caused the biggest reductions in cosmic rays at low altitudes, and set about looking for the consequences.

Evidence Of Liquid Water In Comets Reveals Possible Origin Of Life

Comet Hale-Bopp. The watery environment of early comets, together with the vast quantity of organics already discovered in comets, would have provided ideal conditions for primitive bacteria to grow and multiply, experts argue.

Comets have contained vast amounts of liquid water in their interiors during the first million years of their formation, a new study claims.

The watery environment of early comets, together with the vast quantity of organics already discovered in comets, would have provided ideal conditions for primitive bacteria to grow and multiply. So argue Professor Chandra Wickramasinghe and his colleagues at the Cardiff Centre for Astrobiology in a paper published in the International Journal of Astrobiology.
The Cardiff team has calculated the thermal history of comets after they formed from interstellar and interplanetary dust approximately 4.5 billion years ago. The formation of the solar system itself is thought to have been triggered by shock waves that emanated from the explosion of a nearby supernova. The supernova injected radioactive material such as Aluminium-26 into the primordial solar system and some became incorporated in the comets. Professor Chandra Wickramasinghe together with Drs Janaki Wickramasinghe and Max Wallis claim that the heat emitted from radioactivity warms initially frozen material of comets to produce subsurface oceans that persist in a liquid condition for a million years.
Professor Wickramasinghe said: "These calculations, which are more exhaustive than any done before, leaves little doubt that a large fraction of the 100 billion comets in our solar system did indeed have liquid interiors in the past.
Comets in recent times could also liquefy just below their surfaces as they approach the inner solar system in their orbits. Evidence of recent melting has been discovered in recent pictures of comet Tempel 1 taken by the "Deep Impact" probe in 2005."
The existence of liquid water in comets gives added support for a possible connection between life on Earth and comets. The theory, known as cometary panspermia, pioneered by Chandra Wickramasinghe and the late Sir Fred Hoyle argues the case that life was introduced to Earth by comets.

Crashing Comets Not Likely The Cause Of Earth's Mass Extinctions

A long-period comet called 2001 RX14 (Linear) turned up in images captured in 2002 by the Sloan Digital Sky Survey telescope in New Mexico.

Scientists have debated how many mass extinction events in Earth's history were triggered by a space body crashing into the planet's surface. Most agree that an asteroid collision 65 million years ago brought an end to the age of dinosaurs, but there is uncertainty about how many other extinctions might have resulted from asteroid or comet collisions with Earth.
In fact, astronomers know the inner solar system has been protected at least to some degree by Saturn and Jupiter, whose gravitational fields can eject comets into interstellar space or sometimes send them crashing into the giant planets. That point was reinforced July 20 when a huge scar appeared on Jupiter's surface, likely evidence of a comet impact.
New University of Washington research indicates it is highly unlikely that comets have caused any mass extinctions or have been responsible for more than one minor extinction event. The work also shows that many long-period comets that end up in Earth-crossing orbits likely originate from a region astronomers have long believed could not produce observable comets. A long-period comet takes from 200 years to tens of millions of years to make a single orbit of the sun.

Discovering A New Earth 430 Light Years Away

Astronomers Spy Earth-like Planet Forming Around Distant Star

Astrophysicists analyzing infrared images captured by the Spitzer Space Telescope found indications of a dust cloud surrounding a relatively young star. The star is 10 to 16 million years old, and analysis of the dust cloud suggests that it may coalesce and become a rocky planet like earth. It is located at a distance from the star that it may build an atmosphere, collect liquid water, and perhaps, in millions and millions of years, support life.
It took billions of years and the perfect conditions for our Earth to grow and form. Now, those same conditions can be seen in space, shaping a similar planet. Ivanhoe explains this exciting space discovery.
Far, far away, something amazing is brewing in space. Swirling around a giant star similar to our sun, astrophysicists have spotted the very early stages of a planet taking shape.
"What we think we're seeing is the actual formation of a planet -- terrestrial planet -- a rocky planet like the Earth, around the star," Carey Lisse, Ph.D., a senior research scientist at Johns Hopkins Applied Physics Laboratory in Laurel, Md., told Ivanhoe.
The Earth-like planet is about 430 light years away or 2.5x1015 miles from Earth. It's inside a huge dust belt -- bigger than our asteroid belt -- with enough dusty material to build a planet. "The material is forming at just the same distance, or close to the same distance where the Earth formed from the sun," Dr. Lisse says.
To find the planet, astronomers used images captured by the Spitzer Space Telescope. It looks for infrared light or heat radiating from the dusty materials. The images also confirm the rocky fragments forming the new planet are similar to materials found in the Earth's crust and core.
"So, the body that's going to form -- the planet that's going to form -- isn't going to be this gas giant with incredibly thick atmosphere," explains Dr. Lisse. It's going to be a rocky planet like Mars or Venus or the Earth."
There's also an outer ice belt circling the young planet, making it more likely that water could reach the new planet's surface … and maybe even life; but don't wait around for signs of life. The planet still needs another 100 million years before it's completely formed.
Astronomers say the star the new planet is spinning around is between ten and 16 million years old, which is the perfect age for forming Earth-like planets.

show background -->
ABOUT THE SPITZER TELESCOPE: The Spitzer Space Telescope was launched on 25 August 2003. Spitzer detects the infrared energy radiated by objects in space. Most of this infrared radiation is blocked by the Earth's atmosphere and cannot be observed from the ground. Spitzer allows us to peer into regions of space that are hidden from optical telescopes.
Many areas of space are filled with vast, dense clouds of gas and dust that block our view. Infrared light, however can penetrate these clouds, allowing us to peer into regions of star formation, the centers of galaxies, and into newly forming planetary systems. Infrared also brings us information about the cooler objects in space, such as smaller stars which are too dim to be detected by their visible light, extrasolar planets, and giant molecular clouds. Also, many molecules in space, including organic molecules, have their unique signatures in the infrared.
WHAT IS INFRARED LIGHT? Infrared radiation is an invisible form of light that we usually detect as heat, like the sun shining on our face, or the warmth of a campfire. It has all the same properties as visible light: for example, it can be focused and reflected. The only difference is that it has a longer wavelength, which means we can't see it with the naked eye. Light is made of tiny particles called photons, and the wavelength tells us how fast those particles are vibrating. The shorter the wavelength, the faster the particles are moving. Shorter light waves look blue, and longer ones look red.
The wavelength of infrared light is so long that we can't see it at all. Any warm object gives off infrared radiation. By checking in the infrared spectrum, engineers can find heat leaks in buildings, doctors can find hidden tumors in the body, and biologists can locate diseased plants in a forest. Astronomers use infrared imaging to detect warm dust around new stars that are not yet "hot" enough to emit visible light.
The American Astronomical Society and the American Geophysical Union contributed to the information contained in the video portion of this report.

Bad News For Coffee Drinkers Who Get Headaches

Consuming high amounts of caffeine each day has been linked in new research to a greater likelihood of suffering occasional headaches. But, oddly enough, low caffeine consumption was associated with a greater likelihood of chronic headaches.

People who consume high amounts of caffeine each day are more likely to suffer occasional headaches than those with low caffeine consumption, a team of researchers at the Norwegian University of Science and Technology (NTNU) reports in a study recently published in the Journal of Headache Pain.
But in findings that had “no obvious reason”, the researchers, led by Knut Hagen from NTNU’s Faculty of Medicine, also reported that low caffeine consumption was associated with a greater likelihood of chronic headaches, defined as headaches for 14 or more days each month.
The results are drawn from a large cross-sectional study of 50,483 people who answered a questionnaire about caffeine consumption and headache prevalence as a part of the Nord-Trøndelag Health Survey (HUNT 2), a county-wide health survey conducted in 1995-1997 on a wide range of health topics.
To drink or not to drink
Caffeine is the world’s most commonly consumed stimulant, and has long been known to have both positive and negative effects on headaches. For example, caffeine is a common ingredient in headache analgesics because it can help relieve headaches.
But research worldwide into the relationship between caffeine consumption and headache provides no relief to headache sufferers wondering whether they should drink more coffee or less. Some studies have shown that high caffeine consumption increases the prevalence of headaches and migraines, while other studies have shown no such relationship.
At the same time, headaches are costly to society, in work hours lost, and to individuals themselves. The World Health Organisation ranks migraine 19th in all causes of disability based on a measure called “years lived with disability”, as one example.
The issue is of particular interest in Scandinavia, because Scandinavians are heavy coffee drinkers, consuming on average about 400 mg of caffeine per day. That is roughly twice the average caffeine consumption in other European countries and in the US, and equates to roughly 4 cups of brewed coffee per day, although caffeine levels in coffee vary quite widely.
The power – and limitation -- of numbers
The HUNT study is powerful because it is large-scale, population-based and cross-sectional, but when it comes to headaches, these characteristics make it difficult to establish cause-and-effect. For example, the frequency of non-migraine headache was found by researchers to be 18 per cent more likely in individuals with high caffeine consumption (500 mg per day or more) than among those with the lowest consumption (with mean levels at 125 mg per day).
But does that mean that all that caffeine causes headaches – or that people who are more likely to suffer from headaches drink caffeinated beverages in search of relief? “Since the study is cross-sectional, it cannot be concluded that high caffeine consumption causes infrequent headache,” the researchers write.
Even more difficult is explaining why chronic headache was less likely among individuals with moderate or high caffeine consumption, the researchers said. One possibility is that caffeine consumption helps change chronic headache into infrequent headache.
Cutting back may help
But it is equally possible that chronic headache sufferers had reduced their intake of caffeine because they had experienced its headache precipitating properties – and that individuals with infrequent headaches were unaware that high caffeine might be the cause.

'Hobbits' Couldn't Hustle: Feet Of Homo Floresiensis Were Primitive But Not Pathological

The long, flat "hobbit " foot next to the tibia. Both are from type specimen LB1.
A detailed analysis of the feet of Homo floresiensis—the miniature hominins who lived on a remote island in eastern Indonesia until 18,000 years ago—may help settle a question hotly debated among paleontologists: how similar was this population to modern humans?
A new research paper, featured on the cover of the May 7 issue of Nature, may answer this question. While the so-called "hobbits" walked on two legs, several features of their feet were so primitive that their gait was not efficient.
"The hobbits were bipedal, but they walked in a different way from modern humans," explains William Harcourt-Smith, a Research Scientist in the Division of Paleontology at the American Museum of Natural History and an author on the paper. "Their feet have a combination of human-like and more primitive early hominin traits, some of which are more akin to those in Lucy." Lucy is an early bipedal but small-brained hominin, or australopithecine, that lived in Africa 3.2 million years ago.
The "hobbits," excavated from Liang Bua Cave on the island of Flores, were first described in 2004. Known specimens range in age from 90,000 to 18,000 years old, making them contemporaneous with modern humans. This, in combination with the unusually small stature and brain size of H. floresiensis, led to considerable debate among researchers and in the press. Some consider the population a separate species, while others have assessed the fossils as pathological modern humans. But a number of recent analyses of the skull, face, and wrist have found many unusually primitive features among the "hobbits" that are more similar to chimpanzees and Australopithecus, suggesting that the Flores inhabitants represent a remnant population of early hominins.
The anatomy of the foot described in the new paper might finally answer the pathological modern vs. primitive population question. Although the foot is characteristic of a biped—being stiff and having no opposable big toe—many other traits fall outside of the range for modern humans. The H. floresiensis foot is very long in proportion to the lower limb and considerably more than half the length of the thighbone; modern human feet are relatively shorter at about half of the femur's length. The stubby big toe of the hobbits is another primitive, chimp-like trait. But the pivotal clue comes from the navicular bone, an important tarsal bone that helps form the arch in a modern human foot. The "hobbit" navicular bone is more akin to that found in great apes, which means that these hominins lacked an arch and were not efficient long-term runners.
"Arches are the hallmark of a modern human foot," explains Harcourt-Smith. "This is another strong piece of the evidence that the "hobbit" was not like us."
Researchers also assessed the pathology hypothesis by comparing "hobbit" feet to those of typical modern humans and pathological modern specimens such as pituitary dwarfs. While the pathological specimens fell well within the range of modern humans, the "hobbits" did not. This suggests that H. floresiensis was an unusual, isolated population of early hominins.
"The fossil record continues to surprise us," says William Jungers, Chairman of the Department of Anatomical Sciences at Stony Brook University Medical Center, and an author on the study. "H. floresiensis is either an island-dwarfed descendant of H. erectus that not only underwent body-size reduction but also extensive evolutionary reversals, or, as our analysis suggests, it represents a new species full of primitive retentions from an ancestor that dispersed out of Africa much earlier than anyone would have predicted. Either way, the implications for human evolution are profound."
In addition to Jungers and Harcourt-Smith, authors of the research paper include Roshna Wunderlich, James Madison University; Matthew Tocheri, National Museum of Natural History (Smithsonian Institution); Susan Larson, Stony Brook Medical Center; Thomas Sutikna and Rhokus Awe Due, National Research and Development Centre for Archaeology in Jakarta, Indonesia; and Mike Morwood, University of Wollongong in Australia. Research was funded by grants form the Australian Research Council, the National Geographic Society, the Wenner-Gren Foundation for Anthropological Research, the Wellcome Trust, and the Leakey Foundation.

Avalanche! The Incredible Data Stream Of Solar Dynamics Observatory

This is the sun photographed by an ultraviolet camera onboard NASA's STEREO spacecraft. Solar Dynamics Observatory will expand scenes like this one to IMAX resolution.
ScienceDaily (Aug. 12, 2009) — When NASA's Solar Dynamics Observatory (SDO) leaves Earth in November 2009 onboard an Atlas V rocket, the thunderous launch will trigger an avalanche.
Mission planners are bracing themselves -- not for rocks or snow, but an avalanche of data.
"SDO will beam back 150 million bits of data per second, 24 hours a day, 7 days a week," says Dean Pesnell of the Goddard Space Flight Center in Greenbelt, Md. That’s almost 50 times more science data than any other mission in NASA history. "It's like downloading 500,000 iTunes a day."
SDO is on a mission to study the sun in unprecedented detail. Onboard telescopes will scrutinize sunspots and solar flares using more pixels and colors than any other observatory in the history of solar physics. And SDO will reveal the sun’s hidden secrets in a prodigious rush of pictures.
"SDO is going to send us images ten times better than high definition television," says Pesnell, the project scientist for the new mission. A typical HDTV screen has 720 by 1280 pixels; SDO's images will have almost four times that number in the horizontal direction and five times in the vertical. “The pixel count is comparable to an IMAX movie -- an IMAX filled with the raging sun, 24 hours a day."
Spatial resolution is only half the story, though. Previous missions have photographed the sun no faster than once every few minutes. SDO will shatter that record.
"We'll be getting IMAX-quality images every 10 seconds," says Pesnell. "We'll see every nuance of solar activity." Because these fast cadences have never been attempted before by an orbiting observatory, the potential for discovery is great.
To illustrate the effect this might have on solar physics, Pesnell recalls the 18th century photographer Eadweard Muybridge, who won a famous bet with racehorse owner Leland Stanford. In those days, horses were widely thought to keep at least one hoof on the ground even in full gallop. That's how it appeared to the human eye.
"But when Muybridge photographed horses using a new high-speed camera system, he discovered something surprising," says Pesnell. "Galloping horses spend part of the race completely airborne—all four feet are off the ground."
Pesnell anticipates similar surprises from high-speed photography of the sun. The images could upend mainstream ideas about sunspot genesis, what triggers solar flares, and how explosions ripple through the sun's atmosphere en route to Earth.
The Solar Dynamics Observatory has three main instruments. The Atmospheric Imaging Assembly (AIA) is a battery of four telescopes designed to photograph the sun's surface and atmosphere. AIA filters cover 10 different wavelength bands, or colors, selected to reveal key aspects of solar activity. The bulk of SDO's data stream will come from these telescopes.
The Helioseismic and Magnetic Imager (HMI) will map solar magnetic fields and peer beneath the sun's opaque surface using a technique called helioseismology. A key goal of this experiment is to decipher the physics of the sun's magnetic dynamo.
The Extreme Ultraviolet Variability Experiment (EVE) will measure fluctuations in the sun's ultraviolet output. EUV radiation sun has a direct and powerful effect on Earth's upper atmosphere, heating it, puffing it up, and breaking apart atoms and molecules. "We really don't know how fast the sun varies at these wavelengths," notes Pesnell. "We're guaranteed to learn something new."
To gather data from all three instruments, NASA has set up a pair of dedicated radio antennas near Las Cruces, New Mexico. SDO's geosynchronous orbit will keep the observatory in constant view of the two 18-meter dishes around the clock for the duration of the observatory's five-year mission. Not a single bit should be lost.

Nanoelectronic Transistor Combined With Biological Machine Could Lead To Better Electronics

An artist's representation of a nanobioelectronic device incorporating alamethycin biological pore. In the core of the device is a silicon nanowire (grey), covered with a lipid bilayer (blue). The bilayer incorporates bundles of alamethicin molecules (purple) that form pore channels in the membrane. Transport of protons though these pore channels changes the current through the nanowire.

ScienceDaily (Aug. 10, 2009) — If artificial devices could be combined with biological machines, laptops and other electronic devices could get a boost in operating efficiency.
Lawrence Livermore National Laboratory researchers have devised a versatile hybrid platform that uses lipid-coated nanowires to build prototype bionanoelectronic devices.
Mingling biological components in electronic circuits could enhance biosensing and diagnostic tools, advance neural prosthetics such as cochlear implants, and could even increase the efficiency of future computers.
While modern communication devices rely on electric fields and currents to carry the flow of information, biological systems are much more complex. They use an arsenal of membrane receptors, channels and pumps to control signal transduction that is unmatched by even the most powerful computers. For example, conversion of sound waves into nerve impulses is a very complicated process, yet the human ear has no trouble performing it.
“Electronic circuits that use these complex biological components could become much more efficient,” said Aleksandr Noy, the LLNL lead scientist on the project.
While earlier research has attempted to integrate biological systems with microelectronics, none have gotten to the point of seamless material-level incorporation.
“But with the creation of even smaller nanomaterials that are comparable to the size of biological molecules, we can integrate the systems at an even more localized level,” Noy said.
To create the bionanoelectronic platform the LLNL team turned to lipid membranes, which are ubiquitous in biological cells. These membranes form a stable, self-healing,and virtually impenetrable barrier to ions and small molecules.
“That's not to mention that these lipid membranes also can house an unlimited number of protein machines that perform a large number of critical recognition, transport and signal transduction functions in the cell,” said Nipun Misra, a UC Berkeley graduate student and a co-author on the paper.
Julio Martinez, a UC Davis graduate student and another co-author added: “Besides some preliminary work, using lipid membranes in nanoelectronic devices remains virtually untapped.”
The researchers incorporated lipid bilayer membranes into silicon nanowire transistors by covering the nanowire with a continuous lipid bilayer shell that forms a barrier between the nanowire surface and solution species.
“This 'shielded wire' configuration allows us to use membrane pores as the only pathway for the ions to reach the nanowire,” Noy said. “This is how we can use the nanowire device to monitor specific transport and also to control the membrane protein.”
The team showed that by changing the gate voltage of the device, they can open and close the membrane pore electronically.
The research appears Aug. 10 in the online version of the Proceedings of the National Academy of Sciences.

Violent Youth Of Solar Proxies Steers Course Of Genesis Of Life

Stars similar to our Sun -- "solar proxies" -- enable scientists to look through a window in time to see the harsh conditions prevailing in the early or future Solar System, as well as in planetary systems around other stars. These studies could lead to profound insights into the origin of life on Earth and reveal how likely (or unlikely) the rise of life is elsewhere in the cosmos. This work has revealed that the Sun rotated more than ten times faster in its youth (over four billion years ago) than today generating a stronger magnetic field and stronger activity. This also meant that the young Sun emitted X-rays and ultraviolet radiation up to several hundred times stronger than the Sun does today.

ScienceDaily (Aug. 13, 2009) — One of the hottest topics at this year’s XXVIIth General Assembly of the International Astronomical Union (IAU) in Rio de Janeiro, Brazil involves the study of the astrophysical conditions favourable for the development and survival of primordial life.
The discovery of the first gene involved in regulating the optimal length of human sleep offers a window into a key aspect of slumber.
ScienceDaily (Aug. 14, 2009) — Scientists have discovered the first gene involved in regulating the optimal length of human sleep, offering a window into a key aspect of slumber, an enigmatic phenomenon that is critical to human physical and mental health.

Brain Innately Separates Living And Non-living Objects For Processing

Even in people who have been blind since birth the brain still separates the concepts of living and non-living objects, new research shows.

ScienceDaily (Aug. 14, 2009) — For unknown reasons, the human brain distinctly separates the handling of images of living things from images of non-living things, processing each image type in a different area of the brain. For years, many scientists have assumed the brain segregated visual information in this manner to optimize processing the images themselves, but new research shows that even in people who have been blind since birth the brain still separates the concepts of living and non-living objects.

Trigger-happy Star Formation: Radiation From Massive Stars

This composite image, combining data from the Chandra X-ray Observatory and the Spitzer Space Telescope shows the molecular cloud Cepheus B, located in our Galaxy about 2,400 light years from the Earth. (Credit: Credit: X-ray (NASA/CXC/PSU/K. Getman et al.); IR (NASA/JPL-Caltech/CfA/J. Wang et al.)
ScienceDaily (Aug. 14, 2009) — A new study from two of NASA's Great Observatories provides fresh insight into how some stars are born, along with a beautiful new image of a stellar nursery in our Galaxy. The research shows that radiation from massive stars may trigger the formation of many more stars than previously thought.

World Record In Packing Puzzle Set In Tetrahedra Jam: Better Understanding Of Matter Itself?

Princeton researchers have beaten the present world record for packing the most tetrahedra into a volume. Research into these so-called packing problems have produced deep mathematical ideas and led to practical applications as well. (Credit: Princeton University/Torquato Lab).

ScienceDaily (Aug. 15, 2009) — Finding the best way to pack the greatest quantity of a specifically shaped object into a confined space may sound simple, yet it consistently has led to deep mathematical concepts and practical applications, such as improved computer security codes.

Antarctic Glacier Thinning At Alarming Rate

The Pine Island Glacier in West Antarctica. (Credit: Image courtesy of University of Leeds)
ScienceDaily (Aug. 15, 2009) — The thinning of a gigantic glacier in Antarctica is accelerating, scientists report. The Pine Island Glacier in West Antarctica, which is around twice the size of Scotland, is losing ice four times as fast as it was a decade years ago.
The research, published in the journal Geophysical Research Letters, also reveals that ice thinning is now occurring much further inland. At this rate scientists estimate that the main section of the glacier will have disappeared in just 100 years, six times sooner than was previously thought.
The Pine Island Glacier is located within the most inaccessible area of Antarctica – over 1000 km from the nearest research base – and was for many years overlooked. Now, scientists have been able to track the glacier's development using continuous satellite measurements over the past 15 years.
"Accelerated thinning of the Pine Island Glacier represents perhaps the greatest imbalance in the cryosphere today, and yet we would not have known about it if it weren't for a succession of satellite instruments," says Professor Andrew Shepherd, a co-author of the research from the School of Earth and Environment at the University of Leeds.
"Being able to assemble a continuous record of measurements over the past 15 years has provided us with the remarkable ability to identify both subtle and dramatic changes in ice that were previously hidden," he adds.
Scientists believe that the retreat of glaciers in this sector of Antarctica is caused by warming of the surrounding oceans, though it is too early to link such a trend to global warming.
The 5,400 km squared region of the Pine Island Glacier affected today is big enough to impact the rate at which sea level rise around the world.
"Because the Pine Island Glacier contains enough ice to almost double the IPCC's best estimate of 21st century sea level rise, the manner in which the glacier will respond to the accelerated thinning is a matter of great concern," says Professor Shepherd.
The research was led by Professor Duncan Wingham at University College London, and was funded by the UK Natural Environment Research Council.

First Black Holes Born Starving

This computer-simulated image shows gas (blue) interacting with one of the first black holes (white) in the early universe, approximately 200 million years after the Big Bang. Click here to view a high-resolution version of the image. (Credit: Image and simulation courtesy of Marcelo Alvarez, John H. Wise and Tom Abel.)
ScienceDaily (Aug. 11, 2009) — The first black holes in the universe had dramatic effects on their surroundings despite the fact that they were small and grew very slowly, according to new supercomputer simulations.
The simulations were carried out by astrophysicists Marcelo Alvarez and Tom Abel of the Kavli Institute for Particle Astrophysics and Cosmology, jointly located at the Department of Energy’s SLAC National Accelerator Laboratory and Stanford University, and John Wise, formerly of KIPAC and now of NASA Goddard Space Flight Center.
Several popular theories posit that the first black holes gorged themselves on gas clouds and dust in the early universe, growing into the supersized black holes that lurk in the centers of galaxies today. However, the new results, published in The Astrophysical Journal Letters, point to a much more complex role for the first black holes.
"I'm thrilled that we now can do calculations that start to capture the most relevant physics, and we can show which ideas work and which don't," said Abel. "In the next decade, using calculations like this one, we will settle some of the most important issues related to the role of black holes in the universe."
To make their discovery, the researchers created the most detailed simulations to date of the first black holes in the universe that formed from the collapse of stars. The simulations started with data taken from observations of the cosmic background radiation—the earliest view of the structure of the universe. The researchers then applied the basic laws that govern the interaction of matter, allowing the early universe in their simulation to evolve as it did in reality.
In the simulation, clouds of gas left over from the Big Bang slowly coalesced under the force of gravity, and eventually formed the first stars. These massive, hot stars burned bright for a short time, emitting so much energy in the form of starlight that they pushed nearby gas clouds far away. Yet these stars could not sustain such a fiery existence for long, and they soon exhausted their internal fuel. This caused one of the stars in the simulation to collapse under its own weight, forming a black hole located in a pocket of emptiness. With very little matter in the near vicinity, this black hole was essentially "starved" of food on which to grow.
"Quasars [extremely strong sources of radiation] powered by black holes a billion times more massive than our sun have been observed in the early universe, and we have to explain how these behemoths could have grown so big so fast,” said Alvarez. "Their origin remains among the most fundamental unanswered questions in astrophysics."
One explanation for the existence of supermassive black holes in the early universe postulates that the first black holes were "seeds" that grew into much larger black holes by gravitationally attracting and then swallowing matter. But in their simulation, Alvarez, Abel and Wise found that such growth was negligible, with the black hole in the simulation growing by less than one percent of its original mass over the course of a hundred million years.
Although the simulations do not yet completely rule out the theory, this makes it less likely that these first black holes could have grown directly into the supermassive black holes observed to have existed less than a billion years later, Alvarez said.
An Alternative Theory
Although the early stars pushed away nearby clouds of gas, delaying significant growth of the black holes the stars later became, wisps of gas sometimes found their way to the black holes. As this matter was sucked into the black hole in the researchers’ simulation, it accelerated and released enough X-ray radiation to heat gas as much as a hundred light years away to several thousand degrees. The additional heat from the X-rays caused the gas to expand away from the black hole, helping to keep the snack from turning into a feast.
Heating due to the X-rays was also enough to effectively prevent nearby gas from collapsing to form stars for tens and maybe even hundreds of millions of years. As a result, the researchers hypothesize, significantly larger than usual gas clouds may have had the opportunity to form without creating stars. Such enormous gas clouds may have eventually collapsed under their own weight, creating a supermassive black hole.
"While X-rays from matter falling onto the first black holes hindered their further growth, that very same radiation may have later cleared the way for direct formation of supermassive black holes by suppressing star formation," said Alvarez. "However, a lot of work remains to be done to test whether this idea will actually pan out; this is really just the tip of the iceberg in terms of realistic simulations of black holes in the early universe."
"This work will likely make people rethink how the radiation from these black holes affected the surrounding environment," added Wise. "Black holes are not just dead pieces of matter; they actually affect other parts of the galaxy."
The Kavli Institute for Particle Astrophysics and Cosmology, initiated by a grant from Fred Kavli and the Kavli Foundation, is a joint institute of Stanford University and SLAC National Accelerator Laboratory.
SLAC is a multi-program laboratory exploring frontier questions in astrophysics, photon science, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford for the U.S. Department of Energy Office of Science.

Graphene Has High Current Capacity, Thermal Conductivity

Scanning electron microscope image shows ten graphene nanoribbons between each pair of electrodes. (Credit: Image courtesy of Raghu Murali)
ScienceDaily (Aug. 15, 2009) — Recent research into the properties of graphene nanoribbons provides two new reasons for using the material as interconnects in future computer chips. In widths as narrow as 16 nanometers, graphene has a current carrying capacity approximately a thousand times greater than copper – while providing improved thermal conductivity.
The current-carrying and heat-transfer measurements were reported by a team of researchers from the Georgia Institute of Technology. The same team had previously reported measurements of resistivity in graphene that suggest the material’s conductance would outperform that of copper in future generations of nanometer-scale interconnects.
“Graphene nanoribbons exhibit an impressive breakdown current density that is related to the resistivity,” said Raghunath Murali, a senior research engineer in Georgia Tech’s Nanotechnology Research Center. “Our measurements show that these graphene nanoribbons have a current carrying capacity at least two orders of magnitude higher than copper at these size scales.”
Measurements of thermal conductivity and breakdown current density in narrow graphene nanoribbons were reported June 19 in the journal Applied Physics Letters. The research was supported by the Semiconductor Research Corporation/DARPA through the Interconnect Focus Center and by the Nanoelectronics Research Initiative through the Institute for Nanoelectronics Discovery and Exploration (INDEX).
The unique properties of graphene – which is composed of thin layers of graphite – make it attractive for a wide range of potential electronic devices. Murali and his colleagues have been studying graphene as a potential replacement for copper in on-chip interconnects, the tiny wires that are used to connect transistors and other devices on integrated circuits. Use of graphene for these interconnects, they believe, would help extend the long run of performance improvements in integrated circuit technology.
“Our measurements show that graphene nanoribbons have a current carrying capacity of more than 10^8 amps per square centimeter, while a handful of them exceed 10^9 amps per square centimeter,” Murali said. “This makes them very robust in resisting electromigration and should greatly improve chip reliability.”
Electromigration is a phenomenon that causes transport of material, especially at high current density. In on-chip interconnects, this eventually leads to a break in the wire, which results in chip failure.
“We are learning a lot of new things about this material, which will lead researchers to consider other potential applications,” said Murali. “In addition to the high current carrying capacity, graphene nanoribbons also have excellent thermal conductivity.”
Because heat generation is a significant cause of device failure, the researchers also measured the ability of the graphene nanostructures to conduct heat away from devices. They found that graphene nanoribbons have a thermal conductivity of more than 1,000 watts per meter Kelvin for structures less than 20 nanometers wide.
“This high thermal conductivity could allow graphene interconnects to also serve as heat spreaders in future generations of integrated circuits,” said Murali.
To study the properties of graphene interconnects, Murali and collaborators Yinxiao Yang, Kevin Brenner, Thomas Beck and James Meindl began with flakes of multi-layered graphene removed from a graphite block and placed onto an oxidized silicon substrate. They used electron beam lithography to construct four electrode contacts, then used lithography to fabricate devices consisting of parallel nanoribbons of widths ranging between 16 and 52 nanometers and lengths of between 0.2 and 1 micron.
The breakdown current density of the nanoribbons was then studied by slowly applying an increasing amount of current to the electrodes on either side of the parallel nanoribbons. A drop in current flow indicated the breakdown of one or more of the nanoribbons.
In their study of 21 test devices, the researchers found that the breakdown current density of graphene nanoribbons has a reciprocal relationship to the resistivity.
Because graphene can be patterned using conventional chip-making processes, manufacturers could make the transition from copper to graphene without a drastic change in chip fabrication.
“Graphene has very good electrical properties,” Murali said. “The data we have developed so far looks very promising for using this material as the basis for future on-chip interconnects.”

Herschel And Planck On Way To Study Our Cosmic Roots

The Herschel and Planck spacecraft launched on May 14, 2009, from the Guiana Space Centre in French Guiana. (Credit: ESA-CNES-Arianespace / Optique Vidéo du CSG)

ScienceDaily (May 14, 2009) — The Herschel and Planck spacecraft successfully blasted into space at 6:12 a.m. Pacific Time (9:12 a.m. Eastern Time) on May 14 from the Guiana Space Centre in French Guiana.

The European Space Agency missions, with significant participation from NASA, hitched a ride together on an Ariane 5 rocket, but now have different journeys before them. Herschel will explore, with unprecedented clarity, the earliest stages of star and galaxy birth in the universe; it will help answer the question of how our sun and Milky Way galaxy came to be. Planck will look back to almost the beginning of time itself, gathering new details to help explain how our universe came to be.
"These two missions have spent a lot of time together," said Ulf Israelsson, NASA project manager for both Herschel and Planck at NASA's Jet Propulsion Laboratory, Pasadena, Calif. "But now they are going their separate ways, each ready to do what it does best."
JPL contributed key technology to both missions. NASA team members will play an important role in data analysis and science operations.
Herschel separated from its Ariane 5 rocket 26 minutes after launch, followed by Planck about two minutes later. The spacecraft are traveling on separate trajectories to a point in the Earth-sun system called the second Lagrangian point, four times farther away than the moon's orbit, or an average distance of 1.5 million kilometers (930,000 miles) from Earth. They will spend the rest of their missions independently orbiting this point -- located on the other side of Earth from the sun -- as they make their way around the sun every year. See animations at http://www.esa.int/esa-mmg/mmg.pl?b=b&type=VA&mission=Herschel&single=y&start=10 and http://www.esa.int/esa-mmg/mmg.pl?b=b&type=VA&mission=Planck&single=y&start=10 .
Herschel will start preparing for science operations while en route toward its operational orbit, which will be reached in about two months. Four months later, the science mission will begin and is expected to last more than three-and-a-half years. Planck will reach a similar orbit in roughly two months, with science observations beginning one month later. The mission's science operations are scheduled to last a minimum of 15 months, with the possibility of an extension.
Both observatories are designed to see light that our human eyes cannot. Herschel will detect light that has gone largely unexplored until now, with wavelengths in the infrared and submillimeter range. It will make the most detailed measurements yet of the cold and dark wombs where the embryos of stars and galaxies have just begun to grow.
Herschel will also be able to detect key elements and molecules involved in a star's life, tracing their evolution from atoms to potentially life-forming materials. One of these molecules is water; astronomers say Herschel will provide a greatly improved measurement of how much water there is in space.
"Using Herschel is like opening a dirty window and getting a clear view of stars and galaxies," said Paul Goldsmith, the NASA Herschel project scientist at JPL.
Planck will see longer wavelength light, from the submillimeter to microwave range. It will work like the ultimate time capsule, to see light that has traveled billions of years from the newborn universe to reach us. This light, called the cosmic microwave background, contains information about the Big Bang that created space and time itself.
"Our previous images of the baby universe were like fuzzy snapshots -- now we'll have the cleanest, deepest and sharpest images ever made of the early universe," said Charles Lawrence, the NASA Planck project scientist at JPL.
In order to do their jobs, the instruments on both spacecrafts will be icy cold. Liquid helium will cool the coldest of Herschel's detectors to just 0.3 Kelvin (minus 459 degrees Fahrenheit), or 0.3 degrees above the coldest temperature theoretically attainable in the universe. Planck's coldest detectors, which are chilled by cutting-edge coolers developed in part by JPL, will reach a frosty 0.1 Kelvin.
Herschel is a European Space Agency mission, with science instruments provided by a consortium of European-led institutes, and with important participation by NASA. NASA's Herschel Project Office is based at JPL. JPL contributed mission-enabling technology for two of Herschel's three science instruments. The NASA Herschel Science Center, part of the Infrared Processing and Analysis Center at the California Institute of Technology in Pasadena, supports the United States astronomical community. Caltech manages JPL for NASA

Did 'Dark Matter' Create The First Stars?

Head of the "guitar nebula". The formation contains a fast moving pulsar followed by a tail of gas. Biermann and Kusenko's postulations about dark matter could explain puzzlingly high pulsar velocities, which lead to such cone-shaped features. Images are from the Planetary Camera aboard the Hubble Space Telescope in 1994 (left) and 2001 (right). (Image: Hubble Space Telescope (NASA/ESA), Shami Shatterjee)

ScienceDaily (Mar. 20, 2006) — Dark matter may have played a major role in creating stars at the very beginnings of the universe. If that is the case, however, the dark matter must consist of particles called "sterile neutrinos". Peter Biermann of the Max Planck Institute for Radio Astronomy in Bonn, and Alexander Kusenko, of the University of California, Los Angeles, have shown that when sterile neutrinos decay, it speeds up the creation of molecular hydrogen. This process could have helped light up the first stars only some 20 to 100 million years after the big bang. This first generation of stars then ionised the gas surrounding them, some 150 to 400 million years after the big bang. All of this provides a simple explanation to some rather puzzling observations concerning dark matter, neutron stars, and antimatter (Physical Review Letters, March 10, 2006).

Herschel And Planck Spacecraft Ready To Move To Launch Site

ScienceDaily (Feb. 10, 2009) — ESA’s Herschel and Planck missions that will study the formation of stars and galaxies and the relic radiation from the Big Bang, respectively, have successfully completed their test campaigns in Europe. The two spacecraft will soon be shipped to Europe’s spaceport in Kourou, French Guiana.
On 6 February, ESA reviewed the status of the spacecraft and the results of the test campaigns that were carried out over the past year. Following the review, the spacecraft teams were given the green light to start the launch campaign, the final phase of spacecraft activities on Earth.
Built by Thales Alenia Space leading a host of subcontractors from across Europe, the Herschel spacecraft and its instruments were thoroughly tested at ESA’s Space Research and Technology Centre (ESTEC) in Noordwijk, the Netherlands, during 2008. The spacecraft, now packed in its container, will be transported to Kourou by plane on 11 February 2009.
Planck, also built by Thales Alenia Space and its subcontractors, was initially tested at Thales Alenia Space in Cannes and at ESTEC, Final tests took place at the facilities of the Centre Spatial Liège, Belgium. The spacecraft will be transported to Kourou by plane on 18 February 2009.
Once at the Guiana Space Centre, the two spacecraft will undergo their final checks and be prepared for launch. Herschel and Planck are scheduled to launch into space in a dual launch configuration by an Ariane ECA rocket on 16 April 2009.

Planck Sees Light Billions Of Years Old

ScienceDaily (Aug. 16, 2009) — The Planck space telescope has begun to collect light left over from the Big Bang explosion that created our universe.

The mission, which is led by the European Space Agency with important participation from NASA, will help answer the most fundamental of questions: How did space itself pop into existence and expand to become the universe we live in today?
The answer is hidden in ancient light, called the cosmic microwave background, which has traveled more than 13 billion years to reach us. Planck will measure tiny variations in this light with the best precision to date.
The mission officially started collecting science data today, Aug. 13, as part of a test period. If all goes as planned, these observations will be the first of 15 or more months of data gathered from two full-sky scans. Science results are expected in about three years

Warming Of Arctic Current Over 30 Years Triggers Release Of Methane Gas

Researchers in Germany have found that more than 250 plumes of bubbles of methane gas are rising from the seabed of the West Spitsbergen continental margin in the Arctic, in a depth range of 150 to 400 metres. (Credit: Image courtesy of National Oceanography Centre, Southampton)

ScienceDaily (Aug. 16, 2009) — The warming of an Arctic current over the last 30 years has triggered the release of methane, a potent greenhouse gas, from methane hydrate stored in the sediment beneath the seabed.
Scientists at the National Oceanography Centre Southampton working in collaboration with researchers from the University of Birmingham, Royal Holloway London and IFM-Geomar in Germany have found that more than 250 plumes of bubbles of methane gas are rising from the seabed of the West Spitsbergen continental margin in the Arctic, in a depth range of 150 to 400 metres.
Methane released from gas hydrate in submarine sediments has been identified in the past as an agent of climate change. The likelihood of methane being released in this way has been widely predicted.
The data were collected from the royal research ship RRS James Clark Ross, as part of the Natural Environment Research Council's International Polar Year Initiative. The bubble plumes were detected using sonar and then sampled with a water-bottle sampling system over a range of depths.
The results indicate that the warming of the northward-flowing West Spitsbergen current by 1° over the last thirty years has caused the release of methane by breaking down methane hydrate in the sediment beneath the seabed.
Professor Tim Minshull, Head of the University of Southampton's School of Ocean and Earth Science based at that the National Oceanography Centre, says: "Our survey was designed to work out how much methane might be released by future ocean warming; we did not expect to discover such strong evidence that this process has already started."
Methane hydrate is an ice-like substance composed of water and methane which is stable in conditions of high pressure and low temperature. At present, methane hydrate is stable at water depths greater than 400 metres in the ocean off Spitsbergen. However, thirty years ago it was stable at water depths as shallow as 360 metres.
This is the first time that such behaviour in response to climate change has been observed in the modern period.
While most of the methane currently released from the seabed is dissolved in the seawater before it reaches the atmosphere, methane seeps are episodic and unpredictable and periods of more vigorous outflow of methane into the atmosphere are possible. Furthermore, methane dissolved in the seawater contributes to ocean acididfication.
Graham Westbrook Professor of Geophysics at the University of Birmingham, warns: "If this process becomes widespread along Arctic continental margins, tens of megatonnes of methane per year – equivalent to 5-10% of the total amount released globally by natural sources, could be released into the ocean."
The team is carrying out further investigations of the plumes; in particular they are keen to observe the behaviour of these gas seeps over time.

Surgeons Combine Heat, Chemistry to Bolster Anti-Cancer Drugs

December 1, 2005 — In efforts to boost the effectiveness of anti-cancer drugs, a new method called intra-peritoneal hyperthermic chemotherapy works by flushing a heated chemotherapy drug through tissue surrounding a tumor. Immediately after the tumor is removed, heat boosts the drug's potency and weakens the tumor's ability to repair itself. The targeted delivery means a higher concentration of the anti-cancer drug reaches the tumor.

WINSTON-SALEM, N.C.--There are some cancers against which chemotherapy is virtually useless, but a new technique is making a major difference for some patients.
Day-by-day cancer patient Fred Blum is regaining his strength. Just five months ago, he was battling intestinal cancer -- and the prognosis was bleak. "You just think of this tumor growing and enlarging and getting worse and spreading. And psychologically, it's very hard."
Standard chemo does little to fight advanced abdominal cancers like Blum's. But now, surgical oncologists are using a new twist to make chemo effective and extend survival for patients who often have no other options.
It's called intraperitoneal hyperthermic chemotherapy. It works by flushing a heated chemotherapy drug through tissue surrounding a tumor immediately after the tumor's removed. Perry Shen, an assistant professor of surgery at Wake Forest University Baptist Medical Center in Winston-Salem, N.C., says "That hopefully provides kind of a mop-up or cleanup of any residual cancer cells left behind."
Heat boosts the drug's potency and weakens the tumor's ability to repair itself. The targeted delivery means a higher concentration of the chemo reaches the cancer. "This procedure can provide them a longer-term survival than regular chemotherapy alone," Dr. Shen says.
Before the treatment, Blum's prognosis was about six months. Now, he's planning on staying healthy and strong for a long time.
Some patients are undergoing treatment in an effort to prevent cancer from spreading to the abdomen from nearby organs like the appendix. There's potential the heated chemo could be applied to some non-intestinal, hard-to-treat cancers, like pancreatic cancer.

BACKGROUND: Researchers at Wake Forest University Baptist Medical Center have shown that surgery plus inserting heated chemotherapy drugs directly into the abdomen can improve survival rates, as well as the quality of life, in patients who suffer from several types of cancer. These cancers include tumors of the abdominal cavity (peritoneal cancer, which affects the lining of the abdominal wall) that have spread to multiple organs, and advanced ovarian cancer. Patients with these types of cancer usually fare poorly with conventional cancer treatments.
HOW IT WORKS: There are two stages to the new procedure. First, as much as possible of the cancer is removed with surgery. This often involves multiple organs. Next, while the patient is still on the operating table, the surgeon injects a heated saline solution combined with the chemotherapy drug directly into the abdominal cavity. Scientists have found that tumor tissue is more sensitive to heat than normal tissue. So raising the temperature of the drug makes the tumor less resistant to chemotherapy.
THE RESULTS: The results from four separate studies conducted on animals indicate that the new procedure could kill cancer cells that have spread to other parts of the body. And delivering the drug immediately after the surgery means that more of the drug remains near the tumor instead of spreading throughout the entire body. This means less of the usual unpleasant side effects of chemotherapy.

Groundbreaking Treatment For Oxygen-deprived Newborns

A new treatment for oxygen-deprived newborn infants involves a two-week course of injections of erythropoietin, a hormone that stimulates the formation of red blood cells.

ScienceDaily (Aug. 16, 2009) — Until now immediate cooling of the newborn infant was the only treatment that could possibly prevent brain damage following oxygen deprivation during delivery. New research findings from the Sahlgrenska Academy at the University of Gothenburg and Sahlgrenska University Hospital, Sweden, in collaboration with Zhengzhou University in China, open up the possibility of a new and effective treatment that can be started as late as two days after birth.

This new treatment involves newborn infants being given a two-week course of injections of erythropoietin, a hormone that stimulates the formation of red blood cells.
“For the first time we can demonstrate that it is possible to influence the brain damage occurring as a result of oxygen deprivation during delivery considerably later than the six-hour window of opportunity for treating with cooling,” says Klas Blomgren, professor of paediatrics at the Sahlgrenska Academy and specialist at Queen Silvia Children’s Hospital.
The research findings, which are presented in the latest issue of the medical journal Pediatrics, are the result of cooperation between Swedish, Austrian and Chinese researchers. The study treated just over 150 term newborn infants, half of whom were given small doses of erythropoietin every other day. Once the children reached the age of eighteen months, their neurological condition was assessed.
“Only half as many of the children treated with erythropoietin had developed a severe neurological functional disability or had died of their injuries. Thus the hormone treatment improves the prognosis considerably in the longer perspective,” says Blomgren.
The children in the study had suffered moderate or severe hypoxic-ischemic encephalopathy (HIE) at birth, but it was only children with moderate HIE that were helped by this hormone treatment.
“We believe that erythropoietin has a regenerative and stimulating effect on recovery and on brain development following the injury. This appears to be a safe treatment, almost without side effects, and it is also cheaper and technically simpler to administer in comparison with cooling. This means that the treatment can be given a wide distribution, and can be used even in developing countries,” says Blomgren

Facial Expressions Show Language Barriers, Too


New research offers insights into how different cultures interpret facial expressions.
ScienceDaily (Aug. 16, 2009) — People from East Asia tend to have a tougher time than those from European countries telling the difference between a face that looks fearful versus surprised, disgusted versus angry, and now a new report published online on August 13th in Current Biology, a Cell Press publication, explains why. Rather than scanning evenly across a face as Westerners do, Easterners fixate their attention on the eyes.


New Class Of Astronomical Object: Super Planetary Nebulae

ScienceDaily (Aug. 16, 2009) — A team of scientists in Australia and the United States, led by Associate Professor Miroslav Filipović from the University of Western Sydney, has discovered a new class of object which they call “Super Planetary Nebulae.”

They report their work in the journal Monthly Notices of the Royal Astronomical Society.
Planetary nebulae are shells of gas and dust expelled by stars near the end of their lives and are typically seen around stars comparable or smaller in size than the Sun.
The team surveyed the Magellanic Clouds, the two companion galaxies to the Milky Way, with radio telescopes of the Commonwealth Scientific and Industrial Research Organisation (CSIRO) Australia Telescope National Facility. They noticed that 15 radio objects in the Clouds match with well known planetary nebulae observed by optical telescopes.
The new class of objects are unusually strong radio sources. Whereas the existing population of planetary nebulae is found around small stars comparable in size to our Sun, the new population may be the long predicted class of similar shells around heavier stars.
Filipović’s team argues that the detections of these new objects may help to solve the so called “missing mass problem” – the absence of planetary nebulae around central stars that were originally 1 to 8 times the mass of the Sun. Up to now most known planetary nebulae have central stars and surrounding nebulae with respectively only about 0.6 and 0.3 times the mass of the Sun but none have been detected around more massive stars.
The new Super Planetary Nebulae are associated with larger original stars (progenitors), up to 8 times the mass of the Sun. And the nebular material around each star may have as much as 2.6 times the mass of the Sun.
“This came as a shock to us”, says Filipović, “as no one expected to detect these object at radio wavelengths and with the present generation of radio telescopes. We have been holding up our findings for some 3 years until we were 100% sure that they are indeed Planetary Nebulae”.
Some of the 15 newly discovered planetary nebulae in the Magellanic Clouds are 3 times more luminous then any of their Milky Way cousins. But to see them in greater detail astronomers will need the power of a coming radio telescope – the Square Kilometre Array planned for the deserts of Western Australia

Friday, August 14, 2009

Robots to get their own operating system

THE UBot whizzes around a carpeted conference room on its Segway-like wheels, holding aloft a yellow balloon. It hands the balloon to a three-fingered robotic arm named WAM, which gingerly accepts the gift.
Cameras click. "It blows my mind to see robots collaborating like this," says William Townsend, CEO of Barrett Technology, which developed WAM.
The robots were just two of the multitude on display last month at the International Joint Conference on Artificial Intelligence (IJCAI) in Pasadena, California. But this happy meeting of robotic beings hides a serious problem: while the robots might be collaborating, those making them are not. Each robot is individually manufactured to meet a specific need and more than likely built in isolation.
This sorry state of affairs is set to change. Roboticists have begun to think about what robots have in common and what aspects of their construction can be standardised, hopefully resulting in a basic operating system everyone can use. This would let roboticists focus their attention on taking the technology forward.
One of the main sticking points is that robots are typically quite unlike one another. "It's easier to build everything from the ground up right now because each team's requirements are so different," says Anne-Marie Bourcier of Aldebaran Robotics in Paris, France, which makes a half-metre-tall humanoid called Nao (pictured).
Some robots, like Nao, are almost autonomous. Others, like the UBot, are semi-autonomous, meaning they perform some acts, such as balancing, on their own, while other tasks, like steering, are left to a human operator.
Also, every research robot is designed for a specific objective. The UBot's key ability is that it can balance itself, even when bumped - crucial if robots are to one day work alongside clumsy human beings. The Nao, on the other hand, can walk and even perform a kung-fu routine, as long as it is on a flat, smooth surface. But it can't balance itself as robustly as the UBot and won't easily be able to learn how.
On top of all this, each robot has its own unique hardware and software, so capabilities like balance implemented on one robot cannot easily be transferred to others.
Bourcier sees this changing if robotics advances in a manner similar to personal computing. For computers, the widespread adoption of Microsoft's Disk Operating System (DOS), and later Windows, allowed programmers without detailed knowledge of the underlying hardware and file systems to build new applications and build on the work of others.

Solar Power Starting to Really Take Shape in India: Acme Group Plans 240 MW of Solar Thermal Plants

Acme Group has announced that it's on track to get the first half of it's first 10 MW solar thermal power plant online at the beginning of 2010, Cleantech reports. The plant uses the same technology as in eSolar's newly commissioned plant in California. But that's not the half of it, Acme has active plans for a further 230 MW, and thinks it can really bring the cost down:

Acme says the initial plant will cost about $3.14 million per megawatt, but can reduce costs to $1.68 million per megawatt as more projects get built. That's partly because of economies of scale and lower cost for materials in India, but more because of the nation's new National Solar Mission, which establishes new feed-in tariffs and financing possibilities.
All this could mean, according to Acme's founder Manoj Kumar Upadhyay, that solar thermal could drop in price to 12¢ per kilowatt-hour by 2015.
In addition to this 10 MW plant, located in Rajasthan, Acme is also building a 5 MW plant in Maharashtra, a 110 MW plant in Gujarat, and a 100 MW plant in Madhya Pradesh. What's more, Acme has exclusive license for up to 1,000 MW of plants using this technology.

The Local Group of galaxies


The Milky Way Galaxy belongs to the Local Group, a smaller group of 3 large and over 30 small galaxies, and is the second largest (after the Andromeda Galaxy M31 (above)) but perhaps the most massive member of this group. M31, at about 2.9 million light years, is the nearest large galaxy, but a number of faint galaxies are much closer: Many of the dwarf Local Group members are satellites or companions of the Milky Way. The closest of all is above-mentioned SagDEG at about 80,000 light years from us and some 50,000 light years from the Galactic Center, followed by the more conspicuous Large and Small Magellanic Cloud at 179,000 and 210,000 light years, respectively.

A fisheye view of the Southern Sky

From horizon to horizon, the night sky above Loomberah, New South Wales, Australia was photographed by astronomer Gordon Garradd on March 22, 1996. Garradd used a home made all-sky camera with a fish-eye lens, resulting in a circular 200 degree field of view. This gorgeous sky view is dominated by the luminous band of the Milky Way cut by dramatic, dark interstellar dust clouds. Along with the bright stars of our Galaxy, the Large Magellanic Cloud is visible at the upper right (about 1 o'clock) and the long, lovely, bluish tail of comet Hyakutake can be seen toward the bottom of the image, near the bright star Arcturus. Bright city lights from nearby Tamworth glow along the Northwestern horizon.

Milky Way may have a huge hidden neighbour

A LARGE satellite galaxy may be lurking, hidden from view, next door to our own.
Sukanya Chakrabarti and Leo Blitz of the University of California, Berkeley, suspected that the gravity of a nearby galaxy was causing perturbations that have been observed in gas on the fringes of the Milky Way. "We did a large range of simulations where we varied the mass of the perturber and the distance of closest approach," says Chakrabarti. In the best-fitting simulation, the unseen galaxy has about 1 per cent of the Milky Way's mass, or 10 billion times the mass of the sun.
That's a lot. It means the object has roughly the same mass as the Milky Way's brightest satellite galaxy, the Large Magellanic Cloud (LMC).
Right now, says Chakrabarti, the galaxy is roughly 300,000 light years away from us - about twice as far away as the LMC. But the simulations suggest it follows a highly elongated elliptical path, and about 300 million years ago it swept through our own galaxy just 16,000 light years from the galactic centre - closer in than Earth - disturbing the Milky Way's outskirts as it went.
"Overall, it is a very plausible scenario," says Abraham Loeb at the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts, who was not part of the study. "Of course, the fact that we don't see such a massive satellite is an issue."
Chakrabarti suggests that the galaxy has remained hidden because it is not a brilliant spectacle. Whereas the LMC glistens with bright young stars and the gas that spawned them, the unseen galaxy may be dead, containing old stars and little gas.
To make matters worse, the simulations suggest that the galaxy orbits ours in the same plane as our galaxy's disc. If it is now on the opposite side of the galaxy from us, it could be hiding behind the thick gas and dust in the galactic plane. "It's very likely to be in a region of very high obscuration," says Chakrabarti. The work will appear in Monthly Notices of the Royal Astronomical Society.
By further studying the distribution of gas, Chakrabarti hopes to pinpoint the galaxy's location so that astronomers will know where to look for it. This parallels the way astronomers in the 1840s discovered Neptune from irregularities in the motion of Uranus caused by gravitational tugs from the more distant planet. If the unseen galaxy exists, it will be the first nearby galaxy detected through its gravity rather than its starlight.