Thursday, April 11, 2013

Jury finds Exxon liable for $236.4 million in U.S. pollution suit

The Exxon corporate logo is pictured at a gas station in Arlington, Virginia January 31, 2012. REUTERS/Jason Reed

The Exxon corporate logo is pictured at a gas station in Arlington, Virginia January 31, 2012.

Credit: Reuters/Jason Reed

By Jason McLure

LITTLETON, New Hampshire | Tue Apr 9, 2013 5:32pm EDT

LITTLETON, New Hampshire (Reuters) - A New Hampshire jury on Tuesday found Exxon Mobil Corp liable for $236.4 million in a civil lawsuit that charged the oil company had polluted groundwater in the state with a gasoline additive used to reduce smog in the 1970s and 1980s.

Following a three-month trial, jurors deliberated less than two hours before finding that the world's largest publicly traded oil company acted negligently in contaminating the groundwater with the additive MTBE, said Jessica Grant, a lawyer who represented the state.

"We're very pleased that the jury held Exxon accountable for the harm its defective product caused to the state's groundwater resources and that they also held Exxon responsible for its negligence," she said.

Originally filed in New Hampshire court in 2003, the state charged that Exxon and other major oil companies knew that MTBE was likely to contaminate groundwater and was more difficult to clean up than other pollutants. Some damages from the suit will help pay for the costs of testing and cleaning affected water supplies.

Exxon vowed to appeal.

"MTBE worked as intended to improve our air quality and the benefits of its use substantially outweighed the known risks," said spokeswoman Rachael Moore. "MTBE contamination in New Hampshire is rapidly decreasing and the state's current system for cleaning up gasoline spills ensures safe drinking water."

The U.S. Environmental Protection Agency today considers MTBE a potential human carcinogen, though much of the research on the chemical has focused on the health effects of inhaling it rather than drinking it. New Hampshire banned MTBE in the state in 2007.

Exxon was the only one of the 22 original defendants in the original suit to go to trial. Other defendants either had the suits against them dismissed or agreed to settlements.

Those included Canada-based Irving Oil Co, which agreed to pay $57 million last year, and Venezuela's state-owned Citgo Petroleum Corp, which struck a $16 million agreement as the trial began.

The three-month trial on the suit, filed in state court, was moved to the state's federal courthouse in Concord to accommodate the large number of witnesses, lawyers and exhibits. The jury found that MTBE contamination had caused $816 million in damages in the state. Exxon's market share of 29 percent was used to compute damages, Grant said.

(Reporting by Jason McLure; Editing by Scott Malone and Tim Dobbyn)


View the original article here

Genetics of life and death in an evolutionary arms-race


View the original article here

[Research Article] Annually Resolved Ice Core Records of Tropical Climate Variability Over the Past ~1800 Years

L. G. Thompson1,2,*, E. Mosley-Thompson1,3, M. E. Davis1, V. S. Zagorodnov1, I. M. Howat1,2, V. N. Mikhalenko4, P.-N. Lin1

1Byrd Polar Research Center, The Ohio State University, Columbus, OH 43210, USA.
2School of Earth Sciences, The Ohio State University, Columbus, OH 43210, USA.
3Department of Geography, The Ohio State University, Columbus, OH 43210, USA.
4Institute of Geography, Russian Academy of Sciences, Moscow, Russia. ?*Corresponding author. E-mail: thompson.3{at}osu.eduIce cores from low latitudes can provide a wealth of unique information about past climate in the tropics, but they are difficult to recover and few exist. Here, we report annually resolved ice core records from the Quelccaya ice cap (5670 masl) in Peru which extend back ~1800 years and provide a high-resolution record of climate variability there. Oxygen isotopic ratios (d18O) are linked to sea-surface temperatures in the tropical eastern Pacific, while concentrations of ammonium and nitrate document the dominant role played by the migration of the Intertropical Convergence Zone in the region of the tropical Andes. Quelccaya continues to retreat and thin: Radiocarbon dates on wetland plants exposed along its retreating margins indicate it has not been smaller for at least six millennia.

Received for publication 18 December 2012. Accepted for publication 21 March 2013.


View the original article here

Science Signaling Podcast, 03 February 2009

Sorry, I could not read the content fromt this page.Sorry, I could not read the content fromt this page.

View the original article here

[Research Articles] Amyloid Fibrils Composed of Hexameric Peptides Attenuate Neuroinflammation

Sci Transl Med 3 April 2013:
Vol. 5, Issue 179, p. 179ra42
Sci. Transl. Med. DOI: 10.1126/scitranslmed.3005681 Multiple Sclerosis Michael P. Kurnellas1, Chris M. Adams2, Raymond A. Sobel3, Lawrence Steinman1,* and Jonathan B. Rothbard1,4

1Department of Neurology and Neurological Sciences, Stanford University School of Medicine, Stanford, CA 94305–5316, USA.
2Stanford University Mass Spectrometry, Stanford University School of Medicine, Stanford, CA 94305–5316, USA.
3Department of Pathology, Stanford University School of Medicine, Stanford, CA 94305–5316, USA.
4Division of Immunology and Rheumatology, Department of Medicine, Stanford University School of Medicine, Stanford, CA 94305–5316, USA. ?*Corresponding author. E-mail: steinman{at}stanford.eduThe amyloid-forming proteins tau, aB crystallin, and amyloid P protein are all found in lesions of multiple sclerosis (MS). Our previous work established that amyloidogenic peptides from the small heat shock protein aB crystallin (HspB5) and from amyloid ß fibrils, characteristic of Alzheimer’s disease, were therapeutic in experimental autoimmune encephalomyelitis (EAE), reflecting aspects of the pathology of MS. To understand the molecular basis for the therapeutic effect, we showed a set of amyloidogenic peptides composed of six amino acids, including those from tau, amyloid ß A4, major prion protein (PrP), HspB5, amylin, serum amyloid P, and insulin B chain, to be anti-inflammatory and capable of reducing serological levels of interleukin-6 and attenuating paralysis in EAE. The chaperone function of the fibrils correlates with the therapeutic outcome. Fibrils composed of tau 623–628 precipitated 49 plasma proteins, including apolipoprotein B-100, clusterin, transthyretin, and complement C3, supporting the hypothesis that the fibrils are active biological agents. Amyloid fibrils thus may provide benefit in MS and other neuroinflammatory disorders.

Copyright © 2013, American Association for the Advancement of ScienceCitation: M. P. Kurnellas, C. M. Adams, R. A. Sobel, L. Steinman, J. B. Rothbard, Amyloid Fibrils Composed of Hexameric Peptides Attenuate Neuroinflammation. Sci. Transl. Med. 5, 179ra42 (2013).


View the original article here

NASA Picks Next Exoplanet Mission for Launch in 2017

NASA plans to launch an exoplanet-hunting satellite and an instrument to study neutron stars in 2017. Both are small missions that could have a big impact.

The Transiting Exoplanet Survey Satellite (TESS) will aim to find terrestrial planets in the habitable zones of nearby stars. It will use an array of wide-field cameras to survey the brightest stars in the sun's neighborhood in hopes of detecting exoplanets such as gas giants and rocky, Earth-sized planets. Some of these planets, researchers hope, will become candidates for follow-up studies of their atmospheres by the James Webb Space Telescope, scheduled for launch in 2018.

The other mission chosen by NASA is the Neutron Star Interior Composition Explorer (NICER), which will be deployed on the International Space Station. The instrument will observe x-rays flashed by neutron stars, helping researchers understand the nature of matter contained in these dense, spinning objects that result from the collapse of massive stars.

"TESS will carry out the first space-borne all-sky transit survey, covering 400 times as much sky as any previous mission," said George Ricker, a research scientist at the Massachusetts Institute of Technology in Cambridge and principal investigator of the mission, in a statement. "It will identify thousands of new planets in the solar neighborhood, with a special focus on planets comparable in size to the Earth."

Both TESS and NICER have been selected as part of NASA's Explorer program, out of four concept studies submitted to the agency in the fall last year. TESS will get up to $200 million, and NICER—to be led by Keith Gendreau of NASA's Goddard Space Flight Center in Greenbelt, Maryland—will receive up to $55 million.


View the original article here

Hated Invasive Species Helps Restore an Ecosystem

Tough guy. Invasive green crabs will freak out herbivorous marsh crabs, allowing cordgrass to regrow.

Credit: Catherine Matassa/Northeastern University

" href="http://news.sciencemag.org/sciencenow/assets/2013/04/03/sn-crabs.jpg">sn-crabs.jpg Tough guy. Invasive green crabs will freak out herbivorous marsh crabs, allowing cordgrass to regrow. Credit: Catherine Matassa/Northeastern University

It's not quite redemption, but one of most loathed invasive species in the world—the European green crab (Carcinus maenas)—has had a surprisingly positive effect on an ecosystem. On Cape Cod, Massachusetts, researchers have found that the crab is reversing a decades-long trend of damage that another species has inflicted on salt marshes. It might be the first nice thing that the green crab has done for anyone.

Green crabs are aggressive and have colonized most of the world's coastlines, where they've chased away or killed native species. "It eats about everything," says marine ecologist Mark Bertness of Brown University. "In terms of biodiversity, it's hell on wheels." But in New England, green crabs are now fixing an ecological problem that humans caused.

Recreational fishing and crabbing on Cape Cod have removed most of the native predators that used to eat a common native species called the purple marsh crab (Sesarma reticulatum). With the predators off their backs, the marsh crabs are now up to four times more common than before. And they're hungry, feasting on the tender young shoots of cordgrass (Spartina alterniflora), which is the predominant plant in the salt marshes. In addition, the crab burrows make the soil more likely to erode. Over the past 3 decades, more and more salt marsh has been laid bare.

But about 3 years ago, Bertness noticed that cordgrass was starting to recover. "This was quite surprising," he says. Soon, he realized that salt marshes with lots of green crabs were doing the best. It was a bit odd to find the green crabs there, because these crabs, which invaded New England more than 100 years ago, tend to live near the shore, where they can hide under rocks.

Bertness thinks that the salt marshes became more appealing for green crabs once the marsh crabs had dug so many burrows. (Even though predators are half as common as before, both crabs need burrows to escape from birds.) The 5-centimeter-wide green crabs, Bertness figured, would have no trouble kicking the smaller marsh crabs out of their burrows and making them skitter away.

To test the idea that this was how green crabs are indirectly helping the cordgrass, Bertness and his lab manager, Tyler Coverdale, surveyed 10 marshes on Cape Cod in August 2012. They found that, in fact, recovering patches of salt marsh did have denser populations of green crabs. And they confirmed the brutal nature of green crabs: When confined in cages and pitted one against another, green crabs evicted marsh crabs from their burrows. More than 85% of the marsh crabs died in the cages, they report online this week in Ecology, which were littered with broken shells and severed limbs.

Just the fear factor alone may have a big effect in reducing herbivory. Another test in a larger enclosure showed that the presence of a single green crab caused marsh crabs to spend the entire month of the experiment in hiding. Even if the green crab was locked up in a cage, the marsh crabs rarely dared to venture out. By the end of the experiment, the marsh crabs had eaten an order of magnitude less cordgrass than they usually did. Once cordgrass gets a break, Bertness says, it can rapidly regrow.

What does it all mean? Bertness says that many ecologists have a "knee-jerk reaction to invasives" and that removing them just because they are invasive may not be the best use of limited conservation funds.

"It's important to acknowledge that introduced species will in some cases provide an unintended benefit, and this is a cool one," says Edwin Grosholz, an ecologist at the University of California, Davis, who was not involved in the study. "That means we have to pay attention as we eradicate species" that have invaded an ecosystem. But he's not ready to put the green crab on a pedestal. "It may have a positive effect in New England," he says. "Its track record elsewhere is quite different."


View the original article here

Face-to-face negotiations favor the powerful


View the original article here

[Perspectives] Cell-Based Therapeutics: The Next Pillar of Medicine

Sci Transl Med 3 April 2013:
Vol. 5, Issue 179, p. 179ps7
Sci. Transl. Med. DOI: 10.1126/scitranslmed.3005568 INNOVATION Michael A. Fischbach1, 2,*, Jeffrey A. Bluestone3 and Wendell A. Lim1,4,5,*

1UCSF Center for Systems and Synthetic Biology, University of California, San Francisco, San Francisco, CA 94158, USA.
2Department of Bioengineering and Therapeutic Sciences and the California Institute for Quantitative Biosciences, University of California, San Francisco, San Francisco, CA 94158, USA.
3Diabetes Center and the Department of Medicine, University of California, San Francisco, San Francisco, CA 94143, USA.
4Department of Cellular and Molecular Pharmacology, University of California, San Francisco, San Francisco, CA 94158, USA.
5Howard Hughes Medical Institute, University of California, San Francisco, San Francisco, CA 94158, USA. ?*Corresponding author. E-mail: fischbach{at}fischbachgroup.org (M.A.F.); lim{at}cmp.ucsf.edu (W.A.L.) Two decades ago, the pharmaceutical industry—long dominated by small-molecule drugs—was revolutionized by the the advent of biologics. Today, biomedicine sits on the cusp of a new revolution: the use of microbial and human cells as versatile therapeutic engines. Here, we discuss the promise of this “third pillar” of therapeutics in the context of current scientific, regulatory, economic, and perceptual challenges. History suggests that the advent of cellular medicines will require the development of a foundational cellular engineering science that provides a systematic framework for safely and predictably altering and regulating cellular behaviors.

Copyright © 2013, American Association for the Advancement of ScienceCitation: M. A. Fischbach, J. A. Bluestone, W. A. Lim, Cell-Based Therapeutics: The Next Pillar of Medicine. Sci. Transl. Med. 5, 179ps7 (2013).


View the original article here

[Perspective] Developmental Biology: Programmed Cell Death in Neuronal Development

Sorry, I could not read the content fromt this page.

View the original article here

[News of the Week] Newsmakers

Sorry, I could not read the content fromt this page.

View the original article here

[Report] Translational Repression and eIF4A2 Activity Are Critical for MicroRNA-Mediated Gene Regulation

Sorry, I could not read the content fromt this page.

View the original article here

U.S. Plains states brace for more wild weather

By Kevin Murphy

KANSAS CITY, Missouri | Tue Apr 9, 2013 5:45pm EDT

KANSAS CITY, Missouri (Reuters) - Residents of the central United States braced for a night of nasty weather on Tuesday, with high wind, rain, sleet, hail and possible tornadoes forecast from north Texas through Nebraska.

Meteorologists said the stormy weather would result from a clash of warm southern air with a cold air mass sweeping through eastern Colorado, where heavy snow in Denver closed the airport and forced the cancellation of 535 flights on Tuesday.

"These are a couple of last hurrahs for winter," said Mike July, a meteorologist for the National Weather Service in Kansas City. "We are going through that phase of the season when we can have some rapid changes."

Moderate to heavy snow and gusty winds were forecast for Colorado, Nebraska, South Dakota and Wyoming. A day earlier in northeast Colorado, four tornado sightings were reported.

Areas south and east of the snowstorm will see sleet and freezing rain and potential flash flooding as the storm moves east, the weather service said.

Strong to severe thunderstorms were expected to move through north Texas, Oklahoma and Kansas through Tuesday evening, the weather service said.

The biggest threat in Kansas and Missouri on Tuesday evening will be hail of up to one inch in diameter, July said.

In Oklahoma, the National Weather Service warned that severe storms with possibly baseball-sized hail were likely to strike the area on Tuesday evening into Wednesday. Sleet and snow were expected, and winds gusting up to 45 miles per hour were forecast, the Weather Service said.

Oklahoma could also get some tornado activity, July said.

"There will be heavy rains through Thursday of 1.5 inches up to 3 inches in the Plains and central Midwest with 3 to 12 inches of snow in Nebraska, northeast Colorado, South Dakota, Minnesota and northern Iowa," said Don Keeney, a meteorologist for MDA Weather Services, a private forecaster.

In Denver, rain turned to snow overnight, with up to 11 inches of snow expected Tuesday. Temperatures that had been in the low 70s (Fahrenheit) on Monday dropped into the teens on Tuesday in Denver and in western Kansas, weather officials said.

In Washington County east of Denver, a microburst destroyed a mobile home and sheared off three 45-foot power poles, temporarily knocking out power to residents of Akron on Monday night, said Mike McCaleb, the county's director of emergency services.

A microburst is a sudden rush of air downward that is sometimes confused with a tornado and can do similar damage.

Large parts of western South Dakota including the Black Hills could get up to 20 inches of snow through Wednesday morning, the National Weather Service said.

Heavy snow was falling in western Nebraska, where a winter storm warning was in effect and Interstate 80 was closed in both directions due to blowing snow and poor visibility, according to the Nebraska Roads Department.

(Additional reporting by Jane Sutton in Miami, Katie Schubert in Omaha, Carey Gillam in Kansas City and Steve Olafson in Oklahoma City; Editing by Maureen Bavdek, Greg McCune and Jim Loney)


View the original article here

Editors' Choice

Sorry, I could not read the content fromt this page.

View the original article here

[Editors' Choice] Taking the Road Less Traveled: RAS Meets PI3K/PDK1

Sorry, I could not read the content fromt this page.

View the original article here

Satellite sandwich technique improves analysis of geographical data


View the original article here

[Books et al.] Books Received

Sorry, I could not read the content fromt this page.

View the original article here

There is no single sexy chin, study shows


View the original article here

Multiple genes robustly contribute to schizophrenia risk in replication study


View the original article here

Measuring microbes makes wetland health monitoring more affordable, says researcher


View the original article here

[Podcasts] Science Translational Medicine Podcast: 3 April 2013

Sci Transl Med 3 April 2013:
Vol. 5, Issue 179, p. 179pc2
Sci. Transl. Med. DOI: 10.1126/scitranslmed.3006201 DRUG DISCOVERY Copyright © 2013, American Association for the Advancement of ScienceCitation: J. Uslaner, O. Smith, Science Translational Medicine Podcast: 3 April 2013. Sci. Transl. Med. 5, 179pc2 (2013).


View the original article here

Podcast: A Better Sleep Drug, Spotting Cancer Cells, and the Impact of Flushed Pharmaceuticals

Have scientists developed a sleep drug without the side effects? How might a new device spot cancer cells before they metastasize? And what really happens to the drugs we flush down the toilet?

Science 's Online News Editor David Grimm chats about these stories and more with Science's Sarah Crespi.

Listen to the full Science podcast.

Read the transcript.

Hear more podcasts.


View the original article here

Oceans may explain slowdown in climate change: study

The tide comes in as the sun sets on the seafront in Scarborough, northern England February 26, 2013. REUTERS/Dylan Martinez

The tide comes in as the sun sets on the seafront in Scarborough, northern England February 26, 2013.

Credit: Reuters/Dylan Martinez

By Environment Correspondent Alister Doyle

OSLO | Sun Apr 7, 2013 1:04pm EDT

OSLO (Reuters) - Climate change could get worse quickly if huge amounts of extra heat absorbed by the oceans are released back into the air, scientists said after unveiling new research showing that oceans have helped mitigate the effects of warming since 2000.

Heat-trapping gases are being emitted into the atmosphere faster than ever, and the 10 hottest years since records began have all taken place since 1998. But the rate at which the earth's surface is heating up has slowed somewhat since 2000, causing scientists to search for an explanation for the pause.

Experts in France and Spain said on Sunday that the oceans took up more warmth from the air around 2000. That would help explain the slowdown in surface warming but would also suggest that the pause may be only temporary and brief.

"Most of this excess energy was absorbed in the top 700 meters (2,300 ft) of the ocean at the onset of the warming pause, 65 percent of it in the tropical Pacific and Atlantic oceans," they wrote in the journal Nature Climate Change.

Lead author Virginie Guemas of the Catalan Institute of Climate Sciences in Barcelona said the hidden heat may return to the atmosphere in the next decade, stoking warming again.

"If it is only related to natural variability then the rate of warming will increase soon," she told Reuters.

Caroline Katsman of the Royal Netherlands Meteorological Institute, an expert who was not involved in the latest study, said heat absorbed by the ocean will come back into the atmosphere if it is part of an ocean cycle such as the "El Nino" warming and "La Nina" cooling events in the Pacific.

She said the study broadly confirmed earlier research by her institute but that it was unlikely to be the full explanation of the warming pause at the surface, since it only applied to the onset of the slowdown around 2000.

THRESHOLD

The pace of climate change has big economic implications since almost 200 governments agreed in 2010 to limit surface warming to less than 2 degrees Celsius (3.6 F) above pre-industrial levels, mainly by shifting from fossil fuels.

Surface temperatures have already risen by 0.8 C. Two degrees is widely seen as a threshold for dangerous changes such as more droughts, mudslides, floods and rising sea levels.

Some governments, and skeptics that man-made climate change is a big problem, argue that the slowdown in the rising trend shows less urgency to act. Governments have agreed to work out, by the end of 2015, a global deal to combat climate change.

Last year was ninth warmest since records began in the 1850s, according to the U.N.'s World Meteorological Organization, and 2010 was the warmest, just ahead of 1998. Apart from 1998, the 10 hottest years have all been since 2000.

Guemas's study, twinning observations and computer models, showed that natural La Nina weather events in the Pacific around the year 2000 brought cool waters to the surface that absorbed more heat from the air. In another set of natural variations, the Atlantic also soaked up more heat.

"Global warming is continuing but it's being manifested in somewhat different ways," said Kevin Trenberth, of the U.S. National Center for Atmospheric Research. Warming can go, for instance, to the air, water, land or to melting ice and snow.

Warmth is spreading to ever deeper ocean levels, he said, adding that pauses in surface warming could last 15-20 years.

"Recent warming rates of the waters below 700 meters appear to be unprecedented," he and colleagues wrote in a study last month in the journal Geophysical Research Letters.

The U.N. panel of climate scientists says it is at least 90 percent certain that human activities - rather than natural variations in the climate - are the main cause of warming in recent decades.

(Reporting by Alister Doyle, Environment Correspondent; Editing by Peter Graff)


View the original article here

[Report] Decameric SelA•tRNASec Ring Structure Reveals Mechanism of Bacterial Selenocysteine Formation

Sorry, I could not read the content fromt this page.

View the original article here

Air pollution scourge underestimated, green energy can help: U.N.

Artist Matt Hope adjusts the helmet linked to his air filtration bike in front of the China Central Television (CCTV) building on a hazy day in Beijing, March 26, 2013. REUTERS/Petar Kujundzic

Artist Matt Hope adjusts the helmet linked to his air filtration bike in front of the China Central Television (CCTV) building on a hazy day in Beijing, March 26, 2013.

Credit: Reuters/Petar Kujundzic

OSLO | Tue Apr 9, 2013 12:02pm EDT

OSLO (Reuters) - Air pollution is an underestimated scourge that kills far more people than AIDS and malaria and a shift to cleaner energy could easily halve the toll by 2030, U.N. officials said on Tuesday.

Investments in solar, wind or hydropower would benefit both human health and a drive by almost 200 nations to slow climate change, blamed mainly on a build-up of greenhouse gases in the atmosphere from use of fossil fuels, they said.

"Air pollution is causing more deaths than HIV or malaria combined," Kandeh Yumkella, director general of the U.N. Industrial Development Organization, told a conference in Oslo trying to work out new U.N. development goals for 2030.

Most victims from indoor pollution, caused by wood fires and primitive stoves in developing nations, were women and children.

He suggested that new U.N. energy goals for 2030 should include halving the number of premature deaths caused by indoor and outdoor pollution.

A 2012 World Health Organization (WHO) study found that 3.5 million people die early annually from indoor air pollution and 3.3 million from outdoor air pollution. Toxic particles shorten lives by causing diseases such as pneumonia or cancer.

"The problem has been underestimated in the past," Maria Neira, the WHO's director of public health and environment, told Reuters. Smog is an acute problem from Beijing to Mexico City.

The data, published as part of a global review of causes of death in December 2012, were an upwards revision of previous figures of 1.9 million premature deaths caused by household pollution a year and 1.3 million outdoors, she said.

The revision reflects better measurements and changes in methods, such as including heart problems linked to pollutants, she said. The numbers cannot be added together because they include perhaps 500,000 from overlapping causes.

SIX MILLION

"Still, it means more than 6 million deaths every year caused by air pollution," she said. "The horrible thing is that this will be growing" because of rising use of fossil fuels.

By comparison, U.N. reports show there were about 1.7 million AIDS-related deaths in 2011 and malaria killed about 660,000 people in 2010.

Solutions were affordable, the experts said.

"If we increase access to clean energy ... the health benefits will be enormous. Maybe the health argument was not used enough" in debate on encouraging a shift from fossil fuels to renewable energies, she said.

Almost 200 governments have agreed to work out by the end of 2015 a deal to combat climate change. But negotiations have stalled, partly because of economic slowdown and divisions between nations about how to share out the burden of cuts.

Yumkella also urged the world to build 400,000 clinics and medical units in developing nations by 2030 as part of U.N. energy and health goals. Vaccines, for instance, are often useless without refrigeration, which depends on electricity.

The United Nations has previously urged 2030 targets for universal access to energy, doubling the global rate of improvement in energy efficiency and doubling the share of renewable energy in global consumption.

(Reporting by Alister Doyle, Environment Correspondent; Editing by Jon Hemming)


View the original article here

Do you get what you pay for? It depends on your culture


View the original article here

[Report] Rats and Humans Can Optimally Accumulate Evidence for Decision-Making

Sorry, I could not read the content fromt this page.

View the original article here

Electronic implants: New fast transcutaneous non-invasive battery recharger and energy feeder


View the original article here

Selling concert tickets? Consider parking when setting the price


View the original article here

U.S. blizzard, possible tornadoes forecast in nasty weather week

By Kevin Murphy

KANSAS CITY, Kansas | Mon Apr 8, 2013 5:31pm EDT

KANSAS CITY, Kansas (Reuters) - Forecasters called for strong hail and possible tornadoes in western Kansas and a blizzard in four other states on Monday in the first of what are expected to be several days of nasty weather in the middle of the country.

The blizzard was expected to hit Colorado, Nebraska, South Dakota and Wyoming on Monday. An Arctic cold front has triggered winter weather warnings over most of Colorado, said National Weather Service meteorologist Jim Kalina.

Meanwhile, warm air from the south mixing with cold air from Colorado is expected to cause severe weather in western Kansas, including possible tornadoes, said weather service meteorologist Matt Gerard, based in Dodge City, Kansas.

"It's a clash of air masses going on," Gerard said, adding that forecasts call for large hail in western Kansas.

Denver and its urban area could get up to 11 inches of snow overnight and through Tuesday, said Kalina. He said temperatures could plunge some 40 degrees from the mid-60s on Monday to well below freezing when the front moves through.

Areas from Denver to Rapid City, South Dakota; Casper, Wyoming; and Scottsbluff, Nebraska are expected to see blizzard conditions between Monday night and Tuesday, with plunging temperatures, high winds and heavy snow, according to Accuweather.com. The blizzard is forecast to move into north central Nebraska and central Minnesota later Tuesday into Wednesday.

South Dakota transportation officials advised travelers to move up travel plans to reach intended destinations during daylight hours, and be prepared to stay in until the storm passes. Heavy snowfall is expected, from 3 to 16 inches in the state, with winds up to 40 miles per hour.

The nasty weather will move toward more populated areas on Tuesday evening, with hail, damaging winds and some possibility of tornadoes predicted around Kansas City, Oklahoma City, and the Dallas-Fort Worth area in Texas, according to Robert Thompson, lead forecaster with the National Storm Prediction Center in Norman, Oklahoma.

Forecasters expect the front to hit Arkansas Wednesday afternoon and evening, with a line of thunderstorms expected to bring as much as three inches of rain and damaging winds, according to the National Weather Service.

The tornado season in the United States typically starts in the Gulf Coast states in the late winter, and then moves north with the warming weather, peaking around May and trailing off by July.

(Reporting by Kevin Murphy in Kansas City, Suzi Parker in Arkansas, Keith Coffman in Denver and Mary Wisniewski in Chicago; Editing by Scott Malone, Sofina Mirza-Reid, Greg McCune and James Dalgleish)


View the original article here

[Report] Influence of HLA-C Expression Level on HIV Control

Sorry, I could not read the content fromt this page.

View the original article here

[Editorial] Climate Change Conversations

Science 5 April 2013:
Vol. 340 no. 6128 p. 9
DOI: 10.1126/science.1238241 Bassam Z. Shakhashiri1,*, Jerry A. Bell2,†
1Bassam Z. Shakhashiri holds the William T. Evjue Distinguished Chair for the Wisconsin Idea and is a professor in the Department of Chemistry at the University of Wisconsin, Madison, WI. He was president of the American Chemical Society in 2012.
2Jerry A. Bell is an emeritus professor in the Department of Chemistry at Simmons College, Boston, MA, and chair of the American Chemical Society's presidential working group on climate science. ?*E-mail: bassam{at}chem.wisc.edu. ?†E-mail: j_bell{at}acs.org. FigureCREDITS: (LEFT) BRYCE RICHTER/UNIVERSITY COMMUNICATIONS, UW-MADISON; (RIGHT) SEAN PARSONS/AMERICAN CHEMICAL SOCIETY The thousands of presentations at next week's meeting of The American Chemical Society (ACS) in New Orleans exemplify one of the many ways scientists converse among themselves about the most recent advances in science. Science and technology continue to reshape the world we live in, and appreciating how these changes, both intended and unintended, come about is a necessity for all citizens in a democratic society. Scientists have a responsibility to help their fellow citizens understand what science and technology can and cannot do for them.

Communicating the science of climate change provides one example where the scientific community must do more. Climate change affects everyone, so everyone should understand why the climate is changing and what it means for them, their children, and generations to follow. Scientists are already members of groups that can facilitate this communication: neighborhoods, school boards, religious groups, service clubs, political organizations, and so on. These groups present opportunities to engage in respectful conversations on climate change and on the policies and actions that individuals, communities, and nations might take to mitigate and adapt to what is happening to our planet.

FigureCREDIT: DAVID JONES/ISTOCKPHOTO We know that the concentrations of greenhouse gases in Earth's atmosphere are higher and increasing faster than at any time in the past 1 million years.* The average temperature of Earth is increasing, ice is melting, oceans are acidifying, and extreme weather events are more frequent. Human activities, principally the combustion of fossil fuels, are a major source of greenhouse gases and a major driver of climate change. To share this knowledge with the public and be credible as a “scientist-citizen,” a scientist must acquire a good grasp of the science of climate change.

In recent years, U.S. scientific institutions and societies, including the National Academies, Environmental Protection Agency, National Aeronautics and Space Administration, and American Institute of Physics have prepared Web-based materials on the science of climate change suitable for communicating with the public.† Last year, the ACS released a Climate Science Toolkit on greenhouse gases, atmospheric and planetary warming, and Earth's energy balance, among other topics.‡ The Toolkit provides a succinct intellectual foundation at an introductory level that can be a guide to more extensive resources. Some of the materials are in forms (such as slide shows) that scientists may use to present this subject to the public, and there is a series of brief narratives designed to help scientists initiate informal conversations with others. Implicit in this resource is the message that the world must make adaptations to changes that have already occurred and that reducing emissions is required to avoid a warmer planet. Scientist-citizens can stress how lifestyle decisions that reduce energy consumption are actually meaningful steps. Supporting elected officials who promote policies and practices aimed to decrease the effects of global warming is another step that individuals and citizens' groups should take.

F. Sherwood Rowland was a central figure in the late–20th-century controversy about the effect of chlorofluorocarbons on stratospheric ozone. For years, he engaged audiences ranging from students to members of the U.S. Congress. As an exemplary scientist-citizen, his focus eventually led to the worldwide ban on these compounds. Rowland spoke to all scientist-citizens when he asked: “Isn't it the responsibility of scientists, if you believe that you have found something that can affect the environment, isn't it your responsibility to do something about it, enough so that action actually takes place?…If not us, who? If not now, when?”§

We pose these same questions and ask you to join the conversations now.


View the original article here

New chart shows the entire topography of the Antarctic seafloor in detail for the first time


View the original article here

Shale-rich Spanish region vote to ban fracking

MADRID | Mon Apr 8, 2013 1:52pm EDT

MADRID (Reuters) - Lawmakers in Spain's northern Cantabria region unanimously voted on Monday to ban hydraulic fracturing on environmental concerns, shooting down the central government's hopes for a project to boost jobs in a region believed to be rich in shale gas.

Spain, battling a deep recession and high unemployment, imports about 76 percent of its energy needs and the technology to extract shale gas, known as fracking, could help relieve its foreign dependence on oil, coal and gas.

Early estimates indicate Spain has large shale gas reserves, but environmentalists have voiced concerns over the safety of the technique, which involves injecting water and chemicals at high pressure into underground rock formations.

Cantabria's ruling People's Party (PP), which has an absolute majority in the regional parliament, proposed the law to ban the practice.

The bill passed with support from all political parties in the Cantabrian parliament on Monday afternoon. The Cantabrian parliament said on its Twitter feed: "Unanimous. Law to ban hydraulic fracturing approved."

"In Cantabria, there is a very large social movement against fracking... the bill will be passed unanimously by the three parliamentary groups. The region is very small and highly populated," a PP source told Reuters earlier on Monday.

However, at a national level, the PP has voiced support for hydraulic fracturing as long as it complies with environmental rules. The ruling PP, which controls the Spanish parliament, could seek to appeal or overturn Cantabria's ban.

Experts say if it is done according to best practice it is environmentally safe, but the technology still evokes much public concern, especially in Europe.

In the United States, shale gas has helped transform the energy market by lowering gas and coal prices, which are in turn helping to lure gas-intensive industries such as petrochemicals back to home soil thanks to the abundance of low-cost energy.

But in Europe, it has made far slower progress and has met with environmental concerns that have triggered bans on fracking in France and Bulgaria.

Shale Gas Europe, a lobby group, says Spanish shale gas reserves are among the biggest in the world.

"Spain's significant reserves, if technically recoverable, will transform its economy at a time when the country is struggling with a burgeoning debt and has been forced to adopt austerity measures," the group says.

Although there is no reliable data available, some analysts say Spain's shale gas reserves could be as high as 1.4 trillion cubic meters, enough to cover European Union demand for around three years.

Early estimates have, however, proven unreliable in other cases.

Poland, which had hoped to be sitting in some of Europe's biggest reserves, had to slash its initial estimates by 90 percent last year after detailed follow-up surveys and drillings disappointed.

(Reporting By Tracy Rucinski; Additional reporting by Paul Day and Clare Kane in MADRID and Henning Gloystein in LONDON; editing by James Jukwey)


View the original article here

[Perspective] Ecology: Dynamics of Coral Reef Recovery

Sorry, I could not read the content fromt this page.

View the original article here

Advancing secure communications: A better single-photon emitter for quantum cryptography


View the original article here

Neutrons help explain ozone poisoning and links to thousands of premature deaths each year


View the original article here

Short-term benefits seen with repetitive transcranial magnetic stimulation for focal hand dystonia


View the original article here

A step toward optical transistors?


View the original article here

[Report] Transposition-Driven Genomic Heterogeneity in the Drosophila Brain

Sorry, I could not read the content fromt this page.

View the original article here

Nanotechnology imaging breakthrough


View the original article here

U.S. Navy to field first laser weapon, could shoot down a drone

The amphibious transport dock ship USS Ponce is seen underway in the U.S. 5th fleet area of responsibility in the Red Sea in this February 16, 2011 handout photo provided by the U.S. Navy. REUTERS/U.S. Navy/Mass Communication Specialist 3rd Class Scott Pittman/Handout

The amphibious transport dock ship USS Ponce is seen underway in the U.S. 5th fleet area of responsibility in the Red Sea in this February 16, 2011 handout photo provided by the U.S. Navy.

Credit: Reuters/U.S. Navy/Mass Communication Specialist 3rd Class Scott Pittman/Handout

WASHINGTON | Mon Apr 8, 2013 7:49pm EDT

WASHINGTON (Reuters) - The U.S. Navy said on Monday it will deploy for the first time a laser weapon on one of its ships that could be capable of shooting down drones and disabling vessels.

"The future is here," said Peter Morrison at the Office of Naval Research's Solid-State Laser Technology Maturation Program.

The weapon is being billed as a step toward transforming warfare. Since it runs on electricity, it can fire as long as there is power at a cost of less than $1 dollar per shot.

"Compare that to the hundreds of thousands of dollars it costs to fire a missile, and you can begin to see the merits of this capability," Chief of Naval Research Rear Admiral Matthew Klunder, said in a statement.

The prototype, which one official said cost between $31 million and $32 million to make, will be installed aboard the USS Ponce, which is being used as a floating base in the Middle East, sometime in fiscal year 2014, which begins in October.

A Navy video showing the laser shooting down a drone can be seen at youtu.be/OmoldX1wKYQ

Klunder said the Navy expects that someday incoming missiles will not be able to "simply outmaneuver" a highly accurate laser beam traveling at the speed of light.

A new report from the Congressional Research Service praises the laser technology but also notes drawbacks, including the potential it could accidentally hit satellites or aircraft. Weather also affects lasers.

"Lasers might not work well, or at all, in rain or fog, preventing lasers from being an all-weather solution," it said in its report issued on March 14.

(Reporting by Phil Stewart; editing by Xavier Briand)


View the original article here

[Report] Broadband 2D Electronic Spectroscopy Reveals a Carotenoid Dark State in Purple Bacteria

Sorry, I could not read the content fromt this page.

View the original article here

Research holds revelations about an ancient society's water conservation, purification


View the original article here

Interior Department recommends removal of dams on Klamath River to aid salmon

By Laura Zuckerman

Thu Apr 4, 2013 11:16pm EDT

n">(Reuters) - The government on Thursday recommended the removal of four hydroelectric dams on the Klamath River in Oregon and California to aid native salmon runs and help resolve a decades-long struggle over allocation of scarce water resources.

The Interior Department proposal, which comes as the largest dam removal project in U.S. history is nearing completion in Washington state, concerns a system of dams that straddle the Oregon-California border.

The proposal to dismantle the dams owned by utility PacifiCorp coincides with a broader push by environmentalists and others to restore salmon fisheries in the Klamath Basin and elsewhere in the nation.

The dams recommended for removal, two in Oregon and two in California, block upstream spawning migrations of salmon and place juvenile fish at risk by slowing their return to the Pacific Ocean.

Removing them would open 420 miles of salmon habitat for the first time in 100 years, eliminate turbines that grind up fish and restore the Klamath River channel, according to the government analysis.

The recommendation stems from a 2010 agreement among competing Klamath Basin water users that called for the government to determine if removing the dams would restore failing salmon runs and lessen conflicts in regional water management.

The Klamath River contains several fish, including Coho salmon, on the federal threatened and endangered species list, and repeated droughts in the basin have periodically forced water managers to allocate flows to protected fish rather than to farmers for irrigation.

LEGAL WRANGLING

The recommendation, which came in an environmental impact statement released by Interior, follows years of legal wrangling and periods of low flows that saw massive die-offs of salmon, shut-offs of irrigation districts and tightening of rules for hydroelectric projects that caused them to operate at losses.

The near collapse of Klamath Basin Chinook salmon led the government in 2006 to severely restrict commercial and sport fishing in the Klamath River and along 700 miles of the California and Oregon coast.

In a statement, Interior Secretary Ken Salazar on Thursday described the dismantling of the dams as "a comprehensive solution addressing all of the needs of the Klamath Basin, including fisheries, agriculture, refuges and power."

Under the proposal, which must still gain congressional approval, the dams would be removed over 20 months at a cost of $450 million to be garnered from rate payers and bonds.

If the dams were to remain in place, PacifiCorp would incur more than $460 million in costs for relicensing, operation and maintenance of aging structures that have proved unprofitable, the analysis shows.

Glen Spain, regional director of the Pacific Coast Federation of Fishermen's Associations, said the analysis "confirms that dam removal is both feasible and cheaper than any other option." But Klamath County Commissioners have withdrawn their support for taking down the dams.

The largest dam-removal project in U.S. history is expected to be completed this summer with the dismantling of the second of two towering dams on the Elwha River in Olympic National Park in Washington.

The project is designed to allow salmon to return to their historic spawning areas and raise salmon counts from 3,000 to 400,000.

(Editing by Cynthia Johnston and Todd Eastham)


View the original article here

No regrets: Close that menu and enjoy your meal more


View the original article here

[Report] Nuclear Actin Network Assembly by Formins Regulates the SRF Coactivator MAL

Christian Baarlink, Haicui Wang, Robert Grosse*

Institute of Pharmacology, Biochemical-Pharmacological Center (BPC), University of Marburg, Germany?*Corresponding author. E-mail: robert.grosse{at}staff.uni-marburg.deFormins are potent activators of actin filament assembly in the cytoplasm. In turn, cytoplasmic actin polymerization can promote release of actin from megakaryocytic acute leukemia (MAL) for serum response factor (SRF) transcriptional activity. Here, we found that formins polymerized actin inside the mammalian nucleus to drive serum-dependent MAL/SRF activity. Serum stimulated rapid assembly of actin filaments within the nucleus in a formin-dependent manner. Endogenous mDia was regulated using a optogenetic tool, which allowed for photoreactive release of nuclear formin autoinhibition. Activated mDia promoted rapid and reversible nuclear actin network assembly, subsequent MAL nuclear accumulation, and SRF activity. Thus, a dynamic polymeric actin structure within the nucleus is part of the serum response.

Received for publication 11 January 2013. Accepted for publication 26 March 2013.


View the original article here

Women with low-self esteem work harder to keep a keeper


View the original article here

[News Focus] Molecular Biology: 'Dead' Enzymes Show Signs of Life

Sorry, I could not read the content fromt this page.

View the original article here

[Book Review] Browsings

Sorry, I could not read the content fromt this page.

View the original article here

Wednesday, April 10, 2013

'Super solvents' voted 'Most Important British Innovation of the 21st Century'

Ken Seddon and Jim Swindall.

Research by scientists from Queen's University Belfast on ionic liquid chemistry has been named the 'Most Important British Innovation of the 21st Century'.

The work of staff in the Queen's University Ionic Liquid Laboratories (QUILL) Research Centre has been named as the innovation that will have the greatest impact in the coming Century.

QUILL fought off stiff competition from 11 other innovations from across the United Kingdom to win the vote which was part of the Science Museum's Initiative on Great British past and future Innovations. This initiative was also sponsored, amongst others, by: Engineering UK, The Royal Society, British Science Association, Royal Academy of Engineering and Department for Business Innovation & Skills.

A team of nearly 100 scientists are exploring the potential of ionic liquids at Queen's. Known as 'super solvents', they are salts that remain liquid at room temperature and do not form vapours. They can be used as non-polluting alternatives to conventional solvents and are revolutionising chemical processes by offering a much more environmentally friendly solution than traditional methods.

Professor Ken Seddon is Co-Director of QUILL. His seminal paper started the world-wide surge of interest in ionic liquids and it has now reached over 1000 citations. Speaking about their latest achievement, he said: "We are delighted to win as this shines a very public spotlight on how a team of chemists can dramatically improve the quality of the environment for everyone. Being named the most important British innovation of the 21st Century is recognition of the high calibre of research being undertaken at QUILL and throughout the University."

Professor Jim Swindall, Co-Director of QUILL at Queen's, said: "This is fantastic news for QUILL and for the University. This vote confirms that Queen's work on ionic liquid chemistry will eventually have a bearing on most of our lives. The liquids dissolve almost everything, from elements such as sulfur and phosphorus (that traditionally require nasty solvents) to polymers, including biomass. They can even remove bacterial biofilms such as MRSA. They are already being used in a process to remove mercury from natural gas by Petronas in Malaysia. Others can be used as heat pumps, compression fluids, or lubricants - the list is limitless."

Enterprise Minister Arlene Foster said: "I congratulate Queen's University on winning this most prestigious of accolades. It is a great achievement for Professors Ken Sedden and Jim Swindall and the entire team at QUILL and it is a great day for Northern Ireland science. This recognition underlines the strength of research being undertaken by Queen's and the impact this research has on the chemical and environmental industry around the world."

Robin Swann, Chairman of the Northern Ireland Assembly's Committee for Employment and Learning said: "The result of this public vote is terrific news for Northern Ireland as it demonstrates the importance of the research being undertaken at Queen's. The fact that global energy giant Petronas is already using the technology in its plants demonstrates the value and global impact of the research at the University and I congratulate Queen's on this significant achievement."

Provided by Queen's University Belfast search and more info website


View the original article here

Communicating the science of the '65-degree egg'

Why does the "65-degree egg" and its "6X°C" counterparts continue to entice chefs and diners at chic restaurants, when the science underpinning that supposed recipe for perfection in boiling an egg is flawed?

It all boils down to the need for greater society-wide understanding of basic scientific concepts, an expert said here today at the 245th National Meeting & Exposition of the American Chemical Society. And in one of the keynote addresses at the meeting, which features almost 12,000 scientific reports, César Vega, Ph.D., explained why cooking ranks as an ideal way of fostering broader awareness about science.

"Cooking is chemistry, and the kitchen is a laboratory," said Vega. "Cooking and food are the single most direct and obvious personal experiences that people have with chemistry. Food is personal. Food is fun! Seemingly simple foods like cookies, fondue and eggs help illustrate key scientific principles. Why are some cookies chewy and others crunchy—or even better, both at the same time? Why do egg whites whip better if we add cream of tartar? Why does Gruyère cheese make the perfect fondue? The sights, the smells, the textures of food can help people remember the science."

The fascination by both diners and chefs de cuisine with that "6X°C" egg is a good example, said Vega, who has a Ph.D. in food science, culinary training from Le Cordon Bleu, and is research manager at Mars Botanical, a division of Mars, Incorporated. Vega also co-edited, with Job Ubbink and Erik van der Linden, The Kitchen as Laboratory: Reflections on the Science of Food and Cooking.

Heating an egg may seem like the simplest form of cooking, next to boiling water, Vega pointed out. But the best way of doing so remains a surprisingly contentious issue among great chefs. Eggs are a gastronomic enigma because the ovotransferrin and ovalbumin proteins in the white begin to coagulate or solidify at around 142 and 184 degrees Fahrenheit, respectively. The phosvitins and other egg yolk proteins, however, can start thickening even at temperatures as low as 130 degrees F. So what's the right temperature for the perfect egg?

Vega explained that some professional cooks have taken a relatively new approach by cooking eggs in temperature-controlled water circulators. Using these devices, chefs cook eggs at relatively low temperatures (such as 60 degrees C, or 140 degrees F), for relatively long periods of time (at least 1 hour). And what Vega terms the "6X°C egg" is now ubiquitous on menus in chic restaurants. The "X" varies depending on the cook, but usually is from 0 to 5, such as the "65°C egg." But chefs claim that temperature alone translates into the perfectly cooked egg, and cooking time—one hour or three hours—does not matter.

"The idea that cooking time does not matter is nonsense," Vega said, citing research he did and published in the peer-reviewed journal Food Biophysics that debunked the idea. It carefully documented that the texture of a cooked egg yolk depends on both temperature and time. The study gives chefs precise numbers of the time and temperature combinations needed to cook eggs to whatever firmness they want.

Eggs certainly are not the only entry on the menu of scientific misconceptions in the kitchen. Vega pointed out that research published last year challenged time-honored ideas about the browning of sugar, known as caramelization. Everyone thought that sugar had to first melt before undergoing that mouth-watering transformation into caramel. The new research showed, however, that sugar can caramelize when heated while it's still solid.

"It's dismaying to think that so many could be so wrong for so long about what actually happens to such basic ingredients like sugar or eggs during cooking," Vega said. "But it also provides a rare opportunity to rethink the possibilities of the basic, and to communicate accurate information and the fun and excitement of science to the public."

More information: Abstract

It is no secret that communicating science to the general public is a challenge. And, it should be no secret that to do this effectively provides the non-scientist with a more realistic perspective of the reaches and limitations of science, and in a better position to interpret and even apply scientific information. However, it seems that we have plenty of homework to do ahead of us. First consider that for the most part, the audience we aim to reach has less technical/scientific knowledge and second, that it is others, (i.e. the media) who take the lead on translating our findings – not always accurately. Inaccurate science interpretation results in misinformation and confusion among the lay public which then minimizes the ability to make informed decision based on science. Is there something we can do to improve the quality of the message? Absolutely. I will try to make a case for the above through food. Food is personal…and complex. Cooking makes food even more personal and brings up further complexity through the transformations and interactions that it promotes. I'm sure that most of your non-scientist friends wonder why using cream of tartar makes better meringues, do you? Cooking is a great tool to demonstrate scientific principles and science offers an exciting path into the kitchen. The challenge is in making the bridge, allowing others to cross it from both sides. I will provide a few examples of bridges that others and I have built.

Provided by American Chemical Society search and more info website


View the original article here

When boron butts in: Bridging N–N ligand borylation in group 4 metallocene complexes

For nature and chemists alike, making atmospheric nitrogen available for the formation of more complex nitrogen compounds is both essential and difficult. In the European Journal of Inorganic Chemistry, Paul Chirik and Scott Semproni at Princeton University, USA, report the first examples of the use of group 4 metallocene complexes for boron–nitrogen bond formation from elemental N2.

The bond in molecular nitrogen (N2) is very hard to cleave. Transition metal complexes have been used more and more to this end, and recently it was shown that N–N bond cleavage can be coupled to N–element bond formation by use of suitable reagents. Chirik and Semproni achieved this change by treating hafno- and zirconocene complexes containing a highly activated, side-on bound bridging N–N ligand with pinacolborane. Subsequent carbonylation of the borylated fragment leads to N–N bond cleavage and concomitant N–C bond formation.

In contrast, treatment of the borylated metallocene with cyclohexanecarbonitrile or tert-butylisocyanide results only in the insertion of the cyanide ligand into the metal–hydrogen bond but not in cleavage of the N–N bond.

The cyclopentadienyl rings used in the metallocene complexes are well suited for the construction of more elaborate nitrogen-based ligands after dinitrogen functionalization, as they are robust and do not give rise to undesired ancillary ligation. The clean reactivity reported was achieved by a systematic study of the substitution of these ligands to obtain the appropriate activation of the side-on bound N2 ligand. These reactions expand the scope of CO-induced N–N bond cleavage.

More information: Chirik, P. Dinitrogen Borylation with Group 4 Metallocene Complexes, European Journal of Inorganic Chemistry, Permalink to the article: dx.doi.org/10.1002/ejic.201300046

Journal reference: European Journal of Inorganic Chemistry search and more info website

Provided by Wiley search and more info website


View the original article here

New wastewater treatment technique protects fish from antidepressants

The membrane distillation technology at Hammarby Sjöstadsverket in Sweden.

Researchers at KTH Royal Institute of Technology in Stockholm have developed a new technique to prevent pharmaceutical residues from entering waterways and harming wildlife.

The new water treatment technology – called membrane distallation – separates drug residues from sewage with the help of district heating, says Andrew Martin, a professor at KTH's Institute of Energy Technology who worked on the development project with IVL and Scarab Development AB.

Martin says that water vapor passes through a thin, hydrophobic membrane of material similar to Goretex, and through an air gap, where it condensed onto a cold surface. Drug residues collect on one side of the membrane and pure water on the other.

"There is currently no technology capable of doing this cleaning process on a large scale," Martin says. "And for the membrane distillation process to work, the water temperature does not need to be very high, which is good."

Pharmaceutical residues in wastewater have been found to alter fish behavior and could even affect the growth of algae. A recent study at Sweden's Umeå University shows even low levels of Oxazepam detected in the Fyris River, in central Sweden, caused perch to become more antisocial, risk prone and active, making them an easier target for predators such as pike. The study measured levels of Oxazepam found in the perch, which were six times higher than in the water itself.

The study also indicated that the release of anti-anxiety drugs can affect entire ecosystems in a waterway, possibly contributing to an increases or decreases in the incidence of algae.

In a test of the membrane distillation technique at Hammarby Sjöstadsverket in Sweden, researchers found a level of 282 nanograms of Oxazepam per litre of wastewater. After ordinary treatment, that level of pharmaceuticals would essentially remain unchanged when the water is returned to the local waterway. But when treated with the membrane distillation system, the concentration was reduced to less than 2 nanograms per litre.

"Of all the 20th century-tested drugs, it is only the remains of the antidepressant Sertraline that we failed to clear 100 percent," Martin says. "We have some theories, but cannot yet explain why."

Martin and his colleagues are now awaiting results from the next step in the evolution of the technique. They are testing membrane distillation with drug residue levels that are nearly 10 times higher. "These samples are out for analysis right now," he says.

Provided by KTH Royal Institute of Technology search and more info website


View the original article here

New method for uncovering side effects before a drug hits the market

Side effects are a major reason that drugs are taken off the market and a major reason why patients stop taking their medications, but scientists are now reporting the development of a new way to predict those adverse reactions ahead of time. The report on the method, which could save patients from severe side effects and save drug companies time and money, appears in ACS' Journal of Chemical Information and Modeling.

Yoshihiro Yamanishi and colleagues explain that drug side effects are a major health problem—the fourth-leading cause of death in the U.S.—which by some estimates claim 100,000 lives every year. Serious side effects are the main reason why existing drugs must be removed from the market and why pharmaceutical companies halt development of new drugs after investing millions of dollars. Current methods of testing for side effects are costly and inaccurate. That's why the scientists sought to develop a new computer-based approach to predicting possible side effects.

They show the usefulness of their proposed method on simultaneous prediction of 969 side effects of 658 drugs that already are in wide medical use. The method is based on knowledge about chemical and biological information about ingredients in these medications. They also used the approach to identify possible side effects for many uncharacterized molecules. Based on that work, the scientists conclude that the new method could be helpful in uncovering serious side effects early in the development and testing of new drugs, avoiding costly investment in medications unsuitable for marketing.

More information: Drug Side-Effect Prediction Based on the Integration of Chemical and Biological Spaces, J. Chem. Inf. Model., 2012, 52 (12), pp 3284–3292. DOI: 10.1021/ci2005548

Abstract
Drug side-effects, or adverse drug reactions, have become a major public health concern and remain one of the main causes of drug failure and of drug withdrawal once they have reached the market. Therefore, the identification of potential severe side-effects is a challenging issue. In this paper, we develop a new method to predict potential side-effect profiles of drug candidate molecules based on their chemical structures and target protein information on a large scale. We propose several extensions of kernel regression model for multiple responses to deal with heterogeneous data sources. The originality lies in the integration of the chemical space of drug chemical structures and the biological space of drug target proteins in a unified framework. As a result, we demonstrate the usefulness of the proposed method on the simultaneous prediction of 969 side-effects for approved drugs from their chemical substructure and target protein profiles and show that the prediction accuracy consistently improves owing to the proposed regression model and integration of chemical and biological information. We also conduct a comprehensive side-effect prediction for uncharacterized drug molecules stored in DrugBank and confirm interesting predictions using independent information sources. The proposed method is expected to be useful at many stages of the drug development process.

Provided by American Chemical Society search and more info website


View the original article here

Widely used filtering material adds arsenic to beers

The mystery of how arsenic levels in beer sold in Germany could be higher than in the water or other ingredients used to brew the beer has been solved, scientists announced here today at the 245th National Meeting & Exposition of the American Chemical Society, the world's largest scientific society. The meeting, which features almost 12,000 reports and other presentations, continues through Thursday.

Mehmet Coelhan, Ph.D., and colleagues said the discovery could be of importance for breweries and other food processors elsewhere that use the same filtering technology implicated in the elevated arsenic levels in some German beers. Coelhan's team at the Technische Universität in Munich set out to solve that riddle after testing 140 samples of beers sold in Germany as part of a monitoring program. The monitoring checked levels of heavy metals like arsenic and lead, as well as natural toxins that can contaminate grain used in brewing beer, pesticides and other undesirable substances.

Coelhan explained that the World Health Organization uses 10 micrograms per liter of arsenic in drinking water as a limit. However, some beers contained higher arsenic levels. "When arsenic level in beer is higher than in the water used during brewing, this excess arsenic must come from other sources," Coelhan noted. "That was a mystery to us. As a consequence, we analyzed all materials, including the malt and the hops used during brewing for the presence of arsenic."

They concluded that the arsenic was released into the beer from a filtering material called kieselguhr, or diatomaceous earth, used to remove yeast, hops and other particles and give the beer a crystal clear appearance. Diatomaceous earth consists of fossilized remains of diatoms, a type of hard-shelled algae that lived millions of years ago. It finds wide use in filtering beer, wine and is an ingredient in other products.

"We concluded that kieselguhr may be a significant source of arsenic contamination in beer," Coelhan said. "This conclusion was supported by analysis of kieselguhr samples. These tests revealed that some kieselguhr samples release arsenic. The resulting arsenic levels were only slightly elevated, and it is not likely that people would get sick from drinking beers made with this filtration method because of the arsenic. The arsenic is still at low levels—the risk of alcohol poisoning is a far more realistic concern, as stated in previous studies on the topic."

Coelhan pointed out that beers produced in at least six other countries had higher arsenic amounts than German beers, according to a report published four years ago. He said that breweries, wineries and other food processors that use kieselguhr should be aware that the substance can release arsenic. Substitutes for kieselguhr are available, he noted, and simple measures like washing kieselguhr with water can remove the arsenic before use.

More information: Abstract

The German Brewing industry has more than 1000 members and produces annually 100 million hectoliter beer. Although beer consumption in Germany per capita has been stagnating for many years, it is relatively high (around 100 liter) compared to many other countries. A large part of beer produced is exported. Brewers take a great deal of care to ensure that the beers they produce are entirely safe. Heavy metals are subject to stringent legislation under German and EU law. In addition to these requirements, contaminants are subject to monitoring organized by the Association of German Beer Brewers, in order to check for any adverse effects on malt or beer quality or other effect on processing before they are accepted for use on malting barley. Hence, a malt monitoring program was started 2011 in Germany to explore levels of heavy metals, mycotoxins, dioxins and dioxine-like PCBs, and a large number of pesticides. In the present study results for arsenic levels are presented. Analyses revealed that particularly kieselguhr used for filtration of beers may be a significant source of arsenic contamination in beer.

Provided by American Chemical Society search and more info website


View the original article here

Catalysts' outer coordination spheres take their place in the spotlight

Wendy Shaw wrote a comprehensive review article on outer coordination spheres.

(Phys.org) —Once dismissed as shrubbery, experimental and computational research shows the outer coordination sphere greatly influences a catalyst's effectiveness, according to Dr. Wendy Shaw at Pacific Northwest National Laboratory in her invited review article. The outer coordination sphere is the complex structure that wraps around the catalyst's central active site and controls the activity, selectivity and specificity of the catalyst. Shaw's Catalysis Reviews article focuses on bottom-up design research. In this approach, aspects of the outer coordination sphere are added as needed.

"The advantage is that you can add just the features you need to get the effects you want," said Shaw.

In her article, Shaw explores studies of a minimal outer coordination sphere based on amino acids. She goes beyond these simple arrangements to examine structured peptide use. These more complex structures allow scientists to add specific positioning of an amino acid near the active site to change the molecular properties at the metal, controlling the catalyst's behavior. She also examines the newer area of enzyme mimics. She notes several exciting studies are using computers to design enzymes from scratch that catalyze reactions that aren't found in nature.

Looking back at the 61-page review, with 226 references, she notes that many of the catalysts fall into two categories: those that function but have undefined outer coordination spheres and those that do not work but have rigorously defined spheres. Few, such as a PNNL rhodium-based catalyst, perform the task at hand and have defined structures. For her, the takeaway message is the large influence that changes far from the active site can exert over the reactivity of the catalyst, and the power of integrating computational chemistry and experimentation to create functional and structurally characterized catalysts.

More information: Shaw WJ. 2012. The Outer-Coordination Sphere: Incorporating Amino Acids and Peptides as Ligands for Homogeneous Catalysts to Mimic Enzyme Function. Catalysis Reviews 54(4):489-550. DOI: 10.1080/01614940.2012.679453

Provided by Pacific Northwest National Laboratory search and more info website


View the original article here

Surfaces inspired by geckos can be switched between adhesive and non-adhesive states, study finds

Adhesives inspired by the gecko can be made to switch on and off reversibly and repeatedly. The key design parameters for these materials are identified in a study published in Journal of the Royal Society Interface today.

Geckos use thread-like fibres on their hands and feet to stick to surfaces. Synthetic gecko-inspired adhesives rely on the same fibrillar structures. In both cases nonchemical adhesion is created by concentrating the intermolecular forces between two bodies.

In 2007 researchers from the Leibniz Institute for New Materials, Germany created adhesive materials which could be switched on and off using differences in pressure. Now the same research group have shown precisely how to do this by adjusting the shape of the surface fibres.

Dr Paretkar and his team identified the key parameters that influence adhesion switchability; namely the fibrillar contact shape, radius, aspect ratio, orientation and the applied compressive load. They found that adding flap structures to the ends of the fibrils significantly enhanced how effectively adhesiveness could be switched on and off.

The synthetic adhesive materials are 'switched' on by pressing them against a surface and 'switched' off by increasing their pressure on the surface, which causes loss of adhesion.

The findings mean that new materials can be developed in which adhesiveness can be precisely controlled. This study was conducted using biocompatible material; if the same results can be repeated in biodegradable materials then they could be used during delicate medical procedures in which small objects have to be moved around. These adhesive materials could also be scaled-up and used as fillers in operations such as repairing a damaged ear drum without the use of stitches.

More information: Paretkar, D. et al. Preload responsive adhesion: effects of aspect ratio, tip shape, and alignment, Journal of the Royal Society Interface. dx.doi.org/10.1098/rsif.2013.0171

Journal reference: Journal of the Royal Society Interface search and more info website

Provided by The Royal Society search and more info website


View the original article here

Amberlyst-15 can act as a catalyst for the acylation of phenols and alcohols

Amberlyst-15 can act as a catalyst for the acylation of phenols and alcohols in solvent free conditions. Credit: Versita

Owing to the huge array of applications, catalysis has long been dubbed as one of the most significant areas of process and synthetic chemistry. In fact, the vast majority of all chemical industrial products – be it in the field of pharmaceutical, agricultural or polymer chemistry – involve catalysts at some stage of the manufacturing process.

Catalytic processes are generally conducted in homogeneous phase using anhydrous organic solvents (e.g. halogenated solvents, toluene, etc.) which are very toxic and hard to eliminate. The development of environmentally friendly and efficient catalytic systems continues to be a major challenge, with researchers and industry teaming up to come up with new solutions that would make catalytic reactions and catalysts not only more ecologically sound, but more efficient and cost-effective (i.e. recyclable). An increasing demand for sustainable catalysis in the last decade has produced a prolific research into Green Chemistry, with scientists investigating different aspects of applied catalysis. In fact, the "Twelve Principles of Green Chemistry" address numerous concerns, such as the use of toxic solvents, expensive reagents and catalysts, the number of chemical steps and their reaction conditions and the atom-economy of these synthetic protocols. A diverse set of approaches can be employed to make this process green and sustainable. First and foremost, preventing or minimizing the use of organic solvents is a way to adopt - this can be achieved either by replacing hazardous solvents with ones that have superior ecological health and safety properties or, better still, by using solvent-free processes. Another important approach concerns the development of catalysts that could be repeatedly recycled and reused with minimal effort.

That aspect, bearing in mind the feasibility of a process in an industrial application, may be of particular importance, as all too often catalysts are expensive and hence there is considerable pressure to reduce costs via recyclability. Now, Mumbai Researchers have developed an inexpensive and entirely green procedure, which enables catalytic acetylation of phenols and alcohols under solvent-free conditions.

In the paper: "Amberlyst-15 catalyzed acetylation of phenols and alcohols under solvent free conditions", released recently in Recyclable Catalysis, an open access journal by Versita, Prof. Manoj Pande and Prof. Shriniwas D. Samant from the University Institute of Chemical Technology in Mumbai, India offer a novel and highly sustainable method for the acylation of phenols and alcohols. They found that Amberlyst-15 is an active catalyst for the acylation of phenols and alcohols by means of acetic anhydride as an acylating agent at room temperature under heterogeneous conditions. The scientists confirmed that applications of this catalyst allow mild and highly selective transformations and synthesis in a facile and environmentally friendly manner. Equally worth mentioning is the fact that the catalysts can be recovered by filtration and recycled several times without loss of activity and selectivity.

Pande and Simant set up a mild, efficient and simple method for the acetylation of phenols and alcohols using acetic anhydride as an acylating agent in the presence of a catalytic amount of Amberlyst-15. The acetylation procedure was extended to a variety of phenols and alcohols to obtain the corresponding acetates in good to excellent yields. They also tested the recyclability of Amberlyst-15 in the reaction of p-bromophenol with acetic anhydride: The catalyst performed outstandingly – exhibiting excellent potential for sustainability – it was reusable for four cycles with no decrease in its activity, giving almost quantitative yield in 4-bromophenylacetate in 20 min. at room temperature.

An inexpensive material, Amberlyst-15 possesses unique properties such as environmental compatibility, nontoxicity, reusability, non-corrosiveness and chemical and physical stability – allowing for its versatile synthetic applications. It is also one of the most resistant catalysts and can be used over a prolonged period.

This cost-effective process proves uncomplicated, workable and generating minimal waste. Considering the sought after recyclability of the catalyst, the method bodes well as it can be potentially scaled up for industrial purposes. Usually, the organic chemists perform the acylation of alcohols with acid chlorides in the presence of Et3N or pyridine. In contrast to the current methodologies, additives are not needed - which certainly offers significant economic and environmental benefits.

More information: www.degruyter.com/view/j/recat.2012.1.issue/recat-2012-0002/recat-2012-0002.xml

Provided by Versita


View the original article here

2013 economic outlook for global chemical industry

The 2013 outlook for the global chemical industry—a $3 trillion enterprise that impacts virtually every other sector of the economy—is the topic of the cover story in this week's edition of Chemical & Engineering News. C&EN is the weekly newsmagazine of the American Chemical Society, the world's largest scientific society.

Titled "World Chemical Outlook" and compiled by a team of 10 editors and correspondents, the annual feature forecasts chemical industry growth rates in various regions, including a modest 1.9 percent increase in the United States (compared to 1.5 percent growth in 2012) and a 0.5 percent increase in Europe (an improvement from the 2.0 percent contraction in 2012).

The story describes several bright spots dotting that generally overcast landscape. U.S. chemical manufacturers, for instance, can look forward to another year of low-priced natural gas to fuel their facilities and provide cheap raw materials. Producers of "fine chemicals," highly pure substances produced in relatively small amounts for medications, pesticides and other products, should do better than the industry as a whole. Likewise, makers of scientific instruments for the energy, environmental, forensics and food markets also are upbeat about 2013 sales.

More information: "World Chemical Outlook"—cen.acs.org/articles/91/i2/World-Chemical-Outlook.html

Provided by American Chemical Society search and more info website


View the original article here

First mobile app for green chemistry fosters sustainable manufacturing of medicines

Mention mobile applications, or mobile apps, and people think of games, email, news, weather, productivity and other software for Apple, Android and other smart phones and tablet computers. But an app with broader impact—the first mobile application to foster wider use of the environmentally friendly and sustainable principles of green chemistry—is the topic of a report in the American Chemical Society's new journal, ACS Sustainable Chemistry & Engineering.

Sean Ekins, Alex M. Clark and Antony Williams point out that the companies that manufacture medicines, electronics components and hundreds of other consumer products have a commitment to work in a sustainable fashion without damaging the environment. That's the heart of "green chemistry," often defined as "the utilization of a set of principles that reduces or eliminates the use or generation of hazardous substances in the design, manufacture and application of chemical products."

Their article describes a guide on doing so for solvents, key ingredients in processes for making medicines. Some traditional processes generate 25-100 times more waste than the chemical they are making (e.g., pharmaceuticals). The solvents guide was developed by the ACS Green Chemistry Institute's Pharmaceutical Roundtable, a group of 14 pharmaceutical companies. The Green Solvents mobile app version of the guide for Apple devices covers 60 different solvents and is available online at https://itunes.apple.com/us/app/green-solvents/id446670983?mt=8, and the Lab Solvents app for Android devices is available online at https://play.google.com/store/apps/details?id=com.mmi.android.labsolvents.

More information: "Incorporating Green Chemistry Concepts into Mobile Chemistry Applications and Their Potential Uses", ACS Sustainable Chem. Eng., 2013, 1 (1), pp 8–13. DOI: 10.1021/sc3000509

Abstract
Green Chemistry related information is generally proprietary, and papers on the topic are commonly behind pay walls that limit their accessibility. Several new mobile applications (apps) have been recently released for the Apple iOS platform, which incorporate green chemistry concepts. Because of the large number of people who now own a mobile device across all demographics, this population represents a highly novel way to communicate green chemistry, which has not previously been appreciated. We have made the American Chemical Society Green Chemistry Institute (ACS GCI) Pharmaceutical Roundtable Solvent Selection Guide more accessible and have increased its visibility by creating a free mobile app for the Apple iOS platform called Green Solvents. We have also used this content for molecular similarity calculations using additional solvents to predict potential environmental and health categories, which could help in solvent selection. This approach predicted the correct waste or health class for over 60% of solvents when the Tanimoto similarity was >0.5. Additional mobile apps that incorporate green chemistry content or concepts are also described including Open Drug Discovery Teams and Yield101. Making green chemistry information freely available or at very low cost via such apps is a paradigm shift that could be exploited by content providers and scientists to expose their green chemistry ideas to a larger audience.

Provided by American Chemical Society search and more info website


View the original article here

First tests of old patent medicine remedies from a museum collection

What was in Dr. F. G. Johnson's French Female Pills and other scientifically untested elixirs, nostrums and other quack cures that were the only medicines available to sick people during the 18th, 19th and early 20th centuries?

Scientists provided a glimpse today based on an analysis of a museum collection of patent medicines used in turn-of-the-century America. It was part of the 245th National Meeting & Exposition of the American Chemical Society, the world's largest scientific society, which is being held here this week.

Mark Benvenuto, Ph.D, who headed the study, explained that hundreds of untested products were sold in stores, by mail order or in traveling medicine shows during the patent medicine era. The products were called "patent medicines" not because they had been granted a government patent, but from an unrelated term that originated in 17th century England.

"This was an era long before the controlled clinical trials and federal regulations that ensure the safety and effectiveness of the medicines we take today," Benvenuto explained. "Many patent medicines had dangerous ingredients, not just potentially toxic substances like arsenic, mercury and lead, but cocaine, heroin and high concentrations of alcohol."

The samples came from the collection of the Henry Ford Museum, in Dearborn, Mich. The museum houses artifacts celebrating American inventors of various items, including planes, cars, trains, machines, furniture and more. The 50 patent medicines in the analysis were among hundreds in the museum's Health Aids collection. The results of Benvenuto's study are on display at the museum.

Undergraduate students working under Benvenuto's supervision performed the bulk of the research. Andrew Diefenbach, a senior and mechanical engineering major at the university, presented the group's research in a talk here today. He got involved in the project as a freshman in Benvenuto's general chemistry course. "I'm interested to see what other comments people have, and what kind of things they may have thought of that we haven't thought of so far that we can use to further the research," Diefenbach said.

Some of the ingredients in the samples of old patent medicines, including calcium and zinc, actually could have been healthy and are mainstays in modern dietary supplements, said Benvenuto. He is with the University of Detroit Mercy. But others were clearly dangerous. Analysis of Dr. F. G. Johnson's French Female Pills, for instance, revealed iron, calcium and zinc. But the nostrum also contained lead, which is potentially toxic. Others contained mercury, another potentially toxic heavy metal, and arsenic.

Benvenuto explained that the presence of heavy metals may have been due to contamination. On the other hand, there actually was a rationale for including some of them. Arsenic and mercury were mainstays for treatment of syphilis, for instance.

Provided by American Chemical Society search and more info website


View the original article here

Smoke signals: The intriguing chemistry of a conclave chimney

The eyes of the world are focused on a thin chimney on top of the Sistine Chapel. Underneath, ensconced in the papal conclave, 115 cardinals are due to make their decision as to who will succeed Benedict XVI as Pope. And the answer to the all-important question comes in the form of a simple smoke signal - no tweets or digital communication allowed - but will it be white or black smoke?

So, when the Royal Society of Chemistry was contacted with a question on what goes up the conclave chimney, we turned to our very own holy smoke expert, Reverend Ron Lancaster, former chemistry teacher and founder of Britain's biggest pyrotechnic display company, Kimbolton Fireworks.

As well as explaining some of the chemistry behind smoke production, Revd Lancaster says he's intrigued to know what the Vatican are using to colour the smoke that will herald the new Pope.

"White smokes are easy chemically and often based on zinc chloride from hexachloroethane and zinc oxide. As for making smoke black, we're not sympathetic chemically to making the necessary carbon compounds - the principle of smoke production needs you to burn something, which unfortunately can have nasty environmental side effects.

"The easiest way to create the black colour is to burn a carbon-rich organic material but it disintegrates in the air and tends to turn grey or white quite quickly. In the old days we used anthracine, but that's now thought to be carcinogenic, so they had to stop using that. They then started using naphthalene, which was used in mothballs - it's not damaging to humans but is toxic to fish. Whatever you're burning, someone somewhere doesn't like it!"

Reverend Lancaster spent 25 years as Chaplain and chemistry teacher at Kimbolton School in Cambridgeshire, founding a workshop conducting research into pyrotechnics which led to the creation of his fledgling fireworks company in 1963. He says while white smoke is going to be welcome in Rome this week, it is not always a welcome side-effect for pyrotechnics experts.

"Smoke is often a nuisance to fireworks makers like me as it gets in the way during daylight displays or on a night with no up-draft. And while you can do some incredible things with coloured smoke as a screen, you really want the clearest possible view of the fireworks.

"If I were involved in looking at making smoke at the Vatican, the question I would be asking is what colour it has changed to by the time it gets in air at the top of the chimney - and how many rehearsals they have had.

"Maybe the chimney design is important - I can imagine they must have got hold of some pyrotechnic experts in Italy. But where they did the tests is beyond me!"

Provided by Royal Society of Chemistry search and more info website


View the original article here