Thursday, April 18, 2013

Memory, the adolescent brain and lying: The limits of neuroscientific evidence in the law


View the original article here

Functional foods from the sea

Seaweeds are not only tasty, but they are a source of nutrients that could be beneficial for health and wellbeing. And like terrestrial plants, seaweeds also contain significant portions of fibre that reach the colon undigested. But does seaweed fibre have similar positive effects on bacterial in the human gut? That's the question that the HYFFI project, funded by the EU, is trying to solve.

Their objective is find out if low-molecular weight polysaccharides (LMWPs) from seaweed fibre had functional effects as prebiotics; not to be confused with probiotics, which are live infusions of bacteria. "The most recent definition for a prebiotic is 'a selectively fermented ingredient that allows specific changes, both in the composition or activity in the gastrointestinal microflora that confers benefits,'" says Sarah Hotchkiss, a seaweed specialist, or phycologist, working for CyberColloids, in Carrigaline, Ireland, one of the commercial partners on the project.

As part of the study, they performed a laboratory-based trial over 10 LWMPs. These were cultured on human faeces because prebiotics such as fibre ferment in the gut as they interact with the microbial community. Thanks to a technique called gas chromatography, it was possible to measure fermentation products. Scientists found that one of their alginate powder compounds, called CC2238, produced a significant increase in total bacteria populations.

Furthermore, a Gelidium seaweed compound, dubbed CC2253, showed a significant increase in bifidobacterial populations. "One fraction in particular was good at stimulating the growth of bifidobacteria," says gut microbiologist Arjan Narbad of the Institute of Food Research, Norwich, UK , who is not associated with the project. "Bifidobacteria are probiotics that are added to many foods and have been shown to impart many beneficial effects on the host, including immune stimulation and antipathogenic activities."

However, in vitro screening results cannot be directly interpreted in terms of human health potential because many beneficial effects drop away in human trials. "This is a 'must' in order to prove bioactivity to humans," says biochemist Lars Ove Dragsted of the department of nutrition, exercise and sports at the University of Copenhagen, Denmark. He tells youris.com that the HYFFI project is a good starting point for the identification of potentially bioactive components in foods like seaweeds.

Another part of the project team therefore conducted a human trial with 60 volunteers at the Northern Ireland Centre for Food and Health (NICHE) at the University of Ulster. Unfortunately, Hotchkiss says, the small number of positive changes they observed in gut microflora was not enough to demonstrate prebiotic activity. "In general, evidence is required for the selective growth of 'good' species of bacteria Bifidobacterium, Lactobacillus and Eubacterium rectale at the expense of other less desirable 'bad' bacteria, in particular, species of Bacteroides and Clostridium," Hotchkiss explains.

However, she adds that just because their study did not demonstrate a prebiotic effect does not mean that seaweed derived fibres have no potential. She concludes: "There are several projects running worldwide and the academic literature suggests that seaweed shows potential."

Provided by Youris.com search and more info website


View the original article here

Egyptian wedding certificate key to authenticating controversial Biblical text

A scientist who helped verify authenticity of the fabled Gospel of Judas today revealed how an ancient Egyptian marriage certificate played a pivotal role in confirming the veracity of inks used in the controversial text. The disclosure, which sheds new light on the intensive scientific efforts to validate the gospel, was made here today at the 245th National Meeting & Exposition of the American Chemical Society (ACS).

"If we hadn't found a Louvre study of Egyptian wedding and land contracts, which were from the same time period and had ink similar to that used to record the Gospel of Judas, we would have had a much more difficult time discerning whether the gospel was authentic," said Joseph G. Barabe. A senior research microscopist at McCrone Associates, he led an analytical team of five scientists who worked on the project at McCrone, a consulting laboratory in microscopy and microanalysis in Westmont, Ill. "That study was the key piece of evidence that convinced us that the gospel ink was probably okay."

Barabe's team was part of a multidisciplinary effort organized in 2006 by the National Geographic Society to authenticate the Gospel of Judas, which was discovered in the late 1970s after having been hidden for nearly 1,700 years. The text, written in Egyptian Coptic, is compelling because—unlike other Biblical accounts that portray Judas Iscariot as a reviled traitor—it suggests that Jesus requested that his friend, Judas, betray him to authorities.

Barabe's presentation was part of an ACS symposium on archeological chemistry.

After analyzing a sample, Barabe and his colleagues concluded that the gospel was likely penned with an early form of iron gall ink that also included black carbon soot bound with a gum binder. While this finding suggested that the text may have been written in the third or fourth century A.D., the researchers were perplexed by one thing: The iron gall ink used in the gospel was different than anything they'd ever seen before. Typically, iron gall inks—at least those from the Middle Ages—were made from a concoction of iron sulfate and tannin acids, such as those extracted from oak gall nuts. But the iron gall ink used to produce the Gospel of Judas didn't contain any sulfur. And that, Barabe said, was troubling.

"We didn't understand it. It just didn't fit in with anything that we had ever encountered," he said. "It was one of the most anxiety-producing projects I've ever had. I would lie awake at night trying to figure it out. I was frantically searching for answers."

Ultimately, Barabe found a reference to a small French study conducted by scientists at the Louvre who analyzed Egyptian marriage and land records written in Coptic and Greek and dating from the first to third centuries A.D. Much to Barabe's relief, those researchers had determined that a wedding certificate and other documents were written in ink made with copper, but little or no sulfur.

"Finding that study, and realizing its implications, tilted my opinion a little in the direction of it being appropriate for the era," Barabe said. "My memory of that experience remains quite vivid. I had a sudden feeling of peace that things were okay, and that I could submit my data without qualms."

Barabe now suspects that the ink used in the Gospel of Judas was probably transitional, a "missing link" between the ancient world's carbon-based inks and the iron gall inks (made with iron sulfate) that became popular in medieval times.

More information: Abstract

Characterization of the ink on the Gospel of Judas: A collaborative approach

In 2006, the National Geographic Society (NGS) contracted with McCrone Associates to characterize the ink in a purportedly 3rd century document, the Gospel of Judas, in order to determine whether the ink was consistent with materials and manufacturing methods of 3rd century Egypt. McCrone's approach was to assemble a group of scientists with expertise in different aspects of microanalysis: The project required taking the initial ink samples in Geneva, Switzerland, specimen preparation for each of the instruments, and analysis by polarized light microscopy, scanning electron microscopy with high-resolution imaging and energy dispersive X-ray spectrometry (EDS), X-ray diffraction, transmission microscopy with EDS, and infrared and Raman spectroscopy. The ink turned out to be an unexpected mix of a traditional carbon black ink in a gum binder with an iron gall component which lacked the expected sulfur. Altogether, our findings are not inconsistent with 3rd century Egyptian ink.

Provided by American Chemical Society search and more info website


View the original article here

Revealing hidden artwork with airport security full-body-scanner technology

In the latest achievement in efforts to see what may lie underneath the surface of great works of art, scientists today described the first use of an imaging technology like that used in airport whole-body security scanners to detect the face of an ancient Roman man hidden below the surface of a wall painting in the Louvre Museum in Paris.

They described unveiling the image, which scientists and art historians say may be thousands of years old, during the 245th National Meeting & Exposition of the American Chemical Society.

J. Bianca Jackson, Ph.D., who reported on the project, explained that it involved a fresco, which is a mural or painting done on a wall after application of fresh plaster. In a fresco, the artist's paint seeps into the wet plaster and sets as the plaster dries. The painting becomes part of the wall. The earliest known frescoes date to about 1500 B.C. and were found on the island of Crete in Greece.

"No previous imaging technique, including almost half a dozen commonly used to detect hidden images below paintings, forged signatures of artists and other information not visible on the surface has revealed a lost image in this fresco," Jackson said. "This opens to door to wider use of the technology in the world of art, and we also used the method to study a Russian religious icon and the walls of a mud hut in one of humanity's first settlements in what was ancient Turkey."

The technology is a new addition to the palette that art conservators and scientists use to see below the surface and detect changes, including fake signatures and other alterations in a painting. Termed terahertz spectroscopy, it uses beams of electromagnetic radiation that lie between microwaves, like those used in kitchen ovens, and the infrared rays used in TV remote controls. This radiation is relatively weak, does not damage paintings and does not involve exposure to harmful radiation.

"Terahertz technology has been in use for some time, especially in quality control in the pharmaceutical industry to assure the integrity of pills and capsules, in biomedical imaging and even in homeland security with those whole-body scanners that see beneath clothing at airport security check points," said Jackson, who is now with the University of Rochester. "But its use in examining artifacts and artworks is relatively new."

Artists, including some of the great masters, sometimes re-used canvases, wiping out the initial image or covered old paintings with new works. They often did this in order to avoid the expense of buying a new canvas or to enhance colors and shapes in a prior composition. Frescoes likewise got a refresh, especially when the originals faded, owners tired of the image on the wall or property changed hands.

The scientists turned to terahertz technology when suspicions surfaced that a hidden image might lie beneath the brushstrokes of a precious 19th century fresco, Trois hommes armés de lances, in the Louvre's Campana collection. Giampietro Campana was an Italian art collector in the 1800s whose treasures are now on display in museums around the world. When Campana acquired a work of art, he sometimes restored damaged parts or reworked the original. Art historians believe that Campana painted Trois hommes armés de lances after the fresco was removed from its original wall in Italy and entered his collection.

Jackson said that Campana's painting in itself is valuable, and the terahertz revelations may have added value by showing that an authentic Roman fresco lies under it.

To search for a hidden image, Jackson and colleagues, including Gerard Mourou, Ph.D., of Ècole Polytechnique, and Michel Menu, Ph.D., of the Centre de Recherche et de Restauration des Musées de France, and Vincent Detalle, of the Laboratoire Recherche des Monuments Historiques, probed it with terahertz technology. The process is slow, requiring a few hours to analyze a section the size of a sheet of paper.

"We were amazed, and we were delighted," said Jackson. "We could not believe our eyes as the image materialized on the screen. Underneath the top painting of the folds of a man's tunic, we saw an eye, a nose and then a mouth appear. We were seeing what likely was part of an ancient Roman fresco, thousands of years old."

Who is the man in the fresco? An imperial Roman senator? A patrician? A plebian? A great orator? A ruler who changed the course of history? Or just a wealthy, egotistical landowner who wanted to admire his image on the wall?

Jackson is leaving those questions to art historians. The team already has moved ahead and used terahertz technology to study a Russian religious icon and the walls of a mud hut in one of the earliest known human settlements in what now is the country of Turkey.

More information: Abstract

Terahertz pulse imaging and spectroscopy is emerging as a tool of high potential for the nondestructive investigation of historical artworks, architecture and archaeological objects for the purpose of research and conservation. We studied a section of the fresco Trois hommes armés de lances from the Louvre's Campana collection using time-domain terahertz imaging. The top painting is 19th C, while the support is composed of wall sections recovered from Roman ruins. No previous technique, including X-ray radiography, XRF, infrared photography, infrared reflectometry and UV florescence, has produced an image of a lost fresco beneath the painting. A composite of the photograph of the section and the composite terahertz image reveals a face hidden beneath the 1st man's drape. Other examples of this application will also be presented, including a Russian icon, a wall painting from the Riga Dom cathedral and a Neolithic site from Catalhoyuk Turkey.

Provided by American Chemical Society search and more info website


View the original article here

[Letter] Drawbacks to Natural Gas

Sorry, I could not read the content fromt this page.

View the original article here

[Technical Comment] Comment on “Apatite 4He/3He and (U-Th)/He Evidence for an Ancient Grand Canyon”

Science 12 April 2013:
Vol. 340 no. 6129 p. 143
DOI: 10.1126/science.1233982 Karl E. Karlstrom1,*, John Lee2, Shari Kelley3, Ryan Crow1, Richard A. Young4, Ivo Lucchitta5, L. Sue Beard5, Rebecca Dorsey6, Jason W. Ricketts1, William R. Dickinson7, Laura Crossey1

1University of New Mexico, Albuquerque, NM 87131, USA.
2U.S. Geological Survey, Denver, CO 80225, USA.
3New Mexico Bureau of Geology and Mineral Resources, Socorro, NM 87801, USA.
4State University of New York, Geneseo, NY 14454, USA.
5U.S. Geological Survey, Flagstaff, AZ 86001, USA.
6University of Oregon, Eugene, OR 97403, USA.
7University of Arizona, Tucson, AZ 85721, USA. ?*Corresponding author. E-mail: kek1{at}unm.eduFlowers and Farley (Reports, 21 December 2012, p. 1616; published online 29 November 2012) propose that the Grand Canyon is 70 million years old. Starkly contrasting models for the age of the Grand Canyon—70 versus 6 million years—can be reconciled by a shallow paleocanyon that was carved in the eastern Grand Canyon 25 to 15 million years ago (Ma), negating the proposed 70 Ma and 55 Ma paleocanyons. Cooling models and geologic data are most consistent with a 5 to 6 Ma age for western Grand Canyon and Marble Canyon.

The “old” Grand Canyon hypothesis, reinvigorated by modeling of recent 4He/3He data in Flowers and Farley (1), posits that an early phase of canyon carving was accomplished by a northeast-flowing river 80 to 70 million years ago (Ma) followed by establishment of a west-flowing river by 55 Ma, such that the western Grand Canyon was “excavated to within a few hundred meters of modern depths by ~70 million years ago” (1, 2). The dramatically different “young” Grand Canyon hypothesis states that a majority of the canyon was carved by the west-flowing Colorado River in the past 5 to 6 million years (3, 4). Geologic data supporting the “young” canyon model include (i) 5.3-million-year age of earliest Colorado Plateau–derived sediments in the Salton Trough (5); (ii) 4.4-million-year age of oldest known Colorado River gravels (6); (iii) lack of pre-6 Ma Colorado River sediment immediately downstream of the mouth of Grand Canyon (7); (iv) geometry of north-flowing 70 to 18 Ma paleocanyons in western Grand Canyon (8); (v) southward-transported 60 to 50 Ma Hindu fanglomerate that was deposited across the modern course of the western Grand Canyon (9); (vi) semisteady incision rates over the past 4 million years sufficient to carve most of the Grand Canyon in 6 million years (10); and (vii) lack of Colorado Plateau detritus in early Tertiary deposits of the Los Angeles basin (11).

We believe that the thermochronologic data and modeling of Flowers and Farley also are consistent with a “young” Grand Canyon when reinterpreted to correct for tenuous assumptions. Their thermal models were generated from 4He/3He diffusion profiles and apatite 4He/3He and (U-Th)/He (Ahe) data for the eastern and western Grand Canyon (Fig. 1A), but we question their geological interpretations of these models for several reasons. Incomplete understanding of He diffusion in apatite poses considerable difficulties in assigning constrained cooling paths, requiring critical examination of modeling assumptions. One assumption used (1) was that apatite grains from each four-sample “ensemble” in the eastern and western Grand Canyon shared common cooling histories and can be modeled together. However, this is suspect because of structural complexities in both regions (12). Instead, existing AHe and apatite fission-track (AFT) data (12–15) show variability in thermochronologic ages and therefore nullify the extrapolation of results from a few samples to the entire Grand Canyon. Another questionable assumption (1) (see below) is that western Grand Canyon samples were heated enough to completely anneal apatite at 80 to 120 Ma.

Fig. 1 Fig. 1 Thermochronology data from the Grand Canyon region. (A) Map of the Grand Canyon region showing apatite helium samples discussed in the text (1, 13–15). (B) Carving of an Eastern paleocanyon from 25 to 15 Ma is indicated by different temperatures of rim- and river-level samples until ~25 Ma. (C) Western Grand Canyon thermal models are in conflict, but joint inversion of AFT and AHe data [purple curves, from (14)], suggest that the western Grand Canyon was carved in the the past 6 million years. (D) The top left diffusion profile (1) may fit the “young canyon” model if modeled without the highest temperature step. (E) Full data set of AHe ages (top) resembles predicted “young” canyon distribution of (1).

Joint inversion of independent AHe and AFT data sets is especially powerful and provides well-constrained cooling histories for river samples in the eastern Grand Canyon (14); these show that basement rocks cooled slowly from 80° to 70°C between 65 and 25 Ma, then cooled rapidly from 25 to 15 Ma. The geometry of their published rim-level samples (shown in our Fig. 1A) is not optimal for resolving paleocanyons, but all available data (12–15) suggest that rim- and river-level samples, now separated vertically by 1 to 1.5 km, resided at 45° to 55° and 80°C, respectively, from 60 to 25 Ma. There is no evidence for a paleocanyon until after 25 Ma, when rim- and river-level cooling paths converge (Fig. 1B). Similar data show that the Marble Canyon section of the eastern Grand Canyon was buried by ~2 km of rock, and hence no canyons existed there until after 10 Ma (14). The combined data (Fig. 1B) refute the hypothesis for carving of the eastern Grand Canyon by 55 Ma (1, 2).

The western Grand Canyon cooled earlier than the eastern Grand Canyon because of its proximity to the ancient Sevier/Laramide highlands. This region was eroded by northeast-flowing Laramide paleocanyons (9) and is cut by numerous faults with a history of recurring movement (12). A model from one 4He/3He sample (CP06-69) (Fig. 1C) suggests that rocks cooled to <30°C (~200 m depth) and have resided at these cool temperatures since 70 Ma (1). However, this interpretation conflicts with the joint inversion of AFT and AHe data from nearby samples (14), which suggests that these rocks cooled from ~60° to 40°C between 60 and 25 Ma (01-GC86) (Fig. 1C), compatible with ~1-km burial depth (the present depth below the rim). These conflicting results (1, 14) have several plausible explanations: (i) Sample “ensembles” from (1) span several known faults and therefore may not have shared a common cooling history. (ii) Western Grand Canyon samples accumulated considerable radiation damage during residence in the AHe partial retention zone for >600 million years and may not have been heated enough during the Cretaceous time to fully anneal grains, such that western Grand Canyon models should be rerun starting ~600 Ma to account for any incomplete annealing and inherited helium. (iii) When the combined AFT and AHe data sets (1, 12–14) are merged, the results of (1) are more closely reproduced by the “young” canyon than the “old” canyon model (Fig. 1E). The conflicting models (Fig. 1C) could both be correct if (iv) sample CP06-69 (1) was situated beneath a north-flowing paleocanyon near Separation Canyon, whereas sample 01GC-86 (14) was from an interfluve; or (v) CP06-69 was cooled on the upthrown side of an unrecognized Laramide reverse fault relative to 01GC-86. Although our knowledge of the north-flowing Laramide paleocanyon system is incomplete, existing thermochronologic data argue against a 70-Ma western Grand Canyon that followed the same path with nearly the same depth as the modern canyon.

A simple dichotomy of “old” canyon versus “young” canyon hypotheses is overly simplistic because the Grand Canyon includes different sections with different geologic histories. Older paleocanyons likely were reused or re-excavated once the river found its modern path and began eroding rapidly. Despite these complexities, existing data do not support the model for a 80- to 70-Ma northeast-flowing California river, nor a 55-Ma southwest-flowing Arizona river, that collectively carved the Grand Canyon to within a few hundred meters of its modern depth by Early Tertiary time. Instead, an overwhelming body of published geologic and thermochronologic evidence shows that a majority of the Grand Canyon—the canyon that we see from the rim today—has been carved in the past 5 to 6 million years by the Colorado River. Drainage integration at 5 to 6 Ma was likely facilitated by older paleocanyon segments, whose geometry is now coming into focus.

Received for publication 12 December 2012. Accepted for publication 25 February 2013. ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? Acknowledgments: Funding for the University of New Mexico coauthors (K.E.K., R.C., L.C., and J.W.R.) was from NSF EAR-0711546 and EAR-1119629.


View the original article here

Colleges say federal cuts could cause brain drain

Massachusetts Institute of Technology doctoral candidate in mechanical engineering Nikolai Begg poses in an MIT workshop in Cambridge, Mass., Friday, March 15, 2013. Begg is concerned about whether government funding losses could force undergraduates who are contemplating higher degrees to enter the workforce for financial reasons, meaning a loss of American ingenuity in the end. (AP Photo/Michael Dwyer)

At the Massachusetts Institute of Technology, faculty fret about the future of the school's Plasma Science and Fusion Center. Thirty miles (fifty kilometers) away, administrators at the state university campus in Lowell worry that research aimed at designing better body armor for soldiers could suffer.

The concerns have emerged because of automatic federal budget cuts that could reduce government funding for research done at educational institutions, spending that totaled about $33.3 billion in 2010, Department of Education statistics show. And the possible cuts raise another concern at those schools and others across the country: brain drain.

President Barack Obama and lawmakers failed to agree on a plan to reduce the nation's deficit that would have avoided the automatic spending cuts, the so-called sequester, which began to roll out this month. Included in the cuts are 5 percent of the money for programs that fund education research, a Department of Education spokesman said Friday. But because negotiations over how to balance the budget are ongoing, the timing and size of many cuts to be made by government agencies remain unclear.

"One of the questions we don't know is if agencies will elect to cut funding by not making new grants or cutting back on old grants," said Terry Hartle, a senior vice president at the American Council on Education.

In the meantime, professors are left wondering how many young scientists will become discouraged by domestic funding challenges and either leave for careers abroad or change fields.

At MIT, doctoral candidate Nikolai Begg said he's lucky the research he's working on now has corporate sponsorship.

"It's kind of scary to be hearing that a lot of that support is going away," he said of government cuts. "How do we keep America technologically relevant has been a question on everybody's mind. And the sequester only makes that harder."

The 25-year-old mechanical engineer recently won a $30,000 Lemelson-MIT award for inventions that aim to make surgical procedures less invasive. But Begg is concerned about whether government funding losses could force undergraduates who are contemplating higher degrees to enter the workforce for financial reasons, meaning a loss of American ingenuity in the end.

"I wonder if this whole issue is going to prevent people from going into more advanced research where they can really innovate ... We don't really know what it's going to do yet. There's not enough information out. You know the storm is coming."

Some university officials say a loss of federal funding from the cuts aggravates a current trend: Scientists already have less time to spend in their labs because they have to spend more time seeking grants.

"What the sequester has done is make more dramatic this trend," said Scott Zeger, Johns Hopkins University's vice provost for research. "... It means that people aren't spending quiet time thinking about how nature works."

Breast cancer researcher Dr. Debu Tripathy, a professor at the Keck School of Medicine of the University of Southern California, compared a scientist who doesn't spend enough time in a lab because of grant writing to a politician who is too busy campaigning for re-election to serve constituents.

He worries the country's commitment to a war on cancer, going back to the signing of the National Cancer Act in 1971, could falter. Tripathy said a lot of good science isn't getting funded and bright minds aren't coming into the field.

"If we don't engage the brightest minds to continue the trajectory we're on, then that will affect a whole generation," the doctor said.

At Washington University School of Medicine in St. Louis, dean Dr. Larry Shapiro said the automatic cuts are causing anxiety among young researchers who are wondering what career options they'll have if the current economic climate becomes "the new normal."

"This is all that's being discussed in the hallways and over coffee," he said.

He said two genetics researchers recently decided to leave the university and move their labs to the United Kingdom amid the climate of funding losses.

"Scientists are passionate about their work, and they'll go where they have the best opportunity to accomplish it," Shapiro said.

Washington University School of Medicine could be looking at $30 million to $40 million in budget cuts because of cutbacks at the National Institutes of Health, and possibly having to cut 300 scientific personnel jobs, according to Shapiro. The school is part of a consortium working on new therapies for Alzheimer's disease, and he said that work would be slowed considerably because the NIH is a big funding source.

At the University of Massachusetts-Amherst, school officials are projecting that they could lose about $8 million in research money, which could affect projects including biofuels research.

But UMass-Amherst chemistry professor Paul Lahti, who is leading research on better ways to harvest solar energy, said it's the job of senior faculty members to keep students encouraged and excited about the future of discovery despite negative economic factors.

"You carry on and do the best work you can," Lahti tells them.

"The science is going to get done," the professor said. "The younger people in the end are the ones that are our most important project."

Copyright 2013 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.


View the original article here

WRAPUP 2-IMF trims global growth forecast, sees bumpy recovery

* IMF sees world growth of 3.3 pct in 2013, down two-tenths

* IMF warns against fatigue in battling European debt crisis

* Japan economy to expand with new monetary steps

* Fund estimates first Fed rate increase in 2016

* Emerging economies picking up steam again

By Lesley Wroughton

WASHINGTON, April 16 (Reuters) - The International Monetary Fund on Tuesday trimmed projections for global economic growth for this year and next to take into account sharp government spending cuts in the United States and the latest struggles of recession-stricken Europe.

While it said economic prospects had improved in recent months with a fading of financial risks, it warned Europe against relaxing efforts to combat its debt crisis given the messy bailout in Cyprus and a political stalemate in Italy.

The IMF raised its forecast for Japan, welcoming the Bank of Japan's aggressive new monetary stimulus, which it said would boost growth and help vanquish deflation.

"While some tail risks have decreased it is not time for policymakers to relax," IMF chief economist Olivier Blanchard told a news conference to discuss the World Economic Outlook.

The report was released as global financial leaders gathered for the semiannual meetings of the IMF and World Bank later this week.

The IMF cut its 2013 forecast for global growth to 3.3 percent, down from its January projection of 3.5 percent. It also trimmed its 2014 forecast to 4.0 percent from 4.1 percent.

A more subdued outlook for the United States and for the euro zone led it to lower its growth forecast for advanced economies to 1.2 percent for 2013 while it kept its 2014 forecast at 2.2 percent.

While it lowered its projections for growth in emerging economies to 5.3 percent for this year, it also said growth was already accelerating and would hit 5.7 percent in 2014. Growth has returned to a healthy pace in China and activity is expected to recover in Brazil next year, the IMF said.

Strong domestic demand in sub-Saharan Africa should help boost growth in both resource-rich and poorer economies in that region, the Fund added. Meanwhile, growth in the Middle East and North Africa is likely to dip this year as oil production slows in some oil-exporting nations and "Arab Spring" countries struggle with political transitions.

"Notwithstanding old dangers and new turbulence, the near-term risk picture has improved as recent policy actions in Europe and the United States have addressed some of the gravest short-term risks," the Fund said.

BOJ ON TRACK BUT NEEDS HELP

Blanchard said the dramatic overhaul of monetary policy announced by the Bank of Japan was a necessary step and he hoped it would succeed.

The IMF said inflation in Japan would likely rise above zero in 2013 and temporarily jump in 2014 and 2015 in response to an increase in consumption taxes.

The Bank of Japan unleashed an intense burst of monetary stimulus earlier this month, pledging to inject about $1.4 trillion into the economy in less than two years, a major shift from its previous incremental steps.

Tokyo came under fire before a meeting of officials from the Group of 20 leading economies in February for comments that suggested it was targeting specific levels for the yen with its easing of monetary and fiscal policy. The yen last week hit a four-year low against the dollar.

But the IMF said it found "no large deviations of the major currencies from medium-term fundamentals" and dismissed talk of a "currency war" as overblown.

"We think it is a logical consequence of appropriate monetary policy," Blanchard said when asked about the yen's sharp decline.

The Fund said the U.S. dollar and euro "appear moderately overvalued" and the Chinese renminbi "moderately undervalued." Evidence on the value of the yen "is mixed," it added.

FIRST FED RATE INCREASE IN 2016

The IMF said Europe and the United States had dodged bullets by enacting policies that laid to rest the notion of a euro zone breakup and the possibility the world's richest economy would fall off a "fiscal cliff" of tax increases and budget cuts.

However, it suggested an easier monetary policy might be warranted in the euro zone.

"Given moderating inflation pressure, monetary policy should remain very accommodative. Room is still available for further conventional easing, as inflation is projected to fall below the European Central Bank's target in the medium term," it said.

The IMF forecast economic contractions in France, Spain and Italy this year. IMF economist Jorg Decressin said Italy's economic policy was on the right track and prospects would brighten next year with less need for government spending cuts. He also said fiscal policy in France is "appropriate" even if the country misses the goal to trim the deficit below an EU ceiling of 3 percent of GDP in 2013.

The Fund also made clear that, while a worst-case outcome had been avoided, fiscal policy in Washington had tightened more than it had expected - a key reason for its forecast downgrade.

It said across-the-board spending cuts known as the "sequester" would shave about 0.3 percentage points from gross domestic product this year, the IMF said. If the sequester continued into the next fiscal year, it could trim another 0.2 percentage points from GDP growth, the IMF added.

Blanchard said without fiscal consolidation, U.S. economic growth would probably be between 1.5 percent to 2 percent higher this year.

As for U.S. monetary policy, the IMF said it expects the Federal Reserve to hold interest rates near zero into early 2016, although it cautioned that the Fed may need to tighten policy earlier "should upside risks to growth materialize."

The Fed last month maintained a controversial program of buying $85 billion of bonds a month, while pledging to keep interest rates near zero at least until unemployment falls to 6.5 percent, so long as inflation stays under 2.5 percent.

The Fund said developing a comprehensive medium-term deficit reduction framework that reformed so-called entitlement programs and raised additional revenues should be the top priority for the United States.

"Such a comprehensive plan should place fiscal consolidation on a gradual path in the short term, in light of the fragile recovery and limited room for monetary policy," the IMF added.


View the original article here

Biodiversity crisis: The impacts of socio-economic pressures on natural floras and faunas

This shows an endangered species: Trifolium saxatile - clover. Copyright: Stefan Dullinger

A new study on extinction risk has shown that proportions of plant and animal species being classified as threatened on national Red Lists are more closely related to socioeconomic pressure levels from the beginning than from the end of the 20th century. Stefan Dullinger of the University of Vienna and Franz Essl from the Austrian Environment Agency together with an international group of researchers reports this new finding in the current issue of PNAS.

It is well understood that the survival of a substantial and increasing number of species is put at risk by human activity via e.g. habitat destruction, environmental pollution or introduction of alien species. Accordingly, the most recent global IUCN Red List classifies 31% of the 65,518 plant and animal species assessed as endangered. However, the temporal scale of cause-effect relationships is little explored. If extended time lags between human pressure and population decline are common, then the full impact of current high levels of anthropogenic pressures on biodiversity will only be realized decades into the future.

Biodiversity crisis: The impacts of socio-economic pressures on natural floras and faunas
Enlarge

This shows Spermophilus citellus - European ground squirrel. Copyright: Wolfgang Rabitsch

Historical legacy of species' population losses

Taking an historical approach, the new study provides circumstantial evidence that such time-lags are indeed substantial. The researchers demonstrate that proportions of vascular plants, bryophytes, mammals, reptiles, dragonflies and grasshoppers facing medium to high extinction risks are more closely matched to country-specific indicators of socio-economic pressures (i.e. human population density, per capita GDP, land use intensity) from the early or mid rather than the late 20th century. Accordingly, their results suggest a considerable historical legacy of species' population losses. In a related analysis they also show that current spending on environmental conservation only has a weak mitigating effect. This finding implies that current conservation actions are effective, but inadequate in scale, to halt species losses.

Biodiversity crisis: The impacts of socio-economic pressures on natural floras and faunas
Enlarge

This is Onosma helvetica ssp. austriaca. Copyright: Franz Essl

"The broad taxonomic and geographic coverage indicates that a so-called 'extinction debt' is a widespread phenomenon", says Stefan Dullinger from the University of Vienna. "This inertia is worrying as it implies that albeit numbers of species classified as threatened on Red Lists are increasing continuously and worldwide, these assessments might still underestimate true extinction risks", explains Franz Essl from the Austrian Environment Agency.

Increase in global conservation effort is urgently needed

Therefore, the scientists write "mitigating extinction risks might be an even greater challenge if temporal delays mean many threatened species might already be destined towards extinction". They expect that minimizing the magnitude of the current extinction crisis might be an even greater challenge when temporal delays are taken into account. Therefore a substantial increase in global conservation effort is urgently needed to conserve species diversity for future generations, warns Dullinger.

More information: Stefan Dullinger, Franz Essl, Wolfgang Rabitsch, Karl-Heinz Erb, Simone Gringrich, Helmut Haberl, Karl Hülber, Vojtech Jarošík, Fridolin Krausmann, Ingolf Kühn, Jan Pergl, Petr Pyšek, & Philip E. Hulme 2013: Europe's other debt crisis caused by the long legacy of future extinctions. Proceedings of the National Academy of Sciences (PNAS), April 15, 2013. DOI: www.pnas.org/cgi/doi/10.1073/pnas.1216303110

Journal reference: Proceedings of the National Academy of Sciences search and more info website

Provided by University of Vienna search and more info website


View the original article here

Predicting Collapse

Downfall. This graph shows the 1992 North Atlantic cod fishery crash.

Credit: Millennium Ecosystem Assessment of the United Nations

" href="http://news.sciencemag.org/sciencenow/assets/2013/04/10/sn-systemcollapse.jpg">sn-systemcollapse.jpg Downfall. This graph shows the 1992 North Atlantic cod fishery crash. Credit: Millennium Ecosystem Assessment of the United Nations

In 1990, North Atlantic fishers hauled in more than 200,000 metric tons of cod; in 1992 they caught almost none. The collapse cost thousands of Canadian fishers and plant workers their jobs, and the northern cod fishery has never recovered. Now, physicists studying laboratory yeast have found a new way to tell when such a collapse is imminent. The researchers hope their warning signal can help fishery and wildlife managers act in time to save stressed populations.

The team's work is "a really nice paper" that "could potentially lead to some new insights," says Stephen Carpenter, an ecologist at the University of Wisconsin. Madison, who has studied similar early warning signals in lakes.

The key to preventing a population collapse is spotting early signs of trouble. One recognized warning signal is that unhealthy systems often take longer than healthy ones to recover from a disturbance. Scientists call this "critical slowing down." For example, Carpenter and colleagues found that algae levels were slow to return to normal in a lake to which they had added largemouth bass, a predatory fish. However, measuring this slowing in nature can require years' worth of data—a luxury that many fishery and wildlife managers don't have.

Enter Saccharomyces cerevisiae, commonly known as brewer's yeast. In June 2012, physicist Jeff Gore of the Massachusetts Institute of Technology in Cambridge and colleagues reported that they had induced critical slowing down in laboratory populations of these single-celled fungi. Brewer's yeast cells break down inedible sugars in their environment into edible ones, meaning that individuals get a boost from the work of their neighbors—especially at high densities. Thus, the scientists were able to stress their populations by diluting them. The researchers found that at lower densities, the populations took longer to return to previous levels after being shocked with a dose of salt—which can harm or kill yeast—in their growth medium. But an isolated lab colony is a highly artificial system. In real ecosystems, creatures can move from one part of their environment to another.

So, in their new study, Gore and colleagues connected yeast populations by migration. The researchers grew groups of yeast colonies in rows of small, circular wells on plastic trays—imagine the bottom half of a miniature egg carton that has eight rows instead of two. Every morning, the scientists transferred a quarter of each population to the wells on either side of it, simulating a natural dispersal process like fish swimming from one region of a lake to another. The team then diluted the cells in one of the wells in each group to an extremely low density, creating what they call a "bad region." The researchers measured each population's size over the next week and found that colonies to either side of the stressed yeast in the bad region also declined . But populations two or more wells away remained healthy.

Then the scientists delivered the knockout blow. They began diluting yeast cells in the rest of the wells, bringing the entire group closer to collapse. They found that the distance between the bad region and the nearest well with a healthy population increased from two wells to three or more. Gore and his colleagues call this distance the "recovery length" and believe it could be observed in real-world environments with habitats of different levels of fitness. For instance, managers could monitor fish numbers in a marine reserve next to a fishery and curtail fishing if the distance from the fishery's edge to the nearest region of healthy fish populations increases. Recovery length is a "new category of indicator that has not been proposed in the field," says Gore, whose team published its results online today in Nature.

"The really cool thing about the insight is that it could be applied in field conditions," Carpenter says. He envisions ecologists studying satellite images of rangelands or other ecosystems to look for increases in recovery length. Carpenter also notes that while "real landscapes are far more complex than a one-dimensional gradient of yeast cultures," Gore's lab experiment adds something that whole-ecosystem studies like his often can't: easy replicability and control. "The yeast experiments give us one more angle to think about the problem," Carpenter writes in an e-mail.

For Gore, the next step is reaching out to people who could benefit from his results. He has recently started collaborations with scientists who work on economically and ecologically important systems like fisheries and honey bee colonies. "Nobody wants to necessarily save my microbial populations from collapsing," he admits. "We really want to save populations in the wild."


View the original article here

Patients with surgical complications provide greater hospital profit-margins


View the original article here

Dying supergiant stars implicated in hours-long gamma-ray bursts


View the original article here

FOREX-Yen under renewed pressure as gold jitters ease

* Yen down broadly as demand fades

* Steadier gold, after historic plunge, soothe jitters

By Masayuki Kitano and Ian Chua

SINGAPORE/SYDNEY, April 17 (Reuters) - The yen fell broadly on Wednesday, succumbing to renewed pressure after gold prices steadied somewhat from an eye-watering plunge earlier in the week.

The dollar was changing hands at about 98.14 yen, up 0.6 percent from late U.S. trade on Tuesday. It was still down about 1.8 percent from a four-year high of 99.95 yen set last week.

The euro climbed 0.6 percent to 129.29 yen, although it still remained some way off a three-year peak of 131.10 yen hit last week.

A historic plunge in gold prices on Monday, coupled with concerns about China's economic growth had sapped risk sentiment and given a lift to the yen earlier this week, reversing a tide of selling sparked by the Bank of Japan's aggressive stimulus programme.

"We still believe that the recent volatility in the commodity prices was mainly driven by long position liquidation, while the underlying backdrop remains risk-positive due to expanding global monetary easing," said Vassili Serebriakov, strategist at BNP Paribas.

"Overall, we expect the focus to gradually shift back to JPY which remains the key driver of FX markets. We see renewed USD/JPY gains driven by Japanese investor outflows, targeting USD/JPY at 105 by year-end."

A focal point is whether Japanese investors will eventually look overseas for higher returns as the Bank of Japan injects about $1.4 trillion into the economy in two years as part of a dramatic plan to jump start growth.

"We expect significant outflows of capital from Japan and increased use of the yen as a funding currency," said Mitul Kotecha, Hong Kong-based head of foreign exchange strategy for Credit Agricole.

"We're forecasting 104 for dollar/yen by the end of the year," he said.

G20 MEETING

In the near-term, the market will be focused on the Group of 20 meeting beginning on Thursday in Washington, where finance ministers and central bankers from the world's leading economies will discuss the economic and financial market outlook, including the Cyprus crisis and asset price reactions.

It seems unlikely that Japan will face any significant criticism over the Bank of Japan's aggressive monetary easing at the G20 meeting, said Credit Agriole's Kotecha.

"Although there may be some warnings about not focusing on exchange rate levels, etc. I don't think there is going to be anything categoric to put pressure on Japan to change its monetary policy," Kotecha said.

A senior Canadian financial official said on Tuesday that Canada was supportive of Japan's effort to kick-start its economy and that the G20 believed policy should target domestic economies and not exchange rates.

Separately, a U.S. official said on Tuesday that ways to boost global demand to help the faltering recovery will be a key focus for the United States at the G20 meeting.

Asked about competitive devaluations and the impact of Japan's aggressive monetary policy on the yen, the U.S. official said the United States will be watching closely to see how effective the policies are at boosting Japanese demand.

The euro held steady against the dollar at $1.3174, after having hit a seven-week high of $1.3202 on Tuesday, partly helped by its bounce versus the yen.

The single currency had added to its gains on Tuesday after breaching resistance at its 100-day moving average at roughly $1.3155, a level which could now act as support for the euro.


View the original article here

Crucial step in human DNA replication observed for the first time

An elusive step in the process of human DNA replication has been observed, for the first time, by scientists at Penn State University in the lab of Stephen J. Benkovic. The step, which is crucial for DNA replication in humans and other forms of life, previously had remained mysterious and had not been well studied in human DNA. For illustrative purposes, this image represents a crucial molecular player in the process, by a hand, which is loading the sliding clamp ring onto DNA. Credit: Benkovic lab, Penn State University

(Phys.org) —For the first time, an elusive step in the process of human DNA replication has been demystified by scientists at Penn State University. According to senior author Stephen J. Benkovic, an Evan Pugh Professor of Chemistry and Holder of the Eberly Family Chair in Chemistry at Penn State, the scientists "discovered how a key step in human DNA replication is performed." The results of the research will be published in the journal eLife on 2 April 2013.

Part of the DNA replication process—in humans and in other life forms—involves loading of molecular structures called sliding clamps onto DNA. This crucial step in DNA replication had remained somewhat mysterious and had not been well studied in human DNA replication. Mark Hedglin, a post-doctoral researcher in Penn State's Department of Chemistry and a member of Benkovic's team, explained that the sliding clamp is a ring-shaped protein that acts to encircle the DNA strand, latching around it like a watch band. The sliding clamp then serves to anchor special enzymes called polymerases to the DNA, ensuring efficient copying of the genetic material. "Without a sliding clamp, polymerases can copy very few bases—the molecular 'letters' that make up the code of DNA—at a time. But the clamp helps the polymerase to stay in place, allowing it to copy thousands of bases before being removed from the strand of DNA," Hedglin said.

Hedglin explained that, due to the closed circular structure of sliding clamps, another necessary step in DNA replication is the presence of a "clamp loader," which acts to latch and unlatch the sliding clamps at key stages during the process. "The big unknown has always been how the sliding clamp and the clamp loader interact and the timing of latching and unlatching of the clamp from the DNA," said Hedglin. "We know that polymerases and clamp loaders can't bind the sliding clamp at the same time, so the hypothesis was that clamp loaders latched sliding clamps onto DNA, then left for some time during DNA replication, returning only to unlatch the clamps after the polymerase left so they could be recycled for further use."

To test this hypothesis, the team of researchers used a method called Förster resonance energy transfer (FRET), a technique of attaching fluorescent "tags" to human proteins and sections of DNA in order to monitor the interactions between them. "With these tags in place, we then observed the formation of holoenzymes—the active form of the polymerase involved in DNA replication, which consists of the polymerase itself along with any accessory factors that optimize its activity," Hedglin said. "We found that whenever a sliding clamp is loaded onto a DNA template in the absence of polymerase, the clamp loader quickly removed the clamp so that free clamps did not build up on the DNA. However, whenever a polymerase was present, it captured the sliding clamp and the clamp loader then dissociated from the DNA strand."

The team members also found that, during the moments when both the clamp loader and the clamp were bound to the DNA, they were not intimately engaged with each other. Rather, the clamp loader released the closed clamp onto the DNA, allowing an opportunity for the polymerase to capture the clamp, completing the assembly of the holoenzyme. Subsequently, the clamp loader dissociated from DNA. "Our research demonstrates that the DNA polymerase holoenzyme in humans consists of only a clamp and a DNA polymerase. The clamp loader is not part of it. It disengages from the DNA after the polymerase binds the clamp," Hedglin added.

Benkovic noted that this mechanism provides a means for the cell to recycle scarce clamps when they are not in use for productive replication.

Journal reference: eLife search and more info website

Provided by Pennsylvania State University search and more info website


View the original article here

Chevron grills U.S. lawyer in $19 billion Ecuador pollution case

Gas prices are displayed at a Chevron gas station in Los Angeles, California October 9, 2012. REUTERS/Mario Anzuoni

Gas prices are displayed at a Chevron gas station in Los Angeles, California October 9, 2012.

Credit: Reuters/Mario Anzuoni

By Jonathan Stempel

NEW YORK | Tue Apr 16, 2013 4:45pm EDT

NEW YORK (Reuters) - Chevron Corp on Tuesday sought to persuade a New York federal judge to punish a U.S. lawyer representing Ecuadorean villagers who won a $19 billion environmental damages award, saying the lawyer is withholding documents from the oil company.

In an unusual court proceeding, a Chevron lawyer sharply questioned Steven Donziger, who represents residents of the Lago Agrio region who claim the company is responsible for contamination that sickened people in the Ecuadorean Amazon area.

At issue is the extent to which Donziger and others acted in bad faith by failing to turn over files and documents that Chevron claimed it needed for its case.

U.S. District Judge Lewis Kaplan in Manhattan is holding the hearing as part of a 2011 lawsuit in which Chevron accused Donziger and other defendants of racketeering and extortion. That case is scheduled to go to trial in October.

The two-decade fight between Chevron and Lago Agrio residents has included aggressive litigation tactics and accusations of coercion and bribery that each side has denied.

Under questioning from Chevron lawyer Randy Mastro, Donziger denied having directed his Ecuadorean counterpart Pablo Fajardo to keep documents from Chevron, the second-largest U.S. oil company.

"Mr. Fajardo's view is that responding to your document request would violate Ecuador law," Donziger told Mastro.

Donziger admitted that he lacks access to many documents, including documents stored on Fajardo's computers, and thus could not speak to their importance.

Mastro, meanwhile, sought to discredit Donziger's contention that he worked for Fajardo, not the other way around.

Donziger stands to earn more than $1 billion if the $19 billion judgment were upheld, while Fajardo would make just under one-third that amount, Mastro said.

"So you make more than three times as much as Mr. Fajardo does for working on this case, but you testify that you work for him?" Mastro said.

Chevron's lawyer also introduced other evidence suggesting that Donziger was in fact calling the shots, noting that Fajardo has sometimes called Donziger "Commander."

Donziger said that moniker should be seen "as a term of affection, not as a term of hierarchy."

NO CONTROL?

In 2011, the Lago Agrio plaintiffs won an $18.2 billion judgment in Ecuador, which has since grown to the $19 billion, on claims that San Ramon, California-based Chevron is responsible for contamination of their water and soil.

The environmental damage was supposedly caused by Texaco, which operated in Ecuador from 1964 to 1992. Chevron took on Texaco's liabilities when it bought the company in 2001.

Chevron says the Ecuador ruling is unenforceable. The Ecuadorean residents have yet to collect on the award and are trying to enforce the judgment in countries where Chevron operates.

Donziger has led that charge in the United States. Under questioning from his lawyer, John Keker, he said his authority has been reduced in recent months, furthering his argument that he lacked control to get the documents Chevron wants.

The Lago Agrio plaintiffs "wanted an adviser, not a person in control or in command of their decisions," Donziger said. "It's preposterous to think I can order Pablo Fajardo to turn over his case files to me."

Kaplan regularly chastised lawyers at the hearing for interrupting one another.

"Look Mr. Keker, I'm going to run this courtroom, and you're not going to tell me how," he told Keker, who had objected to what he thought was Mastro's interrupting an answer by Donziger.

Last month, in challenging other rulings by Kaplan, the Lago Agrio plaintiffs urged a federal appeals court to replace him with a different judge, citing his alleged "contempt" for Ecuador and its courts and "ill will" for Donziger.

Chevron won a victory on Monday when U.S. Magistrate Judge James Francis recommended the dismissal of counterclaims by Donziger accusing the company of harassment and trying to block enforcement of the judgment.

Kaplan will review the recommendation. The hearing that began on Tuesday is expected to last several days.

The case is Chevron Corp v. Donziger et al, U.S. District Court, Southern District of New York, No. 11-00691.

(Reporting by Jonathan Stempel; Editing by Martha Graybow and Andrew Hay)


View the original article here

REFILE-UPDATE 3-Intel foresees Q2 sales decline as PC market shrinks

* Trims capex plans for 2013

* Maintains full-year revenue forecast

* Shares edge lower

By Noel Randewich

SAN FRANCISCO, April 16 (Reuters) - Intel Corp said its current-quarter revenue would decline as much as 8 percent and trimmed its 2013 capital spending plans, as personal computer sales drop due to the growing popularity of tablets and smartphones.

Shares in the world's largest chip maker rallied as much as 3 percent after hours but quickly gave up the gains. The stock had been battered over the past week after researcher IDC revealed that PC sales notched a record quarterly decline in the first quarter.

Despite persistently weak demand for PCs, Intel held firm on its previous forecast that 2013 revenue would grow by a low single-digit percentage, a target some analysts believe is becoming more difficult to hit.

Chief Financial Officer Stacy Smith told analysts on a conference call after Intel's earnings report on Tuesday that its upcoming Haswell chip, as well as new ultrathin laptops and an improving economy, would revive growth in the second half of the year.

"That scares the hell out of me. They are holding to the same ultra-bullish forecast they gave before," said Stacy Rasgon, an analyst at Bernstein Research. "They are presumably pretty bullish on the new products they are planning."

Personal computer sales plunged 14 percent in the first three months of the year, the biggest decline in the two decades on record, as tablets grew more popular and buyers seemed to be avoiding Microsoft Corp's new Windows 8 operating system, according to IDC.

Under pressure, Intel also said in its quarterly news release on Tuesday that it was reducing 2013 capital spending from $13 billion to $12 billion, plus or minus $500 million.

STICKING TO THEIR GUNS

Intel said its first-quarter revenue fell to $12.58 billion from $12.91 billion in the year-ago quarter.

The world's largest chipmaker forecast June-quarter revenue of $12.9 billion, plus or minus $500 million. Compared to the second quarter of last year, that amounts to roughly no change or a drop of as much as 8 percent.

Analysts had expected $12.588 billion in revenue for the first quarter and $12.854 billion for the June quarter, according to Thomson Reuters I/B/E/S.

Intel posted first-quarter net income of $2.04 billion, or 40 cents a share, down from $2.74 billion, or 55 cents a share, in the year-ago period. Analysts on average had expected 41 cents per share.

"These numbers are not very solid, but the second-quarter guidance is better than feared. Conditions are probably not as bad as industry reports have suggested recently," said Doug Freedman, an analyst at RBC Capital.

Shares of Intel edged down less than 1 percent in extended trade after closing up 2.5 percent at $21.91 on Nasdaq.


View the original article here

Updyke receives 3 years for Auburn tree poisoning

(AP)—The Alabama fan who poisoned the iconic Toomer's Corner oak trees at rival Auburn has been sentenced to three years in prison.

Court filings Friday show that Harvey Updyke Jr. pleaded guilty to damaging an agricultural facility. The sentence requires him to serve at least six months in jail and spend five years on supervised probation. He has been credited with 104 days already served.

Lee County Circuit Judge Jacob A. Walker III also fined Updyke $1,000. The probation terms include a 7 p.m. curfew and prohibit Updyke from going onto Auburn's campus or attending a college sporting event.

Auburn fans traditionally gather at Toomer's Corner to celebrate victories. The resolution ends a case that highlighted the emotions in the year-round rivalry.

Copyright 2013 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.


View the original article here

[Editors' Choice] In Search of a (Functional) Cure

Sorry, I could not read the content fromt this page.

View the original article here

Material screening method allows more precise control over stem cells

William Murphy, the Harvey D. Spangler professor of biomedical engineering

(Phys.org) —When it comes to delivering genes to living human tissue, the odds of success come down the molecule. The entire therapy - including the tools used to bring new genetic material into a cell - must have predictable effects.

Now, a new screening process will simplify non-viral transfection, providing a method researchers and clinicians to use to find an optimal set of biomaterials to deliver genes to cells.

Developed by William Murphy, the Harvey D. Spangler professor of biomedical engineering at the University of Wisconsin-Madison, the method gives researchers greater control over how cells react to the gene delivery mechanism. The broader implication is more nuanced, effective control over cell behavior. "We've been exploring using this concept for reprogramming of adult cells, as well as controlling differentiation of stem cell types," he says.

Murphy and his collaborators published news of their advance in the March 28, 2013 issue of Nature's Scientific Reports.

In a current successful approach, researchers use specialized viruses to deliver genetic material to cells. While efficient, that method also carries a greater risk of turning on unwanted genes or provoking an immune response from the body—making it less attractive for sensitive biomedical applications like controlling stem cell behavior, says Murphy.

His team has developed a process that does not rely on viruses. Rather, the researchers can grow specific calcium phosphate coatings that serve as a medium via which genetic material can be delivered to cells more efficiently. By matching a coating to a specific application for delivering genes, Murphy has seen up to a 70-fold increase in successful expression of those genes in human stem cells.

"From an application standpoint, the advance could be really impactful, and could enable gene delivery to become an integral part of medical device design and tissue engineering applications," says Murphy.

The process could be critical to further advances in regenerative medicine. Since researchers can apply it to any size or shape of tissue engineering structure, it could help provide engineers a simpler way to build the complex tissue structures required to deliver next-generation drug screening and patient therapies.

More information: www.nature.com/srep/2013/130328/srep01567/full/srep01567.html

Journal reference: Scientific Reports search and more info website

Provided by University of Wisconsin-Madison search and more info website


View the original article here

CANADA STOCKS-TSX gets bounce from banks, energy after U.S. data

Sorry, I could not read the content fromt this page.

View the original article here

Reliable packaging for chemical-free food

It is not obvious when examining a wrapped lettuce or a microwavable bowl of Chinese soup. But plastic food packaging is made of multiple layers designed to act as a barrier for oxygen or bacteria. "Each of these layers is made by a different manufacturer. Still, at the end of the chain, the food manufacturer who sells the packaged product is the sole responsible for food safety," notes Olivier Vitrac, a researcher at the Genial joint research unit of the National Institute of Agricultural Research (INRA-Agroparistech), located in Massy, near Paris. Scientists have been addressing the issue of such potentially harmful molecules diffusing from one layer to the next in food packaging, susceptible to ultimately contaminate the food. This work has been performed under the SafeFoodPackDesign project, coordinated by Vitrac and funded by the French National Research Agency (ANR).

Food packaging is designed to preserve the content as fresh and safe as possible. Its second function is to make the product look attractive to customers, using colourful prints. The problem is that this approach requires the use of potentially harmful molecules such as anti-ultraviolet radiations, anti-oxydisers, glue and pigments from inks. These substances have been suspected—albeit not always proven categorically—of being responsible for triggering cancers. They have also been accused of endocrine disruption because they contain substances like bisphenol A. This hormone-like chemical has recently been banned in France. It has also been the object of the EU funded ENDOCEVAL project, which aims at testing new packaging that are free from bisphenol A.

The SafeFoodPackDesign project goal is to build tools to help packaging manufacturers assess the diffusion risks of potentially harmful molecules, at every stage of the packaging's life; from manufacturing to final use, including transport and storage. As an example, piling up plastic cups designed to hold, for example, Chinese soup, results in putting the inked external layer in contact with the inner layer of the cup immediately underneath. As a result, ink molecules migrate towards the inner layer, which will eventually be in contact with the food. "In this case, the most critical step regarding chemical risk is with no doubt storage," Vitrac explains youris.com.

The first task of the project team has been to build a database of materials used for packaging to document their molecular content. Now, they are measuring the diffusion speed of these molecules, in order to build predictive models. "Building migration models can be challenging in the case of a multi-layer packaging containing several chemicals in contact with non-homogeneous food. Besides, the input data available to feed the model, such as packaging and food composition, is not always accurate," warns Peter Mercea, of FABES, a German company specialised in testing and evaluating migration of substances from packaging into foods.

By December 2014, the project team is expected to have developed an open-source software, based on a method used in aeronautics called Failure Mode Effects and Criticality Analysis, also referred to as FMECA, to detect every critical point on the packaging lifecycle where contamination could occur. A first version is already available.

This new tool could change the way plastic food packaging is designed and handled. At present, the EU Framework Regulation on food contact materials (EC) No 1935/2004 states that packaging must not transfer chemicals to food in quantities that may pose a threat to human health and alter significantly the composition and organoleptic characteristics of the food. "This regulation is already stringent and every new packaging must be proved individually safe before entering the market. But it says nothing about transport and storage and other possible critical points," explains Daniel Ribera, of Bio-tox , a consultancy specialised in sanitary and environmental risk assessment. If, "in addition, SafefoodPackDesign can help assess the related risks, it will provide a real increase in safety for manufacturers and therefore for consumers."

Provided by Youris.com search and more info website


View the original article here

Theory models, EMSL capabilities illuminate how particles grow in the atmosphere

Theory models, EMSL capabilities illuminate how particles grow in the atmosphere

Enlarge

Barrier breakthrough: The two-step sequential ammonia–sulfuric acid loss pathway shows the presence of barriers for addition of ammonia (NH3) and sulfuric acid (H2SO4) to the clusters. Current atmospheric particle models do not consider these barriers, which may limit their precision.

Determining the chemical mechanisms that govern new particle formation, or NPF, in the atmosphere is not something that can be pulled out of thin air. In the atmosphere, nucleating clusters are presumed to be composed of a few common species: sulfuric acid (the key chemical component), ammonia, amines (ammonia derivatives), and water—all of which have different effects on nucleation and particle growth. Moreover, these same clusters may significantly impact cloud condensation nuclei, the particles that spawn cloud formation, and, in turn, affect global climate. Scientists used EMSL's Fourier transform ion cyclotron resonance mass spectrometer equipped with surface-induced dissociation (SID) to examine and model fragmentation kinetics and energetics of small clusters that may serve as precursors to NPF.

Primarily, they evaluated the role of two positively charged ammonium bisulfate clusters (other studies have shown sulfuric acid-ammonia cluster growth follows an ammonium bisulfate pathway). SID of these ammonium bisulfate clusters indicated two unique pathways for cluster fragmentation: 1) a two-step pathway, where a cluster initially loses an ammonia molecule then a sulfuric acid molecule, and 2) a one-step pathway that proceeds via the loss of an ammonium bisulfate molecule. They compared experimental data to quantum chemical calculations of cluster dissociation thermodynamics and determined that loss of either an ammonia molecule or an ammonium bisulfate molecule is greater than the corresponding thermodynamic value.

These results suggest the presence of an activation barrier for ammonia incorporation into molecular clusters as they grow, which may impact cluster distributions in the atmosphere. Current atmospheric particle models typically do not consider activation barriers in NPF growth processes. Thus, this work paves the way for atmospheric NPF models with considerably improved accuracy.

More information: Bzdek, B. et al. 2013. Fragmentation Energetics of Clusters Relevant to Atmospheric New Particle Formation. Journal of the American Chemical Society 135(8):3276-3285. DOI: 10.1021/ja3124509.

Journal reference: Journal of the American Chemical Society search and more info website

Provided by Environmental Molecular Sciences Laboratory search and more info website


View the original article here

[News & Analysis] Neuroscience: Tissue Imaging Method Makes Everything Clear

Sorry, I could not read the content fromt this page.

View the original article here

Tiny colorful snails are in danger of extinction with vanishing limestone ecosystems

This image shows the beautiful bright orange-colored Perrottetia dermapyrrhosa, one of the newly described species from Thailand. Credit: Somsak Panha

Researchers from Chulalongkorn University, Bangkok and the Natural History Museum, London (Thanit Siriboon, Chirasak Sutcharit, Fred Naggs and Somsak Panha) discovered many new taxa of the brightly coloured carnivorous terrestrial snails family Streptaxidae. Terrestrial snails are primarily herbivores and only a rare few groups like this one are carnivorous. The animals come from several limestone areas across the world, including some threatened by human exploitation, especially by quarrying.

Three new species from the genus Perrottetia were described from north and northeastern Thailand. The species show extraordinary endemism, with each of these colourful snails occurring as "One Hill One Species". This is a very peculiar phenomenon where each one of these highly endemic snails is specific and the only one inhabiting a certain mountain range. They live in rock crevices, feeding on tinier snails, insect larvae and some earthworms species. These beautiful animals are now at risk from extinction with the destruction of limestone ecosystems. The study was published in the open access journal ZooKeys.

Tiny colorful snails are in danger of extinction with vanishing limestone ecosystems
Enlarge

This image shows Perrottetia aquilonaria, one of the newly described species. Credit: Somsak Panha

Limestone ecosystems in the world are now being destroyed at an alarming rate. This means we are losing biodiversity resources, a tendency especially threatening for the hot spot areas like Thailand. The new research findings show that key terrestrial invertebrates, such as several new bright carnivorous land snails are still persisting in such areas and are being described even from the highly endangered quarried sites. This demonstrates that there are still remnants of some fundamental ecosystem, which lives and is struggling for survival, a great experience for mankind to learn.

Tiny colorful snails are in danger of extinction with vanishing limestone ecosystems
Enlarge

This image shows a limestone hill being quarried. Under its rocks, there are still many living animals, including carnivorous snails struggling for survival. Credit: Somsak Panha

"The three new Perrottetia species exhibit distinct morphological characteristics, which make for a great example for evolutionary studies in unstable environments," comments one of the authors, Dr Somsak Panha. "More than 50% of limestone ecosystems in this region have been or still are being destroyed. This astonishing case of biodiversity persistence gives a valuable reason to put effort in the conservation of this important world ecosystem. "

More information: Siriboon T, Sutcharit C, Naggs F, Panha S (2013) Three new species of the carnivorous snail genus Perrottetia Kobelt, 1905 from Thailand (Pulmonata, Streptaxidae). ZooKeys 287: 41-57. doi: 10.3897/zookeys.287.4572

Journal reference: ZooKeys search and more info website

Provided by Pensoft Publishers search and more info website


View the original article here

Orange flour for gluten-free bread

Orange pomace.

During the processing of fruit and vegetables one third is discarded as 'waste'. The waste or by-product can be described as the core, pips and peel of the fruit or vegetable. This waste can be costly for the manufacturer to dispose of and it may also have hazardous effects on the environment.

Research has shown that a high quantity of nutrients such as dietary fibre and bio-actives are present in these by-products, thus the full potential of the fruit or vegetable is not fulfilled. For example, orange pomace, a by-product from the smoothie and juice industry, has proven to have good nutritional attributes; it is low in fat (2% dry matter) and high in dietary fibre (40% dry matter) and has the potential to be used as a food ingredient.

Researchers at Teagasc Food Research Centre and University College Cork have been looking in to possible uses for this discard, for example, the use of orange flour in gluten-free bread formulations.

Orange flour for gluten-free bread
Enlarge

Orange flour.

Presently in Ireland, approximately one in a hundred people suffer from coeliac disease. Coeliac disease can be explained as an autoimmune reaction to the prolamin fraction of the gluten protein, which causes damage to the villi in the small intestine. The only treatment for coeliac disease is lifelong avoidance of foods containing wheat, barley, spelt, rye and some oats. Many gluten-free products available on the market are calorie dense, lacking in flavour, mouthfeel and nutrients, and are costly to buy.

Gluten-free formulations

Dr Eimear Gallagher, Teagasc Food Research Centre, explains: "Developing gluten-free formulations can be challenging for the cereal technologist, as the structure-building protein (i.e., gluten) is absent. In the present study, a response surface design was created to statistically calculate the optimal level of orange pomace addition, water addition and ideal proofing time to produce an optimal bread formulation. This study investigated the effects of these three factors in different combinations on bread parameters such as loaf volume, crumb structure, crumb colour, texture, microstructure, nutritional and sensory properties and the optimised samples were assessed using sensory panellists.

"Sensory panellists scored the bread favourably with respect to appearance, flavour, texture and overall acceptability," says Norah O'Shea, a PhD student of Dr Gallagher.

"The inclusion of orange pomace as a novel structure-building and nutritious ingredient in a gluten-free formulation was illustrated in this study. Using response surface methodology as a tool, successfully created bread with favourable baking characteristics and enhanced dietary fibre. Orange pomace proved to be a viable, low cost food by-product for improving the physical and nutritional characteristics of gluten-free breads," explains Dr Gallagher.

"The addition of this ingredient is not limited to gluten-free bread; it has potential to be used in both gluten-containing and gluten-free breads and confectionary," adds Dr Gallagher.

More information: O'Shea, N. et al. (2012). Dietary fibre and phytochemical characteristics of fruit & vegetable by-products & their recent applications as novel ingredients in food products. Innovative Food Science and Emerging Technologies, 16: 1-10.

Provided by Teagasc


View the original article here

[Report] Wnt Stabilization of β-Catenin Reveals Principles for Morphogen Receptor-Scaffold Assemblies

Sung-Eun Kim1,*, He Huang1,*,†, Ming Zhao2,*, Xinjun Zhang1, Aili Zhang2,4, Mikhail V. Semonov1,‡, Bryan T. MacDonald1, Xiaowu Zhang5, Jose Garcia Abreu1,3, Leilei Peng2, Xi He1,§

1F. M. Kirby Center, Boston Children’s Hospital, Harvard Medical School, Boston, MA 02115, USA.
2College of Optical Sciences, University of Arizona, Tucson, AZ 85721, USA.
3Instituto de Ciencias Biomedicas, Universidade Federal do Rio de Janeiro, Rio de Janeiro, Brazil.
4School of Biomedical Engineering, Jiaotong University, Shanghai, China.
5Cell Signaling Technology, Danvers, MA 01923, USA. ?§Corresponding author. E-mail: xi.he{at}childrens.harvard.edu?* These authors contributed equally to this work.

?† Present address: Department of Pathology, University of Buffalo, Buffalo, NY 14203, USA.

?‡ Present address: Veterans Administration Hospital and Department of Pathology, Boston University, Bedford, MA 01730, USA.

Wnt signaling stabilizes ß-catenin through the LRP6 receptor signaling complex, which antagonizes the ß-catenin destruction complex. The Axin scaffold and associated glycogen synthase kinase-3 (GSK3) have central roles in both assemblies, but the transduction mechanism from the receptor to the destruction complex is contentious. We report that Wnt signaling is governed by phosphorylation regulation of Axin scaffolding function. Phosphorylation by GSK3 kept Axin activated (“open”) for ß-catenin interaction and poised for engagement of LRP6. Formation of the Wnt-induced LRP6-Axin signaling complex promoted Axin dephosphorylation by protein phosphatase-1 and inactivated (“closed”) Axin through an intramolecular interaction. Inactivation of Axin diminished its association with ß-catenin and LRP6, thereby inhibiting ß-catenin phosphorylation and enabling activated LRP6 to selectively recruit active Axin for inactivation reiteratively. Our findings reveal mechanisms for scaffold regulation and morphogen signaling.

Received for publication 5 November 2012. Accepted for publication 19 March 2013.


View the original article here

Companies jockey for position in changing U.S. space market

By Andrea Shalal-Esa

WASHINGTON | Fri Apr 12, 2013 5:35pm EDT

WASHINGTON (Reuters) - Space companies are shifting strategies to benefit from a change in how the U.S. government buys satellites, rockets and space services.

After years of billion-dollar cost overruns and schedule delays on complex satellite programs, U.S. officials are looking for smaller, less expensive spacecraft and exploring alternatives such pay-for-service deals, or packing sensors on government or commercial satellites.

Air Force General Robert Kehler, who heads the military command that oversees U.S. nuclear, satellite and cyber operations, warned thousands of top industry executives this week that big-ticket space programs would be reevaluated as part of a 60-day review ordered by Defense Secretary Chuck Hagel.

"There is no doubt that our space-based systems are expensive and will be part of that review," he said at the week-long National Space Symposium, which brought together over 9,000 experts from across the industry.

"Regardless of the outcome, we must find ways to drive costs down as we look to the future."

The White House budget proposal for fiscal 2014 includes $8 billion for unclassified space programs, about the same as this year, and billions more for additional programs in the "black world," but officials are forecasting deeper cuts in 2015.

Brigadier General Roger Teague, director of strategic plans at The Air Force Space Command, told the conference that his office had already cut costs by $985 million by reducing staffing, testing and production costs, and was targeting additional cuts of $600 million across 20 programs this year.

INDUSTRY CHANGES TACTICS

The shift is changing the way big players like Lockheed Martin Corp, Boeing Co, Northrop Grumman Corp, and smaller firms like Orbital Sciences Corp, ITT Exelis Corp, Harris Corp and Alliant Techsystems Inc map out future business plans.

"If we keep doing things the same way and expect different results in this environment, that's not going to work," said retired Lieutenant General Trey Obering, the former director of the Missile Defense Agency who now works for consulting firm Booz Allen.

Rick Ambrose, who heads space programs for Lockheed, which is building new large missile warning, protected communications and global positioning satellites for the Air Force, said Lockheed is bracing for further declines in Pentagon spending.

He said the Pentagon was likely to stick to the existing programs for core missions, such as missile warning, targeting and protected communications, all contracts held by Lockheed, even as it explores new options. But all firms were under pressure to innovate, accelerate schedules and cut costs.

Ambrose said Lockheed's space division was halfway to its goal of consolidating 1.5 million square feet of facilities and would have reduced overhead by $300 million this year.

Lockheed also builds smaller satellites, and is taking part in Air Force studies about different approaches, but Ambrose warned against rushing headlong into new acquisition programs.

He said that approach backfired during the last drawdown, when the Pentagon curtailed existing programs before leap-ahead programs matured, leaving gaps in some important capabilities.

"If you're a wing walker you never let go of a rung until you grab the next rung," he said.

SPACEX VS BOEING

Lockheed and Boeing also face competition for the biggest rockets, with the Air Force laying the groundwork for new entrants like Space Exploration (SpaceX) and Orbital Sciences.

Boeing is rapidly expanding its commercial satellite orders after losing out to Lockheed on some bigger military contracts, but sees good opportunities for its new line of smaller satellites and the prospect of hosting sensors on its wideband communications satellites in coming years.

"It's clear that they want to move away from these big mega-programs that cost a lot of money and find cheaper solutions," Craig Cooning, vice president and general manager of Boeing Space and Intelligence Systems told Reuters, citing space situational awareness and communications as promising areas.

"If you look at our business overall, we have recognized the changes in the marketplace and we have adapted for that," he said, adding that Boeing delivered 11 satellites last year and was on track to deliver 10 more this year.

The Air Force is wrapping up an analysis of alternatives for a next-generation weather satellite this summer that will likely include several of the new approaches.

Companies like Harris and Orbital Sciences are promoting the use of "hosted payloads" in which sensors are packed aboard other satellites, although government and industry officials say some technical and policy issues must still be worked out.

Harris is working with Iridium Communications on the largest hosted payload deal to date, an aircraft-tracking venture to kick off in 2015 that it says will save airlines money but also holds promise for U.S. government missions.

Meanwhile, Inmarsat PLC is getting ready to launch its new Global Express communications satellites, which it says will offer comprehensive mobile broadband services to a variety of users, including the U.S. military.

Obering said the new environment presented opportunities, but the Pentagon needed to revamp its acquisition processes to realize the benefits of technological advances in industry.

One model could be the way the Missile Defense Agency had been freed from some budget processes to rapidly develop missile defense capabilities, Obering said.

"We have to fundamentally change our acquisition approaches to do that. We have to be able to move at the speed of industry to be able to take full advantage of what ... the commercial market will be offering in the coming years," he said.

(Reporting By Andrea Shalal-Esa; Editing by Alwyn Scott and Andrew Hay)


View the original article here

Cell phone camera photographs microscopic cell samples

On April 11th JoVE (Journal of Visualized Experiments) will publish a new video article by Dr. Aydogan Ozcan demonstrating how a cell phone camera can capture images from a fluorescent microscope and flow cytometer, which will make it possible for areas with limited resources to easily run tests such as checking for contaminated water and monitoring HIV positive patients.

In the new video article electrical and bio engineers from the University of California Los Angeles show the construction of the device and how it can be modified to fit any cellphone with a camera. The team expects the device to be helpful to doctors and scientists in countries with limited supplies and in fast-paced clinical environments.

"There is a huge need for these [miniaturized] devices. Resource poor countries demand compact, cost effective and light weight devices to replace bulky equipment common in our labs and hospitals," Dr. Ozcan explains. "These devices bring the diagnostic, testing, and microanalysis capabilities of larger machines to your cellphone."

This video is not supported by your browser at this time.

The video as it appears in the JoVE article. Credit: jove.com

Flow cytometry is a way to count and characterize cells in a liquid sample, and was first developed by Wallace H. Coulter in 1953. Since then, flow cytometry has become ubiquitous in scientific research, particularly in the fields of molecular biology, pathology, and immunology.

Dr. Ozcan's device brings fluorescent microscopy and flow cytometry, two widely used tools in biomedical research, to the common cellphone. The vast network of cellphone subscribers around the world, estimated by the United Nations to top 6 billion subscribers, provides a massive infrastructure to be able to conduct complex biological tests. Dr. Ozcan's device can be constructed for less than $50 plus the cost of the cellphone, while full sized fluorescent flow cytometers can cost more than $150,000 and require expansive lab space to operate.

"A cellphone has almost the computing power of a super computer of the early 1990s, and with over 6 billion cellphone subscribers in the world there is a massive cost reduction to owning a cellphone. That is exactly why I and my colleagues are trying to deploy these micro-devices to cellphones." Dr. Ozcan and his colleagues have filed more than 20 intellectual property licenses as part of Holomic LLC, a startup focused on the development of laboratory equipment for mobile devices.

Dr. Nandita Singh, senior science editor at JoVE, says of Dr. Ozcan's publication "We are very excited to publish this inexpensive cell phone based technology platform that enables the detection of white blood cells to monitor HIV positive patients in geographical regions with limited resources. This technology can also be extended to detect E. Coli contamination in water and milk supplies."

More information: Ozcan et. al.; www.jove.com/video/50451/wide-field-fluorescent-microscopy-fluorescent-imaging-flow-cytometry

Journal reference: Journal of Visualized Experiments search and more info website

Provided by The Journal of Visualized Experiments


View the original article here

Survived cancer? Now look out for cardiovascular risks


View the original article here

Powerpot turns heat and water into electricity


View the original article here

Obama administration renews aviation biofuel program

Crew prepare a Virgin Atlantic Boeing 747 aircraft before the world's first commercial biofuel flight to Amsterdam from Heathrow Airport in London February 24, 2008. REUTERS/Luke MacGregor

Crew prepare a Virgin Atlantic Boeing 747 aircraft before the world's first commercial biofuel flight to Amsterdam from Heathrow Airport in London February 24, 2008.

Credit: Reuters/Luke MacGregor

By Ayesha Rascoe

WASHINGTON | Mon Apr 15, 2013 7:10pm EDT

WASHINGTON (Reuters) - The Obama administration on Monday renewed an interagency agreement that backs the development of biofuels for the aviation industry and reiterated its support for embattled federal renewable fuel targets.

U.S. Agriculture Secretary Tom Vilsack and Transportation Secretary Ray LaHood signed a pact extending a program that has worked with the private sector and rural communities to create an alternative to fossil fuels for aviation.

"We want to re-affirm the importance of this particular industry in this administration," Vilsack told reporters at an industry conference in Washington.

The "Farm to Fly" program aims to support annual production of 1 billion gallons of aviation biofuels by 2018.

The program will focus on evaluating various sources of renewable alternatives to jet fuel, while also developing state and local partnerships with private companies.

Federal support for biofuels has come under increased scrutiny amid complaints from livestock producers and refiners that the federal biofuels mandate has contributed to higher food prices and could threaten gasoline supplies.

Last week, lawmakers in the House of Representatives introduced legislation that would eliminate the corn-based ethanol portion of the mandate, which requires increasing amounts of renewable fuels to blended into U.S. gasoline and diesel supplies.

The Obama administration's support for the mandate could block attempts to curtail the targets, though, especially as most lawmakers from major grain-producing states oppose any limits on the mandate.

Vilsack encouraged the biofuel industry representatives to remain "vigilant" in support of the mandate.

"There are industries and folks who are deeply concerned about the progress that is being made, who want to show that progress down," Vilsack said. "Now, is not the time to step back, now is the time to continue moving forward."

Vilsack told reporters that the mandate was lowering, not raising, gasoline prices for consumers and creating jobs in rural communities.

Oil refiners, who want the mandate rescinded, say the targets are approaching a point where compliance would require the industry blend more ethanol into gasoline than can physically be done at the 10 percent per gallon level.

This problem is referred to as the "blend wall".

Supporters of ethanol argue the "blend wall" could be easily overcome if refiners drop their opposition to allowing gasoline with 15 percent ethanol content, or E15.

The Environmental Protection Agency has approved use of E15 in cars built since 2001, which now account for about two-thirds of U.S. passenger vehicles on the road, but gasoline station operators and oil refiners have voiced concerns that higher blends could hurt vehicle engines.

(Reporting by Ayesha Rascoe; Editing by Ros Krasny and Leslie Gevirtz)


View the original article here