Tuesday, March 31, 2009

Action Video Games Improve Vision, New Research Shows

This is a photo illustrating 58 percent better contrast perception versus "regular" contrast perception.

Video games that involve high levels of action, such as first-person-shooter games, increase a player's real-world vision, according to research in Nature Neuroscience March 29.

The ability to discern slight differences in shades of gray has long been thought to be an attribute of the human visual system that cannot be improved. But Daphne Bavelier, professor of brain and cognitive sciences at the University of Rochester, has discovered that very practiced action gamers become 58 percent better at perceiving fine differences in contrast.
"Normally, improving contrast sensitivity means getting glasses or eye surgery—somehow changing the optics of the eye," says Bavelier. "But we've found that action video games train the brain to process the existing visual information more efficiently, and the improvements last for months after game play stopped."
The finding builds on Bavelier's past work that has shown that action video games decrease visual crowding and increases visual attention. Contrast sensitivity, she says, is the primary limiting factor in how well a person can see. Bavelier says that the findings show that action video game training may be a useful complement to eye-correction techniques, since game training may teach the visual cortex to make better use of the information it receives.
To learn whether high-action games could affect contrast sensitivity, Bavelier, in collaboration with graduate student Renjie Li and colleagues Walt Makous, professor of brain and cognitive sciences at the University of Rochester, and Uri Polat, professor at the Eye Institute at Tel Aviv University, tested the contrast sensitivity function of 22 students, then divided them into two groups: One group played the action video games "Unreal Tournament 2004" and "Call of Duty 2." The second group played "The Sims 2," which is a richly visual game, but does not include the level of visual-motor coordination of the other group's games. The volunteers played 50 hours of their assigned games over the course of 9 weeks. At the end of the training, the students who played the action games showed an average 43% improvement in their ability to discern close shades of gray—close to the difference she had previously observed between game players and non-game players—whereas the Sims players showed none.
"To the best of our knowledge, this is the first demonstration that contrast sensitivity can be improved by simple training," says Bavelier. "When people play action games, they're changing the brain's pathway responsible for visual processing. These games push the human visual system to the limits and the brain adapts to it, and we've seen the positive effect remains even two years after the training was over."
Bavelier says that the findings suggest that despite the many concerns about the effects of action video games and the time spent in front of a computer screen, that time may not necessarily be harmful, at least for vision.
Bavelier is now taking what she has learned with her video game research and collaborating with a consortium of researchers to look into treatments for amblyopia, a problem caused by poor transmission of the visual image to the brain.
This research was funded by the National Eye Institute and the Office of Naval Research.
Adapted from materials provided by University of Rochester, via EurekAlert!, a service of AAAS.

Your Ad Here READ MORE - Action Video Games Improve Vision, New Research Shows

Ice Storms Devastating To Pecan Orchards

This is the aftermath of an ice storm in a pecan grove near Eufaula, Okla.

Ice storms and other severe weather can have devastating impacts on agricultural crops, including perennial tree crops. Major ice storms occur at least once a decade, with truly catastrophic "icing events" recorded once or twice a century within a broad belt extending from eastern Texas through New England. Ice storms can result in overwhelming losses to orchards and expensive cleanup for producers.
Because the long limbs of pecan trees act as levers and increase the likelihood of breakage, pecan orchards and groves are particularly susceptible to damage from tornadoes, hurricanes, and ice storms. Ice damage is typically more severe in pecan orchards than other orchard crops.

Oklahoma has 85,740 acres of pecans on 2,879 farms. Ice storms struck Oklahoma four times from 2000 through 2007. The crippling ice storm in December 2000, which hit the southeast quarter of Oklahoma, extended into parts of Texas, Louisiana, and Arkansas. An estimated 25,000 to 30,000 acres of pecans were damaged in Oklahoma during this storm alone.
Michael W. Smith from the Department of Horticulture and Landscape Architecture at Oklahoma State University, and Charles T. Rohla of the Samuel Roberts Noble Foundation published a research report in the latest issue of HortTechnology that provides pecan producers, government agencies, and insurance companies with important information concerning orchard management and economics following destructive ice storms.
Cleanup of pecan orchards following ice damage presents enormous challenges for producers. Typical damage, cleanup, and recovery from four ice storms that hit the region from 2000 to 2007 were reported in the study. Trees less than 15 feet tall typically had the least damage; trees 15 to 30 feet tall incurred as much or more damage than larger trees and cleanup costs were greater.
The silver lining: pecan trees are resilient. Most trees can survive and eventually return to productivity following loss of most of their crown. But cleanup costs to ice-damaged pecan orchards are high, ranging from $207 to $419 per acre based on the dollar value in 2008. According to the researchers, these costs were consistent among orchards where the owner supervised the labor and had the resources to obtain equipment necessary to prune and remove debris from the orchard. The cleanup costs paid to "custom operators" for renovating orchards following ice storms were significantly more expensive, ranging from $500 to $800 per acre in 2008 for orchards with similar damage levels.
Explaining the outcomes of the research study, Smith stated; "Following damaging weather events, producers seek information concerning effective cleanup procedures, subsequent management, recovery duration, and economic impact. State and Federal agencies and insurance companies seek guidance concerning economic impact and how to assist producers. Our objective was to provide information for producers and others regarding the impact of an ice storm on pecans."
Adapted from materials provided by American Society for Horticultural Science, via EurekAlert!, a service of AAAS.

Your Ad Here READ MORE - Ice Storms Devastating To Pecan Orchards

Mice And Humans Should Have More In Common In Clinical Trials

Purdue researcher Joseph Garner found that traditional testing methods in mice increase errors in lab results. His study suggests researchers vary the environmental conditions for mice during tests to lessen the possibility of false positives.

Just as no two humans are the same, a Purdue University scientist has shown treating mice more as individuals in laboratory testing cuts down on erroneous results and could significantly reduce the cost of drug development.

Mice have long been used as test subjects for treatments and drugs before those products are approved for human testing. But new research shows that the customary practice of standardizing mice by trying to limit environmental variation in laboratories actually increases the chance of getting an incorrect result.
The study, done by Joseph Garner, a Purdue assistant professor of animal sciences, and professor Hanno Würbel of the Justus-Liebig University of Giessen in Germany, was published in the early online edition of Nature Methods on Monday (March 30). It suggests scientists should change their methods and test mice in deliberately varying environmental conditions. Garner said that will decrease the number of false positive test results and eliminate further costly testing of drugs or treatments destined to fail.
"In lab animals, we have this bizarre idea that we can control everything that happens," Garner said. "But we would never be able to do that with humans, and we wouldn't want to. You want to know if a drug is going to work in all people, so you test it on a wide range of different people. We should do the same thing with mice."
Garner said human testing uses a broad range of subjects, giving scientists an idea of how a drug or treatment might affect different types of people. But scientists often use mice that are basically genetically identical and try to limit internal and external environmental factors such as stress, diet and age to eliminate variables affecting the outcome.
Garner said there is no practical way to ensure that all environmental conditions are the same with mice, however, because they respond to cues humans cannot detect. For example, a researcher's odor in one lab might cause more stress for a mouse than another researcher's odor in a second lab with different mice, giving different results. But scientists, unaware of the odor difference, may believe a treatment worked when the mice were actually responding to an environmental cue, giving a false positive.
The study used three different strains of mice from previously published data and compared their behavioral characteristics against each other. The observations were done in three different labs, two different types of cages and at three different times to make 18 different replicates of the same experiment. Traditional testing theories say the results should have been the same in all those experiments.
Once the results were compared, however, the researchers found many false positives, or instances when one strain appeared to act differently from another when it actually should not.
"There were nearly 10 times more false positives than we would expect by chance," Garner said. "There had to be a gremlin causing these false positives."
The researchers suspected the problem was in the traditional lab experiment design. So they reevaluated the data, picking a mouse of each strain from each environment - similar to matching pairs in human clinical trials - and found only the same number of false positives as would be expected by chance.
When mouse testing creates a false positive, leading a researcher to believe a drug has worked, the drug could be sent to further animal testing and human clinical trials at a cost of millions of dollars. Drugs that fail in clinical trials cannot be marketed, and the money is wasted. To recoup those losses, drug companies must increase the costs of marketable drugs.
"Drugs aren't expensive because they're costly to make," Garner said. "They're expensive because the company has to recoup the costs of the other drugs that have failed in human clinical trials. Numbers are hard to estimate, but for every drug that reaches the marketplace, well over 100 have been abandoned at some point in their development."
Garner said giving mice varying environments also could be better for the animals because fewer could be used. Weeding out an unsuccessful drug would eliminate an unnecessary second round of animal testing.
"The really exciting message is that we have shown how the false positives in early drug discovery can be drastically reduced without costing anything more than a change in experimental design," Garner said. "These are positive results for pharmaceutical research, patients and for mice."
Garner and Würbel, along with Würbel's doctoral student Helene Richter, received research funding from the German Research Foundation. Their research will now focus on which environmental factors have the most impact on results.
Adapted from materials provided by Purdue University.

Your Ad Here READ MORE - Mice And Humans Should Have More In Common In Clinical Trials

Hollow Gold Nanospheres Show Promise For Biomedical And Other Applications

Partial view of a gold nanosphere (shown), magnified by a factor of one billion, as seen through an electron microscope. The darker ring shows the "wall" of the nanosphere, while the lighter area to the right of the ring shows the interior region of the shell.

A new metal nanostructure developed by researchers at the University of California, Santa Cruz, has already shown promise in cancer therapy studies and could be used for chemical and biological sensors and other applications as well.
The hollow gold nanospheres developed in the laboratory of Jin Zhang, a professor of chemistry and biochemistry at UCSC, have a unique set of properties, including strong, narrow, and tunable absorption of light. Zhang is collaborating with researchers at the University of Texas M. D. Anderson Cancer Center, who have used the new nanostructures to target tumors for photothermal cancer therapy. They reported good results from preclinical studies earlier this year (Clinical Cancer Research, February 1, 2009).

Zhang will describe his lab's work on the hollow gold nanospheres in a talk on Sunday, March 22, at the annual meeting of the American Chemical Society in Salt Lake City.
"What makes this structure special is the combination of the spherical shape, the small size, and the strong absorption in visible and near infrared light," Zhang said. "The absorption is not only strong, it is also narrow and tunable. All of these properties are important for cancer treatment."
Zhang's lab is able to control the synthesis of the hollow gold nanospheres to produce particles with consistent size and optical properties. The hollow particles can be made in sizes ranging from 20 to 70 nanometers in diameter, which is an ideal range for biological applications that require particles to be incorporated into living cells. The optical properties can be tuned by varying the particle size and wall thickness.
In the cancer studies, led by Chun Li of the M. D. Anderson Cancer Center, researchers attached a short peptide to the nanospheres that enabled the particles to bind to tumor cells. After injecting the nanospheres into mice with melanoma, the researchers irradiated the animals' tumors with near-infrared light from a laser, heating the gold nanospheres and selectively killing the cancer cells to which the particles were bound.
Cancer therapy was not the goal, however, when Zhang's lab began working several years ago on the synthesis and characterization of hollow gold nanospheres. Zhang has studied a wide range of metal nanostructures to optimize their properties for surface-enhanced Raman scattering (SERS). SERS is a powerful optical technique that can be used for sensitive detection of biological molecules and other applications.
Adam Schwartzberg, then a graduate student in Zhang's lab at UCSC, initially set out to reproduce work reported by Chinese researchers in 2005. In the process, he perfected the synthesis of the hollow gold nanospheres, then demonstrated and characterized their SERS activity.
"This process is able to produce SERS-active nanoparticles that are significantly smaller than traditional nanoparticle structures used for SERS, providing a sensor element that can be more easily incorporated into cells for localized intracellular measurements," Schwartzberg, now at UC Berkeley, reported in a 2006 paper published in Analytical Chemistry.
The collaboration with Li began when Zhang heard him speak at a conference about using solid nanoparticles for photothermal cancer therapy. Zhang immediately saw the advantages of the hollow gold nanospheres for this technique. Li uses near-infrared light in the procedure because it provides good tissue penetration. But the solid gold nanoparticles he was using do not absorb near-infrared light efficiently. Zhang told Li he could synthesize hollow gold nanospheres that absorb light most efficiently at precisely the wavelength (800 nanometers) emitted by Li's near-infrared laser.
"The heat that kills the cancer cells depends on light absorption by the metal nanoparticles, so more efficient absorption of the light is better," Zhang said. "The hollow gold nanospheres were 50 times more effective than solid gold nanoparticles for light absorption in the near-infrared."
Zhang's group has been exploring other nanostructures that can be synthesized using the same techniques. For example, graduate student Tammy Olson has designed hollow double-nanoshell structures of gold and silver, which show enhanced SERS activities compared to the hollow gold nanospheres.
The ability to tune the optical properties of the hollow nanospheres makes them highly versatile, Zhang said. "It is a unique structure that offers true advantages over other nanostructures, so it has a lot of potential," he said.
Adapted from materials provided by University of California - Santa Cruz, via EurekAlert!, a service of AAAS.

Your Ad Here READ MORE - Hollow Gold Nanospheres Show Promise For Biomedical And Other Applications

Food Choices Evolve Through Information Overload

Just as information overload leads to people repeatedly choosing what they know, same concept applies equally to hundreds of animal species, too, new research shows

Ever been so overwhelmed by a huge restaurant menu that you end up choosing an old favourite instead of trying something new?
Psychologists have long since thought that information overload leads to people repeatedly choosing what they know. Now, new research has shown that the same concept applies equally to hundreds of animal species, too.
Researchers from the University of Leeds have used computer modelling to examine the evolution of specialisation, casting light on why some animal species have evolved to eat one particular type of food. For example some aphids choose to eat garden roses, but not other plants which would offer similar nutritional values.
"This is a major leap forward in our understanding of the way in which animals interact with their environment," says lead researcher Dr Colin Tosh from the University's Faculty of Biological Sciences. "Our computer models show the way in which neural networks operate in different environments. They have made it possible for us to see how different species make decisions, based on what's happening – or in this case, which foods are available - around them."
Despite the prevalence of specialisation in the animal kingdom, very little is known about why it occurs. The work conducted at Leeds has provided strong evidence in support of the 'neural limitations' hypothesis put forward by academics in the 1990s. This hypothesis, derived from human psychology, is based on the concept of information overload.
"There are several hypotheses to explain specialisation: one suggests that animals adapt to eat certain foods and this prevents them from eating other types of food," says Dr Tosh.
"For example, cows have evolved flat teeth which allow them to chew grass but they are unable to efficiently process meat. However, the problem with these hypotheses is that they don't apply across the board. Some species – such as many plant eating insects – have evolved to specialise even though there are many other available foods they could eat perfectly well."
This is the first study to provide a realistic representation of neural information processing in animals and how these interact with their environment. The research team believe that it could also have major implications for predicting the effects of environmental change.
"A good example of a struggling specialist is the giant panda, which relies on high mountain bamboo," says Dr Tosh. "In understanding how neural processes work, we may be able to gain an insight into how future environmental conditions – such as the dying out of particular types of plants - may affect a range of different animal species that utilise them for food."
This research was funded by the Natural Environment Research Council in the UK.
Adapted from materials provided by University of Leeds, viaEurekAlert!, a service of AAAS.
Your Ad Here READ MORE - Food Choices Evolve Through Information Overload

Wednesday, March 11, 2009

Genetic Study Finds Treasure Trove Of New Lizards

New species of gecko that was once thought to be Diplodactylus tessellatus.

University of Adelaide research has discovered that there are many more species of Australian lizards than previously thought, raising new questions about conservation and management of Australia's native reptiles.
PhD student Paul Oliver, from the University's School of Earth and Environmental Sciences, has done a detailed genetic study of the Australian gecko genus Diplodactylus and found more than twice the recognised number of gecko species, from 13 species to 29. This study was done in collaboration with the South Australian Museum and Western Australian Museum.

"Many of these species are externally very similar, leading to previous severe underestimation of true species diversity," says Mr Oliver.
"One of the major problems for biodiversity conservation and management is that many species remain undocumented.
"This problem is widely acknowledged to be dire among invertebrates and in developing countries.
"But in this group of vertebrates in a developed nation, which we thought we knew reasonably well, we found more than half the species were unrecognised."
Mr Oliver says this has great significance for conservation. For instance, what was thought to be a single very widespread species of gecko has turned out to be eight or nine separate species with much narrower, more restricted habitats and possibly much more vulnerable to environmental change, he says.
"This completely changes how we look at conservation management of these species," he says.
"Even at just the basic inventory level, this shows that there is a lot of work still to be done. Vertebrate taxonomy clearly remains far from complete with many species still to be discovered. This will require detailed genetic and morphological work, using integrated data from multiple sources. It will require considerable effort and expense but with potentially rich returns."
The research was supported by grants from the Australia Pacific Science Foundation and the Australian Biological Resources Study.
Adapted from materials provided by University of Adelaide.

Your Ad Here READ MORE - Genetic Study Finds Treasure Trove Of New Lizards

Coral Reefs May Start Dissolving When Atmospheric Carbon Dioxide Doubles

Coral reef. If carbon dioxide reaches double pre-industrial levels, coral reefs can be expected to not just stop growing, but also to begin dissolving all over the world.

Rising carbon dioxide in the atmosphere and the resulting effects on ocean water are making it increasingly difficult for coral reefs to grow, say scientists. A study to be published online March 13, 2009 in Geophysical Research Letters by researchers at the Carnegie Institution and the Hebrew University of Jerusalem warns that if carbon dioxide reaches double pre-industrial levels, coral reefs can be expected to not just stop growing, but also to begin dissolving all over the world.
The impact on reefs is a consequence of both ocean acidification caused by the absorption of carbon dioxide into seawater and rising water temperatures. Previous studies have shown that rising carbon dioxide will slow coral growth, but this is the first study to show that coral reefs can be expected to start dissolving just about everywhere in just a few decades, unless carbon dioxide emissions are cut deeply and soon.

"Globally, each second, we dump over 1000 tons of carbon dioxide into the atmosphere and, each second, about 300 tons of that carbon dioxide is going into the oceans," said co-author Ken Caldeira of the Carnegie Institution's Department of Global Ecology, testifying to the U.S. House of Representatives Subcommittee on Insular Affairs, Oceans and Wildlife of the Committee on Natural Resources on February 25, 2009. "We can say with a high degree of certainty that all of this CO2 will make the oceans more acidic – that is simple chemistry taught to freshman college students."
The study was designed determine the impact of this acidification on coral reefs. The research team, consisting of Jacob Silverman, Caldeira, and Long Cao of the Carnegie Institution as well as Boaz Lazar and Jonathan Erez from The Hebrew University of Jerusalem, used field data from coral reefs to determine the effects of temperature and water chemistry on coral calcification rates. Armed with this information, they plugged the data into a computer model that calculated global seawater temperature and chemistry at different atmospheric levels of CO2 ranging from the pre-industrial value of 280 ppm (parts per million) to 750 ppm. The current atmospheric concentration is over 380 ppm, and is rapidly rising due to human-caused emissions, primarily through the burning of fossil fuels.
Based on the model results for more than 9,000 reef locations, the researchers determined that at the highest concentration studied, 750 ppm, acidification of seawater would reduce calcification rates of three quarters of the world's reefs to less than 20% of pre-industrial rates. Field studies suggest that at such low rates, coral growth would not be able to keep up with dissolution and other natural as well as manmade destructive processes attacking reefs.
Prospects for reefs are even gloomier when the effects of coral bleaching are included in the model. Coral bleaching refers to the loss of symbiotic algae that are essential for healthy growth of coral colonies. Bleaching is already a widespread problem, and high temperatures are among the factors known to promote bleaching. According to their model the researchers calculated that under present conditions 30% of reefs have already undergone bleaching and that at CO2 levels of 560 ppm (twice pre-industrial levels) the combined effects of acidification and bleaching will reduce the calcification rates of all the world's reefs by 80% or more. This lowered calcification rate will render all reefs vulnerable to dissolution, without even considering other threats to reefs, such as pollution.
"Our fossil-fueled lifestyle is killing off coral reefs," says Caldeira. "If we don't change our ways soon, in the next few decades we will destroy what took millions of years to create."
"Coral reefs may be the canary in the coal mine," he adds. "Other major pieces of our planet may be similarly threatened because we are using the atmosphere and oceans as dumps for our CO2 pollution. We can save the reefs if we decide to treat our planet with the care it deserves. We need to power our economy with technologies that do not dump carbon dioxide into the atmosphere or oceans."
Adapted from materials provided by Carnegie Institution, viaEurekAlert!, a service of AAAS.

Your Ad Here READ MORE - Coral Reefs May Start Dissolving When Atmospheric Carbon Dioxide Doubles

Teenage Boys Who Eat Fish At Least Once A Week Achieve Higher Intelligence Scores

New research has found that 15-year-old males who ate fish at least once a week displayed higher cognitive skills at the age of 18 than those who it ate it less frequently.

Fifteen-year-old males who ate fish at least once a week displayed higher cognitive skills at the age of 18 than those who it ate it less frequently, according to a study of nearly 4,000 teenagers published in the March issue of Acta Paediatrica.
Eating fish once a week was enough to increase combined, verbal and visuospatial intelligence scores by an average of six per cent, while eating fish more than once a week increased them by just under 11 per cent.
Swedish researchers compared the responses of 3,972 males who took part in the survey with the cognitive scores recorded in their Swedish Military Conscription records three years later.

"We found a clear link between frequent fish consumption and higher scores when the teenagers ate fish at least once a week" says Professor Kjell Torén from the Sahlgrenska Academy at the University of Gothenburg, one of the senior scientists involved in the study. "When they ate fish more than once a week the improvement almost doubled.
"These findings are significant because the study was carried out between the ages of 15 and 18 when educational achievements can help to shape the rest of a young man's life."
The research team found that:
  • • 58 per cent of the boys who took part in the study ate fish at least once a week and a further 20 per cent ate fish more than once a week.
  • • When male teenagers ate fish more than once a week their combined intelligence scores were on average 12 per cent higher than those who ate fish less than once a week. Teenagers who ate fish once a week scored seven per cent higher.
  • • The verbal intelligence scores for teenagers who ate fish more than once a week were on average nine per cent higher than those who ate fish less than once a week. Those who ate fish once a week scored four per cent higher.
  • • The same pattern was seen in the visuospatial intelligence scores, with teenagers who ate fish more than once a week scoring on average 11 per cent higher than those who ate fish less than once a week. Those who ate fish once a week scored seven per cent higher.
"A number of studies have already shown that fish can help neurodevelopment in infants, reduce the risk of impaired cognitive function from middle age onwards and benefit babies born to women who ate fish during pregnancy" says Professor Torén.

"However we believe that this is the first large-scale study to explore the effect on adolescents."
The exact mechanism that links fish consumption to improved cognitive performance is still not clear.
"The most widely held theory is that it is the long-chain polyunsaturated fatty acids found in fish that have positive effects on cognitive performance" explains Professor Torén.
"Fish contains both omega-3 and omega-6 fatty acids which are known to accumulate in the brain when the foetus is developing. Other theories have been put forward that highlight their vascular and anti-inflammatory properties and their role in suppressing cytokines, chemicals that can affect the immune system."
In order to isolate the effect of fish consumption on the study subjects, the research team looked at a wide range of variables, including ethnicity, where they lived, their parents' educational level, the teenagers' well-being, how frequently they exercised and their weight.
"Having looked very carefully at the wide range of variables explored by this study it was very clear that there was a significant association between regular fish consumption at 15 and improved cognitive performance at 18" concludes lead author Dr Maria Aberg from the Centre for Brain Repair and Rehabilitation at the University of Gothenburg.
"We also found the same association between fish and intelligence in the teenagers regardless of their parents' level of education."
The researchers are now keen to carry out further research to see if the kind of fish consumed - for example lean fish in fish fingers or fatty fish such as salmon - makes any difference to the results.
"But for the time being it appears that including fish in a diet can make a valuable contribution to cognitive performance in male teenagers" says Dr Aberg.
Adapted from materials provided by Wiley-Blackwell, viaEurekAlert!, a service of AAAS.

Your Ad Here READ MORE - Teenage Boys Who Eat Fish At Least Once A Week Achieve Higher Intelligence Scores

Gray Wolves No Longer To Be Listed As Threatened And Endangered Species In Western Great Lakes, Portion Of Northern Rockies

Two gray wolves.

Secretary of the Interior Ken Salazar has affirmed on March 6 the decision by the U.S. Fish and Wildlife Service to remove gray wolves from the list of threatened and endangered species in the western Great Lakes and the northern Rocky Mountain states of Idaho and Montana and parts of Washington, Oregon and Utah. Wolves will remain a protected species in Wyoming.
“The recovery of the gray wolf throughout significant portions of its historic range is one of the great success stories of the Endangered Species Act,” Salazar said. “When it was listed as endangered in 1974, the wolf had almost disappeared from the continental United States. Today, we have more than 5,500 wolves, including more than 1,600 in the Rockies.”
“The successful recovery of this species is a stunning example of how the Act can work to keep imperiled animals from sliding into extinction,” he said. “The recovery of the wolf has not been the work of the federal government alone. It has been a long and active partnership including states, tribes, landowners, academic researchers, sportsmen and other conservation groups, the Canadian government and many other partners.”

The Fish and Wildlife Service originally announced the decision to delist the wolf in January, but the new administration decided to review the decision as part of an overall regulatory review when it came into office. The Service will now send the delisting regulation to the Federal Register for publication.
The Service decided to delist the wolf in Idaho and Montana because they have approved state wolf management plans in place that will ensure the conservation of the species in the future.
At the same time, the Service determined wolves in Wyoming would still be listed under the Act because Wyoming’s current state law and wolf management plan are not sufficient to conserve its portion of northern Rocky Mountain wolf population.
Gray wolves were previously listed as endangered in the lower 48 states, except in Minnesota where they were listed as threatened. The Service oversees three separate recovery programs for the gray wolf; each has its own recovery plan and recovery goals based on the unique characteristics of wolf populations in each geographic area.
Wolves in other parts of the 48 states, including the Southwest wolf population, remain endangered and are not affected by the actions taken today.

About Northern Rocky Mountain Wolves
The northern Rocky Mountain Distinct Population Segment includes all of Montana, Idaho and Wyoming, the eastern one-third of Washington and Oregon, and a small part of north-central Utah. The minimum recovery goal for wolves in the northern Rocky Mountains is at least 30 breeding pairs and at least 300 wolves for at least three consecutive years, a goal that was attained in 2002 and has been exceeded every year since. There are currently about 95 breeding pairs and 1,600 wolves in Montana, Idaho, and Wyoming.
The Service believes that with approved state management plans in place in Montana and Idaho, all threats to the wolf population will be sufficiently reduced or eliminated in those states. Montana and Idaho will always manage for more than 15 breeding pairs and 150 wolves per state and their target population level is about 400 wolves in Montana and 500 in Idaho.
As a result of a Montana United States District Court decision on July 18, 2008, the Service reexamined Wyoming law, its management plans and implementing regulations. While the Service has approved wolf management plans in Montana and Idaho, it has determined that Wyoming’s state law and wolf management plan are not sufficient to conserve Wyoming’s portion of a recovered northern Rocky Mountain wolf population. Therefore, even though Wyoming is included in the northern Rocky Mountain District Population Segment, the subpopulation of gray wolves in Wyoming is not being removed from protection of the Endangered Species Act.
Continued management under the Endangered Species Act by the Service will ensure that wolves in Wyoming will be conserved. Acting U.S. Fish and Wildlife Service Director Rowan Gould said the Service will continue to work with the State of Wyoming in developing its state regulatory framework so that the state can continue to maintain its share of a recovered northern Rocky Mountain population. Once adequate state regulatory mechanisms are in place, the Service could propose removing the Act’s protections for wolves in Wyoming. National parks and the Wind River Reservation in Wyoming already have adequate regulatory mechanisms in place to conserve wolves. However, at this time, wolves will remain protected as a nonessential, experimental population under the ESA throughout the state, including within the boundaries of the Wind River Reservation and national park and refuge units.

Western Great Lakes Region
The Service’s delisting of the gray wolf also applies to gray wolves in the Western Great Lakes Distinct Population Segment. As the result of another legal ruling from the Washington D.C. United States District Court on September 29, 2008, the Service reexamined its legal authorization to simultaneously identify and delist a population of wolves in the western Great Lakes. The Service today reissued the delisting decision in order to comply with the Court’s concerns.
The area included in the DPS boundary includes the states of Minnesota, Wisconsin and Michigan as well as parts of North Dakota, South Dakota, Iowa, Illinois, Indiana and Ohio. The DPS includes all the areas currently occupied by wolf packs in Minnesota, Michigan, and Wisconsin, as well as nearby areas in these states in which wolf packs may become established in the future. The DPS also includes surrounding areas into which wolves may disperse but are not likely to establish packs.
Rebounding from a few hundred wolves in Minnesota in the 1970s when listed as endangered, the region’s gray wolf population now numbers about 4,000 and occupies large portions of Wisconsin, Michigan and Minnesota. Wolf numbers in the three states have exceeded the numerical recovery criteria established in the species’ recovery plan for several years. In Minnesota, the population is estimated at 2,922. The estimated wolf population in Wisconsin is a minimum of 537, and about 520 wolves are believed to inhabit Michigan’s Upper Peninsula.
The Michigan, Minnesota, and Wisconsin Departments of Natural Resources have developed plans to guide wolf management actions in the future. The Service has determined that these plans establish a sufficient basis for long-term wolf management. They address issues such as protective regulations, control of problem animals, possible hunting and trapping seasons, and the long-term health of the wolf population, and will be governed by the appropriate state or tribe.
The Service will monitor the delisted wolf populations for a minimum of five years to ensure that they continue to sustain their recovery. At the end of the monitoring period, the Service will decide if relisting, continued monitoring or ending Service monitoring is appropriate.
Adapted from materials provided by U.S. Department of the Interior.

Your Ad Here READ MORE - Gray Wolves No Longer To Be Listed As Threatened And Endangered Species In Western Great Lakes, Portion Of Northern Rockies

Big-hearted Fish Reveals Genetics Of Cardiovascular Condition

Enlarged heart of a 48-hour-post-fertilization zebrafish embryo lacking the gene for ccm2. Nuclei from endothelial cells shown in red and junctions in between in green.

Researchers at the University of Pennsylvania School of Medicine have unlocked the mystery of a puzzling human disease and gained insight into cardiovascular development, all thanks to a big-hearted fish.
Mark Kahn, MD, Associate Professor of Medicine, graduate student Benjamin Kleaveland, and colleagues report in the February issue of Nature Medicine that a human vascular condition called Cerebral Cavernous Malformation (CCM) is caused by leaky junctions between cells in the lining of blood vessels. By combining studies with zebrafish and mice, the researchers found that the aberrant junctions are the result of mutated or missing proteins in a novel biochemical process, the so-called Heart-of-glass (HEG)-CCM pathway.
The HEG-CCM pathway "is essential to regulate endothelial cell-cell interaction, both during the time that vertebrates make the cardiovascular system and later in life," says Kahn. "Its loss later in life confers this previously unexplained disease, cerebral cavernous malformation."
CCM proteins, along with the receptor HEG, are responsible for building properly formed blood and lymphatic vessels during embryonic development by sealing the cell-cell junctions in the walls of vessels; loss of any of these proteins disrupts those seals, causing leaky vasculature.

Cerebral Cavernous Malformations are abnormal clusters of leaky blood vessels, typically in the brain, which can cause both seizures and strokes. The condition affects about 1 in 1,000 people, about 20% of whom carry a genetic predisposition for the condition. Researchers had already identified the genes responsible for the disease– indeed they were named CCM1, CCM2, and CCM3, in recognition of that fact – but not what those genes did.
That's where the big-hearted fish come in. Several years ago, another research team discovered that mutations in CCM1, CCM2, or HEG (which had not previously been linked to CCM) caused zebrafish to develop enlarged hearts. Sensing that this observation could help unlock the mystery of what CCM proteins do, Kleaveland decided to see if these results could be extended to mice.
"Our notion was to take the zebrafish developmental studies and use the mouse as a way of bridging between what appeared to be a role in heart development in fish and blood vessel disease in people," says Kahn.
Kleaveland genetically engineered mice that both completely lack the HEG protein and produce diminished amounts of CCM2. This combination of genetic defects is fatal for the mice; they die during embryonic development. But, examination of their cardiovascular system and that of genetically altered fish, as well revealed several key findings, Kleaveland says.
First, loss of HEG produces cardiovascular defects—mainly leakiness—in the heart, in blood vessels in the lung, and in the lymphatic system. Second, loss of HEG with partial loss of CCM2 produces a worse cardiovascular defect—failure to even form critical blood vessels. Third, all of these defects are characterized by malformed cell-cell junctions in the endothelial cells that line these organs. And finally, HEG actually physically interacts with CCM proteins.
"It looks like the disease is a reflection of a disruption in endothelial cell-cell junctions, and this pathway is required to regulate them," Kahn says.
These data underscore the evolutionary significance of the biochemical process underlying CCM. "With millions of years of evolution between fish and mammals, genes typically acquire new roles and lose old roles," Kahn explains. "When things are that conserved, it just tends to mean that it's a highly important and central process, and it probably also tells us that whatever it's doing is fundamental to blood vessels and the whole cardiovascular system."
The study, Kahn adds, addresses a debate in the field as to whether CCM is the result of defects that cause the disease present in the affected endothelial cells themselves, or in the cells that surround them, such as neurons in the brain?
"We think the developmental model has shown us that the requirement is in the endothelial cell," he says.
Now Kahn, Kleaveland, and their colleagues are working to determine just what it is that HEG is doing in endothelial cell-cell junctions – what proteins it "talks" to on adjacent endothelial cells – and also, to build a true mouse model of the CCM disease.
The mice in this study died in utero, but CCM disease tends to affect humans in their 30s and older. With a good model, however, "you could watch the progression of it, and you could try to change that progression, essentially to treat a mouse,” Kleaveland says.
The research was funded by the National Institutes of Health, the Swiss National Science Foundation, and the European Community, and involved researchers from the University of California, San Diego, Columbia University Medical Center, New York, and the University of Basel, Switzerland.
Adapted from materials provided by University of Pennsylvania School of Medicine.

Your Ad Here READ MORE - Big-hearted Fish Reveals Genetics Of Cardiovascular Condition

Inactivity Of Proteins Behind Longer Shelf Life When Freezing

Frozen biological material, for example food, can be kept for a long time without perishing. A new study is close to providing answers as to why.

Frozen biological material, for example food, can be kept for a long time without perishing. A study by researchers at the University of Gothenburg, Sweden, is close to providing answers as to why.
A cell's proteins are programmed to carry out various biological functions. The protein's level of activity and its ability to successfully carry out these functions is dependent on the amount of water by which it is surrounded. For example, dry proteins are completely inactive. A critical amount of water is required in order for the function to get going, after which point the protein's level of activity increases concurrently with an increase in the amount of water. Proteins achieve full biological activity when the surrounding water has approximately the same weight as the protein.

Researchers at the University of Gothenburg and Chalmers University of Technology have together with a group of American researchers used advanced experimental techniques to study how movements in the water that surrounds the protein cause movements in the protein itself. The study, which is being published in the journal PNAS, indicates that the dynamics in the surrounding water have a direct effect on the protein's dynamics, which, in turn, should affect the activity.
The results explain, for example, why biological material such as foodstuffs or research material can be stored at low temperatures for a long period of time without perishing.
"When the global movements in the surrounding water freeze, then significant movements within the protein also come to a stop. This results in the protein being preserved in a state of minimum energy and biological activity comes to a stop," says researcher Helén Jansson at the Swedish NMR Centre, University of Gothenburg, Sweden.
Adapted from materials provided by University of Gothenburg.

Your Ad Here READ MORE - Inactivity Of Proteins Behind Longer Shelf Life When Freezing

Rising Sea Levels Set To Have Major Impacts Around The World

Houses along the Mexican Riveria in Cabo San Lucas. The impacts of sea level rise - even in the lower ranges of the current predictions - looks to be severe. Approximately ten percent of the worlds population - 600 million people - live in low lying areas in danger of being flooded.

Research presented March 10 at the International Scientific Congress on Climate Change in Copenhagen shows that the upper range of sea level rise by 2100 could be in the range of about one meter, or possibly more. In the lower end of the spectrum it looks increasingly unlikely that sea level rise will be much less than 50 cm by 2100.
This means that if emissions of greenhouse gases is not reduced quickly and substantially, even the best case scenario will hit low lying coastal areas housing one in ten humans on the planet hard.
Dr John Church of the Centre for Australian Weather and Climate Research, Hobart, Tasmania, Australia and the lead speaker in the sea level session, told the conference, "The most recent satellite and ground based observations show that sea-level rise is continuing to rise at 3 mm/yr or more since 1993, a rate well above the 20th century average. The oceans are continuing to warm and expand, the melting of mountain glacier has increased and the ice sheets of Greenland and Antarctica are also contributing to sea level rise."

New insights reported include the loss of ice from the Antarctic and Greenland Ice Sheets. "The ice loss in Greenland has accelerated over the last decade. The upper range of sea level rise by 2100 might be above 1m or more on a global average, with large regional differences depending where the source of ice loss occurs", says Konrad Steffen, Director of the Cooperative Institute for Research in Environmental Sciences (CIRES) at the University of Colorado, Boulder and co-chair of the congress session on sea level rise.
The last assessment report from the IPCC from 2007 projected a sea level rise of 18 - 59 centimeter. However the report also clearly stated that not all factors contributing to sea level rise could be calculated at that time. The uncertainty was centered on the ice sheets, how they react to the effects of a warmer climate and how they interact with the oceans, explains Eric Rignot, Professor of Earth System Science at the University of California Irvine and Senior Research Scientist at NASA's Jet Propulsion Laboratory.
"The numbers from the last IPCC are a lower bound because it was recognized at the time that there was a lot of uncertainty about ice sheets. The numerical models used at the time did not have a complete representation of outlet glaciers and their interactions with the ocean. The results gathered in the last 2-3 years show that these are fundamental aspects that cannot be overlooked. As a result of the acceleration of outlet glaciers over large regions, the ice sheets in Greenland and Antarctica are already contributing more and faster to sea level rise than anticipated. If this trend continues, we are likely to witness sea level rise one meter or more by year 2100", he says.
"Unless we undertake urgent and significant mitigation actions, the climate could cross a threshold during the 21st century committing the world to a sea level rise of metres", said John Church.
"Measurements around the world show that sea level has risen almost 20 centimeters since 1880," explains Professor Stefan Rahmstorf of the Potsdam Institute for Climate Impact Research, who will give the plenary speech on sea level rise at the congress. These data also reveal that the rate of sea level rise is closely linked to temperature: sea level rises faster the warmer it gets. "If sea level keeps rising at a constant pace, we will end up in the middle of that 18-59 cm IPCC range by 2100," says Rahmstorf. "But based on past experience I expect that sea level rise will accelerate as the planet gets hotter."
The impacts of sea level rise - even in the lower ranges of the current predictions - looks to be severe. Approximately ten percent of the worlds population - 600 million people - live in low lying areas in danger of being flooded. A previously released study led by John Church, shows that even a modest sea level rise of 50 centimeters will result in a major increase in the number of coastal flooding events.
"Our study centered on Australia showed that coastal flooding events that today we expect only once every hundred years will happen several times a year by 2100", says John Church.
John Church also brings new results of the current sea level rise to the congress, "Sea level is currently rising at a rate that is above any of the model projections of 18 to 59 cm".
"Different groups may come to slightly different projections, but differences in the details of the projections should not cloud the overall picture where even the lower end of the projections looks to have very serious effects," says Konrad Steffen.

1. The rising tide: assessing the risks od climate change and human settlements in low elevation costal zones. Gordon McGranahan, Deborah Balk, and Bridget Anderson; Environment and Urbanization, Apr 2007; vol. 19: pp. 17-37.
Adapted from materials provided by University of Copenhagen.

Your Ad Here READ MORE - Rising Sea Levels Set To Have Major Impacts Around The World

Your Ad Here