Climate change and air pollution will combine to curb food supplies

Many studies have shown the potential for global climate change to cut food supplies. But these studies have, for the most part, ignored the interactions between increasing temperature and air pollution — specifically ozone pollution, which is known to damage crops.A new study involving researchers at MIT shows that these interactions can be quite significant, suggesting that policymakers need to take both warming and air pollution into account in addressing food security.The study looked in detail at global production of four leading food crops — rice, wheat, corn, and soy — that account for more than half the calories humans consume worldwide. It predicts that effects will vary considerably from region to region, and that some of the crops are much more strongly affected by one or the other of the factors: For example, wheat is very sensitive to ozone exposure, while corn is much more adversely affected by heat.The research was carried out by Colette Heald, an associate professor of civil and environmental engineering (CEE) at MIT, former CEE postdoc Amos Tai, and Maria van Martin at Colorado State University. Their work is described this week in the journal Nature Climate Change.Heald explains that while it’s known that both higher temperatures and ozone pollution can damage plants and reduce crop yields, “nobody has looked at these together.” And while rising temperatures are widely discussed, the impact of air quality on crops is less recognized.The effects are likely to vary widely by region, the study predicts. In the United States, tougher air-quality regulations are expected to lead to a sharp decline in ozone pollution, mitigating its impact on crops. But in other regions, the outcome “will depend on domestic air-pollution policies,” Heald says. “An air-quality cleanup would improve crop yields.”Overall, with all other factors being equal, warming may reduce crop yields globally by about 10 percent by 2050, the study found. But the effects of ozone pollution are more complex — some crops are more strongly affected by it than others — which suggests that pollution-control measures could play a major role in determining outcomes.Ozone pollution can also be tricky to identify, Heald says, because its damage can resemble other plant illnesses, producing flecks on leaves and discoloration.Potential reductions in crop yields are worrisome: The world is expected to need about 50 percent more food by 2050, the authors say, due to population growth and changing dietary trends in the developing world. So any yield reductions come against a backdrop of an overall need to increase production significantly through improved crop selections and farming methods, as well as expansion of farmland.While heat and ozone can each damage plants independently, the factors also interact. For example, warmer temperatures significantly increase production of ozone from the reactions, in sunlight, of volatile organic compounds and nitrogen oxides. …

Read more

Diet or exercise? ‘Energy balance’ real key to disease prevention

A majority of Americans are overweight or obese, a factor in the rapid rise in common diseases like diabetes, heart disease, cancer, high blood pressure and more. According to a paper published collaboratively in this month’s issues of the official journals of both the American College of Sports Medicine (ACSM) and the Academy of Nutrition and Dietetics, energy balance is a viable public health solution to address the obesity epidemic. The paper outlines steps to incorporate energy balance principles into public health outreach in the U.S.”It is time we collectively move beyond debating nutrition or exercise and focus on nutrition and exercise,” said co-author and ACSM member Melinda Manore, Ph.D., R.D., C.S.S.D., FACSM of Oregon State University. “Nutrition and exercise professionals working collaboratively, combined with effective public health messaging about the importance of energy balance, can help America shape up and become healthier.”The paper, published in the July edition of Medicine & Science in Sports & Exercise and in the Journal of the Academy of Nutrition and Dietetics, gives the following recommendations:• Integrate energy balance into curriculum and training for both exercise science and nutrition professionals and strengthen collaborative efforts between them • Develop competencies for school and physical education teachers and position them as energy balance advocates • Develop core standards for schools that integrate the dynamic energy balance approach • Work with federally-funded nutrition programs like the Cooperative Extension Service and school lunch programs to incorporate energy balance solutions • Develop messaging and promotional strategies about energy balance that American consumers can understand and apply to their lifestyle • Map out and support existing programs that emphasize energy balance”Our health professionals are currently working in silos and must work together to educate and promote energy balance as the key to better health” said Manore. “The obesity crisis is one of the greatest public health challenges of our generation. Energy balance can help us work toward a solution so our children aren’t saddled with the same health challenges we currently face. “The paper is an outcome of the October 2012 expert panel meeting titled “Energy Balance at the Crossroads: Translating the Science into Action” hosted by ACSM, Academy of Nutrition and Dietetics and the US Department of Agriculture (USDA)/Agriculture Research Service.Story Source:The above story is based on materials provided by American College of Sports Medicine (ACSM). Note: Materials may be edited for content and length.

Read more

Six new genetic risk factors for Parkinson’s found

Using data from over 18,000 patients, scientists have identified more than two dozen genetic risk factors involved in Parkinson’s disease, including six that had not been previously reported. The study, published in Nature Genetics, was partially funded by the National Institutes of Health (NIH) and led by scientists working in NIH laboratories.”Unraveling the genetic underpinnings of Parkinson’s is vital to understanding the multiple mechanisms involved in this complex disease, and hopefully, may one day lead to effective therapies,” said Andrew Singleton, Ph.D., a scientist at the NIH’s National Institute on Aging (NIA) and senior author of the study.Dr. Singleton and his colleagues collected and combined data from existing genome-wide association studies (GWAS), which allow scientists to find common variants, or subtle differences, in the genetic codes of large groups of individuals. The combined data included approximately 13,708 Parkinson’s disease cases and 95,282 controls, all of European ancestry.The investigators identified potential genetic risk variants, which increase the chances that a person may develop Parkinson’s disease. Their results suggested that the more variants a person has, the greater the risk, up to three times higher, for developing the disorder in some cases.”The study brought together a large international group of investigators from both public and private institutions who were interested in sharing data to accelerate the discovery of genetic risk factors for Parkinson’s disease,” said Margaret Sutherland, Ph.D., a program director at the National Institute of Neurological Disorders and Stroke (NINDS), part of NIH. “The advantage of this collaborative approach is highlighted in the identification of pathways and gene networks that may significantly increase our understanding of Parkinson’s disease.”To obtain the data, the researchers collaborated with multiple public and private organizations, including the U.S. Department of Defense, the Michael J. Fox Foundation, 23andMe and many international investigators.Affecting millions of people worldwide, Parkinson’s disease is a degenerative disorder that causes movement problems, including trembling of the hands, arms, or legs, stiffness of limbs and trunk, slowed movements and problems with posture. Over time, patients may have difficulty walking, talking, or completing other simple tasks. Although nine genes have been shown to cause rare forms of Parkinson’s disease, scientists continue to search for genetic risk factors to provide a complete genetic picture of the disorder.The researchers confirmed the results in another sample of subjects, including 5,353 patients and 5,551 controls. …

Read more

Genome analysis helps in breeding more robust cows

Genome analysis of 234 bulls has put researchers, including several from Wageningen Livestock Research, on the trail of DNA variants which influence particular characteristics in breeding bulls. For example, two variants have proven responsible for disruptions to the development of embryos and for curly hair, which is disadvantageous because more ticks and parasites occur in curly hair than in short, straight hair. These are the first results of the large 1000 Bull Genomes project on which some 30 international researchers are collaborating. They report on their research in the most recent edition of the science journal Nature Genetics.Most breeding characteristics are influenced by not one but a multiplicity of variants. It is therefore important to be able to use all the variants in breeding, say the Wageningen researchers. In order to make this possible, Rianne van Binsbergen, PhD researcher at the Animal Breeding and Genomics Centre of Wageningen UR, investigated whether the genomes of all the common bulls in the Netherlands can be filled with the help of these 234 bulls. Currently, these bulls have been genotyped with markers of 50,000 or 700,000 DNA variants. The positive results indicate the direction for further research into the practical use of genome information in breeding.Dairy and beef cattle The project demonstrates how useful large-scale DNA analyses can be, says Professor Roel Veerkamp, Professor of Numerical Genetics at Wageningen University and board member of the 1000 Bull Genomes project. He emphasises that the requirements for dairy and beef cattle are becoming ever more exacting: “Until the mid nineties, a cow primarily had to produce a lot of milk. But since then, expectations have gone up. …

Read more

Soccer-related facial fractures examined

Fractures of the nose and other facial bones are a relatively common and potentially serious injury in soccer players, reports a Brazilian study in Plastic and Reconstructive Surgery — Global Open , the official open-access medical journal of the American Society of Plastic Surgeons (ASPS).On the eve of the 2014 World Cup, a group of Brazilian plastic surgeons review their experience with soccer-related facial fractures requiring surgery. Dr. Dov Charles Goldenberg, MD, PhD, of University of So Paulo and colleagues write, “Due to exposure and the lack of protection for the face, the occasional maxillofacial trauma sustained during soccer games often entails serious facial injuries requiring hospital admissions and invasive procedures.”Soccer Players at Risk of Nasal and Other Facial FracturesThe researchers assembled data on 45 patients undergoing surgical treatment for soccer-related facial fractures at two large university hospital centers in So Paulo between 2000 and 2013. The 45 soccer injuries accounted for two percent of surgically treated facial fractures during that time. Forty-four of the patients were male; the average age was 28 years. All of the injured players were amateurs.The nose and upper jaw (maxilla) accounted for 35 percent of fractures and the cheekbone (zygomatic bone) for another 35 percent. Most of the remaining fractures were of the lower jaw (mandible) and eye socket (orbit). Eighty-seven percent of the injuries were caused by collision with another player; the rest occurred when the player was struck by the ball.Nasal fractures were treated by repositioning (reducing) the fractured bones to their proper place and splinting until they healed. Other types of facial fractures required open surgery and internal fixation (plates, screws) to reposition the bones. The patients remained in the hospital for about five days on average, and were told they could return to play after six to eight weeks of healing.Emphasis on Awareness and Examination to Detect Soccer-Related FracturesThe results are consistent with previous studies of soccer-related facial injuries. …

Read more

Slow walking speed, memory complaints can predict dementia

A study involving nearly 27,000 older adults on five continents found that nearly 1 in 10 met criteria for pre-dementia based on a simple test that measures how fast people walk and whether they have cognitive complaints. People who tested positive for pre-dementia were twice as likely as others to develop dementia within 12 years. The study, led by scientists at Albert Einstein College of Medicine of Yeshiva University and Montefiore Medical Center, was published online on July 16, 2014 in Neurology, the medical journal of the American Academy of Neurology.The new test diagnoses motoric cognitive risk syndrome (MCR). Testing for the newly described syndrome relies on measuring gait speed (our manner of walking) and asking a few simple questions about a patient’s cognitive abilities, both of which take just seconds. The test is not reliant on the latest medical technology and can be done in a clinical setting, diagnosing people in the early stages of the dementia process. Early diagnosis is critical because it allows time to identify and possibly treat the underlying causes of the disease, which may delay or even prevent the onset of dementia in some cases.”In many clinical and community settings, people don’t have access to the sophisticated tests — biomarker assays, cognitive tests or neuroimaging studies — used to diagnose people at risk for developing dementia,” said Joe Verghese, M.B.B.S., professor in the Saul R. Korey Department of Neurology and of medicine at Einstein, chief of geriatrics at Einstein and Montefiore, and senior author of the Neurology paper. “Our assessment method could enable many more people to learn if they’re at risk for dementia, since it avoids the need for complex testing and doesn’t require that the test be administered by a neurologist. The potential payoff could be tremendous — not only for individuals and their families, but also in terms of healthcare savings for society. All that’s needed to assess MCR is a stopwatch and a few questions, so primary care physicians could easily incorporate it into examinations of their older patients.”The U.S. …

Read more

Climate Change Increases Risk of Crop Slowdown in Next 20 Years

The world faces a small but substantially increased risk over the next two decades of a major slowdown in the growth of global crop yields because of climate change, new research finds.The authors, from Stanford University and the National Center for Atmospheric Research (NCAR), say the odds of a major production slowdown of wheat and corn even with a warming climate are not very high. But the risk is about 20 times more significant than it would be without global warming, and it may require planning by organizations that are affected by international food availability and price.”Climate change has substantially increased the prospect that crop production will fail to keep up with rising demand in the next 20 years,” said NCAR scientist Claudia Tebaldi, a co-author of the study.Stanford professor David Lobell said he wanted to study the potential impact of climate change on agriculture in the next two decades because of questions he has received from stakeholders and decision makers in governments and the private sector.”I’m often asked whether climate change will threaten food supply, as if it’s a simple yes or no answer,” Lobell said. “The truth is that over a 10- or 20-year period, it depends largely on how fast Earth warms, and we can’t predict the pace of warming very precisely. So the best we can do is try to determine the odds.”Lobell and Tebaldi used computer models of global climate, as well as data about weather and crops, to calculate the chances that climatic trends would have a negative effect of 10 percent on yields of corn and wheat in the next 20 years. This would have a major impact on food supply. Yields would continue to increase but the slowdown would effectively cut the projected rate of increase by about half at the same time that demand is projected to grow sharply.They found that the likelihood of natural climate shifts causing such a slowdown over the next 20 years is only 1 in 200. But when the authors accounted for human-induced global warming, they found that the odds jumped to 1 in 10 for corn and 1 in 20 for wheat.The study appears in this month’s issue of Environmental Research Letters. It was funded by the National Science Foundation (NSF), which is NCAR’s sponsor, and by the U.S. Department of Energy (DOE).More crops needed worldwideGlobal yields of crops such as corn and wheat have typically increased by about 1-2 percent per year in recent decades, and the U.N. Food and Agriculture Organization projects that global production of major crops will increase by 13 percent per decade through 2030 — likely the fastest rate of increase during the coming century. …

Read more

Researchers work to save endangered New England cottontail

Scientists with the NH Agricultural Experiment Station are working to restore New Hampshire and Maine’s only native rabbit after new research based on genetic monitoring has found that in the last decade, cottontail populations in northern New England have become more isolated and seen a 50 percent contraction of their range.The endangered New England cottontail is now is at risk of becoming extinct in the region, according to NH Agricultural Experiment Station researchers at the University of New Hampshire College of Life Sciences and Agriculture who believe that restoring habitats is the key to saving the species.”The New England cottontail is a species of great conservation concern in the Northeast. This is our only native rabbit and is an integral component of the native New England wildlife. Maintaining biodiversity gives resilience to our landscape and ecosystems,” said NHAES researcher Adrienne Kovach, research associate professor of natural resources at UNH.New England cottontails have been declining for decades. However, NHAES researchers have found that in the last decade, the New England cottontail population in New Hampshire and Maine has contracted by 50 percent; a decade ago, cottontails were found as far north as Cumberland, Maine.The majority of research on New England cottontails has come out of UNH, much of it under the leadership of John Litvaitis, professor of wildlife ecology, who has studied the New England cottontail for three decades. Kovach’s research expands on this knowledge by using DNA analysis to provide new information on the cottontail’s status, distribution, genetic diversity, and dispersal ecology.The greatest threat and cause of the decline of the New England cottontail is the reduction and fragmentation of their habitat, Kovach said. Fragmentation of habitats occurs when the cottontail’s habitat is reduced or eliminated due to the maturing of forests or land development. Habitats also can become fragmented by roads or natural landscape features, such as bodies of water.”Cottontails require thicketed habitats, which progress from old fields to young forests. Once you have a more mature forest, the cottontail habitat is reduced. A lot of other species rely on these thicket habitats, including bobcats, birds, and reptiles. Many thicket-dependent species are on decline, and the New England cottontail is a representative species for this kind of habitat and its conservation,” Kovach said.Kovach explained that for cottontail and most animal populations to be healthy and grow, it is important for adult animals to leave the place where they were born and relocate to a new habitat, which is known as dispersal. …

Read more

Changes in agriculture increase high river flow rates

Just as a leaky roof can make a house cooler and wetter when it’s raining as well as hotter and dryer when it’s sunny, changes in land use can affect river flow in both rainy and dry times, say two University of Iowa researchers.While it may be obvious that changes in river water discharge across the U.S. Midwest can be related to changes in rainfall and agricultural land use, it is important to learn how these two factors interact in order to get a better understanding of what the future may look like, says Gabriele Villarini, UI assistant professor of civil and environmental engineering, assistant research engineer at IIHR — Hydroscience & Engineering and lead author of a published research paper on the subject.”We wanted to know what the relative impacts of precipitation and agricultural practices played in shaping the discharge record that we see today,” he says. “Is it an either/or answer or a much more nuanced one?”By understanding our past we are better positioned in making meaningful statements about our future,” he says.The potential benefits of understanding river flow are especially great in the central United States, particularly Iowa, where spring and summer floods have hit the area in 1993, 2008, 2013 and 2014, interrupted by the drought of 2012. Large economic damage and even loss of life have resulted, says co-author Aaron Strong, UI assistant professor in the Department of Urban and Regional Planning and with the Environmental Policy Program at the UI Public Policy Center.”What is interesting to note,” says Strong, “is that the impacts, in terms of flooding, have been exacerbated. At the same time, the impacts of drought, for in-stream flow, have been mitigated with the changes in land use composition that we have seen over the last century.”In order to study the effect of changes in agricultural practices on Midwest river discharge, the researchers focused on Iowa’s Raccoon River at Van Meter, Iowa. The 9,000-square-kilometer watershed has the advantage of having had its water discharge levels measured and recorded daily for most of the 20th century right on up to the present day. (The study focused on the period 1927-2012). During that period, the number of acres used for corn and soybean production greatly increased, roughly doubling over the course of the 20th century.Not surprisingly, they found that variability in rainfall is responsible for most of the changes in water discharge volumes.However, the water discharge rates also varied with changes in agricultural practices, as defined by soybean and corn harvested acreage in the Raccoon River watershed. In times of flood and in times of drought, water flow rates were exacerbated by more or less agriculture, respectively. The authors suggest that although flood conditions may be exacerbated by increases in agricultural production, this concern “must all be balanced by the private concerns of increased revenue from agricultural production through increased cultivation.””Our results suggest that changes in agricultural practices over this watershed — with increasing acreage planted in corn and soybeans over time — translated into a seven-fold increase in rainfall contribution to the average annual maximum discharge when we compare the present to the 1930s,” Villarini says.The UI research paper, “Roles of climate and agricultural practices in discharge changes in an agricultural watershed in Iowa,” can be found in the April 15 online edition of Agriculture, Ecosystems & Environment.Story Source:The above story is based on materials provided by University of Iowa. …

Read more

Experiences at every stage of life contribute to cognitive abilities in old age

Early life experiences, such as childhood socioeconomic status and literacy, may have greater influence on the risk of cognitive impairment late in life than such demographic characteristics as race and ethnicity, a large study by researchers with the UC Davis Alzheimer’s Disease Center and the University of Victoria, Canada, has found.”Declining cognitive function in older adults is a major personal and public health concern,” said Bruce Reed professor of neurology and associate director of the UC Davis Alzheimer’s Disease Center.”But not all people lose cognitive function, and understanding the remarkable variability in cognitive trajectories as people age is of critical importance for prevention, treatment and planning to promote successful cognitive aging and minimize problems associated with cognitive decline.”The study, “Life Experiences and Demographic Influences on Cognitive Function in Older Adults,” is published online in Neuropsychology, a journal of the American Psychological Association. It is one of the first comprehensive examinations of the multiple influences of varied demographic factors early in life and their relationship to cognitive aging.The research was conducted in a group of over 300 diverse men and women who spoke either English or Spanish. They were recruited from senior citizen social, recreational and residential centers, as well as churches and health-care settings. At the time of recruitment, all study participants were 60 or older, and had no major psychiatric illnesses or life threatening medical illnesses. Participants were Caucasian, African-American or Hispanic.The extensive testing included multidisciplinary diagnostic evaluations through the UC Davis Alzheimer’s Disease Center in either English or Spanish, which permitted comparisons across a diverse cohort of participants.Consistent with previous research, the study found that non-Latino Caucasians scored 20 to 25 percent higher on tests of semantic memory (general knowledge) and 13 to 15 percent higher on tests of executive functioning compared to the other ethnic groups. However, ethnic differences in executive functioning disappeared and differences in semantic memory were reduced by 20 to 30 percent when group differences in childhood socioeconomic status, adult literacy and extent of physical activity during adulthood were considered.”This study is unusual in that it examines how many different life experiences affect cognitive decline in late life,” said Dan Mungas, professor of neurology and associate director of the UC Davis Alzheimer’s Disease Research Center.”It shows that variables like ethnicity and years of education that influence cognitive test scores in a single evaluation are not associated with rate of cognitive decline, but that specific life experiences like level of reading attainment and intellectually stimulating activities are predictive of the rate of late-life cognitive decline. This suggests that intellectual stimulation throughout the life span can reduce cognitive decline in old age.”Regardless of ethnicity, advanced age and apolipoprotein-E (APOE genotype) were associated with increased cognitive decline over an average of four years that participants were followed. APOE is the largest known genetic risk factor for late-onset Alzheimer’s. Less decline was experienced by persons who reported more engagement in recreational activities in late life and who maintained their levels of activity engagement from middle age to old age. Single-word reading — the ability to decode a word on sight, which often is considered an indication of quality of educational experience — was also associated with less cognitive decline, a finding that was true for both English and Spanish readers, irrespective of their race or ethnicity. …

Read more

‘Lost in translation’ issues in Chinese medicine addressed by researchers

Millions of people in the West today utilize traditional Chinese medicine, including acupuncture, herbs, massage and nutritional therapies. Yet only a few U.S. schools that teach Chinese medicine require Chinese-language training and only a handful of Chinese medical texts have so far been translated into English.Given the complexity of the language and concepts in these texts, there is a need for accurate, high-quality translations, say researchers at UCLA’s Center for East-West Medicine. To that end, the center has published a document that includes a detailed discussion of the issues involved in Chinese medical translation, which is designed to help students, educators, practitioners, researchers, publishers and translators evaluate and digest Chinese medical texts with greater sensitivity and comprehension.”This publication aims to raise awareness among the many stakeholders involved with the translation of Chinese medicine,” said principal investigator and study author Dr. Ka-Kit Hui, founder and director of the UCLA center.The 15-page document, “Considerations in the Translation of Chinese Medicine” was developed and written by a UCLA team that included a doctor, an anthropologist, a China scholar and a translator. It appears in the current online edition of the Journal of Integrative Medicine.Authors Sonya Pritzker, a licensed Chinese medicine practitioner and anthropologist, and Hanmo Zhang, a China scholar, hope the publication will promote communication in the field and play a role in the development of thorough, accurate translations.The document highlights several important topics in the translation of Chinese medical texts, including the history of Chinese medical translations, which individuals make ideal translators, and other translation-specific issues, such as the delicate balance of focusing translations on the source-document language while considering the language it will be translated into.It also addresses issues of technical terminology, period-specific language and style, and historical and cultural perspective. For example, depending on historical circumstances and language use, some translations may be geared toward a Western scientific audience or, alternately, it may take a more natural and spiritual tone. The authors note that it is sometimes helpful to include dual translations, such as “windfire eye/acute conjunctivitis,” in order to facilitate a link between traditional Chinese medical terms and biomedical diagnoses.The final section of the document calls for further discussion and action, specifically in the development of international collaborative efforts geared toward the creation of more rigorous guidelines for the translation of Chinese medicine texts.”Considerations in the Translation of Chinese Medicine,” was inspired by the late renowned translator and scholar Michael Heim, a professor in the UCLA departments of comparative literature and Slavic studies. A master of 12 languages, he is best known for his translation into English of Czech author Milan Kundera’s “The Unbearable Lightness of Being.” The new UCLA document is dedicated to him.The document, the authors say, was influenced in large part by the American Council of Learned Societies’ “Guidelines for the Translation of Social Science Texts,” which are intended to promote communications in the social sciences across language boundaries. It was also influenced by Pritzker’s longstanding anthropological study of translation in Chinese medicine, which is detailed in her new book, “Living Translation: Language and the Search for Resonance in U.S. …

Read more

New EMS system dramatically improves survival from cardiac arrest

A new system that sent patients to designated cardiac receiving centers dramatically increased the survival rate of victims of sudden cardiac arrest in Arizona, according to a study published online in Annals of Emergency Medicine.”We knew lives would be saved if the hospitals implemented the latest cutting edge guidelines for post-cardiac arrest care and we were able to get cardiac arrest patients to those hospitals, similar to what is done for Level 1 trauma patients,” said lead study author Daniel Spaite, MD, Director of EMS Research at the University of Arizona Emergency Medicine Research Center in Phoenix and Tucson and a professor and distinguished chair of emergency medicine at the University of Arizona College of Medicine. “Taking these patients directly to a hospital optimally prepared to treat cardiac arrest gave patients a better chance of survival and of preventing neurologic damage, a common result of these cardiac events.”Under the study, 31 hospitals, serving about 80 percent of the state’s population, were designated as cardiac receiving centers between December 2007 and November 2010. Approximately 55 emergency medicine service agencies also participated in the study.The study shows that the survival rate increased by more than 60 percent during the four-year period of the study, from 2007 to 2010. More importantly, when the results were adjusted for the various factors that significantly impact survival (such as age and how quickly the EMS system got to the patients after their arrest), the likelihood of surviving an arrest more than doubled. In addition, the likelihood of surviving with good neurological status also more than doubled.This statewide effort was accomplished through the Save Hearts Arizona Registry and Education-SHARE Program, a partnership involving the Arizona Department of Health Services, the University of Arizona, over 30 hospitals and more than 100 fire departments and EMS agencies. The SHARE Program is part of a network of statewide cardiac resuscitation programs dedicated to improving cardiac arrest survival and working together as the HeartRescue Project.”We worked closely with the hospitals around the state to implement these Guidelines and then formally recognized the hospitals as Cardiac Receiving Centers (CRCs) ,” said Ben Bobrow, MD, Medical Director of the Bureau of Emergency Medicine Services and Trauma System for the Arizona Department of Health Services in Phoenix, Ariz. “We then developed protocols for our EMS agencies to transport post-cardiac arrest patients to those centers. Our overarching goal was to have more cardiac arrest victims leave the hospital in good shape and be able to return to their families and careers. As we suspected, ‘regionalizing’ the care for these critically-ill patients markedly increased their likelihood of survival and good neurologic outcome.”Dr. Bobrow, who is also a professor of emergency medicine at the University of Arizona College of Medicine-Phoenix and an emergency physician at Maricopa Medical Center, said the study shows that just transporting these patients to the nearest emergency department does not maximize the likelihood of a positive outcome. …

Read more

Primary texting bans associated with lower traffic fatalities, study finds

Researchers at the University of Alabama at Birmingham School of Public Health examined the impact texting-while-driving laws have had on roadway crash-related fatalities, and the findings are published in the August issue of the American Journal of Public Health.Of drivers in the United States ages 18-64 years, 31 percent reported they had read or sent text or email messages while driving at least once in the 30 days prior, according to 2011 data from the Centers for Disease Control and Prevention. That same year, 3,331 people were killed in crashes involving a distracted driver, and an additional 387,000 people were injured.While completing her doctoral work in the Department of Health Care Organization and Policy, Alva O. Ferdinand, Dr.P.H., J.D., conducted a longitudinal panel study to examine within-state changes in roadway fatalities after the enactment of state texting-while-driving bans using roadway fatality data captured in the Fatality Analysis Reporting System between 2000 and 2010.”Very little is known about whether laws banning texting while driving have actually improved roadway safety,” Ferdinand said. “Further, given the considerable variation in the types of laws that states have passed and whom they ban from what, it was necessary to determine which types of laws are most beneficial in improving roadway safety.”Some states have banned all drivers from texting while driving, while others have banned only young drivers from this activity, Ferdinand says. Additionally, some states’ texting bans entail secondary enforcement, meaning an officer must have another reason to stop a vehicle, like speeding or running a red light, before citing a driver for texting while driving. Other states’ texting bans entail primary enforcement, meaning an officer does not have to have another reason for stopping a vehicle.”Our results indicated that primary texting bans were significantly associated with a 3 percent reduction in traffic fatalities among all age groups, which equates to an average of 19 deaths prevented per year in states with such bans,” Ferdinand said. “Primarily enforced texting laws that banned only young drivers from texting were the most effective at reducing deaths among the 15- to 21-year-old cohort, with an associated 11 percent reduction in traffic fatalities among this age group in states with such bans.”States with secondarily enforced restrictions did not see any significant reductions in traffic fatalities.”We were a little surprised to see that primarily enforced texting bans were not associated with significant reductions in fatalities among those ages 21 to 64, who are not considered to be young drivers,” Ferdinand said. “However, states with bans prohibiting the use of cellphones without hands-free technology altogether on all drivers saw significant reductions in fatalities among this particular age group. Thus, although texting-while-driving bans were most effective for reducing traffic-related fatalities among young individuals, handheld bans appear to be most effective for adults.”Ferdinand says these results could aid policymakers interested in improving roadway safety in that they indicate the types of laws that are most effective in reducing deaths among various age groups, as well as those in states with secondarily enforced texting bans advocating for stricter, primarily enforced texting bans.Ferdinand’s mentor, Nir Menachemi, Ph.D., professor in the Department of Health Care Organization and Policy, says it is a key responsibility of health policy researchers to generate high-quality evidence on the health impact of societal policies and laws.”Clearly, distracted driving is a growing problem affecting everyone on the roadways,” Menachemi said. “It is my hope that policymakers act upon our findings so that motor-vehicle deaths can be prevented.”Story Source:The above story is based on materials provided by University of Alabama at Birmingham. …

Read more

Biologist warn of early stages of Earth’s sixth mass extinction event

The planet’s current biodiversity, the product of 3.5 billion years of evolutionary trial and error, is the highest in the history of life. But it may be reaching a tipping point.In a new review of scientific literature and analysis of data published in Science, an international team of scientists cautions that the loss and decline of animals is contributing to what appears to be the early days of the planet’s sixth mass biological extinction event.Since 1500, more than 320 terrestrial vertebrates have become extinct. Populations of the remaining species show a 25 percent average decline in abundance. The situation is similarly dire for invertebrate animal life.And while previous extinctions have been driven by natural planetary transformations or catastrophic asteroid strikes, the current die-off can be associated to human activity, a situation that the lead author Rodolfo Dirzo, a professor of biology at Stanford, designates an era of “Anthropocene defaunation.”Across vertebrates, 16 to 33 percent of all species are estimated to be globally threatened or endangered. Large animals — described as megafauna and including elephants, rhinoceroses, polar bears and countless other species worldwide — face the highest rate of decline, a trend that matches previous extinction events.Larger animals tend to have lower population growth rates and produce fewer offspring. They need larger habitat areas to maintain viable populations. Their size and meat mass make them easier and more attractive hunting targets for humans.Although these species represent a relatively low percentage of the animals at risk, their loss would have trickle-down effects that could shake the stability of other species and, in some cases, even human health.For instance, previous experiments conducted in Kenya have isolated patches of land from megafauna such as zebras, giraffes and elephants, and observed how an ecosystem reacts to the removal of its largest species. Rather quickly, these areas become overwhelmed with rodents. Grass and shrubs increase and the rate of soil compaction decreases. Seeds and shelter become more easily available, and the risk of predation drops.Consequently, the number of rodents doubles — and so does the abundance of the disease-carrying ectoparasites that they harbor.”Where human density is high, you get high rates of defaunation, high incidence of rodents, and thus high levels of pathogens, which increases the risks of disease transmission,” said Dirzo, who is also a senior fellow at the Stanford Woods Institute for the Environment. …

Read more

New hope for powdery mildew resistant barley

New research at the University of Adelaide has opened the way for the development of new lines of barley with resistance to powdery mildew.In Australia, annual barley production is second only to wheat with 7-8 million tonnes a year. Powdery mildew is one of the most important diseases of barley.Senior Research Scientist Dr Alan Little and team have discovered the composition of special growths on the cell walls of barley plants that block the penetration of the fungus into the leaf.The research, by the ARC Centre of Excellence in Plant Cell Walls in the University’s School of Agriculture, Food and Wine in collaboration with the Leibniz Institute of Plant Genetics and Crop Plant Research in Germany, will be presented at the upcoming 5th International Conference on Plant Cell Wall Biology and published in the journal New Phytologist.”Powdery mildew is a significant problem wherever barley is grown around the world,” says Dr Little. “Growers with infected crops can expect up to 25% reductions in yield and the barley may also be downgraded from high quality malting barley to that of feed quality, with an associated loss in market value.”In recent times we’ve seen resistance in powdery mildew to the class of fungicide most commonly used to control the disease in Australia. Developing barley with improved resistance to the disease is therefore even more important.”The discovery means researchers have new targets for breeding powdery mildew resistant barley lines.”Powdery mildew feeds on the living plant,” says Dr Little. “The fungus spore lands on the leaf and sends out a tube-like structure which punches its way through cell walls, penetrating the cells and taking the nutrients from the plant. The plant tries to stop this penetration by building a plug of cell wall material — a papillae — around the infection site. Effective papillae can block the penetration by the fungus.”It has long been thought that callose is the main polysaccharide component of papilla. But using new techniques, we’ve been able to show that in the papillae that block fungal penetration, two other polysaccharides are present in significant concentrations and play a key role.”It appears that callose acts like an initial plug in the wall but arabinoxylan and cellulose fill the gaps in the wall and make it much stronger.”In his PhD project, Jamil Chowdhury showed that effective papillae contained up to four times the concentration of callose, arabinoxylan and cellulose as cell wall plugs which didn’t block penetration.”We can now use this knowledge find ways of increasing these polysaccharides in barley plants to produce more resistant lines available for growers,” says Dr Little.Story Source:The above story is based on materials provided by University of Adelaide. Note: Materials may be edited for content and length.

Read more

Designer potatoes on the menu to boost consumption

A decline in overall potato consumption has Texas A&M AgriLife Research breeders working on “designer” spuds that meet the time constraints and unique tastes of a younger generation.Dr. Creighton Miller, AgriLife Research potato breeder from College Station, recently conducted the Texas A&M Potato Breeding and Variety Development Program field day at the farm of cooperator Bruce Barrett south of Springlake.”Potatoes are an important delivery system for nutrients to humans,” Miller said. “The average consumption in the U.S. is 113 pounds per year per person. But overall potato consumption in the U.S. has generally declined somewhat.”So what we are doing now is developing unique varieties that have a tendency to appeal to the younger set with high income who are willing to try something different,” he said. “This has contributed to an increase in consumption of these types over the russets, which are still the standard.”Miller said the objective of the Texas A&M potato breeding program is to develop improved varieties adapted specifically to Texas environmental conditions.”However, some of our varieties are widely adapted across the U.S.,” he said. “Three of them collectively represent the fifth-largest number of acres certified for seed production in the U.S., so we’ve released some successful varieties,and we are developing more all the time.”The Texas Potato Variety Development Program currently has 412 entries at the Springlake trials and 927 entries at the Dalhart trials. Additionally, the 2014 seedling selection trials at both Springlake and Dalhart include 115,408 seedlings from 634 families or crosses.One selected Best of Trial at Springlake this year is BTX2332-IR, which is a round red potato. And, he said, the traditional russet potatoes will always be a mainstay, as they are used primarily for baking and French fries. …

Read more

Increasing diversity of marketable raspberries

Raspberries are the third most popular berry in the United States. Their popularity is growing as a specialty crop for the wholesale industry and in smaller, local markets, and U-pick operations. As consumer interest in the health benefits of colorful foods increases, small growers are capitalizing on novelty fruit and vegetable crops such as different-colored raspberries. Authors of a newly published study say that increasing the diversity of raspberry colors in the market will benefit both consumers and producers. “Producers will need to know how fruit of the other color groups compare with red raspberries with regard to the many postharvest qualities,” noted the University of Maryland’s Julia Harshman, corresponding author of the study published in HortScience (March 2014).Raspberries have an extremely short shelf life, which can be worsened by postharvest decay. Postharvest susceptibility to gray mold (Botrytis cinerea) drastically reduces the shelf life of this delicate fruit. “The main goal of our research was to compare the postharvest quality of different-colored raspberries that were harvested from floricanes under direct-market conditions with minimal pesticide inputs,” Harshman said. The researchers said that, although there is abundant information in the literature regarding red raspberry production in regard to gray mold, very little research has been conducted on postharvest physiology of black, yellow, or purple raspberries.The researchers analyzed 17 varieties of raspberries at the USDA’s Agricultural Research Center in Beltsville, Maryland, examining each cultivar for characteristics such as anthocyanins, soluble solids, titratable acids, pH, color, firmness, decay and juice leakage rates, ethylene evolution, and respiration.”In comparing the four commonly grown colors of raspberry, we drew several important conclusions,” they said. “The mechanisms controlling decay and juice leakage are distinct and mediated by both biotic and abiotic factors. The colors that performed well for one area are opposite the ones that did well in the other.” For example, firmness was expected to track closely with either leakage or decay resistance; however, the analyses did not indicate this.Red raspberries, in comparison with the other three colors analyzed during the study, had the highest titratable acids (TA) and the lowest ratio of soluble solids to TA, which, the authors say, accounts for the tart raspberry flavor consumers expect.Yellow raspberries had the lowest levels of anthocyanins and phenolics. …

Read more

Don’t like the food? Try paying more

Restaurateurs take note — by cutting your prices, you may be cutting how much people will like your food.Researchers in nutrition, economics and consumer behavior often assume that taste is a given — a person naturally either likes or dislikes a food. But a new study suggests taste perception, as well as feelings of overeating and guilt, can be manipulated by price alone.”We were fascinated to find that pricing has little impact on how much one eats, but a huge impact on how you interpret the experience,” said Brian Wansink, Ph.D., a professor at the Dyson School of Applied Economics and Management at Cornell University who oversaw the research. “Simply cutting the price of food at a restaurant dramatically affects how customers evaluate and appreciate the food.”The researchers teamed up with a high-quality Italian buffet in upstate New York to study how pricing affects customers’ perceptions. They presented 139 diners with a menu that offered an all-you-can-eat buffet priced at either $4 or $8. Customers were then asked to evaluate the food and the restaurant and rate their first, middle and last taste of the food on a nine-point scale.Those who paid $8 for the buffet reported enjoying their food on average 11 percent more than those who paid $4, though the two groups ate the same amount of food overall. People who paid the lower price also more often reported feeling like they had overeaten, felt more guilt about the meal, and reported liking the food less and less throughout the course of the meal.”We were surprised by the striking pattern we saw,” said Ozge Sigirci, a researcher at Cornell University Food and Brand Lab who conducted the study. “If the food is there, you are going to eat it, but the pricing very much affects how you are going to feel about your meal and how you will evaluate the restaurant.”Public health researchers and health advocates have focused on how all-you-can-eat buffets influence people’s eating habits. On the theory that such restaurants foster overeating and contribute to obesity, some advocates have proposed imposing special taxes on buffet consumers or restaurant owners.The study did not directly address the public health implications of all-you-can-eat buffets, but the researchers said the results could offer lessons about how to optimize a restaurant experience. “If you’re a consumer and want to eat at a buffet, the best thing to do is eat at the most expensive buffet you can afford. You won’t eat more, but you’ll have a better experience overall,” said Wansink.The study fits within a constellation of other work by Wansink and others offering insights about how health behaviors can be manipulated by small changes, such as putting the most healthful foods first in a display or using a smaller dinner plate.”This is an example of how a really small change can transform how a person interacts with food in a way that doesn’t entail dieting,” said Wansink, who is author of Slim by Design: Mindless Eating Solutions for Everyday Life, an upcoming book about how design choices influence eating behavior.Ozge Sigirci presented the findings during the Experimental Biology 2014 meeting on Tuesday, April 29.

Read more

Novel drug cocktail may improve clinical treatment for pancreatic cancer

Pancreatic cancer is the fourth leading cause of cancer deaths in the U.S. and has the lowest overall survival rate of all major cancers (~6%). With current treatment options being met with limited success it is anticipated that pancreatic cancer will move up to the second leading cause of cancer deaths by as early as 2015. Surgical removal of the tumor presents the best chance of survival, however only 15% of patients are eligible due to the late stage of diagnosis common with this disease. With very limited improvements in patient outcome over the last two decades there remains an enormous need for new therapies and treatment options.David Durrant, a Ph.D. student in the laboratory of Dr. Rakesh Kukreja from the Pauley Heart Center at Virginia Commonwealth University’s School of Medicine, is studying a novel combination therapy for the treatment of pancreatic cancer. The traditional chemotherapy drug, doxorubicin (DOX), has long been used in the treatment of several cancers. However, patients commonly acquire resistance to DOX because of increased activation of specific survival proteins or through increased expression of drug transporters which reduce cellular levels of the drug. This is especially true for pancreatic cancer, which does not respond to multiple treatment strategies, including those that contain DOX. …

Read more

Utilizzando il sito, accetti l'utilizzo dei cookie da parte nostra. maggiori informazioni

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close