Salamanders shrinking as their mountain havens heat up

Wild salamanders living in some of North America’s best salamander habitat are getting smaller as their surroundings get warmer and drier, forcing them to burn more energy in a changing climate.That’s the key finding of a new study, published March 25 in the journal Global Change Biology, that examined museum specimens caught in the Appalachian Mountains from 1957 to 2007 and wild salamanders measured at the same sites in 2011-2012. The salamanders studied from 1980 onward were, on average, 8% smaller than their counterparts from earlier decades. The changes were most marked in the Southern Appalachians and at low elevations — settings where detailed weather records showed the climate has warmed and dried out most.Scientists have predicted that some animals will get smaller in response to climate change, and this is strongest confirmation of that prediction.”This is one of the largest and fastest rates of change ever recorded in any animal,” said Karen R. Lips, an associate professor of biology at the University of Maryland and the study’s senior author. “We don’t know exactly how or why it’s happening, but our data show it is clearly correlated with climate change.” And it’s happening at a time when salamanders and other amphibians are in distress, with some species going extinct and others dwindling in number.”We don’t know if this is a genetic change or a sign that the animals are flexible enough to adjust to new conditions,” Lips said. “If these animals are adjusting, it gives us hope that some species are going to be able to keep up with climate change.”The study was prompted by the work of University of Maryland Prof. Emeritus Richard Highton, who began collecting salamanders in the Appalachian Mountains in 1957. The geologically ancient mountain range’s moist forests and long evolutionary history make it a global hot spot for a variety of salamander species. Highton collected hundreds of thousands of salamanders, now preserved in jars at the Smithsonian Institution’s Museum Service Center in Suitland, MD.But Highton’s records show a mysterious decline in the region’s salamander populations beginning in the 1980s. Lips, an amphibian expert, saw a similar decline in the frogs she studied in Central America, and tracked it to a lethal fungal disease. …

Read more

Radiation damage at the root of Chernobyl’s ecosystems

Radiological damage to microbes near the site of the Chernobyl disaster has slowed the decomposition of fallen leaves and other plant matter in the area, according to a study just published in the journal Oecologia. The resulting buildup of dry, loose detritus is a wildfire hazard that poses the threat of spreading radioactivity from the Chernobyl area.Tim Mousseau, a professor of biology and co-director of the Chernobyl and Fukushima Research Initiatives at the University of South Carolina, has done extensive research in the contaminated area surrounding the Chernobyl nuclear facility, which exploded and released large quantities of radioactive compounds in the Ukraine region of the Soviet Union in 1986. He and frequent collaborator Anders Mller of Universit Paris-Sud noticed something unusual in the course of their work in the Red Forest, the most contaminated part of the Chernobyl Exclusion Zone.”We were stepping over all these dead trees on the ground that had been killed by the initial blast,” Mousseau said. “Some 15 or 20 years later, these tree trunks were in pretty good shape. If a tree had fallen in my backyard, it would be sawdust in 10 years or so.”They set out to assess the rate at which plant material decomposed as a function of background radiation, placing hundreds of samples of uncontaminated leaf litter (pine needles and oak, maple and birch leaves) in mesh bags throughout the area. The locations were chosen to cover a range of radiation doses, and the samples were retrieved after nine months outdoors.A statistical analysis of the weight loss of each leaf litter sample after those nine months showed that higher background radiation was associated with less weight loss. The response was proportional to radiation dose, and in the most contaminated regions, the leaf loss was 40 percent less than in control regions in Ukraine with normal background radiation levels.They also measured the thickness of the forest floor in the same areas where samples were placed. They found that it was thicker in places with higher background radiation.The team concluded that the bacteria and fungi that decompose plant matter in healthy ecosystems are hindered by radioactive contamination. They showed a smaller effect for small invertebrates, such as termites, that also contribute to decomposition of plant biomass.According to Mousseau, slower decomposition is likely to indirectly slow plant growth, too, given that the products of decomposition are nutrients for new plants. The team recently reported diminished tree growth near Chernobyl, which he says likely results both from direct radiation effects and indirect effects such as reduced nutrient supply.”It’s another facet of the impacts of low-dose-rate radioactive contaminants on the broader ecosystem,” Mousseau says. …

Read more

Retention leads to discipline problems in other kids

When students repeat a grade, it can spell trouble for their classmates, according to a new Duke University-led study of nearly 80,000 middle-schoolers.In schools with high numbers of grade repeaters , suspensions were more likely to occur across the school community. Discipline problems were also more common among other students, including substance abuse, fighting and classroom disruption.Public debate typically focuses on how retention affects an individual student’s academic performance, said lead author Clara Muschkin. So she and her colleagues decided to take a wider view and consider how holding students back may affect the school as a whole.”The decision to retain students has consequences for the whole school community,” said Muschkin, an associate director of the Duke Center for Child and Family Policy. “That wider effect is an issue worth considering as we debate this policy.”The study by Muschkin, Elizabeth Glennie and Audrey Beck looked at 79,314 seventh-graders in 334 North Carolina middle schools.For information on retention and discipline problems, the authors turned to administrative data from the state’s public school system. The authors found that different schools have greatly varying numbers of older and retained students, with significant consequences.The authors took pains to account for a range of factors that might offer alternative explanations for their findings, including schools’ socioeconomic composition and parents’ educational status. Even after controlling for such factors, the presence of older and retained students was still strongly linked with more discipline problems in the entire group.For instance, if 20 percent of children in seventh grade were older than their peers, the chance that other students would commit an infraction or be suspended increased by 200 percent.”There’s a strong relationship here that we think is likely to be causal,” Muschkin said.The study focused on two groups in particular: students who repeated a grade, and students who were a year older than their classmates, on average. When there were more older and retained students present, discipline problems increased for all subgroups in the study, including black and white students and boys and girls. Two groups saw a particularly large jump in discipline problems: white students and girls of all races.”This finding took us by surprise,” Muschkin said. “These two groups appear to be a bit more affected than others by the influence of older peers.”In early adolescence, a time of major physical and psychological change, students are particularly vulnerable to peer influence, Muschkin noted. However, more research is needed to understand why some subgroups appear to respond more strongly than others to the influence of their classmates, she said.Holding students back became a popular educational option as criticism of “social promotion” mounted. …

Read more

Drop in crime rates are less where Wal-Mart builds, study shows

Communities across the United States experienced an unprecedented decline in crime in the 1990s. But for counties where Wal-Mart built stores, the decline wasn’t nearly as dramatic.”The crime decline was stunted in counties where Wal-Mart expanded in the 1990s,” says Scott Wolfe, assistant professor of criminology and criminal justice at the University of South Carolina and lead author of a new study. “If the corporation built a new store, there were 17 additional property crimes and 2 additional violent crimes for every 10,000 persons in a county.”The study, titled “Rolling back prices and raising crime rates? The Wal-Mart effect on crime in the United States,” released last month in the British Journal of Criminology, was co-authored with David Pyrooz, assistant professor of criminal justice and criminology at Sam Houston State University.Wolfe says the commonly known “Wal-Mart effect” is the company’s overwhelming influence on numerous economic and social factors in communities, including jobs, poverty rates and retail prices.The study was not intended to criticize Wal-Mart, he says. Instead, it attempted to answer the unexplored question of whether Wal-Mart could equate with either more or less crime.”There have been dozens of studies on the ‘Wal-Mart effect’ showing the company impacts numerous outcomes closely related to crime. Our objective was to determine if the Wal-Mart effect extended to understanding crime rates during arguably one of the most pivotal historical periods in the study of crime,” Wolfe says.Wolfe and Pyrooz based the study on 3,109 U.S. counties. They focused on Wal-Mart’s expansion in the 1990s, a time of dynamic growth for the company and falling crime rates nationally. During that decade Wal-Mart expanded in 767 of those counties.”There are reasons why Wal-Mart ranks among the most successful commercial enterprises in U.S. history,” Wolfe says. …

Read more

To calculate long-term conservation pay off, factor in people

Paying people to protect their natural environment is a popular conservation tool around the world — but figure out that return on investment, for both people and nature, is a thorny problem, especially since such efforts typically stretch on for years.”Short attention-span worlds with long attention-span problems” is how Xiaodong Chen, a former Michigan State University doctoral student now on faculty at the University of North Carolina-Chapel Hill sums it up.Chen, with his adviser Jianguo “Jack” Liu, director of the MSU Center for Systems Integration and Sustainability (CSIS) and others, have developed a new way to evaluate and model the long-term effectiveness of conservation investments. Their achievement is not only factoring in ecological gains — like, more trees growing — but also putting the actions and reactions of people into the equation.The paper, Assessing the Effectiveness of Payments for Ecosystem Services: an Agent-Based Modeling Approach, appears in this week’s online edition of Ecology and Society.The paper examines payments for ecosystem services — the practice of paying people to perform tasks or engage in practices that aid conservation. The authors examined one of China’s most sweeping — the National Forest Conservation Program, in which residents in Wolong Nature Reserve are paid to stop chopping down trees for timber and fuel wood.Chen explained they tapped into both social data and environmental information to be able to create a computer model to simulate how the policy would fare over many years in a variety of scenarios. Studies documenting results on land cover change and panda habitat dynamics were merged with studies revealing how people were likely to behave if new households were formed or incentives for conservation activities were varied.”Usually studies are developed in either the social sciences or the natural sciences, and the importance of the other perspectives are not built into scientific exploration,” Chen said. “We were able to develop this kind of simulation because of collaborative interdisciplinary research — by putting people with different backgrounds together.”He also said the model’s ability to run scenarios about how policy could work over decades is crucial because many goals of conservation, like restoring wildlife habitat, can take decades. In the meantime, the actions of individuals living in the area can change.Story Source:The above story is based on materials provided by Michigan State University. Note: Materials may be edited for content and length.

Read more

Forensic experts compile guide on how to ID child abuse, starvation

Forensic science experts from North Carolina State University are publishing a comprehensive overview of forensic research that can be used to identify child abuse and starvation.”By pulling all of this information together in one place, we hope that we can save the lives of some children and find justice for others,” says Dr. Ann Ross, a professor of anthropology at NC State and lead author of the paper. Ross is also co-editor of the book “The Juvenile Skeleton in Forensic Abuse Investigations.””For example, we looked at issues of neglect in which children are starved to death,” Ross says. “These are supposedly rare, but I’ve unfortunately seen this a few times in my capacity as an advisor to medical examiners. In this paper we offer some guidelines on how to use the mineral density of bones to determine whether a child was being starved.”Proving that a child was starved to death is difficult; it’s essentially impossible to assess normal indicators of starvation once a body has decomposed. But the paper explains that forensic investigators can use a DXA scan, like those used to assess osteoporosis in older adults, to assess bone density and determine whether a child was severely malnourished.Also, because teeth are not as affected by malnutrition as bones are, investigators can compare the development of an individual’s teeth and bones. Stunted growth of a child’s tibia can be a strong indicator of starvation, for example.”These techniques are well-established but are not in widespread use in the United States,” Ross says.”We also combed the existing literature to focus on skeletal injuries that are indicators of abuse and that are unlikely or impossible to be caused by accident,” says Dr. Chelsey Juarez, an assistant professor of anthropology at NC State and co-author of the paper.For example, rib fractures are very rare in accidental trauma, so the presence of rib fractures in children is highly suggestive of abuse.The paper also offers broader advice, such as noting that forensic investigators should determine whether the story they’re getting from a child’s caregiver is consistent with the injuries they see on the child.”The portion of the paper dealing with injuries is particularly important,” Juarez says. “Because while it can be used for post-mortem assessment, it can also be used to examine X-rays of living children who can still be saved from abuse.”Story Source:The above story is based on materials provided by North Carolina State University. Note: Materials may be edited for content and length.

Read more

Software uses cyborg swarm to map unknown environs

Oct. 16, 2013 — Researchers from North Carolina State University have developed software that allows them to map unknown environments — such as collapsed buildings — based on the movement of a swarm of insect cyborgs, or “biobots.””We focused on how to map areas where you have little or no precise information on where each biobot is, such as a collapsed building where you can’t use GPS technology,” says Dr. Edgar Lobaton, an assistant professor of electrical and computer engineering at NC State and senior author of a paper on the research.”One characteristic of biobots is that their movement can be somewhat random,” Lobaton says. “We’re exploiting that random movement to work in our favor.”Here’s how the process would work in the field. A swarm of biobots, such as remotely controlled cockroaches, would be equipped with electronic sensors and released into a collapsed building or other hard-to-reach area. The biobots would initially be allowed to move about randomly. Because the biobots couldn’t be tracked by GPS, their precise locations would be unknown. However, the sensors would signal researchers via radio waves whenever biobots got close to each other.Once the swarm has had a chance to spread out, the researchers would send a signal commanding the biobots to keep moving until they find a wall or other unbroken surface — and then continue moving along the wall. This is called “wall following.”The researchers repeat this cycle of random movement and “wall following” several times, continually collecting data from the sensors whenever the biobots are near each other. The new software then uses an algorithm to translate the biobot sensor data into a rough map of the unknown environment.”This would give first responders a good idea of the layout in a previously unmapped area,” Lobaton says.The software would also allow public safety officials to determine the location of radioactive or chemical threats, if the biobots have been equipped with the relevant sensors.The researchers have tested the software using computer simulations and are currently testing the program with robots. …

Read more

Healthier Diets Possible in Low-Income, Rural Communities

Oct. 11, 2013 — In the United States, children don’t eat enough fruits, vegetables and whole grains. Instead, their diets typically include excessive amounts of sugars and solid fats, counter to the 2010 Dietary Guidelines for Americans recommendations, increasing the risk of obesity and diabetes. A team of investigators implemented a two-year intervention study in low-income, rural areas where a disproportionately higher risk of overweight and obesity habits among children persists, leading to increased risk of diabetes and heart disease in adulthood. The children enrolled in the study consumed significantly more fruits and vegetables. The results are published in the Journal of the Academy of Nutrition and Dietetics.To evaluate students’ diet quality at the beginning and after the study, researchers designed the CHANGE (Creating Healthy, Active and Nurturing Growing-up Environments) study, a two-year randomized, controlled, community- and school-based intervention to prevent unhealthy weight gain among rural school-aged children.”Our primary objectives were to improve the diets, physical activity levels, and weight status of rural children based on the successful model developed by Tufts University researchers for the Shape Up Somerville study,” says lead investigator Christina Economos, PhD, Friedman School of Nutrition Science and Policy, Tufts University, Boston . “The objective of our analysis was to examine changes in fruit, vegetable, legume, whole-grain and low-fat dairy consumption among rural elementary students who were exposed to the CHANGE study intervention compared with students in control schools,” says lead author Juliana F. W. Cohen, ScM, ScD, Department of Nutrition, Harvard School of Public Health, Boston. The team wanted to test its hypothesis that students exposed to the study would improve their diet quality due to healthier food environments.Eight communities in rural California, Kentucky, Mississippi, and South Carolina participated in the study between 2007 and 2009. …

Read more

Four common genetic variants associated with blood pressure in African-Americans

Sep. 10, 2013 — Case Western Reserve University is part of a landmark study that has discovered four novel gene variations associated with blood pressure. The 19-site meta-analysis, involving nearly 30,000 African-Americans, also found that the set of genetic mutations are also associated with blood pressure across other populations.Epidemiology and biostatistics professor Xiaofeng Zhu, PhD, is co-senior author of the paper, which appears in The American Journal of Human Genetics. The Continental Origins and Genetic Epidemiology Network (COGENT) consortium conducted the research, which is the largest genome-wide association study of blood pressure in individuals of African ancestry. Most gene discovery studies to date have been performed using individuals of European ancestry. Previous genome-wide association studies using samples from individuals of African descent failed to detect any replicable genes associated with blood pressure.”In addition to their disproportionate suffering, hypertension occurs earlier in life for African-Americans compared to individuals of other ancestries,” Zhu explained. “Therefore, it is important to study this population to better understand genetic susceptibility to hypertension.”Zhu and his colleagues also confirmed that previous findings regarding other genes whose presence correlates with increased hypertension risk.”Although it is unknown how the genes regulate blood pressure,” Zhu added, “our findings contribute to better understanding of blood pressure pathways that can lead to future development of drug target for hypertension and may guide therapy for clinical care.”Experts estimate genetic make-up accounts for roughly 40-50 percent of individuals’ susceptibility to hypertension. Other factors associated with the disease include lifestyle, diet, and obesity. Compared to Americans of European-ancestry, African-Americans’ increased hypertension prevalence contributes to a greater risk of stroke, coronary heart disease, and end-stage renal disease.”We anticipated that individuals of African ancestry share similar biology to other populations. However, differences in genomic make-up between African ancestry and other populations have uncovered additional genes affecting blood pressure, in addition to genetic variants that are specific to individuals of African ancestry,” said Nora Franceschini, MD, MPH, nephrologist and research assistant professor of epidemiology at the University of North Carolina at Chapel Hill and first author on the paper.The next phase of study involving the newly discovered gene mutations will investigate their function using human blood samples at the molecular level. …

Read more

Protecting 17 percent of Earth’s land could save two-thirds of plant species

Sep. 5, 2013 — Protecting key regions that comprise just 17 percent of Earth’s land may help preserve more than two-thirds of its plant species, according to a new Duke University-led study by an international team of scientists.The researchers from Duke, North Carolina State University and Microsoft Research used computer algorithms to identify the smallest set of regions worldwide that could contain the largest numbers of plant species. They published their findings today in the journal Science.”Our analysis shows that two of the most ambitious goals set forth by the 2010 Convention on Biological Diversity — to protect 60 percent of Earth’s plant species and 17 percent of its land surface — can be achieved, with one major caveat,” said Stuart L. Pimm, Doris Duke Professor of Conservation Ecology at Duke’s Nicholas School of the Environment.”To achieve these goals, we need to protect more land, on average, than we currently do, and much more in key places such as Madagascar, New Guinea and Ecuador,” Pimm said. “Our study identifies regions of importance. The logical — and very challenging — next step will be to make tactical local decisions within those regions to secure the most critical land for conservation.”Plant species aren’t haphazardly distributed across the planet. Certain areas, including Central America, the Caribbean, the Northern Andes and regions in Africa and Asia have much higher concentrations of endemic species, that is, those which are found nowhere else.”Species endemic to small geographical ranges are at a much higher risk of being threatened or endangered than those with large ranges,” said Lucas N. Joppa, a conservation scientist at Microsoft Research’s Computational Science Laboratory in Cambridge, U.K. “We combined regions to maximize the numbers of species in the minimal area. With that information, we can more accurately evaluate each region’s relative importance for conservation, and assess international priorities accordingly.”To identify which of Earth’s regions contain the highest concentrations of endemic species, relative to their geographic size, the researchers analyzed data on more than 100,000 different species of flowering plants, compiled by the Royal Botanic Gardens in Kew, England.Joppa and Piero Visconti, also of Microsoft Research’s Computational Science Laboratory, created and ran the complex algorithms needed to analyze the large spatial database.Based on their computations, Clinton N. …

Read more

First documented report of swimming and diving in apes

Aug. 14, 2013 — Two researchers have provided the first video-based observation of swimming and diving apes. Instead of the usual dog-paddle stroke used by most terrestrial mammals, these animals use a kind of breaststroke. The swimming strokes peculiar to humans and apes might be the result of an earlier adaptation to an arboreal life.For many years, zoos have used water moats to confine chimpanzees, gorillas or orangutans. When apes ventured into deep water, they often drowned. Some argued that this indicated a definitive difference between humans and apes: people enjoy the water and are able to learn to swim, while apes prefer to stay on dry land.But it turns out that this distinction is not absolute. Renato Bender, who is working on a PhD in human evolution at the School of Anatomical Sciences at Wits University, and Nicole Bender, who works as an evolutionary physician and epidemiologist at the Institute of Social and Preventive Medicine at the University of Bern, have studied a chimpanzee and an orangutan in the US. These primates were raised and cared for by humans and have learned to swim and to dive.’We were extremely surprised when the chimp Cooper dived repeatedly into a swimming pool in Missouri and seemed to feel very comfortable,’ said Renato Bender.To prevent the chimp from drowning, the researchers stretched two ropes over the deepest part of the pool. Cooper became immediately interested in the ropes and, after a few minutes, he started diving into the two-meter-deep water to pick up objects on the bottom of the pool. ‘It was very surprising behavior for an animal that is thought to be very afraid of water,’ said Renato Bender. …

Read more

Competition changes how people view strangers online: On sites like eBay, strangers no longer seen as ‘just like you’

Aug. 12, 2013 — An anonymous stranger you encounter on websites like Yelp or Amazon may seem to be just like you, and a potential friend. But a stranger on a site like eBay is a whole different story.A new study finds that on websites where people compete against each other, assumptions about strangers change.Previous research has shown that people have a bias toward thinking that strangers they encounter online are probably just like them.But when they are competitors, strangers are seen as different, and not sharing your traits and values — and that changes how people act, said Rebecca Walker Naylor, co-author of the study and assistant professor of marketing at The Ohio State University’s Fisher College of Business.”When you’re competing against people you don’t know, you actually bid much more aggressively than you might normally, because you assume that these strangers aren’t similar to you,” she said.”You feel you have the license to bid aggressively because the other bidders aren’t like you and you don’t have to be nice to them.”Naylor said the results should serve as a caution for people who shop on auction sites like eBay.”You need to be aware that, whether you mean to or not, you will naturally see other anonymous bidders as different from you. That will get the competitive juices flowing and you might end up paying more than you really want,” she said.Naylor conducted the study with Cait Poynor Lamberton of the Katz Graduate School of Business at the University of Pittsburgh, and David Norton at the University of Connecticut.Their findings appear in the August 2013 issue of the Journal of Consumer Research.In several related experiments, the researchers conducted simulated online auctions in which people thought they were bidding for products like a popular energy drink. In some cases, the study participants knew they were similar — or dissimilar — from the people they were bidding against, while other times they didn’t know anything about their competitors.In one study, college students were told they would be bidding on a bottle of Five Hour Energy Drink. They were then shown one of three profiles of a “representative” bidder they were competing against. In one profile, the bidder was demographically similar to the participant. A second profile featured a profile that was demographically different. The third profile was ambiguous, so these participants couldn’t tell how similar they were to their competition.Before the bidding, the researchers primed some participants for competition by having them complete a word search involving words describing competition, such as “battle,” “challenge” and “contend.”As expected, the participants’ bid was lowest — 75 cents — when they competed against someone similar to themselves. When they were bidding against people who were not like them, their high bid went up to $1.22.But the bid was highest when the participant competed against an anonymous person — all the way to $1.28.”They automatically assumed that if they didn’t know anything about this person they were bidding against, it must be someone who is not like them,” Naylor said. …

Read more

Self-healing solar cells ‘channel’ natural processes

Aug. 7, 2013 — To understand how solar cells heal themselves, look no further than the nearest tree leaf or the back of your hand.The “branching” vascular channels that circulate life-sustaining nutrients throughout leaves and hands serve as the inspiration for solar cells that can restore themselves efficiently and inexpensively.In a new paper, North Carolina State University researchers Orlin Velev and Hyung-Jun Koo show that creating solar cell devices with channels that mimic organic vascular systems can effectively reinvigorate solar cells whose performance deteriorates due to degradation by the sun’s ultraviolet rays. Solar cells that are based on organic systems hold the potential to be less expensive and more environmentally friendly than silicon-based solar cells, the current industry standard.The nature-mimicking devices are a type of dye-sensitized solar cells (DSSCs), composed of a water-based gel core, electrodes, and inexpensive, light-sensitive, organic dye molecules that capture light and generate electric current. However, the dye molecules that get “excited” by the sun’s rays to produce electricity eventually degrade and lose efficiency, Velev says, and thus need to be replenished to reboot the device’s effectiveness in harnessing the power of the sun.”Organic material in DSSCs tends to degrade, so we looked to nature to solve the problem,” Velev said. “We considered how the branched network in a leaf maintains water and nutrient levels throughout the leaf. Our microchannel solar cell design works in a similar way. Photovoltaic cells rendered ineffective by high intensities of ultraviolet rays were regenerated by pumping fresh dye into the channels while cycling the exhausted dye out of the cell. This process restores the device’s effectiveness in producing electricity over multiple cycles.”Velev, Invista Professor of Chemical and Biomolecular Engineering at NC State and the lead author of a paper in Scientific Reports describing the research, adds that the new gel-microfluidic cell design was tested against other designs, and that branched channel networks similar to the ones found in nature worked most effectively.Study co-author Dr. Hyung-Jun Koo is a former NC State Ph.D. student who is now a postdoctoral researcher at the University of Illinois. …

Read more

Length of human pregnancies can vary naturally by as much as five weeks

Aug. 6, 2013 — The length of a human pregnancy can vary naturally by as much as five weeks, according to research published online August 7 in the journal Human Reproduction.Normally, women are given a date for the likely delivery of their baby that is calculated as 280 days after the onset of their last menstrual period. Yet only four percent of women deliver at 280 days and only 70% deliver within 10 days of their estimated due date, even when the date is calculated with the help of ultrasound.Now, for the first time, researchers in the USA have been able to pinpoint the precise point at which a woman ovulates and a fertilised embryo implants in the womb during a naturally conceived pregnancy, and follow the pregnancy through to delivery. Using this information, they have been able to calculate the length of 125 pregnancies.”We found that the average time from ovulation to birth was 268 days — 38 weeks and two days,” said Dr Anne Marie Jukic, a postdoctoral fellow in the Epidemiology Branch at the National Institute of Environmental Health Sciences (Durham, USA), part of the National Institutes for Health. “However, even after we had excluded six pre-term births, we found that the length of the pregnancies varied by as much as 37 days.”We were a bit surprised by this finding. We know that length of gestation varies among women, but some part of that variation has always been attributed to errors in the assignment of gestational age. Our measure of length of gestation does not include these sources of error, and yet there is still five weeks of variability. It’s fascinating.”The possibility that the length of pregnancies can vary naturally has been little researched, as it is impossible to tell the difference between errors in calculations and natural variability without being able to measure correctly the gestational age of a developing fetus. Previous studies conducted as long ago as the 1970s and 1980s had used the slight rise in a woman’s body temperature at waking as a way of detecting when ovulation occurred. This is an inexact measurement and cannot be used to detect when the embryo actually implants in the womb.In the current study, the researchers took information from daily urine samples collected by women taking part in an earlier study, the North Carolina Early Pregnancy Study, which took place between 1982-1985 and followed 130 singleton pregnancies from unassisted conception through to birth. …

Read more

A roadblock to personalized cancer care?

Doctors need a way to target treatments to patients most likely to benefit and avoid treating those who will not. Tumor biomarker tests can help do this.The problem, according to a new commentary paper, is that, unlike drugs or other therapies, cancer biomarker tests are undervalued by doctors and patients. The authors say that inconsistent regulatory rules, inadequate payment and underfunded tumor biomarker research has left us in a vicious cycle that prevents development and testing of reliable biomarker tests that could be used to personalize clinical care of patients with cancer.”Right now biomarkers are not valued nearly to the extent that we see with therapeutics. But if a tumor biomarker test is being used to decide whether a patient should receive a certain treatment, then it is as critical for patient care as a therapeutic agent. A bad test is as dangerous as a bad drug,” says Daniel F. Hayes, M.D., clinical director of the breast oncology program at the University of Michigan Comprehensive Cancer Center.Hayes led a blue-ribbon panel of experts from universities, corporations, insurance and advocacy organizations to outline the issues in a commentary published today in Science Translational Medicine.Tumor biomarker tests look at the genetic or molecular make-up of a tumor to determine whether the cancer is likely to progress, and if so, if it is likely to respond to treatment. If the test is good, it can help doctors decide when a patient can safely skip further therapy, or it can be used to direct which drug might be most likely to help. The result: “personalized medicine,” which means patients get treatments that benefit them specifically and they avoid treatments — including their costs and side effects — that are not likely to make a difference for them.The regulatory process, the research funding, the reimbursement, even the standards for journal publications for tumor biomarker tests are all meager compared to the robust support for drug development, the authors say.This creates a vicious cycle in which researchers and drug companies don’t invest in tumor biomarker research, tests are not fully evaluated in clinical trials, and tests with uncertain value in terms of predicting the success of treatment are published. This in turn means that few of these tests are included in evidence-based care guidelines, leaving health care professionals unsure of whether or how to use the test, and third-party payers unsure of how much to pay for them.The authors outline five recommendations and suggest that all five must be addressed to break the vicious cycle:1. Reform regulatory review of tumor biomarker tests 2. …

Read more

Buying a used car? Be sure to flatter the seller

July 26, 2013 — Consumers set high prices when selling their possessions because they feel threatened, according to a new study in the Journal of Consumer Research.”When consumers consider selling a product they own, they feel threatened by the impending loss. In order to counter this threat, they increase the product’s value,” write authors Promothesh Chatterjee (University of Kansas), Caglar Irmak (University of Georgia), and Randall L. Rose (University of South Carolina).Due to a phenomenon called the “endowment effect,” consumers seek much higher prices when selling a product they own than they would be willing to pay to purchase the same product.In one study, consumers were assigned either a seller or buyer role and presented with a coffee mug. Sellers were told they could keep the mug or sell it, while buyers were asked to evaluate the mug. Then, both sellers and buyers were shown a series of words on a computer screen consisting of threat-related words (endanger), neutral words (wood), and non-words (tlun). Sellers responded to threat-related words much more quickly than buyers, and this difference in their response time led to significantly higher selling prices compared to buying prices.Consumers should be aware that sellers can feel threatened when parting with even the most mundane possessions. Complimenting or flattering a seller can make them feel less threatened and lead them to lower their selling prices.”Affirming a seller leads to elimination of the endowment effect. Buyers may want to affirm sellers to make them feel less threatened by the loss of a possession and therefore willing to set lower prices. Next time you are buying a second-hand car, for example, you may want to start the negotiation by telling the car owner what a wonderful family she has,” the authors conclude.

Read more

Scientists discover new variability in iron supply to the oceans with climate implications

July 19, 2013 — The supply of dissolved iron to oceans around continental shelves has been found to be more variable by region than previously believed — with implications for future climate prediction.Iron is key to the removal of carbon dioxide from Earth’s atmosphere as it promotes the growth of microscopic marine plants (phytoplankton), which mop up the greenhouse gas and lock it away in the ocean.A new study, led by researchers based at the National Oceanography Centre Southampton, has found that the amount of dissolved iron released into the ocean from continental margins displays variability not currently captured by ocean-climate prediction models. This could alter predictions of future climate change because iron, a key micronutrient, plays an important role in the global carbon cycle.Previously assumed to reflect rates of microbial activity, the study found that the amount of iron leaking from continental margins (the seafloor sediments close to continents) is actually far more varied between regions because of local differences in weathering and erosion on land. The results of the study are published this week in Nature Communications.”Iron acts like a giant lever on marine life storing carbon,” says Dr Will Homoky, lead author and postdoctoral research fellow at University of Southampton Ocean and Earth Science, which is based at the Centre. “It switches on growth of microscopic marine plants, which extract carbon dioxide from our atmosphere and lock it away in the ocean.”Continental margins are a major source of dissolved iron to the oceans and therefore an important factor for climate prediction models. But until now, measurements have only been taken in a limited number of regions across the globe, all of which have been characterised by low oxygen levels and high sedimentation rates. The present study focussed on a region with contrasting environmental conditions — in Atlantic waters off the coast of South Africa.”We were keen to measure iron from this region because it is so different to areas studied before. The seawater here contains more oxygen, and sediments accumulate much more slowly on the seafloor because the region is drier and geologically less active,” says Professor Rachel Mills, co-author at the University of Southampton.The team found substantially smaller amounts of iron being supplied to seawater than measured anywhere before — challenging preconceptions of iron supply across the globe.The researchers also identified that there are two different mechanisms by which rocks are dissolving on the seafloor. They did this by measuring the isotopic composition of the iron, using a technique developed with co-authors based at the University of South Carolina.”We already knew that microbial processes dissolve iron in rocks and minerals,” says Dr Homoky, “but now we find that rocks also dissolve passively and release iron to seawater. A bit like sugar dissolving in a cup of tea.”The fact that we have found a new mechanism makes us question how much iron is leaking out from other areas of the ocean floor. If certain rocks are going to dissolve irrespective of microbial processes, suddenly there are whole regions that might be supplying iron that are presently unaccounted for.”But how much can this one factor really affect changes in Earth’s climate? …

Read more

Air pollution responsible for more than 2 million deaths worldwide each year, experts estimate

July 12, 2013 — More than two million deaths occur worldwide each year as a direct result of human-caused outdoor air pollution, a new study has found.In addition, while it has been suggested that a changing climate can exacerbate the effects of air pollution and increase death rates, the study shows that this has a minimal effect and only accounts for a small proportion of current deaths related to air pollution.The study, which has been published today, 12 July, in IOP Publishing’s journal Environmental Research Letters, estimates that around 470,000 people die each year because of human-caused increases in ozone.It also estimates that around 2.1 million deaths are caused each year by human-caused increases in fine particulate matter (PM2.5) ? tiny particles suspended in the air that can penetrate deep into the lungs, causing cancer and other respiratory disease.Co-author of the study, Jason West, from the University of North Carolina, said: “Our estimates make outdoor air pollution among the most important environmental risk factors for health. Many of these deaths are estimated to occur in East Asia and South Asia, where population is high and air pollution is severe.”According to the study, the number of these deaths that can be attributed to changes in the climate since the industrial era is, however, relatively small. It estimates that a changing climate results in 1500 deaths due to ozone and 2200 deaths related to PM2.5 each year.Climate change affects air quality in many ways, possibly leading to local increases or decreases in air pollution. For instance, temperature and humidity can change the reaction rates which determine the formation or lifetime of a pollutant, and rainfall can determine the time that pollutants can accumulate.Higher temperatures can also increase the emissions of organic compounds from trees, which can then react in the atmosphere to form ozone and particulate matter.”Very few studies have attempted to estimate the effects of past climate change on air quality and health. We found that the effects of past climate change are likely to be a very small component of the overall effect of air pollution,” continued West.In their study, the researchers used an ensemble of climate models to simulate the concentrations of ozone and PM2.5 in the years 2000 and 1850. A total of 14 models simulated levels of ozone and six models simulated levels of PM2.5.Previous epidemiological studies were then used to assess how the specific concentrations of air pollution from the climate models related to current global mortality rates.The researchers’ results were comparable to previous studies that have analysed air pollution and mortality; however, there was some variation depending on which climate model was used.”We have also found that there is significant uncertainty based on the spread among different atmospheric models. This would caution against using a single model in the future, as some studies have done,” continued West.

Read more

Outdated practice of annual cervical-cancer screenings may cause more harm than good

July 9, 2013 — For decades, women between the ages of 21 and 69 were advised to get annual screening exams for cervical cancer. In 2009, however, accumulating scientific evidence led major guideline groups to agree on a new recommendation that women be screened less frequently: every three years rather than annually.Despite the revised guidelines, about half of the obstetrician-gynecologists surveyed in a recent study said they continue to provide annual exams — an outdated practice that may be more harmful than helpful, said Drs. Russell Harris and Stacey Sheridan of the Cecil G. Sheps Center for Health Services Research at the University of North Carolina at Chapel Hill.”Screening is not the unqualified good that we have advertised it to be,” they wrote in an editorial titled, “The Times They (May Be) A-Changin’: Too Much Screening is a Health Problem.” The editorial accompanied a research study reviewing physician practices around cervical-cancer screening and vaccination for human papilloma virus (HPV), which has been linked to cervical cancer.The study, “Physicians Slow to Implement HPV Vaccination and Cervical Screening Guidelines,” was published July 9 in the American Journal of Preventive Medicine.”Screening for cervical cancer and other cancers such as breast and prostate, has clear potential for harms as well as benefits, and these must be carefully weighed before a rational decision about screening can be made,” wrote Harris and Sheridan, who are professor and assistant professor of medicine, respectively, at UNC’s School of Medicine. They also hold adjunct appointments at UNC’s Gillings School of Global Public Health.The study noted physicians said they were comfortable with longer testing intervals, but were concerned their patients might not come in for annual check-ups if Pap tests, the screening test for cervical cancer, were not offered. The problem, Harris said, is that annual Pap tests produce more abnormal results leading to additional, invasive testing that itself bring risks.”Many women have ‘abnormal’ [Pap test] findings that are not cancer, but may be a ‘cancer precursor.’ We know that the great majority of these abnormal findings would never progress to actual invasive cancer, yet these women are referred” for further, more invasive testing, Harris said.One such test, called a “colposcopy,” [cohl-PAH-scoh-pee], involves examining the cervix for possibly cancerous lesions, followed frequently by a biopsy, i.e., taking a small sample of the lesion, which can cause pain and bleeding, as well as potential psychological harm. “The screening test itself can raise concern about dreaded cancer; a positive screening test heightens this worry; finding a cancer precursor, even one of uncertain importance, just increases worry further,” they wrote.The authors recognize the important benefit of screening for cervical and other cancers, but “screening every three years [for cervical cancer] retains about 95 percent of the benefit of annual screening, but reduces harms by roughly two-thirds.” Less-frequent screening also reduces costs significantly in terms of patient and physician time and laboratory testing supplies and other resources.The newest cervical-cancer and HPV screening recommendations were released in March 2012, too recent to have been included in the July 9 study. Women should still begin Pap tests at age 21 and every three years afterward, but women between the ages of 30 and 65 may choose to extend the Pap test interval to every five years, provided they also get an HPV test, according to the U.S. Preventive Services Task Force and the American Cancer Society, among others. However, the authors added, “the debate about a do-less approach to screening — for cervical cancer and other conditions as well — is ongoing.”The editorial concluded: “Bob Dylan sang about changing times before they actually changed, yet his singing moved the public discussion in a positive direction. …

Read more

3-D structures built out of liquid metal

July 9, 2013 — Researchers from North Carolina State University have developed three-dimensional (3-D) printing technology and techniques to create free-standing structures made of liquid metal at room temperature.”It’s difficult to create structures out of liquids, because liquids want to bead up. But we’ve found that a liquid metal alloy of gallium and indium reacts to the oxygen in the air at room temperature to form a ‘skin’ that allows the liquid metal structures to retain their shapes,” says Dr. Michael Dickey, an assistant professor of chemical and biomolecular engineering at NC State and co-author of a paper describing the work.The researchers developed multiple techniques for creating these structures, which can be used to connect electronic components in three dimensions. White it is relatively straightforward to pattern the metal “in plane” — meaning all on the same level — these liquid metal structures can also form shapes that reach up or down.One technique involves stacking droplets of liquid metal on top of each other, much like a stack of oranges at the supermarket. The droplets adhere to one another, but retain their shape — they do not merge into a single, larger droplet.Another technique injects liquid metal into a polymer template, so that the metal takes on a specific shape. The template is then dissolved, leaving the bare, liquid metal in the desired shape. The researchers also developed techniques for creating liquid metal wires, which retain their shape even when held perpendicular to the substrate.Dickey’s team is currently exploring how to further develop these techniques, as well as how to use them in various electronics applications and in conjunction with established 3-D printing technologies.”I’d also like to note that the work by an undergraduate, Collin Ladd, was indispensable to this project,” Dickey says. “He helped develop the concept, and literally created some of this technology out of spare parts he found himself.”The paper, “3-D Printing of Free Standing Liquid Metal Microstructures,” is published online in Advanced Materials. Ladd, a recent NC State graduate, is lead author. Co-authors are Dickey; former NC State Ph.D. …

Read more

Utilizzando il sito, accetti l'utilizzo dei cookie da parte nostra. maggiori informazioni

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close