Aug. 14, 2013 — For a new study in the Journal of Animal Science, researchers evaluated how different levels of selenium affect the immune system of adult horses. According to the researchers, the effects of selenium supplementation on the immune system have been evaluated in other species but not extensively in horses.Dr. Laurie Lawrence, animal science professor at the University of Kentucky, said the amount of selenium in soil and forages varies across the United States. She said that they wanted to know whether horses grazing pasture that was marginal in selenium would react differently to vaccines, compared with horses that were supplemented with additional dietary selenium.Mieke Brummer, PhD student working with the University of Kentucky and Alltech, said she was interested in the effects of selenium on the immune system, specifically the cell-mediated component of the immune system. The cell-mediated component is responsible for the activation of immune cells. These immune cells can directly attack foreign antigens.In this experiment, twenty-eight adult horses were divided by age and sex. The horses were then randomly assigned to a dietary selenium treatment. Brummer said the first phase lasted 35-weeks. During the first phase, the researchers were aiming to deplete the selenium in the horses that were assigned to the low diet.In the next 29-weeks, some the horses that were assigned to low treatments were given additional dietary selenium supplements. …Read more
Aug. 28, 2013 — Teachers who practice “mindfulness” are better able to reduce their own levels of stress and prevent burnout, according to a new study conducted by the Center for Investigating Healthy Minds (CIHM) at UW-Madison’s Waisman Center.The results of the study, led by Assistant Scientist Lisa Flook, were recently published in the journal Mind, Brain and Education.Mindfulness, a notion that stems from centuries-old meditative traditions and is now taught in a secular way, is a technique to heighten attention, empathy and other pro-social emotions through an awareness of thoughts, external stimuli, or bodily sensations such as breath.While teachers play a critical role in nurturing children’s well-being, progress in addressing teacher stress has been elusive. Stress and burnout among teachers is a major concern for school districts nationwide, affecting the quality of education and incurring increased costs in recruiting and sustaining teachers.For the study, a group of 18 teachers was recruited to take part in a Mindfulness-Based Stress Reduction (MBSR) course, a well-established and well-studied method of mindfulness training. The project team adapted the MBSR training to fit the particular needs and time demands of elementary school teachers. It was among the first efforts to train teachers, in addition to students, in mindfulness techniques and to examine the effects of this training in the classroom.”We wanted to offer training to teachers in a format that would be engaging and address the concerns that were specifically relevant to their role as teachers,” says Flook, who has advanced degrees in education and psychology and whose primary interest is in exploring strategies to reduce stress and promote well-being in children and adolescents.The teachers who received the training were randomly assigned and asked to practice a guided meditation at home for at least 15 minutes per day. They also learned to use specific strategies for preventing and dealing with stressors in the classroom, such as “dropping in,” a term to describe the process of bringing attention to the sensations of breath and other physical sensations, thoughts, and emotions for brief periods of time. The training also included caring practices to bring kind awareness to their experiences, especially those that are challenging.One of the goals of the study was to evaluate outcomes using measures that could be affected by mindfulness training. The researchers found that those who received the mindfulness training displayed reductions in psychological stress, improvements in classroom organization and increases in self-compassion. In comparison, the group that did not receive the training showed signs of increased stress and burnout over the course of the school year. These results provide objective evidence that MBSR techniques are beneficial to teachers.”The most important outcome that we observed is the consistent pattern of results, across a range of self-report and objective measures used in this pilot study, that indicate benefits from practicing mindfulness,” says Flook, who also leads CIHM’s “Kindness Curriculum” study involving 4-year-old preschoolers.Madison teacher Elizabeth Miller discovered that mindfulness is a meditative technique that does not require “just sitting still and trying to observe your thoughts,” which she said was difficult for her. …Read more
Aug. 17, 2013 — Researchers have urged the use of a wearable computing system installed in a helmet to protect construction workers from carbon monoxide poisoning, a serious lethal threat in this industry.This award will be presented at the August 17-21, 2013 Institute of Electrical and Electronic Engineers (IEEE) Conference on Automation Science and Engineering .Carbon monoxide poisoning is a significant problem for construction workers in both residential and industrial settings. The danger exists because the exhaust from gasoline-powered hand tools can quickly build up in enclosed spaces and easily overcome the tool’s users and nearby co-workers.In the paper, the researchers explained how they integrated a pulse oximetry sensor into a typical construction helmet to allow continuous and noninvasive monitoring of workers’ blood gas saturation levels. The results of their study showed that a user of this helmet would be warned of impending carbon monoxide poisoning with a probability of greater than 99 percent.The award-winning research and resulting paper was written by Jason B. Forsyth of Durham, N.C., and a Ph.D. candidate in computer engineering, his adviser Thomas L. Martin, professor of electrical and computer engineering, Deborah Young-Corbett, assistant professor of civil and environmental engineering and a member of the Myers-Lawson School of Construction, and Ed Dorsa, associate professor of industrial design.Ten Virginia Tech students participated in the study conducted on the university campus. They mimicked simple tasks of construction workers.To show the feasibility of monitoring for carbon monoxide poisoning without subjecting the users to dangerous conditions, the researchers used a prototype for monitoring the blood oxygen saturation. The difference for monitoring for oxygen and for carbon monoxide differs only in the number of wavelengths of light employed, so if this monitoring proved feasible, then the monitoring for carbon monoxide would be feasible as well.They selected a helmet for the installation of a wearable computer because they needed a design that could be worn year round which ruled out seasonal clothing such as overalls or coats. They also wanted a design that was socially acceptable, and one that struck a balance between comfort, usability, and feasibility.”This helmet is only a first step toward our long-term vision of having a network of wearable and environmental sensors and intelligent personal protective gear on construction sites that will improve safety for workers,” according to their report. …Read more
Aug. 19, 2013 — When enough raindrops fall over land instead of the ocean, they begin to add up. New research led by the National Center for Atmospheric Research (NCAR) shows that when three atmospheric patterns came together over the Indian and Pacific oceans, they drove so much precipitation over Australia in 2010 and 2011 that the world’s ocean levels dropped measurably. Unlike other continents, the soils and topography of Australia prevent almost all of its precipitation from running off into the ocean.The 2010-11 event temporarily halted a long-term trend of rising sea levels caused by higher temperatures and melting ice sheets.Now that the atmospheric patterns have snapped back and more rain is falling over tropical oceans, the seas are rising again. In fact, with Australia in a major drought, they are rising faster than before.”It’s a beautiful illustration of how complicated our climate system is,” says NCAR scientist John Fasullo, the lead author of the study. “The smallest continent in the world can affect sea level worldwide. Its influence is so strong that it can temporarily overcome the background trend of rising sea levels that we see with climate change.”The study, with co-authors from NASA’s Jet Propulsion Laboratory and the University of Colorado at Boulder, will be published next month in Geophysical Research Letters. It was funded by the National Science Foundation, which is NCAR’s sponsor, and by NASA.Consistent rising, interruptedAs the climate warms, the world’s oceans have been rising in recent decades by just more than 3 millimeters (0.1 inches) annually. This is partly because the heat causes water to expand, and partly because runoff from retreating glaciers and ice sheets is making its way into the oceans.But for an 18-month period beginning in 2010, the oceans mysteriously dropped by about 7 millimeters (about 0.3 inches), more than offsetting the annual rise.Fasullo and his co-authors published research last year demonstrating that the reason had to do with the increased rainfall over tropical continents. They also showed that the drop coincided with the atmospheric oscillation known as La Niña, which cooled tropical surface waters in the eastern Pacific and suppressed rainfall there while enhancing it over portions of the tropical Pacific, Africa, South America, and Australia.But an analysis of the historical record showed that past La Niña events only rarely accompanied such a pronounced drop in sea level.Using a combination of satellite instruments and other tools, the new study finds that the picture in 2010-11 was uniquely complex. …Read more
Aug. 7, 2013 — A joint Group Health-University of Washington (UW) study in the New England Journal of Medicine has found that higher blood sugar levels are associated with higher dementia risk, even among people who do not have diabetes.Blood sugar levels averaged over a five-year period were associated with rising risks for developing dementia, in this report about more than 2,000 Group Health patients age 65 and older in the Adult Changes in Thought (ACT) study.For example, in people without diabetes, risk for dementia was 18 percent higher for people with an average glucose level of 115 milligrams per deciliter compared to those with an average glucose level of 100 mg/dl. And in people with diabetes, whose blood sugar levels are generally higher, dementia risk was 40 percent higher for people with an average glucose level of 190 mg/dl compared to those with an average glucose level of 160 mg/dl.”The most interesting finding was that every incrementally higher glucose level was associated with a higher risk of dementia in people who did not have diabetes,” said first author Paul K. Crane, MD, MPH, an associate professor of medicine at the UW School of Medicine, adjunct associate professor of health services at the UW School of Public Health, and affiliate investigator at Group Health Research Institute. “There was no threshold value for lower glucose values where risk leveled off.””One major strength of this research is that it is based on the ACT study, a longitudinal cohort study, where we follow people for many years as they lead their lives,” said senior author Eric B. Larson, MD, MPH, a senior investigator at Group Health Research Institute who also has appointments at the UW Schools of Medicine and Public Health. “We combine information from people’s research visits every other year with data from their visits to Group Health providers whenever they receive care. And this gave us an average of 17 blood sugar measurements per person: very rich data.”These measurements included blood glucose (some fasting, some not) and glycated hemoglobin (also known as HbA1c). Blood sugar levels rise and fall in peaks and valleys throughout each day, but glycated hemoglobin doesn’t vary as much over short intervals. R. …Read more
July 30, 2013 — Patients with diabetes who take certain types of medications to lower their blood sugar sometimes experience severe low blood-sugar levels, whether or not their diabetes is poorly or well controlled, according to a new study by Kaiser Permanente and Yale University School of Medicine. The finding, published in the current online issue of Diabetes Care, challenges the conventional wisdom that hypoglycemia is primarily a problem among diabetic patients with well-controlled diabetes (who have low average blood-sugar levels).Low blood sugar, or hypoglycemia, can cause unpleasant symptoms but is typically treatable with food or a sweet drink. Severe hypoglycemia occurs when blood sugar gets so low a patient needs assistance, and may result in dizziness or mental confusion, injury, car accident, coma or, rarely, even death. Several recent studies have found that patients who experienced severe hypoglycemia were also at higher risk for dementia, falls, fractures and heart attacks, compared with patients who did not experience hypoglycemia.”Many clinicians may assume that hypoglycemia is not much of a problem in poorly controlled type 2 diabetes given their high average blood-sugar levels,” said senior author and study principal investigator, Andrew Karter, PhD, of the Kaiser Permanente Division of Research. “This study suggests that we should pay much closer attention to hypoglycemia, even in poorly controlled patients. Providers should explain the symptoms of hypoglycemia, how to treat it, and how to avoid it — for example, by not skipping meals. Most of all, providers should ask all their diabetic patients whether they’ve experienced hypoglycemia, even those patients with very high average levels of blood sugar.”The researchers surveyed patients with type 2 diabetes being treated with medications to lower their blood sugar and asked about their experiences with severe hypoglycemia. Nearly 11 percent of the more than 9,000 respondents experienced severe hypoglycemia in the prior year, and it occurred at all levels of blood-sugar control.Researchers categorized patients into five categories of HbA1c, a measure of average blood sugar, ranging from lowest to highest. The prevalence of severe hypoglycemia was calculated for each category. Patients with the lowest and highest HbA1c values tended to be at higher risk for hypoglycemia, compared to those with HbA1c values in the middle range. …Read more
July 19, 2013 — Kidney patients who take calcium supplements to lower their phosphorus levels may be at a 22 per cent higher risk of death than those who take other non-calcium based treatments, according to a new study by Women’s College Hospital’s Dr. Sophie Jamal.The study, published today in the Lancet, calls into question the long-time practice of prescribing calcium to lower phosphate levels in patients with chronic kidney disease. The researchers suggest some of the calcium is absorbed into the blood stream and may expedite hardening of the arteries, leading to a higher risk of heart disease and even death. Cardiovascular disease is a leading cause of death for people with chronic kidney disease.”Doctors commonly prescribe calcium supplements to prevent elevated phosphate levels, which can damage the body, but a growing number of studies have shown calcium supplements may actually increase the risk of heart disease,” explains Dr. Sophie Jamal, a physician at Women’s College Hospital and an associate professor of medicine at the University of Toronto. “Our study validates these claims and, for the first time, shows the long-term consequences of taking calcium supplements can be dangerous for patients with kidney disease.”As part of their analysis, researchers reviewed 11 randomized, controlled studies that included more than 4,600 patients. The researchers assessed the risk of heart disease, including heart attack, stroke, and hardening of the arteries, along with death among individuals prescribed the medication containing calcium and those prescribed the medication without calcium. They found:A 22% reduction in death among patients who took non-calcium based treatments sevelamer and lanthanum. Less artery calcification (hardening) in patients who did not take calcium supplements. “Some researchers and physicians have been saying for years that kidney disease patients need to get off calcium, now we think our review shows there is much more solid evidence to argue for that change to clinical practice,” the study’s senior author Ross Tsuyuki from the University of Alberta’s faculty of medicine and dentistry.In the meantime, given the study’s findings, the researchers suggest non-calcium containing treatments be used as a first line of treatment to lower phosphate for patients with chronic kidney disease.”The findings of our study provides the best evidence as to what doctors should be prescribing their patients, but further research is necessary to help us understand how exactly calcium increases the risk of death, if non calcium-based treatments reduce the risk of death, and whether certain types of treatments may be more effective and beneficial than others,” says Dr. …Read more
July 11, 2013 — Delaying clamping of the umbilical cord after birth benefits newborn babies, according to a systematic review published in The Cochrane Library. The authors found babies’ blood and iron levels were healthier when the cord was clamped later.In many high income countries, it is standard practice to clamp the umbilical cord connecting mother and baby less than a minute after birth. However, clamping the cord too soon may reduce the amount of blood that passes from mother to baby via the placenta, affecting the baby’s iron stores. On the other hand, delayed cord clamping, which is carried out more than a minute after birth, may also slightly increase the risk of jaundice. The World Health Organization now recommends cord clamping between one and three minutes after birth.The researchers reviewed data from 15 trials involving a total of 3,911 women and their babies. They looked at outcomes for mothers and outcomes for babies separately, and looked at haemoglobin concentrations as an indicator of healthy blood and iron levels. While clamping the cord later made no difference to the risk of maternal haemorrhaging, blood loss or haemoglobin levels, babies were healthier in a number of respects. When cord clamping was delayed, babies had higher haemoglobin levels between one and two days after birth and were less likely to be iron-deficient three to six months after birth. Birth weight was also higher with delayed cord clamping.”In light of growing evidence that delayed cord clamping increases early haemoglobin concentrations and iron stores in infants, a more liberal approach to delaying clamping of the umbilical cord in healthy babies appears to be warranted,” said Philippa Middleton, one of the authors of the review based at the Australian Research Centre for Health of Women and Babies, Robinson Institute at the University of Adelaide in Adelaide, Australia.Clamping the cord later did lead to a slightly higher number of babies needing treatment for jaundice, which is treated by light therapy. “The benefits of delayed cord clamping need to be weighed against the small additional risk of jaundice in newborns,” said Middleton. …Read more
June 3, 2013 — As ice sheets melted during the deglaciation of the last ice age and global oceans warmed, oceanic oxygen levels decreased and “denitrification” accelerated by 30 to 120 percent, a new international study shows, creating oxygen-poor marine regions and throwing the oceanic nitrogen cycle off balance.By the end of the deglaciation, however, the oceans had adjusted to their new warmer state and the nitrogen cycle had stabilized — though it took several millennia. Recent increases in global warming, thought to be caused by human activities, are raising concerns that denitrification may adversely affect marine environments over the next few hundred years, with potentially significant effects on ocean food webs.Results of the study have been published this week in the journal Nature Geoscience.”The warming that occurred during deglaciation some 20,000 to 10,000 years ago led to a reduction of oxygen gas dissolved in sea water and more denitrification, or removal of nitrogen nutrients from the ocean,” explained Andreas Schmittner, an Oregon State University oceanographer and author on the Nature Geoscience paper. “Since nitrogen nutrients are needed by algae to grow, this affects phytoplankton growth and productivity, and may also affect atmospheric carbon dioxide concentrations.””This study shows just what happened in the past, and suggests that decreases in oceanic oxygen that will likely take place under future global warming scenarios could mean more denitrification and fewer nutrients available for phytoplankton,” Schmittner added.In their study, the scientists analyzed more than 2,300 seafloor core samples, and created 76 time series of nitrogen isotopes in those sediments spanning the past 30,000 years. They discovered that during the last glacial maximum, the Earth’s nitrogen cycle was at a near steady state. In other words, the amount of nitrogen nutrients added to the oceans — known as nitrogen fixation — was sufficient to compensate for the amount lost by denitrification.A lack of nitrogen can essentially starve a marine ecosystem by not providing enough nutrients. Conversely, too much nitrogen can create an excess of plant growth that eventually decays and uses up the oxygen dissolved in sea water, suffocating fish and other marine organisms.Following the period of enhanced denitrification and nitrogen loss during deglaciation, the world’s oceans slowly moved back toward a state of near stabilization. But there are signs that recent rates of global warming may be pushing the nitrogen cycle out of balance.”Measurements show that oxygen is already decreasing in the ocean,” Schmittner said “The changes we saw during deglaciation of the last ice age happened over thousands of years. But current warming trends are happening at a much faster rate than in the past, which almost certainly will cause oceanic changes to occur more rapidly.”It still may take decades, even centuries to unfold,” he added.Schmittner and Christopher Somes, a former graduate student in the OSU College of Earth, Ocean, and Atmospheric Sciences, developed a model of nitrogen isotope cycling in the ocean, and compared that with the nitrogen measurements from the seafloor sediments. Their sensitivity experiments with the model helped to interpret the complex patterns seen in the observations.This study was supported by the National Science Foundation.Read more
May 6, 2013 — New scientific results show that arctic foxes accumulate dangerous levels of mercury if they live in coastal habitats and feed on prey which lives in the ocean. Researchers from the Leibniz Institute for Zoo and Wildlife Research, Moscow State University and the University of Iceland just published their discovery in the science online journal PLOS ONE.
Mercury is usually transferred across the food chain, so the researchers checked which items were the main source of food and measured mercury levels in the main prey of Arctic foxes.
The scientists compared three fox populations in different environments. Foxes on the small Russian Commander Island of Mednyi fed almost exclusively on sea birds, with some foxes eating seal carcasses. In Iceland, foxes living on the coast ate sea birds whereas those living inland ate non-marine birds and rodents.
In all three environments different levels of mercury were present in their hair. Foxes living in coastal habitats such as Iceland and Mednyi Island exhibited high levels of mercury.
What does this mean for the foxes? Using museum skin samples from the Commander Islands, the researchers could show that the foxes suffered exposure to mercury for a long time. The researchers confirmed that the source of contamination was their food, as they measured high mercury levels in the prey of foxes such as seals and sea birds.
However, the inland Arctic fox populations of Iceland had low mercury levels. Thus, living inland and eating non-marine birds and rodents instead of eating prey that feeds from the sea protected the inland foxes from mercury exposure. This may have health and conservation implications. The Mednyi Island foxes are almost an opposite example to the inland Icelandic fox population. They live on a small island with no rodents or alternative food source to seals or sea birds. They suffered a tremendous population crash and while the population is currently stable, it is very small and juvenile foxes in particular show high mortality rates. Foxes of all ages exhibit low body weight and have poor coat condition.
“When going into this project we thought that an introduced pathogen would explain the poor condition of the foxes and their high mortality but after extensive screening, we did not find anything,” says Alex Greenwood, principal investigator of the study. Instead, the researchers began to suspect that something else was at play. “If pathogens were not the cause, we thought perhaps pollutants could be involved. We thought of mercury because it has been reported in high concentration in other Arctic vertebrates also in remote areas and mercury intoxication is known to increase mortality in mammals. As mercury can have negative effects on overall health, particularly in young individuals, and as we knew that Mednyi foxes were exclusively feeding on potentially contaminated sources, we wanted to see whether contamination with mercury depended on feeding ecology and hence might have been the crucial factor for the population decline on Mednyi Island,” comments Gabriele Treu, one of the lead authors of the study. As it turned out, the observed high mercury demonstrated a tight association with feeding ecology and geographical distribution of the foxes.
In terms of conservation and long term population health for the entire arctic food chain of carnivores, mercury pollution must be stopped.Read more
May 10, 2012 — With high UV levels continuing in Queensland this autumn, young people are at risk of suffering the worst skin damage they will receive during their lifetime, research from Queensland University of Technology (QUT) has found.
Researcher Professor Michael Kimlin from QUT’s AusSun Research Lab said the study found UV exposure during a person’s first 18 years of life was the most critical for cancer-causing skin damage and skin aging.
Professor Kimlin said while people aged over 50 had the slowest rate of skin degradation, results indicated that damage still occurred even at that age, so lifetime sun protection was important.
The study used a unique, non-invasive “UV camera,” which took images of skin damage and aging invisible to the naked eye, to measure the relationship between lifetime sun exposure and skin cancer risk.
Professor Kimlin said the majority of skin damage occurred in the early years of sun exposure, with a much slower increase in damage in subsequent years over the age of 50.
“We looked at how age impacted on the skin damage we saw and found it’s not a simple one to one relationship,” said Professor Kimlin.
“The message from this research is to look after your skin when you are a child and teenager to prevent wrinkles and skin damage.
“Sun protection when you are young sets you on a lifetime of good skin health.”
One hundred and eighty people aged 18 to 83 years old were imaged with the UV camera and interviewed to determine the level of their sun exposure.
The study measured hyperpigmentation of the skin to determine level of damage and wrinkles to indicate skin aging.
Professor Kimlin said using the UV camera meant people’s skin could be examined for skin cancer risk factors without an invasive biopsy.Read more