Under new guidelines from the Institute of Medicine, the estimated number of children who are at risk of having insufficient or deficient levels of vitamin D is drastically reduced from previous estimates, according to a Loyola University Chicago Stritch School of Medicine study.The study, led by Holly Kramer, MD, MPH, and Ramon Durazo-Arvizu, PhD, is published online ahead of print in the Journal of Pediatric Endocrinology and Metabolism.New Institute of Medicine guidelines say most people get sufficient vitamin D when their blood levels are at or above 20 nanograms per milliliter (ng/mL). The Pediatric Endocrine Society has a similar guideline. However, other guidelines recommend vitamin D levels above 30 ng/mL.Loyola researchers studied vitamin D data from a nationally representative sample of 2,877 U.S. children and adolescents ages 6 to 18 who participated in the National Health and Nutrition Examination Survey.The study found that under the Institute of Medicine guidelines, 10.3 percent of children ages 6 to 18 are at risk of inadequate or deficient vitamin D levels. (This translates to an estimated 5.5 million children.)By comparison, a 2009 study in the journal Pediatrics, which defined sufficient vitamin D levels as greater than 30 ng/mL, found that an estimated 70 percent of people ages 1 to 21 had deficient or insufficient vitamin D levels.Under previous guidelines, millions of children who had vitamin D levels between 20 and 30 ng/mL would have needed supplementation. Under the Institute of Medicine guidelines, children in this range no longer need to take vitamin D supplements.The new study found that children at risk of vitamin D deficiency under the Institute of Medicine guidelines are more likely to be overweight, female, non-white and between the ages of 14 and 18.The Institute of Medicine’s new vitamin D guidelines are based on nearly 1,000 published studies and testimony from scientists and other experts. The IOM found that vitamin D is essential to avoid poor bone health, such as rickets. But there have been conflicting and mixed results in studies on whether vitamin D can also protect against cancer, heart disease, autoimmune diseases and diabetes. Moreover, excessive vitamin D can damage the kidneys and heart, the IOM found.Story Source:The above story is based on materials provided by Loyola University Health System. Note: Materials may be edited for content and length.Read more
After the final play of the Super Bowl, millions of fans will go through withdrawal symptoms from not being able to watch football for months.Loyola University Medical Center psychiatrist Dr. Angelos Halaris describes the effects this has on the brain and offers tips on how fans can cope.Halaris explains that when a person engages in a pleasurable activity, such as watching a football game, a neurotransmitter (brain chemical) called dopamine is released in a part of the brain called the nucleus accumbens.When the pleasurable activity ends, the person is left with a feeling of deprivation. It’s similar to what a smoker feels when deprived of a cigarette — except there’s no quick fix like a cigarette for the football fan.”When the football season is over and there’s no other game on the schedule for months, you’re stuck, so you go through withdrawal,” Halaris said.For hard-core fans, the feeling can be similar to post-holiday blues, Halaris said.Halaris offers these tips for fans who suddenly have to face months without football:Don’t go cold turkey. Watch football on YouTube, or on recordings, in gradually diminishing amounts. Share your feelings of withdrawal and letdown with a friend or spouse. While it can be unpleasant, football withdrawal is not serious enough to require antidepressants or other medications. And do not self-medicate with drugs or alcohol. Most important, buck up. “You’re just going to have to basically tough it out until football starts up again,” Halaris said. Story Source:The above story is based on materials provided by Loyola University Health System. …Read more
Aug. 30, 2013 — Despite the NFL’s $765 million settlement with retired players, there still is no credible scientific evidence that playing football causes Alzheimer’s disease or other neurological disorders, according to Loyola University Medical Center clinical neuropsychologist Christopher Randolph, PhD, who has published multiple studies on the topic.Share This:”The lawsuit is not a scientific issue, it’s a legal and political issue,” Randolph said. “There is absolutely no credible scientific data to suggest an increase of neurological risk from playing professional football.”Under the tentative settlement, the NFL would pay up to $5 million for each player who has Alzheimer’s disease and up to $4 million for each death from chronic traumatic encephalopathy (CTE). But a recent study by Randolph and colleagues of retired NFL football players found no evidence that CTE even exists. The study was published in the Journal of the International Neuropsychological Society.Randolph said there currently are no conclusive data that retired NFL players suffer a unique neuropathology. CTE is a vague condition, with no established clinical criteria and no consistent pathological criteria to diagnose it. And recent studies have found that NFL players have overall mortality rates that are only half of expected rates based upon men in the general population. Suicide rates are only about 40 percent of the rates in the general population.”We still do not know if NFL players have an increased risk of late-life neurodegenerative disorders,” Randolph said. “If there is a risk, it probably is not a great risk. And there is essentially no evidence to support the existence of any unique clinical disorder such as CTE.”Share this story on Facebook, Twitter, and Google:Other social bookmarking and sharing tools:|Story Source: The above story is based on materials provided by Loyola University Health System. …Read more
Oct. 4, 2012 — While many marathon runners may be preoccupied with shin splints, chafing and blisters come race day, one thing they may not consider is their bladder health.
“The added stress on the body that comes with running a marathon can cause urinary stress incontinence problems during the race or down the road,” said Melinda Abernethy, MD, fellow, Division of Female Pelvic Medicine and Reconstructive Surgery, Loyola University Chicago Stritch School of Medicine. “People who already suffer from incontinence also are at risk for bladder-control issues while running.”
Urinary stress incontinence is the loss of urine from physical activity such as coughing, sneezing and running. It is the most common form of incontinence, which impacts women more often than men.
Researchers from Loyola University Health System will partner with the Chicago Area Runners Association to study the relationship between long-distance running and pelvic floor disorders.
“This study will help us to better understand the link between endurance running and pelvic floor disorders including incontinence,” Dr. Abernethy said.
Until we know more, Dr. Abernethy recommends that runners should monitor their fluid intake and go to the bathroom at least every few hours during a marathon.
“Putting off going to the bathroom during the race is not healthy for your bladder,” Dr. Abernethy said. “Runners also should avoid diuretics, such as coffee or tea, before the race, because this can stimulate the bladder and cause you to visit the bathroom more frequently.”
Dr. Abernethy adds that pelvic floor exercises such as kegels, may help runners prevent urine leakage during the race. However, runners should speak with their doctor, if they experience bladder-control problems during or after the marathon.Read more