Facial transplantation: Almost a decade out, surgeons prepare for burgeoning demand

Plastic and reconstructive surgeons leading the first retrospective study of all known facial transplants worldwide conclude that the procedure is relatively safe, increasingly feasible, and a clear life-changer that can and should be offered to far more carefully selected patients.Reporting in The Lancet online April 27, NYU Langone plastic and reconstructive surgeon and senior author Eduardo Rodriguez, MD, DDS, says results after nearly a decade of experience with what he calls the “Mount Everest” of medical-surgical treatments are “highly encouraging.”The review team noted that the transplants still pose lifelong risks and complications from infection and sometimes toxic immunosuppressive drugs, but also are highly effective at restoring people to fully functioning lives after physically disfiguring and socially debilitating facial injuries.Surgeons base their claims on the experience of 28 people known to have had full or partial face transplants since 2005, when the first such procedure was performed on a woman in France.Of the 22 men and six women whose surgeries were reported, including seven Americans, none have chronically rejected their new organs and tissues, says Dr. Rodriguez, chair of the Department of Plastic Surgery at NYU Langone Medical Center and director of its Institute of Reconstructive Plastic Surgery. All but three recipients are still living. Four have returned to work or school.Dr. Rodriguez, the Helen L. Kimmel Professor of Reconstructive Plastic Surgery at NYU Langone, in 2012 performed what is widely considered the most extensive facial transplant (when he practiced at the University of Maryland Medical Center in Baltimore). The patient was a Virginia man who had lost the lower half of his face in a gunshot accident 10 years earlier. Dr. Rodriguez is currently readying his new team at NYU Langone to perform its first facial transplantation, expected later this year.In The Lancet article, Dr. Rodriguez and his colleagues point out that although all recipients to date have experienced some complications from infection, and mild to moderate signs of rejection, the few deaths among patients were due to infection and cancer not directly related to their transplants. …

Read more

Improving understanding of valley-wide stream chemistry

A geostatistical approach for studying environmental conditions in stream networks and landscapes has been successfully applied at a valley-wide scale to assess headwater stream chemistry at high resolution, revealing unexpected patterns in natural chemical components.”Headwater streams make up the majority of stream and river length in watersheds, affecting regional water quality,” said Assistant Professor Kevin J. McGuire, associate director of the Virginia Water Resources Research Center in Virginia Tech’s College of Natural Resources and Environment. “However, the actual patterns and causes of variation of water quality in headwater streams are often unknown.””Understanding the chemistry of these streams at a finer scale could help to identify factors impairing water quality and help us protect aquatic ecosystems,” said Gene E. Likens, president emeritus and distinguished senior scientist emeritus with the Cary Institute of Ecosystem Studies and the University of Connecticut.Results of the study that used a new statistical tool to describe spatial patterns of water chemistry in stream networks are published in the April 21 issue of the Proceedings of the National Academies of Science by a team of ecosystem scientists, including McGuire and Likens.The data used in the new analysis consist of 664 water samples collected every 300 feet throughout all 32 tributaries of the 14-square-mile Hubbard Brook Valley in New Hampshire. The chemistry results were originally reported in 2006 in the journal Biogeochemistry by Likens and Donald C. Buso, manager of field research with the Cary Institute.McGuire and other members of the National Science Foundation’s Long-Term Ecological Research team at the Hubbard Brook Ecosystem Study decided that the huge, high-resolution dataset was ideal for a new statistical approach that examines how water flows both within the stream network and across the landscape.”The goal was to visualize patterns that no one has been able to quantify before now and describe how they vary within headwater stream networks,” said McGuire. “Some chemical constituents vary at a fine scale, that is patterns of chemical change occur over very short distances, for example several hundred feet, but some constituents vary over much larger scales, for example miles. Several chemical constituents that we examined even varied at multiple scales suggesting that nested processes within streams and across the landscape influence the chemistry of stream networks.””The different spatial relationships permit the examination of patterns controlled by landscape versus stream network processes,” the article reports. Straight-line and unconnected network spatial relationships indicate landscape influences, such as soil, geology, and vegetation controls of water chemistry, for instance. In contrast, flow-connected relationships provide information on processes affected within the flowing streams.The researchers are very familiar with the Hubbard Brook Valley and could point to the varying influences of the geology and distinct soil types, including areas of shallow acidic organic-rich soils.The findings revealed by the analysis technique showed how chemistry patterns vary across landscapes with two scales of variation, one around 1,500 feet and another at about 4 miles. …

Read more

Amazon Studied to Predict Impact of Climate Change

Three extreme weather events in the Amazon Basin in the last decade are giving scientists an opportunity to make observations that will allow them to predict the impacts of climate change and deforestation on some of the most important ecological processes and ecosystem services of the Amazon River wetlands.Scientists from Virginia Tech, the Woods Hole Research Center, and the University of California, Santa Barbara, funded by NASA, are collaborating with Brazilian scientists to explore the ecosystem consequences of the extreme droughts of 2005 and 2010 and the extreme flood of 2009.”The research fills an important gap in our understanding of the vulnerability of tropical river-forest systems to changes in climate and land cover,” said the project’s leader, Leandro Castello, assistant professor of fish and wildlife conservation in Virginia Tech’s College of Natural Resources and Environment.The huge study area encompasses 1.7 million square miles, the equivalent of half of the continental United States.In addition to historical records and ground observations, the researchers will use newly available Earth System Data Records from NASA — satellite images of the Amazon and its tributaries over the complete high- and low-water cycles.NASA is funding the study with a $1.53 million grant shared among the three institutions.”Amazon floodplains and river channels — maintained by seasonal floods — promote nutrient cycling and high biological production, and support diverse biological communities as well as human populations with one of the highest per capita rates of fish consumption,” said Castello.The researchers will look at how the natural seasonality of river levels influences aquatic and terrestrial grasses, fisheries, and forest productivity in the floodplains, and how extreme events such as floods and droughts may disturb this cycle.”We are confident that deforestation and climate change will, in the future, lead to more frequent and severe floods and droughts,” said Michael Coe, a senior scientist at the Woods Hole Research Center. “It is important that we understand how the Amazon River and ecosystem services such as fisheries are affected so that we can devise mitigation strategies.”Amazonian grasses, sometimes called macrophytes, convert atmospheric carbon to plant biomass, which is then processed by aquatic microorganisms upon decomposition.”Terrestrial grasses grow during the short window when water levels are low, sequestering some carbon, and then die when the floods arrive, releasing the carbon into the aquatic system,” said Thiago Silva, an assistant professor of geography at So Paulo State University in Rio Claro, Brazil. “They are followed by aquatic grasses that need to grow extremely fast to surpass the rising floods and then die off during the receding-water period.””Although most of the macrophyte carbon is released back to the atmosphere in the same form that it is assimilated, carbon dioxide, some of it is actually exported to the ocean as dissolved carbon or released to the atmosphere as methane, a gas that has a warming potential 20 times larger than carbon dioxide,” said John Melack, a professor at the University of California, Santa Barbara.Researchers will measure plant growth and gas exchange, and use photographs from the field and satellites.Two other Amazon resources — fisheries and forests — are important to the livelihood of the people of the region.”We will combine water level, fishing effort, and fish life-history traits to understand the impact of droughts and floods on fishery yields,” said Castello, whose specialty is Amazon fisheries. “Floods in the Amazon are almost a blessing because in some years they can almost double the amount of fish in the river that is available for fishermen and society.”The fishery data include approximately 90,000 annual interview records of fisheries activities on the number of fishers, time spent fishing, characteristics of fishing boats and gear used, and weight of the catch for 40 species. The hydrological data include daily water level measurements recorded in the Madeira, Purus, and Amazonas-Solimes rivers.The researchers will examine the potential impact of future climate scenarios on the extent and productivity of floodplain forests — those enriched by rising waters, called whitewater river forests, and nutrient-poor blackwater river forests.For example, extreme droughts may reduce productivity due to water stress and increases in the frequency and severity of forest fires. Prolonged periods of inundation, on the other hand, may decrease productivity or increase mortality due to water-logging stress.”We will evaluate these responses for the first time at a regional scale using remotely sensed indicators of vegetation condition and fire-induced tree mortality to measure the response of floodplain forests to inter-annual flood variability and extreme climate events,” said Marcia Macedo, a research associate at the Woods Hole Research Center.Researchers will measure tree litter dry weight, depth of flooding, tree height and diameter, and stand density. They will also use photographs and satellite images.Previous research has focused on Amazon upland forests and the potential impacts of deforestation, fire, and drought. The research team will compare new greenhouse gas simulations to previous simulations.”Our research informs large river ecology globally because natural flowing rivers like the Amazon are rare these days, and most research to date, being done in North America and Europe, has focused on degraded systems,” Castello said.

Read more

Bats inspire ‘micro air vehicle’ designs: Small flying vehicles, complete with flapping wings, may now be designed

By exploring how creatures in nature are able to fly by flapping their wings, Virginia Tech researchers hope to apply that knowledge toward designing small flying vehicles known as “micro air vehicles” with flapping wings.More than 1,000 species of bats have hand membrane wings, meaning that their fingers are essentially “webbed” and connected by a flexible membrane. But understanding how bats use their wings to manipulate the air around them is extremely challenging — primarily because both experimental measurements on live creatures and the related computer analysis are quite complex.In Virginia Tech’s study of fruit bat wings, the researchers used experimental measurements of the movements of the bats’ wings in real flight, and then used analysis software to see the direct relationship between wing motion and airflow around the bat wing. They report their findings in the journal Physics of Fluids.”Bats have different wing shapes and sizes, depending on their evolutionary function. Typically, bats are very agile and can change their flight path very quickly — showing high maneuverability for midflight prey capture, so it’s of interest to know how they do this,” explained Danesh Tafti, the William S. Cross professor in the Department of Mechanical Engineering and director of the High Performance Computational Fluid Thermal Science and Engineering Lab at Virginia Tech.To give you an idea of the size of a fruit bat, it weighs roughly 30 grams and a single fully extended wing is about 17 x 9 cm in length, according to Tafti.Among the biggest surprises in store for the researchers was how bat wings manipulated the wing motion with correct timing to maximize the forces generated by the wing. “It distorts its wing shape and size continuously during flapping,” Tafti noted.For example, it increases the area of the wing by about 30 percent to maximize favorable forces during the downward movement of the wing, and it decreases the area by a similar amount on the way up to minimize unfavorable forces. The force coefficients generated by the wing are “about two to three times greater than a static airfoil wing used for large airplanes,” said Kamal Viswanath, a co-author who was a graduate research assistant working with Tafti when the work was performed and is now a research engineer at the U.S. Naval Research Lab’s Laboratories for Computational Physics and Fluid Dynamics.This study was just an initial step in the researchers’ work. “Next, we’d like to explore deconstructing the seemingly complex motion of the bat wing into simpler motions, which is necessary to make a bat-inspired flying robot,” said Viswanath. The researchers also want to keep the wing motion as simple as possible, but with the same force production as that of a real bat.”We’d also like to explore other bat wing motions, such as a bat in level flight or a bat trying to maneuver quickly to answer questions, including: What are the differences in wing motion and how do they translate to air movement and forces that the bat generates? …

Read more

First stroke guidelines for women created

While stroke occurrences have been on a consistent decline in the United States since the early 1900s, more women are still dying from them than are men. To aid in curbing these deaths, first-of-their-kind stroke-prevention guidelines for women have been released with the help of one University of Alabama at Birmingham expert.Stroke is the fourth-leading cause of death for all Americans, and 60 percent of strokes occur in women, according to the American Stroke Association.”Men are physiologically different from women, so preventive tips cannot be one-size-fits-all,” explained Virginia Howard, Ph.D., co-author of the new scientific statement Guidelines for the Prevention of Stroke in Women, published from the American Heart Association and American Stroke Association Council on Stroke in the AHA journal Stroke.”There are many considerations about stroke that might be different for women: Reproductive factors and risk factors more common or stronger in women, like diabetes and atrial fibrillation, might get lost in a general guidelines document,” said Howard, UAB professor of epidemiology and a lead investigator for the long-running Reasons for Geographic and Racial Differences in Stroke (REGARDS) study, the nation’s largest study aimed at exploring racial and geographic differences in stroke risk factors and stroke occurrence.The guidelines report stroke risks unique to women and provide scientifically based recommendations on how best to treat them, including:• Women should be screened for high blood pressure before being prescribed birth control pills, which raise blood pressure in some women.• Women with a history of high blood pressure before pregnancy should be considered for low-dose aspirin and/or calcium supplement therapy to lower pre-eclampsia risks.• Women who have had pre-eclampsia have twice the risk of stroke and a fourfold risk of high blood pressure later in life. Therefore, pre-eclampsia should be recognized as a risk factor well after pregnancy, and other risk factors such as smoking, high cholesterol and obesity in these women should be treated early.• Pregnant women with moderately high blood pressure (150-159 mm Hg/100-109 mm Hg) may be considered for blood pressure medication, whereas expectant mothers with very high blood pressure (160/110 mm Hg or above) should be treated.”Getting these preventive measures to doctors is exciting because it’s an opportunity to start the conversation early; people think stroke is just an ‘old person’s disease,'” Howard said. “While it generally is, it’s also preventable. There are many things women can do at younger ages, during child-bearing years, which can impact stroke risk later in life, so it’s an important message to have physicians — especially OB/GYNs, who may be the only doctors some women see at younger ages — involved in stroke-prevention care early on.”Story Source:The above story is based on materials provided by University of Alabama at Birmingham. Note: Materials may be edited for content and length.

Read more

Peaches can be profitable in three years: Researcher to growers

Florida peach growers, some of whom are looking for an alternative to citrus as greening takes a toll on that crop, could see a small profit by their third year of operation, a UF researcher says.Greening, a disease first found in Florida in 2005, has led to $4 billion in lost revenue and industry-related jobs since 2006 for the $9 billion-a-year citrus industry.As some farmers turn to peaches, they want to know how long before they turn a profit and how long they can sustain that profit, said Mercy Olmstead, assistant professor in horticultural sciences at UF’s Institute of Food and Agricultural Sciences. Growers should see steady profit through years 10-12, when the tree starts to decline in the South.”This is good news,” she said. “It is typically seven years before you get a commercial crop on citrus and probably eight before you are profitable.”Olmstead co-wrote a paper that created four-year peach orchard budgets and growing operation plans with former UF doctoral student Kim Morgan, now an assistant professor in agriculture and applied economics at Virginia Tech.Florida peaches go to market earlier than others around the nation, giving growers here a leg up on national competition, Olmstead said.Growers invest about $11,600 in a peach orchard during the first two years before they see a profit, with a third-year income of about $10,150 per acre, with $8,342 in grower costs, for a profit of about $1,800, , she said.A 2011 Florida grower survey showed peaches grown on about 670 acres, according to the paper. Another 300 to 400 acres were added in 2012. Those acres are now producing about 4.5 million pounds per year, at an estimated value of $6 million, the paper says.While an assistant professor at Mississippi State University, Morgan interviewed 26 of the estimated 40 Florida peach growers and then created four-year budgets and operation plans for the growers. The growers had varying amounts of experience, from just having established an orchard to five or more years’ experience, Olmstead said.The budget plans included prices of pest sprays, tree costs, fuel, repairs and more. Morgan presented her paper last summer at the Proceedings of the Florida State Horticultural Society, and it is online at society’s website, http://www.fcla.edu/fshs.Story Source:The above story is based on materials provided by University of Florida Institute of Food and Agricultural Sciences. Note: Materials may be edited for content and length.

Read more

Doctors likely to accept new Medicaid patients as coverage expands

Oct. 16, 2013 — The upcoming expansion of Medicaid under the Affordable Care Act (ACA) won’t lead physicians to reduce the number of new Medicaid patients they accept, suggests a study in the November issue of Medical Care, published by Lippincott Williams & Wilkins, a part of Wolters Kluwer Health.However, doctors may be less likely to accept those patients who remain uninsured, according to an analysis of historical data by Lindsay M. Sabik, PhD, and Sabina Ohri Gandhi, PhD, of Virginia Commonwealth University, Richmond. They write, “Our results suggest that after increases in Medicaid coverage within a market, access may be limited for the remaining patients.”Doctors Likely to Continue Accepting Medicaid Patients After ExpansionAs part of the ACA, Medicaid coverage will expand substantially beginning in 2014, with the goal of improving the health of people who were previously uninsured. Whether that goal is achieved will partly depend on how doctors respond to changes in their local market — and how those decisions affect low-income individuals who rely on “safety-net” care.Drs Sabik and Gandhi analyzed data from a long-term, nationwide study of changes in the health care system (the Community Tracking Study Physician Survey). Physician survey responses from the mid-1990s to the mid-2000s were analyzed to assess how market-level changes in Medicaid coverage affected doctors’ acceptance of new patients: both patients covered by Medicaid and uninsured patients who were unable to pay.For most of the period studied, Medicaid coverage rates increased while uninsurance rates trended lower. Both rates varied between different markets. About 70 percent of physicians surveyed were in solo or group medical practice.The data suggested that changes in Medicaid coverage did not significantly affect doctors’ acceptance of new Medicaid patients. “[P]hysicians who were already accepting (or not accepting) Medicaid patients before changes in Medicaid coverage rates continue to do so,” Drs Sabik and Gandhi write.On average, new Medicaid patients were accepted by about 72 percent of office-based and 90 percent of facility-based doctors (those who work at hospitals or other facilities). These rates remained about the same after changes in Medicaid coverage.But May Not Accept Patients Who Remain UninsuredHowever, when Medicaid coverage rates increased, physicians became less likely to accept new uninsured patients. …

Read more

Men feel worse about themselves when female partners succeed

Aug. 29, 2013 — Deep down, men may not bask in the glory of their successful wives or girlfriends. While this is not true of women, men’s subconscious self-esteem may be bruised when their spouse or girlfriend excels, says a study published by the American Psychological Association.It didn’t matter if their significant other was an excellent hostess or intelligent, men were more likely to feel subconsciously worse about themselves when their female partner succeeded than when she failed, according to the study published online in the APA Journal of Personality and Social Psychology. However, women’s self-esteem was not affected by their male partners’ successes or failures, according to the research, which looked at heterosexual Americans and Dutch.”It makes sense that a man might feel threatened if his girlfriend outperforms him in something they’re doing together, such as trying to lose weight,” said the study’s lead author, Kate Ratliff, PhD, of the University of Florida. “But this research found evidence that men automatically interpret a partner’s success as their own failure, even when they’re not in direct competition”Men subconsciously felt worse about themselves when they thought about a time when their female partner thrived in a situation in which they had failed, according to the findings. The researchers studied 896 people in five experiments.In one experiment, 32 couples from the University of Virginia were given what was described as a “test of problem solving and social intelligence” and then told that their partner scored either in the top or bottom 12 percent of all university students. Hearing that their partner scored high or low on the test did not affect what the researchers called participants’ explicit self-esteem — i.e., how they said they felt.Participants were also given a test to determine how they felt subconsciously about their partners’ performance, which the researchers called implicit self-esteem. In this test, a computer tracks how quickly people associate good and bad words with themselves. For example, participants with high implicit self-esteem who see the word “me” on a computer screen are more likely to associate it with words such as “excellent” or “good” rather than “bad” or “dreadful.”Men who believed that their partner scored in the top 12 percent demonstrated significantly lower implicit self-esteem than men who believed their partner scored in the bottom 12 percent. Participants did not receive information about their own performance.Findings were similar in two more studies conducted in the Netherlands. …

Read more

Promising therapeutic target for hard-to-treat brain tumor

Aug. 27, 2013 — Johns Hopkins researchers say they have found a specific protein in nearly 100 percent of high-grade meningiomas — the most common form of brain tumor — suggesting a new target for therapies for a cancer that does not respond to current chemotherapy.Importantly, the investigators say, the protein — NY-ESO-1 — is already at the center of a clinical trial underway at the National Cancer Institute. That trial is designed to activate the immune systems of patients with other types of tumors that express the protein, training the body to attack the cancer and eradicate it.”Typically there is a lag time before a laboratory finding like this leads to a clear path forward to help patients. But in this case, since there is already a clinical trial underway, we have a chance of helping people sooner rather than later,” says Gregory J. Riggins, M.D., Ph.D., a professor of neurosurgery at the Johns Hopkins University School of Medicine and the senior author of the study published online in the journal Cancer Immunology Research.In the NCI trial, NY-ESO-1 is found in a much smaller percentage of tumors than Riggins and his team found in high-grade meningioma, suggesting that for the brain cancer, the target would be potentially more significant.Most low-grade meningiomas located in easy-to-reach locations can be treated successfully with surgery and radiation. But more atypical, higher-grade tumors are much more difficult to eradicate and are deadlier.Riggins and his colleagues, including Gilson S. Baia, Ph.D., and Otavia L. Caballero, M.D., Ph.D., set out to find cancer antigens in meningioma. Cancer antigens are proteins expressed in tumors but not in healthy cells, making them good targets for chemical or immune system attack. They looked specifically at 37 cancer/testis (CT) genes, which are not found in normal cells in the body except in germ cells and cells cordoned off in the testicles or, in some cases, ovaries.CT genes are activated, however, in various cancers. …

Read more

Human brains are hardwired for empathy, friendship

Aug. 22, 2013 — Perhaps one of the most defining features of humanity is our capacity for empathy — the ability to put ourselves in others’ shoes. A new University of Virginia study strongly suggests that we are hardwired to empathize because we closely associate people who are close to us — friends, spouses, lovers — with our very selves.”With familiarity, other people become part of ourselves,” said James Coan, a psychology professor in U.Va.’s College of Arts & Sciences who used functional magnetic resonance imaging brain scans to find that people closely correlate people to whom they are attached to themselves. The study appears in the August issue of the journal Social Cognitive and Affective Neuroscience.”Our self comes to include the people we feel close to,” Coan said.In other words, our self-identity is largely based on whom we know and empathize with. Coan and his U.Va. colleagues conducted the study with 22 young adult participants who underwent fMRI scans of their brains during experiments to monitor brain activity while under threat of receiving mild electrical shocks to themselves or to a friend or stranger. The researchers found, as they expected, that regions of the brain responsible for threat response — the anterior insula, putamen and supramarginal gyrus — became active under threat of shock to the self. In the case of threat of shock to a stranger, the brain in those regions displayed little activity. However when the threat of shock was to a friend, the brain activity of the participant became essentially identical to the activity displayed under threat to the self.”The correlation between self and friend was remarkably similar,” Coan said. “The finding shows the brain’s remarkable capacity to model self to others; that people close to us become a part of ourselves, and that is not just metaphor or poetry, it’s very real. …

Read more

New early warning system for cholera epidemics

Aug. 15, 2013 — In two recently published papers, Tufts University School of Engineering researchers have established new techniques for predicting the severity of seasonal cholera epidemics months before they occur and with a greater degree of accuracy than other methods based on remote satellite imaging. Taken together, findings from these two papers may provide the essential lead time to strengthen intervention efforts before the outbreak of cholera in endemic regions.Cholera is an acute diarrheal disease caused by the bacterium Vibrio cholerae. It occurs in the spring and fall in the Bengal delta. In past research, scientists have used chlorophyll, a surrogate for phytoplankton, as a measuring stick for cholera. The cholera bacteria lives and thrives among phytoplankton and zooplankton.In the June issue of Remote Sensing Letters, Antarpreet Jutla, then a doctoral student at Tufts School of Engineering and now on the faculty at West Virginia University, was lead author on a study that measured chlorophyll and other organic matter.The team, which was led by Shafiqul Islam, Ph.D., professor of civil and environmental engineering at Tufts School of Engineering, used satellite data to measure chlorophyll and algae, organic substances, and flora that also support growth of the cholera bacteria.Using satellite images, the researchers created a “satellite water marker” (SWM) index to estimate the presence of organic matter including chlorophyll and plankton based on wavelength measurements.A predominance of green, plankton-rich water — which is measured at 555 nanometers — indicated the degree to which the waters contained chlorophyll, plankton, and other impurities. Clear, blue water — measured at 412 nanometers — indicated low levels of these impurities, according to the researchers.The researchers targeted the spring epidemic, which is a coastal phenomenon caused by water flow into the delta from three principal rivers — the Brahmaputra, Ganges, and Meghna. Unlike the spring outbreak, the fall epidemic is linked to flooding which follows the monsoons and subsequent breakdown of sanitary conditions rather than costal conditions.In their study, the researchers correlated cholera incidence from the International Center for Diarrheal Disease Research, Bangladesh from 1997 to 2010 with satellite imaging data from the National Aeronautics and Space Administration for the same time period.They discovered a relationship between SWM index measurements taken in early winter — from October to December — and the severity of cholera epidemics in the following spring. “In short, the index for chlorophyll along with readings for other biological matter in early winter indicated severity of cholera incidence in the spring,” says Jutla.The SWM is a more accurate predictor of cholera than the algorithm that measures strictly chlorophyll levels because it also measures a broader range of organic matter, says Islam.”The probability for error in this index-based estimate is less than 10 percent while the error in using the chlorophyll-based algorithm is about 30 percent,” says Islam. To validate their hypothesis that the index can be used in coastal areas outside of the Bengal Delta, the team applied the SWM to coastal waters around Mozambique’s capital city, Maputo.Additional authors on this paper are Abu Syed Golam Faruque, and Rita Colwell of the Center for Bioinformatics and Computational Biology at the University of Maryland, and Anwar Huq of the Maryland Pathogenic Research Institute at the University of Maryland. …

Read more

New data reveal extent of genetic overlap between major mental disorders

Aug. 11, 2013 — The largest genome-wide study of its kind has determined how much five major mental illnesses are traceable to the same common inherited genetic variations. Researchers funded in part by the National Institutes of Health found that the overlap was highest between schizophrenia and bipolar disorder; moderate for bipolar disorder and depression and for ADHD and depression; and low between schizophrenia and autism. Overall, common genetic variation accounted for 17-28 percent of risk for the illnesses.”Since our study only looked at common gene variants, the total genetic overlap between the disorders is likely higher,” explained Naomi Wray, Ph.D., University of Queensland, Brisbane, Australia, who co-led the multi-site study by the Cross Disorders Group of the Psychiatric Genomics Consortium (PGC), which is supported by the NIH’s National Institute of Mental Health (NIMH). “Shared variants with smaller effects, rare variants, mutations, duplications, deletions, and gene-environment interactions also contribute to these illnesses.”Dr. Wray, Kenneth Kendler, M.D., of Virginia Commonwealth University, Richmond, Jordan Smoller, M.D., of Massachusetts General Hospital, Boston, and other members of the PGC group report on their findings August 11, 2013 in the journal Nature Genetics.”Such evidence quantifying shared genetic risk factors among traditional psychiatric diagnoses will help us move toward classification that will be more faithful to nature,” said Bruce Cuthbert, Ph.D., director of the NIMH Division of Adult Translational Research and Treatment Development and coordinator of the Institute’s Research Domain Criteria (RDoC) project, which is developing a mental disorders classification system for research based more on underlying causes.Earlier this year, PGC researchers — more than 300 scientists at 80 research centers in 20 countries — reported the first evidence of overlap between all five disorders. People with the disorders were more likely to have suspect variation at the same four chromosomal sites. But the extent of the overlap remained unclear. In the new study, they used the same genome-wide information and the largest data sets currently available to estimate the risk for the illnesses attributable to any of hundreds of thousands of sites of common variability in the genetic code across chromosomes. They looked for similarities in such genetic variation among several thousand people with each illness and compared them to controls — calculating the extent to which pairs of disorders are linked to the same genetic variants.The overlap in heritability attributable to common genetic variation was about 15 percent between schizophrenia and bipolar disorder, about 10 percent between bipolar disorder and depression, about 9 percent between schizophrenia and depression, and about 3 percent between schizophrenia and autism.The newfound molecular genetic evidence linking schizophrenia and depression, if replicated, could have important implications for diagnostics and research, say the researchers. …

Read more

New protein discovered with vast potential for treatment of cancer and other diseases

July 31, 2013 — In cancer research, discovering a new protein that plays a role in cancer is like finding a key and a treasure map: follow the clues and eventually there could be a big reward. At least that’s the hope from a new study published in the journal Nature that discovered a novel protein called ceramide-1 phosphate transport protein (CPTP) — a finding that could eventually lead to the development of new drugs to treat a variety of cancers and other conditions involving inflammation and thrombosis, or blood clotting.The identification of CPTP was the result of an international collaboration that built on prior research by co-lead author Charles Chalfant, Ph.D., Endowed Chair of Cancer Cell Signaling and member of the Cancer Cell Signaling program at Virginia Commonwealth University Massey Cancer Center as well as professor in the Department of Biochemistry and Molecular Biology at VCU School of Medicine. The team discovered that CPTP regulates levels of biologically active lipids, which are molecules such as fatty acids that often play a role in cell signaling. As its name implies, this study determined that CPTP’s main function is to transport ceramide-1-phosphate (C1P), a lipid that helps regulate cell growth, survival, migration and inflammation. Specifically, C1P increases the production of pro-inflammatory eicosanoids — powerful signaling molecules that contribute to chronic inflammation in diseases such as cancer, asthma, atherosclerosis and thrombosis — and the discovery of CPTP shines a light on the cellular mechanisms that contribute to these diseases.”We may have identified the newest target for treating cancer,” says Chalfant. “Because of the important role this protein plays in a number of cellular functions, it could also have large implications for a variety of diseases like cancer that are caused by inflammation.”With assistance from Massey’s Lipidomics Developing Shared Resource core, the researchers were able to determine the composition of the bioactive lipids regulated by CPTP. Residing in the cytosol, or the liquid within cells, the team found that CPTP regulates catabolism of C1P, a process that breaks down the molecule in order to release its energy. They also demonstrated that CPTP transports C1P to the cellular membrane where it helps synthesize eicosanoids from fatty acids in the membrane.Confirming a decade of research from Chalfant’s laboratory, the scientists provided further proof that C1P regulates group IVA phospholipase A2, an enzyme that promotes inflammation through the production of a fatty acid known as arachidonic acid. The release of arachidonic acid via C1P activation of this enzyme was shown to trigger the production of eicosanoids. These findings help to explain the reported link between ceramide kinase, the enzyme responsible for C1P production, and poor prognosis in breast cancer patients, which further suggests that alleviation of systemic inflammation may lead to better prognosis and better treatment responses.”Moving forward, we hope to use our knowledge of the structure of CPTP in order to find small molecules and other means that can block it,” says Chalfant. …

Read more

Physicists discover theoretical possibility of large, hollow magnetic cage molecules

July 31, 2013 — Virginia Commonwealth University researchers have discovered, in theory, the possibility of creating large, hollow magnetic cage molecules that could one day be used in medicine as a drug delivery system to non-invasively treat tumors, and in other emerging technologies.Approximately 25 years ago, scientists first made the discovery of C60 fullerene — better known as the Buckminster Fullerene — a molecule composed of 60 carbon molecules that formed a hollow cage. Due to its unique hollow cage structure the molecule offers serious technological potential because it could hold other atoms or small molecules inside, and therefore, be used in applications such as drug delivery.That potential has since spurred worldwide interest among scientists who have been searching for similar molecules. Although some hollow cage structures have been found, none of them is magnetic. Magnetic properties of the structure are of particular interest because a hollow magnetic structure carrying an embedded atom or molecule can be guided by an external magnetic field and may serve as an effective vehicle for targeted drug delivery.In a new study, published online on July 22 in The Journal of Chemical Physics, two VCU scientists employing state-of-the-art theoretical methods show that magnetic hollow cages larger than the original C60 fullerene that carry giant magnetic moments are possible. A magnetic moment refers to the measure of the magnetic strength of a cluster.”The potential benefit of this finding is that it provides a route to the synthesis of molecular magnets with colossal magnetic moments,” said co-lead investigator Puru Jena, Ph.D., distinguished professor of physics in the VCU College of Humanities and Sciences. Jena collaborated with Menghao Wu, Ph.D., co-author of the paper and a postdoctoral scholar in the VCU Department of Physics.”These molecules can be used for targeted non-invasive drug delivery. When assembled, the molecules can also form new high strength magnets for device application,” Jena said.According to Jena, the pair of VCU researchers demonstrated the magnetic moment of the molecule by focusing on hetero-atomic clusters consisting of transition metal atoms such as cobalt (Co) and manganese (Mn) and carbon (C) atoms. In particular, Co12C6, Mn12C6, and Mn24C18 clusters consisting of 12 cobalt and six carbon atoms, 12 manganese and six carbon atoms, and 24 manganese and 18 carbon atoms, respectively, carry magnetic moments as large as 14, 38 and 70 Bohr magnetons. In comparison, the magnetic moment of an iron (Fe) atom in crystalline iron is 2.2 Bohr magnetons.According to Jena, the team is still early in its discovery process.”There is a long way to go. Experiments first have to be carried out to prove the predictions of our theory,” said Jena.”Ways must be found to synthesize large quantities of these molecules and study their magnetic properties once they are assembled. …

Read more

Head hits can be reduced in youth football

July 29, 2013 — Less contact during practice could mean a lot less exposure to head injuries for young football players, according to researchers at Wake Forest Baptist Medical Center and Virginia Tech.Their study of 50 youth-league players ages 9 to 12 — the largest ever conducted to measure the effects of head impacts in youth football — found that contact in practice, not games, was the most significant variable when the number and force of head hits incurred over the course of a season were measured. Numerous studies in this area have been done on high school and college players, but those findings do not necessarily apply to younger players.Though more than 70 percent of the football players in the United States are under age 14, there is no clear, scientifically based understanding of the effect of repeated blows to the head in young players, said Steven Rowson, Ph.D., assistant professor at the Virginia Tech-Wake Forest University School of Biomedical Engineering and lead author of the study, which is published in the current online edition of the Annals of Biomedical Engineering.To quantify youth football players’ exposure to head impacts in practices and games over the course of a single season, the researchers employed sensors in the helmets of 50 players on three teams in two different leagues.The sensors were installed on an elastic base inside the helmet so that they remained in contact with the head throughout the duration of head impact, allowing for measurement of head acceleration rather than that of the helmet. Data from the sensors were transmitted wirelessly to a computer on the sideline and processed to measure both the linear and rotational head acceleration caused by each impact. All data were analyzed on an individual player basis and then averaged to represent the exposure level of a typical 9- to12-year-old football player.The most important finding was that substantial differences existed among the three teams for both frequency and intensity of the impacts, Rowson said. For the entire season, players on team A experienced an average of 37 to 46 percent fewer impacts than players on teams B and C. For example, the average player on team A experienced 158 impacts during the season, compared to 294 and 251 on the other two teams. This can be attributed to several factors, but the primary reason was that team A had fewer practices during the season than teams B and C, the study showed. During games, impact frequency and acceleration magnitudes were not significantly different among the teams.In addition, team A competed in a league that had implemented Pop Warner rule changes, including a limit on contact during practice sessions. Teams B and C had no such restrictions.Although the practice contacts were limited, there were no differences in the head acceleration magnitudes measured in the games between all three teams. The 95 percent head accelerations ranged from 41 g to 45 g for all three teams and were not significantly different. …

Read more

Scientists unable to find evidence of ’embryonic-like’ cells in marrow of adult mice

July 24, 2013 — Research on human embryonic stem cells has been a political and religious lightning rod for more than a decade.The cells long have been believed to be the only naturally occurring pluripotent cells. (Under the right conditions, pluripotent cells can become any other cell in the body.) But some people object to the fact that the embryo is destroyed during their isolation. Induced pluripotent stem cells, created by experimentally manipulating an adult cell such as a skin or nerve cell, are much more ethically palatable. But many researchers feel it is important to continue studying both types of cells.In 2006, a group of researchers led by Mariusz Ratajczak, MD, PhD, at the University of Louisville, described another possible alternative: a special population of very small, pluripotent embryonic-like cells in adult bone marrow of mice and humans. These cells, called VSEL (very small embryonic-like) cells, presumably arise through the self-renewal of embryonic stem cells during the developmental process and, as described, could provide all the benefits of embryonic stem cell research with none of the ethical controversy. However, subsequent research from other labs has provided conflicting results as to the pluripotency — and even the existence — of VSEL cells in bone marrow.A company, NeoStem, has proposed a human clinical trial of the cells for periodontitis to begin this year.But scientists in the laboratory of Irving Weissman, MD, a professor of pathology at the Stanford University School of Medicine, say they have been unable to identify any very small, pluripotent cells in the bone marrow of mice, despite exhaustive efforts to duplicate the original experimental procedures.”It has become important to know to what extent and where these VSEL cells exist to understand how they may affect the field of stem cell research,” said Weissman, who directs Stanford’s Institute for Stem Cell Biology and Regenerative Medicine and the Ludwig Center for Cancer Stem Cell Research and Medicine at Stanford. “We tried as hard as we could to replicate the original published results using the methods described and were unable to detect these cells in either the bone marrow or the blood of laboratory mice.”Although other groups have seemingly confirmed the existence of these cells as defined by size and the expression of key cell-surface molecules, Weissman’s study is the first to evaluate the biological potency of the cells.The research will be published online July 24 in Stem Cell Reports. Weissman, who is also the Virginia & D.K. Ludwig Professor for Clinical Investigation in Cancer Research and a member of the Stanford Cancer Institute, shares senior authorship of the study with instructor Jun Seita, MD, PhD. Postdoctoral scholars Masanori Miyanishi, PhD, and Yasuo Mori, MD, PhD, are the lead authors.Using a variety of methods, the researchers found that most of the very small (less than 5 micrometers in diameter) particles in mouse-bone marrow were not cells, but were in fact cell debris or dead cells with a less-than-normal complement of DNA. …

Read more

Joint custody? Overnights away from home affect children’s attachments

July 19, 2013 — Babies have an innate biological need to be attached to caregivers, usually their parents. But what happens when babies spend a night or more per week away from a primary caregiver, as increasingly happens in cases where the parents share custody, but do not live together?In a new national study, University of Virginia researchers found that infants who spent at least one night per week away from their mothers had more insecure attachments to the mother compared to babies who had fewer overnights or saw their fathers only during the day.The finding is reported in the August edition of the Journal of Marriage and Family.Attachments are defined as an enduring, deep, emotional connection between an infant and caregiver that develops within the child’s first year of life, according to Samantha Tornello, the study’s lead author and a Ph.D. candidate in psychology in U.Va.’s Graduate School of Arts & Sciences.Attachments during that critical first year serve as the basis for healthy attachments and relationships later in life, including adulthood, Tornello said.She notes that growing numbers of parents are living apart due to nonmarital childbirth, the breakup of cohabitating parents, separation and divorce. Parents increasingly are choosing to share child rearing in some form of joint custody, and often the legal system must determine custody arrangements for the children of parents who do not live together.”Judges often find themselves making decisions regarding custody without knowing what actually may be in the best interest of the child, based on psychology research,” Tornello said. “Our study raises the question, ‘Would babies be better off spending their overnights with a single caregiver, or at least less frequently in another home?'”Tornello pointed out that either the mother or father could be the primary caregiver, but the point would be that the child ideally would be in the care each night of a loving and attentive caregiver and that there may be something disruptive about an infant spending nights in different homes.”We would want a child to be attached to both parents, but in the case of separation a child should have at least one good secure attachment,” she said. “It’s about having constant caregivers that’s important.”Tornello and her co-authors at U.Va. and the American Institutes for Research, including U.Va. psychology professor Robert Emery, analyzed data from the Fragile Families and Child Wellbeing Study, a national longitudinal study of about 5,000 children born in large U.S. cities from 1998 to 2000. The data was collected by researchers at Princeton University and Columbia University and consisted of interviews with both parents at the time of the child’s birth, and at ages 1 and 3. …

Read more

Raising adopted children: How parents cooperate matters more than gay or straight

July 13, 2013 — A study suggests that whether parents are gay, lesbian or straight, how well they work together as a couple is linked to fewer behavior problems in their adopted children and is more important than their sexual orientation.A new study by psychology researchers suggests that whether parents are gay, lesbian or straight, how well they work together as a couple and support each other in parenting is linked to fewer behavior problems among their adopted children and is more important than their sexual orientation.Rachel H. Farr at the University of Massachusetts Amherst and Charlotte J. Patterson at the University of Virginia report their findings from this first empirical examination of differences and similarities in co-parenting among lesbian, gay and heterosexual adoptive couples and associations with child behavior in the July/August issue of Child Development.Farr, who led the study, says, “While actual divisions of childcare tasks such as feeding, dressing and taking time to play with kids were unrelated to children’s adjustment, it was the parents who were most satisfied with their arrangements with each other who had children with fewer behavior problems, such as acting out or showing aggressive behavior.””It appears that while children are not affected by how parents divide childcare tasks, it definitely does matter how harmonious the parents’ relationships are with each other,” she adds. She and Patterson also observed differences in division of labor in lesbian and gay couples compared to heterosexual parents.The study suggests that lesbian and gay couples may be creating new ways to live together and raise children outside of traditional gender roles, the authors say, and results are important to adoption professionals and others who work with adoptive families. Further, the research is informative for those debating legal, political and policy questions about family dynamics and outcomes for children raised by same-sex couples.For this study, Farr and Patterson recruited families from five adoption agencies across the United States. In total, 104 families agreed to participate, 25 headed by lesbian partners, 29 by gay male partners and 50 by heterosexual couples. Their adoptive children had been placed with them at birth or within the first few weeks of life; at the time of the study the children were all around three years old.Parents were asked to report on the division of child-related labor between them and on factors of their child’s adjustment. They were also observed by researchers who coded their co-parenting behavior during videotaped parent-child play sessions along scales rated for “supportive” and “undermining” interactions, using an established test.The researchers discovered that lesbian and gay couples were more likely to equally share childcare tasks, while heterosexual couples were likely to specialize, with mothers doing more work than fathers in these families. In addition, Farr says, from the videotaped observations of family interactions, “it was clear that other aspects of co-parenting, such as how supportive parents were of each other, or how much they competed, were connected with children’s behavioral problems.”Parents’ dissatisfaction with division of child-care labor, not the actual division of these tasks, was significantly associated with increased child behavior problems. As the researchers had expected, supportive co-parenting interactions, such as greater pleasure and engagement between parents, were associated with positive child behavior for all three types of parents.Overall, whether parents shared child care tasks or had a more specialized division of this work was not related to children’s adjustment. …

Read more

Ethical quandary about vaccinations sparked by tension between parental rights and protecting public health

July 8, 2013 — Increased concerns about the perceived risk of vaccination, inconvenience, or religious tenets are leading more U.S. parents to opt-out of vaccinating their children. Parents are increasingly able to do so in states that have relatively simple procedures for immunization exemption, report researchers at NYU Langone Medical Center in the July issue of Health Affairs. Some states, fearing a public health crisis, have responded by putting in place more burdensome procedures for parents of school-aged children to opt-out.All this adds up to an ethical quagmire, say the researchers.”Choosing not to vaccinate a child puts others at risk,” said study co-author Arthur L. Caplan, PhD, Drs. William F. and Virginia Connolly Mitty Professor of Bioethics, Department of Population Health, Division of Medical Ethics, NYU Langone Medical Center. “Though making the opt-out process more difficult may reduce the numbers of exemptions granted, it is also ethically problematic.” According to Dr. Caplan, this conflict poses a challenge for public health policy makers and continues to incite social debate.All 50 states and the District of Columbia allow vaccination exemptions for medical reasons; 49 states and the District of Columbia allow exemptions for religious reasons; and 18 states allow exemptions for school-aged children for philosophical reasons (people who strongly object to immunization for reasons not associated with their religious beliefs). For the purposes of the recent study, researchers examined the process that certain states require parents to follow to obtain non-medical exemptions for school-age children, which are more prevalent than medical exceptions: About 80 percent of all vaccination exemptions in the 2011-2012 school year were non-medical.Researchers also studied the complexity of individual states exemption processes to determine if they played a role in parents’ decisions to opt-out. …

Read more

‘Dead zone’ impacts Chesapeake Bay fishes

July 8, 2013 — A 10-year study of Chesapeake Bay fishes by researchers at the Virginia Institute of Marine Science provides the first quantitative evidence on a bay-wide scale that low-oxygen “dead zones” are impacting the distribution and abundance of “demersal” fishes — those that live and feed near the Bay bottom.The affected species — which include Atlantic croaker, white perch, spot, striped bass, and summer flounder — are a key part of the Chesapeake Bay ecosystem and support important commercial and recreational fisheries.The study, published in a recent issue of Marine Ecology Progress Series, was authored by Andre Buchheister, a Ph.D. student in William & Mary’s School of Marine Science at VIMS, along with VIMS colleagues Chris Bonzek, Jim Gartland, and Dr. Rob Latour.All four authors are involved in VIMS’ Chesapeake Bay Multi-Species Monitoring and Assessment Program (ChesMMAP), an ongoing effort to track and understand interactions between and among fishes and other marine life within the Bay ecosystem.Buchheister says “This is the first study to document that chronically low levels of dissolved oxygen in Chesapeake Bay can reduce the number and catch rates of demersal fish species on a large scale.” He notes that other studies have looked at the effects of low oxygen on fishes within the water column and on demersal fishes within individual Bay tributaries.Low-oxygen conditions — what scientists call “hypoxia” — form when excessive loads of nitrogen from fertilizers, sewage, and other sources feed algal blooms in coastal waters. When these algae die and sink, they provide a rich food source for bacteria, which in the act of decomposition take up dissolved oxygen from nearby waters.In Chesapeake Bay, low-oxygen conditions are most pronounced in mid-summer, and in the deep waters of the Bay’s middle reaches. “This appears to displace fish biomass toward the northern and southern edges of the bay’s mainstem channel,” says Buchheister.”The drastic decline we saw in species richness, species diversity, and catch rate under low-oxygen conditions is consistent with work from other systems,” he adds. “It suggests that demersal fishes begin to avoid an area when levels of dissolved oxygen drop below about 4 milligrams per liter, as they start to suffer physiological stress.”The fishes’ response at this value is interesting, says Buchheister, “because it occurs at levels greater than the 2 milligrams per liter that scientists formally use to define hypoxia.” Normal coastal waters contain from 7-8 milligrams of oxygen per liter.Previous research suggests that oxygen-poor waters can stress fish directly, through increased respiration and elevated metabolism, and also by affecting their prey.”Low levels of dissolved oxygen stress or kill the bottom-dwelling invertebrates that demersal fishes rely on for food,” says Buchheister. “Prolonged exposure of these invertebrates to hypoxic conditions in the mid-Bay represents a substantial reduction in the habitat available for foraging by demersal fishes baywide, and could reduce the quality of foraging habitat even after bottom waters become re-oxygenated.”The authors caution, however, that the limits on fish abundance and distribution brought on by low-oxygen conditions are to some degree balanced by the positive effects that nutrients have on production of mid-water and surface-dwelling fishes elsewhere in the Bay. The nutrient-rich waters that encourage dead-zone formation also fuel algal growth, thus turbocharging the base of a food web that ultimately supports fish and other predators.ChesMMAPThe team’s findings are based on an exhaustive study of the distribution and abundance of late juvenile and adult fishes caught and released in trawl nets during 48 sampling trips between 2002 and 2011, the largest quantitative assessment of the bay-wide demersal fish community ever conducted. The sampling took place at 3,640 ChesMMAP stations throughout the mainstem of Chesapeake Bay.ChesMMAP, currently funded by Wallop-Breaux funds from the Virginia Marine Resources Commission, was established in 2002 as part of the growing international recognition that a single-species approach to fisheries management does not fully account for the complex interactions within marine ecosystems.Latour, head of the Multispecies Research Group at VIMS, says “The traditional approach to fisheries management looks at a single species as if it were independent, unaffected by other processes and having no effect on other species. In ChesMMAP and our other multispecies research programs we analyze the interactions between species and their environment, including studies of predator-prey dynamics, seasonal changes in distribution, and water-quality parameters such as temperature, salinity, and DO [dissolved oxygen].”SalinityIndeed, the team’s research shows that salinity is the most important factor affecting the distribution of Bay fishes, whether they live near the bottom or towards the surface.”Salinity was the major environmental gradient structuring community composition, biodiversity, and catch rates in our 10-year dataset,” says Buchheister. …

Read more

Utilizzando il sito, accetti l'utilizzo dei cookie da parte nostra. maggiori informazioni

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close