The Pregnancy Discrimination Act of 1978 makes it illegal for a woman to be fired just because she is pregnant. But that doesn’t stop it from happening, according to new research by Reginald Byron, assistant professor of sociology at Southwestern University and Vincent Roscigno, professor of sociology at The Ohio State University.What employers do to get around the law, Byron said, is vilify pregnant women as poor performers and tardy employees while also pointing to seemingly fair attendance policies and financial costs.Although such concerns may, at face value, seem legitimate in a business sense, Byron and Roscigno note that the same policies and rationales are often not invoked in the case of non-pregnant employees, including those with worse records of performance and attendance.”This strategy of portraying pregnant workers as undependable and costly seems to legitimize their terminations to external audiences,” Byron said. “Such a strategy adds to existing employer-employee power disparities like employers’ ability to hire a lawyer in discrimination suits.”The study was published online Feb. 20 in the journal Gender and Society and will appear in the June 2014 print edition.Byron and Roscigno analyzed 70 verified cases of pregnancy-based firing discrimination that were handled by the Ohio Civil Rights Commission between 1986 and 2003 plus an additional 15 cases that were processed between 2007 and 2011. Their key findings included the following:• Pregnancy accounted for 40 percent of all gender-related firing cases.• Poor performance was the reason employers cited most frequently for terminating pregnant workers; about 30 percent gave this as the reason.• Fifteen percent of employers claimed pregnant women were fired because of poor attendance and/or tardiness.• About 10 percent of employers invoked “business needs, profit and efficiency” in reference to pregnancy discrimination cases.One example Byron and Roscigno cite in their paper was the case of a woman who was fired from her job as an assistant restaurant manager after she became pregnant. Her supervisor claimed that the company was restructuring and needed to reduce its number of assistant managers from three to two. But after she was fired for “business reasons,” the company hired a man to fill the exact same position that was supposedly no longer needed.”Some employers think pregnant women will be distracted both in the present and in the future,” Byron said.Byron said many pregnancy-related firings stem from stereotypes of what “ideal” workers should look like. Also, he said existing laws are full of gender-laden economic loopholes. For example, the Family and Medical Leave Act of 1993, which provides a maximum of 12 weeks of unpaid job-protected leave during any 12-month period, does not apply to private sector employers with less than 50 people nor does it grant leave to employees with less than one year of tenure.Some states have their own laws that are broader than the federal law. In Ohio, for example, companies with four or more employees are subject to state anti-discrimination law. …Read more
Scientists have pieced together sections of DNA from 12 individual cells to sequence the genome of a bacterium known to live in healthy human mouths.With this new data about a part of the body considered “biological dark matter,” the researchers were able to reinforce a theory that genes in a closely related bacterium could be culprits in its ability to cause severe gum disease.Why the dark matter reference? More than 60 percent of bacteria in the human mouth refuse to grow in a laboratory dish, meaning they have never been classified, named or studied. The newly sequenced bacterium, Tannerella BU063, is among those that to date have not successfully been grown in culture — and its genome is identified as “most wanted” by the Human Microbiome Project.The federal Human Microbiome Project aims to improve research about the microbes that play a role in health and disease. Those 12 cells of BU063 are a good example of the complexity of life in the mouth: They came from a single healthy person but represented eight different strains of the bacterium.BU063 is closely related to the pathogen Tannerella forsythia, a bacterium linked to the gum disease periodontitis. Despite being “cousins,” this research revealed that they have clear differences in their genetic makeup.Those genes lacking in BU063 but present in forsythia — meaning they are a likely secret behind forsythia’s virulence — are now identified as good targets for further study, researchers say.”One of the tantalizing things about this study was the ability to do random searches of other bacteria whose levels are higher in periodontitis,” said Clifford Beall, research assistant professor of oral biology at The Ohio State University and lead author of the study. “We looked for genes that were present in these bacteria and forsythia and not in BU063. There is one particular gene complex in a whole list of these periodontitis-related bacteria that could be involved with virulence.”The research is published in the journal PLOS ONE.Periodontitis results when extensive inflammation or infection of the gums spreads beyond the gums to damage structures that support the teeth, including bone. Pockets that form between the gums and teeth are filled with different kinds of bacteria. Treatment typically involves deep cleaning or surgery to remove these infected pockets. Because multiple bacteria are associated with the disease, antibiotics have not been considered effective for treatment.And though many bacteria in these pockets have been collected and at least partially identified, their characteristics remain a mystery.”We think some of the gene differences we’ve found in this study are important, but it’s still not clear what all these genes do, meaning we still don’t know why certain bacteria in periodontitis are pathogenic in the first place. …Read more
Single seniors lead a risky life: after a fall, they often lie on the floor several hours before their awkward predicament is discovered. A sensor system detects these emergency situations automatically and sends an emergency signal.Mr. S. is visually impaired and dependent on a cane since suffering a stroke. Nevertheless, as a 70-yr old living alone, he would rather not move into a care home. Most older people harbor this wish. They want to stay in their own familiar surroundings and continue to live independently for as long as possible. According to data from the German Federal Statistical Office, this applies to 70 percent of seniors. Against better judgment, they are putting their health at risk, for not only does the risk of cardiovascular problems increase with age, but the risk of falling increases also. According to estimates, about 30 percent of those over 65 years of age living at home experience a fall at least once a year. …Read more
Last spring, President Obama established the federal BRAIN Initiative to give scientists the tools they need to get a dynamic picture of the brain in action.To do so, the initiative’s architects envision simultaneously recording the activity of complete neural networks that consist of thousands or even millions of neurons. However, a new study indicates that it may be possible to accurately characterize these networks by recording the activity of properly selected samples of 50 neurons or less — an alternative that is much easier to realize.The study was performed by a team of cognitive neuroscientists at Vanderbilt University and reported in a paper published the week of Feb. 3 in the online Early Edition of the Proceedings of the National Academy of Sciences.The paper describes the results of an ambitious computer simulation that the team designed to understand the behavior of the networks of hundreds of thousands of neurons that initiate different body movements: specifically, how the neurons are coordinated to trigger a movement at a particular point in time, called the response time.The researchers were surprised to discover that the range of response times produced by the simulated population of neurons did not change with size: A network of 50 simulated neurons responded with the same speed as a network with 1,000 neurons.For decades, response time has been a core measurement in psychology. “Psychologists have developed powerful models of human responses that explain the variation of response time based on the concept of single accumulators,” said Centennial Professor of Psychology Gordon Logan. In this model, the brain acts as an accumulator that integrates incoming information related to a given task and produces a movement when the amount of information reaches a preset threshold. The model explains random variations in response times by how quickly the brain accumulates the information it needs to act.Meanwhile, neuroscientists have related response time to measurements of single neurons. “Twenty years ago we discovered that the activity of particular neurons resembles the accumulators of psychology models. We haven’t understood until now how large numbers of these neurons can act collectively to initiate movements,” said Ingram Professor of Neuroscience Jeffrey Schall.No one really knows the size of the neural networks involved in initiating movements, but researchers estimated that about 100,000 neurons are involved in launching a simple eye movement.”One of the main questions we addressed is how ensembles of 100,000 neuron accumulators can produce behavior that is also explained by a single accumulator,” Schall said.”The way that the response time of these ensembles varies with ensemble size clearly depends on the ‘stopping rules’ that they follow,” explained co-author Thomas Palmeri, associate professor of psychology. For example, if an ensemble doesn’t respond until all of its member neurons have accumulated enough activity, then its response time would be slower for larger networks. On the other hand, if the response time is determined by the first neurons that react, then the response time in larger networks would be shorter than those of smaller networks.Another important factor is the degree to which the ensemble is coordinated. …Read more
Oct. 11, 2013 — A recent study by a University of Missouri researcher shows that resveratrol, a compound found in grape skins and red wine, can make certain tumor cells more susceptible to radiation treatment. This research, which studied melanoma cells, follows a previous MU study that found similar results in the treatment of prostate cancer. The next step is for researchers to develop a successful method to deliver the compound to tumor sites and potentially treat many types of cancers.”Our study investigated how resveratrol and radiotherapy inhibit the survival of melanoma cells,” said Michael Nicholl, MD, assistant professor of surgery at the MU School of Medicine and surgical oncologist at Ellis Fischel Cancer Center in Columbia, Mo. “This work expands upon our previous success with resveratrol and radiation in prostate cancer. Because of difficulties involved in delivery of adequate amounts of resveratrol to melanoma tumors, the compound is probably not an effective treatment for advanced melanoma at this time.”The study found that melanoma cells become more susceptible to radiation if they were treated first with resveratrol. The MU researcher found that when the cancer was treated with resveratrol alone, 44 percent of the tumor cells were killed. When the cancer cells were treated with a combination of both resveratrol and radiation, 65 percent of the tumor cells died.Nicholl said his findings could lead to more research into the cancer-fighting benefits of the naturally occurring compound.”We’ve seen glimmers of possibilities, and it seems that resveratrol could potentially be very important in treating a variety of cancers,” Nicholl said. “It comes down to how to administer the resveratrol. If we can develop a successful way to deliver the compound to tumor sites, resveratrol could potentially be used to treat many types of cancers. …Read more
Sep. 4, 2013 — A University of Iowa physiologist has a new technique to measure the stiffness of the aorta, a common risk factor for heart disease. And it can be as simple as measuring the pulse in your finger.The new procedure developed by Gary Pierce, assistant professor in the Department of Health and Human Physiology, works by placing an instrument called a transducer on the finger or over the brachial artery, located inside the arm just beneath the elbow. The readout, combined with a person’s age and body mass index, lets physicians know whether the aorta has stiffened.Currently, physicians see whether a patient has a hardened aorta by recording a pulse from the carotid artery, located in the neck, and the femoral artery, which is located in the groin. Taking a pulse from the finger or on the arm is easier to record and nearly as accurate, Pierce says. It also works better with obese patients, whose femoral pulse can be difficult to obtain reliably, he adds.”The technique is more effective in that it is easy to obtain just one pulse waveform in the finger or the brachial artery, and it’s less intrusive than obtaining a femoral waveform in patients,” says Pierce, first author on the paper, published in the American Journal of Physiology Heart and Circulatory Physiology. “It also can be easily obtained in the clinic during routine exams similar to blood pressure tests.”Heart disease is the leading cause of death for both men and women in the United States, killing about 600,000 people every year, according to the federal Centers for Disease Control and Prevention.One key to a healthy heart is a healthy aorta. A person’s heart has to work harder when the aorta, the large artery that leaves the heart and delivers blood to the body’s tissues, stiffens due to aging and an inactive lifestyle. The harder a person’s heart needs to work, the higher risk he or she has for developing high blood pressure, stroke and a heart attack.Since people can live for years without any knowledge of existing cardiovascular problems, this new measurement tool is especially important. It can provide useful diagnostic information for middle-aged and older patients, who are most susceptible to having hardened arteries that can lead to heart disease.Regular assessments of the aorta may help reduce those risks. …Read more
Aug. 29, 2013 — Neuroscientists at Freie Universität Berlin show a link between reward activity in the brain due to discovering one has a good reputation and social media use.A person’s intensity of Facebook use can be predicted by activity in the nucleus accumbens, a reward-related area of the brain, according to a new study published by neuroscientists in the Languages of Emotion Cluster of Excellence at Freie Universität Berlin. Dr. Dar Meshi and his colleagues conducted this first ever study to relate brain activity (functional MRI) to social media use.The researchers focused on the nucleus accumbens, a small but critical structure located deep in the center of the brain, because previous research has shown that rewards — including food, money, sex, and gains in reputation — are processed in this region.”As human beings, we evolved to care about our reputation. In today’s world, one way we’re able to manage our reputation is by using social media websites like Facebook,” says Dar Meshi, lead author of the paper. Facebook is the world’s largest social media channel with 1.2 billion monthly active users. It was used in the study because interactions on the website are carried out in view of the user’s friends or public and can affect their reputation. For example, Facebook consists of users “liking” posted information. This approval is positive social feedback, and can be considered related to their reputation.All 31 participants completed the Facebook Intensity Scale to determine how many friends each participant had, how many minutes they each spent on Facebook, and general thoughts. The participants were selected to vary widely in their Facebook Intensity Scale scores.First, the subjects participated in a video interview. …Read more
Aug. 13, 2013 — Swiss researchers report an increase risk of inflammatory bowel disease (IBD) relapse in patients during heat wave periods. The study published in The American Journal of Gastroenterology also found an increase of infectious gastroenteritis during heat waves, with the strongest impact following a 7 day lag time after the heat wave.Share This:The authors noted, “There is evidence for an increase of IBD hospital admissions by 4-6 percent for each additional day within a heat wave period. Presence of a heat wave was estimated to increase the risk of infectious gastroenteritis by 4-7 percent for every additional day within a heat way period. In the control group there was no evidence for a heat wave effect.”Researchers from Zurich, Switzerland studied the data of 738 IBD and 786 IG patients admitted to the University Hospital of Zurich over a 5-year period (2001-2005) and compared data with other non-infectious chronic intestinal inflammations, as the control. The Swiss Federal Office for Meteorology and Climatology provided the climate data. A total of 17 heat waves were identified during that period.”The evidence of patients with IBD having a significant increase risk of flare ups compared to the control group shows a cause and effect between the climate and the disease,” said lead author Christine N. Manser, MD. “This study ties heat stress to digestive symptoms supporting the observed seasonal variation in the clinical course of inflammatory bowel disease and suggests that microbial infections of the gut might be additionally influenced by climate changes.”Some people with IBD may experience flare ups during significant weather changes. “Heat waves are known to cause physical stress as evident from increased frequencies of other stress dependent health events such as heart attacks. …Read more
July 30, 2013 — Researchers from the Center for Neuroprosthetics at the Swiss Federal Institute of Technology (EPFL), Switzerland, show that people can be “tricked” into feeling that an image of a human figure — an “avatar” — is their own body. The study is published in the open-access journal Frontiers in Behavioral Neuroscience.Twenty-two volunteers underwent a Full Body Illusion when they were stroked with a robotic device system while they watched an avatar being stroked in the same spot. The study is the first to demonstrate that Full Body Illusions can be accompanied by changes in body temperature.Participants wore a 3D high-resolution head-mounted display to view the avatar from behind. They were then subjected to 40 seconds of stroking by a robot, on either their left or right back or on their left or right leg. Meanwhile, they were shown a red dot that moved synchronously on the same regions of the avatar (see image).After the stroking, the participants were prompted to imagine dropping a ball and to signal the moment when they felt that the ball would hit the floor. This allowed the researchers to objectively measure where the participants perceived their body to be.The volunteers were asked questions about how much they identified with the avatar and where they felt the stroking originated from. Furthermore, to test for physiological changes during the illusion, the participants’ skin temperature was measured on four locations on the back and legs across 20 time points.Results showed that stroking the same body part simultaneously on the real body and the avatar induced a Full Body Illusion. The volunteers were confused as to where their body was and they partly identified with the avatar. More than 70% of participants felt that the touch they had felt on their body was derived from the stroking seen on the avatar.Data revealed a continuous widespread decrease in skin temperature that was not specific to the site of measurement and showed similar effects in all locations. The changes in body temperature “were highly significant, but very small,” write the authors in the study, adding that the decrease was in the range of 0.006-0.014 degrees Celsius.The recorded temperature change was smaller than an earlier study found (0.24 degrees Celsius) that looked at fluctuations during rubber hand illusion, probably because the latter used a hand-held thermometer over longer periods and different regions of the body, the authors explain.”When the brain is confronted with a multisensory conflict, such as that produced by the Full Body Illusion, the way we perceive our real body changes. …Read more
July 24, 2013 — Commercial honey bees used to pollinate crops are exposed to a wide variety of agricultural chemicals, including common fungicides which impair the bees’ ability to fight off a potentially lethal parasite, according to a new study by researchers at the University of Maryland and the U.S. Department of Agriculture.The study, published July 24 in the online journal PLOS ONE, is the first analysis of real-world conditions encountered by honey bees as their hives pollinate a wide range of crops, from apples to watermelons.The researchers collected pollen from honey bee hives in fields from Delaware to Maine. They analyzed the samples to find out which flowering plants were the bees’ main pollen sources and what agricultural chemicals were commingled with the pollen. The researchers fed the pesticide-laden pollen samples to healthy bees, which were then tested for their ability to resist infection with Nosema ceranae — a parasite of adult honey bees that has been linked to a lethal phenomenon known as colony collapse disorder.On average, the pollen samples contained 9 different agricultural chemicals, including fungicides, insecticides, herbicides and miticides. Sublethal levels of multiple agricultural chemicals were present in every sample, with one sample containing 21 different pesticides. Pesticides found most frequently in the bees’ pollen were the fungicide chlorothalonil, used on apples and other crops, and the insecticide fluvalinate, used by beekeepers to control Varroa mites, common honey bee pests.In the study’s most surprising result, bees that were fed the collected pollen samples containing chlorothonatil were nearly three times more likely to be infected by Nosema than bees that were not exposed to these chemicals, said Jeff Pettis, research leader of the USDA’s Bee Research Laboratory and the study’s lead author. The miticides used to control Varroa mites also harmed the bees’ ability to withstand parasitic infection.Beekeepers know they are making a trade-off when they use miticides, said University of Maryland researcher Dennis vanEngelsdorp, the study’s senior author. The chemicals compromise bees’ immune systems, but the damage is less than it would be if mites were left unchecked. But the study’s finding that common fungicides can be harmful at real world dosages is new, and points to a gap in existing regulations, he said.”We don’t think of fungicides as having a negative effect on bees, because they’re not designed to kill insects,” vanEngelsdorp said. Federal regulations restrict the use of insecticides while pollinating insects are foraging, he said, “but there are no such restrictions on fungicides, so you’ll often see fungicide applications going on while bees are foraging on the crop. …Read more
June 14, 2013 — Approaching the two-month anniversary of the April 15 Boston Marathon bombing, a new UMass Poll released today by the University of Massachusetts Amherst shows that only one-in-eight Massachusetts residents are very concerned about a terrorist attack where they live. The poll also indicated distinct party-line divisions regarding which government officials and agencies were to blame for failing to prevent the attack.In an online survey of 500 registered Massachusetts voters conducted by YouGov America under the direction of the UMass Poll from May 30 to June 4, Boston-area residents expressed an overall concern about a terrorist attack at a rate higher than those in the rest of the state, with 61 percent of Bostonians stating that they were “very concerned” or “somewhat concerned” of an attack, compared to 53 percent of other Bay State residents. The number of respondents who indicated that they were “very concerned,” however, was actually lower in the Boston area than elsewhere in Massachusetts, 11 percent to 15 percent.”I wouldn’t necessarily call it complacency, but it’s more likely that residents are moving on with their lives,” said Raymond La Raja, associate director of the UMass Poll.Maryann Barakso, associate director of the UMass Poll, noted that concern about terrorist attacks was lower in the Massachusetts poll than it has been in recent national polls. “Interestingly, Massachusetts voters seem less worried about another terrorist attack than Americans are as a whole,” Barakso said.In the aftermath of the attack, two-thirds of Massachusetts residents surveyed supported increasing the number of video surveillance cameras (66 percent) and the number of police officers at public gatherings (69 percent) in an effort to prevent future attacks. However, differences were evident depending on political party preference, as Democrats were much more likely to favor increasing the number of police officers (78 percent) compared to Republicans (50 percent). The least popular option for increased security was the use of unmanned aerial surveillance vehicles, or drones (23 percent).When given the opportunity to select multiple government agencies or individuals that bore at least some responsibility for failing to prevent the bombing, federal agencies were assigned the greatest blame with the Federal Bureau of Investigation and U.S. Department of Homeland Security found at fault by 56 percent and 50 percent of respondents, respectively, followed by the Central Intelligence Agency at 42 percent. City and state representatives received the least blame, with Governor Deval Patrick found most liable at only 11 percent. Overall, 30 percent of respondents replied that no agency or individual — federal or state — should bear the responsibility for the attack.”The poll suggests that Massachusetts voters who believe one or more elected officials or agencies deserve a share of the blame see the attacks as a failure of national security intelligence rather than the mistakes of local law enforcement,” said La Raja.The political affiliation of respondents appeared to play a major role in their views of responsibility for the attacks. Although President Barack Obama was seen as culpable by 20 percent of overall respondents, nearly 40 percent of Republicans surveyed found the President at fault for the attack, while only 6 percent of Democrats laid blame with the President. …Read more
June 12, 2013 — Brazilian paleontologists Taissa Rodrigues, of the Federal University of Espirito Santo, and Alexander W. A. Kellner, of the National Museum of the Federal University of Rio de Janeiro, have just presented the most extensive review yet available of toothed pterosaurs from the Cretaceous of England. The study features detailed taxonomic information, diagnoses and photographs of 30 species and was published in the open access journal ZooKeys.Pterosaurs from the Cretaceous of England were first described by British naturalists Richard Owen and Harry Seeley in the 19th century, when little was known about the diversity of the group, resulting in the description of dozens of species, all based on very fragmentary remains, represented mostly by the tips of the snouts of these animals. However, more recent findings of pterosaur fossils have challenged views on their diversity.Results show that these pterosaurs had a remarkable diversity in their appearances. Some species had head crests of different sizes and shapes, while others had none. Most had large teeth at the tip of their snouts and were fish eaters, but others had smaller teeth, suggesting different feeding preferences. The paleontologists were able to identify fourteen different species, belonging to at least five different genera, showing a greater diversity than previously thought.Most of these fossils were found in a deposit known as the Cambridge Greensand, located in the eastern part of the country. This unit, one of the most important for the study of flying reptiles, records a past marine environment where the bones that were already fossilized and buried, were eroded, exposed to weathering, and then buried again. Cycles of erosion and burial must have taken place during several years. …Read more
June 11, 2013 — Using wood for energy is considered cleaner than fossil fuels, but a Dartmouth College-led study finds that logging may release large amounts of carbon stored in deep forest soils.Global atmospheric studies often don’t consider carbon in deep (or mineral) soil because it is thought to be stable and unaffected by timber harvesting. But the Dartmouth findings show deep soil can play an important role in carbon emissions in clear-cutting and other intensive forest management practices. The findings suggest that calls for an increased reliance on forest biomass be re-evaluated and that forest carbon analyses are incomplete unless they include deep soil, which stores more than 50 percent of the carbon in forest soils.”Our paper suggests the carbon in the mineral soil may change more rapidly, and result in increases in atmospheric carbon dioxide, as a result of disturbances such as logging,” said Dartmouth Professor Andrew Friedland, a co-author. “Our paper suggests that increased reliance on wood may have the unintended effect of increasing the transfer of carbon from the mineral soil to the atmosphere. So the intended goal of reducing carbon in the atmosphere may not be met.”The federal government is looking to wood, wind, solar, hydropower and other renewable energy sources to address concerns about climate change and energy security. Woody biomass, which includes trees grown on plantations, managed natural forests and logging waste, makes up about 75 percent of global biofuel production. Mineral soil carbon responses can vary highly depending on harvesting intensity, surface disturbance and soil type.”Analysis of forest carbon cycles is central to understanding and mitigating climate change, and understanding forest carbon cycles requires an in-depth analysis of the storage in and fluxes among different forest carbon pools, which include aboveground live and dead biomass, as well as the belowground organic soil horizon, mineral soil horizon and roots,” Friedland said.Co-authors included Dartmouth’s Thomas Buchholz, a former post-doctoral student, and Claire Hornig, a recent undergraduate student, and researchers from the University of Vermont, Lund University in Sweden and the Vermont Department of Forest, Parks and Recreation. The research was supported by awards to Friedland from the Northeastern States Research Cooperative and the Porter Fund.Friedland’s research focuses on understanding the effects of atmospheric deposition of pollutants and biomass harvesting on elemental cycling processes in high-elevation forests in the Northeastern United States. He considers many elements including carbon, trace elements such as lead and major elements such as nitrogen and calcium. He also is examining issues related to personal choices, energy use and environmental impact.The results appear in the journal Global Change Biology-Bioenergy.Read more
June 1, 2013 — Marine conservationists from Plymouth University, and the Universidad Federal da Bahia in Brazil, have spent more than 17 years analysing the diversity and density of coral colonies off the coast of South America.That coincided with the catastrophic El Niño event of 1997-98, creating an opportunity for the first detailed assessment of the long-term impact a major environmental incident of this nature can have on coral assemblages.Professor Martin Attrill, Director of Plymouth University’s Marine Institute, said: “Coral reefs are perhaps the most diverse marine ecosystem on Earth, potentially holding 25% of the known marine species. Yet they are under intense threat from a range of local human activities and, in particular, climate change. Any impact on the corals is going to have major knock on effects on the organisms that live on coral reefs, such as the fish, and if climatic events become more frequent, as is suggested, it is likely corals will never be able to fully recover.”The 1997-98 El Niño was the most extensive global event of its kind in history, with record global high seawater temperatures in an 18-month period before and subsequently.It prompted flooding in some parts of the world and droughts in others, but also caused severe coral bleaching and mortality in parts of Central America, the Indian Ocean, Arabian Gulf, the tropical Pacific and Brazil.For this study, the research team used their own observations of eight species of scleractinian corals, and data from the Brazilian Meteorological Office, to create a full picture of environmental conditions and species behaviour that resulted.It showed a significant rise in air and seawater temperatures in 1998, with increased mortality across all species and, in one case, it disappearing completely from the reefs for more than seven years.The density of the coral in the area also fell after 1998, but then increased continuously until 2007, with recent measurements showing it is now mostly back to pre-1998 levels.Professor Attrill added: “El Niño events give us an indication of how changing climate affects ecosystems as major changes in the weather patterns within the Pacific impact the whole world. If the reefs can recover quickly, it is probable they can adapt and survive the likely changes in water temperature ahead of us. However, we found it took 13 years for the coral reef system in Brazil to recover, suggesting they may be very vulnerable to regular climate-related impacts. This has major consequences for how we consider climate change impacts on coral reefs.”Read more
May 20, 2013 — The Amazon rain forest, popularly known as the lungs of the planet, inhales carbon dioxide as it exudes oxygen. Plants use carbon dioxide from the air to grow parts that eventually fall to the ground to decompose or get washed away by the region’s plentiful rainfall.
Until recently, people believed much of the rain forest’s carbon floated down the Amazon River and ended up deep in the ocean. University of Washington research showed a decade ago that rivers exhale huge amounts of carbon dioxide — though left open the question of how that was possible, since bark and stems were thought to be too tough for river bacteria to digest.
A study published this week in Nature Geoscience resolves the conundrum, proving that woody plant matter is almost completely digested by bacteria living in the Amazon River, and that this tough stuff plays a major part in fueling the river’s breath.
The finding has implications for global carbon models, and for the ecology of the Amazon and the world’s other rivers.
“People thought this was one of the components that just got dumped into the ocean,” said first author Nick Ward, a UW doctoral student in oceanography. “We’ve found that terrestrial carbon is respired and basically turned into carbon dioxide as it travels down the river.”
Tough lignin, which helps form the main part of woody tissue, is the second most common component of terrestrial plants. Scientists believed that much of it got buried on the seafloor to stay there for centuries or millennia. The new paper shows river bacteria break it down within two weeks, and that just 5 percent of the Amazon rainforest’s carbon ever reaches the ocean.
“Rivers were once thought of as passive pipes,” said co-author Jeffrey Richey, a UW professor of oceanography. “This shows they’re more like metabolic hotspots.”
When previous research showed how much carbon dioxide was outgassing from rivers, scientists knew it didn’t add up. They speculated there might be some unknown, short-lived carbon source that freshwater bacteria could turn into carbon dioxide.
“The fact that lignin is proving to be this metabolically active is a big surprise,” Richey said. “It’s a mechanism for the rivers’ role in the global carbon cycle — it’s the food for the river breath.”
The Amazon alone discharges about one-fifth of the world’s freshwater and plays a large role in global processes, but it also serves as a test bed for natural river ecosystems.
Richey and his collaborators have studied the Amazon River for more than three decades. Earlier research took place more than 500 miles upstream. This time the U.S. and Brazilian team sought to understand the connection between the river and ocean, which meant working at the mouth of the world’s largest river — a treacherous study site.
“There’s a reason that no one’s really studied in this area,” Ward said. “Pulling it off has been quite a challenge. It’s a humongous, sloppy piece of water.”
The team used flat-bottomed boats to traverse the three river mouths, each so wide that you cannot see land, in water so rich with sediment that it looks like chocolate milk. Tides raise the ocean by 30 feet, reversing the flow of freshwater at the river mouth, and winds blow at up to 35 mph.
Under these conditions, Ward collected river water samples in all four seasons. He compared the original samples with ones left to sit for up to a week at river temperatures. Back at the UW, he used newly developed techniques to scan the samples for some 100 compounds, covering 95 percent of all plant-based lignin. Previous techniques could identify only 1 percent of the plant-based carbon in the water.
Based on the results, the authors estimate that about 45 percent of the Amazon’s lignin breaks down in soils, 55 percent breaks down in the river system, and 5 percent reaches the ocean, where it may break down or sink to the ocean floor.
“People had just assumed, ‘Well, it’s not energetically feasible for an organism to break lignin apart, so why would they?'” Ward said. “We’re thinking that as rain falls over the land it’s taking with it these lignin compounds, but it’s also taking with it the bacterial community that’s really good at eating the lignin.”
The research was supported by the Gordon and Betty Moore Foundation, the National Science Foundation and the Research Council for the State of São Paulo. Co-authors are Richard Keil at the UW; Patricia Medeiros and Patricia Yager at the University of Georgia; Daimio Brito and Alan Cunha at the Federal University of Amap in Brazil; Thorsten Dittmar at Carl von Ossietzky University in Germany; and Alex Krusche at University of São Paulo in Brazil.Read more
May 29, 2013 — EPFL researchers have detected microplastic pollution in one of Western Europe’s largest lakes, Lake Geneva, in large enough quantities to raise concern. While studies in the ocean have shown that these small bits of plastic can be harmful to fish and birds that feed on plankton or other small waterborne organisms, the full extent of their consequences in lakes and rivers is only now being investigated.
The study, which is being extending under a mandate by the Swiss Federal Office for the Environment, was published in the latest issue of the journal Archives des Sciences.
“We were surprised to find such high concentrations of microplastics, especially in an environmentally aware country like Switzerland,” says first author Florian Faure from EPFL. Faure’s study focused on Lake Geneva, where both beaches and lake water were shown to contain significant amounts of microplastic contamination — pieces of plastic waste up to 5 mm in diameter. The study is one of the first of its kind to focus on a continental freshwater lake. And according to Faure, given the massive efforts put into protecting the lakes shores over the past decades, both on its French and the Swiss shores, the situation is likely to be representative of fresh water bodies around the world.
Microplastics in continental waters may be the main source of microplastic pollution in oceans, where huge hotspots containing high concentrations of these pollutants have formed. Scientists estimate that only around 20 percent of oceanic microplastics are dumped straight into the sea. The remaining 80 percent are estimated to originate from terrestrial sources, such as waste dumps, street litter, and sewage.
Microplastic pollution is also a strain to lake and river ecosystems, threatening the animals that inhabit these aquatic ecosystems both physically and chemically. When inadvertently swallowed by aquatic birds and fish, the tiny bits of plastic can wind up stuck in the animals’ intestines, where they obstruct their digestive tracts, or cause them to suffocate by blocking their airways. Ingested plastics may also leach toxic additives and other pollutants stuck to their surface into the animals that swallow them, such as bisphenol A (BPA) and phthalates, two carcinogenic agents used in transparent plastics, or other hydrophobic water pollutants, such as PCBs.
Like counting needles in a haystack
Florian Faure and his collaborators used a variety of approaches to quantify plastic and microplastic pollution in and around the lake, from combing beaches along Lake Geneva for plastic litter to dissecting animals, fishes (pikes, roaches and breams) and birds from the aquatic environment, and observing bird droppings around the lake.
To measure the concentration of microplastics in the water, Faure worked in collaboration with Oceaneye, a Geneva-based non-profit organization. Using an approach developed to study plastic pollution in the Mediterranean Sea, they pulled a manta trawl — a floating thin-meshed net — behind a boat in Lake Geneva to pick up any solid matter in the top layer of the water. The samples were then sorted out, dried and the solid compounds were analyzed for their composition.
“We found plastic in every sample we took from the beaches,” says Faure. Polystyrene beads were the most common culprits, but hard plastics, plastic membranes, and bits of fishing line were also widespread. In this preliminary study, the amount of debris caught in Lake Geneva using the manta trawl was comparable to measurements made in the Mediterranean Sea.
The scientists are now extending their focus to lakes and rivers across the country, backed by a mandate from the Swiss Federal Office for the Environment. According to the lab’s director, Luiz Felippe de Alencastro, this will involve studying microplastic pollution in lakes, rivers, and biota across the country, as well as the associated micropollutants, such as PCBs, which have already been found stuck on microplastics from Lake Geneva in significant concentrations.Read more