Summary: Best surrogacy useses a straightforward IVF cycle in which the surrogate mother takes fertility medication to stimulate her ovariesRead more
A few days ago while I was walking along the beautiful and peaceful beach at a little cove/seaside town in Tasmania called Greens Beach, as there was no one else on the beach I decided spur of the moment to draw this heart in the sand with this powerful message as I felt it reaches out worldwide with a very important message.I stood back and went to take a photograph when all of a sudden a couple appeared from ‘no where’ and asked if they could ‘take a look at my artwork’! I showed them, they looked at each other and went a pale shade of grey and said ”a friend of ours was recently diagnosed with pleural mesothelioma and he lives in Launceston’! (Launceston …Read more
Break-up of the supercontinent Gondwana about 130 Million years ago could have lead to a completely different shape of the African and South American continent with an ocean south of today’s Sahara desert, as geoscientists from the University of Sydney and the GFZ German Research Centre for Geosciences have shown through the use of sophisticated plate tectonic and three-dimensional numerical modelling.The study highlights the importance of rift orientation relative to extension direction as key factor deciding whether an ocean basin opens or an aborted rift basin forms in the continental interior.For hundreds of millions of years, the southern continents of South America, Africa, Antarctica, Australia, and India were united in the supercontinent Gondwana. While the causes for Gondwana’s fragmentation are still debated, it is clear that the supercontinent first split along along the East African coast in a western and eastern part before separation of South America from Africa took place. Today’s continental margins along the South Atlantic ocean and the subsurface graben structure of the West African Rift system in the African continent, extending from Nigeria northwards to Libya, provide key insights on the processes that shaped present-day Africa and South America.Christian Heine (University of Sydney) and Sascha Brune (GFZ) investigated why the South Atlantic part of this giant rift system evolved into an ocean basin, whereas its northern part along the West African Rift became stuck.”Extension along the so-called South Atlantic and West African rift systems was about to split the African-South American part of Gondwana North-South into nearly equal halves, generating a South Atlantic and a Saharan Atlantic Ocean,” geoscientist Sascha Brune explains. “In a dramatic plate tectonic twist, however, a competing rift along the present-day Equatorial Atlantic margins, won over the West African rift, causing it to become extinct, avoiding the break-up of the African continent and the formation of a Saharan Atlantic ocean.”The complex numerical models provide a strikingly simple explanation: the larger the angle between rift trend and extensional direction, the more force is required to maintain a rift system. The West African rift featured a nearly orthogonal orientation with respect to westward extension which required distinctly more force than its ultimately successful Equatorial Atlantic opponent.Story Source:The above story is based on materials provided by Helmholtz Centre Potsdam – GFZ German Research Centre for Geosciences. Note: Materials may be edited for content and length.Read more
Adenocarcinoma histology, female sex, never-smoking status, and Asian ethnicity have been considered the most important factors associated with EGFR mutations in non-small cell lung cancer and response to EGFR inhibitors. A recent study has found that, within the Asian population, the frequency of EGFR mutations associated with other demographic and clinical characteristics is higher than previously reported, even in patients with a history of smoking, suggesting that mutation testing should be done on a broader basis among Asian patients with advanced adenocarcinoma of the lung.The PIONEER study is the first prospective, multinational epidemiologic study to document the frequency of EGFR mutations in lung adenocarcinoma in the Asian population. The PIONEER authors found that EGFR mutations were present in 51.4% of stage IIIB or IV adenocarcinomas of the lung among 1,450 patients from seven regions of Asia. Previous reports have suggested a frequency of approximately 30% among the Asian population (compared with 20% among the white population). The findings of the PIONEER study are published in the February issue of the International Association for the Study of Lung Cancer’s journal, the Journal of Thoracic Oncology (JTO).The frequency of EGFR mutations was high among women (61.1%) and never-smokers (60.7%), but EGFR mutations were also common among men (44%), occasional smokers (51.6%), and previous smokers (43.2%). With regard to Asian regions, the frequency was highest in Vietnam (64.2%) and lowest in India (22.2%).”The frequency of EGFR mutations in demographic and clinical subgroups of Asian patients in PIONEER suggests that EGFR mutation testing should be considered for all patients with stage IIIB or IV adenocarcinoma of the lung in Asian populations,” says first author Yuankai Shi, MD, of the Department of Medical Oncology, Cancer Institute/Hospital, Beijing, China. More widespread mutation testing would help to ensure the optimal identification and treatment of patients with lung adenocarcinomas that harbor EGFR mutations.Story Source:The above story is based on materials provided by International Association for the Study of Lung Cancer. Note: Materials may be edited for content and length.Read more
Castor, grown in Florida during World War II and currently considered as a component for military jet fuel, can be grown here again, using proper management techniques, a new University of Florida study shows.Those techniques include spacing plants properly and using harvest aids to defoliate the plant when it matures.Growers in the U.S. want to mechanically harvest castor, which is typically hand-picked in other parts of the world, the researchers said. Among other things, the UF/IFAS study evaluated whether the plant would grow too tall for mechanical harvesting machines.Castor oil is used in paints, lubricants and deodorants, among other industrial products, said David Campbell, a former UF agronomy graduate student and lead author of the study. It has not been grown in the U.S. since 1972, because the federal government ceased giving price supports, the study says.At UF research units in Citra and Jay, scientists tested Brigham and Hale, two types of castor that were bred in an arid part of west Texas near Lubbock in 1970 and 2003, respectively. These cultivars are shorter than castor found in the wild, said Diane Rowland, an associate professor of agronomy at UF’s Institute of Food and Agricultural Sciences, and Campbell’s faculty adviser.Scientists tried to control the growth of the plants even more by spraying them with a chemical, she said. Even though the crop didn’t respond to the chemicals, it did not grow taller than expected. So it appears these types of castor can be harvested mechanically, she said.While yields were lower than those reported in Texas research trials in 1993, results are promising for Florida.“We were concerned that, in this environment, with all the moisture and the good growing conditions, that it would grow too tall. But it didn’t,” Rowland said. “So it shows that shorter genetic types will still work, without the chemical application. …Read more
For the first time, scientists have discovered how tree roots in the mountains may play an important role in controlling long-term global temperatures. Researchers from Oxford and Sheffield Universities have found that temperatures affect the thickness of the leaf litter and organic soil layers, as well as the rate at which the tree roots grow. In a warmer world, this means that tree roots are more likely to grow into the mineral layer of the soil, breaking down rock into component parts which will eventually combine with carbon dioxide. This process, called weathering, draws carbon dioxide out of the atmosphere and cools the planet. The researchers say this theory suggests that mountainous ecosystems have acted like Earth’s thermostat, addressing the risk of ‘catastrophic’ overheating or cooling over millions of years.In their research paper published online in Geophysical Research Letters, the researchers carried out studies in tropical rain forests in Peru, measuring tree roots across different sites of varying altitude — from the warm Amazonian Lowlands to the cooler mountain ranges of the Andes. They measured the growth of the tree roots to 30 cm beneath the surface, every three months over several years. At each of the sites, they also measured the thickness of the organic layer above the soil. This information was then combined with existing data of monthly temperature, humidity, rainfall, and soil moisture in order to calculate the likely breakdown process of the basalt and granite rocks found in the mountain ranges of Peru.Using this model, based on field data in Peru, the scientists were able to scale up in order to calculate the likely contribution of mountain forests worldwide to global weathering rates. The researchers then calculated the likely amount of carbon to be pulled out of the atmosphere through weathering when Earth became very hot. They looked at the volcanic eruptions in India 65 million years ago (known as the Deccan traps). …Read more
Camels are mentioned as pack animals in the biblical stories of Abraham, Joseph, and Jacob. But archaeologists have shown that camels were not domesticated in the Land of Israel until centuries after the Age of the Patriarchs (2000-1500 BCE). In addition to challenging the Bible’s historicity, this anachronism is direct proof that the text was compiled well after the events it describes.Now Dr. Erez Ben-Yosef and Dr. Lidar Sapir-Hen of Tel Aviv University’s Department of Archaeology and Near Eastern Cultures have used radiocarbon dating to pinpoint the moment when domesticated camels arrived in the southern Levant, pushing the estimate from the 12th to the 9th century BCE. The findings, published recently in the journal Tel Aviv, further emphasize the disagreements between Biblical texts and verifiable history, and define a turning point in Israel’s engagement with the rest of the world.”The introduction of the camel to our region was a very important economic and social development,” said Dr. Ben-Yosef. “By analyzing archaeological evidence from the copper production sites of the Aravah Valley, we were able to estimate the date of this event in terms of decades rather than centuries.”Copper mining and camel ridingArchaeologists have established that camels were probably domesticated in the Arabian Peninsula for use as pack animals sometime towards the end of the 2nd millennium BCE. In the southern Levant, where Israel is located, the oldest known domesticated camel bones are from the Aravah Valley, which runs along the Israeli-Jordanian border from the Dead Sea to the Red Sea and was an ancient center of copper production. At a 2009 dig, Dr. …Read more
Quite a number of patients afflicted with asbestos related diseases such as asbestosis and mesothelioma now-a days use different types of complimentary and alternative therapies in addition to conventional therapies like surgery and drugs.These alternative therapies are used by patients coping with asbestos related disease as a form of pain management, to improve general health, and also to provide symptomatic relief.Although these treatments do not offer a cure, they certainly help you to live more comfortable lives by providing relief from pain and stress.The most commonly used alternative therapies include the following:1} AcupunctureThis is one of the commonest forms of available alternative therapies today, and there are a lot of insurance companies offering coverage for this type of treatment. Acupuncture involves the …Read more
Aug. 12, 2013 — New irrigation systems in arid regions benefit farmers but can increase the local malaria risk for more than a decade — which is longer than previously believed — despite intensive and costly use of insecticides, new University of Michigan-led study in northwest India concludes.The study’s findings demonstrate the need to include a strong, binding commitment to finance and implement long-term public health and safety programs when building large-scale irrigation projects, according to the researchers.”In these dry, fragile ecosystems, where increase in water availability from rainfall is the limiting factor for malaria transmission, irrigation infrastructure can drastically alter mosquito population abundance to levels above the threshold needed to maintain malaria transmission,” said lead author and U-M graduate student Andres Baeza, who works in the laboratory of Mercedes Pascual in the Department of Ecology and Evolutionary Biology.”Our results highlight the need for considering health impacts in the long-term planning, assessment and mitigation of projects related to water resources,” Baeza said.The researchers studied changes in land use and malaria risk around a large irrigation project under construction in a semi-arid area in the northeast part of the Indian state of Gujarat. Water from the project is eventually expected to cover more than 47 million acres and will benefit about a million farmers.Malaria risk in arid regions often rises when irrigation is introduced, due to increased amounts of standing water that serve as mosquito breeding sites. Globally, the number of people at risk of contracting malaria due to proximity to irrigation canals and related infrastructure has been estimated at 800 million, which represents about 12 percent of the global malaria burden.Historical evidence shows that after irrigation is introduced into arid locations, the increased malaria risk eventually subsides and that this food versus disease dilemma is a temporary stage on the road to greater prosperity.The new study demonstrates that this transition phase from high risk to low disease prevalence can last more than a decade. The study is the first to combine satellite imagery of vegetation cover with public health records of malaria cases over a large region to track changes that occur as a mega-irrigation project progresses.The findings are scheduled to be published online Aug. 12 in the Proceedings of the National Academy of Sciences.”By following the changes in malaria incidence, vegetation and socioeconomic data at the level of sub-districts, we identified a transition phase toward sustainable low malaria risk lasting for more than a decade and characterized by an enhanced environmental malaria risk despite intensive mosquito control efforts,” said Pascual, the Rosemary Grant Collegiate Professor of Ecology and Evolutionary Biology at U-M and a Howard Hughes Medical Institute Investigator.Pascual said the findings show that environmental methods for sustainable disease control are urgently needed. Several of these methods — including intermittent irrigation and periodic flushing of canals — have proved to be affordable, effective and feasible to implement at local levels.”The challenge ahead, then, will be to apply these methods over extensive regions and maintain them for long enough periods,” said Pascual, a theoretical ecologist.Malaria is caused by the Plasmodium parasite, which is transmitted via the bites of infected Anopheles mosquitoes. In the human body, the parasites multiply in the liver and then infect red blood cells.In the PNAS study, the researchers examined epidemiological data on microscopically confirmed malaria cases from rural areas, some dating back to 1997. Using satellite imagery, the researchers were able to discriminate irrigated crops from non-irrigated crops by their spectral signature.They were then able to determine how levels of malaria changed as the massive irrigation project progressed. They showed that elevated disease risk — despite heavy use of insecticides — is concentrated in the areas adjacent to the main irrigation canal that have experienced the most pronounced change in irrigation levels in the last decade.They tied the remote sensing and epidemiological findings to various socioeconomic factors. …Read more
Aug. 8, 2013 — Scientists from Harvard Medical School and the CSIR-Centre for Cellular and Molecular Biology in Hyderabad, India, provide evidence that modern-day India is the result of recent population mixture among divergent demographic groups.The findings, published August 8 in the American Journal of Human Genetics, describe how India transformed from a country where mixture between different populations was rampant to one where endogamy — that is, marrying within the local community and a key attribute of the caste system — became the norm.”Only a few thousand years ago, the Indian population structure was vastly different from today,” said co-senior author David Reich, professor of genetics at Harvard Medical School. “The caste system has been around for a long time, but not forever.”In 2009, Reich and colleagues published a paper based on an analysis of 25 different Indian population groups. The paper described how all populations in India show evidence of a genetic mixture of two ancestral groups: Ancestral North Indians (ANI), who are related to Central Asians, Middle Easterners, Caucasians, and Europeans; and Ancestral South Indians (ASI), who are primarily from the subcontinent.However, the researchers wanted to glean clearer data as to when in history such admixture occurred. For this, the international research team broadened their study pool from 25 to 73 Indian groups.The researchers took advantage of the fact that the genomes of Indian people are a mosaic of chromosomal segments of ANI and ASI descent. Originally when the ANI and ASI populations mixed, these segments would have been extremely long, extending the entire lengths of chromosomes. However, after mixture these segments would have broken up at one or two places per chromosome, per generation, recombining the maternal and paternal genetic material that occurs during the production of egg and sperm.By measuring the lengths of the segments of ANI and ASI ancestry in Indian genomes, the authors were thus able to obtain precise estimates of the age of population mixture, which they infer varied about 1,900 to 4,200 years, depending on the population analyzed.While the findings show that no groups in India are free of such mixture, the researchers did identify a geographic element. “Groups in the north tend to have more recent dates and southern groups have older dates,” said co-first author Priya Moorjani, a graduate student in Reich’s lab at Harvard Medical School. “This is likely because the northern groups have multiple mixtures.””This genetic datatells us a three-part cultural and historical story,” said Reich, who is also an associate member of the Broad Institute. “Prior to about 4000 years ago there was no mixture. …Read more
July 31, 2013 — A science team that includes researchers from Scripps Institution of Oceanography at UC San Diego has linked increasing oxygen levels and the rise and evolution of carnivores (meat eaters) as the force behind a broad explosion of animal species and body structures millions of years ago.Led by Erik Sperling of Harvard University, the scientists analyzed how low oxygen zones in modern oceans limit the abundance and types of carnivores to help lead them to the cause of the “Cambrian radiation,” a historic proliferation of animals 500-540 million years ago that resulted in the animal diversity seen today. The study is published in the July 29 early online edition of the Proceedings of the National Academy of Sciences.Although the cause of the influx of oxygen remains a matter a scientific controversy, Sperling called the Cambrian radiation that followed “the most significant evolutionary event in the history of animals.””During the Cambrian period essentially every major animal body plan — from arthropods to mollusks to chordates, the phylum to which humans belong — appeared in the fossil record,” said Sperling, who is scheduled to join Scripps as a postdoctoral researcher through National Science Foundation support. The authors linked this proliferation of life to the evolution of carnivorous feeding modes, which require higher oxygen concentrations. Once oxygen increased, animals started consuming other animals, stimulating the Cambrian radiation through an escalatory predator-prey “arms race.”Lisa Levin, a professor of biological oceanography at Scripps, along with graduate student researcher Christina Frieder, contributed to the study by providing expertise on the fauna of the ocean’s low-oxygen zones, areas that have been increasing in recent decades due to a variety of factors. While the Cambrian radiation exploded with new species and diversification, Levin believes this study suggests the reverse may ensue as oxygen declines and oxygen minimum zones expand.”This paper uses modern oxygen gradients and their effects on marine worms to understand past evolutionary events” said Levin, director of Scripps’s Center for Marine Biodiversity and Conservation and a 1982 Scripps graduate. “However, the study of oxygen’s role in the past is also going to help us understand the effects of and manage for changes in ocean oxygen in the future.”As part of the research study, Sperling spent time at Scripps working with Levin and Frieder. He also participated in the San Diego Coastal Expedition (bit.ly/sdcoastex), a cruise led by Frieder aboard the Scripps/U.S. Navy research vessel Melville and funded by the UC Ship Funds program, which offers students unique access to at-sea training and research.In addition to Sperling, Frieder, and Levin, coauthors of the paper include Akkur Raman of Andhra University (India) and Peter Girguis and Andrew Knoll of Harvard. Funding for the study was provided by Ministry of Earth Sciences, New Delhi, Agouron Geobiology, the National Science Foundation, and NASA.Read more
July 30, 2013 — A team of researchers led by Dr. Matt Lewin of the California Academy of Sciences, in collaboration with the Department of Anesthesia at the University of California, San Francisco, has pioneered a novel approach to treating venomous snakebites — administering antiparalytics topically via a nasal spray. This new, needle-free treatment may dramatically reduce the number of global snakebite fatalities, currently estimated to be as high as 125,000 per year.The team demonstrated the success of the new treatment during a recent experiment conducted at UCSF; their results have been published in the medical journal Clinical Case Reports.Snakebite is one of the most neglected of tropical diseases — the number of fatalities is comparable to that of AIDS in some developing countries. It has been estimated that 75% of snakebite victims who die do so before they ever reach the hospital, predominantly because there is no easy way to treat them in the field. Antivenoms provide an imperfect solution for a number of reasons — even if the snake has been identified and the corresponding antivenom exists, venomous bites often occur in remote locations far from population centers, and antivenoms are expensive, require refrigeration, and demand significant expertise to administer and manage.”In addition to being an occupational hazard for field scientists, snakebite is a leading cause of accidental death in the developing world, especially among otherwise healthy young people,” says Lewin, the Director of the Center for Exploration and Travel Health at the California Academy of Sciences. “We are trying to change the way people think about this ancient scourge and persistent modern tragedy by developing an inexpensive, heat-stable, easy-to-use treatment that will at least buy people enough time to get to the hospital for further treatment.”In his role as Director of the Academy’s Center for Exploration and Travel Health, Lewin prepares field medicine kits for the museum’s scientific expeditions around the world and often accompanies scientists as the expedition doctor. In 2011, Lewin put together snakebite treatment kits for the Academy’s Hearst Philippine Biodiversity Expedition, which would have required scientists to inject themselves if they needed treatment. When he saw their apprehension about the protocol, Lewin began to wonder if there might be an easier way to treat snakebite in the field.In some fatal snakebites, victims are paralyzed by the snake’s neurotoxins, resulting in death by respiratory failure. A group of common drugs called anticholinesterases have been used for decades to reverse chemically-induced paralysis in operating rooms and, in intravenous form, to treat snakebite when antivenoms are not available or not effective. However, it is difficult to administer intravenous drugs to treat snakebite outside of a hospital, so Lewin began to explore the idea of a different delivery vehicle for these antiparalytics — a nasal spray.In early April of 2013, Lewin and a team of anesthesiologists, led by Dr. …Read more
July 22, 2013 — High levels of arsenic in rice have been shown to be associated with elevated genetic damage in humans, a new study has found.Over the last few years, researchers have reported high concentrations of arsenic in several rice-growing regions around the world.Now, University of Manchester scientists working in collaboration with scientists at CSIR-Indian Institute of Chemical Biology in Kolkata, have proven a link between rice containing high levels of arsenic and chromosomal damage, as measured by micronuclei* in urothelial cells, in humans consuming rice as a staple.The researchers discovered that people in rural West Bengal eating rice as a staple with greater than 0.2 mg/kg arsenic showed higher frequencies of micronuclei than those consuming rice with less than this concentration of arsenic.The study, published in Nature Publishing Group’s Scientific Reports, looked at the frequency of ‘micronuclei’ — a tell-tale sign of chromosomal damage (that has been shown by others previously to be linked to cancer) by screening more than 400,000 individual cells extracted from urine samples from volunteers.The team, funded by the UK India Education and Research Initiative (UKIERI), chose a study population with relatively similar dietary and socio-economic status that was not otherwise exposed to arsenic, for example, through drinking water.They demonstrated that the trend of greater genetic damage with increasing arsenic in rice was observed for both men and women, for tobacco-users and non-users, and for those from three different locations within the study area. The pattern observed was broadly similar to that previously seen for people exposed to arsenic through drinking high arsenic well waters, which has caused devastating health impacts, including cancers, in many parts of the world.The authors say their work raises considerable concerns about health impacts of consuming high arsenic rice as a staple, particularly by people with relatively poor nutritional status — perhaps as many as a few hundred million people. How directly relevant the results are to people in the UK, with a generally lower consumption of rice and better nutritional status, remains to be fully determined but is an obvious focus for further research.Professor David Polya, who led the Manchester team in the University’s School of Earth, Atmospheric and Environmental Sciences, said: “Although concerns about arsenic in rice have been raised for some time now, to our knowledge, this is the first time a link between consumption of arsenic-bearing rice and genetic damage has been demonstrated. As such, it vindicates increasing concerns expressed by the European Food Safety Authority and others about the adequacy of regulation of arsenic in rice.”In the absence of contamination, rice is an easily stored food that provides essential energy, vitamins and fibre to billions of people around the world, but a small proportion of rice contains arsenic at concentrations at which we have observed significant genetic damage in people who consume it as a staple food. We hope that our work will encourage efforts to introduce regulatory standards for arsenic in food, and particularly in rice, which are more consistent and protective of human health.”Dr Ashok K Giri, who led the Indian research team, added: “Although high arsenic in rice is a potential threat to human health, there should not be any panic about the consequences, particularly as the health risks arise from long-term chronic exposure. We can avoid high arsenic rice by taking proper mitigation strategies for rice cultivation; moreover, one CSIR institute in India has already identified a number of Indian rice varieties which accumulate lower concentrations of arsenic, so we can easily address future human health risks with proper mitigation strategies Results of this study will not only help to understand the toxic effects caused by this human carcinogen but also these results will help the scientists and regulatory authorities to design further extensive research to set improved regulatory values for arsenic in rice, particularly for those billions of people who consume 10 to 50% rice in their daily diet.”* Most human cells have one nucleus which contains 46 human chromosomes but when any of these chromosomes are damaged, the part of the chromosome not able to participate in cell division typically remains as small ‘micronuclei’ in any daughter cells. Increased frequency of these micronuclei has been shown by other groups to be linked to the development of cancers.Read more
July 17, 2013 — Hematopoietic stem cells — bone marrow-derived adult stem cells that give rise to the wide variety of specialized blood cells — come in two flavors: the reserve force sits quietly waiting to be called upon while the active arm continually proliferates spawning billions of blood cells every day. In their latest study, researchers at the Stowers Institute for Medical Research reveal a new mechanism that is critical in maintaining the delicate balance between the two.Publishing in the July 17 advance online issue of Nature, the team led by Stowers Investigator Linheng Li, Ph.D., reports that genomic imprinting, a process that specifically shuts down one of the two gene copies found in each mammalian cell, prevents the reservists from being called up prematurely.”Active HSCs (hematopoietic stem cells) form the daily supply line that continually replenishes worn-out blood and immune cells while the reserve pool serves as a backup system that replaces damaged active HSCs and steps in during times of increased need,” explains Li. “In order to maintain a long-term strategic reserve of hematopoietic stem cells that lasts a lifetime it is very important to ensure that the back-up crew isn’t mobilized all at once. Genomic imprinting provides an additional layer of regulation that does just that.”Sexual reproduction yields progeny with two copies, or alleles, for each gene, one from the mother and one from the father. Most genes are expressed from both copies but in mammals and marsupials a small subset of genes receives a mark, or “imprint” during the development of egg or sperm cells. These genomic imprints not only differentiate between genes of maternal and paternal origin and but specifically shut down one copy of those genes in the offspring.Genomic imprinting is an important mechanism for regulating fetal growth and development and, not surprisingly, faulty imprinting has been linked to human disease. But whether imprinting also plays a role in adult stem cells had remained elusive.Earlier mouse studies by Li and his collaborators had indicated that the expression of several imprinted genes changes as hematopoietic stem cells embark on their journey from quiescent reserve cells to multi-lineage progenitor cells, which form the many highly specialized cell types that circulate within the blood stream.For the current study, the Stowers researchers focused on a differentially imprinted control region, which drives the reciprocal expression of H19 from the maternal allele and Igf2 (Insulin growth factor 2) from the paternal allele.The study’s first author Aparna Venkatraman, Ph.D., formerly a postdoc in the Li Lab and now an independent investigator at the Centre for Stem Cell Research at the Christian Medical College in Vellore, India, developed a mouse model that allowed her to specifically excise the imprinting control region from the maternal allele. As a result, the H19 gene, which restricts growth, was no longer active while the Igf2 gene, which promotes cell division, was now expressed from both the paternal and the maternal allele.To gauge the effect off the loss of imprinting control on the maintenance of the quiescent hematopoietic stem cell pool, Venkatraman analyzed the numbers of quiescent, active and differentiated hematopoietic stem cells in mouse bone marrow.”A large number of quiescent hematopoietic stem cells was activated simultaneously when the epigenetic control provided by genomic imprinting was removed,” explains Venkatraman. “It created a wave of activated stem cells that moved through the different maturation stages.”She then followed up with a closer look at role of the Igf2 signaling pathway in coaxing quiescent hematopoietic stem cells to start dividing and maturing into multi-lineage progenitors that ultimately give rise to specialized blood cells.Igf2, an important growth factor, is highly active during fetal development and its misregulation leads to overgrowth disorders such as Beckwith-Wiedemann Syndrome. It exerts its growth promoting effects through the Igf1 receptor, which induces an intracellular signaling cascade that stimulates cell proliferation.The expression of the Igf1 receptor itself is regulated by H19. …Read more
July 17, 2013 — It’s widely thought that Earth arose from violent origins: Some 4.5 billion years ago, a maelstrom of gas and dust circled in a massive disc around the sun, gathering in rocky clumps to form asteroids. These asteroids, gaining momentum, whirled around a fledgling solar system, repeatedly smashing into each other to create larger bodies of rubble — the largest of which eventually cooled to form the planets.Countless theories, simulations and geologic observations support such a scenario. But there remains one lingering mystery: If Earth arose from the collision of asteroids, its composition should resemble that of meteoroids, the small particles that break off from asteroids.But to date, scientists have found that, quite literally, something doesn’t add up: Namely, Earth’s mantle — the layer between the planet’s crust and core — is missing an amount of lead found in meteorites whose composition has been analyzed following impact with Earth.Much of Earth is composed of rocks with a high ratio of uranium to lead (uranium naturally decays to lead over time). However, according to standard theories of planetary evolution Earth should harbor a reservoir of mantle somewhere in its interior that has a low ratio of uranium to lead, to match the composition of meteorites. But such a reservoir has yet to be discovered — a detail that leaves Earth’s origins hazy.Now researchers in MIT’s Department of Earth, Atmospheric and Planetary Sciences have identified a “hidden flux” of material in Earth’s mantle that would make the planet’s overall composition much more similar to that of meteorites. This reservoir likely takes the form of extremely dense, lead-laden rocks that crystallize beneath island arcs, strings of volcanoes that rise up at the boundary of tectonic plates.As two massive plates push against each other, one plate subducts, or slides, under the other, pushing material from the crust down into the mantle. At the same time, molten material from the mantle rises up to the crust, and is ejected via volcanoes onto Earth’s surface.According to the MIT researchers’ observations and calculations, however, up to 70 percent of this rising magma crystallizes into dense rock — dropping, leadlike, back into the mantle, where it remains relatively undisturbed. The lead-heavy flux, they say, puts the composition of Earth’s mantle on a par with that of meteorites.”Now that we know the composition of this flux, we can calculate that there’s tons of this stuff dropping down from the base of the crust into the mantle, so it is likely an important reservoir,” says Oliver Jagoutz, an assistant professor of geology at MIT. “This has a lot of implications for understanding how the Earth evolved through history.”Jagoutz and his colleague Max Schmidt, of the Swiss Federal Institute of Technology in Zurich, have detailed their results in a paper published in Earth and Planetary Science Letters.A mantle exposedMeasuring the composition of material that has dropped into the mantle is a nearly impossible task. Jagoutz estimates that such dense rocks would form at a depth of 40 to 50 kilometers below the surface, beyond the reach of conventional sampling techniques.There is, however, one place on earth where such a depth of the crust and mantle is exposed: a region of northern Pakistan called the Kohistan arc. …Read more
June 18, 2013 — Researchers conclude in a new report that a global push for small hydropower projects, supported by various nations and also the Kyoto Protocol to reduce greenhouse gas emissions, may cause unanticipated and potentially significant losses of habitat and biodiversity.An underlying assumption that small hydropower systems pose fewer ecological concerns than large dams is not universally valid, scientists said in the report. A five-year study, one of the first of its type, concluded that for certain environmental impacts the cumulative damage caused by small dams is worse than their larger counterparts.The findings were reported by scientists from Oregon State University in the journal Water Resources Research, in work supported by the National Science Foundation.The conclusions were based on studies of the Nu River system in China but are relevant to national energy policies in many nations or regions — India, Turkey, Latin America — that seek to expand hydroelectric power generation. Hydropower is generally favored over coal in many developing areas because it uses a renewable resource and does not contribute to global warming. Also, the social and environmental problems caused by large dam projects have resulted in a recent trend toward increased construction of small dams.”The Kyoto Protocol, under Clean Development Mechanism, is funding the construction of some of these small hydroelectric projects, with the goal of creating renewable energy that’s not based on fossil fuels,” said Desiree Tullos, an associate professor in the OSU Department of Biological and Ecological Engineering.”The energy may be renewable, but this research raises serious questions about whether or not the overall process is sustainable,” Tullos said.”There is damage to streams, fisheries, wildlife, threatened species and communities,” she said. “Furthermore, the projects are often located in areas where poverty and illiteracy are high. The benefit to these local people is not always clear, as some of the small hydropower stations are connected to the national grid, indicating that the electricity is being sent outside of the local region.”The result can be profound and unrecognized impacts.”This study was one of the first of its type to look at the complete range of impacts caused by multiple, small hydroelectric projects, both in a biophysical, ecological and geopolitical basis, and compare them to large dam projects. It focused on the remote Nu River in China’s Yunnan Province, where many small dams producing 50 megawatts of power or less are built on tributaries that fall rapidly out of steep mountains. There are already 750,000 dams in China and about one new dam is being built every day, researchers say.Among the findings of the report as it relates to this region of China:The cumulative amount of energy produced by small hydroelectric projects can be significant, but so can the ecological concerns they raise in this area known to be a “hotspot” of biological diversity. Per megawatt of energy produced, small tributary dams in some cases can have negative environmental impacts that are many times greater than large, main stem dams. Many dams in China are built as part of a state-mandated policy to “Send Western Energy East” toward the larger population and manufacturing centers. …Read more
June 10, 2013 — Since 1996, farmers worldwide have planted more than a billion acres (400 million hectares) of genetically modified corn and cotton that produce insecticidal proteins from the bacterium Bacillus thuringiensis, or Bt for short. Bt proteins, used for decades in sprays by organic farmers, kill some devastating pests but are considered environmentally friendly and harmless to people. However, some scientists feared that widespread use of these proteins in genetically modified crops would spur rapid evolution of resistance in pests.A team of experts at the University of Arizona has taken stock to address this concern and to figure out why pests became resistant quickly in some cases, but not others. Bruce Tabashnik and Yves Carrière in the department of entomology at the College of Agriculture and Life Sciences together with visiting scholar Thierry Brévault from the Center for Agricultural Research for Development (CIRAD) in France scrutinized the available field and laboratory data to test predictions about resistance. Their results are published in the journal Nature Biotechnology.”When Bt crops were first introduced, the main question was how quickly would pests adapt and evolve resistance,” said Tabashnik, head of the UA department of entomology who led the study. “And no one really knew, we were just guessing.””Now, with a billion acres of these crops planted over the past 16 years, and with the data accumulated over that period, we have a better scientific understanding of how fast the insects evolve resistance and why.”Analyzing data from 77 studies of 13 pest species in eight countries on five continents, the researchers found well-documented cases of field-evolved resistance to Bt crops in five major pests as of 2010, compared with only one such case in 2005. Three of the five cases are in the United States, where farmers have planted about half of the world’s Bt crop acreage. Their report indicates that in the worst cases, resistance evolved in 2 to 3 years; but in the best cases, effectiveness of Bt crops has been sustained more than 15 years.According to the paper, both the best and worst outcomes correspond with predictions from evolutionary principles.”The factors we found to favor sustained efficacy of Bt crops are in line with what we would expect based on evolutionary theory,” said Carrière, explaining that conditions are most favorable if resistance genes are initially rare in pest populations; inheritance of resistance is recessive — meaning insects survive on Bt plants only if have two copies of a resistance gene, one from each parent — and abundant refuges are present. Refuges consist of standard, non-Bt plants that pests can eat without ingesting Bt toxins.”Computer models showed that refuges should be especially good for delaying resistance when inheritance of resistance in the pest is recessive,” explained Carrière.Planting refuges near Bt crops reduces the chances that two resistant insects will mate with each other, making it more likely they will breed with a susceptible mate, yielding offspring that are killed by the Bt crop. The value of refuges has been controversial, and in recent years, the EPA has relaxed its requirements for planting refuges in the U.S.”Perhaps the most compelling evidence that refuges work comes from the pink bollworm, which evolved resistance rapidly to Bt cotton in India, but not in the U.S.,” Tabashnik said. …Read more
June 10, 2013 — In the wake of concerns over climate change and other emergent environmental issues, both individuals and governments are examining the impact of consumer and producer behavior and policies. In two new studies, three researchers from the University of Maryland’s Department of Geographical Sciences publish groundbreaking findings on the environmental impact of globalization, production and trade on both regional and international scales.Professor Klaus Hubacek and researchers Yang Yu and Kuishuang Feng’s “Tele-connecting local consumption to global land use” appeared in Global Environmental Change and is available now online. Hubacek and Feng, with co-authors from leading institutions worldwide, published “Outsourcing CO2 within China” in the Proceedings of the National Academy of Sciences.”Tele-connecting local consumption to global land use”As local consumption is increasingly met by global supply chains, often involving great geographical distances, the impact of consumer behavior on the environment is becoming increasingly apparent. Hubacek, Yu and Feng’s research concretely connects local consumption to global land use through tracking global commodity and value chains via international trade flows. Specifically, they have zeroed in on land use attributed to “unusual” sectors, including services, machinery and equipment, and construction.Their findings show how developed countries, such as the United States, consume a large amount of goods and services from both domestic and international markets, and thus impose pressure on their domestic land resources and displace land in other countries, creating an impact on how land is used, and consuming land that could potentially be used in more environmentally friendly ways. For example, 33 percent of total U.S. land use for consumption purposes is displaced from other countries, which is actually at the lower end of the global spectrum: the ratio becomes much larger for the EU (more than 50 percent) and Japan (92 percent).The researchers have also illustrated the vast gap between consumption habits of rich and relatively poor countries. Their research shows that rich countries tend to displace land by consuming non-agricultural products, such as services, clothing and household appliances, which account for more than 50 percent of their total land displacement. For developing economies, such as African countries, the share of land use for non-agricultural products is much lower, with an average of seven percent.”In addition, the emerging economies and population giants, China and India, are likely to further increase their appetite for land from other countries, such as Africa, Russia and Latin America, to satisfy their own land needs driven by their fast economic growth and the needs and lifestyles of their growing populations,” Hubacek said. “Obviously, there are significant global consequences when these types of demands exceed the supply of land. …Read more
June 5, 2013 — No food for the human race without bees? It is not quite as straightforward as that. A case study by ecologists from ETH Zurich in a coffee-growing area in India reveals that pollinating insects are just one production factor among many. Farmers have several possibilities to increase their harvest.All over the world, bees are dying and insect diversity is dwindling. Only recently, both the media and scientists expressed fears that insect pollination is in decline, which jeopardises food security. The (lack of) pollination has thus become a sound argument for the protection of species and natural habitats, and organic farming.ETH-Zurich researchers from the group headed by Jaboury Ghazoul, professor of ecosystem management, set about investigating this argument by studying the influence of pollinator insects on coffee harvests in an agroforestry system at coffee plantations in the province of Kodagu in southern India. They also included soil and forest management, environmental factors such as water and soil fertility, and tree cover for the cultures in their study.The research group thus obtained a different picture of the role of pollinators to the popular perception of this cultivation system of “no bees, no harvest.” According to their findings, pollinator bees are merely one production factor among many and to some extent coffee farmers can increase the productivity of their plantations independently of the insects. The results of the study have just been published in the journal PNAS.Important but not the only factor”Pollinators are important for coffee farmers,” stresses Ghazoul; “as far as effective coffee growing and increasing harvests are concerned, however, they are much less important than irrigation or liming, for instance.” This encapsulates one of the central findings from coffee farming in the Kodagu province.Coffee is grown in a traditional agroforestry system in the region. As coffee plants must not be grown in direct sunlight, they are planted in the forest’s undergrowth or the shade of large, isolated trees. The coffee plants all bloom at the same time after heavy rains between February and March and three species of bee pollinate the flowers: the giant honeybee Apis dorsata, Apis cerana and the solitary wild bee Tetragonula iridipennis. …Read more
May 13, 2013 — Scientists have developed techniques for the genetic improvement of sunflowers using a non-GMO based approach. The new technology platform can harness the plant’s own genes to improve characteristics of sunflower, develop genetic traits, which will improve its role as an important oilseed crop.
The work was led by Dr Manash Chatterjee, an Adjunct Faculty member of Botany and Plant Science at NUI Galway, and has been published in the journal BMC Plant Biology.
Among oilseed crops, sunflowers are one of the most important sources of edible vegetable oil for human consumption worldwide. Sunflower and other oilseed crops are the source of the vast majority of vegetable oil used for cooking and food processing. The oils are also for industrial processes such as making soaps, cosmetics, perfumes, paints and biofuels.
Dr Chatterjee is currently a Science Foundation Ireland (SFI) ETS Walton Fellow at NUI Galway, collaborating with the SFI Genetics and Biotechnology Lab of Professor Charles Spillane. Dr Chatterjee’s research uses an approach called TILLING (Targeting Induced Lesions In The Genome), an established non-GM method for creating and discovering new traits in plants.
According to Dr Chatterjee: “Over the centuries, the sunflower has been cultivated for traits such as yield. However, along the way many useful genetic variations have been lost. This new technology allows us to pinpoint key genetic information relating to various useful traits in the sunflower, including wild sunflower species. It gives us a method to quickly create variability for further breeding to enhance the quantity, quality and natural performance of the crop. In this era of increasing global food crisis and changing climatic regimes, such ability is highly desirable.”
The research breakthrough was part of a collaborative project between Bench Bio (India), URGV Lab INRA (France), NUI Galway Plant and AgriBiosciences Research Centre (Ireland) and Advanta Seeds Argentina. NUI Galway PhD student Anish PK Kumar has been working on the technology platform development as a component of his PhD research studies.
Dr Chatterjee is also involved in research in the NUI Galway Plant and AgriBiosciences Research Centre (PABC) to improve the bioenergy crop Miscanthus. Also known as elephant grass, miscanthus is one of a new generation of renewable energy crops that can be converted into renewable energy by being burned in biomass power stations.Read more