The planet’s current biodiversity, the product of 3.5 billion years of evolutionary trial and error, is the highest in the history of life. But it may be reaching a tipping point.In a new review of scientific literature and analysis of data published in Science, an international team of scientists cautions that the loss and decline of animals is contributing to what appears to be the early days of the planet’s sixth mass biological extinction event.Since 1500, more than 320 terrestrial vertebrates have become extinct. Populations of the remaining species show a 25 percent average decline in abundance. The situation is similarly dire for invertebrate animal life.And while previous extinctions have been driven by natural planetary transformations or catastrophic asteroid strikes, the current die-off can be associated to human activity, a situation that the lead author Rodolfo Dirzo, a professor of biology at Stanford, designates an era of “Anthropocene defaunation.”Across vertebrates, 16 to 33 percent of all species are estimated to be globally threatened or endangered. Large animals — described as megafauna and including elephants, rhinoceroses, polar bears and countless other species worldwide — face the highest rate of decline, a trend that matches previous extinction events.Larger animals tend to have lower population growth rates and produce fewer offspring. They need larger habitat areas to maintain viable populations. Their size and meat mass make them easier and more attractive hunting targets for humans.Although these species represent a relatively low percentage of the animals at risk, their loss would have trickle-down effects that could shake the stability of other species and, in some cases, even human health.For instance, previous experiments conducted in Kenya have isolated patches of land from megafauna such as zebras, giraffes and elephants, and observed how an ecosystem reacts to the removal of its largest species. Rather quickly, these areas become overwhelmed with rodents. Grass and shrubs increase and the rate of soil compaction decreases. Seeds and shelter become more easily available, and the risk of predation drops.Consequently, the number of rodents doubles — and so does the abundance of the disease-carrying ectoparasites that they harbor.”Where human density is high, you get high rates of defaunation, high incidence of rodents, and thus high levels of pathogens, which increases the risks of disease transmission,” said Dirzo, who is also a senior fellow at the Stanford Woods Institute for the Environment. …Read more
Fifty years of research has revealed the sad truth that the children of lower-income, less-educated parents typically enter school with poorer language skills than their more privileged counterparts. By some measures, 5-year-old children of lower socioeconomic status (SES) score two years behind on standardized language development tests by the time they enter school.In recent years, Anne Fernald, a psychology professor at Stanford University, has conducted experiments revealing that the language gap between rich and poor children emerges during infancy. Her work has shown that significant differences in both vocabulary and real-time language processing efficiency were already evident at age 18 months in English-learning infants from higher- and lower-SES families. By age 24 months, there was a six-month gap between SES groups in processing skills critical to language development.Fernald’s work has also identified one likely cause for this gap. Using special technology to make all-day recordings of low-SES Spanish-learning children in their home environments, Fernald and her colleagues found striking variability in how much parents talked to their children. Infants who heard more child-directed speech developed greater efficiency in language processing and learned new words more quickly. The results indicate that exposure to child-directed speech — as opposed to overheard speech — sharpens infants’ language processing skills, with cascading benefits for vocabulary learning.Fernald and colleagues are now running a parent-education intervention study with low-income Spanish-speaking mothers in East San Jose, California, funded by the W. K. Kellogg Foundation. This new program, called Habla conmigo! …Read more
Scientists with the Wildlife Conservation Society, Oregon State University, Stanford University, Columbia University, and the American Museum of Natural History have found that humpback whales swimming off the coast of western Africa encounter more than warm waters for mating and bearing young; new studies show that the whales share these waters with offshore oil rigs, major shipping routes, and potentially harmful toxicants.With the aid of satellite tags affixed to more than a dozen whales, the researchers have quantified the amount of overlap between hydrocarbon exploration and extraction, environmental toxicants, shipping lanes, and humpback whales occurring in their nearshore breeding areas. The scientists also identified additional parts of the whales’ breeding range and migratory routes to sub-Antarctic feeding grounds.The study appears in the latest edition of the journal Conservation Biology. The authors are: Howard Rosenbaum of the Wildlife Conservation Society and the American Museum of Natural History; Sara Maxwell of Stanford University; Francine Kershaw of Columbia University; and Bruce Mate of Oregon State University.”Throughout numerous coastal and offshore areas, important whale habitats and migration routes are increasingly overlapping with industrial development, a scenario we have quantified for the first time in the eastern South Atlantic,” said Dr. Howard Rosenbaum, Director of WCS’s Ocean Giants Program. “Studies such as this one are crucial for identifying important habitats for humpback whales and how to best protect these populations from potential impacts associated with hydrocarbon exploration and production, shipping, and other forms of coastal and offshore activities.”Rosenbaum added: “From understanding which habitats are most important to tracking their migrations, our work provides great insights into the current issues confronting these whales and how to best engage ocean industries to better protect and ensure the recovery of these leviathans.”Growing to approximately 50 feet in length, humpback whales are characterized by their long pectoral fins, acrobatic behavior, and haunting songs. Like other great whales, the humpback whale was hunted for centuries by commercial whaling fleets, with experts estimating a reduction of possibly 90 percent in its global population size. The International Whaling Commission has protected humpback whales globally from commercial whaling since 1968.While migration patterns of humpbacks have been the subject of extensive study in other ocean basins and regions, the migratory behaviors of humpbacks along the western African coast in the eastern South Atlantic are poorly described. To better understand the movements of humpback whales in the Gulf of Guinea, the researchers deployed satellite tags on 15 individual animals off the coast of Gabon between August and September of 2002.”This study demonstrates clearly that all of the countries on the west coast of Africa need to work together on a range-wide humpback whale conservation strategy and consider the possibility of creating a whale sanctuary,” said Professor Lee White, CBE, director of Gabon’s National Parks Agency. “Gabon supports the concept of a South Atlantic Whale Sanctuary and will continue to work with other nations in the region to this end.”Dr. Bruce Mate, who pioneered the satellite-monitored radio tagging of large whales, said: “This technology allows the science and conservation communities to discover detailed seasonal migration routes, timing and destinations, so we can characterize these important habitats and reduce potential impacts of human activities, even in the harshest possible marine environments.”The major goal of the study was to elucidate the unknown migratory movement of whales from breeding areas off western Africa to areas where the whales likely feed in Antarctic or sub-Antarctic waters. …Read more
Stanford researchers may have solved a riddle about the inner workings of the brain, which consists of billions of neurons, organized into many different regions, with each region primarily responsible for different tasks.The various regions of the brain often work independently, relying on the neurons inside that region to do their work. At other times, however, two regions must cooperate to accomplish the task at hand.The riddle is this: what mechanism allows two brain regions to communicate when they need to cooperate yet avoid interfering with one another when they must work alone?In a paper published today in Nature Neuroscience, a team led by Stanford electrical engineering professor Krishna Shenoy reveals a previously unknown process that helps two brain regions cooperate when joint action is required to perform a task.”This is among the first mechanisms reported in the literature for letting brain areas process information continuously but only communicate what they need to,” said Matthew T. Kaufman, who was a postdoctoral scholar in the Shenoy lab when he co-authored the paper.Kaufman initially designed his experiments to study how preparation helps the brain make fast and accurate movements — something that is central to the Shenoy lab’s efforts to build prosthetic devices controlled by the brain.But the Stanford researchers used a new approach to examine their data that yielded some findings that were broader than arm movements.The Shenoy lab has been a pioneer in analyzing how large numbers of neurons function as a unit. As they applied these new techniques to study arm movements, the researchers discovered a way that different regions of the brain keep results localized or broadcast signals to recruit other regions as needed.”Our neurons are always firing, and they’re always connected,” explained Kaufman, who is now pursuing brain research at Cold Spring Harbor Laboratory in New York. “So it’s important to control what signals are communicated from one area to the next.”Experimental designThe scientists derived their findings by studying monkeys that had been trained to make precise arm movements. The monkeys were taught to pause briefly before making the reach, thus letting their brain prepare for a moment before moving.Remember, the goal was to help build brain-controlled prostheses. Because the neurons in the brain always send out signals, engineers must be able to differentiate the command to act from the signals that accompany preparation.To understand how this worked with the monkey’s arm, the scientists took electrical readings at three places during the experiments: from the arm muscles, and from each of two motor cortical regions in the brain known to control arm movements.The muscle readings enabled the scientists to ascertain what sorts of signals the arm receives during the preparatory state compared with the action step.The brain readings were more complex. Two regions control arm movements. They are located near the top center of the brain, an inch to the side.Each of the two regions is made up of more than 20 million neurons. The scientists wanted to understand the behavior of both regions, but they couldn’t probe millions of neurons. …Read more
Oct. 15, 2013 — A brain region activated when people are asked to perform mathematical calculations in an experimental setting is similarly activated when they use numbers — or even imprecise quantitative terms, such as “more than” — in everyday conversation, according to a study by Stanford University School of Medicine scientists.Using a novel method, the researchers collected the first solid evidence that the pattern of brain activity seen in someone performing a mathematical exercise under experimentally controlled conditions is very similar to that observed when the person engages in quantitative thought in the course of daily life.”We’re now able to eavesdrop on the brain in real life,” said Josef Parvizi, MD, PhD, associate professor of neurology and neurological sciences and director of Stanford’s Human Intracranial Cognitive Electrophysiology Program. Parvizi is the senior author of the study, published Oct. 15 in Nature Communications. The study’s lead authors are postdoctoral scholar Mohammad Dastjerdi, MD, PhD, and graduate student Muge Ozker.The finding could lead to “mind-reading” applications that, for example, would allow a patient who is rendered mute by a stroke to communicate via passive thinking. Conceivably, it could also lead to more dystopian outcomes: chip implants that spy on or even control people’s thoughts.”This is exciting, and a little scary,” said Henry Greely, JD, the Deane F. and Kate Edelman Johnson Professor of Law and steering committee chair of the Stanford Center for Biomedical Ethics, who played no role in the study but is familiar with its contents and described himself as “very impressed” by the findings. “It demonstrates, first, that we can see when someone’s dealing with numbers and, second, that we may conceivably someday be able to manipulate the brain to affect how someone deals with numbers.”The researchers monitored electrical activity in a region of the brain called the intraparietal sulcus, known to be important in attention and eye and hand motion. Previous studies have hinted that some nerve-cell clusters in this area are also involved in numerosity, the mathematical equivalent of literacy.However, the techniques that previous studies have used, such as functional magnetic resonance imaging, are limited in their ability to study brain activity in real-life settings and to pinpoint the precise timing of nerve cells’ firing patterns. These studies have focused on testing just one specific function in one specific brain region, and have tried to eliminate or otherwise account for every possible confounding factor. …Read more
Sep. 6, 2013 — Graphene is a sheet of carbon atoms arrayed in a honeycomb pattern, just a single atom thick. It could be a better semiconductor than silicon — if we could fashion it into ribbons 20 to 50 atoms wide. Could DNA help?DNA is the blueprint for life. Could it also become the template for making a new generation of computer chips based not on silicon, but on an experimental material known as graphene?That’s the theory behind a process that Stanford chemical engineering professor Zhenan Bao reveals in Nature Communications.Bao and her co-authors, former post-doctoral fellows Anatoliy Sokolov and Fung Ling Yap, hope to solve a problem clouding the future of electronics: consumers expect silicon chips to continue getting smaller, faster and cheaper, but engineers fear that this virtuous cycle could grind to a halt.Why has to do with how silicon chips work.Everything starts with the notion of the semiconductor, a type of material that can be induced to either conduct or stop the flow of electricity. Silicon has long been the most popular semiconductor material used to make chips.The basic working unit on a chip is the transistor. Transistors are tiny gates that switch electricity on or off, creating the zeroes and ones that run software.To build more powerful chips, designers have done two things at the same time: they’ve shrunk transistors in size and also swung those gates open and shut faster and faster.The net result of these actions has been to concentrate more electricity in a diminishing space. So far that has produced small, faster, cheaper chips. But at a certain point, heat and other forms of interference could disrupt the inner workings of silicon chips.”We need a material that will let us build smaller transistors that operate faster using less power,” Bao said.Graphene has the physical and electrical properties to become a next-generation semiconductor material — if researchers can figure out how to mass-produce it.Graphene is a single layer of carbon atoms arranged in a honeycomb pattern. Visually it resembles chicken wire. …Read more
Aug. 29, 2013 — You can’t resurrect a dead cell anymore than you can breathe life into a brick, regardless of what you may have gleaned from zombie movies and Dr. Frankenstein. So when heart cells die from lack of blood flow during a heart attack, replacing those dead cells is vital to the heart muscle’s recovery.But muscle tissue in the adult human heart has a limited capacity to heal, which has spurred researchers to try to give the healing process a boost. Various methods of transplanting healthy cells into a damaged heart have been tried, but have yet to yield consistent success in promoting healing.Now, researchers at the Stanford University School of Medicine and Lucile Packard Children’s Hospital have developed a patch composed of structurally modified collagen that can be grafted onto damaged heart tissue. Their studies in mice have demonstrated that the patch not only speeds generation of new cells and blood vessels in the damaged area, it also limits the degree of tissue damage resulting from the original trauma.The key, according to Pilar Ruiz-Lozano, PhD, associate professor of pediatrics, is that the patch doesn’t seek to replace the dead heart-muscle cells. Instead, it replaces the epicardium, the outer layer of heart tissue, which is not muscle tissue, but which protects and supports the heart muscle, or myocardium.”This synthetic tissue has the mechanical properties of the embryonic epicardium,” said Ruiz-Lozano, who is the senior author of a study that describes the researchers’ findings. The study will be published online Aug. 29 in Biomaterials. Vahid Serpooshan, PhD, a postdoctoral scholar in cardiology, is the lead author.Embryonic epicardium is significantly more flexible than adult epicardium, but more rigid and structured than existing materials, making it more conducive to growth of new tissue. …Read more
Aug. 16, 2013 — Children with autism and average IQs consistently demonstrated superior math skills compared with nonautistic children in the same IQ range, according to a study by researchers at the Stanford University School of Medicine and Lucile Packard Children’s Hospital.”There appears to be a unique pattern of brain organization that underlies superior problem-solving abilities in children with autism,” said Vinod Menon, PhD, professor of psychiatry and behavioral sciences and a member of the Child Health Research Institute at Packard Children’s.The autistic children’s enhanced math abilities were tied to patterns of activation in a particular area of their brains — an area normally associated with recognizing faces and visual objects.Menon is senior author of the study, published online Aug. 17 in Biological Psychiatry. Postdoctoral scholar Teresa luculano, PhD, is the lead author.Children with autism have difficulty with social interactions, especially interpreting nonverbal cues in face-to-face conversations. They often engage in repetitive behaviors and have a restricted range of interests.But in addition to such deficits, children with autism sometimes exhibit exceptional skills or talents, known as savant abilities. For example, some can instantly recall the day of the week of any calendar date within a particular range of years — for example, that May 21, 1982, was a Friday. And some display superior mathematical skills.”Remembering calendar dates is probably not going to help you with academic and professional success,” Menon said. “But being able to solve numerical problems and developing good mathematical skills could make a big difference in the life of a child with autism.”The idea that people with autism could employ such skills in jobs, and get satisfaction from doing so, has been gaining ground in recent years.The participants in the study were 36 children, ages 7 to 12. Half had been diagnosed with autism. The other half was the control group. …Read more
Aug. 13, 2013 — The steady accumulation of a protein in healthy, aging brains may explain seniors’ vulnerability to neurodegenerative disorders, a new study by researchers at the Stanford University School of Medicine reports.The study’s unexpected findings could fundamentally change the way scientists think about neurodegenerative disease.The pharmaceutical industry has spent billions of dollars on futile clinical trials directed at treating Alzheimer’s disease by ridding brains of a substance called amyloid plaque. But the new findings have identified another mechanism, involving an entirely different substance, that may lie at the root not only of Alzheimer’s but of many other neurodegenerative disorders — and, perhaps, even the more subtle decline that accompanies normal aging.The study, published Aug. 14 in the Journal of Neuroscience, reveals that with advancing age, a protein called C1q, well-known as a key initiator of immune response, increasingly lodges at contact points connecting nerve cells in the brain to one another. Elevated C1q concentrations at these contact points, or synapses, may render them prone to catastrophic destruction by brain-dwelling immune cells, triggered when a catalytic event such as brain injury, systemic infection or a series of small strokes unleashes a second set of substances on the synapses.”No other protein has ever been shown to increase nearly so profoundly with normal brain aging,” said Ben Barres, MD, PhD, professor and chair of neurobiology and senior author of the study. Examinations of mouse and human brain tissue showed as much as a 300-fold age-related buildup of C1q.The finding was made possible by the diligence and ingenuity of the study’s lead author, Alexander Stephan, PhD, a postdoctoral scholar in Barres’ lab. Stephan screened about 1,000 antibodies before finding one that binds to C1q and nothing else. (Antibodies are proteins, generated by the immune system, that adhere to specific “biochemical shapes,” such as surface features of invading pathogens.)Comparing brain tissue from mice of varying ages, as well as postmortem samples from a 2-month-old infant and an older person, the researchers showed that these C1q deposits weren’t randomly distributed along nerve cells but, rather, were heavily concentrated at synapses. Analyses of brain slices from mice across a range of ages showed that as the animals age, the deposits spread throughout the brain.”The first regions of the brain to show a dramatic increase in C1q are places like the hippocampus and substantia nigra, the precise brain regions most vulnerable to neurodegenerative diseases like Alzheimer’s and Parkinson’s disease, respectively,” said Barres. Another region affected early on, the piriform cortex, is associated with the sense of smell, whose loss often heralds the onset of neurodegenerative disease.Other scientists have observed moderate, age-associated increases (on the order of three- or four-fold) in brain levels of the messenger-RNA molecule responsible for transmitting the genetic instructions for manufacturing C1q to the protein-making machinery in cells. …Read more
Aug. 6, 2013 — Something big is about to happen on the sun. According to measurements from NASA-supported observatories, the sun’s vast magnetic field is about to flip.”It looks like we’re no more than three to four months away from a complete field reversal,” said solar physicist Todd Hoeksema of Stanford University. “This change will have ripple effects throughout the solar system.”The sun’s magnetic field changes polarity approximately every 11 years. It happens at the peak of each solar cycle as the sun’s inner magnetic dynamo re-organizes itself. The coming reversal will mark the midpoint of Solar Cycle 24. Half of “solar max” will be behind us, with half yet to come.Hoeksema is the director of Stanford’s Wilcox Solar Observatory, one of the few observatories in the world that monitors the sun’s polar magnetic fields. The poles are a herald of change. Just as Earth scientists watch our planet’s polar regions for signs of climate change, solar physicists do the same thing for the sun. Magnetograms at Wilcox have been tracking the sun’s polar magnetism since 1976, and they have recorded three grand reversals — with a fourth in the offing.Solar physicist Phil Scherrer, also at Stanford, describes what happens: “The sun’s polar magnetic fields weaken, go to zero and then emerge again with the opposite polarity. …Read more
Aug. 1, 2013 — More than 7 billion people live on this planet — members of a single species that originated in one place and migrated all over Earth over tens of thousands of years.But even though we all trace our family lineage to a few common ancestors, scientists still don’t know exactly when and how those few ancestors started to give rise to the incredible diversity of today’s population.A brand-new finding, made using advanced analysis of DNA from all over the world, sheds new light on this mystery. By studying the DNA sequence of Y chromosomes of men from many different populations, scientists have determined that their male most recent common ancestor (MRCA) lived sometime between 120,000 and 156,000 years ago.It’s the first time the human ancestry has been traced back through the male line by sequencing the DNA of many entire Y chromosomes.And, it agrees reasonably well with previous findings about our female most recent common ancestor, made by studying DNA carried down through the human race’s female line. Such studies used DNA from mitochrondria — structures inside cells — and placed that time of the most recent common ancestor between 99,000 and 148,000 years ago. That agreement makes the new finding especially significant:The research was done by a team of scientists from Stanford University, the University of Michigan Medical School, Stony Brook University, and their colleagues, and is published in the journal Science.The team hopes their work will lead to further research on Y chromosomes as vehicles for studying human history — and tracing male lineages back to the common “Adam” ancestors.Jeffrey Kidd, Ph.D., an Assistant Professor of Human Genetics and Computational Medicine & Bioinformatics who worked on the new study, notes that only recently has it become possible to sequence Y chromosomes, because of technical limitations of previous approaches.The new paper details how the team was able to make reliable measurements of the sequence variation along the Y chromosome — which is handed down only from father to son without exchanging, or recombining, genetic material with other chromosomes.Kidd notes that this initial paper on Y chromosome sequence diversity provides important first evidence that the male most recent common ancestor did not live more recently than the female most recent common ancestor.”We’re interested in understanding the historical relationships between many different human populations, and the migration patterns that have led to the peopling of the world,” he says. “We hope that others will make use of this approach and sequence additional chromosomes of interest that are related to the peopling of specific places.”The study involved Y chromosomes obtained through the Human Genome Diversity Project, and from other sources. It included chromosomes from 69 men in several populations in sub-Saharan Africa, and from Siberia, Cambodia, Pakistan, Algeria and Mexico.The great migrations of our ancestors out of Africa, across Asian and Europe and into the Americas all helped shape today’s populations — as did more recent forces related to colonialism and ever-growing global mobility.Genetic studies such as this one may help anthropologists understand those migrations — and their timing — even better by giving them a genetic “clock” to use when studying today’s humans, or potentially DNA extracted from ancient bones. It may also help scientists understand the great genetic diversity seen across Africa, and the evolution process that led to modern humans.The reconciliation of the timing of “Adam” and “Eve,” however, may be this study’s most important immediate implication.”This has been a conundrum in human genetics for a long time,” said Carlos D. Bustamante, PhD, a professor of genetics at Stanford and senior author of the study. “Previous research has indicated that the male MRCA lived much more recently than the female MRCA. …Read more
July 30, 2013 — Medical noncompliance — or failure to follow the doctor’s orders — is estimated to increase healthcare costs in the US by $100 billion per year. Patients sometimes opt not to take medicines, for instance, because the side effects are unbearable or the dosing regimens are too complicated. But medical noncompliance may also stem from sheer inertia — the tendency to stay in the current state, even when that state is undesirable.In a series of studies, Gaurav Suri and colleagues from Stanford and Tel Aviv Universities tested whether this status-quo bias could result in behavior that is detrimental, and whether such a bias could be lessened with minimal interventions.Their results are published in Psychological Science, a journal of the Association for Psychological Science.In the first study, participants were told that the research would involve receiving electric shocks. One group was told that they were required to choose one of two options: They could press a button to stop the shock 10 seconds earlier, or press another button to keep the waiting time the same. As the researchers expected, most people opted to get the shock over with early.In contrast, those participants who were told that they could press a time-decrease button if they wanted to were more likely to stick with the status quo: Only about 40% chose to push the button in order to shorten the trial.The researchers saw similar results when they told participants that pressing a button would reduce the chance of shock by as much as 90%. Those participants who had to make a proactive choice to press the button opted to leave it untouched about half the time, even though it meant they had to withstand shocks they themselves rated as highly undesirable.These studies clearly demonstrate that, when faced with a choice that requires them to make a proactive decision, people often opt do nothing, even when actions that are easy to perform could noticeably improve their current state.Interestingly, the researchers found that simply requiring participants to press the button on an early trial made them more likely to hit the button on later trials. Thus, while medical noncompliance may sometimes result from patient inaction, the researchers conclude that people may be capable of making productive choices about their health if given a nudge in the right direction.Read more
July 24, 2013 — Research on human embryonic stem cells has been a political and religious lightning rod for more than a decade.The cells long have been believed to be the only naturally occurring pluripotent cells. (Under the right conditions, pluripotent cells can become any other cell in the body.) But some people object to the fact that the embryo is destroyed during their isolation. Induced pluripotent stem cells, created by experimentally manipulating an adult cell such as a skin or nerve cell, are much more ethically palatable. But many researchers feel it is important to continue studying both types of cells.In 2006, a group of researchers led by Mariusz Ratajczak, MD, PhD, at the University of Louisville, described another possible alternative: a special population of very small, pluripotent embryonic-like cells in adult bone marrow of mice and humans. These cells, called VSEL (very small embryonic-like) cells, presumably arise through the self-renewal of embryonic stem cells during the developmental process and, as described, could provide all the benefits of embryonic stem cell research with none of the ethical controversy. However, subsequent research from other labs has provided conflicting results as to the pluripotency — and even the existence — of VSEL cells in bone marrow.A company, NeoStem, has proposed a human clinical trial of the cells for periodontitis to begin this year.But scientists in the laboratory of Irving Weissman, MD, a professor of pathology at the Stanford University School of Medicine, say they have been unable to identify any very small, pluripotent cells in the bone marrow of mice, despite exhaustive efforts to duplicate the original experimental procedures.”It has become important to know to what extent and where these VSEL cells exist to understand how they may affect the field of stem cell research,” said Weissman, who directs Stanford’s Institute for Stem Cell Biology and Regenerative Medicine and the Ludwig Center for Cancer Stem Cell Research and Medicine at Stanford. “We tried as hard as we could to replicate the original published results using the methods described and were unable to detect these cells in either the bone marrow or the blood of laboratory mice.”Although other groups have seemingly confirmed the existence of these cells as defined by size and the expression of key cell-surface molecules, Weissman’s study is the first to evaluate the biological potency of the cells.The research will be published online July 24 in Stem Cell Reports. Weissman, who is also the Virginia & D.K. Ludwig Professor for Clinical Investigation in Cancer Research and a member of the Stanford Cancer Institute, shares senior authorship of the study with instructor Jun Seita, MD, PhD. Postdoctoral scholars Masanori Miyanishi, PhD, and Yasuo Mori, MD, PhD, are the lead authors.Using a variety of methods, the researchers found that most of the very small (less than 5 micrometers in diameter) particles in mouse-bone marrow were not cells, but were in fact cell debris or dead cells with a less-than-normal complement of DNA. …Read more
July 17, 2013 — Extreme weather, sea level rise and degraded coastal systems are placing people and property at greater risk along the coast. Natural habitats such as dunes and reefs are critical to protecting millions of U.S. residents and billions of dollars in property from coastal storms, according to a new study by scientists with the Natural Capital Project at the Stanford Woods Institute for the Environment.The study, “Coastal habitats shield people and property from sea-level rise and storms,” published July 14 in the journal Nature Climate Change, offers the first comprehensive map of the entire U.S. coastline that shows where and how much protection communities get from natural habitats such as sand dunes, coral reefs, sea grasses and mangroves. The likelihood and magnitude of losses can be reduced by intact ecosystems near vulnerable coastal communities.One map shows predicted exposure of the United States coastline and coastal population to sea level rise and storms in the year 2100. An interactive map can be zoomed in on for the West, Gulf or East coasts; Hawaii or Alaska; or the continental United States.”The natural environment plays a key role in protecting our nation’s coasts,” said study lead author Katie Arkema, a Woods postdoctoral scholar. “If we lose these defenses, we will either have to have massive investments in engineered defenses or risk greater damage to millions of people and billions in property.”With the release of the Obama administration’s Climate Action Plan on June 25, there is renewed interest in coastal resilience and climate adaptation planning, as well as in finding natural ways to protect America’s coastline. Billions of dollars will soon be spent on restoration activities in the Gulf of Mexico and the Eastern Seaboard affected by Hurricane Sandy. Leaders can make decisions now to factor natural capital into decisions that could have long-term benefits.”As a nation, we should be investing in nature to protect our coastal communities,” said Mary Ruckelshaus, managing director of the Natural Capital Project. “The number of people, poor families, elderly and total value of residential property that are most exposed to hazards can be reduced by half if existing coastal habitats remain fully intact.”At a moment when many coastal planners are considering their options for dealing with the impacts of sea level rise, the study provides both a national and a localized look at coastal areas where restoration and conservation of natural habitats could make the biggest difference.”Hardening our shorelines with sea walls and other costly engineering shouldn’t be the default solution,” said Peter Kareiva, the chief scientist at The Nature Conservancy and co-author of the study. …Read more
July 17, 2013 — New research shows that great white sharks power their non-stop journeys of more than 2,500 miles with energy stored as fat and oil in their massive livers. The findings provide novel insights into the biology of these ocean predators.Great white sharks are not exactly known as picky eaters, so it might seem obvious that these voracious predators would dine often and well on their migrations across the Pacific Ocean. But not so, according to new research by scientists at Stanford University and the Monterey Bay Aquarium.The researchers’ findings, published July 17 in Proceedings of the Royal Society B, reveal previously unknown details of how great white sharks power themselves and stay buoyant on non-stop trips of more than 2,500 miles. The discoveries have potentially broad implications for conservation and management of coastal waters.”We have a glimpse now of how white sharks come in from nutrient-poor areas offshore, feed where elephant seal populations are expanding — much like going to an Outback Steakhouse — and store the energy in their livers so they can move offshore again,” said researcher Barbara Block, a professor of marine sciences and a senior fellow at the Stanford Woods Institute for the Environment. “It helps us understand how important their near-shore habitats are as fueling stations for their entire life history.”Just as bears put on fat to keep them going through long months of hibernation, ocean-going mammals such as whales and sea lions build up blubber to burn on their long migrations. Until now, little was known about how sharks, which carry fat in their massive livers rather than external blubber, make similar voyages.In a study initiated by a summer project of Stanford undergraduate student Gen Del Raye, researchers first looked at a well-fed juvenile great white shark at the Monterey Bay Aquarium. They documented over time a steady increase in buoyancy as the shark’s body mass increased, presumably due to the addition of stored oils in its liver.The researchers then turned to detailed data records from electronically tagged white sharks free-swimming in the eastern Pacific Ocean. Using these data, which include location, depth and water temperature, the scientists identified periods of “drift diving,” a common behavior of marine animals in which they passively descend while momentum carries them forward like underwater hang gliders.By measuring the rate at which sharks sink during drift dives, the researchers were able to estimate the amount of oil in the animals’ livers, which accounts for up to a quarter of their body weight. A quicker descent meant less oil was present to provide buoyancy. A slower descent equated with more oil.”Sharks face an interesting dilemma,” said Sal Jorgensen, a research scientist at the Monterey Bay Aquarium. …Read more
July 2, 2013 — An existing FDA-approved drug improves cognitive function in a mouse model of Down syndrome, according to a new study by researchers at the Stanford University School of Medicine.The drug, an asthma medication called formoterol, strengthened nerve connections in the hippocampus, a brain center used for spatial navigation, paying attention and forming new memories, the study said. It also improved contextual learning, in which the brain integrates spatial and sensory information.Both hippocampal function and contextual learning, which are impaired in Down syndrome, depend on the brain having a good supply of the neurotransmitter norepinephrine. This neurotransmitter sends its signal via several types of receptors on the neurons, including a group called beta-2 adrenergic receptors.”This study provides the initial proof-of-concept that targeting beta-2 adrenergic receptors for treatment of cognitive dysfunction in Down syndrome could be an effective strategy,” said Ahmed Salehi, MD, PhD, the study’s senior author and a clinical associate professor of psychiatry and behavioral sciences. The study will be published online July 2 in Biological Psychiatry.Down syndrome, which is caused by an extra copy of chromosome 21, results in both physical and cognitive problems. While many of the physical issues, such as vulnerability to heart problems, can now be treated, no treatments exist for poor cognitive function. As a result, children with Down syndrome fall behind their peers’ cognitive development. In addition, adults with Down syndrome develop Alzheimer’s-type pathology in their brains by age 40. Down syndrome affects about 400,000 people in the United States and 6 million worldwide.In prior Down syndrome research, scientists have seen deterioration of the brain center that manufactures norepinephrine in both people with Down syndrome and its mouse model. Earlier work by Salehi’s team found that giving a norepinephrine precursor could improve cognitive function in a mouse model genetically engineered to mimic Down syndrome.The new study refined this work by targeting only one group of receptors that respond to norepinephrine: the beta-2 adrenergic receptors in the brain. The researchers began by giving mice a compound that blocks the action of beta-2 adrenergic receptors outside the brain. …Read more
July 1, 2013 — Long-term hearing loss from loud explosions, such as blasts from roadside bombs, may not be as irreversible as previously thought, according to a new study by researchers at the Stanford University School of Medicine.Using a mouse model, the study found that loud blasts actually cause hair-cell and nerve-cell damage, rather than structural damage, to the cochlea, which is the auditory portion of the inner ear. This could be good news for the millions of soldiers and civilians who, after surviving these often devastating bombs, suffer long-term hearing damage.”It means we could potentially try to reduce this damage,” said John Oghalai, MD, associate professor of otolaryngology and senior author of the study, published July 1 in PLOS ONE. If the cochlea, an extremely delicate structure, had been shredded and ripped apart by a large blast, as earlier studies have asserted, the damage would be irreversible. (Researchers presume that the damage seen in these previous studies may have been due to the use of older, less sophisticated imaging techniques.)”The most common issue we see veterans for is hearing loss,” said Oghalai, a scientist and clinician who treats patients at Stanford Hospital & Clinics and directs the hearing center at Lucile Packard Children’s Hospital.The increasingly common use of improvised explosive devices, or IEDs, around the world provided the impetus for the new study, which was primarily funded by the U.S. Department of Defense. Among veterans with service-connected disabilities, tinnitus — a constant ringing in the ears — is the most prevalent condition. Hearing loss is the second-most-prevalent condition. But the results of the study would prove true for anyone who is exposed to loud blasts from other sources, such as jet engines, air bags or gunfire.More than 60 percent of wounded-in-action service members have eardrum injuries, tinnitus or hearing loss, or some combination of these, the study says. Twenty-eight percent of all military personnel experience some degree of hearing loss post-deployment. The most devastating effect of blast injury to the ear is permanent hearing loss due to trauma to the cochlea. …Read more
June 27, 2013 — A chemical code scrawled on histones — the protein husks that coat DNA in every animal or plant cell — determines which genes in that cell are turned on and which are turned off. Now, Stanford University School of Medicine researchers have taken a new step in the deciphering of that histone code.In a study published June 27 in Cell Reports, a team led by Thomas Rando, MD, PhD, professor of neurology and neurological sciences and chief of the Veterans Affairs Palo Alto Health Care System’s neurology service, has identified characteristic differences in “histone signatures” between stem cells from the muscles of young mice and old mice. The team also distinguished histone-signature differences between quiescent and active stem cells in the muscles of young mice.”We’ve been trying to understand both how the different states a cell finds itself in can be defined by the markings on the histones surrounding its DNA, and to find an objective way to define the ‘age’ of a cell,” said Rando, who is also director of Stanford’s Glenn Laboratories for the Biology of Aging and deputy director of the Stanford Center on Longevity.While all cells in a person’s body share virtually the same genes, these cells can be as different from one another as a nerve cell is from a fat cell. This is because only a fraction of a cell’s genes are actually “turned on” — actively involved in the production of one or another protein. A muscle cell produces the proteins it uses to be a muscle cell, a liver cell produces those it needs in order to be a liver cell and so forth. Rando’s team thinks the same kinds of on/off differences may distinguish old stem cells from young stem cells.In human cells, the DNA in which genes are found doesn’t float loose inside the cell nucleus but is, rather, packaged inside protein “husks” called histones. Chemical marks on the histones, which sheathe our chromosomal DNA in each cell’s nucleus, act as “stop” and “go” traffic signals. These signals tell the complex molecular machinery that translates genes’ instructions into newly produced proteins which genes to read and which ones to skip.In 2005, Rando and his colleagues published a study in Nature showing that stem cells in several tissues of older mice, including muscle, seemed to act younger after continued exposure to younger mice’s blood. Their capacity to divide, differentiate and repopulate tissues, which typically declines with an organism’s advancing age, resembled those of their stem-cell counterparts in younger animals.This naturally led to curiosity about exactly what is happening inside a cell to rejuvenate it, said Rando. One likely place to look for an answer was histones, to see if changes in the patterns of the chemical marks on them might reveal any secrets, at the cellular level, of the aging process we all experience — and, perhaps, whether there might be anything we can do about it. …Read more
May 30, 2013 — A chemical reaction between iron-containing minerals and water may produce enough hydrogen “food” to sustain microbial communities living in pores and cracks within the enormous volume of rock below the ocean floor and parts of the continents, according to a new study led by the University of Colorado Boulder.
The findings, published in the journal Nature Geoscience, also hint at the possibility that hydrogen-dependent life could have existed where iron-rich igneous rocks on Mars were once in contact with water.
Scientists have thoroughly investigated how rock-water reactions can produce hydrogen in places where the temperatures are far too hot for living things to survive, such as in the rocks that underlie hydrothermal vent systems on the floor of the Atlantic Ocean. The hydrogen gases produced in those rocks do eventually feed microbial life, but the communities are located only in small, cooler oases where the vent fluids mix with seawater.
The new study, led by CU-Boulder Research Associate Lisa Mayhew, set out to investigate whether hydrogen-producing reactions also could take place in the much more abundant rocks that are infiltrated with water at temperatures cool enough for life to survive.
“Water-rock reactions that produce hydrogen gas are thought to have been one of the earliest sources of energy for life on Earth,” said Mayhew, who worked on the study as a doctoral student in CU-Boulder Associate Professor Alexis Templeton’s lab in the Department of Geological Sciences.
“However, we know very little about the possibility that hydrogen will be produced from these reactions when the temperatures are low enough that life can survive. If these reactions could make enough hydrogen at these low temperatures, then microorganisms might be able to live in the rocks where this reaction occurs, which could potentially be a huge subsurface microbial habitat for hydrogen-utilizing life.”
When igneous rocks, which form when magma slowly cools deep within Earth, are infiltrated by ocean water, some of the minerals release unstable atoms of iron into the water. At high temperatures — warmer than 392 degrees Fahrenheit (200 degrees Celsius) — scientists know that the unstable atoms, known as reduced iron, can rapidly split water molecules and produce hydrogen gas, as well as new minerals containing iron in the more stable, oxidized form.
Mayhew and her co-authors, including Templeton, submerged rocks in water in the absence of oxygen to determine if a similar reaction would take place at much lower temperatures, between 122 and 212 degrees Fahrenheit (50 to 100 degrees Celsius). The researchers found that the rocks did create hydrogen — potentially enough hydrogen to support life.
To understand in more detail the chemical reactions that produced the hydrogen in the lab experiments, the researchers used “synchrotron radiation” — which is created by electrons orbiting in a humanmade storage ring — to determine the type and location of iron in the rocks on a microscale.
The researchers expected to find that the reduced iron in minerals like olivine had converted to the more stable oxidized state, just as occurs at higher temperatures. But when they conducted their analyses at the Stanford Synchrotron Radiation Lightsource at Stanford University, they were surprised to find newly formed oxidized iron on “spinel” minerals found in the rocks. Spinels are minerals with a cubic structure that are highly conductive.
Finding oxidized iron on the spinels led the team to hypothesize that, at low temperatures, the conductive spinels were helping facilitate the exchange of electrons between reduced iron and water, a process that is necessary for the iron to split the water molecules and create the hydrogen gas.
“After observing the formation of oxidized iron on spinels, we realized there was a strong correlation between the amount of hydrogen produced and the volume percent of spinel phases in the reaction materials,” Mayhew said. “Generally, the more spinels, the more hydrogen.”
Not only is there a potentially large volume of rock on Earth that may undergo these low temperature reactions, but the same types of rocks also are prevalent on Mars, Mayhew said. Minerals that form as a result of the water-rock reactions on Earth have been detected on Mars as well, which means that the process described in the new study may have implications for potential Martian microbial habitats.
Mayhew and Templeton are already building on this study with their co-authors, including Thomas McCollom at CU-Boulder’s Laboratory for Atmospheric and Space Physics, to see if the hydrogen-producing reactions can actually sustain microbes in the lab.
This study was funded by the David and Lucille Packard Foundation and with a U.S. Department of Energy Early Career grant to Templeton.Read more
May 30, 2013 — Researchers at the Stanford University School of Medicine have found that a naturally occurring protein secreted only in discrete areas of the mammalian brain may act as a Valium-like brake on certain types of epileptic seizures.
The protein is known as diazepam binding inhibitor, or DBI. It calms the rhythms of a key brain circuit and so could prove valuable in developing novel, less side-effect-prone therapies not only for epilepsy but possibly for anxiety and sleep disorders, too. The researchers’ discoveries will be published May 30 in Neuron.
“This is one of the most exciting findings we have had in many years,” said John Huguenard, PhD, professor of neurology and neurological sciences and the study’s senior author. “Our results show for the first time that a nucleus deep in the middle of the brain generates a small protein product, or peptide, that acts just like benzodiazepines.” This drug class includes not only the anti-anxiety compound Valium (generic name diazepam), first marketed in 1965, but its predecessor Librium, discovered in 1955, and the more recently developed sleep aid Halcyon.
Valium, which is notoriously addictive, prone to abuse and dangerous at high doses, was an early drug treatment for epilepsy, but it has fallen out of use for this purpose because its efficacy quickly wears off and because newer, better anti-epileptic drugs have come along.
For decades, DBI has also been known to researchers under a different name: ACBP. In fact, it is found in every cell of the body, where it is an intracellular transporter of a metabolite called acyl-CoA. “But in a very specific and very important brain circuit that we’ve been studying for many years, DBI not only leaves the cells that made it but is — or undergoes further processing to become — a natural anti-epileptic compound,” Huguenard said. “In this circuit, DBI or one of its peptide fragments acts just like Valium biochemically and produces the same neurological effect.”
Other endogenous (internally produced) substances have been shown to cause effects similar to psychoactive drugs. In 1974, endogenous proteins called endorphins, with biochemical activity and painkilling properties similar to that of opiates, were isolated. A more recently identified set of substances, the endocannabinoids, mimic the memory-, appetite- and analgesia-regulating actions of the psychoactive components of cannabis, or marijuana.
DBI binds to receptors that sit on nerve-cell surfaces and are responsive to a tiny but important chemical messenger, or neurotransmitter, called GABA. The roughly one-fifth of all nerve cells in the brain that are inhibitory mainly do their job by secreting GABA, which binds to receptors on nearby nerve cells, rendering those cells temporarily unable to fire any electrical signals of their own.
Benzodiazepine drugs enhance GABA-induced inhibition by binding to a different site on GABA receptors from the one GABA binds to. That changes the receptor’s shape, making it hyper-responsive to GABA. These receptors come in many different types and subtypes, not all of which are responsive to benzodiazepines. DBI binds to the same spot to which benzodiazepines bind on benzodiazepine-responsive GABA receptors. But until now, exactly what this means has remained unclear.
Huguenard, along with postdoctoral scholar and lead author Catherine Christian, PhD, and several Stanford colleagues zeroed in on DBI’s function in the thalamus, a deep-brain structure that serves as a relay station for sensory information, and which previous studies in the Huguenard lab have implicated on the initiation of seizures. The researchers used single-nerve-cell-recording techniques to show that within a GABA-secreting nerve-cell cluster called the thalamic reticular nucleus, DBI has the same inhibition-boosting effect on benzodiazepine-responsive GABA receptors as do benzodiazepines. Using bioengineered mice in which those receptors’ benzodiazepine-binding site was defective, they showed that DBI lost its effect, which Huguenard and Christian suggested makes these mice seizure-prone.
In another seizure-prone mouse strain in which that site is intact but the gene for DBI is missing, the scientists saw diminished inhibitory activity on the part of benzodiazepine-responsive GABA receptors. Re-introducing the DBI gene to the brains of these mice via a sophisticated laboratory technique restored the strength of the GABA-induced inhibition. In normal mice, a compound known to block the benzodiazepine-binding site weakened these same receptors’ inhibitory activity in the thalamic reticular nucleus, even in the absence of any administered benzodiazepines. This suggested that some naturally occurring benzodiazepine-like substance was being displaced from the benzodiazepine-binding site by the drug. In DBI-gene-lacking mice, the blocking agent had no effect at all.
Huguenard’s team also showed that DBI has the same inhibition-enhancing effect on nerve cells in an adjacent thalamic region — but also that, importantly, no DBI is naturally generated in or near this region; in the corticothalamic circuit, at least, DBI appears to be released only in the thalamic reticular nucleus. So, the actions of DBI on GABA receptors appear to be tightly controlled to occur only in specific brain areas.
Huguenard doesn’t know yet whether it is DBI per se, or one of its peptide fragments (and if so which one), that is exerting the active inhibitory role. But, he said, by finding out exactly which cells are releasing DBI under what biochemical circumstances, it may someday be possible to develop agents that could jump-start and boost its activity in epileptic patients at the very onset of seizures, effectively nipping them in the bud.Read more