Green tea extract boosts your brain power, especially the working memory, new research shows

Green tea is said to have many putative positive effects on health. Now, researchers at the University of Basel are reporting first evidence that green tea extract enhances the cognitive functions, in particular the working memory. The Swiss findings suggest promising clinical implications for the treatment of cognitive impairments in psychiatric disorders such as dementia. The academic journal Psychopharmacology has published their results.In the past the main ingredients of green tea have been thoroughly studied in cancer research. Recently, scientists have also been inquiring into the beverage’s positive impact on the human brain. Different studies were able to link green tea to beneficial effects on the cognitive performance. However, the neural mechanisms underlying this cognitive enhancing effect of green tea remained unknown.Better memoryIn a new study, the researcher teams of Prof. Christoph Beglinger from the University Hospital of Basel and Prof. Stefan Borgwardt from the Psychiatric University Clinics found that green tea extract increases the brain’s effective connectivity, meaning the causal influence that one brain area exerts over another. This effect on connectivity also led to improvement in actual cognitive performance: Subjects tested significantly better for working memory tasks after the admission of green tea extract.For the study healthy male volunteers received a soft drink containing several grams of green tea extract before they solved working memory tasks. …

Read more

Education attenuates impact of TBI on cognition

Kessler Foundation researchers have found that higher educational attainment (a proxy of intellectual enrichment) attenuates the negative impact of traumatic brain injury (TBI) on cognitive status. The brief report, Sumowski J, Chiaravalloti N, Krch D, Paxton J, DeLuca J. Education attenuates the negative impact of traumatic brain injury (TBI) on cognitive status, was published in the December issue of Archives of Physical Medicine & Rehabilitation Volume 94, Issue 12:2562-64.Cognitive outcomes vary post-TBI, even among individuals with comparable injuries. To examine this finding, investigators looked at whether the hypothesis of cognitive reserve helps to explain this differential cognitive impairment following TBI. Kessler Foundation investigators have previously supported the cognitive reserve hypothesis in persons with multiple sclerosis, demonstrating that lifetime intellectual enrichment protects patients from cognitive impairment, as published in Multiple Sclerosis Journal. In the current study, they sought to determine whether individuals with TBI with greater intellectual enrichment pre-injury (estimated with education), are less vulnerable to cognitive impairment.Researchers compared 44 people with moderate to severe TBI with 36 healthy controls. Their cognitive status (processing speed, working memory, episodic memory) was evaluated with neuropsychological tasks. “Although cognitive status was worse in the TBI group,” said Dr. Sumowski, senior research scientist in Neuropsychology & Neuroscience Research at Kessler Foundation, “higher education attenuated the negative effect of TBI on cognitive status, such that persons with higher education were protected against TBI-related cognitive impairment.””These results support the hypothesis of cognitive reserve in TBI, ie, as in MS, higher intellectual enrichment benefits cognitive status,” concluded Dr. Chiaravalloti, the Foundation’s director of TBI Research. …

Read more

I’m OK, you’re not OK: Right supramarginal gyrus plays an important role in empathy

Oct. 9, 2013 — Egoism and narcissism appear to be on the rise in our society, while empathy is on the decline. And yet, the ability to put ourselves in other people’s shoes is extremely important for our coexistence. A research team headed by Tania Singer from the Max Planck Institute for Human Cognitive and Brain Sciences has discovered that our own feelings can distort our capacity for empathy. This emotionally driven egocentricity is recognised and corrected by the brain. When, however, the right supramarginal gyrus doesn’t function properly or when we have to make particularly quick decisions, our empathy is severely limited.When assessing the world around us and our fellow humans, we use ourselves as a yardstick and tend to project our own emotional state onto others. While cognition research has already studied this phenomenon in detail, nothing is known about how it works on an emotional level. It was assumed that our own emotional state can distort our understanding of other people’s emotions, in particular if these are completely different to our own. But this emotional egocentricity had not been measured before now.This is precisely what the Max Planck researchers have accomplished in a complex marathon of experiments and tests. They also discovered the area of the brain responsible for this function, which helps us to distinguish our own emotional state from that of other people. …

Read more

Breakthrough discerns normal memory loss from disease

Sep. 11, 2013 — Cornell University researchers have developed a reliable method to distinguish memory declines associated with healthy aging from the more-serious memory disorders years before obvious symptoms emerge. The method also allows research to accurately predict who is more likely to develop cognitive impairment without expensive tests or invasive procedures.Their results hold promise for detecting cognitive impairment early and monitoring treatment, but also have implications for healthy adults, said Charles Brainerd, professor of human development and the study’s lead co-author with Valerie Reyna, director of the Institute for Human Neuroscience and professor of human development, both in Cornell’s College of Human Ecology.Their research, “Dual-retrieval models and neurocognitive impairment,” appears online in the Journal of Experimental Psychology: Learning, Memory and Cognition.The memory abilities affected by cognitive impairment differ from those affected by healthy aging, the authors say, resulting in unique error patterns on neuropsychological tests of memory. Their theory-driven mathematical model detects these patterns by analyzing performance on such tests and measuring the separate memory processes used.”With 10 or 15 minute recall tests already in common use worldwide, we can distinguish individuals who have or are at risk for developing cognitive impairment from healthy adults, and we can do so with better accuracy than any existing tools,” said Brainerd.The notion that memory declines continuously throughout adulthood appears to be incorrect, they say. “When we separated out the cognitively impaired individuals, we found no evidence of further memory declines after the age of 69 in samples of nationally representative older adults and highly educated older adults,” said Reyna.To develop their models, the team used data from two longitudinal studies of older adults — a nationally representative sample of older adults, the Aging, Demographics and Memory Study, and the Alzheimer’s Disease Neuroimaging Initiative — that include brain and behavioral measures as well as diagnoses for cognitive impairment and dementia.Specifically, the researchers found that declines in reconstructive memory (recalling a word or event by piecing it together from clues about its meaning, for example, recalling that “dog” was presented in a word list by first remembering that household pets were presented in the list) were associated with mild cognitive impairment and Alzheimer’s dementia, but not with healthy aging. Declines in recollective memory — recalling a word or event exactly — were a feature of normal aging.Over a period of between one and a half to six years, declines in reconstructive memory processes were reliable predictors of future progression from healthy aging to mild cognitive impairment and Alzheimer’s dementia, and better predictors than the best genetic marker of such diseases.”Reconstructive memory is very stable in healthy individuals, so declines in this type of memory are a hallmark of neurocognitive impairment,” Reyna said.Younger adults rely heavily on recollection, Brainerd said, but this method becomes increasingly inefficient throughout mid-adulthood. “Training people how to make better use of reconstructive recall as they age should assist healthy adult memory function,” he said. “Our analytical models are readily available for research and clinical use and could easily be incorporated into existing neuropsychological tests.”

Read more

Screening for minor memory changes will wrongly label many with dementia, warn experts

Sep. 10, 2013 — A political drive, led by the UK and US, to screen older people for minor memory changes (often called mild cognitive impairment or pre-dementia) is leading to unnecessary investigation and potentially harmful treatment for what is arguably an inevitable consequence of ageing, warn experts.Their views come as the Preventing Overdiagnosis conference opens in New Hampshire, USA today (10 September), partnered by BMJ’s Too Much Medicine campaign, where experts from around the world will gather to discuss how to tackle the threat to health and the waste of money caused by unnecessary care.A team of specialists in Australia and the UK say that expanding diagnosis of dementia will result in up to 65% of people aged over 80 having Alzheimer’s disease diagnosed — and up to 23% of non-demented older people being labelled with dementia.They argue this policy is not backed by evidence and ignores the risks, harms and costs to individuals, families and societies. It may also divert resources that are badly needed for the care of people with advanced dementia.Dementia is age related and with an ageing population is predicted to become an overwhelming and costly problem. But the evidence suggests that while 5 — 15% of people with mild cognitive impairment will progress to dementia each year, as many as 40 -70% will not progress and indeed their cognitive function may improve. Studies also show that the clinical tools used by doctors to diagnose dementia are not robust, and that many people who develop dementia do not meet definitions of mild cognitive impairment before diagnosis. But this has not deterred countries from developing policies to screen for pre -dementia.For example, in the US, the Medicare insurance programme will cover an annual wellness visit to a physician that includes a cognitive impairment test. In England, the government has announced that it will reward general practitioners for assessing brain function in older patients — and has committed to have “a memory clinic in every town and every city” despite no sound evidence of benefit.This had led to the development of imaging techniques and tests that are increasingly used in diagnosis, despite uncertainty over their accuracy, say the authors. The researchers say however, that until such approaches are shown to be beneficial to individuals and societies they should remain within the clinical research domain.Furthermore, there are no drugs that prevent the progression of dementia or are effective in patients with mild cognitive impairment, raising concerns that once patients are labelled with disease or pre-disease, they may try untested therapies and run the risk of adverse effects.They also question whether ageing of the population is becoming a “commercial opportunity” for developing screening, early diagnosis tests and medicines marketed to maintain cognition in old age.The desire of politicians, dementia organisations, and academics and clinicians in the field to raise the profile of dementia is understandable, write the authors, “but we risk being conscripted into an unwanted war against dementia.”They suggest that the political rhetoric expended on preventing the burden of dementia would be much better served by efforts to reduce smoking and obesity, given current knowledge linking mid-life obesity and cigarettes with the risk of dementia.”Current policy is rolling out untested and uncontrolled experiments in the frailest people in society without a rigorous evaluation of its benefits and harms to individuals, families, service settings, and professionals,” they conclude.

Read more

Mindfulness training improves attention in children

Sep. 5, 2013 — A short training course in mindfulness improves children’s ability to ignore distractions and concentrate better.These are the findings of a study carried out by Dominic Crehan and Dr Michelle Ellefson at the University of Cambridge being presented today, 6 September 2013, at the British Psychological Society’s Cognitive Developmental Psychology Annual Conference at the University of Reading.Dominic explained: “Mindfulness involves paying attention in a particular way — on purpose, in the present moment, and non-judgementally. It has been shown to reduce levels of stress and depression, and to improve feelings of well-being, but to date researchers have not established a link between mindfulness and attention skills in children.”The researchers recruited thirty children (girls and boys aged 10 to 11 years old) to take part in a mindfulness course as part of their school curriculum. The children took part in the mindfulness course in two groups at different times, and so the researchers were able to compare the groups and see the effects of the course. To do this, they measured the children’s levels of mindfulness using a questionnaire. They also measured their attention skills, using a computer game designed specifically for this purpose. They made these measurements on three occasions, at three month intervals, so that they could measure changes in attention skills over time as a result of the mindfulness course.The results indicated that an improvement in the children’s ability to focus and deal with distractions was associated with the mindfulness course.Dominic said: “The ability to pay attention in class is crucial for success at school. Mindfulness appears to have an effect after only a short training course, which the children thoroughly enjoyed! Through their training, the children actually learn to watch their minds working and learn to control their attention. These findings could be particularly important for helping children with attention difficulties such as ADHD. …

Read more

Brain imaging study reveals the wandering mind behind insomnia

Aug. 30, 2013 — new brain imaging study may help explain why people with insomnia often complain that they struggle to concentrate during the day even when objective evidence of a cognitive problem is lacking.”We found that insomnia subjects did not properly turn on brain regions critical to a working memory task and did not turn off ‘mind-wandering’ brain regions irrelevant to the task,” said lead author Sean P.A. Drummond, PhD, associate professor in the department of psychiatry at the University of California, San Diego, and the VA San Diego Healthcare System, and Secretary/Treasurer of the Sleep Research Society. “Based on these results, it is not surprising that someone with insomnia would feel like they are working harder to do the same job as a healthy sleeper.”The research team led by Drummond and co-principal investigator Matthew Walker, PhD, studied 25 people with primary insomnia and 25 good sleepers. Participants had an average age of 32 years. The study subjects underwent a functional magnetic resonance imaging scan while performing a working memory task.Results published in the September issue of the journal Sleep show that participants with insomnia did not differ from good sleepers in objective cognitive performance on the working memory task. However, the MRI scans revealed that people with insomnia could not modulate activity in brain regions typically used to perform the task.As the task got harder, good sleepers used more resources within the working memory network of the brain, especially the dorsolateral prefrontal cortex. Insomnia subjects, however, were unable to recruit more resources in these brain regions. Furthermore, as the task got harder, participants with insomnia did not dial down the “default mode” regions of the brain that are normally only active when our minds are wandering.”The data help us understand that people with insomnia not only have trouble sleeping at night, but their brains are not functioning as efficiently during the day,” said Drummond. “Some aspects of insomnia are as much of a daytime problem as a nighttime problem. …

Read more

Human brains are hardwired for empathy, friendship

Aug. 22, 2013 — Perhaps one of the most defining features of humanity is our capacity for empathy — the ability to put ourselves in others’ shoes. A new University of Virginia study strongly suggests that we are hardwired to empathize because we closely associate people who are close to us — friends, spouses, lovers — with our very selves.”With familiarity, other people become part of ourselves,” said James Coan, a psychology professor in U.Va.’s College of Arts & Sciences who used functional magnetic resonance imaging brain scans to find that people closely correlate people to whom they are attached to themselves. The study appears in the August issue of the journal Social Cognitive and Affective Neuroscience.”Our self comes to include the people we feel close to,” Coan said.In other words, our self-identity is largely based on whom we know and empathize with. Coan and his U.Va. colleagues conducted the study with 22 young adult participants who underwent fMRI scans of their brains during experiments to monitor brain activity while under threat of receiving mild electrical shocks to themselves or to a friend or stranger. The researchers found, as they expected, that regions of the brain responsible for threat response — the anterior insula, putamen and supramarginal gyrus — became active under threat of shock to the self. In the case of threat of shock to a stranger, the brain in those regions displayed little activity. However when the threat of shock was to a friend, the brain activity of the participant became essentially identical to the activity displayed under threat to the self.”The correlation between self and friend was remarkably similar,” Coan said. “The finding shows the brain’s remarkable capacity to model self to others; that people close to us become a part of ourselves, and that is not just metaphor or poetry, it’s very real. …

Read more

Playing video games can boost brain power

Aug. 21, 2013 — Certain types of video games can help to train the brain to become more agile and improve strategic thinking, according to scientists from Queen Mary University of London and University College London (UCL).The researchers recruited 72 volunteers and measured their ‘cognitive flexibility’ described as a person’s ability to adapt and switch between tasks, and think about multiple ideas at a given time to solve problems.Two groups of volunteers were trained to play different versions of a real-time strategy game called StarCraft, a fast-paced game where players have to construct and organise armies to battle an enemy. A third of the group played a life simulation video game called The Sims, which does not require much memory or many tactics.All the volunteers played the video games for 40 hours over six to eight weeks, and were subjected to a variety of psychological tests before and after. All the participants happened to be female as the study was unable to recruit a sufficient number of male volunteers who played video games for less than two hours a week.The researchers discovered that those who played StarCraft were quicker and more accurate in performing cognitive flexibility tasks, than those who played The Sims.Dr Brian Glass from Queen Mary’s School of Biological and Chemical Sciences, said: “Previous research has demonstrated that action video games, such as Halo, can speed up decision making but the current work finds that real-time strategy games can promote our ability to think on the fly and learn from past mistakes.””Our paper shows that cognitive flexibility, a cornerstone of human intelligence, is not a static trait but can be trained and improved using fun learning tools like gaming.”Professor Brad Love from UCL, said: “Cognitive flexibility varies across people and at different ages. For example, a fictional character like Sherlock Holmes has the ability to simultaneously engage in multiple aspects of thought and mentally shift in response to changing goals and environmental conditions.””Creative problem solving and ‘thinking outside the box’ require cognitive flexibility. Perhaps in contrast to the repetitive nature of work in past centuries, the modern knowledge economy places a premium on cognitive flexibility.”Dr Glass added: “The volunteers who played the most complex version of the video game performed the best in the post-game psychological tests. We need to understand now what exactly about these games is leading to these changes, and whether these cognitive boosts are permanent or if they dwindle over time. Once we have that understanding, it could become possible to develop clinical interventions for symptoms related to attention deficit hyperactivity disorder or traumatic brain injuries, for example.”

Read more

Working-life training and maternity leave are related to slower cognitive decline in later life

Aug. 5, 2013 — Employment gaps may promote but also reduce cognitive function in older age, as new research from the University of Luxembourg has shown. In particular, some of the findings suggest that leaves reported as unemployment and sickness are associated with higher risk of cognitive impairment indicating that these kinds of employment gaps may decrease cognitive reserve in the long run. Strongest evidence was found for training and maternity spells being related to slower cognitive decline, suggesting beneficial associations of these kinds of leaves on cognitive function.Share This:In this new publication, Dr Anja Leist from the University’s Research Unit INSIDE concludes that employment gaps during working life have the potential to increase or decrease cognitive reserve. The examination of how different activities performed during employment gaps are associated with later cognitive function and change has not been systematically investigated until now.Based on complete work histories and extensive cognitive assessments among respondents to the Survey of Health, Ageing and Retirement in Europe (SHARE) in 13 countries, the research team examined how employment gaps associated with unemployment, sickness, homemaking, training and maternity spells relate to cognitive function and aging-related cognitive decline at older age. These results provide first evidence for possible beneficial effects of cognitively stimulating activities during employment gaps. In analyses stratified by occupational class, the team found that unemployment and sickness spells were more strongly associated with cognitive impairment for workers in higher occupations. Further research is needed to examine if these associations are indeed causal.”For me it was exciting to think of employment gaps as a possibility to increase cognitive reserve during working life. There may be different mechanisms at work, for instance training spells may lead to higher socioeconomic status later on, whereas maternity spells may reduce the stress of balancing family and work tasks, and we need further research to disentangle these effects. The findings are in line with other studies that suggest that cognitively stimulating activities can indeed increase cognitive reserve and delay cognitive decline in older age” says Anja Leist who is supported by a postdoctoral research fellowship of the National Research Fund Luxembourg.Share this story on Facebook, Twitter, and Google:Other social bookmarking and sharing tools:|Story Source: The above story is based on materials provided by Université du Luxembourg, via AlphaGalileo. …

Read more

Exercise may be the best medicine for Alzheimer’s disease

July 30, 2013 — New research out of the University of Maryland School of Public Health shows that exercise may improve cognitive function in those at risk for Alzheimer’s by improving the efficiency of brain activity associated with memory. Memory loss leading to Alzheimer’s disease is one of the greatest fears among older Americans. While some memory loss is normal and to be expected as we age, a diagnosis of mild cognitive impairment, or MCI, signals more substantial memory loss and a greater risk for Alzheimer’s, for which there currently is no cure.The study, led by Dr. J. Carson Smith, assistant professor in the Department of Kinesiology, provides new hope for those diagnosed with MCI. It is the first to show that an exercise intervention with older adults with mild cognitive impairment (average age 78) improved not only memory recall, but also brain function, as measured by functional neuroimaging (via fMRI). The findings are published in the Journal of Alzheimer’s Disease.”We found that after 12 weeks of being on a moderate exercise program, study participants improved their neural efficiency — basically they were using fewer neural resources to perform the same memory task,” says Dr. Smith. “No study has shown that a drug can do what we showed is possible with exercise.”Recommended Daily Activity: Good for the Body, Good for the BrainTwo groups of physically inactive older adults (ranging from 60-88 years old) were put on a 12-week exercise program that focused on regular treadmill walking and was guided by a personal trainer. Both groups — one which included adults with MCI and the other with healthy brain function — improved their cardiovascular fitness by about ten percent at the end of the intervention. …

Read more

Brain picks out salient sounds from background noise by tracking frequency and time, study finds

July 23, 2013 — New research reveals how our brains are able to pick out important sounds from the noisy world around us. The findings, published online today in the journal eLife, could lead to new diagnostic tests for hearing disorders.Our ears can effortlessly pick out the sounds we need to hear from a noisy environment — hearing our mobile phone ringtone in the middle of the Notting Hill Carnival, for example — but how our brains process this information (the so-called ‘cocktail party problem’) has been a longstanding research question in hearing science.Researchers have previously investigated this using simple sounds such as two tones of different pitches, but now researchers at UCL and Newcastle University have used complicated sounds that are more representative of those we hear in real life. The team used ‘machine-like beeps’ that overlap in both frequency and time to recreate a busy sound environment and obtain new insights into how the brain solves this problem.In the study, groups of volunteers were asked to identify target sounds from within this noisy background in a series of experiments.Sundeep Teki, a PhD student from the Wellcome Trust Centre for Neuroimaging at UCL and joint first author of the study, said: “Participants were able to detect complex target sounds from the background noise, even when the target sounds were delivered at a faster rate or there was a loud disruptive noise between them.”Dr Maria Chait, a senior lecturer at UCL Ear Institute and joint first author on the study, adds: “Previous models based on simple tones suggest that people differentiate sounds based on differences in frequency, or pitch. Our findings show that time is also an important factor, with sounds grouped as belonging to one object by virtue of being correlated in time.”Professor Tim Griffiths, Professor of Cognitive Neurology at Newcastle University and lead researcher on the study, said: “Many hearing disorders are characterised by the loss of ability to detect speech in noisy environments. Disorders like this that are caused by problems with how the brain interprets sound information, rather than physical damage to the ear and hearing machinery, remain poorly understood.”These findings inform us about a fundamental brain mechanism for detecting sound patterns and identifies a process that can go wrong in hearing disorders. We now have an opportunity to create better tests for these types of hearing problems.”The research was funded by the Wellcome Trust and Deafness Research UK.

Read more

Scientists show proof-of-principle for silencing extra chromosome responsible for Down syndrome

July 17, 2013 — Scientists at UMass Medical School are the first to establish that a naturally occurring X chromosome “off switch” can be rerouted to neutralize the extra chromosome responsible for trisomy 21, also known as Down syndrome, a genetic disorder characterized by cognitive impairment.The discovery provides the first evidence that the underlying genetic defect responsible for Down syndrome can be suppressed in cells in culture (in vitro). This paves the way for researchers to study the cell pathologies and identify genome-wide pathways implicated in the disorder, a goal that has so far proven elusive. Doing so will improve scientists’ understanding of the basic biology underlying Down syndrome and may one day help establish potential therapeutic targets for future therapies. Details of the study by Jiang et al. were published online in Nature.”The last decade has seen great advances in efforts to correct single-gene disorders, beginning with cells in vitro and in several cases advancing to in vivo and clinical trials,” said lead author Jeanne B. Lawrence, PhD, professor of cell & developmental biology. “By contrast, genetic correction of hundreds of genes across an entire extra chromosome has remained outside the realm of possibility. Our hope is that for individuals living with Down syndrome, this proof-of-principle opens up multiple exciting new avenues for studying the disorder now, and brings into the realm of consideration research on the concept of ‘ ‘chromosome therapy’ in the future.”Humans are born with 23 pairs of chromosomes, including two sex chromosomes, for a total of 46 in each cell. People with Down syndrome are born with three (rather than two) copies of chromosome 21, and this “trisomy 21” causes cognitive disability, early-onset Alzheimer’s disease; and a greater risk of childhood leukemia, heart defects and immune and endocrine system dysfunction. Unlike for genetic disorders caused by a single gene, genetic correction of a whole chromosome in trisomic cells has been beyond the realm of possibility, even in cultured cells.Harnessing the power of the RNA gene called XIST, which is normally responsible for turning off one of the two X chromosomes found in female mammals, UMMS scientists have shown that the extra copy of chromosomes 21 responsible for Down syndrome can be silenced in the laboratory using patient-derived stem cells.The natural function of the XIST gene, located on the X chromosome, is to effectively silence one of the two X chromosomes in female cells, making expression of X-linked genes similar to that of men, who have just one X chromosome. …

Read more

Path of plaque buildup in brain shows promise as early biomarker for Alzheimer’s disease

July 15, 2013 — The trajectory of amyloid plaque buildup — clumps of abnormal proteins in the brain linked to Alzheimer’s disease — may serve as a more powerful biomarker for early detection of cognitive decline rather than using the total amount to gauge risk, researchers from Penn Medicine’s Department of Radiology suggest in a new study published online July 15 in Neurobiology of Aging.Amyloid plaque that starts to accumulate relatively early in the temporal lobe, compared to other areas and in particular to the frontal lobe, was associated with cognitively declining participants, the study found. “Knowing that certain brain abnormality patterns are associated with cognitive performance could have pivotal importance for the early detection and management of Alzheimer’s,” said senior author Christos Davatzikos, PhD, professor in the Department of Radiology, the Center for Biomedical Image Computing and Analytics, at the Perelman School of Medicine at the University of Pennsylvania.Today, memory decline and Alzheimer’s — which 5.4 million Americans live with today — is often assessed with a variety of tools, including physical and bio fluid tests and neuroimaging of total amyloid plaque in the brain. Past studies have linked higher amounts of the plaque in dementia-free people with greater risk for developing the disorder. However, it’s more recently been shown that nearly a third of people with plaque on their brains never showed signs of cognitive decline, raising questions about its specific role in the disease.Now, Dr. Davatzikos and his Penn colleagues, in collaboration with a team led by Susan M. Resnick, PhD, Chief, Laboratory of Behavioral Neuroscience at the National Institute on Aging (NIA), used Pittsburgh compound B (PiB) brain scans from the Baltimore Longitudinal Study of Aging’s Imaging Study and discovered a stronger association between memory decline and spatial patterns of amyloid plaque progression than the total amyloid burden.”It appears to be more about the spatial pattern of this plaque progression, and not so much about the total amount found in brains. We saw a difference in the spatial distribution of plaques among cognitive declining and stable patients whose cognitive function had been measured over a 12-year period. They had similar amounts of amyloid plaque, just in different spots,” Dr. Davatzikos said. “This is important because it potentially answers questions about the variability seen in clinical research among patients presenting plaque. …

Read more

Combination of smoking and heavy drinking ‘speeds up cognitive decline’

July 11, 2013 — The combination of smoking and heavy drinking speeds up cognitive decline, according to new research published in the British Journal of Psychiatry.Researchers from UCL (University College London) found that smokers who drank alcohol heavily had a 36% faster cognitive decline compared to non-smoking moderate drinkers.Smoking and heavier alcohol consumption often co-occur, and their combined effect on cognition may be larger than the sum of their individual effects. The research team assessed 6,473 adults (4,635 men and 1,838 women) aged between 45 and 69 years old over a 10-year period. The adults were part of the Whitehall II cohort study of British civil servants.All the participants were asked about their cigarette and alcohol consumption, and their cognitive function (including verbal and mathematical reasoning, short-term verbal memory and verbal fluency) was then assessed three times over 10 years.The research team found that in current smokers who were also heavy drinkers, cognitive decline was 36% faster than in non-smoking moderate drinkers. This was equivalent to an age effect of 12 years — an additional two years over the 10-year follow up period. Among smokers, cognitive decline was found to be faster as the number of alcohol units consumed increased.Lead researcher Dr Gareth Hagger-Johnson said: “Our research shows that cognitive decline was 36% faster in those people who reported both cigarette smoking and drinking alcohol above the recommended limits (14 units per week for women, 21 units per week for men). When we looked at people who were heavy-drinking smokers, we found that for every 10 years that they aged their brains aged the equivalent of 12 years.””From a public health perspective, the increasing burden associated with cognitive aging could be reduced if lifestyle factors can be modified, and we believe that people should not drink alcohol more heavily in the belief that alcohol is a protective factor against cognitive decline. Current advice is that smokers should stop or cut down, and people should avoid heavy alcohol drinking. Our study suggests that people should also be advised not to combine these two unhealthy behaviours — particularly from mid-life onwards. Healthy behaviours in midlife may prevent cognitive decline into early old age.”

Read more

Women suffer higher rates of decline in aging and Alzheimer’s disease

July 10, 2013 — The rates of regional brain loss and cognitive decline caused by aging and the early stages of Alzheimer’s disease (AD) are higher for women and for people with a key genetic risk factor for AD, say researchers at the University of California, San Diego School of Medicine in a study published online July 4 in the American Journal of Neuroradiology.The linkage between APOE ε4 — which codes for a protein involved in binding lipids or fats in the lymphatic and circulatory systems — was already documented as the strongest known genetic risk factor for sporadic AD, the most common form of the disease. But the connection between the sex of a person and AD has been less-well recognized, according to the UC San Diego scientists.”APOE ε4 has been known to lower the age of onset and increase the risk of getting the disease,” said the study’s first author Dominic Holland, PhD, a researcher in the Department of Neurosciences at UC San Diego School of Medicine. “Previously we showed that the lower the age, the higher the rates of decline in AD. So it was important to examine the differential effects of age and APOE ε4 on rates of decline, and to do this across the diagnostic spectrum for multiple clinical measures and brain regions, which had not been done before.”The scientists evaluated 688 men and women over the age of 65 participating in the Alzheimer’s Disease Neuroimaging Initiative, a longitudinal, multi-institution study to track the progression of AD and its effects upon the structures and functions of the brain. They found that women with mild cognitive impairment (a condition precursory to AD diagnosis) experienced higher rates of cognitive decline than men; and that all women, regardless of whether or not they showed signs of dementia, experienced greater regional brain loss over time than did men. The magnitude of the sex effect was as large as that of the APOE ε4 allele.”Assuming larger population-based samples reflect the higher rates of decline for women than men, the question becomes what is so different about women,” said Holland. Hormonal differences or change seems an obvious place to start, but Holland said this is largely unknown territory — at least regarding AD.”Another important finding of this study is that men and women did not differ in the level of biomarkers of Alzheimer’s disease pathology,” said co-author Linda McEvoy, PhD, an associate professor in the UCSD Department of Radiology. “This suggests that brain volume loss in women may also be caused by factors other than Alzheimer’s disease, or that in women, these pathologies are more toxic. We clearly need more research on how an individual’s sex affects AD pathogenesis.”Holland acknowledged that the paper likely raises more questions than it answers. “There are many factors that may affect the sex differences we observed, such as whether the women in this study may have had higher rates of diabetes or insulin resistance than the men. …

Read more

Drug improves cognitive function in mouse model of Down syndrome

July 2, 2013 — An existing FDA-approved drug improves cognitive function in a mouse model of Down syndrome, according to a new study by researchers at the Stanford University School of Medicine.The drug, an asthma medication called formoterol, strengthened nerve connections in the hippocampus, a brain center used for spatial navigation, paying attention and forming new memories, the study said. It also improved contextual learning, in which the brain integrates spatial and sensory information.Both hippocampal function and contextual learning, which are impaired in Down syndrome, depend on the brain having a good supply of the neurotransmitter norepinephrine. This neurotransmitter sends its signal via several types of receptors on the neurons, including a group called beta-2 adrenergic receptors.”This study provides the initial proof-of-concept that targeting beta-2 adrenergic receptors for treatment of cognitive dysfunction in Down syndrome could be an effective strategy,” said Ahmed Salehi, MD, PhD, the study’s senior author and a clinical associate professor of psychiatry and behavioral sciences. The study will be published online July 2 in Biological Psychiatry.Down syndrome, which is caused by an extra copy of chromosome 21, results in both physical and cognitive problems. While many of the physical issues, such as vulnerability to heart problems, can now be treated, no treatments exist for poor cognitive function. As a result, children with Down syndrome fall behind their peers’ cognitive development. In addition, adults with Down syndrome develop Alzheimer’s-type pathology in their brains by age 40. Down syndrome affects about 400,000 people in the United States and 6 million worldwide.In prior Down syndrome research, scientists have seen deterioration of the brain center that manufactures norepinephrine in both people with Down syndrome and its mouse model. Earlier work by Salehi’s team found that giving a norepinephrine precursor could improve cognitive function in a mouse model genetically engineered to mimic Down syndrome.The new study refined this work by targeting only one group of receptors that respond to norepinephrine: the beta-2 adrenergic receptors in the brain. The researchers began by giving mice a compound that blocks the action of beta-2 adrenergic receptors outside the brain. …

Read more

Higher education may be protective against multiple sclerosis-associated cognitive deficits

July 2, 2013 — Multiple sclerosis (MS) can lead to severe cognitive impairment as the disease progresses. Researchers in Italy have found that patients with high educational levels show less impairment on a neuropsychological evaluation compared with those with low educational levels.Their results are published in Restorative Neurology and Neuroscience.MS is a progressive immunologic brain disorder with neuropsychological deficits including selective attention, working memory, executive functioning, information processing speed, and long term memory. These deficits often impact daily life (ability to do household tasks, interpersonal relationships, employment, and overall quality of life).In this study, investigators first assessed the role of cognitive reserve, the brain’s active attempt to focus on how tasks are processed, in compensating for the challenge represented by brain damage. Earlier studies had reported that higher cognitive reserve protects MS subjects from disease-related cognitive inefficiency but in these studies cognitive reserve was mainly estimated through a vocabulary test. Here, investigators considered educational level and occupational attainment instead of vocabulary. They also evaluated both educational and occupational experience, hypothesizing that an individual’s lifetime occupational attainment could also be considered a good proxy of CR, similar to the way in which higher occupational attainment reduces the risk of Alzheimer’s disease.The second aim of the study was to investigate the possible role of perceived fatigue. Fatigue can have a great negative influence on daily life, so that higher perceived fatigue might result in lower cognitive performance.Fifty consecutive clinically diagnosed MS patients took part in the study. A control group included 157 clinically healthy subjects, with no psychiatric or neurological diagnosis. Individuals in both groups were, on average, of the same age, education level and gender. The mean age was 40.41 (± 9.67) years, with 12.37 (± 4.42) years of education.Cognitive performance was evaluated using the Paced Auditory Serial Addition Test (PASAT), in which a series of single digit numbers are presented and the two most recent digits must be summed. …

Read more

By trying it all, predatory sea slug learns what not to eat

June 6, 2013 — Researchers have found that a type of predatory sea slug that usually isn’t picky when it comes to what it eats has more complex cognitive abilities than previously thought, allowing it to learn the warning cues of dangerous prey and thereby avoid them in the future.The research appears in the Journal of Experimental Biology.Pleurobranchaea californica is a deep-water species of sea slug found off the west coast of the United States. It has a relatively simple neural circuitry and set of behaviors. It is a generalist feeder, meaning, as University of Illinois professor of molecular and integrative physiology and leader of the study Rhanor Gillette put it, that members of this species “seem to try anything once.”Another sea slug species, Flabellina iodinea, commonly known as the Spanish shawl because of the orange outgrowths called cerata that cover its purple back, also lives off the west coast. Unlike Pleurobranchaea, however, the Spanish shawl eats only one type of food, an animal called Eudendrium ramosum. According to Gillette, the Spanish shawl digests the Eudendrium’s entire body except for its embryonic, developing stinging cells. The Spanish shawl instead transports these stinging cells to its own cerata where they mature, thereby co-opting its victim’s body parts for its own defense.The story of Gillette’s Pleurobranchaea-Flabellina research began with a happy accident that involved showing a lab visitor Pleurobranchaea’s penchant for predation.”I had a Pleurobranchaea in a small aquarium that we were about to do a physiological experiment with, and my supplier from Monterey had just sent me these beautiful Spanish shawls,” Gillette said. “So I said to the visitor, ‘Would you like to see Pleurobranchaea eat another animal?'”Gillette placed the Spanish shawl into the aquarium. The Pleurobranchaea approached, smelled, and bit the purple and orange newcomer. However, the Flabellina’s cerata stung the Pleurobranchaea, the Spanish shawl was rejected and left to do its typical “flamenco dance of escape,” and Pleurobranchaea also managed to escape with an avoidance turn.Some minutes later, his curiosity piqued, Gillette placed the Spanish shawl back into the aquarium with the Pleurobranchaea. Rather than try to eat the Spanish shawl a second time, the Pleurobranchaea immediately started its avoidance turn.”I had never seen that before! …

Read more

PET finds increased cognitive reserve levels in highly educated pre-Alzheimer’s patients

June 3, 2013 — Highly educated individuals with mild cognitive impairment that later progressed to Alzheimer’s disease cope better with the disease than individuals with a lower level of education in the same situation, according to research published in the June issue of The Journal of Nuclear Medicine. In the study “Metabolic Networks Underlying Cognitive Reserve in Prodromal Alzheimer Disease: A European Alzheimer Disease Consortium Project,”neural reserve and neural compensation were both shown to play a role in determining cognitive reserve, as evidenced by positron emission tomography (PET).Cognitive reserve refers to the hypothesized capacity of an adult brain to cope with brain damage in order to maintain a relatively preserved functional level. Understanding the brain adaptation mechanisms underlying this process remains a critical question, and researchers of this study sought to investigate the metabolic basis of cognitive reserve in individuals with higher (more than 12 years) and lower (less than 12 years) levels of education who had mild cognitive impairment that progressed to Alzheimer’s disease, also known as prodromal Alzheimer’s disease.”This study provides new insight into the functional mechanisms that mediate the cognitive reserve phenomenon in the early stages of Alzheimer’s disease,” said Silvia Morbelli, MD, lead author of the study. “A crucial role of the dorso-lateral prefrontal cortex was highlighted by demonstrating that this region is involved in a wide fronto-temporal and limbic functional network in patients with Alzheimer’s disease and high education, but not in poorly educated Alzheimer’s disease patients.”In the study, 64 patients with prodromal Alzheimer’s disease and 90 control subjects — coming from the brain PET project (chaired by Flavio Nobili, MD, in Genoa, Italy) of the European Alzheimer Disease Consortium — underwentbrain 18F-FDG PET scans. Individuals were divided into a subgroup with a low level of education (42 controls and 36 prodromal Alzheimer’s disease patients) and a highly educated subgroup (40 controls and 28 prodromal Alzheimer’s disease patients). Brain metabolism was compared between education-matched groups of patients and controls, and then between highly and poorly educated prodromal Alzheimer’s disease patients.Higher metabolic activity was shown in the dorso-lateral prefrontal cortex for prodromal Alzheimer’s disease patients. More extended and significant correlations of metabolism within the right dorso-lateral prefrontal cortex and other brain regions were found with highly educated than less educated prodromal Alzheimer’s disease patients or even highly educated controls.This result suggests that neural reserve and neural compensation are activated in highly educated prodromal Alzheimer’s disease patients. Researchers concluded that evaluation of the implication of metabolic connectivity in cognitive reserve further confirms that adding a comprehensive evaluation of resting 18F-FDG PET brain distribution to standard inspection may allow a more complete comprehension of Alzheimer’s disease pathophysiology and possibly may increase 18F-FDG PET diagnostic sensitivity.”This work supports the notion that employing the brain in complex tasks and developing our own education may help in forming stronger ‘defenses’ against cognitive deterioration once Alzheimer knocks at our door,” noted Morbelli.”It’s possible that, in the future, a combined approach evaluating resting metabolic connectivity and cognitive performance can be used on an individual basis to better predict cognitive decline or response to disease-modifying therapy.”

Read more

Utilizzando il sito, accetti l'utilizzo dei cookie da parte nostra. maggiori informazioni

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close