A new study correlating brain activity with how people make decisions suggests that when individuals engage in risky behavior, such as drunk driving or unsafe sex, it’s probably not because their brains’ desire systems are too active, but because their self-control systems are not active enough.This might have implications for how health experts treat mental illness and addiction or how the legal system assesses a criminal’s likelihood of committing another crime.Researchers from The University of Texas at Austin, UCLA and elsewhere analyzed data from 108 subjects who sat in a magnetic resonance imaging (MRI) scanner — a machine that allows researchers to pinpoint brain activity in vivid, three-dimensional images — while playing a video game that simulates risk-taking.The researchers used specialized software to look for patterns of activity across the whole brain that preceded a person’s making a risky choice or a safe choice in one set of subjects. Then they asked the software to predict what other subjects would choose during the game based solely on their brain activity. The software accurately predicted people’s choices 71 percent of the time.”These patterns are reliable enough that not only can we predict what will happen in an additional test on the same person, but on people we haven’t seen before,” said Russell Poldrack, director of UT Austin’s Imaging Research Center and professor of psychology and neuroscience.When the researchers trained their software on much smaller regions of the brain, they found that just analyzing the regions typically involved in executive functions such as control, working memory and attention was enough to predict a person’s future choices. Therefore, the researchers concluded, when we make risky choices, it is primarily because of the failure of our control systems to stop us.”We all have these desires, but whether we act on them is a function of control,” said Sarah Helfinstein, a postdoctoral researcher at UT Austin and lead author of the study that appears online this week in the journal Proceedings of the National Academy of Sciences.Helfinstein said that additional research could focus on how external factors, such as peer pressure, lack of sleep or hunger, weaken the activity of our brains’ control systems when we contemplate risky decisions.”If we can figure out the factors in the world that influence the brain, we can draw conclusions about what actions are best at helping people resist risks,” said Helfinstein.To simulate features of real-world risk-taking, the researchers used a video game called the Balloon Analogue Risk Task (BART) that past research has shown correlates well with self-reported risk-taking such as drug and alcohol use, smoking, gambling, driving without a seatbelt, stealing and engaging in unprotected sex.While playing the BART, the subject sees a balloon on the screen and is asked to make either a risky choice (inflate the balloon a little and earn a few cents) or a safe choice (stop the round and “cash out,” keeping whatever money was earned up to that point). Sometimes inflating the balloon causes it to burst and the player loses all the cash earned from that round. After each successful balloon inflation, the game continues with the chance of earning another standard-sized reward or losing an increasingly large amount. Many health-relevant risky decisions share this same structure, such as when deciding how many alcoholic beverages to drink before driving home or how much one can experiment with drugs or cigarettes before developing an addiction.The data for this study came from the Consortium for Neuropsychiatric Phenomics at UCLA, which recruited adults from the Los Angeles area for researchers to examine differences in response inhibition and working memory between healthy adults and patients diagnosed with bipolar disorder, schizophrenia, or adult attention deficit hyperactivity disorder (ADHD). Only data collected from healthy participants were included in the present analyses.Other researchers on the study include: Tom Schonberg and Jeanette A. Mumford at The University of Texas at Austin; Katherine H. Karlsgodt at Zucker Hillside Hospital and the Feinstein Institute for Medical Research; Eliza Congdon, Fred W. …Read more
New research from Case Western Reserve University and University of Toronto neuroscientists finds that the brains of autistic children generate more information at rest — a 42% increase on average. The study offers a scientific explanation for the most typical characteristic of autism — withdrawal into one’s own inner world. The excess production of information may explain a child’s detachment from their environment.Published at the end of December in Frontiers in Neuroinformatics, this study is a follow-up to the authors’ prior finding that brain connections are different in autistic children. This paper determined that the differences account for the increased complexity within their brains.”Our results suggest that autistic children are not interested in social interactions because their brains generate more information at rest, which we interpret as more introspection in line with early descriptions of the disorder,” said Roberto Fernndez Galn, PhD, senior author and associate professor of neurosciences at Case Western Reserve School of Medicine.The authors quantified information as engineers normally do but instead of applying it to signals in electronic devices, they applied it to brain activity recorded with magnetoencephalography (MEG). They showed that autistic children’s brains at rest generate more information than non-autistic children. This may explain their lack of interest in external stimuli, including interactions with other people.The researchers also quantified interactions between brain regions, i.e., the brain’s functional connectivity, and determined the inputs to the brain in the resting state allowing them to interpret the children’s introspection level.”This is a novel interpretation because it is a different attempt to understand the children’s cognition by analyzing their brain activity,” said Jos L. Prez Velzquez, PhD, first author and professor of neuroscience at University of Toronto Institute of Medical Science and Department of Pediatrics, Brain and Behavior Center. “Measuring cognitive processes is not trivial; yet, our findings indicate that this can be done to some extent with well-established mathematical tools from physics and engineering.”This study provides quantitative support for the relatively new “Intense World Theory” of autism proposed by neuroscientists Henry and Kamila Markram of the Brain Mind Institute in Switzerland, which describes the disorder as the result of hyper-functioning neural circuitry, leading to a state of over-arousal. More generally, the work of Galn and Prez Velzquez is an initial step in the investigation of how information generation in the brain relates to cognitive/psychological traits and will begin to frame neurophysiological data into psychological aspects. The team now aims to apply a similar approach to patients with schizophrenia.Story Source:The above story is based on materials provided by Case Western Reserve University. …Read more
Sep. 12, 2013 — Alzheimer’s disease is thought to be caused by the buildup of abnormal, thread-like protein deposits in the brain, but little is known about the molecular structures of these so-called beta-amyloid fibrils. A study published by Cell Press September 12th in the journal Cell has revealed that distinct molecular structures of beta-amyloid fibrils may predominate in the brains of Alzheimer’s patients with different clinical histories and degrees of brain damage. The findings pave the way for new patient-specific strategies to improve diagnosis and treatment of this common and debilitating disease.Share This:”This work represents the first detailed characterization of the molecular structures of beta-amyloid fibrils that develop in the brains of patients with Alzheimer’s disease,” says senior study author Robert Tycko of the National Institutes of Health. “This detailed structural model may be used to guide the development of chemical compounds that bind to these fibrils with high specificity for purposes of diagnostic imaging, as well as compounds that inhibit fibril formation for purposes of prevention or therapy.”Tycko and his team had previously noticed that beta-amyloid fibrils grown in a dish have different molecular structures, depending on the specific growth conditions. Based on this observation, they suspected that fibrils found in the brains of patients with Alzheimer’s disease are also variable and that these structural variations might relate to each patient’s clinical history. But it has not been possible to directly study the structures of fibrils found in patients because of their low abundance in the brain.To overcome this hurdle, Tycko and his collaborators developed a new experimental protocol. They extracted beta-amyloid fibril fragments from the brain tissue of two patients with different clinical histories and degrees of brain damage and then used these fragments to grow a large quantity of fibrils in a dish. They found that a single fibril structure prevailed in the brain tissue of each patient, but the molecular structures were different between the two patients.”This may mean that fibrils in a given patient appear first at a single site in the brain, then spread to other locations while retaining the identical molecular structure,” Tycko says. “Our study also shows that certain fibril structures may be more likely than others to cause Alzheimer’s disease, highlighting the importance of developing imaging agents that target specific fibril structures to improve the reliability and specificity of diagnosis.”Share this story on Facebook, Twitter, and Google:Other social bookmarking and sharing tools:|Story Source: The above story is based on materials provided by Cell Press, via EurekAlert!, a service of AAAS. …Read more
Aug. 21, 2013 — Metals such as iron, copper, and zinc are important for many biological processes. In recent years, studies have shown that these nutritionally-essential metals are elevated in human Alzheimer’s disease (AD) brains and some animal models of AD. Scientists are now exploring whether these metals are causing the neurodegeneration seen in AD or are indicative of other ongoing pathologic processes.In a new study, investigators used synchrotron x-ray fluorescence microscopy to image metal ions in the brain, focusing on the amyloid plaques that are the hallmark of AD. They found that, in two AD mouse models that exhibit neurodegeneration, the plaques contained about 25% more copper than an AD mouse model that shows little neurodegeneration. Looking at other metals, they found that none of the mouse models had significant increases in iron and very little increases in zinc. Metal content was not related to the age of the plaque. The study is reported in the current issue of Biomedical Spectroscopy and Imaging.”Since excess copper should not be ‘free’ in the brain to bind to the plaques, these data suggest that the cellular control of copper is altered in AD, which may lead to toxic reactions between free copper ions and neurons,” comments lead investigator Lisa M. Miller, PhD, a biophysical chemist in the Photon Sciences Directorate at Brookhaven National Laboratory. In previous work, Dr. …Read more
Aug. 7, 2013 — Drinking two cups of hot chocolate a day may help older people keep their brains healthy and their thinking skills sharp, according to a study published in the August 7, 2013, online issue of Neurology®, the medical journal of the American Academy of Neurology.The study involved 60 people with an average age of 73 who did not have dementia. The participants drank two cups of hot cocoa per day for 30 days and did not consume any other chocolate during the study. They were given tests of memory and thinking skills. They also had ultrasounds tests to measure the amount of blood flow to the brain during the tests.”We’re learning more about blood flow in the brain and its effect on thinking skills,” said study author Farzaneh A. Sorond, MD, PhD, of Harvard Medical School in Boston and a member of the American Academy of Neurology. “As different areas of the brain need more energy to complete their tasks, they also need greater blood flow. This relationship, called neurovascular coupling, may play an important role in diseases such as Alzheimer’s.”Of the 60 participants, 18 had impaired blood flow at the start of the study. Those people had an 8.3-percent improvement in the blood flow to the working areas of the brain by the end of the study, while there was no improvement for those who started out with regular blood flow.The people with impaired blood flow also improved their times on a test of working memory, with scores dropping from 167 seconds at the beginning of the study to 116 seconds at the end. There was no change in times for people with regular blood flow. …Read more
Aug. 7, 2013 — Though one might think the brains of people who develop Alzheimer’s disease (AD) possess building blocks of the disease absent in healthy brains, for most sufferers, this is not true. Every human brain contains the ingredients necessary to spark AD, but while an estimated 5 million Americans have AD — a number projected to triple by 2050 — the vast majority of people do not and will not develop the devastating neurological condition.For researchers like Subhojit Roy, MD, PhD, associate professor in the Departments of Pathology and Neurosciences at the University of California, San Diego School of Medicine, these facts produce a singular question: Why don’t we all get Alzheimer’s disease?In a paper published in the August 7 issue of the journal Neuron, Roy and colleagues offer an explanation — a trick of nature that, in most people, maintains critical separation between a protein and an enzyme that, when combined, trigger the progressive cell degeneration and death characteristic of AD.”It’s like physically separating gunpowder and match so that the inevitable explosion is avoided,” said principal investigator Roy, a cell biologist and neuropathologist in the Shiley-Marcos Alzheimer’s Disease Research Center at UC San Diego. “Knowing how the gunpowder and match are separated may give us new insights into possibly stopping the disease.”The severity of AD is measured in the loss of functioning neurons. In pathological terms, there are two tell-tale signs of AD: clumps of a protein called beta-amyloid “plaques” that accumulate outside neurons and threads or “tangles” of another protein, called tau, found inside neurons. Most neuroscientists believe AD is caused by the accumulating assemblies of beta-amyloid protein triggering a sequence of events that leads to impaired cell function and death. This so-called “amyloid cascade hypothesis” puts beta-amyloid protein at the center of AD pathology.Creating beta-amyloid requires the convergence of a protein called amyloid precursor protein (APP) and an enzyme that cleaves APP into smaller toxic fragments called beta-secretase or BACE.”Both of these proteins are highly expressed in the brain,” said Roy, “and if they were allowed to combine continuously, we would all have AD.”But that doesn’t happen. Using cultured hippocampal neurons and tissue from human and mouse brains, Roy — along with first author Utpal Das, a postdoctoral fellow in Roy’s lab, and colleagues — discovered that healthy brain cells largely segregate APP and BACE-1 into distinct compartments as soon as they are manufactured, ensuring the two proteins do not have much contact with each other.”Nature seems to have come up with an interesting trick to separate co-conspirators,” said Roy.The scientists also found that the conditions promoting greater production of beta-amyloid protein boost the convergence of APP and BACE. Specifically, an increase in neuronal electrical activity — known to increase the production of beta-amyloid — also led to an increase in APP-BACE convergence. Post-mortem examinations of AD patients revealed increased physical proximity of the proteins as well, adding support to the pathophysiological significance of this phenomenon in human disease.Das said the findings are fundamentally important because they elucidate some of the earliest molecular events triggering AD and show how a healthy brain naturally avoids them. …Read more
July 17, 2013 — Honey bees Apis mellifera) infected with the parasitic mite, Varroa destructor, or the microsporidia, Nosema ceranae, have changes in the chemical profile of their skin and in their brains, finds research in BioMed Central’s open access journal BMC Ecology. Despite this, parasitized bees were not expelled from the hive, which, the authors say, supports the hypothesis that stressed bees leave the hive altruistically to prevent the spread of infection.Share This:This study from INRA (French National Institute for Agricultural Research) investigated the effect of parasitization on honey bees living in hives at Avignon. Individual bees were infected with either the ectoparasite Varroa, which lives on the bees, or endoparasite Nosema, which invades their bodies, and reintroduced to the hive. After a few days the effect of infection on bees and their behavior was monitored.Parasitization caused changes in the levels of active genes in the brains of infected bees. Varroa altered the activity of 455 genes, including genes involved in GABA and serotonin signaling, while Nosema affected 57. Twenty genes were common between the two infections and several of the up-regulated genes are involved in oxidative stress, neural function and foraging behavior. Parasitized bees also tended to have a higher viral infection as well, adding to their disease burden, — even if they did not have physical symptoms.Hydrocarbons on the cuticle of bees provide a ‘family’ scent allowing bees from the same hive to recognize each other. The levels of these chemicals was altered by infection with either the endo- or ecto-parasite nevertheless infected bees were treated as normal by other bees — social interactions including antennal contact, grooming, feeding, and vibration, continued — and they were not expelled from the hive.Dr Cynthia McDonnell who led this study commented, “Parasitized bees tend to leave the colony earlier to perform foraging activity, which could lead to a significant depopulation of the colony. However, very few studies have analyzed the impact of parasites on bee phenotypes, e.g. brain and behavior. …Read more
July 8, 2013 — A hallmark of neurodegenerative diseases such as Alzheimer’s, Parkinson’s and Huntington’s is that by the time symptoms appear, significant brain damage has already occurred — and currently there are no treatments that can reverse it. A team of SRI International researchers has demonstrated that measurements of electrical activity in the brains of mouse models of Huntington’s disease could indicate the presence of disease before the onset of major symptoms.The findings, “Longitudinal Analysis of the Electroencephalogram and Sleep Phenotype in the R6/2 Mouse Model of Huntington’s Disease,” are published in the July 2013 issue of the neurology journal Brain, published by Oxford University Press.SRI researchers led by Stephen Morairty, Ph.D., a director in the Center for Neuroscience in SRI Biosciences, and Simon Fisher, Ph.D., a postdoctoral fellow at SRI, used electroencephalography (EEG), a noninvasive method commonly used in humans, to measure changes in neuronal electrical activity in a mouse model of Huntington’s disease. Identification of significant changes in the EEG prior to the onset of symptoms would add to evidence that the EEG can be used to identify biomarkers to screen for the presence of a neurodegenerative disease. Further research on such potential biomarkers might one day enable the tracking of disease progression in clinical trials and could facilitate drug development.”EEG signals are composed of different frequency bands such as delta, theta and gamma, much as light is composed of different frequencies that result in the colors we call red, green and blue,” explained Thomas Kilduff, Ph.D., senior director, Center for Neuroscience, SRI Biosciences. “Our research identified abnormalities in all three of these bands in Huntington’s disease mice. Importantly, the activity in the theta and gamma bands slowed as the disease progressed, indicating that we may be tracking the underlying disease process.”EEG has shown promise as an indicator of underlying brain dysfunction in neurodegenerative diseases, which otherwise occurs surreptitiously until symptoms appear. Until now, most investigations of EEG in patients with neurodegenerative diseases and in animal models of neurodegenerative diseases have shown significant changes in EEG patterns only after disease symptoms occurred.”Our breakthrough is that we have found an EEG signature that appears to be a biomarker for the presence of disease in this mouse model of Huntington’s disease that can identify early changes in the brain prior to the onset of behavioral symptoms,” said Morairty, the paper’s senior author. “While the current study focused on Huntington’s disease, many neurodegenerative diseases produce changes in the EEG that are associated with the degenerative process. This is the first step in being able to use the EEG to predict both the presence and progression of neurodegenerative diseases.”Although previous studies have shown there are distinct and extensive changes in EEG patterns in Alzheimer’s and Huntington’s disease patients, researchers are looking for changes that may occur decades before disease onset.Huntington’s disease is an inherited disorder that causes certain nerve cells in the brain to die, resulting in motor dysfunction, cognitive decline and psychiatric symptoms. It is the only major neurodegenerative disease where the cause is known with certainty: a genetic mutation that produces a change in a protein that is toxic to neurons.Read more
June 27, 2013 — A study from Karolinska Institutet in Sweden shows, that our imagination may affect how we experience the world more than we perhaps think. What we imagine hearing or seeing “in our head” can change our actual perception. The study, which is published in the scientific journal Current Biology, sheds new light on a classic question in psychology and neuroscience — about how our brains combine information from the different senses.Share This:”We often think about the things we imagine and the things we perceive as being clearly dissociable,” says Christopher Berger, doctoral student at the Department of Neuroscience and lead author of the study. “However, what this study shows is that our imagination of a sound or a shape changes how we perceive the world around us in the same way actually hearing that sound or seeing that shape does. Specifically, we found that what we imagine hearing can change what we actually see, and what we imagine seeing can change what we actually hear.”The study consists of a series of experiments that make use of illusions in which sensory information from one sense changes or distorts one’s perception of another sense. Ninety-six healthy volunteers participated in total.In the first experiment, participants experienced the illusion that two passing objects collided rather than passed by one-another when they imagined a sound at the moment the two objects met. In a second experiment, the participants’ spatial perception of a sound was biased towards a location where they imagined seeing the brief appearance of a white circle. In the third experiment, the participants’ perception of what a person was saying was changed by their imagination of a particular sound.According to the scientists, the results of the current study may be useful in understanding the mechanisms by which the brain fails to distinguish between thought and reality in certain psychiatric disorders such as schizophrenia. Another area of use could be research on brain computer interfaces, where paralyzed individuals’ imagination is used to control virtual and artificial devices.”This is the first set of experiments to definitively establish that the sensory signals generated by one’s imagination are strong enough to change one’s real-world perception of a different sensory modality” says Professor Henrik Ehrsson, the principle investigator behind the study.Share this story on Facebook, Twitter, and Google:Other social bookmarking and sharing tools:|Story Source: The above story is reprinted from materials provided by Karolinska Institutet. Note: Materials may be edited for content and length. …Read more
June 19, 2013 — For years, Alzheimer’s researchers have focused on two proteins that accumulate in the brains of people with Alzheimer’s and may contribute to the disease: plaques made up of the protein amyloid-beta, and tangles of another protein, called tau.But for the first time, an Alzheimer’s researcher has looked closely at not the two proteins independently, but at the interaction of the two proteins with each other — in the brain tissue of post-mortem Alzheimer’s patients and in mouse brains with Alzheimer’s disease. The research found that the interaction between the two proteins might be the key: as these interactions increased, the progression of Alzheimer’s disease worsened.The research, by Hemachandra Reddy, Ph.D., an associate scientist at the Oregon National Primate Research Center at Oregon Health & Science University, is detailed in the June 2013 edition of the Journal of Alzheimer’s Disease.Reddy’s paper suggests that when the interaction between the phosphorylated tau and the amyloid-beta — particularly in its toxic form — happens at brain synapses, it can damage those synapses. And that can lead to cognitive decline in Alzheimer’s patients.”This complex formation between amyloid-beta and tau — it is actually blocking the neural communication,” Reddy said. “If we could somehow find a molecule that could inhibit the binding of these two proteins at the synapses, that very well might be the cure to Alzheimer’s disease.”To conduct the research, Reddy and his team studied three different kinds of mice, who had been bred to have some of the brain characteristics of Alzheimer’s disease, including having amyloid-beta and phosphorylated tau in their brains. Reddy also analyzed postmortem brain tissue from people who had Alzheimer’s disease.Using multiple antibodies that recognize amyloid-beta and phosphorylated tau, Reddy and Maria Manczak, Ph.D., a research associate in Reddy’s laboratory, specifically looked for the evidence of the amyloid-beta and phosphorylated tau interactions. They found amyloid-beta/tau complexes in the human Alzheimer’s brain tissue and in the Alzheimer’s disease mouse brains. The Reddy team also found much more of those amyloid-beta/tau complexes in brains where Alzheimer’s disease had progressed the most.Reddy found very little or no evidence of the same interaction in the “control” subjects — mice that did not have the Alzheimer’s traits and human brain tissue of people who did not have Alzheimer’s.”So much Alzheimer’s research has been done to look at amyloid-beta and tau,” Reddy said. “But ours is the first paper to strongly demonstrate that yes, there is an amyloid-beta/phosphorylated tau interaction. And that interaction might be causing the synaptic damage and cognitive decline in persons with Alzheimer’s disease.”Reddy and his lab are already working on the next crucial questions. One is to define the binding site or sites and exactly where within the neuron the interaction of amyloid-beta and tau first occurs. …Read more
May 23, 2013 — A brief visual task can predict IQ, according to a new study. This surprisingly simple exercise measures the brain’s unconscious ability to filter out visual movement. The study shows that individuals whose brains are better at automatically suppressing background motion perform better on standard measures of intelligence. The test is the first purely sensory assessment to be strongly correlated with IQ and may provide a non-verbal and culturally unbiased tool for scientists seeking to understand neural processes associated with general intelligence.
“Because intelligence is such a broad construct, you can’t really track it back to one part of the brain,” says Duje Tadin, a senior author on the study and an assistant professor of brain and cognitive sciences at the University of Rochester. “But since this task is so simple and so closely linked to IQ, it may give us clues about what makes a brain more efficient, and, consequently, more intelligent.”
The unexpected link between IQ and motion filtering was reported online in the Cell Press journal Current Biology on May 23 by a research team lead by Tadin and Michael Melnick, a doctoral candidate in brain and cognitive sciences at the University of Rochester.
In the study, individuals watched brief video clips of black and white bars moving across a computer screen. Their sole task was to identify which direction the bars drifted: to the right or to the left. The bars were presented in three sizes, with the smallest version restricted to the central circle where human motion perception is known to be optimal, an area roughly the width of the thumb when the hand is extended. Participants also took a standardized intelligence test.
As expected, people with higher IQ scores were faster at catching the movement of the bars when observing the smallest image. The results support prior research showing that individuals with higher IQs make simple perceptual judgments swifter and have faster reflexes. “Being ‘quick witted’ and ‘quick on the draw’ generally go hand in hand,” says Melnick.
But the tables turned when presented with the larger images. The higher a person’s IQ, the slower they were at detecting movement. “From previous research, we expected that all participants would be worse at detecting the movement of large images, but high IQ individuals were much, much worse,” says Melnick. That counter-intuitive inability to perceive large moving images is a perceptual marker for the brain’s ability to suppress background motion, the authors explain. In most scenarios, background movement is less important than small moving objects in the foreground. Think about driving in a car, walking down a hall, or even just moving your eyes across the room. The background is constantly in motion.
The key discovery in this study is how closely this natural filtering ability is linked to IQ. The first experiment found a 64 percent correlation between motion suppression and IQ scores, a much stronger relationship than other sensory measures to date. For example, research on the relationship between intelligence and color discrimination, sensitivity to pitch, and reaction times have found only a 20 to 40 percent correlation. “In our first experiment, the effect for motion was so strong,” recalls Tadin, “that I really thought this was a fluke.”
So the group tried to disprove the findings from the initial 12-participant study conducted while Tadin was at Vanderbilt University working with co-author Sohee Park, a professor of psychology. They reran the experiment at the University of Rochester on a new cohort of 53 subjects, administering the full IQ test instead of an abbreviated version and the results were even stronger; correlation rose to 71 percent. The authors also tested for other possible explanations for their findings.
For example, did the surprising link to IQ simply reflect a person’s willful decision to focus on small moving images? To rule out the effect of attention, the second round of experiments randomly ordered the different image sizes and tested other types of large images that have been shown not to elicit suppression. High IQ individuals continued to be quicker on all tasks, except the ones that isolated motion suppression. The authors concluded that high IQ is associated with automatic filtering of background motion.
“We know from prior research which parts of the brain are involved in visual suppression of background motion. This new link to intelligence provides a good target for looking at what is different about the neural processing, what’s different about the neurochemistry, what’s different about the neurotransmitters of people with different IQs,” says Tadin.
The relationship between IQ and motion suppression points to the fundamental cognitive processes that underlie intelligence, the authors write. The brain is bombarded by an overwhelming amount of sensory information, and its efficiency is built not only on how quickly our neural networks process these signals, but also on how good they are at suppressing less meaningful information. “Rapid processing is of little utility unless it is restricted to the most relevant information,” the authors conclude.
The researchers point out that this vision test could remove some of the limitations associated with standard IQ tests, which have been criticized for cultural bias. “Because the test is simple and non-verbal, it will also help researchers better understand neural processing in individuals with intellectual and developmental disabilities,” says co-author Loisa Bennetto, an associate professor of psychology at the University of Rochester.
Bryan Harrison, a doctoral candidate in clinical and social psychology at the University of Rochester is also an author on the paper. The research was supported by grants from the National Institutes of Health.Read more
According to a study, people with severe depression feel like they are living in a separate time zone from everyone else. How? Because the biological clock in their brains has been corrupted. In other words, the circadian …Read more