Study finds brain system for emotional self-control
Different brain areas are activated when we choose to suppress an emotion, compared to when we are instructed to inhibit an emotion, according a new study from the UCL Institute of Cognitive Neuroscience and Ghent University.
In this study, published in Brain Structure and Function, the researchers scanned the brains of healthy participants and found that key brain systems were activated when choosing for oneself to suppress an emotion. They had previously linked this brain area to deciding to inhibit movement.
“This result shows that emotional self-control involves a quite different brain system from simply being told how to respond emotionally,” said lead author Dr Simone Kuhn (Ghent University).
In most previous studies, participants were instructed to feel or inhibit an emotional response. However, in everyday life we are rarely told to suppress our emotions, and usually have to decide ourselves whether to feel or control our emotions.
In this new study the researchers showed fifteen healthy women unpleasant or frightening pictures. The participants were given a choice to feel the emotion elicited by the image, or alternatively to inhibit the emotion, by distancing themselves through an act of self-control.
The researchers used functional magnetic resonance imaging (fMRI) to scan the brains of the participants. They compared this brain activity to another experiment where the participants were instructed to feel or inhibit their emotions, rather than choose for themselves.
Different parts of the brain were activated in the two situations. When participants decided for themselves to inhibit negative emotions, the scientists found activation in the dorso-medial prefrontal area of the brain. They had previously linked this brain area to deciding to inhibit movement.
In contrast, when participants were instructed by the experimenter to inhibit the emotion, a second, more lateral area was activated.
“We think controlling one’s emotions and controlling one’s behaviour involve overlapping mechanisms,” said Dr Kuhn.
“We should distinguish between voluntary and instructed control of emotions, in the same way as we can distinguish between making up our own mind about what do, versus following instructions.”
Regulating emotions is part of our daily life, and is important for our mental health. For example, many people have to conquer fear of speaking in public, while some professionals such as health-care workers and firemen have to maintain an emotional distance from unpleasant or distressing scenes that occur in their jobs.
Professor Patrick Haggard (UCL Institute of Cognitive Neuroscience) co-author of the paper said the brain mechanism identified in this study could be a potential target for therapies.
“The ability to manage one’s own emotions is affected in many mental health conditions, so identifying this mechanism opens interesting possibilities for future research.
“Most studies of emotion processing in the brain simply assume that people passively receive emotional stimuli, and automatically feel the corresponding emotion. In contrast, the area we have identified may contribute to some individuals’ ability to rise above particular emotional situations.
“This kind of self-control mechanism may have positive aspects, for example making people less vulnerable to excessive emotion. But altered function of this brain area could also potentially lead to difficulties in responding appropriately to emotional situations.”
Research determines how the brain computes tool use
With a goal of helping patients with spinal cord injuries, Jason Gallivan and a team of researchers at Queen’s University’s Department of Psychology and Centre for Neuroscience Studies are probing deep into the human brain to learn how it manages basic daily tasks.
The team’s most recent research, in collaboration with a group at Western University, investigated how the human brain supports tool use. The researchers were especially interested in determining the extent to which brain regions involved in planning actions with the hand alone would also be involved in planning actions with a tool. They found that although some brain regions were involved in planning actions with either the hand or tool alone, the vast majority were involved in planning both hand- and tool-related movements. In a subset of these latter brain areas the researchers further determined that the tool was in fact being represented as an extension of the hand.
“Tool use represents a defining characteristic of high-level cognition and behaviour across the animal kingdom but studying how the brain – and the human brain in particular – supports tool use remains a significant challenge for neuroscientists” says Dr. Gallivan. “This work is a considerable step forward in our understanding of how tool-related actions are planned in humans.”
Over the course of one year, human participants had their brain activity scanned using functional magnetic resonance imaging (fMRI) as they reached towards and grasped objects using either their hand or a set of plastic tongs. The tongs had been designed so they opened whenever participants closed their grip, requiring the participants to perform a different set of movements to use the tongs as opposed to when using their hand alone.
The team found that mere seconds before the action began, that the neural activity in some brain regions was predictive of the type of action to be performed upon the object, regardless of whether the hand or tool was to be used (and despite the different movements being required). By contrast, the predictive neural activity in other brain regions was shown to represent hand and tool actions separately. Specifically, some brain regions only coded actions with the hand whereas others only coded actions with the tool.
“Being able to decode desired tool use behaviours from brain signals takes us one step closer to using those signals to control those same types of actions with prosthetic limbs,” says Dr. Gallivan. “This work uncovers the brain organization underlying the planning of movements with the hand and hand-operated tools and this knowledge could help people suffering from spinal cord injuries.”
The research was recently published in eLife.
Hit a 95 mph baseball? Scientists pinpoint how we see it coming
How does San Francisco Giants slugger Pablo Sandoval swat a 95 mph fastball, or tennis icon Venus Williams see the oncoming ball, let alone return her sister Serena’s 120 mph serves? For the first time, vision scientists at the University of California, Berkeley, have pinpointed how the brain tracks fast-moving objects.
The discovery advances our understanding of how humans predict the trajectory of moving objects when it can take one-tenth of a second for the brain to process what the eye sees.
That 100-millisecond holdup means that in real time, a tennis ball moving at 120 mph would have already advanced 15 feet before the brain registers the ball’s location. If our brains couldn’t make up for this visual processing delay, we’d be constantly hit by balls, cars and more.
Thankfully, the brain “pushes” forward moving objects so we perceive them as further along in their trajectory than the eye can see, researchers said.
“For the first time, we can see this sophisticated prediction mechanism at work in the human brain,” said Gerrit Maus, a postdoctoral fellow in psychology at UC Berkeley and lead author of the paper published today (May 8) in the journal, Neuron.
A clearer understanding of how the brain processes visual input – in this case life in motion – can eventually help in diagnosing and treating myriad disorders, including those that impair motion perception. People who cannot perceive motion cannot predict locations of objects and therefore cannot perform tasks as simple as pouring a cup of coffee or crossing a road, researchers said.
This study is also likely to have a major impact on other studies of the brain. Its findings come just as the Obama Administration initiates its push to create a Brain Activity Map Initiative, which will further pave the way for scientists to create a roadmap of human brain circuits, as was done for the Human Genome Project.
Using functional Magnetic Resonance Imaging (fMRI) Gerrit and fellow UC Berkeley researchers Jason Fischer and David Whitney located the part of the visual cortex that makes calculations to compensate for our sluggish visual processing abilities. They saw this prediction mechanism in action, and their findings suggest that the middle temporal region of the visual cortex known as V5 is computing where moving objects are most likely to end up.
For the experiment, six volunteers had their brains scanned, via fMRI, as they viewed the “flash-drag effect,”(a, b) a visual illusion in which we see brief flashes shifting in the direction of the motion.
“The brain interprets the flashes as part of the moving background, and therefore engages its prediction mechanism to compensate for processing delays,” Maus said.
The researchers found that the illusion – flashes perceived in their predicted locations against a moving background and flashes actually shown in their predicted location against a still background – created the same neural activity patterns in the V5 region of the brain. This established that V5 is where this prediction mechanism takes place, they said.
In a study published earlier this year, Maus and his fellow researchers pinpointed the V5 region of the brain as the most likely location of this motion prediction process by successfully using transcranial magnetic stimulation, a non-invasive brain stimulation technique, to interfere with neural activity in the V5 region of the brain, and disrupt this visual position-shifting mechanism.
“Now not only can we see the outcome of prediction in area V5,” Maus said. “But we can also show that it is causally involved in enabling us to see objects accurately in predicted positions.”
On a more evolutionary level, the latest findings reinforce that it is actually advantageous not to see everything exactly as it is. In fact, it’s necessary to our survival:
“The image that hits the eye and then is processed by the brain is not in sync with the real world, but the brain is clever enough to compensate for that,” Maus said. “What we perceive doesn’t necessarily have that much to do with the real world, but it is what we need to know to interact with the real world.”
Been Thinking of Somebody? Brain Researchers Know Who
by Charles Q. Choi
Scientists scanning the human brain can now tell whom a person is thinking of, the first time researchers have been able to identify what people are imagining from imaging technologies.
Work to visualize thought is starting to pile up successes. Recently, scientists have used brain scans to decode imagery directly from the brain, such as what number people have just seen and what memory a person is recalling. They can now even reconstruct videos of what a person has watched based on their brain activity alone. Cornell University cognitive neuroscientist Nathan Spreng and his colleagues wanted to carry this research one step further by seeing if they could deduce the mental pictures of people that subjects conjure up in their heads.
“We are trying to understand the physical mechanisms that allow us to have an inner world, and a part of that is how we represent other people in our mind,” Spreng says.