Follow posts tagged #sensory information in seconds.Sign up
Helpful for robotics: brain uses old information for new movements
Information from the senses has an important influence on how we move. For instance, you can see and feel when a mug is filled with hot coffee, and you lift it in a different way than if the mug were empty. Neuroscientist Julian Tramper discovered that the brain uses two forms of old information in order to execute new movements well. This discovery can be useful for the field of robotics. Tramper will receive his doctorate on Thursday 24 April from Radboud University Nijmegen
Every time you move, the brain deals with two problems. First, there is a slight delay in the sensory information needed to execute the movement. Second, the command from the brain directing the muscles to move is not entirely clear, because neuronal signals contain a certain amount of natural static interference. According to Tramper, the brain has a clever way of getting around both problems: It combines the old information from the senses with experience gained through similar movements made in the past. This means that our senses use two forms of old information in order to make new movements.
Computer versus test subject
Understanding the brain processes behind movement can be of great importance to fields like robotics. Therefore Tramper is trying to model his findings so that it will be possible to use them in robots in the future. He has already succeeded in this for certain hand-eye coordination experiments, to the extent that a computer can perform at about the same level as human test subjects. As a post-doctoral researcher within the Donders Institute, Tramper is researching how these types of models can be integrated into bio-inspired robots (robots based on biological principles).
Tramper is currently working on a project called SpaceCog. The goal of this project is to develop a robot which can independently orient itself in space, something that humans do automatically. This is difficult to achieve, because each time a robot moves, it must reinterpret the information from its cameras and other sensors in order to determine whether the changes to its input are the result of its own movement or an external cause. The researchers involved in SpaceCog want to figure out how our brain has solved this problem. Tramper has three years to come up with a good computer model addressing this issue.
Looking towards the future
Tramper is studying hand-eye coordination by having test subjects play a special computer game. The subjects use a game controller to move a digital right hand and left hand on a screen. They have to move the two hands independently of one another and make them each follow a particular path in order to reach a final destination (see film 1). It turned out that the test subject’s eyes moved ahead of the digital hands. In other words, the eyes looked at a point that the hands would reach in the future (see film 2). This type of eye movement is called smooth pursuit, and before now it had only been detected in the case of external stimuli, when a subject was following an object’s movement. Tramper detected smooth pursuit eye movements at locations the hands had not yet reached, meaning these movements were triggered by internal stimuli.
Tramper explains, ‘We’d previously demonstrated for other types of eye movement that the eye anticipates and moves in advance of external movement To our surprise, this is also the case with smooth pursuit. It is probable that this is a compromise between where you are at a particular moment and where you want to get to. When moving, you need to keep track of your current location (which is constantly changing) and your target destination. Smooth pursuit eye movements can help you do this by letting your eye “hover” between both locations. If we can teach robots to do something like this, it will help make their movements much more natural. This will increase the number of ways in which robots can be put to work.’
Rats take high-speed multisensory snapshots
When animals are on the hunt for food they likely use many senses, and scientists have wondered how the different senses work together.
New research from the laboratory of CSHL neuroscientist and Assistant Professor Adam Kepecs shows that when rats actively use the senses of smell (sniffing) and touch (through their whiskers) those two processes are locked in synchronicity. The team’s paper, published today in the Journal of Neuroscience, shows that sniffing and “whisking” movements are synchronized even when they are running at different frequencies.
Studies in the 1960s suggested these two sensory activities were coordinated: sniffing, a sharp, profound intake of air; and whisking, the back-and-forth movement of the whiskers to sample the near environment, akin to the sensation of touch as felt through the fingers in humans. Such coordination could be important for decisions that depend on multiple types of sensory information, for instance, locating food. “The question is how two very different streams of sensory information, touch and smell, are integrated into a single multisensory “snapshot” of the environment,” says Kepecs.
These snapshots can be taken at high frequency, up to 12 times a second. To determine whether these two sensorimotor rhythms are indeed phase-locked, Kepecs’ team, including postdocs Sachin Ranade and Balázs Hangya, simultaneously monitored sniffing and whisking in rats freely foraging for food pellets.
At different frequencies occurring between 4-12 times per second they found strong 1:1 phase locking — in other words, every time the rats extended their whiskers to feel their vicinity, they also smelled it. Surprisingly, they found even when the sniffing and whisking rhythms operating at different fundamental frequencies they were locked in phase. Key to this is that the phases of the sensory input – the start of inhalation and onset of whisking – are aligned, which facilitates multisensory integration.
This is similar to how a person’s breathing rhythm settles into place while running and is synchronized to the steps. In both cases, the coordination could be advantageous in terms of energy efficiency. A crucial difference, though, is that in humans, the breathing rate has to catch up to the running rhythm after changes in pace, while for sniffing and whisking in rats they lock into phase immediately.
Even though human behavior doesn’t seem to be overtly tied to rhythms, there are hints that it could be. “Underneath the smoothly executed movements of humans there are rhythm generators, which are sometimes revealed in some diseases, for example the tremors seen in Parkinson’s disease, or in the brain waves that result from the synchronized firing of neurons,” says Kepecs. Studying the rhythms of multisensory inputs in rodents could provide clues to a fundamental principle underlying sensory and brain rhythms that are essential to all animals, including humans.