People talk a lot about how animalistic Max is with his grunts and his survival instincts, but I feel like another way this was portrayed in the movie was through his keen hearing and how he often uses hearing alone to manage tasks.
The very opening shot gives a depiction of this (unless you want to go with Glory and her psychic visions). There’s this series of shots:
He doesn’t turn to see the lizard approaching him. Not once. He just listens to it as it scurries up to him, and when it’s within perfect reach of him, he makes his move. We’d normally expect to see a person turn. In fact, that last look, over his shoulder, is closer to one we’d expect from someone waiting for a small animal to approach them. But he doesn’t. He is able to both determine when it is close enough to strike and where to strike. This kind of sets up a later moment, but I’ll get to that later.
Early humans heard sounds differently than we do today, according to new research that provides intriguing clues on the environments our ancestors were living in, and how prehistoric humans communicated with each other.
A key finding of the study, published in the journal Science Advances, is that our ancestors living in South Africa around 2 million years ago were extremely sensitive to close-range sounds, hearing them more keenly than both our species and chimpanzees do now.
“We concluded that Australopithecus africanus and Paranthropus robustus had a heightened sensitivity to sound between 1.0-3.0 kHz compared with both chimpanzees and humans,” lead author Rolf Quam, an assistant professor of anthropology at Binghamton University, told Discovery News.
He added that these early humans “were capable of hearing softer sounds” than our species and chimps can.
Quam and colleagues made the determination after reconstructing the internal anatomy of the ears of the two prehistoric humans. To do so, the researchers used CT scans and virtual computer reconstructions based on fossils. The particular human species were selected for the study because their remains include preserved ear bones.
The sensitivity to short-range sounds likely would have facilitated up close communication in an open habitat. Prior research on the tooth enamel of the prehistoric humans found evidence for consumption of foods found in both forests and savannahs, so our South African ancestors must have divided their time between these two environments.
Concentrating attention on a visual task can render you momentarily
‘deaf’ to sounds at normal levels, reports a new UCL study funded by the
The study, published in the Journal of Neuroscience,
suggests that the senses of hearing and vision share a limited neural
resource. Brain scans from 13 volunteers found that when they were
engaged in a demanding visual task, the brain response to sound was
significantly reduced. Examination of people’s ability to detect sounds
during the visual demanding task also showed a higher rate of failures
to detect sounds, even though the sounds were clearly audible and people
did detect them when the visual task was easy.
“This was an experimental lab study which is one of the ways that we
can establish cause and effect. We found that when volunteers were
performing the demanding visual task, they were unable to hear sounds
that they would normally hear,” explains study co-author Dr Maria Chait
(UCL Ear Institute). “The brain scans showed that people were not only
ignoring or filtering out the sounds, they were not actually hearing
them in the first place.”
The phenomenon of 'inattentional deafness’, where we fail to notice
sounds when concentrating on other things, has been observed by the
researchers before. However, this is the first time that they have been
able to determine, by measuring brain activity in real-time using MEG
(magnetoencephalography), that the effects are driven by brain
mechanisms at a very early stage of auditory processing which would be
expected to lead to the experience of being 'deaf’ to these sounds.
“Inattentional deafness is a common experience in everyday life, and
now we know why,” says co-author Professor Nilli Lavie (UCL Institute
of Cognitive Neuroscience) “For example, if you try to talk to someone
who is focusing on a book, game or television programme and don’t
receive a response, they aren’t necessarily ignoring you, they might
simply not hear you! This could also explain why you might not hear your
train or bus stop being announced if you’re concentrating on your
phone, book or newspaper.
"This has more serious implications in situations such as the
operating theatre, where a surgeon concentrating on their work might not
hear the equipment beeping. It also applies to drivers concentrating on
complex satnav directions as well as cyclists and motorists who are
focusing intently on something such as an advert or even simply an
interesting-looking passer-by. Pedestrians engaging with their phone,
for example texting while walking, are also prone to inattentional
deafness. Loud sounds such as sirens and horns will be loud enough to
get through, but quieter sounds like bicycle bells or car engines are
likely to go unheard.”
Much like trying to watch a video with the audio out of synch, older
adults may have difficulty combining the stimuli they see and hear, and
it could have implications for rapid decision-making tasks such as
driving, according to new research.
A recent study from the University of Waterloo found that seniors
have a harder time distinguishing the order of events than younger
adults. When researchers presented them with both a light and sound at
the same or different times, they found that young and older adults
could determine whether they occurred simultaneously with similar
accuracy. But when asked to determine which appeared first, the light or
the sound, older adults performed much worse.
“To make sense of the world around us, the brain has to rapidly
decide whether to combine different sources of information,” said
Michael Barnett-Cowan, a professor in the Department of Kinesiology at
the University of Waterloo and senior author on the paper. “Older adults
often experience problems processing multisensory information, which in
turn can affect everyday tasks from following conversations, to
driving, to maintaining balance.”
In another test, researchers showed the study participants two lights
travelling towards one another. Usually the lights appear to stream
past each other, but when a sound occurs close to when the lights touch,
they seem to bounce off each other. In this test, older adults
continued to perceive the lights as bouncing even when the sound
occurred well before or after the lights touched, suggesting that older
adults combine sensory information that should not belong together.
This is the first study to test multiple ways in which younger and
older people combine sensory information in time. The findings provide
new hope that by strengthening the link between these brain processes as
people age, the impairments in distinguishing the order of events and
perceived collisions could reduce. Possible solutions for improving
impaired perceptions of time in the older adults could come from
training using video games or brain stimulation.
“Health professionals are able to address many changes in our vision
and hearing as we age using corrective lenses and hearing aids, for
example. But these interventions don’t help with changes in the brain’s
ability to combine sensory information,“ said Barnett-Cowan. "If we can
identify and address impaired timing of events in the elderly, we could
potentially improve the quality of life, safety and independence for
many older people.”
Allow mainstream integrated deaf people to reclaim their Deafness and Deaf behaviours. Allow them to take as long as they need, cuz you don’t know what they were made to believe while growing up away from the Deaf world. Don’t punish them for acting ‘hearing’ cuz many of us didn’t get a fairly informed choice.
Humans Probably Not Alone in How We Perceive Melodic Pitch
The specialized human ability to perceive the sound quality known as
“pitch” can no longer be listed as unique to humans. Researchers at
Johns Hopkins report new behavioral evidence that marmosets, ancient
monkeys, appear to use auditory cues
similar to humans to distinguish between low and high notes. The
discovery infers that aspects of pitch perception may have evolved more
than 40 million years ago to enable vocal communication and songlike vocalizations.
“Pitch perception is essential to our ability to communicate and make music,” says Xiaoqin Wang, Ph.D.,
a professor of biomedical engineering at the Johns Hopkins University
School of Medicine, “but until now, we didn’t think any animal species,
including monkeys, perceived it the way we do. Now we know that
marmosets, and likely other primate ancestors, do.”
Marmosets are small monkeys native to South America that are highly
vocal and social. Wang, an auditory neuroscientist and biomedical
engineer, has been studying their hearing and vocalizations for the past
20 years. A decade ago, he says, he and his team of researchers identified a region in the marmoset brain that appears to process pitch. Nerve cells in that region, on the edge of the primary auditory cortex,
only “fired” after marmosets were exposed to sounds with pitch, like
the shifting in high and low notes associated with a melody, not those
without, such as noise. Human brains show similar activity in that
region, as other researchers have reported, he notes.
What was missing was behavioral evidence that the marmosets could
perceive and respond to differences in pitch the way humans do, and
Wang’s laboratory group spent years developing behavioral tests and
electrophysiological devices designed to monitor subtle changes in the
monkeys’ neural activity. Part of their work was to train a group of
marmosets to lick a waterspout only after hearing a change in pitch.
Wang says that other animal species have been reported to show pitch
perception, but none have shown the three specialized features of human
pitch perception. First, people are better at distinguishing pitch
differences at low frequencies than high. For example, people who hear
tones of 100, 200, 300 and 400 hertz played simultaneously hear four
separate sounds, but they hear only one sound when tones of 1,100,
1,200, 1,300 and 1,400 hertz are played together, even though the
frequency intervals are the same in both cases.
Second, humans are able to pick up on subtle changes in the spread
between pitches at low frequencies or hertz, so they notice if a series
of tones is increasing by 100 hertz each time and then introduces a tone
only 90 hertz higher.
And third, at high frequencies, peoples’ ability to perceive pitch
differences among tones played simultaneously is related to how
sensitive they are to the rhythm, or timed fluctuations, of sound waves.
Through a series of hearing tests, with waterspout licks as a readout,
Wang’s team, led by graduate student Xindong Song, determined that
marmosets share all three features with humans, suggesting that human
components of pitch perception evolved much earlier than previously
The American continent, with its marmosets in place, broke away from
the African land mass approximately 40 million years ago, before humans
appeared in Africa, so it’s possible that this humanlike pitch
perception evolved before that break and was maintained throughout
primate evolution in Africa until it was inherited by modern humans.
Another possibility is that only certain aspects of pitch perception
were in place before the split, with the rest of the mechanisms evolving
in parallel in Old and New World monkeys. According to Wang, more
stringent tests are needed to determine whether existing Old World
monkeys perceive pitch like humans do.
“In addition to the evolutionary implications of this discovery, I’m
looking forward to what we will be able to learn about human pitch
perception now that we have a primate relative we can study behaviorally
and physiologically,” says Wang. “Now we can explore questions about
what goes wrong in people who are tone deaf and whether perfect pitch is
an inherited or learned trait.”