Infants

Roboticists learn to teach robots from babies

Babies learn about the world by exploring how their bodies move in space, grabbing toys, pushing things off tables and by watching and imitating what adults are doing.

But when roboticists want to teach a robot how to do a task, they typically either write code or physically move a robot’s arm or body to show it how to perform an action.

Now a collaboration between University of Washington developmental psychologists and computer scientists has demonstrated that robots can “learn” much like kids — by amassing data through exploration, watching a human do something and determining how to perform that task on its own.

“You can look at this as a first step in building robots that can learn from humans in the same way that infants learn from humans,” said senior author Rajesh Rao, a UW professor of computer science and engineering.

“If you want people who don’t know anything about computer programming to be able to teach a robot, the way to do it is through demonstration — showing the robot how to clean your dishes, fold your clothes, or do household chores. But to achieve that goal, you need the robot to be able to understand those actions and perform them on their own.”

The research, which combines child development research from the UW’s Institute for Learning & Brain Sciences Lab (I-LABS) with machine learning approaches, was published in a paper in November in the journal PLOS ONE.

In the paper, the UW team developed a new probabilistic model aimed at solving a fundamental challenge in robotics: building robots that can learn new skills by watching people and imitating them.

The roboticists collaborated with UW psychology professor and I-LABS co-director Andrew Meltzoff, whose seminal research has shown that children as young as 18 months can infer the goal of an adult’s actions and develop alternate ways of reaching that goal themselves.

In one example, infants saw an adult try to pull apart a barbell-shaped toy, but the adult failed to achieve that goal because the toy was stuck together and his hands slipped off the ends. The infants watched carefully and then decided to use alternate methods — they wrapped their tiny fingers all the way around the ends and yanked especially hard — duplicating what the adult intended to do.

Children acquire intention-reading skills, in part, through self-exploration that helps them learn the laws of physics and how their own actions influence objects, eventually allowing them to amass enough knowledge to learn from others and to interpret their intentions. Meltzoff thinks that one of the reasons babies learn so quickly is that they are so playful.

“Babies engage in what looks like mindless play, but this enables future learning. It’s a baby’s secret sauce for innovation,” Meltzoff said. “If they’re trying to figure out how to work a new toy, they’re actually using knowledge they gained by playing with other toys. During play they’re learning a mental model of how their actions cause changes in the world. And once you have that model you can begin to solve novel problems and start to predict someone else’s intentions.”

Rao’s team used that infant research to develop machine learning algorithms that allow a robot to explore how its own actions result in different outcomes. Then the robot uses that learned probabilistic model to infer what a human wants it to do and complete the task, and even to “ask” for help if it’s not certain it can.

The team tested its robotic model in two different scenarios: a computer simulation experiment in which a robot learns to follow a human’s gaze, and another experiment in which an actual robot learns to imitate human actions involving moving toy food objects to different areas on a tabletop.

In the gaze experiment, the robot learns a model of its own head movements and assumes that the human’s head is governed by the same rules. The robot tracks the beginning and ending points of a human’s head movements as the human looks across the room and uses that information to figure out where the person is looking. The robot then uses its learned model of head movements to fixate on the same location as the human.

The team also recreated one of Meltzoff’s tests that showed infants who had experience with visual barriers and blindfolds weren’t interested in looking where a blindfolded adult was looking, because they understood the person couldn’t actually see. Once the team enabled the robot to “learn” what the consequences of being blindfolded were, it no longer followed the human’s head movement to look at the same spot.

“Babies use their own self-experience to interpret the behavior of others — and so did our robot,” said Meltzoff.

In the second experiment, the team allowed a robot to experiment with pushing or picking up different objects and moving them around a tabletop. The robot used that model to imitate a human who moved objects around or cleared everything off the tabletop. Rather than rigidly mimicking the human action each time, the robot sometimes used different means to achieve the same ends.

“If the human pushes an object to a new location, it may be easier and more reliable for a robot with a gripper to pick it up to move it there rather than push it,” said lead author Michael Jae-Yoon Chung, a UW doctoral student in computer science and engineering. “But that requires knowing what the goal is, which is a hard problem in robotics and which our paper tries to address.”

Though the initial experiments involved learning how to infer goals and imitate simple behaviors, the team plans to explore how such a model can help robots learn more complicated tasks.

“Babies learn through their own play and by watching others,” says Meltzoff, “and they are the best learners on the planet — why not design robots that learn as effortlessly as a child?”

Better fine motor skills with delayed cord clamping

The importance of the umbilical cord not only for the fetus but for newborn infants too was shown by Swedish researchers several years ago, in a study that received great international acclaim. In a follow-up study in the journal JAMA Pediatrics they have now been able to show an association between delayed cord clamping (DCC) and children’s fine motor skills at the age of four years, especially in boys.

Several years ago, in a clinical study comprising 400 newborns, Dr. Ola Andersson and colleagues demonstrated that the risk of iron deficiency at the age of four months was considerably lower in infants whose umbilical cords were clamped and cut three minutes after birth (‘delayed cord clamping’, DCC) than in those whose cords were removed within 10 seconds ('early cord clamping’, ECC). The newborns in the study were well-nourished babies born after full-term pregnancies to healthy mothers.

'If the cord is left in place for three minutes, the blood continues to flow into the newborn’s circulation. The baby receives about a deciliter of extra blood, which corresponds to two liters in an adult,’ says Dr. Andersson, a researcher at Uppsala University and pediatrician in Halmstad.

Image Source

2

Top Knot Newborn Cap

Sweet little hat for newborns, soft and comfy to keep baby’s little head warm those first few weeks as his body thermostat regulates itself. Made of soft knit cotton, just the right stretch to fit snugly but not too tightly.

  • Handmade item
  • Made of certified organic knit interlock cotton and organic cotton thread
  • Sized NB-3mo
  • Ships worldwide

BUY NOW

($16.00)

Mesh Credit | TOU

No real money is being made from this item.

The Ends Count Starting at Birth: Newborns use first and last syllables to recognize words

The cognitive system encodes better the first and last syllables of words. Researchers at SISSA, in collaboration with Udine Hospital (Azienda Ospedaliera di Udine), have demonstrated for the first time that this cognitive mechanism is present from birth. The study was published in the scientific review Developmental Science.

Most of us think of infants as tiny beings whose main business is to sleep, suck and cry, without much awareness of what is happening around them. It may come as somewhat of a surprise, then, to know that newborn brains are full of feverish activity and that they are already gathering and processing important information from the world around them. At just two days after birth, babies are already able to process language using processes similar to those of adults. SISSA researchers have demonstrated that they are sensitive to the most important parts of words, the edges, a cognitive mechanism which has been repeatedly observed in older children and adults.

It is well-­‐known that, in general, people better remember the edges of sequences and particularly in language, when we must remember and recognize words, the brain gives greater weight to information at the beginning and the end of the word. Languages around the world seem to capitalize on this better encoding at the edges. “The syllables at the beginnings and the ends of words often carry important information. For example, the parts of words that contain information about plurality of objects or verb tense are almost always found at the beginning or at the end of words in all known languages,” says Alissa Ferry, researcher at the International School for Advanced Studies of Trieste (SISSA) and author of the study.

“It is a pervasive phenomenon and our study shows that it is present from birth,” says Ana Flo, a SISSA researcher who was involved in the study. “Here at SISSA researchers already showed that pre-­‐linguistic babies of 7-­‐8 months show this enhanced encoding of word edges, but we went further, showing that this mechanism is present in humans even during the first days of life”.

"The infants heard a sequence of six syllables and we examined if they could discriminate it from a very similar sequence, in which we switched the positions of two of the syllables. When we switched the edge syllables, the newborns’ brain responded to the change, but when we switched the two syllables in the middle, they did not respond to the change. This suggests that the newborns better encoded the syllables at the edges of the sequence,” says Perrine Brusini, a SISSA researcher and one of the study’s authors.

In real language there are signals, like prosody or very subtle pauses, that cue the boundaries between words and phrases, and may help us remember words from even longer discourse.” In another series of experiments, we tried to findout if neonates can use these cues to process the syllables in the middle of the sequence,“ continues Flo. "To do that we introduced a small discontinuity between the two middle syllables, an almost imperceptible 25 millisecond pause, and examined whether infants would now notice the switch between the middle syllables. With this very subtle cue, the neonate brain treated the sequence as two shorter words and responded when the syllables switched.”

Humans better encode information from the edges of sequences and this cognitive mechanism can influence language acquisition even from the first days of life, conclude SISSA researchers.

Behind  the  scenes  research  fact…

How do you figure out what is happening in the brain of a newborn (without disturbing the baby too much)? While not an easy process, there are experimental methods that take advantage of the "habituation” phenomenon and can be used to figure out how children think and learn. When hearing a stimulus repeatedly, the brain response habituates: it responds stongly for a new stimulus but after hearing the same things repeatedly, the response to that stimulus decreases. If you change the stimulus, the brain response becomes strong again. Using non-­‐invasive infrared spectroscopy, brain activity can be measured: “We had the newborns listen to the same word repeatedly and then we played the word with the syllables switched. If the newborn brain detected the difference, we see an increased brain response. The brain response increased when we switched the syllables at the edge of a word but not when we switched the syllables in the middle of a word, indicating that edges were encoded better,” explains Ferry.

Babies’ Babbles Reflect Their Own Involvement in Language Development

“Dada” is a first word for many babies. Babbling sounds with consonant-vowel repetitions, such as “dada,” are common among infants once they reach 8 months old; however, these sounds are not prevalent among infants who have profound hearing loss – that is, until they receive cochlear implants. Now, University of Missouri research shows that babies’ repetitive babbles primarily are motivated by infants’ ability to hear themselves. Additionally, infants with profound hearing loss who received cochlear implants to improve their hearing soon babbled as often as their hearing peers, allowing them to catch up developmentally.

“Hearing is a critical aspect of infants’ motivation to make early sounds,” said Mary Fagan, an assistant professor of communication science and disorders in the MU School of Health Professions. “The fact that they attend to and learn from their own behaviors, especially in speech, highlights how infants’ own experiences help their language, social and cognitive development. This research doesn’t diminish the importance of the speech that babies hear from others – we know they need to learn from others – but it raises our awareness that infants are not just passive recipients of what others say to them. They are actively engaged in their own developmental process.”

Fagan studied the babbles of 27 hearing infants and 16 infants with profound hearing loss before and after they received cochlear implants, which are small electronic devices embedded into the bone behind the ear that replace some functions of the damaged inner ear. Before receiving cochlear implants, babies with profound hearing loss rarely produced repetitive vocalizations, such as ‘ba-ba’ or ‘da-da.’ Within a few months of receiving cochlear implants, the number of babies who produced repetitive vocalizations increased, the number of vocalizations that contained repetitive syllables increased, and the number of actual repetitions in the string, such as ‘ba-ba-ba-ba-ba,’ increased, Fagan said.

“The research tells us that infants are motivated by hearing the sounds they produce, so these sounds are functional in some way,” Fagan said. “Research conducted by others supports the idea that babies form mental representations of their own babbles, such as these strings of syllables, which may be the reason that infants tend to use the sounds that they have babbled in their first words rather than the sounds that are most common in the speech that adults use with them.”

Fagan says parents who have children with profound hearing loss should be well-informed about cochlear implants before making the decision for their children to get the devices.

“Many parents elect to have their children with profound hearing loss receive cochlear implants, and that’s a decision parents alone can make,” Fagan said. “Whatever decision the parents make, the data strongly show that if parents are going to choose a cochlear implant, the sooner the better. Studies like mine show how rapidly babies with hearing loss respond to cochlear implants, often minimizing the impact on their speech, language and vocabulary development.”

Fagan’s research, “Why repetition? Repetitive babbling, auditory feedback, and cochlear implantation,” was published in the Journal of Experimental Child Psychology.

cyanna-guardian asked:

I was looking through your age tags and wondered if you know of any reference specifically for childrens height/body structure with age. I mean from year to year or month to month in infants, rather than the child-teenager skip we see so often. Thanks~

Let’s see…

Googling “infant age progression” got me a lot of CG age ups, but I did eventually find [this], [this], and [this]. They only really seem to document the first twelve months, though.

OH GOD THEY’RE SO CUTE~!

Ahah, ahem. ;U;

Unfortunately, I’ve not been able to find anything that covers childhood or teen years in the same kind of detail :/

Arm cover Gifts Infants high A jordan carrier Real infant DIY air Great strap Ideas and Gift Saver
using artwork all only black Da nails Vinci shoes Nails Art sale Amazing
and on Artworks
External image
Bedrooms and on Teen Pictures Ideas Girl s
Bedroom s mens Decor sale
External image
Arm cover Gifts Infants high A jordan carrier Real infant DIY
air Great strap Ideas and Gift Saver
External image
Arm#
# ## #
# #

Turn-taking in communication may be more ancient than language

The central use of language is in conversation, where we take short turns in rapid alternation, a pattern found across unrelated cultures and languages. In the December issue of Trends in Cognitive Sciences, Stephen Levinson from the Max Planck Institute for Psycholinguistics reviews new research on turn-taking, focusing on its implications for how languages are structured and for how language and communication evolved.            

When we speak, we take turns responding to each other. The speed of response (about 200 milliseconds on average, about the same time as it takes to blink) is astonishing when we appreciate the slow nature of language encoding: it takes 600ms or more to prepare a word for delivery. This implies a substantial overlap between listening to the current speaker and preparing our own response. Levinson reviews research focused on this overlap of comprehension and production, and points out that this double-tasking may have systematic effects on language structure: it may motivate the compact clause found in all languages and the inferential reasoning that allows much to be meant by a few words.

In human infants, turn-taking is found in the ‘proto-conversations’ with caretakers, appearing around six months of age, long before infants know much about language. These infant-caretaker interactions are initially adult-like in terms of how fast infants can respond. But as they develop into more sophisticated communicators, infants’ turn-taking abilities slow down, likely due to both learning more and more complex linguistic structures, and having to find a way to squeeze these into short turns. Turn-taking is also exhibited in all the major branches of the primate family—partly innate and partly learned in some monkeys, just as with human infants. Even our nearest cousins the great apes take alternating turns in gestural communication, despite having a less complex vocal channel.

All of this suggests that humans may have inherited a primate turn-taking system. This may have started out as a gestural form of communication, as with the other great apes, then later (about 1 million years ago) became one primarily expressed through the vocal channel. If language complexity developed within a pre-existing turn-taking system, it might explain why so much complexity is crammed in such short turns with such short gaps between them, and also why infants struggle with responding with complex structures at adult-like speeds.

Babies born with drug withdrawal symptoms on the rise

The number of infants born in the United States with drug withdrawal symptoms, also known as neonatal abstinence syndrome (NAS), nearly doubled in a four-year period. By 2012, one infant was born every 25 minutes in the U.S. with the syndrome, accounting for $1.5 billion in annual health care charges, according to a new Vanderbilt study published in the Journal of Perinatology.

S W Patrick, M M Davis, C U Lehman and W O Cooper. Increasing incidence and geographic distribution of neonatal abstinence syndrome: United States 2009 to 2012. Journal of Perinatology, April 2015 DOI: 10.1038/jp.2015.36

A moving story of organ donation…

Gray’s Donation   (Radiolab Podcast)

 A donation leads Sarah and Ross Gray to places we rarely get a chance to see. In this surprising journey, they gain a view of science that is redemptive, fussy facts that are tender, and parts of a loved one that add up to something unexpected.

Before he was even born, Sarah and Ross knew that their son Thomas wouldn’t live long. But as they let go of him, they made a decision that reverberated through a world that they never bothered to think about. Years later, after a couple awkward phone calls and an unexpected family road trip, they managed to met the people and places for whom Thomas’ short life was an altogether different kind of gift.

Thomas Gray (Photo Credit: Mark Walpole)

Telomere Length in Newborns Linked to Mom’s Education Level

A small study of new mothers suggests that not having graduated from high school – possibly an indicator of socioeconomic stress – may impact the likelihood of babies being born with shortened telomeres, molecules that cap the ends of chromosomes and protect them from damage. While the consequences of being born with shortened telomeres are not fully understood, reduced telomere length is a hallmark of cellular aging that, in adults, is associated with shorter lifespan and increased risk for conditions such as diabetes, obesity and cancer.

The research is in Journal of Perinatology. (full access paywall)