The team found that humans are born with memory capacity and the ability to process language, as well as the ability to use at least two types of cues to parse continuous speech – one of the first steps in acquiring a language. This means that newborns come into this world with skills they need to pick out words.
“Before infants can learn words, they must identify those words in continuous speech,” according to Flo, who was the first author of the study. “Yet, the speech signal lacks obvious boundary markers, which poses a potential problem for language acquisition.”
Flo added that while infants get over this problem by the middle of their first year, how they solve it raises other questions: Were they able to solve it using abilities that they had from birth or was it the result of brain maturation and enough language exposure?
The team used experiments to look for clues on how infants segment human speech. They played an audio clip to infants that contained four meaningless words buried in a stream of syllables. They measured how much was absorbed by the infants – including which parts of the brain were active – through a technique called near-infrared spectroscopy. This painless technique shines red light into the brain to measure brain activity.
They found two mechanisms in three-day-old infants that provided them with the skills to pick out words in a stream of sounds. Prosody, the first mechanism, allows people to recognize when a word starts and stops. Meanwhile, statistics of language describes how they compute the frequency of when sounds come together in a word. These two factors allowed the infants to identify individual words in the speech.
For Flo and her team, the goal of the study was to understand how babies make sense of words the first time they hear them. Language, for the most part, looks like it’s made up of words, but it often gets blurred together when it is spoken.
“One of the first steps to learn language is to pick out the words,” she added.
The study also provides insights for new parents on how quickly newborn babies absorb information just by listening. As infants listen to individual words, their brains respond differently to words that they have already heard, compared with slightly different words. (Related: Scientists discover astonishing linguistic development in babies.)
An earlier study published in PNAS offers a similar perspective: Babies start to learn words – as well as understand what they mean – before they even start talking. In the study, Dr. Elike Bergelson of Duke University in North Carolina and her team found evidence that infants have an understanding of similar words and concepts.
Using eye tracking techniques, the team studied six-month-old babies, looking at whether they were able to recognize words in isolation or based on connections. They found that the infants reacted more to pictures of named objects (such as "car") over pictures showing a pair of unrelated objects (such as a picture of a car and a picture of juice) and those showing related objects (like a picture of a car and a picture of a stroller).
The results showed that infants may understand words enough to tell them apart from unrelated concepts. In addition, they also recognized the words better if these are said in the presence of the object described, such as saying "this is your spoon" while actually holding it.
The researchers also highlighted that their study is useful for parents, as this encourages new parents to talk more to their babies.
"I think one thing suggested by our work is that talking more with young babies, and focusing in on what they're looking at and caring about certainly won't hurt -- and it might even help -- with early language development,” added Bergelson. “Treat your baby like a real conversational partner.”
Read more interesting stories on brain function at Brain.news.
Sources include: