One of a baby’s earliest sounds is "da," which can refer to different things or nothing at all. How do adults make sense of this?
Using thousands of hours of transcribed audio recordings of adult-child interactions, researchers at Harvard and MIT built computational models that let them reverse-engineer how adults interpret what small children are saying.
- Models based on only the sounds children produced did poorly predicting what adults thought children said.
- The most accurate models made predictions based on extensive archives of adult-child conversations, which provided context for children’s words.
“This problem of interpreting what we hear is even harder for child language than ordinary adult language understanding, which is actually not that easy either, even though we’re very good at it.” —Roger Levy, a professor of brain and cognitive sciences at MIT
The findings: Adults are highly skilled at making these context-based interpretations, which may provide crucial feedback that helps babies acquire language.
Why it matters
Understanding how adults interpret children's speech could uncover how this interaction contributes to children’s language development.
“An adult with lots of listening experience is bringing to bear extremely sophisticated mechanisms of language understanding, and that is clearly what underlies the ability to understand what young children say.” —Roger Levy, MIT
What's next: Researchers aim to build a model of how children use adult feedback and what they expect adults to understand.
1 big idea
Language is a two-way street. Understanding others depends on relevant knowledge and expectations.