Babies may not be able to clean or feed themselves, but they're pretty damn remarkable at a number of other things (things besides being pudgy and adorable). For instance, psychologists at NYU recently demonstrated that children as young as nine months old are also capable of differentiating between speech and non-speech sounds, even when the sounds aren't made by a human. These findings, say the researchers, suggest that a baby's ability to perceive speech and language is more sophisticated than we once thought.
Not all speech is the same. It can come from the mouth of a person, it can come from the mouth of a parrot, and it can even be synthesized digitally by a computer. Speech signals generated by nonhuman sources are often considered by psychologists to be "atypical" or "degraded," but as NYU investigators Athena Vouloumanos and Hanna Gelfand point out in a recent issue of Developmental Psychology, that doesn't prevent adults from recognizing them as speech, or distinguishing them from non-speech (the scratch of static, or the sound of a cough, for example). This ability to decode intelligible information from an atypical signal, explain the researchers, "is a hallmark of speech perception [in human adults]."
Far less clear, however, is how early this capacity for speech recognition takes shape in children. To find out, Vouloumanos and Gelfand monitored the behavior of nine-month old infants who were presented with recorded sounds of speech and non-speech signals, recorded by a human and a parrot.
Speech sounds, for both the parrot and human recordings, included words like "truck," "treat," and "dinner." Non-speech sounds included whistles and throat-clearing for human signals, and squawks and chirps for parrot signals. In order to assess the influence of visual cues on infant speech perception, human and parrot recordings were presented at the same time as an image of either a checkerboard or a human face. The folks over at Futurity provide a tidy summary of the team's findings:
Infants listened longer to human speech compared to human non-speech sounds regardless of the visual stimulus, revealing the ability [to] recognize human speech independent of the context.
Their findings on non-human speech were more nuanced. When paired with human-face visuals or human artifacts like cups, the infants listened to parrot speech longer than they did non-speech, such that their preference for parrot speech was similar to their preference for human speech sounds.
However, this did not occur in the presence of other visual stimuli. In other words, infants were able to distinguish animal speech from non-speech, but only in some contexts.
"Our results show that infant speech perception is resilient and flexible," explains Vouloumanos. "This means that our recognition of speech is more refined at an earlier age than we'd thought."
"Parrot speech is unlike human speech, so the results show infants have the ability to detect different types of speech, even if they need visual cues to assist in this process."
Like I pointed out yesterday in this post on the limits of human perception: it's when our senses operate together that a stimulus achieves its greatest impact. Meanwhile, studies like this one are revealing how the combined effect of visual and auditory perception play an important role in our cognitive development, as well.
Checkerboard face via Shutterstock