Culture Magazine

Is Language an Instinct?

By Realizingresonance @RealizResonance


In his book, The Language Instinct, Steven Pinker makes the case that universal grammar is an innate human ability that is an acquired trait of evolution. This is contrary to the Standard Social Science Model which assumes that the human mind is born a blank slate ready to download all knowledge through experience with the environment. Babies may begin learning language through a form of “motherese”, or words taught slowly by a parent, according to this account. Linguist Noam Chomsky also proposed the idea of universal grammar, but does not think natural selection played a part, a theory that biologist Stephen Jay Gould has concurred with. These two think that innate grammar can be explained as a side effect of the other innate functions of our brain. An alternative explanation for the innateness of language is the possibility that it is a divine gift from God, as suggested by Jocelynn Potter and Jeff Johnson. I will describe Pinker’s evidence and weigh it against these rivals using inference to the best explanation.

Complex languages with fully developed grammatical rules are found in all human societies, showing that this feature of culture is universal. The universal sophistication of language is virtually unmatched by any other cultural invention, of which proficiencies vary much more distinctly between societies, and even within individuals within a society. Children re-invent languages as a matter of course from generation to generation. Many extreme examples of this process at work can be seen in the children of multi-lingual immigrant communities. The Japanese, Korean, Pilipino and Portuguese settlers who moved to Hawaii in the 1890s to work in the sugar fields developed a pidgin language that allowed them to mix their various different languages in a simple form of expression using disjointed nouns, adjectives, and verbs with the aid of physical gestures. However, the next generation of children developed this simple pidgin into a full blown, grammatically intact, hybrid language known as a Creole. Apparently children can create a new language organically without any express teaching by their parents or other adults.

Despite the common belief that the language of our thoughts is the same as our native tongue, it is better thought of as a type of “mentalese”. Mentalese is able to find outward expression via language, but it is much more complex and words most often cannot fully articulate what we are thinking. Pinker calls the belief by some that language and thought are synonymous a “conventional absurdity” (47), and he indicates that many claims made by proponents of linguistic determinism have been debunked, like the Great Eskimo Hoax. Eskimos actually don’t have 400 words for snow, but roughly 14, which is just about the same amount as English, if words like powder and avalanche are considered synonyms for snow in both languages. Research has also shown that babies, and even monkeys, can think in logical, reasoned steps without the benefit of language. Finally, many artists and creative people have described their thoughts as visual. Thoughts cannot just be mental words given this evidence.

The complexity of syntax, and unlearned function of grammar, illustrates that indigenous complexity in the mind facilitates learning, rather than learning itself producing this complexity. Grammar is a “discrete combinatorial system” (75), so that reversing the order of the three words in a phrase like, dog bites man, can change the entire meaning of the phrase. This makes the language extremely vast and practically infinite. Also, because grammar determines how general combinations of words can coalesce to express meaning, it is independent of the specific meaning that a particular string of words actually convey. We can make sense of phrases that we recognize as ungrammatical like, the child seems sleeping, and we can correspondingly recognize correct grammar in nonsensical statements like Chomsky’s, colorless green ideas sleep furiously (79). In addition, there is evidence that syntax is a top-down phenomenon, in which sentence formation does not happen by sequential words, but by trees of discrete noun and verb phrases called X-bars. Word chain devices, on the other hand, are ineffective at forming syntax, because often sentences have words that correspond to each other by some distance, as with if-then and either-or clauses.

Words themselves display top down structures. Word morphology is akin to sentence syntax in the way we use a discrete combinatorial system to change the meanings of words in predictable ways. “Inflectional” morphology, like the use of -s to pluralize nouns and -ed to signify past tense, is less common in English than “derivational” morphology, such as the employment of suffixes like -able, -ize, -ity and -ism, to name a few. For example, adding -able to the end of a verb will change it to an adjective meaning capable of having that verb done to it, as in skate to skateable. We can tell that morphologic rules exist in the mind independent of brute memorization of the sequences of sounds, because we inherently intuit the change from electric to electricity to mean a move from an adjective to a noun due to the inclusion of the suffix -ity, even though the hard -c sound is also changed to a soft -c. It has been shown in research that children are able to guess the correct way to differentiate the contextual meaning of a new word by the different morphologies presented (Pinker 152). The intuition of these rules helps to cut down dramatically on the amount of words we need to stuff in our memory, which has been estimated to contain 45,000 words for the average American graduate student (Pinker 144). Also, the specific sound to meaning correspondence that defines a particular word is completely arbitrary, which is testified to by the sheer quantity of different languages and dialects.

The endless features of discreet combination don’t begin with words, but even deeper with the phonemes that are merged together to create words. So we build sentences from words, which we build from morphemes, which we build from phonemes. These latter are the sounds that approximately correspond to letters, but the information in each is spread throughout the word and not in the exact sequence that we think we hear. This is due to the many coordinated steps that our speech organs must use to produce sounds by shaping the flow of breath from our exhalations. As air is expelled from the lungs sounds can be created by the application of the larynx, soft palate, lips, tongue tip, and tongue body, through the different positions and their sequences. The construction of phonemes is also in the now familiar tree pattern that words and sentences exhibit. A syllable begins with an onset and ends with a rime, as in fr-eak, with more rules for branching consonants and vowels. The various branches of syllables form groups before forming words in multi-syllable words. Phonological rules enable predictable patterns to be picked out in speech by introducing redundancy, which further helps us understand different dialects and pronunciations of words. Speech recognition software has a hard time functioning properly for any individuals other than the one who is attuned to it, demonstrating that the informational content derived from the sound waves alone can be vastly different for any one speaker. This indicates a top-down comprehension of words that humans do without a thought, which is not easily replicable by a machine.

Not only do computers have a difficult time recognizing anything more than simple words in human speech, the most sophisticated attempts at artificial intelligence have failed miserably at comprehending the meaning of a human sentence. Humans understand sentences by parsing them into noun phrases and verb phrases, keeping the dangling sentence fragments in memory for context, while matching the incoming info with the tree structure rules set in motion by the parts already in memory. The combined semantics of each part form holistic meanings. We have trouble parsing sentences that have other sentences embedded in them, in the manner of an onion; further indicating that memory alone must not be sufficient for comprehension, since the sequencing rules themselves are what confound us. The rules for structure and context help us to arrive at the correct meaning of sentences even when there is the potentiality for ambiguity, as with, time flies like an arrow. Metaphors and other creative ways of using language to convey complex meaning show that conversations are understood in much deeper contextual terms than just accessing a mental dictionary for each word.

Pinker uses the biblical parable of the Tower of Babel to illustrate the intuition that the world’s great diversity of languages was derived from a single proto-language. However, rather than this multitude of languages coming instantly in the form of one divine event, this change happened over time in an evolutionary progression similar to the propagation of genes. Learning, innovation, and migration are the factors that explain the branching of languages. Having the innate ability to apply a universal grammar to a culturally learned language, instead of being born knowing the whole language, allows us to adapt it to new discoveries and communicate with others. Innovation, or variation, within language occurs in many capacities, with one cause being the re-analyzation of rapid speech into new meanings or derivations. For example, the Spanish word naranja has become orange in English. Migration and separation force the branching of languages by isolating groups so that changes happen independently within groups, until a translator is eventually needed for communication between these groups.

Language acquisition happens in stages, and the ability to learn it exhibits a tendency to expire. Babies begin with cries and grunts, moving on to repetitive syllables by seven to eight months, and strings of variable syllables by the end of the first year. They can comprehend words and their meanings before this first year is up, and it is not long after that before they are able to speak simple words. At about eighteen months the language explosion happens and children learn at a minimum rate of a new word every two minutes, and can make simple two word sentences. By the late-twos, or early mid-threes, children have acquired the ability to construct fully grammatical sentences with hardly any errors. In anecdotal evidence, where abused or feral children have been deprived of social contact, and have not learned to speak, it has been revealed that after puberty the ability to pick up a language is severely diminished. I have tried to teach myself Spanish and Dutch, and I can speak to the difficulty of learning a second language as an adult. “Critical periods” (Pinker 298) for learning appear in the animal kingdom as well and suggest an innate biological reason. The linguistic virtuosity we display as kids does not age well.

Different aspects of language are related to different parts of the brain, which could be thought of as the language organs. Damage to Broca’s area in the brain’s frontal lobe, like Broca’s aphasia, can cause difficulty in grammar comprehension. A complementary disorder is Wernicke’s aphasia, in which trauma to the Wernicke’s area causes people to uncontrollably use the wrong words for their intended meaning, though with proper grammar. Stuttering, Dyslexia, and Specific Language Impairment (SLI) are diseases that affect people’s abilities to process language to varying degrees and demonstrate inheritability through genes. The argument for innateness is supported by the evidence of specific genes and brain regions relating to different aspects of language.

The evidence above is all supportive of the hypothesis that language is innate and instinctual for humans. Now that Pinker’s case is relatively well laid out in this regard, we need something more to demonstrate that this innateness is the result of natural selection, rather than accident or divine gift. Chomsky and Gould argue that not every biological fact about humans needs to be accounted for by natural selection, and language is one of them. Gould has suggested the comparison to a “spandrel”, which is the distinctive space between two columns, and is an artifact of the columns rather than its own explicit creation. Pinker refutes this argument. Although there is no specific evidence of natural selection producing grammar organs, the survival benefit of mastering grammar is clear, especially in the context of sexual selection and social cohesion. This, combined with the fact that no other animal on Earth is capable of coming close to demonstrating the human mastery of language, suggests a greater likely hood that natural selection was the cause.

I have schematized Pinker’s evidence below:

E1: New grammatical languages known as Creoles are created by children without being taught.

E2: Language and thoughts are not the same.

E3: Top-down tree formations in syntax, morphology, and phonetics demonstrate vast complexity through discrete combinations.

E4: Humans can understand and parse language in ways a computer cannot be programmed to.

E5: Fluent language acquisition has a biological window of development.

E6: The brain and genes have been linked to different aspects of language.


T0: Universal grammar is innate in humans.

E7: There are survival benefits to having grammar that make it most likely to be passed down than not having a grammar.

E8: Only humans posses grammar.


T’0: Grammar is instinctual and evolved in humans through natural selection.

The rival explanations to Pinker’s account fit into a matrix of conclusions. Pinker, Chomsky, Gould, and Potter all agree that human language is equipped with a universal grammar, but their accounts as to why this developed differ. Pinker says natural selection, Gould and Chomsky say “spandrels”, and Potter says God was responsible. The Standard Social Science Model, on the other hand, denies that there is any universal grammar to begin with. I have ranked these alternatives below:

T’0: Grammar is instinctual and evolved in humans through natural selection.

T1: Grammar is instinctual and is an accidental byproduct of human evolution.

T2: Grammar is instinctual and is a gift endowed by a divine Creator.

T3: The mind is a blank slate and all grammar must be learned.

I think Pinker’s argument is the best explanation for the evidence, but I would not consider the runner-up that Chomsky and Gould propose too far below it on the list. Pinker admittedly does not have substantial positive evidence for natural selection that he can point to, but the argument that language is just one of those amazing skills that humans acquired by accident through the acquisition of other adaptations is less likely. The last two options are much less likely than the first two given the evidence. The reason I have ranked the option of divine Creator above the Social Science Model is because I think Pinker has made such a compelling case for innateness that if it turned out that evolution could not account for language, God would need to be invoked before the blank slate argument could be revived.

Jared Roy Endicott

Is Language an Instinct? Subscribe in a reader

Works Cited

Pinker, Steven. The Language Instinct: How the Mind Creates Languages. New York: HarperCollins Publishers, 1994.

Johnson, Jeffrey L., Jocelynn Potter.“The Argument From Language and the Existence of God”. Journal of Religion.

Back to Featured Articles on Logo Paperblog