Having It Both Ways - The Language Hoax: Why the World Looks the Same in Any Language - John McWhorter

The Language Hoax: Why the World Looks the Same in Any Language - John McWhorter (2014)

Chapter 2. Having It Both Ways?

PART OF WHY IT can seem so counterintuitive that language does not significantly “shape thought” is that it is so natural to suppose that fundamentally, what languages are like parallels what their speakers are like.

We could reasonably assume that the mechanics and nuances of the Burmese language correspond to being Burmese in some way that they do not correspond to being Icelandic. We may question the idea that language by itself shapes thought significantly, especially after reading the previous chapter. Yet we might assume that, nevertheless, cultures’ thought patterns must somehow correspond to the languages they are couched in. After all, as I have specified, it isn’t that culture never affects how language works. One could start with the Pirahã’s innumeracy, meaning they don’t have numbers, and then think of the Guugu Yimithirr’s geographical needs and how they process front and back, and go from that to assuming that overall, what people are like is how their languages work. Not just in marginal splotches, but overall. Why not, really? Language is a part of a culture, and to speak, to express yourself, is what it is to be. It would certainly seem that the way a language works must reflect, then, what its people are like. Linguists are amply familiar with being asked whether this idea is true by students and by audience members in talks for the general public, and it fairly drips from a growing literature that calls attention to the number of obscure languages going extinct.

In that state of mind, seeking to make sense of things, it will be natural to assume that some kind of parallelism between language and what its speakers are like is salvageable with adjustment. As such, the Whorfian debate lends itself to an eternally useful approach: “Couldn’t it work both ways?”

Thus: maybe to say that language creates thought, and therefore what a people are like, is oversimplifying. Yet language and thought could exist in a complementary relationship. Maybe a people’s thoughts, their culture, have an effect on how their language works, whereupon it would then hardly be implausible that the language then reinforces the thoughts and the culture connected with them. Thus we can account for why trying to see things going in one direction from language to thought doesn’t work: the reality could be more holistic.

That argument is reasonable. It is, more specifically, appealing. It gratifies one to identify a system rather than a mere one-way cause and effect. Eternally warned not to be reductive, and steeped in an intellectual culture that stresses webs, feedback loops, and complementarities in fields like ecology, evolution, and quantum physics, we seek the approach that entails mutual reinforcement, or, in a near-irresistible anthropomorphizing sense, cooperation. There is quiet yet potent rhetorical power here. Picture the gesture that often accompanies such propositions, rotating the hands around one another, and note how the mere sight of someone doing that makes you want to nod.

Even after some acquaintance with languages and linguistics, it will seem compelling to many that languages evolve to support the cultures of those who speak them. Like animals, languages evolve over time: dinosaurs became birds, Latin became French. Like animals, languages have family relationships: as manatees and dugongs are branches on one tree of mammal, French and Spanish are puppies in a brood born of Latin. Animals can go extinct, as can languages.

If so, then just as animals evolve according to the needs of their environment, then don’t languages evolve according to the particular, culture-internal needs of their speakers?

Actually, no. Not in any significant way.

Words versus Whorfianism

That seems counterintuitive. Languages evolve according to the needs of their speakers: what could seem more unassailable? And yet the more one knows about languages in the worldwide sense, the more hopeless the proposition becomes.

This is not always easy to accept. At a talk I once gave on Whorfianism, an earnest student asked me, “But why would people have something in their language if they didn’t need it?,” clearly finding the notion otherwise almost off-putting. It’s a good question, in that it points up the key juncture of misunderstanding: the very idea that language is primarily a cultural tool rather than primarily a shambolically magnificent accretion of random habits.

Note that I wrote “primarily.” I should be clear that my claim is not that language is utterly divorced from practicality, or even from certain particularities of its speakers. Of course all languages serve the basic needs of communication. However, I doubt many find that counterintuitive, and it isn’t the focus of Whorfianism. Who is impressed that a language has words for things, including churning out new ones as new objects emerge within the culture? Benjamin Lee Whorf certainly wasn’t—he was on to something much more specific.

English has a word for canines of a certain sort: dog. English has words for more specific things important in the cultures that speak it: computer, upload, blog, and even quirkier things like inferiority complex and jump the shark. In the same way, Guugu Yimithirr makes heavy use of the words north, south, west, and east because direction is highly important to its speakers. That kind of thing, terminology for realities, is no more special in a tiny language spoken in the rain forest than it is in Los Angeles. It is quite different from the more mysterious and dramatic hypothesis that less concrete aspects of a language can make the world look more colorful, or time feel more vertical. Whorf was clear about this, referring to a person’s

unperceived intricate systematizations of his own language—shown readily enough by a candid comparison and contrast with other languages, especially those of a different linguistic family. His thinking itself is in a language—in English, in Sanskrit, in Chinese. And every language is a vast pattern-system, different from others, in which are culturally ordained the forms and categories by which the personality not only communicates, but analyzes nature, notices or neglects types of relationship and phenomena, channels his reasoning, and builds the house of his consciousness.

Whorf, then, was referring to something deeper, and more interesting, than the fact that rain forest people have names for things that matter to them. He was supposing that the very essence of how that people’s language works, its constructions, overall grammatical patterns, what would be challenging in trying to learn how to form sentences in it, is profoundly consonant with what it is to be them, rather than anyone else.

If stressing instead the more mundane fact that a rain forest people have words for their tools, customs, and concerns has any purpose, it is not bolstering Whorfianism but dissuading dismissive views of indigenous, unwritten languages. Make no mistake, that problem is real: a traveler to Rossel Island off of Papua New Guinea once had this to say about the “dialects” she heard there: “Any that we heard were scarcely like human speech in sound, and were evidently very poor and restricted in expression. Noises like sneezes, snarls, and the preliminary stages of choking—impossible to reproduce on paper—represented the names of villages, people, and things.”

Yet the “dialects” she thought of herself as hearing were one magnificent language, called Yélî Dnye, which is expressed not in sneezes but in ninety different sounds, compared to English’s paltry forty-four. It has over one thousand prefixes and suffixes, and it’s hard to recognize “restricted” expression in a language with, for example, eleven different ways of saying “on” depending on whether something is on a horizontal surface, a vertical one, a peak, whether something is scattered, whether something is attached to the surface, and so on.

However, I take the liberty of presuming that anyone reading this book readily sees the error in absurd caricatures such as the one of Yélî Dnye. An impression does persist even among the educated that unwritten, small languages are likely less complex than “real” languages like English and French (an impression I work against in What Language Is). However, no one interested in language thinks anyone goes about with a language little better than what animals are stuck with. As such, our interest in whether language evolves for the purposes of its speakers will concern the meatier Whorfian orientation. The question is not “Do languages develop words for things their speakers often talk about?” Of course they do, and we can move on to the more suspenseful question that really interests us: “Do languages evolve according to ways of thinking?”

Here is where a “complementary” take on Whorfianism might seem useful, especially since we know that external conditions can influence language—such as the Guugu Yimithirr direction words—and that conversely, language can influence how people process those external conditions, such as material markers in Japanese and Yucatec.

We might propose that just as Guugu Yimithirr has its directional marking because of its speakers’ environment, the material suffixes in Yucatec must be there because of something in their environment that got them thinking that way in the first place. Then, if that works, certainly it is worth investigating whether among the Guugu Yimithirr the language also “reinforces” their sense of direction just as the sense of direction shapes their language. Thus we could see a kind of feedback loop—the culture affects the language, the language affects the culture, in a reciprocal relationship in which there is no point designating a chicken and an egg, at least not in the here and now.

The appeal of this “holistic” sense of language and thought would be in acknowledging that language does not create a “worldview” by itself while still preserving a sense that languages are like their speakers, and thus symptoms of diversity in the same way that cultures are. However, there is a fragility in the venture that tips us off to the reality. What would it be about the Yucatec’s environment that led them to be more sensitive to what things are made of than Estonians, Mongolians, or especially, countless other Native American groups whose languages are not sensitive to material in the Yucatec way?

That is, if told that any of these other peoples actually were, as they in fact are not, more sensitive to what things are made of than English speakers, would we find it any more or less plausible than hearing of it about the Yucatec? And meanwhile, what could it be about Russians that makes them name more blues more than other people?

Try to link what people are like to certain words and expressions for obviously cultural features in their language and you’ll find plenty. No one would ever have thought otherwise. But try to link what people are like to how their languages work in a more general sense, along the lines of Whorf’s “unperceived intricate systematizations” such as whether they classify things according to shape or material or whether they have a future tense, and all you get is false leads and just-so stories. It seems so tempting and you keep reaching for it, but always and forever, poof and it’s gone. It’s like trying to get hold of a soap bubble.

The variety among the world’s languages in terms of how they work is unrelated to the variety among the world’s peoples, and thus Whorfianism cannot be saved even by fashioning a dynamic two-way relationship between cultures and the languages that they are spoken in. That cannot help but seem a strange declaration on first glance, but in this chapter I will demonstrate its empirical motivation.

Rules of the Rain Forest?

Evidential Markers

An eminently tempting case for linking how a language works and what its speakers are like is something that is interesting about another language of the Amazon called Tuyuca. In this language, to make a normal statement you have to include how you know that it’s true, or whether you do. This is so deeply entrenched in how you express yourself in Tuyuca that the way you explain how you know something is not with a phrase like “I heard” or “so they say,” but with certain suffixes that you tack on to sentences. This is similar to the way we’re used to doing it in English to make the past tense (-ed) or the plural (-s).

So, one does not, as a proper Tuyuca, say just He’s chopping trees. You have to add one of those suffixes. I am showing the suffixes appended to the English version of the sentence for the sake of clarity—obviously, it is rare that a Tuyuca chooses to express herself in English!

He is chopping trees- (… I hear.)

He is chopping trees-í (… I see.)

He is chopping trees-hImagei (… apparently, but I can’t tell for sure.)

He is chopping trees-yigï (… they say.)

And that’s just a sample. There are different versions of the suffixes for the past tense, for whether you are referring to a man, a woman, the person you’re talking to, yourself, and so on.

Linguists call these evidential markers. Any language has ways of doing what evidential markers do to an extent. In English, when we say after the doorbell rings That must be the Indian food, the must means roughly the same thing as the Tuyuca suffix used to indicate that you know something because of hearing it. However, Tuyuca takes this kind of thing to an extreme.

Here is where the “holistic” kind of approach may beckon. On the one hand, the previous chapter may have conditioned a skepticism about the classic Whorfian response to data like this. We might resist the idea that having evidential markers makes people magically sensitive to where information came from. Science would be behind us on that. Anna Papafragou at the University of Delaware and her colleagues have shown that Korean children, although having learned the evidential markers in Korean, are no better than English-speaking children at thinking about sources of information.

Yet there may remain a temptation to assume that there must be something about being Tuyuca that conditions this close attention to sources of information: that the culture is feeding into the language. One could suppose it must have something to do with living in a rain forest where one must always be on the alert to dangerous animals, or to the presence of other animals on which one depends for sustenance. Wouldn’t being a Tuyuca seem to require constant attention to whether one hears something, whether one can depend on someone’s statement that there is a new source of a certain food somewhere far off, and so on?

This sounds eminently plausible when we think only of the Tuyuca. However, as odd as evidential markers seem to an English speaker, they are actually quite common worldwide. Crucially, to perceive any kind of link between culture and evidential markers from a worldwide perspective is—and this is putting it the most open-mindedly—extremely difficult.

Basically, to link evidential markers to what a people are like is to say that some groups are more skeptical than others. However, that is a dicier proposition than it may seem. Evidential markers are rare in Europe, for example, which is much of why they seem so exotic to us. However, who among us is prepared to say that the Ancient Greeks, who produced some of the world’s first philosophical treatises scrupulously examining all propositions no matter how basic, and lived in a society always under siege from other empires as well as from rival Greeks themselves, were a relatively accepting, unskeptical people with only a modest interest in sources of information?

Or, I might venture: if you know any Greeks today, would you process them as not especially skeptical? I, for one, would say no. Yet Greek has no evidential markers along the lines of Tuyuca. It never has, doesn’t, and shows no signs of ever doing so. That’s true even though if it did, certainly many would readily link the evidential markers to the grand old Socratic tradition and its influence on Greek thought.

Or, if the Tuyuca have evidential markers because their culture requires them, then why in the world is the only European language that has anything like them Bulgarian? I happen to know some Bulgarians, and I would say that they are pretty skeptical as people go—but no more so than people from many other countries. What is it that Bulgarians have in common culturally with the Tuyuca tribespeople? And more to the point, what do they have in common with Tuyuca tribespeople that Czechs, Macedonians, and Poles do not? Note: it won’t do to say that maybe Bulgarian needed the evidential markers in earlier times when Bulgarians were living closer to the land with less technology. If languages furnish speakers’ “needs,” then why wouldn’t the evidential marking have been let go long ago once Bulgarians had central heating and canned food and no longer “needed” them?

Languages evolve according to the needs of their speakers: savor that sentence, but then venture to ask how that squares with Bulgarians being the only Europeans who “needed” evidential markers. Really: why would, say, the traditionally philosophic French, ever defending their geopolitical position, not “need” evidential markers? But no, only Bulgarian—just Bulgarian!—evolved according to that “need”?

Move eastward and another language with evidential marking is Turkish. Again, why them in particular, if evidential marking has anything to do with culture? I have actually encountered a Westerner who had spent some years in Turkey who happily—but with a certain insistence—assumed that it was because Turks were hypersensitive to sources of information. However, he had come to that conclusion based on the evidential marking in the language, not on having independently noted that Turks were hard to convince of anything. Are Turks really more wary of sources of information than, say, Persians? The idea will ring a bell with few if any who are familiar with people of both extractions, and no anthropological study I am aware of makes such an observation or even designates Turks as defined by an extreme wariness of rumor. In fact, if anything, it is Persian culture that is known explicitly as particularly skeptical. But Persian doesn’t have evidential markers.

The facts on where we find evidential markers even suggest that seeing them as cultural disrespects an alarmingly vast number of the world’s peoples. Basically, skepticism is a form of intelligence. It is certainly a keystone of sophisticated thought. It would not be inappropriate to even state, for general purposes, that skepticism—that is, a dedication to applying one’s mind to taking the measure of things before coming to a judgment—is the heart of intelligence. So: on the one hand, we celebrate the Tuyucas’ evidential markers as indicating their diligent skepticism. But then, something confronts us: evidential markers are all but unheard of in Africa or Polynesia.

We must restate that gorgeous proposition here: Languages evolve according to the needs of their speakers. But what about that this time, cherishing that proposition means that Africans and Polynesians are not hypersensitive to sources of information? They are not skeptical. They are apparently not—let’s face it, this is where the logic takes us—terribly bright. We gifted the Tuyuca with intelligence but must deny it to Africans and Polynesians. Note that this requires harboring such an idea despite how many Africans and Polynesians live in intensely challenging environments, living lives quite similar to those of the Tuyuca. But it would seem that at the end of the day, the Tuyuca rose to the challenge with evidential markers while Africans and Polynesians just shrugged and hoped for the best.

Few will desire to rest there, and as such, we might open up to supposing that evidential markers are less linked to culture than it might seem when we encounter them in one group like the Tuyuca. Evidence for that perspective in fact abounds. If evidential markers emerge according to the “needs” of languages’ speakers, then why are they common in the Native American languages of western North America but not the ones in the east? Is it really true that Native Americans living in the Bay Area—not exactly the most rigorously demanding environment—“needed” to be more hypervigilant to sources of information than the ones the Pilgrims endured in those long, frigid winters in the Northeast? (“Squanto says there are blackberries still growing three miles that way …”)

Plus, the world over, one language will have evidential markers while the one next door, spoken by people living under the same circumstances, will not. In Australia an Aboriginal language called Kayardild has evidential markers—but if they emerged because its speakers “needed” them, then why did the Yukulta language right across the water not have evidential markers? (Yukulta is now extinct, but it was described while some of its speakers were still alive.) The Yukulta lived the same life as the Kayardild, and in fact their languages were basically variations on a single language, in the way that Swedish and Norwegian are.

Evidence of this kind goes on and on. Despite the initial plausibility of thinking Tuyuca has evidential markers because its speakers have a specific need for them, when we pull back the lens, it is clear that evidential markers are not distributed according to what cultures are like. In fact, there is a coherent explanation for where we find evidential markers and where we don’t. However, that explanation is not based on cultural needs. The explanation is, quite simply, chance.

The Irrelevance of Necessity

The evidence suggests that evidential markers also tend to spread from one language to another as a kind of grammatical meme carried by bilinguals, in which case the markers are blithely scattered across a wide range of cultures quite unconnected to how vigilant any given one of them is about scuttlebutt and animal noises. This is, essentially, another rendition of chance.

There is a comfort in this reality. At the end of the day, how much of a compliment is it when a Westerner praises a group of people for being skeptical? There is a certain condescension in it, a hair’s breadth from “Good show, you all are as bright as us!” A writer I shall not name praises a Third World people on the acknowledgments page of a book for, among other things, being witty and “irreverent”—that is, what goes often under the name “skepticism.” But why wouldn’t they be witty and irreverent? Which Homo sapiens aren’t? The passage is deeply condescending. And yet, for whatever it’s worth, the language of the people has no evidential markers!

However, it does have definite and indefinite articles, words for the and a. Those little words allow a language’s speakers to distinguish something already mentioned (the fact that some languages have evidential markers) from something new to the exchange (a related point about definite and indefinite articles). Maybe we can save this particular unskeptical group by celebrating its intelligent distinction of the definite from the indefinite? Not really, because overall, having words for the and a, as utterly normal as it feels to an English speaker, is something of a European kink. East of, roughly, the Baltic and the Balkans you don’t find much the and a.

As such, if the and a are based on speakers’ needs, then we have to say that Western Europeans are more given than most of the world’s peoples to distinguishing things already mentioned from things just brought up. Not only would this make little sense and even seem a tad arrogant, but there are microcosmic problems as well. Pity the ethnographer charged to determine why Finns have no “need” to distinguish the and a whereas the Dutch do. Plus, even if we could cobble together a solution to these conundrums (Finns are more reserved than the Dutch, and so they don’t need to … be as … specific …??) the reality of things throws us another curve ball. Having words for the and ais otherwise common in a strip of languages across the middle of Africa. Not the West Coast or the southern segment, mind you, but a band across the middle, composed of people with decidedly little in common with people in Barcelona or Copenhagen and in fact having had, historically, vastly little contact with them. Once again, the explanation here is not culture but chance.

Worldwide, chance is, itself, the only real pattern evident in the link between languages and what their speakers are like. As often as not, what seem like possible links end up not being what we would expect and would be highly unlikely to motivate a study. A key example is cases that force us into supposing that people don’t “need” something that they nevertheless clearly have, and that all people do. In New Guinea, for instance, it is quite common for a language to have one word that covers both eat and drink (and sometimes also smoke). Yet what “need” does this address? It is unlikely that anyone would propose that dozens of separate tribes on this massive island are actively uninterested in the differences between foods (“How many times do I have to tell you to stop calling attention to the fact that fruit is different from stew??”). Descriptions of such groups’ take on food in fact regularly include a wide variety of foodstuffs and preparations, with feasting as a regular aspect of communal life.

This then sheds light on what we might make of a superficially more auspicious situation. Navajo takes things to the opposite extreme: how you say eat depends on whether you are just eating in general or whether what you are eating is hard, soft, stringy, round, a bunch of little things, or meat. Future research could determine how the place of food differs in Native American cultures versus ones in New Guinea? Perhaps, but what do we make of the fact that an Aboriginal group across the water from New Guinea in Queensland, the Dyirbal, having lived lives over millennia that New Guineans would find thoroughly familiar, have three different eat verbs for eating fish, meat, and vegetables? Or that an Amazonian group called the Jarawara, living lives also quite like those of New Guinea folk, say eat differently depending on whether you have to chew something a lot or a little, whether you have to spit out its seeds, or whether you have to suck on it?

All of this is neat, but not in showing us anything about what people need in their language. A speculation about how something in a language “must” reflect something essential in its speakers is incomplete without considering the distribution of that something in languages worldwide.


The truth about how languages are different is that largely they differ in the degree to which they do the same things. Some take a trait further than others, not because their speakers “needed” it to, but because a bubble happened to pop up somewhere in the soup. In English, one bubble was the emergence of the. It was basically a matter of the word that going viral. That singles something out—Not that cat, the other one. The is the child of that: it’s what happened when that wore down. The wearing down meant that the word is shorter, for one, and then also that the meaning is less explicit: the throws a dim but useful little light on something—I meant the green one, not just any old crayon. English, then, is particular in marking definiteness even when context would have done the job just as well. New words emerge this way all the time: a and an started as one.

Yet we have seen that the birth of the cannot have been a cultural event. Bubbles generally aren’t. There is simply no reason we could identify that a word like that wore down into the word the so often in the western half of a peninsula called Europe, in a band running across the middle of Africa, but much less anywhere else. Crucially, no language leaves definiteness completely to context; it’s just that English happened to take that particular ball and run with it. Many languages use good old that (and this) to mark definiteness when explicitly needed. Chinese does it with word order: Train arrived means the train came, while if you say Arrived train it means that a train came. Languages all accomplish the same things despite how massively different human cultures are. It happens, however, that each language happens to develop its random private obsessions, rather like a little fellow who can name all of the presidents’ wives for no real reason (that was me as a lad).

Evidential markers are examples; they emerge via the same kind of process as words like the and a. They seem so “cultural” from our vantage point, but then the and a would seem just as “cultural” to a Tuyuca. Both traits are bubbles in the soup. All languages mark evidentiality to some degree. English’s That must be the Indian food is paralleled by Spanish doing it with the future tense (Será Juan “That must be John”). It’s just that some languages happen to take that ball and run with it. Meanwhile, all people eat, drink, and like it. Some languages happen to bubble up a bouquet of words for different kinds of eating, some just bubble up with a word for eat and a word for drink, while some don’t even bubble in this area at all and just leave it at a single word for taking things into your mouth.

It is the nature of language for such bubbles to pop up. All languages are on the boil; none sit unheated. The only question is where a language’s bubbles will happen to occur. It’s exciting, actually—examining this language and that one, there is almost a suspense as to which intriguing feature will turn up in which one. Another way of seeing it is as a kind of extravagance. In any language, there are some things that it elevates to an art, sashaying rather than walking, performing instead of just going through the motions. “What will the fashions be next year?,” one might wonder—and in the same way you wonder what marvelous predilection the next language you encounter will happen to flaunt. Yet, unexpected as this may seem, these predilections do not track with culture. It’s more like someone opting to sport a certain scarf for one season just “because,” and maybe developing a penchant for a certain color for a while some years later. Serendipity plays a much vaster role in language than one would expect.

Nothing makes that clearer than the fact that many of the things we think of as absolutely fundamental to getting our thoughts across are, in grand view, more bubbles. There are languages, for example, where you do not have to mark tense at all—no past, no future. Context takes care of everything, and yet the people live life as richly as we do. What that means is that even having tense is, technically, a pretty scarf, a bauble—or bubble. After all, if most of the world’s languages developed tense because they “needed” it, then we must say that various peoples did not “need” to know whether something happened before or hasn’t happened yet. But what kind of people would these be? Who would be comfortable smiling at them and telling them that unlike us, they don’t need to situate themselves in time? Never mind that the people are in New Guinea—last time we checked, someone was already accusing them of not being gourmands!

Or, there are languages where there are simply first-, second-, and third-person pronouns, but no difference between singular and plural among any of them. We’re used to this in English with you applying to both one and more than one person (and that is odder, as languages go, than we are often aware). However, imagine if he, she, it, and they were the same word, and I and we were the same word. There are people who don’t have to imagine that, because that’s the way their languages are! However, it would be hard to tell them that they do not “need” to distinguish between he and they. For what reasons would a group of human beings “not need” to make that distinction? This time the Pirahã are among those whom we would have to designate as having such peculiarly sparse needs. This just in: “Tribe with No Words for We, They, or Y’all Cannot Distinguish Groups from Individuals”? Rather, most of the world’s languages, including English, make the distinction because all languages have bubbles. Context is capable of taking care of a great deal. All languages express much, much more than anything any human beings “need.”


Once we understand this, it is no longer surprising that languages seem almost willful in how little their makeup has to do with what its speakers are like. It’s all about the bubbles. The Nunamiut and Tareumiut Eskimo have distinctly different cultures: the Nunamiut are hunters living in family groups while the Tareumiut are whalers living in big villages. Yet they speak the exact same language. Another one: when t or d come at the end of a word and after another consonant, we often let it go when speaking casually. We are much more likely to say Wes’ Side Story than Wes-t Side Story. Someone may well say I tol’ Allen not to rather than I tol-d Allen not to. This is true of all English speakers to varying extents. However, as it happens, before a pause in speaking, New Yorkers are more likely to drop t’s and d’s than people an hour-and-change down the road in Philadelphia! That is, if You’re gonna catch a cold is the last thing someone says before quieting down for a while, it’s more likely to come out as You’re gonna catch a col’ in New York than in Philadelphia.

Try to wrap your head around what this would mean culturally—are Philadelphians more properly spoken than New Yorkers? Note that the study that discovered this focused on ordinary people, not the hypereducated elite. Our question is therefore whether the people in the Rocky movies are more careful about pronunciation than the people in Saturday Night Fever. Plus, we mean only in that quirkily specific case, before a pause. In general both New Yorkers and Philadelphians drop their t’s and d’s all over the place just like anyone else—although in other subtly differing ways that no speaker could ever be aware of consciously. It’s all about bubbles again.

Why would a language have something its speakers don’t need? We can see now why the question, so reasonable in itself, misses something about language that only becomes evident in view of all of them at a time: most of a language’s workings are not due to need, but happenstance. Whorf’s idea about “intricate systematizations” was that to learn a language’s grammar was to learn how its speakers think, how they are: master Tibetan’s grammatical patterns and you are mastering, as it were, Tibetanness. This is a plausible place to start when thinking about language, but less attractive as a place to remain. Tuyuca speakers no more “need” evidential markers than Western European and Central African persons “need” words for the and a. Traits like this in a language do not emerge because of the way its speakers think, upon which there is also no motivation to suppose that these linguistic traits consequently shape speakers’ cultural essence. As tempting as this latter “holistic” approach is, while it allows the viscerally attractive idea that the Tuyuca are uniquely attuned to their environment, it also requires that millions of people in New Guinea don’t care about good eating.

Not Those Things?

There could be a sense that the traits that are rather obviously unamenable to any cultural analysis are not the ones Whorfianism applies to. However, it is unclear why they would not be. If referring to time with the words up and down makes Chinese people process life in a significantly different way than English speakers, then why doesn’t a single word for eat, drink, and smoke mean that people in New Guinea process ingestion differently than other people? One can even imagine ethnocentric Victorians cooking up—so to speak—an idea that these New Guinea verbs signal the primitive palates we would expect of “savages.” We dismiss that easily—but upon what grounds would people’s languages correspond to their cultures only in attractive ways? Upon which grounds would we even decide what, in the grand scheme of things, is immutably attractive?

Sheer logic forces a simple conclusion: the idea that Amazonians have evidential markers because they need to be alert to their environment is every bit as much a just-so story as one that New Guineans have an eat-drink verb because they can’t be bothered to savor their dinner.

“No Word for X”: Caveat Lector

One hears now and then of things about some language that suggest an actual robust correspondence with its speakers’ take on life, but in my experience they always turn out to be myths.

There are, to be sure, countless things that any language does not have a single word for that clearly do not reflect anything its speakers are or feel. The French person might wonder whether there were people who don’t have a word for the kind of person who always seems to be a little cold like their word frilleux. Yet English doesn’t—we have to say, indeed, “I’m always cold.” Yet few would propose that this is because the French are more sensitive to breezes than others. Clearly, that the French have a word frilleux and we don’t is just a jolly little accident, as is the fact that Swedish happens not to have a word for wipe. Let’s not even imagine telling Swedes they don’t wipe—it’s just that they use words like dry and erase, which serve just as well.

The propositions that really would suggest a different take on life always fall apart. The film Amistad taught us that the African language Mende has no word for may. The idea was to highlight the basic innocence of one of the African characters, his language supposedly requiring one to specify whether something is or isn’t, with no gray zones. It was great narrative drama, but cartoon linguistics. It is safe to say that no language lacks ways of conveying degrees of confidence in truth, given that all humans have the cognitive equipment for perceiving such gradation and urgently need to express it day in and day out. Mende, in fact, has a much more robust and elaborate subjunctive construction than English does. In that language, one not only does and doesn’t, but may and may not.

The Language Log website’s “No Word for X” department is a useful archive of how things like this never pan out. I have also heard that a dialect of Berber, spoken in northern Africa by people who were living there on the land long before Arabic got there, has no words for win or lose. We are supposed to think of them as in contact with the communal, cooperative essence that we acquisitive individualists in the West have fallen for. There is value in the lesson, but it would be more honestly conveyed by addressing what the Shilha Berbers are like as a culture, not their language. Anthropology tells us that all human groups have games, especially among children. Are the Berber really alien to children engaging in scrappy competitions in which one person comes out the victor and one doesn’t? If not, then right there, we know that these are not a people with no concept of winning and losing; the queston becomes whether they watch winning and losing happening all the time and yet mysteriously lack a word for it. This would seem highly implausible, not to mention condescending.

And then, a dictionary of precisely the Shilha’s dialect of Berber reveals words for win and lose. Perhaps they do not use the words just as we do, indeed—especially since the dictionary is in French and gagner and perdrethemselves overlap only partially with English’s win and lose. However, the same dictionary also had words for conquer and fail. Plus, as it happened, I once had a Berber-speaking cab driver, and when I asked him how to say winand lose he immediately tossed out exactly the two words for them I had seen in the dictionary! Let’s face it, these people not only know what winning and losing are, but talk about them with ease.

Who Thinks Otherwise?

Some readers may understandably wonder whether there are actually people informed about languages and cultures who would find anything I am putting forth at all novel. There are: how one perceives such things varies immensely depending on training, cultural predilections, and intent, and a robust strain in modern academia is quite committed to the idea that languages represent cultural thought patterns. For example, Swarthmore’s K. David Harrison has posited that depicting language diversity as marvelously random, as I have, is “stunningly obtuse.” He happens to have done so in a passing critique of an article I wrote in World Affairs. That personal aspect, however, is not my reason for using his position as an example here. For instance, his claim that I think language’s complexities render them unfit for the modern world and that it would be better if all people were monolingual are so contrary to anything I have ever written that the proper response is silence.

However, Harrison’s take on the link between language and culture is useful to the argument here in demonstrating exactly the unwitting misimpressions I have described in this chapter. When it comes to grammar, as opposed to what they have names for, languages are awesomely different, but not in ways that correspond to how peoples are different. Harrison disagrees: “If so, then Stonehenge and Machu Picchu differ only because of different randomly evolved building methods, but tell us nothing interesting about the ancient Neolithic and fourteenth-century Inca cultures.”

But that’s just it—languages are not things. Stonehenge and Machu Picchu, as tokens of culture, tell us plenty about the people who built them. However, if we had records of the language Stonehenge’s builders spoke, its structure could tell us nothing about what they were like, nor would early Quechua teach us anything about what it was to be an Inca in the 1500s. Both languages, of course, had words for things important in their cultures. However, from where the idea that what shapes thought is the word for something rather than the thing itself?

Harrison continues to protest against the idea that language changes randomly: “It’s hard to imagine a lesser regard for the products of human genius and their great diversity that arises differently under different conditions. As people have spread out and populated the planet, they have continually adapted, applying their ingenuity to solve unique survival problems in each location, and inventing unique ways of conceptualizing ideas. Geographic isolation and the struggle for survival have been the catalyst for immense creativity.”

But languages are not like paintings. They do not develop via people applying their ingenuity or being creative. Languages develop via step-by-step driftings that operate below the level of consciousness, and this is not an opinion, but a fact, fundamental to any introductory class on language change. How else, after all, did Estonian end up with fourteen cases?






“of the book”



“some book”



“into the book”



“in the book”



“out of the book”



“onto the book”



“on the book”



“off from the book”



“like a book”



“as far as the book”



“as a book”



“without the book”



“with the book”

On top of all of that, Estonian is one of those languages where irregularity is practically the rule. Does anyone plan such things? If this is creativity, I’m not sure we’re giving Estonians a compliment.

The impression that people “create” their grammars is easily maintained when we marvel at a language unlike ours spoken by indigenous people. When a language works so differently from ours, a natural gut-level impression is that it is a departure from normality, and even that this departure must have been deliberately effected, or must have arisen because of some pressing circumstance such as interesting cultural particularities. However, the notion falls apart when we turn the lens on ourselves. Spanish has subjunctive endings. Who “created” them? In what way do they correspond to life in Madrid as opposed to life in Tokyo? If we are to say that they are historical baggage from another time, why was a subjunctive more useful in Old Castile—or Ancient Rome, where Spanish’s ancestor Latin already had a subjunctive—than in feudal Japan?

Or, if we are to say that the Whorfian analysis isn’t supposed to apply to the subjunctive, why not? It is not clear from Whorfian work to date what would disqualify the subjunctive from the analysis while permitting numeral classifiers, color terms, and the future tense. After all, if European languages didn’t have the subjunctive and we encountered it instead in a tiny language spoken in a rain forest, wouldn’t it be the first thing treated as evidence of its speakers’ layered perspective on truth conditions?


The magnificence of how a language is built is not its correspondence with folkways, cosmology, and thought patterns, but in its protean, fecund independence from these things, ever happening to burgeon into new spaces of meaning and complexity, evidencing what one can barely help thinking of as a kind of irrepressibility. To think of the most interesting thing about language as being how it sheds light on its speakers’ thought processes is like cherishing Beethoven’s Seventh Symphony not for its nimble melodies, richness of harmony, surging thematic progressions, and stirring orchestration, but for the handful of dimly flickering hints that it just might lend us about what Beethoven was like as a dude.

In the synaesthetic sense, a language smells like mowed grass or a steamy jungle. It cooks—bubbles, as it were. However, it does not do this on assignment from a culture’s needs. Like culture, but largely apart from it, language is quite the marvel in itself.