What’s the Worldview from English - The Language Hoax: Why the World Looks the Same in Any Language - John McWhorter

The Language Hoax: Why the World Looks the Same in Any Language - John McWhorter (2014)

Chapter 5. What’s the Worldview from English?

WHORFIAN WORK COMPARES ENGLISH with other languages, with the goal of showing how other languages might make other people think differently from English speakers. However, something investigated too infrequently, which could be useful in evaluating the implications often drawn from Whorfian work, is how English might make us think differently from other people.

It could be said that this is what the work on other languages has shown, although not presented in quite that way. If Russians perceive dark blue and light blue as more distinct than we do, then we perceive them as rather less distinct than they do. If some native Australians process themselves as oriented toward geographical coordinates, then what defines us is that we do not.

However, facts such as these feel somewhat beside the point. So often it seems to be English speakers who don’t rather than do, who are somewhat less sensitive to something, who lack what others have. Surely there isn’t something inherently numbing about our language, however: any appearance of that would have to be an artifact of experiments typically done by English speakers for an Anglophone audience.

The question, then, is: How does English shapes its speakers’ thought, in ways that would intrigue audiences if most Whorfian work were done from the perspective of Third World languages, or even Japanese or Chinese? Of course there is no need to suppose that English outright bars us from thinking anything, any more than any language exerts such an effect on its speakers. We established early in the book that modern Whorfianism is about statistical tendencies, not straw-man absolutes. But still: How does English influence the thinking patterns of those who speak it?

Many will already notice how peculiar the question feels. The idea that our language creates a uniquely Anglophone “worldview” can seem less intuitive than that Japanese creates a Japanese worldview. It isn’t hard to imagine a language called Guugu Yimithirr creating its own worldview, since its very name suggests a world of life vastly unlike our own. But when it comes to future tense markers, ways of saying before and after, or nonexistent gender markers on nouns, what worldview are they creating for the man reaching for a box of cereal at a Walmart outside of St. Louis?

The question bears exploration. In this chapter we will settle in with just a single sentence of English and train the Whorfian light upon it in the same fashion as we usually see it trained on other languages. However, we will not examine a passage from the Bible or Henry James or even Henry Miller: we want live, spoken language. And then, not from Walter Cronkite or Hillary Clinton either. We must keep front and center that in framing languages as shaping thought, we are referring not to icons speaking carefully to large audiences, but real people speaking casually amid everyday life.

As real and live, for example, as a guy of about sixteen I overheard saying something one weekday morning in Jersey City on his way to school with a friend. He was black, for the record, and that aspect of him is in itself handy, in that any claims about how English shapes thought must be applicable to him as well as to a middle-aged person who subscribes to the Atlantic—in treating English as shaping thought, we must account for a vast array of people, and for that matter, not only in the United States but worldwide. Besides, to the extent that this guy’s rendition of the sentence was affected by the patterns of the dialect widely known as Ebonics, those, too, are richly pertinent to evaluating what to make of Whorfian findings on language and thought.

Recall: those findings certainly show that language can shape thought to an infinitesimal degree. The question is what implications we draw from that degree about what it is to be human. In that light, here is what a human said to his friend one morning in 2012: “Dey try to cook it too fast, I’m-a be eatin’ some pink meat!”

If anyone needs translation, the standard version would be If they try to cook it too fast, I’m going to be eating some pink meat! I didn’t catch what came before or afterward; I just kind of liked the feel of the sentence such that it stuck in my ear and later occurred to me as solidly, even pleasantly, representative modern American English.

So: we know that if we asked our teen to participate in certain kinds of psycholinguistic experiments, we would see that his modern American English shapes his thought in certain ways. However, how plausible would we find the assertion that his speech—Dey try to cook it too fast, I’m-a be eatin’ some pink meat!—conditions in him a worldview (1) different from that of an Indonesian or a Brazilian and (2) akin to that of Lindsay Lohan, Condoleezza Rice, Ben Kingsley, me, and probably you?

We shall see.

As If

The first order of business, according to Whorfian tradition, is to subject our teen to the same gloomy surmise that has been tried on the Chinese to such general dismay. Namely, if language shapes thought, then mustn’t we wonder what it might mean that the guy did not use the word if?

More precisely, sometimes he uses if and sometimes he doesn’t: black Americans shift in and out of the structures of Black English. Still, though, it is presumably reasonable to hypothesize that someone with lower rates of if usage is having their thoughts shaped less by if-ness than someone whose dialect requires them to use if always.

As it happens, there is a long history of treating Black English exactly in this vein. Often it has been with good intentions based on a fallacy: that black children need rescue from an illogical home dialect. Education expert Carl Bereiter and his associates in the 1960s argued that a sentence like They mine for They are mine was, in its lack of a be form, a broken locution hindering the learning process. However, since mighty languages like Russian and Indonesian also do not use a be verb in the same way, Bereiter was unknowingly diagnosing massive numbers of human beings as verbally handicapped (for the record, the writers of the Old Testament would also have to be included, as Biblical Hebrew was be-less in the same way).

It should be said that Bereiter’s analysis was rooted in a sincere desire to help poor black children learn to read more effectively. In fact, the method of teaching reading to poor kids that my friend Siegfried Englemann and Bereiter spearheaded, which itself does not dwell on issues of Black English grammar, is one of the most tragically underconsulted secrets in education today. However, it should also be said that someone else treating Black English’s streamlined nature as evidence of deficit around the same time was psychologist Arthur Jensen, who famously suggested that black people are, on the average, less intelligent than others.

Ah, yes—now what were we saying about “language shapes thought” and our black teenager? Most will readily classify treating his casual relationship to the word if as having about as much cognitive import as Chinese people’s soft-pedaling of explicit ways of expressing would and would have. Any remaining sympathy anyone has for treating Black English as a deficit must also be prepared to assert that languagewise, Chinese speakers are also playing with less than a full deck. Reject that, and the only logical conclusion is that languages (and dialects of languages) differ in how explicit they are overall—here Atsugewi, there Chinese—but that this difference is independent of thought in any significant way, certainly not justifying metaphors about “how we see the world.”

When it comes to how someone speaks English, this isn’t even all that hard to wrap our heads around. After all, writing obscures things absolutely central to expression: context and intonation. The boy’s friend did not hear Dey try to cook it too fast as an independent declaration, because they were both aware of the situation they were talking about, in which there was presumably some question as to the quality of the food. Also, the vocal melody with which the boy expressed Dey try to cook it too fast made it clear that something else, a result of this potentially rapid cooking, was coming up immediately. That is, because of the melody, of a sort that any English speaker would use when uttering a dependent clause of this kind, even if the boy had for some reason said Dey try to cook it too fast and then lapsed into silence, his friend would have wondered what was coming next. If dey cook it too fast, den what? Say somethin’, dude!

This sentence was, then, one passage in a thoroughly coherent exchange. If language shapes thought, and what that boy was speaking was language, then apparently Whorfianism is not to be applied to his usage of if. We move on.

Dey In, Dey Out

To many, the idea that different languages condition different ways of feeling life is the most interesting thing about languages. Often, however, even just bits of language are interesting in a great many ways, quite apart from fragile Whorfian speculations. The little word they is a good example.

It started as a patch of sorts, entering the language from elsewhere, as a solution to a problem. Basically, in Old English the words for he and they had become rather inconveniently similar. He was pronounced “hay” and the word for they was roughly “hyay.” By early Middle English, both he and they were he.

Stranger things have happened. What’s the other language you have learned besides English where the word for you is the same in the singular and the plural? It would have seemed barbaric to earlier English speakers who kept thoufor one person and you for more as religiously separate as we keep I and we apart today. There was even, in Old English, a pronoun just for saying “you two” as opposed to you all: git! But today, we consider the one-size-fits-all youas perfectly normal.

Yet languages have a way of keeping things organized to a certain extent. English speakers have always champed at the bit somewhat on the you issue, for example. Forms such as y’all, youse, and Pittsburgh’s y’uns, despite their backyard repute, are attempts to be more explicit and make English more “normal” in this regard. The almost suffocating influence of the standard language in education and the media keeps these novelties from ever becoming accepted speech, but things were quite different in the fourteenth century. Before widespread schooling or literacy, natural attempts to tidy a language up (or muss it up) could normalize much more easily.

As such, Scandinavian Vikings confronting the singular/plural he puzzle found it handy to bring their own Old Norse’s third-person plural pronoun into the slot.

That’s how we got they: a linguistic cross-fertilization. We all know languages borrow words for new things, like sushi. However, we’re less likely to think as meat-and-potatoes a word like they started as a foreign intrusion. Thou never knowest!

Image

Meanwhile, however, if we train that Whorfian lens on naturalized little they, we fall back into the issue that yielded so little in the last chapter: whether people differ in how richly they perceive plurality.

The experiment would have to take into account that they, as languages go, is about average in terms of explicitness in third-person pronouns. English gilds the lily in even having a pronoun especially marking more than one third person, as we have seen: some languages have the same word for he, she, it, and they. Meanwhile, Old English had even more fun than modern English. I simplified a bit before: Old English’s hie was the masculine they, but there was a feminine they too, heo, pronounced roughly as a doughty older character on Downton Abbey would pronounce hair: “hay-uh.” Thus English was like Arabic and Hebrew and other languages in keeping things tidy. Yes, tidy; if you’re going to have a he and a she, then shouldn’t there be a they and, as it were, a female “they-uh”?

But my, how much further languages can take this kind of hair-splitting. Among languages in the South Seas eastward of Australia, it is ordinary for a language to have separate words for they two, they three (or so), and they all. Elsewhere things are just plain different from anything we would imagine. In the Amazon’s Jarawara, if the things in question are inanimate objects, there is no pronoun for them at all. That’s right: the pronoun is, of all things, absence. You know, as a Jarawara speaker, that when no pronoun is used, then it is a “phantom” they, referring to things that are not living ones.

Consider, in the light of all of this, a conclusion along the lines of Mark Abley’s on Native American languages of the Algonquian group, such as Cree, Ojibwa, and Pocahontas’s Powhatan. Abley, a journalist, is deeply taken with the Whorfian perspective, and for all of the right reasons in the sociopolitical sense. But it means that to him, “to speak properly, in an Algonquian language, is to be aware of the identities and interrelationships of all the people you address.” He bases this statement on the fact that in such languages, when you use I and you in the same sentence such as I see you, the you comes before the I, such that one might think that the “I” is less central in Algonquians’ minds than in ours.

One might make a similar statement based on the proliferation of they words in languages like the ones I mentioned in the South Seas—especially since such languages are similarly fecund in their variations on we and you. Also, many might find the Abley approach welcome in comparing undersung, undervalued, and historically exploited groups of the Melanesian islands to speakers of boring, oppressive English. Under this analysis, to speak English would mean being relatively insensitive to people, their number, and their relationships to you and to one another, less socially fine-tuned than, say, a Melanesian.

However, besides the air of Noble Savage-style romanticization in this kind of thing, as well as how shaky the idea is that Anglophones worldwide are inherently a little chilly, does the Abley-style perspective on they seem as attractive when we compare the Melanesians not to Margaret Thatcher but to a black teenager in Jersey City? English is likely the only language he’s ever known, and yet the language that supposedly shapes his thoughts is the same one shaping the thoughts of Rush Limbaugh.

At such a point many will consider that the entire enterprise just doesn’t hold up. Adding fuel to that fire would be if we decide that in having a distinction between third-person singular and plural at all in his pronouns, our Jersey City boy is “aware of the identities and interrelationships of all the people you address” to a greater extent than the Pirahã tribespeople. They have a single he/she/it/they pronoun. Yet they live in a small group, interacting closely all day long every day. Wouldn’t we expect them to be more attuned to shades of they-ness than an urban American? But if we would, then that’s yet another mark against a meaningful connection between how a language’s grammar plays out and how its speakers think.

And then who knows what kind of connection we could draw between grammar and reality based on the Jarawara’s lack of any pronoun at all for inanimate objects! If asked, I’m sure they would tell us that they perceive that birds are alive and sticks aren’t just as clearly as we do, thank you very much. I’m not sure who would tell them otherwise—or even venture an experiment seeking to reveal them as a hair less quick at pressing a button related to demonstrating that fact. And then, were Old English villagers more alert to the gender of pairs of people than a fifteen-year-old boy in Jersey City? Why?

And so it goes. Sometimes a cigar is just a cigar, and sometimes, a they is just a they. Onward!

Try, Try Again

Try is an orphan. No one knows where it came from beyond a certain point: roughly, somewhere around France. It’s one of the thousands of words that English borrowed (and never gave back) from French in the Middle English period, leaving English’s vocabulary the queer blend of grand old Germanic and fancy new French and Latin that it mainly is. The French word trier was one of assorted variants of that word kicking around in the French area and thereabouts. Just as one can know from comparing dogs, platypuses, kangaroos, and more that there was once an Ur-mammal with four legs and hair that gave birth to live young, comparing the variations on trier we can know that there was once a word triare in the Gallic area.

Usually, one can then compare a word like this to similar ones in other languages throughout Europe, and using the same comparative method, linguists have reconstructed thousands of words in what must have been the grandfather language to most of the languages of Europe, not to mention Iran and India. For example, father began as a word pəter in that ancient language, at this point pretty firmly placed as having been spoken in the southern Ukraine. It yielded French’s père, Spanish’s padre, German’s Vater, Hindi’s pitaa, Irish’s atheir, Armenian’s hayr, and so on.

But there’s no word like triare in any other European languages. That means try has no pedigree tracing back to some ancestral word that now has its spawn in Russian, Greek, Hindi, Persian, and Lithuanian. And only because of France’s temporary takeover of England in the late Middle Ages, when French was the language of writing and its words percolated into humble English speech, did the word make it even into English.

Try, then, is a foster child, shipped across the English Channel around the time Thomas Aquinas was teaching at the University of Paris and today is used several times a day by English speakers worldwide, including on ordinary weekday mornings by adolescents in Jersey City, New Jersey.

Image

And how our particular adolescent used try on one particular morning is especially interesting. Note he said Dey try to cook it too fast, I’m-a be eatin’ some pink meat! If you think about it, that usage of try is somewhat off in the logical sense if we take try as intended in its core meaning. It would be one thing if he said, If they try to cook it too fast, I’m going to tell them to turn down the heat or If they try to cook it too fast, I just won’t have any chicken. Overall, if he says, If they try to cook it too fast, we expect that he will follow this up with something about him either stopping them from doing so or turning away from what they cook.

Instead, though, his sentence has him eating the meat that the people “tried to” cook too fast—that is, they would appear to have not tried to, but succeeded in, cooking the meat too fast, which makes you wonder why the guy put it as “try to” when, after all, they quite simply did. One feels as if the sentence should have been simply If they cook it too fast, I’ll be eating pink meat—the try to seems extra.

And it is, but not in a random way. This usage of try to is actually an example of how the dialect of English that most black Americans switch in and out of all day, so often thought of as “bad” grammar, a deformation of “correct” English, is in many ways more complex than standard English. Our adolescent’s usage of try to is, of all things, a subjunctive mood a-borning in Black English.

Its air of extraness is analogous to how the subjunctive in languages like Spanish feels to English speakers. In Spanish, for I doubt you will go is Dudo que él vaya, where the subjunctive form vaya conveys the hypotheticality of the going instead of the plain-vanilla indicative va. To an English speaker learning Spanish this seems a finicky add-on. One wonders why a language has to actually have a separate verb form to mark such a nuance. In the same way, the try to in Dey try to cook it too fast, I’m-a be eatin’ some pink meat is marking the hypothetical.

Indeed, taken literally the try to seems like clutter, “messy” grammar. However, black people use try to in precisely this way quite often. It is a regularity, a logical pattern of, of all things, grammar.

That is, try to has broken the bonds of the literal and now signifies “In the case that they cook it too fast.” This kind of thing happens to words in all languages all the time, such as English, where going to now means future—I’m going to think about that—even though in terms of the original meaning of go, that doesn’t make sense: how do you “go” toward thinking? Going to has only been used that way since the 1600s. To a speaker of Old English, using going to to express the future would sound as odd as our teen’s use of try to does to many of us now.

“Us” would include the very people who are using it that way, if we were to tell them they were doing so. To be sure, black Americans are no more consciously aware that they are wielding a nascent subjunctive than standard English speakers know that when they say That must be the Indian food they are using what is termed the evidential mood by linguists. Sources such as the online Urban Dictionary note a black “expression” tryna. This, however, is not the subjunctive try to but a mere matter of colloquial pronunciation, namely of the ordinary try-ing to, used just as all English speakers use try in its default meaning. Our teen’s try to usage is something different—and just as cool as the aural “flava” of tryna for trying to.

And this try to as in try to cook it too fast is a grammatical feature more elaborate than in schoolbook English, where the subjunctive has been on the ropes for centuries. One can slip it in. If there be persons in opposition is the subjunctive version of If there are persons in opposition, but it’s decidedly hoity-toity. If I were the one versus If I was the one: the fact that grammar hounds must lecture us on how the were version, the subjunctive one, is better is a sign that it’s dying. Yet our teen pops off with his try to cook it too fast intending nothing remotely formal, and certainly with no one having told him to express himself that way. He was just talking—using a subjunctive as effortlessly as someone speaking French or Spanish.

Image

Those are some of the ways that try to is interesting. How does the Whorfian take on it stack up? Language shapes thought—and so now we have to speculate that black Americans are possibly more alert to the hypothetical than other Americans. It’s one thing—although, as we have seen, a deeply fraught one—to speculate that Amazonian hunter-gatherers, with their evidential markers, might have an exotically different take on whether things are true and why than Westerners. But now, are we to say that the black cop in Oakland, or the black woman minister in Atlanta, or Kanye West, or Barack Obama, hearkens more keenly to the if over the is than Ashton Kutcher or Tom Friedman?

We should be wary of the whole approach after what came of Alfred Bloom’s attempt to delineate Chinese people as less sensitive to the hypothetical. However, this time we are treating a people as more alive to what might be versus what is. Might that seem perhaps more inviting? Especially since it might serve to counter the tragically prevalent sense that black American speech is a perversion of English rather than a fascinating variation upon it? I might note that I myself have been very much on the battlefront when it comes to spreading the word on this latter point.

However, on the specific issue of black people and higher subjunctive awareness, we’re asking for trouble in the scientific sense. For one, recall that this is the same dialect that can leave off the if in a sentence like Dey try to cook it too fast, I’m-a be eatin’ some pink meat. That would seem to indicate leaving the hypothetical to context to a greater extent than standard English’s obligatory use of the if. So which is it? If anything, the try to subjunctive combined with the absent if would seem to leave black Americans at par with, but not ahead of, standard English speakers on hypotheticality.

And then we run up against the bigger picture. Thought patterns drive culture. What, then, does the culture of black Americans have in common with that of Ancient Romans, whose Latin had a subjunctive, which then evolved into the subjunctive today used by speakers of the languages that developed from Latin, like French and Spanish? We might even ask what was the common thought pattern that meant that Ancient Romans, in addition to peasants in Gaul and Iberia, used a subjunctive—and, on top of that, not the Vietnamese, or any number of Australian Aboriginals, or Israelis or Finns, or countless other people one could easily parse as culturally likely to cotton to subjunctivity. To wit, Julius Caesar, Valéry Giscard d’Estaing, Pablo Picasso, Sophia Loren, and even Nicolae Ceauşescu have shared just what in common, that would indicate that the subjunctive in their language shaped their thoughts? And whatever that would be, now try to liken it to the way Jesse Jackson and Jay-Z process reality as well.

Here, it becomes attractive to consider those soup bubbles again, the ones that pop up on that side, this one, in the middle, God knows just where one will turn up—all you know is that some will, somewhere and always. There is an endless variety of life’s nuances that a language may end up marking. All languages mark some but not all, and which ones they mark is a matter not of what its speakers need or what its speakers are like, but chance. Chance is what makes both Gérard Depardieu and our Jersey City black boy both use subjunctive marking, just as chance is why both the Tuyuca Amazonians and Bulgarians have evidential marking while Polynesian islanders and Czechs do not.

In fact, the way try to is used in Black English shows us that in the end, languages show that all people think alike, not differently. Black English can leave off an ifDey try to cook it too fast …—but then, the try to subjunctive conveys the same kind of hypotheticality, just in a way less obvious. This is akin to what we saw in how Chinese, although lacking definite articles, can convey definiteness with word order, even though speakers do not consciously know it: train arrived means the train came, while arrived train means a train came.

Lesson: black Americans’ dialect is more subjunctive grammatically than standard English. However, any attempt to extend that into characterizing speakers of that dialect as fascinatingly attuned to the if over the is fails, once we consider how likely we would be to parse Leslie Caron and Ségolène Royal, all of the peoples of Portugal, Spain, Latin America, France, Italy, and even Romania as subject to the same influence of the subjunctive on thought as upon a black boy in New Jersey.

It seems a tad absurd, upon which we must re-evaluate the initially seductive nature of statements such as Languages evolve according to the needs of their speakers. Quite simply, they do not. Of course languages develop new words for new things: that is as undeniable as it is uninteresting. However, beyond this, how a language is put together structurally has nothing to do with what its speakers need. Language is intriguing for countless other reasons.

Undercooked?

Of course no one has said that every element in a sentence has a Whorfian significance. However, as we pass through this one vibrant sentence of English, Whorfianism seems fraught no matter where we turn.

Cook seems innocent enough, but then English borrowed it from French—before which English, and its early Germanic kin like Old Norse, had no single generic word for cooking. One baked, roasted, boiled things—but there was no more one word for just cooking in general than today’s English has one word that refers to both eating and drinking. (Ingest, technically, but it’s highly formal—no one says Man, I ingested too much meat and wine at Thanksgiving!—and it applies more readily to solids than liquids: who ingests lemonade?)

Yet: if Russians see blue more vividly because they have separate words for dark blue and light blue, then we must explore whether modern English speakers perceive cookery less vividly than Iron Age villagers. What do we make of a notion that a Viking was more sensitive to distinctions in cooking techniques than today’s foodie couple in San Francisco? Or, if the Jersey City schoolboy is less attuned to cooking techniques than Edward the Confessor, then he gave no evidence of it in his enthusiastic discussion of a future chicken dinner at eight in the morning.

Or is it that Whorfian effects are cancelled out by cultural developments that occur after a language has taken shape? If so, then how can we apply it to any human group? Languages are typically much more ancient than their current culture. They were often imposed on people beyond the ones they originated among: Arabic started as the language of an obscure group of nomads in Arabia and was only later imposed on Coptic-speaking Egyptians, Berber-speaking North Africans, and others. Languages often change vastly over time anyway: Old English was much like German both in structure and vocabulary.

Which stage of language shapes the thoughts of speakers at which time—and then on top of that, exactly which kinds of thoughts, and why? Whorfianism must work harder on this kind of question to justify the implications many wish to draw from it.

Image

Even humble little it has a story. Wouldn’t English seem to have one more of its ducks in a row if it were him, her, hit rather than him, her, it? As a matter of fact that’s the way it was in earlier English. However, hit was the only one of the three where eons of rapid pronunciation were so hard on the h that it truly wore away. With him and her, the h hangs on, although we say ’im and ’er as much as, if not more than, we actually enunciate him and her. However, in a sense modern English does have its ducks in a row in that in rapid speech, the little trio is properly ’im, ’er, and it.

The Whorfian story of it, in contrast, requires insulting the Chinese again. In many languages, pronouns are highly optional when context can do the job—so much so that an English speaker might wonder how communication occurs. In Chinese, if someone asks How’d you like the movie? you can, and probably will, say back, Didn’t like rather than I didn’t like it. Japanese and many East and Southeast Asian languages are similar, as are countless ones worldwide. European languages like English are just prissier about getting that pronoun in there.

Does that mean our teen has a greater sensitivity to who is doing what to whom than a Chinese person? In deciding, we should also know that there are languages where to say I met John you have to include a redundant him as well: I met-him John. Is the African tribesman who speaks a language like that more aware of who is doing what to whom than a black boy in Jersey City?

Perhaps the tribesman’s small, intimate social group conditions such an awareness? But what about the fact that equally small groups all over the world are just the kind of languages where you don’t have to express pronouns? Random example: among the 2,500 Manambu of Papua New Guinea, actually overheard was someone saying, If you feel like peeing, wake me up as Feel like peeing, wake. After all, no one had any reason to think the person was referring to the urinary inclinations of the guy two doors down, much less that it would be useful to wake him up about one’s own.

Thought is not the issue here. Language varies gorgeously astride the very same kinds of thoughts from group to group.

Image

Too fast. Linguists and Whorfians, their thoughts perhaps shaped differently, will seize upon different things here.

The linguist sees how fast is like a feather. Feathers today aid birds’ flight. They began as insulation and decoration on dinosaurs; for some, the feathers came to be of help in gliding, step by step over millions of years until what started as downy plumage on a Compsognathus became the aerodynamically splendid feathers of an eagle.

Fast, too, is the end stage of a process that began at quite a different point. Old English’s word for fast was snel, just as German’s still is schnell. The word fast existed, but its meaning was firm, tight—as in a meaning it still has secondarily today: hold it fast. However, in this original meaning, one could say run fast in the meaning of running with tight application, vigorously, keeping at it. To run in such a fashion is, by definition, to be doing it quickly, and over time, that indeed became fast’s main definition. Today the original meaning lurks in the margins, in words like steadfast, expressions like stuck fast and fast asleep, which if you think about it is kind of silly if fast means rapid—few sleepers sprint. Fast asleep hearkens back to when fast meant tight, tenaciously, which describes how quite a few of us do sleep (myself regrettably not among them).

In any language, most words have histories like this, starting quite distinctly from what we know them as, and having reached their current state via a stepwise development of inferences few are ever aware of within the span of a human life. Quaint first meant clever or crafty, and by extension, fashionable—note the remnant way we can still refer to a modishly dressed person as looking “smart.” The extension continued over the years: the fashionable connotation acquired a negative air and sank into “elaborate,” “affected.” Time passed, and extension drifted into a more arch direction, from “affected” to our modern sense of quaint as “enticingly weird in an old-fashioned way.” Fast is a case of this kind, perhaps even itself weird in an old-fashioned way.

But for Whorfianism, the potential meal in too fast is too. It is, for the record, an odder little word than we have reason to consider often. If asked, what would you say too meant? You might be surprised how much there would be to say. Have you ever learned a language in which there is a word referring both to addition (me, too) and excess (too hot)? In French, aussi but trop; in German, auch but zu; in Japanese, mo but ammari. Plus, too also has a specialized alternate meaning. In French, you deny a negation with si rather than a oui: Guillaume: Tu n’as pas payé! (You didn’t pay!). Isabelle: Si, j’ai payé! (Yes I did pay!). German does the same thing with Doch, and an English speaker might wish we had such a thing—and we do. Craig: You didn’t do it. Laura: I did too!

One could consider these three meanings of too a neat little splotch; one never knows which related meanings one word might end up covering for various reasons. However, for Benjamin Lee Whorf, this kind of thing fell under the rubric of the “cryptotypes” that he thought of as the channels via which language shapes thought. One of his examples was that in Hopi, there is one word masa’ytaka for all things that fly except birds: such as insects, planes, pilots. There is also a different word for water occurring in nature as opposed to water that you cook with or drink. To Whorf this suggested that the Hopis’ language conditioned them to process the world in ways that a language without these particular configurations would not.

Modern Whorfians are explicit in rejecting the more extreme claims of Whorf’s writings. However, this is a matter of temperance; the basic orientation stays the same. No one today claims that languages prevent speakers from thinking in certain ways, or even make thinking in certain ways a strain; rather, we are to investigate whether languages make thinking in a particular way more likely. However, that likelihood is still treated as up for debate, and as such, the Hopis’ classification of flying things and water is akin to work on, say, Russian words for blue. Moreover, Whorfian adherents outside of the academy are especially given to reading words’ semantic spreads as indicative of weltanschauung. References to the Hopi masa’ytaka have been widespread and steady for eons now. Meanwhile, Mark Abley sees that French subdivides knowing between savoir for facts and connaître for people and supposes that “to a French speaker, that distinction is central to how the mind interacts with the world.”

As such, too leads to a question. Let’s say that masa’ytaka means that a people process flight as an especially vibrant distinguishing trait of moving objects, and that Europeans with their separate words for knowing people as opposed to knowing things have an insight into the contours of familiarity that others lack. If so, then when a word means “also,” “overly,” and “but I did!,” what kind of interaction with the world does it condition?

The answer can’t be that the things that too covers are too abstractly related to condition a way of thinking. After all, there is a short step from addition (me, too) to excess (too much), or from addition to refuting a denial by adding back the truth (I did too!). So: Do English speakers have a uniquely sensitive access to the concept of addition (me, too), as something potentially overdone (too much) but also useful in appending objections amid conversation (I did too!)? Many would not hesitate if such a claim were made about the Hopi as opposed to a lawyer in San Antonio, and one must admit that it’s hardly more abstract than the idea that to be French is to carefully distinguish the knowing of a fact and the knowing of a person.

Yet in the end, let’s suppose that in an experiment, our black adolescent in Jersey City could be shown to have a certain wisp of a readiness to associate addition, excess, and denial—a few milliseconds’ more alertness to this peculiar squiggle of cognition than someone from Seoul. In the grand scheme of things, of all the ways that we might be interested in how American adolescents think, black or not, or how any Americans of any age think, or how English speakers worldwide think, what insight could this wee discovery about too lend us on issues humanistic, political, societal, artistic, educational, medical, or even psychological?

Anglerfish Testicles and the Future

I’m-a in Black English is an awesome little eddy of a thing, where I am going to has coalesced into what is essentially a single word. Imagine the extraterrestrial assigned to make sense of English who happened to come upon Black English first, learning only by ear and trying to figure out what people meant by this I’m-a—pronounced Ah-muh—when otherwise people are using will and gonna to indicate the future. I’m-a is very particular, not just a random instance of running words together. No one says “youra” for you are going to or “theya” for they are going to. The extraterrestrial, to be successful, would have to figure out that I’m-a is of all things something as specific as a first-person singular future construction. It’s the kind of thing a person would often screw up on in an exam, if there were such a thing as Ebonics lessons.

It is one of those cluttered nodes that human languages can develop, seeming to almost willfully challenge those inclined to try unraveling them. French’s Qu’est-ce que c’est? for What is that? is an example. Only because of the written convention can we parcel out that the expression is composed of que, est, ce, que, ce again and then est again—and we still wonder why French has all that just to ask What’s that?

Just the -a part of I’m-a is rather gorgeous when we consider that it began as not one but two words, going to. Going to eroded to gonna, ’onna, and finally just a, as unlike its progenitor as French’s août, pronounced just “oo,” is like its Latin source augustus. The -a in I’m-a is the linguistic equivalent of the male anglerfish, tiny compared to the female, whose lifecycle consists of sucking onto the female’s head permanently and gradually wearing away like a dying pimple until nothing is left but his testicles, whose sperm are absorbed into the female’s bloodstream to fertilize her eggs! In I’m-a, -a is stuck to I’m’s forehead, fertilizing it with future meaning.

And then the -m- part of I’m-a is a shard of am, itself part of English’s bizarrely multifarious community of be-verb forms. Irregular is one thing, but am, are, is, be, been, was, and were is a train wreck. The current situation is litter from no fewer than three different original verbs that collapsed together as if laughing together at a warmly potent joke, beon, weson, and aron. In any language, spots that undergo especial wear and tear tend to be messy—habit scorns logic. Be verbs are used a lot, and thus, like irregular plurals such as man and men, they tend not to be places to seek order. Thus the -m- in I’m-a and the be of be eatin’ that the Jersey City boy used are two shards from a three-verb traffic pileup that Germanic tribespeople allowed in early English two thousand years ago.

Image

Meanwhile, recall that the Whorfian take on our adolescent’s future marker is that it will make him less likely to save money.

Image

As we near the end of the sentence, the message holds steady: People think alike; it’s the languages that change.

Does the guy’s having a word for eat separate from one meaning drink mean he likes food better than tribespeople who have one word for both? It would be hard to say so when earlier in the sentence his having a general word for cook seemed to suggest on the contrary that he was less of a food person than Hagar the Horrible.

When our teen says some pink meat, the some doesn’t mean “a little bit,” but an extension of that meaning, suggesting a diminution of its quality, a pejorative evaluation. All languages have a way of conveying that flavor. Japanese would convey the same attitude toward, say, pink meat with a collection of words like nante and nado. In the Native American language Klamath of the Pacific Northwest there was a prefix that did the same job. There’s always something.

Then, plenty of unwritten languages have words for only a few colors, with pink certainly not one of them. In the 1960s at the University of California at Berkeley, linguist-anthropologists Brent Berlin and Paul Kay discovered that color terms emerge in languages in a rough order. After black and white comes red, then green and yellow, then blue, then brown, and only after them, purple, pink, orange, and gray. That is, there is no such thing as a language with words for only black, yellow, and pink, or even black, white, and green.

In this light, it has been noted that Homer tossed off bizarre usages of color, such as references to not only wine-dark seas but wine-dark oxen, green honey, and blue hair. There was an early temptation to attribute this to Homer’s reputed blindness, but then sighted Greeks were given to similar oddities, such as Euripides’s green tears. Are we really to suppose that these hypersensitive artists did not see the colors we do?

The philosopher Empidocles gave the game away in dividing colors into what we process as an oddly spare palette: light, dark, red, and yellow. That is exactly what Berlin and Kay’s flowchart predicts of a society that has yet to develop a prolific set of conventionalized terms for colors: black, white, red, and then yellow or green.

Yet Whorfian thought, with its Russian blues findings, teaches us to wonder whether the Ancient Greeks, as well as the peoples today with few color terms, actually processed color differently than we do. Does the Jersey City kid see pink flamingos and cherry blossoms as more distinctly un-red than Homer and Empidocles could have? Yet just as a difference of 124 milliseconds is hard to see as demonstrating a different way of seeing the world, it’s hard to imagine that our Jersey City kid was imagining that pink meat he spoke of more “pinkly” than, say, Old English speakers would have perceived undercooked meat, despite the fact that they didn’t have a word for pink yet either.

At the end of our sentence not a single thing has seemed able to tell us much about how its speaker thinks. On meat, we might try the cryptotype route again: many African languages have the same word for both animal and meat. One may think first of people pointing to “meats” running around the savanna, but more properly, it’s that these people see themselves as eating “animal.” They do not make our prim distinction between living creatures and pieces of them sitting on our plate.

Is this a sign of a Western remove from the mundane reality that animal slaughter is required for our culinary delight? One is often taught so, in that English inherited from the French euphemistic distinctions like beef for cow on the plate and pork for pig on the plate. Now, truth be told, it would appear that this would classify as culture shaping language, not the other way around—but there is always the chicken and egg question, as well as the surmise that it might “go both ways.”

As such, there is inconvenient data beyond France and Africa no matter how we approach the subject. Generally human groups do not have the same word for animal and meat. More typically, humans of all societal types have a word for animal flesh as distinct from the living animals. In fact, even Old English had this basic distinction between beasts and meat, despite the fact that its speakers tended to be much more intimately familiar with animal slaughter than anyone in Jersey City! It is, rather, the Africans in question among whom that particular bubble in the soup happens not to have burbled up, just as in English we do not happen to keep knowledge of people and knowledge of things separate.

What’s Significant?

Dey try to cook it too fast, I’m-a be eatin’ some pink meat! He said it. He was expressing a thought. And an attempt to Whorfianize it simply doesn’t work. Not a single element in this boy’s utterance can be scientifically identified as distinguishing how he thinks from how his Mongolian or Peruvian equivalent does.

An objection that my approach in this chapter caricatures Whorfianism is as implausible as it is likely. It would be hard to say that the sentence I used is unrepresentative of English or of normal thought. As casual and even humorous as it may be, it is language, pure and simple, replete with exactly the kinds of semantic and grammatical categories that have fascinated Whorfians since the Roosevelt administration. Hypotheticality, tense, color terms, classification of objects—it’s all there. Might Whorfianism seem more intuitively applicable to that same sentence about the pink meat if it had been uttered by a farmer in the hills of Vietnam? It shouldn’t: surely one need not be a Southeast Asian, Native American, or Amazonian to have a “worldview.” Surely, that is, our Jersey City teen is a human being, with both thoughts and a language potentially shaping him.

Yet we have seen that it doesn’t hold up that his specific way of experiencing life is channeled by how his language happens to play out. I must repeat: we can assume that English does have minute effects on his thought; on this, the data is in, from the best Whorfian experiments, as I acknowledged in chapter 1. My interest is in the implication drawn from this kind of work, partly by interested writers and partly by researchers themselves, that these perceptual differences amount to significantly different ways of experiencing existence.

“Who’s to decide what’s significant?” one rightly asks. Yet it must be clear, for example, that if English conditions a worldview, then that has to be a worldview that encompasses the frames of reference of that Jersey City boy, Mary Tyler Moore, Margaret Cho, William Jennings Bryan, and Sting. What’s significant?—well, not whatever it is that unites the way those five people have processed life, I suspect most would agree. Clearly, that is a worldview so general as to be equivalent, essentially, to simply being human.

Is it that for some reason languages spoken as widely as English stop conditioning worldviews? For one, that would automatically disqualify Russian and Chinese from Whorfian experiments, as they are spoken by people of hundreds of distinct cultures. Yet, let’s imagine a proposition that it’s the “National Geographic” languages spoken by people of just one culture that shape thought. But why would language only shape thought among small groups? At what point in cultural development would we posit Whorfianism to peter out—and on what basis?

Besides, it’s hard to see just how disqualifying big languages would work. Would the English someone speaks in New Delhi not condition a specifically English-conditioned worldview in her because English is also spoken in Sydney and Spokane? Surely the axiom cannot be “language shapes thought unless the language is spoken by different groups of people.” The language in someone’s head cannot “know” that it is spoken in heads on the other side of the world. Presumably a language shapes thoughts in whatever head it finds itself in.

One suspects that if anything is shaping a worldview for that English speaker in New Delhi, it is her specific culture, not how the verbs in the English she speaks—the same verbs used daily in London, Chicago, and Jersey City—work. Here, if we are seeking to glean some overarching sense in things—that is, to do science—it becomes attractive to suppose that culture is what shapes thought not only for the woman in New Delhi but also for the speaker of a local, obscure language. An analysis that covers everything, small languages and big ones, is that what shapes worldviews is culture, with how a people’s grammar works having nothing significant to do with it. In the scientific sense, if language isn’t shaping thought significantly on the streets of Jersey City, it isn’t doing it in the Amazonian rain forest either.

Image

Doing science indeed: Is science the bedrock of the almost narcotic appeal so many find in Whorfianism? One wonders. We are to think that the goal is simply the empirical one of investigating whether language shapes thought. Yet science never seems to quite seal the deal on this beyond tiny differentials gleaned from deeply artificial experiments. Recall that Guy Deutscher’s Through the Language Glass is, properly speaking, a chronicle of the failure of a paradigm, yielding squeaks rather than peaks.

Notwithstanding, the media taught the reading public that his book investigated less whether language shapes thought than confirmed that it did so. And this was just a symptom of a general orientation: the media as well as academia continue to promulgate the idea that the question as to whether each language is a special pair of lenses is an open one. The very prospect of Whorfianism gets people going like a call to dinner.

Yet some Whorfianism goes over better than other Whorfianism. If Alfred Bloom had written a book claiming that Chinese makes its speakers more insightful in some ways than English speakers, he may well have won a Pulitzer.

There’s a reason. It illuminates the core of Whorfianism’s sexiness, and yet is also antithetical to an authentic and respectful exploration of the human condition.