Chomsky: The Plato of Linguistics
¶
"There is an element of grim irony in the emergence of Linguistic Analysis on the philosophical scene. The assault on man’s conceptual faculty has been accelerating since Kant, widening the breach between man’s mind and reality. The cognitive function of concepts was undercut by a series of grotesque devices—such, for instance, as the “analytic-synthetic” dichotomy which, by a route of tortuous circumlocutions and equivocations, leads to the dogma that a “necessarily” true proposition cannot be factual, and a factual proposition cannot be “necessarily” true. The crass skepticism and epistemological cynicism of Kant’s influence have been seeping from the universities to the arts, the sciences, the industries, the legislatures, saturating our culture, decomposing language and thought. If ever there was a need for a Herculean philosophical effort to clean up the Kantian stables—particularly, to redeem language by establishing objective criteria of meaning and definition, which average men could not attempt—the time was now. As if sensing that need, Linguistic Analysis came on the scene for the avowed purpose of “clarifying” language—and proceeded to declare that the meaning of concepts is determined in the minds of average men, and that the job of philosophers consists of observing and reporting on how people use words.
The reductio ad absurdum of a long line of mini-Kantians, such as pragmatists and positivists, Linguistic Analysis holds that words are an arbitrary social product immune from any principles or standards, an irreducible primary not subject to inquiry about its origin or purpose—and that we can “dissolve” all philosophical problems by “clarifying” the use of these arbitrary, causeless, meaningless sounds which hold ultimate power over reality. . . .
Proceeding from the premise that words (concepts) are created by whim, Linguistic Analysis offers us a choice of whims: individual or collective. It declares that there are two kinds of definitions: “stipulative,” which may be anything anyone chooses, and “reportive,” which are ascertained by polls of popular use.
As reporters, linguistic analysts were accurate: Wittgenstein’s theory that a concept refers to a conglomeration of things vaguely tied together by a “family resemblance” is a perfect description of the state of a mind out of focus."
Ayn Rand, Intro. to Objectivist Epistemology
so, is there a "universal language" programmed into us at birth? Discuss
So why has he publicly expressed enthusiasm for Hamas, Cuba, North Korea, Maoist China, the former Soviet Union, and French Holocaust Denying journals (the last under the excuse of "academic freedom")?
No, he is not an opponent of state power. He is an opponent of WESTERN state power, even the classical liberal goal of a minimalist state.
Politically, the guy is simply a wingnut. Philosophically, the guy is simply an innate-ideas Platonist. Linguistically, the guy is simply wrong – BUT, his significance as a linguist lies in the fact that he was the first to ask two important questions regarding natural languages, both interrelated:
1) How does a human being acquire his natural language, given the (presumed) paucity of linguistic material for him to hear and mimic from adults? The simple fragments and cooing by mothers toward infants are referred to in linguistics as "motherese", and Chomsky claims that this material, by itself, would not be enough for an infant to acquire the correct rules of syntactic construction that would allow him to create, in just a few years, a non-finite number of "well formed" (i.e., grammatical) sentences.
2) Closely related to this is a problem Chomsky viewed as one of human mental computing: given an implicit knowledge of a few rules of syntax, how can the young child — in a very short amount of time — express himself by means of completely new sentences, in limitless variety?
Chomsky's background was in mathematics, so he analogized this sort of computation to a simple mathematical equation, e.g., x^2 + y^2 = 1, which is the equation for a circle of radius "1" (in any unit you choose). The mathematical theory of continuity proves that there are potentially an infinite number of "x's" and "y's" which satisfy that equation, which is the reason a circle — though BOUNDED and FINITE in all aspects (radius, circumference, area, etc.) nevertheless comprises an infinite number of points defining it as a cirlce; there are no mathematical gaps anywhere around the circumference. Chomsky likened the equation to a syntactic rule in grammar, and the points around the circumference to any one specific sentence satisfying the syntactic rule and resulting in a "well formed" (i.e., grammatically correct) sentence.
I don't regard Chomsky's questions as trivial, though I do regard his answers as wrong, as well as regarding his approach to answering them as — frankly — unprofessional. He used to claim that linguistics was a "science just like any other science" (e.g., physics, chemistry); yet when Ved Mehta interviewed him in 1971 for The New Yorker, he asked Chomsky if he had any experimental evidence to support his theories about innate grammar. Apparently, Chomsky snorted and replied, "Ha! I hate experiments!"
"I hate experiments"? And he calls himself a scientist after the pattern of physicists and chemists? I think not.
It's probably behind a pay wall, but you can find The New Yorker interview (May 8, 1971) here:
http://www.newyorker.com/archive/1971/05...
Ved Mehta interviews Noam Chomsky
The New Yorker Magazine, 1971
"JOHN IS EASY TO PLEASE"
by Ved Mehta May 8, 1971
For excellent critiques of Chomsky, his admirer Steven Pinker, and the idea of inborn universal grammar (known in the biz as "the theory of linguistic nativism") see the works of British linguist (and libertarian, by the way) Geoffrey Sampson:
http://www.grsampson.net/BLID.html
the language-instinct debate
http://www.grsampson.net/
home page
I have a few more things I'll post shortly. In the meantime:
Linguist Frederick Newmeyer writes about Chomsky in "The Politics of Linguistics" (1986, University of Chicago Press). He has some pretty eye-opening things to say.
Chomsky was originally working as a mathematician in the field of computer science after WWII, his work being funded by the US military (!). The research, apparently, had to do with creating voice-controlled weapons systems (e.g., a pilot would simply say "fire!" and a missile would launch). After reviewing the work done by Chomsky and his group at MIT, the military apparently said, "Useless. Thanks but no thanks," and cut off funding. It was at that point that Chomsky became a radical leftist, as well as rescuscitating his ("useless") theories of machine language for weapons systems as a new theory of "transformational grammar".
Coincidence?
Yeah, right.
Chomsky is not a brilliant eccentric, he is wrong, mad, and bad.
Thanks dk and h for the link.
Re Kant, what little I understood I did not like.
Wittgenstein- generally obscure but I do like the statement about the meaning of words- 'words have meaning only in the stream of life'. I do not see a contradiction here with your point as I understand it.
Citing Chomsky, Economic Freedom asked: "How does a human infant learn to hone in on ONLY the area of a pin-head and ignore as "ungrammatical" the Jupiter-sized area of POSSIBLE combinations, especially given the fact that number of possible gibberish combinations is so much larger than the number of well formed constructions?"
I suggest that hardly any "random" sounds occur in the environment. In a "natural" state (rural village), an infant is surrounded by animal calls as well as human language, but few if any random noises. Even the winds in the leaves produce sibilants. In the city, an emergency vehicle's siren sends a very clear message, speaking to the deepest roots of our commonest understanding. Having grown up in America, the first time I heard a European police car, I did not mistake it for the ice cream cart. Language is very, very deep.
"The Case Against B. F. Skinner" (1971) by Noam Chomsky here:
http://www.chomsky.info/articles/1971123...
Chomsky's 1959 review of Skinner's "Verbal Behavior"
http://cogprints.org/1148/1/chomsky.htm
Agreed. But Chomsky's interest was in the sound combinations possible in the English language, not those emanating from non-human sources (trees, wind, sirens, animals, etc.), with the 26 letters of the English alphabet (plus a space), and an arbitrary sentence length of 100 characters. The total number of letter-sound combinations is 27^100, which is about 10^125 — an unimaginatively huge number. The majority of those combinations — e.g., "qxxzpkmnn..." — are gibberish and not actually encountered in real usage by speakers of English. The actual number of combinations of grammatically well formed sentences in English is more like 10^25 (according to Chomsky's estimate).
Again:
1) Total number combinations (grammatical + gibberish) = 10^125
2) Grammatical combinations only = 10^25
3) Gibberish only = Total number - grammatical only = 10^125 - 10^25, a breathtakingly large number.
The riddle for Chomsky was how a human child learns to ignore the huge number "10^125 - 10^25" and manages to locate and use only the much smaller number "10^25."
He and a linguistics colleague (Morris Halle) asked how many sentences could be formed in English comprising 100 characters (meaning, 26 letters and a space, so 27 characters in all). As stated, that's a simple problem to solve: there are 27 choices for the first character, times 27 choices for the second character, times 27 choices for the third character, etc. So the total possible combinations are 27^100, which is approximately 10^125 combinations. But let's face it: most of those combinations are going to be gibberish. To narrow it down to the non-gibberish, we have to include some grammatical constraints: for example, you can't begin a sentence with a space; "e" appears more frequently than other letters; after "q", "u" appears with a probability of 100%; etc. After including a number of linguistic constraints that they thought reasonable for English, they calculated the total number of WELL FORMED sentences in English (i.e., syntactically and grammatically correct constructions) comprising 100 characters, with an alphabet of 27 symbols, at about 10^25.
Assuming this is correct just for the sake of argument, 10^25 is still a HUGE number . . . BUT it's far, far smaller than the original number of 10^125 by many orders of magnitude.
Think of it this way: If 10^125 (the total possible number of combinations) were represented by the area of a planet the size of Jupiter, then 10^25, by way of comparison, would be an area the size of a pin-head: so the area of Jupiter minus the area of the pin-head equals the immense area of ungrammatical, non-syntactical, gibberish (again, given our example: English, 27 symbols, and a sentence 100 characters long).
Here's Chomsky's (and Halle's) question:
How does a human infant learn to hone in on ONLY the area of a pin-head and ignore as "ungrammatical" the Jupiter-sized area of POSSIBLE combinations, especially given the fact that number of possible gibberish combinations is so much larger than the number of well formed constructions?
It's an interesting insight, and a perfectly valid question. Quite reasonably, Chomsky claimed that the language acquisition process (whether self-consciously directed by the human infant or not) could not be a Darwinian one of trial-and-error plus selection, discarding those combinations that don't elicit a survival-enhancing response from adults; again, because there are simply far too many gibberish combinations to sort through and not enough time. This is perfectly true. But Chomsky's answer was to revive the old (and discredited) Platonic notion of Innate Ideas: the rules of syntax and grammatical construction are not searched for at all. They are inborn and "hardwired" into the brain, in the mental equivalent of machine code. Steven Pinker (an admirer of Chomsky's) calls this "mentalese." The idea is that the infant's mentalese collides with the particular concrete sound-formations (phonemes and morphemes) in its particular culture. These cultural sounds, combined with the innate mental syntactic structures, form the child's native languge.
Again, the insight is interesting, and the question — i.e., how does the human mind (and an infant human mind, yet) locate the meaningful pinhead of grammatical combinations when it's surrounded by a Jupiter-sized area of ungrammatical combinations? It obviously cannot be a so-called "random walk" or statistical blind search: trial, error, lucky trial, selection, because the number of possible combinations is simply too large. Chance, therefore, is ruled out.
Chomsky's answer, of course, was to jump headlong into determinism: grammar is a neurological given, just as bone marrow is a biological given. Chomsky's admirer, Steven Pinker, even claims in his book, "The Language Instinct", that there are specific genes governing the transmission of knowledge about parts of speech, e.g., a "preposition gene" (an imbecilic notion, frankly). Linguist Geoffrey Sampson (mentioned above) clearly and thoroughly debunks these arguments in several of his books, most notably in "Educating Eve: The Language Instinct Debate". See following link for synopsis:
http://en.wikipedia.org/wiki/Educating_E...
PPS - It is important to note that Chomsky is credited with completely devastating B. F. Skinner's behaviorism. Again, see above: nice conclusions; faulty premises.
When I look at other areas, such as philology, the scholars there assume some basic knowledge of more than English, and unless they are being academic, they explain the examples from other languages. So, if they say, "This is like Latin...." they expect you to know that much, but if the example in in Turkish or Urdu, they explain it after they show it. So, just to say, if Chomsky wanted to demonstrate a universal grammar, he had an audience capable of following his empirical evidence. He seemed not to have any.
Consider reconstructed hypothetical languages such as Proto-Indo-European. In fact, the evolution from PIE to Modern English just BEGS for Chomsky's explanation of what universal grammar is and how it works. In fact, no one who actively reconstructs lost languages seems to make any use of Chomsky's theories.
Some philologists claim to have an inkling of an idea of "Nostratic" the most common foundation of all Euro-Asian and African languages. I never find any reference to Chomsky's universal grammar in any of these works.
From what I could follow of Chomsky's books, he sets up an algebra-like system of symbols. However, he never formally defines his operators or operations. He just seems to use letters and other symbols as a shorthand, like "I (heart) U."
He is an impassioned opponent of state power. I remember his articles in the New York Review of Books from the Vietnam War days of 1967-1968. (I had a professor for an honors English class who used to read them to us. I kid you not.) That leaves him at best like Karl Popper: we can agree with what he says, politically, but not why he says it, philosophically.
Perhaps an interesting counterpoint is the nominally "conservative" S. I. Hayakawa, an expert in linguistics and semantics who, as a university president, stood on an automobile rooftop with a megaphone to call campus protesters "little bastards." They might have been; but it just seemed that a world-famous expert in communication would have said something more constructive... if his study had anything constructive to offer... or maybe it was just the fact that passions submerge reason. We may never know.