Language and information

Lecture 4: The Nature of Language

4.3. Stages of development

[Audio recording—Opens a separate, blank window. Return to this window to read the transcript.]

Now, certain features of the structure as presented here have implications for the manner and the time order of steps in the development of language.

We'll consider first of all the arising of elementary sentences. In the analysis presented here, sentences are created by a partial order of words in respect to their combinability, meaning, in their combinations. If so, words, and varied combinations of them, have to have existed before this partial ordering developed, since the sentences are a regularization of the combinations. If some words get to be used more and more with words of a particular informal set, because it makes sense that way—for instance, eat might be used with word pairs like child and berry—I'm trying to be primitive, you see—more than eat would be used alone, and more than it would be used with pairs like walk, sleep, to say walk eats sleep or something of this sort. Then it is not hard to see that the appearance of eat would come to be especially connected with the appearance of those word-pairs, the ones that it does indeed occur with, to such an extent that each occurrence of eat would be understood as associated with the pairs that are in use with them, so much so that if it occurs without them that those words, or words like them, would be implicit: that if one says eats, then one would understand that somebody is eating something, and so forth. But further, that if words which are not said with them, like walk eats sleep, will under these circumstances (once one is accustomed to what eats occurs with), then walk eats sleep would not only be something that is not said, but if it ever were said, it would be counter to expectation. It would in effect be wrong. This is standardization. That is to say, eat, we can see, was standardized; we know it because by now it has been standardized. It is standardized to the point of institutionalization, something which is not unknown in social behavior. And since, for eat, the rejection of such words as walk was due to the lack of occasion for use—lack of making sense—then the requiring of other words, like child and berry, such pairs, was due to the fact that eat made sense with them and was being said about them. This was approximately, as we see it today, the meaning of this relation of the partial order, the meaning of saying something about, or of predicating about.

Now, so much for the first elementary sentences, which had to have had something existing before them, namely, a respectable stock of words and word-combinations.

Now, to take a jump to the dependence-on-dependency relation. This is something else again, because the dependence of a particular word, like eat, a first-level word, on particular other words, zero-level words like child and berry, is a relation among words. It could be viewed as a relation in a list of words—of eat to a certain list, of some other words to another list, and so forth. The dependence of a second-level word, such as continue, on a first-level such as you see in  the child's eating berries continued, for instance, is also a relation between words; but as we know language now, one can easily show that the relation of dependence is not a dependence on lists, it is a dependence on the dependence properties of words. Now, this is a generalization of what was actually being done. If you look at the words, they have indeed such lists, the lists fit the dependence-on-dependence properties, but one does not define eat by a list of words—new words come in and go out, and so forth, one wouldn't want to define eat by a list of words—they are defined by a property. The property, the realization—not the realization because nobody had to realize it—but the generalization of this property had to come after the actual practice, after the situation when there were lists of words involved.

We thus have stages in the development of language. There had to have been, as we just saw, there had to have been words and word combinations before there were sentences. There had to have been elementary sentences before there were expanded sentences, that is to say, first-level words had to have been used in sentences before second-level words were used in sentences. How do we know? Because the expanded sentences contain the elementary sentences. They're not simply derived from them, they contain them. So the others had to exist prior.

The dependence-on-dependence relation could only have been made after the practice had already come to exist, as we just saw. Reductions could only have become a general method after all unreduced base sentences were constructable. They didn't all have to exist, but the method of constructing them had to exist. And this is because they are made in a regular way on the word entry into the base sentences. You have no other—otherwise, every reduction would be something new again. They are only generalizable and regularizable in respect to the structure of base sentences.

These are therefore stages in the formation of successively complex sentences.

But there are other constraints in language also, which we have not discussed because they go beyond the formation of sentences. These are not—we've only touched upon some of them. And these are also information bearing, and these could have developed only after sentences existed, even in reasonable complexity. I'll give a few—one or two examples. There's only a small list of them altogether.

One type is the permuted linearizations, the ones I had mentioned. The bringing of a word to the front of a sentence to make it the topic of the sentence.

Another is the interrupting of a sentence after a given word by a subsidiary sentence which begins with the same word, making the subsidiary a modifier of the given word. This also is something which is very important, it is real constraint, it does something very important, it creates modifiers. It had to develop after sentences already existed with reasonable complexity.

Another kind of additional formation is the power of words to refer to words, which I discussed yesterday, which creates the meta- power of language—the metalanguage, the say operator, and referentials, pronouns.

Then there are constraints on sentence sequences. It will take me a minute or so to explain this, but it is worth the trouble. If we take two sentences with a conjunction between them, then consider them—like he went because it was late, or something like that. If we consider such sentence pairs with a conjunction, where there is no word repetition between the two sentences—no word is common between the two sentences—and where the sentence is peculiar, or uncomfortable, or unlikely—for example, I am writing this paper because tomorrow is Wednesday, which may sound foolish (I could make more foolish examples). This kind of sentence can be made more acceptable by conjoining another sentence which will contain one word from each of these two sentences with some verb connecting them. So to speak, explaining the connection, like saying I am writing the paper because tomorrow is Wednesday and the paper is due Wednesday. And the paper is due Wednesday, which makes the whole thing quite sensible, repeated paper from the first sentence and Wednesday from the second sentence, and connected them.

Now, so much for raising the acceptability of a sentence pair, where it is conjoined, by bringing in word-repetition if it isn't there. Now let us take the case of two sentences with a conjunction which also have no word repeated between them, but which are fully acceptable, like I am going home because it is late, or something of that sort. In these cases, one can show that there always exists an adjoinable sentence which would complete the repetition in the same way, by taking a word from each one and connecting them in the sentence, but presents known information, and is therefore zeroable by the considerations of the first lecture. That would be like if  I said I am going home because it is late, and when it is late one had better go home, or something like that—I mean, some sentence about this. This, of course, is unnecessary because everybody knows it, but the fact that such a sentence is always available, always exists, shows that it could have been there and had been zeroed. I don't mean that it had to have been there, of course, that it really was there, we are discussing what are the potentialities of words and their relations, rather than the actual history of sentences as worked out as if they were abbreviated.

Now, I can give another example, and this is, if we consider all discourses, that is to say, all sequences of sentences which originated together—they were said or written by somebody—without regard to whether they are foolish or not foolish, whether they make sense or don't make sense. In contrast to any arbitrary collection of sentences which was not 'born' together—which you pull out by by taking every first sentence of every 60th page in the encyclopedia, or something like that—one finds a regularity in all the ones that are created as a discourse. Now, of course, by 'all' I mean [something like] 300, but 300 is 'all'. But in all such discourses, one can find a property to the sentence sequences, namely, that it is possible to establish at least two sets of words proper to the discourse, determined by the discourse itself, sets such that one or another word of the set repeats, but that they repeat in a fixed grammatical relation to each other [of one set to the other]. This is not just repetition as we saw in the case of the two sentences with a conjunction between them, this is a much, much stronger condition. It is that there are sets of words with a fixed grammatical relation to each other such that this grammatical relation on these sets repeats very heavily—not in every sentence of the discourse, but repeats very heavily in the discourse. Or, it may repeat very heavily in a section of the discourse, and a later section of the discourse would have other word-sets in another grammatical relation. The thing is a fact. It is a constraint, in effect it is a constraint on discourses. It is the meaning of being a discourse, as against being just a conglomeration of sentences.

Now, there are yet further, later constraints, under specialized language conditions. I want to mention two or three. One is in mathematics, whose special power vis-à-vis language rests on dispensing with likelihood differences in operator-argument relations, and in the syntactic demands on truth, that is, that certain sentence-sequences can be described as preventing loss of truth-value. In science we have the replacement of likelihood differences by subsets of arguments and subsets of operators in a way that reserves relevance in the language, that is, it precludes nonsense. A somewhat different situation is the possibility, long discussed by Henry Hiż, of formulating a relation, ultimately structural, between a sentence and its consequences in such a way that would characterize the special sets of such sentences, and would have a very important effect on what one can do semantically with grammar.