Copy Link
Add to Bookmark
Report

NL-KR Digest Volume 02 No. 51

eZine's profile picture
Published in 
NL KR Digest
 · 20 Dec 2023

NL-KR Digest             (6/03/87 17:22:19)            Volume 2 Number 51 

Today's Topics:
V2.N49
Re: In layman's terms.
winograd 2
parsing languages with free word order
syntax and NL
Re: more than one paradigm in linguistics

----------------------------------------------------------------------

Date: Wed, 3 Jun 87 00:49 EDT
From: Brad Miller <miller@ACORN.CS.ROCHESTER.EDU>
Subject: V2.N49

Did not appear, due to a machine crash (it was named V2.N50 instead by the
digester)..

Brad Miller
nl-kr-request@cs.rochester.edu
------
miller@cs.rochester.edu
miller@acorn.cs.rochester.edu

------------------------------

Date: Mon, 1 Jun 87 10:01 EDT
From: John Carl Zeigler <jcz@sas.UUCP>
Subject: Re: In layman's terms.

It is not be necessary, nor sumptimes suffic'nt, foah an Englizt
sentence to simptatically correct in oder to habe semantic content.
A'int Broca's area wunnerful?

--jcz
John Carl Zeigler
SAS Institute Inc.
Cary, NC 27511 (919) 467-8000 ...!mcnc!rti-sel!sas!jcz

------------------------------

Date: Mon, 1 Jun 87 18:10 EDT
From: William J. Rapaport <rapaport%buffalo.csnet@RELAY.CS.NET>
Subject: winograd 2

I've been told by the CS editor at Addison-Wesley that Winograd's second
volume will not be published. It exists in manuscript, but it's out of
date and Winograd apparently doesn't have the desire/time/energy/whatever
to bring it up to date. I suggested to the editor that it be published as
is, with suitable disclaimers, but he vetoed that. I've seen the manuscript.
It's wonderful, though 10 years old.

------------------------------

Date: Tue, 2 Jun 87 09:54 EDT
From: Elizabeth Hinkelman <eliz@cs.rochester.edu>
Subject: parsing languages with free word order

BNF can't capture languages with free word order nicely, if at all.
What do the pros do about it, overgenerate?

Elizabeth Hinkelman

------------------------------

Date: Wed, 3 Jun 87 14:44 EDT
From: Kent Wittenburg <HI.WITTENBURG@MCC.COM>
Subject: syntax and NL

I wanted to mention, in connection with the discussion on this net about
syntactic theories, that there has been a recent revival of another
nontransformational approach to syntax besides Generalized/Head Phrase
Structure Grammar and Lexical Functional Grammar, namely, Categorial
Grammar. The revival of this venerable approach to syntax has had a
distinctly computational flavor. There are working grammars as parts of
NL systems at at least the following sites that encompass some version
of "Categorial Unification Grammar": SRI, MCC, University of Edinburgh
(under Alvey and Esprit grants), IBM Germany (since Hans Uszkoreit's
move), and CGE, an Esprit funded project in France.

Unfortunately, there is no comprehensive introductory overview of this
work as of yet. However, there will be two papers on the topic of
parsing with these grammars in this year's ACL proceedings. Other
references include Mark Steedman's paper in Language 1985, Lauri
Karttunen's paper in a forthcoming volume "Alternative Conceptions of
Phrase Structure" edited by Mark Baltin, a paper by Hans Uszkoreit in
last year's Coling proceedings, several papers from the University of
Edinburgh including the volume "Categorial Grammar, Unification Grammar,
and Parsing," several tech reports available from MCC, and a paper
I wrote about a chart parser for these grammars in AAAI proceedings 1986.

I was also tempted to flame about some of the recent discussions of
linguistics on this net, but, sigh, these debates seem to serve little
purpose. Despite some of the comments on this net, my impression is
that appreciation of linguists and linguistics is on the rise within the
AI community. Why? because progress in NL systems requires solutions to
many of the problems in natural language syntax, semantics, and
discourse that linguists have been concerned with for some time. This
is not to say that all such work is relevant, but much of it clearly is.

------------------------------

Date: Tue, 2 Jun 87 18:02 EDT
From: Bruce Nevin <bnevin@cch.bbn.com>
Subject: Re: more than one paradigm in linguistics

From: David Pesetsky <PESETSKY%cs.umass.edu@RELAY.CS.NET>

DP> What is meant by "relying on anecdotal evidence"? Judging from
DP> the rest of the paragraph, this means using as subject matter
DP> paradigms DRAWN FROM the syntactic data displayed by the
DP> language, rather than some "complete" corpus of syntactic data.
DP> We have no "complete" corpus, of course, for principled reasons.

For a more detailed specification of what I meant by `anecdotal evidence',
and for a description of an alternative that has been pursued with some
very interesting results, please read:

On the Failure of Generative Grammar
Maurice Gross, _Language_ 55.4:859-885 (1979)

(Readers of this list will I think appreciate this paper, and especially
Gross's engineering kind of approach and his understanding of how things
are done in other sciences. Last Fall, he told me that he has so far
seen no rejoinder to this paper, despite its prominent publication and a
title that one would think invited reply. He and his colleagues at the
LADL, University of Paris, have published a great deal since then, but
this is probably the most accessible piece-- except perhaps the very
brief paper in COLING84--and is also the one that addresses our present
concerns most directly.)

David Pesetsky sees two choices: an impossibly `complete' corpus, or
many and various disconnected example sets or paradigms. We see here a
third alternative, namely, a large database of `paradigms drawn from the
syntactic data displayed by the language', a database that is more
comprehensive and systematic than the constant flux of isolated examples
and counterexamples that one finds in the literature of Generative
grammar.

This contentious use of fragmentary examples and counterexamples to make
and refute grand claims has developed because of the explicit assumption
in the Generative paradigm that there are many `descriptively adequate'
grammars, and that significant results are to be reached by ruling out
or another of two proposed grammars, moving thus step by polemical step
toward some future grammar that is not only descriptively adequate but
also explanatorily adequate. But no descriptively adequate grammar has
ever been considered in this process--only fragments of proposed rule
formalisms. The whole program is vacuous. The practice of claims and
refutations--more appropriate for philosophical disputation than for
science--lacks the methodological foundation that is claimed for it. It
is a political and social fact about the community of generative
linguists, but has no scientific status.

There is a reluctance to accumulate a large database for linguistics.
It seems `taxonomic' (dirty word--see below), and it is hard work.

[S]yntacticians have never thought it possible to accumulate
significant data in the form of large lists of combinations of
words, i.e. lists of sentences or of sentence types. This
hesitency in the face of large amounts of data is unjustified;
note that the size of such lists would be considerably smaller
than the number of pictures taken daily from bubble chambers and
analysed by physicists. (op. cit. p. 879)

Gross and his colleagues set out to construct a generative grammar of
French. They created matrices, in which the rows are labeled with long
lists of verbs, adjectives, etc. (`predicative' words) and the columns
are labelled with simple sentential environments in which the given word
may occur (mark the intersection +) or not occur (mark it -). The
matrices can then be sorted by rows and by columns of + and - marks.
The results of examining this sort of empirical database--with no claim
that it is an exhaustive corpus, only that it is based on very
extensive, in principle exhaustive, lists of verbs, etc--are in many
ways surprising. They found that exceptions to rules were far more
pervasive than expected. Where they expected their sorting of rows and
columns to fall out into a nice subcategorization of the vocabulary, it
turned out instead that each verb, etc., is uniquely specified by the
simple syntactic environments in which it can occur or not occur. There
were other surprises that required them to depart sharply from the
Generative model.

DP> While we aim at the deepest explanations we can come [up] with,
DP> staring at 3000 sentences does not -- or has not -- revealed in
DP> a flash to anyone the nature of the grammar as a whole.

`Staring at' the patterns and lack of predicted patterns in a large matrix
of this sort (~12K * ~200 = ~24,000 sentences just for the verb matrices, not
counting the matrices based on other word classes and the matrices for
frozen expressions) can have a salutory effect. For example, in all
the voluminous literature on `raising', there is no mention of the fact
that sentences like (a) involve only a handful of verbs:

a. It seems to me that these little islands of understanding
are getting smaller and farther apart.
These little islands of understanding seem [to me] to be
getting smaller and farther apart.

b. I believe that these islands are diffusing.
I believe these islands to be diffusing.

(In French, where the problem is essentially the same, there are just 3
verbs like seem, vs more than 600 like believe.)

Placing this fact in a theory of the language as a whole, we must
recognize that sentences of type (a) are not productive in the language
today, whereas those of type (b) are productive and are liable to affect
new verbs and new constructions. The status of the former is almost
that of a linguistic fossil, and the formalism to account for it would
make most sense (and perhaps should best be couched) in context of the
grammar of an earlier period in the development of the language, with a
specifiable relation to the grammar of the current stage. (Issues of
language change do not, however, have a prominent place in Generative
research.) The productive type (b) obviously deserves much more
attention whatever the status of type (a), and contentions e.g. whether
both involve transformations or one involves a different formalism (the
Chomsky-Postal polemic of the early 1970s) appear in this context
embarassingly close to what Freud called the narcissism of small
differences. This controversy, while not recent, is paradigmatic of a
loss of perspective that recurs simply because issues and `facts' are
defined on too narrow an empirical base.

Another `fact' about which floods of ink have been spilled is socalled
`aspirated h' and liaison in French, which can supposedly motivate (or
refute) a cycle in French grammar. This is in fact only an artifact of
a completely artificial policy of the French educational system. It
demonstrates absolutely nothing one way or another about hypothesised
characteristics of an innate Universal Grammar. (See the Gross paper
cited above, pp 868-9 and citations.)

DP> In calling for a "complete generative grammar", Nevin is stating
DP> the goals of the field -- where by "complete generative grammar"
DP> we mean a full description of the contribution of the language
DP> faculty (Universal Grammar) and of the parameters/rules supplied
DP> by experience to the syntactic competence of the adult.

No, I meant `comprensive coverage', having the same coverage as
Jespersen's _Grammar of English on Historical Principles_.

The book by Harris, _A Grammar of English on Mathematical Principles_,
describes an explanatory grammar of English that is `complete' in this
sense.

( Aside: This grammar is of course generative in the original sense )
( of the term, but not `Generative' insofar as that term has become a )
( mere trademark for a particular school of philosophy. When I say )
( that there is no complete Generative grammar of any language, I )
( intend the term in the second, sociology-of-science sense. )

There are two senses of `complete' here.

First, the sense in which David Pesetsky assumes I mean the term. Let
me refer to Harris's grammar and his theory of language as the
Constructive paradigm (as I proposed in my review of H's book), to
distinguish it from the Generative paradigm. No one has yet looked much
at at the interpretation of the Constructive paradigm w.r.t.
neurophysiology, brain function, and cognition. That may be in part
because many researchers in the cognitive sciences are basing hypotheses
on the theoretical constructs of the Generative paradigm (which in turn
involve strong presuppositions about the character and complexity of a
biologically innate, universal language faculty--note the circularity
there). But it is also because those characteristics of language that
are universal turn out not to be nearly so complex in the Constructive
paradigm as they appear to be in the Generative paradigm.

Note that the hypothesis of an innate language-acquisition device
depends upon two presuppositions: the notion that linguistic structure
is very complex, so that learning by children is difficult to explain,
and the notion (due to Piaget) that small children have limited
cognitive abilities, and that they are incapable of learning structures
that are complex but learnable for adults. The complexity goes away in
the Constructive paradigm (examine it for yourself), and children have
turned out in more recent research to have much greater cognitive
abilities than Piaget recognized.

In broad brush strokes, those who follow Fodor in advocating notions of
modularity of mind, with specialized, genetically endowed functions in
each module, assume the correctness of the theoretical constructs of the
Generative paradigm (or at least find them congenial), and those who
argue for simple, general-purpose information-processing functions
operating in all of the postulated modules alike would find the
Constructive paradigm congenial.

In the same sense of the word `complete', but looking in another
direction, Harris's grammar in the 1982 book is also (and more
substantively) incomplete in that it does not describe *all* the details
of language variation and ongoing language change (though it gives means
for doing so in direct and obvious ways, and its explicit coverage is
quite astonishing). Also, the crucial matter of sublanguages and
discourse structure (and related questions of lexicography and semantic
representation) are only touched on here, to be taken up in greater
detail in:

Harris, Gottfried, Ryckman, et al., _The Form of Information In
Science: A Test Case in Immunology_, Boston Studies in the
Philosophy of Science, Reidel (forthcoming)

This work should be of special interest to this forum, since it
describes and exemplifies an empirically based approach to knowledge
representation.

The second sense of the term `complete' should be clear by now. I do
not mean that Harris's grammar is exhaustive in diachronic,
sociolinguistic and dialectological detail, and certainly not that it
satisfies the predictions and presuppositions of a Rationalist, Realist
philosophy to which I do not subscribe, but rather that it encompasses
the whole of language syntax, and that it shows how all the subsystems
of rules and the structures that they generate fit together. This, I
contrast with the isolated examples and counterexamples that I referred
to as anecdotal evidence.

DP> if many or most
DP> attempts at "generalizing" a theory of some set of data to a larger
DP> set of data fail, I am not surprised or too disappointed. You have
DP> to expect this: I repeat, linguistics is hard

I believe that the Generative paradigm makes linguistic research
unnecessarily hard. Here is a proposition that I think will be
controversial:

Generative theory is overstructured, and many of the
difficulties it encounters are artifacts of the theory.

I doubt this is the appropriate forum to debate that proposition. In
any case, philosophical debate about it is bound to be less productive
than examination of the following existence proof: there does exist a
`generative' theory (lowercase g, not the trademark) that is far
simpler. To examine the evidence (the alternative theory), you must
shift from one paradigm (Generative Grammar) to another (what I called
`Constructive Grammar' in my 1984 review of Harris 1982). Paradigm
shifts are not easy. For help in making this shift, you might read:

Tom Ryckman, _Grammar and Information: An Investigation in
Linguistic Metatheory_, unpublished PhD dissertation, Columbia
University (1986).

This is surely available from University Microfilms. If you are unable
to find it, I could send you a copy in exchange for postage. Tom is
working this up for publication, but don't look for it in your bookstore
or library tomorrow. :-)

A couple of other references on the Constructive paradigm:

Harris, _Mathematical Structures of Language_,
Wiley/Interscience (1968)

Harris, _A Mathematical Approach to the Theory of Language_,
Oxford University Press (forthcoming)

Be aware that Gross's work is not based on Harris's operator grammar.
It is based on the earlier distributional/transformational paradigm. It
therefore is not a primary source for understanding of this new
paradigm, though its data and many of its results are relatively easy to
transfer.

There are also other paradigms that are alive and well in linguistics.
For example, there are structural linguists who are quite intelligent
and reasonable scholars--and who know full well that the `taxonomic
linguistics' of the 1960s was a Generativist straw man that was very
wide of its purported mark. (Cf. the long monograph _American
Structuralism_ by Dell Hymes and John Fought, in the 1975 volume of
_Current Trends in Linguistics_, pp. 903-1176, as well as the early
chapters of the book by Ryckman cited above.) Orthodox Generativist
texts represent this as a dead horse, but the victory is merely a
political one, as Generativists became what they accused Structuralists
of being: a hegemony with a stranglehold on journals and appointments.
The actual merits and demerits of structuralism have scarcely been
discussed.

Bruce Nevin
bn@cch.bbn.com

(This is my own personal communication, and in no way expresses or
implies anything about the opinions of my employer, its clients, etc.)

------------------------------

End of NL-KR Digest
*******************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT