Chapter 7 - Phrase Structure Grammar PDF
Chapter 7 - Phrase Structure Grammar PDF
Chapter 7 - Phrase Structure Grammar PDF
chapter.
the same level. The Bloomfieldians’ preference for binary branching analyses like-
wise reemerges in later models of phrase structure, and their practice of extending
syntactic analysis below the word level, to include stems and inflectional forma-
tives, survives largely intact in the transformational tradition. Some other features
of IC analyses are less faithfully preserved. These include general properties such
as the recognition of discontinuous and overlapping constituents or the representa-
tion of intonation. More specific proposals, such as the classification of elements
(notably coordinating conjunctions) as ‘markers’ (Hockett : ) were not re-
habilitated until nearly years later (Gazdar et al. ,Sag et al. ,Pollard and
Sag : chapter ). The encoding of dependency relations within a part-whole
analysis (Nida ) was also suppressed until the development of feature-based
models such as LFG (Kaplan and Bresnan and chapter of this volume) and
HPSG (Pollard and Sag and section .. below) that could explicitly express
valence dependencies within syntactic representations.
John is here
John P can → go
Bloomfieldians. While generally preferring continuous (and binary) analyses, they
also admitted a range of constructions that violated these preferences.
Most linguists operate on the principle that cuts will be made binary
whenever possible, but that cuts giving three or more ICs will not be
excluded a priori. In the same way, they will make cuts giving contin-
uous ICs wherever possible, but discontinuous ICs are not excluded on
principle. (Gleason : )
The descriptive challenges that arose in extending these formats to the descrip-
tion of discontinuous dependencies are illustrated by the representation of phrasal
verb constructions, which were taken to be discontinuous from at least Wells ().
wake up
your friend
analyses were interpreted as representing the successive segmentation of an expres-
sion into sub-expressions, each of which was annotated with a word class label and,
usually, other types of information. It was not until the early transformational ac-
counts that IC analyses were incorporated into explicit grammar formalisms rather
than treated as procedures of classification, and the fact that these procedures were
first formalized by the Bloomfieldians’ successors had the eect of simplifying them,
much as the Bloomfieldians had themselves simplified Bloomfield’s more intricate
constructional perspective (Manaster-Ramer and Kac ). In Chomsky (),
phrase structure grammars are proposed as “the form of grammar [that] corresponds
to [the] conception of linguistic structure” expressed by IC analysis (). Chom-
sky’s insight consisted in recognizing how informal procedures for segmenting and
classifying expressions could be expressed by means of rules of the form A → ω
that would ‘rewrite’ a single word class label A by a string ω (which could consist
of labels along with words and formatives). Thus a rule such as S → NP VP would
rewrite a sentence S by a subject NP and a VP predicate, and the rule V → took would
classify took as a verb.
By starting with the sentence label ‘S’ and applying a sequence of phrase structure
rules, one could define a ‘derivation’ that terminated in the expression that would
be the starting point for procedures of IC analysis. The syntactic analysis assigned
to an expression by a phrase structure grammar was conventionally represented by
a phrase structure tree, though in Chomsky’s initial formulations, analyses are rep-
resented by stringsets that he termed . These sets contain strings
from equivalence classes of derivations diering from one another solely in that
they apply the same rules in a dierent order (e.g., a derivation where the subject
NP is rewritten before rewriting the VP and a second derivation where the VP is
rewritten first).
This [the treatment of ‘long components’ in the sense of Harris ]
is an important question, deserving a much fuller treatment, but it will
quickly lead into areas where the present formal apparatus may be in-
adequate. The dicult question of discontinuity is one such problem.
Discontinuities are handled in the present treatment by construction of
permutational mappings from P [the level of phrase structure, JPB/IAS]
to W [the level of word structure, JPB/IAS/], but it may turn out that
they must ultimately be incorporated somehow into P itself. (Chomsky
: )
The transformational tradition never did reconsider whether discontinuities could
be handled better within a phrase structure analysis and no general approach to this
issue was explored within constituency-based grammars until the development of
Head Grammars (Pollard ) and linearization-based models of HPSG (Reape
; Müller , ; Kathol ). These models rehabilitated many of the
same intuitions about syntactic and semantic units that had been explored in ‘wrap-
ping’ analyses in the Montague grammar tradition, particularly in the accounts of
Bach () and Dowty (). However, Chomsky sought to reinforce the case
for ‘permutational mappings’ (i.e., transformations) by disputing the feasibility of
applying procedures of IC analysis to ‘derived’ constructions such as polar and in-
formation questions.
The case for indirect representation, not based on the relation of mem-
bership, becomes even stronger when we consider such sentences as
“did they see John” or “whom did they see”. T
-
–, i.e. no one would ask how they
can be subdivided into two or three parts, each of which has several
constituents, going on to use this subdivision as the basis for analysis
of other sentences, and so on. Yet there is nothing in the formulation
of principles of procedure for IC analysis that justifies excluding these
sentences, or treating them somehow in terms of sentences already an-
alyzed. (Chomsky : f.; emphasis added JPB/IAS)
This discrepancy between procedures of IC analysis and phrase structure gram-
mars is of more than purely historical interest. One of the criticisms levelled by
Chomsky against phrase structure grammars turned on their inability to represent
discontinuous dependencies, particularly within auxiliary verb phrases.
To put the same thing dierently, in the auxiliary verb phrase we re-
ally have discontinuous elements … But discontinuities cannot be han-
dled within [Σ, F] grammars [i.e. phrase structure grammars, JPB/IAS].
(Chomsky : )
‘nondistinctness’ condition on complex symbols in Chomsky (: ) anticipated
the unification operations of later constraint-based formalisms, this condition could
play no role in regulating the distribution of features within a projection.
S.
NP S[ ]
saw e
trast, the parallelism requirement on extraction from coordinate structures followed
on a phrase structure analysis. Two conjuncts of category X[ ] were syntac-
tically alike, whereas a conjunct of category X[ ] and one of category X were
not. In the analysis in Figure ., the two conjuncts of category S[ ] are
syntactically alike and can be conjoined, but neither could be conjoined with a full S
to yield unacceptable examples such as *what Felix heard and Max saw the intruder
or *what Felix heard the intruder and Max saw.
. ]
S[
heard e saw e
Gazdar () also clarified how constraints on extraction, which had typically
been described in terms of conditions on rule application, could be recast in terms of
restrictions on the ‘paths’ of category-valued features that connected extraction sites
to dislocated fillers. In classical transformational accounts, there had been no rea-
son why information about missing constituents should trace a path along the con-
stituent structure links of a tree. But once extraction was characterized in terms of
the sharing of category-valued features along a sequence of mother-daughter links,
it became clear that any restrictions on the extraction of elements out of specified
‘island’ domains (Ross ) would correspond to paths in which those domains
occurred somewhere along the path between extraction sites and fillers.
maximum expansion. The expansions were indeed almost as restricted as pronom-
inal clitic sequences in Romance languages, and, like these sequences, exhibited
some of the ordering rigidity characteristic of morphological formations. Even the
selectional dependencies tended to relate pairs of adjacent elements. So there was
nothing that presented any intrinsic diculties for a phrase structure analysis.
The ‘ax hopping’ analysis of Chomsky () had long been held as one of the
crowning achievements of transformational approaches. However, Gazdar et al.
(a) showed that the strategy of ‘hopping’ axes from one point in a terminal
string to another was a solution to a self-inflicted problem and hence dispensable in
a model with complex-valued features. If one auxiliary element could select the verb
form of the head of a phrasal complement, there was no need to assemble inflected
forms in the course of a syntactic derivation. Instead, the admissible expansions
could be determined by the subcategorization demands of individual elements. The
first component of this analysis is a feature classification of verbal elements that
distinguishes tense, aspect and voice properties, along with form variants, such as
participles, infinitives, etc. The second is a generalization of the X-bar feature con-
ventions that allows these ‘head’ features to be shared between a mother and head
daughter node. The final ingredient is, again, category-valued features that permit
a verbal element to select a complement headed by a particular form variant.
These components are set out in detail in Gazdar et al. (a) and in much
subsequent work within Generalized Phrase Structure models. One type of analysis
that they define is illustrated in Figure . below. The advantages of this analysis
are summarized in Gazdar et al. (a: .), though one immediate benefit was
the avoidance of the formal problems that had plagued the ‘ax-hopping’ analysis
since its initial formulation (see, e.g., Akmajian and Wasow ; Sampson ).
V.
V V
[+] [+,+]
must V V
[+,+] [+,+,+]
V V
have [+,+,+] [+,–,+]
V V
been [+,–,+] [+]
being persecuted
Figure .: Passive auxiliary expansion (cf. Gazdar et al. a: )
The analyses in Gazdar et al. (a) thus established that the same basic feature-
passing strategy used in the treatment of unbounded dependencies could provide an
account of local dependencies. Patterns of subject-auxiliary inversion were amenable
to a similar analysis using grammar rules systematically related to the basic rules
via metarules, a device whose utility in the grammar of programming languages had
previously been established. Figure . exhibits the analysis of the polar question
cited by Gleason () on p. above. The invertibility of modals and auxiliaries is
encoded here via compatability with the [+] specification that is required of the
verbal head in a phrase structure rule licensing the ‘inverted’ structure. Independent
motivation for this feature comes from lexical restrictions on the distribution and
interpretation of auxiliary elements. Some elements, such as sg aren’t, are obli-
gatorily inverted, while others, such as better, are obligatorily uninverted, and yet
others, such as may, have a dierent range of meanings depending on whether or
not they are inverted.
V.
[+]
V V
[+] [+]
did NP V
[+]
V
the man [+]
come
opposite occurred. Transformational models abandoned their flirtation with a ‘rep-
resentational’ interpretation, a perspective that had been developed particularly in
Koster (, ), and adopted a more resolutely derivational orientation.
While transformational accounts were following the developmental pathway
that led to current Minimalist models (see chapter ), extended phrase structure
models began to incorporate insights and perspectives from other monostratal ap-
proaches. Following McCawley (), models of Generalized Phrase Structure
Grammar (Gazdar et al. ) had already adopted – and, indeed, refined – a ‘node
admissibility’ interpretation of phrase structure rules. On this interpretation, a rule
such as S → NP VP is interpreted as directly ‘licensing’ a local subtree in which S im-
mediately and exhaustively dominates NP and VP daughters, and the NP daughter
immediately precedes the VP daughter. A node admissibility interpretation imme-
diately eliminated the need for string-rewrite derivations and string-based repre-
sentations of phrase structure (‘phrase markers’). Instead, rules could be regarded
as partial of the subtrees that they sanctioned and the admissibility of
a tree could be defined in terms of the admissibility of the subtrees that it contained.
In large part, this reinterpretation of phrase structure productions supplied graph-
theoretic modelling assumptions that were a better fit for the classes of IC analy-
ses initially proposed by the Bloomfieldians. The schematization adopted within
models of X-bar Theory similarly deprecated phrase structure rules within transfor-
mational models, though without substantially revising the string-based model of
phrase structure represented by phrase markers (as discussed in footnote ).
Furthermore, a node admissibility interpretation clarified the fact that conven-
tional phrase structure rules bundle information about structure (mother-daughter
links) together with information about order (linear arrangement of daughters).
GPSG accounts showed how these two types of information could be expressed sep-
arately, by means of a set of Immediate Dominance (ID) rules that just constrained
mother-daughter relations and a set of Linear Precedence statements that applied to
sisters in a local tree. For example, the information represented by the phrase struc-
ture rule S → NP VP would be expressed by an ID rule S → NP, VP and the general
LP statement NP ≺ VP. The absence of an applicable LP rule would not sanction
unordered trees, but rather trees in which the NP and VP occurred in either order.
An overriding consideration in the development of GPSG was the goal of keep-
ing analyses as explicit as possible and the underlying grammatical formalism as
formally restrictive as possible. The central role of context-free phrase structure
grammars largely reflected the fact that their properties were well-understood and
provided a formal basis for transparent analyses. In some cases, analyses were con-
strained so that they did not take GPSG models outside the class of phrase structure
grammars. For example, requiring that sets of ID rules and LP statements must op-
erate over the same local domains, ensured that they could in principle be ‘reconsti-
tuted’ as phrase structure grammars. LP statements were thus restricted to apply to
sister nodes. As a consequence, LP statements could allow free or partial ordering
of VP-internal elements, but they could not impose any ordering of subjects and
VP-internal elements other than those that followed from the ordering of a subject
and full VP expansion. Yet there was no direct empirical support for this restriction.
Hence the tight association between the domains of ID rules and LP statements
undermined the fundamental separation of structure and order in the ID/LP format
since certain types of ordering variation dictated a flat structure. This was perhaps
acceptable as long as there was some independent motivation for remaining within
the class of context-free phrase structure grammars. But by , the demonstration
of non-context-free patterns in Swiss German subordinate clauses (Shieber )
and Bambara compounds (Culy ) had weakened the empirical grounds for this
restriction and the non-transformational community shifted their focus to identify-
ing restricted classes of weakly context-sensitive grammars that were descriptively
adequate. This was a natural development within the family of phrase structure ap-
proaches, given that the interest in context-free grammars had been driven by an
interest in explicit formalisms with clearly-defined and well-understood properties.
Hence the move from the limited word order freedom defined by the ID/LP for-
mat in GPSG to ‘domain union’ in HPSG (Reape ) extended the dissociation
of structure and order in ways that allow for the interleaving of non-sisters in an
explicit but non-context-free formalism.
valence of a predicate was likewise represented implicitly by the other elements that
were introduced in the same rule expansions. GPSG enriched this spartan concep-
tion by locating terminal elements within lexical entries that specified distinctive
grammatical features of an element other than word class. Corresponding to the
preterminal rules of a simple phrase structure grammar was a class of ‘lexical ID
rules’ which introduced lexical heads indexed by a subcategorization index. This
index (technically the value of a feature) was then cross-referenced with a
class of lexical entries. For example, the rule VP → H[] would license a local VP
subtree that dominated a unary tree whose mother was V[] and whose daughter
was an intransitive verb, such as sleep, whose entry contained the index .
In eect, the use of subcategorization indices achieved a limited type of context
sensitivity within a context-free formalism. Yet, as Jacobson (: .) pointed
out, the fact that lexical items did not directly represent valence information cre-
ated numerous complications in GPSG. The most acute arose in connection with the
treatment of valence alternations. There was no way to formulate a passive rule that
mapped the transitive entry for devour onto a (syntactically) detransitivized entry
devoured, because entries themselves contained no direct representation of transi-
tivity. This led to an analysis of passivization in terms of metarules that mapped a
‘transitive expansion’ such as VP → W , NP to a ‘detransitivized expansion’ such as
VP[] → W (where W is any string). However, it then became necessary to con-
strain metarules so that they only applied to lexical ID rules. But lexical ID rules
were serving as proxies for underinformative entries, so the obvious solution lay in
associating valence information directly with lexical items and introducing a class
of lexical rules to map between entries, as suggested by Pollard ().
·rd
→ ·.ref ·sing
ref
rd
sing
a. b. ·masc
masc
value space, which in the present case just represents traditional person, number
and gender contrasts. The empirical eects of this type system derive from two
additional assumptions. The first is that structures must be -
(Carpenter , chapter ) in the sense that they must be assigned a value for
each appropriate attribute. This constraint precludes, for example, the assignment
of a number-neutral structure as the analysis of English sheep, given that number is
distinctive for English nouns (each occurence of sheep is unambuously singular or
plural). A separate requirement that structures must be - (Pollard and
Sag : ) permits only ‘fully specific’ feature values and thus bars disjunctive
case values from occurring in a wellformed structure. Hence sheep could not be
treated as neutral by assigning the attribute a maximally general value such as
number, which subsumes the resolved values sing and plur. Given that entries are
interpreted as descriptions of lexical structures, the English lexicon can still contain
a single underspecified for sheep, one that specifies either no attribute or
a attribute with a non-sort-resolved value. But the lexical structures described
by the entry must be totally well-typed and sort-resolved.
These general assumptions have the eect of ensuring that structures are max-
imally specific and that all underspecification is confined to descriptions. A neutral
description is not satisfied by a correspondingly underspecified structure but by a set
of structures, each of which supplies dierent, fully resolved values for underspeci-
fied attributes. This technical point has a number of consequences. On the positive
side, the assumption that structures must be totally well-typed and sort-resolved
does some of the work of the completeness and coherence conditions in LFG, and
facilitates type-based inferencing within HPSG. However, these assumptions also
lead to apparent diculties in accounting for the types of patterns described in Ingria
(), in which the neutrality of an item seems to permit it to satisfy incompatible
demands simultaneously, most prominently in coordinate structures.
Note further that in a model theory that only contains fully specified structures,
it is somewhat anachronistic to describe the processes that determine feature com-
patibility in terms of feature structure , as had been the practice in GPSG
and PATR-based formalisms (Shieber ). A more accurate characterization of a
See Blevins () for a recent review and discussion of these types of cases.
model-theoretic linguistic framework would be as -, a term that
has garnered a certain acceptance in the non-transformational community. Within
HPSG, configurations in which a single object occurs as the value of multiple at-
tributes are described in terms of -, a term that refers to reen-
trance in the graph-theoretic models typically assumed in HPSG.
⟨ [ ]⟩
3rd 1 acc
sg ⟨ [ ]⟩
2 acc
fem ⟨ ⟩
hana: acc vanta: - 1, 2
gorial approaches, elements are ‘popped o’ valence lists as arguments are encoun-
tered. Hence the term in the list of the verb vantar is structure shared with
the syntactic object peninga in Figure ., producing a VP with an empty list.
The subject term is in turn identified with the syntactic subject hana, yielding a ‘sat-
urated’ clause, with empty and lists. The terms in the - list of the
verb vanta are also structure-shared with the syntactic arguments. However, in ac-
cordance with the locality constraints of HPSG, - values are only associated
at the lexical level, so that elements that combine syntactically with the clause in
Figure . cannot access information about the dependents it contains.
[ ]
. ⟨ ⟩
⟨ ⟩
3rd ⟨ ⟩
sg
1
1
fem ⟨ ⟩
acc
⟨ ⟩
1 3rd
⟨ ⟩ sg
hana 2 2
⟨ ⟩ masc
- 1, 2 acc
vantar peninga
Given this general treatment of valence, the transparency of virðist can be repre-
sented by the entry in Figure .. The cross-referencing of the two values (via
the boxed integer ‘ 1 ’) indicates that the attribute of virðist literally shares its
value with the value of its complement. Identifying the values of the two
attributes ensures that any constraints that apply to the of the complement of
virðist will apply to its own syntactic . Hence when vanta occurs as the head
of the complement, as in Figure ., its accusative demands will be identified
with the demands of virðist. Only an accusative subject such as hana can satisfy
these demands. So this analysis forges a direct association between hana and the
complement vanta peninga, but the association is established by means of structure
sharing, rather than through constituent structure displacements.
⟨ ⟩
1
⟨ [ ⟨ ⟩]⟩
2 1
⟨ ⟩
virðist: - 1 2
[ ]
. ⟨ ⟩
⟨ ⟩
3rd ⟨ ⟩
sg
1
1
fem ⟨ ⟩
acc
⟨ ⟩
1 ⟨ ⟩
⟨ ⟩
1
hana 2 2
⟨ ⟩ ⟨ ⟩
- 1, 2
This analysis shows how the complex-valued features that provide an account
of basic valence demands in Figure . interact with structure-sharing to allow the
subject demands of a raising verb to be identified with those of its complement.
Furthermore, precisely the same elements oer an analysis of ‘control’ construc-
tions, in which the higher controller merely identifies the reference of the subject of
the complement. The properties of control constructions are discussed in detail in
Sag and Pollard () but they can be broadly subsumed under the generalization
that control verbs are not transparent to the syntactic demands of the head of their
complement. The contrast with raising verbs is reflected in the fact that the sub-
ject of the control verb ‘hope’ in (.b) follows the default nominative pattern
and does not inherit the accusative case governed by its complement in (.a)
(repeated from (.a)).
A similar analysis is proposed within LFG in terms of ‘functional control’ (Bresnan ).
(.) Icelandic subject control constructions (cf. Andrews : )
a. Hana vantar peninga.
her. lack. money.
‘She lacks money.’
b. Hún/*hana vonast til að vanta ekki peninga.
she./*her. hope. toward not lack money.
‘She hopes not to lack money.’
The intuition that the subject of a control verb merely identifies the reference
of its complement’s subject is expressed by the entry in Figure ., in which the
index values of the two values are identified (i.e. structure-shared). The fact
⟨ [ ]⟩
1 3
⟨ [ ⟨[ ]⟩]
2 3
⟨ ⟩
vanast: - 1, 2
that index but not case values are shared in this entry allows the subject of vonast
to select a nominative subject and control a complement that selects an accusative
subject in Figure .. Exactly the same formal components determine the anal-
yses in Figures . and .; there is no analogue to distinct ‘raising’ and ‘equi’
transformations or to distinct PRO and ‘trace’ elements in the subordinate subject
positions. Instead it is solely the locus of structure sharing that distinguishes these
subconstructions.
[ ]
. ⟨ ⟩
⟨ ⟩
3rd ⟨ ⟩
3 sg
1
1
fem ⟨ ⟩
nom
⟨ ⟩ ⟨[ ]⟩
1
3
⟨ ⟩
2 2 acc
hún ⟨ ⟩
- 1, 2 ⟨ ⟩
phonological representation
syntactic and semantic features
[ ]
- single sign
- <list of signs>
<vantar peninga>
⟨ ⟩
1
⟨⟩
<vantar>
⟨ [ ]⟩
⟨
⟩
1 acc
- ⟨ ⟩
2
⟨ ⟩
-
1 2
<peninga>
⟨
⟩
3rd
- 2
plu
masc
acc
basic intuitions and desiderata that underlie HPSG models, a more streamlined ver-
sion of the formalism is presented in Sag et al. ().
As noted in Sag (b, ), feature structure counterparts of the local trees
from GPSG provide suitable candidates. Individual constructions can be repre-
sented by feature structures exhibiting the organization in Figure ., where
represents the mother sign and a list of daughter signs. Many of the construction-
specific properties investigated in the modern Construction Grammar literature (typ-
ified by Kay and Filmore ()) can be integrated into these unified data structures.
construct
sign0
<sign1 ,…,signn >
phrasal-cxt lexical-cxt
The detailed treatment of English relative and filler-gap clauses in Sag (,
a) presents a sustained argument for extending HPSG models to include a no-
tion of construction. At the same time, these studies make a case for reconceptu-
alizing grammatical constructions in the context of a constraint-based architecture,
rather than in the exemplar-based terms assumed in traditional grammars.
These studies also illustrate the ways that phrase structure models continue to
evolve, driven in part by the logic of their basic organizing principles, and in part
by their ability to incorporate and extend insights from other traditions. From their
origins in the string rewriting systems in Chomsky (), extended phrase struc-
ture models have assumed their modern form by successively integrating traditional
perspectives on grammatical features and units with more formal notions such as
inheritance hierarchies and constraint satisfaction. In addition to providing analy-
ses of a wide range of syntactic constructions, these models have clarified how ex-
plicit mechanisms for regulating the distribution of grammatical information within
a single syntactic representation can achieve any of the benefits that had, beginning
with the work of Harris (), been claimed to accrue to derivational analyses.
References
Akmajian, Adrian and Wasow, Thomas (). The constituent structure of VP and
and the position of the verb . Linguistic Analysis, , –.
Bear, John (). Gaps as syntactic features. Technical note, Indiana University
Linguistics Club.
Blevins, James P (). Feature-based grammar. See Borsley and Börjars ().
Bloch, Bernard (). Studies in colloquial Japanese II: Syntax. Language, ,
–. Reprinted in Joos (: –).
Bloomfield, Leonard (). Language. University of Chicago Press, Chicago.
Bouma, Gosse, Malouf, Rob, and Sag, Ivan A (). Satisfying constraints on
extraction and adjunction. Natural Language and Linguistic Theory, , –.
Bresnan, Joan (). Control and complementation. Linguistic Inquiry, –.
Carpenter, Bob (). The Logic of Typed Feature Structures. Cambridge University
Press, New York.
Chomsky, Noam (). Three models for the description of language. Institute of
Radio Engineers Transactions on Information Theory, II-, –. Reprinted in
Luce et al. (), –.
Chomsky, Noam (). Syntactic Structures. Mouton, The Hague.
Chomsky, Noam (). Aspects of the Theory of Syntax. MIT Press, Cambridge,
MA.
Dalrymple, Mary, Kaplan, Ronald M., Maxwell, III, John T., and Zaenen, Annie
(ed.) (). Formal Issues in Lexical-Functional Grammar. CSLI, Stanford.
Dowty, David (). Grammatical relations and Montague Grammar. In The Na-
ture of Syntactic Representation (ed. G. K. Pullum and P. Jacobson), pp. –.
Reidel, Dordrecht.
Gazdar, Gerald (). Unbounded dependencies and coordinate structure. Lin-
guistic Inquiry, , –.
Gazdar, Gerald, Klein, Ewan, Pullum, Georey K., and Sag, Ivan A (). Gener-
alized Phrase Structure Grammar. Harvard University Press, Cambridge.
Gazdar, Gerald, Pullum, Georey K., and Sag, Ivan A (a). Auxiliaries and
related phenomena in a restrictive theory of grammar. Language, , –.
Gazdar, Gerald, Pullum, Georey K., Sag, Ivan A., and Wasow, Thomas (b).
Coordination and transformational grammar. Linguistic Inquiry, , –.
Gleason, Henry Allan (). Linguistics and English Grammar. Holt, Rinehart and
Winston, New York.
Harris, Zellig S (). Co-occurrence and transformation in linguistic structure.
Language, , –. Reprinted in Harris (), –.
Harris, Zellig S (). Papers in Syntax. Reidel, Dordrecht.
Hockett, Charles F (). A Course in Modern Linguistics. MacMillan, New York.
Ingria, Robert J (). The limits of unification. In Proceedings of the th An-
nual Meeting of the Association for Computational Linguistics, Morristown, NJ,
pp. –.
Jacobson, Pauline (). Review of Generalized Phrase Structure Grammar. Lin-
guistics and Philosophy, , –.
Johnson, David and Lappin, Shalom (). Local Constraints vs Economy. CSLI,
Stanford.
Joos, Martin (ed.) (). Readings in Linguistics I. University of Chicago Press,
Chicago.
Joshi, Avarind and Schabes, Yves (). Tree Adjoining Grammars. In Hand-
book of Formal Languages, Vol. (ed. G. Rosenberg and A. Salomaa), pp. –.
Springer Verlag.
Kaplan, Ronald M. and Bresnan, Joan (). Lexical-functional grammar: A formal
system for grammatical representation. In The Mental Representation of Gram-
matical Relations (ed. J. Bresnan), pp. –. MIT Press, Cambridge.
Kathol, Andreas (). Linear Syntax. Oxford University Press, Oxford.
Kay, Paul and Filmore, Charles J. (). Grammatical constructions and linguistic
generalizations: The What’s X doing Y? construction. Language, (), –.
Kehler, Andrew (). Coherence, Reference, and the Theory of Grammar. CSLI
Publications, Stanford.
Koster, Jan (). Locality Principles in Syntax. Foris, Dordrecht.
Koster, Jan (). Domains and Dynasties: The Radical Autonomy of Syntax. Foris,
Dordrecht.
Ladusaw, William (). A proposed distinction between levels and strata. In
Linguistics in the Morning Calm (ed. S.-D. Kim), pp. –. Hanshin, Seoul.
Lako, George (). Frame semantic control of the coordinate structure con-
straint. In Chicago Linguistic Society, Volume , pp. –.
Levine, Robert D. and Hukari, Thomas (). The Unity of Unbounded Dependency
Constructions. CSLI Lecture Notes, No. . CSLI Publications, Stanford.
Luce, R., Bush, R., and Galanter, E (ed.) (). Readings in Mathematical Psychol-
ogy . Wiley and Sons, New York.
Manaster-Ramer, Alexis and Kac, Michael B. (). The concept of phrase struc-
ture. Linguistics and Philosophy, , –.
Manning, Christopher D. and Sag, Ivan A. (). Dissociations between argument
structure and grammatical relations. In Lexical and Constructional Aspects of Lin-
guistic Explanation (ed. G. Webelhuth, J.-P. Koenig, and A. Kathol), pp. –.
CSLI, Stanford.
McCawley, James D. (). Concerning the base component of a tranformational
grammar. Foundations of Language, , –.
McCawley, James D (). Parentheticals and discontinuous constituent structure.
Linguistic Inquiry, , –.
Müller, Stefan (). Deutsche Syntax deklarativ. Head-Driven Phrase Structure
Grammar für das Deutsche. Volume Linguistische Arbeiten. Niemeyer.
Müller, Stefan (). Complex Predicates: Verbal Complexes, Resultative Construc-
tions, and Particle Verbs in German. Studies in Constrant-Based Lexicalism .
CSLI Publications, Stanford.
Müller, Stefan (). Continuous or discontinuous constituents? a comparison
between syntactic analyses for constituent order and their processing systems.
Research on Language and Computation, –.
Müller, Stefan (). Grammatiktheorie. Number in Stauenburg Einführun-
gen. Stauenburg Verlag, Tübingen.
Nida, Eugene A (). A Synopsis of English Grammar. Mouton, The Hague.
Orgun, C. Orhan (). Sign-Based Morphology and Phonology: with special at-
tention to Optimality Theory. Ph.D. thesis, University of California, Berkeley.
Pike, Kenneth L (). Taxemes and immediate constituents. Language, , –.
Pollard, Carl (). Generalized Phrase Structure Grammars, Head Grammars and
Natural Language. Ph.D. thesis, Stanford.
Pollard, Carl and Sag, Ivan A (). Information-Based Syntax and Semantics.
CSLI, Stanford.
Pollard, Carl and Sag, Ivan A. (). Anaphors in English and the scope of the
binding theory. Linguistic Inquiry, , –.
Pollard, Carl and Sag, Ivan A (). Head-driven Phrase Structure Grammar. Uni-
versity of Chicago Press, Stanford.
Post, Emil L (). Formal reductions of the general combinatorial decision prob-
lem. American Journal of Mathematics, , –.
Post, Emil L (). Recursive unsolvability of a problem of Thue. Journal of Sym-
bolic Logic, , –.
Postal, Paul M (). Three Investigations of Extraction. MIT Press, Cambridge,
MA.
Sag, Ivan A. and Fodor, Janet D. (). Extraction without traces. In Proceedings
of the Thirteenth West Coast Conference on Formal Linguistics (ed. R. Aranovich,
W. Byrne, S. Preuss, and M. Senturia), Stanford University. CSLI.
Sag, Ivan A., Gazdar, Gerald, Wasow, Thomas, and Weisler, Steven (). Co-
ordination and how to distinguish categories. Natural Language and Linguistic
Theory, , –.
Sag, Ivan A. and Pollard, Carl (). An integrated theory of complement control.
Language, , –.
Sag, Ivan A., Wasow, Thomas, and Bender, Emily (). Syntactic Theory: A
Formal Introduction (nd edn). CSLI Publications, Stanford.
Sampson, Georey R. (). What was transformational grammar? Lingua, ,
–.
Scholz, Barbara C. and Pullum, Georey K (). Tracking the origins of trans-
formational generative grammar. Journal of Linguistics, , –.
Shieber, Stuart M (). Evidence against the context-freeness of natural language.
Linguistics and Philosophy, , –.
Shieber, Stuart M. (). An Introduction to Unification-based Approaches to
Grammar. CSLI, Stanford.
Steedman, Mark (). The Syntactic Process. MIT Press, Cambridge, MA.
Steedman, Mark and Baldridge, Jason (). Combinatory categorial grammar.
See Borsley and Börjars ().