Syntax is a central subfield within linguistics and is important for the study of natural languages, since they all have syntax. Theories of syntax can vary drastically, though. They tend to be based on one of two competing principles, on... more
Syntax is a central subfield within linguistics and is important for the study of natural languages, since they all have syntax. Theories of syntax can vary drastically, though. They tend to be based on one of two competing principles, on dependency or phrase structure. Surprisingly, the tests for constituents that are widely employed in syntax and linguistics research to demonstrate the manner in which words are grouped together forming higher units of syntactic structure (phrases and clauses) actually support dependency over phrase structure. The tests identify much less sentence structure than phrase structure syntax assumes. The reason this situation is surprising is that phrase structure has been dominant in research on syntax over the past 60 years. This article examines the issue in depth. Dozens of texts were surveyed to determine how tests for constituents are employed and understood. Most of the tests identify phrasal constituents only; they deliver little support for the existence of subphrasal strings as constituents. This situation is consistent with dependency structure, since for dependency, subphrasal strings are not constituents to begin with.
In this paper we argue that in addition to Immediate Constituent (IC) models, there is a relatively new variety of formalisms within what Hockett (1954) and Schmerling (1983) call Item-and-Arrangement grammars: we call them array-based... more
In this paper we argue that in addition to Immediate Constituent (IC) models, there is a relatively new variety of formalisms within what Hockett (1954) and Schmerling (1983) call Item-and-Arrangement grammars: we call them array-based (AB) grammars. These grammars have as a goal to define a hierarchically ordered sequence of nodes rather than establish distributional generalisations, and define 1-dimensional arrays instead of proper containment and is-a¬ relations. We argue that it is crucial to distinguish between IC and AB grammars in the interpretation of tree diagrams used in current generative theorising.
Lexical Functional Gramma (LFG) is a non-transformational generative grammar, which excludes concepts such as deep structure, surface structure and transformation. Rather than shifting a deep structure into a surface form through... more
Lexical Functional Gramma (LFG) is a non-transformational generative grammar, which excludes concepts such as deep structure, surface structure and transformation. Rather than shifting a deep structure into a surface form through transformations, LFG maintains the idea that several structures exist in parallel levels. The two main structures in the LFG are constituent structure and function structure, which are abbreviated as c-structure and f- structure, respectively. LFG is also comprised of other structures including argument structure, semantic structure, and information structure. The present paper mainly focuses on the f-structure to prove the capability of LFG in explaining some linguistic phenomena and characteristics of Persian language such as passivization, non-configurationality, and topicalization. Certain Persian structures such as simple and compound sentences, complement clauses, and genitive structures are studied and, following 10 grammatical roles are introduced for Persian Lodrup’s (2011) model, which are classified as argument vs. non-argument and discourse vs. non-discourse.