Syntax

From New World Encyclopedia
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.
Linguistics
Comparative linguistics
Computational linguistics
Dialectology
Etymology
Historical linguistics
Morphology
Phonetics
Phonology
Psycholinguistics
Semantics
Synchronic linguistics
Syntax
Psycholinguistics
Sociolinguistics

In linguistics, syntax (The word originates from the Greek words συν (syn), meaning "co-" or "together," and τάξις (táxis), meaning "sequence, order, or arrangement.") is the study of the rules, or "patterned relations," that govern the way words combine to form phrases and phrases combine to form sentences. Syntax, in this sense, should be contrasted with the two other kinds of studies about linguistic expressions: semantics and pragmatics. The former studies the meanings of linguistic expressions and the latter studies the practical use of linguistic expressions by agents or communities of interpretation in particular circumstances and contexts.

Overview

The combinatory behavior of words is governed to a first approximation by their part of speech (noun, adjective, verb, etc., a categorization that goes back in the Western tradition to the Greek grammarian Dionysios Thrax). Modern research into natural language syntax attempts to systematize descriptive grammar and, for many practitioners, to find general laws that govern the syntax of all languages. It is unconcerned with prescriptive grammar.

Theories of syntax differ in the object of study. While formal grammars (especially in the generative grammar tradition) have focused on the mental process of language production (i-language), empirical grammars have focused on linguistic function, explaining the language in use (corpus linguistics). The latter often encode frequency data in addition to production rules, and provide mechanisms for learning the grammar (or at least the probabilities) from usage data. One way of considering the space of grammars is to distinguish those that do not encode rule frequency (the majority) and those that do (probabilistic grammars).

In Logic, "syntax" refers to the part of a formal system that determines (1) the vocabulary of a language in which the formal system is expressed, (2) the rules of formations of permissible sentences in the language, which are called well-formed formulas (denoted as "wffs"), and (3) the deductive elements (axioms and rules of inference) of the system.

Brief Historical Overview

Syntax, literally "composition," is an ancient Greek work, whereas the name of other domains of linguistics such semantics or morphology are recent (nineteenth century). The history of this field is rather complicated: two landmarks in the field are the first complete Greek grammar, written by Dionysus Thrax in the first century B.C.E.—a model for Roman grammarians, whose work led to the medieval and Renaissance vernacular grammars—and the Grammaire of Port Royal —a Cistercian convent in the Vallée de Chevreuse southwest of Paris that launched a number of culturally important institutions.

The central role of syntax within theoretical linguistics became clear only in the last century which could reasonably called the "century of syntactic theory" as far as linguistics is concerned. One of the most major approaches is transformational-generative grammar which was initiated by Noam Chomsky, which has stimulated various kinds of later approaches. Other famous approaches includes dependency grammar (L. Tesnière), systemic functional grammar (A. K. Halliday), tree adjoining grammar (A. Josh), etc.

For a detailed and critical survey of the history of syntax in the last two centuries see the monumental work by Graffi from 2001.

Formal Syntax

There are many theories of formal syntax—theories that have in time risen or fallen in influence. Most theories of syntax share at least two commonalities. First, they hierarchically group subunits into constituent units (phrases). Second, they provide some systems of rules to explain patterns of acceptability/grammaticality and unacceptability/ungrammaticality.

Transformational-generative grammar (of which Government and Binding Theory and Minimalism are recent developments) represents the structures of sentences by phrase structure trees, otherwise known as phrase markers or tree diagrams. The hierarchical structures of such trees provide the information about how the acceptable sentences of a given languages are produced from the components part of them.

In Dependency grammar, the structures of sentences are considered to be determined by the relation between words and their dependents. One difference from phrase structure grammar is that dependency grammar does not have phrasal categories. Algebraic syntax is one example of dependency grammar.

A modern approach to combining accurate descriptions of the grammatical patterns of language with their function in context is that of "systemic functional grammar," an approach originally developed by Michael A.K. Halliday in the 1960s and now pursued actively on all continents. Systemic-functional grammar is related both to feature-based approaches, such as Head-driven phrase structure grammar, and to the older functional traditions of European schools of linguistics such as British Contextualism and the Prague School.

Tree adjoining grammar is a grammar formalism with interesting mathematical properties which has sometimes been used as the basis for the syntactic description of natural language. In monotonic and monostratal frameworks, variants of unification grammar are often preferred formalisms

Empirical Approaches to Syntax

Formal models of syntax face several problems. One is that often more than one rule of sentence production may apply to a structure, thus resulting in a conflict. The greater the coverage, the higher this conflict, and all grammarians (starting with Panini) have spent considerable effort devising a prioritization for the rules, which usually turn out to be infeasible. Another difficulty is over generation, where unlicensed structures are also generated.

Probabilistic grammars circumvent these conflicts by using the frequency of various productions to order them, resulting in a "most likely" (winner-take-all) interpretation, which by definition, is infeasible given additional data. As usage patterns are altered in diachronic shifts, these probabilistic rules can be re-learned, thus upgrading the grammar.

One may construct a probabilistic grammar from a traditional formal syntax by taking some probability distribution over the production rules that is estimated from empirical data about actual usage of sentences. On most samples of broad language, probabilistic grammars that tune these probabilities from data typically outperform hand-crafted grammars (although some rule-based grammars are now approaching the accuracies of PCFG).

Recently, probabilistic grammars appear to have gained some cognitive plausibility. It is well known that there are degrees of difficulty in accessing different syntactic structures (e.g. the Accessibility Hierarchy for relative clauses). Probabilistic versions of minimalist grammars have been used to compute information-theoretic entropy values which appear to correlate well with psycholinguistic data on understandability and production difficulty.

Logic

In logic, syntax is the part of a formal system that defines the formal language setting in which the system is expressed and the rules of deductive formations of the expressions in the languages.

The formal language can be characterized by its vocabulary and the grammar, that is, the rules of forming permissible expressions, called "well-formed formula." (An example of vocabulary and formation rules for formal languages, particularly the one for propositional calculus, can be found in Propositional Calculus).

Deductive systems of a formal system consist of [axiomatic systems |axioms] and rules of inferences. The axioms in a deductive system are well-formed formulas of a distinguished kind. The rules of inferences are the rules by which well-formed formulas of some distinguished forms are transformed in some specific ways. Well-formed formulas that are either axioms or the ones obtainable from axioms with several applications of rules of inferences are called "theorems" in the deductive system. The sequences of well-formed formulas that represents how a theorem in a given deductive system are derived from axioms with applications of rules of inferences are called "proofs."

Notes

Syntactic terms

  • Adjective
    • Attributive adjective and predicative adjective
  • Adjunct
  • Adverb
  • Appositive
  • Article
  • Aspect
  • Auxiliary verb
  • Case
  • Clause
  • Closed class word
  • Comparative
  • Complement
  • Compound noun and adjective
  • Conjugation
  • Conjunction
  • Dangling modifier
  • Declension
  • Determiner
  • Dual

  • Expletive
  • Function word
  • Grammatical gender|Gender
  • Gerund
  • Infinitive
  • Measure word (classifier)
  • Modal particle
  • Movement paradox
  • Modifier
  • Mood
  • Noun
  • Number
  • Object
  • Open class word
  • Parasitic gap
  • Part of speech
  • Particle
  • Person
  • Phrase

  • Phrasal verb
  • Plural
  • Predicate (also verb phrase)
  • Predicative (adjectival or nominal)
  • Preposition
  • Personal pronoun
  • Pronoun
  • Restrictiveness
  • Sandhi
  • Sentence (linguistics)
  • Singular
  • Subject
  • Superlative
  • Tense
  • Uninflected word
  • Verb
  • Voice
  • Wh-movement
  • Word order

References
ISBN links support NWE through referral fees

  • Aronoff, M. and J. Rees-Miller, 2003. The Handbook of Linguistics. Blackwell Publishing Professional. ISBN 978-1405102520
  • Bender, E. M., I. A. Sag, and T. Wasow, 2003. Syntactic Theory: A Formal Introduction. 2nd Edition. Center for the Study of Language and Inf. ISBN 978-1575864006
  • Chomsky, N. 1969. Aspects of the Theory of Syntax. MIT Press. ISBN 978-0262530071
  • Chomsky, M. 2003. Syntactic Structures. 2nd Edition. Walter de Gruyter. ISBN 978-3110172799
  • Enderton, H. B. 2000. A Mathematical Introduction to Logic. 2nd Edition. Academic Press. ISBN 978-0122384523
  • Graffi, G. 2001 200 Years of Syntax. A Critical Survey, Amsterdam, Benjamins, 2001. ISBN 978-1588110527

External links

All links retrieved February 26, 2023.

General Philosophy Sources

Credits

New World Encyclopedia writers and editors rewrote and completed the Wikipedia article in accordance with New World Encyclopedia standards. This article abides by terms of the Creative Commons CC-by-sa 3.0 License (CC-by-sa), which may be used and disseminated with proper attribution. Credit is due under the terms of this license that can reference both the New World Encyclopedia contributors and the selfless volunteer contributors of the Wikimedia Foundation. To cite this article click here for a list of acceptable citing formats.The history of earlier contributions by wikipedians is accessible to researchers here:

The history of this article since it was imported to New World Encyclopedia:

Note: Some restrictions may apply to use of individual images which are separately licensed.