What is meant by generative grammars in flrmal languages and automata theorey?
Answers
Answered by
2
Ans.
● In formal language theory, a grammar (when the context is not given, often called a formal grammar for clarity) is a set of production rules for strings in a formal language. The rules describe how to form strings from the language's alphabet that are valid according to the language's syntax. A grammar does not describe the meaning of the strings or what can be done with them in whatever context—only their form.
Formal language theory, the discipline that studies formal grammars and languages, is a branch of applied mathematics. Its applications are found in theoretical computer science, theoretical linguistics, formal semantics, mathematical logic, and other areas.
A formal grammar is a set of rules for rewriting strings, along with a "start symbol" from which rewriting starts. Therefore, a grammar is usually thought of as a language generator. However, it can also sometimes be used as the basis for a "recognizer"—a function in computing that determines whether a given string belongs to the language or is grammatically incorrect. To describe such recognizers, formal language theory uses separate formalisms, known as automata theory. One of the interesting results of automata theory is that it is not possible to design a recognizer for certain formal languages. Parsing is the process of recognizing an utterance (a string in natural languages) by breaking it down to a set of symbols and analyzing each one against the grammar of the language. Most languages have the meanings of their utterances structured according to their syntax—a practice known as compositional semantics. As a result, the first step to describing the meaning of an utterance in language is to break it down part by part and look at its analyzed form (known as its parse tree in computer science, and as its deep structure in generative grammar).
● Generative grammar is a linguistic theory that regards grammar as a system of rules that generates exactly those combinations of words that form grammatical sentences in a given language. Noam Chomsky first used the term in relation to the theoretical linguistics of grammar that he developed in the late 1950s. Linguists who follow the generative approach have been called generativists. The generative school has focused on the study of syntax, but has also addressed other aspects of a language's structure, including morphology and phonology.
Early versions of Chomsky's theory were called transformational grammar, which is still used as a general term that includes his subsequent theories. The most recent is the minimalist program, from which Chomsky and other generativists have argued that many of the properties of a generative grammar arise from a universal grammar that is innate to the human brain, rather than being learned from the environment (see the poverty of the stimulus argument).
There are a number of versions of generative grammar currently practiced within linguistics. A contrasting approach is that of constraint-based grammars. Where a generative grammar attempts to list all the rules that result in all well-formed sentences, constraint-based grammars allow anything that is not otherwise constrained. Constraint-based grammars that have been proposed include certain versions of dependency grammar, head-driven phrase structure grammar, lexical functional grammar, categorial grammar, relational grammar, link grammar, and tree-adjoining grammar. In stochastic grammar, grammatical correctness is taken as a probabilistic variable, rather than a discrete (yes or no) property.
● In formal language theory, a grammar (when the context is not given, often called a formal grammar for clarity) is a set of production rules for strings in a formal language. The rules describe how to form strings from the language's alphabet that are valid according to the language's syntax. A grammar does not describe the meaning of the strings or what can be done with them in whatever context—only their form.
Formal language theory, the discipline that studies formal grammars and languages, is a branch of applied mathematics. Its applications are found in theoretical computer science, theoretical linguistics, formal semantics, mathematical logic, and other areas.
A formal grammar is a set of rules for rewriting strings, along with a "start symbol" from which rewriting starts. Therefore, a grammar is usually thought of as a language generator. However, it can also sometimes be used as the basis for a "recognizer"—a function in computing that determines whether a given string belongs to the language or is grammatically incorrect. To describe such recognizers, formal language theory uses separate formalisms, known as automata theory. One of the interesting results of automata theory is that it is not possible to design a recognizer for certain formal languages. Parsing is the process of recognizing an utterance (a string in natural languages) by breaking it down to a set of symbols and analyzing each one against the grammar of the language. Most languages have the meanings of their utterances structured according to their syntax—a practice known as compositional semantics. As a result, the first step to describing the meaning of an utterance in language is to break it down part by part and look at its analyzed form (known as its parse tree in computer science, and as its deep structure in generative grammar).
● Generative grammar is a linguistic theory that regards grammar as a system of rules that generates exactly those combinations of words that form grammatical sentences in a given language. Noam Chomsky first used the term in relation to the theoretical linguistics of grammar that he developed in the late 1950s. Linguists who follow the generative approach have been called generativists. The generative school has focused on the study of syntax, but has also addressed other aspects of a language's structure, including morphology and phonology.
Early versions of Chomsky's theory were called transformational grammar, which is still used as a general term that includes his subsequent theories. The most recent is the minimalist program, from which Chomsky and other generativists have argued that many of the properties of a generative grammar arise from a universal grammar that is innate to the human brain, rather than being learned from the environment (see the poverty of the stimulus argument).
There are a number of versions of generative grammar currently practiced within linguistics. A contrasting approach is that of constraint-based grammars. Where a generative grammar attempts to list all the rules that result in all well-formed sentences, constraint-based grammars allow anything that is not otherwise constrained. Constraint-based grammars that have been proposed include certain versions of dependency grammar, head-driven phrase structure grammar, lexical functional grammar, categorial grammar, relational grammar, link grammar, and tree-adjoining grammar. In stochastic grammar, grammatical correctness is taken as a probabilistic variable, rather than a discrete (yes or no) property.
Similar questions