Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Transformational Grammar: Generating Infinite Sentences with Finite Rules, Summaries of English Language

A summary of transformational grammar, a linguistic theory developed by noam chomsky. Transformational grammar posits that sentences are generated by subconscious procedures or rules, which can be thought of as transformations of basic structures. The key concepts of generative grammar, including kernel sentences, transformations, and the distinction between lexical and formal transformations. It also discusses the limitations of transformational analysis in handling certain linguistic constructions, such as compounds, and the need for a separate level of analysis for these phenomena. The document offers insights into the generative power of language and the underlying cognitive processes involved in sentence formation, making it a valuable resource for students and researchers interested in linguistics and language theory.

Typology: Summaries

2021/2022

Uploaded on 12/23/2022

lia-aftanty
lia-aftanty 🇮🇩

4 documents

1 / 3

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Lia Aftanty / 21202241042 / PBI B
Summary of Transformational Grammar
We called tranformational grammar because we can create an infinite number of
sentences that we’ve never seen or heard before with a finite set of words and rules. Sentences
are generated by subconscious procedures or rules. These procedures are thought of as
transformations of basic structures. That’s why generative grammar is also called
transformational grammar or transformational generative grammar.
A. Generative Grammar
Sentences are generated by subconscious procedures or rules. These procedures are
thought of as transformations of basic structures. That’s why generative grammar is also called
transformational grammar or transformational generative grammar. Chomsky argued that the
methods which were being used to describe languages on the syntactic level were inadequate (or
at least intolerably clumsy), and proposed a new grammatical model to replace them.
By Harris, this model had been called as a transformation. According to Harris, two
sentences which are of a quite different pattern can be transformationally related if the same
lexical items co-occur in both.
For examples:
- dog - cat - kill
The cat killed the dog ← → the dog was killed by the cat
- man - biscuit - eat
The man ate the biscuit ← → the man who ate the biscuit
- Mary - make - cake
Mary made a cake ← → the cake which Mary made
Given such a chain, Chomsky's basic contribution is to single out one of the transforms (in
this case Mary made the cake), and successively derive from it all the rest. Such a derivative (or
generative) relationship is symbolised by (→): thus, to return to an earlier example, the cat
killed the dog → the dog was killed by the cat. Transforms singled out in this way Chomsky
refers to as kernel sentences.
Both Chomsky and Lees maintain that no analyst need set out mechanical procedures for
arriving at his results. But the fact that no mechanical explanation can be given for a series of
decisions does not necessarily mean that no explanation need be given at all, and it would be
pf3

Partial preview of the text

Download Transformational Grammar: Generating Infinite Sentences with Finite Rules and more Summaries English Language in PDF only on Docsity!

Lia Aftanty / 21202241042 / PBI B Summary of Transformational Grammar We called tranformational grammar because we can create an infinite number of sentences that we’ve never seen or heard before with a finite set of words and rules. Sentences are generated by subconscious procedures or rules. These procedures are thought of as transformations of basic structures. That’s why generative grammar is also called transformational grammar or transformational generative grammar. A. Generative Grammar Sentences are generated by subconscious procedures or rules. These procedures are thought of as transformations of basic structures. That’s why generative grammar is also called transformational grammar or transformational generative grammar. Chomsky argued that the methods which were being used to describe languages on the syntactic level were inadequate (or at least intolerably clumsy), and proposed a new grammatical model to replace them. By Harris, this model had been called as a transformation. According to Harris, two sentences which are of a quite different pattern can be transformationally related if the same lexical items co-occur in both. For examples:

  • dog - cat - kill The cat killed the dog ← → the dog was killed by the cat
  • man - biscuit - eat The man ate the biscuit ← → the man who ate the biscuit
  • Mary - make - cake Mary made a cake ← → the cake which Mary made Given such a chain, Chomsky's basic contribution is to single out one of the transforms (in this case Mary made the cake ), and successively derive from it all the rest. Such a derivative (or generative) relationship is symbolised by (→): thus, to return to an earlier example, the cat killed the dog → the dog was killed by the cat. Transforms singled out in this way Chomsky refers to as kernel sentences. Both Chomsky and Lees maintain that no analyst need set out mechanical procedures for arriving at his results. But the fact that no mechanical explanation can be given for a series of decisions does not necessarily mean that no explanation need be given at all, and it would be

interesting to know why Lees has decided that such-and-such a construction should be handled in the kernel and such-and-such in the transformational section. This is particularly so as at first sight there seem to be some inconsistencies. Example by Sub Sequent Rule: John’s there in the garden There are two alternatives:

1. Single Kernel: John’s in the garden

2. Two Kernel Taken Together: John’s there and John’s in the garden

B. Inter Alia To develop this further we could return to the notion of predictability used earlier to handle sentences with Loc and Acc. That, for instance, the stone's ready to drink is collocationally abnormal can be predicted from the abnormality of John drank the stone, and that the sigh is ready to utter is so from the sigh is ready. Furthermore, it is sometimes necessary to distinguish classes of collocations which are relevant on the transformational level alone. Consider the following transformations:

  • the boy's got a bicycle ←→ the boy's bicycle , but the bicycle of the boy.
  • the boy's got a front brake ←→ the boy's front brake , but the front brake of the boy.

- the bicycle has got a front brake ←→ both the bicycle's front brake and the

front brake of the bicycle. C. Stage of Analysis Produces

  1. Two Lexical Element Example: mulberry ← → *John mulled the berry, *the berry mulled the wine
  2. One Lexical Element Example: Cranberry ← → *cran whortleberry ← → *whortle milch-cow ← → * milch D. Formal Transform Example: