Universal grammar by chomsky pdf
Clark, Roger Brown's "a First Language", and several more. The data, research, test subjects, specific experiments, and various specific approaches used to write this paper are vast and numerous. For this review, because of its two page limitation, I will restrict the content to Cook's findings and observations of those things I just mentioned. The paper begins by discussing how incredibly complex language is to learn.
It suggests that there is something deeper, such as innate capabilities, that allows human beings to apprehend all aspects of language. To prove this, it is suggested that there are two states of mind, S0 and Ss.
The first is the fresh mind of the child while the second is the steady state of the adult. To find what capabilities are innate we must discover what is not developed between those two stages. The concept of Universal Grammar seems to play an important role and is stated as an integral part of the state S0.
Much of the structure and system for the way we speak is born with us while the values associated with the parameters of this system and the elements of the periphery are gathered through experience.
All of which he deems as inadequate by themselves. Chomsky is noted as providing the most useful knowledge of the subject and Cook agrees with most of Chomsky's work. Many examples are provided to show how Chomsky's and all these other concepts are relevant. Examples of all sorts are present through the reading.
Each narrating a specific concept. The next section deals with the methods we use towards language acquisition. It is divided into two categories: the positive evidence of what is allowed and the tools we use to know what is not allowed in the language. Chomsky divides language acquisition into three types: positive evidence, direct negative evidence, and indirect negative evidence.
Other forms of acquisition are presented and analyzed. Imitation is shown to be an important role for children. Imitation is where children only use the exact sentence information they acquire through their environment. The next is the explanatory evidence that is known to counteract the inadequacies of positive evidence acquisition.
One problem is that if a child is old enough to understand the explanation they are hardly in need of it. The other is that the teachers of the language to the children hardly have conscious knowledge of the workings of the language and would rarely attempt to explain it anyway. The perpetual correction of the child's use of the language is also a possibility.
Chomsky is stated to mention how the child may have certain errors but that the general rules of UG are hardly broken and are in no need of correction. The domestication of social cognition in dogs. Science , — Haspelmath, M. Biberauer Amsterdam: Benjamins , 75— Hawkins, J. Efficiency and Complexity in Grammars. Heine, B. World Lexicon of Grammaticalization. Hoff, E. How social contexts support and shape language development. Holland, A. Treatment efficacy: aphasia.
Speech Hear. Horgan, D. The development of the full passive. Hornstein, N. Hornstein and D. Lightfoot London: Longman , 9— Hulst, H. On the question of linguistic universals. Huttenlocher, J. Language input and language growth.
Language input and child syntax. Indefrey, P. Johnson, C. Jones, M. Jusczyk, P. The Discovery of Spoken Language. Kaplan, D. Developing linguistic flexibility across the school years.
First Lang. Karbe, H. Brain plasticity in poststroke aphasia: what is the contribution of the right hemisphere? Brain Lang. Karmiloff, K. Pathways to Language. From Fetus to Adolescent. Karmiloff-Smith, A. Beyond Modularity. A Developmental Perspective on Cognitive Science. Nelson and M. Kay, P. Language 75, 1— Kayne, R.
Cinque and R. Oxford: Oxford University Press , 3— Kolb, B. Brain plasticity and behaviour in the developing brain. Psychiatry 20, — PubMed Abstract Google Scholar. Kuczaj, S. Initial verbs in yes-no questions: a different kind of general grammatical category? Langacker, R. Foundations of Cognitive Grammar , Vol. Constituency, dependency, and conceptual grouping.
Barlow and S. Cognitive Grammar: A Basic Introduction. Lasnik, H. On the poverty of the challenge. Laws, G. Leonard, L. Children with Specific Language Impairment.
Lidz, J. How nature meets nurture: universal grammar and statistical learning. Constructions on holiday. Lieven, E. Lum, J. Longitudinal study of declarative and procedural memory in primary school-aged children. MacWhinney, B. Hillsdale, NJ: Lawrence Erlbaum. The Emergence of Language. Mahwah, NJ: Lawrence Erlbaum. The emergence of linguistic form in time. Robinson and N.
Maratsos, M. Marcus, G. Negative evidence in language acquisition. Cognition 46, 53— Martins, I. Acquired childhood aphasia: a clinicoradiological study of 11 stroke patients.
Aphasiology 7, — Matthews, D. Menn, L. Evidence children use: learnability and the acquisition of grammatical morphemes. Berkeley Linguist. Misyak, J. Clark and I. Arnon Amsterdam: John Benjamins , — Christiansen, C. Collins and S.
Edelman Oxford: Oxford University Press , — Musso, M. Nelson, K. Structure and strategy in learning to talk. Child Dev 38, 1— Individual differences in language development: implications for development and language. Nevins, A. Language 85, — Newmeyer, F. Universals in syntax. Newport, E. Maturational constraints on language learning.
Nippold, M. Austin, TX: Pro-ed. Conversational versus expository discourse: a study of syntactic development in children, adolescents and adults. Speech Lang. The emergentist program. Lingua , — Narrog and B. Heine Oxford: Oxford University Press , — Contemporary Linguistics. An Introduction. London: Longman. Paterson, S. Cognitive modularity and genetic disorders. Pesetsky, D. Wilson and F. Peters, A. Language learning strategies: does the whole equal the sum of the parts?
Language 53, — Filler syllables: what is their status in emerging grammar? False starts and filler syllables: ways to learn grammatical morphemes. Language 69, — Piaget, J.
The Construction of Reality in the Child. Piattelli-Palmarini, M. Pinker, S. The Language Instinct. The New Science of Language and Mind. London: Penguin Books. Changeux and J. Chavaillon Oxford: Clarendon Press , — Words and Rules. The Ingredients of Language. London: Weidenfeld and Nicolson. Pullum, G. Empirical assessment of stimulus poverty arguments.
Corbett Oxford: Oxford University Press. Reilly, J. Later language development in narratives in children with perinatal stroke. Richards, B. Robenalt, C. Judgment and frequency evidence for statistical preemption: it is relatively better to vanish than to disappear a rabbit, but a lifeguard can equally well backstroke or swim children to shore. Roberts, I. Broekhuis, N. Corver, R. Huybregts, U. Kleinhenz, and J.
Koster Berlin: Mouton de Gruyter , — Past and future approaches to linguistic variation: why doubt the existence of UG? Sachs, J. Hillsdale, NJ: Lawrence Erlbaum , 1— Language learning with restricted input: case studies of two hearing children of deaf parents. Saxton, M. Negative evidence and negative feedback: immediate effects on the grammaticality of child speech. Longer-term effects of corrective input: an experimental approach.
Scholz, B. Searching for arguments to support linguistic nativism. Shlonsky, U. The cartographic enterprise in syntax. Compass 4, — Shore, C. Individual Differences in Language Development. Slobin, D. Ferguson and D. Smith, N. Chomsky: Ideas and Ideals.
The Mind of a Savant. Language Learning and Modularity. Oxford: Blackwell. Smolensky, P. Universals in cognitive theories of language. Stefanowitsch, A. Negative entrenchment: a usage-based approach to negative evidence. Stiles, J. Cognitive development following early brain injury: evidence for neural adaptation. Stojanovik, V. Williams Syndrome and Specific Language Impairment do not support claims for developmental double dissociations.
Stowe, L. Rethinking the neurological basis of language. Street, J. More individual differences in language attainment: how much do adult native speakers of English know about passives and quantifiers?
Stromswold, K. Lepore and Z. Pylyshyn Oxford: Blackwell , — The heritability of language: a review and meta-analysis of twin, adoption and linkage studies. Language 77, — Localization of syntactic processing by positron emission tomography. Tallal, P. Language learning disabilities: integrating research approaches. Temple, C. Lexical skills in Williams syndrome: a cognitive neuropsychological analysis.
Thal, D. Ties between lexical and grammatical development: evidence from early talkers. Theakston, A. The role of performance limitations in the acquisition of verb argument structure: an alternative account.
Thelen, E. Thomas, M. Are developmental disorders like cases of adult brain damage? Implications from connectionist modeling. Past tense formation in Williams syndrome.
Processes 16, — Todd, P. Learning language the hard way. Tomasello, M. Language is not an instinct. The Cultural Origins of Human Cognition. Beyond formalities: the case of language acquisition. Construction grammar for kids. Constructions SV1—SV11, 1— Understanding and sharing intentions: the origins of cultural cognition. Trauner, D.
Early language development after peri-natal stroke. Ullman, M. Language and cognitive development in a grammatical SLI boy: modularity and innateness. Past tense morphology in specifically language impaired children and normally developing children.
Castro-Caldas, H. Van Dongen, and A. Van Hout Dordrecht: Kluwer , — Van Valin, R. Lima, R. Corrigan, and G. Iverson Amsterdam: Benjamins , — Vicari, S. Plasticity and reorganization during language development in children with early brain injury. Cortex 36, 31— Figure 3.
For instance, looking at its top right corner, one can deduce that for any language in the dataset, it is enough to know the values of parameters EZ3 and P LS in order to know the value of EZ2, and therefore, of EZ1, too.
Some of the rules identified by the algorithm are not new, and are already contained in the dataset, as encoded by the implicational system described in Sec- tion 1.
Even the rest of the rules learned are still just empirical findings: they may change with the addition of other examples of languages or their validity may be questioned by linguists on theoretical grounds. Linguistic analysis of the results is ongoing, and while no part of the results has been accepted as sufficient evidence to dispose of a parameter, implication rules have been revised on the basis of the decision trees learned, as in the case of the parameter P LS.
According to our definition, the parameter asks if in a lan- guage without grammaticalized Number, a plural marker can also appear outside a nominal phrase, marking a distributive relation between the plural subject and the constituent bearing it. Learning Language Family Descriptions The existing parameters see Appendix A have been introduced in order to en- sure each language in the database can be uniquely described and separated from the rest on their basis.
On a more general level, one could search for the condi- tions that separate languages from one linguistic family from all others. This is, of course, a classical machine learning task of producing training a classifier, which could be used for two purposes, to classify new languages as they are added to the database or to describe the conditions separating one family from the rest.
Again, a decision tree can be produced for this purpose. Instead, we can adopt another algorithm, namely, Candidate Elimi- nation CE Mitchell, to learn all possible hypotheses classifiers. This is a classical algorithm for learning in logic, which uses propositional data i. Each of these hypotheses covers implies all positive examples, and does not cover any of the negative i. If no hypothesis of this form and properties can be produced, the result is an empty set of hypotheses.
The set of all hypotheses is also known as the version space of hypotheses for the given dataset. While such logic-based approach makes the algorithm rather sensitive to any noise errors in the data, here we make the assumption that at this stage of the work, our data is error-free.
The output of CE consists of three parts: 1 the set of most specific hypotheses S, i. We applied CE to learning the description of two families of languages, namely, the Romance and the Indo-European IE , in order to explore the insights it provides. Both families are well established, with the latter subsuming the for- mer. There was a single most specific hypothesis MSH for each of the two fam- ilies see Table 1.
All constraints for the IE family are shared with the Romance family, as expected, while the parameter constraints listed in bold face are spe- cific to the Romance family.
This distinction can help guide hypotheses about the last common ancestor of each family, thus providing insight into the evolution of the languages within each family, and the parameters that defined their divergent properties. Looking at the set G of most general hypotheses MGHs for each family can provide further insight in this direction.
A closer look at these parameters reveals that these are particularly useful to delineate boundaries between language families, e. Thedata contain more features i. The most obvious way to counteract this, adding more languages, comes at a very high cost, as it requires well-trained linguists and an abundance of subtle though typologically wide ev- idence.
Yet another approach is to collect data selectively for rules of interest, as only a small number of parameters, e. This research could have important implications for the understanding of pro- cesses underlying the faculty of language potentially strengthening the case for UG through strengthening its adequacy as a restrictive typological model and as tool for insightful historical reconstructions , with consequences ranging from models of language acquisition to phylogenetic linguistics, where the syntactic relatedness between two languages may be more adequately measured.
However, the approach requires a close collaboration between a machine learning expert, discovering empirical laws in the data, and a linguist who can test their plausibil- ity and theoretical consequences. There is also an open theoretical computational learning challenge here presented by the need to estimate the significance of em- pirical findings from a given number of examples languages with respect to the available range of discriminative features in the dataset.
References Baker, M. The atoms of language. New York: Basic Books. Bortolussi, L. How many possible languages are there? Bel-Enguix, V. Amsterdam: IOS. Chomsky, N. Current issues in linguistic theory. The Hague: Mouton. Lectures on government and binding.
Dordrecht: Foris. Clark, R. A computational model of language learnability and language change. Linguistic Inquiry, 24, — Creanza, N. A comparison of worldwide phonemic and ge- netic variation in human populations. Proceedings of the National Academy of Sciences, 5 , —
0コメント