Language Evolution and Computation Bibliography

Our site (www.isrl.uiuc.edu/amag/langev) retired, please use https://langev.com instead.
J. L. Elman
2009
Cognitive science 33(4):547--582, 2009
Abstract Although for many years a sharp distinction has been made in language research between rules and words—with primary interest on rules—this distinction is now blurred in many theories. If anything, the focus of attention has shifted in recent years in favor of ...
2000
Hybrid Neural Symbolic Integration, 2000
This paper proposes an account of the acquisition of grammatical relations using the basic concepts of connectionism and a construction-based theory of grammar. Many previous accounts of first-language acquisition assume that grammatical relations (eg, the ...
1999
The emergence of language: A conspiracy theoryPDF
Emergence of Language, 1999
Language is puzzling. On the one hand, there are compelling reasons to believe that the possession of language by humans has deep biological roots. We are the only species that has a communication system with the complexity and richness of language. There are ...
1998
Generalization, simple recurrent networks, and the emergence of structurePDF
Proceedings of the Twentieth Annual Conference of the Cognitive Science Society, 1998
If human behavior were list-like, accounting for human behavior would be simple: Just enumerate the list of possible stereotypies. Alternatively, if behavior were predictable on the basis of abstract, fully-productive, context-insensitive rules, our task would be different but ...MORE ⇓
If human behavior were list-like, accounting for human behavior would be simple: Just enumerate the list of possible stereotypies. Alternatively, if behavior were predictable on the basis of abstract, fully-productive, context-insensitive rules, our task would be different but ...
1995
Language as a dynamical systemPDF
Mind as Motion: Explorations in the Dynamics of Cognition, pages 195-223, 1995
Despite considerable diversity among theories about how humans process language, there are a number of fundamental assumptions that are shared by most such theories. This consensus extends to the very basic question about what counts as a cognitive process. ...
Cognition 56(1):61-98, 1995
An account is offered to change over time in English verb morphology, based on a connectionist approach to how morphological knowledge is acquired and used. A technique is first described that was developed for modeling historical change in connectionist networks, and that ...MORE ⇓
An account is offered to change over time in English verb morphology, based on a connectionist approach to how morphological knowledge is acquired and used. A technique is first described that was developed for modeling historical change in connectionist networks, and that technique is applied to model English verb inflection as it developed from the highly complex past tense system of Old English towards that of the modern language, with one predominant ``regular'' inflection and a small number of irregular forms. The model relies on the fact that certain input-output mappings are easier than others to learn in a connectionist network. Highly frequent patterns, or those that share phonological regularities with a number of others, are learned more quickly and with lower error than low-frequency, highly irregular patterns. A network is taught a data set representative of the verb classes of Old English, but learning is stopped before reaching asymptote, and the output of this network is used as the teacher of a new net. As a result, the errors in the first network were passed on to become part of the data set of the second. Those patterns that are hardest to learn led to the most errors, and over time are ``regularized'' to fit a more dominant pattern. The results of the networks simulations were highly consistent with the major historical developments. These results are predicted from well-understood aspects of network dynamics, which therefore provide a rationale for the shape of the attested changes.
1993
Cognition 48(1):71-99, 1993
It is a striking fact that in humans the greatest learning occurs precisely at that point in time - childhood - when the most dramatic maturational changes also occur. This report describes possible synergistic interactions between maturational change and the ability to learn a ...MORE ⇓
It is a striking fact that in humans the greatest learning occurs precisely at that point in time - childhood - when the most dramatic maturational changes also occur. This report describes possible synergistic interactions between maturational change and the ability to learn a complex domain (language), as investigated in connectionist networks. The networks are trained to process complex sentences involving relative clauses, number agreement, and several types of verb argument structure. Training fails in the case of networks which are fully formed and `adultlike' in their capacity. Training succeeds only when networks begin with limited working memory and gradually `mature' to the adult state. This result suggests that rather than being a limitation, developmental restrictions on resources may constitute a necessary prerequisite for mastering certain complex domains. Specifically, successful learning may depend on starting small.
1991
Distributed representations, simple recurrent networks, and grammatical structurePDF
Machine Learning 7:195-224, 1991
In this paper three problems for a connectionist account of language are considered1. What is the nature of linguistic representations? 2. How can complex structural relationships such as constituent be represented? 3. How can the apparently open-ended nature of ...
1990
Cognitive Science 14(2):179--211, 1990
Time underlies many interesting human behaviors. Thus, the question of how to represent time in connectionist models is very important. One approach is to represent time implicitly by its effects on processing rather than explicitly (as in a spatial representation). The current ...MORE ⇓
Time underlies many interesting human behaviors. Thus, the question of how to represent time in connectionist models is very important. One approach is to represent time implicitly by its effects on processing rather than explicitly (as in a spatial representation). The current report develops a proposal along these lines first described by Jordan (1986) which involves the use of recurrent links in order to provide networks with a dynamic memory. In this approach, hidden unit patterns are fed back to themselves; the internal representations which develop thus reflect task demands in the context of prior internal states. A set of simulations is reported which range from relatively simple problems (temporal version of XOR) to discovering syntactic/semantic features for words. The networks are able to learn interesting internal representations which incorporate task demands with memory demands; indeed, in this approach the notion of memory is inextricably bound up with task processing. These representations reveal a rich structure, which allows them to be highly context-dependent while also expressing generalizations across classes of items. These representations suggest a method for representing lexical categories and the type/token distinction.