1994 :: PROCEEDINGS
Artificial Life IV
Altruism in the evolution of communicationPDF
Artificial Life IV, pages 40-48, 1994
Computer models of evolutionary phenomena often assume that the fitness of an individual can be evaluated in isolation, but effective communication requires that individuals interact. Existing models directly reward speakers for improved behavior on the part of the listeners so ...MORE ⇓
Computer models of evolutionary phenomena often assume that the fitness of an individual can be evaluated in isolation, but effective communication requires that individuals interact. Existing models directly reward speakers for improved behavior on the part of the listeners so that, essentially, effective communication is fitness. We present new models in which, even though 'speaking truthfully' provides no tangible benefit to the speaker, effective communication nonetheless evolves. A large population is spatially distributed so that 'communication range' approximately correlates with 'breeding range,' so that most of the time 'you'll be talking to family,' allowing kin selection to encourage the emergence of communication. However, the emergence of altruistic communication also creates niches that can be exploited by 'information parasites.' The new models display complex and subtle long-term dynamics as the global implications of such social dilemmas are played out.
Innate biases and critical periods: Combining evolution and learning in the acquisition of syntaxPDF
Artificial Life IV, pages 160-171, 1994
Recurrent neural networks can be trained to recognize strings generated by context-free grammars, but the ability of the networks to do so depends on their having an appropriate set of initial connection weights. Simulations of evolution were performed on populations of simple ...MORE ⇓
Recurrent neural networks can be trained to recognize strings generated by context-free grammars, but the ability of the networks to do so depends on their having an appropriate set of initial connection weights. Simulations of evolution were performed on populations of simple recurrent networks where the selection criterion was the ability of the networks to recognize strings generated by grammars. The networks evolved sets of initial weights from which they could reliably learn to recognize the strings. In order to recognize if a string was generated by a given context-free grammar, it is necessary to use a stack or counter to keep track of the depth of embedding in the string. The networks that evolved in our simulations are able to use the values passed along their recurrent connections for this purpose. Furthermore, populations of networks can evolve a bias towards learning the underlying regularities in a class of related languages. These results suggest a new explanation for the ``critical period'' effects observed in the acquisition of language and other cognitive faculties. Instead of being the result of an exogenous maturational process, the degraded acquisition ability may be the result of the values of innately specified initial weights diverging in response to training on spurious input.
Artificial Life III
Coevolving High-Level RepresentationsPDF
Artificial Life III, pages 55-71, 1994
Abstract Several evolutionary simulations allow for a dynamic resizing of the genotype. This is an important alternative to constraining the genotype's maximum size and complexity. In this paper, we add an additional dynamic to simulated evolution with the description of a ...
Proceedings of The IEEE Conference on Evolutionary Computation, IEEE World Congress on Computational Intelligence
Proceedings of The IEEE Conference on Evolutionary Computation, IEEE World Congress on Computational Intelligence, 1994
Abstract This paper investigates the use of genetic algorithms for inferring small regular and context-free grammars. Applied simply, a genetic algorithm is not very effective at this. To overcome this problem we investigate two methods of structuring the chromosomes. The ...
1994 :: JOURNAL
Nature
Nature 372:325, 1994
METHODS. Subjects were tested at Base Camp (altitude 5,300 m) before and after a summit climb attempt, at Camp Two (6,300 m), and at Camp Three (7,150 m), within a day of arriving at each location. No supplementary oxygen was used at the testing altitudes. Each subject ...
Artificial Life
Toward Synthesizing Artificial Neural Networks that Exhibit Cooperative Intelligent Behavior: Some Open Issues in Artificial Lifedoi.orgPDF
Artificial Life 1(1):111-134, 1994
The tasks that animals perform require a high degree of intelligence. Animals forage for food, migrate, navigate, court mates, rear offspring, defend against predators, construct nests, and so on. These tasks commonly require social interaction/cooperation and are accomplished by ...MORE ⇓
The tasks that animals perform require a high degree of intelligence. Animals forage for food, migrate, navigate, court mates, rear offspring, defend against predators, construct nests, and so on. These tasks commonly require social interaction/cooperation and are accomplished by animal nervous systems, which are the result of billions of years of evolution and complex developmental/learning processes. The Artificial Life (AL) approach to synthesizing intelligent behavior is guided by this biological perspective. In this article we examine some of the numerous open problems in synthesizing intelligent animal behavior (especially cooperative behavior involving communication) that face the field of AL, a discipline still in its infancy.
Mathematical and Computer Modelling
Mathematical and Computer Modelling 19(12):27-36, 1994
This paper presents the finding that the invocation of new words in human language samples is governed by a slowly changing Poisson process. The time dependent rate constant for this process has the form [small lambda, Greek(t)...]. This form implies that there are opening, ...MORE ⇓
This paper presents the finding that the invocation of new words in human language samples is governed by a slowly changing Poisson process. The time dependent rate constant for this process has the form [small lambda, Greek(t)...]. This form implies that there are opening, middle and final phases to the introduction of new words, each distinguished by a dominant rate constant, or equivalently, rate of decay. With the occasional exception of the phase transition from beginning to middle, the rate small lambda, Greek(t) decays monotonically. Thus, small lambda, Greek(t) quantifies how the penchant of humans to introduce new words declines with the progression of their narratives, written or spoken.
Mind and Language
Generalization and connectionist language learning
Mind and Language 9:273-287, 1994
The performance of any learning system may be assessed by its ability to generalize from past experience to novel stimuli. Hadley (this issue) points out that in much connectionist research, this ability has not been viewed in a sophisticated way. Typically, the'test-set' ...
1994 :: EDIT BOOK
Origins of Semiosis: Sign Evolution in Nature and Culture
Language Evolution: A Darwinian Process
Origins of Semiosis: Sign Evolution in Nature and Culture, pages 269-92., 1994
Unlike biology, where science was able to prevail upon metaphysics as early as the last century, linguistics still lives in the cozy world where the postulated distinction between body and mind provides a rationale for thinking that evolution must be confined to bones and ...
1994 :: BOOK
The evolution of grammar: tense, aspect and modality in the language of the world
University of Chicago Press, 1994
Joan Bybee and her colleagues present a new theory of the evolution of grammar that links structure and meaning in a way that directly challenges most contemporary versions of generative grammar. This study focuses on the use and meaning of grammatical markers ...
On language change: The invisible hand in language
Routledge, London, 1994
In the twentieth century, linguistics has been dominated by two paradigms-those of Saussure and Chomsky. In both these philosophies of linguistics, language change was left aside as an unsolvable mystery which challenged theoretical entirety. In On Language ...
The Language Instinct: How the Mind Creates Language
HarperCollins, 1994
In this classic study, the world's leading expert on language and the mind lucidly explains everything you always wanted to know about languages: how it works, how children learn it, how it changes, how the brain computes it, and how it envolved. With wit, erudition, and deft use ...MORE ⇓
In this classic study, the world's leading expert on language and the mind lucidly explains everything you always wanted to know about languages: how it works, how children learn it, how it changes, how the brain computes it, and how it envolved. With wit, erudition, and deft use it everyday examples of humor and wordplay, Steven Pinker weaves our vast knowledge of language into a compelling story: language is a human instinct, wired into our brains by evolution like web spinning in spiders or sonar bats. The Language Instinct received the William James Book Prize from the American Psychological Association and the Public Interest Award from the Linguistics Society of America.
The Origin of Language: Tracing the Evolution of the Mother Tongue
Wiley, 1994
Language and History: Voices from the Past.
Language Families: What Is Known.
Controversy: What Is Debated.
Native Americans: Language in the New World.
The Origin of Language: Are There Global Cognates?.
A Window on the World: What Has Been Resolved.
...MORE ⇓
Language and History: Voices from the Past.
Language Families: What Is Known.
Controversy: What Is Debated.
Native Americans: Language in the New World.
The Origin of Language: Are There Global Cognates?.
A Window on the World: What Has Been Resolved.
Genes: Biology and Language.
The Emerging Synthesis: On the Origin of Modern Humans.
Epilogue.
Lawrence Erlbaum Ass, 1994
This volume is the direct result of a conference in which a number of leading researchers from the fields of artificial intelligence and biology gathered to examine whether there was any ground to assume that a new AI paradigm was forming itself and what the essential ...
1994 :: PHD THESIS
Infinite Languages, Finite Minds: Connectionism, Learning and Linguistic StructurePDF
University of Edinburgh, Scotland, 1994
This thesis presents a connectionist theory of how infinite languages may fit within nite minds. Arguments are presented against the distinction between linguistic competence and observable language performance. It is suggested that certain kinds of finite state automata--i.e., ...MORE ⇓
This thesis presents a connectionist theory of how infinite languages may fit within nite minds. Arguments are presented against the distinction between linguistic competence and observable language performance. It is suggested that certain kinds of finite state automata--i.e., recurrent neural networks|are likely to have sufficient computational power, and the necessary generalization capability, to serve as models for the processing and acquisition of linguistic structure. These arguments are further corroborated by a number of computer simulations, demonstrating that recurrent connectionist models are able to learn complex recursive regularities and have powerful generalization abilities. Importantly, the performance evinced by the networks are comparable with observed human behavior on similar aspects of language. Moreover, an evolutionary account is provided, advocating a learning and processing based explanation of the origin and subsequent phylogenetic development of language. This view construes language as a nonobligate symbiant, arguing that language has evolved to fit human learning and processing mechanisms, rather than vice versa. As such, this perspective promises to explain linguistic universals in functional terms, and motivates an account of language acquisition which incorporates innate, but not language-specific constraints on the learning process. The purported poverty of the stimulus is re-appraised in this light, and it is concluded that linguistic structure may be learnable by bottom-up statistical learning models, such as, connectionist neural networks.