Language Evolution and Computation Bibliography

Our site (www.isrl.uiuc.edu/amag/langev) retired, please use https://langev.com instead.
Pieter Wellens
2013
Fluid Construction Grammar for Historical and Evolutionary LinguisticsPDF
ACL, pages 127-132, 2013
Fluid Construction Grammar (FCG) is an open-source computational grammar formalism that is becoming increasingly popular for studying the history and evolution of language. This demonstration shows how FCG can be used to operationalise the cultural processes and cognitive ...MORE ⇓
Fluid Construction Grammar (FCG) is an open-source computational grammar formalism that is becoming increasingly popular for studying the history and evolution of language. This demonstration shows how FCG can be used to operationalise the cultural processes and cognitive mechanisms that underly language evolution and change.
2012
Multi-Dimensional Meanings in Lexicon Formation
Experiments in Cultural Language Evolution, pages 143 -- 166, 2012
This chapter introduces a language game experiment for studying the formation of a shared lexicon when word meanings are not restricted to a single domain, but instead consist of any combination of perceptual features from many different domains. The main difficulty for the ...
2010
Proceedings of the 8th International Conference on the Evolution of Language, pages 344-351, 2010
According to recent developments in (computational) Construction Grammar, language processing occurs through the incremental buildup of meaning and form according to constructional specifications. If the number of available constructions becomes large however, this results in a ...MORE ⇓
According to recent developments in (computational) Construction Grammar, language processing occurs through the incremental buildup of meaning and form according to constructional specifications. If the number of available constructions becomes large however, this results in a search process that quickly becomes cognitively unfeasible without the aid of additional guiding principles. One of the main mechanisms the brain recruits (in all sorts of tasks) to optimize processing efficiency is priming. Priming in turn requires a specific organisation of the constructions. Processing efficiency thus must have been one of the main evolutionary pressures driving the organisation of linguistic constructions. In this paper we show how constructions can be organized in a constructional dependency network in which constructions are linked through semantic and syntactic categories. Using Fluid Construction Grammar, we show how such a network can be learned incrementally in a usage-based fashion, and how it can be used to guide processing by priming the suitable constructions.
2008
Coping with Combinatorial Uncertainty in Word Learning: A Flexible Usage-Based ModelPDF
Proceedings of the 7th International Conference on the Evolution of Language, pages 370-377, 2008
Scaling up the complexity of a language game brings about a towering scale-up in the uncer- tainty the agents are faced with when acquiring (lexical) form-meaning associations. The two most prominent assumptions influencing the uncertainty in models on word meaning concern (1) ...MORE ⇓
Scaling up the complexity of a language game brings about a towering scale-up in the uncer- tainty the agents are faced with when acquiring (lexical) form-meaning associations. The two most prominent assumptions influencing the uncertainty in models on word meaning concern (1) meaning transfer and (2) whether a form can be associated with only one part of meaning or any subset of parts of meaning. If meaning has internal structure (e.g. sets of attributes) this second assumption amounts to whether a form can be associated with only one attribute, giving rise to linear uncertainty, or any subset of attributes, resulting in exponential uncertainty. We first present a short overview of different models that each tried to tackle at least one of these assumptions. We propose a new model borrowing ideas from many of these models that can handle the exponential increase in uncertainty when removing both assumptions and allows scaling towards very large meaning spaces (i.e. worlds).
Connection Science 20(2-3):173-191, 2008
Learning the meanings of words requires coping with referential uncertainty - a learner hearing a novel word cannot be sure which aspects or properties of the referred object or event comprise the meaning of the word. Data from developmental psychology suggest that human learners ...MORE ⇓
Learning the meanings of words requires coping with referential uncertainty - a learner hearing a novel word cannot be sure which aspects or properties of the referred object or event comprise the meaning of the word. Data from developmental psychology suggest that human learners grasp the important aspects of many novel words after just a few exposures, a phenomenon known as fast mapping. Traditionally, word learning is viewed as a mapping task, in which the learner has to map a set of forms onto a set of pre-existing concepts. We criticise this approach and argue instead for a flexible nature of the coupling between form and meanings as a solution to the problem of referential uncertainty. We implemented and tested the model in populations of humanoid robots that play situated language games about objects in their shared environment. Results show that the model can handle an exponential increase in uncertainty and allows scaling towards very large meaning spaces, while retaining the ability to grasp an operational meaning almost instantly for a great number of words. In addition, the model captures some aspects of the flexibility of form-meaning associations found in human languages. Meanings of words can shift between being very specific (names) and general (e.g. 'small'). We show that this specificity is biased not by the model itself but by the distribution of object properties in the world.
2007
ECAL07 4648:425-434, 2007
Language can be viewed as a complex adaptive system which is continuously shaped and reshaped by the actions of its users as they try to solve communicative problems. To maintain coherence in the overall system, different language elements (sounds, words, grammatical ...MORE ⇓
Language can be viewed as a complex adaptive system which is continuously shaped and reshaped by the actions of its users as they try to solve communicative problems. To maintain coherence in the overall system, different language elements (sounds, words, grammatical constructions) compete with each other for global acceptance. This paper examines what happens when a language system uses systematic structure, in the sense that certain meaning-form conventions are themselves parts of larger units. We argue that in this case multi-level selection occurs: at the level of elements (e.g. tense affixes) and at the level of larger units in which these elements are used (e.g. phrases). Achieving and maintaining linguistic coherence in the population under these conditions is non-trivial. This paper shows that it is nevertheless possible when agents take multiple levels into account both for processing meaning-form associations and for consolidating the language inventory after each interaction.
2006
Symbol Grounding and Beyond: Proceedings of the Third International Workshop on the Emergence and Evolution of Linguistic Communication, pages 76-88, 2006
According to the functional approach to language evolution (inspired by cognitive linguistics and construction grammar), grammar arises to deal with issues in communication among autonomous agents, particularly maximisation of communicative success and expressive power and ...MORE ⇓
According to the functional approach to language evolution (inspired by cognitive linguistics and construction grammar), grammar arises to deal with issues in communication among autonomous agents, particularly maximisation of communicative success and expressive power and minimisation of cognitive effort. Experiments in the emergence of grammar should hence start from a simulation of communicative exchanges between embodied agents, and then show how a particular issue that arises can be solved or partially solved by introducing more grammar. This paper shows a case study of this approach, focusing on the issue of search during parsing. Multiple hypotheses arise in parsing when the same syntactic pattern can be used for multiple purposes or when one syntactic pattern partly overlaps with another one. It is well known that syntactic ambiguity rapidly leads to combinatorial explosions and hence an increase in memory use and processing power, possibly to a point where the sentence can no longer be handled. Additional grammar, such as syntactic or semantic subcategorisation or word order and agreement constraints can help to dampen search because it provides information to the hearer which hypotheses are the most likely. The paper shows an operational experiment where avoiding search is used as the driver for the introduction and negotiation of syntax. The experiment is also a demonstration of how Fluid Construction Grammar is well suited for experiments in language evolution.