Edward P. Stabler
2009
Computational Models of Language Universals: Expressiveness, Learnability, and ConsequencesPDF
Language Universals 10.0:200-224, 2009
This chapter reports on research showing that it may be a universal structural property of human languages that they fall into a class of languages defined by mildly context-sensitive grammars. It also investigates the issue of whether there are properties of language that are ...MORE ⇓
This chapter reports on research showing that it may be a universal structural property of human languages that they fall into a class of languages defined by mildly context-sensitive grammars. It also investigates the issue of whether there are properties of language that are needed to guarantee that it is learnable. It suggests that languages are learnable if they have a finite Vapnik-Chervonenkis (VC) dimension (where the VC dimension provides a combinatorial measure of complexity for a set of languages). Informally, a finite VC dimension requires that there be restrictions on the set of languages to be learned such that they do not differ from one another in arbitrary ways. These restrictions can be construed as universals that are required for language to be learnable (given formal language learnability theory). The chapter concludes by pointing out that formalizations of the semantic contribution (e.g., compositionality) to language learning might yield further insight into language universals.
2008
Cohesion of languages in grammar networksPDF
Cooperative Control of Distributed Multi-Agent Systems, 2008
2005
ECAL05, pages 624-633, 2005
The complexity, variation, and change of languages make evident the importance of representation and learning in the acquisition and evolution of language. For example, analytic studies of simple language in unstructured populations have shown complex dynamics, depending on the ...MORE ⇓
The complexity, variation, and change of languages make evident the importance of representation and learning in the acquisition and evolution of language. For example, analytic studies of simple language in unstructured populations have shown complex dynamics, depending on the fidelity of language transmission. In this study we extend these analysis of evolutionary dynamics to include grammars inspired by the principles and parameters paradigm. In particular, the space of languages is structured so that some pairs of languages are more similar than others, and mutations tend to change languages to nearby variants. We found that coherence emerges with lower learning fidelity than predicted by earlier work with an unstructured language space.
The role of population structure in language evolutionPDF
Proceedings of the 10th International Symposium on Artificial Life and Robotics, 2005
The question of language evolution is of interest to linguistics, biology and recently, engineering communicating networks. Previous work on these problems has focused mostly on a fully-connected population. We are extending this study to structured populations, which are ...MORE ⇓
The question of language evolution is of interest to linguistics, biology and recently, engineering communicating networks. Previous work on these problems has focused mostly on a fully-connected population. We are extending this study to structured populations, which are generally more realistic and offer rich opportunities for linguistic diversification. Our work focuses on the convergence properties of a spatially structured population of learners acquiring a language from one another. We investigate several metrics, including mean language coherence and the critical learning fidelity threshold.
2003
Grounding As LearningPDF
Proceedings of Language Evolution and Computation Workshop/Course at ESSLLI, pages 87-94, 2003
Communication among agents requires (among many other things) that each agent be able to identify the semantic values of the generators of the language. This is the” grounding” problem: how do agents with different cognitive and perceptual experiences successfully ...
ECAL03, pages 525-534, 2003
This paper describes a framework for studies of the adaptive acquisition and evolution of language, with the following components: language learning begins by associating words with cognitively salient representations (``grounding''); the sentences of each language are determined ...MORE ⇓
This paper describes a framework for studies of the adaptive acquisition and evolution of language, with the following components: language learning begins by associating words with cognitively salient representations (``grounding''); the sentences of each language are determined by properties of lexical items, and so only these need to be transmitted by learning; the learnable languages allow multiple agreements, multiple crossing agreements, and reduplication, as mildly context sensitive and human languages do; infinitely many different languages are learnable; many of the learnable languages include infinitely many sentences; in each language, inferential processes can be defined over succinct representations of the derivations themselves; the languages can be extended by innovative responses to communicative demands. Preliminary analytic results and a robotic implementation are described.
1999
ECAL99, pages 709-719, 1999
What permits some systems to evolve and adapt more effectively than others? Gell-Mann [3] has stressed the importance of ``com- pression'' for adaptive complex systems. Information about the environ- ment is not simply recorded as a look-up table, but is rather compressed in a ...MORE ⇓
What permits some systems to evolve and adapt more effectively than others? Gell-Mann [3] has stressed the importance of ``com- pression'' for adaptive complex systems. Information about the environ- ment is not simply recorded as a look-up table, but is rather compressed in a theory or schema. Several conjectures are proposed: (I) compression aids in generalization; (II) compression occurs more easily in a ``smooth'', as opposed to a ``rugged'', string space; and (III) constraints from com- pression make it likely that natural languages evolve towards smooth string spaces. We have been examining the role of such compression for learning and evolution of formal languages by artificial agents. Our sys- tem does seem to conform generally to these expectations, but the trade- oobetween compression and the errors that sometimes accompany it need careful consideration.