Tracy K. Teal
2000
Artificial Life 6(2):129-143, 2000
For many adaptive complex systems information about the environment is not simply recorded in a look-up table, but is rather encoded in a theory, schema, or model, which compresses the information. The grammar of a language can be viewed as such a schema or theory. In a prior ...MORE ⇓
For many adaptive complex systems information about the environment is not simply recorded in a look-up table, but is rather encoded in a theory, schema, or model, which compresses the information. The grammar of a language can be viewed as such a schema or theory. In a prior study [Teal et al., 1999] we proposed several conjectures about the learning and evolution of language that should follow from these observations: (C1) compression aids in generalization; (C2) compression occurs more easily in a smooth, as opposed to a rugged, problem space; and (C3) constraints from compression make it likely that natural languages eveolve toward smooth string spaces. This previous work found general, if not complete support for these three conjectures. Here we build on that study to clarify the relationship between Minimum Description Length (MDL) and error in our model and examine evolution of certain languages in more detail. Our results suggest a fourth conjecture: that all else being equal, (C4) more complex languages change more rapidly during evolution.
1999
ECAL99, pages 709-719, 1999
What permits some systems to evolve and adapt more effectively than others? Gell-Mann [3] has stressed the importance of ``com- pression'' for adaptive complex systems. Information about the environ- ment is not simply recorded as a look-up table, but is rather compressed in a ...MORE ⇓
What permits some systems to evolve and adapt more effectively than others? Gell-Mann [3] has stressed the importance of ``com- pression'' for adaptive complex systems. Information about the environ- ment is not simply recorded as a look-up table, but is rather compressed in a theory or schema. Several conjectures are proposed: (I) compression aids in generalization; (II) compression occurs more easily in a ``smooth'', as opposed to a ``rugged'', string space; and (III) constraints from com- pression make it likely that natural languages evolve towards smooth string spaces. We have been examining the role of such compression for learning and evolution of formal languages by artificial agents. Our sys- tem does seem to conform generally to these expectations, but the trade- oobetween compression and the errors that sometimes accompany it need careful consideration.