Self-sustaining iterated learning

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

An important result from psycholinguistics (Griffiths & Kalish, 2005) states that no language can be learned iteratively by rational agents in a self-sustaining manner. We show how to modify the learning process slightly in order to achieve self-sustainability. Our work is in two parts. First, we characterize iterated learnability in geometric terms and show how a slight, steady increase in the lengths of the training sessions ensures self-sustainability for any discrete language class. In the second part, we tackle the nondiscrete case and investigate self-sustainability for iterated linear regression. We discuss the implications of our findings to issues of non-equilibrium dynamics in natural algorithms.

Original languageEnglish (US)
Title of host publication8th Innovations in Theoretical Computer Science Conference, ITCS 2017
EditorsChristos H. Papadimitriou
PublisherSchloss Dagstuhl- Leibniz-Zentrum fur Informatik GmbH, Dagstuhl Publishing
ISBN (Electronic)9783959770293
DOIs
StatePublished - Nov 1 2017
Event8th Innovations in Theoretical Computer Science Conference, ITCS 2017 - Berkeley, United States
Duration: Jan 9 2017Jan 11 2017

Publication series

NameLeibniz International Proceedings in Informatics, LIPIcs
Volume67
ISSN (Print)1868-8969

Other

Other8th Innovations in Theoretical Computer Science Conference, ITCS 2017
CountryUnited States
CityBerkeley
Period1/9/171/11/17

All Science Journal Classification (ASJC) codes

  • Software

Keywords

  • Iterated Bayesian linear regression
  • Iterated learning
  • Language evolution
  • Non-equilibrium dynamics

Fingerprint Dive into the research topics of 'Self-sustaining iterated learning'. Together they form a unique fingerprint.

Cite this