## Abstract

Previous work on iterated learning, a standard language learning paradigm where a sequence of learners learns a language from a previous learner, has found that if learners use a form of Bayesian inference, then the distribution of languages in a population will come to reflect the prior distribution assumed by the learners (Griffiths and Kalish 2007). We expand these results to allow for more complex population structures, and demonstrate that for learners on undirected graphs the distribution of languages will also reflect the prior distribution. We then use techniques borrowed from statistical physics to obtain deeper insight into language evolution, finding that although population structure will not influence the probability that an individual speaks a given language, it will influence how likely neighbors are to speak the same language. These analyses lift a restrictive assumption of iterated learning, and suggest that experimental and mathematical findings using iterated learning may apply to a wider range of settings.

Original language | English (US) |
---|---|

Pages (from-to) | 1-6 |

Number of pages | 6 |

Journal | Journal of Mathematical Psychology |

Volume | 76 |

DOIs | |

State | Published - Feb 1 2017 |

Externally published | Yes |

## All Science Journal Classification (ASJC) codes

- General Psychology
- Applied Mathematics

## Keywords

- Bayesian models
- Cultural evolution
- Iterated learning
- Language evolution
- Voter models