TY - JOUR
T1 - Iterated learning in dynamic social networks
AU - Chazelle, Bernard
AU - Wang, Chu
N1 - Funding Information:
We wish to thank the anonymous referees for their useful comments and suggestions. Some results of this work were previously published in the conferences, Innovations in Theoretical Computer Science (ITCS 2017), and American Control Conference (ACC 2017) by the same authors. Chu Wang did this work prior to joining Amazon. The research of Bernard Chazelle was sponsored by the Army Research Office and the Defense Advanced Research Projects Agency and was accomplished under Grant Number W911NF-17-1-0078. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the Army Research Office, the Defense Advanced Research Projects Agency, or the U.S. Government. The U.S. Government is authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation herein.
Publisher Copyright:
© 2019 Bernard Chazelle and Chu Wang.
PY - 2019/1/1
Y1 - 2019/1/1
N2 - A classic finding by (Kalish et al., 2007) shows that no language can be learned iteratively by rational agents in a self-sustained manner. In other words, if A teaches a foreign language to B, who then teaches what she learned to C, and so on, the language will quickly get lost and agents will wind up teaching their own common native language. If so, how can linguistic novelty ever be sustained? We address this apparent paradox by considering the case of iterated learning in a social network: we show that by varying the lengths of the learning sessions over time or by keeping the networks dynamic, it is possible for iterated learning to endure forever with arbitrarily small loss.
AB - A classic finding by (Kalish et al., 2007) shows that no language can be learned iteratively by rational agents in a self-sustained manner. In other words, if A teaches a foreign language to B, who then teaches what she learned to C, and so on, the language will quickly get lost and agents will wind up teaching their own common native language. If so, how can linguistic novelty ever be sustained? We address this apparent paradox by considering the case of iterated learning in a social network: we show that by varying the lengths of the learning sessions over time or by keeping the networks dynamic, it is possible for iterated learning to endure forever with arbitrarily small loss.
UR - http://www.scopus.com/inward/record.url?scp=85072648430&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85072648430&partnerID=8YFLogxK
M3 - Article
AN - SCOPUS:85072648430
SN - 1532-4435
VL - 20
JO - Journal of Machine Learning Research
JF - Journal of Machine Learning Research
ER -