Transformer Working Memory Enables Regular Language Reasoning And Natural Language Length Extrapolation

Ta Chung Chi, Ting Han Fan, Alexander I. Rudnicky, Peter J. Ramadge

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Fingerprint

Dive into the research topics of 'Transformer Working Memory Enables Regular Language Reasoning And Natural Language Length Extrapolation'. Together they form a unique fingerprint.

Computer Science

Medicine and Dentistry

Psychology

Keyphrases