The automaticity of visual statistical learning

Nicholas B. Turk-Browne, Justin A. Jungé, Brian J. Scholl

Research output: Contribution to journalArticlepeer-review

578 Scopus citations

Abstract

The visual environment contains massive amounts of information involving the relations between objects in space and time, and recent studies of visual statistical learning (VSL) have suggested that this information can be automatically extracted by the visual system. The experiments reported in this article explore the automaticity of VSL in several ways, using both explicit familiarity and implicit response-time measures. The results demonstrate that (a) the input to VSL is gated by selective attention, (b) VSL is nevertheless an implicit process because it operates during a cover task and without awareness of the underlying statistical patterns, and (c) VSL constructs abstracted representations that are then invariant to changes in extraneous surface features. These results fuel the conclusion that VSL both is and is not automatic: It requires attention to select the relevant population of stimuli, but the resulting learning then occurs without intent or awareness.

Original languageEnglish (US)
Pages (from-to)552-564
Number of pages13
JournalJournal of Experimental Psychology: General
Volume134
Issue number4
DOIs
StatePublished - Nov 2005

All Science Journal Classification (ASJC) codes

  • Experimental and Cognitive Psychology
  • Developmental Neuroscience
  • General Psychology

Keywords

  • Implicit learning
  • Nonadjacent dependencies
  • Selective attention
  • Specificity
  • Statistical learning

Fingerprint

Dive into the research topics of 'The automaticity of visual statistical learning'. Together they form a unique fingerprint.

Cite this