Incidental biasing of attention from visual long-term memory

Judith E. Fan, Nicholas B. Turk-Browne

Research output: Contribution to journalArticlepeer-review

11 Scopus citations

Abstract

Holding recently experienced information in mind can help us achieve our current goals. However, such immediate and direct forms of guidance from working memory are less helpful over extended delays or when other related information in long-term memory is useful for reaching these goals. Here we show that information that was encoded in the past but is no longer present or relevant to the task also guides attention. We examined this by associating multiple unique features with novel shapes in visual long-term memory (VLTM), and subsequently testing how memories for these objects biased the deployment of attention. In Experiment 1, VLTM for associated features guided visual search for the shapes, even when these features had never been task-relevant. In Experiment 2, associated features captured attention when presented in isolation during a secondary task that was completely unrelated to the shapes. These findings suggest that long-term memory enables a durable and automatic type of memory-based attentional control.

Original languageEnglish (US)
Pages (from-to)970-977
Number of pages8
JournalJournal of Experimental Psychology: Learning Memory and Cognition
Volume42
Issue number6
DOIs
StatePublished - Jun 1 2016

All Science Journal Classification (ASJC) codes

  • Language and Linguistics
  • Experimental and Cognitive Psychology
  • Linguistics and Language

Keywords

  • Attentional capture
  • Episodic memory
  • Features and objects
  • Memory-guided attention
  • Working memory

Fingerprint Dive into the research topics of 'Incidental biasing of attention from visual long-term memory'. Together they form a unique fingerprint.

Cite this