The contents of predictions in sentence comprehension: Activation of the shape of objects before they are referred to

Joost Rommers (Corresponding Author), Antje S. Meyer, Peter Praamstra, Falk Huettig

Research output: Contribution to journalArticle

46 Citations (Scopus)


When comprehending concrete words, listeners and readers can activate specific visual information such as the shape of the words' referents. In two experiments we examined whether such information can be activated in an anticipatory fashion. In Experiment 1, listeners' eye movements were tracked while they were listening to sentences that were predictive of a specific critical word (e.g., “moon” in “In 1969 Neil Armstrong was the first man to set foot on the moon”). 500 ms before the acoustic onset of the critical word, participants were shown four-object displays featuring three unrelated distractor objects and a critical object, which was either the target object (e.g., moon), an object with a similar shape (e.g., tomato), or an unrelated control object (e.g., rice). In a time window before shape information from the spoken target word could be retrieved, participants already tended to fixate both the target and the shape competitors more often than they fixated the control objects, indicating that they had anticipatorily activated the shape of the upcoming word's referent. This was confirmed in Experiment 2, which was an ERP experiment without picture displays. Participants listened to the same lead-in sentences as in Experiment 1. The sentence-final words corresponded to the predictable target, the shape competitor, or the unrelated control object (yielding, for instance, “In 1969 Neil Armstrong was the first man to set foot on the moon/tomato/rice”). N400 amplitude in response to the final words was significantly attenuated in the shape-related compared to the unrelated condition. Taken together, these results suggest that listeners can activate perceptual attributes of objects before they are referred to in an utterance.
Original languageEnglish
Pages (from-to)437-447
Number of pages11
Issue number3
Early online date10 Dec 2012
Publication statusPublished - Feb 2013



  • Predictive sentence processing
  • Visual representations
  • Eye-tracking
  • Event-related potentials (ERPs)
  • N400

Cite this