Posted on Mar 24, 2015

We have a new paper in press:

Mack, M.L., & Palmeri, T.J. (in press). The dynamics of categorization: Unraveling rapid categorization. Journal of Experimental Psychology: General. [PDF]

We explore a puzzle of visual object categorization: Under normal viewing conditions, you spot something as a dog fastest, but at a glance, you spot it faster as an animal. During speeded category verification, a classic basic-level advantage is commonly observed (Rosch, Mervis, Gray, Johnson, & Boyes-Braem, 1976), with categorization as a dog faster than as an animal (superordinate) or Golden Retriever (subordinate). A different story emerges during ultra-rapid categorization with limited exposure duration (<30ms), with superordinate categorization faster than basic or subordinate categorization (Thorpe, Fize, & Marlot, 1996). These two widely cited findings paint contrary theoretical pictures about the time course of object categorization, yet no study has previously investigated them together. Over five experiments, we systematically examined two experimental factors that could explain the qualitative difference in categorization across the two paradigms: exposure duration and category trial context. Mapping out the time course of object categorization by manipulating exposure duration and the timing of a post-stimulus mask revealed that brief exposure durations favor superordinate-level categorization, but with more time a basic-level advantage emerges. But this superordinate advantage was modulated significantly by target category trial context. With randomized target categories, the superordinate advantage was eliminated; and with “blocks” of only four repetitions of superordinate categorization within an otherwise randomized context, the advantage for the basic-level was eliminated. Contrary to some theoretical accounts that dictate a fixed priority for certain levels of abstraction in visual processing and access to semantic knowledge, the dynamics of object categorization are flexible, depending jointly on the level of abstraction, time for perceptual encoding, and category context.