Research Article: Dissociating Attention Effects from Categorical Perception with ERP Functional Microstates

Date Published: September 22, 2016

Publisher: Public Library of Science

Author(s): Benjamin Dering, David I. Donaldson, Hisao Nishijo.


When faces appear in our visual environment we naturally attend to them, possibly to the detriment of other visual information. Evidence from behavioural studies suggests that faces capture attention because they are more salient than other types of visual stimuli, reflecting a category-dependent modulation of attention. By contrast, neuroimaging data has led to a domain-specific account of face perception that rules out the direct contribution of attention, suggesting a dedicated neural network for face perception. Here we sought to dissociate effects of attention from categorical perception using Event Related Potentials. Participants viewed physically matched face and butterfly images, with each category acting as a target stimulus during different blocks in an oddball paradigm. Using a data-driven approach based on functional microstates, we show that the locus of endogenous attention effects with ERPs occurs in the N1 time range. Earlier categorical effects were also found around the level of the P1, reflecting either an exogenous increase in attention towards face stimuli, or a putative face-selective measure. Both category and attention effects were dissociable from one another hinting at the role that faces may play in early capturing of attention before top-down control of attention is observed. Our data support the conclusion that certain object categories, in this experiment, faces, may capture attention before top-down voluntary control of attention is initiated.

Partial Text

We are able to recognise objects with only a momentary glance around our visual environment and some of these objects will capture our attention more than others. This capture of attention is guided by both bottom-up structural analysis of images and a top-down control of attention suggesting that, in a particular context, one stimulus can become most salient, for example, noticing a fire alarm in a corridor only in the event of your office burning down. Because of their social and biological importance in comparison to most other stimuli, faces are a prominent example of a visual stimulus which automatically captures attention, often to the detriment of other stimuli in the environment [1]. Faces have even been found to capture attention in visual search paradigms when they are not the explicit target [2]. These behavioural findings appear in contrast with some evidence from neuroimaging which suggests that effects of attention do not modulate early domain-specific processes in the perception of faces [3, 4]. Here, we use Event Related Potentials (ERPs) to identify the locus of attention within face perception, asking whether face-specific processes operate entirely independently of attention.

Our behavioural measures exhibited an expected pattern of response and reaction time (RT) for a rapid presentation task involving a rare target response switch. We subjected accuracy and reaction time data from deviant conditions to a repeated measures ANOVA model with factors of Deviant (Target, Non-Target), Category (Butterflies, Faces), and Colour (Blue, Green). We found differences in deviant accuracy such that non-target conditions were correctly identified more than target conditions [F(1,18) = 33.783, p < .05, ήp2 = 0.652]; on average, non-target accuracy = 93.71±4.6%, target accuracy = 68.3±21.1%. There were no differences in response accuracy for category or colour, and no significant interaction effects. We aimed to test if ERP correlates of face-perception were impervious to effects of attention, which could support a domain-specific view of face perception, or if attention was a driving factor in determining face-sensitive ERPs. Firstly, we replicated previous findings of P1 face-sensitivity in ERPs, supported by topographic differences between faces and butterflies (see also [16, 34]), yet found no topographic effects of task-driven attention at the P1. Since the P1 findings are similar to the proposed outcomes given in Fig 2A, these results could support a domain-specific account of P1 face-sensitivity, which will be examined further below. Critically, we found that the N170 showed no signs of category-sensitivity, no topographic differences encompassing the N170 peak maximum, or topographic distribution that implied a category effect for faces. The pattern of results at N170 follows the predicted outcome given in Fig 2C, that task-driven attention affects both object categories equally. In line with previous studies, attention significantly modulated our N170 component, supported by a distinct topography focused around the N170 present for target deviant conditions only [25, 55]. We have previously challenged the notion that the N170 is a face-selective measure and highlighted methodological reasons for this view [16, 34, 45, 56], and the present data further suggests that any face-sensitivity observed in the N170 range is not a product of increased salience of faces [57, 58].   Source: