Skip to main content

ORIGINAL RESEARCH article

Front. Psychol., 08 February 2011
Sec. Cognition
This article is part of the Research Topic Embodied and Grounded Cognition View all 24 articles

Modality Switching in a Property Verification Task: An ERP Study of What Happens When Candles Flicker after High Heels Click

  • 1 Brain and Cognition Lab, Department of Cognitive Science, University of California San Diego, San Diego, CA, USA
  • 2 Memory Lab, Department of Psychology, Faculty of Social Sciences, Erasmus University Rotterdam, Rotterdam, Netherlands

The perceptual modalities associated with property words, such as flicker or click, have previously been demonstrated to affect subsequent property verification judgments (Pecher et al., 2003). Known as the conceptual modality switch effect, this finding supports the claim that brain systems for perception and action help subserve the representation of concepts. The present study addressed the cognitive and neural substrate of this effect by recording event-related potentials (ERPs) as participants performed a property verification task with visual or auditory properties in key trials. We found that for visual property verifications, modality switching was associated with an increased amplitude N400. For auditory verifications, switching led to a larger late positive complex. Observed ERP effects of modality switching suggest property words access perceptual brain systems. Moreover, the timing and pattern of the effects suggest perceptual systems impact the decision-making stage in the verification of auditory properties, and the semantic stage in the verification of visual properties.

Introduction

Over the past decade, cognitive scientists have gradually moved away from the assumption that concepts are symbolic, that is, arbitrarily related to the things they represent, and amodal, or independent of any sensory modality (see Murphy, 2002 for a review of traditional models), and have increasingly come to embrace an embodied or grounded approach. These more recent accounts have focused on how concepts are grounded in our perception of, and interaction with, the physical and social world, and stressed their modal characteristics (see Barsalou, 2008 for a review). The perceptual symbol system hypothesis, for example, is that conceptual knowledge involves schematized perceptual and motor representations involved in one’s prior experience with the concept’s referent (Barsalou, 1999). On this account, a concept is a sensorimotor simulation involving the partial reactivation of brain regions that participated in the acquisition of that concept. For example, the concept of a dog is a simulation involving brain areas that represent one’s visual, auditory, tactile, olfactory, affective, and motoric experiences with dogs. Importantly, simulations are not holistic records of experience, but can be flexibly adapted to the current context and task (Barsalou et al., 2003).

The use of visual mental images for ostensibly conceptual tasks has been demonstrated with the property verification task, in which participants are asked whether or not a particular property (e.g., has-a-head) is true for a given concept (e.g., CAT). The perceptual symbol system hypothesis suggests that accessing conceptual knowledge involves the activation of associated visual images, and thus predicts a systematic relationship between the difficulty of property verification and that of activating the relevant visual image. Consistent with this prediction, Solomon and Barsalou (2004) found that participants took less time to verify visually large properties of a concept (e.g., that a CAT has a head) than visually smaller properties of the same concept (e.g., that a CAT has a paw). The fact that performance on this conceptual task was modulated in a similar way as performance on a visual imagery task was argued to implicate the importance of visual processes in conceptual representations.

Moreover, a functional magnetic resonance imaging (fMRI) study in which participants performed the property verification task employed by Solomon and Barsalou (2004) revealed activation in the left fusiform gyrus, an area important for object recognition and visual imagery (Kan et al., 2003). The recruitment of perceptual brain areas for the conceptual task of property verification is consistent with the perceptual symbol system hypothesis, and is also in keeping with other fMRI studies in which conceptual tasks have activated brain regions used to perceive the concepts’ referents (Goldberg et al., 2006; Martin, 2007; Simmons et al., 2007).

Modality Switch Effects

Although the bulk of empirical support for the perceptual symbol system hypothesis concerns the involvement of specifically visual representations, the hypothesis is, in fact, farther ranging, extending to the full multimodal characteristic of human experience. The concept of a lemon, for example, should not only represent its color, but also its taste, its smell, and its texture. Moreover, because simulations involve the coordination of information from multiple perceptual modalities, the perceptual symbol system hypothesis predicts that conceptual operations will display many of the same properties as complex perceptual operations, and be subject to similar constraints. Accordingly, Pecher et al. (2003) tested whether a property verification task using properties from several modalities, including vision, audition, and touch, was modulated by factors known to affect perceptual detection tasks with stimuli from multiple modalities.

In particular, Pecher et al. (2003) focused on the modality switch effect, a phenomenon observed in the literature on perceptual processing. In a study designed to assess cross-modal effects of spatial attention, Spence et al. (2001) asked participants to detect brief auditory, visual, or tactile targets at peripheral locations. The modality switch effect is the finding that reaction times were longer for all stimulus types when they were preceded by a stimulus from a different modality than from the same modality, and has been interpreted as an exogenously driven attentional cost for the switch trials (Spence et al., 2001; Rodway, 2005).

Pecher et al. (2003) reasoned that if conceptual processing relies on perceptual systems, the well-known cost for successive trials from different modalities in perceptual tasks might also be expected to occur on a property verification task employing properties from multiple modalities. In their conceptual analog to the modality switch studies, Pecher et al. (2003) asked participants to determine whether a property (e.g., yellow or sour) applied to the preceding concept (e.g., LEMON or MOUSE). The manipulation of interest was whether a pair of trials was from the same modality (LEAVESrustling followed by BLENDERloud) or different modalities (CRANBERRIEStart followed by BLENDERloud). As predicted by the perceptual symbol system hypothesis, Pecher et al. (2003) found longer reaction times for the second trial in a pair of different modality (switch) trials than for the second trial in a pair of same modality (no-switch) trials, the conceptual modality switch effect.

Variations on the conceptual modality switch paradigm have shown that results cannot be attributed to alternative explanations, such as word association (Pecher et al., 2003), or category overlap (Marques, 2006). The generality of the effect is supported by the demonstration of a similar switch effect on a property verification task using perceptual and emotional attributes (Vermeulen et al., 2007). Importantly, property verification has also been shown to be speeded by the presentation of a perceptual stimulus from the same modality relative to one from a different modality (van Dantzig et al., 2008). The finding that the verification of visual features of a concept is faster after the perceptual detection of visual than auditory or tactile stimuli provides strong support for the suggestion that the conceptual task of property verification recruits perceptual processing resources, as opposed to an amodal re-representation of perceptual information.

Another direction this research has taken has been to investigate the neural substrate of modality specific concepts using cognitive neuroscience methods. Goldberg et al. (2006) recorded participants’ brain activity using fMRI while they engaged in a property verification task. The experiment used a design in which different blocks required participants to make decisions about properties referring to different modalities – visual, auditory, tactile, and gustatory. The brain regions uniquely activated for each property category were regions related to the perception of stimuli in the different domains. These results are particularly important given that reaction time results for similar conceptual tasks have not distinguished between responses to properties of different modalities (Pecher et al., 2009).

Neuroimaging data thus provide compelling evidence that conceptual tasks are associated with the activation of perceptual brain regions. At issue, however, is whether perceptual systems play a central or a peripheral role in cognition (Barsalou, 2008). Perceptual activations might, for example, be an artifact of the blocked design used by Goldberg et al. (2006). Alternatively, perceptual activations might reflect top-down processing initiated only after the meaning of the property words has been activated.

The Present Study

The present study addressed the cognitive and neural basis of the conceptual modality switch effect by recording event-related potentials (ERPs) as participants made property verification judgments about the visual and auditory properties of objects. ERPs are patterned voltage changes in the on-going electroencephalogram (EEG) that are time-locked to classes of specific processing events. As a continuous, real-time measure of brain activity, ERPs are well-suited for investigating the neural processes relevant to the conceptual modality switch effect allowing us to better understand when a perceptual system is accessed by a related concept. In particular, the present study was designed to address whether the modality manipulation affected ERP components associated with the visual processing of property terms, such as the N1 and P2, semantic processing of property terms, such as the N400, or their task-relevant categorization as typical properties of the relevant concept, indexed by the P3 or late positive complex (LPC).

We used stimuli similar to those employed by Pecher et al. (2003), but included only visual (CANDLESflicker) and auditory (NEWSPAPERSrustle) trials in our critical conditions. This reduction in variation was important in order to have enough trials in critical conditions for averaging ERPs. Participants’ task was to determine whether or not the property applied to the concept. The correct response on all experimental trials was “true,” and a large number of filler trials requiring a “false” response (e.g., COCKROACHESablaze) were included to discourage the development of a particular response bias. A subset of false filler trials included properties and concepts that were lexically associated (e.g., STRAWBERRIEScream) and were intended to discourage the use of word association strategies (Solomon and Barsalou, 2004). The critical manipulation concerned whether the target concept–property trial (e.g., NEWSPAPERSrustle) was preceded by a prime concept–property trial from the same modality (e.g., HIGH HEELSclick), or a different modality (e.g., CHERRIESruby). Half of the experimental trials involved visual and half auditory properties, and were equally likely to follow a concept–property trial from the same modality (a visual property following a visual property, or an auditory property following an auditory property, viz. no-switch trials) as one from a different modality (visual–auditory or auditory–visual, viz. switch trials).

The primary goal of the study was thus to identify electrophysiological correlates of the conceptual modality switch effect in order to determine which stage or stages of processing the switch manipulation would modulate. If concepts automatically engage early sensory processing, then the mention of a visual property such as “flicker” could modulate the actual perception of visual word forms presented shortly afterward. The converse of this type of effect was found behaviorally by van Dantzig et al. (2008). Low-level perceptual engagement of this sort would be indexed by modulation of visual ERP components to the word form such as the N1, and P2.

Alternatively, perceptual access might be part of an extended, standard semantic network that subserves the representation of concepts. The N400, a negative-going wave evident between 200 and 700 ms after the visual presentation of a word, was of particular interest due to its association with the processing of meaningful events. The N400 is elicited by words in all modalities, whether written, spoken, or signed (Holcomb and Neville, 1990). Words preceded by semantically related words elicit smaller amplitude N400 than do words preceded by unrelated words, the N400 priming effect (Bentin, 1987; Holcomb, 1988; Smith and Halgren, 1989). The N400 is also sensitive to contextual factors related to meaning at the sentence and text level. In general, N400 amplitude varies inversely with the predictability of the target word: N400s are large for unexpected items, smaller for words of intermediate predictability, and are barely detectable for highly predictable words (Kutas and Hillyard, 1984; see Kutas and Federmeier, 2011 for a review).

Yet another possibility is that the conceptual modality switch effect is attributable to decision processes specifically induced by the property verification task. If this is the case, we would expect the conceptual modality switch paradigm to modulate later, decision-related components such as the P3, or LPC. This family of ERP components is generally thought to index the updating of mental representations modulated by processes such as allocation of attention and task-dependent target classification (Polich, 2007).

A secondary goal of the study was to test whether property terms from different modalities (viz. visual versus auditory) would activate different modality specific brain areas as found in related fMRI studies (e.g., Goldberg et al., 2006). Although the spatial resolution of the EEG is limited, such differences might be detectable as subtle differences in the scalp topography of ERPs to visual versus auditory properties. An interaction between the modality factor in our analysis and electrode site would suggest that non- overlapping neural generators underlie the brain response to auditory and visual properties, viz. that the exact same brain regions do not subserve the processing of visual and auditory properties (Urbach and Kutas, 2002). More generally, differences between the modality switch process in the visual and auditory domains would connect this paradigm with Pecher et al.’s (2003) claim that the conceptual modality switch effect results from switching between different perceptual networks.

As a time-sensitive measure of online cognitive processing, ERPs can provide more information about whether the real-time processing of property terms involves the recruitment of perceptual brain areas during early perceptual processing, during semantic processing, or whether the switch effect would be evident only later, during decision-related stimulus processing. Given Barsalou’s (1999) claim that sensorimotor simulations comprise an intrinsic component of concept meaning, we hypothesized that the facilitative impact of a same modality prime would involve the semantic processing of the target trial, and thus would modulate the amplitude of the N400 component of the ERP. In particular, we predicted that no-switch trials would elicit reduced amplitude N400 relative to switch trials.

Materials and Methods

The protocol for this study was approved by the University of California, San Diego Social and Behavioral Science Institutional Review Board. As such, informed consent was obtained from all participants prior to their enrollment.

Participants

Twenty undergraduates from the UCSD community (13 women) participated as part of a course requirement. Data from six additional participants were not included in the analysis due to the presence of an excessive number of artifacts (greater than 30% of trials in a critical condition). All participants were between the ages of 18 and 40 years old. As reported in a screening questionnaire, all participants had normal vision, and none had any history of neurological or psychiatric disorders within the previous 10 years.

Materials

Each trial in the study consisted of a concept–property combination such as HIGH HEELS (concept) and click (property). Experimental trials involved 48 visual properties (such as flicker), and 48 auditory properties (such as click). Each property was presented with two different concepts for a total of 192 experimental trials; all properties were repeated once over the course of the experiment, while all concepts were unique. Half (96) of the concept–property combinations served as prime trials (48 involving auditory properties, and 48 involving visual properties), and half (96) served as target trials (48 involving auditory properties, and 48 involving visual ones). Experimental trials were presented in pairs, so that a prime trial was immediately followed by a target trial that was either from the same modality (no-switch condition), or the other modality (switch condition). Materials were thus comprised of 96 trial pairs in which the modality of the probe property was crossed with the modality switch dimension (24 auditory prime/auditory target, 24 visual prime/auditory target, 24 visual prime/visual target, and 24 auditory prime/visual target pairs). Apart from the modality manipulation the prime–target pairs were unrelated. All properties in experimental trials were valid for their concept so that the correct response on the property verification task was always “true.”

Materials also included 384 filler trials, 96 of which involved auditory properties that did not pertain to their concept (e.g., LOBSTERSbark) and 96 of which involved visual properties also eliciting false responses (e.g., LAWNSscarlet). These two sets were included so that participants could not strategically respond true to any trial involving an auditory or visual property. Another 96 filler trials involved tactile properties, half of which were valid for their concept (e.g., CAVESdamp), and half of which were not (e.g., TOASTERSdamp; one response for each property repetition). The final 96 filler trials were lexical associates (e.g., BUFFALOSwinged), included to discourage participants from shallow processing strategies relying on word association (as in Solomon and Barsalou, 2004). Half of the associated trials were true trials, and half were false trials. Of the 384 filler trials, the correct response on the property verification task was true for 96, and false for 288. When including the 192 experimental trials as well, the correct response on the task was thus true for half of the total trials, and false for the other. Moreover, even though the experimental trials always involved two true responses in a row (viz. one for the prime, and one for the probe), the inclusion of filler trials guaranteed that a correct true response was equally likely to be followed by a correct false response as by another correct true.

Two lists were employed so that any given target property occurred once in a switch trial (that is, following a prime from the other modality), and once in a no-switch trial (that is, following a prime from the same modality). Two variants of each were created by swapping the first and second half of each list. In this way, each concept–property combination was presented equally often in the first and second half of the experiment.

Procedure

Participants were seated in a dim, sound attenuating chamber approximately 50 inches from a 17-inch computer monitor. They read a standard set of instructions telling them to “read the entity (such as objects, people, animals, etc.) and property words, … and respond true if the property was typical or often possible for the entity, and false if the property was highly unusual for the entity.” They read several examples and were presented with practice trials on which they received feedback. Participants were told, “after you read the property, decide as quickly and accurately as possible whether the property is true or false,” but no explicit feedback was given on either of these dimensions during the course of the experiment.

The timing of events in the experimental paradigm is presented in Figure 1. Each trial began with the presentation of a white fixation cross for 250 ms. The inter-stimulus interval (ISI) between the fixation cross and the concept was randomly varied with 50 ms steps between 200 and 400 ms. The concept appeared on the center of the screen in capital letters for 150 ms followed by a 250 ms ISI and the property word in lowercase letters for 200 ms. In order to limit the potential for eye-movement artifacts in our EEG signal we chose to centrally present both concepts and properties and eliminate the phrase “can be” from the original paradigm which is not a necessary aspect of the conceptual modality switching procedure (e.g., Pecher et al., 2004). All type was in white font presented on a black background. Participants had 2600 ms to make their decision and prepare for the next trial. Responses were made via a button press in which a right hand response indicated true and a left hand response indicated false. Trials were presented in ten blocks, each lasting about 3.5 min with time in between for participants to rest. The first block began with eight practice trials that were not included in the analysis. All blocks had 60 trials except for the last block which had 44 trials.

FIGURE 1
www.frontiersin.org

Figure 1. Participants saw pairs of words – a concept (in capitals) followed by a property (in lowercase) – after which they would make a true/false judgment during a 2600-ms blank screen. Both examples shown in the figure should elicit true responses because the properties are typical of their respective concept. The critical manipulation in this experiment is the perceptual modalities evoked by subsequent trials. In this example the first is a visual decision, the second is an auditory decision and together they make up an item in the “switch” condition.

EEG Recording and Analysis

Participants’ EEG was recorded with tin electrodes mounted in an electrode cap with 29 scalp sites (see Figure 2). Scalp electrodes were referenced online to the left mastoid, and subsequently re-referenced to the average of the left and right mastoid electrodes. Blinks were monitored with an electrode below the right eye. Horizontal eye movements were monitored via a bipolar derivation of electrodes placed over the outer canthi. EEG was recorded and amplified with an SA Instruments isolated bioelectric amplifier at a bandpass of 0.1 and 100 Hz, digitized online at 250 Hz, and stored on a hard drive for subsequent averaging. The EEG was later monitored offline for blinks and eye movements which were rejected manually. ERPs were time-locked to the onset of property words on probe trials.

FIGURE 2
www.frontiersin.org

Figure 2. Relative placement of 29 scalp electrodes at which EEG was recorded.

For each time interval of interest we performed a 2 × 2 × 29 repeated measures ANOVA with the factors switch (switch/no-switch), target property modality (visual/auditory), and electrode site (29 levels). The dependent measure was the mean amplitude within the time intervals of interest. In cases where the overall analysis revealed a significant interaction between modality switch and property modality, follow-up analyses were conducted separately for the visual and auditory properties. Follow-up analyses thus involved factors switch (switch/no-switch) and electrode site (29 levels). The Huynh–Feldt correction was applied where relevant. For clarity, however, we report the original degrees of freedom.

Results

Behavioral Results

Analysis of reaction times failed to reveal any statistically significant effects in a 2 × 2 ANOVA testing switch (switch/no-switch) and modality (visual/auditory; all Fs < 2). Given that behavioral studies of this phenomenon typically do not test the modalities separately and employ data from at least 60 participants (cf. the 20 employed in the present study), these null results are likely due to a lack of power. The pattern of reaction times was in the expected direction for the visual properties (switch = 902 ms, SD = 152 ms; no-switch = 891 ms, SD = 155 ms) but not for the auditory properties (switch = 908 ms, SD = 148 ms; no-switch = 917 ms, SD = 163 ms).

Analysis of accuracy rates revealed a main effect of modality type with auditory properties showing worse accuracy than visual properties [F(1,19) = 13.81, p < 0.01]. There were no significant effects of switch condition for the visual (switch = 0.92, SD = 0.07; no-switch = 0.94, SD = 0.05) nor auditory properties (switch = 0.86, SD = 0.09; no-switch = 0.87, SD = 0.09; Fs < 1) but both modality types showed slightly worse performance for the switch condition.

ERP Results

Probe properties elicited ERPs typical of visually presented words, an N1–P2 complex followed by the N400 and a LPC. The switch manipulation did not affect ERP waveforms in the early 100–200 ms interval. The switch manipulation modulated the amplitude of the N400 (measured 200–500 ms post-stimulus) and the LPC (measured 500–800 ms), but did so differently for visual and auditory properties. Whereas visual properties elicited a larger N400 for switch than no-switch trials, auditory properties elicited a larger LPC for the same comparison.

100–200 ms

Analysis of ERPs measured 100–200 ms after stimulus onset did not show any differences for analyses of switch effects (all Fs < 1). Nor did it reveal differences based on the modality elicited by the properties (all Fs < 1.4).

200–500 ms

Overall analysis of ERPs measured 200–500 ms after stimulus onset revealed a significant interaction between the switch and the modality factors [F(1,19) = 4.61, p < 0.05, MSE = 147.25]. Follow-up analyses of each individual modality revealed no effects in the auditory modality (Fs < 1; auditory switch = 5.08 μV, auditory no-switch = 4.76 μV), but a reliable switch effect in the visual one [F(1,19) = 4.93, p < 0.05, MSE = 135.52]. The latter reflects the slightly more negative (0.7 μV) ERPs elicited in the visual switch (4.53 μV) than the visual no-switch (5.21 μV) condition (Figure 3). Although this difference showed up as a main effect in the analysis, visual inspection suggests it was largest over centro-parietal sites characteristic of the classic N400 effect (Figure 4).

FIGURE 3
www.frontiersin.org

Figure 3. The N400 elicited by visual property verification targets in the switch (red) and no-switch (black) conditions. Each graph represents data recorded from a midline electrode over frontal (top), central (middle), and parietal (bottom) regions of the scalp. Time is plotted on the x-axis against voltage on the y-axis. By convention, negative polarity is plotted upward.

FIGURE 4
www.frontiersin.org

Figure 4. Topography of the switch effect for visual property verification targets.

500–800 ms

Overall analysis of ERPs measured 500–800 ms after stimulus onset revealed a significant interaction between the modality and the switch factors [F(1,19) = 5.27, p < 0.05, MSE = 162.78], as well as a marginal interaction between modality and electrode site [F(28,532) = 1.81, p = 0.10, ε = 0.20, MSE = 3.49]. Follow-up analyses suggested the interaction between modality and switch results from a positive-going switch effect evident only for auditory properties. Separate analysis of the visual modality revealed no effect of the switch factor, either as a main effect (F < 1; visual switch = 6.00 μV; visual no-switch = 6.22 μV), or in interaction with electrode site (F < 1). Separate analysis of the auditory modality suggested a trend for switch trials to elicit a slightly larger positivity (switch = 6.70 μV) than did no-switch trials [5.86 μV; F(1,19) = 3.02, p = 0.098, MSE = 201.31; see Figures 5 and 6].

FIGURE 5
www.frontiersin.org

Figure 5. The late positive complex (LPC) to auditory targets in the switch (red) relative to the no-switch (black) conditions. Each graph represents data recorded from a midline electrode over frontal (top), central (middle), and parietal (bottom) regions of the scalp. Time is plotted on the x-axis against voltage on the y-axis and negative polarity is plotted upward.

FIGURE 6
www.frontiersin.org

Figure 6. Topography of the switch effect for auditory targets.

We also followed up on the marginal interaction between modality and electrode site as the possible topographic differences were of interest to our question of access to underlying perceptual modalities by property words. We tested midline, medial, and lateral sites separately. Our midline test included factors of modality (visual, auditory) and anteriority (seven midline electrodes, see Figure 2). This test revealed a marginal interaction between modality and anteriority [F(6,114) = 2.67, p = 0.057, ε = 0.45 MSE = 1.35]. Our test of the medial sites was similar and also included a factor of hemisphere (right, left). This test also revealed a difference between modalities that interacted with anteriority [F(6,114) = 3.55, p < 0.05, ε = 0.41 MSE = 3.70], but no hemispheric differences were significant (Fs < 1.8). No differences at the lateral sites were observed (Fs < 2). The interaction effects between modality and scalp location can be seen in Figure 7 with the current source density (CSD) plots. These figures show that the visual and auditory properties result in different patterns of voltage change during this time interval.

FIGURE 7
www.frontiersin.org

Figure 7. Current source density (CSD) maps of responses to visual and auditory properties including both switch and no-switch conditions. The units are normalized values of micro amps per square meter. CSD maps highlight local differences between electrode sites likely to reflect nearby neural generators. These maps suggest a subtle difference in the configuration of neural generators and timing of activation for the visual versus auditory property stimuli during the 500–800 ms interval, particularly at 600 ms.

Discussion

The present study investigated the electrophysiological correlates of the conceptual modality switch effect, an effect used to argue that conceptual tasks recruit perceptual processing systems. We predicted that the sequencing of property verification trials in same modality versus different modality pairs would be reflected in semantic processing of target properties, and thus would modulate the amplitude of the N400 component of the ERP. While this was indeed the case for the visual properties we tested, it was not the case for the auditory properties. Relative to the no-switch trials, visual properties in the switch condition elicited a larger negativity in the N400 time window; by contrast, auditory properties elicited a larger positivity 500–800 ms after stimulus onset in the switch condition. No early differences emerged for N1–P2 components arguing against the suggestion that the switch effect involves low-level visual processing.

N400 Effect

The first effect of interest was the negativity observed 200–500 ms after the onset of visual property terms. As predicted, no-switch trials elicited a smaller negativity than did the switch trials during a time interval typically associated with the semantic processing of words and the elicitation of the N400 component. Experts differ on the exact functional significance of this component, with some arguing it indexes lexical access (Kutas and Federmeier, 2000; Lau et al., 2008), and others contextual integration processes (e.g., Hagoort, 2008). There is widespread agreement, however, that the N400 indexes processing events associated with the construction of meaning, and, further, that its amplitude is related to processing difficulty (see Wu and Coulson, 2005 for a review). In general, contextual factors that facilitate processing lead to reduced amplitude N400; for example, words elicit smaller N400 when preceded by related than unrelated words, and smaller N400 when preceded by supportive than unsupportive sentence and paragraph contexts (see Kutas and Federmeier, 2011 for extensive review).

Results of the present study suggest that the perceptual modality of the property term on a previous trial can comprise a supportive semantic context, and that N400 priming effects can be observed between subsequent decisions disguised to participants as completely independent trials. The smaller negativity observed here for the no-switch trials thus suggests that semantic processing of visual target properties was facilitated by processing a visual prime property relative to an auditory prime property. We attribute this facilitated processing to the use of modality specific sensory simulations to mentally represent objects. While perceptual modalities are recruited automatically during concept processing in general, attention can focus more or less on specific modalities. In the property verification task, the presentation of a modality specific property can direct attention to the relevant modality. If the next trial has a property from a different modality (as in the switch condition) the focus shifts to a simulation in the newly relevant modality in order to represent the property. This shift incurs a processing cost which is evident in the ERP differences observed in the present study and reaction time differences of previous studies (Pecher et al., 2003).

Our results are consistent with those of a recent study by Hald et al. (submitted). Hald et al. (submitted) also used a modality switch paradigm in which they presented visual and tactile properties and obtained N400 differences between switch and no-switch trials. Thus, it seems that the N400 effect for modality switching is robust. The identification of the N400 as an ERP index of the conceptual modality switch effect suggests that the cost of shifting between modalities, in this case driven by visual property words, is reflected in semantic processes. This further implies that the semantic activation indexed by this ERP component includes the activation of perceptual features. Results of the present study are thus consistent with ERP studies that have demonstrated modulation of the N400 based on categorical relations that imply similar visual features (Federmeier and Kutas, 1999), and so-called perceptual priming between items such as pizza and coin that share a salient visual feature (Kellenbach et al., 2000). In sum, results of the present study are in keeping with an account of concepts as involving sensorimotor simulations (e.g., Barsalou, 1999) and suggest that the access of visual features occurs during meaning processing.

LPC Effects

Two effects of interest were observed in the interval 500–800 ms after the onset of property terms. First, visual versus auditory properties elicited ERPs with subtle topographic differences (modality effects). Second, the switch manipulation modulated the ERPs to auditory but not visual property terms (modality switch effects).

Modality effects

Between 500 and 800 ms ERP patterns differed across midline and medial electrode sites for auditory versus visual property decisions. The positivity elicited by auditory properties was more fronto-centrally focused than that elicited by visual properties. Figure 7 illustrates this relatively subtle difference in the scalp topography particularly visible at 500 and 600 ms after stimulus onset. The CSD maps plot the second spatial derivative of the ERP waveforms, and as such highlight differences in the voltage recorded at adjacent electrode sites. The electrode montage used in the present study was too sparse to allow localization but the observed scalp topography differences imply differences in the neural generators underlying the brain response to visual versus auditory property terms. These differences observed between visual and auditory processing are compatible with related fMRI studies that show areas of unique brain activity for properties describing different modalities (Goldberg et al., 2006). The timing of observed topographic differences is later than initial semantic activation implicated in the generation of the N400 component. Semantic and pragmatic manipulations have, however, been observed to modulate the amplitude of the ERP in this interval (see e.g., Regel et al., 2010 for a review). Differences in the brain response to visual and auditory properties are consistent with the hypothesis that perceptual networks help subserve the neural representations of concepts, and the corollary that such networks would be different for concepts that predominantly activate one perceptual modality over another.

Modality switch effects

The other effect of interest in the present study was a positive deflection of the LPC for auditory switch trials relative to the auditory no-switch trials between 500 and 800 ms, primarily at anterior electrode sites (see Figure 6). This effect is likely related to the P3, a family of ERP components that index memory processing, whose amplitude reflects the allocation of attention, and whose latency is proportional to the task-relevant stimulus evaluation process (see Polich, 2007 for a review). In view of the relatively long reaction times on the property verification task (>900 ms), the timing of the late positivity observed in the present study (500–800 ms after the onset of the auditory property term) is consistent with its interpretation as an index of the property verification decision. In studies of the P3, the same target stimulus has been shown to elicit a larger positivity in the ERP in difficult than in easy discrimination tasks (Comerchero and Polich, 1999). On this interpretation, the larger late positivity on the switch trials suggests the auditory property verification judgments were more difficult when preceded by a visual prime trial than another auditory one. Alternatively, the anterior distribution of the LPC switch effect suggests the predominance of the P3a sub-component associated with attentional orienting to novel stimuli (see Polich, 2007 for review). On this interpretation, the larger late positivity we observed need not imply greater processing difficulty, but rather an appreciation of the switch trials as involving more novelty than the no-switch trials – presumably because the switch trials required participants to activate semantic features from a different modality.

Hald et al. (submitted) also found a positivity for switch items elicited by a conceptual modality switch task but only over posterior electrodes, differing from the distribution described here (Figure 6). Their finding of a posterior positivity co-occurred with a larger negativity for switch trials over anterior electrodes in the same time intervals. The timing and scalp distribution of these effects were interpreted as a unified frontal N400 effect similar to that elicited by pictures. The different ERP patterns found by Hald et al. (submitted) at anterior and posterior electrode sites were revealed as a topography difference but this scalp difference cannot be compared to that reported in the current study because the topographic differences reported here were driven by different modalities, a dimension not tested by Hald et al. (submitted).

Differences between Visual and Auditory Property Verification

The most surprising result of the present study was the observed difference in the conceptual modality switch effect for visual versus auditory properties. As noted above, visual properties elicited reduced N400 in no-switch relative to switch trials, suggesting our experimental manipulation affected semantic processing of the targets. Auditory properties, however, elicited an enhanced LPC, suggesting the manipulation impacted neural processes occurring later than those indexed by the N400, and were more likely related to making the decision about whether the property was typical of the concept.

Whereas neither finding is surprising alone – that is, a conceptual modality switch might reasonably be predicted to impact either the semantic processing of the stimuli, or the difficulty of decisions regarding property verification, or, indeed, both sets of processes – our finding of semantic effects for visual properties and decision-related effects for auditory properties was unexpected. Prior reports of the conceptual modality switch effect using reaction time measures have found similar sized switch effects for properties from different modalities (Pecher et al., 2009). Similarly, studies of the perceptual modality switch effect also report similar sized switch effects for visual, auditory, and tactile stimuli, with the only difference being a trend for tactile primes to yield longer reaction times for subsequent visual and auditory probes (Spence et al., 2001). However, reaction times measure only the end point of a property verification process, while ERPs provide an index of brain activity from the onset of the stimulus until the generation of the behavioral response on the task. ERP data in the present study suggest the switch manipulation affects different aspects of processing in the verification of visual versus auditory properties.

Our observed differences between auditory and visual switch effects are consistent with a prior ERP study of the perceptual modality shift effect by Gondan et al. (2007) in which stimuli involved either LED flashes (visual targets) or bursts of white noise (auditory targets). They found that visual targets following visual primes compared to when they followed auditory primes elicited ERP effects similar to those found for increased visual attention – namely, an amplified N1 component. In contrast, auditory targets elicited smaller N1 and P2 components when they followed auditory primes than when they followed visual primes. The fact that ERP differences for the switch effect were opposite in the visual and auditory domains was an unexpected asymmetry. The authors explain this asymmetry by suggesting different mechanisms driving the switch effects in the two perceptual domains. They suggest a “neural trace” explanation for the auditory domain in which residual activity from an auditory prime speeds the response and processing for a subsequent auditory stimulus. The result of this priming is a smaller ERP component for the target auditory stimulus. For the visual targets, ERP amplification for the same modality condition is explained through attentional mechanisms because increased attention tends to result in amplified perceptual ERP components. These different patterns suggest that different mechanisms might be driving the modality switch effect in the visual and auditory domains. Likewise, results of the present study suggest that different mechanisms were involved for the conceptual modality switch in the case of visual versus auditory property terms.

One account for why different mechanisms would drive the conceptual switch effects in the present study is that the particular visual and auditory property words we used access the perceptual domains differently. In particular, the visual property words may refer to relatively pure visual experiences, whereas auditory properties may refer to mixed visual and auditory experiences. For example, green (as for asparagus) might refer to a purely visual perception while clicking (as for high heels) might refer to a combined auditory and visual experience. We examined this possibility using the Lynott and Connell (2009) norms. Lynott and Connell (2009) asked participants to what extent each of 423 property words were experienced via each of the five sensory modalities. Of the 48 property words used in each modality category of our study, 37 visual properties and 27 auditory properties were represented in their list. Our subset of visual property words had an average visual ranking of 4.65 (out of 5.0 possible) and the subset of auditory words had an average auditory ranking of 4.60 (two-tailed t-test, t < 1), verifying the experimental conditions used in our study.

However, when considering both visual and auditory rankings for each of these sets our auditory properties appear more multimodal than our visual properties as indicated by a smaller difference between their auditory and visual rankings (auditory property difference = 2.44, visual property difference = 4.18; t(41) = 8.99, p < 0.01). This classification is also consistent with a modality exclusivity score available in Lynott and Connell’s (2009) norms. For each property word the modality exclusivity score factors the strength of the rating for an individual modality relative to ratings for all five sensory modalities. The visual properties used in our study had a higher modality exclusivity ranking (0.73, of possible values between 0.0 and 1.0) than our auditory properties [0.58; t(59) = 4.31, p < 0.01].

While it is clear that our auditory properties are characterized as typically experienced via hearing [as indicated by values derived from the Lynott and Connell (2009) norming study], their greater multimodal characteristic might have led to a weaker switch effect than that seen for the visual properties. In perceptual studies of the modality switch effect, a bimodal target stimulus (e.g., simultaneous beep and flash) following a unimodal stimulus (e.g., a flash) produces a smaller switch cost than unimodal targets following unimodal primes (e.g., a beep following a flash; Gondan et al., 2004). The reduction in the switch effect is presumed to result because only one of the two modalities making up the bimodal target stimulus requires a switch from the modality of the previous stimulus; the other, in fact, is primed. The absence of an observed N400 effect in our ERP results for auditory properties could reflect a lack of power to see an attenuated modality switch for these auditory properties that are more multimodal than the visual properties for which we did find an N400 effect. The decision-related LPC effect on the other hand would thus index more effort required to attribute the multimodal (auditory) property to a concept in the context of a visual prime.

Using a combination of published norms and dictionary definitions, we identified four of the visual target properties and eight of the auditory target properties employed in our study as being multimodal, that is, either having a modality exclusivity score (as defined by Lynott and Connell, 2009) of less than 0.51, or a dictionary definition that mentioned more than one modality. We eliminated multimodal items from ERP waveforms and conducted a post hoc analysis of ERPs elicited by the remaining unimodal stimuli. In the 200–500 ms window the same pattern of significant effects was obtained as for the complete dataset. Reanalysis restricted to unimodal items thus suggested the N400 switch effect for visual properties was robust and slightly larger than that measured for the full set of experimental items, but still failed to reveal an N400 switch effect for the auditory properties.

A similar reanalysis of the LPC interval failed to reveal either the modality by switch interaction (F < 1) or the auditory switch effect (F < 1) observed in the original analyses. This raises the possibility that the LPC switch effect observed in the present study primarily reflects the brain response to the multimodal items. Consistent with this, further analysis also suggested a trend for multimodal visual and multimodal auditory properties to elicit slightly more positive ERPs in the 500–800 ms interval than unimodal visual [1.13 μV, F(1,19) = 3.57, p = 0.074, MSE = 370.59] and unimodal auditory [0.84 μV, F(1,19) = 2.86, p = 0.107, MSE = 205.51] properties, respectively. According to this interpretation, the auditory switch effect observed in our original analysis reflects the greater difficulty of responding to multimodal auditory properties following (more likely unimodal) visual than (more likely multimodal) auditory primes. The greater multimodality of the auditory properties also suggests an alternative explanation for the different topography of ERPs elicited by auditory and visual properties measured 500–800 ms post-stimulus onset (discussed in “Modality Effects”). These reanalyses must, however, be interpreted with caution since the comparison of unimodal versus multimodal stimuli involve ERPs derived from different numbers of trials, and the number of visual multimodal trials was particularly low. More definitive conclusions regarding brain activity elicited by multimodal versus unimodal items in a property verification task would require a stimulus set specifically designed for this purpose.

Conclusion

The present study contributes to evidence demonstrating that concepts referring to perceptual properties are recruiting perceptual processing resources. Whereas previous studies have shown similar modality switch effects in conceptual processing, the present study informs us in a more detailed way on the locus of this switch effect. ERP measures showed that the elicitation of perceptual meaning, as typically demonstrated by switching costs, is evident at the semantic level or at later decision-making stages of processing. The switch effect for visual properties was different from the switch effect for auditory properties due to either different underlying mechanisms driving the processes or different modal representations of these properties. Both explanations support a theory of concepts as a reactivation of brain areas important for the perception of the world. Just as seeing candles flicker generates different neural activity from hearing high heels click, we expect the concepts representing these events to differ as well.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

This research was supported by grants from the Netherlands Organization for Scientific Research (NWO) to Diane Pecher and from the Erasmus University Trustfonds to Diane Pecher and René Zeelenberg. We would like to thank Lauren Cardoso, Nafees Hamid, and Rubén Moreno for their assistance collecting data. The authors would also like to thank Lea Hald and Frederico Marques for their feedback and reviews that allowed us to improve this paper.

References

Barsalou, L. W. (1999). Perceptual symbol systems. Behav. Brain Sci. 22, 577–660.

Pubmed Abstract | Pubmed Full Text

Barsalou, L. W. (2008). Grounded cognition. Annu. Rev. Psychol. 59, 617–645.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Barsalou, L. W., Simmons, W. K., Barbey, A. K., and Wilson, C. D. (2003). Grounding conceptual knowledge in modality-specific systems. Trends Cogn. Sci. 7, 84–91.

Pubmed Abstract | Pubmed Full Text

Bentin, S. (1987). Event-related potentials, semantic processes, and expectancy factors in word recognition. Brain Lang. 31, 308–327.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Comerchero, M. D., and Polich, J. (1999). P3a and P3b from typical auditory and visual stimuli. Clin. Neuropsychol. 110, 24–30.

Federmeier, K. D., and Kutas, M. (1999). A rose by any other name: long-term memory structure and sentence processing. J. Mem. Lang. 41, 469–495.

Goldberg, R. F., Perfetti, C. A., and Schneider, W. (2006). Perceptual knowledge retrieval activates sensory brain regions. J. Neurosci. 26, 4917–4921.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Gondan, M., Lange, K., Rösler, F., and Röder, B. (2004). The redundant target effect is affected by modality switch costs. Psychon. Bull. Rev. 11, 307–313.

Pubmed Abstract | Pubmed Full Text

Gondan, M., Vorberg, D., and Greenlee, M. W. (2007). Modality shift effects mimic multisensory interactions: an event-related potential study. Exp. Brain Res. 182, 199–214.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Hagoort, P. (2008). The fractionation of spoken language understanding by measuring electrical and magnetic brain signals. Philos. Trans. R. Soc. Lond., B, Biol. Sci. 363, 1055–1069.

Pubmed Abstract | Pubmed Full Text

Holcomb, P. (1988). Automatic and attentional processing: an event-related brain potential analysis of semantic priming. Brain Lang. 35, 66–85.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Holcomb, P., and Neville, H. (1990). Semantic priming in visual and auditory lexical processing. Lang. Cogn. Process. 5, 281–312.

Kan, I. P., Barsalou, L. W., Solomon, K. O., Minor, J. K., and Thompson-Schill, S. L. (2003). Role of mental imagery in a property verification task: fMRI evidence for perceptual representations of conceptual knowledge. Cogn. Neuropsychol. 20, 525–540.

Pubmed Abstract | Pubmed Full Text

Kellenbach, M. L., Wijers, A. A., and Mulder, G. (2000). Visual semantic features are activated during the processing of concrete words: event-related potential evidence for perceptual semantic priming. Cogn. Brain Res. 10, 67–75.

Kutas, M., and Federmeier, K. D. (2000). Electrophysiology reveals semantic memory use in language comprehension. Trends Cogn. Sci. 4, 463–470.

Pubmed Abstract | Pubmed Full Text

Kutas, M., and Federmeier, K. D. (2011). Thirty years and counting: finding meaning in the N400 component of the event related brain potential (ERP). Annu. Rev. Psychol. 62, 621–647.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Kutas, M., and Hillyard, S. A. (1984). Brain potentials during reading reflect word expectancy and semantic association. Nature 307, 161–163.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Lau, E. F., Phillips, C., and Poeppel, C. (2008). A cortical network for semantics: (de)constructing the N400. Nat. Rev. Neurosci. 9, 920–933.

Pubmed Abstract | Pubmed Full Text

Lynott, D., and Connell, L. (2009). Modality exclusivity norms for 423 object properties. Behav. Res. Methods 41, 558–564.

Pubmed Abstract | Pubmed Full Text

Marques, J. F. (2006). Specialization and semantic organization: evidence for multiple semantics linked to sensory modalities. Mem. Cogn. 34, 60–67.

Martin, A. (2007). The representation of object concepts in the brain. Annu. Rev. Psychol. 58, 25–45.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Murphy, G. L. (2002). The Big Book of Concepts. Cambridge, MA: MIT Press.

Pecher, D., Van Dantzig, S., and Schifferstein, H. N. J. (2009). Concepts are not represented by imagery. Psychon. Bull. Rev. 16, 914–919.

Pubmed Abstract | Pubmed Full Text

Pecher, D., Zeelenberg, R., and Barsalou, L. W. (2003). Verifying properties from different modalities for concepts produces switching costs. Psychol. Sci. 14, 119–124.

Pubmed Abstract | Pubmed Full Text

Pecher, D., Zeelenberg, R., and Barsalou, L. W. (2004). Sensorimotor simulations underlie conceptual representations: modality-specific effects of prior activation. Psychon. Bull. Rev. 11, 164–167.

Pubmed Abstract | Pubmed Full Text

Polich, J. (2007). Updating P300: an integrative theory of P3a and P3b. Clin. Neurophysiol. 118, 2128–2148.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Regel, S., Coulson, S., and Gunter, T. C. (2010). The communicative style of a speaker can affect language comprehension? ERP evidence from the comprehension of irony. Brain Res. 1311, 121–135.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Rodway, P. (2005). The modality shift effect and the effectiveness of warning signals in different modalities. Acta Psychol. 120, 199–226.

CrossRef Full Text

Simmons, W. K., Ramjee, V., Beauchamp, M. S., McRae, K., Martin, A., and Barsalou, L. W. (2007). A common neural substrate for perceiving and knowing about color. Neuropsychologia 45, 2802–2810.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Smith, M. E., and Halgren, E. (1989). Dissociation of recognition memory components following temporal lobe lesions. J. Exp. Psychol. Learn. Mem. Cogn. 15, 50–60.

Pubmed Abstract | Pubmed Full Text

Solomon, K., and Barsalou, L. W. (2004). Perceptual simulation in property verification. Mem. Cogn. 32, 244–259.

Spence, C., Nicholls, M. E. R., and Driver, J. (2001). The cost of expecting events in the wrong sensory modality. Percept. Psychophys. 63, 330–336.

Pubmed Abstract | Pubmed Full Text

Urbach, T. P., and Kutas, M. (2002). The intractability of scaling scalp distributions to infer neuroelectric sources. Psychophysiology 39, 791–808.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

van Dantzig, S., Pecher, D., Zeelenberg, R., and Barsalou, L. W. (2008). Perceptual processing affects conceptual processing. Cogn. Sci. 32, 579–590.

Vermeulen, N., Niedenthal, P. M., and Luminet, O. (2007). Switching between sensory and affective systems incurs processing costs. Cogn. Sci. 31, 183–192.

Wu, Y. C., and Coulson, S. (2005). Meaningful gestures: electrophysiological indices of iconic gesture comprehension. Psychophysiology 42, 654–667.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Keywords: property verification, modality switch, ERPs, grounded cognition

Citation: Collins J, Pecher D, Zeelenberg R and Coulson S (2011) Modality switching in a property verification task: an ERP study of what happens when candles flicker after high heels click. Front. Psychology 2:10. doi: 10.3389/fpsyg.2011.00010

Received: 16 July 2010; Paper pending published: 29 July 2010;
Accepted: 09 January 2011; Published online: 08 February 2011.

Edited by:

Anna M. Borghi, University of Bologna and Institute of Cognitive Sciences and Technologies, Italy

Reviewed by:

Lea Hald, Canterbury Christ Church University, UK
Frederico Marques, Lisbon University, Portugal

Copyright: © 2011 Collins, Pecher, Zeelenberg and Coulson. This is an open-access article subject to an exclusive license agreement between the authors and Frontiers Media SA, which permits unrestricted use, distribution, and reproduction in any medium, provided the original authors and source are credited.

*Correspondence: Jennifer Collins, Department of Cognitive Science, University of California, San Diego, 9500 Gilman Drive, La Jolla, CA 92093-0515, USA. e-mail: jcollins@cogsci.ucsd.edu

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.