Research Article: Evidence for a common mechanism of spatial attention and visual awareness: Towards construct validity of pseudoneglect

Date Published: March 7, 2019

Publisher: Public Library of Science

Author(s): Jiaqing Chen, Jagjot Kaur, Hana Abbas, Ming Wu, Wenyi Luo, Sinan Osman, Matthias Niemeier, Alastair Smith.

http://doi.org/10.1371/journal.pone.0212998

Abstract

Present knowledge of attention and awareness centres on deficits in patients with right brain damage who show severe forms of inattention to the left, called spatial neglect. Yet the functions that are lost in neglect are poorly understood. In healthy people, they might produce “pseudoneglect”—subtle biases to the left found in various tests that could complement the leftward deficits in neglect. But pseudoneglect measures are poorly correlated. Thus, it is unclear whether they reflect anything but distinct surface features of the tests. To probe for a common mechanism, here we asked whether visual noise, known to increase leftward biases in the grating-scales task, has comparable effects on other measures of pseudoneglect. We measured biases using three perceptual tasks that require judgments about size (landmark task), luminance (greyscales task) and spatial frequency (grating-scales task), as well as two visual search tasks that permitted serial and parallel search or parallel search alone. In each task, we randomly selected pixels of the stimuli and set them to random luminance values, much like a poor TV signal. We found that participants biased their perceptual judgments more to the left with increasing levels of noise, regardless of task. Also, noise amplified the difference between long and short lines in the landmark task. In contrast, biases during visual searches were not influenced by noise. Our data provide crucial evidence that different measures of perceptual pseudoneglect, but not exploratory pseudoneglect, share a common mechanism. It can be speculated that this common mechanism feeds into specific, right-dominant processes of global awareness involved in the integration of visual information across the two hemispheres.

Partial Text

The term pseudoneglect refers to a set of intact functions of spatial attention and perceptual awareness in healthy people that feature small but robust leftward biases [1–3] and that are thought to complement some of the left-sided deficits that patients exhibit after right-brain damage, called spatial neglect [4, 5]. However, using a neuropsychological syndrome is a rather unsatisfactory way to delineate intact functions. It shows how little is known about functions underlying pseudoneglect. As we will argue in the following this is a problem of unclear validity of pseudoneglect research data.

In the landmark task, we examined the influence of pixel noise on biases using stimuli of different lengths where only long lines yielded pseudoneglect with short lines producing no leftward biases, if not small numerical trends in the opposite direction. A Pixel Noise (0%/42%/84%) × Task (long vs. short lines) repeated-measures ANOVA produced an effect of Task (F (1, 21) = 17.00, p < 0.001, partial η2 = 0.447). Pixel noise had no main effect on biases (F (1.03, 21.56) = 3.54, p = 0.073). Crucially however, the interaction between Noise and Task was significant (F (1.03, 21.56) = 7.76, p = 0.010, partial η2 = 0.270), indicating that pixel noise amplifies the difference between long and short lines (Fig 3A). The interaction was mainly driven by an influence of noise in the long line condition as we observed when we submitted the long line data to a follow-up 1-factorial ANOVA with factor Pixel Noise (0%/42%/84%; F (1.03, 21.61) = 8.37, p = 0.008, partial η2 = 0.285; significant at a Bonferroni-corrected level of 2.5%). Submitting the short line condition to an equivalent follow-up ANOVA yielded no significant effect (F (1.01, 21.29) = 0.88, p = 0.361). Furthermore, we conducted more detailed post-hoc analyses of the noise effect on long lines by organizing the data into two linear contrasts. However, this merely produced a non-significant trend for the bias at 84% compared to the other noise levels (84% vs. average of 0% and 42%: t (21) = 2.915, p = 0.008, effect size = 0.288; significant after Bonferroni correction). Biases for 0% vs. 42% noise were not significantly different from each other (t (21) = -1.217, p = 0.237). Finally, we contrasted each of the six conditions with zero with one-sample t-tests. This showed that leftward long line biases at all three noise levels were significantly different from zero (0% noise: average PSE = -1.459, SD = 2.011; t (21) = -3.403, p = 0.003, effect size = 0.726; 42% noise: average PSE = -1.096, SD = 1.829; t (21) = -2.810, p = 0.010, effect size = 0.599; 84% noise: average PSE = -6.772, SD = 9.155; t (21) = -3.470, p = 0.002, effect size = 0.739; significant after serial Bonferroni correction). Short line biases were not significantly different from zero (0% noise: average PSE = 0.028, SD = 1.084, t (21) = 0.122, p = 0.904; 42% noise: average PSE = -0.013, SD = 1.155, t (21) = 0.052, p = 0.959; 84% noise: average PSE = 1.300, SD = 6.858, t (21) = 0.889, p = 0.384). The aim of the current study was to establish convergent and discriminant validity for pseudoneglect. To this end we tested whether different pseudoneglect measures would be similarly influenced by the same experimental manipulation. Specifically, we manipulated pseudoneglect with pixel noise. We have previously shown that visual images corrupted by pixel noise amplify pseudoneglect biases in the grating-scales task [59] apparently due to visual activation [60] of attentional processes [40]. Therefore, in the current study we tested whether pixel noise has comparable effects on other measures of pseudoneglect. We found that pseudoneglect as measured with the landmark task [67, 71], the greyscales task [12, 13], and the grating-scales task [14], but not pseudoneglect as observed during visual search [16], yielded pseudoneglect biases that were influenced similarly by noise.   Source: http://doi.org/10.1371/journal.pone.0212998

 

Leave a Reply

Your email address will not be published.